Why your Enterprise AI strategy needs both open and closed models: the TCO reality check

Why your Enterprise AI strategy needs both open and closed models: the TCO reality check

7 minutes, 33 seconds Read

This article is part of the special edition of Venturebeat: “The real costs of AI: performance, efficiency and ROI to scale.” Read more from this special number.

Over the past two decades, companies have had a choice between open-source and closed own technologies.

The original choice for companies was mainly focused on operating systems, where Linux offers an open-source alternative to Microsoft Windows. In the developer RealM there are open-source languages ​​such as Python and Javascript Dominate, as open-source technologies, including Kubernetes, standards in the cloud.

The same type of choice between open and closed is now confronted with companies for AI, with multiple options for both types of models. On the patented front of the closed model are some of the largest, most used models on the planet, including those of OpenAi and Anthropic. On the open-source side are models such as Meta’s Llama, IBM Granite, Alibaba’s Qwen and Deepseek.

Insight into when an open or closed model needs to be used is a critical choice for Enterprise AI decision-making formers in 2025 and then. The choice has both financial and adjustment implications for both options that companies must understand and consider.

Insight into the difference between open and closed licenses

There is no shortage of hyperbool around the decades of old rivalry between open and closed licenses. But what does it actually mean for company users?

For example, a closed-source own technology, such as GPT 4O from OpenAI, has no model code, training data or model weights open or available to everyone to see. The model is not easily available to be refined and in general it is only available for real company use with a costs (certainly, Chatgpt has a free low, But that is not going to cut it for a real workload of companies).

An open technology, such as Meta Lama, IBM Granite or Deepseek, has openly available code. Companies can use the models freely, usually without restrictions, including refinement and adjustments.

Rohan Gupta, a director with DeloitteVenturebeat told that the open debate on closed source is not unique or native in AI, nor will it probably be solved quickly.

Gupta explained that closed source providers generally offer different wrappers around their model that make it possible to use ease of use, simplified scale, more seamless upgrades and downgrades and a steady stream of improvements. They also offer considerable support for developers. This includes both documentation and hands-on advice and often provides stricter integrations with both infrastructure and applications. In exchange, a company pays a premium for these services.

“Open-source models, on the other hand, can offer more control, flexibility and adjustment options and are supported by a lively, enthusiastic developer ecosystem,” Gupta said. “These models are increasingly accessible through fully managed APIs in cloud suppliers, which broadens their distribution.”

Make the choice between open and closed model for Enterprise AI

The question that many Enterprise users could ask is what is better: an open or a closed model? However, the answer is not necessarily one or the other.

“We don’t see this as a binary choice,” David Guarrera, generative AI leader at EY Americastold Venturebeat. “Open versus is increasingly a liquid design space, where models are selected, or even automatically orchestrated, based on considerations between accuracy, latency, costs, interpretability and security at various points in a workflow.”

Guarrera noted that closed models limit how deep organizations can optimize or adjust behavior. Patrietary model sellers often limit the refinement, load premium rates or hide the process in black boxes. Although API-based tools simplify integration, they abstract much from the control, making it more difficult to build very specific or interpretable systems.

Open-source models, on the other hand, ensure targeted refinement, the design of the crash barrier and optimization for specific use cases. This is more important in an agent future, where models are no longer monolithic general aids, but interchangeable components within dynamic workflows. The ability to finely shape model behavior, at low costs and with full transparency, becomes an important competitive advantage when deploying task -specific agents or tightly regulated solutions.

“In practice we provide an agentic future where modeling section is abstracted,” said Guarrera.

For example, a user can set up an e-mail with one AI tool, summarizes legal documents together with another, search companies with a refined open-source model and communicate locally with AI via an on -evice LLM, all without ever knowing what model does.

“The real question becomes: which mix of models fits best with the specific requirements of your workflow?” Said Guarrera.

Given the total property costs

With open models, the basic idea that the model is freely available for use is. While on the other hand, on the other hand, always pay for closed models.

The reality when it comes to considering the total property costs (TCO) is more nuanced.

Praveen Akkiraju, director of Insight Partners Explained to Venturebeat that TCO has many different layers. Some important considerations include infrastructure hosting costs and engineering: are the open-source models hosted by the Enterprise or the Cloud provider? How much engineering, including refinement, security and security tests, is needed to operationalize the model safely?

Akkiraju noted that Refining an open weighted model can sometimes also be a very complex task. Closed border model companies make enormous technical efforts to guarantee performance in several tasks. According to him, they will, unless companies implement comparable engineering expertise, will receive a complex balance act when refining open source models. This creates cost implications when organizations choose their model implementation strategy. For example, companies can refine multiple model versions for different tasks or use one API for multiple tasks.

Ryan Gross, head of Data & Applications at Cloud Native Services Provider Caylent Venturebeat said that from his perspective license conditions did not matter, except in Edge Case scenarios. The largest limitations often relate to the availability of the model when the data residence requirements are present. In this case, using an open model on infrastructure such as Amazon Sagemaker is perhaps the only way to get an ultramodern model that is still satisfactory. When it comes to TCO, Gross noted that the assessment between costs per token and hosting and maintenance costs is.

“There is a clear break-even point where the economy switches from closed to open models that are cheaper,” said Gross.

According to him, most organizations closed models, with the hosting and scales solved on behalf of the organization, will have a lower TCO. For large companies, however, SaaS companies with a very high demand for their LLMS can, but simpler use cases that require border performance, or AI-centric product companies, can be distilled open models more cost-effective.

How an enterprise software developer open versus closed models evaluated

Josh Bosquez, CTO Op Second front -systems Is one of the many companies that have had to consider and evaluate open versus closed models.

“We use both open and closed AI models, depending on the specific use case, security requirements and strategic objectives,” Bosquez told Venturebeat.

Bosquez explained that open models enable its company to integrate advanced possibilities without the time or costs of training models completely again. For internal experiments or rapid prototyping, Open Models help its company to repeat and take advantage of community driven progress quickly.

“Closed models, on the other hand, are our choice when data about sovereignty, support of enterprise-grade and security guarantees are essential, especially for customer-oriented applications or implementations with sensitive or regulated environments,” he said. “These models often come from trusted suppliers, who offer strong performance, compliance support and self -hosting options.”

Bosquez said that the modeling process is cross-functional and risk-inspired, whereby not only technical fit has been evaluated, but also the policy for data processing, integration requirements and scalability in the long term.

Looking at TCO, he said that it varies considerably between open and closed models and neither approaches is universally cheaper.

“It depends on the implementation range and the maturity of the organization,” said Bosquez. “In the end, we not only evaluate TCO on issued dollars, but at the delivery speed, compliance risk and the possibility of safely scale.”

What this means for Enterprise AI strategy

For smart technical decision makers who evaluate AI investments in 2025, the open versus closed debate is not about choosing parties. It is about building a strategic portfolio approach that optimizes for various use cases within your organization.

The immediate action items are simple. First check your current AI -Deskloads and point them to the decision -making framework described by the experts, taking into account accuracy requirements, latency needs, cost limits, security needs and compliance obligations for each use case. Secondly, honestly assess the technical possibilities of your organization for coordinating, hosting and maintenance of the model, because this has directly influenced your actual total property costs.

Thirdly, start experimenting with model orchestration platforms that can automatically routed tasks to the most suitable model, open or closed. This positions your organization for the Agentic Future that market leaders, such as the Guarrera van EY, predict, whereby mode system section becomes invisible to end users.

#Enterprise #strategy #open #closed #models #TCO #reality #check

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *