Fine-Tuning the Future of AI: Argonautic is proud to support the evolution of the AI ecosystem and the entrepreneurs powering the innovation

SEATTLE, Sept. 30, 2024 /PRNewswire/ — In the evolving landscape of artificial intelligence and machine learning, foundation models – the backbone of predictive tasks – have captivated the tech world. Their ability to perform learning tasks are transforming the way we approach natural language processing, computer vision, and signal processing. At Argonautic, while we acknowledge the pivotal role foundation models play in the AI value chain, it is our perspective that the verticalization of these models under direction of teams with unparalleled subject matter expertise and access to proprietary training data aligns with our capital efficient thesis versus new generalized foundation models which require high upfront training costs initially and face price commoditization in the long run.

Foundation models are pre-trained deep learning models that serve as the versatile and general base for computationally-intensive predictive tasks. Foundation models are then ‘fine-tuned’ to perform function specific tasks for a given use case. The term was coined by Stanford Academics in 2021 and surged in popularity in 2022 to describe the models which were breaking out at the time.  Foundation models are especially recognized for their ability to perform ‘zero-shot’ and ‘few-shot’ learning tasks, where a task is performed with few or zero examples previous given to the model.

Argonautic believes that while the evolution of foundation models is crucial for the advancement of AI and technology overall, business models which focus on building these baseline models have high capital requirements and likelihood of commoditization over time. Instead, we believe teams building models with strong subject matter expertise and knowledge of the problem to be solved (which may be built on these generalized models) will reliably come out ahead in solving the most important problems. These teams have unique access to proprietary data that can be used to train their fine-tuned models and unique distribution channels that more seamless inserts AI-powered tools into business workflows.

Foundation models are commonly used for a variety of natural language processing, computer vision, and signal processing tasks. Open AI’s “GPT-N” series captured the general public’s attention with its ability to craft coherent seeming text given a wide range of prompts. At its core, GPT-N simply predicts the next word in a sentence, which when scaled produces coherent seeming responses. It is trained on a large “corpus” of text data sourced mainly from the open internet but also from, forums, publications and  books.

Argonautic maintains a strategic focus on industries with use cases that require verticalized models. While recognizing the importance of foundation models in advancing AI and machine learning, it is clear that the concentration on constructing baseline models carries inherent limitations. Instead, Argonautic partners with teams with strong subject matter expertise to build models tailored to specific problem domains, leveraging their unique access to proprietary data and distribution channels. In the context of foundation models, Argonautic acknowledges the widespread applicability in natural language processing, computer vision, and signal processing tasks. By emphasizing the importance of fine-tuning to achieve verticalization, Argonautic underscores the ability of companies to specialize their models while benefiting from the underlying foundation model’s conversational interface. In this landscape, Argonautic positions itself against significant capital deployment in general foundation models due to diversification risks and concerns about the potential disruption posed by open-source models and new architectures.

For a generalized foundation model to become ‘verticalized’, it must be “fine-tuned” by passing in an extra set of domain specific data to tailor the generalized model for a use case. This allows companies to specialize their models while still benefiting from the conversational interface of the underlying foundation model. OpenAI’s ChatGPT, Alphabet’s Gemini, Meta’s Llama and others are foundation model driven businesses, which enables teams to build while avoiding billions of dollars of initial training costs.

Training a foundation model from scratch is a large, expensive, and important data engineering undertaking. The architecture of these models typically rely on transformers, which have been the industry standard for a number of years. Where it differs is the scale. The success of foundation models depends on the ability to seamless aggregate vast amounts of data with trillions of parameters. At the time of writing, this costs in the order of billions of dollars and will only grow as customer demands outpace the cost trends of computation and storage.

Argonautic does not believe general foundation models to be an area of capital deployment given our investment style. First, they require large checks which create diversification risk for our investors. Second, we are wary of the risk of open-source foundation models and new architectures disrupting the economics of proprietary models. For example, Retrieval Augmented Generation has changed the way enterprises look at retraining. Staying on are ahead of the curve is expensive and risky. 

Pre-dating the explosion of interest in private sector machine learning models, Argonautic believes the value of a model comes from a few areas: (1) unique architecture which gives it a technical or economic advantage (2) proprietary data which lets the model produce unique insights (3) ability to integrate seamlessly into existing workflows. Unique architecture is often spun off from academic institutions with heavy financial backing. As such our area of interest is in teams who have demonstrated the ability to use their unique insight to solve a specific problem. Teams in the space tend to work on verticalized foundation models which take general foundation models a step further with proprietary expertise.

For example, our portfolio company Cognaize, which automates financial spreading for large financial institutions, has accumulated years of financial data which allows it to fine-tune a defensible, verticalized foundation model in the financial technology space. Similarly, Document Crunch, which analyzes construction contracts for conflicting language, uses a corpus built over a number of years to produce exceedingly accurate results for its customers. ConCntric’s platform allows it collect data which will eventually inform its own powerful predictive model. The specific problems our protein engineering teams solve cannot be adequately addressed by a general model. The model’s differentiation for our teams is only possible because of the expertise of the overall team and is not reliant on a lasting technical edge.

As such, more important than ever, Argonautic is interested in teams that know the problem and market they are solving better than anyone else. This also protects companies from future disruption. Even with the next generation of trends, such as automated ‘AI agents’, we believe that teams with strong subject matter expertise are equipped to stay ahead of the pack. It is our view that general foundation models will never be able to solve a specific more reliably than a combination of an elite team that understands a problem and verticalized foundation model.

Argonautic is proud to have been deploying into AI since our founding. As technologists, we are excited to watch the field continue to change the world and as investors we see the opportunity to support this growth.

About Argonautic:

Founded in 2017, Argonautic is a AI/ML B2B venture capital fund investing across Fintech, Construction Tech and Biotech. Argonautic invests in entrepreneurs who are redefining the future of technology and innovation.

For more information, visit argonauticventures.com.

SOURCE Argonautic

WANT YOUR COMPANY’S NEWS FEATURED ON PRNEWSWIRE.COM?

icon3

440k+
Newsrooms &
Influencers

icon1

9k+
Digital Media
Outlets

icon2

270k+
Journalists
Opted In

Go to Source