AI Musings #4 – How To Differentiate As An AI Applications Startup?

Discussing two areas startups can focus on to create competitive differentiation in the AI applications layer – (1) data and (2) product flows.

Over the last few weeks, I have met a few exciting startups building in the applications layer of AI. As the landscape stands today, the foundational LLMs layer is likely to be dominated by a mix of open-source, Big Tech, and perhaps 1-2 hyper scalers (eg. Anthropic). The cloud infra (compute, safety, security etc.) to deliver these model capabilities will definitely be served by the Big Tech cloud players.

This leaves 2 categories for startups to exploit against these large competitors – (1) Applications and (2) Dev Tools. On the latter, I don’t understand it deeply enough to have a view on it (yet). However, the Application layer is something I get, and therefore, have some working POVs on it.

Almost all AI application layer startups I am seeing right now are essentially using ChatGPT (+ Bard and Llama in a few cases) to build features that solve sharp use cases in specific verticals. Based on observation, some low-hanging verticals that founders are going after include Insurance, Marketing, Sales, and HRTech with AI-generated content being a horizontal ingredient in most of these products (eg. automated email generation, stitching together a marketing video, crafting a training course outline etc.).

In all these cases, I am still struggling to understand how these startups can create competitive differentiation or moats purely by building features on top of hyper-scaler APIs. To take a step back:

For a new technology inflection to create viable startup opportunities, there need to be sizable areas where new companies are significantly better positioned than incumbents to leverage this new technology and solve unaddressed customer problems.

This is a really important point. For a startup to be viable, it’s not enough to just be an early adopter of cool technology and build new products before anyone else. The startup has to be able to create significant differentiation against entrenched competition too. Eg. Apple beat IBM in the PC inflection, Amazon beat offline retailers in the Internet inflection, Instagram and WhatsApp beat Facebook in the mobile inflection, and Figma beat AdobeXD in the cloud inflection.

This is the aspect where I am pushing all AI application founders I meet to start thinking through and strategizing from Day 0. A couple of ways to potentially drive competitive differentiation have emerged from these working sessions:

1/ Access to data

While ChatGPT is great for bootstrapping specific use cases, eventual product differentiation will emerge from startups fine-tuning their own LLMs (with open-source models as a starting point) using proprietary data sets for industry-specific use cases.

To put it simply, foundational models will keep doing a great job of adding horizontal knowledge. Startups will need to do the work of incorporating deep vertical knowledge into the models.

Here, access to the ‘right’ customer data will be critical. But then, entrenched incumbents would already have access to much more data than a 0-to-1 startup. So, how does a startup create a data advantage?

One way could be to identify unsolved pain points for customers that large pre-AI competitors aren’t going after, either because they are contextually unviable (Innovator’s Dilemma), were unsolvable pre-AI, or due to organizational inertia.

In these cases, AI-native startups can leverage their speed to get to the ‘right’ customer data sets before anyone else, and start creating an edge via custom fine-tuning and benefiting from faster learning cycles.

It’s interesting that the underlying driver of this differentiation is still good-old startup execution, rather than just building AI-first features. The company would still require classic software execution (founder-led sales, figuring out ICP, setting up GTM motions, etc.) to succeed.

2/ New product flows

Another area where startups could do better than the entrenched competition is putting in the work to develop AI-first product flows. We saw this happen in previous tech inflections where new capabilities and form factors gave rise to new ways of doing specific jobs. Eg. Apple cracked the smartphone user experience while Nokia struggled. Or Figma figured out how designers should work and collaborate with other functions in a fully hosted, in-browser experience, while Adobe continued to be stuck in its old UX.

Given the vast range of new capabilities that AI is unlocking (eg. chat-based UX, AI ‘agents’ to deliver specific tasks that underly use cases), it’s reasonable to expect a plethora of new workflows to emerge across customer segments. Many of them will require absolutely fresh product thinking to crack, something that pre-AI product teams at established companies might struggle with.

Similar to the ‘access to data’ point earlier, the underlying driver of this product flows differentiation again will be good-old, startup-style product management – Paul Graham’s “do things that don’t scale”, starting with a wedge of focusing on a very-specific customer persona and pain point, frantically iterating on it, and in the words of Brian Chesky, “Focusing on 100 people that love you, rather than getting a million people to kind of like you”.

Putting things together…

If one looks at both the above areas of potential startup differentiation, the way AI might end up creating viable startup opportunities is not the LLM technology itself, which will become baseline, widely available (like cloud today), and likely open source (similar to programming languages like Java and Python).

Rather, the drivers of value creation by startups will be in:

(1) What’s needed to effectively leverage these LLMs to solve verticalized, deep industry-specific problems – eg. pre-AI, a 10x backend engineer was needed to leverage the cloud. Post AI, specific datasets will be needed to leverage LLMs.

(2) 2nd and 3rd order impact of AI on product experiences and workflows – Figma and Notion took years of fresh thinking and iterations to reimagine collaboration UX in the cloud. AI-first use cases will require similar untethered, ground-up product thinking to deliver these capabilities effectively to customers.

What does this mean for venture investing?

It means even in the post-AI world, investors should continue to look for founding teams that demonstrate many of the classical startup traits, a few of them being – (1) ability to unearth a unique customer insight, (2) product thinking to be able to solve for it, (3) GTM skillset to be able to create a differentiated business out of it, and (4) grit to last through the journey.

Essentially, might be a good idea to avoid AI overthink and keep doing more of the basics of venture capital.

Note: check out the previous post #3 in this AI Musings series – LLMs for Beginners.

Subscribe

to my weekly newsletter where in addition to my long-form posts, I will also share a weekly recap of all my social posts & writings, what I loved to read & watch that week + other useful insights & analysis exclusively for my subscribers.

Author: Soumitra Sharma

Operator-Angel I Product Leader I US-India corridor I Believer in Power Laws I Love building & learning

2 thoughts on “AI Musings #4 – How To Differentiate As An AI Applications Startup?”

Leave a Reply

Discover more from An Operator's Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading