Last week, my colleagues Shuvi and Dev wrote about new opportunities for founders in India to pursue, including desi models/LLMs and generative SaaS.
There’s another area that seeing a massive amount of activity…
These are ground-up new infra/devtools to develop, deploy and manage AI-based applications. These products would be geared toward the generative AI developer community or toward regular developers, data scientists and now even non-technical users.
The opportunity here is immense & we are beginning to see new companies tackle exciting problems across a wide variety of use cases. A few trends that stand out are –
- Increased developer productivity: inside organizations, developers are one of the leading adopters of generative AI & are leveraging a plethora of tools to increase their productivity & code quality. A few value propositions in this context include 1) code generation i.e. automatically writing boilerplate code, unit tests & generating complete applications basis certain inputs; 2) auto completion of code using tools like CoPilot, CodeWhisperer; 3) automated documentation of code. Improved developer productivity will likely have implications on R&D budgets, hiring funnels, product roadmap timelines & iteration cycles. Moreover, this could also significantly improve legacy business models like offshore technology services, given organizations can do a lot more with a lot less!
- Emergence of a new AI-stack: enterprises are looking to leverage the full potential of AI on proprietary datasets, which typically includes massive amounts of unstructured data (think text, audio, video, images, etc.). We are seeing the emergence of tools that aid with the ingestion, transformation & storage of unstructured data in AI-relevant data formats like vector embeddings. Tools like vector databases have been around for the last few years but have experienced a sharp growth in adoption over the last year primarily because of the advancements in AI. Going forward we expect to see many tools that will either augment or replace existing layers in the data engineering and machine learning stack.
- Rise of LLMOps: as organizations think about creating first party & leveraging third party LLMs, we expect an increased maturity of the tooling ecosystem that enhance the usability of these models. In particular, use cases around performance monitoring, model observability & workflow management are likely to be initial categories that experience startup activity and customer adoption.
The opportunity that generative AI represents at the infra layer is large & exciting! India has a massive developer base (9 million!, 2nd largest in the world), and a large base of software founders. Time to start building!
Authors