THE NEXT SUPERCYCLE

Why AI-Enabled DeepTech Will Define the Next Era of VC

Every decade or so, venture capital undergoes a structural reset. Converging forces — technological, macroeconomic, societal, and occasionally geopolitical — change the rules of the game and define an entirely new class of category-defining companies. We saw it with the internet in the 1990s. We saw it again with mobile and cloud software in the 2010s. Investors across the VC landscape believe we are at the beginning of the next one, led by AI. At Odyssey, we believe we are just seeing the start of the positive disruption from AI – and whilst LLM's and Agents will undoubtedly bring massive efficiencies and labour improvements - the biggest truly groundbreaking transformations to our world will come from the application of AI to fundamental science and engineering.

Science and engineering breakthroughs – or ‘DeepTech’ – has historically delivered huge returns for the venture investors brave enough to back it – think of chipmaker Cerebras, biotech Nimbus, or quantum breakout IonQ, each of whom delivered +100x for their early investors. Indeed, LLMs and broader AI models - based on decades of deep IP - are themselves ‘DeepTech’. The very genesis of the home of venture capital - silicon valley - was originally established to back the new silicon ‘chips’ emerging from the likes of Intel and AMD. However, the fundamentals which delivered those venture returns – enormous markets, deep moats – also required long development cycles and often came with heavy upfront investment. AI is re-wiring our entire approach to research and development – and in doing so, dismantling the barriers that historically made DeepTech difficult to back. This powerful combination – AI and DeepTech - is unlocking a generation of start-ups that promise to be the most consequential, and the most valuable, that venture has yet seen.

“AI is not replacing the scientist. It is giving science and engineering its venture capital moment.”

Part I: Why DeepTech Has Always Been Worth Backing

DeepTech — ventures built on radical scientific and engineering advancement — is structurally more advantaged than conventional technology investing. Rather than incremental improvements to existing products which win market share, DeepTech investors back fundamental breakthroughs that create entirely new markets, redefine industries, and erect build barriers to entry that competitors can’t breach.

Defensibility that software alone cannot provide

Software has always had a relatively low barrier to entry - and AI coding assistants, open-source frameworks and no-code platforms have massively reduced this further. In comparison, the value of a DeepTech start-up is rooted in proprietary, hard-to-replicate IP developed in labs and validated through years of intense research and development. AI advances have enabled the development of the IP – working in partnership with the founder to vastly expand datasets and accelerate R&D cycle times.  

Hardware-enabled Deeptech technologies further embed this advantage; when a technology is literally embedded within core infrastructure —bolted to a grid, integrated into a manufacturing process, or delivering a medical treatment — the switching cost is not merely technological. It is political, logistical, and financial. Software can be updated overnight. Hardware cannot be replaced quite so easily.

Patents amplify this structural advantage. Start-ups built with patents achieve a gross IRR of 25%, compared to 20% for non-patented peers — a meaningful delta that reflects the compounding value of protected IP in markets where first-mover advantage is difficult to reverse.

Category creation, not category capture

Software investing is typically about disrupting incumbents and capturing a share of an existing market more efficiently. DeepTech investing is different. The most exciting companies in this space are not optimising markets — they are revolutionising established industries and, in many cases, creating categories that simply did not exist before. Think of IonQ’s quantum computing; Nimbus Therapeutics developing synthetic biology and Cerebas developing an entirely new semiconductor. These are not faster horses; they are entirely new modes of transport. 

And the numbers bear this out; DeepTech investments on average return 26% for their investors, versus 21% for traditional tech.

“These are not marginal outperformances. They reflect a structural truth: when you solve a genuinely hard problem, the market rewards you generously — and for a long time.”

Part II: The Honest Case Against DeepTech

It would be intellectually dishonest to talk about DeepTech's promise without confronting its very real challenges. For decades, the sector has attracted brilliant scientists and frustrated investors in roughly equal measure. Understanding why is essential to understanding why the moment we are entering is so significant.

Capital intensity and long development timelines

DeepTech companies burn cash in a very particular way. The initial R&D phase requires substantial upfront capital, and extensive testing and iteration cycles, before anything resembling a product exists. Then comes the MVP — historically an expensive undertaking in physical hardware that cannot be prototyped on a laptop and shipped to first customers for immediate feedback. Then comes performance validation: proving durability over time and reliability in industrial or commercial environments. Each stage demands capital, and the runway to revenue can stretch to a decade or more.

Technology-first, customer-second

The culture of deep science is one of rigour, thoroughness, and intellectual integrity. These are extraordinary qualities in a laboratory, but the flip side can be a disengagement from customers, and a development of a technology or product in complete isolation from the intended market. Too often, DeepTech founders have fallen into the trap of building what they know how to build — rather than building what the market actually needs. The result is technically brilliant products with poorly tested product-market fit.

Business models built for the wrong era

DeepTech has also been slow to borrow from its software cousins. While software business models sell into customers’ operating budgets, ensuring their costs increase only with users, hardware developers typically default to selling hardware — a capital-intensive transaction that requires customers to commit investment dollars before users have fully tested and adopted the product, and often requiring senior executive or board-level sign-off. Sales cycles grow long and complex, and the risk of losing deals at the final stage is uncomfortably high. The SaaS revolution that transformed software economics has, until recently, barely touched the deep technology world.

“The barriers to DeepTech were never insurmountable. They were waiting for the right tool to dismantle them.”

Part III: AI as the Great Enabler

AI changes all of this. Each of the structural disadvantages described above — the capital intensity, the long timelines, the product-market fit risk, the difficult business models — is being directly addressed by artificial intelligence, today.

Silicon over steel: compressing the R&D cycle

Perhaps the most significant shift is the ability to conduct R&D computationally rather than physically. AI-driven platforms can design experiments, operate robotic laboratory equipment, analyse results, and generate new hypotheses in a closed loop that runs around the clock. What previously required a six-to-twelve-month cycle of physical iteration — synthesising materials, characterising their properties, testing, and starting again — can now be compressed into one to two months. The cost savings from reduced materials, laboratory time, and human labour are substantial. More importantly, the speed of discovery changes the fundamental risk profile of the investment.

  • Cambridge Massachusetts based Lila Sciences is building an ‘AI Science Factory’: a fully autonomous platform in which AI models design experiments, direct robotic laboratory equipment, analyse results, and generate new hypotheses in a closed loop that runs without interruption. Rather than pure simulation — which narrows the search space but misses critical findings only possible with physical synthesis — Lila's platform combines prediction in compute and experimentation in the real world. Lila has demonstrated its platform, producing catalysts and sorbents for industry. It is now using the $550m funds raised to date to build a 235,000 square foot AI Science Factory, which it will offer to corporates and start-ups as a shared infrastructure platform for AI-driven materials discovery.

Digital twins and rapid prototyping

AI-powered simulation can now create high-fidelity digital twins of physical products. These models predict material properties and device performance across thousands of conditions before a single physical prototype is built. Surrogate models can predict long-term degradation from short-term accelerated tests or infer internal cell states from external electrical signatures — compressing what used to be years of cycle-life testing into weeks. For startups operating in capital-constrained environments, this is transformative.

  • London based PhysicsX is building precisely this infrastructure for advanced industrial engineering. Its platform combines AI-driven multi-physics inference with high-performance numerical simulation, enabling engineers to explore millions of design configurations — across fluid dynamics, thermodynamics, and structural analysis — in the time it previously took to evaluate dozens. In aerospace, automotive, semiconductor, and energy applications, PhysicsX customers are replacing multi-hour computational fluid dynamics runs with sub-second AI inference, compressing design iteration cycles by an order of magnitude. Its customers — working on systems ranging from advanced manufacturing components to energy transition hardware — are building things that would previously have required years of iterative physical prototyping, in a fraction of the time.

Rapid, evidence-based product-market fit

AI enables minimum viable products to be developed quickly and cheaply, tested with real customers, and iterated upon with the kind of velocity that was previously the exclusive domain of software. The feedback loop between product development and customer insight can now operate at a pace that keeps research investment firmly focused on genuine pain points — building products that customers pull towards them, rather than products that founders push into an indifferent market.

  • San Francisco based Flow Engineering addresses the specific failure mode that most often derails DeepTech product development: the disconnect between what engineers build and what the system actually needs to do. Its AI-powered platform acts as a unified system of record for hardware development — linking requirements, architecture, design parameters, and verification in a single continuously updated model that AI agents can reason over directly. When a design changes, the system immediately surfaces which requirements are affected, which tests need updating, and which downstream components are at risk — feedback that previously required days of cross-team coordination to surface. The result is that hardware teams can iterate at speeds approaching software velocity: testing assumptions against real engineering constraints, identifying gaps between product and customer need, and redirecting R&D effort before months of development are sunk into the wrong direction. For DeepTech founders, tools like Flow represent a structural shift in how quickly that alignment between product and market can be found and maintained.

Business model evolution: DeepTech learns from SaaS

A new generation of AI-savvy DeepTech founders is learning from enterprise SaaS and using licensing models that distribute services rather than products, toll manufacturing and fabless production models that outsource capital-intensive production to incumbents with existing infrastructure, and pay-per-use structures that convert capital expenditure into operating expenditure for customers. This dramatically shortens sales cycles, reduces customer commitment risk, and creates the kind of recurring revenue dynamics that investors understand and value.

  • Houston based Solugen has developed a proprietary Bioforge platform — modular enzymatic reactors that convert biomass into commodity and specialty chemicals without fossil feedstocks, heavy metals, or harmful byproducts. Rather than attempting to build and operate standalone chemical plants at scale — the approach that has sunk most green chemistry companies before first revenue — Solugen co-locates its Bioforge units within the facilities of established industrial partners. Customers pay for the chemical output on a volume basis; Solugen retains ownership of the process technology and earns a margin that more closely resembles a software licensing model than a commodity manufacturer. The result: revenue exceeding $100 million with margins of approximately 60% — figures that would be unremarkable in enterprise SaaS but are almost unheard of in chemicals.

AI empowers and augments the experts

At its heart, AI empowers scientists and engineers to use their incredible domain expertise to develop life changing technologies – and this is the shift I find exciting of all. For much of the last decade, the dominant narrative in technology entrepreneurship has centred on business model innovation and distribution strategy. What made founders successful was often their ability to navigate venture networks, execute growth playbooks, and hire full-stack engineering teams. The brilliant scientist, the exceptional materials engineer, the clinician who understood disease at a mechanistic level — these individuals often found the gap between their expertise and venture-scale entrepreneurship difficult to bridge.

AI is changing this. The technical expert is no longer constrained by the speed of experimental iteration or the cost of computational modelling. AI doesn't replace the scientist — it removes the scaffolding that previously required large teams, long timelines, and deep pockets. The domain expert who knows which questions to ask, which hypotheses are scientifically plausible, and which experimental results deserve deeper investigation is now the most valuable person in the room. AI needs them to direct the research and interrogate the outputs. Without that domain expertise, AI is a very powerful tool operating without a compass.

But perhaps the most profound shift is this: AI does not merely give domain experts better tools within their field — it allows them to extend outside of it. A chemist who deeply understands molecular behaviour can now build across biology, materials science, and pharmacology without requiring a separate expert team for each discipline. The computational scaffolding that once demanded specialised knowledge in adjacent fields — bioinformatics, process engineering, clinical data analysis — can now be navigated, and in some cases built, by a single expert with the right AI stack. This is genuinely new. The result is a new generation of companies that are both deep and broad, and built with fewer resources than the previous era of deep tech would have thought possible. These companies are the ones that will define the next generation of VC. 

“The era of the scientist-founder is arriving — and it will produce some of the most important companies of the next generation.”

Part IV: AI as the Great Accelerator — Solving our biggest challenges

AI-enabled DeepTech now provides us with the tools to solve the biggest problems humanity faces. The world is grappling with challenges of a scale and urgency that incremental solutions cannot address: energy resilience and availability, planetary health, defence expansion, constrained supply chains, labour restructuring. These are not niche market opportunities. They are the defining challenges of the next quarter-century.

Energy & the Clean Transition

Solar cell costs have fallen 99.7% per watt since 1975 — one of the most dramatic cost curves in economic history. AI is now accelerating the next chapter of that story by sifting through billions of chemical simulations to identify optimal materials for photovoltaics and battery storage. The material discovery implications are extraordinary: while humanity identified approximately 20,000 stable inorganic compounds in the entire history of science up to 2023, Google's GNoME AI model has since increased that number to 421,000. The periodic table is no longer the constraint it once was.

System level advances are also set to transform energy. AI systems that process data from distributed generation, transmission, and demand sources can forecast, optimise, and balance electricity grids in ways that human operators or rule-based systems cannot approach.

Industry & Advanced Manufacturing

AI-driven robotics is set to transform labour economics in dangerous and physically demanding fields. Meanwhile, AI is enabling materials scientists to identify substitutes for expensive rare-earth elements — replacing them with abundant alternatives like zirconium, silicon, and graphene — potentially making whole categories of goods dramatically cheaper and supply chains significantly less fragile. For countries and companies building manufacturing and energy resilience, this is not a nice-to-have. It is strategic infrastructure.

Human & Planetary Health

In 2022, DeepMind's AlphaFold 2 determined the three-dimensional shapes of over 200 million proteins, compared to the approximately 190,000 that had been mapped in the entire prior history of structural biology. This is not an incremental advance. It is a civilisational shift in our ability to understand life at a molecular level — and it directly accelerates drug discovery, materials design, and our understanding of disease. AI is moving medicine from trial-and-error towards an exact science.

System level advances are also advancing how we improve and maintain planetary health.  AI platforms that combine satellite imagery with ground-level sensors can monitor crop health, detect pests, predict yield, and optimise intervention — at the farm level, across hundreds of thousands of acres simultaneously. These are not incremental product improvements. This is not a new product category - this requires the unique combination of deep hardware capability and AI intelligence to exist at all.

“We are not building better software. We are building the infrastructure for the next century of human progress.”

Part V: What This Means for Venture Capital

The implications for venture capital are profound. Deep tech's share of venture capital has already doubled in a decade and, in Europe, now exceeds every other sector. Novel deep tech segments (space, robotics, compute, energy) hit a record $7.8 billion in 2024, up 56% year-on-year. DeepTech already provides superior returns – delivering 26% IRR on average, versus 21% for to traditional tech. However, DeepTech has historically taken 25 – 40% longer to develop and required up to 60% more capital.

This is why applying AI to DeepTech will fundamentally reshape the economics of DeepTech investment. The capital intensity, the timelines, the execution risk — are being structurally mitigated by AI. What remains is the upside: proprietary IP in large, underserved markets; business models with recurring revenue characteristics; and products that create genuine switching costs.

The companies being built at this intersection are not competing for the next marginal improvement in productivity software. They are competing to define how energy is generated and distributed, how disease is diagnosed and treated, how materials are discovered and manufactured, and how the physical world is managed at scale. These are not small markets. They are the foundations of the global economy. The energy transition alone is attracting over $2 trillion in annual investment. The global market for new materials, accelerated industrial processes, and AI-accelerated drug discovery runs to the trillions more. These markets dwarf the productivity software and consumer app opportunities of the last supercycle.

For all these reasons, we are at a tipping point, and the beginning of the next venture capital supercycle. The last one produced search engines, social networks, and smartphone apps. This one will produce the companies that define how energy is generated, how disease is treated, how materials are made — and the investors who recognised the inflection early will look, in retrospect, like those who backed Google in 2000.