Hi, itβsΒ Alex from 20VC. Iβm investing in seed & series A European vertical solutions (vSol) which are industry specific solutions aiming to become industry OS and combining dynamics from SaaS, marketplaces and fintechs. Overlooked is a weekly newsletter about venture capital and vSol. Today, Iβm sharing the insightful tech news of September.
Sunday, Sep. 1st: Ramp has released a report analyzing AI spending trends based on the purchasing behavior of its customer base. - Ramp
βIn Q2, mean AP spend with AI vendors rose 375% year over year. As expected, these AP expenditures were dominated by OpenAI, with companies spending $181K on average.β
βAnthropic is seeing particularly rapid adoption when we look at the percentage of businesses purchasing AI models on Ramp cards. After hovering at 4% at the start of the year, its market share jumped to 17% in Q2.β
βOur data show customers who started spending with top AI vendors in 2023 are likelier to stick with these vendors than those who started spending in 2022 or 2021βa signal that companies are increasingly convinced of AI's value.β
βVendors that offer novel approaches to everyday work are becoming commonplace, as companies turn to AI to boost employee productivity.
Suno AI, for instance, lets anyone generate music using simple text prompts.
Limitless runs in the background to capture your audio and screen automatically so you can generate meeting summaries, emails, and more at a later time.
Instantly offers AI for sales engagement and lead intelligence AI.
Cursor makes an elegant AIΒ code editor. All of these vendors are popular with small and large businesses alike.β
Monday, Sep. 2nd: Marie Brayer explored LLMsβ adoption in enterprises. - SOTA 1, SOTA 2
There are currently three main enterprise use-cases: (i) transforming human language into computer actions (e.g. streamlining customer support, reducing admin tasks, removing the need for complex interfaces), (ii) helping people make sense of information overload and (iii) helping humans to produce content at a faster pace.
Today, people are using open-source because they want control and customisability but next year, cost will be the main reason to pick open source when people realize how much scaled LLM pipelines cost.
βHowever, seeing how many non-β pure-playerβ companies will keep investing 9 figures+ into training models in the long term will be interesting, as the game is a bit more expensive than traditional open-source.β
βHistorical data & ML platforms that already have access to enterprise data seem ideally positionedΒ to train or help fine-tune models that could sweep the market. However, no killer use case or feature has added a clear line to their P&L yet.β
βWhereas overall sentiment towards LLMs is excitement about the potential, the reality is that, except for a few determined use cases in high-tech companies (startups & software vendors), the market has not successfully yet produced clear wins that directly translate to an increase in P&L.β
βThe limits of LLM technology are overall very misunderstood even by seasoned data scientists because this field is brand new; plus, most of the LLM science is not mature.Β For example, getting RAG to deliver something other than a disappointing result isΒ βa dark art.βΒ A lot of people who get β33% accuracyβ on their pilot donβt measure how difficult (or impossible) it is to reach 99% accuracy with the current models.β
βThe business model of an LLM pipeline is not a given, as the system canΒ costΒ 1 to 10 euros per inference requestΒ (read that again) or more in compute or API costs for complex use cases. Itβs not as cheap as running βnormal data scienceβ pipelines where the only cost is the team. Everything from finetuning to inference is expensive. Massive ROIΒ needsΒ to be present to justify the cost.β
β[LLMs are used] notably of their vision capabilities, to empower forms of RPA (robotic process automation).β
βLLMs helped Document processors build more robust, higher-value use cases and might be killing the game.β
UiPath is trying to incorporate LLMs into its platform via its seed investment into H. βHβs goal is to develop a large action model to make LLM-based RPA a reality and deliver it to UIPathβs customers through a tight-nit partnership.β
ServiceNow acquired ElementAI for $500m to integrate AI into its platform. βThe tier one AI team, created from the ElementAI acquisition, delivered on many fronts:
an impressive collaboration with Huggingface called BigCode, that led to the training of one of the best code generation model families,
a bunch of AI-powered internal AI tools and improvements, customer-facing or not,
a smart chatbot,Β βNow Assist,βΒ was released in Nov. 2023. You can think about it as a Copilot-Chatbot for all the functions supported by ServiceNow.β
βLLM in the Enterprise is in the pilot phase for now, and the very high expectations of AI doing complex stuff face scientific and technical blockers before being delivered.β
Tuesday, Sep. 3rd: Dan Shipper wrote about OpenAIβs new Strawberry model. - Chain of Thought
βOpenAI has created a new model called Strawberry that could represent a major leap forward in its ability to reason.β
βIt appears to be a language model with significantly better reasoning abilities than current frontier models. This makes it better at solving problems that it hasnβt seen before, and less likely to hallucinate and make weird reasoning mistakes.β
βProcess supervision means that during training, a model is rewarded for correctly moving through each reasoning step that will lead it to the answer. By comparison, most of todayβs language models are trained via βoutcome supervision.β Theyβre only rewarded if they get the answer right.β
βOpenAI originally developed Strawberry to create training data for its newest foundation model, codenamed Orion. Strawberry can be prompted to generate a vast custom training set of problems with step-by-step solutions that Orion can learn to solve and, therefore, hopefully hallucinate less when it encounters similar problems in the wild.β
βWhile Strawberry was originally built to create training data, OpenAI has plans to release a smaller, faster version of it as a part of ChatGPT as soon as this fall, potentially representing a major upgrade to the LLMβs current reasoning abilities.β
βOpenAI has chosen to use Strawberry to generate moreΒ syntheticΒ data of a certain type: logic and reasoning problems, and their solutions.β
Wednesday, Sep. 4th: Meta shared data on Llamaβs usage trends and practical applications. - Meta
Llamaβs open-source nature has spurred rapid adoption, with 350m downloads on Hugging Face. Companies like AT&T, DoorDash, and Goldman Sachs are integrating Llama for AI-driven innovation.
βAccentureΒ is using Llama 3.1 to build a custom LLM for ESG reporting that they expect to improve productivity by 70% and quality by 20 β 30%, compared with the companyβs existing way of generating Accentureβs annual ESG report.β
βDoorDashΒ uses Llama to streamline and accelerate daily tasks for its software engineers, such as leveraging its internal knowledge base to answer complex questions for the team and delivering actionable pull request reviews to improve its codebase.β
βShopifyΒ is continuing to experiment with best-in-class open source models, including LLaVA, which is built on the foundations of Llama. They use finetunes of LLaVA for multiple specialized tasks and are currently doing 40M β 60M Llava inferences per day supporting the companyβs work on product metadata and enrichment.β
βLlama models are approaching 350 million downloads to date (more than 10x the downloads compared to this time last year), and they were downloaded more than 20 million times in the last month alone, making Llama the leading open source model family.β
βMonthly usage (token volume) of Llama grew 10x from January to July 2024 for some of our largest cloud service providers.β
Thursday, Sep. 5th: Ikea is adapting to changing consumer habits by expanding e-commerce, smaller urban stores, and sustainability efforts. Online sales now make up 23% of its total, and new βplanning centresβ and stores in cities like New York and London aim to bring Ikea closer to customers. A key initiative is its second-hand furniture platform, allowing customers to resell Ikea products with Ikea-provided pricing and instructions. This move supports both sales and environmental goals, as the company works to reduce waste and carbon emissions while promoting sustainability. - The Economist
βIKEA products already account for around a third of second-hand-furniture sales by volume.β
βSince 2016 the company has expanded its sales by a third but reduced carbon emissions from its operations by half.β
Friday, Sep. 6th: 6 Degrees Capital shared a presentation on the UKβs fintech ecosystem. - 6 Degrees
Saturday, Sep. 7th: Fidelity wrote about the wealth management industry. - Fidelity
βIn the United States alone, an estimatedΒ $84tnΒ will change hands, primarily moving from Baby Boomers to Millennials and Gen Z.β
βManyΒ traditional wealth management firms are caught at a crossroads, struggling with legacy technology, and ever evolving client expectations. These cumbersome systems are expensive to maintain and lack the flexibility needed to keep pace with modern market demands. Today,Β nearly 50%Β of wealth management firms report that their technology platforms are not fully integrated. This lack of integration leads to inefficiencies, higher operational costs, and ultimately higher fees for clients. Additionally, these outdated systems often rely on manual processing, limiting scalability and slowing down service delivery.β
βThe number of people aged 65+ in the US is projected to rise from 60 million today to 100 million by 2060 (U.S.Β **Census BureauβsΒ Population Projections). This demographic shift presents significant opportunities to develop digital-first solutions for a tech-savvy generation entering retirement.β
βMid-market wealth managers and family offices have traditionally relied on a missmash of legacy platforms to perform their administrative back and middle office tasks such as portfolio management, books of record and risk analysis. Designed for large institutions, this expensive and cumbersome software is a drain to the squeezed margins of mid-market firms. Companies likeΒ PerformativΒ are already innovating in the space, offering a fully integrated, front-to-back wealth management operating system. Their platform seeks to consolidate all functionalities (portfolio management, CRM, risk analysis, back office) into a single solution, enabling wealth managers to deliver personalised services to their clients effectively.β
Sunday, Sep. 8th: Paul Graham wrote an article arguing that conventional advice for running large companies often harms rather than helps founders. He emphasized the importance of a "founder mode," which allows founders to engage more directly with their teams instead of relying solely on traditional management practices. Understanding and developing founder mode could lead to better outcomes for startups as they grow. - Paul Graham
βI hope in a few years founder mode will be as well understood as manager mode.β
βWhatever founder mode consists of, it's pretty clear that it's going to break the principle that the CEO should engage with the company only via his or her direct reports.β
βYou tell your direct reports what to do, and it's up to them to figure out how. But you don't get involved in the details of what they do. That would be micromanaging them, which is bad. Hire good people and give them room to do their jobs.β
Monday, Sep. 9th: Hadyen Capital wrote about AppLovin it its latest investor update. AppLovin offers marketing, monetization, and analytics solutions to mobile app developers to boost user engagement and revenue. - Hayden Capital
βApplovin is an advertising network for mobile apps (in particular, casual mobile games). Essentially, they are a market-maker for those looking to buy and sell ads β helping apps acquire users and monetize themselves, in an extremely competitive industry.β
βThe company facilitates over $10 billion dollars of volume annually for its mobile gaming clients, and is expected to make $4.4BN in revenue, $2.5BN in EBITDA, and $1.8BN in Free Cash Flow this year.β
βThis is a two-sided market, and might be easiest thought of like a real estate transaction. AppDiscovery is the βbuyerβs brokerβ and the Max mediation platform is the βsellerβs brokerβ. They are one of the largest players on both the βbuyerβsβ side (3rd largest ad network, behind Google and Meta) and the βsellerβsβ side (~60 - 70% market share). Applovin takes a cut in the process, typically 20 - 30% of the total ad spend with the rest going to the publisher.β
βThis is why an effective ad network is extremely important to the mobile gaming ecosystem. Monetizing quickly & then pushing users to the next game is vital to the casual game business model. Itβs possible that without effective ad networks like Applovin, the entire genre of free-toplay games might not exist.β
βThere have been a few developments in the past few years though, that make this company especially exciting. Apple rolling out ATT, Unityβs missteps, and Applovinβs improved AI engine have all been enormous tailwinds.β
βApplovin shrewdly recognized the need to βownβ first-party data early on, and spent ~$1 billion to acquire or partner with gaming studios, starting in 201811. These studios have a combined ~200 games, which provide data on over 200 million users to Applovinβs advertising engine, that is outside the confines of ATT.β
Tuesday, Sep. 10th: SaaS companies need to focus on NDR for sustainable growth in 2024, as expansion now accounts for a larger portion of growth. Achieving an NRR of over 100% is becoming more difficult, but it remains crucial for driving higher growth rates. Companies with strong NRR tend to grow faster, highlighting the importance of customer retention and finding the right customers. - ChartMogul
βRetention is now critical for long-term, sustainable growth. In 2024, companies with $15M-30M+ ARR are seeing 40% of their growth driven by expansion, compared to 30% in early 2021, when growth peaked.β
βTo succeed today SaaS companies must focus on capital efficient growth. Best-in-class SaaS companies now focus on retention and expansion, and mature companies are shifting their growth strategies to prioritize existing customers.β
βDuring the first half of 2021, companies across all ARR segments reached their highest growth peak. They did this by acquiring new customers. But over the past three years, the way companies grew has relied much more on expansion. Bigger companies with larger workforces and customer bases naturally tend to focus more on retention. But today, not only is retention a focus, itβs a growth driver. Expansion now contributes up to 40% of growth for companies with $15M-30M+ ARR.β
Wednesday, Sep. 11th: Rick Zullo at Equal Ventures wrote about the potential uniformisation and decline of VC firms. Concentration of capital among a few large firms is stifling innovation and diversity within the industry. Many emerging managers are pressured to imitate established firms, creating homogeneity. It could lead to a βVC extinction eventβ where only mega-firms survive, similar to asset managers like BlackRock. To avoid this, stakeholders must embrace more diverse strategies and thinking to prevent the industryβs collapse. - Rick Zullo
βThe majority of the LP landscape remains focused on these big names and they are heavily incentivized to do so.β
βWhen LPs do invest in a new emergent fund, their preference is to back spin outs of these large organizations or those with deep ties to those firms. I personally believe this is misguided (most of the best firms from the last 2 decades (e.x. USV, First Round, Founder Collective, IA Capital, Forerunner, Emergence, etc.) were not spin outs from large Bay Area venture shops), but also creates an insular feedback loop to reinforce the mega fund machine.β βLP preference for this pattern has incentivized the landscape of emerging managers, pressuring managers to look more and more like miniature copies of these firms.β
βWhen a small subset of funds force the industry to uniform, consensus investing, then I think weβve lost the essential nature of VC. At that point, smaller firms are just outsourced resources, not independent investors.β
βThese incentives may yield a terminal state where venture capital ceases to exist in its current form - one where our industry becomes so uniform in feeding the mega fund machine (making those firms even bigger and these incentive structures stronger) that our industry ultimately looks more like Blackrock than Benchmark.β
βMy fear is that we will see an extinction level event to the venture industry if left unaddressed.β
Thursday, Sep. 12th: I listened to a Colossusβ podcast episode on AI with Gavin Baker who is managing partner and CIO at Atreides Management. - Colossus, Gavin Baker
Reaching AGI is seen as the ultimate race between tech giants which are not focused on ROI at the moment. βThe people who actually control these companiesβ¦believe theyβre in a race to create a Digital Godβ¦if they lose that race, losing the race is an existential threat to the company.β βMark Zuckerberg, Satya and Sundar just told you in different ways, we are not even thinking about ROI.β
GPT models may not stay commodities forever. βOnce we get to GPT-7 or 8 that literally costs $500 billion to train, I donβt think theyβre going to stay commodities.β
Multiple competing LLMs are crucial for humanity. βIt is supremely important for humans that we do not end up in a world where there is just one dominant model. That is the most dystopian future I can imagine.β
βROI on AI has been positive thus far. This is undeniable. ROIC at the largest spenders on AI has gone up significantly since they ramped their datacenter/GPU capex spend last year. Most of the ROI on AI thus far has come from improved advertising targeting and creative leading to higher ROAS for customers.
βThe only way to generate ROI on a model is to have unique, valuable data and internet scale distribution. Absent unique data and internet scale distribution, these models are essentially commodities today (might change as scale becomes a barrier) which is why it was relatively easy for Meta to outsource Llama - the value will mostly come from the data, not the model at least for now.β
Robotics combined with AI will redefine the labor market. βPutting LLMs into these humanoid robots I think is going to be so transformational for the world and make a lot of blue-collar labor optional.β
Synthetic data is allowing AI models to continue scaling despite data limitations. While no one fully understands why synthetic data works, it is already proving to be a solution to a potential bottleneck in AI training.
AI tools are replacing repetitive tasks in industries like research, advertising, and software development. Even simple AI wrappers around LLMs are already transforming workflows, making workers significantly more efficient.
Friday, Sep. 13th: I listened to Josh Wolfe on the Logan Bartlett Show. - Logan Bartlett
βI'm competitive. I want to be smarter than the next person. I want to know something that somebody else doesn't know. That is an addictive feeling.β βI need to know more than you about this particular topic.β
βToday, lot of people are working on all the different variations of GLP1. Itβs a crowded space. I would not be very bullish on the incremental advance there because it has to be so much better that it's improbable to really get the attention to get the investor demand, to get the media attention and to get the desire for people to want to work for that company.β
βIf look at all the risks of a company (financing risk, market risk, technology risk, people risk, competition risk), every risk that we can identify and kill with time or talent or money thrown at it should be rewarded. A later investor that comes should pay a much higher price and demand a lower quantum of return. I should get rewarded in the form of value creation.β
βWe want people to agree with us just later. The contrarian aspect [of my investing style] is just being earlier.β
βI just start reading scientific papers [on a topic Iβm interested about] and my goal is to be incrementally smarter in every conversation so that by the sixth or seventh person I'm talking to [on this topic, he is impressed by knowledge]. That to me is sort of like a competitive, obsessive thing. From nuclear waste, to geopolitical controversy in the Sahel and Maghreb, to aerospace and defense, to satellite and manufacturing. I need to know more than the next person about whatever that new thing.β
βI get obsessed about something typically on like a six, eight week basis.β Sometimes, I get obsessed with something and it just sits there. Sometimes it's something where I'm like, this is inevitable. But it might just take forever and I'm waiting for the entrepreneur, or maybe there's a technological breakthrough.β
Saturday, Sep. 14th: Nichole Wischoff shared the fundraising deck she used to raise its $50m third fund. - Nichole Wischoff
Sunday, Sep. 15th: Chris Paik at Pace Capital wrote an essay on being intentional in understanding oneβs own judgment and biases while investing. - Christ Paik
Investing involves two areas of study: the outward-facing world of evaluating businesses and markets, and the inward-facing world of understanding oneβs own judgment and biases. While most investors focus on the former, true success often comes from studying oneself.
βIn the world of investing, there are two areas of study. The first world is outward-facingβthe study of what makes a good investment opportunity, a good business, fundamentals, frameworks, etc. The second world of study is inward-facingβthe study of oneβs own judgment, mental biases, where intuition is perfectly right and where it is perfectly wrong, motivators that skew incentives, and our natural tendency to want to outsmart ourselves.β
βObjectively, you are capable of having 100% conviction in something and also being 100% wrong. Sitting with that as a truth about oneβs judgment, while uncomfortable, ought to prompt us to be hungry to debug ourselves (rather than dismiss it to protect our ego). If we do this well enough, we can study ourselves at a distance, like an anthropologist, and come to valuable conclusions and workarounds.β
βIf I donβt do the work because Iβm too lazy, that means I donβt feel strongly enough about the opportunity and we should not make the investment.β
βI should pursue investment opportunities alone because if I have a team that does the work, we will end up with a memo and have to make an investment decision but I wonβt know how convicted I am in the opportunity.β
βThe challenge with this inward-facing world of study is that it is fractal. We constantly try to outsmart ourselves once we know how we operate. It is a constant cat and mouse game between our rational self and meta-analysis, each vying for superposition. But itβs also where the most alpha as investors comes from and is wholly unique to each individual.β
Monday, Sep. 16th: Jordan Nel at Hummingbird wrote about the growing gap between large venture funds and small/emerging funds. He argues that small funds are the only ones which are non-consensus and high alpha when larger funds are facing challenges due to inflated valuations and over-reliance on consensus deals. - Jordan Nel
βBig funds are forced to look for deals which can both accommodate large checksΒ andΒ have potentially enormous outcomes. So they invest in companies with pedigreed teams, obvious markets, and which could be enormous, but which also raise big dilutive rounds and have a lot of expectation baked into their valuations.β
βVenture is meant to be the riskiest asset class, where conviction matters the most. With the big funds now facing deployment pressure, doing more expensive deals, and having trained a generation of investors to source and sell rather than pick, the field is more open than most think.β
βInvesting with emerging managers requires parsing their real unique insights (for instance: why did they invest in this specific team and not their competitor with the same narrative and more pedigree?) and getting comfortable that theyβll nail the transitory period of non-consensus to consensus.β
Tuesday, Sep. 17th: Tanay Jaipuria compared OpenAI and Anthropicβs revenues. - Tanay
OpenAI is forecasted to generate $3.7bn in revenues in 2024 with 73% coming from ChatGPT subscriptions and 27% coming from its API. OpenAI will burn $5bn in 2024. Its API business is operating at a 50% gross margin meaning that the losses are mostly coming from its ChatGPT business and operating costs.
Anthropic is forecasted to reach $1bn in annualised run rate revenues at the end of the year (10x YoY growth) with 60% coming from its API distributed via Amazon, 25% from its own API and 15% from its Claude subscription. Anthropic will burn $2bn in 2024.
βDistribution remains king, even for developer products. Anthropic generating ~65% of their revenue from their third party API highlights how important distribution is.β
Wednesday, Sep. 18th: AI start-ups are generating revenue faster than previous tech companies with many reaching $1m within 11 months, compared to 15 months for SaaS companies. AI firms achieving $30 million in revenue did so in 20 months, 5x times faster than SaaS counterparts. - FT
Thursday, Sep. 19th: I listened to a BG2βs podcast episode with Bill Gurley and Brad Gerstner. - BG2
OpenAI's early lead in consumer AI is fuelled by network effects and strategic advantages. OpenAI has a dominant position in the consumer AI market because it has network effects, where more users generate more data, leading to better models and attracting even more users.
A combination of factors is contributing to the decrease in initial public offerings, including regulatory hurdles for going public, increased access to liquidity through secondary markets, and evolving motivations for both founders and investors.
There is concern about the potentially detrimental effects of too much capital flowing into the VC market. This "overfeeding" of companies could lead to operational inefficiencies, unsustainable burn rates, and hinder innovation rather than foster it.
People are enthusiastic about the potential for advanced AI voice applications. Advanced voice applications are likely to be very βreinforcingβ for users, leading them to engage with AI products more and more. This could give companies that develop these applications a significant advantage over their competitors. However, there are some concerns that the high cost of running advanced AI voice applications, particularly when it comes to inference, may prevent these companies from offering them at an affordable price. It is still unclear what the best business model is for advanced AI voice applications, and how much consumers would be willing to pay for them. Some have suggested that consumers would be willing to pay a lot for a βperfectβ AI assistant, but no such product exists yet.
The demand for AI inference is increasing and is expected to increase dramatically in the future. The increasing demand for inference is being driven by new AI models that rely heavily on inference, such as those that use inference-time reasoning (a technique where the model performs reasoning steps during the process of generating a response to a prompt, rather than just generating a response in a single step). OpenAIβs new AI model, Strawberry, uses inference-time reasoning and requires 100x more inference than models that use single-shot prompting. In the future, there will be much more machine-to-machine communication, which will also drive up the demand for inference
The increasing demand for AI inference is creating constraints for companies developing AI products and services. This is because inference requires a lot of computational power, which is expensive. This is reflected in the fact that companies like OpenAI and Microsoft are reportedly βinference constrained,β and that the release of OpenAIβs new voice model has been slow, perhaps because it is very expensive to run.
Friday, Sep. 20th: Thomas Laffont, GP at Coatue, talked about the unicorn economy. - All-In Summit
Funding remains healthy but exits are challenging. βFunding is still actually pretty healthyβ¦ but if we look at exitsβ¦ weβre at pre-levels without really any substantial increases.β
Traditional exit options are limited. βThe three kind of traditional exits for companies are blocked todayβ¦ fewer buyouts, fewer IPOs, and M&A is restricted.β
VC distributions are at an all-time low. βDistributions from VCs back to their investors are essentially at all-time lowsβ¦ almost back to Global Financial Crisis levels.β
Private companies are struggling to raise funds.β In todayβs market, it takes longer to raise fundsβ¦ and down rounds and bridge rounds make up almost 63% of total rounds.β
Unprofitable tech and SaaS companies are underperforming in the public market. βUnprofitable Tech and SaaS companies have been hit the hardest and have recovered the least since 2019.β
Public markets are demanding profitability, growth, and scale. βWhat the public market is telling you is that we want it allβwe want you to be profitable, to grow, and to have scale.β
Restricting big companies from acquiring small ones reduces the value of small companies, as investors downgrade their worth. More critically, it removes the pressure on large companies to act quickly, since they no longer need to worry about competitors acquiring those small companies due to regulatory barriers.
1,440 private companies are unicorns. There is more private companies in tech that are worth more $1bn than public ones.
Many unicorns in recent cohorts have yet to be re-priced and may never secure additional capital.
βOf all of the IPOs since 2020, if you look at the value created or destroyed from their IPO price you can see it as as a cohort we've destroyed almost $225bn in market cap offset by the value creation of $84bn - so net negative as a cohort.β
βTechnology is the great resetter of the business world. It can take an incredible company and turn it into dust (e.g. Nokia). The reason why I'm still an incredible optimist about our industry and about technology is because technology is still the most disruptive force.β
Saturday, Sep. 21st: Harry interviewed Eric Vishria. He is a general partner at Benchmark who invested into companies including Cerebras, Confluence, Amplitude and Benchling. - 20VC
βI'm not a sector specialist and nobody at Benchmark is. In venture, you have to be moving. You have to be looking at new stuff because that's where the disruption's happening.β
Even if youβre not a sector specialist, you can always evaluate three things in all companies: (i) is the entrepreneur extraordinary? (ii) has the entrepreneur a very interesting and unique insight? (iii) can the addressed market sustain a big company?
βI think that a lot of venture capital tend to follow a very spreadsheety, investor bankery type which does not work.β
βIn a hyper competitive market, you have to really believe in the entrepreneur, in the heart of the insight and his execution ability.β
βWe almost never think about like fund cycle or fund timing. We almost never think about portfolio construction. Finding good investment ideas is hard enough. Finding great companies is hard enough. Let's not over constrain it.β
Sunday, Sep. 22nd: Cerebras filed to go public. Itβs an AI chipmaker. It aims to raise $1bn at a $7-8bn valuation. Cerebras is not profitable and its business heavily relies on one customer, G42, an Abu Dhabi-based AI company, accounting for over 80% of its revenue. - S1, Matt Turck
βFor the most part, public market investors have had very limited options to play the Generative AI wave: essentially NVIDIA, and indirect bets on AI through the hyperscalers.β
βTiming is everything. AI is hot and Cerebras plays a very strategic part of the AI market (the well-documented AI infrastructure build, happening right now), at a very sensitive time (the even more well-documented GPU shortage, which may or may not last).β
βReturn of the small IPO? While itβs growing very fast, the company is reasonably early in revenue β $136 million in revenue in the first six months of 2024, and $78.7M for the fully year 2023.β
βA big part of NVIDIAβs competitive advantage lies not just in the hardware, but also in the software β CUDA, both a computing platform and programming model enabling developers to program NVIDIA GPUs directly.β
β87% of revenue in the first half of this year is from just one customer, G42 in the UAEβ
βCerebras provides chips for high-performance computing, specifically for use in training & inferencing LLMs and other models. The company also provides a software layer (CSoft) closely integrated with their chipsβ
βCerebras has raised $715M in venture capital, most recently at a $4.1B valuation. A number of well-known VC investors on the cap table, with Alpha Wave, Altimeter, Benchmark, Coatue, Eclipse Ventures and Foundation Capital all holding >5% of outstanding sharesβ
βWe design processors for AI training and inference. We build AI systems to power, cool, and feed the processors data. We develop software to link these systems together into industry-leading supercomputers that are simple to use, even for the most complicated AI work, using familiar ML frameworks like PyTorch.β
βAI compute is comprised of training and inference. For training, many of our customers have achieved over 10 times faster training time-to-solution compared to leading 8-way GPU systems of the same generation and have produced their own state-of-the-art models. For inference, we deliver over 10 times faster output generation speeds than GPU-based solutions from top CSPs, as benchmarked on leading open-source models.β
βOur proprietary software platform, CSoft, is core to our solution and provides intuitive usability and improved developer productivity. CSoft seamlessly integrates with industry-standard ML frameworks like PyTorch and with popular software development tools, allowing developers to easily migrate to the Cerebras platform. CSoft eliminates the need for low-level programming in CUDA, or other hardware-specific languages.β
βOur wafer-scale chip architecture eliminates the need for distributed computing. This enables AI developers to use up to 97% less code when working with large models on our platform compared to on clusters of GPUs and greatly accelerates the speed of AI model development for larger-scale models.β
Monday, Sep. 23rd: a16z published a post discussing the impact of AI on vertical SaaS companies. - Angela Strange
βWith AI, many customers of VSaaS can dramatically reduce internal and external labor spend on sales, marketing, customer service, operations, and finance.β
βWith the right solution, many businesses consuming VSaaS can dramatically reduce internal and external labor spend on sales, marketing, customer service, operations, and finance. This should further increase the take rate of VSaaS companies by an additional 2-10x.β
βAI is unlocking a new era for vertical SaaS. In functions like marketing, sales, customer service, and finance, AI will augment, automate away, or in some cases, replace, many of the rote tasks currently performed by people, allowing VSaaS companies to offer even more with their software.β
Tuesday, Sep. 24th: Sarah Tavel at Benchmark discussed the potential impact of LLM players on B2B AI startups if these former players start to move up the stack. - Sarah Tavel
βIt seems inevitable that as the underlying foundation models become more powerful, the LLM players will seek to justify the enormous investment that has gone into training their models by moving "up the stack", and evolve from an API or chat interface, to async agents.β
βRight now, OpenAI/Anthropic/et al have an API consumable by almost anyone, but it's not hard to imagine a world in which they begin to compete with some of their API developers.β
βI'd guess the async coding agents are most vulnerable to this potential in the near term given the seemingly unbounded economic value of owning this use case, and the already existing product market fit LLMs have found with coding.β
βWhy buy a specialized AI application that lets you automate internal IT ticket response, when the foundation model companies offer an AI agent that if you point it in the right direction with a job spec, will read your knowledge base, build its own integrations to connect to your existing systems of record (e.g., Jira), and then handle all the internal requests automatically?β
Sarah highlights 3 paths for startups to create defensibility: (i) build a network effect, (ii) capture proprietary data/hard to access data, (iii) land grab in an overlooked vertical.
Wednesday, Sep. 25th: Paul Graham wrote on the debate about whether it's a good idea to follow your passion. Following your passion doesnβt always guarantee financial success but can be crucial for creating extraordinary work or launching startups. The best startup ideas often arise from personal interest. - Paul Graham
βIf your main goal is to make money, you can't usually afford to work on what interests you the most. People pay you for doing what they want, not what you want. But there's an obvious exception: when you both want the same thing.β
βMany if not most of the biggest startups began as projects the founders were doing for fun. Apple, Google, and Facebook all began that way. Why is this pattern so common? Because the best ideas tend to be such outliers that you'd overlook them if you were consciously looking for ways to make money.β
βIf you don't need to make much, you can work on whatever you're most interested in; if you want to become moderately rich, you can't usually afford to; but if you want to become super rich, and you're young and good at technology, working on what you're most interested in becomes a good idea again.β
βYou don't necessarily need a job doing x in order to work on x; often you can just start doing it in some form yourself.β
Thursday, Sep. 26th: CRV will return the remaining $275m to be invested to its LPs from its $500m growth arguing that there is a poor risk-reward ratio at growth stages with most companies being overvalued compared to their potential. CRV will refocus on its core business which was deploying a $1bn early stage fund. - NYT, Techcrunch
βThe reason was the maths no longer work. In order to generate the kind of returns that CRVβs investors expected, many start-ups β far more than ever before β would have to wind up being worth $10bn or more. βThe data just doesnβt support that,β said Saar Gur, a partner at the firm. βThere arenβt many really big foundational companies and big outcomes.ββ
βThis is the second time the firm has cut its fund size. In 2002, after the dot-com bubble broke, CRV slashed its $1.2 billion fund to just $450 million.β
βSome limited partners have been frustrated with the expansion of venture capital funds in recent years.β
βBigger funds also tend to have middling investment performance. Smaller venture capital funds have historically minted the highest returns.β
Friday, Sep. 27th: Molten acquired a 97% stake in Connectβ first fund for about Β£18.6m. Molten made several acquisitions of LP positions in the past including in Seedcamp Funds I, II, and III, as well as Earlybird DWES Fund IV and Earlybird Digital East Fund I. This acquisition illustrates the growing need for GPs to generate DPI for their investors as well as the emerging liquidity options for GPs that can sell their funds without having realised exits. - Molten
βThis strategic move is part of Molten's ongoing commitment to acquiring LP positions in funds made up of later stage portfolios with strong potential for near-term realisation.β
β85% of the value of the portfolio is driven by two standout companies: Typeform, a leading cloud-based SaaS platform for survey data collection, and Soldo, an expense management platform.β
Saturday, Sep. 28th: Lightspeed published a deep-dive on conversational AI. - Lightspeed
βCommercial voice applications have evolved dramatically over the last 50 years.
The first interactive voice response (IVR) system appeared in the 1970s, requiring end users to use a keypad to navigate through voice prompts.
In the last two decades, weβve seen this traditional, touch-tone model give way to something smarter: voice-driven phone trees, allowing customers to use natural language commands instead of just pressing buttons.
Now, we are entering the era of LLM-based systems, where end users donβt just talk to software, but have conversations with it. These systems understand the nuances and context just like a human would.β
βIn the short term, the most successful voice companies will be focused on vertical apps in fields like healthcare and hospitality, as well as apps designed for relatively simple tasks like scheduling. Eventually, however, these new voice apps are poised to wedge their way into broader SaaS platforms, significantly expanding the total addressable market.β
βAI-driven voice apps rely on three basic reference architectures devoted to ingesting natural language, interpreting it, and generating an intelligent response:
Speech to Text (STT) ingestion. Capturing spoken words and translating them into text.
Text-to-Text (TTT) reasoning. Utilising an LLM to tokenise the text transcription and formulate a written response.
Text-to-Speech (TTS) generation. Translating that written response into spoken language.β
There are 3 key challenges for voice AI startups: (i) having humans in the loop when AI alone is not enough, (ii) providing near real-time performance while lowering latency and cost, (iii) finding the right GTM strategy to grow at an exponential pace.
Sunday, Sep. 29th: I read an interview with Stripeβs founder John Collison on Ireland. - The Currency
βJohn Collison retains deep roots in the country of his birth. He has funded a think-tank, backed university programmes, renovated a stately pile in the midlands, and even bought an airport.β
βThe general worry I have is, is there some complacency that we need to be quite paranoid about? Lots of our successes came out of quite radical ideas.β
βCollison argues that much of what the company has achieved has taken longer than initially planned. βBecause we set the business up to be self-sustaining, to be in a good spot, to not be reliant upon investor capital, it is okay if certain things take longer than we anticipated.β
βIn Ireland, we are blown back by the fire hose of demand and it is the supply of housing, of infrastructure, of all these things where the issues are.β
βWhen you look at a lot of the housing debates, there is an implicit, sometimes explicit, concern about overbuilding despite the fact that overbuilding is the very last problem we should be worried about right now.β
Monday, Sep. 30th: Duolingo is integrating AI into its platform with the vision to replace human tutoring with AI driven tools like conversational characters or interactive lessons. - Forbes
βLate last year, Duolingo decided not to renew the contracts of about 10% of its contracted workforce who did translations and lesson writing, instead opting to use AI for those tasks in some cases.β
βDuolingoβs AI may put one-on-one human tutors out of business. I understand that. But I think net-net it is better if everybody has access to one.β
βAI could unlock new possibilities for learning, bringing high-quality education to the masses. He thinks languages can help lift people out of poverty, noting that, for non-native speakers, learning English instantly broadens a personβs earning potential and opens up a whole new world of jobs.β
βDuolingo unveiled its first step in that direction: An interactive feature in which users partake in βvideo callsβ with Lily, one of Duolingoβs beloved mascots β a purple-haired, sarcastic, cartoon woman. Chatting with Lily allows people to practice conversing in other languages as if FaceTiming with an AI friend, with dialogue generated by OpenAIβs GPT-4o model. Itβs part of a $30 a month subscription tier, called Duolingo Max, which the company debuted last year for its premium AI features, including one that tells people why they answered a question wrong during a lesson. Another new AI addition is a mini-game called Adventures, which puts users in interactive situations to practice their language skills, like ordering a coffee from a cafe or getting their passport checked.β
βIn addition to its AI tutor efforts, the company has made another big AI investment in its Duolingo English Test (DET), the appβs version of TOEFL, or the Test Of English as a Foreign Language, which is widely used to certify English proficiency for university admissions or visa applications. The DET, which costs $59 and first launched in 2016, gained traction during the pandemic because it could be taken remotely. Duolingo now uses AI for every element of the DET exam, von Ahn said, from generating the questions to making sure people donβt cheat. One security feature, for example, uses facial recognition to make sure a test taker isnβt looking offscreen at notes. Right now, the test accounts for 10% of Duolingo revenue, and von Ahn wants it to become a bigger part of the pie as the company focuses increasingly on non-English speaking users.β
Thanks to Julia for the feedback! π¦ Thanks for reading! See you next week for another issue! π