Disclosure: As an Amazon Associate I earn from qualifying purchases. This site contains affiliate links.

Back to Blog
Mira Murati Thinking Machines Lab AI startup launch with Nvidia partnership visualization
tech news

Mira Murati's Thinking Machines Lab: The $50B AI Startup Shaking Up the Industry

Former OpenAI CTO Mira Murati launched Thinking Machines Lab with $2B in seed funding and a massive Nvidia deal. Here's what it means for the AI landscape.

18 min read
March 11, 2026
mira murati thinking machines, thinking machines lab ai, mira murati new company
W
Wayne Lowry

10+ years in Digital Marketing & SEO

Mira Murati's Thinking Machines Lab Is Rewriting the AI Playbook

The AI industry just got a seismic jolt. On March 10, 2026, Nvidia announced a massive strategic investment in Thinking Machines Lab — the startup founded by former OpenAI CTO Mira Murati. The deal includes a multiyear commitment to deploy at least one gigawatt of Nvidia's next-generation Vera Rubin AI accelerators, making it one of the largest AI compute partnerships ever announced for a startup. Reports from the Financial Times indicate the deal is valued at tens of billions of dollars.

I have been following Murati's trajectory closely since she stepped down from OpenAI in September 2024, and Thinking Machines Lab is shaping up to be one of the most consequential AI companies to emerge in this era. With a $2 billion seed round, a $50 billion target valuation, a product already in beta, and now Nvidia's hardware muscle behind it, this is not just another AI startup — it is a legitimate challenger to OpenAI, Anthropic, and Google DeepMind.

Let me walk you through everything: Murati's remarkable journey, what Thinking Machines Lab is actually building, how it compares to the incumbents, the Nvidia partnership details, and what this all means for the future of AI.

Who Is Mira Murati? From Albania to the Top of AI

To understand Thinking Machines Lab, you first need to understand the person behind it. Mira Murati's story is one of the most compelling in the tech world — and it does not get talked about enough.

Early Life and Education

Murati was born on December 16, 1988, in Vlore, Albania. At just 16 years old, she won a scholarship to the prestigious United World Colleges program, studying at Pearson College on Vancouver Island in Canada. After graduating in 2007, she earned a BA from Colby College in 2011 and a Bachelor of Engineering degree from Dartmouth College's Thayer School of Engineering in 2012.

Her career started with an internship at Goldman Sachs in Tokyo in 2011, which gave her a taste of the intersection of technology and global business. She briefly worked at Zodiac Aerospace before joining Tesla in 2013 as a product manager on the Model X — the company's ambitious falcon-wing SUV that pushed the boundaries of automotive engineering.

From 2016 to 2018, Murati worked at Leap Motion, an augmented reality startup focused on hand-tracking technology. Each role built a different dimension of her skill set: finance, hardware engineering, consumer products, and emerging tech.

The OpenAI Years: Overseeing ChatGPT, DALL-E, and Sora

Murati joined OpenAI in 2018 as VP of Applied AI and Partnerships. In May 2022, she was promoted to Chief Technology Officer — a role that put her at the helm of some of the most transformative AI products in history.

Under her technical leadership, OpenAI launched:

  • ChatGPT — the conversational AI that became the fastest-growing consumer application in history
  • DALL-E — the text-to-image generation system that sparked the AI art revolution
  • Codex — the AI model behind GitHub Copilot that changed how developers write code
  • Sora — the text-to-video model that demonstrated AI could generate photorealistic video

Murati oversaw OpenAI's research, product, and safety teams — the trifecta of functions that determined what got built and how it was deployed to the public. During the tumultuous November 2023 board crisis that temporarily ousted CEO Sam Altman, Murati briefly served as interim CEO before Altman's return.

The Departure

In September 2024, Murati announced she was stepping down as CTO. She reportedly framed her departure as an opportunity to pursue her own exploration — a diplomat's way of saying she had a bigger vision that could not be realized within OpenAI's increasingly complex structure.

According to Fortune, she reportedly declined a $1.5 billion retention offer to stay. That tells you everything about her conviction.

What Is Thinking Machines Lab Building?

Murati launched Thinking Machines Lab in February 2025 as a public benefit corporation — a deliberate structural choice that signals her priorities. Unlike a traditional C-corp focused solely on shareholder returns, a public benefit corporation is legally committed to balancing profit with positive impact.

The Mission

Thinking Machines Lab aims to make AI systems that are, in Murati's words, "more widely understood, customizable, and generally capable." That might sound like standard Silicon Valley mission statement language, but the emphasis on "customizable" is key. Where OpenAI builds monolithic models accessed through APIs, and Anthropic focuses on safety-first general AI, Thinking Machines is betting on a future where organizations can deeply customize AI systems for their specific needs.

Tinker: The First Product

The startup released its first product, Tinker, in October 2025 in a tightly controlled beta. Tinker is an API platform that allows researchers and developers to fine-tune AI models with significantly less overhead than current approaches require.

Think of it this way: today, if you want to customize a large language model for your specific use case, you need ML engineers, massive compute budgets, and weeks of experimentation. Tinker aims to compress that process, making model customization accessible to a much broader set of developers and enterprises.

The platform works with open-source LLMs, which positions Thinking Machines as an enabler of the open AI ecosystem rather than a walled-garden competitor. This is a strategic differentiator that aligns with the broader trend of AI agents becoming more accessible.

2026 Roadmap: Proprietary Models and Multimodal Capabilities

Thinking Machines plans to release its own proprietary models in 2026. The company has also announced plans to add multimodal capabilities to Tinker, meaning the platform will support fine-tuning across text, image, audio, and potentially video models.

This roadmap places Thinking Machines in direct competition with not just OpenAI and Anthropic, but also with enterprise AI platforms like Google Vertex AI and AWS Bedrock. The difference is the customization-first approach — building tools that put control in the hands of developers rather than locking them into pre-built solutions.

The Team: World-Class Talent (and Some Drama)

Murati did not build Thinking Machines quietly. She assembled a team of approximately 30 leading researchers and engineers poached from competitors including Meta, Mistral, and OpenAI itself. The founding team included some heavy hitters:

  • Barret Zoph — Former Google Brain researcher and OpenAI veteran, brought on as co-founder and CTO
  • Luke Metz — Co-founder, previously at OpenAI working on optimization and training methods
  • John Schulman — OpenAI co-founder serving as an advisor
  • Alec Radford — OpenAI researcher (co-creator of GPT) serving as an advisor
  • Bob McGrew — Former OpenAI VP of Research serving as an advisor

The January 2026 Departures

I would not be giving you the full picture if I did not address the turbulence. In January 2026, TechCrunch reported that two of Thinking Machines Lab's co-founders — Barret Zoph and Luke Metz — left the company to return to OpenAI. Sam Schoenholz, another key team member, also departed for OpenAI. Earlier, co-founder Andrew Tulloch had left to join Meta in October 2025.

According to reporting from Fortune, the departures were driven by a combination of factors: money, compute constraints, and a perceived lack of clarity on products and business model during the early stage. OpenAI apparently made aggressive offers to bring the researchers back.

Here is how the team timeline looks:

Event Date Details
Murati departs OpenAI September 2024 Steps down as CTO
Thinking Machines Lab founded February 2025 Public benefit corporation structure
Core team assembled Q1-Q2 2025 ~30 researchers from Meta, Mistral, OpenAI
$2B seed round closes July 2025 Led by Andreessen Horowitz at $12B valuation
Tinker beta launches October 2025 First product: model fine-tuning API
Andrew Tulloch departs October 2025 Joins Meta
Zoph, Metz, Schoenholz depart January 2026 Return to OpenAI
Nvidia partnership announced March 10, 2026 Significant investment + 1GW Vera Rubin deal

The departures are concerning but not necessarily fatal. Losing your CTO and co-founders 11 months after launch is a real blow, but Murati still has the advisory support of OpenAI co-founders and a war chest that can attract top-tier replacements. The Nvidia partnership announcement just days ago suggests the company has moved past the turbulence and is focused on execution.

The Nvidia Partnership: A Game-Changing Compute Deal

Now let us dig into the news that dropped on March 10, 2026, because it is a massive development.

What the Deal Includes

Nvidia has made what both companies describe as a "significant investment" in Thinking Machines Lab as part of a new multiyear strategic partnership. The key terms:

  1. Equity investment: Nvidia has taken an equity stake in Thinking Machines (exact amount undisclosed)
  2. 1 gigawatt of Vera Rubin systems: Thinking Machines will deploy at least 1 GW of Nvidia's next-generation Vera Rubin AI accelerators — the chipmaker's most advanced offering, expected to ship in the second half of 2026
  3. Co-designed training infrastructure: The partnership includes efforts to design training and serving systems optimized for Nvidia architectures
  4. Enterprise AI access: A joint commitment to broaden access to frontier AI and open models for enterprises, research institutions, and the scientific community

Why This Matters

To put the 1-gigawatt figure in perspective: that is an enormous amount of compute. A single modern AI data center facility typically operates at 100 to 200 megawatts. One gigawatt is enough to power multiple large-scale training clusters simultaneously — the kind of infrastructure needed to train frontier models that compete with GPT-5 and Claude.

The Vera Rubin platform represents Nvidia's next generational leap beyond the current Blackwell architecture. If you have been following Nvidia's roadmap (I covered their record Q4 earnings recently), Vera Rubin promises dramatic improvements in training efficiency and inference cost reduction. Getting early access to these chips gives Thinking Machines a hardware advantage that money alone cannot buy — supply of cutting-edge AI chips remains constrained, and Nvidia is selective about who gets priority allocation.

The Valuation Question

Thinking Machines raised its initial $2 billion seed round at a $12 billion valuation in July 2025. Reports from late 2025 indicated the company was in talks to raise an additional $5 billion at a $50 billion valuation — a 4x jump in roughly six months. The Nvidia partnership likely strengthens that valuation case significantly.

For context, here is how Thinking Machines stacks up against other AI labs:

Company Reported Valuation (2026) Total Funding Founded Key Product
OpenAI ~$300B $110B+ 2015 GPT-5, ChatGPT
Anthropic ~$60B $15B+ 2021 Claude, Constitutional AI
Thinking Machines Lab ~$50B (target) $2B+ 2025 Tinker, upcoming models
xAI ~$50B $12B+ 2023 Grok
Mistral ~$6B $1B+ 2023 Open-source LLMs

Reaching $50 billion in under two years from founding — before even releasing a proprietary model — would be extraordinary, but in the current AI investment climate, not unprecedented.

How Thinking Machines Differs from OpenAI, Anthropic, and Google

The AI landscape is getting crowded, so it is worth examining what makes Thinking Machines Lab's approach distinct. Understanding these differences among AI companies matters for anyone building with or investing in AI.

vs. OpenAI

OpenAI has evolved into a general-purpose AI platform company with consumer products (ChatGPT), enterprise APIs, and increasingly closed-source models. Murati's departure was partly driven by disagreements about OpenAI's direction — specifically, the tension between commercial ambitions and research openness. Thinking Machines is structured as a public benefit corporation and is explicitly building tools that work with open-source models, signaling a different philosophy about who should control AI capabilities.

vs. Anthropic

Anthropic positions itself as the "safety-first" AI lab, with Constitutional AI as its core differentiator. Thinking Machines is not competing on the safety messaging axis — instead, it is competing on customizability and accessibility. Where Anthropic wants to build AI you can trust, Thinking Machines wants to build AI you can shape. These are complementary but different value propositions.

vs. Google DeepMind

Google has the advantage of integrating AI across Search, Cloud, Android, and Workspace. Thinking Machines cannot match Google's distribution, but it can offer something Google cannot: platform neutrality. Enterprise customers who do not want to be locked into the Google ecosystem may find Thinking Machines' independent positioning appealing.

The Open Model Bet

The most interesting strategic decision is the focus on open-source model fine-tuning. By making Tinker work with open-source LLMs rather than building a proprietary-only ecosystem, Thinking Machines is positioning itself as the "Switzerland" of AI — a neutral platform that empowers builders regardless of which base model they prefer. This aligns with the broader industry trend of AI agents becoming composable rather than monolithic.

What This Means for the AI Landscape

Thinking Machines Lab's emergence — and especially the Nvidia partnership — has several implications for the broader AI ecosystem that I think are worth watching:

More Competition at the Frontier

The AI industry has been consolidating around a few major players. Thinking Machines adds a credible new competitor with the funding, talent, and now compute access to train frontier models. More competition means faster innovation, more diverse approaches, and potentially better outcomes for end users.

The Compute Arms Race Intensifies

The Nvidia partnership underscores a reality I have been writing about: access to compute is the new oil in AI. Companies that secure long-term chip supply agreements with Nvidia have a structural advantage over those fighting for allocation on the spot market. Expect more of these strategic partnerships as the AI agent revolution demands ever more compute.

Enterprise AI Customization as a Category

Thinking Machines' focus on model customization could define a new product category. Today, most enterprises use off-the-shelf AI models through APIs. If Tinker delivers on its promise of making fine-tuning accessible, it could shift the market toward customized AI deployments — a potentially massive market that no single company has won yet.

Talent Wars Continue

The co-founder departures in January show that the AI talent war is far from over. Top researchers are the scarcest resource in the industry, and companies will continue poaching aggressively. For anyone building a career in AI/ML, this is the most competitive hiring market in tech history.

What to Watch Next

As I look ahead through the rest of 2026, here are the key milestones that will determine whether Thinking Machines Lab lives up to the hype:

  1. Proprietary model release: The company has said it will release its own models in 2026. The quality and capabilities of these models will be the ultimate test of the team's research prowess.

  2. Multimodal Tinker launch: Adding support for image, audio, and video fine-tuning could dramatically expand Tinker's addressable market.

  3. Vera Rubin deployment: The Nvidia chips are expected in the second half of 2026, with Thinking Machines deploying them in early 2027. The training runs enabled by this compute will determine what kind of models the company can build.

  4. New funding round: Whether the company closes its reported $5 billion raise at $50 billion will signal investor confidence post-departures.

  5. CTO replacement: Filling the Barret Zoph-shaped hole in the org chart with a credible research leader is critical for maintaining team morale and research direction.

Murati has proven she can build world-class AI products. The question is whether she can build a world-class AI company from scratch, navigating talent drama, investor expectations, and competition from organizations with 10x her resources. Based on what I have seen so far, I would not bet against her.

Frequently Asked Questions

What is Thinking Machines Lab?

Thinking Machines Lab is an artificial intelligence startup founded by Mira Murati in February 2025. Structured as a public benefit corporation, it focuses on making AI systems more widely understood, customizable, and generally capable. The company's first product, Tinker, is a platform that helps developers fine-tune open-source AI models.

Who is Mira Murati?

Mira Murati is an Albanian-American technology executive who served as Chief Technology Officer of OpenAI from 2022 to 2024. She oversaw the development and launch of ChatGPT, DALL-E, Codex, and Sora. Before OpenAI, she worked at Tesla on the Model X and at augmented reality startup Leap Motion. She founded Thinking Machines Lab after departing OpenAI in September 2024.

How much funding has Thinking Machines Lab raised?

Thinking Machines Lab raised $2 billion in a seed round led by Andreessen Horowitz in July 2025 at a $12 billion valuation — the largest seed round in Silicon Valley history. The company was reportedly seeking an additional $5 billion at a $50 billion valuation as of late 2025. In March 2026, Nvidia announced a significant investment as part of a multiyear strategic partnership.

What is the Nvidia partnership with Thinking Machines Lab?

Announced on March 10, 2026, Nvidia made a significant equity investment in Thinking Machines Lab and committed to supplying at least 1 gigawatt of its next-generation Vera Rubin AI accelerators. The multiyear deal includes co-designed training infrastructure and a joint effort to broaden enterprise access to frontier AI.

What happened with the co-founders leaving?

In January 2026, co-founders Barret Zoph (CTO) and Luke Metz departed Thinking Machines Lab to return to OpenAI. Sam Schoenholz also left for OpenAI, while co-founder Andrew Tulloch had previously departed for Meta in October 2025. The departures were reportedly driven by compensation offers, compute constraints, and product direction uncertainty.

How does Thinking Machines Lab differ from OpenAI?

While OpenAI has evolved into a general-purpose AI company with closed-source models and consumer products, Thinking Machines Lab is focused on customizable AI tools that work with open-source models. It is structured as a public benefit corporation rather than a capped-profit entity, and its flagship product Tinker enables model fine-tuning rather than offering pre-built AI solutions.

Key Takeaways

  1. Mira Murati's Thinking Machines Lab has secured a massive Nvidia partnership with a significant investment and access to 1 gigawatt of Vera Rubin AI chips — a computing deal that positions it alongside the largest AI labs.
  2. The $2B seed round at $12B valuation was the largest seed in Silicon Valley history, and the company is reportedly targeting a $50B valuation for its next round.
  3. Tinker, the company's first product, enables developers to fine-tune open-source AI models, positioning Thinking Machines as a platform for AI customization rather than a monolithic model provider.
  4. Co-founder departures in January 2026 were a setback, with key researchers returning to OpenAI, but the Nvidia partnership signals the company has recovered its momentum.
  5. Proprietary models and multimodal capabilities are planned for 2026, which will be the real test of whether Thinking Machines can compete at the frontier.
  6. The public benefit corporation structure differentiates Thinking Machines from profit-driven competitors and signals a commitment to broader AI accessibility.
  7. For the AI industry, this represents a credible new entrant that could reshape how enterprises build and customize AI systems.

What do you think about Mira Murati's next chapter? Can Thinking Machines Lab compete with OpenAI and Anthropic? Share your thoughts on X (@wikiwayne).

Recommended Gear

These are products I personally use and recommend for AI developers and tech enthusiasts. Click to view on Amazon.

Raspberry Pi 5 8GB Raspberry Pi 5 8GB — The Raspberry Pi 5 is the best affordable platform for experimenting with edge AI and running lightweight models locally. The quad-core Arm Cortex A76 at 2.4GHz and 8GB RAM handle inference tasks surprisingly well. Perfect for anyone exploring AI without a massive compute budget.

Samsung T7 Portable SSD 1TB Samsung T7 Portable SSD 1TB — AI datasets and model weights eat storage fast. The T7 gives you 1TB of blazing-fast portable storage with read speeds up to 1,050 MB/s. I use mine to move training data between machines without waiting forever for transfers.

Sony WH-1000XM5 Noise Canceling Headphones Sony WH-1000XM5 Noise Canceling Headphones — Whether you are reading research papers, coding ML pipelines, or following an Nvidia earnings call, the XM5s eliminate distractions completely. Industry-leading noise cancellation with 30-hour battery life. Essential for deep work.

Logitech MX Master 3S Wireless Mouse Logitech MX Master 3S Wireless Mouse — The MagSpeed scroll wheel is a game-changer for scrolling through long code files and documentation. Quiet clicks, 8K DPI tracking, and multi-device pairing make this the mouse every developer should own.

Logitech MX Keys S Wireless Keyboard Logitech MX Keys S Wireless Keyboard — Pairs perfectly with the MX Master 3S. Low-profile keys with smart backlighting that activates when your hands approach. Connects to 3 devices simultaneously — switch between your workstation, laptop, and tablet with one button.


This article contains affiliate links. As an Amazon Associate I earn from qualifying purchases. See our full disclosure.

Affiliate Disclosure: As an Amazon Associate I earn from qualifying purchases. This site contains affiliate links.

Related Articles