A viral claim that Builder.ai “faked AI with 700 engineers” following the startup’s bankruptcy has been debunked by former employees who confirm the company actually built legitimate AI systems. The misinformation originated from an unverified social media post and was amplified across major publications, potentially damaging the career prospects of engineers who worked on genuine AI technology.
What actually happened: Builder.ai developed Natasha, a legitimate AI-powered code generation platform that used large language models like GPT and Claude to automate software development tasks.
- A team of approximately 15 engineers built the core AI system, with the broader AI team reaching around 30 people at its peak.
- The platform could refine app ideas, create user stories, generate code following test-driven development, and create pull requests.
- Natasha was first showcased in 2021, well before ChatGPT’s release, initially as a “personal app builder” working with a “network of geeks.”
The 700 engineers myth explained: The viral number appears to stem from Builder.ai’s legitimate outsourced development network, not engineers pretending to be AI.
- Around 500-1,000 external developers worked through outsourcing companies like Globant and TatvaSoft across Vietnam, Romania, Ukraine, Poland, and India.
- These developers used Builder.ai’s custom IDE and AI tools to build over 500 client applications.
- An additional 300 internal engineers worked on rebuilding existing tools like project management and communication platforms.
Technical architecture: Natasha’s tech stack demonstrates sophisticated AI integration rather than human deception.
- A Python orchestrator coordinated AI agent workflows, while Ruby on Rails and React handled backend and frontend operations.
- The team maintained coding benchmarks to evaluate and select the best-performing large language models for their use cases.
- A knowledge graph vector database stored relationships between features and customer requirements.
In plain English: Think of Natasha as a sophisticated digital assistant that could understand what kind of app you wanted to build, break that down into specific programming tasks, write the actual code, and even test it to make sure it worked—all using the same AI technology that powers tools like ChatGPT, but specialized for software development.
Why the company failed: Builder.ai collapsed due to alleged accounting fraud, not technological shortcomings.
- Financial audits revealed the company had misled investors about revenue, with 2024 estimates dropping from $220 million to $55 million.
- Previously reported 2023 sales of $180 million were restated to roughly $45 million.
- Lenders seized remaining funds once the discrepancies were discovered, leading to bankruptcy.
The misinformation spread: The false claim originated from an unverified X (Twitter) post and was amplified by finance newsletter writer Linas Beliūnas to his 500,000+ LinkedIn followers.
- Major publications including Mashable, MSN, and India’s Business Standard repeated the unsubstantiated claim.
- Former employees report concern that the viral misinformation could harm their career prospects despite their legitimate AI work.
What former employees are saying: Engineers who worked at Builder.ai expressed disappointment about the company’s collapse and frustration with the false narrative.
- “I didn’t spot any warning signs,” one developer noted, pointing to Microsoft’s investment in April 2024 as a sign of confidence.
- Former associate product director Yash Mittal confirmed that fraud issues were related to external developer billing practices, not AI deception.
- Multiple engineers confirmed they built AI systems comparable to tools like Devin and Factory.
Builder.ai did not “fake AI with 700 engineers”