• Home
  • News
  • Fortune 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
AIChatbots

Want to build your own chatbot for $100? A glimpse into AI’s small, cheap, DIY future

Sharon Goldman
By
Sharon Goldman
Sharon Goldman
AI Reporter
Sharon Goldman
By
Sharon Goldman
Sharon Goldman
AI Reporter
October 15, 2025 at 9:00 AM UTC
Small, cheap, and surprisingly capable—a new generation of mini-models is proving that AI power doesn’t have to be massive.

Andrej Karpathy, a former OpenAI researcher and Tesla’s former director of AI, calls his latest project the “best ChatGPT $100 can buy.” 

Recommended Video

Called “nanochat,” the open-source project, released yesterday for his AI education startup, EurekaAI, shows how anyone with a single GPU server and about $100 can build their own mini-ChatGPT that can answer simple questions and write stories and poems. 

Karpathy, who called nanochat a “micro model,” wrote on X that models like his should be thought of as “very young children” that “don’t have the raw intelligence of their larger cousins.” Scale up your spending to $1,000, however, and such a model “quickly becomes a lot more coherent and can solve simple math/code problems and take multiple choice tests.” 

The announcement garnered millions of views on X, with the CEO of Shopify, Tobi Lütke, calling it a “gift” to developers, researchers, and students. But it’s also an example of what has become a growing trend: smaller, cheaper and more specialized models that have fewer parameters, or the “knobs” inside a model that get fine-tuned during training to help it make sense of language, images, or data. Massive large language models (LLMs) may have trillions of parameters, requiring access to GPUs in the cloud and enormous computational power, while the latest small models may have just a few billion parameters. 

With fewer parameters, these small models don’t try to match the power of frontier models like GPT-5, Claude, and Gemini. But they are good enough for specific tasks, affordable to train, lightweight enough to use on devices like phones and laptops, and easy for startups, researchers, and hobbyists to build and deploy. 

The small-model approach was echoed by researchers at Samsung AI Lab last week, who released a paper showing off their Tiny Recursive Model. It uses a new neural network architecture that shows remarkable efficiency on complex reasoning and puzzle tasks like sudoku, outperforming popular LLMs while using a minuscule fraction of the computational resources.

There has been a wave of other organizations releasing small AI models, showing that size isn’t everything when it comes to power. Last week, Israel’s AI21 unveiled Jamba Reasoning 3B, a 3-billion-parameter open-source model that can “remember” and reason over massive amounts of text, and run at high speed even on consumer devices. In September, UAE’s Mohamed bin Zayed University of Artificial Intelligence (MBZUAI) and G24 introduced K2 Think, an open-source reasoning model with only 32 billion parameters that in trials rivaled systems more than 20 times as large. Meanwhile, Big Tech companies like Google, Microsoft, IBM, and OpenAI have all joined the small-but-mighty club, with models that are a fraction of the size of their bigger counterparts. 

Much of this momentum traces back to China’s DeepSeek, whose lean, low-cost models upended industry assumptions at the beginning of this year and kicked off a race to make AI smaller, faster, and smarter. But it’s important to note that these models, while impressive, aren’t designed to match the broad capabilities of frontier systems like GPT-5. Instead, they’re built for narrower, specialized tasks—and often shine in specific use cases. 

For example, this week IBM Research, along with NASA and others, released open-source, “drastically smaller” versions of its Prithvi and TerraMind Earth-observation models that can run on almost any device, from satellites orbiting Earth to the smartphone in your pocket, all while maintaining strong performance. “These models could reshape how we think about doing science in regions far from the lab—whether that’s in the vacuum of space or the savanna,” the company wrote in a blog post.

None of this means the era of massive, trillion-parameter models is coming to an end. As companies like OpenAI, Google, and Anthropic push for artificial general intelligence, which requires more reasoning capabilities, those will be the models that push the frontier. But the rise of smaller, cheaper, and more efficient models shows that AI’s future won’t be defined by size alone. 

About the Author
Sharon Goldman
By Sharon GoldmanAI Reporter
LinkedIn icon

Sharon Goldman is an AI reporter at Fortune and co-authors Eye on AI, Fortune’s flagship AI newsletter. She has written about digital and enterprise tech for over a decade.

See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.