Can You Be a Great AI Engineer Without a Million Dollars?
10/14/2025
Recently, Karpathy has just released a new project called nanochat (https://github.com/karpathy/nanochat), a tiny, end-to-end ChatGPT-like model that runs for around a hundred bucks. It’s minimal, hackable, and meant to teach the fundamentals of how large language models actually work.
The tagline reads, “The best ChatGPT that $100 can buy.”
It’s clever, almost poetic, and it caught me off guard. Because under the humor, there’s a truth I can’t stop thinking about: it takes a rare kind of creativity to build something like that today, when AI feels increasingly locked behind data centers and venture money.
nanochat isn’t the most powerful model out there, not even close, but that’s not the point. It’s a reminder that understanding still matters. That curiosity can still produce something real, even when computation feels out of reach.
And yet, it also highlights the other side of the story.
Because when I look at the state of AI today, I can’t help but wonder: can you really be a great AI engineer without a million dollars?
The Dream We Were Sold
For years, we were told that tech was a meritocracy. If you’re curious, hardworking, and creative, you can build anything. The tools are free, the knowledge is online, and your laptop is your workshop.
That story used to feel true. You could learn ML on Coursera, spin up a Jupyter notebook, train a small model overnight, and actually see results.
Now it feels different. You need a corporate budget just to play in the same sandbox.
The Compute Wall
Let’s be real. The barrier isn’t knowledge anymore, it’s compute.
Training something meaningful takes access to GPUs that cost more than most cars. The big players have clusters worth hundreds of millions. You, me, and most independent researchers have Colab Pro and a prayer.
It’s wild how fast this flipped. What used to be an open field now feels like a gated community. If you don’t have access to high-end hardware, you’re not even in the race. You’re just watching from the stands while companies like OpenAI, Anthropic, and Google trade laps.
Compute Is the New Capital
In capitalism, whoever controls the resources controls the future.
In AI, that resource is compute.
Owning GPUs is like owning oil rigs in 1900. You’re sitting on the thing everyone else needs. And that’s created a weird kind of monopoly on innovation. The frontier models come from the same few companies because they’re the only ones who can afford to train them.
Meanwhile, the rest of us build wrappers, fine-tuners, and side projects around whatever they release. We’ve gone from creating to customizing.
The Loop That Keeps Spinning
Big tech has figured out a pretty efficient loop. Make money, buy GPUs, train better models, make more money, repeat.
Independent engineers can’t keep up. Even open-source models, the ones we celebrate as community-driven, are often trained with funding from the same corporations we’re supposedly pushing back against.
So “open” sometimes means open weights, but not open access. You can download the model, sure. Just don’t ask for the five-million-dollar compute bill it took to train it.
Open Source: Rebellion or Reliance?
Open source AI is the one bright spot. It’s scrappy. It’s community-driven. It’s people saying, “We’re not waiting for permission.”
But even the open stuff depends on big funding or donated compute credits. It’s not free in the sense we like to think. It’s just subsidized resistance.
Still, there’s hope there. Every time someone squeezes more performance out of smaller models, or builds something clever on a single GPU, it feels like a small rebellion.
nanochat, in that sense, is part of that rebellion. Not because it’s big or competitive, but because it’s small and honest. It reminds us that you can still learn, build, and understand the full stack of an AI system on your own hardware.
That’s a pretty radical statement in 2025.
The Learning vs. Frontier Gap
To be fair, AI isn’t completely closed off. You can still do a lot with Google Colab, Kaggle, Hugging Face, or projects like nanochat. You can fine-tune open models, train toy ones, and even publish solid research if you’re clever about optimization.
There’s a healthy open-source community and plenty of free-tier compute floating around. It’s not hopeless.
But let’s be honest, that’s the learning tier, not the frontier tier.
You can experiment, but you can’t train a billion-parameter model. You can learn, but you can’t compete with labs running 10,000 GPUs in parallel. Colab or nanochat are great classrooms, but they’re not launchpads to compete with Anthropic or OpenAI.
It’s like being able to practice flying on a simulator while the real planes are locked in corporate hangars.
That doesn’t make AI inaccessible to everyone, but it does mean the definition of “great” in this field is being quietly rewritten, from ingenuity to infrastructure.
Compared to Other Tech Roles
It’s also worth asking how AI compares to the rest of the tech world.
Because if you look around, most roles still reward skill more than spending power.
A web developer can build real projects with a laptop and some free tools.
A data engineer can practice using open datasets and cloud credits.
A security researcher can experiment in virtual labs.
A designer can create stunning interfaces with open-source software.
Even DevOps engineers, who work closest to the infrastructure layer, can learn using free-tier cloud services, local clusters, or container tools like Docker and Kubernetes.
Sure, DevOps gets complex fast, and enterprise systems are expensive. But you can still learn the principles, build automations, and run small setups on your own. What matters most there is still mastery, not money.
AI feels different.
Machine learning used to be part of that same do-it-yourself culture. You could explore ideas in notebooks and learn by doing. Now the distance between what an independent engineer can achieve and what a well-funded lab can do has turned into a canyon.
A front-end developer can still build something that goes viral.
A backend engineer can still design smart systems with limited resources.
A DevOps engineer can still prove their skill by building reliable systems from scratch.
But a solo AI engineer? Without serious compute, you’re boxed out of the frontier.
It’s not about intelligence or effort. It’s about the field itself shifting from code to capital. AI has become the first major area of software engineering where the tools of creation are priced like industrial equipment.
And that’s a turning point for tech as a whole. The garage-style innovation that built Silicon Valley doesn’t fit easily in a world where progress depends on who can afford the next GPU cluster.
The Only Field Where Progress Scales with Money
Most tech fields still let you grow with skill.
A DevOps engineer can automate cloud systems on a laptop.
A security researcher can hack simulated networks for free.
A web developer can launch something real on a free-tier host.
In those fields, curiosity still counts more than capital.
AI flipped that.
In this world, progress scales directly with money. The more GPUs you have, the smarter your model gets. The more data you buy, the better your results. The faster your cluster, the further ahead you pull.
That’s what makes AI feel different from the rest of tech. It’s not just an engineering discipline anymore, it’s a resource economy.
It’s the only field where being “great” has a price tag.
Redefining “Great”
Maybe that’s what greatness means now. Not building the biggest model, but doing something meaningful with what you have.
Maybe the new mark of a great AI engineer isn’t scale, it’s efficiency. Cleverness per FLOP. The ability to make something powerful without needing a data center.
Projects like nanochat hint at that shift, an attempt to make the field hackable again, understandable again, and maybe even fun again.
Because when you strip away the hype and the compute, that’s what drew most of us here in the first place. Curiosity.
The Real Question
So, can you be a great AI engineer without a million dollars?
By the current definition, probably not.
But maybe that definition is broken.
Maybe being great isn’t about scale or resources anymore. Maybe it’s about creativity, efficiency, and the courage to build anyway.
Because someone has to make intelligence and innovation itself, affordable again. It might as well be any of us.