An Australian company is growing human brain cells in petri dishes and hooking them up to computers. Actual living neurons—not simulations, not code mimicking neural networks. Real brain tissue is doing computational work. My first reaction when I heard about this? Complete disbelief.
Cortical Labs built something called the CL1. It's got about 800,000 neurons sitting on a chip (Kagan et al., 2022). They demonstrated it at a tech expo in Barcelona this year. Most attendees probably walked right past it. But this tech? It's kept me up countless nights reading papers.
2022 marked the beginning. These researchers cultivated neurons and connected them to the video game Pong. The cells learned to play. How long did it take? Five minutes (Kagan et al., 2022). No programming involved. No algorithms written. The neurons just... figured it out. They understood the goal and improved their gameplay.
Consider what that means. Cells in a dish—disconnected from any body, lacking any brain structure—were learning video game mechanics. That shouldn't be possible based on everything we thought we knew.
Billions of dollars have gone into making computers faster. Silicon chips? They're mathematical powerhouses. Give them any calculation and they'll solve it instantly.
But learning is where things get complicated. AI needs enormous datasets plus massive power consumption just to identify cats in photographs. You have to show it millions of cat pictures before it develops that capability.
Human brains work differently. You saw maybe a dozen cats as a child. That was enough. Now you recognise them in any lighting, from any angle, regardless of breed. The reason? Neurons constantly reorganise themselves—creating pathways, eliminating others (Smirnova et al., 2023). Real-time rewiring happens continuously in your brain.
A Johns Hopkins professor captured the distinction perfectly: "While silicon-based computers are certainly better with numbers, brains are better at learning" (Smirnova et al., 2023). We've spent years forcing silicon to behave like biological brains. Using actual brains might've been the better approach all along.
There's another advantage with organoids. They grow three-dimensionally, unlike the flat architecture of traditional chips. More neurons fit into smaller spaces. Each neuron can connect to 200,000 others potentially. Try replicating that with conventional circuits—you'd need a building the size of a warehouse.
GPT-3's training required 1,300 megawatt-hours of electricity. That's equivalent to powering 130 homes for an entire year. One model was used that much. And companies keep producing these systems, each iteration larger than before.
Your brain uses twenty watts. A desk lamp uses more power. Yet your brain performs tasks that make state-of-the-art AI look simplistic.
Biological computing could be a billion times more efficient than silicon, according to research (Hartung & Elferink, 2023). I checked that paper multiple times when I first read it. A billion times. We're not discussing incremental improvements here—this is a completely different category of efficiency.
Traditional chips running AI operations draw 300 watts typically. These neural systems? They use 0.02 watts for comparable tasks. The difference is almost absurd.
Cooling requirements add another layer of complexity. Data centres generate tremendous heat. Industrial air conditioning prevents equipment failure. Biological systems sidestep this problem entirely.
Building a computer from neurons sounds complicated. The process is actually more straightforward than you'd expect.
Stem cells are the starting point. These versatile cells can become any cell type. Scientists provide them with specific chemical signals and nutrients. Several weeks later, they've transformed into brain cells. The cells start communicating, forming connections, and developing into small masses of brain tissue. You've got your organoid.
The tissue gets placed onto a microelectrode array next (Cai et al., 2023). Picture a platform covered in tiny electrodes. These electrodes send electrical signals to the neurons. They also detect signals coming back from the neurons. It's a translation layer between biology and electronics.
Training is where things get interesting. Dopamine—that chemical your brain releases during positive experiences—gets delivered to neurons when they perform correctly. The neurons respond by strengthening the connections that led to that dopamine reward. Like training an animal, except you're working with brain tissue floating in a dish.
Keeping them alive requires effort. These aren't silicon chips you can throw in a drawer. Precise temperature control is necessary. Sterile environments are mandatory. Constant nutrient supply is required. They survive about 100 days currently before replacement becomes necessary. Each unit costs $35,000, and yes, they're available for purchase. Shipping began this past summer.
Drug testing represents the most immediate application. Pharmaceutical companies face challenges in testing medications that affect the brain. Animal brains aren't identical to human brains. Computer simulations provide approximations at best. Actual human neurons, though? You can directly observe their response to experimental treatments for conditions like Alzheimer's. This changes everything for drug development.
The speech recognition demonstration was noteworthy. They created something called Brainoware that identified different speakers from audio recordings with 78% accuracy (Cai et al., 2023). Smartphones perform better, sure. But smartphones aren't using actual neurons in dishes. This proved that biological tissue can process complex audio input and generate meaningful output.
Adaptive robotics has the most potential in my view. Traditional AI is inflexible. Train it for one specific task—that becomes its permanent function. Modify conditions even slightly, and you're retraining the entire system from scratch. Biological networks adapt without that hassle. They continue learning naturally, no massive updates required. Robots that genuinely improve at their jobs every single day just by performing them—that's the vision here.
Cortical Labs has a company now. FinalSpark, a Swiss company, chose a different business model. Instead of selling hardware, they rent access to their neural systems. "Wetware-as-a-Service" is what they call it. Possibly the best tech terminology I've encountered this year.
Researchers pay $500-$1000 monthly to remotely access neural computing capabilities without maintaining their own laboratories. FinalSpark's organoids are smaller—roughly 10,000 neurons each. They connect them in arrays, though. Nine research institutions have established partnerships already. Universities want access. Entertainment companies want access. Everyone's interested.
When multiple companies invest significant capital into emerging technology, it signals that real commercial potential exists. This has moved beyond academic curiosity.
Lifespan presents the first major obstacle. A hundred days might sound acceptable initially. Then you remember your laptop has been running for five years. Replacing biological components every few months isn't practical for most real-world applications. Biological limitations are hard to overcome.
Scale is another challenge. Even the largest organoids contain fewer than a million neurons. Cat brains contain billions. Competing with advanced AI systems apparently requires at least 10 million neurons per organoid, all maintaining health and proper connectivity. That's not just an engineering problem—it's a question of feasibility.
Ethics complicate matters significantly. These are human brain cells we're discussing. Actual human neurons growing in laboratory settings. As complexity increases, questions arise. Could they become conscious? Could they experience suffering? (Lavazza & Massimini, 2018). They're too simple currently for those concerns. But where should the line be drawn? Who makes that determination?
Cell sourcing raises concerns, too. These cells originate from stem cell lines. As this technology scales up, clear regulations about consent and donor rights become necessary. The Henrietta Lacks case should have taught us to approach these matters carefully.
Silicon won't be replaced entirely by biological computing. That's not really the objective here. Silicon remains superior for pure mathematical calculations and tasks requiring perfect repeatability.
Hybrid systems seem more likely for the future. Silicon handles the computational work. Biological networks handle learning and adaptation. Each component does what it does best.
The Australian military is funding this research already. That demonstrates how seriously major institutions are taking it. They recognise potential in autonomous systems where continuous learning provides more value than raw processing speed (Kagan et al., 2022).
The timing is noteworthy. AI power consumption is hitting practical limits just as this biological alternative emerges. That's probably not coincidental. Maybe exploring silicon computing fully was necessary before we could appreciate what biological computing offers.
Three years elapsed between neurons playing Pong and commercial biological computers becoming available. That's remarkably rapid for such a fundamental technological shift. We're observing something genuinely novel developing here. Not just improved chips—an entirely different computing paradigm.
Will your next smartphone run on brain cells? Unlikely. Medical research, though? Adaptive robotics? Energy-efficient AI? Those applications seem probable. I'm genuinely excited to see how this develops. We're literally cultivating the future in laboratory dishes. That's exactly as remarkable as it sounds.
References: