Biological Computing Breakthrough: Human Neurons Play DOOM in Petri Dish
AI ResearchScore: 95

Biological Computing Breakthrough: Human Neurons Play DOOM in Petri Dish

Cortical Labs has successfully trained 200,000 human brain cells to play the classic video game DOOM, marking a significant leap toward Synthetic Biological Intelligence. This biological computing approach could solve AI's massive energy consumption problem while enabling new forms of adaptive learning.

Mar 8, 2026·4 min read·23 views·via @rohanpaul_ai
Share:

Human Brain Cells in a Dish Learn to Play DOOM: The Dawn of Biological Computing

In a remarkable fusion of biology and technology, researchers at Australian biotech company Cortical Labs have achieved what sounds like science fiction: they've trained approximately 200,000 human brain cells grown on a silicon chip to play the 1993 video game DOOM. This isn't merely a quirky laboratory demonstration—it represents a fundamental shift toward what the company calls Synthetic Biological Intelligence (SBI), potentially solving some of artificial intelligence's most pressing limitations.

How 200,000 Neurons Learned to Play

The experiment, detailed in social media reports from AI commentator Rohan Paul, involves growing human neurons derived from adult stem cells directly onto specialized silicon chips. Rather than providing the cells with visual input through a screen, Cortical Labs translated DOOM's digital data into electrical signals.

When a monster appears on the left side of the game environment, the left side of the neural network receives an electrical "zap." The neurons naturally respond by firing electrical spikes, which Cortical's software then decodes into game commands like shooting or dodging. Through neuroplasticity—the same biological mechanism by which human brains learn—the cells gradually identify which neural pathways lead to success and strengthen those connections.

This approach stands in stark contrast to traditional AI training, which requires billions of parameters and massive datasets. Biological cells possess inherent plasticity, allowing them to adapt organically to new, sparse, and chaotic data streams almost instantly as they seek to optimize their environment.

Solving AI's Energy Crisis

The energy implications of this breakthrough are potentially revolutionary. Traditional silicon-based AI systems, particularly the massive GPU clusters used for training models like GPT-4, consume staggering amounts of electricity—often measured in megawatts. In contrast, biological brains represent the most efficient computing systems known to exist.

The human brain operates on approximately 20 watts of power—less than a standard lightbulb. According to Cortical Labs, a server rack containing 30 of their CL1 biological computers consumes less than 1,000 watts combined. This represents orders of magnitude improvement in energy efficiency compared to conventional AI hardware.

Wetware-as-a-Service: Biological Computing Goes Commercial

Perhaps most surprisingly, this technology has already moved beyond the laboratory. Cortical Labs has commercialized their system through what they call "Wetware-as-a-Service" (WaaS). Their "Cortical Cloud" platform allows developers worldwide to write Python code and deploy it via an API to physical jars of living human brain cells sitting in server racks.

This means researchers no longer need sterile laboratory facilities to experiment with biological computing—they simply need a software subscription. The platform effectively democratizes access to biological neural networks, potentially accelerating research and development in this emerging field.

Ethical Considerations and Consciousness

Cortical Labs emphasizes that these neural systems are not conscious. To achieve sentience, self-awareness, or the capacity to suffer, a brain requires complex structures like a prefrontal cortex, limbic system, and sensory organs. The CL1 system contains just 200,000 neurons in a microscopic, disorganized web—compared to approximately 86 billion neurons in a human brain.

The cells don't "know" what DOOM is, don't "see" demons, and don't experience fear. They function as biological circuits seeking electrical equilibrium, responding to stimuli through basic biological mechanisms rather than conscious thought.

The Future of Biological Computing

This development represents what researchers describe as "the very bleeding edge of a transition from simulating neural networks in silicon to utilizing actual neural networks in biology." While still in early stages, the implications are profound:

  1. Energy-efficient AI: Biological computing could dramatically reduce the carbon footprint of advanced AI systems
  2. Adaptive learning: Biological neural networks may excel at tasks requiring rapid adaptation to changing environments
  3. Hybrid systems: Future computing architectures might combine silicon and biological components for optimal performance
  4. Neuroscience research: These systems provide unprecedented windows into how neural networks process information

As with any emerging technology, biological computing raises important ethical questions about the use of human-derived cells, potential future developments toward consciousness, and the appropriate applications of such systems. However, the current technology remains firmly in the realm of sophisticated biological circuits rather than artificial consciousness.

Source: Rohan Paul AI (@rohanpaul_ai) on X/Twitter, reporting on Cortical Labs research

AI Analysis

This development represents a paradigm shift in computing architecture that could address two fundamental limitations of current AI systems: energy inefficiency and lack of true adaptive learning. Traditional AI models require massive computational resources for both training and inference, creating sustainability concerns as AI adoption grows. Biological computing offers a potential solution by leveraging the inherent efficiency of neural systems that have evolved over millions of years. The commercial implementation through Cortical Cloud is particularly significant, as it lowers barriers to entry for researchers and developers. This could accelerate innovation in biological computing much as cloud computing democratized access to GPU resources for AI development. However, the technology remains in its infancy—200,000 neurons represent only 0.0002% of a human brain's capacity, suggesting substantial scaling challenges ahead. Ethically, the use of human-derived cells in computing systems will require careful consideration as the technology matures. While current systems lack consciousness, future developments with more complex neural structures may necessitate new ethical frameworks. The technology also raises questions about biological security and the potential vulnerabilities of living computing systems compared to traditional silicon-based hardware.
Original sourcex.com

Trending Now

More in AI Research

View all