A small number of companies are developing biological computers
Floriana/Getty Images
Data centers consume vast amounts of energy while the demand for computer chips continues to soar. Could utilizing brain cells be the solution?
Australian startup Cortical Labs is pioneering this field, planning to establish two innovative “biological” data centers in Melbourne and Singapore. These cutting-edge data centers will feature chips integrated with reproducible neurons.
Pon vs. Doom.
Cortical Labs stands out as one of the few firms creating biological computers that link nerve cells to microelectrode arrays, enabling the stimulation and measurement of cell responses during data input. Recently, the company successfully showcased that its primary model, the CL1, can learn to play games like Doom within just a week.
The first data center in Melbourne is set to accommodate around 120 CL1 units, while a second facility in collaboration with the National University of Singapore will initially support 20 CL1 systems, with plans to expand to 1,000 pending regulatory approval. This initiative aims to enhance cloud-based brain computing services.
According to Michael Barros from the University of Essex, UK, while biological computers have been constructed and tested globally, they remain challenging to build and use. He states, “We invest a lot of time and resources developing these systems.”
Barros further elaborates that Cortical Labs is democratizing access to biocomputers at scale, pioneering an accessible approach in the industry.
These systems can be trained for simple tasks, such as playing Doom, yet there are challenges in understanding how neurons function and training them for more complex tasks like machine learning. Reinhold Scherer, also from the University of Essex, notes, “When you access this technology, it opens doors to exploration in learning, training, and programming, but neurons cannot be programmed like standard computers.”
Cortical Labs asserts that its biological data centers use significantly less energy than traditional computing systems, with each CL1 requiring only 30 watts compared to thousands needed by leading conventional AI chips.
Paul Roach from Loughborough University, UK, emphasizes that scaling biocomputers into entire rooms, akin to traditional data servers, could yield substantial energy savings. Notably, while biological data centers may necessitate nutrients to sustain neuron chips, they require less cooling energy than conventional computing infrastructures, suggesting significant potential for energy conservation.
Nevertheless, experts like Tjeerd Olde Scheper, who holds a PhD from Oxford Brookes University, recognize that the technology remains nascent. “Will it perform as expected? We are still in the early developmental phase,” he comments.
Although direct comparisons between the sizes of biological and silicon AI systems remain complex, it’s notable that the envisioned biological data center would integrate hundreds of biological chips in contrast to the hundreds of thousands of GPUs typically found in large-scale AI data centers.
“We have a long way to go before these systems are production-ready. Transitioning from a small network playing games to a large language model is a substantial leap,” says Steve Furber from the University of Manchester, UK.
A pressing concern is the lack of clarity on how to store training outcomes within neurons as memory, or how to execute computational algorithms beyond specific tasks, such as video gaming.
Additionally, retraining neurons post-task completion poses challenges, as their training and learning may be lost upon the end of their lifespan. “Proper retraining is essential,” Scherer states. “If retraining is required every 30 days, it may hinder technological continuity.”
Topics:
This optimization includes enhanced keywords relevant to biological computing, energy efficiency, and neural networks, while ensuring the structure and relevant HTML tags remain intact.
Source: www.newscientist.com












