Rollin Justin can navigate obstacles and serve beverages.
Henrik Sporer/laif/CameraPress
Human fascination with our own likeness is longstanding. The term “robot” was introduced by Czech writer Karel Čapek in his 1920s play Rossum’s Universal Robots, depicting human-like beings made to work in factories. Since then, numerous humanoid robots have been designed to interact with their environment in human-like manners.
Henrik Sporer, an acclaimed photographer with the agency Leif, investigates such themes in his project Tomorrow is the Problem. The main image above showcases the metal craftsmanship from the Institute of Robotics and Mechatronics at the German Aerospace Center, featuring a 200-kilogram robot named Rollin’ Justin. It can traverse extensive distances on wheels, capture images, navigate around obstacles, and serve drinks. Its potential applications include assisting astronauts and aiding individuals with disabilities.
Lola is the ideal robot for challenging terrains and new surroundings
Henrik Sporer/laif/CameraPress
The image above depicts Lola, a 1.8-meter tall robot developed by the Technical University of Munich. It can navigate new and unstable surfaces with ease.
Next up is Amar-6, a robot standing at 1.9 meters tall, designed by the Karlsruhe Institute of Technology. This machine can converse with people and assist in carrying heavy loads.
Amar-6 is designed to assist with heavy lifting
Henrik Sporer/laif/CameraPress
The final image below features ZAR5, a dual-armed robot engineered at the Technical University of Berlin, equipped to pick and place items with finesse.
ZAR5 is capable of lifting and placing items manually
Feedback is new scientist A popular entity that keeps an eye on the latest developments in science and technology. To share your thoughts on articles you think might be of interest to our readers, please send an email to feedback@newscientist.com.
mechanical turk
Feedback can be quite cynical, so when faced with enforced entertainment, we tend to shy away. This might explain why purchasing ice cream in Türkiye can be a bit of a challenge. To buy ice cream, you must either truly enjoy it or tolerate an extended prank.
The Turkish ice cream shops are notorious for amusing their customers, often pulling tricks such as presenting cones filled with ice cream only to customers. They use sleight of hand to pull it away cleverly. These performances are astonishing and require years of practice. If feedback says we want ice cream, we’re really just after ice cream, not an immersive magic act.
When reporter Matthew Sparks tipped us off about a new preliminary paper on arXiv, we internally sighed. A robot that mimics the antics of a Turkish ice cream vendor caught our attention, as Matt noted, “all the essential research has been completed.”
The result is a robotic arm capable of twisting, turning, and swinging in various directions. Researchers programmed it to perform five entertaining tricks typical of Turkish ice cream vendors.
In one instance, the robot “bounces” the cone from side to side, giving the impression that it’s moving away from the user. In another scenario, it “evades” the user’s hand as they reach for the cone, creating a large arc before pulling back. And then there’s the “dance,” where it playfully levitates the cone in a circular motion just out of the user’s grasp.
Next, the robot was trialed on real individuals. This trick was rated as “more deceptive” compared to a control condition where the robot merely handed out ice cream without any theatrics. Notably, this approach “enhanced enjoyment-related responses (pleasure, engagement, challenge) and the robot’s perceived efficacy, while also potentially undermining performance reliability, perceived safety, and self-efficacy.”
In effect, “playful deception introduces a structural trade-off: it can delight and retain attention but often at the cost of predictability and trust.” The authors advise that “in safety-sensitive contexts, the resulting decline in trust and security may be intolerable.” Really? You think?
appropriate acronyms
When Feedback first solicited suggestions for the best and worst scientific acronyms, we had no inkling of the flood of responses to come. Our inboxes were overwhelmed with convoluted phrases reduced to mere strings of capital letters.
For instance, Stuart McGlashan shared information about a conservation initiative focused on “revitalizing the marine and coastal environment of the River Solway,” located at the border of England and Scotland. It was aptly named the “Solway Coast Marine Project” or SCAMP.
Stuart felt the project’s creators were shortchanging themselves. Given its focus on “marine life restoration,” wouldn’t it have been beneficial to include one more word to enhance the acronym? I concur with the feedback. The acronym should definitely reflect the Solway Coast Marine Conservation Initiative.
On the other side of the globe, Jamie Pittock and Jenny Marella from the Australian National University recently secured funding for a project examining the management of rivers flowing into the Indian Ocean. Creatively, they titled it “Management of Rivers Discharging to the Marine Domain (MORDOR).”
However, this serves as a cautionary lesson. Jamie recounted: “We posted a call for a Research Fellow, and Mr. Bilbo Baggins from the Shire applied. Thankfully, there were far more qualified candidates, and he was not selected.”
shakespeare shakes up
Feedback previously pointed out that two of William Shakespeare’s sonnets needed revisions to omit erroneous references to roses possessing thorns. Those sharp projections are more accurately termed spikes. Reader James Fradgley noted in a letter that Shakespeare’s scientific inaccuracies extend far beyond botany into astronomy.
In Julius Caesar, Act 3, Scene 1, the titular dictator proclaims: “I am as unchangeable as the North Star / whose true nature of correction and rest / has no companion in the heavens.” Caesar is referring to the North Star, which is positioned so close to the celestial north pole that, while other stars orbit around it throughout the year, the north pole remains relatively stable in the sky.
However, as James points out, at the time of Caesar’s assassination in 44 BC, “Polaris was not the North Star.” Instead, a star named Kotyab, or Beta Ursa Majoris, was nearest to the celestial north pole, yet it wasn’t sufficiently close to serve as a reliable navigation point.
“To complicate matters, Polaris is a Cepheid variable star,” James adds. This indicates that its brightness fluctuates regularly and it doesn’t shine with a stable intensity. “Honestly, I don’t understand why anyone bothers with Shakespeare,” James concludes.
Feedback tends to be more lenient. Our grasp of astronomical history isn’t solid enough to assert whether it was known in Europe that the North Star was shifting during Shakespeare’s era, but we believe he was busy enough that it makes sense he might have overlooked it. Cepheid variable stars, conversely, weren’t identified until 168 years posthumously, which seems like a valid excuse.
Have a story for feedback?
You can submit your article to Feedback at feedback@newscientist.com. Please remember to include your home address. This week’s and past feedback can be accessed on our website.
I found myself in a softly illuminated room, making my way to the table. The beat of “Mamushi” by Megan Thee Stallion filled the air, while two large white circles moved rhythmically along the silhouette of my body displayed on the screen.
Is this an exclusive sex club in Germany at 2 AM?
Unfortunately, no. I’m actually in a suburban shopping center on a Tuesday afternoon, experiencing a massage from an Aescape robot.
Aescape lets me explain, is a “groundbreaking lifestyle robot company” that “transforms the wellness sector by delivering outstanding massage experiences.” To put it simply, they provide robot massages: a cushioned table paired with two large robotic arms that knead your body according to your preferences and a preset program.
I absolutely love massages—there’s nothing that makes me happier than having the sack of flesh I call a body manipulated like Wagyu beef. So, I opted for the “power-up” option for 30 minutes, which costs $60 and promises to make you feel “good and alert.”
According to Mayo Clinic Health System, massage therapy can benefit numerous conditions including anxiety, depression, sports injuries, digestive issues, headaches, and soft tissue sprains. It also enhances circulation, boosts the immune system, reduces stress, and increases energy levels.
But can a robotic massage compare to one delivered by a human?
“There are many gimmicks out there. My mother owns a massage chair, and I use a Theragun at home,” explains Christa de La Garza, a board-certified massage therapist from Colorado. While these devices can be beneficial, Della Garza believes there’s no cause for concern about robots taking over.
Primarily, there are tangible physical advantages to interacting with humans. During the early days of the pandemic, many felt “skin hungry,” a term reflecting a need for physical touch that comes from prolonged isolation.
A paper published in 2024 in Nature, indicated that receiving touch is “very significant,” revealing that touch interventions help alleviate pain, depression, and anxiety in adults. The study found that while touch from objects and robots could provide physical benefits, the mental health advantages were minimal.
“Receiving safe and therapeutic touch is incredibly powerful,” shared Della Garza.
Aescape clarifies on their website that they do not intend to replace therapists but rather to complement their services and address workforce shortages in the industry.
An Aescape robot masseuse. Photo: Provided by Aescape
Upon my arrival, the efficient, blonde receptionist assured me, “Humans can’t be replaced.” Although I was late due to parking complications, the machine didn’t seem to mind. It felt like a typical spa, complete with nail polish, pedicures, and crystals for sale at the front desk. As we walked to the robot room, she mentioned that Aescape is quite popular, especially among clients who “are uneasy about being touched by strangers.”
Once inside, she provided me with high-compression, Aescape-branded leggings and top to wear. As my treatment began, I learned these garments help improve the machines’ “body detection.” Lying on the table, I remained very still as the Aescape robot scanned my shape. The staff explained the screen controls and an emergency button to halt the process if needed.
“We don’t need it!” she said cheerfully before changing my outfit.
Once I was facing the screen, I was greeted with a variety of options. What kind of music would I prefer? Choices included a playlist named lo-fi ambient jazz, classic rock, or “brat.” I could also see clear outlines revealing my body’s quirks—did they have to outline my saddlebags so distinctly?—or watch calming videos of the ocean, snowy mountains, or rainy forests. The headrest could be adjusted for different pressure settings on the machine.
One of Della Garza’s concerns about robotic massages is the potential for overthinking. With a human therapist, you tend to surrender more, as they steer the majority of the experience, allowing you to switch off your mind.
I, however, was distracted by the temptation to fine-tune the music and visuals. Did I want to see the snow? It was okay. Was a “Brat” playlist relaxing? Nope—back to ambient lo-fi. Is that really what my body looks like? This is unsettling.
Several friends expressed concern that a robotic massage might make me anxious. What if the powerful robotic arms tore through my muscles like a pile of deli meat? Conversely, my greater worry was that the massage wouldn’t be firm enough, leaving my tension knots intact.
Nevertheless, the massage proved enjoyable. The robot’s smooth plastic hands felt pleasantly warm. While they lack the finesse and precision of human touch, they provide a rather fun experience. In fact, the robot hands resemble knobby shapes, much like small Dutch clogs, with a firm push against the back.
By the end of the session, I felt significantly more relaxed than when I began. Although I wasn’t exactly elated, I sensed that I could comfortably drift into a deep, dreamless sleep.
While it certainly can’t replace the human touch, I appreciate the surrender and attention to detail a traditional massage provides. Additionally, Aescape doesn’t address the head, hands, or feet. But if you’re feeling sore after a workout or find yourself walking around a suburban shopping center for a duration, I could see it as a viable option. It’s relatively affordable, and there’s no need to tip.
Presence: Everywhere, particularly on social media.
That seems somewhat derogatory. Indeed, it’s considered a slur.
What type of slur? A slur targeting robots.
Is it because they are made of metal? Yes, it’s often used to insult actual robots like delivery bots and autonomous vehicles, but it increasingly targets platforms like AI chatbots and ChatGPT.
I’m not familiar with this – why would I want to belittle AI? For information creation, they either promote utterly false narratives and generate “slops” (meaning glitter or clearly unfounded content), or simply lack human qualities.
Does AI care about being insulted? It’s a complex philosophical issue, and the consensus is “no.”
So why does it matter? People feel frustrated with technology that can become widespread and potentially disrupt job markets.
Come here and let Crancous take over our responsibilities! That’s the notion.
Where did this slur originate? It was first used in the 2005 Star Wars game to describe PE Jor’s fight against Androids, but Clanker gained popularity through the Clone Wars TV series. It then spread to platforms like Reddit, memes, and TikTok.
Is that truly the best we can do? Popular culture has birthed other anti-robot slurs. There’s “Toaster” from Battlestar Galactica and “Skin Job” from Blade Runner, but “Clanker” seems to have taken the lead for now.
It seems like a frivolous waste of time, but I suppose it’s largely harmless. You might think so, yet it implies that using “clankers” could normalize real bias.
Oh, come on. Popular memes and parody videos often equate “clankers” to racial slurs.
So what? They’re just clankers. “This inclination to use such terms reveals more about our insecurities than about the technology itself,” says linguist Adam Alexick.
I haven’t. Anti-robot; I wouldn’t want to marry my daughter. Can you hear how that sounds?
I feel like I’ll be quite embarrassed about all this in ten years. Probably. Some argue that by mocking AI, we risk elevating it to a human level that isn’t guaranteed.
That’s definitely my view. However, “Roko’s Basilisk” suggests that future AI could punish those who didn’t help them thrive initially.
I believe it’s vital to label it a Clanker. We might find ourselves apologizing to robot overlords for past injustices.
Will they find humor in this? Perhaps one day Clanker will have a sense of humor about it.
Say: “This desire to create a slur reflects more on our insecurities than the technology itself.”
Don’t say: “Some of my best friends are Clankers.”
A swift left hook, a front kick to the chest, a series of cross jabs, and the crowd erupts in cheers. However, it isn’t traditional kickboxing skills that determine the outcome of the match; instead, an attempted roundhouse kick goes awry, leading to the kickboxer from a prestigious university team tumbling to the floor.
While conventional kickboxing involves risks like bloodshed, sweat, and severe head injuries, competitors in Friday’s match at Beijing’s inaugural world humanoid robot game encountered a unique set of challenges, including balance, battery life, and a deeper philosophical purpose.
The compact humanoid robot named Kickboxer, entered by a team from a leading Chinese University of Technology, is part of the Jamboree—a humanoid event showcased at China’s latest tech gatherings. This government-backed competition kicked off after an audience of 12,000 national speed skaters, training for the 2022 Winter Olympics, performed to the Chinese national anthem on Friday morning.
“I came here out of curiosity,” remarked Hong Yun, a 58-year-old retired engineer seated in the front row. He mentioned that watching a robot compete was “far more thrilling than seeing real humans doing the same.”
The robot is set to compete in five soccer matches on the event’s first day in Beijing. Photo: Tingshu Wang/Reuters
The event showcases China’s proficiency in humanoid robotics, a sector prominently featured within the country’s artificial intelligence landscape. The promotional efforts are in full swing.
Similar to kickboxing, humanoids engaged in various sports, including athletics, soccer, and dance. One robot stumbled during a 1500-meter event, losing its head mid-course. “Maintaining [the head] was our goal,” shared Wang Ziyi, a 19-year-old student from Beijing Union University who was part of the robotics team.
A troupe of humanoid dance robots took to the stage during the 2025 Spring Festival Gala, a televised celebration that captivated nearly 1.7 billion viewers online.
One robot got derailed midway through a 1500m event as its head detached. Photo: Kevin Frayer/Getty Images
These social media-friendly activities reflect more serious geopolitical dynamics, highlighting the intensifying technological rivalry between the US and China, which may reshape the AI landscape.
This technology has become a pivotal factor in relations between the two nations. Despite the US’s continued lead in frontier research, Beijing is heavily investing in practical applications like robotics, partly driven by restrictions on Washington’s advanced chip exports to China.
Several cities, including Beijing and Shanghai, have created 100 billion yuan (around 1 billion pounds) funds for the robotics industry. In January, state-owned banks revealed plans to offer 1 trillion yuan in financial support for the AI sector over the next five years.
“If there’s a sector where [Beijing] has heavily invested, it’s this one,” noted Kyle Chan, a researcher at Princeton University.
The robot is seen being transported after a kickboxing match during the competition’s opening day. Photo: China News Service/Getty Images
There’s something inherently unsettling about witnessing a jerky, human-like robot with two arms and legs, being dragged out of the ring by a human operator.
In the realm of humanoids, the Chinese industry possesses many strengths. While US firms like Tesla and Boston Dynamics remain dominant overall, several Chinese companies—including Ubtech and Unitree Robotics, who provided the boxing robots for Friday’s match—are swiftly catching up.
Tesla relies on China for numerous components needed to produce its physical humanoids. According to investment bank Morgan Stanley, the China-based supply chain is projected to manufacture robots using a third of its non-Chinese suppliers. “It appears remarkably challenging to disentangle this area from China completely,” Sheng Zhong, head of Chinese Industrial Research at the bank, noted in a recent report.
The robot, developed by the Chinese firm Unitree Robotics, is seen playing traditional drums. Photo: Tingshu Wang/Reuters
Beyond just generating positive attention on social media, China envisions humanoids as potential solutions to challenges posed by its aging population and shrinking workforce. A recent article from the People’s Daily, the mouthpiece of the Chinese Communist Party, suggested that robots could provide both practical and emotional support to the elderly. “The vision for robot-assisted elderly care is not far off,” it asserted. Humanoid robots could also replace factory workers as China seeks to retrain its workforce for more advanced technological roles.
However, there remains a significant gap between humanoids that can stumble through a sports match and those capable of managing everyday tasks. Ensuring safe interactions with vulnerable populations represents another considerable challenge. “The home is likely one of the last environments where humanoid robots will be welcomed for safety reasons,” Chan stated. “Overall, I maintain a somewhat skeptical view regarding the humanoid explosion.”
A technician works on humanoid robots in the vicinity of the game. Photo: Kevin Frayer/Getty Images
Two significant obstacles to deploying technology that is useful beyond PR stunts are the complexity of human environments and the dexterity required to navigate them.
While other forms of AI, like large language models, can be trained using vast amounts of digital data, there are far fewer datasets available to train algorithms for walking through crowded restaurants or maneuvering stairs. China’s initiatives to integrate robots into everyday settings might assist businesses in gathering more data, yet that remains a major bottleneck, according to Chang.
Dr. Jonathan Aitken, a robotics lecturer at the University of Sheffield, echoed this sentiment. “The current AI state is not yet prepared for humanoids operating in uncontrolled environments,” he asserted.
While impressive displays, such as a robot jumping or kicking, showcase remarkable capabilities, executing mundane tasks—like using a knife or folding laundry—demands a level of finesse. Human hands possess approximately 27 “degrees of freedom,” enabling independent movements. In contrast, one of the most advanced models available, Tesla’s Optimus Humanoid, has only 22.
Nevertheless, China has defied the odds before with rapid advancements. Just a decade ago, the nation exported fewer than 375,000 cars annually. Today, China stands as the world’s largest automotive supplier, shipping nearly 6 million vehicles each year. In response, the European Union has raised tariffs on electric vehicles produced in China to curb this trend.
In China, both the government and the populace are firmly behind the push for humanoids. Zhan Guangtao attended the Humanoid Games alongside her two daughters on Friday. “It’s essential to expose my kids to advanced robotics from around the world,” Zhan remarked. “Such exposure broadens their perspectives.”
The reasons audiobooks resonate are deeply human. They evoke moments that catch in the throat or a genuine smile when a word is spoken.
Melbourne actor and audiobook narrator Annabelle Tudor refers to narration as a storyteller’s innate ability, a fundamental and priceless skill. “The voice easily reveals our true feelings,” she explains.
However, this art form might be facing challenges.
In May, Audible, the audiobook service under Amazon, revealed plans to enable authors and publishers to narrate in English, Spanish, French, and Italian, using over 100 voices generated by artificial intelligence.
With a dwindling number of audiobook companies, emerging talents like Tudor are increasingly reliant on these opportunities, sparking concerns regarding job security, transparency, and overall quality.
Having narrated 48 books, Tudor is uncertain whether AI can replicate her work, yet fears that a dip in quality may alienate listeners.
“I once narrated a particularly explicit scene. The AI lacks understanding of how an orgasm sounds,” she remarks. “I’m curious to know how they plan to address such nuances, including more delicate scenes like childbirth.”
Audiobook Giant Audible claims it aims to use AI to enhance human narration rather than replace it. Photo: M4OS Photo/Aramie
The Audiobook Boom
A 2024 report from Nielseniq Bookdata indicates that over half of Australian audiobook consumers have increased their listening in the last five years. On an international scale, US audiobook sales have risen by 13% from 2023 to 2024. Meanwhile, the UK has seen audiobook revenues soar to £268 million, marking a 31% increase in 2023, as reported by the Publishers Association.
As demand for audio content surges, companies are seeking quicker and cheaper production methods. In January 2023, Apple unveiled a new catalog featuring AI-narrated audiobooks. Later that year, Amazon introduced a feature allowing self-published authors to convert their Kindle ebooks into audiobooks using “virtual audio” technology, resulting in tens of thousands of Audible titles now available in AI-generated formats.
Additionally, in February, Spotify announced support for AI audiobooks, making it easier for authors to reach wider audiences. Audible claims its goal is not to supersede human narrators but to enable more authors and titles to connect with larger audiences. In the US, Audible is testing audio replicas of audiobook narrators to create a unique voice, enhancing their capacity to produce high-quality audiobooks.
“In 2023 and 2024, Audible Studios has hired more [human narrators],” a spokesperson shared with the Guardian. “We continually engage with creators eager to have their work available in audio format and reach new audiences across languages.”
Yet, robot narrators remain a more economical choice than human talent, raising fears among industry professionals about potential job threats.
Volume vs. Quality?
Australian bestselling crime author Chris Hammer helped narrator Doge Swallow launch his career, highlighting a belief that AI narration is a tool designed by people who fail to grasp the intricacies, techniques, and skills necessary for quality audiobook production.
“Some assume we just press a button for a similar or sufficient quality result,” he notes.
Simon Kennedy, president of the Australian Audio Actors Association, mentions a long-standing struggle in Australia about fair remuneration for narrators. Recording an audiobook can mean narrators spend up to three times the length of the finished product for recording, not counting the initial read to understand the narrative and characters.
“In my view, AI narrators prioritize volume over quality and aim to cut costs,” he asserts.
In 2024, Kennedy founded the Australian Voice Subject Association in response to AI’s looming threat. In a submission to a parliamentary committee last year, the organization warned that 5,000 Australian voice acting jobs were at stake.
While not surprised by Audible’s recent announcement, he dismisses it as a “foolish decision.”
“Audiobook narrators hold a truly special and intimate connection with their listeners; pursuing an approach that lacks this connection is misguided,” he suggests.
Regarding voice cloning opportunities, he states that voice actors should be involved in the process, but warns that it may lead to a homogenized robotic voice that listeners quickly tire of.
“If a monotonous, emotionless narration suffices for ‘high quality,’ then perhaps,” he counters. “However, if you seek an evocative, captivating listening experience, don’t expect to find it there.”
Another pressing concern is the absence of AI regulations within Australia. The EU has its own AI ACT, while China and Spain also have measures in place, whereas Australia lacks regulations regarding the labeling of AI-produced content.
“No laws exist to prevent data scraping, voice cloning, or breeding deeper AI capabilities,” Kennedy explains. “There’s no labeling or transparency requirement for AI-generated material or its origins, nor any regulations governing the proper use of AI-generated deepfakes, audio clones, or text.”
Author Hannah Kent expresses concern that AI will “devalue creativity” in the arts. Photo: Carrie Jones/Guardian
This year, during the burial ceremony and dedication of her work, Author: Hannah Kentdropped with astonishment upon discovering that pirated copies of her work had trained meta AI systems. Despite initial resistance and frustration towards AI’s infiltration in creative spaces, she shows curiosity about Audible’s AI developments and the prospective trials for translating texts into various languages.
“It’s evident that the primary motive behind AI adoption is cost-efficiency. Its aim is to reduce artistic value and creative narratives,” Kent reflects.
Both Tudor and Swallow agree that large corporations struggle to fully substitute human narration, as many Australian authors express opposition.
Yet, it remains unclear whether audiences can discern the difference.
“We are rushing straight into a dystopia,” Tudor warns. “Will I listen to humans or robots?”
Abroad, particularly in states like California, robots are more commonplace in daily life. Following initial self-driving vehicle tests in cities like San Francisco, humans now share sidewalks with robots.
Delivery robots have also been delivering meals in Europe for years. Countries like Sweden, Finland, and the UK allow customers to summon robots through food delivery apps.
However, autonomous robots are still a rarity in the Australian market.
Legal “Minefield”
One of the main obstacles hindering this technology in Australia is the uncertainty surrounding the legal status and safety of delivery robots.
When Australia trialed a robotic “mobile parcel locker” in Brisbane in 2017, its effectiveness was questioned as it required human accompaniment and could only transport one parcel at a time.
In contrast to drone food delivery, the trial went ahead, but the legal status of robots remains undetermined.
Christine Eldridge, an attorney specializing in automobile accidents, noted that robots fall under various road and sidewalk regulations, creating foggy areas across states and local councils.
She compared the limitations faced by delivery robots to those of e-scooters.
The absence of legal guidelines for these emerging vehicles means they are commonly allowed in public spaces, except in certain council areas conducting trials.
On March 5th, 2024, an Uber Eats food delivery robot was seen navigating pedestrians during a media demonstration in Tokyo, Japan. Photo: Kajiyama Shiyama/AP
“For instance, current laws concerning liability and compensation do not adequately address robotics. The law is struggling to keep pace with technology,” she remarked.
“It’s quite the minefield,” said Eldridge.
Hussein Deer, a future mobility professor at Swinburne University, concurred, asserting the current legal landscape is ambiguous.
“There is no legislation stating they’re permitted, nor is there any stating they’re forbidden,” Deer mentioned.
The federal government is working on a comprehensive legal framework for self-driving vehicles, including those transporting passengers, with regulations expected by 2026.
DIA aims to “accept more risks” with announced regulations to help Australia keep pace with overseas advancements and “demystify” technology for pedestrians and other road users.
“Evidence suggests that they are remarkably safe.”
Creating spaces where self-driving cars and robots can co-exist with pedestrians and drivers requires extensive planning, including adjustments to streets, sidewalks, and terrain.
Moreover, utilizing a robot mandates adherence to slow speeds, generally below 10 km/h, combined with various sensors to detect obstacles and potential hazards, halting the device accordingly.
Reducing Pollution, Traffic, and Labor Costs
Once challenges are addressed, delivery robots can offer substantial advantages.
“In city centers, vehicles that pollute and exacerbate traffic can be substituted, freeing up parking spaces.”
Professor Michael Bell from the Institute for Transport and Logistics at the University of Sydney believes that Australia trails behind in utilizing delivery robots compared to densely populated foreign cities with simpler terrain. He noted that agriculture and mining are currently leading the way for robotics in Australia.
The attractiveness of delivery robots lies in their potential to lower labor costs, streamline elevator navigation to meet couriers at high-rise entrances, and enhance efficiency in controlled environments like university campuses.
The Connected Autonomous Vehicle team at Monash University has created delivery robots tailored for defined areas such as campuses, industrial zones, shopping centers, and hospitals. Photo: Eugene Highland/Guardian
“Courier delivery is costly, so there is an economic incentive here. Any situation that reduces delivery time will be appealing,” Bell said.
Kate Lötel, an associate professor at the Peter Faber Business School at Australian Catholic University, anticipates that robots will lead to more affordable delivery solutions.
“In the end, we may witness a shift towards reduced or tiered service delivery based on whether items are transported by land, air, humans, or technology-assisted humans,” she stated.
“Initially, we may not see changes in costs but rather an increase in value for customers by addressing general inconveniences associated with deliveries,” she added.
The unclear legal status of delivery robots in Australia hasn’t stifled local innovation. Startups are focusing on implementing technology in private settings.
A group of student engineers from the Connected Autonomous Vehicle team at Monash University has designed delivery robots specifically for circumscribed areas, including campuses, industrial parks, shopping centers, and hospitals.
A robot named Ari operates at speeds up to 6 km/h and stands around 1 meter tall, equipped with a set of camera-like sensors for navigation.
ARIs utilize these sensors for communication, moving between restaurants where employees load orders to customers, eliminating the need for a stable internet connection.
This setup entails significant initial costs in deploying a network of sensors, but it ultimately leads to lower costs for individual robots, making them easier to scale.
The notion is that in high-density environments, multiple cameras can be installed simultaneously, making it more economical as demand rises.
Moreover, ARI boasts features that its creators trust in.
The heated and cooled compartments enable each ARI robot to deliver multiple orders while maintaining the appropriate temperatures. This ensures that pizza arrives hot, ice cream stays frozen, and medicine arrives safely.
ARI has begun distributing foods like burgers and burritos throughout Monash University’s Clayton campus, with plans to commercialize the technology underway.
More than just labor savings, 24-year-old inventor John Bui noted that temperature-controlled compartments give ARIs an edge over competing robots and traditional delivery personnel.
“Imagine receiving a hot coffee or warm burrito,” Bui expressed.
Ultimately, beyond legal and technical limitations, behavioral and psychological factors also pose significant barriers to the adoption of delivery robots.
“There is already tension between pedestrians and e-scooter riders; it’s expected that someone walking late at night might confront a delivery robot while picking up pizza,” Deer suggested.
“Of course, there are locks to protect the food, but I hope people treat these robots with respect.”
Humanoid robot waltzes with the help of AI trained on human motion capture recordings
Xuxin Cheng and Mazeyu Ji
AI that helps humanoid robots mirror human movements could allow robots to walk, dance, and fight in more human-like ways.
The most agile and fluid robot movements, such as Boston Dynamics’ impressive demonstration of robotic acrobatics, are typically narrow, pre-programmed sequences. Teaching robots a wide repertoire of persuasive human movements remains difficult.
In order to overcome this hurdle, Peng Xuanbin at the University of California, San Diego, and colleagues have developed an artificial intelligence system called ExBody2. This allows the robot to imitate various human movements in a more realistic way and execute them smoothly.
Peng and his team began by building a database of possible movements that a humanoid robot could perform, from simple movements such as standing and walking to more complex movements such as tricky dance moves. Created. The database contained motion capture recordings of hundreds of human volunteers collected in previous research projects.
“Humanoid robots share a similar physical structure with us, so it makes sense to leverage the vast amount of human movement data that is already available,” Peng says. “By learning to imitate this kind of behavior, robots can quickly learn a variety of human-like behaviors. This means that anything humans can do, robots have the potential to learn.” It means something.”
To teach the pseudo-humanoid robot how to move, Peng and his team used reinforcement learning. In this learning, the AI is given an example of what makes a successful move and then challenged to figure out how to do it yourself through trial and error. They started by training ExBody2 with full access to all the data on this virtual robot, including the coordinates of each joint, so it could mimic human movements as closely as possible. It then learned from these movements, using only data accessible in the real world, such as inertia and velocity measurements from sensors on the actual robot’s body.
After ExBody2 was trained on the database, it was able to control two different commercially available humanoid robots. It was able to smoothly combine simple movements such as walking in a straight line and crouching, as well as perform tricky movements such as following a 40-second dance routine, throwing punches, and waltzing with humans.
“Humanoid robots work best when all limbs and joints work together,” Penn says. “Many tasks and movements require coordination between the arms, legs, and torso, and whole-body coordination greatly increases the range of a robot’s capabilities.”
A team of researchers at Cornell University has created a new class of magnetically controlled microscopic robots (microbots) that operate at the diffraction limit of visible light. These microbots, called diffractive robots, can interact with visible light waves and yet move independently, allowing them to move to specific locations, take images, and measure forces at the scale of the body’s smallest structures. You can.
Diffractive robotics connects untethered robots with imaging techniques that rely on visible light diffraction (the bending of light waves as they pass through an aperture or around something).
Imaging techniques require an aperture with a size comparable to the wavelength of light.
For the optics to work, the robot must be at that scale, and for the robot to reach the target it is imaging, it must be able to move on its own.
The robot is controlled by a magnet that performs a pinching motion, allowing it to move inchworm-like across solid surfaces. The same motion can also be used to “swim” through a fluid.
The combination of maneuverability, flexibility, and sub-diffractive optical technology represents a major advance in the field of robotics.
“A walking robot that is small enough to interact with light and effectively shape it would place a microscope lens directly into the microworld,” said Paul McEwen, a professor at Cornell University.
“We can perform close-up imaging in a way that would never be possible with a regular microscope.”
“These robots are 2 to 5 microns in size. They're tiny. And by controlling the magnetic fields that drive their movement, we can make them do whatever we want them to do.”
“I'm really excited about the fusion of microrobotics and micro-optics,” said Dr. Francesco Monticone of Cornell University.
“The miniaturization of robotics has finally reached a stage where these actuated mechanical systems can interact with and actively shape light on the scale of just a few wavelengths (one millionth of a meter). I did.”
To magnetically drive a robot at this scale, the research team used hundreds of nanometer-scale magnets with two different shapes, long and thin or short and stubby, with the same volume of material to drive the robot. I made it into a pattern.
Professor Itai Cohen of Cornell University says, “Long, thin objects require a larger magnetic field to switch from pointing in one direction to pointing in another direction, whereas short, stubby objects require a larger magnetic field to switch from pointing in one direction to pointing in another direction.'' “Things require smaller magnetic fields.”
“So if you apply a large magnetic field, you can align them all, but if you apply a smaller field, only the short and thick ones will flip.”
To create the robot, the authors combined this principle with a very thin film.
“One of the main challenges for optical engineering was to find the best approach for the three tasks (light conditioning, focusing, and super-resolution imaging) for this particular platform, because “different approaches “There are different performance trade-offs depending on how the microrobots behave,” said Dr. Monticone. “They can move and change shape.”
“There are advantages to being able to mechanically move the diffractive elements to enhance imaging,” Professor Cohen says.
The robot itself can be used as a diffractive grader or a diffractive lens can be added. In this way, the robot can act as a local extension of the microscope lens looking down from above.
The robot measures force using the same magnet-driven pinching motions used to push structures while walking.
“These robots are very compliant springs, so if something pushes on them, it can squeeze them,” Professor Cohen said.
“That changes the diffraction pattern and allows us to measure it very well.”
Force measurements and optical capabilities can be applied to basic research such as exploring the structure of DNA. Or it may be introduced into clinical practice.
“Looking to the future, we can imagine swarms of diffractive microbots walking along the surface of samples to perform super-resolution microscopy and other sensing tasks,” Professor Monticone said.
“I think we have just scratched the surface of what is possible with this new paradigm of combining robotics and optics at the microscale.”
Robots that can take off like birds could eliminate the need for runways for small fixed-wing drones.
Birds use the powerful explosive force generated by their legs to jump into the sky and begin flight, but it has proven difficult to build robots that can withstand the strong accelerations and forces exerted during this process.
now, Won Dong Shin Researchers at the Swiss Federal Institute of Technology Lausanne (EPFL) have developed a flying propeller robot called RAVEN. The robot has legs that move like a bird and can walk, hop, jump into the air and start flying.
“Fixed-wing aircraft like airplanes always need a runway or a launcher, but that's not available everywhere. You really need a designated infrastructure to get the plane off the ground,” Singh said. Masu. “But when they spot a bird, they just walk around, jump, take off. It's very easy for them. They don't need any outside help.”
A real bird's legs have joints at the hip, knee, and ankle, but RAVEN's legs have only two joints, the hip and knee, and are driven by a motor. Each leg also has a spring that can store and release elastic energy. By using fewer parts, Singh and his team were able to keep RAVEN's weight to about 600 grams, the same as a crow.
In indoor tests, RAVEN was able to jump approximately 0.5 meters into the air at a speed of 2.4 meters per second. This is a similar speed to birds of the same size. At this point the propeller takes over. Because it can be launched upwards from anywhere, RAVEN could be useful for disaster relief missions where regular fixed-wing drones cannot take off or land, Singh said. But first, he says, the team needs to develop RAVEN's ability to land safely.
“We've seen a lot of work on flying robots that land on perches, but not many focused on taking off with their feet,” he says. Rafael ZafriEPFL was also not involved in this work. “I think the two disciplines of landing, or perching, and takeoff will be integrated into one platform that will allow robots to fly, detect branches, land, recover, and perform missions.” Take off. ”
A Cheerios-inspired robot that emits alcoholic fuel using fluorescent dye.
Jackson K. Wilt et al. 2024
The same phenomenon as beetles floating on a pond and Cheerios growing in clusters.You can combine them inside a cereal bowl to make a small floating robot.
One such effect, the Marangoni effect, occurs when a fluid with a low surface tension spreads rapidly across the surface of a fluid with a high surface tension. To take advantage of this effect, stenus There are beetles that have evolved to fly around ponds by secreting a substance called stenusin, and toy boats that run on soap.
To explore how engineers can use this, jackson wilt Harvard University and his colleagues 3D printed round plastic pucks about 1 centimeter in diameter. Each had an air chamber for buoyancy and a small fuel tank containing 10 to 50 percent alcohol, which has a lower surface tension than water. The alcohol gradually leaks out of the pack and the pack moves across the surface of the water.
The researchers used alcohol, which evaporates, as fuel, unlike soap, which ends up contaminating the water and ruining the Marangoni effect. It turns out that the stronger the alcohol, the better the results. “The beer would be pretty bad,” Wilt says. “Vodka is probably the best thing you can use. Absinthe…that's a lot of propulsion.” At top speed, the robot moves at 6 centimeters per second, and some experiments propelled the puck for as long as 500 seconds. It has been confirmed that
By printing pucks with multiple fuel outlets and gluing them together, researchers can also create larger devices that can make wide curves or rotate in place. Using multiple packs also allows researchers to study the “Cheerios effect,” where cereals and other similar floating objects cluster together. This occurs because they form a meniscus, or curved surface, in the fluid, and these surfaces are attracted to each other.
Wilt said 3D printed devices could be useful in education to help students intuitively understand concepts related to surface tension, but could also be carefully designed to produce more complex and elegant behavior. If this technology is developed, it can be expected to be applied to environmental and industrial processes.
For example, if there is a substance that needs to be dispersed throughout the environment and also acts as a suitable fuel, the robot can automatically disperse it around it. “Say you have a body of water that needs to release a chemical and you want it to be distributed more evenly, or say you have a chemical process that needs to deposit material over time,” Wilt says. . “I feel like there's some really interesting behavior here.”
A pigeon-inspired robot has uncovered the mystery of bird flight without vertical tails found in human-designed aircraft. The prototype has the potential to lead to passenger planes that can reduce drag and fuel consumption.
The vertical stabilizer, or tail fin, in aircraft allows for side-to-side turns and prevents unintentional changes in direction. Some military aircraft, like the Northrop B-2 Spirit, are designed without tails to reduce radar visibility. Instead, they use inefficient methods like flaps creating extra resistance on one side.
Research by David Lentink at the University of Groningen in the Netherlands led to the development of the PigeonBot II to study how birds maintain control without vertical stabilizers.
PigeonBot II, a robot designed to imitate bird flight techniques
Eric Chan
The previous model, built in 2020, mimicked bird flight by flapping wings but had a traditional tail. The new design, featuring 52 real pigeon feathers, incorporates a bird-like tail, and successful test flights confirm its functionality.
Lentink explains that PigeonBot II’s success lies in its programmed, reflexive tail movements resembling those of birds. The intricate tail movements contribute to stability, proven by the robotic replica’s flight.
The team controlled PigeonBot II’s nine servo motors, utilizing propellers on each wing for steering and tail adjustments in response to the autopilot’s commands. Lentink notes that the complexity of these reflex movements prevents direct human control of PigeonBot II.
After numerous unsuccessful tests, the control system was refined, enabling safe takeoff, cruising, and landing. Lentink envisions a future where vertical stabilizers are unnecessary, reducing weight and drag in aircraft designs.
This tuna-inspired robot borrows some nifty tricks from the real fish
Lin, Z. et al. (2024).
The tuna-shaped robot harnesses the secret to the speed and agility of real fish – the ability to selectively fold and extend their fins – which could improve underwater robot design.
Tuna are one of the fastest swimming fish in the ocean, thanks in part to their ability to retract and fold their fins to reduce drag. Chung-Rok Hayashi Researchers from China's Xiamen University and their colleagues investigated how such fins could improve the agility of robots.
The researchers built a 50-centimeter-long tuna-shaped robot that can be controlled by motors attached to its head, a dorsal fin on its back, and a fluke at the end of its tail. The researchers filmed the robot swimming in a pool and tested the effects of flattening or erecting the dorsal fin on the robot's acceleration, direction changes, and steady forward motion.
They found that folding and unfolding the dorsal fin had significant effects on factors such as speed, efficiency and linear acceleration. When the robot tuna was changing direction, keeping the dorsal fin erect increased its speed by about 33%. However, keeping the fin erect when the robot was moving steadily forward reduced the efficiency of its movement by up to 13%, increasing the robot's energy consumption.
Lin says these findings are consistent with how tuna in nature raise their dorsal fin to make fast, precise movements, such as when catching prey, then fold it back to continue swimming. “By designing similar flexible control systems, underwater vehicles can improve balance, navigation, and agility at high speed,” he says.
“Understanding this high level of swimming performance in tuna is intrinsically interesting because it is something that even the best human swimmers cannot achieve.” Frank Fish At West Chester University in Pennsylvania.
But Fish adds that the tail fin may play a bigger role than the dorsal fin in a tuna's swimming ability. His own research Many of these animals have shown this to be especially true when it comes to turning, Fish says. “We measured the turning ability of Pacific bluefin tuna and found that it far surpasses the capabilities of a robot,” he says. This may mean that tuna-inspired robots could also be improved by studying their tails in more detail.
Robots that can peel vegetables as easily as humans can, demonstrating a level of dexterity that could be useful for moving delicate objects on production lines.
Prototype robots are often tasked with peeling vegetables to test their ability to carefully handle tricky objects, but these tasks are typically simplified, such as immobilizing the vegetable or testing only a single fruit or vegetable, like peeling a banana.
now, Pulkit Agrawal Researchers at the Massachusetts Institute of Technology have developed a robotic system that can rotate different types of fruits and vegetables using the fingers of one hand and peel them with the other arm.
“This extra step of rotating is something that's very easy for humans to do and they don't even think about it,” Agrawal says, “but it makes it difficult for a robot.”
First, the robot was trained in a simulated environment, where the algorithm rewarded it for correct turns and punished it for turning in the wrong direction or not turning at all.
The robot was then tested in real-world conditions peeling fruits and vegetables, including pumpkins, radishes, and papayas, using feedback from touch sensors to rotate the vegetables with one hand while a human-operated robotic arm did the peeling.
The robot can grab and spin vegetables with one hand and peel them with the other.
Tao Chen, Eric Cousineau, Naveen Kuppuswamy, Pulkit Agrawal
Agrawal said the algorithm struggles with small, awkwardly shaped vegetables like ginger, but the team hopes to expand its capabilities.
Grasping and orienting an object is a difficult task for any robot, but the speed and firm grip of this robot are impressive, he said. Jonathan Aitken Researchers at the University of Sheffield in the UK say the technology could be useful in factories where objects need to be moved from machine to machine in the correct orientation.
But Aitken said it was unlikely to be used industrially to peel vegetables because other methods already exist, such as automated potato peelers.
A spot robot equipped with a burner for weed removal
Song, Deok-jin et al. (2024)
Robot dogs equipped with burners could be used to prevent weeds from growing on farms, offering a potential alternative to harmful herbicides.
Even highly targeted herbicides can cause environmental problems and affect local wildlife, and “superweeds” are rapidly evolving resistance to the most common herbicides like glyphosate.
Looking for alternative solutions Song Deokjin Researchers at Texas A&M University have developed a weed control system that uses short bursts of heat from a propane gas torch controlled by a robotic arm attached to a Boston Dynamics Spot Robot.
Rather than incinerating weeds, the robot is designed to identify and heat the core of the plant, which can stop weed growth for weeks, Song said. “It doesn’t kill the weeds, it just inhibits their growth, giving the crop a chance to fight them.”
Song and his team first tested the flame nozzle to see if it could accurately target the center of the weeds, then deployed the robot in a cotton field that was also planted with weeds, including sunflowers, which are native to Texas.Sun Flower) and giant ragweed (Ambrosia trifidaFive tests showed the robot could find weeds and focus an average of 95 percent of its flames on them to burn them down.
Song said a major limitation of the Spot robot is its battery life — in this setup it can only operate for about 40 minutes before needing to be recharged — but the team is working on upgrading to a longer-lasting device. They’re also considering equipping the robot dog with an electric shock device that can deliver more than 10,000 volts of current, which would stop weeds from growing for longer.
“With other machines, people use a fairly broad, inaccurate flame to kill weeds. That’s been around for a while, but I’ve never seen anything as precise as this.” Simon Pearson A researcher at the University of Lincoln in the UK said the robot’s success will depend on how precisely it can deliver the flames without damaging valuable crops.
Article updated on July 24, 2024
The article has been revised to more accurately describe battery life for burning tools and robots.
Robot dogs equipped with flame throwers could be used to prevent weeds from growing on farms, offering a potential alternative to harmful herbicides.
Even highly targeted herbicides can cause environmental problems and affect local wildlife, and “superweeds” are rapidly evolving resistance to the most common herbicides like glyphosate.
Looking for alternative solutions Song Deokjin Researchers at Texas A&M University have developed a weed control system that emits short bursts of heat from a propane-powered flame thrower controlled by a robotic arm attached to a Boston Dynamics Spot Robot.
Rather than incinerating weeds, the robot is designed to identify and heat the core of the plant, which can stop weed growth for weeks, Song said. “It doesn't kill the weeds, it just inhibits their growth, giving the crop a chance to fight them.”
Song and his team first tested the flame nozzle to see if it could accurately target the center of the weeds, then deployed the robot in a cotton field that was also planted with weeds, including sunflowers, which are native to Texas.Sun Flower) and giant ragweed (Ambrosia trifidaFive tests showed the robot could find weeds and focus an average of 95 percent of its flames on them to burn them down.
Song said the Spot robot's biggest limitation is its battery life — it can only operate for about 40 minutes before needing to be recharged — but the team is working on upgrading it to a longer-lasting device. They're also considering equipping the robot dog with an electric shock device that can deliver more than 10,000 volts of current, which would stop weeds from growing for longer.
“With other machines, people use a fairly broad, inaccurate flame to kill weeds. That's been around for a while, but I've never seen anything as precise as this.” Simon Pearson A researcher at the University of Lincoln in the UK said the robot's success will depend on how precisely it can deliver the flames without damaging valuable crops.
Among the many strange robot designs in the past, a new contender has emerged as the world’s first robot powered by a real human brain, making it more human-like than ever.
Researchers from Tianjin University and Southern University of Science and Technology have managed to control the robot’s movements, such as tracking, grasping, and obstacle avoidance, using what they call “mini-brains.”
These miniature brains are not taken from human bodies but rather grown in labs for research purposes and then integrated into robots.
The researchers have utilized living organisms to create “brains on a chip,” which provide some intelligence to the robot’s brain but require assistance for full functionality.
Through the integration of these chips, scientists can debug the brain, send signals externally, and control specific functions like grasping in robots.
Professor Min Dong, Vice President of Tianjin University, explains that this brain-computer interface on a chip combines ex vivo cultured brains with electrode chips to interact with the outside world through encoding, decoding, and stimulation feedback.
With the brain chip, robots can perform tasks like tracking targets, avoiding obstacles, and learning to move their arms using electrical signals fed by the chips.
While robots do not have a human appearance, their brains process information through electrical signals from the chips. Training in simulated environments is possible, but understanding the real world remains a complex challenge.
The brain chip, known as MetaBOC, was developed as an open-source project and has been used in various experiments, including one where Neanderthal DNA was used to create mini-brains for robot control.
The latest research on robot-brain interaction focuses on utilizing ball-shaped organoids to create a more complex neural network for the brain-on-a-chip to function effectively.
Additionally, artificial intelligence algorithms have been integrated to enhance the robot’s capabilities through its mini-brain.
Although the advancements are groundbreaking, there is still progress to be made, with the current brain inside the robot being a model while the actual brain tissue is kept separate for testing purposes.
Humanoid robots that can drive cars may one day be used as chauffeurs, but their creators acknowledge that this could be at least 50 years away.
Most driverless cars work completely differently than a human driver, using artificial intelligence and custom mechanical systems to directly control the steering wheel and pedals. This approach is much more efficient and simpler than using a humanoid robot to drive, but it needs to be customized for each specific car.
Kento Kawarazuka Professor Takeru Sato of the University of Tokyo and his team have developed a humanoid robot called “Musashi” that can drive a car just like a human. Musashi has a human-like “skeleton” and “muscles,” and is equipped with cameras in both eyes and force sensors in its limbs. An artificial intelligence system determines the movements required to drive the car and responds to events such as changes in the color of traffic lights and people cutting in front of the car.
Currently, robots can only perform a limited range of driving tasks, such as going straight or turning right, at speeds of around 5km per hour on non-public roads. “The pedal speed and car speed are not high, and the car handling is also not as fast as a human,” Kawarazuka said.
Musashi is a humanoid robot that operates cars just like a human would.
Kento Kawarazuka et al. 2024
But Kawarazuka hopes that as the system is improved it could be used in any car, which could be useful when humanoid robots are routinely produced. “I’m not looking 10 or 20 years out, I’m looking 50 or 100 years out,” he says.
“This research could be of interest to people developing humanoid robots, but it doesn’t tell us much about autonomous driving.” Jack Stilgoe “Self-driving cars cannot and should not drive like humans. Because the technology doesn’t need to rely on limbs and eyes, it can rely on digital maps and dedicated infrastructure to find safer, more convenient ways to navigate the world,” say researchers at University College London.
Created by scientists and explosion theory artists from the University of Nottingham cat royale is a multispecies world centered around a custom-built enclosure where three cats and a robotic arm coexist for six hours a day during a 12-day installation period.
Professor Steve Benford from the University of Nottingham and colleagues said: “Robots are finding a place in everyday life, from cleaning houses to mowing the lawn, shopping around hospitals and delivering parcels.”
“In doing so, they will inevitably have interactions and encounters with animals.”
“They could be companion animals, pets that share a home, guide dogs that help people navigate public places, but they could also be wild animals.”
“Often these encounters are unplanned and incidental to the robot’s primary mission, such as navigating a world inhabited by cats riding Roombas, guide dogs confused by delivery robots, and lawn mowing robots. Such as a hedgehog.”
“But it could also be intentional. We could also design robots to serve animals.”
“Little is known about how to design robots for animals, even though such encounters are inevitable, whether planned or not. Can you do that?
“We present Cat Royale, a creative quest to design a domestic robot to enrich cats’ lives through play.”
schneiders other. It suggests we need more than carefully designed robots to care for cats. In addition to human interaction, the environment in which the robot operates is also important. Image credit: Schneiders other., doi: 10.1145/3613904.3642115.
Cat Royale was unveiled at the World Science Festival in Brisbane, Australia in 2023, has been touring ever since, and just won a Webby Award for its creative experience.
The installation centers around a robotic arm that provides activities to make cats happier, including dragging a “mouse” toy along the floor and raising a feathered “bird” into the air. , which included feeding the cat treats.
The team then trained the AI to learn which games cats liked best so they could personalize their experience.
“At first glance, this project is about designing a robot that can play with cats and enrich the lives of families,” Professor Benford says.
“But beneath the surface, we are exploring the question of what it takes to entrust robots to care for our loved ones, and in some cases, ourselves.”
By working with Blast Theory to develop and study Cat Royale, researchers gained important insights into robot design and interaction with cats.
They had to design a robot that would pick up toys and deploy them in a way that excited the cats, all while learning which games each cat liked.
They also designed an entire world for the cat and robot to live in, providing a safe space for the cat to observe and sneak around the robot, and decorating it so that the robot had the best chance of spotting the approaching cat. did.
This means that robot design involves not only engineering and AI, but also interior design.
If you want to bring a robot into your home to take care of your loved ones, you will likely need to redesign your home.
Dr Ike Schneiders, a researcher at the University of Nottingham, said: ‘As we learned through Cat Royale, to create a multi-species system where cats, robots and humans are all taken into account, you simply need to design robots. That’s not enough.”
“We needed to ensure the animal’s health at all times, while also ensuring that the interactive installation would attract a global (human) audience.”
“Many factors were considered in this, including the design of the enclosure, the robot and its underlying systems, the different roles of the humans, and of course the selection of the cat.”
Ike Schneiders other. Design multispecies worlds for robots, cats, and humans. CHI ’24: Proceedings of the CHI Conference on Human Factors in Computing Systems. article #593; doi: 10.1145/3613904.3642115
A humanoid robot can predict one second in advance whether someone will smile and match the smile on its own face. The creators hope this technology will make interactions with robots more realistic.
Artificial intelligence is now able to imitate human language to an impressive degree, but interacting with physical robots often falls into the “uncanny valley.” One reason for this is that robots cannot reproduce the complex nonverbal cues and mannerisms that are essential to communication.
now, Hod Lipson Researchers at Columbia University in New York have developed a robot called Emo that uses AI models and high-resolution cameras to predict and attempt to reproduce people's facial expressions. It predicts whether someone will smile about 0.9 seconds in advance and smiles accordingly. “I'm a jaded roboticist, but when I see this robot, I smile back,” Lipson says.
Emo consists of a face with a camera in its eyeball and a flexible plastic skin with 23 individual motors attached by magnets. This robot uses her two neural networks. One looks at people's faces and predicts their expressions, and her other one figures out how to create expressions on her own face.
The first network was trained on YouTube videos of people making faces, while the second network was trained by watching the robot itself make faces on a live camera feed. “You learn what your face looks like when you pull all your muscles,” Lipson says. “It's like being in front of a mirror. Even if you close your eyes and smile, you know what your face looks like.”
Lipson and his team hope Emo's technology will improve human-robot interaction, but first they need to expand the range of expressions robots can perform. Lipson also wants to train his children to express themselves in response to what people say, rather than simply imitating others.
Will robots eat us? Or will they eat robots? Tech lovers and tech haters alike want to know which will happen first. The answer has now arrived. report The work comes from a team from the University of Electro-Communications in Tokyo and Osaka University in Japan.
Reader Bruce Gitelman alerted us to the synopsis passage: “We developed a pneumatically driven edible robot using gelatin and sugar. We investigated the robot's appearance and impressions when eating it.”
The researchers investigated the psychological reactions of the participants. “We evaluated two conditions: one in which the robot was moving and one in which it was stationary. Our results show that participants perceived a mobile robot differently from a stationary robot. We showed that the robot can be recognized in different ways and elicit different cognitions upon consumption.We also observed differences in the perceived texture when biting and biting the robot under the two conditions.”
This is yet another example of Stephen Sondheim's foresight when he wrote the musical (in previous feedback, I mentioned a case involving a duck and a monkey) Sweeney Todd: The Demon Barber of Fleet Street. Sondheim has Sweeney say this: “The history of the world is about who gets eaten and who gets eaten!”
Ketchup inside
Many types of slicable sauces are not yet popular. For now, technical hopes and resources are being poured into ketchup. Not only ketchup eaters, but also food technologists can satisfy their thirst for knowledge to some extent by reading this study. ”Texture and rheological properties of slicable ketchupPublished in the magazine gel.
“There is a lack of knowledge about sliceable ketchup,” explain the authors, who are based at three Iranian institutions: Islamic Azad University, Allameh Tabatabai University, and Institute of Food Science and Technology.
For readers who are not familiar with the field of sliced sauces, they explain: “Ketchup to be used in conjunction with sausages must be viscous as a final product, elastic in terms of textural properties, solid, and, if cool, can be cut and sliced like sausages. If this research is successful, ketchup could become more than just an outer sticky coating. The interior beckons.
The research objective was to “investigate the influence of gelling hydrocolloids on the physical, textural and rheological properties of ketchup and develop new formulations of slicable ketchup and their combined use as fillers in meat products such as sausages.” “to do.”
So, I acquired a rare item called state-of-the-art ketchupree.
Ketchup on glass
The 7th European Conference on Precision Optical Component Manufacturing was held in Teisnach, Germany in 2020, according to feedback on ketchup news that broke just as the coronavirus pandemic was grabbing everyone's attention. It turns out that at the seminar, the manufacturer explained the benefits of applying ketchup to the glass. .
Max Schneckenburger and colleagues at the Center for Optical Technology in Aalen, Germany, introduced their colleagues to what was, to some, a new concept.High-precision glass polishing with ketchup”.
Their presentation explained the benefits of polishing with a “non-conventional” non-Newtonian fluid that “flows slowly under its own weight and acts like a solid under short-term stress as its viscosity increases.”
Therefore, ketchup behaves non-Newtonian in some situations. They admire the behavior. “Tomato ketchup changes its viscosity over time. The longer the ketchup is subjected to shear stress, the lower the viscosity will be. Therefore, in this article, we will discuss polishing glass surfaces with ketchup containing micro-sized Ce. We propose a new process.2O. Besides traditional ketchup, we also tested curry ketchup and organic products. ”
Schneckenburger's team used an industrial robot to guide the polishing head. To Feedback's knowledge, this was the first reported instance of a robot intentionally being placed on top of ketchup on a glass.
Financial jokes
It's fair to wonder if there's a smirk inside the financial industry, hidden deep behind the sombre and serious exterior of the buildings, business suits and hairstyles. Many top financial analysts investigate these laughs in their daily work.
In Feedback's shaky understanding of that concept, this kind of fake smile is a raw, lopsided laugh that you see in plots when you have access to certain types of financial data.
But outside of the industry, few people see these fake smiles.
That obscurity resonates with observations made by economist John Kenneth Galbraith half a century ago about the selected attitudes of financial executives. “No one wants a funny banker,” Galbraith said.
Mark Abrahams hosted the Ig Nobel Prize ceremony and co-founded the magazine Annals of Improbable Research. Previously, he was working on unusual uses of computers.his website is impossible.com
Have a story for feedback?
You can email your article to Feedback at feedback@newscientist.com. Please enter your home address. This week's and past feedback can be found on our website.
Fascinated by the shelled mollusk, Saravana and colleagues decided to build a large, soft, single-legged version of the snail and use it as the basis for a robot that moves like a snail.
Saravana explained in his presentation that the team chose to make the legs out of a soft material that could be partially inflated with a small pneumatic pump. Although the chemistry of snail mucus has been studied in detail, the way the snail’s legs move has only been hypothesized based on biologists’ observations, he says. These past studies propose that different parts of the snail’s foot impact the ground and leave the ground before impacting the ground again, and that their movements are not synchronized with each other. This creates a wave pattern across the foot, causing the snail to glide forward on the mucus.
Researchers have successfully reproduced this “pedal wave” motion, which can also expel mucus, in an experimental robot, allowing it to move forward and change direction without falling over. Saravana said that in some experiments, the robot was able to climb steep slopes.
Snail robot without shell
Saravana Prashanth Murali Babu/University of Southern Denmark
Although the bot is still in the experimental stage, Saravana hopes it will be the first robot ever to propel itself like a snail. To make it more self-contained, the team is experimenting with placing the pump inside a snail-like shell on top of the robot. A slightly larger plastic replica of a real snail’s shell, the shell contains electronics to remotely control the robot and emits mucus under the robot’s feet to mimic the slimy tracks of a real snail. It can also accommodate a syringe system for use.
But the team’s ultimate goal is to make the robot’s inflatable legs even softer, making it more like a real snail, whose body is mostly made of water. The researchers hope that a robot that successfully navigates on mucus could eventually inform the design of soft medical robots that can navigate inside the mucus-rich human body.
A machine learning model figured out how to keep the robot stable on three legs while opening a door with one leg.
Philip Arm, Mayank Mittal, Hendrik Kolvenbach, Marco Hutter/Robot Systems Laboratory
The robot dog can open doors, press buttons, and pick up backpacks with one leg while balancing on its other three legs.
Quadruped robots like Spot, the star of Boston Dynamics' viral video, typically require arms attached to their bodies to open doors or lift objects, which adds significantly to their weight. This can make it difficult for the robot to maneuver through tight spaces. .
philip arm Researchers at ETH Zurich in Switzerland used a machine learning model to teach an off-the-shelf robotic dog to perform tasks using one of its legs while remaining stationary or moving with the other three. I taught you to do it.
“We can't do everything with our legs that we can do with our arms. We're much more dexterous with our hands at the moment. But what's really important is making this work in applications where there are mass constraints, or in robots. “The idea is to make this work in applications where you don’t want the added complexity, such as space exploration, where every kilogram counts,” Arm says.
To train the dog, the ANYmal robot from ANYbotics, Arm and his team gave the machine learning model the goal of finding a specific point in space on one of the robot's legs. The model then took control of his remaining three legs and independently worked out how to keep the robot balanced when standing and walking.
Arm and his team can now remotely control the robot to perform actions such as picking up backpacks and putting them in boxes, or collecting rocks. Currently, the robot can only perform these tasks when controlled by a human, but Arm hopes future improvements will allow the dog to autonomously manipulate objects with its paws.
A wheeled robot released on a college campus has discovered how to roll around the real world and open all kinds of doors and drawers.
Robots have adapted themselves to new challenges, paving the way for machines that can independently interact with physical objects. “We want the robots to be able to operate autonomously, without having to rely on humans to keep giving them examples of all new kinds of scenarios during testing,” he says. Deepak Pathak at Carnegie Mellon University (CMU) in Pennsylvania.
Pathak and his colleagues initially trained the robot through imitation learning, which provided visual examples of how to open objects such as doors, cabinets, drawers, and refrigerators. They then unleashed it around CMU's campus, opening doors and cabinets they had never encountered before. This required the robot to adapt to each new object using artificial intelligence that rewards the robot for understanding things.
The robot typically spent 30 minutes to an hour learning how to open each object consistently. Haoyu Shion CMU Ph.D. built a robot to scout various testing locations on campus. The team included his 12 training objects for practice, and then he included eight additional objects to test the robot's abilities.
Initial success rates averaged about 50%, but the robot sometimes completely failed to open new objects when first started. Eventually, that success rate increased to about 95%.
In addition to learning on the fly, he said he had to be able to physically handle heavy doors. Russell Mendonca At C.M.U. Achieving both goals will cost him $25,000, which he says is much cheaper than other robotic systems with adaptive learning capabilities.
Demonstrating the robot outside the lab “represents a concrete step towards more general robotic manipulation systems,” he said. Yunju Lee At the University of Illinois at Urbana-Champaign. “Opening a door or a drawer is a seemingly simple task for humans, but it's actually surprisingly difficult for robots,” he says.
Robots that can grow around trees and rocks like vines could be used to construct buildings or measure pollution in hard-to-reach natural environments.
Vine-like robots are not new, but they are often designed to rely only on a single sense, such as heat or light, to grow upwards, making them less effective than others in certain environments. It doesn't work well.
Emanuela del Dottore The Italian Institute of Technology and colleagues have developed a new version called FiloBot that can use light, shadow, or gravity as a guide. It grows by wrapping a plastic filament into a cylindrical shape, adding a new layer to the body just behind the head that contains the sensor.
“Our robot has a built-in microcontroller that can process multiple stimuli and direct growth at a precise location, namely at the tip, ensuring that the structure of the body is preserved.” she says.
According to Dottore, having such fine control over the direction of the tip means the robot can easily navigate unfamiliar terrain by wrapping around trees and using shadowed areas of leaves as guideposts. This means that it can be moved.
FiloBot grows at approximately 7 millimeters per minute. Although slower than many traditional robots, this gentler progress could mean less disruption to sensitive natural environments, she says.
The researchers don't know exactly what the robot will be used for at this point, but they hope it can be deployed to collect data in areas that are difficult for humans to reach, such as the tops of trees.
Robot dodecahedron mounted on a submersible (circled area)
brennan phillips
The robotic dodecahedron can capture fragile deep-sea animals, collect tissue samples, and build three-dimensional scans of the creatures, potentially speeding up the cataloging of deep-sea life. Up to 66% of marine species are still unknown to science.
brennan phillips RAD2 Sampler and colleagues at the University of Rhode Island have developed the RAD2 Sampler, which is designed to be mounted on any submersible to collect fresh tissue samples in situ from living animals. They hope this will reveal more about the creature than existing techniques, which are typically exposed to stress when pulled up from the depths.
RAD2 is a dodecahedron with an internal volume large enough to hold a basketball. It can be folded and unfolded on command to temporarily capture organisms for detailed examination and take small tissue samples that are stored directly on board the submarine for later genetic analysis. It is designed to.
The ultimate goal is to take a small biopsy and release the animal relatively unscathed, but RAD2's current technique (called tissue cutting) is “a little more crude,” Phillips said.
RAD2 has already been tested on two expeditions, collecting up to 14 tissue samples a day at a depth of around 1200 meters. “We could get small pieces of tissue, and sometimes we could get whole animals,” he says. “It depended on how big it was. So I can't say we've been able to release the animal unharmed after that, but we're moving towards that.”
The robot sampler is also equipped with a 4K resolution video camera to capture high-quality footage of the animal in motion, and a virtual model of the animal is constructed by various 3D scanning devices. In the future, Phillips said, he might be able to put sensors on each of his 12 sides of the dodecahedron and take different measurements of living things at once.
Phillips called other sampling methods “outdated” and said they essentially require people to manually put things into jars for later analysis, or use submersibles to do the same thing. Masu.
Preservation at the point of collection using RAD2 improves the quality of tissue samples and also allows researchers to detect which genes are expressed, further informing animal behavior and physiology. Phillips said it could shed some light. “This is a luxury item,” he says. “This is the best you can get with this animal, better than anyone we’ve ever had.”
eva stewart Researchers at the University of Southampton in the UK say that while digital data on deep-sea life can be a useful tool for research, there is no substitute for capturing and preserving entire samples.
“There are thousands of type specimens here. [at the university]” says Stewart.Some of them were collected by Swedish scientists carl linnaeusShe died in 1778 and says: Once you have the specimen, you are done. Even as our science changes, we can keep coming back to it. ”
But Stewart said underwater scans are useful for gelatinous and other delicate animals that are difficult to collect intact, and for how the creatures behave in their natural environment, rather than after being hoisted onto the deck of a boat. I agree that it may be helpful to understand.
“We've been conducting research to examine gene expression in sea cucumbers because we want to understand how sea cucumbers behave when they're stressed or affected by things like climate change,” says Stewart. he says. “But when you collect them and bring them to the surface, it's stressful. So being able to harvest tissue from them in a more natural way means you know what their natural baseline is, so they can It means we may be able to see more clearly what happens when placed in different environments.”
See robot dogs perform alongside models at Paris Fashion Week
François Durand/Getty Images
While the majority of robots have remained in labs, there were indications that robots will be more commonplace in 2023. These images display some of the most attention-grabbing machines from the past year, illustrating the growing presence of technology in our daily lives.
Spot, the robotic dog, makes its appearance on the runway. Originally unveiled in 2016, Boston Dynamics’ Spot has become more prevalent in real-world settings since its commercial release in 2019. The New York City Police Department has even acquired two Spot robots to use in various scenarios. Additionally, Spot was witnessed removing jackets from models during a Paris Fashion Week show.
Joining actors and writers at Paramount Studios in Los Angeles, the robot dog Gato partook in a demonstration against artificial intelligence. The SAG-AFTRA and Writers Guild of America strike highlighted concerns about the potential threat of advanced AI, ultimately leading to an agreement between the union and the Alliance of Motion Picture and Television Producers.
Adam, the robotic barista and bartender, was showcased at the Consumer Electronics Show in Las Vegas, exhibiting the growing automation in the food and beverage industry. While the prospect of automated food and beverage service looms, the closure of a San Francisco-based automated pizza truck company indicates that this shift is not inevitable.
At the World Robotics Conference in Beijing, humanoid robots displayed their emotional range, showcasing the advancements in technology that are narrowing the gap between humans and robots. Despite the existence of the “uncanny valley,” in which minor differences between humans and human-like robots can evoke uneasiness, there has been progress in refining details such as skin, facial expressions, and eyes.
The humanoid robot Amy, created as a visual artwork by Dutch artist Dries Verhoeven, represents the increasing prevalence of humanoid robots aimed at assisting individuals in coping with challenging realities. Though pharmacists may not have much to fear at the moment, the emergence of robots like Amy indicates a growing trend of humanoid robots being utilized to support those facing difficult circumstances.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.