Can Your Brain Communicate with Others While You Sleep? This Experiment Says Yes.

Modern machine learning technology has demonstrated the ability to visualize our dreams. But what if you wish to take it a step further and share your dreams?

At present, we are capable of interpreting brain signals to obtain a vague understanding of imaginary scenes and overarching concepts, yet there is no method for transferring these ideas from one brain to another. Perhaps this is for the best. Many might feel uneasy at the thought of a computer implanting ideas into our minds while we sleep.

Our current means of communication rely on our sensory capabilities. Words that are whispered into your ears during sleep could serve as a method to convey information between two sleeping individuals. However, how can people communicate while asleep? The answer is more complicated than it seems.

Individuals who talk in their sleep (referred to as Somniloquists) often do so as a result of stress, and their peculiar utterances are not within their conscious control. Moreover, our capacity to hear while asleep is limited; sounds during sleep can disrupt it, causing both stress and dreams to the sleeper.

Yet, there is a particular dream that may be beneficial: the Akaid Dream. This unique type of dream allows the dreamer to recognize that they are still asleep. With some practice and various techniques, this can be guided.

In this state, could two dreamers actually communicate?

The company Rem Space claims not only that this is possible but also that they have achieved it.

They employed external stimuli to aid one sleeper in transitioning to a lucid dreaming state. The sleeper then conveyed a message through earphones, which was recorded by a computer as the lucid dreamer repeated the words in their sleep.

Eight minutes later, the message was played back to the second lucid dreamer, who confirmed hearing the words upon waking. While this may not serve a practical purpose in our current state, it did represent a form of communication within a dream.

There is, however, another type of shared thought that might prove more useful.

Researchers are currently demonstrating that individuals who work closely together begin to synchronize their brain waves. This phenomenon can occur in situations where musicians are tightly synchronized or in social groups where a strong connection is felt.

Inter-brain synchronization is observable through precise “hypersensitivity” with an electroencephalography (EEG) scanner that tracks brain waves. These can originate from theta waves (produced when we are deeply relaxed), alpha waves (when we are calm), or beta waves (when we are focused and active).

When these brain waves, particularly beta waves, synchronize among two or more individuals, they often collaborate more effectively, show enhanced empathy, and even display a reduced sensitivity to pain. Teams with synchronized neural activity typically perform better overall.

The best part is that no artificial intelligence or brain scanners are required!

To cultivate neural synchronization among those who wish to share experiences: engage in activities like listening to music together, dancing, collaborating, solving problems, or simply conversing. This sort of spiritual connection is available to us for free and brings substantial benefits.


This article responds to the question posed by Idris Wise via email: “Can you communicate in a dream?”

Feel free to email us your questions at Question @sciencefocus.com or send a message through Facebook, Twitter or Instagram (don’t forget to include your name and location).

Check out our ultimate Fun fact for more incredible science pages.


Read more:


Source: www.sciencefocus.com

Chimpanzees Utilize Various Linguistic Attributes to Communicate About One Another

Recent research indicates that wild chimpanzees have established a more nuanced communication system than previously thought, employing various mechanisms that merge vocalizations to convey new meanings.

These aspects of chimpanzee communication are detailed in studies published in Friday Journal Science Advances, resembling some basic elements of human language.

Researchers examined recordings from three groups of chimpanzees residing along ivory shores, revealing that they can combine vocalizations much like humans use idioms and rearrange words to form new phrases.

This study marks the first documentation of such complexity in non-human communication systems, suggesting that chimpanzees’ capabilities reflect an evolutionary turning point between basic animal communication and human language.

“The ability to combine sounds to create new meanings is a hallmark of human language,” stated Catherine Crockford, a researcher at the Max Planck Institute for Evolutionary Anthropology and co-director of the Tai Chimpanzee project. “It is crucial to explore whether similar capabilities exist in our closest living relatives, chimpanzees and bonobos.”

Another study published last month provided similar evidence indicating that bonobos can also combine calls to form phrases. Together, these studies imply that both species are evolving fundamental components of human language.

Bonobos and chimpanzees are the species most closely linked to humans in evolutionary history, suggesting all three may have derived from a common ancestor with this capability.

“Our findings indicate a highly generative vocal communication system that is unmatched in the animal kingdom. This aligns with recent discoveries about bonobos and implies that complex combinatorial abilities may have already existed in a common human ancestor.”

Researchers identified these new complexities in chimpanzee vocal systems by tracking specific animals in the field from dawn to dusk for approximately 12 hours daily, capturing the sounds they produced and their interactions with others in the group. They documented over 4,300 vocalizations from 53 wild chimpanzees.

While observing the vocalizations, researchers noted the activities, social interactions, and environmental changes occurring simultaneously, indicating whether the chimpanzees were eating, playing, or encountering predators.

The team performed statistical analyses on particular two-call combinations, such as “bark followed by bark,” recorded across various animals.

Their findings revealed that chimpanzees combine sounds to reference everyday experiences, with combinations that can express a range of meanings.

Simon Townsend, a professor at the University of Zurich who studied primate cognition and contributed to the bonobo study, noted that he wasn’t involved in this particular research.

He suggested that the common evolutionary ancestors of bonobos, humans, and chimpanzees likely possessed this ability.

“This suggests that our linguistic capabilities were already developing about 6-7 million years ago,” Townsend stated, referring to the time when these species likely diverged in the evolutionary tree.

Not all primates showcase such intricate communication. Townsend noted that forest monkeys, with simpler social structures, primarily utilize vocalizations to address predatory threats.

However, he believes that increasingly larger and more intricate social groups—a common trait among great apes and humans—have catalyzed the evolution of more sophisticated communication and ultimately, language.

For bonobos and chimpanzees, “Their biggest challenge is managing their intricate social environment. They exist in larger groups… There are conflicts, reconciliations, territorial disputes, and intergroup interactions. Vocalization is likely one evolutionary response to navigating these complex social dynamics.”

In human language, syntax refers to a set of rules that create a system capable of expressing infinite meanings.

“Syntax pertains to conveying increasingly precise and sophisticated information, which probably becomes necessary as social interactions grow more complex,” Townsend stated.

Source: www.nbcnews.com

Research: Bottlenose dolphins communicate through “smiles” during playful interactions

Play is a widespread behavior in distant species, and its social form relies on complex communication. Playful communication has been largely ignored in marine mammals. In a new study, scientists from the University of Pisa focused on playful visual communication. bottlenose dolphin (Tursiops truncatus).

“We revealed that bottlenose dolphins have a unique facial expression of open mouth, and showed that dolphins can also mirror other people’s facial expressions.” Dr. Elisabetta Palagian evolutionary biologist at the University of Pisa.

“Open-mouth cues and quick imitations are repeated throughout the mammalian family tree. This shows that in many species, not just dolphins, visual communication is important for forming complex social interactions. This suggests that it has played a role.”

Dolphin play includes acrobatics, surfing, playing with objects, chasing and fighting, but it is important that these activities are not mistaken for aggression.

Other mammals use facial expressions to convey playfulness, but it has not been investigated whether marine mammals also use facial expressions to signal play.

“The mouth-opening gesture probably evolved from the chewing motion, breaking down the chewing sequence to leave only the ‘intention to bite’ without contact,” Palagi said.

“The relaxed, open mouth seen in sociable carnivores, the playful faces of monkeys, and even the laughter of humans is a universal sign of playfulness and signals enjoyment to animals, and to us. , helps avoid conflict.”

marieri others. They investigated the presence and possible functions of open-mouth displays in solitary play, interspecific (human-dolphin) play, and intraspecific free play. Image credit: Marieli others., doi: 10.1016/j.isci.2024.110966.

To investigate whether dolphins visually communicate playfulness, Dr Palagi and colleagues studied captive bottlenose dolphins when they were playing in pairs and when they were playing freely with a human handler. recorded.

They showed that dolphins frequently use the open-mouthed expression when playing with other dolphins, but do not seem to use it when playing with humans or alone. .

Although only one open-mouth incident was recorded during solitary play, the researchers recorded a total of 1,288 open-mouth incidents during social play sessions, and these 92% of the incidents occurred during dolphin-dolphin play sessions.

Dolphins were also more likely to make open-mouthed expressions when their faces were within the field of view of their playmates, with 89% of recorded open-mouthed expressions produced in this situation. When this “smile” was recognized, the playmate smiled back. With a probability of 33%.

“Given that dolphins frequently participate in the same activities and situations, some might argue that dolphins are simply copying each other’s open-mouthed facial expressions by chance, but this This does not explain why the probability of imitating another dolphin’s open-mouth expression within 1 second is 13 times higher if the recipient actually saw the original expression. ” said Dr. Palagi.

“This rate of mimicry in dolphins is consistent with what has been observed in certain carnivores, such as meerkats and sun bears.”

of study Published in a magazine iscience.

_____

veronica marieri others. Smiling underwater: Exploring the playful signals and rapid imitation of bottlenose dolphins. isciencepublished online October 2, 2024. doi: 10.1016/j.isci.2024.110966

Source: www.sci.news

Marmosets use names to communicate among themselves

Scientists at the Hebrew University of Jerusalem recorded the natural “phee call” conversations between pairs of marmosets. They found that the marmosets use these calls to vocally address each other. Moreover, these non-human primates respond more consistently and accurately to calls directed at them.



Humans, dolphins, elephants, and marmosets are the only species known to vocalize names for other animals of their own species. Image credit: Oren others., doi: 10.1126/science.adp3757.

In the study, Guy Oren, a graduate student at the Hebrew University of Jerusalem, and his colleagues recorded natural conversations between pairs of marmosets and interactions between the monkeys and a computer system.

The researchers discovered that these monkeys use a “fee” call to address specific individuals.

Even more interesting, the marmosets were able to discern calls directed at them and responded more accurately when called.

“This discovery highlights the complexity of social communication between marmosets,” Omer said.

“These calls are not simply used to locate themselves, as previously thought. Marmosets use these specific calls to label and call to specific individuals.”

The authors also found that family members within marmoset groups use similar phonetic labels when calling different individuals and use similar phonetic features when encoding different names, which is similar to human use of names and dialects.

This learning appears to occur even among unrelated adult marmosets, suggesting that they learn both phonetic labels and dialects from other members of their family group.

Scientists think that the acoustic signatures may have evolved to help marmosets stay connected in dense forest habitats where visibility is often limited.

These calls allow primates to maintain social bonds and keep their groups cohesive.

“Marmosets live in small, monogamous family groups and care for their young together, just like humans do,” Omer said.

“These similarities suggest that they faced similar evolutionary social challenges as their early ancestors before acquiring language, which may have led to the development of similar ways of communicating.”

This study provides new insights into how social communication and human language have evolved.

“Our findings shed light on the complexity of social vocalizations in non-human primates and suggest that marmoset vocalizations may serve as a model for understanding aspects of human language and provide new insights into the evolution of social communication,” the researchers said.

of Survey results Published in a journal Science.

_____

Guy Oren others2024. Speech labelling of others by non-human primates. Science 385 (6712): 996-1003; doi: 10.1126/science.adp3757

Source: www.sci.news

Chimpanzees communicate with each other at a speed comparable to human conversation

Chimpanzees in Budongo Forest, Uganda

Catherine Hobaiter

When chimpanzees socialize, they exchange gestures at a rate similar to how humans converse.

The researchers surveyed five wild chimpanzees.Pan troglodytesThe researchers studied 8,559 gestures made by 252 chimpanzees across chimpanzee (Pan troglodytes) communities in East Africa — one of the largest studies of its kind. They recorded face-to-face interactions between the apes, recording the timing of one chimpanzee's gestures relative to those of the other.

An analysis of the ape “conversations” found that chimpanzees' signaling intervals are remarkably similar to human interactions, and even a little faster: “On average, it takes 120 milliseconds between the end of one gesture and the start of the next,” the researchers say. Gal Badig “In humans, the average is about 200 milliseconds, so this is very close,” said researchers at the University of St Andrews in the UK.

All chimpanzee groups responded quickly, but the exact timing varied from group to group: for example, chimpanzees from Sonso, Uganda, took a few milliseconds longer to return the gesture than the other chimpanzee groups studied.

Such differences in timing exist in human languages ​​too. For example, Japanese speakers generally Faster turn changes Japanese people have a different conversational style than Danish speakers. “We don't know exactly why,” says Vadig. “As with humans, we don't know if it's a cultural difference, something we've learned over time, or a reaction to our environment.”

Chimpanzees interacting in the Budongo Forest in Uganda

Adrian Soldati

Only 14 percent of the interactions the researchers observed between chimpanzees involved any kind of interaction. Most consisted of a single gesture, such as “go away” or “follow me,” in which the other person ran away or followed. But interactions were more frequent when the chimpanzees were negotiating over food or grooming.

“What's really exciting about this study is that it shows that communication is a cooperative, socially engaged process in non-human animals,” Budig says, “and that the processes involved in human language may have actually evolved much earlier than we thought.”

topic:

Source: www.newscientist.com

Research: African elephants use individualized calls similar to nicknames to communicate with each other

A team of scientists from Colorado State University, Save the Elephants and Elephant Voices used machine learning to: African savanna elephant (African brown) The calls included name-like elements that identified the intended recipient. When the authors played the recorded calls, the elephants responded positively to the calls, either by returning the call or by approaching the speaker.

Two young elephants greet each other in the Samburu National Reserve in Kenya. Image by George Wittemyer.

“Dolphins and parrots call each other by name, imitating each other's distinctive sounds,” says Dr. Michael Pardo, a postdoctoral researcher at Colorado State University and Save the Elephants.

“In contrast, our data suggest that elephants do not imitate the sounds of their mates when calling, but rather use a method that resembles the way humans communicate names.”

“The ability to learn to produce new sounds is unusual among animals, but it is necessary for identifying individuals by name.”

“Arbitrary communication, expressing ideas through sounds but not imitating them, greatly expands communication abilities and is considered a next-level cognitive skill.”

“If we could only make sounds that resembled what we say, our ability to communicate would be severely limited,” added George Wittemyer, a professor at Colorado State University and chairman of Save the Elephants' science committee.

“The use of arbitrary phonetic labels suggests that elephants may be capable of abstract thought.”

For their study, the researchers used machine learning techniques to analyze 469 recordings of rumbles made by wild female African elephant calves in the Samburu Buffalo Springs National Reserve in Amboseli National Park, Kenya, between 1986 and 2022.

The machine learning model correctly identified the recipient in 27.5% of these calls, which the researchers noted was a higher percentage than the model detected when control voice was input.

The researchers also compared the responses of 17 wild elephants to recordings of calls that were originally directed at them or at other elephants.

The researchers observed that the elephants approached the speaker playing the recordings more quickly and were more likely to respond vocally when they were called to, compared to when other elephants were called to.

This suggests that elephants recognise individual calls addressed to them.

“The discovery that elephants are not simply mimicking the calls of calling individuals is most intriguing,” said Dr. Kurt Fristrup, a researcher at Colorado State University.

“The ability to use arbitrary acoustic labels for other individuals suggests that other kinds of labels or descriptors may exist for elephant calls.”

The new insights revealed by this study into elephant cognition and communication reinforce the need to protect elephants.

Elephants are classified as follows: EndangeredThey are endangered due to poaching for their ivory and habitat loss due to development.

Due to their large size, they require a lot of space and can cause damage to property and pose a danger to people.

“Communicating with pachyderms is still a distant dream, but being able to communicate with them could be a game changer for their conservation,” Prof Wittemyer said.

“Living with elephants is difficult when you are trying to share the land but the elephants eat the crops.

“I want to warn them: 'Don't come here. If you come here, you will be killed.'”

a paper The findings were published in the journal. Natural Ecology and Evolution.

_____

MA Pardo othersAfrican elephants call out to each other by different names for each individual. Nat Ecol EvolPublished online June 10, 2024; doi: 10.1038/s41559-024-02420-w

Source: www.sci.news

The Hidden Method Your Dog Uses to Communicate With You

Our dogs have been our companions for thousands of years. Every wag of a tail, flick of an ear, and furrowed brow speaks volumes if you know how to interpret them.

Despite thinking we know our dogs well, research suggests that dogs are actually better than humans at reading body language. To help us understand dog communication better, we sought advice from experts in animal behavior, such as Dr. Zazie Todd. Learn more about the hidden meaning behind your dog’s behavior, from their nose to their tail, ears to paws.


Understanding Dog Facial Expressions

Humans often rely on facial expressions to understand each other, but can we do the same with dogs? Some dogs have very expressive faces, which can help us interpret their emotions. A relaxed jaw and slightly open mouth can be the equivalent of a “smile” in dogs, while a grinning dog may be signaling aggression. It’s important not to anthropomorphize too much, as dogs may have different expressions than humans.

In some cases, what may seem like a “guilty look” from a dog may actually be their fear of getting scolded. Research has shown that dogs may not fully understand their actions but are responding to the owner’s potential reaction. Eye contact and ear positioning can also reveal a lot about a dog’s feelings and intentions.

Signs of Stress in Dogs

While it’s easy to spot when a dog is happy, signs of anxiety or fear can be harder to detect. Yawning, licking lips, and other subtle cues may indicate stress in dogs. Understanding these signals can help prevent misunderstandings and improve communication between you and your pet.

Interpreting Dog Posture

Dog posture can reveal a lot about their emotions. A low, hunched body may indicate fear, while a playful “play bow” posture signals a desire to engage. Observing your dog’s body language can help you understand their intentions and mood better.

Decoding Tail Wagging

Tail wagging is a common form of communication for dogs. A big, loose wag can indicate happiness, while a stiff, vertical tail may signal stress or aggression. Pay attention to the direction of the wag to better understand your dog’s emotions. Research has shown that the direction of the wag can reflect the dog’s mood.

Understanding Vocalizations

Barking and growling are essential forms of vocal communication for dogs. Different sounds can convey various emotions or intentions. Research has shown that dogs can use growls to express their size and feelings honestly in different situations. Understanding your dog’s vocalizations can help you better respond to their needs.

About Our Expert

Zazie Todd is an animal behavior expert and award-winning author. She founded Pet Psychology in 2012 to explore how science can improve the happiness of cats and dogs. With over 50,000 monthly visitors, Companion Animal Psychology is a valuable resource for pet owners.

Source: www.sciencefocus.com

How did humans acquire the ability to communicate through speech?

Scientists researching human speech believe that this ability likely evolved in the human brain during our evolution from primates, but the exact process remains unclear. These researchers can compare the human brain to that of other primates to study how it changed over time and gave rise to language.

Previous studies have proposed that groove-like structures in the front of the primate brain may aid humans in learning language. To explore if these and other brain changes are involved in language evolution, an international team of scientists recently compared the speech-related regions of human and primate brains. The primates they studied included baboons and chimpanzees.

Using high-resolution scans from sources like the National Chimpanzee Brain Resource and the Human Connectome Project database, the scientists analyzed specific areas of the human and primate brains to identify differences that may have contributed to the development of language.

They focused on brain regions controlling speech, facial expressions, and language, such as the prefrontal extent of the frontal skull (PFOP). They found that the PFOP is fully developed in humans, partially in chimpanzees, and absent in Old World monkeys.

Another notable difference in the human brain was the presence of a groove called the operculum, which was more pronounced on the left side. This suggests that the left hemisphere of the human brain has a larger PFOP compared to the right hemisphere, a feature not found in other primates.

By comparing chimpanzee brains, the researchers found that the size of the chimpanzee’s PFOP was consistent on both sides, indicating a recent full development of the PFOP in humans.

The scientists also examined the distance between two brain grooves, the circular sulcus and the operculum. Previous studies linked these grooves to communication sounds in chimpanzees, leading the researchers to investigate their role in human language development.

Based on their findings, the scientists suggested that the development of certain brain structures like the D-FO and V-FO grooves contributed to the emergence of human language. They emphasized the need for further research to understand how these structures function in the human brain.

In conclusion, changes in brain structures like the operculum and cerebral sulci likely play a role in human language acquisition, but more research is needed to fully understand this association. Future studies should explore how specific features like the PFOP function in the human brain to better comprehend their role in speech development.


Post views: 202

Source: sciworthy.com

Large-scale language models are used by Agility to communicate with humanoid robots

I’ve spent most of the past year discussing generative AI and large-scale language models with robotics experts. It is becoming increasingly clear that this type of technology is poised to revolutionize the way robots communicate, learn, look, and program.

Therefore, many leading universities, research institutes, and companies are exploring the best ways to leverage these artificial intelligence platforms. Agility, a well-funded Oregon-based startup, has been experimenting with the technology for some time with its bipedal robot Digit.

Today, the company is showcasing some of its accomplishments in a short video shared across its social channels.

“[W]We were curious to see what we could accomplish by integrating this technology into Digit,” the company said. “The physical embodiment of artificial intelligence created a demonstration space with a series of numbered towers of several heights and three boxes with multiple features. Digit has We were given information about the environment, but we were not given any specific information about the task, just to see if we could execute natural language commands of varying complexity.”

In the video example, Digit is instructed to pick up a box colored “Darth Vader’s Lightsaber” and move it to the tallest tower. As you might expect from early demos, the process is not instantaneous, but rather slow and methodical. However, the robot performs the task as described.

Agility says: “Our innovation team developed this interactive demo to show how LLM can make robots more versatile and faster to deploy. In this demo, people can use natural language to communicate with Digit. You can talk to it and ask it to perform tasks, giving you a glimpse into the future.”


Want the top robotics news in your inbox every week? Sign up for Actuator here.


Natural language communication is an important potential application of this technology, along with the ability to program systems through low-code and no-code technologies.

On my Disrupt panel, Gill Pratt explained how Toyota Research Institute is using generative AI to accelerate robot learning.

We figured out how to do something. It uses the latest generative AI techniques that allow humans to demonstrate both position and force, essentially teaching the robot from just a handful of examples. The code hasn’t changed at all. What is this based on? There is a popularization policy. This is a study we conducted in collaboration with Columbia and MIT. We have taught 60 different skills so far.

MIT CSAIL’s Daniela Russ also told me recently: “Generative AI turns out to be very powerful in solving even motion planning problems. It provides much faster solutions and more fluid and human-like control solutions than using model prediction solutions. I think this is very powerful because the robots of the future will be much less robotic. Their movements will be more fluid and human-like.”

The potential applications here are wide and exciting. And Digit, as an advanced commercial robotic system being piloted in Amazon fulfillment centers and other real-world locations, seems like a prime candidate. If robots are to work alongside humans, they will also need to learn to listen to us.

Source: techcrunch.com