US Student Handcuffed After AI Mistakes Bag of Chips for Gun in Baltimore

A system powered by artificial intelligence (AI) mistakenly identified a high school student’s Doritos bag as a firearm, prompting local authorities to be notified that the student was armed.

Taki Allen was enjoying snacks with friends outside Kenwood High School in Baltimore on Monday night when an armed police officer approached her.

“Initially, I was unsure of their intentions until they started approaching me with weapons drawn, ordering me to ‘Get on the ground,’ and I thought, ‘What is happening?'” Allen recounted to WBAL-TV 11 News.

Allen stated that they forced him to his knees, handcuffed him, and conducted a search but found nothing. They then showed her the image that triggered the alarm.

“I was just holding a bag of Doritos, and they mentioned it resembled a gun because it had two hands with a finger protruding,” Allen explained.

Last year, high schools in Baltimore County began using a gun detection system that leverages AI and school cameras to identify potential weapons. If anything suspicious is detected, both the school and police are notified.

In a letter to families, which was acquired by WBAL TV 11 News, the school stated: “We recognize how distressing this situation must have been for the individual who was searched as well as for other students who witnessed the event. Our counselors are ready to provide direct support to those involved and are available to talk with anyone needing assistance.”

Baltimore County police informed the media: “Officers from Essex Precinct 11 responded to Kenwood High School after a report of an individual carrying a weapon. Upon arrival, they searched the individual and confirmed that he did not possess a weapon.”

“Nobody wants their child to experience this. No one wants such incidents to occur,” said Allen’s grandfather, Lamont Davis, to the news station.

Source: www.theguardian.com

Memory Chips Just 10 Atoms Thick Could Boost Capacity Significantly

Current silicon chips are highly compact, but using ultrathin 2D materials could enhance their density even further.

Wu Kailiang/Alamy

A memory chip with a thickness of just 10 atoms could revolutionize the storage capacity of electronic gadgets like smartphones.

Despite decades of scaling down, modern computer chips often have very few components yet integrate tens of billions of transistors into an area comparable to a fingernail. Although the size of silicon components has significantly decreased, the thickness of the silicon wafers remains considerable, imposing limitations on increasing a chip’s complexity through stacking layers.

Researchers have been exploring the potential of thinner chips made from 2D materials like graphene. Graphene consists of a single layer of carbon atoms and represents the thinnest known material. However, until recently, only basic chip designs could be implemented with these materials, complicating their connection to traditional processors and integration into electrical devices.

Recently, Liu Chunsen and his team from Fudan University in Shanghai successfully integrated a 2D chip only 10 atoms thick with a CMOS chip currently utilized in computers. The manufacturing method for these chips yields a rough surface, making it challenging to layer a 2D sheet on top. The researchers addressed this issue by placing a glass layer between the 2D and CMOS chips, although this step is not yet part of the industrial process and requires further development for mass production.

The prototype memory module the team created achieved over 93% accuracy during testing. While this falls short of the reliability needed for consumer-grade devices, it serves as an encouraging proof of concept.

“This technology holds significant promise, but there’s still a considerable journey ahead before it can be commercialized,” says Steve Furber from the University of Manchester, UK.

Kai Shu, a researcher at King’s College London, mentions that further reducing current chip designs without utilizing 2D materials poses challenges due to signal leakage associated with traditional components made at very narrow widths. Thinner layers might mitigate this issue. Consequently, achieving greater thinness may facilitate additional reductions in width.

“Silicon is encountering hurdles,” said Xu. “2D materials might provide solutions. With their minimal thickness, gate control becomes more uniform and comprehensive, resulting in reduced leakage.”

topic:

Source: www.newscientist.com

Nobel Prize in Physics Awarded to Trio Pioneering Quantum Computing Chips

John Clarke, Michel Devolette and John Martinis awarded the 2025 Nobel Prize in Physics

Jonathan Nackstrand/AFP via Getty Images

The prestigious 2025 Nobel Prize in Physics was awarded to John Clarke, Michel Devolette, and John Martinis. Their research elucidates how quantum particles can delve through matter, a critical process that underpins the superconducting quantum technology integral to modern quantum computers.

“I was completely caught off guard,” Clarke remarked upon hearing the news from the Nobel Committee. “This outcome was unimaginable; it felt like a dream to be considered for the Nobel Prize.”

Quantum particles exhibit numerous peculiar behaviors, including their stochastic nature and the restriction to specific energy levels instead of a continuous range. This phenomenon sometimes leads to unforeseen occurrences, such as tunneling through solid barriers. Such unusual characteristics were first revealed by pioneers like Erwin Schrödinger during the early years of quantum mechanics.

The implications of these discoveries are profound, particularly supporting theories like nuclear decay; however, earlier research was limited to individual particles and basic systems. It remained uncertain whether more intricate systems such as electronic circuits, conventionally described by classical physics, also adhered to these principles. For instance, the quantum tunneling effect seemed to vanish when observing larger systems.

In 1985, the trio from the University of California, Berkeley—Clarke, Martinis, and Devolette—sought to change this narrative. They investigated the properties of charged particles traversing a superconducting circuit known as the Josephson Junction, a device that earned the Nobel Prize in Physics in 1973 for British physicist Brian Josephson. These junctions comprise wires exhibiting zero electrical resistance, separated by an insulating barrier.

The researchers demonstrated that particles navigating through these junctions behaved as individual entities, adopting distinct energy levels, clear quantum attributes, and registering voltages beyond expected limits without breaching the adiabatic barrier.

This groundbreaking discovery significantly deepened our understanding of how to harness similar superconducting quantum systems, transforming the landscape of quantum science and enabling other scientists to conduct precise quantum physics experiments on silicon chips.

Moreover, superconducting quantum circuits became foundational to the essential components of quantum computers, known as qubits. Developed by companies like Google and IBM, the most advanced quantum computers today consist of hundreds of superconducting qubits, a result of the insights gained from Clarke, Martinis, and Devolette’s research. “In many respects, our findings serve as the cornerstone of quantum computing,” stated Clarke.

Both Martinis and Devolette are currently affiliated with Google Quantum AI, where they pioneered the first superconducting quantum computer in 2019 that demonstrated quantum advantage over traditional machines. However, Clarke noted to the Nobel Committee that it was surprising to consider the extent of impact their 1985 study has had. “Who could have imagined that this discovery would hold such immense significance?”

Topics:

  • Nobel Prize/
  • Quantum Computing

Source: www.newscientist.com

Trump Sparks Concerns Over Nvidia’s Potential Sale of Advanced AI Chips in China

Donald Trump has indicated that Nvidia can sell more advanced chips in China than is currently allowed.

During a Monday briefing, Trump addressed the recent development, revealing his groundbreaking agreements with NVIDIA and AMD. He has authorized an export license allowing the sale of previously restricted chips to China, with the US government receiving 15% of the sales revenue. The US president defended the deal after analysts labeled it as potentially resembling “shakedown” payments or unconstitutional export taxes. He expressed hope for further negotiations regarding a more advanced Nvidia chip.

Trump mentioned that Nvidia’s latest chip, Blackwell, would not be available for trade, but he is considering trading “a slightly negatively impacted version of Blackwell,” which could see a downgrade of 30-50%.

“I believe he’ll be back to discuss it, but it will be a significant yet unenhanced version,” he remarked, referring to Nvidia’s CEO Jensen Huang, who has had multiple discussions with Trump about China’s export limits.



Huang has yet to comment on the revenue-sharing agreement pertaining to the sales of Nvidia’s H20 chips and AMD’s Mi308 chips in China.

The H20 and Mi308 chips were prohibited from being sold to China in April, even though the low-power H20 was specially designed to meet the restrictions set by the Biden administration. Nvidia previously stated last month that they hoped to receive clearance to resume shipments soon.

Nvidia’s impact is a major driver of the AI boom, garnering significant interest from both China and the US, which has led to heightened scrutiny among analysts in Washington and concerns from Chinese officials.

“I’m worried about reports indicating the US government might take revenue from sales of chips akin to advanced H20 sales,” he told the Financial Times.

Trump justified the agreement on Monday: “I stated, ‘Listen, I want 20% if I approve this for you,'” emphasizing that he hasn’t received any personal money from the deal. He suggested that Huang provided 15% as part of the agreement.

“I permitted him only for the H20,” Trump clarified.

He referred to the H20 as an “outdated” chip that is “already in a different form for China.”

However, Harry Cleja, research director at the Washington office of the Carnegie Mellon Institute of Strategic Technology, labeled the H20 as a “second tier” AI chip.

“The H20 is not the premier training chip available, but the type of computing dominating AI tasks today—particularly the ‘inference’ model and ‘agent’ products—are what the field is focused on,” Kresja told the Guardian, referring to systems employing advanced inference to autonomously resolve complex issues.

“Lifting H20 export restrictions undoubtedly provides Beijing with the necessary tools to compete in the AI realm.”

The US government has been attempting for several years to defend national security, especially concerning artificial intelligence development and the provision of technology that could be weaponized.

China’s Foreign Ministry remarked on Monday that the country has consistently articulated its stance on US chip exports, accusing Washington of utilizing technology and trade measures to “maliciously suppress and hinder China.”

Revenue-sharing contracts are quite rare in the US, reflecting Trump’s latest interference in corporate decisions after pressuring executives to reinvest in American manufacturing. He has requested the resignation of Intel’s new CEO, Lip-Bu Tan, regarding its connections with Chinese companies.

Trump has also suggested imposing 100% tariffs on the global semiconductor market, exempting businesses that commit to investing in the US.

Taiwan’s TSMC, a leading semiconductor manufacturer, announced plans in April to expand its US operations through a $100 million investment. However, foreign investments of this magnitude require government approval from Taiwan.

The Guardian confirmed that TSMC has yet to apply for this approval. The company has not responded to requests for comment.

Source: www.theguardian.com

Musk Celebrates $16.5 Billion Samsung Deal for AI Chips Supplying Tesla

On Monday, Samsung entered into a contract worth $16.5 billion (£12.3 billion) to produce artificial intelligence chips for Tesla, marking a significant development while Elon Musk was active on social media.

The South Korean technology giant disclosed a partnership with an unnamed client in a regulatory filing. Musk shared details on the platform X.

Musk stated that Samsung will manufacture Tesla’s next-gen A16 chips at its new facility in Texas.

He emphasized, “The strategic significance of this is hard to overstate.”

Back in December, the Biden administration allocated $4.755 billion to support Samsung’s semiconductor manufacturing operations in Texas under the Chips Act. At that time, former U.S. Commerce Secretary Gina Raimondo remarked that this funding would ensure a “steady flow” of chips vital for AI and national security.

In a post on X on Monday, Musk indicated that he had allowed Samsung to assist Tesla in “maximizing manufacturing efficiency” and “expediting” production processes.

He added that the Samsung facility in Taylor, a suburb of Austin, Texas, is “conveniently located not far from my home.”

This agreement is poised to revive projects that have faced significant delays due to Samsung’s struggle to maintain and attract major clients. Ryu Young-Ho, a senior analyst at NH Investment & Securities in Seoul, noted that the Taylor plant “currently has no customers,” making this deal “a logical move.”

In October, Reuters reported that Samsung had postponed shipments of chip manufacturing equipment from ASML, a Dutch supplier, as it had not yet secured key customers. The facility’s opening has already been delayed until 2026.

Samsung is currently producing Tesla’s AI4 chips, which enhance the automaker’s fully autonomous driver assistance system. Taiwan’s TSMC will initially manufacture Tesla’s AI5 chips in Taiwan before moving production to Arizona.

Skip past newsletter promotions

The regulatory filing announcing Samsung’s chip supply agreement did not disclose the client’s identity, citing a request for confidentiality concerning the transaction to be carried out until the end of 2033.

Samsung has been invited to provide a comment.

Reuters contributed to this article

Source: www.theguardian.com

Nvidia confirms US will restrict sales of additional AI chips to China

Nvidia stated on Tuesday that the US government will sell some of its artificial intelligence chips to China without a license and will require a license for future sales.

These restrictions mark the first major limitations imposed by President Trump’s administration on semiconductor sales overseas. This decision could lead to Nvidia’s sales to China diminishing in the near future, as the US has restricted the export of chips to its geopolitical rivals.

Nvidia has been striving to maintain sales to China amidst increasing government restrictions. In response to rules imposed by the Biden administration in 2022, Nvidia modified its main AI chip, the H100, to comply with the US government’s regulations. The resulting H20 chip has now become a product exclusively available in China.

NVIDIA is projected to incur a $5.5 billion expense against current quarterly revenues due to H20 inventory, purchase commitments, and related reserves.

The impact of these restrictions is more strategic than financial. Nvidia holds a dominant position in the semiconductor market for artificial intelligence systems. Selling chips to China is vital for its future, and losing access to this market could potentially benefit Huawei, a leading Chinese AI chip manufacturer, in challenging Nvidia globally.

“This decision will limit Nvidia’s reach in key markets and weaken its influence in the country,” stated Patrick Moorhead, a technology analyst at Moor Insights & Strategy. “Chinese companies may simply turn to Huawei as an alternative.”

Nvidia declined to provide a comment. The company’s stock price dropped over 5% in after-hours trading on Tuesday.

Commerce Department spokesperson Benno Kass announced on Tuesday that the government will be enforcing new export licensing requirements for NVIDIA H20 chips, AMD’s MI308 chip, and equivalents.

“The Commerce Department is dedicated to implementing the President’s directive to safeguard our national and economic security,” Kass remarked.

Nvidia announced changes to its regulatory filings on Tuesday, a day after earning praise from the White House for committing to invest $500 billion in US AI infrastructure. The company plans to begin manufacturing servers at its Houston plant and collaborate with a chip packaging company based in Arizona.

Despite these commitments, a regulatory submission revealed that NVIDIA will need to seek licenses from China for selling AI chips following notification from the Trump administration. The administration confirmed on Tuesday that the licensing requirements will remain in place indefinitely.

This development follows a meeting between Nvidia CEO Jensen Huang and Trump at a $1 million Mar-a-Lago dinner per person. Speculation arose that the US government might relent on its plans to restrict Nvidia’s sales to China.

Since taking office, the Trump administration has vowed to crack down on US support for Chinese AI companies. The emergence of Chinese startup Deepseek in recent months, which developed an AI system at a fraction of the cost typically spent by US companies, has raised concerns in Washington.

During his nomination hearing, Commerce Secretary Howard Lutnick emphasized the need to prevent Chinese companies from leveraging American technology to compete against the US.

Nvidia reported $17 billion in sales to China last year. As US government restrictions continue, the company’s operations face significant challenges, with sales to China dropping from about a fifth of Nvidia’s revenue in 2023 to 13% last year.

In its filing, NVIDIA did not indicate the impact of the licensing requirements on future sales. Analysts suggest that stock may be limited as the H20 chips have been modified to match the performance of the H100 chip, which can still be sold by US and European companies.

Source: www.nytimes.com

“9 valuable lessons I learned from TikTok, including how to avoid stale potato chips” | TikTok

The average TikTok user is approximately 1 hour every day on the app. However, the app is set to be banned in the US, and US users' screen time is about to decline rapidly.

On Friday, the U.S. Supreme Court rejected an appeal against a law banning social media platforms, citing national security concerns. TikTok's China-based parent company ByteDance had challenged the law, arguing that it violated free speech protections for its more than 170 million users in the United States. But the ruling was upheld by a unanimous vote, requiring the app to find an approved buyer for the U.S. version by Sunday or be blocked. Outgoing U.S. President Joe Biden has said he has no intention of enforcing the ban, instead deferring to that decision. President-elect Donald Trump's administration. While there are rumors that the ban may still be thwarted, a major U.S. exodus has already begun as users flock to alternative Chinese video-sharing app Red Note.

many Users have been Post a “Farewell TikTok” video. The app was first released in the US in 2016. Some of them satirize or mock national security concerns as users bid farewell to “China's personal spies.” Others are more heartfelt, such as a video montage of a teenager turning 18 or a newborn puppy growing into a white-haired dog. Many comments said, “I grew up with this app.”

But that is“What I learned on TikTok that changed my life” video that is the most attractive. While social media is regularly criticized for spreading fake news and harming mental health, these videos show there is a more positive side, at least for some people.

When life gives you lemons… Photo: TikTok

one video by Brigitte Muller piled up Over 1 million views. Her advantage is to use a yoga mat underneath sofa Use cushions to prevent them from slipping off, spray vodka on vintage clothing, etc. Gets rid of musty odor. Thousands of users flocked to the comments section to learn everything from recognizing the characteristics of neurodivergence to making jammy eggs. Some people describe TikTok as the parent or grandmother they never had.

So, in honor of TikTok's final hours (at least in the US), I'm sharing nine of my own favorite lessons.

1. Always roll limes and lemons before squeezing them to maximize the amount of juice. I also ditched the glass lemon squeezer and replaced it with a fork.

2. I envy those houses. architectural digest Does your sofa always seem to have fluffy cushions instead of squishy ones? The secret is Purchase a cushion insert that is 2 inches larger than the cover.

3. To prevent the cutting board from slipping, put a wet tea towel down. And somehow Place a damp kitchen towel next to the onion. meanwhile Chopping stops the eye streaming.

4. You are tying the dressing gown incorrectly. Remove the tie and rethread it through the loop so that it hangs in front of you. after that Instead of tying it at the back, Pull the ends together at the front and voila! It will not open unexpectedly again.

Let's tie…the correct way to tie a dressing gown. Photo: TikTok

5. A former American POW tells us How to keep a bag of potato chips fresh without using rubber bands or clips. TLift the bag and fold the two corners so that the top forms a triangle. Next, roll the other side under and curl it completely, creating two “pockets” on the other side. Turn it over and keep it in place Clips are never stale again.

6. Instead Do you leave your cosmetics scattered near the sink? I dumped my bag by the front door when I entered the house and now follow this mantra: “Don’t put it down, please put it away.” IIf you use something and then immediately put it back in its original place, No need to organize it later.

7. How to use Turntable – aka Lazy Susan – Storing spices means they don’t have to be rooted deep in the back of your cupboard That particular seasoning seemed like I would never find it.

Say cheese…turn the grater sideways. Photo: TikTok

8. Turn the box mold to make it easier to grate the cheese. put the grater aside. The same goes for can openers. put it on top of the canrather than using it from the side. Stops sharp and jagged edges.

9. Hair dryer helps remove stubborn stains glass and plastic adhesive labels. You can also remove address labels from cardboard boxes. Very useful for reusing after purchasing on eBay or Vinted.

Source: www.theguardian.com

Chips linked with light could speed up AI training while reducing energy consumption.

SEI 232166506

IBM optical module prototype for connecting chips with optical fibers

IBM’s Ryan Rabin

Fiber optic technology helps chips communicate with each other at the speed of light, allowing them to transmit 80 times more information than using traditional electrical connections. This could significantly reduce the training time required for large-scale artificial intelligence models from months to weeks, while also reducing data center energy and emissions costs.

Most cutting-edge computer chips still communicate using electrical signals transmitted over copper wires. But as the tech industry rushes to train AI models at scale, a process that requires networks of AI superchips to transfer large amounts of data, companies are using fiber optic speed-of-light communications to link chips together. I am very passionate about this.

This technology is not new. The Internet already relies on undersea fiber-optic cables that stretch thousands of kilometers between continents. But to transmit data between fingernail-sized chips, companies need to connect as many hair-thin optical fibers as possible to the end of each chip.

“As everyone knows, the best communication technology is fiber optics. That’s why fiber optics is used everywhere for long-distance communications.” Mukesh Khare A preview of the technology was given at a press conference at IBM Research. “This co-packaged optical innovation essentially brings the power of fiber optics to the chip itself.”

Khare and his colleagues have developed an optical module that allows chipmakers to add six times more optical fibers to the edge of a chip than with current technology. This module uses a structure called an optical waveguide to connect 51 optical fibers per millimeter. It also prevents optical signals from one fiber from interfering with adjacent fibers.

“What IBM has really done here is take advantage of all of its materials and packaging technology, its history of leadership in that field, to truly break down the way waveguides can be used to achieve high-density optical fiber. “It’s about doing it,” he says. dan hutchison at TechInsights, a semiconductor technology research company headquartered in Canada. “For me, when I saw this, it was a big step forward.”

The result is enhanced chip-to-chip communication, potentially allowing AI developers to train large language models in less than three weeks instead of three months. Switching from wires to fiber optics for chip communications could also mean cutting energy costs for training such AI models by a factor of five.

IBM has already put its optical modules through stress tests that include high humidity and temperatures ranging from -40°C (-40°F) to 125°C (257°F). Hutcheson expects large semiconductor manufacturing companies may be interested in licensing the technology.

“We are in the early days of all of this, but semiconductor technology is the hottest area right now in terms of high-performance computing and AI technology,” he says.

topic:

  • artificial intelligence/
  • computing

Source: www.newscientist.com

Google’s AI asserts it can create superior chips compared to human designers, but experts are skeptical

Can AI design chips that are more efficient than those made by humans?

Yuichiro Kayano/Getty Images

Google DeepMind says its artificial intelligence is already helping design chips used in data centers and even smartphones. But some chip design experts are skeptical of the company’s claims that these AIs can plan new chip layouts better than humans.

He said the new method, dubbed AlphaChip, can design “superhuman chip layouts” in hours, rather than relying on weeks or months of human effort. anna goldie and Azaria Mirhoseiniaccording to researchers at Google DeepMind. blog post. This AI approach uses reinforcement learning to figure out relationships between chip components and receives rewards based on the quality of the final layout. However, independent researchers say the company has yet to prove that such AI can outperform expert human chip designers or commercial software tools, and they say they are unable to demonstrate that such AI can outperform expert human chip designers or commercial software tools, and that they believe that current state-of-the-art The company hopes to test AlphaChip’s performance on public benchmarks that include cutting-edge circuit designs.

“If Google provides experimental results for these designs, we’ll be able to make a fair comparison, and we hope everyone will accept the results,” he says. patrick madden At Binghamton University in New York. “Experiments take a day or two to run at most, and Google has nearly infinite resources. The fact that these results aren’t being provided speaks volumes to me.” He declined to comment.

Google DeepMind’s blog post says: update Google for 2021 nature A journal paper about the company’s AI process. Since then, Google DeepMind says AlphaChip has helped design three generations of Google’s Tensor Processing Units (TPUs). TPUs are specialized chips used to train and run generative AI models for services such as Google’s Gemini chatbot.

The company also claims that its AI-assisted chip designs outperform those designed by human experts and are steadily improving. AI accomplishes this by reducing the overall length of wire needed to connect chip components. This could reduce the chip’s power consumption and increase processing speed. Google DeepMind also said AlphaChip created the layout for a general-purpose chip used in Google’s data centers, while also helping MediaTek develop a chip used in Samsung’s phones.

However, the code published by Google lacks support for common industry chip data formats, which suggests the AI method is currently more suited to Google’s own chips, it said. . Igor Markovchip design researcher. “We have no idea what AlphaChip is today, what it does or doesn’t do,” he says. “We know that reinforcement learning requires two to three orders of magnitude more computational resources than techniques used in commercial tools, and typically lags behind. [in terms of] result. “

Markov and Madden criticized the original paper controversial Claim that AlphaChip outperforms anonymous human experts. “Comparisons to unnamed human designers are subjective, non-reproducible, and very easily fooled. Although it is possible that the human designer is not trying hard enough or is underqualified. , there are no scientific results here,” says Markov. “Imagine if AlphaGo were reported to have won against an unknown Go player.”

In 2023, independent experts who reviewed Google’s paper revoked his nature An explanatory article that initially praised Google’s efforts. The expert is andrew kern At the University of California, San Diego, Public benchmarking efforts When we tried to replicate Google’s AI methods, we found that they could not consistently outperform human experts or traditional computer algorithms. The best approach was commercial software for chip design from companies like Cadence and NVIDIA.

“Reinforcement learning appears to lag significantly behind the state-of-the-art in every benchmark that would be considered a fair comparison,” Madden says. “I don’t think that’s a promising research direction when it comes to circuit placement.”

topic:

Source: www.newscientist.com