“Wearing non-smart glasses created a reality that was not augmented at all…”
Ekaterina Goncharova/Getty Images
By the mid-2020s, the world became inundated with “AI slop.” Various forms of content—images, videos, music, emails, advertisements, speeches, and TV shows—were generated by artificial intelligence and often felt unoriginal and unengaging. While some experiences occasionally offered amusement, many were dull and soulless, sometimes leading to harmful misinterpretations. Interactions with others raised doubts—was the person on the other end of the call genuine? Many were repulsed and eager to escape from this perplexing landscape.
There was no “Butler’s Crusade,” a fictional revolt against the thinking machines. The book title references Samuel Butler’s insightful 1863 letter discussing machine evolution, titled “Darwin in the Machines.” Ironically, the solution emerged through innovative applications of AI.
One tech firm unveiled a series of smart glasses, featuring an augmented reality (AR) display equipped with built-in cameras, microphones, and headphones. By 2028, engineers from the Reclaim Reality Foundation adapted this tech for smart glasses, utilizing bespoke AI to eliminate any AI-generated content. Wearing non-smart glasses functioned as a form of negative AR, presenting an unfiltered reality.
Roaming the streets with DumbGlasses, later dubbed X-ray specs due to their ability to see beyond the surface, felt akin to subscribing to ad-free media. These glasses stripped away AI-created banners and seamlessly inserted natural scenery, ensuring that every conversation or song was crafted using classic analog methods. Users embraced X-ray specs as a means to unwind, declutter their minds, and break free from the deluge of AI. Many proudly displayed their status with T-shirts and badges touting slogans like “AI Vegan,” “Real or Nothing,” and “Slop Free Zone.”
As we moved into the 2030s, electronic contact lenses and tiny ear implants emerged that could perform similar functions.
The online domain posed a different challenge. There, escaping the grip of AI and relentless algorithmic profiling proved far more difficult.
“
Engineers took that technology into smart glasses and utilized custom AI to eliminate any AI-generated content. “
One method allowed users to access search engines without activating the AI summaries. In the 2020s, one such option was: startpage.com. Some clever hacks employed expletives in search queries, circumventing AI-generated summaries. Nonetheless, even with these workarounds, evading AI profiling and targeting on social media platforms remained nearly impossible. Given the overwhelming dominance of major tech companies over social media, navigation, and the online realm, disengaging was far easier said than done. Yet, few were willing to abandon everything the Internet revolution had gifted us; they yearned for a digital universe to explore and rich online experiences.
The solution manifested as a new kind of network. Beyond the standard internet and the dark web, accessible only via specific browsers and passcodes, emerged veriweb (from veritas, Latin for truth). This network featured content entirely free from AI influence. Collaborating with Reclaim Reality, artists, musicians, and writers devised an infallible system, akin to blockchain used for verifying cryptocurrency transactions, ensuring that every piece of content had verifiable human origins. Veriweb, or the transparent web, became the trusted haven for reliable information and journalism, as users could trace the origin of their content. Wikipedia, which struggled with AI-generated material throughout the 2020s, transitioned to Veriweb in 2029. Traditional news entities followed suit, eager to assert their credibility in a post-AI landscape. Moreover, veriweb ensured that users remained unmonitored, unprofiled, and untouched by AI algorithms.
As millions flocked to this platform, humanity rediscovered connections and creativity. While much AI utilization persisted in personal tasks—like medical diagnoses—the intellectual stagnation that plagued society since the 2020s began to dissipate as individuals more actively engaged in their actions rather than leaving them to machines.
People discovered that navigating the vast digital world without algorithmic guidance diminished their sense of curated and personalized experiences. Additionally, the extensive collection of sensitive data by tech giants and the colossal revenues derived from targeted exploitation of that data became distant memories, evoking little sorrow among the populace.
Rowan Hooper, editor of New Scientist podcast and author of How to Spend $1 Trillion: 10 Global Problems We Can Actually Solve. Follow him on Bluesky @rowwhoop.bsky.social
T
he Oakley Meta Vanguard represents a new era of display-less AI glasses, crafted for running, cycling, and action sports, featuring robust integration with Garmin and Strava. These could potentially be the first functional sports smart glasses on the market.
These innovative glasses serve as running eyewear, open-ear headphones, and alternatives to head-mounted action cameras—all in one. They mark the latest collaboration between Meta and the sunglasses giant, Essilor Luxottica, which includes brands like Ray-Ban and Oakley.
Unlike the popular Ray-Ban Meta Wayfarer, which focuses on leisurely use, the Vanguard is specifically engineered for athletes, enabling them to move fast and track their performance without hesitation.
As one of the priciest screenless AI glasses available, they start at £499 (€549/$499/AU$789), surpassing the £399 Oakley Meta HSTN and £379 Ray-Ban Meta Wayfarer.
The included silicone nose pads come in three sizes, allowing for a customized fit that remains secure and balanced. Photo: Samuel Gibbs/The Guardian
The Vanguard distinguishes itself from traditional smart glasses, thanks to a design brought together by Oakley, ensuring a great fit and finish. The large wraparound visor and 3-point fit system take cues from the renowned Oakley Sephala sports glasses, ensuring that the 66g frame stays snugly in place, even under a helmet.
With two frame colors and four contrast-enhancing lenses, though without prescription options, the glasses incorporate a small button on the left arm to power on and off, allowing for dual functionality as sunglasses.
The speaker is conveniently located next to a large camera button and a small action button, allowing you to configure various functions using the Meta AI app. Photo: Samuel Gibbs/The Guardian
The hidden speakers within each arm can be directed towards your ear for listening to music, taking calls, and engaging with Meta’s AI chatbot. They offer superior sound quality compared to other open-ear speakers, and are loud enough to be heard over wind and road noise. Although they may not deliver the bass of traditional headphones, they are well-suited for music during physical activity. The five beamforming microphones effectively minimize background noise during calls or interactions with the AI, even in busy environments.
A touch panel on the right arm allows for manual control of playback and volume. Music automatically pauses when the glasses are removed, and the volume adapts to match surrounding noise—a great feature for road use.
Specifications
Size: 136×120×59mm
Weight: 66g (258g case)
Water resistance: IP67
Camera: 12MP/3K Ultra Wide
Speaker: Stereo open-ear
Microphone: 5-microphone array
Battery life: 6 hours of music playback (30 hours with case)
Connectivity: Bluetooth 5.3, Wi-Fi 6
Storage: 32GB
The ultra-wide camera is discreetly mounted between your eyes and remains unseen when worn. An LED directly above indicates when the camera is in use. Photo: Samuel Gibbs/The Guardian
The central camera captures quality 12-megapixel photos and stable video at up to 3K resolution for five-minute intervals. It also supports hyperlapses that combine a series of action shots into a single fast-paced clip, as well as slow-motion video at up to 120 frames per second at 720p.
While it cannot compete with high-end smartphones, the camera effectively captures live action—similar to mid-range action cameras commonly used by adventurers. Users can take photos by pressing the camera button, initiate video recording with a long press, or request the Meta AI for hands-free photo or video capture.
All functionalities, including voice controls for music, auto volume adjustments, and camera operations, work seamlessly with Bluetooth-connected devices, like running watches. For enhanced AI features, however, connections to Android or iPhone devices are necessary, via the Meta AI app.
The Meta AI app enables you to configure settings for the glasses, sync media, and translate or transcribe conversations with the chatbot. Composition: Samuel Gibbs/The Guardian
Similar to any chatbot, you can query Meta AI about its visual input. It can identify plants, translate text, and address inquiries about your surroundings. Notably, it also quickly converts distance and pace between metric and imperial units while running, although it falls short of the comprehensive capabilities of Google’s Gemini on mobile.
Furthermore, users can send messages, share photos, and make hands-free calls using their glasses, connected to devices or apps like WhatsApp, Facebook, or Instagram.
One of Vanguard’s standout features is its seamless integration with the latest Garmin running watches and bike computers. Utilizing the free Meta AI app on Garmin, you can access real-time statistics during activities just by asking.
An LED indicator next to the right eye flashes to inform you when the AI is listening, interacting, or capturing an image. Photo: Samuel Gibbs/The Guardian
The AI can relay personal metrics such as pace, distance, heart rate, and cadence on request by saying “my stats,” which are gathered directly from Garmin at that moment. It audibly communicates your pace and other metrics for each lap, mile, or kilometer using a small LED indicator at your right eye to indicate your status in pace or heart rate zones, turning red when outside desired ranges.
Garmin integration also allows the camera to automatically document your workouts, whether walking, hiking, running, or biking. It captures five-second videos every kilometer or mile, or during significant milestones like sprint finishes or heart rate spikes. After your activity, the Meta AI app compiles your manually captured videos and highlight reels. You can overlay workout stats like distance, pace, altitude, time, heart rate, and power data, then quickly share this content with Meta’s app or Strava for immediate posting to your activity log.
The Meta AI app showcases a workout summary and processes any videos automatically captured via the glasses. Composition: Samuel Gibbs/The Guardian
All functions operate effectively, and although not every run needs a video documentation, the capability to automatically capture highlight moments without hassle is a substantial bonus.
However, video capture does drain the battery. During a one-hour run, I interacted with Meta AI multiple times, listened to music, automatically recorded 14 five-second videos, and manually shot 13 minutes of 1080p/30fps video along with 14 photos. The glasses concluded the run with 25% battery life intact.
They should survive a full marathon with automatic capture activated, but caution is advised for prolonged filming to ensure coverage over the entire 42km distance.
For those without Garmin devices, the Meta AI app can sync with Apple Health, Google Health Connect, or Strava to retrieve data retrospectively from other trackers, such as Apple Watch and Coros devices. However, the metrics displayed on videos are limited, and there are no automatic capture features or running statistics available for these integrations.
The glasses provide up to six hours of continuous music playback and come stored in a durable Oakley case, complete with four additional charges while transmitting photos and videos to the cloud during recharging. Photo: Samuel Gibbs/The Guardian
Sustainability
The glasses’ battery retains at least 80% capacity even after 500 full charge cycles. Replacement lenses (£69), charging case (£139), and nose pads (£10) are offered, but the glasses themselves cannot be repaired, and the battery is non-replaceable, ultimately deeming them disposable. Currently, these glasses do not incorporate recycled materials, and Meta lacks any trade-in programs or environmental impact reports for the product.
Price
The Oakley Meta Vanguard is priced at £499 (€549/$499/AU$789).
For reference, the pricing for other models includes the Ray-Ban Meta Wayfarer (gen 2) at £379, the Oakley Meta HSTN at £399, the Oakley Sphaera at £191, and the Shokz OpenRun Pro 2 at £169.
Verdict
The Oakley Meta Vanguard stands out as the finest sports smart glasses available. While expensive, they are clearly crafted for athletes by individuals who comprehend the needs of runners and sports enthusiasts.
By concentrating on the essentials for runners, cyclists, and adventure seekers—like an impeccable fit, lenses that effectively shield from the wind, decent water resistance, clear open-ear audio, extended battery life, and user-friendly controls—you end up with an excellent product without unnecessary bells and whistles.
The nose-mounted camera is excellent and could easily substitute for action cameras for many users. The microphone performs adequately, and Meta’s integrated AI is beneficial for answering inquiries that arise while on the move.
The primary highlight is unquestionably the integration with Garmin, allowing users to monitor statistics and create an automatic highlight reel from their camera capture during activities. However, it requires you to carry your phone while running, which could be a minor inconvenience. The glasses also have links to Strava and other fitness applications for non-Garmin users.
If you utilize them with your running watch to listen to music without needing your phone, basic voice commands for volume, playback, and camera will still operate smoothly.
Although the steep price is hard to justify, the main drawback remains the non-replaceable battery. While the lenses and nose pads are replaceable, like most earbuds and other smart glasses, they cannot be repaired, which detracts from the overall score.
Pros: Large, high-quality lenses; secure fit; powerful and clear open-ear speakers; nose-mounted camera; IP67 waterproof rating; long battery life; protective case; excellent Garmin integration; compatible with Strava and other fitness apps; customizable lenses and nose pads.
Cons: Very costly; non-repairable; advanced features necessitate a phone; excessive vocal commands while driving.
The Oakley logo doubles as part of the touchpad for controlling music playback and volume adjustments. Photo: Samuel Gibbs/The Guardian
Study participant measuring reading capacity post-retinal implant
Moorfields Eye Hospital
Individuals experiencing significant vision impairment can regain the ability to read, thanks to a compact wireless chip implanted in one eye along with advanced glasses.
Age-related macular degeneration (AMD) is a prevalent condition that impacts central vision and tends to progress over time. While the precise cause remains unknown, this condition arises from damage to the light-sensitive photoreceptor cells and neurons located in the central retina, leading to difficulties in facial recognition and reading. Available treatments are primarily designed to slow down the progression.
An advanced form of AMD referred to as geographic atrophy typically allows individuals to retain some photoreceptor cells that facilitate peripheral vision, along with sufficient retinal neurons to relay visual information to the brain.
Leveraging this capability, Daniel Palanker and his team at Stanford University in California created the PRIMA device. This system includes a small camera mounted on the glasses, which captures images and projects them through infrared light onto a 2-by-2-millimeter solar-powered wireless chip implanted at the rear of the eye.
The chip then transforms the image data into electrical signals, which the retinal neurons transmit to the brain. Infrared light is employed for this process as it is invisible to the human eye, thereby ensuring it does not interfere with any remaining vision. “This allows patients to utilize both the prosthesis and their peripheral vision simultaneously,” explains Palanker.
To evaluate its efficacy, researchers enlisted 32 participants aged 60 and above, all suffering from geographic atrophy. Their visual acuity in at least one eye was below 20/320—meaning they could see what a person with 20/20 vision could see at 320 feet (97.5 meters) only at 20 feet (6 meters).
The team initially implanted a chip in one of the participant’s eyes. After a waiting period of four to five weeks, the volunteers began using the glasses in their everyday activities. The glasses enabled them to magnify their view up to 12 times and adjust brightness and contrast as needed.
After a year of using the device, 27 of the participants managed to read again and recognize shapes and patterns. They also noted an average improvement of five lines on a standard eye chart compared to their initial findings. Some participants were able to achieve 20/42 vision.
“Witnessing them progress from reading letters to full words brought immense joy to both sides. One patient expressed, ‘I believed my eyes were irreparably damaged, but now they’re revitalizing,'” shares Jose Alan Sahel from the University of Pittsburgh School of Medicine.
While stem cell therapy and gene therapy may potentially restore vision lost due to AMD, these approaches are still in early experimental trials. PRIMA stands out as the first artificial eye designed to restore functional vision in individuals with the condition, allowing them to perceive shapes and patterns.
Approximately two-thirds of the volunteers experienced temporary side effects, such as increased intraocular pressure, as a result of the implants; however, this did not hinder their vision improvement.
Comparison of a trial participant’s eye (left) and eye with retinal implant (right)
Science Co., Ltd.
“This research is both exciting and significant,” remarks Francesca Cordeiro from Imperial College London. “It provides hope for delivering vision improvements that have previously seemed more like science fiction.”
The improved visibility experienced by participants is limited to black and white. “Our next objective is to develop software to provide grayscale resolution and enhance facial recognition,” states Palanker. Nevertheless, researchers do not anticipate achieving color vision in the near future.
Palanker also aims to increase PRIMA’s resolution, which is currently constrained by pixel size and the total count that can be included on a chip. Testing a more advanced version in rats is underway. “This current version equates to human vision of 20/80, but electronic zoom can enable vision as sharp as 20/20,” he explains.
As we near the threshold of the AI apocalypse, glimmers of hope remain. The technology may not always function as intended.
This was evident last week when Mark Zuckerberg attempted to showcase his company’s latest AI-powered smart glasses. “I’m not sure what to say, folks,” he told his supporters after struggling multiple times to make a video call through the glasses, which ultimately failed.
This mishap came after an ambitious start to the event at Meta Connect 2025, a developer conference held in Menlo Park, California. The keynote was set to unveil the Ray-Ban Meta Display, essentially a modern version of the wearable iPhone—ideal for those too lazy to dig their devices out of their pockets, and appealing to fans of both Buddy Holly and the Terminator. Yet, despite its alluring design, the presentation was riddled with technical blunders, perhaps serving as an ironic tribute to the latest, meaningless iterations of digital devices.
The event kicked off with quite the spectacle. Attendees watched as Zuckerberg made his way to the stage, hitting the beat while sharing numerous fist bumps along the way. The camera on the glasses displayed “Mark’s POV” for the audience, all while he undoubtedly received an avalanche of texts filled with genuine excitement: “Let’s Gooo” followed by rocket emojis, accompanied by GIFs of two guys exclaiming, “The audience is hyped,” and “It’s Time.”
Zuckerberg eventually reached the stage, clad in his trademark baggy t-shirt and tousled hair. He expressed the company’s dedication to developing attractive eyewear, all while referencing the ironic concept that technology “doesn’t interrupt” human interactions, alongside the equally ironic assertion that “serious Super Intelligence” is the cornerstone of our age. “AI must serve humanity, not just those in data centers automating our lives,” he stated.
Things seemed to flow smoothly until it was time to actually utilize the AI features. Zuckerberg attempted a video call with chef Jack Mankuso, suggesting a dish inspired by “probably Korean-style, like steak sauce.”
“What should I do first?” he asked the Oracle.
“You’ve already combined the basic ingredients,” the AI mistakenly informed him, leading to an awkward silence.
“What do I do first?” Mankuso inquired again.
“You’ve already combined the base ingredients, so grate the pears and gently mix them into the base sauce,” the AI patiently reminded him.
“I think the Wi-Fi is acting up. Sorry. Back to you, Mark.” (Certainly the fault lay with the Wi-Fi, not the AI itself.)
To his credit, Zuckerberg maintained his composure. “It’s all good. What can you do? It’s all good,” he said. “The irony is that you can spend years crafting technology, only for the Wi-Fi of the day to trip you up.”
Failing AI demonstrations are not new phenomena. They’ve become a tradition; last year at Google, a presenter attempted to use the Gemini tool to scan posters for Sabrina Carpenter’s concert to find her tour dates. The bot remained silent when asked to “Open Gemini and take a photo and ‘Check out the calendar for my availability when she visits San Francisco this year.” It eventually worked on my third attempt on another device.
This year, Google demonstrated its translation features with its own smart glasses, which failed only 15 seconds into the presentation. To be fair, a blunder in a high-stakes tech demonstration doesn’t equate to a non-functioning product, as anyone familiar with a certain Tesla CyberTruck presentation will remember. It flopped when the designer threw metal balls at the truck’s so-called “armor glass”; the incident paved the way for a bright future and earned the dubious title of “more fatal than the Ford Pinto.”
At this juncture in his presentation, one might assume Zuckerberg would play it safe. However, when it came time to demonstrate the new wristbands for the Ray-Ban Meta display, he chose to rely on live trials instead of slides.
The wristband, which he dubbed a “neural interface,” detects minimal hand gestures by picking up electrical signals from muscle activity. “You can be among others, yet still type without drawing attention,” Zuckerberg explained. In essence, the combination of glasses and wristbands is practically a stalker’s fantasy.
At least, that is, when it operates correctly. Zuckerberg repeatedly attempted to call his colleague Andrew Bosworth, but each attempt was met with failure. “What a letdown. I’m not sure what went wrong,” he said after the first unsuccessful attempt. He tried again: “I’ll pick it up with my neural band,” he quipped, but still couldn’t connect.
“I’m not sure what to tell you guys, it’s impressive, we’ll bring Boz out here and move to the next presentation and hope it works.” The sign at the back of the room that appeared on-screen read, “Live Demo – Good Luck.”
If the aim was to humanize Zuckerberg, it indeed succeeded: he put forth his best effort in the face of disasters and smiled through it all, making it easy to forget the childlike wonder.
However, the overall event felt like a misaligned millennial dream, a bizarre echo of early 2000s optimism that only Silicon Valley billionaires could buy into. The spectacle mirrored Steve Jobs’ iPhone unveiling in 2007, with two key contrasts: back then, the U.S. hadn’t crumbled behind the scenes—not yet—and it was clear why people were eager to see the devices’ launch. They were on the internet! In your pocket! Can you believe this incredible human innovation?
This event is mired in hardware and software that seems to function without them, with many AI pushes hoping to harness the same energy remotely and without comparable offerings.
For amateurs, it appears consumer technology has entered an era of solutions searching for problems. Witnessing our high-tech overlords stumble on stage raises a broader question: Is that not the case?
Glasses featuring LCD lenses offer a convenient way to switch between correcting for near and distant vision.
Bifocal glasses have a history stretching back to the 18th century, with designs that haven’t changed significantly over time. Typically, the lower section of the lens is shaped for nearby viewing, while the upper section is designed for distance, requiring wearers to shift their gaze frequently. A team led by Yi-Hsin Lin at National Yang Ming Chiao Tung University and Chuo Tung University in Taiwan has developed a pair of glasses that can toggle between two modes at the press of a button.
Rather than traditional glass lenses, these new lenses utilize a thin layer of liquid crystals controlled by an electric field, which alters the arrangement of the molecules to bend light in different ways depending on their function.
The concept of adjustable glasses using this principle, which produces the necessary electric field through thin electrodes embedded in the frame, is nearly 50 years old. However, advancements have been slow due to various technical obstacles. Lin and her colleagues have addressed some of these challenges, creating a wearable prototype.
The current iteration of these glasses offers a limited power range, boasting a functional viewing area of about 10 millimeters (3 to 4 times larger than traditional bifocals) and a switch time of approximately 5 seconds upon pressing the button on the frame. Nonetheless, Lin’s team has successfully identified the physical principles enabling these glasses to function effectively, and they are optimistic about future improvements in areas such as reducing switching times through electronic optimization. “Almost everyone will need prescription eyewear at some point, so the market potential is enormous,” claims team member Victor Reshetnyak from Taras Shevchenko National University in Kiev, Ukraine.
Viola Kanevsky, an optometrist in New York, notes that those most likely to benefit from these adaptive glasses are in technical fields where small objects may occupy significant visual focus or in professions requiring a broad perspective—such as architects evaluating long, straight lines or stockbrokers monitoring multiple large screens both near and far without needing to constantly adjust their heads.
However, these new prototypes are still far from being ready for clinical application due to the limited operational areas of the lenses, according to Mark Rosenfield from SUNY Optometry University in New York. The inclusion of electronic components also raises both the cost and complexity of manufacturing LCD glasses. Similarly, Joshua Silver from Oxford University highlighted that in 2004, he and his colleagues developed alternative self-adjustable eyeglass designs utilizing liquids.
A device that can convert infrared light into visible light
Laura Valencia Molina et al. 2024
Glasses coated with lithium compounds may one day help us see clearly in the dark.
For more than a decade, researchers have been searching for the best lightweight materials that can convert infrared light, invisible to the human eye, into visible light in order to provide an alternative to night-vision goggles, which are often heavy and cumbersome.
Until recently, the leading candidate was gallium arsenide. Laura Valencia Molina The researchers, from the Australian National University in Canberra, and their colleagues found that a film of lithium niobate coated with a lattice of silicon dioxide performed better.
“Through improved design and material properties, we have achieved a tenfold increase in the conversion rate from infrared to visible light compared to gallium arsenide films,” the team said. Maria del Rocio Camacho MoralesAt the Australian National University.
Through a series of experiments, the team demonstrated that the lithium niobate film could convert high-resolution images from infrared light with a wavelength of 1,550 nanometers to visible light with a wavelength of 550 nanometers, exceeding the capabilities of gallium arsenide.
Night vision goggles require infrared particles called photons to pass through a lens and be converted into electrons in a device called a photocathode. These electrons then pass through a phosphor screen to be converted into visible light photons. This entire process requires cryogenic cooling to prevent distortion of the image.
Molina says the lithium niobate film is hit by infrared light emitted by an object and illuminated with a laser at the same time. The film combines the infrared light with the laser light, which then up-converts the infrared light into visible light.
Camacho Morales says that one day, lattices of lithium niobate and silicon dioxide could be made into a film thinner than plastic wrap that could be coated over regular glasses to improve night vision.
While still in the research stage, the laser was positioned so that it could be easily shone onto the film along with infrared light emitted by the object, and the team is now experimenting with creating an array of nanolasers that can be positioned on top of the lithium niobate film.
The research is an important next step toward lightweight night-vision devices, and perhaps a film that can be attached to ordinary glasses, Camacho Morales said. It could also help drones navigate in the dark, he said, because current night-vision devices are too heavy to carry in some vehicles.
Watching a total solar eclipse is an experience you’ll never forget, but if you don’t take the right precautions, it could end up for the wrong reasons. Looking directly at the sun can be dangerous, so read on to learn how to safely observe a solar eclipse and what you need to prepare in advance.
On April 8, 2024, a total solar eclipse will be visible to more than 42 million people across North America. The total path is only about 185 kilometers wide and touches Mexico, 13 U.S. states, and parts of Canada. Most people in North America will experience this phenomenon as a partial solar eclipse, rather than a total solar eclipse.
“For those outside the path of totality, the moon will never completely cover the sun,” he says. Jeff Todd At Prevent Blindness, a Chicago-based eye care advocacy group. No matter how you look at it, eye protection is essential.
“To avoid damaging your eyes, you should wear eclipse glasses throughout the eclipse,” says Todd. Otherwise, you risk burning your retina. This phenomenon, also known as “eclipse blindness,” can occur painlessly and can be permanent. It may take several days after seeing a solar eclipse before you realize something is wrong. Sunglasses do not provide sufficient protection. However, it is perfectly safe to wear eclipse glasses over your prescription glasses.
How to safely view a solar eclipse
The prize for those traveling the path of totality is seeing the sun’s corona with the naked eye. However, it is only visible for a short few minutes during totality. Otherwise, partial phases will be visible and must be observed through eclipse glasses. Todd says people on the path to totality should wear eclipse glasses at all times, except during totality, a brief period of darkness when the sun is completely hidden by the moon. “Only then can you take off your eclipse glasses,” he said.
It is important for those in the path of totality to use their naked eyes to view the Sun during a total solar eclipse. “You have to look without a protective filter, otherwise you won’t see anything,” he says. ralph chow At the University of Waterloo, Canada.
solar eclipse 2024
On April 8th, a total solar eclipse will pass over Mexico, the United States, and Canada. Our special series covers everything you need to know, from how and when to see a solar eclipse to the strangest solar eclipse experience of all time.
Just before totality ends, light from the Sun’s photosphere flows between the Moon’s peaks and valleys. Called Bailey beads, they appear for a few seconds and eventually become a flashing “diamond ring,” exposing enough of the sun’s photosphere for sunlight to return. “It gives us ample warning that it’s time to resume viewing partial solar eclipses with protective filters,” Chow said.
Which solar eclipse glasses should I buy?
It is important to wear eclipse glasses that meet the ISO 12312-2 international standard. ISO 12312-2 applies to products used for direct viewing of the sun. “Look for the ISO standard label and buy your glasses from a trusted source,” says Todd. “Get your glasses early in time for the eclipse.” Before you buy, make sure the company or brand is listed on the American Astronomical Society’s site. A vetted list of suppliers and resellers.
Do not use Eclipse glasses with binoculars or telescopes. If you want to use these instruments to observe a solar eclipse, you’ll need to attach a solar filter over the objective lens (the lens opposite the one you’re looking through). Never place solar filters or eclipse glasses between the telescope eye and the eyepiece or binocular eyecup.
Another way to safely view the eclipse is with a pinhole projector. This is a simple device that projects an image of the sun onto paper or cardboard through a small hole. An even easier method is to use a colander or a small hole in a spaghetti spoon. This projects a small crescent sun onto every surface.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.