Understanding ImageFX: A Comprehensive Guide to Google’s New AI Image Generator

Google is lagging behind in artificial intelligence. While OpenAI’s innovative Dall-E AI art image generator was released two years ago, Google only recently released its competing product.

The software, known as ImageFX, is backed by one of the largest technology companies and a substantial amount of data. So how is this data accumulated?

In brief, ImageFX has produced some impressive images that rival the best. But how does it work? Can it be accessed now? And have major problems in the AI art world been solved?

How to use Google ImageFX

Google ImageFX is currently available in countries like the United States, Kenya, New Zealand, and Australia.

If you attempt to access the site in a country like the UK, you’ll see a warning stating, “This tool is not yet available in your country.”

To access it from any of the currently available countries, visit Google’s AI Test Kitchen. Then create an account. Once everything is set up, your new prompt will be ready for use.

Even if you’re not in one of the listed countries, the website is still worth visiting. Google allows you to sign up for notifications about when the platform becomes available in your area.

How good is Google ImageFX?

There’s no denying that Google is late to the game. OpenAI’s Dall-E was released in January 2021, and Midjourney was released a year later. So did Google’s delay pay off in terms of quality?

Two images generated by ImageFX. On the left is a room with an art desk, and on the right is a painting of a vampire – Credit: ImageFX

The images released so far demonstrate that ImageFX is capable of producing content at a very high level. Detailed and contextual, ImageFX is an unsurprisingly capable image generator.

But that’s expected. AI art has made significant progress over the years, and Google’s main competitors are producing similarly high-quality work and have been doing so for much longer.

The significant advantage of ImageFX at the moment is that it’s free (in select countries). Both Midjourney and Dall-E are mainly behind paywalls or restricted services, so it’s worth making the most of ImageFX before any changes.

ImageFX also includes a unique feature called the “Expressive Chip.” This allows users to quickly edit the prompt and try a different search. For example, if you request a portrait of a woman, you can quickly switch this to an abstract, hand-drawn, or even oil painting.

How does it work?

Basically, Google ImageFX works like any other AI art generator. This involves several steps, starting with obtaining an image database large enough for training.

Google has not disclosed the source of its training data, but it likely includes a combination of internal sources, collaborations, and possibly web scraping and user-generated content.

Once the database is built, a model is trained on these images to learn the relationships between the words and visual concepts in the images, possibly through a diffusion model.

These models start with random noise in the image and are refined based on information from both the data and the accompanying text description. By repeating this process, you essentially learn the relationships between words, images, and context.

This training helps ImageFX and other AI image generators understand the prompts asked because it understands what words are associated with the images.

How is it linked to Google Bard?

Google Bard is probably the biggest competitor to the AI chatbot ChatGPT. Google has been working on the chatbot for some time and was released publicly in 2023.

If ImageFX is photography, Bard is understanding words and context. The goal is to combine the two to create the ultimate AI model, similar to OpenAI’s combination of ChatGPT and Dall-E (OpenAI’s image generator).

Google Bard is currently in testing but will soon be fully operational with the recently announced Google Gemini system.

This could theoretically mean a platform that asks models to create a board game and returns both the rules and lore, as well as all images, boards, and content. Or you could write a series of books with illustrations to go along with it.

Does ImageFX produce bad images?

There’s a problem with AI art…people. When trained on artwork from a human population and then utilized again by humans, less appropriate parts of the human brain tend to enter.

Previous AI art generators displayed sexist, biased, and sometimes intensely graphic images. This is a problem that all major technology companies are trying to tackle, including Google with ImageFX.

“All images generated with ImageFX are marked with SynthID, a tool developed by Google DeepMind that adds digital watermarks directly to the content we generate.” Google says:.

“SynthID watermarks are imperceptible to the human eye but can be detected for identification. Additionally, all images contain metadata, so when you encounter an AI-generated image, You can get more information.”

In addition to this, Google announced that it has improved the safety of its training data, reducing problematic output such as violent, offensive, or sexually explicit content. This extends to a reduced ability to create images of real people.

read more:

Source: www.sciencefocus.com

‘UK Artists Unite to Combat AI Image Generation Software’

Since the advent of Midjourney and other image generators, artists have been observing and wondering whether AI represents a major opportunity or an existential threat. Midjourney names 16,000 artists whose work is said to have been used to train its AI, including Bridget Riley, Damien Hirst, Rachel Whiteread, Tracey Emin, David Hockney, and Anish Kapoor. Now that the list has been revealed, the art world is calling to arms. For engineers.

British artists have contacted U.S. lawyers to discuss joining a class action lawsuit against Midjourney and other AI companies, other artists told a U.S. newspaper. observer They argued that they could bring legal action themselves in the UK.

“What we need to do is come together,” said Tim Flack, president of the Society of Photographers and an internationally acclaimed photographer whose name also appears on the list.

“Releasing this list of names to the public is a great opportunity for artists to come together and take on this issue. Personally, I am in favor of doing so.”

The 24-page list of names constitutes Exhibit J in a class action lawsuit filed by 10 American artists in California against Midjourney, Stability AI, Runway AI, and DeviantArt. “We’re seeing interest from artists around the world, including in the UK,” said Matthew Butterick, one of the artists’ lawyers.

Tech companies have until February 8 to respond to the complaint. Midjourney did not respond to requests for comment.




Bridget Riley and her painting Rajasthan (2012)
Photo: Nils Jorgensen/Rex Shutterstock

Source: www.theguardian.com

Detailed Image of Black Hole Unveiled in New Fiery Donut Visualization

The image on the right is the latest and best image of a black hole.

EHT collaboration

Thanks to an update to the world’s first black hole image taken a year later, we now have the most detailed observation of a black hole to date.

In 2019, researchers released an image of the supermassive black hole known as M87*, located 55 million light-years away at the center of galaxy M87. The image, the world’s first glimpse of a black hole, was taken during the first observations in 2017 by a network of radio astronomical observatories around the world called the Event Horizon Telescope (EHT).

Now, the EHT collaboration has released tracking images of M87* taken during 2018 observations using additional telescopes in Greenland.

As the name suggests, these objects do not emit light, so the light in the image does not come out of the black hole. What we see instead is the silhouette of a black hole at the center of a mass of hot material, pulled inward by its powerful gravity.

“This image tells us that the black hole’s shadow is permanent and still exists,” says the EHT scientist. Eduardo Ross. “You can see that the ring is a beautiful circle. It’s very circular, not an oval or anything. We also see an enhancement on the south side in this ring, which is what we expected.”

This enhancement, visible as a slightly bright glow under the slightly displaced shadow of M87*, is due to the distortion of space-time associated with the black hole’s rotation (as explained by Albert Einstein’s theory of general relativity). This is due to

The additional telescopes have slightly increased the resolution of the images, greatly increasing the amount of data that can be cross-referenced with observations from other telescopes. However, less than ideal weather made viewing conditions difficult. This means the resolution is not as high as theoretically expected, Ross says.

topic:

Source: www.newscientist.com

Newly color-corrected image shows that Uranus and Neptune have a greenish-blue hue

The so-called ice giants Uranus and Neptune are the most distant giant planets in the solar system. Our knowledge of these worlds was revolutionized by his flybys of NASA’s Voyager 2 spacecraft on January 24, 1986 and August 25, 1989, respectively. Since these Voyager encounters, our knowledge of the visible appearance of these worlds has come primarily from images reconstructed from observations from Voyager 2 Imaging Science System (ISS), images were recorded with several separate filters ranging from ultraviolet to orange. In these images, Uranus appears pale green and Neptune appears dark blue, and the perception of the relative colors of these planets has become generally accepted. However, new research has revealed that the two ice giants are actually much closer in color.

Voyager 2/ISS images of Uranus and Neptune, released shortly after the Voyager 2 flybys in 1986 and 1989, respectively, were used in this study to determine the best estimates of the true colors of these planets. The filtered image was compared with the reprocessed version.Image credit: Irwin other., doi: 10.1093/mnras/stad3761.

Professor Patrick Irwin of the University of Oxford said: “While the well-known Voyager 2 image of Uranus was released in close to ‘true’ color, the image of Neptune has actually been stretched and enhanced. As a result, it was artificially too blue.”

“Although artificially saturated colors were known to planetary scientists at the time and images were published with descriptive captions, over time that distinction has been lost. I lost it.”

“By applying our model to the original data, we were able to reconstruct the most accurate representation to date of the colors of both Neptune and Uranus.”

In the study, Professor Irwin and his colleagues space telescope imaging spectrometer On board the NASA/ESA Hubble Space Telescope (STIS) Multi-unit spectroscopic explorer (MUSE) ESO’s Very Large Telescope.

This means that the STIS and MUSE observations can be processed unambiguously to determine the actual apparent colors of Uranus and Neptune.

Astronomers used these data to rebalance the composite color images recorded by Voyager 2’s camera. Hubble’s Wide Field Camera 3 (WFC3).

This revealed that Uranus and Neptune are actually quite similar shades of greenish-blue.

The main difference is that Neptune has a slight hint of additional blue. Models revealed that this is due to Neptune’s thin haze layer.

The study also provides an answer to the long-standing mystery of why Uranus’ color changes slightly during the sun’s 84-year revolution.

The authors first reached their conclusion after comparing images of the ice giant with measurements of its brightness recorded in blue and green wavelengths from 1950 to 2016 by the Lowell Observatory in Arizona.

These measurements showed that Uranus appears slightly greener during the summer and winter solstices, when one of the planet’s poles points toward our star.

However, at the vernal equinox, when the sun is above the equator, the sun takes on a somewhat blue hue.

Part of the reason for this is known to be because Uranus has a very unusual rotation.

During its orbit, it effectively rotates almost sideways. This means that during the planet’s summer solstice, either the north or south pole points almost directly in the direction of the sun and Earth.

This is therefore important because changes in reflectivity in the polar regions have a large effect on Uranus’ overall brightness as seen from Earth.

Astronomers have not been very clear about how or why this reflectance differs.

This led the researchers to develop a model that compares the spectra of Uranus’ polar and equatorial regions.

They found that in polar regions, green and red wavelengths are more reflective than blue wavelengths. Part of the reason is that red-absorbing methane is about half as abundant near the poles as it is at the equator.

But this wasn’t enough to fully explain the color change, so the researchers looked at the gradually thickening icy surface of the planet’s sunlit pole during the summer. We added a new variable to the model in the form of a haze “hood”. We move from the vernal equinox to the summer solstice.

Astronomers believe it is likely made up of particles of methane ice.

When simulated in the model, the ice particles further increased reflection in green and red wavelengths at the poles, providing an explanation for why Uranus is green at the summer solstice.

“This is the first study to match quantitative models with image data to explain why Uranus’s color changes during its orbit,” Professor Irwin said.

“Thus, we prove that Uranus at the summer solstice is greener, not only because methane abundance is reduced in the polar regions, but also because the thickness of brightly scattering methane ice particles is increased. it was done.”

“The misperceptions of Neptune’s colors and the unusual color changes of Uranus have puzzled us for decades. This comprehensive study finally puts an end to both problems. ” said Dr. Heidi Hummel, a researcher at the Association of Universities for Astronomical Research (AURA).

of result will appear in Royal Astronomical Society Monthly Notices.

_____

Patrick G.J. Irwin other. 2024. Model the seasonal cycle of Uranus’ color and size and compare it to Neptune. MNRAS 527 (4): 11521-11538; doi: 10.1093/mnras/stad3761

Source: www.sci.news

Juno spacecraft captures incredibly detailed image of Jupiter’s moon Io

This image of Io was taken by the Juno spacecraft.

NASA/SwRI/MSSS

Thanks to NASA's Juno spacecraft, which passed the moon on December 30, we had the closest look at Jupiter's volcanic moon Io in decades.

Juno, which has been orbiting Jupiter since 2016, has increasingly taken images of Io's vicinity in recent months as its orbit around Jupiter changes.

This latest image was taken just 1500 kilometers above the moon's surface. In it, you can see some of Io's hundreds of towering mountains, some of which can exceed 10 kilometers in height, and their long, sharp shadows.

Io is thought to be the most volcanically active body in the solar system, with hundreds of active volcanoes. These volcanoes tend to be smaller than the largest mountains, averaging only 1 to 2 kilometers in height, and are difficult to see in images.

But by comparing the data with images from Juno's previous 56 lunar passes, astronomers are starting to understand how these volcanoes have changed over time and why they are so active. You can know.

Juno has also been exploring Jupiter's other moons, including Europa and Ganymede, collecting data and taking the closest images. NASA's Galileo spacecraft In February, Juno will again fly very close, about 1,500 kilometers above Io's surface.

Juno will fly close to Io seven more times before leaving orbit for Jupiter at the end of 2025, but that won't be the end of learning about Jupiter's moons. NASA's Europa Clipper spacecraft, scheduled to launch in October this year, is designed to fly just 25 kilometers above the surface of Europa, considered one of the most promising sites for life on Earth. This will provide important information about the mysterious inland sea. Solar system. The spacecraft is scheduled to arrive at Europa in 2030.

topic:

Source: www.newscientist.com

The Chicken Nebula’s Best Image Ever Captured by OmegaCAM

astronomer using omega cam meter is on ESO’s VLT Survey Telescope (VST) captured 1.5 billion pixel image A vast stellar nursery called the Running Chicken Nebula.



The Running Chicken Nebula is made up of several clouds, all of which can be seen in this OmegaCAM/VST image. Image credit: ESO / VPHAS+ Team / CASU.

of running chicken nebulaThe star, also known as IC 2944, is located in the constellation Centaurus, about 6,500 light-years from Earth.

ESO astronomers said in a statement: “The Running Chicken Nebula is actually made up of several regions, all of which can be seen in this vast image, spanning an area of ​​the sky roughly the size of 25 full moons. ” he said.

“The brightest region in the nebula is IC 2948There, some people see the chicken’s head, and others see the butt. ”

“The wispy pastel outlines are fantastic plumes of gas and dust.”

“IC 2944 is characterized by a bright, vertical, almost column-like structure toward the center of the image.”

“The brightest glow in this particular area is lambda centauri, a star visible to the naked eye that is much closer to us than the nebula itself. ”

But IC 2948 and IC 2944 themselves are full of young stars. And while they may be bright, they are definitely not cheerful.

“They cut through their environment in the same way as chickens, exhaling enormous amounts of radiation,” the astronomers said.

“Some regions of the nebula, known as the Bok globules, can withstand the intense assault of ultraviolet radiation that spreads through this region.”

“If you zoom in on the image, you may see small, dark, dense pockets of dust and gas scattered throughout the nebula.”

“Other areas depicted here include: gum 39 and gum 40, there is gum 41 in the bottom right. ”

“Besides the nebula, there are countless orange, white, and blue stars in the sky, like fireworks.”

“Overall, this image is a wonder that cannot be explained. It’s a feast for the eyes as you zoom in and pan around.”

“This image is a large mosaic made of hundreds of individual frames carefully stitched together.”

“Individual images were taken through filters that let different colors of light through and combined into the final result presented here.”

“The data used to create this mosaic was acquired as part of the VST Photometric Hα Survey of the Southern Galactic Plane and Bulge (VPHAS+), a project aimed at better understanding the life cycle of stars. .”

Source: www.sci.news

JWST Captures Spectacular Image of Uranus Revealing 13 Rings and 9 Moons

Uranus showing all its rings and 9 of the planet’s 27 moons

NASA, ESA, CSA, STScI

This amazing shot of Uranus taken by the James Webb Space Telescope (JWST) gives us the most complete view of Uranus yet, revealing its rings and turbulent atmosphere in stunning detail .

In April, JWST used infrared sensors to image Uranus, revealing more of the ice giant’s rock and dust rings, which have only been directly imaged twice before, by the Voyager 2 spacecraft and by Earth’s Keck Observatory. Now it can be observed clearly. Eleven of Uranus’s 13 known rings were visible in this image, but the last two were too dark to see.

JWST has now followed up on these observations using a wider field of view and more wavelengths of infrared light, revealing the rings in even more detail and showing us the elusive final two rings.

The diagram above also shows nine of Uranus’ 27 moons. These are all tilted away from the Sun at her 98 degree angle, the same as the planet itself. Another new image from JWST below shows five more moons (Oberon, Umbriel, Ariel, Miranda, and Titania) shining like blue stars, bringing the total shown to 14.

This JWST photo of Uranus shows five more moons, shining like blue stars around the planet. They are (clockwise from top) Oberon, Umbriel, Ariel, Miranda, and Titania.

STScI Copyright: NASA, ESA, CSA, STScI

The planet’s tilt causes long stretches of sunlight and darkness on different sides of Uranus, with each season lasting 21 Earth years and producing polar caps and atmospheric storms. Both can be seen more clearly in this picture. The storm lies just below the southern edge of the broad white polar cap, appearing as a white wisp against a blue background.

Although it takes Uranus 84 years to orbit the Sun, it only takes 17 hours to complete its rotation, allowing its atmosphere and moons to travel faster than standard telescopic exposures. Astronomers created the image above by combining long and short exposure times with JWST to smooth out the changing features.

topic:

Source: www.newscientist.com

NASA’s Webb and Hubble team up to capture the most vivid image of the universe

This panchromatic view of galaxy cluster MACS0416 was created by combining infrared observations from NASA’s James Webb Space Telescope with visible-light data from NASA’s Hubble Space Telescope. Credits: NASA, ESA, CSA, STScI, Jose M. Diego (IFCA), Jordan CJ D’Silva (UWA), Anton M. Koekemoer (STScI), Jake Summers (ASU), Rogier Windhorst (ASU), Haojing Yan ( University of Missouri)https://chat.openai.com/c/de5c3def-7d31-49b0-bd44-3d61675a3ae5

The result is a vivid landscape of the galaxy and more than a dozen newly discovered time-changing objects.

When the two flagship observatories come together, they reveal a wealth of new details that are only possible through their combined power. Webb and Hubble collaborated on studying MACS0416, a galaxy cluster about 4.3 billion light-years from Earth. Combining these data yields a prismatic panorama of blue and red. These colors provide clues to the galaxy’s distance. While the images themselves are surprising, researchers are already using these observations to fuel new scientific discoveries, such as identifying gravitationally expanded supernovae and ordinary stars.

This side-by-side comparison of galaxy cluster MACS0416 seen in optical light from the Hubble Space Telescope (left) and infrared light from the James Webb Space Telescope (right) reveals different details. Both images show hundreds of galaxies, but the Webb image shows galaxies that are invisible or only barely visible in the Hubble image. This is because Webb’s infrared vision can detect galaxies that are too far away or covered in dust to be seen by Hubble. (Light from distant galaxies is redshifted due to the expansion of the universe.) Webb’s total exposure time was about 22 hours, while the exposure time of the Hubble image was his 122 hours. Credit: NASA, ESA, CSA, STScI

NASA’s Webb Space Telescope and Hubble Space Telescope combine to create the most colorful view of the universe. NASA’s james webb space telescope and hubble space telescope They teamed up to study a vast galaxy cluster known as MACS0416. The resulting panchromatic images combine visible and infrared light to assemble one of the most comprehensive views of the universe ever captured. MACS0416, located approximately 4.3 billion light-years from Earth, is a pair of colliding galaxy clusters that will eventually merge to form an even larger cluster. Details revealed by the combination of stretching and contraction forces

This image reveals a wealth of detail only possible by combining the power of both space telescopes. This includes an abundance of galaxies outside the cluster and a scattering of light sources that change over time, possibly due to gravitational lensing (distortion and amplification of light from distant background sources). It is.

The galaxy cluster was the first in a series of unprecedented cosmic views into ultra-deep space from an ambitious joint Hubble program called Frontier Fields, launched in 2014. Hubble pioneered the search for some of the faintest and youngest galaxies ever detected. Webb’s infrared vision greatly enhances this deep observation by going even deeper into the early universe with its infrared vision.

This image of galaxy cluster MACS0416 highlights gravitational lensing background galaxies that existed about 3 billion years after the Big Bang. The galaxy contains an ephemeral object that the scientific team has named Mothra, whose brightness changes over time. Mothra is a star that is magnified at least 4,000 times. The researchers believe that Mothra is magnified not only by the gravity of the galaxy cluster MACS 0416, but also by an object known as a millilens, which weighs about the same as the globular cluster. Credits: NASA, ESA, CSA, STScI, Jose M. Diego (IFCA), Jordan CJ D’Silva (UWA), Anton M. Koekemoer (STScI), Jake Summers (ASU), Rogier Windhorst (ASU), Haojing Yan ( University of Missouri)

Roger Windhorst of Arizona State University, principal investigator of the PEARLS program (Extragalactic Field for Reionization and Lensing Science), which carried out the Webb observations, said: “We are looking at objects that are farther away and fainter. “By doing so, we are building on Hubble’s legacy.”Understand image color and scientific goals

To create the images, the shortest wavelengths of light were generally color-coded as blue, the longest wavelengths as red, and the intermediate wavelengths as green. The wide range of wavelengths from 0.4 to 5 microns provides particularly vivid galactic landscapes.

These colors provide clues to the galaxy’s distance. The bluest galaxies, as most commonly detected by Hubble, are relatively nearby and often exhibit intense star formation, while the redder galaxies, as detected by Webb, tend to be more distant. Some galaxies appear very red because they contain large amounts of cosmic dust that tends to absorb bluer-colored starlight.

“Until we combine the Webb data with the Hubble data, we won’t get the full picture,” Windhorst said.Scientific discoveries and the “Christmas Tree Galaxy Cluster”

New Webb observations contribute to this aesthetic view, but they were taken for a specific scientific purpose. The research team combined his three epochs, each conducted a few weeks apart, with his fourth epoch by the CANUCS (Canadian NIRISS Unbiased Cluster Survey) research team. The goal was to search for objects that change in brightness observed over time, known as transients.

They identified 14 such transients across the visual field. Twelve of these transients are located in three galaxies that are highly magnified by gravitational lensing, and may be individual stars or star systems that are temporarily highly magnified. The remaining two transients are in more moderately expanded background galaxies and may be supernovae.

“We call MACS 0416 the Christmas Tree Galaxy Cluster, both because it is so colorful and because of the flashing lights found within it. Transients are seen everywhere. ” said Haojing Yang of the University of Missouri-Columbia, lead author of a paper describing the scientific results.

Among the transients the team identified, one in particular stood out. It is located in a galaxy that existed about 3 billion years after the Big Bang and has been magnified by at least 4,000 times. The research team nicknamed the system “Mothra” for its “monstrous nature” of being extremely bright and highly magnified. It joins another lensed star that researchers previously identified and named “Godzilla.” (Godzilla and Mothra are both giant monsters known as kaiju in Japanese movies.)

Interestingly, Mothra can also be seen in Hubble observations taken nine years ago. This is unusual because zooming in on stars this much requires a very specific alignment between the foreground galaxy cluster and the background stars. The mutual motion of stars and star clusters should eventually dissolve the alignment.

Credits: NASA, ESA, CSA, STScI, Jose M. Diego (IFCA), Jordan CJ D’Silva (UWA), Anton M. Koekemoer (STScI), Jake Summers (ASU), Rogier Windhorst (ASU), Haojing Yan ( University of Missouri)”

Source: scitechdaily.com