Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

Google ‘working to fix’ AI picture bot after inaccuracy row

Google has said it is working to fix its new AI-powered image generation tool, after users claimed it was creating historically inaccurate images to over-correct long-standing racial bias problems within the technology (Tim Goode/PA)
Google has said it is working to fix its new AI-powered image generation tool, after users claimed it was creating historically inaccurate images to over-correct long-standing racial bias problems within the technology (Tim Goode/PA)

Google has said it is working to fix its new AI-powered image generation tool, after users claimed it was creating historically inaccurate images to over-correct long-standing racial bias problems within the technology.

Users of the Gemini generative AI chatbot have claimed that the app generated images showing a range of ethnicities and genders, even when doing so was historically inaccurate.

Several examples have been posted to social media, including where prompts to generate images of certain historical figures – such as the US founding fathers – returned images depicting women and people of colour.

Google has acknowledged the issue, saying in a statement that Gemini’s AI image generation purposefully generates a wide range of people because the tool is used by people around the world and that should be reflected, but admitted the tool was “missing the mark here”.

“We’re working to improve these kinds of depictions immediately,” the company’s statement, posted to X, said.

“Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”

Jack Krawczyk, senior director for Gemini experiences at Google, said in a post on X: “We are aware that Gemini is offering inaccuracies in some historical image generation depictions, and we are working to fix this immediately.

“As part of our AI principles, we design our image generation capabilities to reflect our global user base, and we take representation and bias seriously.

“We will continue to do this for open ended prompts (images of a person walking a dog are universal!).

“Historical contexts have more nuance to them and we will further tune to accommodate that.”

He added that it was part of the “alignment process” of rolling out AI technology, and thanked users for their feedback.

Some critics have labelled the tool woke in response to the incident, while others have suggested Google has over-corrected in an effort to avoid repeating previous incidents involving artificial intelligence, racial bias and diversity.

There have been several examples in recent years involving technology and bias, including facial recognition software struggling to recognise, or mislabelling, black faces, and voice recognition services failing to understand accented English.

The incident comes as debate around the safety and influence of AI continues, with industry experts and safety groups warning AI-generated disinformation campaigns will likely be deployed to disrupt elections throughout 2024, as well as to sow division between people online.