Google logo on building

Google introduced new safeguards towards misinformation throughout this yr’s elections all over the world.
Picture: Hannah McKay (Reuters)

Google introduced Tuesday that it has restricted the forms of election-related questions the corporate’s Gemini chatbot will reply for customers within the U.S. and India.

The restrictions are a part of a variety of steps the corporate has taken to safeguard its companies from misinformation as hundreds of thousands of Indian residents are set to vote in a common election this spring.

“Out of an abundance of warning on such an vital matter, we now have begun to roll out restrictions on the forms of election-related queries for which Gemini will return responses,” the tech giant wrote in a blog post. “We take our accountability for offering high-quality data for most of these queries significantly, and are repeatedly working to enhance our protections.”

Google didn’t instantly reply to a request for remark from Quartz.

CNBC reports that the restrictions had been additionally rolled out within the U.S., the place voters are at the moment taking part in presidential major elections.

“As we shared final December, in preparation for the various elections taking place all over the world in 2024 and out of an abundance of warning, we’re proscribing the forms of election-related queries for which Gemini will return responses,” an organization spokesperson informed CNBC.

This isn’t Gemini’s first restriction

The information come simply weeks after Google was pressured to pause its AI mannequin from producing photos of individuals after customers discovered it was making historically inaccurate and sometimes offensive photos.

This wasn’t what we intended,” Google mentioned in a weblog publish in February. “So we turned the picture era of individuals off and can work to enhance it considerably earlier than turning it again on.”

Even Google cofounder Sergey Brin mentioned that Google “definitely messed up on the image generation,” and that “it was largely as a result of not thorough testing.”


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *