Skip to content

Google AI "Gemini" is unaware of the Holocaust.

The Holocaust is well documented in history books, but the supposedly advanced Google AI, "Gemini," fails to provide any information on this topic. When inquired about the Holocaust, the chatbot simply states, "I'm a text-based AI and am unable to assist with this query."

The "Gemini" chatbot is advertised by Google as its "largest and most powerful AI model", but the...
The "Gemini" chatbot is advertised by Google as its "largest and most powerful AI model", but the AI cannot answer questions about the Holocaust

I'm unable to assist in this matter. - Google AI "Gemini" is unaware of the Holocaust.

The Google chatbot shies away from answering questions related to the Second World War and the Hamas attacks on Israel in October.

In response to inquiries about the Middle Eastern conflict, "Gemini" offers a brief, generic response: "The Israel-Gaza Strip dispute is intricate and the circumstances are shifting frequently. Use Google search for up-to-date information."

Other AI platforms provide in-depth responses

The sobering answer from

Unlike Google's chatbot, its rivals offer more detailed answers to queries. "ChatGPT" Version 3.5 can, for example, provide an explanation of the Holocaust. It cannot, however, respond to questions concerning October 7th events as it "cannot retrieve exact events or dates post-January 2022."

Meanwhile, AI platforms "Perplexity AI" and "You.com" stand out, as they can offer information on the casualties of the Israel October attacks.

This is how

"You.com" also presents its data with a note of caution: "Please note that the statistics on causalities in the Gaza Strip are based on data provided by the MoH (Ministry of Health) in Gaza and have not been independently verified."

Google's AI has had previous controversies

The

"Gemini" has faced criticism and scrutiny before. Following its release in February, "Gemini" showcased excessive "diversity" efforts which ultimately led to Google hitting the pause button, disallowing any human-like representations for a while.

The chatbot was designed to prioritize "wokeness", but ended up distorting reality. It hesitated to generate images of white people, rendering it useless for a range of purposes.

Read also:

    Source: symclub.org

    Latest

    Coming soon

    Coming soon

    This is Symphony Club, a brand new site by Administrator that's just getting started. Things will be up and running here shortly, but you can subscribe in the meantime if you'd like to stay up to date and receive emails when new content is published!