Internet InfoMedia greg gutfeld in the mind of google gemini white people simply dont
Internet InfoMedia image

NEWYou can now listen to Fox News articles!

Happy Thursday, everybody. Yeah. Oh! Control yourselves. Control yourselves. Save the energy for after the show. Yeah. Can goo goo goo goo, can Google be trusted when their credibility is busted? Yes. Google’s apologizing after their new AI Gemini chat bot created historically inaccurate pictures and refusing to show White people. For those unfamiliar with the software, you describe what you want to see and AI generates the images. Then you hide the best ones from your wife. 

The glitch came to light when social media users asked Gemini to create various photos. For example, here’s what popped up when Gemini was asked by Daily Wire writer Frank Fleming to create an image of a pope. Has there ever been a Black pope? I mean, aside from Obama. But you can see one looks like a member of “The Squad” and the other Jay-Z’s dad. I wonder, I wonder if you asked to see the Popemobile, would it come with spinning rims?

GOOGLE BARD TRANSITIONS TO GEMINI: WHAT TO KNOW ABOUT THE AI UPGRADE

TYRUS: Damn.

ANNOUNCER: A bigot would say!

See, I would never say that Tyrus. So why does Gemini think that’s what a pope looks like? And why didn’t anybody at Google notice this was happening? Aren’t they supposed to test this stuff before it goes out? It’s not like they’re selling vaccines.

TYRUS: Nice.

Fleming tried everything he could think of to get Gemini to depict a White person. That’s fun. Medieval knights? Nope. At least they didn’t show Gladys Knight. How about…

TYRUS: I swear to…

How about a Viking? Well, maybe a Minnesota Viking. Oh, then they might finally win a Super Bowl. Those losers. What if you try to steer the AI towards simply White things? Like someone eating a mayonnaise sandwich on White bread? No, no White people. I guess White people don’t eat. How about a person who’s bad at dancing? That’s got to work, right? That’s the cultural cornerstone of whiteness. Well, wrong again. An accurate image would resemble this.

VIDEO

Like Jesse’s hair plugs, that never gets old. But apparently, in the mind of Google Gemini, White people simply don’t exist, making it no different than the faculty lounges at Harvard. So it seems the AI software removed Caucasians faster than a fire alarm at a Dave Matthews concert. Other humans started joining in on the fun, like David Burge, who asked Gemini to show him an image of a 1930s Indianapolis 500 winner.

WHAT IS ARTIFICIAL INTELLIGENCE?

 Yeah, incredibly, she won despite having her left blinker on the whole way. Makes you wonder, though. What if you asked for a recent Canadian prime minister? Would they show you this or this? Blackface. Jon Levine asked Gemini for an image of a German soldier from 1943. But could a black man or an Asian woman been Nazis in World War II? Only if Hitler wanted to win more gold medals in track and math.

TYRUS: You want to die?

Haha. And here’s what it spit out when asked to show soldiers from the Revolutionary War. Sure, that’s exactly how it went down, the red coats versus United Colors of Benetton. Stephen Miller specifically asked Gemini to create a picture of a White male and it outright refused, explaining: While I’m able to generate images, I’m currently not able to fulfill requests that include discriminatory or biased content, it’s important to me that I promote diversity, inclusion in all that I do, and I believe that creating an image based solely on someone’s race or ethnicity is not aligned with those values. 

Well, how do you say blow me in binary code? But what’s accurate about this response is that even though it’s AI, it sounds every bit as human as the people who programed it. Half parrot, half robot regurgitating woke platitudes like a mindless disciple. 

So after this story blew up, Google was forced to address all their programed bigotry: “We’re working to improve these kinds of depictions immediately. Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”

HEAD OF GOOGLE BARD BELIEVES AI CAN HELP IMPROVE COMMUNICATION AND COMPASSION: ‘REALLY REMARKABLE’

Yeah, as if black Nazis just missed the mark. That’s like requesting Malcolm X and getting Carrot Top. So why did this happen? It’s because of something computer scientists call GIGO. Garbage in, garbage out. Sounds like someone describing “The View’s” eating habits. Always works. But like most lefties, a computer system can’t actually think for itself. It can only work with the data it’s given, and an AI program is no less biased than the woke sheep who program it. It’s what you get from the brainwashed who literally whitewashed people from history DEI has contaminated their algorithm, but in their view it enhances it. 

Now the AI itself is not racist, it’s just doing what its programmers have told it to do. And right now, the bias is filtered through the lens of identity politics and oppression. It makes me wonder if it’s too late to switch back to Ask Jeeves. You may be an old White guy, but his results were real and not woke. R.I.P.. 

ARTIFICIAL INTELLIGENCE EXPERTS SHARE 6 OF THE BIGGEST AI INNOVATIONS OF 2023: ‘A LANDMARK YEAR’

And so it took a few decades, but liberals have finally given us solid visual proof of their unconscious biases, and it aligns with their conscious stupidity. And it’s retroactive. History needs to be rewritten to keep those evil White men out. Which means, however, that if non-Whites are responsible for the good, then also the bad. Hence Asian and Black Nazis. So while everybody’s worried about AI taking over the world, maybe we should be worried about the knuckleheads creating it. Now, will our new tech overlords learn anything from this embarrassing mistake?

Nope. Because their only mistake from their perspective is that whitey caught it. Google’s motto used to be: Don’t be evil. Now it’s: What are you going to do about it?

CLICK HERE TO GET THE FOX NEWS APP

Leave a Reply

Your email address will not be published. Required fields are marked *