• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Google released another AI banger

SJRB

Gold Member
They inject """"AI"""" into their search engine and it goes as expected.


maybe-google-ai-was-a-mistake-v0-0mt5f1fieb2d1.jpeg


GOStkgnWMAEwVSW


GOTI3z8W8AAtvRj



In the last bit it based its recommendation on an article scraped from The Onion which, as we all know, is a satirical website. But because these morons have such a poor grasp on their models and deploy their trash as soon as possible their models are completely unfit to deal with satire, comedy, sarcasm and just presents everything as cookie-cutter facts.

With Google, Microsoft and all these tech juggernauts stumbling over each other to push their unwanted and extremely poorly optimized "AI" slop to production, we're in for a wild ride.
 
Last edited:

nush

Member
How do I get this to work for me? I want to see some istaphobic suggestion from Google.
 

winjer

Gold Member
This just serves to show that AI, still isn't intelligence.
It's just probabilistic , using large data bases.
In reality, those models don't know the true meaning of what they are saying, just the likelihood of the next word.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
Have we ever had something as simultaneously mind-blowingly impressive and embarrassingly awful as the current slate of AI trends?
 

Hudo

Member
""AI"" itself is not the problem, the problem is taking the entirety of the internet as training data. 90% of the publicly written stuff is just fucking garbage. So you end up with shit like this.

The only reasonable way to combat is to develop "common sense" reasoning skills for LLMs, which is already a branch of active research. But'll take a while.
 
Last edited:

JBat

Gold Member
Google seems to be scrubbing some stuff that has gotten attention. Or I'm using it wrong. I couldn't make it tell me to add glue to my pizza, eat rocks, or that Wario is a trans drag queen but Pokemon are still gay. The language it spits out is hilarious to me. Can tell it was pulled from some ones headcanon
QjJ3HDA.jpeg
 

JayK47

Member
AI is just adding to all the misinformation out there. It is not differentiating between fact or fiction. Just pulling from the entire online cesspool.
 

Mr Reasonable

Completely Unreasonable
As if the internet wasn't trustworthy enough, I'm gonna have to go to my colleagues and get them to overhaul this. My apologies everyone.
 

StreetsofBeige

Gold Member
And people think AI s going to take over the world. lol

Then again, there will be some people for breakfast swapping out cereal for a rock. lol. But I'm more of a mid-sized rocker.
 
Last edited:

diffusionx

Gold Member
Google pays some of the most intelligence and advanced data scientists millions of dollars a year in salary to develop their AI and this is what it does, and I am supposed to be afraid of an AI future?
 

Kilau

Member
Maybe AI will kill us all more subtly than using nukes or robot armies. It’s quickly picked up how fucking stupid people are and will just let us do it ourselves with tiny nudges.
 

AlphaDump

Gold Member
This just highlights the fact that AI solutions use the data lake method to get you information, and it is quite easy to poison that well.
 
It is hilarious how inept Google with AI. Now they just need to run their only source of revenue - search - into the ground :messenger_tears_of_joy:

Google pays some of the most intelligence and advanced data scientists millions of dollars a year in salary to develop their AI and this is what it does, and I am supposed to be afraid of an AI future?
Maybe those people are just not as intelligent as we believe. Not to mention their folks who are driven by ideology.
 
Last edited:

Drew1440

Member
There probably a lot of restrictions placed on the AI so it can avoid unconscious bias and answers that could be perceived as problematic, the end result being it pulls sources from several years ago.
 
Doesn't their "AI" use reddit as one of the sources for training? Many of large subreddits are spammed with numerous bot accounts. It's like having it learn from youtube comments LMAO. Bot learning from bots.
 

StreetsofBeige

Gold Member
Google pays some of the most intelligence and advanced data scientists millions of dollars a year in salary to develop their AI and this is what it does, and I am supposed to be afraid of an AI future?
I have no idea how AI algorithms work, but I assumed it has to do with scraping tons of content off the net and outputting an answer based on consensus answers or some shit. If most people in world says 1+1 = 2 then thats the answer.

But what if everyone and every math/science site trolled the net and said 1 + 1 = 3, would AI be smart enough to know that is wrong or just go with it since that's the data scrape?

Who knows. But its obvious AI can be dumb as rocks so whatever it's doing to formulate an answer can be garbage.
 
Last edited:

diffusionx

Gold Member
I have no idea how AI algorithms work, but I assumed it has to do with scraping tons of content off the net and outputting an answer based on consensus answers or some shit. If most people in world says 1+1 = 2 then thats the answer.

But what if everyone trolled the net and said 1 + 1 = 3, would AI be smart enough to know that is wrong or just go with it since that's the data scrape?

Who knows. But its obvious AI can be dumb as rocks so whatever it's doing to formulate an answer can be garbage.
More or less. When people say they are training AI they are basically feeding that info into it and telling it what it is. Like you know when we did those captchas like "click on all the pictures of a bicycle"? We were telling their AI what a bicycle looked like.

But this is the point. AI is just a machine. It's not thinking, it has no soul, it can't create anything new, it's basically a glorified search tool, but as we are seeing, even the preeminent search company on earth can't make an AI that is better than their search tool (which is also way worse than it used to be). Even worse, it has the idiocies of the ruling class permanently coded into it, as we saw with Google's AI drawing tool (which btw still hasn't been turned on again).

Doesn't their "AI" use reddit as one of the sources for training? Many of large subreddits are spammed with numerous bot accounts. It's like having it learn from youtube comments LMAO. Bot learning from bots.

IIRC Google paid $60 million to be able to use it LOL.
 
Last edited:

Romulus

Member
This just serves to show that AI, still isn't intelligence.
It's just probabilistic , using large data bases.
In reality, those models don't know the true meaning of what they are saying, just the likelihood of the next word.


Google AI is a horrible example to judge AI by. To me, its basically regular Google, but almost worse because at least when I use regular search engines at least I can scan for the right answer.

But I would argue many humans are similar, regurgitating shit they've read or heard with no actual experience or understanding of what they're even saying.
 
Last edited:

StreetsofBeige

Gold Member
More or less. When people say they are training AI they are basically feeding that info into it and telling it what it is. Like you know when we did those captchas like "click on all the pictures of a bicycle"? We were telling their AI what a bicycle looked like.

But this is the point. AI is just a machine. It's not thinking, it has no soul, it can't create anything new, it's basically a glorified search tool, but as we are seeing, even the preeminent search company on earth can't make an AI that is better than their search tool (which is also way worse than it used to be). Even worse, it has the idiocies of the ruling class permanently coded into it, as we saw with Google's AI drawing tool (which btw still hasn't been turned on again).
That's retarded, but makes sense that's how AI is done.

But the whizbang aura of it makes it sound like the tech company has a bunch of coders who make an AI program, unleashes the tool and then it does it's own thing like Skynet learning and doing its own shit.

AI doesn't sound too magical anymore if it's just tons of programmers constantly changing it like any other software program.
 

diffusionx

Gold Member
That's retarded, but makes sense that's how AI is done.

But the whizbang aura of it makes it sound like the tech company has a bunch of coders who make an AI program, unleashes the tool and then it does it's own thing like Skynet learning and doing its own shit.

AI doesn't sound too magical anymore if it's just tons of programmers constantly changing it like any other software program.

Obviously AI has automatic info gathering and processing capabilities. Programmers are not coding every single answer into it. But in the end it is just processing tons of data really quickly as well as parsing natural language prompts. It's an incredible technology and could be a great tool but that's all it is and all it ever will be.
 
Last edited:
Top Bottom