• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Tinfoil Hat: Gaming Could Lead Into "accidentally" A.G.I, Artificial general intelligence, and Google & Amazon Know This.

I'm throwing spheghetti at the wall here.

Both Google and Amazon in gaming don't seem like big focus to them, and why would it when they dominate in other service and and hardware.. Robotic and A.i (machine learning).

When we also think of A.I. we usually think robotic but that industry cost alot in R&D.

But gaming in some way, by programming and coding A.I. in the millions, it's possible that a one day someone might stumble on A.G.I.

Do you think Gaming is a likely industry to discover A.I.? And separately, Do you think Google and Amazon are smart enough to realize this and to established their gaming platforms over it?
 
Last edited:

SJRB

Gold Member
Wild statement with zero scientific reasoning, lol.

But as to your question: no, gaming will not discover "AI" because as a whole videogame development is incredibly limited in scope and bound by comparatively low computational power.

Both Amazon and Google more probably entered the market because it is a multi billion dollar industry and they want in on that.
 
Companies realized that they could use seemingly unrelated user inputs to train machine learning algorithms a long time ago (e.g. Captcha). I can see this being a part of the rationale of moving gaming to the cloud, along with eliminating the cost of physical distribution. I'm sure a bunch of those gatcha apps are doing the same, though they are driven more by maximizing profit than creating better experiences.

Wild statement with zero scientific reasoning, lol.

But as to your question: no, gaming will not discover "AI" because as a whole videogame development is incredibly limited in scope and bound by comparatively low computational power.

Both Amazon and Google more probably entered the market because it is a multi billion dollar industry and they want in on that.

"Discover" in this case would be for some algorithm that is getting tweaked by constant exposure to human behavior and constant improvement by programmers to adapt to us to the extent that it appears life-like. Of course, there is not an algorithm like that, but with enough data and constant refinement it should be possible. Our brains weren't developed in a day, after all, but by millions of years of refinement of chemical algorithms stacked together.

With all the push by Google and others to move gaming to the cloud, there would no more scope or power bound if they succeeded on a large scale.
 

Tesseract

Banned
I mean... Maybe quantum computing a hundred years from now will change things but until then: No.
GraveFixedFinwhale-max-1mb.gif
 
Last edited:
Interesting you post this. I'm just familiarising myself with microtubles and how quantum fluctuations are harnessed for conciousness....so in a nutshell, no, your PS5 will not become self-aware.

It's not about your PS5 becoming self-aware, at least to me. It's about the cloud-side of the PS20 40-50 years from now having gotten enough data and modifications to produce something that looks approximately "human". We only look approximately "human" anyway - not like there is a hard definition of it.

Maybe it just emerges on its own within that environment, like simple replicating compounds did under the right conditions on early Earth. All it takes is one momentary alignment to start removing entropy and building complexity.

You don't want that's stuff in your navigational system, do you?

IDK, I'd love if my navigation system could safely drive me through walls.
 
Last edited:

Con-Z-epT

Live from NeoGAF, it's Friday Night!
When you compare an artificial intelligence, that would be close to the performance of the human brain, to the A.I. we have in gaming now then you can clearly see that this thought is not reasonable.

The main focus of A.I. in gaming is still entertainment. To make it somewhat believable. And it is mostly movement. It's not like that the NPC's talk to you on their own. Everything is prerecorded.


The only reasonable answer in my opinion is this:

But as to your question: no, gaming will not discover "AI" because as a whole videogame development is incredibly limited in scope and bound by comparatively low computational power.

Both Amazon and Google more probably entered the market because it is a multi billion dollar industry and they want in on that.
To add to this.

The amount of money videogame companies invest for A.I. in gaming sure is nothing compared to the financial efforts of Google or others.
 

Con-Z-epT

Live from NeoGAF, it's Friday Night!
Maybe it just emerges on its own within that environment, like simple replicating compounds did under the right conditions on early Earth. All it takes is one momentary alignment to start removing entropy and building complexity.
:messenger_clapping:

Well put!
 

Miles708

Member
Gaming world surely already is immensely interesting, as a data-harvesting resource.
Not only personal data, but behavioural data coming from MMOs and online games. And, yes, also personal data (I'm fairly sure there's something about recording your voice for marketing and research purposes, in some of the ToS nobody ever reads).

You can create, test and modify various scenarios and study reactions of hundreds of thousands of people, by simply adding a time-related event.
You then have the aggregate behavioral data to make a part of a software to perform more "convincingly" in a specific occasion. I'm fairly sure that's already happening, this is nothing sci-fi (nor an "AI" in the strict sense).
 

V4skunk

Banned
Not for years.
They struggle to get ai to be able to recognise letters and words.
There is a conspiracy that captcha is used to help develop ai recognition.
 
The idea of emergent consciousness is an interesting one but it could never happen within the context you speak of here because the infrastructure isn't designed for it.

Increasing complexity through the composition of arbitrary simple logical functions does not in all cases yield the kinds of emergent sophisticated behaviours we see in say cellular automata for example.

In the overwhelming majority of cases you'll simply get lots and lots of dumb simple stuff.

Also brains evolved over billions of years and are composed of radically complicated neurons, which themselves are evolutionary in many of their structural functions.

Without the right foundational building block structures I highly doubt digital consciousness could ever be emergent, because exactly none of the components of the foundational systems that underpin modern AI/ML efforts are evolutionary in their inception, outside of the algorithms and data models they produce.

EDIT: Also if you believe in Roger Penrose's line of thinking, then such apparatus may give rise to intelligent function without any form of sentience/intelligence to "live" and operate it (i.e. consciousness is non-computable and therefore may exist within a realm of biological quantum phenomena we are still yet to understand or conceive of).
 
Last edited:
The idea of emergent consciousness is an interesting one but it could never happen within the context you speak of here because the infrastructure isn't designed for it.

Increasing complexity through the composition of arbitrary simple logical functions does not in all cases yield the kinds of emergent sophisticated behaviours we see in say cellular automata for example.

In the overwhelming majority of cases you'll simply get lots and lots of dumb simple stuff.

Also brains evolved over billions of years and are composed of radically complicated neurons, when themselves are evolutionary.

Without the right foundational building block structures I highly doubt digital consciousness could ever be emergent, because exactly none of the components of the foundational systems that underpin modern AI/ML efforts are evolutionary in their inception, outside of the algorithms and data models they produce.

That's the point, though. All it took was that chance of RNA forming from some transient conditions in a pool a billion years ago, and then it built from there. It's a tiny random chance but with countless opportunities to occur. In computing that situation is even better for it to happen IMO once we have the right pieces. Not there yet, of course.
 
That's the point, though. All it took was that chance of RNA forming from some transient conditions in a pool a billion years ago, and then it built from there. It's a tiny random chance but with countless opportunities to occur. In computing that situation is even better for it to happen IMO once we have the right pieces. Not there yet, of course.
No because computing is made up of three rigid constructs: hardware, software and data.

Unlike natural systems and biological systems which have no fixed parameters, the exact inverse is true with modern computing, in that the only variable construct is data. With that you have no hope of ever evolving say an ML model for example past the narrow confines of its hardwired and hardcoded forms in terms of the hardware and software that make it up. Doesn't matter how much data you throw at it. It may evolve intelligent (more "complex") functions but they will not be anything more than lots and lots of functions, completely devoid of any ability to e.g. spontaneously spring themselves into action or e.g. a self-driving car evolving wings and flying.

The primordial basis for biological sentience was far more amorphous and more importantly structurally evolutionary, not just functionally evolutionary.
 
Last edited:
No because computing is made up of three rigid constructs: hardware, software and data.

Unlikely natural systems and biological systems which have no fixed parameters, the exact inverse is true with modern computing, in that the only variable construct is data. With that you have no hope of ever evolving say an ML model for example past the narrow confines of its hardwired and hardcoded forms in terms of the hardware and software that make it up. Doesn't matter how much data you through at it. It may evolve intelligent functions but they will not be anything more than lots and lots of functions, completely devoid of any ability to e.g. spontaneously spring themselves into action or e.g. a self-driving car evolving wings and flying.

The primordial basis for biological sentience was far more amorphous and more importantly structurally evolutionary, not just functionally evolutionary.

Biological systems have fixed parameters, as did the chemistry that started life. All matter and energy are just data objects in the end with inherently fixed systems of interaction. The universe is not really different than a computer program, and thus we aren't either. If you can model a protein folding, you can theoretically model a human being eventually. It's all the same thing, just in much different levels of detail.

So if there were widespread quantum computers and an environment where a theoretical program could interact with new data and adjust itself, why would "something" not be able to emerge from it. RNA did not spontaneously turn into a brain - it took countless iterations to get there, building up from one nucleic acid forming to a string of them polymerizing to them forming catalysts to them binding with peptides and so on. So if you start out with something simple in the right environment (e.g. one Google or someone created for that purpose), I see no reason why the same couldn't occur. The iterations could occur much faster as well, speeding the evolution.

And back to the topic, gaming is a great way for that program to learn about how to act human.
 
Last edited:

Tesseract

Banned
Not for years.
They struggle to get ai to be able to recognise letters and words.
There is a conspiracy that captcha is used to help develop ai recognition.
some of the money was ported to ocrs and the like, hypothesis testing, optimization as reinforcement

it's not a conspiracy
 

Xenon

Member
Not sure about true AI. But I can see some sort of basic NPC logic tools like the current physics ones available. Which could have other applications.

Personally I feel that actual AI will be born out of a smaller self replicating piece of code rather than a huge program. So it could come from gaming.
 
Last edited:

Sejan

Member
No. Even the best video game AIs are typically relatively primitive by design. Games are made to run on relatively weak hardware (consoles) that have to split their resources with much more intensive tasks such as video processing and physics engines. On top of that, games have to run their AI scrpits for multiple instances of each enemy at once so they can't be very resource intensive at all. Game AIs are in no way comparable to those that are used for things such as self driving cars, for example.

We are still a ways from Strong AI that could lead to a dystopian future. Even then, it is by far most likely that it will come from a company actively pursuing strong AIs than as a byproduct of game programming.

With that said, I don't think anybody truly even wants decent AI for enemies in their games. People are far more interested in the appearance of AI that can still be relatively easily be overcome with even basic tactics. Even AIs that are lauded for being the best (Games like F.E.A.R.) run relatively simple scripts that simply appear to be tactical or intelligent at a glance. A powerful AI for controlling enemies would likely be percieved by most players as too difficult or somehow unfair. We tend to be much happier playing against enemies that mindlessly stick their heads out of cover for an easy headshot or seemingly never question why their buddy hasn't reported in for the past hour. I believe that investing game resources in significantly better AI would largely be a waste that simply wouldn't be worth the programming time or computational resources.
 
Yup. You cracked the code. Stadia exists so Google can overtake the world when the next Assassin's Creed accidentally creates sentient NPCs.
 

Rat Rage

Member
Yo, MegaZoneEX, if you don't already know it, go watch the Anime "Ghost in the Shell" (from 1995), you'll like it ;)
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
It's because they think eventually game streaming will be the norm, and they will be the leaders.

It's all just trying to take a chunk of the fact Apple, without doing much at all, has the 3rd highest gaming revenue of any company in the world. More than Microsoft, almost as much as Sony. And they do it without devices contributing to that revenue, which is pretty incredible.

Gaming has been lighting up as an industry for a decade.
 
And back to the topic, gaming is a great way for that program to learn about how to act human.
No it's not.

All this shows is your severely limited understanding of what we call "AI" in videogames.

"AI" in videogames isn't actually AI at all. It does next to no learning of anything and in the vast majority of cases comprises simple pathfinding and simulated behavioural algorithms that haven't meaningfully changed in the past 30 years.

ML, DL, Neural Networks are the domain in which any meaningful computer intelligence might emerge, and these fields barely find any utilisation in videogames beyond limited examples like graphical tricks to improve image quality like DLSS, and offline procedural generation of art assets to help speed-up the art production workflow of the dev process.

The subroutines that determine enemy and NPC behaviour in games are simple non-learning functions that take in a limited number of inputs and spit out the exact same pre-canned behavioural outputs regardless of the contextual backdrop of the game state. It's not AI.
 
Last edited:
No it's not.

All this shows is your severely limited understanding of what we call "AI" in videogames.

"AI" in videogames isn't actually AI at all. It does next to no learning of anything and in the vast majority of cases comprises simple pathfinding and simulated behavioural algorithms that haven't meaningfully changed in the past 30 years.

ML, DL, Neural Networks are the domain in which any meaningful computer intelligence might emerge, and these fields barely find any utilisation in videogames beyond limited examples like graphical tricks to improve image quality like DLSS, and offline procedural generation of art assets to help speed-up the art production workflow of the dev process.

The subroutines that determine enemy and NPC behaviour in games are simple non-learning functions that take in a limited number of inputs and spit out the exact same pre-canned behavioural outputs regardless of the contextual backdrop of the game state. It's not AI.

You aren't understanding - player interaction with game systems could be used for machine learning, just like machine learning can be applied to other human behavior.

I'm not saying that an existing scripted NPC will turn into an AI - I'm saying that a machine learning system applied to player actions and interactions in games could theoretically begin to create an AI over time. Games offer varied environments to look at human behavior.

So let's say that basically all games are a cloud service in 10 years. Your inputs are analyzed on a Google/Amazon/etc. server during games and are used to improve NPC behavior. Over time, the system will be updated with new functions added to it. That combination of learning human behavior + updates will improve how "human" it acts. What happens after a decade or two of constant refinement and adaption? I'd say it would be as "alive" as any insect, at the least.

The neat thing would be that the genre of game can lead to different information. I'd say in the next 5-10 years there will be a proliferation of VR life sims. What happens as developers try to make a reactive, realistic VR friend? A reactive, realistic VR parent? A reactive, realistic VR girlfriend?
 

Arkam

Member
No it's not.

All this shows is your severely limited understanding of what we call "AI" in videogames.

"AI" in videogames isn't actually AI at all. It does next to no learning of anything and in the vast majority of cases comprises simple pathfinding and simulated behavioural algorithms that haven't meaningfully changed in the past 30 years.

ML, DL, Neural Networks are the domain in which any meaningful computer intelligence might emerge, and these fields barely find any utilisation in videogames beyond limited examples like graphical tricks to improve image quality like DLSS, and offline procedural generation of art assets to help speed-up the art production workflow of the dev process.

The subroutines that determine enemy and NPC behaviour in games are simple non-learning functions that take in a limited number of inputs and spit out the exact same pre-canned behavioural outputs regardless of the contextual backdrop of the game state. It's not AI.

Bro, you might want to look at your self here. So many people are using ML for game AI right now. In fact you will see a major Racing franchise utilize it THIS year. The NPC isnt learning/adapting obviously. The sessions are logged in the cloud and patterns are observed and the NPC's "AI" is updated.
 
You are right and wrong. It is the perfect sandbox to develop/train agents and provide for truly emergent gameplay. However, Amazon and Google are failing (see Crucible and Stadia for example) while Microsoft and OpenAI get it (see project Malmo and Dota for example). Graphics trump AI in triple A gaming (see hype for Ray racing vs RTS development).
 

KungFucius

King Snowflake
Accidentally, as in randomly emerging? No fucking way. Purposely, as in big data companies running experiments through games to train and improve AI until it goes all fucking Skynet? Maybe.

It isn't even clear to me how an emergent AI could sustain itself. Life developed the means to sustain itself before intelligence emerged.
 

Griffon

Member
edit: nvm that was mean and pedantic of me.

Better answer: no OP, this isn't possible. It's as if you were asking if your toaster oven could become self aware.
 
Last edited:
Absolutely OP.

Game worlds give researchers a perfect little simulation in which to train agents in. Reinforcement learning is incredibly effective for training AI to maximize a defined reward function for a given problem domain/environment. OpenAI's work on Dota (OpenAI Five) and DeepMind's god-like chess AI (AlphaZero) are perfect examples of applying this technique.

The key is defining a good reward function and a network structure to optimize parameters for. Unfortunately, we are a long way off from formulating a reward function to evaluate and ultimately reward "believable" AI concerning interactions with players in the game world (i.e. something that passes the Turing Test under constraints). With the rise of generative adversarial networks, this doesn't seem like quite the pipe dream some are making it out to be...
 

Self

Member
It isn't even clear to me how an emergent AI could sustain itself. Life developed the means to sustain itself before intelligence emerged.

True. That's the reason why A.I is a misnomer. It should be called 'Artificial Unintelligence', because it's the lack of intelligence which determines their behavior.

Meanwhile I just enjoy A.U just for what it is, right Cyberpunk?!
 

ripeavocado

Banned
I'm throwing spheghetti at the wall here.

Both Google and Amazon in gaming don't seem like big focus to them, and why would it when they dominate in other service and and hardware.. Robotic and A.i (machine learning).

When we also think of A.I. we usually think robotic but that industry cost alot in R&D.

But gaming in some way, by programming and coding A.I. in the millions, it's possible that a one day someone might stumble on A.G.I.

Do you think Gaming is a likely industry to discover A.I.? And separately, Do you think Google and Amazon are smart enough to realize this and to established their gaming platforms over it?

Bro life/science is not live a poorly written sci-fi movie.
 
You aren't understanding - player interaction with game systems could be used for machine learning, just like machine learning can be applied to other human behavior.

I'm not saying that an existing scripted NPC will turn into an AI - I'm saying that a machine learning system applied to player actions and interactions in games could theoretically begin to create an AI over time. Games offer varied environments to look at human behavior.

So let's say that basically all games are a cloud service in 10 years. Your inputs are analyzed on a Google/Amazon/etc. server during games and are used to improve NPC behavior. Over time, the system will be updated with new functions added to it. That combination of learning human behavior + updates will improve how "human" it acts. What happens after a decade or two of constant refinement and adaption? I'd say it would be as "alive" as any insect, at the least.

The neat thing would be that the genre of game can lead to different information. I'd say in the next 5-10 years there will be a proliferation of VR life sims. What happens as developers try to make a reactive, realistic VR friend? A reactive, realistic VR parent? A reactive, realistic VR girlfriend?

Ok this clarifies your position a little better, but I still think you're vastly overstating what you're calling "human behavior" in gaming; in terms of the player data to be analyzed.

Consider for a moment that games at their fundamental level are simple systems whose demand of the player is a relatively miniscule selection of inputs from a controller input device. At best, the most complex NN learning from this data would only learn how humans play within the rulesets of the systems defined by the selection of games available in the training data. This is essentially a microcosm of human behavior, focused in a fairly narrow and abstract domain; a far cry from anything that might lead to the emergence of general intelligence.

For the most state of the art NNs to generate something approaching general intelligence from human behavior, the pool of training data needs to encompass a waaaay broader spectrum of actual human behavioral data. Even the best data repositories we have, e.g. the internet at large and in particular social media, represent only a simulacrum of the totality of human behavior. Gaming is not even close to social media, in terms of it's ability to reflect who and what we as intelligent beings are.

Bro, you might want to look at your self here. So many people are using ML for game AI right now. In fact you will see a major Racing franchise utilize it THIS year. The NPC isnt learning/adapting obviously. The sessions are logged in the cloud and patterns are observed and the NPC's "AI" is updated.

Not at all. The example you cite is Drivatars in a single racing franchise. Even the NN used for this are rudimentary in comparison to the actual state of the art in Machine and Deep Learning.

The vast majority of "AI" in games isn't neural network based. Its not even close. So my point stands.
 
Ok this clarifies your position a little better, but I still think you're vastly overstating what you're calling "human behavior" in gaming; in terms of the player data to be analyzed.

Consider for a moment that games at their fundamental level are simple systems whose demand of the player is a relatively miniscule selection of inputs from a controller input device. At best, the most complex NN learning from this data would only learn how humans play within the rulesets of the systems defined by the selection of games available in the training data. This is essentially a microcosm of human behavior, focused in a fairly narrow and abstract domain; a far cry from anything that might lead to the emergence of general intelligence.

For the most state of the art NNs to generate something approaching general intelligence from human behavior, the pool of training data needs to encompass a waaaay broader spectrum of actual human behavioral data. Even the best data repositories we have, e.g. the internet at large and in particular social media, represent only a simulacrum of the totality of human behavior. Gaming is not even close to social media, in terms of it's ability to reflect who and what we as intelligent beings are.

The possible input into machine learning isn't just controller inputs - it's movement relative to location, correlation of actions, how low health correlates with player behavior, etc. And like I said, I can see different genres providing different information. Imagine if you start putting odd games into that system, like SpaceChem. That's where the different perspectives into human behavior and thinking come from.
 
The possible input into machine learning isn't just controller inputs - it's movement relative to location, correlation of actions, how low health correlates with player behavior, etc. And like I said, I can see different genres providing different information. Imagine if you start putting odd games into that system, like SpaceChem. That's where the different perspectives into human behavior and thinking come from.
Again, the data is still biased by the rulesets of the systems available in the gaming training data.

It's still a microcosm of human behavior, because the full spectrum of human behavior isn't anywhere close to being encapsulated within the confines of the complete collection of every gaming genre that has ever existed since the inception of gaming as a pastime.

There's a whole host of human behavior that simply isn't expressed in games, because games that require the expression of said behaviors haven't even been invented yet, or will never be, either because of gaming interface limitations or because there wouldn't be a market for such gaming experiences. Gaming is still fundamentally a commercially driven industry, and so the types of experiences available to players are fundamentally limited by commercial interests to sell games as products.
 
Top Bottom