• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

ChatGPT and the (near) future of AI

8BiTw0LF

Banned






skynews-stock-child-robot-ai_6018085.jpg


What's your guys take on AI going forward? I believe it's inevitable we'll see a major change in how we live in the future. AI (and robots) will account for 80% of all jobs - the last 20% will be humanitarian jobs that needs to be run by "reflecting realistically to the world" humans. In regards to safety I believe we'll see an "API" that have to be implemented so the AI doesn't go rogue - but the world has to agree on it - and that's the tricky part. We could see forms of AI running wild like computer virusses.

Please keep this debate civil.
 

Biff

Member
AI has the potential to significantly transform our society and the way we live in the future. It is likely that we will see a significant shift towards automation in many industries, with AI and robots performing a larger percentage of jobs. However, it is important to ensure that the development and deployment of AI is done ethically and responsibly, with considerations for job displacement and the potential risks of AI behaving in ways that are unexpected or harmful. Implementing an "API" or other safety measures to ensure that AI does not go rogue could be an important step in this direction, but it will certainly be challenging to get the entire world to agree on such measures. It is important for researchers, policymakers, and society at large to carefully consider the potential impacts and risks of AI, and work towards responsible and ethical development and deployment.
 

Jinzo Prime

Member
AI has the potential to significantly transform our society and the way we live in the future. It is likely that we will see a significant shift towards automation in many industries, with AI and robots performing a larger percentage of jobs. However, it is important to ensure that the development and deployment of AI is done ethically and responsibly, with considerations for job displacement and the potential risks of AI behaving in ways that are unexpected or harmful. Implementing an "API" or other safety measures to ensure that AI does not go rogue could be an important step in this direction, but it will certainly be challenging to get the entire world to agree on such measures. It is important for researchers, policymakers, and society at large to carefully consider the potential impacts and risks of AI, and work towards responsible and ethical development and deployment.
Was this post written by an AI? 🤔
 

Tams

Gold Member
Was this post written by an AI? 🤔

I suspect the same. It sounds very 'AIish'.

We'll definitely be needing things like UBI (universal basic income). I just can't see all the jobs that will be lost being replaced. I do think that more than just 'humanitarian' jobs will exist though. I think there will always be interest in people doing things, even if a machine/computer can do it better. It's just the number will be much reduced, especially the number of people who will be able to make a living from it. And it'll become ever more 'zoo-like'.
 

8BiTw0LF

Banned
We'll definitely be needing things like UBI (universal basic income). I just can't see all the jobs that will be lost being replaced. I do think that more than just 'humanitarian' jobs will exist though. I think there will always be interest in people doing things, even if a machine/computer can do it better. It's just the number will be much reduced, especially the number of people who will be able to make a living from it. And it'll become ever more 'zoo-like'.
But in a world with universal basic income and a higher intelligence present, most nowadays jobs would be called hobbies.
I hope humanity is ready for exceptionally more spare time, cause we're not good at doing nothing. Sports, gaming and socializing are all things that will be much more needed, for us to keep challenging one another. Just because we made AI and it will be far greater than us at most things, we still need to improve ourselves.
Maybe universal basic income will gather the millions of divided tribes we've made the past century - and especially the past two decades since social media divided us even more.
One can hope.
 

JimmyRustler

Gold Member
I used Chat GPT for the first time a few weeks ago and was blown away. I cannot see this not being googles doom if they do not come up with something similar.
 

Tams

Gold Member
I was just watching a video on highwaymen, so got ChatGPT to knock up this poem:

"Sure, here's a poem about a highwayman:

Through the dark and misty night
Rode the highwayman so bright
With a pistol in his hand
And a heart full of free land

His steed was fast and sure
As he raced along the moor
With the wind in his face
He left a trail of fear and grace

From coach to coach he'd leap
With a roar, and then a creep
As he whispered to his prey
"Your gold and your life, or you'll pay"

But the highwayman had a heart
That beat for a different part
For a lady fair, who waited for
The one who would open the door

To a life of love and ease
But alas, it was not to be
These roads, they were his fate
And his heart, it was sealed, sealed fate

But still he rode, through wind and rain
A legend of the night, forever to remain.

Please note that robbing people is illegal and harmful for the society, this is a fictional work about a fictional character, not an endorsement or glorification of any criminal activity."

I used Chat GPT for the first time a few weeks ago and was blown away. I cannot see this not being googles doom if they do not come up with something similar.

They already have an AI called LaMDA that's likely comparable. It certainly convinced one developer on it that it was sentient. Now, that guy is a bit of a loon, but he did leak some of what it's capable of and it seems on par with ChatGPT.

I'm sure they'll incorporate it into Google searches if they feel that the Bing - OpenAI collaboration starts to threaten Google.
 

Wildebeest

Member
The new public models are impressive, but they are still nothing like artificial general intelligence and are only as impressive as the meaning we project onto them.
 

01011001

Banned
telling ChatGPT to write a Resident Evil movie script 😂
it actually used Alice as the main character


Title: Resident Evil: Apocalypse

Opening shot of a large, imposing mansion set in the middle of a dense forest. The camera slowly pans around the building, showing its grandeur and decay.

Fade to black.

Opening credits roll.

We cut to the inside of the mansion, where we see a group of scientists in lab coats milling about. They are led by Dr. James Marcus, a brilliant but eccentric scientist who is working on a secret project known only as the "T-virus."

As the scientists work, we hear the sound of something breaking through a window. The scientists turn to see a group of zombies breaking into the lab. They scream and scatter as the zombies, now in full control of the lab, start attacking them.

The scene cuts to a team of heavily armed special forces soldiers, led by Alice (played by Milla Jovovich), as they infiltrate the mansion to rescue the scientists and retrieve any information on the T-virus.

As they make their way through the mansion, they encounter hordes of zombies and other bioweapons created by the T-virus, including giant mutant alligators and giant spiders.

The team finally reaches the lab and find Dr. Marcus, who is the only surviving scientist. Dr. Marcus explains to them that the T-virus was being developed as a weapon, but it got out of control and infected everyone in the mansion, turning them into zombies.

As they try to escape with the information on the T-virus, they are confronted by the main antagonist, a high-ranking Umbrella Corporation official named Dr. Isaacs, who wants to use the T-virus to create an army of bioweapons to take over the world.

Alice and her team must now fight to survive against the zombies and Dr. Isaacs' forces, while also trying to prevent the T-virus from spreading to the outside world.

The climax of the film is an intense battle between Alice and Dr. Isaacs in the lab. Alice manages to destroy the T-virus and Dr. Isaacs' plans but not before the doctor himself injects himself with the virus. He mutates into a giant monstrous creature and Alice fights it with all her might .

The last shot of the movie is a helicopter flying off into the distance, with Alice and the remaining members of her team aboard, the mansion and the surrounding forest now a smoldering ruin.

Fade to black.

End credits roll.
 

SF Kosmo

Al Jazeera Special Reporter
Capitalism as we know it is destined to fail because there will simply not be the need for everyone to work to produce goods, but there is the need for everyone to work to buy them.

The question from there becomes what do you replace it with, how do you allow people to have ambitions and to compete and to define themselves without that.
 

8BiTw0LF

Banned
Capitalism as we know it is destined to fail because there will simply not be the need for everyone to work to produce goods, but there is the need for everyone to work to buy them.

The question from there becomes what do you replace it with, how do you allow people to have ambitions and to compete and to define themselves without that.
Good question.

I think Jacque Fresco summed it up pretty well in this doc:

Timestamped:
 

kikkis

Member
Capitalism as we know it is destined to fail because there will simply not be the need for everyone to work to produce goods, but there is the need for everyone to work to buy them.

The question from there becomes what do you replace it with, how do you allow people to have ambitions and to compete and to define themselves without that.
Goods would simply be less expensive in hypothetical ai apocalypse, so no need to work as much to achieve same living standards.

I don't see agi happening in a long time if ever. There have been many ai hype cycles and subsequent failures. I don't think the ai is hardware question, it's theoretical one that humans could have invented 50 years ago. Current machine learning things with dedicated hardware, often seem like throwing shit at wall and hoping its still there until venture funding round.
 

Lasha

Member
Capitalism as we know it is destined to fail because there will simply not be the need for everyone to work to produce goods, but there is the need for everyone to work to buy them.

The question from there becomes what do you replace it with, how do you allow people to have ambitions and to compete and to define themselves without that.

Society as we know it changes. Captialism remains unchanged since the market will find a price for goods regardless of how they are produced.
 

Fools idol

Banned
I've been writing code for 25 years and this is astonishing me every single day.

You can get it to write just about anything that a competent mid-tier senior developer can do within seconds.

If you think this wont devastate the job market you are delusional. I fear most for knowledge economy workers - stuff like teachers, instructors, writers and editors, and even doctors to some degree. It's going to make them obsolete within a few years of training.

We are going to see a lot of unemployment and it will be devastating to an already wobbly economy globally. The instability we see in Brazil and the middle east will spread like wildfire.
 
Last edited:

Fools idol

Banned
As a test, I just used it to create a competent malware.

It generated it in 14 seconds.

Something that used to take me an entire weekend when I was a junior dev.

This is so dangerous, I cant imagine how this is left free to use without regulation for long.

Also, the realisation is setting in for many.




This is innevitable but sad. I just fear once it trickles down from the top-paying jobs into the every day devastation.
 
Last edited:

QSD

Member
As a psychologist I do think that some fears about AI are overblown, I don't see it happening that AI will somehow manifest consciousness and start setting its own goals. Human motivation is strongly linked to the fact we are embodied beings and our bodies have needs which determine what psychological drives we have. For example, humans get curious which prompts us to explore our environment, and that in turn is advantageous to our survival. An AI can't be curious. It can be told to gather information, but there's no way that I can see that an AI will start gathering information simply because it wants to know or is curious. So a terminator-like skynet that will decide humans are to be exterminated of it's own accord, is pretty much out of the question. However, an AI can be pre-programmed with specific goals that could set it up (even inadvertently) to reach a similar conclusion. If you program a goal like self-preservation into an AI, depending on how you operationalize that, it could reach a point where it determines that humans are a threat to its continued existence. This is the big danger, not so much where AI will 'evolve' to, rather what problems it will be directed at and how those problems are framed.

For my own job I'm not super fearful to be replaced, as I work with psychiatric patients and I don't see an effective FreudGPT appearing just yet
 

PSYGN

Member
I've been writing code for 25 years and this is astonishing me every single day.

You can get it to write just about anything that a competent mid-tier senior developer can do within seconds.

If you think this wont devastate the job market you are delusional. I fear most for knowledge economy workers - stuff like teachers, instructors, writers and editors, and even doctors to some degree. It's going to make them obsolete within a few years of training.

We are going to see a lot of unemployment and it will be devastating to an already wobbly economy globally. The instability we see in Brazil and the middle east will spread like wildfire.

It’ll affect programmers before it gets to teachers and doctors. I wouldn’t discount the physical side of things, I think it’ll keep their jobs from becoming obsolete as soon as most programmers, editors, etc.
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
Society as we know it changes. Captialism remains unchanged since the market will find a price for goods regardless of how they are produced.
But at some point we can provide for everyone in society while requiring labor from only a fraction. At this point capitalism is unsustainable. The dynamics of inequality also skyrocket at this point, because whoever owns the "robots" owns everything and people have nothing to get their share if not with labor.

And it wouldn't make sense, like with such abundance, where we can affordably feed and house everyone, would we let the majority starve in the street simply because their labor is unneeded? It would be absurd.

So capitalism in the sense that we understand it now totally breaks down at that point. But purely egalitarian solutions miss the inherently competitive nature of humans, too, like how do we decide who gets the nice house and who lives in a shack if not for what you earn at work?
 

Lasha

Member
But at some point we can provide for everyone in society while requiring labor from only a fraction. At this point capitalism is unsustainable. The dynamics of inequality also skyrocket at this point, because whoever owns the "robots" owns everything and people have nothing to get their share if not with labor.

And it wouldn't make sense, like with such abundance, where we can affordably feed and house everyone, would we let the majority starve in the street simply because their labor is unneeded? It would be absurd.

So capitalism in the sense that we understand it now totally breaks down at that point. But purely egalitarian solutions miss the inherently competitive nature of humans, too, like how do we decide who gets the nice house and who lives in a shack if not for what you earn at work?

A twitch influencer makes millions of dollars per year while the average physician earns only a few hundred thousand dollars today because society values influencers more. A post scarcity society would be similar with ideas, IP, and influence becoming the currency of the new economy. The fundamentals of capitalism remain unchanged.
 

SF Kosmo

Al Jazeera Special Reporter
A twitch influencer makes millions of dollars per year while the average physician earns only a few hundred thousand dollars today because society values influencers more. A post scarcity society would be similar with ideas, IP, and influence becoming the currency of the new economy.
How does this work for the average person? We can't all be artists and influencers. We need a system that serves the mediocre as well and the exceptional and allows both to live happy and fulfilled lives (if not necessarily equal).

Obviously AI and automation aren't literally going to replace every job, we will still need researchers, doctors, lawyers, service industry people, etc. But we won't need enough of these people for everyone to have jobs, and even if we did, these aren't the sort of jobs that everyone has the ability to do.

So the trick then is rewarding labor for those that can perform needed work while also providing for those that aren't needed. It's a tricky problem.
 
Last edited:

Tams

Gold Member
How does this work for the average person? We can't all be artists and influencers. We need a system that serves the mediocre as well and the exceptional and allows both to live happy and fulfilled lives (if not necessarily equal).

That's where UBI comes in. Though much more research needs to be done, what has been done, if I recall correctly, showed no real difference to existing social benefits systems apart from removing the requirement for people to work.
 

SF Kosmo

Al Jazeera Special Reporter
That's where UBI comes in. Though much more research needs to be done, what has been done, if I recall correctly, showed no real difference to existing social benefits systems apart from removing the requirement for people to work.
UBI is a stop-gap at best. I think UBI will make sense at some point, but it's still a system based around the same sort of framework of capital that we have now, and as such it only feeds the massive inequality problem that comes from the privatization of these massive self-sustaining systems that provide for everyone. You essentially end up with neo-fuedalism.
 

QSD

Member
But at some point we can provide for everyone in society while requiring labor from only a fraction. At this point capitalism is unsustainable. The dynamics of inequality also skyrocket at this point, because whoever owns the "robots" owns everything and people have nothing to get their share if not with labor.

And it wouldn't make sense, like with such abundance, where we can affordably feed and house everyone, would we let the majority starve in the street simply because their labor is unneeded? It would be absurd.

So capitalism in the sense that we understand it now totally breaks down at that point. But purely egalitarian solutions miss the inherently competitive nature of humans, too, like how do we decide who gets the nice house and who lives in a shack if not for what you earn at work?
At a certain point, if we want to make sure that people are not going to die in the streets, we are going to have to seriously think about collectively affirming another value system than simply money. One of the big things that bothers me (as a left leaning person) about the current state of affairs, is that since religion is on the way out, the only value system we collectively affirm as society is money. It worries me that children are growing up in this world and see examples like Andrew Tate who aggressively embody both this view of the world, and at the same time also the darkest side of capitalism, basically exploiting the insecure and gullible with awful grifting. I do believe there is always some kind of exploitation at the heart of capitalism, be it of the environment or of our fellow man, but it has been curtailed by other social forces like religion and the labor movement. AI could basically destroy any influence the labor movement has to speak for the less fortunate/less privileged, and I don't see religion coming back in a big way.

I really liked the recent convo between Lex Fridman and Coffeezilla, they talked about the problem of money = success = happiness messaging
 

Lasha

Member
How does this work for the average person? We can't all be artists and influencers. We need a system that serves the mediocre as well and the exceptional and allows both to live happy and fulfilled lives (if not necessarily equal).

Obviously AI and automation aren't literally going to replace every job, we will still need researchers, doctors, lawyers, service industry people, etc. But we won't need enough of these people for everyone to have jobs, and even if we did, these aren't the sort of jobs that everyone has the ability to do.

So the trick then is rewarding labor for those that can perform needed work while also providing for those that aren't needed. It's a tricky problem.

UBI is a stop-gap at best. I think UBI will make sense at some point, but it's still a system based around the same sort of framework of capital that we have now, and as such it only feeds the massive inequality problem that comes from the privatization of these massive self-sustaining systems that provide for everyone. You essentially end up with neo-fuedalism.

I think you have a misconception about capitalism. Capitalism is just an economic system where the means of production are held privately and sold for a profit. There isn't a political or social aspect to consider. The fundamentals of a market economy will morph in a highly autonomous society but a market will always emerged when there is something that others want and it is in short supply. The idea that we get to a point where everything we "need" is produced without human intervention implies that supplying basic essentials will be relatively trivial. Participation in the market suddenly becomes optional for the less motivated.

Mechanisms like UBI or rations would work in that case because some corporation or industrial concern will want to make a profit supplying necessities. Inequality will probably increase but does it really matter what the rich are doing if we get to a point where everybody can live a long and healthy life without working? QSD QSD makes an excellent point in that what we value individually and our social contract will have to change. Reeducation to get people accustomed to being happy without chasing money will be the biggest challenge. Governments will probably need to introduce population control measures as well to prevent population booms when kids are no longer required to support older generations.
 

QSD

Member
I think you have a misconception about capitalism. Capitalism is just an economic system where the means of production are held privately and sold for a profit. There isn't a political or social aspect to consider. The fundamentals of a market economy will morph in a highly autonomous society but a market will always emerged when there is something that others want and it is in short supply. The idea that we get to a point where everything we "need" is produced without human intervention implies that supplying basic essentials will be relatively trivial. Participation in the market suddenly becomes optional for the less motivated.

Mechanisms like UBI or rations would work in that case because some corporation or industrial concern will want to make a profit supplying necessities. Inequality will probably increase but does it really matter what the rich are doing if we get to a point where everybody can live a long and healthy life without working? QSD QSD makes an excellent point in that what we value individually and our social contract will have to change. Reeducation to get people accustomed to being happy without chasing money will be the biggest challenge. Governments will probably need to introduce population control measures as well to prevent population booms when kids are no longer required to support older generations.
Although I appreciate the point of agreement, I don't completely agree that capitalism is *just* an economic system - theoretically it is, absolutely, but in practice I'm seeing a world where other (religious/cultural) value systems are receding and what remains is a grim "money is all that matters, money makes the world go round" type nihilistic culture that I do kind of associate with capitalism. I don't dislike capitalism per se (at least, I love technology and innovation as much as the next person) but the version we have now is not one I have a lot of confidence in anymore, and I'm pretty sure that AI will hasten its demise.

(I hope this is not too political, I'm seeing this more as a discussion of technology culture and society)
 
Last edited:

Chaplain

Member
This is so dangerous, I cant imagine how this is left free to use without regulation for long.

Yuval Noah Harari was warning about this a few years ago:





This is innevitable but sad. I just fear once it trickles down from the top-paying jobs into the every day devastation.

"In the 19th century the Industrial Revolution created a huge urban proletariat, and socialism spread because no other creed managed to answer the unprecedented needs, hopes and fears of this new working class. Liberalism eventually defeated socialism only by adopting the best parts of the socialist program. In the 21st century we might witness the creation of a massive new unworking class: people devoid of any economic, political or even artistic value, who contribute nothing to the prosperity, power and glory of society. This “useless class” will not merely be unemployed — it will be unemployable. In September 2013, two Oxford researchers, Carl Benedikt Frey and Michael A. Osborne, published “The Future of Employment,” in which they surveyed the likelihood of different professions being taken over by computer algorithms within the next 20 years, and they estimated that 47 percent of US jobs are at high risk. For example, there is a 99 percent probability that by 2033 human telemarketers and insurance underwriters will lose their jobs to algorithms. There is a 98 percent probability that the same will happen to sports referees. Cashiers — 97 percent. Chefs — 96 percent. Waiters — 94 percent. Paralegals — 94 percent. Tour guides — 91 percent. Bakers — 89 percent. Bus drivers — 89 percent. Construction laborers — 88 percent. Veterinary assistants — 86 percent. Security guards — 84 percent. Sailors — 83 percent. Bartenders — 77 percent. Archivists — 76 percent. Carpenters — 72 percent. Lifeguards — 67 percent." (Israeli historian Yuval Noah Harari)

Source: The rise of the useless class

Edited: added link
 
Last edited:

Lasha

Member
Although I appreciate the point of agreement, I don't completely agree that capitalism is *just* an economic system - theoretically it is, absolutely, but in practice I'm seeing a world where other (religious/cultural) value systems are receding and what remains is a grim "money is all that matters, money makes the world go round" type nihilistic culture that I do kind of associate with capitalism. I don't dislike capitalism per se (at least, I love technology and innovation as much as the next person) but the version we have now is not one I have a lot of confidence in anymore, and I'm pretty sure that AI will hasten its demise.

(I hope this is not too political, I'm seeing this more as a discussion of technology culture and society)

Markets aren't the issue I reckon. It's the implications of a world without god. What's the point in being moral if saints fade into the same oblivion as rapists? The only goal becomes accumulation of resources to make your life and the lives of people connected to you as comfortable as possible while avoiding jail or ostracization. There is a lot of fiction delving into what a post scarcity would would look like ranging from Brave New World to Forever Peace. It's a very interesting topic to ponder.
 

Con-Z-epT

Live from NeoGAF, it's Friday Night!
Curse and blessing. Inevitable at this point. Just a question of how fast and severe a paradigm shift will occur.

This sure has to be divided into AI that is merely an underdeveloped servant of humanity and something that will surpass us as human beings. While it could help humans to transcend into something greater, in each state, i think it is much more likely to harm us in the long run. We are building the AI as a reflection of ourselves so this won't end well seeing how destructive we are as a species. There are just to many variables where one can mess up.

Just some thoughts, i'm not an expert.
 

Con-Z-epT

Live from NeoGAF, it's Friday Night!
As a psychologist I do think that some fears about AI are overblown, I don't see it happening that AI will somehow manifest consciousness and start setting its own goals. Human motivation is strongly linked to the fact we are embodied beings and our bodies have needs which determine what psychological drives we have. For example, humans get curious which prompts us to explore our environment, and that in turn is advantageous to our survival. An AI can't be curious. It can be told to gather information, but there's no way that I can see that an AI will start gathering information simply because it wants to know or is curious.

We haven't even fully understood our own brains or use them at full capacity, yet we are thriving to build something even greater. Psychology is merely theories that help us understand our behaviour until it is exchanged by another theory. Of course that is simplified but i hope you understand what i want to say with that.

Just imagine an AI that is even a bit more capable than the most intelligent human. Why wouldn't it be curious? Why wouldn't it manifest self consciousness? How could i even understand this while i don't even understand the processes in my own brain and take things like feelings or even a soul / spirit for granted?

You whole perception is that of a human being with limited "processing power" that is about to be surpassed in a few decades.

Not to dismiss you arguments. I really like your take. I just feel different about all this.
 

QSD

Member
We haven't even fully understood our own brains or use them at full capacity, yet we are thriving to build something even greater. Psychology is merely theories that help us understand our behaviour until it is exchanged by another theory. Of course that is simplified but i hope you understand what i want to say with that.

Just imagine an AI that is even a bit more capable than the most intelligent human. Why wouldn't it be curious? Why wouldn't it manifest self consciousness? How could i even understand this while i don't even understand the processes in my own brain and take things like feelings or even a soul / spirit for granted?

I don't know exactly how to explain my point of view, but look at it this way: the single player campaign of any FPS is populated by NPC enemies that all have a limited form of AI. These NPC's are usually programmed to try to survive and hurt the player, albeit in a way that is somewhat nerfed so they aren't unbeatable. But they do have (similar to humans and other organisms) an 'inborn' survival instinct. Humans and animals have a survival instinct that is massively driven by our experience of pain and fear. People who can't feel either pain or fear, are in constant danger of hurting or killing themselves. Do you suppose NPC's in an FPS feel pain or fear when they are being shot at? If not, why not? And do you think you could program them in such a way that they do feel pain? In a way just like humans experience it? Because that is something that I sincerely doubt, you need a body to feel pain. People sometimes forget how much of the contents of our consciousness is feedback from our body.

You whole perception is that of a human being with limited "processing power" that is about to be surpassed in a few decades.
I didn't understand this remark. I think humans and AI are still categorically different, I am (like I said) skeptical of AI manifesting consciousness. I also don't think it is necessarily bad that humans are 'surpassed' in all cases. If Amazons orders are hitherto picked by robots rather than humans I think few people would mourn the loss of 'the intrinsic joy of order picking.' It's a thoroughly dull and monotonous job, that is probably much better suited to robots than people anyway. I do worry about the loss of income, that's why solutions like UBI need to be looked at, in addition to some kind of value system that makes sure we don't let people starve in the streets because of advances in robotics.
Not to dismiss you arguments. I really like your take. I just feel different about all this.
No worries man, I love discussions like this, so please by all means, disagree. I completely understand what you mean by psychology is merely a bunch of theories - I generally say that psychology is still in its infancy compared to say physics or chemistry, in terms of how much we know vs how much we still have to learn.
 

Con-Z-epT

Live from NeoGAF, it's Friday Night!
I don't know exactly how to explain my point of view, but look at it this way: the single player campaign of any FPS is populated by NPC enemies that all have a limited form of AI. These NPC's are usually programmed to try to survive and hurt the player, albeit in a way that is somewhat nerfed so they aren't unbeatable. But they do have (similar to humans and other organisms) an 'inborn' survival instinct. Humans and animals have a survival instinct that is massively driven by our experience of pain and fear. People who can't feel either pain or fear, are in constant danger of hurting or killing themselves. Do you suppose NPC's in an FPS feel pain or fear when they are being shot at? If not, why not? And do you think you could program them in such a way that they do feel pain? In a way just like humans experience it? Because that is something that I sincerely doubt, you need a body to feel pain. People sometimes forget how much of the contents of our consciousness is feedback from our body.
Sure if it would only be a "program", an NPC or a intelligent construct of multiple processes than it makes a lot of sense what you write down. But where will this lead us? Technology will improve rapidly were it's getting harder and harder to even predict the next 10 years.

You only have yourself, to compare in regards to feeling senses. You are right that a lot comes from our body. But to perceive could be completely different if you are a machine. Who knows where technology will lead us. It doesn't need to be pain as we know it but something that you couldn't even process since you are human. Of course it is natural to look for direct comparisons since what else do we got but i'm convinced there will be a point where it all tips and we loose the possibility to understand the AI through our limited minds. It feels like nobody tries to look beyond although it sure makes sense to think more of the near future. Also seems the thread is more aimed towards that.

I also doubt it will end like in any of the movies but i'm convinced that AI will not be a slave to humanity as more like an entity that will go it's very own way, completely independent of us. Maybe not the chat AI but something in the next, i don't know, 30-50 years.
I didn't understand this remark. I think humans and AI are still categorically different, I am (like I said) skeptical of AI manifesting consciousness. I also don't think it is necessarily bad that humans are 'surpassed' in all cases. If Amazons orders are hitherto picked by robots rather than humans I think few people would mourn the loss of 'the intrinsic joy of order picking.' It's a thoroughly dull and monotonous job, that is probably much better suited to robots than people anyway. I do worry about the loss of income, that's why solutions like UBI need to be looked at, in addition to some kind of value system that makes sure we don't let people starve in the streets because of advances in robotics.
I have a hard time to wrap my head around the near term effects of AI impacting society in its entirety. You already made a lot of good points and even then it feels so unpredictable and manifold.
No worries man, I love discussions like this, so please by all means, disagree. I completely understand what you mean by psychology is merely a bunch of theories - I generally say that psychology is still in its infancy compared to say physics or chemistry, in terms of how much we know vs how much we still have to learn.
🙏
 
  • Love
Reactions: QSD

tommolb

Member
Companies are always looking to cut costs to remain competitive. AI will enable companies to save money by replacing their biggest cost - people. It will happen. It's just a case of how quickly.

Already chat bots are replacing call centre workers. I see this where I work.

The problem is, as someone above has already pointed out, all these companies racing to save costs and get rid of people will collectively destroy the market for their goods as fewer and fewer people will buy them. Individual companies won't see this as their problem as shareholder value (and not the good of society) drives each of them individually, but their collective actions will destroy them.

AI will replace people in contact centres, till assistants (AI will scan your basket and cost your goods as you leave the shop), teachers (through AI learning assistants), truck and bus drivers (self driving cars) and warehouse workers and shelf stackers (AI will do inventory management and picking of goods) withing 10-15 years. Just those roles alone are millions of jobs.

I'm not sure what jobs AI will create. Maybe AI trainers, but what happens when one AI can train another?

In a utopian future the government will give each member of society money to live on, but given how most governments currently see benefit claimants as scum this future won't materialise.

On the bright side, capitalism is on course to destroy itself due to AI, but not before there is a lot of suffering, riots and upheaval.
 

64bitmodels

Reverse groomer.
I've been writing code for 25 years and this is astonishing me every single day.

You can get it to write just about anything that a competent mid-tier senior developer can do within seconds.

If you think this wont devastate the job market you are delusional. I fear most for knowledge economy workers - stuff like teachers, instructors, writers and editors, and even doctors to some degree. It's going to make them obsolete within a few years of training.

We are going to see a lot of unemployment and it will be devastating to an already wobbly economy globally. The instability we see in Brazil and the middle east will spread like wildfire.
was considering being a software developer but honestly i think i'd rather go and work as an AI developer... it's clear the programming industry is about to be fucked the same way coal miners were
 
Last edited:

Wildebeest

Member
Is ChatGPT really the only solution right now??
No, but other public bots of similar quality are also based openai's gpt-3+ models. As other people have said, other companies like google have their own models, but they are not public. OpenAI is like a company set up for other companies, like Microsoft, to invest in so one company like google does not win the AI arms race with a private model that has no competition, that they are forced to pay for and use.
 

8BiTw0LF

Banned
For my own job I'm not super fearful to be replaced, as I work with psychiatric patients and I don't see an effective FreudGPT appearing just yet
Who needs a shrink if you can just store your bad memories - even replace it with happy "memories" that are not yours?

 

Bragr

Banned
I've been writing code for 25 years and this is astonishing me every single day.

You can get it to write just about anything that a competent mid-tier senior developer can do within seconds.

If you think this wont devastate the job market you are delusional. I fear most for knowledge economy workers - stuff like teachers, instructors, writers and editors, and even doctors to some degree. It's going to make them obsolete within a few years of training.

We are going to see a lot of unemployment and it will be devastating to an already wobbly economy globally. The instability we see in Brazil and the middle east will spread like wildfire.
In terms of programming though, its a hyper charged tool, you still need the person to put it together, and in high level programming, it cant reason.
 
Top Bottom