• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why did earlier consoles consume so little power?

Elysion

Banned
If you look back at the PS1 for example, it only consumed around 10W or so, which is less than early Switch models use in handheld mode! The N64 used less than 20W, and the same is true for all consoles before that afaik. Even the Dreamcast only consumed 22W. These consoles used so little power that they didn’t even need any active cooling at all. It‘s only with the PS2 that consoles started to use a bit more power (40-50W for the PS2), before it really exploded with the PS3 and 360, whose launch models used between 150 and 200W. PS4 and XBOne were a bit lower, but PS5 and Series X are right back to 200W.

My question is, why didn’t Sony, Nintendo or Sega let their earlier consoles consume more power to get more performance out of them? It wouldn‘t have to be all the way to 200W, but surely a PS1 for example that ran at higher clocks and consumed around 50W or so would‘ve been much more capable? I understand that higher power consumption means more heat, which means there‘s need for things like a fan and/or heatsinks, which means higher costs etc, but starting with PS2 future consoles did that anyway, and they still managed to be sold at mass market prices. Is there a technical reason I’m not aware of why earlier consoles couldn’t run hotter?

I mean, hypothetically speaking, if someone had decided to release a 150W console in the late 80s or early 90s, wouldn’t it have been a generation ahead of all these other consoles, like the SNES or Genesis, that ran on like 15W back then? Could a 150W console in 1990 be comparable to the PS1 in 1994, if only through brute force?
 
Last edited:

daveonezero

Banned
Probably cost.

It is now cheaper to consume more power and in turn push more visuals.

It would have not only cost more but been diminishing returns for the time.
 

lem0n

Member
More power wouldn't have equaled more complex games, IMO. The consoles were purely a means to play games at that time, and the games that were being developed in that era didn't need much computing power. A compact hatchback doesn't need a V8, so why, as a manufacturer, shell out the extra cash for more parts and more development time... just use what's needed to get it around reliably.
 
Last edited:

mansoor1980

Gold Member
i want to know about the power consumption of arcade boards like virtua fighter/tekken etc , generally the watt consumption of cutting edge arcade games
 

ReBurn

Gold Member
They didn't require as much power and more raw power probably would have just killed the consoles earlier. These things were passively cooled.
 
Last edited:
i want to know about the power consumption of arcade boards like virtua fighter/tekken etc , generally the watt consumption of cutting edge arcade games
I 'member when I was a kid my Dad would rent several arcade machines for a New Years eve party and put them in the gameroom. Us kids got to play them before the party when we were then sent off to the Grandparents for New Years Eve. :(
I remember several times the breaker being tripped in the house. So I'm gonna assume quite a bit of juice on those old arcade machines.
 

Keihart

Member
Simply put, power mostly just depends on the amount of transistors and type of them you are using, you can't just keep increasing voltage or current and expect to get better performance.
You can to a certain extent but that doesn't mean that the old consoles werent already consuming as much as it made sense performance wise.

It's not like you can't keep overclocking a PS5 or XBOXone, but at some point, it doesn't make sense anymore.
 

EDMIX

Member
less powerful, thus uses less power......hey listen did you know that Hitler's dog was called Blondi and he killed it along with himself after he lost the war?
 

Connxtion

Member
Dreamcast had a fan for cooling, though it can be run at 3.3v and it’s a hell of a lot quieter 😁 console is basically silent when combined with a ODE.
 

MikeM

Member
More processing power generally needs more fuel. You see it in cars, power lifting etc. No different here unless you change the fundamentals of the processing.
 

ReBurn

Gold Member
I 'member when I was a kid my Dad would rent several arcade machines for a New Years eve party and put them in the gameroom. Us kids got to play them before the party when we were then sent off to the Grandparents for New Years Eve. :(
I remember several times the breaker being tripped in the house. So I'm gonna assume quite a bit of juice on those old arcade machines.
Probably just too many on the same circuit.
 

nush

Gold Member
i want to know about the power consumption of arcade boards like virtua fighter/tekken etc , generally the watt consumption of cutting edge arcade games

The older cabinets were basically run off a PC power supply, JAMMA boards were 12 volts. Newer stuff I'd also assume is a rating comparable to a modern PC.

You've also got to factor in monitor, speakers, lights and the game board.
 

Northeastmonk

Gold Member
It’s interesting to read about Atari and how boards with less circuits were in high demand. I’d imagine that you deal with the most affordable and whatever does the job. Of course the industry is beyond the days of Atari, I’d imagine production costs as well as warranty agreements make it hard to produce for the masses. As long as it works, who really cares how much power their console gets? That’s not trying to undercut this topic. I seriously wonder why it even matters as long as consoles work the way they should.

Here’s a very interesting article:

 
Last edited:

BlackTron

Member
Successful game consoles were all about getting as much performance as possible out of a very affordable price point.

With the tech back then, the best available at that price point did not need to be cooled. The tech is actually a LOT more efficient than back then when you think about it: PS5 uses 4x the power of PS2, but pushes MUCH more than 4x the performance.
 

M1chl

Currently Gif and Meme Champion
Mac Studio consumes like 60w max and still rapes a 1000w+ top end PC. Maybe next time go for ARM instead of the aging x86.
In CPU yeah, ARM64 + HW based ASIC (for example for video in the M1 CPU) is more efficient in these workloads than x86, but main suck of the energy is GPU and I don't think there is really a way to lower power consumption.

- Posted from my M1 MacBook
 

Elysion

Banned
OP could have answer his own question by simply doing a google search

No I couldn’t, actually. If you type my question in the OP into Google then the first thing that (now) comes up is this thread, and the rest is about energy efficiency of modern consoles. There isn’t really anything about why older consoles couldn’t go beyond 25W pre-2000. I couldn’t find anything on tech forums like Beyond3d either.
 
Last edited:
If you look back at the PS1 for example, it only consumed around 10W or so, which is less than early Switch models use in handheld mode! The N64 used less than 20W, and the same is true for all consoles before that afaik. Even the Dreamcast only consumed 22W. These consoles used so little power that they didn’t even need any active cooling at all. It‘s only with the PS2 that consoles started to use a bit more power (40-50W for the PS2), before it really exploded with the PS3 and 360, whose launch models used between 150 and 200W. PS4 and XBOne were a bit lower, but PS5 and Series X are right back to 200W.

My question is, why didn’t Sony, Nintendo or Sega let their earlier consoles consume more power to get more performance out of them? It wouldn‘t have to be all the way to 200W, but surely a PS1 for example that ran at higher clocks and consumed around 50W or so would‘ve been much more capable? I understand that higher power consumption means more heat, which means there‘s need for things like a fan and/or heatsinks, which means higher costs etc, but starting with PS2 future consoles did that anyway, and they still managed to be sold at mass market prices. Is there a technical reason I’m not aware of why earlier consoles couldn’t run hotter?

I mean, hypothetically speaking, if someone had decided to release a 150W console in the late 80s or early 90s, wouldn’t it have been a generation ahead of all these other consoles, like the SNES or Genesis, that ran on like 15W back then? Could a 150W console in 1990 be comparable to the PS1 in 1994, if only through brute force?
Moore's law. As the gains in performance from die shrinks have decreased the need to push power for greater performance has arised.
 
As Moore's law slowed down it was necessary to use more and more watts to keep seeing steady increases to performance.

N64 was the last console without a fan.

Honestly I wish it'd stayed that way at least on console.

Console with cartridge and super low heat = looooooong lasting. Be interesting to see how powerful a passively cooled console that size today could be.
 
Successful game consoles were all about getting as much performance as possible out of a very affordable price point.

With the tech back then, the best available at that price point did not need to be cooled. The tech is actually a LOT more efficient than back then when you think about it: PS5 uses 4x the power of PS2, but pushes MUCH more than 4x the performance.
But the size of the die(s) is not 4x PS2 so that shows the PS5 is less efficient with its silicon area.
 

Damigos

Member
Because it is realistically impossible to have 4k/60/RT etc with so few Watts.

Consoles still use less power than the equivalent graphics cards of PCs. Only the Graphics cards
 

ZywyPL

Banned
Because that's what technology was at the time, the chip ran at mere 100-300MHz not couple of GHz, and there were dozen of transistors a instead if billions. In other wirds it just wasn't possible to bump up the power consumption any further.
 
PlayStations blueprint has always put the power brick inside the console.
419CVQ8K9HL._AC_.jpg
51JSnGkGkoL._AC_SX385_.jpg
 

Bo_Hazem

Banned
In CPU yeah, ARM64 + HW based ASIC (for example for video in the M1 CPU) is more efficient in these workloads than x86, but main suck of the energy is GPU and I don't think there is really a way to lower power consumption.

- Posted from my M1 MacBook

Shouldn't APU's be the future then? Their next shit is more 2x M1 Ultra, more like 2 M1 Ultra fused together. Cutting latency by having both the CPU, GPU, and their shared memory so close makes it compete against the highest specced PC's just falling behind in graphics like 30-50% but consuming ~17x times less!
 

tusharngf

Member
Smaller SOC less power consumption. 6.2 gigaflop ps2 vs 10tf ps5 the jump is insane in numbers.


ConsoleFLOPSRelease Year
Dreamcast1.4 GFLOPS1998
PlayStation 26.2 GFLOPS2000
GameCube9.4 GFLOPS2001
Xbox20 GFLOPS2001
Xbox 360240 GFLOPS2005
PlayStation 3230.4 GFLOPS2006
Wii12 GFLOPS2006
 
Last edited:

Chastten

Banned
I don't know the exact answer or anything, just keep in mind that even PC's in those days barely had or needed any cooling as well. I ran a Pentium 3 for 2 years without any fans: the case didn't have any and the CPU fan made so much noise that I just decided to remove it. The PC kept working like a charm until I replaced it some years later. You wouldn't have to try something like that now.

Honestly miss those days of dead silent electronics. I don't care much for graphics, but as soon as something make annoying noises, it goes out the window. When I see Youtube video's about PS4 Pro's or recently some Steam Decks that sound like a jet engine... Just gimme quiet, energy efficient stuff any day of the week.
 

M1chl

Currently Gif and Meme Champion
Shouldn't APU's be the future then? Their next shit is more 2x M1 Ultra, more like 2 M1 Ultra fused together. Cutting latency by having both the CPU, GPU, and their shared memory so close makes it compete against the highest specced PC's just falling behind in graphics like 30-50% but consuming ~17x times less!
It's more about architecture than packaging. x86 has a ton of back-compat units, features and so on, it cannot be fundamentally changed. ARM is RISC architecture and x86 is CISC, meaning that the amount of instruction which ARM CPU support is smaller, granted it does not matter that much, because when I can run on it whole system without any limitation is not a big deal.

Bottom line is that x86 is very old and it has a lot of added features, instruction and so on, on the top of 40 year old CPU. Meaning that you are going to power a lot of "dead weight".

APU would be a wet dream for manufactures, because if you would want to upgrade system, you would have to swap out at very least CPU+GPU. However looking at power consumption of the big GPUs, that silicon would probably turn into small sun very quickly, if some advancement wouldn't be done on GPU side. I am talking about high end GPUs, mid range GPUs would be possible obviously.

HOWEVER, nVidia (as we all know, the very best company in the PC space) is preparing APU based on ARM64 and one of their GPUs. Problem with this is, that it would require new motherboard so basically whole new platform, which hopefully comes out. Another problem is MS having a bad time with ARM64 <> x86 emulator, because people always hated ARM builds of Windows, because you can't run shit on it, only supported apps. Apple on the other hand did such a good job with Rossetta, that you don't even know if you are running x86 or ARM64 app. So hopefully that will also gets resolved.

Because it's time™ to leave legacy tech behind.
 

Bo_Hazem

Banned
It's more about architecture than packaging. x86 has a ton of back-compat units, features and so on, it cannot be fundamentally changed. ARM is RISC architecture and x86 is CISC, meaning that the amount of instruction which ARM CPU support is smaller, granted it does not matter that much, because when I can run on it whole system without any limitation is not a big deal.

Bottom line is that x86 is very old and it has a lot of added features, instruction and so on, on the top of 40 year old CPU. Meaning that you are going to power a lot of "dead weight".

APU would be a wet dream for manufactures, because if you would want to upgrade system, you would have to swap out at very least CPU+GPU. However looking at power consumption of the big GPUs, that silicon would probably turn into small sun very quickly, if some advancement wouldn't be done on GPU side. I am talking about high end GPUs, mid range GPUs would be possible obviously.

HOWEVER, nVidia (as we all know, the very best company in the PC space) is preparing APU based on ARM64 and one of their GPUs. Problem with this is, that it would require new motherboard so basically whole new platform, which hopefully comes out. Another problem is MS having a bad time with ARM64 <> x86 emulator, because people always hated ARM builds of Windows, because you can't run shit on it, only supported apps. Apple on the other hand did such a good job with Rossetta, that you don't even know if you are running x86 or ARM64 app. So hopefully that will also gets resolved.

Because it's time™ to leave legacy tech behind.

Man, you're so deep into this shit and even if I'm not as knowledgeable as you you made it look pretty simple to understand!

Jimmy Fallon Wow GIF by The Tonight Show Starring Jimmy Fallon
 

M1chl

Currently Gif and Meme Champion
Man, you're so deep into this shit and even if I'm not as knowledgeable as you you made it look pretty simple to understand!

Jimmy Fallon Wow GIF by The Tonight Show Starring Jimmy Fallon
Well I studied IT school, so you remember stuff and well for this topic, I am using one of the actual next-gen CPU and I can't happier.

Cuz I am currently sitting in the woods, working and neoGafin' and have 10 whole hours to go! (MacBook M1 13 pro) It's just annoying that it's weekend and a lot of city people also have the idea to go in here too :messenger_pouting:

Need to find better spot, but if I move too far, I will loose precious 5G signal.
 

winjer

Gold Member
All chips from way back then used less power.
Consider that a Pentium 200, that was at one time the top end, would consume 35W.

But chip manufacturing improved. And that meant higher clock speeds, that also mean higher power usage.
Power delivery also improved a lot, with energy on systems being more precise and stable.

Also, adding more pipelines to a CPU execution, doesn't scale linearly. After a point, it becomes extremely difficult to have a front-end on a CPU that can distribute instructions to many pipelines.
The same thing with adding CPU cores. So after a point, adding more just wastes die space.

And finally, advancements in process nodes are slowing down.
 
Because platform holders back then had a the common sense to know that an entertainment device for the living room, primarily targeted towards children in particular, shouldn't be a ridiculous power hog. Whereas today, these boxes seem to compete with 3K+ PC gaming rigs.
 

Alphagear

Member
Because they were less powerful.

Then again even the Top End CPUs and GPUs of the day used less power.
 
Last edited:

BlackTron

Member
But the size of the die(s) is not 4x PS2 so that shows the PS5 is less efficient with its silicon area.

Huh? That doesn't show that. If you really wanted to know with certainty, you'd have to calculate the ratio of silicone area to performance for both consoles and compare them. I'm pretty sure PS5 will be an order of magnitude more efficient per unit of silicone.

The PS5 CPU is 95 times faster than PS2. If PS5 was somehow LESS efficient per unit of silicone, it would need at least 95 times more of it just based on that one statistic.

Imagine you have a machine that's 1 foot wide that spits out 1 pancake per minute. And then a machine that is two feet wide that spits out 100 pancakes a minute. But because it's twice as large, you call it less efficient for the materials in the machine. That is type the comparison being made here and it's silly!!
 

Drew1440

Member
A combination of using lower lithography and better processor design. PCs around that time (1995) also used a fraction of power compared to today's high-end systems. back in 2000 the highest PSU you could get was 400W, now 1200W watt PSUs are common, but it wasn't until the Pentium 4 / NetBurst that power usage significantly increased in the PC space to warrant it. Over time efficiency has also increased, as there's no use in a CPU pulling 125W if most of it is being wasted as heat.

Xbox 360 is a good example of this, with the early units with 90nm processors pulling in around 205W of power, being reduced to 175W (65nm), then 150W (45nm), and eventually 125W.
Xbox 360 Revisions

OP could have answer his own question by simply doing a google search
Google? Why not use DuckDuckGo or Brave search?
 

AJUMP23

Gold Member
Electricity was a new invention when consoles hit the markets in the 70s and 80s. They hadn’t had a long time to gather up more electricity to put more in each console.
 
Top Bottom