• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD VEGA: Leaked TimeSpy DX12 benchmark?

llien

Member
? AMD catching up to intel was always the more unlikely scenario than catching up to nvidia.

And this kind of comments puzzle me.

Intel has superior fabs on top of everything. nVidia doesn't.

I look at 290 vs 780Ti.
Then at Fury X vs 980Ti.

It's only after switch to 14nm that there is a huge gap (due to lack of any high end card from AMD whatsoever) between nVidia and AMD.


oh brother

I think that kind of myths is why people seriously expected Switch to beat Xbone/PS4 and even touch PS4 Pro,
/sensible chuckle.
 

Renekton

Member
The hype for Vega is low to non-existent right now, so the mood in r/AMD is more of the "get it over with" thing. The upshot for this is there won't be a Fury/Polaris meltdown like previously (thanks for nothing WCCFTech).
 

Firenze1

Banned
The hype for Vega is low to non-existent right now, so the mood in r/AMD is more of the "get it over with" thing. The upshot for this is there won't be a Fury/Polaris meltdown like previously (thanks for nothing WCCFTech).
Time to wait for navi
 

Neizel

Member
The hype for Vega is low to non-existent right now, so the mood in r/AMD is more of the "get it over with" thing. The upshot for this is there won't be a Fury/Polaris meltdown like previously (thanks for nothing WCCFTech).

You can't generate hype if you haven't shown anything.

I'm just waiting to see what AMD can deliver to upgrade, but it's impossible to be hyped if I don't know what they have to offer.
 

tuxfool

Banned
Actually Ryzen is impressive indeed but it still cannot compete with Intel in Core Performance wise. Maybe it is possible for AMD to catch Intel in later years (I don't believe this at all) , but it is impossible for AMD to catch or even compete with Nvidia. Nvidia isn't a company that only produces graphics cards for individual users. It is a company that contributes AI, deep learning etc. The technology that Nvidia uses is years ahead from AMD.
Gosh you seem like the type that will last here.
 

Luigiv

Member
That can't possibly be right. My i7 3770k, GTX 1070 PC gets a score of 5905, and the only reason it's that low is because it's pulled down by my now 4 year old CPU. Comparatively the score in the OP is pulled up a bit by the CPU test.

If we compare just the graphics score then my 1070 gets a score over 6300, whilst the GPU in the OP only scored 5700. That would be pretty bad for AMD if that's all Vega is capable of. That can't be a top-end retail Vega card right, it has to be something else. Maybe a cut down variant or a new APU (though admittedly an 8 Zen core APU with that graphics performance would have an insane power draw).

Edit: Oops, misread the graphics score in the OP. Amended for accuracy. Point still stands though.
 
That can't possibly be right. That can't be a top-end retail Vega card right, it has to be something else. Maybe a cut down variant or a new APU (though admittedly an 8 Zen core APU with that graphics performance would have an insane power draw).

image.php


I too hope its not. We need some form of competitivity here
 

tuxfool

Banned
That can't possibly be right. My i7 3770k, GTX 1070 PC gets a score of 5905, and the only reason it's that low is because it's pulled down by my now 4 year old CPU. Comparatively the score in the OP is pulled up considerably by the CPU test.

If we compare just the graphics score then my 1070 gets a score over 6300, whilst the GPU in the OP only scored 5200. That would be disastrous for AMD if that's all Vega is capable of. That can't be a top-end retail Vega card right, it has to be something else. Maybe a cut down variant or a new APU (though admittedly an 8 Zen core APU with that graphics performance would have an insane power draw).
I should note that the CPU doesn't actually factor that heavily in the final score. If you compare 3D scores, as one should be, it matters even less.
 

DSN2K

Member
dumb picture


that pic smells of fanboyism. Ryzen 5 for example has pretty much made i5's redundant to new builds. Ryzen 7 is no slouch...trading blows with Intels £1000 CPU's.

I have Intel and Nvidia in my PC because it was built a year ago but if I was starting from scratch now I'd be sitting with Ryzen.
 
that pic smells of fanboyism. Ryzen 5 for example has pretty much made i5's redundant to new builds. Ryzen 7 is no slouch...trading blows with Intels £1000 CPU's.

I have Intel and Nvidia in my PC because it was built a year ago but if I was starting from scratch now I'd be sitting with Ryzen.

Outside of the 7700k theres little reason to pay the premium for an intel cpu

If Vega isn't good what the hell am I supposed to get that will make good use of my new Freesync monitor? :(

We need amd to be more competitive so situations like this dont exist. Nvidia would absolutely support freesync if they didnt have such a stranglehold
 

ISee

Member
That can't possibly be right. My i7 3770k, GTX 1070 PC gets a score of 5905, and the only reason it's that low is because it's pulled down by my now 4 year old CPU. Comparatively the score in the OP is pulled up a bit by the CPU test.

If we compare just the graphics score then my 1070 gets a score over 6300, whilst the GPU in the OP only scored 5700. That would be pretty bad for AMD if that's all Vega is capable of. That can't be a top-end retail Vega card right, it has to be something else. Maybe a cut down variant or a new APU (though admittedly an 8 Zen core APU with that graphics performance would have an insane power draw).

Edit: Oops, misread the graphics score in the OP. Amended for accuracy. Point still stands though.

Your 1070 is clearly overclocked. Looking at your score I'd say it is running at about 1.95-2 GHz. The 3d mark database entry is probably from a prototype that is running at a lower clock speed and without optimised drivers. I'm confident that Vega will be able to run at least at 1400 MHz standart boost clock, maybe even more. This could bring it to non oc 1080 levels of performance. Not the breakthrough many hoped for, but a viable option if the price is right.
 
that pic smells of fanboyism. Ryzen 5 for example has pretty much made i5's redundant to new builds. Ryzen 7 is no slouch...trading blows with Intels £1000 CPU's.

I have Intel and Nvidia in my PC because it was built a year ago but if I was starting from scratch now I'd be sitting with Ryzen.

Yeah it's extremely funny when you're the type of person that likes to laugh at your own misfortune and funny in the market monopolies funny kind of way...
 

Kayant

Member
ArtificialIntelligence is actually a perfect name in hindsight based on posts made.

Anyways hope AMD does bring out something there is at least in the 1080 range with Vega. Saying that am more interested 1060 to 1070 range performance to upgrade from my 970 and get decent but "cheap" freesync monitor.
 

Luigiv

Member
Your 1070 is clearly overclocked. Looking at your score I'd say it is running at about 1.95-2 GHz. The 3d mark database entry is probably from a prototype that is running at a lower clock speed and without optimised drivers. I'm confident that Vega will be able to run at least at 1400 MHz standart boost clock, maybe even more. This could bring it to non oc 1080 levels of performance. Not the breakthrough many hoped for, but a viable option if the price is right.

It only has a tiny 26MHz factory overclock (I can't manually set any higher without it becoming unstable). It does boost to a little over 1.9GHz in actual operation but that's entire automatically so it's still out of the box performance. AMD GPUs are suppose to do the same thing, right?

That said, I am running a 900MHz overclock on the GDDR5 memory but that really shouldn't be a point of contention when Vega supposedly has HBM2.
 

ISee

Member
It only has a tiny 26MHz factory overclock (I can't manually set any higher without it becoming unstable). It does boost to a little over 1.9GHz in actual operation but that's entire automatically so it's still out of the box performance. AMD GPUs are suppose to do the same thing, right?

That said, I am running a 900MHz overclock on the GDDR5 memory but that really shouldn't be a point of contention when Vega supposedly has HBM2.

Doesn't matter if the overclock was applied by the cards manufacturer (like in this case) or by you. An overclock is an overclock and comparing an overclocked gpu with a downclocked prototype doesn't work, no matter how you put it.
1900+ isn't just a tiny oc btw. Especially not on pre OCed cards.
 
It only has a tiny 26MHz factory overclock (I can't manually set any higher without it becoming unstable). It does boost to a little over 1.9GHz in actual operation but that's entire automatically so it's still out of the box performance. AMD GPUs are suppose to do the same thing, right?

That said, I am running a 900MHz overclock on the GDDR5 memory but that really shouldn't be a point of contention when Vega supposedly has HBM2.

Youre within a couple percent of the realistic maximum OC a user can expect
 

Durante

Member
To be fair, he's also not far above the minimum OC pretty much anyone can expect on a 1070. These GPUs clock very consistently.

I'd say the most representative comparison is always of the common clock you'll achieve in games with a given architecture. Of course, we'll only be able to do that with Vega once it's released.

Really? Which one?
Well, Vega will introduce tiled rasterization on AMD GPUs, while Nvidia introduced it with Maxwell in 2014 (without telling anyone).
 
To be fair, he's also not far above the minimum OC pretty much anyone can expect on a 1070. These GPUs clock very consistently.

I'd say the most representative comparison is always of the common clock you'll achieve in games with a given architecture. Of course, we'll only be able to do that with Vega once it's released.

Well, Vega will introduce tiled rasterization on AMD GPUs, while Nvidia introduced it with Maxwell in 2014 (without telling anyone).

I mean amds been abe to compete just fine at every performance bracket so far without it(assuming they released a gpu). I cant agree with the general sentiment of his statement at all. Its like saying amd is years ahead of nvidia in tech since they allow mixed compute and graphics to be freely intertwined in the same SM
 

Durante

Member
I mean amds been abe to compete just fine at every performance bracket so far without it(assuming they released a gpu). I cant agree with the general sentiment of his statement at all
In recent times they often competed while throwing more power/transistors/bandwidth at the problem though, for similar performance. Which isn't really sustainable.

That said, I also don't fully agree with the general sentiment of the original poster, but llien asked a specific question and I answered it.
 
In recent times they often competed while throwing more power/transistors/bandwidth at the problem though, for similar performance. Which isn't really sustainable.

That said, I also don't fully agree with the general sentiment of the original poster, but llien asked a specific question and I answered it.

Thats fair.

The extra power and transistors were to some degree because amds architecture offered things on the compute side that nvidia cut out starting with kepler
 
What do people expect in terms of powerdraw?

I'm hoping for <200W, but I fear it'll be higher. GTX 1080 is 180W though, so it's not completely unrealistic, right?
 

Hesemonni

Banned
The hype for Vega is low to non-existent right now, so the mood in r/AMD is more of the "get it over with" thing. The upshot for this is there won't be a Fury/Polaris meltdown like previously (thanks for nothing WCCFTech).
Can't really blame them :/ That Vega event really was ... something.
 
Gosh you seem like the type that will last here.
that pic smells of fanboyism. Ryzen 5 for example has pretty much made i5's redundant to new builds. Ryzen 7 is no slouch...trading blows with Intels £1000 CPU's.

I have Intel and Nvidia in my PC because it was built a year ago but if I was starting from scratch now I'd be sitting with Ryzen.

$300+ Ryzens are listed in top 10 bestsellers on mindfactory.de, to address your picture about bestsellers.

8 core ryzens are beating 8 cores by Intel at a number of practical tasks.
Intel still has single core perf crown, but gap is no where as drastic as before and there is a wide range of Intel CPUs that are much slower than that.

Competition is back to CPU market, at least for now.



Really? Which one?
? AMD catching up to intel was always the more unlikely scenario than catching up to nvidia. Granted, Intel basically giving up on anything more than small incremental gains on performance gave AMD an opening, but AMD has come reasonably close to Intel's IPC and exceeded their performance per dollar in the I5 range with the ryzen 5s, and their 8 cores are on par with intel's 8 cores in a lot of tasks already at a drastically lower price.

I highly doubt that AMD spent all this time making a chip that is within Fury X range. I also doubt that AMD will have anywhere near the performance per watt of NVIDIA in the near future, but if they have a powerful chip that's priced right, people can ignore the fact that the chip will guzzle power.

oh brother


Guys, I am really sorry if I offend people. That is not my intend. I am currently a post doctoral researcher in a university working in aerospace industry. We are currently trying to improve UAV's ability to self control and react to different situations using machine learning. I can't even tell how Nvidia based cards helps us out there. Yes I accept 1800X an amazing P/P ratio compared to the Intel based chips but in individual core performance the situation is a little bit different. In Nvidia side CUDA really helps us out there. And I am talking about my PC at the office. Sometimes we have access to the university main computer which has Telsa GPU's and they really do some amazing jobs. Again I trully apologize for that post. But it is the case in my perspective.
 

Luigiv

Member
Doesn't matter if the overclock was applied by the cards manufacturer (like in this case) or by you. An overclock is an overclock and comparing an overclocked gpu with a downclocked prototype doesn't work, no matter how you put it.
1900+ isn't just a tiny oc btw. Especially not on pre OCed cards.

I don't think you're aware how GPU Boost works. The card does that by itself based on internal telemetry, which has little do with any overclocking applied. Even if I turn off the factory overclock, the card will still automatically run at more or less the same speeds when stressed. the 26MHz overclock is not making any appreciable difference to my benchmark score. I would have to intentionally go out of my way to disable GPU boost in order not to hit those metrics. My GPU performance is not atypical to the out of the box experience many 1070 owners get. Given that's what AMD is competing against, I don't think it's unfair to use it as a point of comparison.

Anyway, it was never my point to perform an apples to apples comparison (how could I possibly do that when we have no freaking idea what this unnamed card actually is?), my point was that the score in the OP is too low, considering the market, and that can't possibly be representative of the top Vega's true performance (which is the same argument you're making).
 

rrs

Member
come on ~1080 for $300 so I might have a chance of affording a GPU upgrade before my GPU goes out
 
Guys, I am really sorry if I offend people. That is not my intend. I am currently a post doctoral researcher in a university working in aerospace industry. We are currently trying to improve UAV's ability to self control and react to different situations using machine learning. I can't even tell how Nvidia based cards helps us out there. Yes I accept 1800X an amazing P/P ratio compared to the Intel based chips but in individual core performance the situation is a little bit different. In Nvidia side CUDA really helps us out there. And I am talking about my PC at the office. Sometimes we have access to the university main computer which has Telsa GPU's and they really do some amazing jobs. Again I trully apologize for that post. But it is the case in my perspective.

cuda is software. From a hardware perspective amds gpus are probably better suited for the jobs you just mentioned

I don't think you're aware how GPU Boost works. The card does that by itself based on internal telemetry, which has little do with any overclocking applied. Even if I turn off the factory overclock, the card will still automatically run at more or less the speeds when stressed. the 26MHz overclock is not making any appreciable difference to my benchmark score. I would have to intentionally go out of my way to disable GPU boost in order not to hit those metrics. My GPU performance is not atypical to the out of the box experience many 1070 owners get. Given that's what AMD is competing against, I don't think it's unfair to use it as a point of comparison.

Anyway, it was never my point to perform an apples to apples comparison (how could I possibly do that when we have no freaking idea what this unnamed card actually is?), my point was that the score in the OP is too low, considering the market, and that can't possibly be representative of the top Vega's true performance (which is the same argument you're making).

Because higher end custom cards usually have custom bioses. 1900+ is not what every 1070 owner is getting. Not everyone overclocks
 

ISee

Member
I don't think you're aware how GPU Boost works. The card does that by itself based on internal telemetry, which has little do with any overclocking applied. Even if I turn off the factory overclock, the card will still automatically run at more or less the speeds when stressed. the 26MHz overclock is not making any appreciable difference to my benchmark score. I would have to intentionally go out of my way to disable GPU boost in order not to hit those metrics. My GPU performance is not atypical to the out of the box experience many 1070 owners get. Given that's what AMD is competing against, I don't think it's unfair to use it as a point of comparison.

Anyway, it was never my point to perform an apples to apples comparison (how could I possibly do that when we have no freaking idea what this unnamed card actually is?), my point was that the score in the OP is too low, considering the market, and that can't possibly be representative of the top Vega's true performance (which is the same argument you're making).

I'm absolutely aware how GPU boost works. I've overclocked several keplar, maxwell and pascal cards in the last couple of years and I also edited several boost clock states directly in bios. This doesn't make me an OC guru, but it gives me enough experience to know that you're running an overclocked card in the first place and to know what 3d mark/TimeSpy scores to expect.
You're also ignoring an important factor that I tried to explain to you now on two occasion: We don't know how fast the RX Vega is going to boost, the 1200 MHz is probably the standard non boost clock speed. Because just in case you aren't aware, AMD is also using variable clock states aka "GPU boost".
If you want to compare two different gpu architectures you have to either max both out or let them run at amd/nvidia specifications. Because any other way you can make things either look worse or better than they really are.

Quick example out of my 3d mark database.

GTX 980 highly overclocked: ~5000
GTX 1070 overclocked: ~6300
GTX 1080 running at nVidia standard clock: ~ 6700

The 1080 looks like a failure here and the 980 and 1070 like very good cards. But if you max out the 1080:

GTX 1080 highly overclocked: 8150

It's another story.

But you'll probably just ignore this, again.
 

Luigiv

Member
Because higher end custom cards usually have custom bioses. 1900+ is not what every 1070 owner is getting. Not everyone overclocks
In game or when checking their "boost" clock in an external application? Out of genuine curiosity. Because according to Gigabyte Xtreme, my "boost clock" is only 1823MHz, which I've never actually seen it run at. I only found out it was boosting as high as it was by turning on the GPU monitor in 3Dmark. GPU Boost is a weird set up, I don't quite entirely get the point of.

I'm absolutely aware how GPU boost works. I've overclocked several keplar, maxwell and pascal cards in the last couple of years and I also edited several boost clock states directly in bios. This doesn't make me an OC guru, but it gives me enough experience to know that you're running an overclocked card in the first place and to know what 3d mark/TimeSpy scores to expect.
You're also ignoring an important factor that I tried to explain to you now on two occasion: We don't know how fast the RX Vega is going to boost, the 1200 MHz is probably the standard non boost clock speed. Because just in case you aren't aware, AMD is also using variable clock states aka "GPU boost".
If you want to compare two different gpu architectures you have to either max both out or let them run at amd/nvidia specifications. Because any other way you can make things either look worse or better than they really are.

Quick example out of my 3d mark database.

GTX 980 highly overclocked: ~5000
GTX 1070 overclocked: ~6300
GTX 1080 running at nVidia standard clock: ~ 6700

The 1080 looks like a failure here and the 980 and 1070 like very good cards. But if you max out the 1080:

GTX 1080 highly overclocked: 8150

It's another story.

But you'll probably just ignore this, again.

Ok, I get it, my card is running at high clock. That doesn't change the point that 5700 is still too low for what's suppose to be AMDs flagship (when comparing it to stock 1070 and 1080s) and I believe retail Vega will score considerably better (for whatever reason that may ultimately be).
 
In game or when checking their "boost" clock in an external application? Because according to Gigabyte Xtreme, my "boost clock" is only 1823MHz, which I've never actually seen it run at. I only found out it was boosting as high as it was by turning on the GPU monitor in 3Dmark. GPU Boost is a weird set up, I don't quite entirely get the point of.



Ok, I get it, my card is running at high clock. That doesn't change the point that 5700 is still too low for what's suppose to be AMDs flagship (when comparing it to stock 1070 and 1080s) and I believe retail Vega will score considerably better (for whatever reason that may ultimately be).

Gigabyte extreme 1070 isnt the only board on the market
 

Luigiv

Member
Gigabyte extreme 1070 isnt the only board on the market

Gigabyte Extreme is the GPU settings and tweaking tool I use, not the board. It reports a boost clock of 1823MHz, despite the real boost clock being much higher. Had I not known better I would have just believed my boost clocks were only that. From my understanding this sort of disparity is normal, due to how GPU Boost 3.0 work (which is a standard feature of all Pascal cards, not just Gigabyte ones). That's why I ask, out of genuine curiosity, given how easy it is to be mislead.

Also I'm pretty sure Polaris cards do the same thing, though I'm less familiar with them, do I could be wrong.
 

ethomaz

Banned
seems similar to the sonic cycle..

also, my condolences to people that bought AMD shares recently......it plummeted 25% on tuesday following their latest quarter earning calls.......25%!!!
I remember some guy saying he was happy because the AMD shares he bought in the Ryzen thread lol

Curious to see his reaction now.
 
Gigabyte Extreme is the GPU settings and tweaking tool I use, not the board. It reports a boost clock of 1823MHz, despite the real boost clock being much higher. Had I not known better I would have just believed my boost clocks were only that. From my understanding this sort of disparity is normal, due to how GPU Boost 3.0 work (which is a standard feature of all Pascal cards, not just Gigabyte ones). That's why I ask, out of genuine curiosity, given how easy it is to be mislead.

Also I'm pretty sure Polaris cards do the same thing, though I'm less familiar with them, do I could be wrong.

Its how boost works, my point is theres lots of 1070s out there that arent boosting that high without manual overclocks
 

ethomaz

Banned
Perf/watt is irrelevent for 99% of consumers
Tell that to the world because every silicon company is working hard to reach better perf/watt.

That is really a big deal in CPUs and even more in GPUs... anybody that has a RX 480 knows how bad it is power usage at the point to not reach the default clocks in most of times.

No. It is not irrelevant.
 
Tell that to the world because every silicon company is working hard to reach better perf/watt.

That is really a big deal in CPUs and even more in GPUs... anybody that has a RX 480 knows how bad it is power usage at the point to not reach the default clocks in most of times.

No. It is not irrelevant.

Not reach default clocks? Evidence needed. I said it doesnt matter for consumers. If product a performs as good or better than product b while usIng 25% more power it doesnt matter outside of fanboy drivel

Eg:

390/390x v 970/980
Furyx v 980ti
480/580 v 1060
 

Steel

Banned
What do people expect in terms of powerdraw?

I'm hoping for <200W, but I fear it'll be higher. GTX 1080 is 180W though, so it's not completely unrealistic, right?

No, the one thing you can expect from AMD is that it will guzzle power. That's a given.
 

Caayn

Member
Tell that to the world because every silicon company is working hard to reach better perf/watt.

That is really a big deal in CPUs and even more in GPUs... anybody that has a RX 480 knows how bad it is power usage at the point to not reach the default clocks in most of times.

No. It is not irrelevant.
Fully anecdotal: As a consumer I've never looked at the power draw of my components when building a desktop. I always looked at the performance I could get within my budget.

The same performance with a lower power draw is nice to have, but won't be the deciding factor for me.
 

ethomaz

Banned
Fully anecdotal: As a consumer I've never looked at the power draw of my components when building a desktop. I always looked at the performance I could get within my budget.

The same performance with a lower power draw is nice to have, but won't be the deciding factor for me.
How do you do a build without looks at the power draw??? You need to choose the power supply based on your build power draw.

Even PCPartpicker works based in this premisse.
 
Fully anecdotal: As a consumer I've never looked at the power draw of my components when building a desktop. I always looked at the performance I could get within my budget.

The same performance with a lower power draw is nice to have, but won't be the deciding factor for me.
Edit: I hate auto-correct and touch screens
How do you pick a powers supply?
 
Fully anecdotal: As a consumer I've never looked at the power draw of my components when building a desktop. I always looked at the performance I could get within my budget.

The same performance with a lower power draw is nice to have, but won't be the deciding factor for me.

You mean you dont monitor your wattman while you play?? Get out!!
 

Caayn

Member
How do you do a build without looks at the power draw??? You need to choose the power supply based on your build power draw.

Edit: I hate auto-correct and touch screens
How do you pick a powers supply?
Allow me to clarify, not when deciding between components. I don't pick my PSU based on the power draw of the current system. And even then I always pick a PSU with more capacity than I need. So that I've got room to swap and add components if I desire to do so in the future.
Even PCPartpicker works based in this premisse.
It's not strange that PCPartpicker keeps the power draw in mind as it needs to validate the PSU compatibility with the rest of the components.
 

tuxfool

Banned
How do you do a build without looks at the power draw??? You need to choose the power supply based on your build power draw.

Even PCPartpicker works based in this premisse.
Most power supplies are fine if you're not buying a psu at the redline. This reeks of concern trolling.


Also do people buy a new psu every time they change graphics cards?
 

ethomaz

Banned
Allow me to clarify, not when deciding between components. I don't pick my PSU based on the power draw of the current system. And even then I always pick a PSU with more capacity than I need. So that I've got room to swap and add components if I desire to do so in the future.

It's not strange that PCPartpicker keeps the power draw in mind as it needs to validate the PSU compatibility with the rest of the components.
So you take in consideration power draw to build a system even if you get more capacity than you need.

Most power supplies are fine if you're not buying a psu at the redline. This reeks of concern trolling.


Also do people buy a new psu every time they change graphics cards?
There are fine for sub 100W GPUs... RX 480 can reach 200W in some cases without any over... anything below 500W PSU you will need to check the others components.

Its sad/disappointing that an "enthusiast" would actually post this
The same can be said about these "enthusiast" saying perf/watt is irrelevant in actual modern hardware lol
 
Gonna Coast on my 480 for a bit still. Hope they have better high-end cards when I'm ready to upgrade.
Allow me to clarify, not when deciding between components. I don't pick my PSU based on the power draw of the current system. And even then I always pick a PSU with more capacity than I need. So that I've got room to swap and add components if I desire to do so in the future.

It's not strange that PCPartpicker keeps the power draw in mind as it needs to validate the PSU compatibility with the rest of the components.
Yeah fair. I guess I've also gone the overkill route with my current psu and ignored power draw. I only wound up doing that cause I found something on sale tho
 
How do you do a build without looks at the power draw??? You need to choose the power supply based on your build power draw.

Even PCPartpicker works based in this premisse.

Of course power draw is important. But when you are talking about 150W versus 200W I couldn't care less (1060 vs 480). If Vega performs slightly faster than a 1080 and draws twice the amount of power it's a problem. If it has that performance and needs 25% more juice I couldn't give a monkeys.
 
Top Bottom