• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K Video Gaming is already here (Toshiba Video Demo )

majik13

Member
Did you miss the part where I could read the words "solar charge level" ?


people trying to take 1080P images & convert them to 4K are missing the point of 4K, because the point of 4K is to have 8 million pixels with bits of detail in each one of these pixels, up scaling a 1080P image to 4K is going to give you 6 million fake pixels to fill in the gaps so there will be a lot of lost details but even then it's no longer 1080P it's 1080P upscaled to 4K so it's still not giving you a real comparison.


looking at 1/4 of a 4K image on a 1080P monitor & just picture having 3 more 1080P monitors surrounding that monitor with the rest of the scene in them and that's what you will be getting with 4K.

it's not about making a 1080P scene bigger it's about being able to see that 1080P scene & 4X more.

& that 1/4 of the scene that used to be the full screen at 1080P will still have the same detail that it had before but the difference is you can now see everything that was surrounding it.


again I am guessing he made this as a comparison to 4k, not to simulate 4k, just a way to judge the exact same image at 2k res and 4k res. this is just simulating an upscale so they line up.

as a comparison, my job has a lot to do with resolutions, we have made 10k still renders for theatrical posters from some of our 2k renders for theater screens.

If I remember correctly when I would scale down the 10k render to 2k, just to compare them, it still had significant more detail, if I recall.

I dont handle the renders and dont know the technicalities, but I am guessing it has to do with the algorithms, and throwing out info/detail that isn't necessary. and the AA that is built in. This is using the same renderer that Pixar uses or at least made (Renderman).

Ill try to actually check on this tomorrow, to see if I am remembering this correctly, but just wanted to say that maybe down-scaling 4k renders to 2k, may not completely and accurately match that same thing rendered at 2k.
 

majik13

Member
Also not sure where all this 4k is 4 times more resolution than HD. I even see this on tech websites from journalists say so. but correct me if I am wonk

HD is essentially
1920x1080

4k is either
3840 x 2160
4096 x 2160

at most the width is only 2.1333 times larger and the height is exactly x2

HD resolution is much closer to 2k(2048 × 1556 to 1000ish) than 1k

So I think it is safe to say 4k is 2 times larger than HD.
 

Zeppelin

Member
people trying to take 1080P images & convert them to 4K are missing the point of 4K, because the point of 4K is to have 8 million pixels with bits of detail in each one of these pixels, up scaling a 1080P image to 4K is going to give you 6 million fake pixels to fill in the gaps so there will be a lot of lost details but even then it's no longer 1080P it's 1080P upscaled to 4K so it's still not giving you a real comparison.

You've got a lot of "fake" data in 4K footage to. The RED for example uses pretty heavy lossy compression when shooting at 4K.

Also not sure where all this 4k is 4 times more resolution than HD. I even see this on tech websites from journalists. but correct me if I am wronk

HD is essentially
1920x1080

4k is either
3840 x 2160
4096 x 2160

at most the width only 2.1333 times larger and the height is exactly x2

HD resolution is much closer to 2k than 1k

4096 x 2160 = 8 847 360 pixels
1920 x 1080 = 2 073 600 pixels

8 847 360 / 2 073 600 = 4,26666667
 

Fredrik

Member
640x480 = 480p
1280x720 = 720p
1920x1080 = 1080p

3840x2160 = 4k?

Why are they suddenly focusing on the horizontal resolution instead of the vertical resolution?

2k or 2160p doesn't sound impressive enough?
 

senahorse

Member
Also not sure where all this 4k is 4 times more resolution than HD. I even see this on tech websites from journalists. but correct me if I am wronk

HD is essentially
1920x1080

4k is either
3840 x 2160
4096 x 2160

at most the width only 2.1333 times larger and the height is exactly x2

HD resolution is much closer to 2k than 1k


edit: fixed
(3840x2160) / (1920x1080) = 4
 

majik13

Member
You've got a lot of "fake" data in 4K footage to. The RED for example uses pretty heavy lossy compression when shooting at 4K.



4096 x 2160 = 8 847 360 pixels
1920 x 1080 = 2 073 600 pixels

8 847 360 / 2 073 600 = 4,26666667

ah yes multiplication "x", duh, thanks. I knew I was forgetting something.
basically the 2.133 x 2 from my previous math.
 
Not on the RED it isn't.

I think he means quarter HD, which would be "QVGA." Instead of QHD ("Quad HD").

EDIT:
Just for clarification, Quad HD is:

3840 x 2160

NOT

4096 x 2160.

Most of the industry has adopted "4K" as 4 times the HD standard, to keep aspect ratios.
 

SappYoda

Member
Also not sure where all this 4k is 4 times more resolution than HD. I even see this on tech websites from journalists say so. but correct me if I am wonk

HD is essentially
1920x1080

4k is either
3840 x 2160
4096 x 2160

at most the width is only 2.1333 times larger and the height is exactly x2

HD resolution is much closer to 2k(2048 × 1556 to 1000ish) than 1k

So I think it is safe to say 4k is 2 times larger than HD.

Maybe this can make it clearer

Ue9pJ.jpg
 

Neo C.

Member
Eventually people will buy 4k TVs. I don't think the resolution will be the reason for most though. Even now I don't see many people using their TVs mostly for HD content, at least when they aren't console gamers.
 

Hyphen

Member
All I know is that I spent £1600 on a 37" Pioneer LCD screen about 4 years ago, and look what I can buy with that kind of money now. Whatever the next best/big tv thing is, I'm waiting until the dust settles...
 

mrklaw

MrArseFace
Ok. So I've done some research and testing myself - I'll concede that there certainly more nuance to the nature of visual acuity than the familiar 1 arcminute rule of thumb that is cited as the limits of normal 20/20 visual acuity.

Great post Zaptruder.
 

onQ123

Member
Also not sure where all this 4k is 4 times more resolution than HD. I even see this on tech websites from journalists say so. but correct me if I am wonk

HD is essentially
1920x1080

4k is either
3840 x 2160
4096 x 2160

at most the width is only 2.1333 times larger and the height is exactly x2

HD resolution is much closer to 2k(2048 × 1556 to 1000ish) than 1k

So I think it is safe to say 4k is 2 times larger than HD.

1920 x 1080 = 2073600

3840 x 2160 = 8294400

that's 4 X the pixels 2X more lines going left to right & 2X more going top to bottom.

that's 4K the resolution.


Edit: should have read the post after your's it would have saved me some time SMH lol.
 

xenist

Member
It is a bit off topic but it makes me sad seeing young people, probably technologically inclined, post on a message board with opinions like 'good enough' regarding tech stuff.

Am I the only one that dreams of machines with a thousand cores, running at 1 THz, using the entirety of Earth's silicone, feeding video signal with a million lines of horizontal resolution straight into my optic nerve? Who cares about me noticing the difference? When did technology for its own sake stop being cool for these people? I love looking at my PC's resources and seeing I'm only using a third of them when I'm playing a game. More power is cool. More pixels are cool.

Seriously, screw 'good enough.'
 

leadbelly

Banned
if you have a camera take a 2MP picture then take a 8MP picture, print them both out at the same size & that should give you a good idea of how much more detail you can get from 4K compared to 1080P.

That would be reliant on developers creating assets at that resolution though. It will make the graphics look just that little bit sharper, but the detail is only going to be as high as the textures and assets the developers use for the game. At this moment in time I imagine it is completely irrelevant

I wonder if the latest graphics cards are even powerful enough for true 4K gaming.
 

onQ123

Member
It is a bit off topic but it makes me sad seeing young people, probably technologically inclined, post on a message board with opinions like 'good enough' regarding tech stuff.

Am I the only one that dreams of machines with a thousand cores, running at 1 THz, using the entirety of Earth's silicone, feeding video signal with a million lines of horizontal resolution straight into my optic nerve? Who cares about me noticing the difference? When did technology for its own sake stop being cool for these people? I love looking at my PC's resources and seeing I'm only using a third of them when I'm playing a game. More power is cool. More pixels are cool.

Seriously, screw 'good enough.'


I never seen nothing like it in my life, thanks to the early days of the PS3 vs Xbox 360 war there is now a war against higher resolution & higher capacity disc formats.



& a war on $599



Edit:


That would be reliant on developers creating assets at that resolution though. It will make the graphics look just that little bit sharper, but the detail is only going to be as high as the textures and assets the developers use for the game. At this moment in time I imagine it is completely irrelevant

I wonder if the latest graphics cards are even powerful enough for true 4K gaming.


Games are already created with way higher models than what you see in the games.
 

Zaptruder

Banned
It is a bit off topic but it makes me sad seeing young people, probably technologically inclined, post on a message board with opinions like 'good enough' regarding tech stuff.

Am I the only one that dreams of machines with a thousand cores, running at 1 THz, using the entirety of Earth's silicone, feeding video signal with a million lines of horizontal resolution straight into my optic nerve? Who cares about me noticing the difference? When did technology for its own sake stop being cool for these people? I love looking at my PC's resources and seeing I'm only using a third of them when I'm playing a game. More power is cool. More pixels are cool.

Seriously, screw 'good enough.'

Some of us care about efficacy and are wise enough to stop when that goes away.
 

M3d10n

Member
Would take games rendered in actual 1080p with good AA + locked fps over 4k, at least for the next console cycle. Not ready for 4k yet.
I agree with this. Wihtout antialiasing, higher res are meaningless for me. I have a retina iPad and i can still see the jaggies on games even at normal viewing distances. Text, photos and other antialiased/supersampled content looks glorious, however. The human eye is way too good at picking high contrast details, it seems.
 

SapientWolf

Trucker Sexologist
It is a bit off topic but it makes me sad seeing young people, probably technologically inclined, post on a message board with opinions like 'good enough' regarding tech stuff.

Am I the only one that dreams of machines with a thousand cores, running at 1 THz, using the entirety of Earth's silicone, feeding video signal with a million lines of horizontal resolution straight into my optic nerve? Who cares about me noticing the difference? When did technology for its own sake stop being cool for these people? I love looking at my PC's resources and seeing I'm only using a third of them when I'm playing a game. More power is cool. More pixels are cool.

Seriously, screw 'good enough.'
I'm not a Luddite. There is just a point where increases in resolution have no appreciable effect on image quality on an LCD at a certain viewing angle, and you get severe diminishing returns well before that. Gamers should be pushing for better contrast ratios, lower input lag, 120hz input capability, better viewing angles, better color composition, less ghosting, etc. Most consumer LCD HDTVs are complete ass when it comes to those things and 4k resolutions alone won't change that.

When it comes to IQ, I'm more excited about the new post processing AA solutions on the horizon. The reduction in aliasing is like night and day and the performance hit is negligible.
 

DCharlie

And even i am moderately surprised
Would take games rendered in actual 1080p with good AA + locked fps over 4k, at least for the next console cycle. Not ready for 4k yet.

not particularly aimed at you but it seems that the inclusion of 4K support (or should that be "4k Support") is being interpreted as WELL THATS IT EVERYTHING IS 4K!

It's not - it's an optional resolution for the machines to support - quite how well they can support it is a complete other matter. This gen has just ran for 6+ years. You think 4K won't be a factor in 6 years?

Supporting it in whatever capacity the machines can (and i'm not deluded enough to think we'll be seeing every game ever on the next gen machines supporting this) is an added tick box, an added boon, and if Bluray as is , if HDMI cabling standards etc surfice to allow the PS4 or whatever to play new media and provide us with a couple of games at least that support the new standard? What's the big fucking deal?!

It doesn't matter - even if the next gen machines can do 1080P at 60fps locked with all details and all manner of effects, there will be MANY devs who decide to go 720P at 30fps so they can ramp up details.

All 4K does is add ANOTHER choice for developers - it might be completely borked by hardware limitations but it's a possible supported standard with limitations. I don't understand WHY anyone would rail against this. It's not like it makes 1080P/720P any less of an option. NO one has a gun to devs heads saying "4K or else!"
 

xenist

Member
Some of us care about efficacy and are wise enough to stop when that goes away.

What does efficiency have to do with not pushing for more powerful hardware?

Efficiency is important when you want to exploit a resource that exists in a finite amount. Does having more powerful video chips subtract processing cycles from some underground deposit? The world is not going to run out of teraflops.

Once the hardware is out then yes, be efficient at exploiting it. But I fail to understand how increased resolutions are inefficient by themselves.
 

Zaptruder

Banned
What does efficiency have to do with not pushing for more powerful hardware?

Efficiency is important when you want to exploit a resource that exists in a finite amount. Does having more powerful video chips subtract processing cycles from some underground deposit? The world is not going to run out of teraflops.

Once the hardware is out then yes, be efficient at exploiting it. But I fail to understand how increased resolutions are inefficient by themselves.

1. cost
2. materials required to construct new panels

If you go back a page, you'll see my reconsidered viewpoint on 4k - yes, there'll be some benefits, but it's not something that really excites me.

At some point, you'll simply be in the situation where there are no longer any perceivable benefits (and we're nearing the border of that (in practical situations) with 4k panels), and then it'd just be a real waste of money and silicone.

Of course the tech industry doesn't quite work this way, but it'd be nice to see them using the R&D money on things that have greater effect on our end user experience.

It's why I keep trumpeting the VR revolution myself - I can see few things in our world making a bigger effect on that end user experience myself - and it's also the ultimate convergent destination of all the various computing technologies that have been developed over the last century.
 

AlStrong

Member
Why are they suddenly focusing on the horizontal resolution instead of the vertical resolution?

2k or 2160p doesn't sound impressive enough?

Simply because digital cinematography isn't always done as a 16:9 resolution (depends on the camera they use).
 
That would be reliant on developers creating assets at that resolution though. It will make the graphics look just that little bit sharper, but the detail is only going to be as high as the textures and assets the developers use for the game. At this moment in time I imagine it is completely irrelevant

I wonder if the latest graphics cards are even powerful enough for true 4K gaming.
Gears of War 1

Not 3.

1.

 

Antiochus

Member
The cardinal question:

Can 2160p resolution not only fit on TVs less than 50 inches (or computer monitors less than 30''), but do so in a way that will still be visibly, demonstrably, unequivocally superior than the standard 1080p resolution for the general consumer.

If not, than the whole standard is dead in the water.
 
Great post. Not enough people get this concept.

They really don't. It's annoying to be honest. They start from the top down. It's easier that way.

The cardinal question:

Can 2160p resolution not only fit on TVs less than 50 inches (or computer monitors less than 30''), but do so in a way that will still be visibly, demonstrably, unequivocally superior than the standard 1080p resolution for the general consumer.

If not, than the whole standard is dead in the water.

This has been addressed several times before.

Let's not read the thread though, that's ok.
 

Salacious Crumb

Junior Member
I was listening to some old 1up yours episodes from '06 the other day and most of the guys thought 360 and PS3 supporting 1080p was pointless.

I'm all for 4k support, but all it will amount to in next gen consoles is an upscaler and HDMI spec that supports the resolution. the vast majority of 3D games will not be rendered natively at 4K. That's the reality thanks to the GPU tech available now.

You will not see Watch Dogs rendered natively at 4k on next gen consoles.
 

tzare

Member
As many said, people didn't care about 720p or 1080p a few years ago and we have been discussing all generation about sub hd gaming, digital foundry's face offs and forums are full of pixel counters. Plus people get mad about ipads resolution display and now it nobody cares? Interesting to say the least.
 

Antiochus

Member
This has been addressed several times before.

Let's not read the thread though, that's ok.

On the contrary many arguments and counterarguments are made in a fashion of debate, but an actual, solid answer to those questions have not been made yet.
 

xenist

Member
1. cost
2. materials required to construct new panels

If you go back a page, you'll see my reconsidered viewpoint on 4k - yes, there'll be some benefits, but it's not something that really excites me.

At some point, you'll simply be in the situation where there are no longer any perceivable benefits (and we're nearing the border of that (in practical situations) with 4k panels), and then it'd just be a real waste of money and silicone.

Of course the tech industry doesn't quite work this way, but it'd be nice to see them using the R&D money on things that have greater effect on our end user experience.

It's why I keep trumpeting the VR revolution myself - I can see few things in our world making a bigger effect on that end user experience myself - and it's also the ultimate convergent destination of all the various computing technologies that have been developed over the last century.

VR is the end game for me too, but the small, light, pixel dense, fast displays needed won't get built and perfected if the industry stops pushing the tech. We need to keep pushing. A 4k display has four times the pixel density of an 1080p one. More 4K displays means economies of scale start having an effect. It means fab processess evolve and impove. This will inevitably make the displays needed for good VR even better.

Maybe get to a point where you can have an HMD with two 4K screens instead of 480p ones or whatever the Occulus has.
 

Zaptruder

Banned
VR is the end game for me too, but the small, light, pixel dense, fast displays needed won't get built and perfected if the industry stops pushing the tech. We need to keep pushing. A 4k display has four times the pixel density of an 1080p one. More 4K displays means economies of scale start having an effect. It means fab processess evolve and impove. This will inevitably make the displays needed for good VR even better.

Maybe get to a point where you can have an HMD with two 4K screens instead of 480p ones or whatever the Occulus has.

I certainly made that concession in my posts - recognizing that ultimately, even if 4k display tech is kinda neither here nor there, the advancement is both necessary and helpful for end game tech.

I guess the flat panel industry is kinda just running out of ideas on how to keep pushing and marketing the tech. We hit commodity pricing on large 1080p flat panels so quickly (relative to the previous standard bearers). Even 3D was only premium for a 12-18 months before that got commodotized as well.
 

Tarin02543

Member
When I buy a 4k television it will come as an afterthought. I'll be unboxing at home and say "oh look, it's also 4k. that's nice."
 
On the contrary many arguments and counterarguments are made in a fashion of debate, but an actual, solid answer to those questions have not been made yet.

1) 4k content can get down sampled to 1080p, increasing the image quality of the content by 20%.

2) 4k content on a 1080p screen can allow one to zoom in without reducing the image quality.

These are two things possible with the new codecs and formats that will include 4k.
 

Woo-Fu

Banned
Are people honestly still arguing over 720p vs 1080p? Is it 2006?

The original argument wasn't even 720p vs. 1080p, it was 720p vs. 1080i. ;)

My stance on 4k is that more resolution is usually good in and of itself but you can't ignore every other factor involved when deciding what is better from a practical standpoint.
 

sinseers

Member
if you have a camera take a 2MP picture then take a 8MP picture, print them both out at the same size & that should give you a good idea of how much more detail you can get from 4K compared to 1080P.

I think what should be asked is "Will the average consumer tell the difference? And if so, is it enough of a difference to justify the asking price?" It will always boil down to price. Lets stop acting like that will ever change.
 

onQ123

Member
I think what should be asked is "Will the average consumer tell the difference? And if so, is it enough of a difference to justify the asking price?" It will always boil down to price. Lets stop acting like that will ever change.

Put it this way the difference between 1080P & 4K is bigger than when you go from playing your PS3\Xbox 360 on a SD TV to playing it on a 720P HDTV /Monitor , remember how it still looked good to you when you played on a SDTV until you hooked it up on a HDTV & now you can't go back to SDTV because it just look funny to you.

720P from SDTV is a little less than 4X the resolution

1080P from SDTV is a little less than 6X the resolution

4K from 720P is about 6X the resolution

& 4K from 1080P is 4X the resolution

it's a really big difference, bigger than the jump from SDTV to 720P HDTV

& the jump from 720P to 4K is going to be bigger than the jump from SDTV to 1080P HDTV.

& yes the price is going to have to come down a lot for it to even matter to people, but chances are the prices will come down a lot in the next few years once they go into production.

1080p @ 60fps >> 4k @ <30fps

But that might just be me.

how about 4K @ 30FPS & 1080P @ 60FPS >>> 1080P @ 60fps
 
You've got a lot of "fake" data in 4K footage to. The RED for example uses pretty heavy lossy compression when shooting at 4K.



4096 x 2160 = 8 847 360 pixels
1920 x 1080 = 2 073 600 pixels

8 847 360 / 2 073 600 = 4,26666667

REDRAW lossy compression is visually indistinguishable from uncompressed capture since the update to the mx sensor.
 

BKK

Member
Actual games won't be rendered at 4K. Lower your expectations and save yourself the dissapointment.

From the OP;

get a 4K Quad-FHD screen from Toshiba and connect a powerful PC with a 4K-capable latest/fastest GPU such as ATI 7970 and Nvidia 680, you can then play many of the latest big high-end games that thus render the full 3840x2160 of the game at 30fps
 
Top Bottom