• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

EDGE: "How critical and consumer feedback made Xbox One a contender again"

Maybe for the knee jerkers' but my reading comprehension seems more then capable.

What do you think the average reader would take away from the DR3 and Infamous comparison?

Do you really think they won't read it as if Edge is saying Dead rising looks to be the better game generally? Context is just as important and it certainly would read to the normal person as if Edge is trying to sell DR as the better looking game currently

But when, all of a sudden, they find an updated, better informed middle ground to walk that takes into account the changes MS has made since, the reasons to become upset and start throwing out accusations of shilling for MS come flying out from people who are themselves looking quite obviously in the tank for one side, by post histories, alone.

This is an informed middle ground?

Letting MS sell the upclock of the CPU while discounting the obvious power advantage of the PS4's GPU?

Letting MS continue the FUD campaign about ESram?

Making what look like factual statements about comparisons between the games instead of identifying them as opinion?

I mean, they host Digital Foundry, a feature factory that thrives off of the hits it generates for the console warrior crowd. They know their readership. Shit, GAF must be the single-largest referrer to their site with the bigass threads that attract in the neighborhood of six figures, in terms of views.

Eurogamer hosts digital foundry not Edge
 
Dead Rising 3 looks like the same moribund gameplay that would definitely be a huge flop if it was released on current-gen instead of pretending to be a next-gen exclusive tentpole.

Infamous SS actually looks like the first interesting next-gen title coming out.

What a genuinely bizarre statement to make in comparing the two.
 
Microsoft doesn't exist in a vacuum. The "third party" you're thinking of is all the retailers with their own motivations, all the publishers with their own motivations, all the other consoles and games that exist (PS4/PC/iOS/Wii U/3DS/Vita/Xbox 360/PS3/etc.), and all the other forms of entertainment people could spend their money on.

This idea that Microsoft could magically control all of these entities, and get away with, I dunno, making every game $70-$80 for all of time, doesn't really make any sense

The third party issue I was talking about was strictly reselling. Not first sales.

As for the first party stores, MS and Sony ABSOLUTELY control pricing to a fair degree. Promotional sales typically need to be slotted (on both stores), and we already have more than a few stories about MS indirectly controlling the pricing of XBLA games and requiring (as part of contracts) parity on "other digital stores". Otherwise isn't it kind of funny that every game hitting both PSN and XBLA are $14.99, but tons of games hitting JUST PSN only arrive at $9.99? Terraria being the most blatant and recent example ($9.99 on PC. $4.99 on iPad. $14.99 on consoles OH BECAUSE IT'S ON XBLA)

You are right though in that under MS' original DRM scheme, there is nothing stopping retailers from bomba'ing disc titles at retail.. But that was the ONLY thing unchanged by what MS was proposing.
 
Show me where you get this info about the PS4 CPU, I can't find it anywhere.

That's because the specs that Sony has published thus far have omitted any reference to clock speed for the CPU. Based on the performance #'s of the system overall, though, several hardware GAFfers have extrapolated that the CPU speed is likely to be at the stock value.

Based on what we know right now, the CPU in the XB1 is clocked 10% higher than the one in the PS4.

Which doesn't amount to a hill of beans. I'm not saying it does.
 
Crazy Bandwidth? I thought this was debunked already and you can't just add the numbers. Because if you did 360 has more bandwidth than the xbox one according to major nelson.

They are probably referring to the 204 GB max bandwidth of the ESRAM, which is a bit disingenuous. First, it's only 32 MB of memory. Second, it's 109 GB write max and 109 GB read max that happen simultaneously supposedly--not 204 GB in one particular direction if you wanted to like you can do with the GDDR5 memory in PS4. And lastly, the overall effective throughput of the memory subsystem is significantly constrained by the effective bandwidth of the larger 8 GB pool of DDR3 memory, which runs at 68 GB/s

So, running through some typical example scenarios at theoretical maximums for throughput at the GPU (remember, this is my best guess with the incomplete information that we have):

PS4 (assuming 26 GB/s from GDDR5 for CPU):

-GDDR5 Exclusive Read: 150 GB/s
-GDDR5 Exclusive Write: 150 GB/s
-GDDR5 Combo Read/Write: 85 GB/s Read & 65 GB/s Write

X1 (assuming 30 GB/s from DDR3 for CPU; also, assume a range of 0.5 to 0.667 inherent efficiency for the ESRAM since it is initially and likely intermittently thereafter dependent on being fed from the larger pool and is thus limited by the larger pool's throughput)

-Exclusive Read: (ESRAM @ 109 GB/s * 0.667) + DDR3 @ 38 GB/s = ~111 GB/s
-Exclusive Write: (ESRAM @ 109 GB/s * 0.667) + DDR3 @ 38 GB/s = ~111 GB/s
-Combo Read/Write:
(ESRAM @ 109 GB/s * 0.5) + DDR3 @ 19 GB/s = ~74 GB/s Read &
(ESRAM @ 109 GB/s * 0.5) + DDR3 @ 19 GB/s = ~74 GB/s Write

These numbers are likely, since MS themselves told Eurogamer's Digital Foundry that they were getting real-world effective throughput at the GPU of 140 to 150 GB/s combo read/write (in my example above, ~74 GB/s Read + ~74 GB/s Write = ~148 GB/s).

So, at the end of the day, PS4 has significantly better unidirectional numbers for effective throughput at the GPU. Both have similar effective throughputs at the GPU for combo read/write, with PS4 averaging a better effective Read throughput and X1 averaging a better effective Write throughput. For combo Read/Write in particular, this sounds a lot like what was said in this recent Edge article.
 
Xbox One now has more CPU power than PS4, while PS4 retains the advantage in GPU speed. “They maybe have a little more GPU,” says Lobb. “We have eSRAM [embedded memory] and crazy bandwidth to that eSRAM. Which is going to be better in the long run from a developer [perspective]?

Granted, I'm not a game developer but I AM a software developer and have been for 15 years now. I think that qualifies me to say that the statement above isn't just stretching the truth, it's a flat out lie.

"But we got clear feedback that some of the things we were proposing were perhaps a little too far into the future"

Not even the slightest admission that their original plans were anti-consumer. But it IS an admission that they're going to try that crap again in the future (as if anyone with a brain didn't realize that already).

I always chuckle when the people who say they don't want a camera in their living room brought to them by a company who caved to the NSA at the expense of individual privacy are called "paranoid". Not sure if people have been paying attention, but the "paranoid" people have a pretty good track record of being right lately.

All we have are the choices we make. As I get older it becomes more and more important to me to both carefully consider and feel good about the choices I make. I'm not going to support a company that harbors contempt for their consumers. Especially when I have a good gaming pc and a PS4 on the way.

I have two Xbox 360's so I'm not some Sony Fanboy saying this. But seriously, F**K Microsoft.

I will never buy an Xbone. Never.
 
-Exclusive Read: (ESRAM @ 109 GB/s * 0.667) + DDR3 @ 38 GB/s = ~111 GB/s
-Exclusive Write: (ESRAM @ 109 GB/s * 0.667) + DDR3 @ 38 GB/s = ~111 GB/s
-Combo Read/Write:
(ESRAM @ 109 GB/s * 0.5) + DDR3 @ 19 GB/s = ~74 GB/s Read &
(ESRAM @ 109 GB/s * 0.5) + DDR3 @ 19 GB/s = ~74 GB/s Write

umm... you can't just add the eSRAM and DDR3 bandwidths together.. that's not at all how it works.
 
That article had me laughing. The game comparisons, saying "second-tier" games on Xbox One look better than anything on PS4, just lol. Also, MS PR still on about "we listened to your voices and that's why we changed our policies" shit. They didn't give a flying fuck what anyone had to say if it wasn't "bare my children please!" levels of worship and were constantly telling those that didn't approve of what they were originally doing to piss off. They didn't care what people had to say, they just couldn't stand seeing the pre-order comparison numbers.
 
I'm genuinely confused by the comparison between DR3 and Infamous, seeing as how we've seen very little of Infamous.

But then again, is Edge really saying that Titanfall is the most next gen game out there because it uses the power of the cloudz?
 
That's because the specs that Sony has published thus far have omitted any reference to clock speed for the CPU. Based on the performance #'s of the system overall, though, several hardware GAFfers have extrapolated that the CPU speed is likely to be at the stock value.

Based on what we know right now, the CPU in the XB1 is clocked 10% higher than the one in the PS4.

Which doesn't amount to a hill of beans. I'm not saying it does.

Sounds like speculation, not facts. So since we don't know, we don't know.
 
I'm genuinely confused by the comparison between DR3 and Infamous, seeing as how we've seen very little of Infamous.

But then again, is Edge really saying that Titanfall is the most next gen game out there because it uses the power of the cloudz?

What's funnier is it's running on the Source engine is it not?

Highly doubt the graphics are going to be overly impressive, gameplay looks to be good but I seriously doubt it's not something that could be done on the X360, wait it is
 
They are probably referring to the 204 GB max bandwidth of the ESRAM, which is a bit disingenuous. First, it's only 32 MB of memory. Second, it's 109 GB write max and 109 GB read max that happen simultaneously supposedly--not 204 GB in one particular direction if you wanted to like you can do with the GDDR5 memory in PS4. And lastly, the overall effective throughput of the memory subsystem is significantly constrained by the effective bandwidth of the larger 8 GB pool of DDR3 memory, which runs at 68 GB/s

So, running through some typical example scenarios at theoretical maximums for throughput at the GPU (remember, this is my best guess with the incomplete information that we have):

PS4 (assuming 26 GB/s from GDDR5 for CPU):

-GDDR5 Exclusive Read: 150 GB/s
-GDDR5 Exclusive Write: 150 GB/s
-GDDR5 Combo Read/Write: 85 GB/s Read & 65 GB/s Write

X1 (assuming 30 GB/s from DDR3 for CPU; also, assume a range of 0.5 to 0.667 inherent efficiency for the ESRAM since it is initially and likely intermittently thereafter dependent on being fed from the larger pool and is thus limited by the larger pool's throughput)

-Exclusive Read: (ESRAM @ 109 GB/s * 0.667) + DDR3 @ 38 GB/s = ~111 GB/s
-Exclusive Write: (ESRAM @ 109 GB/s * 0.667) + DDR3 @ 38 GB/s = ~111 GB/s
-Combo Read/Write:
(ESRAM @ 109 GB/s * 0.5) + DDR3 @ 19 GB/s = ~74 GB/s Read &
(ESRAM @ 109 GB/s * 0.5) + DDR3 @ 19 GB/s = ~74 GB/s Write

These numbers are likely, since MS themselves told Eurogamer's Digital Foundry that they were getting real-world effective throughput at the GPU of 140 to 150 GB/s combo read/write (in my example above, ~74 GB/s Read + ~74 GB/s Write = ~148 GB/s).

So, at the end of the day, PS4 has significantly better unidirectional numbers for effective throughput at the GPU. Both have similar effective throughputs at the GPU for combo read/write, with PS4 averaging a better effective Read throughput and X1 averaging a better effective Write throughput. For combo Read/Write in particular, this sounds a lot like what was said in this recent Edge article.

Here's what the MS guy said:-

The same discussion with ESRAM as well - the 204GB/s number that was presented at Hot Chips is taking known limitations of the logic around the ESRAM into account. You can't sustain writes for absolutely every single cycle. The writes is known to insert a bubble [a dead cycle] occasionally... One out of every eight cycles is a bubble, so that's how you get the combined 204GB/s as the raw peak that we can really achieve over the ESRAM. And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM. That's real code running. That's not some diagnostic or some simulation case or something like that. That is real code that is running at that bandwidth. You can add that to the external memory and say that that probably achieves in similar conditions 50-55GB/s and add those two together you're getting in the order of 200GB/s across the main memory and internally.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview
 
Here, right now, Microsoft has an edge. The £80 price disparity between the two consoles will be a decision maker for many, but for others, Microsoft’s exclusives are more convincing than anything Sony has shown on PS4 to date. Killzone is no Halo; DriveClub, now delayed, never looked like measuring up to Forza; and Dead Rising 3 now looks like a smarter bet for open-world mayhem than Second Son.

Even a second-tier title like Ryse makes a stronger case for its host hardware’s graphical capabilities, at least, than anything set for PS4’s launch day

oooh. That's gonna ruffle a few feathers.
 
I resubscribed when they said PS4 was the best thing ever but now they're saying Xbone stands a chance I'm cancelling it ASAP.
 
I don't get how shit like this gets approved on some of these big gaming sites. It doesn't take much to read into what the article really is.

Too many spun narratives on both sides all across the industry that makes it seem so far behind any other industry of this size.
 
umm... you can't just add the eSRAM and DDR3 bandwidths together.. that's not at all how it works.

I wasn't "just" adding them. If you look at the X1 numbers, you'll see that I've assumed an inherent efficiency of just 50 to 67 percent (not even factoring in other real-world system constraints) due to being bottlenecked by the larger, slower pool of DDR3 ram which is periodically feeding it.
 
PhKr9sM.gif

.
 
Microsoft's comment about its plans being "too far into the future" is mind-boggling and a bit irritating. That plan only works if, in the future, consumers are mindless drones who have no issue with bending over and taking it from Microsoft as they see fit.
 
Dead Rising 3 now looks like a smarter bet for open-world mayhem than Second Son.

When I read this in the actual magazine a few weeks ago I assumed the next paragrapgh would go into some detail supporting this claim.

It did not.
 
Putting on my serious hat, though, I still have literally no idea what people see in inFamous.

Eh from someone who actually really liked the first 2 and from what I've seen on Gaf it seems like I'm the only one, I like open world games with super powers

Will have to play saints row but its the combination of open world, parkour, and super powers that I like best

Crackdown fits the same mold although it had better gameplay for the most part but no story to speak of and literally no characters

I'd be super excited for a crackdown 3 to tell the truth
 
I wasn't "just" adding them. If you look at the X1 numbers, you'll see that I've assumed an inherent efficiency of just 50 to 67 percent (not even factoring in other real-world system constraints) due to being bottlenecked by the larger, slower pool of DDR3 ram which is periodically feeding it.

right.. you assumed an inherent efficiency of the throughput on the eSRAM and then ADDED that throughput to the DDR3 throughput.

you can't do that.

apples + oranges != Appanges

eSRAM bandwidth is the bandwidth when reading/writing to the eSRAM
DDR3 bandwidth is the bandwidth when reading/writing to the DDR3 pool

It's not a unified pool, doesn't act like a unified pool, can't be programmed to like a unified pool, and thus for numbers (in this case throughput) can't magically be presented as a unified pool.

It's as silly/incorrect as saying the XBONE has 8.032GB of RAM.

edit - CORRECT would be to say "when reading/writing to system memory, PS4 is around twice as fast. When reading/writing to the 32MB of eSRAM the XBONE is about as fast as PS4 to main pool." Which of course is true. The problem that exists is BECAUSE the eSRAM isn't a unified pool, devs are having (according to them) a somewhat frustrating time effectively utilizing it to make up for the main pool's throughput deficiency.
 
I don't get how shit like this gets approved on some of these big gaming sites. It doesn't take much to read into what the article really is.

Too many spun narratives on both sides all across the industry that makes it seem so far behind any other industry of this size.

When click bait stops working the "big gaming sites" will stop using click bait to pad advertising numbers.

Until then they'll keep making stupid arguments with little to no supporting logic just to fan the flames and seem like more legitimate internet waypoints than they really are.
 
But when, all of a sudden, they find an updated, better informed middle ground to walk that takes into account the changes MS has made since...
Really? That's meant to refer to this article? I certainly think there's things that MS has done to improve the XBO outlook since May, but this particular article isn't presenting that stance in a "better informed" way at all. It's mostly relying on tearing down the competition to make MS look better. That's not "better informed", that's just sensationalized editorial.
 
This is an informed middle ground?
Yes, updated with newer information and not necessarily coming into their piece with a conclusion to start, as they did with their 'this is your next console' article and cover.

Letting MS sell the upclock of the CPU while discounting the obvious power advantage of the PS4's GPU?
Edge is discounting the GPU spec angle? They aren't playing it down. They let someone from MS have a bite with the CPU upclock and a mildly dismissive take on the difference between the units on GPU. It's not Edge's place to say any more because they already acknowledge the spec war is won for Sony's machine.

Letting MS continue the FUD campaign about ESram?
What FUD? All I ever read about on forums, including this one, is how it and MS' choice of memory subsystem is inferior to the simplicity of Sony's choice of a single access point to memory. Yet, there's so-far unsubstantiated and simply echoed negativity and doubt coming from hundreds of posters rather than MS themselves. They let Gaijinworks guy's quote further establish, as an authority on the difference, from their professional opinion.

Making what look like factual statements about comparisons between the games instead of identifying them as opinion?
It's obviously their opinion. It doesn't need to be further clarified just like Edge went out on a limb to squat on MS' face with their cover story months back.


Eurogamer hosts digital foundry not Edge
Ah, I stand corrected. Mildly embarrassed about mixing them up.
 
I'm genuinely confused by the comparison between DR3 and Infamous, seeing as how we've seen very little of Infamous.

But then again, is Edge really saying that Titanfall is the most next gen game out there because it uses the power of the cloudz?

Well how can we forget the infinite POWAH of the cloudz? Or its 5 billion transistorz. Its a new era I say! A NEW EPOCH!
 
Putting on my serious hat, though, I still have literally no idea what people see in inFamous.

You ever play Crackdown? inFamous is like Crackdown where the guns and vehicles have been replaced with super powers and faster overall gameplay. With inFamous 2 they also built the most interesting open world city outside of the GTA franchise.
 
What I've figured was ~148 GB/s @ the GPU + 30 GB/s @ the CPU = ~178 GB/s system-wide throughput, which is probably closer to the truth. It's a good number nonetheless and is essentially the same as PS4's system-wide throughput.

Yes. Combined bandwidth is quite good. It's just going to be a bit trickier for Xbox One devs to keep all the pipelines fed relative to PS3 development.
 
Is that really "lol" worthy? Based on the latest footage we've seen from launch games on both sides I don't completely disagree with him on that one.

I personally think that nothing will probably touch Shadow Fall for the first year... that is until the first party and big third party games come in. Calling it! Other games will look amazing though.
 
lol, I see MS is trying to get Edge back on their side. Yeah theres some horseshit in that article but one of the quotes is from Ken Lobb, not an unbias source so naturally he's going to spin for microsoft.


End of the day let this be a lesson to those people who claim bitching never does anything. If there wasn't such a huge fuss then we would still be looking at a DRM machine with a mandatory camera. The internet backlash was immense and word of mouth certainly helped to harm those early pre-orders. The fanboys would have just had you bend over and take it, they basically get a better machine due to the fact that we dared to call MS out on their shit. I'm glad that GAF actually supported the rantings and petitions.
 
Going by what we've seen, I don't see where the scepticism comes regarding DR3 and I:SS. We've seen more from DR3 and judging from what we've seen, objectively speaking, it is much more impressive than anything I:SS has shown - granted, I:SS is a 2014 title and still in development, but even if you go by information alone, DR3 still pulls ahead with it's open world mayhem.

Also, I find it odd how some people are oh so sceptical of EDGE when they're actually citing credible sources and giving names, when the majority here were seemingly a-okay accepting that these "anonymous developers" were all of the opinion the PS4 would trounce the XB1.

EDGE plays a nasty game, they're not partisan, they just write whatever gets them clicks, but if we're going to take them seriously then people shouldn't cherry pick which articles they find credible purely to suit their preference. It's unbecoming. Take a spoonful of salt if need be, but don't pretend this somehow makes no sense if you're nodding along when they say equally outrageous stuff in line with your preference. Again, its unbecoming.
 
The third party issue I was talking about was strictly reselling. Not first sales.

As for the first party stores, MS and Sony ABSOLUTELY control pricing to a fair degree. Promotional sales typically need to be slotted (on both stores), and we already have more than a few stories about MS indirectly controlling the pricing of XBLA games and requiring (as part of contracts) parity on "other digital stores". Otherwise isn't it kind of funny that every game hitting both PSN and XBLA are $14.99, but tons of games hitting JUST PSN only arrive at $9.99? Terraria being the most blatant and recent example ($9.99 on PC. $4.99 on iPad. $14.99 on consoles OH BECAUSE IT'S ON XBLA)

The very nature of being a platform holder by definition means you have some control over those things, so I definitely agree with you there.

But I don't see people saying that consoles are inherently "anti-consumer" though (well, some PC diehards might say that, heh). Especially when we're in an environment with 3 console companies all competing with each other (so it's not like the old NES days where Nintendo almost literally did have a monopoly).

You are right though in that under MS' original DRM scheme, there is nothing stopping retailers from bomba'ing disc titles at retail.. But that was the ONLY thing unchanged by what MS was proposing.

I get that. But retailers "bombaing" disc titles is still a pretty big deal! I'm not saying there would be no changes. I just feel that some of the "MS would have had a monopoly!" talk is a bit overblown (at least, not drastically more of a monopoly than being a console manufacturer already allows you to be). Hell, the very fact that MS changed their plans due to customer/preorder pressure, and the fact that so many people want to buy a PS4 now is direct evidence that we're not the helpless consumers we're sometimes painted as, lol.

On a separate note, it would be nice if the "no DRM" movement was as popular for digital games as it was for disc-based console games. Seems like it'd be more inclusive, at least. Why can't I freely transfer licenses digitally, the same way I can "freely transfer" a disc? Let me log-in, email my friend, and it deactivates the purchase on my hardware and account, and activates it on his. If only one person can ever have an active license at once, it's conceptually the same as passing a disc around.
 
Really? That's meant to refer to this article? I certainly think there's things that MS has done to improve the XBO outlook since May, but this particular article isn't presenting that stance in a "better informed" way at all. It's mostly relying on tearing down the competition to make MS look better. That's not "better informed", that's just sensationalized editorial.
I'm not reading any sense of 'tearing down' the PS4 in any way at all. Maybe you can explain further.
 
Yes. Combined bandwidth is quite good. It's just going to be a bit trickier for Xbox One devs to keep all the pipelines fed relative to PS3 development.

There is no such thing as combined bandwidth. They are two separate stacks of memory. And yes, utilizing the eSRAM to make up for the slower DDR3 is an obstacle for the devs. While they have GDDR5 to work with on PC GPUs and the PS4, they have a DDR3 unified pool on the XBONE with a eSRAM workaround that is relatively complicated compared to what they have on the other two. It IS better than nothing, but typically when multi-plat devs are faced with "more complicated" results generally end up reflecting that (see first many years of PS3 and ALL years of Sega Saturn for more examples)
 
Eh from someone who actually really liked the first 2 and from what I've seen on Gaf it seems like I'm the only one, I like open world games with super powers

Will have to play saints row but its the combination of open world, parkour, and super powers that I like best

Crackdown fits the same mold although it had better gameplay for the most part but no story to speak of and literally no characters

I'd be super excited for a crackdown 3 to tell the truth

You might not be saying that once you play it... I feel this game took the 'zanyness' a bit too far. I felt the game had occasionally too many things going on and getting stuff for power ups was just a chore. Plus I loved the idiotic vehicles and this game made all vehicles extremely redundant SR3 had a decent niche carved out for itself, away from the likes of GTA but SR4 imo is just not as good as SR3
 
right.. you assumed an inherent efficiency of the throughput on the eSRAM and then ADDED that throughput to the DDR3 throughput.

you can't do that.

apples + oranges != Appanges

eSRAM bandwidth is the bandwidth when reading/writing to the eSRAM
DDR3 bandwidth is the bandwidth when reading/writing to the DDR3 pool

It's not a unified pool, doesn't act like a unified pool, can't be programmed to like a unified pool, and thus for numbers (in this case throughput) can't magically be presented as a unified pool.

It's as silly/incorrect as saying the XBONE has 8.032GB of RAM.

I was only assuming throughput at the GPU (i.e., the rates at which the GPU is reading and writing at); I know that aside from the actual throughput, there needs to be factored in other system overhead, like the cycle time it takes to copy data back and forth between memory pools. Since PS4 has more of a true hUMA-like memory architecture, it won't suffer from some the said copy overhead inefficiencies that X1 suffers from (i.e., you effectively need less bandwidth at the end of the day if you don't always have to copy data back and forth between the 2 pools). This puts PS4 at a significant advantage both in terms of ease of development and overall memory subsystem performance at the end of the day.
 
Putting on my serious hat, though, I still have literally no idea what people see in inFamous.
I could say the same of Dead Rising but I appreciate that others have different tastes than me. Actually, I wouldn't go so far as to say I have "literally no idea" because I can see some appeal and don't feel the need to downplay that.
 
There is no such thing as combined bandwidth. They are two separate stacks of memory.

Which can both be accessed at the same time so yes. It's fair to say you can combine bandwidth.
 
Top Bottom