• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Red Dead Redemption 2 Is Having A Rough Launch On PC

Graciaus

Member
you missed the word ‘can’, and yes there are occasions where it can take hours. I like to point these things out because i wish someone would have given me strait talk beforeI bought my over priced gaming hardware.
I have never played a game that had hours of troubleshooting. Type your issue into Google. You'll found your solution very quickly since someone else has had the same one probably. Now if you understand how to actually implement what is being said is completely different. Setting up mods or emulators is different. What looks complicated usually isn't once you know what to do.
I ask myself how many Witcher 3 pc players never changed their settings once crossing over the Pontar river? i.e. the forest over there is lush & kills the framerate nearly ten hours into the game. You can play for an hour on one setting & then realize "shit, I need to change here because the stutter sucks". I ask myself how many Assassin's Creed Odyssey players didn't have to change their settings & resolution once they reached the first city, or lit their torch at night? (in that game the benchmark is seriously flawed). Or Fallout players realize Fallout 4 is a stutter fest without locking the framerate in the user settings? These are rhetorical questions FYI, i.e. don't spread bullshit about click & play on pc, because even stuff like rivatuner statistics server is often required to get a smooth framerate beyond what the Nvidia panel can or cannot do.

I'm also referring to the biggest titles as well, aka the blockbuster AAA games which people want. If you're coming in here attempting to get console players to go pc & tell them "don't worry about settings, these assholes who tell you you'll be tinkering with the graphics & framerates are liars spreading propaganda!", I'll call you out.
Those examples you gave all you would do is go into the options and lower the settings. That's if you wanted to improve performance if your settings are to high for what your machine can do. The majority of games auto calibrate to your PC specs and you don't need to do anything. So yeah click play and check the settings and go for the majority of games.

Don't spread propaganda.
 

888

Member
Very close:



Quite a difference in Textures at points, Character models up close are more detailed and less muddy. Water reflections are massively improved. Fire effects are improved. Frame Rates more than doubled depending on hardware. At one point of that video when Arthur and Dutch are talking you can see the motion difference between the consoles. Which I think is the biggest reason I stopped playing it on X1X was due to the input lag and framerate. Man they were rough.
 

Esppiral

Member
Is the most impressive game in term of graphics this gen I don't understand why people gets upset because they can't max out the game in their 1200€GPU no one remember the 90's or early 2000's?
 

ruvikx

Banned
The majority of games auto calibrate to your PC specs and you don't need to do anything. So yeah click play and check the settings and go for the majority of games.

Don't spread propaganda.

You must be joking. Are you getting paid to spread this sort of shit? Unless someone has cataracts & doesn't notice frame pacing issues, stutter & a whole bunch of nasty stuff which requires fixes, pc gamers need to be always ready to spend a load of time delving deep into the game settings & their own machine settings. That's a fucking fact. Try click & play on Witcher 3, Assassin's Creed Odyssey, Origins, Evil Within 1 & 2, Resident Evil 2, Fallout 4 (hell, many older games require extra work because they're not compatible with Windows 10). And what's with this weird "ur a propagandist!" attack dog warrior schtick?

Crings, cringe & more cringe.
 

Kadayi

Banned
Quite a difference in Textures at points, Character models up close are more detailed and less muddy. Water reflections are massively improved. Fire effects are improved. Frame Rates more than doubled depending on hardware. At one point of that video when Arthur and Dutch are talking you can see the motion difference between the consoles. Which I think is the biggest reason I stopped playing it on X1X was due to the input lag and framerate. Man they were rough.

Cool stuff, though I have to say that the PC kind of looks a little too sharp when it comes to things in the distance on some of the larger landscape shots, however, I doubt that will be a concern for me with my modest 1070Ti come December :messenger_grinning_smiling:
 

lefty1117

Gold Member
Main thing I did was go in the ROckstar Launcher and clear my profile. That fixed the startup issues I was having.

Regarding performance, I've settled on DirectX12 as the renderer - even though the benchmark gives me a few extra fps with Vulkan, it seems to run overall more smoothly and with a consistently high frame rate.

Benchmark Results:
Minimum FPS: 12.0911
Maximum FPS: 79.5412
Average FPS: 53.8034

Specs:

i7 5820k
32GB RAM
RTX 2070
Windows 10 w/ SSD

Settings:

2560 x 1440 @ 144hz
VSync Off (I have gsync though probably not relevant here)
Windows Borderless
Texture Quality: Ultra
Anisotropic Filtering: X16
Lighting Quality: Ultra
Global Illumination: Ultra
Shadow Quality: HIgh
Far Shadow QUality: High
Screen Space Ambient Occlusion: High
Reflection Quality: High
Mirror Quality: Ultra
Water Quality: High
Volumetrics Quality: High
Particle Quality: High
Tesellation: High
TAA: Medium
FXAA: Off
MSAA: Off

Graphics API: DirectX 12
Near Volumetric Resolution: High
Far Volumetrics Resolution: High
Volumetric LIghting Quality: High
Unlocked Volumetric Raymarch Resolution: On
Particle Lighting: High
Soft Shadows: Ultra
Grass Shadows: High
Long Shadows: On
Full Resolution Screen Space Ambient Occlusion: On
Water Refraction Quality: High
Water Reflection Quality: High
Water Physics Quality: Appx 75% (bar)
Resolution Scale: Off
TAA Sharpening: 100%
Motion Blur: On
Reflection MSAA: Off
Geometry LOD: 100%
Grass LOD: 75%
Tree Quality: High
Parallax Occlusion Mapping Quality: Ultra
Decal Quality: High
Fur Quality: Medium

Still messing around with settings, but this is where I'm at right now and it seems to be a good combination of great visuals and pretty smooth gameplay.
 

Jayjayhd34

Member
You must be joking. Are you getting paid to spread this sort of shit? Unless someone has cataracts & doesn't notice frame pacing issues, stutter & a whole bunch of nasty stuff which requires fixes, pc gamers need to be always ready to spend a load of time delving deep into the game settings & their own machine settings. That's a fucking fact. Try click & play on Witcher 3, Assassin's Creed Odyssey, Origins, Evil Within 1 & 2, Resident Evil 2, Fallout 4 (hell, many older games require extra work because they're not compatible with Windows 10). And what's with this weird "ur a propagandist!" attack dog warrior schtick?

Crings, cringe & more cringe.

Thats your experience, that does not mean its like that for everyone.

I played most the games you listed and did not need to spend more than few seconds to get perfectly playable experience. As been stated most of the time games automatically calibrate to your system.

I can only speak for myself but if pc gaming was as you say I wouldn't be gaming on the platform.
 
Last edited:

Lanrutcon

Member
You must be joking. Are you getting paid to spread this sort of shit? Unless someone has cataracts & doesn't notice frame pacing issues, stutter & a whole bunch of nasty stuff which requires fixes, pc gamers need to be always ready to spend a load of time delving deep into the game settings & their own machine settings. That's a fucking fact. Try click & play on Witcher 3, Assassin's Creed Odyssey, Origins, Evil Within 1 & 2, Resident Evil 2, Fallout 4 (hell, many older games require extra work because they're not compatible with Windows 10). And what's with this weird "ur a propagandist!" attack dog warrior schtick?

Crings, cringe & more cringe.

Yeah, older games require extra work because they're...older games? That's kind of par for the course as technology changes. Plus, modern consoles are pretty limited in terms of BC and PCs play older console games better than older consoles. But anyway...

You're a propagandist because you seem to use sensationalism and hyperbole to make PC gaming out to be this exercise in computer engineering. Phrases like "need to be always ready" is blatantly false. I'll freely admit some games require tweaking, but most modern games don't. Your measure of how many games require troubleshooting is a bit out of date when I compare them to my own experiences. A decade ago you'd be right on the money, but technology has come a long way.

How often do you game on your PC these days?
 

thelastword

Banned
Rockstar are very smart the way they stagger launches.
Hmmmm!…… 🤔 Are you saying Rockstar may just launch RDR2 with raytracing at 4k 60fps on PS5/Scarlett on November 2019 launch day and say, this is the only way to experience RDR2...…"NOW with raytracing, and for the first time a locked 4k 60fps at ultra settings".....Another staggered launch....... they keep releasing and launching to other/newer platforms with improvements and we keep buying.......Rockstar keeps raking in that dough.....Perhaps it's unoptimized now on PC, so it can be optimized later on PS5......:messenger_grinning:
 

Jayjayhd34

Member
Hmmmm!…… 🤔 Are you saying Rockstar may just launch RDR2 with raytracing at 4k 60fps on PS5/Scarlett on November 2019 launch day and say, this is the only way to experience RDR2...…"NOW with raytracing, and for the first time a locked 4k 60fps at ultra settings".....Another staggered launch....... they keep releasing and launching to other/newer platforms with improvements and we keep buying.......Rockstar keeps raking in that dough.....Perhaps it's unoptimized now on PC, so it can be optimized later on PS5......:messenger_grinning:

Are you dreaming not even best gpu in the world could do raytracing at 4k 60fps.

Current amd gpus can't barely do 4k at an acceptable fps.without raytracing and people expect them magic up a card able to do ray tracing and 4k 60fps.

People's expectations for next gen consoles are going to lead to massive dispointment.
 
Yeah but on average it's 8gb there's not much room for improvement yes you can add system ram but that's slower, and games are console ports they are designed to consume not more than 6gb so besides higher resolutions and a few effects here and there the games all look similar across the board.
And even worse games would look even more similar coming next gen
Do I really need to explain how having 8GiB of dedicated VRAM is wildly different than having 8GiB of shared RAM?
They're not comparable at all.
 
Ps.- I´ve seen a video, it needs low settings to achieve 60fps, some port uh!.
Honestly...who the fuck cares? I sure as shit don't care what some text on a menu says. I care about how the game looks. If it still looks good with some settings on low, and it runs markedly better, then low it is (except textures...textures should always be as high as your VRAM affords you, it has next to no performance hit anyway).
 
Last edited:

Justin9mm

Member
Because it runs at 4K 30 on a $400 console and in many ways looks almost exactly the same while that is a $1,200 GPU in a $3,000+ system which should computationally outperform it by leaps and bounds.

The end result is you either get a game that looks just like it does on the X at 60 FPS or you get it where it looks mostly the same with some marketable improvements at the same 30 FPS...

yeshrug.png
And because AAA games are for the most part developed and optimised for console., next gen consoles are gonna bring the gap a little bit closer. I have a mid tier gaming rig, Pro and an X. I hate the troubleshooting and tweaking involved in running a PC game optimally. It's always some sort of hassle. I've pretty much left PC Gaming and now I just put up with my 75 inch 4K HDR with my X and Pro. Driving games/fps shooters are pretty much 60fps and I can easily put up with 30fps on story driven games. But PC Master Race says you fool, I'm not sure who the fools are really.. The game devs/publishers only care about profit and console is where they get it from!
 

Kenpachii

Member
That xbox one x isn't running red dead redemption anywhere near ultra settings, its probably more a mixture of low / medium settings.

Also people shitting on PC because a game doesnt'run at 4k and 60 fps at ultra settings far beyond consoles offer is laughable at best.

But hey having no visual settings and one janky preset consoles have is better then having options to costumize the experience for you, yourself. Console gamer logic.

Pressing low / mid / high / ultra settings is hard guys. Its rocket science.

If a game on PC runs on a single card at 60 fps 4k and ultra, that means ultra isn't pushing settings high enough forwards.
 
Last edited:

lukilladog

Member
And you're ignorant of the facts, but you don't see me pointing it out...oh...wait...actually you do. Nevermind. Come back when you have years of experience finding the optimal balance between performance and visuals. Then we'll talk.

Oh, I do, I´ve made several mods for several games, from texture work to world design and building.
 
Oh, I do, I´ve made several mods for several games, from texture work to world design and building.
Sure you have. That's why you spent two seconds gawking at the textures instead of eyeballing the scene as a whole and realising that model complexity was similar, that foliage density was similar, that lighting similar, that shadows were basically console quality, that clouds, while a tad blurry weren't radically different and knocking them up to medium is likely about as much as they'd actually need, and that about the only things you were really missing out on was AO, which could be enabled for a small hit, and ultra textures, which would incur a 1-2 FPS penalty at most if you actually have the VRAM to hold them.

Like I said. Come back when you have years of experience finding the optimal balance between performance and visuals. I've been eyeballing games like this for a few years now, that's why I actually notice this stuff rather than just going "ermagerd, look, shit textures!!!!!!11!!!"
 

Södy

Member
Running this at 4k @ 80% resolution with everything on medium adn textures on high with 55-60fps on my Ryzen 5 2600 and RTX 2060 Super.

Looks very good and feels so smooth compared to the One X version last year. 4k @ 100% gave me only 35-40 fps.
 
Last edited:

lukilladog

Member
Sure you have. That's why you spent two seconds gawking at the textures instead of eyeballing the scene as a whole and realising that model complexity was similar, that foliage density was similar, that lighting similar, that shadows were basically console quality, that clouds, while a tad blurry weren't radically different and knocking them up to medium is likely about as much as they'd actually need, and that about the only things you were really missing out on was AO, which could be enabled for a small hit, and ultra textures, which would incur a 1-2 FPS penalty at most if you actually have the VRAM to hold them.

Like I said. Come back when you have years of experience finding the optimal balance between performance and visuals. I've been eyeballing games like this for a few years now, that's why I actually notice this stuff rather than just going "ermagerd, look, shit textures!!!!!!11!!!"

Foliage density is different, fx like god rays are missing, lighting of the scene is different or cut to a couple meters, shadows are ps3 era, ambient oclusion is almost missing, shaders are destroyed, fog is lower quality, etc. Higher textures aint fixing that. And don´t say that thing about the years of experience, my first mods were for gtr 2 with physics and AI.
 
Last edited:
Foliage density is different, fx like god rays are missing, lighting of the scene is different or cut to a couple meters, shadows are ps3 era, ambient oclusion is almost missing, shaders are destroyed, fog is lower quality, etc. Higher textures aint fixing that.
  1. I'm sorry. Did I say foliage density was exactly the same? No. I said it was similar. And it is. It's a little thinner, but speaking honestly...do you really think you're going to notice while playing? What about on medium?
  2. Volumetrics like god rays are literally the very first thing you should look at turning down. They are notorious performance hogs. Would I go to off? No. But, I don't know if you know this, low -> ultra isn't a binary toggle. You can also mix and match. You can have low foliage density with ultra volumetrics. Magic, I know right!
  3. Lighting of one scene is drastically different. The rest are similar. Setting that to medium would resolve that instantly. There is no clip in which it is "cut to a couple meters". This isn't a PS1 game where stuff outside a short range is pitch black.
  4. Don't kid yourself on the shadows, we both know the base PS4 and Xbone are running low shadows. Maybe medium at best.
  5. Ambient occlusion is missing, as I pointed out. But again, it's not an all or nothing ordeal.
  6. "shaders are destroyed" - I'm going to assume you're referring to the post FX like DOF, motion blur, bloom etc. Yeah, I can do without all that shit.
  7. Fog is lower quality...but again...outside of a side by side...are you really going to notice that whilst actually playing? Even if you are...medium exists.
Honestly...if you wanted to make a point you picked a pretty shit video to do it. You didn't get water in there, there's no reflections, neither cubemaps nor SSR, no major population centers, nor particularly long view distances.

And don´t say that thing about the years of experience, my first mods were for gtr 2 with physics and AI.
Yes. I'm sure modding physics and AI taught you so much about balancing visuals and performance.
 

lukilladog

Member
  1. I'm sorry. Did I say foliage density was exactly the same? No. I said it was similar. And it is. It's a little thinner, but speaking honestly...do you really think you're going to notice while playing? What about on medium?
  2. Volumetrics like god rays are literally the very first thing you should look at turning down. They are notorious performance hogs. Would I go to off? No. But, I don't know if you know this, low -> ultra isn't a binary toggle. You can also mix and match. You can have low foliage density with ultra volumetrics. Magic, I know right!
  3. Lighting of one scene is drastically different. The rest are similar. Setting that to medium would resolve that instantly. There is no clip in which it is "cut to a couple meters". This isn't a PS1 game where stuff outside a short range is pitch black.
  4. Don't kid yourself on the shadows, we both know the base PS4 and Xbone are running low shadows. Maybe medium at best.
  5. Ambient occlusion is missing, as I pointed out. But again, it's not an all or nothing ordeal.
  6. "shaders are destroyed" - I'm going to assume you're referring to the post FX like DOF, motion blur, bloom etc. Yeah, I can do without all that shit.
  7. Fog is lower quality...but again...outside of a side by side...are you really going to notice that whilst actually playing? Even if you are...medium exists.
Honestly...if you wanted to make a point you picked a pretty shit video to do it. You didn't get water in there, there's no reflections, neither cubemaps nor SSR, no major population centers, nor particularly long view distances.


Yes. I'm sure modding physics and AI taught you so much about balancing visuals and performance.

The point was that it had to go to low settings to get 60fps on a 1060, and for my standards it just means a crappy port.

This is another video, he also is in mostly low settings and he is on 40-50fps, could be worse I guess:

 
The point was that it had to go to low settings to get 60fps on a 1060, and for my standards it just means a crappy port.

This is another video, he also is in mostly low settings and he is on 40-50fps, could be worse I guess:


With all due respect...I think your standards are stupid. Especially when we're talking about a GPU that was mid range 3 years ago. Now it's basically low end. Of course it's gonna struggle a little. That's how things go. Settings are also relative. Rockstar has always pushed the boundaries a little. GTA5 has the infamous grass setting that enables extremely taxing grass shadows that will still bring GPUs to their knees, as well as the extended draw distance settings, which equally are no pushover. This is no Arkham Knight situation, it's just a case of people being far too used to being able to push max settings with no problems. In other words it's a Crysis 2 title, one that is built with future hardware in mind.
 
The point was that it had to go to low settings to get 60fps on a 1060, and for my standards it just means a crappy port.

This is another video, he also is in mostly low settings and he is on 40-50fps, could be worse I guess:



He's averaging 45 - 55 in game, frame times vary between 4 and 33 ms from what I could see. That is a high amount of variability. He's using a 144 hz display.

He's using Vulcan, which may be a factor as D3D averages are a little higher, as may be his use of Ultra textures and 16x aniso. I don't trust either the vram usage figure, or how the game is handling textures resident in GPU memory.

The game is the most impressive this generation, for my money. Even on a mix of low and med settings the game looks a step above just about everything else.

Edit: silly me, recording the video is probably costing him some fps and adding some frame time variability too.
 
Last edited:
Do I really need to explain how having 8GiB of dedicated VRAM is wildly different than having 8GiB of shared RAM?
They're not comparable at all.
Yes it is a bit comparable because the games are designed to use 6gb the assets on any game today per frame don't exceed 6gb unless you add the graphics settings up on pc to ultra it'll go beyond 7gb at 4k 60 or 8k

You can go beyond 8gb in any game take San Andreas and bump the resolution to 8k and there u have it an ugly looking game using 8gb at 8k it's pointless the fact elephante in the room is that it's the same assets used on consoles to pc it's only Ur resolution and a few toggles that are better
 
Yes it is a bit comparable because the games are designed to use 6gb the assets on any game today per frame don't exceed 6gb unless you add the graphics settings up on pc to ultra it'll go beyond 7gb at 4k 60 or 8k

You can go beyond 8gb in any game take San Andreas and bump the resolution to 8k and there u have it an ugly looking game using 8gb at 8k it's pointless the fact elephante in the room is that it's the same assets used on consoles to pc it's only Ur resolution and a few toggles that are better

It's 5 GB on consoles, shared. 2 ~ 3 GB for buffers and textures.

PC can easily hit 6 or 7 GB for textures and buffers in vram, plus however many GB they want in main ram for everything else including texture overcommit or swapping textures into vram over PCI-E.

A PC using high settings could easily expect to have double or triple the ram, and to be able to pull data off the HDD faster (faster HDD and more CPU to decompress).
 
Averaging above 60 fps despite recording (description says would be +5 fps without), on Ryzen 5 3600 and RX 580. Mostly Med settings and Ultra textures with 16x ansio.

Looks amazing. Doing far better than the GTX 1060 video above, though that person could be facing other issues not related to the GPU.

 
It's 5 GB on consoles, shared. 2 ~ 3 GB for buffers and textures.

PC can easily hit 6 or 7 GB for textures and buffers in vram, plus however many GB they want in main ram for everything else including texture overcommit or swapping textures into vram over PCI-E.

A PC using high settings could easily expect to have double or triple the ram, and to be able to pull data off the HDD faster (faster HDD and more CPU to decompress).
The games are designed to use 6gb of assets per frame whether you add more resolution and graphics settings and use higher is up to you but the assets on that game are console quality, your putting make up on a frog 🐸,
 
Averaging above 60 fps despite recording (description says would be +5 fps without), on Ryzen 5 3600 and RX 580. Mostly Med settings and Ultra textures with 16x ansio.

Looks amazing. Doing far better than the GTX 1060 video above, though that person could be facing other issues not related to the GPU.


It's a console port all those assets in the game every frame has the same assets as the console version it doesn't matter how much FPS or resolutions you can pull on pc it's still a console game at heart, this isn't crysis 1
 
The games are designed to use 6gb of assets per frame whether you add more resolution and graphics settings and use higher is up to you but the assets on that game are console quality, your putting make up on a frog 🐸,

Base consoles only have access to 5GB for games, not sure why you keep saying 6.

The assets are actually designed at higher levels than base consoles can manage - it's a deliberate choice during development for supporting better hardware. PC gamers get access to some of that now.

It's a console port all those assets in the game every frame has the same assets as the console version it doesn't matter how much FPS or resolutions you can pull on pc it's still a console game at heart, this isn't crysis 1

"Console game at heart" doesn't actually mean anything. Consoles use PC architectures, and the this gen the ram allocation was good by 2013 standards. PC games have always been highly scalable. RDR even seems to make good use of the six cores available on consoles.
 
This and FFXV:WE have similar specs. Someone who owns both, how would you compare the two ports? Does one run better than the other?
 
Base consoles only have access to 5GB for games, not sure why you keep saying 6.

The assets are actually designed at higher levels than base consoles can manage - it's a deliberate choice during development for supporting better hardware. PC gamers get access to some of that now.



"Console game at heart" doesn't actually mean anything. Consoles use PC architectures, and the this gen the ram allocation was good by 2013 standards. PC games have always been highly scalable. RDR even seems to make good use of the six cores available on consoles.
Your talking bullshit, you can take any game and put it side by side comparison any game at all since this gen started and they all look similar from consoles to pc besides the resolution and rtx or whatnot features it's the same game, if u can't agree on that part then there's no point arguing it's pointless.
bOa7M6m.png


9ckfEKB.png


If this doesn't look familiar I don't know what does!
 
Last edited:

Lanrutcon

Member
Your talking bullshit, you can take any game and put it side by side comparison any game at all since this gen started and they all look similar from consoles to pc besides the resolution and rtx or whatnot features it's the same game, if u can't agree on that part then there's no point arguing it's pointless.
bOa7M6m.png


9ckfEKB.png


If this doesn't look familiar I don't know what does!

Were those PC shots taken from a potato plugged into the wall? They look like crap. Could someone with the game provide some non-crap screenshots please. I can't believe the game looks that bad on an actual PC.
 
Your talking bullshit, you can take any game and put it side by side comparison any game at all since this gen started and they all look similar from consoles to pc besides the resolution and rtx or whatnot features it's the same game, if u can't agree on that part then there's no point arguing it's pointless.
bOa7M6m.png


9ckfEKB.png


If this doesn't look familiar I don't know what does!
You know what's hilarious? Even in your horrifically compressed 480p screencaps I can see extra detail in the skin texture on...you guessed it...PC. On console his face looks kinda like it's made of playdough, it's too smooth. I can see what looks like pores on PC.
 
You know what's hilarious? Even in your horrifically compressed 480p screencaps I can see extra detail in the skin texture on...you guessed it...PC. On console his face looks kinda like it's made of playdough, it's too smooth. I can see what looks like pores on PC.
What's hilarious is what Ur explaining so a few textures is what u call an upgrade on an asset, if that's what Ur looking for then ur nuts!
 
What's hilarious is what Ur explaining so a few textures is what u call an upgrade on an asset, if that's what Ur looking for then ur nuts!
Oh...didn't you realise textures were assets? I mean...they take up VRAM...and you were quite clear (and verifiably wrong) when you said "all those assets in the game every frame has the same assets as the console version" so I guess you must not have. Either that or...you're talking out of your ass and going out of your way to source the absolute shittiest example screenshots you could to cover up your bullshit, and even then it wasn't enough.
 
Oh...didn't you realise textures were assets? I mean...they take up VRAM...and you were quite clear (and verifiably wrong) when you said "all those assets in the game every frame has the same assets as the console version" so I guess you must not have. Either that or...you're talking out of your ass and going out of your way to source the absolute shittiest example screenshots you could to cover up your bullshit, and even then it wasn't enough.
Your talking monkey manure now!
 
Top Bottom