Seriously? Not sure how much more I can say here man - I covered this earlier..
Please read my posts - and then read the post that I responded to - and try to
understand beyond what you're hyper-focused on with respect to differences purely on FPS on this one game on one particular cutscene.
There's additional context here with respect to resolution differences and settings between the consoles in this one particular game (and others) that you're either blatantly ignoring, or just don't want to include because it directly impacts and subverts your proposed analysis (comparing the minimum FPS as "like for like").
I'll try to spell it out to you in basic terms so that we can move on..
The basics of comparing performance of a GPU are quite straightforward (watch any number of YT reviewers):
- Pick a game
- Setup a clean OS install
- Set the variables of the game to a baseline of settings (resolution, AA, Shadows, etc... )
- Insert first GPU - clear and install latest drivers
- Run a section of the game and count the FPS (typically the average across a specific area is captured and used)
- Insert second GPU - clear and install latest drivers
- Run the same section of the game and again count the FPS (average)
- Make a chart - done.
This works great on a PC because we can easily set the game to run at an unlocked/uncapped FPS with very specific target resolution and additional game settings that would impact overall performance.
But it's nearly
impossible on comparing consoles because the FPS on consoles are always typically
locked/capped and the other settings that directly affect performance are almost always different between the consoles being tested. This is because developers do a lot of tweaking on their games to get the best performance out of the game based on a target
customized profile per console.
To that end there are three paths:
- Positive testing with unlocked framerates and same settings.
- Negative testing (comparing sections of the game where both consoles struggle and yield impacted framerates) - with locked framerates but same exact ("like for like") console settings
- Negative testing - with locked framerates but different console settings, and compare/contrasting them with PC equivalent "guesstimates" attempting to re-create the same settings for each console on a PC configuration to infer performance differences on the consoles.
- Even this process is suspect and debatable but "closer" because the data is interpreted and not a full or exact "like for like" comparison of the actual consoles
In this case of comparing Hitman 3 - the only option to begin to pursue some form of comparison is Option 3. Option 1 or 2 are not possible because the framerates are capped and the settings are different.
Maybe we're saying the same thing in different ways? Not sure - but at the end of the day - this is why 30+ pages of nitpicking over minor dropped FPS counts on a specific section of a game that's not running at the same exact resolution/shadow details, but also running 60fps locked 99.999% otherwise is a completely pointless exercise.
There's no "winning" here for either console because it's not a competition.