Just tested
some benchmark videos on my OG Xbox One. Not very scientific, but the results are here anyway:
H.264 (8-bit), average 10 mbit/s, 1920x1080 progressive, hardware decoded: WORKS
H.264 (8-bit), average 10 mbit/s, 1920x1080 progressive, software decoded: WORKS
HEVC (Main (8-bit)), average 10 mbit/s, 1920x1080 progressive, software decoded: LAG.
HEVC (Main10 (10-bit)), average 10 mbit/s, 1920x1080 progressive, software decoded: LAG.
H.264 (8-bit), average 20 mbit/s, 1920x1080 progressive, hardware decoded: WORKS
H.264 (8-bit), average 20 mbit/s, 1920x1080 progressive, software decoded: WORKS
HEVC (Main (8-bit)), average 20 mbit/s, 1920x1080 progressive, software decoded: LAG.
HEVC (Main10 (10-bit)), average 20 mbit/s, 1920x1080 progressive, software decoded: LAG.
H.264 (8-bit), average 90 mbit/s, 1920x1080 progressive, hardware decoded: WORKS
H.264 (8-bit), average 90 mbit/s, 1920x1080 progressive, software decoded: WORKS
HEVC (Main10 (10-bit)), average 90 mbit/s, 1920x1080 progressive, software decoded: LAG.
High-bitrate HEVC support is terrible, though not very surprising considering the strain it puts on the poor little CPU.
What is surprising perhaps is that it can software decode 90 mbit/s H.264 yet cannot even handle 10 mbit/s HEVC, be it Main or Main10.