– Whether you’re an early adopter
or stubbornly think 1080p will be
good enough for the rest of your life,
4K is here and it’s not going away.
If you do find yourself
in the market for a new TV,
entry-level 4K has become so affordable
that 1080p just doesn’t seem
to make much sense anymore.
But there are also plenty of hidden costs and challenges,
along with new terminology,
specs, and settings to wrangle.
And what does it all mean for using
your older consoles on your big fancy new screen?
– When the PlayStation 4 and Xbox One released in 2013,
it was a real treat to see console games
take much more full advantage
of our 1080p TVs than their predecessors,
which typically render 3D graphics
at a more modest 720p.
At the time,
a 4K screen seemed pretty useless
unless you had a truly monstrous gaming PC.
But then Sony and Microsoft broke the rules
of hardware generations with the
PS4 Pro and Xbox One X,
making resolutions between 1080p
and native 4K possible on consoles
much sooner than expected.
Whether these powered-up systems
or your newest PC build
have piqued your interest in 4K,
chances are your next TV will be 4K,
and you’re gonna need to figure out
how to play games on it,
whether they’re new,
old, or really old.
Marketing buzzwords have kind of made
a mess of things in the 4K world,
making it much more confusing than it needs to be.
But the key points that you really need to know
are actually pretty simple once you boil it all down.
On televisions, 4K is 3840 x 2160,
over 8 million pixels across the screen.
“4K” is itself a generous approximation
of the horizontal resolution of 3840.
The more appropriate term for 4K,
which you won’t likely see in any advertising, is 2160p.
These days it’s hard to hear about 4K without
also hearing about HDR, High Dynamic Range.
This is another confusing term because
it has also been used to describe various
other things that aren’t exactly the same
HDR that we’re talking about here.
For example, for years games have
used a technique called “HDR”
to dynamically adjust exposure
as scenes move from indoors to outdoors,
or you might find an “HDR” setting
on your phone’s camera,
which blends multiple exposures to
retain detail in highlights and shadows.
The type of HDR we’re talking about
is a different thing altogether,
and one that must be specifically implemented
by the screen and the content.
A TV that supports HDR can display
a greater range of color and luminance.
The screen must be capable of achieving
a certain level of actual maximum brightness, “nits,”
to be capable of properly implementing HDR.
HDR rendering doesn’t necessarily
make standard dynamic range look bad,
but it does give creators more control
over the correct luminance and color tones
represented throughout an image.
But be warned: many cheaper 4K TVs that
“support” HDR may indeed accept the signal,
but cannot output enough nits or have high
enough of a contrast ratio against black pixels
to actually meet the qualifications for a proper
In addition, HDR requires dynamic backlight control,
so if HDR is important to you,
research what type of backlight is used,
and read reviews that evaluate its HDR performance.
We’ll talk more in-depth about HDR and how
to be sure it’s set up properly later in the video.
But the biggest mess of all are HDMI standards, cables,
and how no one selling HDMI stuff knows
how to communicate what you actually need.
Unlike analog video,
which can vary in quality due to interference,
digital video isn’t susceptible to the same issues
because the device displays exactly
what the source tells it to.
But at the end of the day,
HDMI cables are still just wires,
once the signal is in the wilderness
between the console and TV,
anything could happen,
and it might not make it to its destination in one piece.
When you’re dealing with 1080p,
sure, almost any cable should get the job done.
But 4K and HDR push HDMI cables to their absolute limits?
if you continue to believe the old adage that
“an HDMI cable is just an HDMI cable,”
you WILL have problems.
There’s a big difference between
HDMI cables and HDMI specifications.
The HDMI 2.0 specification allows
for 2160p at up to 60 frames per second,
along with HDR and all that good stuff.
Your HDMI cable is not an “HDMI 2.0 cable,”
but the devices you connect it with
your PS4 Pro or your 4K TV for example,
THOSE are HDMI 2.0.
It’s what you plug the cable INTO,
not the cable itself.
If you ever see someone call a cable
“HDMI 2.0” or “HDMI 2.1”…
kindly correct them.
However, the cables do have specifications
of their own that define their bandwidth capacity.
Virtually all HDMI cables sold today
are called “High Speed” cables.
Rest assured that if you’ve been using
1080p with your game consoles,
you’ve already got “High Speed” cables,
not the extinct “Standard” HDMI cables.
4K and HDR also require High Speed cables?
but the problem is that not all
High Speed cables are created equal,
in fact, chances are that even if you’ve got
bins full of High Speed HDMI cables,
most of them probably aren’t up to the task.
At best, sub-par cables will give you “speckling”
in the picture and occasional video drops.
Or you just won’t see anything at all.
Many cables are advertised as supporting 4K,
but if you look closer,
you might see that they can only claim
working with 4K at 24Hz or 30Hz,
usable only for video content.
regardless of a game’s framerate,
output at 60Hz,
so more bandwidth is needed.
Look past the messy titles and any claims
of HDMI 2.0 and HDR and 4K and all that.
There are only two key phrases that matter:
4K60, and 18 gigabits per second.
Many “High Speed” cables are actually
only rated to carry something like
10.2 gigabits per second.
If the cable can claim to successfully carry
18 gigabits per second from end to end,
then it should do the trick.
Longer cables are riskier,
even just 15 feet is pushing it.
If you want just a bit more assurance,
simply add “Premium Certified” to your search.
Premium Certified cables are guaranteed
to stand up to everything that HDMI 2.0 can do,
and they don’t even have to be
much more expensive, trust us,
this can save you a lot of headaches.
For an extra long Xbox One X run,
we’ve had great success using two
of Monoprice’s 15ft Premium Certified cables
coupled with their Blackbird HDMI 2.0 repeater.
Even with 4K and HDR,
the signal doesn’t break down at all.
We’ve also tested the Cabernet Ultra Series 30ft cable,
which uses active circuitry to sustain longer runs,
but our tests yielded inconsistent results
when dealing with HDR at 2160p.
As of the time of this video,
a new category of HDMI cable,
Ultra High Speed,
is just barely starting to creep into the market.
These are expected to support anything
that the newest HDMI 2.1 spec or beyond
might throw at it (like 8K video),
carrying up to 48 Gigabits per second.
While these should be highly reliable cables,
they aren’t likely to be relevant for
gaming for years to come.
Even if a device supports HDMI 2.1
(such as the Xbox One X),
game consoles really don’t need more
than 18 Gigabits per second for the time being.
Alright? you got all that?
I promised that the basics aren’t all that complicated,
so here’s a quick review:
The HDMI 2.0 hardware spec allows
for 4K at 60Hz and HDR? 4K is 2160p…
HDR allows for higher color precision
and a greater range of luminance in
an image when properly supported?
and your HDMI cables must be rated to carry
18 Gigabits per second to get full functionality
from consoles like the PS4 Pro and Xbox One X.
But that’s just your starting point.
Are there any particular TV settings to be aware of?
What if you need to split the signal,
stream and capture?
Are there any special considerations
for earlier HDMI consoles,
and what about upscaling with
the Framemeister or OSSC?
Well, Try’s had quite an adventure getting
his own 4K setup up
so I’m going to leave that to him.
– If you’ve been watching My Life in Gaming,
then you probably know that
we like to have all of our consoles,
both new and old,
connected and ready to play at any time?
whether for recording,
streaming, or just for fun.
When I added a 4K TV to the mix,
well, no doubt it opened up exciting possibilities,
but I wasn’t expecting how much more time,
I would have to spend to get
my setup running like I needed it.
The TV that you’ll see most featured
in this episode is LG’s 55 inch C7 OLED,
the 2017 model.
Yes, I know that there is a lot of concern
about burn-in potential on OLEDs,
but I talked to a few people educated
and experienced enough on the subject
to conclude that the risk was extremely minimal,
almost like you have to try to make it happen.
Scenes like the Xbox One or Switch dashboards
that feature some flat middle-gray tones
do have a tendency to show
some image retention (not burn-in)
if you let them linger a moment,
which is totally normal and is nothing to worry about.
From retro games to modern games
to extensive use as a PC monitor,
I’ve got no burn-in after nearly a year of use,
but I do try to vary up what’s on screen every now and then,
especially when I’m editing all day.
The beautiful colors and famously
perfect black levels of OLED
made it well worth saving up for.
But of course there are 4K TVs for virtually any budget,
our friend Drumble of backloggery.com
instead opted for a Vizio M50-D1,
which is excellent for gaming in its own right
But don’t think of this as a review
of any particular TVs or brands,
but rather a look at what obstacles
and other considerations might
come into play when going 4K.
The TV was expensive enough,
I thought I could save money in other ways,
like choosing relatively affordable
HDMI 2.0 switchers and splitters,
and settling for a soundbar instead
of a 4K-ready AV receiver.
Well, I can say from experience that
it’s a mistake making all of these disparate devices
(of potentially dubious manufacturing origin)
try to talk to each other when 4K and HDR
are already pushing the limits of what
you can pump through HDMI,
and when retro hardware can deal
in off-spec signals that don’t always
fit neatly into digital scenarios.
That’s not to say you can’t have success
with HDMI 2.0 switchers and splitters.
The switchers I initially tried could be
spotty with signal detection,
but for the most part were alright when used
to simply connect multiple consoles to the TV.
But when paired with a splitter,
they became much less reliable.
So if you’re not using a splitter,
cheaper switchers on Amazon may do the trick,
or Monoprice’s Blackbird series.
Remember to read the specs of any device
very very closely because many that claim 4K support,
including much of the Blackbird 4K Pro series,
are only good for 4K at 30Hz.
Always look for the key words before buying:
18 gigabits per second, and 4K60.
For me, the solution was to bring all those functions,
switching, splitting, and audio, into a single device.
An AV receiver.
Something I didn’t really have any interest
in investing in at the time,
but in the end it was so worth it to finally
have everything just work.
I ended up with the Denon AVR-X3400H.
But I did first have to
return a far more affordable receiver
that appeared to have everything I needed.
So I learned a few key things to watch out for.
When shopping for a TV and AV receiver,
check the manufacturer’s website to get
specifics on which HDR formats are supported,
for example, Dolby Vision is typically used for movies,
but PS4 and Xbox One use HDR10 for gameplay,
a license-free format that you’d think
would just be supported by default,
but no, not necessarily.
Note that on some receivers and some mini-tvs
HDR features may be limited only to certain inputs.
Now this isn’t necessarily 4k specific
but considering my previous AV receiver
was bought around the dawn of HDMI
and didn’t even properly support it,
I wasn’t aware that ports labeled “Monitor 2”
do NOT carry audio, only video,
making the output unsuitable
for sending to capture devices.
The feature you need to look for is “Zone 2.”
That lets you send both video AND audio
through a second output.
Also, if you want to use the OSSC,
your TV isn’t the only device
that needs to be friendly with it.
The Onkyo receiver I initially bought
didn’t work very well with it at all,
but the Denon receiver lets me do basically
anything I want with the OSSC.
If you’re new to the channel,
you might be confused what we mean by “OSSC.”
It stands for Open Source Scan Converter
and in theory it’s just about the best thing
out there for playing analog consoles
on modern displays.
But what makes it so great is
also its biggest weakness,
it digitizes and multiplies the video input
and spits out the exact same refresh rate that it takes in,
which means if you have a low-lag TV or monitor,
then you can have a nearly
lag-free gameplay experience and it’ll look GREAT.
The problem is that these old consoles,
some of which might output
unwholesome refresh rates like 60.08Hz,
may do some loosey-goosey stuff
that analog TVs are A-OK with,
but conflict with the more rigidly-defined
worldviews that a digital display may hold.
Sadly, the OSSC was all but useless
with my older 1080p Samsung TV.
Though the OSSC’s main competitor,
the soon-to-be-discontinued XRGB-Mini Framemeister,
is most useful because it outputs
a very normal signal that should work
with just about any TV or capture card,
but it does have its own shortcomings,
including about a frame and a half of lag,
along with some color and noise issues
associated with its analog to digital conversion.
Overall it’s still a great device,
but I was really looking forward to getting
more use out of the OSSC with my 4K TV.
In fact, it was thanks to reports from
our good friend John Linneman at Digital Foundry,
saying that the OSSC works great with both
his 2016 LG OLED and that his
HDR10-equipped Denon receiver
plays nice with it too,
that I ultimately ended up with
newer versions of the same stuff myself.
It might seem a bit strange to most folks,
but good OSSC compatibility with various
systems was right at the top of my list of
priorities for my 4K setup,
and indeed I’ve thus far played more with the
OSSC on my OLED than I have actual high res
or HDR games on PS4 Pro and Xbox One X!
The OSSC also works great on Drumble’s Vizio,
and anecdotally it seems that I’ve been hearing
more and more about 4K TVs tending to be
more tolerant of unusual signals than HDTVs
from even just a few years ago.
Both his TV and mine take SNES
And RGB-modded NES via the OSSC without a hitch,
those are the signals that tend
to cause the most compatibility issues.
Though the OSSC has advanced significantly
since the first time we featured it in 2016,
now offering stunning
4x and 5x output modes for 240p sources,
resulting in 960p and 1200p respectively.
Even if 1200p just barely takes advantage
of a 4K screen,
I still think it’s pretty fun to push
my retro consoles just a bit beyond 1080p.
If 5x is incompatible,
you can also try a 1080p variant
that cuts off some overscan.
Of course you can always use 3x
with scanlines if you need that PVM-like look.
If crisp pixels are your thing and your TV takes 960p,
you might also enjoy the 480p 2x mode.
The OSSC is technically a line multiplier
rather than a scaler,
which does give it a bit less flexibility
than something like the Framemeister,
you can use the Framemeister’s zoom functions
so that stuff like Super Game Boy,
Game Boy Player,
and PSP video output can fill more
of the screen than they normally would.
I can do something similar with the OSSC
thanks to my LG TV’s “All Direction Zoom,”
which lets me fill quite a large portion of the
screen with these lower res portable games.
I’m not sure how common this feature is on
other brands or even other LGTVs
but I think the result is as nice as the Framemeister.
retro game emulation via your PC can also
be served very well by a native 4K resolution,
which is Drumble’s preferred way to play on his Vizio.
It goes without saying that pixel graphics look crazy sharp,
but he was a bit surprised by how well his
computer seemed to handle 4K emulation of
more advanced consoles too.
With the OSSC performing better
in my setup than the Framemeister
for 240p and 480p games,
the Framemeister’s quality deinterlacing
of 480i content had been the most
compelling reason to keep it hooked up.
But I have to admit,
with the OSSC’s 4x 480i mode,
I’ve been getting more and more used to the
so-called “bob deinterlacing” technique?
which is good,
because my trusty ol’ Framemeister has
actually been a bit of a thorn in my side ever
since making the move to 4K,
but the reason why is a bit strange.
Somewhere in your TV menu there should be
some options regarding “Deep Color.”
On my TV it’s buried weirdly deep,
not in Picture Settings,
but under General Settings,
way at the bottom.
I can activate Deep Color on all four HDMI inputs,
but note that some TVs may support
Deep Color only on one or two inputs.
To use HDR features on modern consoles,
Deep Color must be supported and activated on your TV.
The problem, if I do this,
the Framemeister gets seriously screwed up,
a pinkish screen and no sound?
and we’ve gotten so many questions
about pink Framemeisters that I think
it’s a fairly widespread issue.
But if nothing is causing any incompatibilities for you,
just leave Deep Color on so that
HDR works when you need it.
– OK, so if you want to use HDR with a console
like a PS4 or Xbox One,
you have to activate Deep Color in your TV settings.
But what is Deep Color and what does it have
to do with HDR?
As you probably know,
each pixel on your TV is made up of blips of red,
green, and blue color, called subpixels.
Each subpixel is driven by its own color channel,
and as the light blends together,
it makes possible all the colors that your screen can resolve.
Most traditional digital displays are
driven by 8-bit color channels,
but don’t think of it like the limited colors
you’d see in an 8-bit video game.
8-bit color is actually pretty impressive,
with each channel being defined on a scale of 256 values,
making possible over 16 million colors!
the real-world impact of an 8-bit color model
is limited by issues such as display quality and
the way an image has been processed,
rendered, or compressed.
Deep Color is simply an umbrella term
for digital color models that use
10, 12, or 16 bits per color channel.
Deep Color is a core requirement
for HDR specifications.
HDR10 uses 10-bit color for instance.
More colors may not seem so impressive by itself.
I mean, it’s been decades since color seemed
like a real limiting factor in video game hardware.
But there’s another layer to HDR
that is more easily caught by the human eye,
control over a larger range of luminance.
Keep in mind that we can’t present
this video to you in HDR,
so you’ll have to use a bit of imagination.
Even our cameras can’t see
the darkest shadows and brightest highlights
in a single exposure.
A medium exposure such at this
is missing detail in the columns and sky,
but you can actually it see in person.
In a direct feed clip like this,
the PS4 is outputting HDR,
but the image is tone-mapped by the
AVerMedia Live Gamer 4K capture card
so that it looks acceptable on a standard screen.
You’ll find a segment about recording 4K,
HDR, and the OSSC near the end of this episode.
A high-quality HDR implementation takes
advantage of the way our eyes actually
perceive light to make a more realistic scene.
Think of it this way,
a standard display shows white at maximum luminance.
But when an HDR game displays white,
usually it’s it’s not at maximum luminance,
but rather something like a “paper white” value.
Parts of the scene that should be brighter
than paperwhite, well,
with HDR they can be.
It’s the ability to distinguish between
something that illuminates,
something that is illuminated,
and how that light bounces through the scene.
All PS4 and Xbox One systems
(except for the original Xbox One model)
are capable of HDR gameplay with up-to-date firmware,
so this is a feature you may be able
to take advantage of even if you don’t have
a PS4 Pro or Xbox One X.
Just keep in mind that each game
has to be specifically designed for HDR,
or some older games may
have HDR features patched in after the fact.
Many games do support it,
but at this point it is still by no means a sure thing.
When choosing video output settings,
you might be wondering whether
to use RGB or compressed YUV modes.
RGB is “technically” the best,
the lack of compression is often denoted
by a 4 to 4 to 4 ratio,
but you shouldn’t get so hung up on that.
YUV is a category of color encoding formats
that require less bandwidth
to transmit compared to raw RGB,
and can be extremely high quality when used properly.
Component video would be
a familiar example from the analog realm.
While there does exist a lossless
four-four-four YUV mode,
we’re currently mostly concerned with
YUV four-two-zero and YUV four-two-two.
These work by taking advantage of limitations
in how the human eye perceives light
and color by reducing the resolution
of certain color channels.
The negative effects of this compression
are often illustrated by subsampling artifacts
on very fine PC text,
but this is a fringe case?
console game graphics are
extremely well-suited to YUV encoding
and are unlikely to ever cause visible issues,
especially at higher resolutions.
YUV has no negative impact
on color vibrancy or the overall range of color.
In fact, HDR is impossible with
10-bit RGB at 4K60 on HDMI 2.0
because it would require more
than 18 Gigabits per second to transmit.
Even if you do choose RGB,
your system automatically switches
to a YUV mode to play HDR content anyway,
so the choice is effectively
only relevant for non-HDR games.
With HDMI 2.1 and Ultra High Speed cables,
compatible devices will be able to deliver
uncompressed color for those who demand it,
but in the here and now It’s better to think of
four-two-zero and four-two-two options as a
means of reducing bandwidth to make for
easier compatibility with certain screens,
switchers, and AV receivers,
rather than a choice in perceptible quality.
Video output settings are presented differently
between PS4 and Xbox One.
For demonstration purposes here,
we’re using a PS4 Pro and Xbox One X.
PS4 Pro offers two 2160p output modes,
RGB and YUV420.
As we’ve established,
this not a choice you need to stress over.
RGB Range is irrelevant for YUV,
but if you do use RGB mode,
we recommend Limited for most scenarios,
more on that in a bit.
HDR and Deep Color Output must both
be set to Automatic to use HDR.
Check “Video Output Information”
to confirm that 60Hz is available for HDR
and non-HDR modes.
If HDR is not available,
then check your TV’s Deep Color setting,
make sure you’re using 18 Gigabit HDMI cables
with a compatible TV input,
and that all devices in your video chain
are at least HDMI 2.0.
Many PS4 games also have the option
to disable HDR rendering if for any reason
you don’t prefer it.
If you’re a fan of PlayStation VR
like our buddy Game Dave,
note that you’ll have to unplug
the Processor Unit in order to use HDR signals
unless you have the revised PSVR hardware
that supports HDR passthrough.
For Xbox One,
first go to the “Video Modes” page under “Advanced.”
Here you can make sure “Allow 4K” is enabled,
as well as HDR.
“Allow YCC 422” is a YUV option,
you can leave it unchecked
unless you’re having difficulty with 4K modes.
Auto Low-Latency Mode” and “Variable Refresh Mode”
are future-proof HDMI 2.1 functions.
In theory, the former forces a TV into
its most gaming-friendly low-lag mode,
while Variable Refresh allows screens to not be
bound to a set refresh like 60Hz or 120Hz,
theoretically making games with
unstable frame rates appear smoother.
Note that 4K output is possible on S as well as X models,
but only X consoles are capable of rendering
gameplay at resolutions beyond 1080p.
Under “Video Fidelity,”
you can choose 8-bit, 10-bit,
or 12-bit color if your TV is compatible,
but regardless of your choice here,
10-bit HDR10 will activate when needed.
Color Space is for choosing what is normally
labeled as “Limited Range” or “Full Range,”
which we’ll talk about shortly,
just leave it on “Standard” for now.
From what we’ve heard,
HDR support in the PC realm is unfortunately
considered to be a bit of a mess at the moment,
although interestingly a few Dolby Vision games
do exist in the PC space.
Whether HDR is the game-changer
that it’s hyped up to be is up to you.
It’s important to know that not all HDR
implementations are created equal,
and in some games you might
even prefer the standard renderer.
I mean, HDR shouldn’t necessarily
be calling attention to itself and blowing
your mind at every second,
but in some games it does
feel like an afterthought.
Not every game needs HDR,
but many studios are getting better
and better at implementing it.
HDR gives creators more controls
to make the image appear
just the way they want it to,
and that can only be a good thing moving forward.
– Of course,
not every game or system
you’ll want to play on your 4K TV supports HDR.
Not much should have to change between
HDR and non-HDR settings,
but you will probably need to turn
your TV’s backlight down a bit.
On my TV, this is marked as “OLED Light,”
which thankfully remembers that I like it
set to 100 for HDR and 75 for non-HDR.
That’s just what feels comfortable to me.
Otherwise it’s kind of painfully bright because
standard dynamic range just isn’t fine-tuned to
work with that kind of luminance range.
Gotta reel it in.
For HDR though,
you may as well let it use all the luminance your TV has.
Don’t confuse your backlight
with the “brightness” setting.
We recommend that you do not
touch brightness at all?
You may lose your TV’s true black level
if you bump it up even one notch.
If you have an Xbox One,
it conveniently has a very nice set
of calibration images,
and chances are,
settings that are good for Xbox One
are good for most other sources too.
The screens with the gray ramp and open
and closed eyes are especially useful.
If you can see the closed eye,
then your TV’s black level is too high.
Now old up, that might sound counter-intuitive.
Aren’t I missing detail if I don’t see the eye?
Don’t I want to see the top and bottom steps
on the gray ramp?
Well, no, actually.
While those details may exist in the signal,
they are outside the range that is intended to be seen.
If I set the TV’s black level to “High”
so that I can see the eye,
then when a game shows a screen
that I know should be perfectly black,
we aren’t getting the black levels
that we should see here.
This is caused by a mismatch of Limited Range
versus Full Range settings between the console and TV.
Of course this isn’t a concern specific to 4K sets,
but your new TV may handle these differently
than your old HDTV or computer monitor,
so it’s important to know how to set this correctly.
Full Range on an 8-bit digital RGB signal
defines each color channel with a scale of 0 to 255.
Limited Range is a standard
that developed separately,
mainly intended for TV use,
which in comparison is represented
by a range of 16 to 235,
where anything below 16 is intended to be pure black.?
while mixed colors above 235 should be pure white.
This makes it seem like Limited Range
is missing 30-some steps of color per channel,
which definitely sounds like a bad thing.
But here’s the reality: your life will be way easier
and you won’t be missing a darn thing
if you just set all of your systems
to LIMITED Range whenever possible.
Sounds completely mathematically wrong, right?
But you don’t have to take my word for it?
here’s Digital Foundry’s John Linneman to explain.
– OK, it’s true,
if you drill down into the numbers,
full RGB slightly increases the number
of steps between full black and white.
However, in practice,
this simply isn’t something you’re going to notice.
Provided your display and console
use matching settings,
the two should look identical by eye,
and Limited Range will not increase
any visible color banding,
while an adjusted gamma curve
keeps color tones matched between Full Range
and Limited Range.
If you’re seeing any differences,
such as increased saturation,
then you’ve simply made an error in configuring
either your display or the console.
The reality is,
sticking with RGB limited across your setup
can prove easier in the long run without
negatively impacting image quality.
Why is that, you ask?
Aside from the fact that video content,
such as Netflix or Blu-rays,
are not designed for use with full RGB in the first place,
many game consoles don’t properly
support this setting either.
When using any modern game systems on your TV,
sticking with limited ensures
that you’ll be seeing content correctly.
Beyond that, as Coury previously mentioned,
if you’re using HDR,
then the output switches over
to YUV anyways where this no longer even applies.
If you prefer to stick with RGB full,
that’s certainly fine,
but you won’t gain anything tangible
and increase the likelihood of running into
configuration issues with certain consoles
and types of content.
So… that’s why if your TV has a setting
called “Black Level” or an equivalent option,
you need to pay it special mind depending
on whether you’re using RGB Limited Range,
RGB Full Range, or YUV.
If your system is set to “Full,”
your Black Level should be “High,”
if your system is set to “Limited” or YUV,
you should bring your Black Level
back down to “Low.”
Avoid automatic range settings because
then you can’t be sure what black level to use.
You might find it helpful to know that
some versions of the 240p Test Suite
have both Full and Limited patterns
that can aid you in calibration.
Note that the PS3 does have a Deep Color option,
we aren’t aware of any tangible in-game benefits,
but it doesn’t hurt if your TV properly supports Deep Color.
The “Super-White” option is only relevant
for certain types of video content.
Microsoft has a habit of calling some of these
settings something a bit different from the norm.
On the Xbox 360,
under “Reference Levels,”
“Standard” is Limited Range,
“Expanded” is Full Range,
I? don’t even know why that exists.
Xbox One again says “Standard” for Limited,
which, hey, they recommend!
While “PC RGB” is Full Range?
If your PC color doesn’t seem quite right,
check what kinds of options you’ve got in your
graphics card’s dedicated control panel and
make sure you’ve set your TV settings
on that input to match.
A few notable cases where you don’t have a choice…
the Wii U is Limited Range only?
while many retro focused devices,
such as the the OSSC,
or much of Kevtris’s work
tends to be geared towards full range output.
I’ve also run into situations where
Limited Range or Full Range may not
pass correctly through a cheap switch box?
so that’s one area to troubleshoot if you’re
having black level frustrations.
At one point
my TV’s Auto Black Level function
did seem to mostly work just fine?
but after a firmware update,
it kind of broke?
which is something else to be aware of.
Whenever a firmware update is released for your TV,
investigate what people are saying about it online
before you decide to install it.
It’s probably fine most of the time,
oftentimes firmware updates
could even improve input lag!
Or? it could make input lag worse.
Or it could even do something obscure,
like break an OSSC mode that was previously working.
These are things I’ve heard about
happening on one TV or another?
so you just never know.
One of the coolest things about 4K
is how 2160p is a multiple of nearly every
common resolution from video game history?
240p? 720p? 1080p?
480 is the only big one left out.
Now mind you that this is no guarantee
that every 4K TV
will scale these resolutions beautifully,
but in theory it may help.
You’re unlikely to find a “nearest neighbor”
and in some cases,
sharpening may be forced even just upscaling 1080p.
Overall I’m really happy
with how my LG OLED handles HD resolutions,
I see no obvious scaling artifacts
from a normal viewing distance,
and systems like the Switch that can’t
output 4K still generally look quite good,
although admittedly stuff like compression
artifacts in pre-rendered videos may not be
masked as well as on a 1080p screen.
I rarely find myself thinking a game
looks poor just because it’s 1080p on a 4K screen,
Artificial scanlines are generally considered
to look better at 720p than 1080p,
so perhaps the best use of a clean
720p to 4k upscale is retro focused hardware
with optional scanline generation.
Try setting your system to 720p
if you’re a fan of that look.
So what if you set say,
your Xbox 360 or Wii U to 720p
so that native 1280×720 games
aren’t upscaled to 1080p by the system
before reaching your 4K TV?
Well, for these systems,
on my TV at least,
I’m not seeing much of a visible difference.
PS3 usually switches to 720p for 720p titles,
but some games may use software scaling
to output 720p as 1080p.
In this case,
Final Fantasy XIII appears just a bit crisper
when letting my TV handle
the full 720p to 2160p upscale.
PlayStation TV is an interesting case
because the Vita’s native resolution
is considered 1/4 HD,
More or less a 2x scale to 1080p,
but no 1080p output option is available.
Because of this,
depending on how well your 4K TV handles 1080i,
you may actually want to consider that over 720p.
One situation where you must set
your system resolution
to match the game resolution is with
the mCable Gaming Edition by Marseille.
This is an HDMI cable with active circuitry
that claims to improve game image quality
with its own layer of post-process anti-aliasing
with no perceptible lag.
It does actually have some merit?
but for a huge chunk of Xbox 360,
PS3, and Wii U games,
you’ll have to set your system to 720p.
The mCable also works with 1080p and 480p
content if the system output matches.
But frankly Marseille’s marketing is
pretty shady creating some confusion
over whether 4k is supported.
The mCable can only pass 4K60 signals
in a compressed 4:2:0 format with no processing.
The color shift and lack of control overall
make the mCable something we don’t prefer,
even if the results can be interesting.
Since we’ve spent so much time talking
about games from the 720p generation,
we can’t not give a shoutout to Microsoft’s
truly commendable backwards compatibility on Xbox One.
The ever-growing list of supported Xbox 360
games is much more comprehensive than
I think anyone should’ve ever expected it to be.
Of course it is emulation,
but from what I’ve seen, flaws are rare,
and I do understand that there’s just a bit of
360 DNA in the silicon that they tap into
to make it so accurate.
But the power of Xbox One
at the very least allows for improved performance
in almost every instance.
But on top of that,
a slowly growing number of Xbox 360 games
are actually rendered in 4K on Xbox One X.
Many 2D elements may appear softer
than the rest of the image,
but the overall impression is stunning.
I noticed that a light shaft effect
in Gears of War 3 renders improperly in 4K,
but that’s the only slight blemish
that I’ve managed to spot.
What’s even wilder is that,
owing to 10-bit color information
in the original game code,
a few games like Mirror’s Edge
are actually presented with a basic
Of course, original Xbox games look fantastic too,
and are presented at a higher resolution
on all Xbox consoles,
typically at or around 1080p on earlier
Xbox One consoles,
and approaching 4K on Xbox One X.
Original Xbox support is a much more recent development,
and the game rollout has been
disappointingly slow compared to Xbox 360,
but a number of key titles are already available,
and hopefully the overall list will look much
more impressive over the coming years.
So? all of these different signals?
from 480p to 4K and all the weird
stuff that the OSSC can output,
how exactly are we capturing this stuff?
If you’re just looking for something
to record the OSSC,
Coury is now using the StarTech PEXHDCAP60L.
It’s not 4K, but it does take the OSSC at up to 5x,
although when it comes to NES and SNES,
he needs to set it to 4x to hold a stable signal.
Otherwise, all direct feed clips in this episode
were recorded via
my Denon receiver’s Zone 2 output
to one of two capture cards.
My first solution for recording both 4K
and the OSSC was the
Magewell Pro Capture HDMI 4K Plus card,
the price kinda hurt,
but Magewell does have a really good
reputation for accepting oddball signals.
All of the critical consoles and resolutions
that the OSSC can output,
even Super Nintendo at 5x,
the Magewell card does it all without a hitch.
But be aware that recording 4K is no joke,
no matter how good your CPU may be.
OBS’s software x264 encoder will almost
definitely result in constant dropped frames,
so consider using the Nvidia hardware encoder instead.
During production of this episode,
I became aware of AVerMedia’s new
Live Gamer 4K via an in-depth review by Epos Vox,
and it’s a big deal because AVerMedia
is first to market with a consumer-grade
HDR capture card.
Luckily, AVerMedia was able to send us a
review unit in time to include in this episode.
So yes, the Live Gamer 4K accepts
2160p signals at 60Hz with HDR.
While you can record HDR files
directly with the RECentral software,
editing in HDR would require additional tools
and workflow changes that we just aren’t
prepared to handle at this time?
but that’s OK because the
Live Gamer 4K tone-maps HDR signals
on the fly so that the image looks good
in recordings or on streams,
while you get to enjoy full proper HDR on your TV.
The tone-mapped image isn’t necessarily
as good as if you’d just played with HDR off,
but I suspect that most of the time you
wouldn’t know the difference without a direct comparison.
But look, HDR shmachdeearr.
This card takes SNES 5x via the OSSC.
Now that is a big freakin’ deal.
In the past,
brands like AVerMedia and Elgato have not had
good track records with OSSC support,
but here we are,
960p, 1200p, RGB modded NES,
PC Engine, VGA Dreamcast, GameCube?
they all work and I haven’t noticed any drops yet.
It’s still early days for me with the Live Gamer 4K,
and I’m not ready to remove
the Magewell card just yet,
but right now it feels like it could become
my all-in-one capture card for
everything retro on up to Xbox One X,
and I strongly suspect that it will be
a key piece of our production equipment
for years to come.
– Of course, 4K will continue to evolve,
especially as the HDMI 2.1 spec takes hold
and features like variable refresh become
more widely supported.
And who knows what the next generation
of consoles might bring?
No doubt the creative hardware designers
in the community who love vintage hardware
will also be sure to keep us busy with new ways
to connect old games to new screens.
There’s always going to be something
new just over the horizon,
so it can be a bit tough to know when
to jump onboard,
you just have to get what you can at some point.
After all, we’re undeniably somewhere
in the realm of diminishing returns by this point.
The final image quality comes
down to more than just raw pixel count,
filtering and anti-aliasing quality,
HDR implementation, scaling techniques,
but the 4K canvas gives creators so much
to work with that it’s hard to imagine that our
screens will be limiting their art anytime soon.
(Closed captioning by Jordan “Link584”)