Modern & Retro Gaming on 4K TVs :: RGB106 / MY LIFE IN GAMING

Modern & Retro Gaming on 4K TVs :: RGB106 / MY LIFE IN GAMING


– Whether you’re an early adopter
or stubbornly think 1080p will be
good enough for the rest of your life,
4K is here and it’s not going away.
If you do find yourself
in the market for a new TV,
entry-level 4K has become so affordable
that 1080p just doesn’t seem
to make much sense anymore.
But there are also plenty of hidden costs and challenges,
along with new terminology,
specs, and settings to wrangle.
And what does it all mean for using
your older consoles on your big fancy new screen?
(Intro Music)
– When the PlayStation 4 and Xbox One released in 2013,
it was a real treat to see console games
take much more full advantage
of our 1080p TVs than their predecessors,
which typically render 3D graphics
at a more modest 720p.
At the time,
a 4K screen seemed pretty useless
unless you had a truly monstrous gaming PC.
But then Sony and Microsoft broke the rules
of hardware generations with the
PS4 Pro and Xbox One X,
making resolutions between 1080p
and native 4K possible on consoles
much sooner than expected.
Whether these powered-up systems
or your newest PC build
have piqued your interest in 4K,
chances are your next TV will be 4K,
and you’re gonna need to figure out
how to play games on it,
whether they’re new,
old, or really old.
Marketing buzzwords have kind of made
a mess of things in the 4K world,
making it much more confusing than it needs to be.
But the key points that you really need to know
are actually pretty simple once you boil it all down.
On televisions, 4K is 3840 x 2160,
over 8 million pixels across the screen.
“4K” is itself a generous approximation
of the horizontal resolution of 3840.
The more appropriate term for 4K,
which you won’t likely see in any advertising, is 2160p.
These days it’s hard to hear about 4K without
also hearing about HDR, High Dynamic Range.
This is another confusing term because
it has also been used to describe various
other things that aren’t exactly the same
HDR that we’re talking about here.
For example, for years games have
used a technique called “HDR”
to dynamically adjust exposure
as scenes move from indoors to outdoors,
or you might find an “HDR” setting
on your phone’s camera,
which blends multiple exposures to
retain detail in highlights and shadows.
The type of HDR we’re talking about
is a different thing altogether,
and one that must be specifically implemented
by the screen and the content.
A TV that supports HDR can display
a greater range of color and luminance.
The screen must be capable of achieving
a certain level of actual maximum brightness, “nits,”
to be capable of properly implementing HDR.
HDR rendering doesn’t necessarily
make standard dynamic range look bad,
but it does give creators more control
over the correct luminance and color tones
represented throughout an image.
But be warned: many cheaper 4K TVs that
“support” HDR may indeed accept the signal,
but cannot output enough nits or have high
enough of a contrast ratio against black pixels
to actually meet the qualifications for a proper
HDR implementation.
In addition, HDR requires dynamic backlight control,
so if HDR is important to you,
research what type of backlight is used,
and read reviews that evaluate its HDR performance.
We’ll talk more in-depth about HDR and how
to be sure it’s set up properly later in the video.
But the biggest mess of all are HDMI standards, cables,
and how no one selling HDMI stuff knows
how to communicate what you actually need.
Unlike analog video,
which can vary in quality due to interference,
digital video isn’t susceptible to the same issues
because the device displays exactly
what the source tells it to.
But at the end of the day,
HDMI cables are still just wires,
once the signal is in the wilderness
between the console and TV,
anything could happen,
and it might not make it to its destination in one piece.
When you’re dealing with 1080p,
sure, almost any cable should get the job done.
But 4K and HDR push HDMI cables to their absolute limits?
if you continue to believe the old adage that
“an HDMI cable is just an HDMI cable,”
you WILL have problems.
There’s a big difference between
HDMI cables and HDMI specifications.
The HDMI 2.0 specification allows
for 2160p at up to 60 frames per second,
along with HDR and all that good stuff.
Your HDMI cable is not an “HDMI 2.0 cable,”
but the devices you connect it with
your PS4 Pro or your 4K TV for example,
THOSE are HDMI 2.0.
It’s what you plug the cable INTO,
not the cable itself.
If you ever see someone call a cable
“HDMI 2.0” or “HDMI 2.1”…
kindly correct them.
However, the cables do have specifications
of their own that define their bandwidth capacity.
Virtually all HDMI cables sold today
are called “High Speed” cables.
Rest assured that if you’ve been using
1080p with your game consoles,
you’ve already got “High Speed” cables,
not the extinct “Standard” HDMI cables.
4K and HDR also require High Speed cables?
but the problem is that not all
High Speed cables are created equal,
in fact, chances are that even if you’ve got
bins full of High Speed HDMI cables,
most of them probably aren’t up to the task.
At best, sub-par cables will give you “speckling”
in the picture and occasional video drops.
Or you just won’t see anything at all.
Many cables are advertised as supporting 4K,
but if you look closer,
you might see that they can only claim
working with 4K at 24Hz or 30Hz,
usable only for video content.
Game consoles,
regardless of a game’s framerate,
output at 60Hz,
so more bandwidth is needed.
Look past the messy titles and any claims
of HDMI 2.0 and HDR and 4K and all that.
There are only two key phrases that matter:
4K60, and 18 gigabits per second.
Many “High Speed” cables are actually
only rated to carry something like
10.2 gigabits per second.
If the cable can claim to successfully carry
18 gigabits per second from end to end,
then it should do the trick.
Longer cables are riskier,
even just 15 feet is pushing it.
If you want just a bit more assurance,
simply add “Premium Certified” to your search.
Premium Certified cables are guaranteed
to stand up to everything that HDMI 2.0 can do,
and they don’t even have to be
much more expensive, trust us,
this can save you a lot of headaches.
For an extra long Xbox One X run,
we’ve had great success using two
of Monoprice’s 15ft Premium Certified cables
coupled with their Blackbird HDMI 2.0 repeater.
Even with 4K and HDR,
the signal doesn’t break down at all.
We’ve also tested the Cabernet Ultra Series 30ft cable,
which uses active circuitry to sustain longer runs,
but our tests yielded inconsistent results
when dealing with HDR at 2160p.
As of the time of this video,
a new category of HDMI cable,
Ultra High Speed,
is just barely starting to creep into the market.
These are expected to support anything
that the newest HDMI 2.1 spec or beyond
might throw at it (like 8K video),
carrying up to 48 Gigabits per second.
While these should be highly reliable cables,
they aren’t likely to be relevant for
gaming for years to come.
Even if a device supports HDMI 2.1
(such as the Xbox One X),
game consoles really don’t need more
than 18 Gigabits per second for the time being.
Alright? you got all that?
I promised that the basics aren’t all that complicated,
so here’s a quick review:
The HDMI 2.0 hardware spec allows
for 4K at 60Hz and HDR? 4K is 2160p…
HDR allows for higher color precision
and a greater range of luminance in
an image when properly supported?
and your HDMI cables must be rated to carry
18 Gigabits per second to get full functionality
from consoles like the PS4 Pro and Xbox One X.
But that’s just your starting point.
Are there any particular TV settings to be aware of?
What if you need to split the signal,
stream and capture?
Are there any special considerations
for earlier HDMI consoles,
and what about upscaling with
the Framemeister or OSSC?
Well, Try’s had quite an adventure getting
his own 4K setup up
so I’m going to leave that to him.
– If you’ve been watching My Life in Gaming,
then you probably know that
we like to have all of our consoles,
both new and old,
connected and ready to play at any time?
whether for recording,
streaming, or just for fun.
When I added a 4K TV to the mix,
well, no doubt it opened up exciting possibilities,
but I wasn’t expecting how much more time,
and money,
I would have to spend to get
my setup running like I needed it.
The TV that you’ll see most featured
in this episode is LG’s 55 inch C7 OLED,
the 2017 model.
Yes, I know that there is a lot of concern
about burn-in potential on OLEDs,
but I talked to a few people educated
and experienced enough on the subject
to conclude that the risk was extremely minimal,
almost like you have to try to make it happen.
Scenes like the Xbox One or Switch dashboards
that feature some flat middle-gray tones
do have a tendency to show
some image retention (not burn-in)
if you let them linger a moment,
which is totally normal and is nothing to worry about.
From retro games to modern games
to extensive use as a PC monitor,
I’ve got no burn-in after nearly a year of use,
but I do try to vary up what’s on screen every now and then,
especially when I’m editing all day.
The beautiful colors and famously
perfect black levels of OLED
made it well worth saving up for.
But of course there are 4K TVs for virtually any budget,
our friend Drumble of backloggery.com
instead opted for a Vizio M50-D1,
which is excellent for gaming in its own right
But don’t think of this as a review
of any particular TVs or brands,
but rather a look at what obstacles
and other considerations might
come into play when going 4K.
The TV was expensive enough,
I thought I could save money in other ways,
like choosing relatively affordable
HDMI 2.0 switchers and splitters,
and settling for a soundbar instead
of a 4K-ready AV receiver.
Well, I can say from experience that
it’s a mistake making all of these disparate devices
(of potentially dubious manufacturing origin)
try to talk to each other when 4K and HDR
are already pushing the limits of what
you can pump through HDMI,
and when retro hardware can deal
in off-spec signals that don’t always
fit neatly into digital scenarios.
That’s not to say you can’t have success
with HDMI 2.0 switchers and splitters.
The switchers I initially tried could be
spotty with signal detection,
but for the most part were alright when used
to simply connect multiple consoles to the TV.
But when paired with a splitter,
they became much less reliable.
So if you’re not using a splitter,
cheaper switchers on Amazon may do the trick,
or Monoprice’s Blackbird series.
Remember to read the specs of any device
very very closely because many that claim 4K support,
including much of the Blackbird 4K Pro series,
are only good for 4K at 30Hz.
Always look for the key words before buying:
18 gigabits per second, and 4K60.
For me, the solution was to bring all those functions,
switching, splitting, and audio, into a single device.
An AV receiver.
Something I didn’t really have any interest
in investing in at the time,
but in the end it was so worth it to finally
have everything just work.
I ended up with the Denon AVR-X3400H.
But I did first have to
return a far more affordable receiver
that appeared to have everything I needed.
So I learned a few key things to watch out for.
When shopping for a TV and AV receiver,
check the manufacturer’s website to get
specifics on which HDR formats are supported,
for example, Dolby Vision is typically used for movies,
but PS4 and Xbox One use HDR10 for gameplay,
a license-free format that you’d think
would just be supported by default,
but no, not necessarily.
Note that on some receivers and some mini-tvs
HDR features may be limited only to certain inputs.
Now this isn’t necessarily 4k specific
but considering my previous AV receiver
was bought around the dawn of HDMI
and didn’t even properly support it,
I wasn’t aware that ports labeled “Monitor 2”
do NOT carry audio, only video,
making the output unsuitable
for sending to capture devices.
The feature you need to look for is “Zone 2.”
That lets you send both video AND audio
through a second output.
Also, if you want to use the OSSC,
your TV isn’t the only device
that needs to be friendly with it.
The Onkyo receiver I initially bought
didn’t work very well with it at all,
but the Denon receiver lets me do basically
anything I want with the OSSC.
If you’re new to the channel,
you might be confused what we mean by “OSSC.”
It stands for Open Source Scan Converter
and in theory it’s just about the best thing
out there for playing analog consoles
on modern displays.
But what makes it so great is
also its biggest weakness,
it digitizes and multiplies the video input
and spits out the exact same refresh rate that it takes in,
which means if you have a low-lag TV or monitor,
then you can have a nearly
lag-free gameplay experience and it’ll look GREAT.
The problem is that these old consoles,
some of which might output
unwholesome refresh rates like 60.08Hz,
may do some loosey-goosey stuff
that analog TVs are A-OK with,
but conflict with the more rigidly-defined
worldviews that a digital display may hold.
Sadly, the OSSC was all but useless
with my older 1080p Samsung TV.
Though the OSSC’s main competitor,
the soon-to-be-discontinued XRGB-Mini Framemeister,
is most useful because it outputs
a very normal signal that should work
with just about any TV or capture card,
but it does have its own shortcomings,
including about a frame and a half of lag,
along with some color and noise issues
associated with its analog to digital conversion.
Overall it’s still a great device,
but I was really looking forward to getting
more use out of the OSSC with my 4K TV.
In fact, it was thanks to reports from
our good friend John Linneman at Digital Foundry,
saying that the OSSC works great with both
his 2016 LG OLED and that his
HDR10-equipped Denon receiver
plays nice with it too,
that I ultimately ended up with
newer versions of the same stuff myself.
It might seem a bit strange to most folks,
but good OSSC compatibility with various
systems was right at the top of my list of
priorities for my 4K setup,
and indeed I’ve thus far played more with the
OSSC on my OLED than I have actual high res
or HDR games on PS4 Pro and Xbox One X!
The OSSC also works great on Drumble’s Vizio,
and anecdotally it seems that I’ve been hearing
more and more about 4K TVs tending to be
more tolerant of unusual signals than HDTVs
from even just a few years ago.
Both his TV and mine take SNES
And RGB-modded NES via the OSSC without a hitch,
those are the signals that tend
to cause the most compatibility issues.
Though the OSSC has advanced significantly
since the first time we featured it in 2016,
now offering stunning
4x and 5x output modes for 240p sources,
resulting in 960p and 1200p respectively.
Even if 1200p just barely takes advantage
of a 4K screen,
I still think it’s pretty fun to push
my retro consoles just a bit beyond 1080p.
If 5x is incompatible,
you can also try a 1080p variant
that cuts off some overscan.
Of course you can always use 3x
with scanlines if you need that PVM-like look.
If crisp pixels are your thing and your TV takes 960p,
you might also enjoy the 480p 2x mode.
The OSSC is technically a line multiplier
rather than a scaler,
which does give it a bit less flexibility
than something like the Framemeister,
for instance,
you can use the Framemeister’s zoom functions
so that stuff like Super Game Boy,
Game Boy Player,
and PSP video output can fill more
of the screen than they normally would.
But luckily,
I can do something similar with the OSSC
thanks to my LG TV’s “All Direction Zoom,”
which lets me fill quite a large portion of the
screen with these lower res portable games.
I’m not sure how common this feature is on
other brands or even other LGTVs
but I think the result is as nice as the Framemeister.
Of course,
retro game emulation via your PC can also
be served very well by a native 4K resolution,
which is Drumble’s preferred way to play on his Vizio.
It goes without saying that pixel graphics look crazy sharp,
but he was a bit surprised by how well his
computer seemed to handle 4K emulation of
more advanced consoles too.
With the OSSC performing better
in my setup than the Framemeister
for 240p and 480p games,
the Framemeister’s quality deinterlacing
of 480i content had been the most
compelling reason to keep it hooked up.
But I have to admit,
with the OSSC’s 4x 480i mode,
I’ve been getting more and more used to the
so-called “bob deinterlacing” technique?
which is good,
because my trusty ol’ Framemeister has
actually been a bit of a thorn in my side ever
since making the move to 4K,
but the reason why is a bit strange.
Somewhere in your TV menu there should be
some options regarding “Deep Color.”
On my TV it’s buried weirdly deep,
not in Picture Settings,
but under General Settings,
way at the bottom.
I can activate Deep Color on all four HDMI inputs,
but note that some TVs may support
Deep Color only on one or two inputs.
To use HDR features on modern consoles,
Deep Color must be supported and activated on your TV.
The problem, if I do this,
the Framemeister gets seriously screwed up,
a pinkish screen and no sound?
and we’ve gotten so many questions
about pink Framemeisters that I think
it’s a fairly widespread issue.
But if nothing is causing any incompatibilities for you,
just leave Deep Color on so that
HDR works when you need it.
– OK, so if you want to use HDR with a console
like a PS4 or Xbox One,
you have to activate Deep Color in your TV settings.
But what is Deep Color and what does it have
to do with HDR?
As you probably know,
each pixel on your TV is made up of blips of red,
green, and blue color, called subpixels.
Each subpixel is driven by its own color channel,
and as the light blends together,
it makes possible all the colors that your screen can resolve.
Most traditional digital displays are
driven by 8-bit color channels,
but don’t think of it like the limited colors
you’d see in an 8-bit video game.
8-bit color is actually pretty impressive,
with each channel being defined on a scale of 256 values,
making possible over 16 million colors!
However,
the real-world impact of an 8-bit color model
is limited by issues such as display quality and
the way an image has been processed,
rendered, or compressed.
Deep Color is simply an umbrella term
for digital color models that use
10, 12, or 16 bits per color channel.
Deep Color is a core requirement
for HDR specifications.
HDR10 uses 10-bit color for instance.
More colors may not seem so impressive by itself.
I mean, it’s been decades since color seemed
like a real limiting factor in video game hardware.
But there’s another layer to HDR
that is more easily caught by the human eye,
control over a larger range of luminance.
Keep in mind that we can’t present
this video to you in HDR,
so you’ll have to use a bit of imagination.
Even our cameras can’t see
the darkest shadows and brightest highlights
in a single exposure.
A medium exposure such at this
is missing detail in the columns and sky,
but you can actually it see in person.
In a direct feed clip like this,
the PS4 is outputting HDR,
but the image is tone-mapped by the
AVerMedia Live Gamer 4K capture card
so that it looks acceptable on a standard screen.
You’ll find a segment about recording 4K,
HDR, and the OSSC near the end of this episode.
A high-quality HDR implementation takes
advantage of the way our eyes actually
perceive light to make a more realistic scene.
Think of it this way,
a standard display shows white at maximum luminance.
But when an HDR game displays white,
usually it’s it’s not at maximum luminance,
but rather something like a “paper white” value.
Parts of the scene that should be brighter
than paperwhite, well,
with HDR they can be.
It’s the ability to distinguish between
something that illuminates,
something that is illuminated,
and how that light bounces through the scene.
All PS4 and Xbox One systems
(except for the original Xbox One model)
are capable of HDR gameplay with up-to-date firmware,
so this is a feature you may be able
to take advantage of even if you don’t have
a PS4 Pro or Xbox One X.
Just keep in mind that each game
has to be specifically designed for HDR,
or some older games may
have HDR features patched in after the fact.
Many games do support it,
but at this point it is still by no means a sure thing.
When choosing video output settings,
you might be wondering whether
to use RGB or compressed YUV modes.
RGB is “technically” the best,
the lack of compression is often denoted
by a 4 to 4 to 4 ratio,
but you shouldn’t get so hung up on that.
YUV is a category of color encoding formats
that require less bandwidth
to transmit compared to raw RGB,
and can be extremely high quality when used properly.
Component video would be
a familiar example from the analog realm.
While there does exist a lossless
four-four-four YUV mode,
we’re currently mostly concerned with
YUV four-two-zero and YUV four-two-two.
These work by taking advantage of limitations
in how the human eye perceives light
and color by reducing the resolution
of certain color channels.
The negative effects of this compression
are often illustrated by subsampling artifacts
on very fine PC text,
but this is a fringe case?
console game graphics are
extremely well-suited to YUV encoding
and are unlikely to ever cause visible issues,
especially at higher resolutions.
YUV has no negative impact
on color vibrancy or the overall range of color.
In fact, HDR is impossible with
10-bit RGB at 4K60 on HDMI 2.0
because it would require more
than 18 Gigabits per second to transmit.
Even if you do choose RGB,
your system automatically switches
to a YUV mode to play HDR content anyway,
so the choice is effectively
only relevant for non-HDR games.
With HDMI 2.1 and Ultra High Speed cables,
compatible devices will be able to deliver
uncompressed color for those who demand it,
but in the here and now It’s better to think of
four-two-zero and four-two-two options as a
means of reducing bandwidth to make for
easier compatibility with certain screens,
switchers, and AV receivers,
rather than a choice in perceptible quality.
Video output settings are presented differently
between PS4 and Xbox One.
For demonstration purposes here,
we’re using a PS4 Pro and Xbox One X.
PS4 Pro offers two 2160p output modes,
RGB and YUV420.
As we’ve established,
this not a choice you need to stress over.
RGB Range is irrelevant for YUV,
but if you do use RGB mode,
we recommend Limited for most scenarios,
more on that in a bit.
HDR and Deep Color Output must both
be set to Automatic to use HDR.
Check “Video Output Information”
to confirm that 60Hz is available for HDR
and non-HDR modes.
If HDR is not available,
then check your TV’s Deep Color setting,
make sure you’re using 18 Gigabit HDMI cables
with a compatible TV input,
and that all devices in your video chain
are at least HDMI 2.0.
Many PS4 games also have the option
to disable HDR rendering if for any reason
you don’t prefer it.
If you’re a fan of PlayStation VR
like our buddy Game Dave,
note that you’ll have to unplug
the Processor Unit in order to use HDR signals
unless you have the revised PSVR hardware
that supports HDR passthrough.
For Xbox One,
first go to the “Video Modes” page under “Advanced.”
Here you can make sure “Allow 4K” is enabled,
as well as HDR.
“Allow YCC 422” is a YUV option,
you can leave it unchecked
unless you’re having difficulty with 4K modes.
Auto Low-Latency Mode” and “Variable Refresh Mode”
are future-proof HDMI 2.1 functions.
In theory, the former forces a TV into
its most gaming-friendly low-lag mode,
while Variable Refresh allows screens to not be
bound to a set refresh like 60Hz or 120Hz,
theoretically making games with
unstable frame rates appear smoother.
Note that 4K output is possible on S as well as X models,
but only X consoles are capable of rendering
gameplay at resolutions beyond 1080p.
Under “Video Fidelity,”
you can choose 8-bit, 10-bit,
or 12-bit color if your TV is compatible,
but regardless of your choice here,
10-bit HDR10 will activate when needed.
Color Space is for choosing what is normally
labeled as “Limited Range” or “Full Range,”
which we’ll talk about shortly,
just leave it on “Standard” for now.
From what we’ve heard,
HDR support in the PC realm is unfortunately
considered to be a bit of a mess at the moment,
although interestingly a few Dolby Vision games
do exist in the PC space.
Whether HDR is the game-changer
that it’s hyped up to be is up to you.
It’s important to know that not all HDR
implementations are created equal,
and in some games you might
even prefer the standard renderer.
I mean, HDR shouldn’t necessarily
be calling attention to itself and blowing
your mind at every second,
but in some games it does
feel like an afterthought.
Not every game needs HDR,
but many studios are getting better
and better at implementing it.
HDR gives creators more controls
to make the image appear
just the way they want it to,
and that can only be a good thing moving forward.
(Adventure music)
– Of course,
not every game or system
you’ll want to play on your 4K TV supports HDR.
Not much should have to change between
HDR and non-HDR settings,
but you will probably need to turn
your TV’s backlight down a bit.
On my TV, this is marked as “OLED Light,”
which thankfully remembers that I like it
set to 100 for HDR and 75 for non-HDR.
That’s just what feels comfortable to me.
Otherwise it’s kind of painfully bright because
standard dynamic range just isn’t fine-tuned to
work with that kind of luminance range.
Gotta reel it in.
For HDR though,
you may as well let it use all the luminance your TV has.
Don’t confuse your backlight
with the “brightness” setting.
We recommend that you do not
touch brightness at all?
You may lose your TV’s true black level
if you bump it up even one notch.
If you have an Xbox One,
it conveniently has a very nice set
of calibration images,
and chances are,
settings that are good for Xbox One
are good for most other sources too.
The screens with the gray ramp and open
and closed eyes are especially useful.
If you can see the closed eye,
then your TV’s black level is too high.
Now old up, that might sound counter-intuitive.
Aren’t I missing detail if I don’t see the eye?
Don’t I want to see the top and bottom steps
on the gray ramp?
Well, no, actually.
While those details may exist in the signal,
they are outside the range that is intended to be seen.
If I set the TV’s black level to “High”
so that I can see the eye,
then when a game shows a screen
that I know should be perfectly black,
we aren’t getting the black levels
that we should see here.
This is caused by a mismatch of Limited Range
versus Full Range settings between the console and TV.
Of course this isn’t a concern specific to 4K sets,
but your new TV may handle these differently
than your old HDTV or computer monitor,
so it’s important to know how to set this correctly.
Full Range on an 8-bit digital RGB signal
defines each color channel with a scale of 0 to 255.
Limited Range is a standard
that developed separately,
mainly intended for TV use,
which in comparison is represented
by a range of 16 to 235,
where anything below 16 is intended to be pure black.?
while mixed colors above 235 should be pure white.
This makes it seem like Limited Range
is missing 30-some steps of color per channel,
which definitely sounds like a bad thing.
But here’s the reality: your life will be way easier
and you won’t be missing a darn thing
if you just set all of your systems
to LIMITED Range whenever possible.
Sounds completely mathematically wrong, right?
But you don’t have to take my word for it?
here’s Digital Foundry’s John Linneman to explain.
– OK, it’s true,
if you drill down into the numbers,
full RGB slightly increases the number
of steps between full black and white.
However, in practice,
this simply isn’t something you’re going to notice.
Provided your display and console
use matching settings,
the two should look identical by eye,
and Limited Range will not increase
any visible color banding,
while an adjusted gamma curve
keeps color tones matched between Full Range
and Limited Range.
If you’re seeing any differences,
such as increased saturation,
then you’ve simply made an error in configuring
either your display or the console.
The reality is,
sticking with RGB limited across your setup
can prove easier in the long run without
negatively impacting image quality.
Why is that, you ask?
Well, compatibility.
Aside from the fact that video content,
such as Netflix or Blu-rays,
are not designed for use with full RGB in the first place,
many game consoles don’t properly
support this setting either.
When using any modern game systems on your TV,
sticking with limited ensures
that you’ll be seeing content correctly.
Beyond that, as Coury previously mentioned,
if you’re using HDR,
then the output switches over
to YUV anyways where this no longer even applies.
If you prefer to stick with RGB full,
that’s certainly fine,
but you won’t gain anything tangible
and increase the likelihood of running into
configuration issues with certain consoles
and types of content.
So… that’s why if your TV has a setting
called “Black Level” or an equivalent option,
you need to pay it special mind depending
on whether you’re using RGB Limited Range,
RGB Full Range, or YUV.
If your system is set to “Full,”
your Black Level should be “High,”
if your system is set to “Limited” or YUV,
you should bring your Black Level
back down to “Low.”
Avoid automatic range settings because
then you can’t be sure what black level to use.
You might find it helpful to know that
some versions of the 240p Test Suite
have both Full and Limited patterns
that can aid you in calibration.
Note that the PS3 does have a Deep Color option,
we aren’t aware of any tangible in-game benefits,
but it doesn’t hurt if your TV properly supports Deep Color.
The “Super-White” option is only relevant
for certain types of video content.
Microsoft has a habit of calling some of these
settings something a bit different from the norm.
On the Xbox 360,
under “Reference Levels,”
“Standard” is Limited Range,
“Expanded” is Full Range,
while “Intermediate,”
I? don’t even know why that exists.
Xbox One again says “Standard” for Limited,
which, hey, they recommend!
While “PC RGB” is Full Range?
If your PC color doesn’t seem quite right,
check what kinds of options you’ve got in your
graphics card’s dedicated control panel and
make sure you’ve set your TV settings
on that input to match.
A few notable cases where you don’t have a choice…
the Wii U is Limited Range only?
while many retro focused devices,
such as the the OSSC,
or much of Kevtris’s work
tends to be geared towards full range output.
I’ve also run into situations where
Limited Range or Full Range may not
pass correctly through a cheap switch box?
so that’s one area to troubleshoot if you’re
having black level frustrations.
At one point
my TV’s Auto Black Level function
did seem to mostly work just fine?
but after a firmware update,
it kind of broke?
which is something else to be aware of.
Whenever a firmware update is released for your TV,
investigate what people are saying about it online
before you decide to install it.
It’s probably fine most of the time,
oftentimes firmware updates
could even improve input lag!
Or? it could make input lag worse.
Or it could even do something obscure,
like break an OSSC mode that was previously working.
These are things I’ve heard about
happening on one TV or another?
so you just never know.
One of the coolest things about 4K
is how 2160p is a multiple of nearly every
common resolution from video game history?
240p? 720p? 1080p?
480 is the only big one left out.
Now mind you that this is no guarantee
that every 4K TV
will scale these resolutions beautifully,
but in theory it may help.
You’re unlikely to find a “nearest neighbor”
upscaling feature,
and in some cases,
sharpening may be forced even just upscaling 1080p.
Overall I’m really happy
with how my LG OLED handles HD resolutions,
I see no obvious scaling artifacts
from a normal viewing distance,
and systems like the Switch that can’t
output 4K still generally look quite good,
although admittedly stuff like compression
artifacts in pre-rendered videos may not be
masked as well as on a 1080p screen.
But honestly,
I rarely find myself thinking a game
looks poor just because it’s 1080p on a 4K screen,
Artificial scanlines are generally considered
to look better at 720p than 1080p,
so perhaps the best use of a clean
720p to 4k upscale is retro focused hardware
with optional scanline generation.
Try setting your system to 720p
if you’re a fan of that look.
So what if you set say,
your Xbox 360 or Wii U to 720p
so that native 1280×720 games
aren’t upscaled to 1080p by the system
before reaching your 4K TV?
Well, for these systems,
on my TV at least,
I’m not seeing much of a visible difference.
PS3 usually switches to 720p for 720p titles,
but some games may use software scaling
to output 720p as 1080p.
In this case,
Final Fantasy XIII appears just a bit crisper
when letting my TV handle
the full 720p to 2160p upscale.
PlayStation TV is an interesting case
because the Vita’s native resolution
is considered 1/4 HD,
More or less a 2x scale to 1080p,
but no 1080p output option is available.
Because of this,
depending on how well your 4K TV handles 1080i,
you may actually want to consider that over 720p.
One situation where you must set
your system resolution
to match the game resolution is with
the mCable Gaming Edition by Marseille.
This is an HDMI cable with active circuitry
that claims to improve game image quality
with its own layer of post-process anti-aliasing
with no perceptible lag.
It does actually have some merit?
but for a huge chunk of Xbox 360,
PS3, and Wii U games,
you’ll have to set your system to 720p.
The mCable also works with 1080p and 480p
content if the system output matches.
But frankly Marseille’s marketing is
pretty shady creating some confusion
over whether 4k is supported.
The mCable can only pass 4K60 signals
in a compressed 4:2:0 format with no processing.
The color shift and lack of control overall
make the mCable something we don’t prefer,
even if the results can be interesting.
Since we’ve spent so much time talking
about games from the 720p generation,
we can’t not give a shoutout to Microsoft’s
truly commendable backwards compatibility on Xbox One.
The ever-growing list of supported Xbox 360
games is much more comprehensive than
I think anyone should’ve ever expected it to be.
Of course it is emulation,
but from what I’ve seen, flaws are rare,
and I do understand that there’s just a bit of
360 DNA in the silicon that they tap into
to make it so accurate.
But the power of Xbox One
at the very least allows for improved performance
in almost every instance.
But on top of that,
a slowly growing number of Xbox 360 games
are actually rendered in 4K on Xbox One X.
Many 2D elements may appear softer
than the rest of the image,
but the overall impression is stunning.
I noticed that a light shaft effect
in Gears of War 3 renders improperly in 4K,
but that’s the only slight blemish
that I’ve managed to spot.
What’s even wilder is that,
owing to 10-bit color information
in the original game code,
a few games like Mirror’s Edge
are actually presented with a basic
HDR implementation.
Of course, original Xbox games look fantastic too,
and are presented at a higher resolution
on all Xbox consoles,
typically at or around 1080p on earlier
Xbox One consoles,
and approaching 4K on Xbox One X.
Original Xbox support is a much more recent development,
and the game rollout has been
disappointingly slow compared to Xbox 360,
but a number of key titles are already available,
and hopefully the overall list will look much
more impressive over the coming years.
So? all of these different signals?
from 480p to 4K and all the weird
stuff that the OSSC can output,
how exactly are we capturing this stuff?
If you’re just looking for something
to record the OSSC,
Coury is now using the StarTech PEXHDCAP60L.
It’s not 4K, but it does take the OSSC at up to 5x,
although when it comes to NES and SNES,
he needs to set it to 4x to hold a stable signal.
Otherwise, all direct feed clips in this episode
were recorded via
my Denon receiver’s Zone 2 output
to one of two capture cards.
My first solution for recording both 4K
and the OSSC was the
Magewell Pro Capture HDMI 4K Plus card,
the price kinda hurt,
but Magewell does have a really good
reputation for accepting oddball signals.
All of the critical consoles and resolutions
that the OSSC can output,
even Super Nintendo at 5x,
the Magewell card does it all without a hitch.
But be aware that recording 4K is no joke,
no matter how good your CPU may be.
OBS’s software x264 encoder will almost
definitely result in constant dropped frames,
so consider using the Nvidia hardware encoder instead.
During production of this episode,
I became aware of AVerMedia’s new
Live Gamer 4K via an in-depth review by Epos Vox,
and it’s a big deal because AVerMedia
is first to market with a consumer-grade
HDR capture card.
Luckily, AVerMedia was able to send us a
review unit in time to include in this episode.
So yes, the Live Gamer 4K accepts
2160p signals at 60Hz with HDR.
While you can record HDR files
directly with the RECentral software,
editing in HDR would require additional tools
and workflow changes that we just aren’t
prepared to handle at this time?
but that’s OK because the
Live Gamer 4K tone-maps HDR signals
on the fly so that the image looks good
in recordings or on streams,
while you get to enjoy full proper HDR on your TV.
The tone-mapped image isn’t necessarily
as good as if you’d just played with HDR off,
but I suspect that most of the time you
wouldn’t know the difference without a direct comparison.
But look, HDR shmachdeearr.
Listen, people.
This card takes SNES 5x via the OSSC.
Now that is a big freakin’ deal.
In the past,
brands like AVerMedia and Elgato have not had
good track records with OSSC support,
but here we are,
960p, 1200p, RGB modded NES,
PC Engine, VGA Dreamcast, GameCube?
they all work and I haven’t noticed any drops yet.
It’s still early days for me with the Live Gamer 4K,
and I’m not ready to remove
the Magewell card just yet,
but right now it feels like it could become
my all-in-one capture card for
everything retro on up to Xbox One X,
and I strongly suspect that it will be
a key piece of our production equipment
for years to come.
– Of course, 4K will continue to evolve,
especially as the HDMI 2.1 spec takes hold
and features like variable refresh become
more widely supported.
And who knows what the next generation
of consoles might bring?
No doubt the creative hardware designers
in the community who love vintage hardware
will also be sure to keep us busy with new ways
to connect old games to new screens.
There’s always going to be something
new just over the horizon,
so it can be a bit tough to know when
to jump onboard,
you just have to get what you can at some point.
After all, we’re undeniably somewhere
in the realm of diminishing returns by this point.
The final image quality comes
down to more than just raw pixel count,
filtering and anti-aliasing quality,
HDR implementation, scaling techniques,
but the 4K canvas gives creators so much
to work with that it’s hard to imagine that our
screens will be limiting their art anytime soon.
(Closed captioning by Jordan “Link584”)

100 thoughts on “Modern & Retro Gaming on 4K TVs :: RGB106 / MY LIFE IN GAMING”

  1. Hey I have a question if anyone can help. Looking to hook up my ps2 to my ROG SWIFT PG348Q Monitor. Thinking of getting a framemeister as i will use for other retro consoles as well..

    When i used a $30 porta upscaler (component to hdmi) it worked but it was very stretched and flickered alot on the monitor. Then when I went back to my pc (displayport) there was a flickering line near the top of screen for about 45 minutes then it went away… Does anyone know if this will still happen when i use framemeister? and what caused this? Please let me know if anyone can help. Thanks!

  2. I suspect you will likely need a capture card suitable max of 1080i and 1080p with dips into hdmi 1.2-1.4 standard you might have hdcp handshake issues with hdmi 2.0 devices with hdmi 1.2-1.4 devices not everything is backward and forwards compatible with hdmi2.x
    you can use the highest hdmi cable you want though the ram limiter what it is in consoles you will need add either an external caching ssd or use a phone or a tablet to improve processing power for all 4k/4lhdr content..

    even with 16GB cards being available on the pc world game may not be enough in process ing 4k/4khdr content and trying upscale something 720×576 may cause image degradation

    Not including adaptions and proper splitter boxes,

    if you were to sink each console on every platform..

    playstation would cater for 4-5 consoles
    xbox caters 5 revisions
    nintendo 5 revisions
    9 revisions for sega

    AT THE MOMENT

    ONLY 4-5 consoles support hdmi as a connection in xbox
    3 in playstation
    2-3 in nintendo

    Before someone comment hdmi converter dongels and cables for the original xbox and ps2/ps1 and original 360 without hdmi output..

    yes i know about the pound cable solution however i don't have a clue if it is compatible for hdmi 2.0/2.1

    I do know both ps4 pro and xbox 1x autodetect if you have a 4k and 4khdr screen and will update your games automatically..

    you might get some loose support on nintendos wiiu and switch hardware for 1080 support though they would have match or pass the current specs of the ps4/xbox 1 x if they wnat to farm games within the 4k/4khdr sphere,

    8k in next gen consoles is a novelty at this time as you are looking 65 inch and to support the format if it dwindle down to lower screen size i don't think it will venture past 55inch which is the happy medium where 4khdr resides everything smaller may look great for the 1st few hours of gameplay if you are above the age of 40 i don't recommend lower screen sizes for 4k material

    Yes it looks great as a sales gimmick on a 32-40 inch monitor though may lead excess eyestrain for the 4k format…

    at the moment unless you have a rich relative or you the likes of samsung/sony/JVC/hitachi or any other display builder wanting a review their monitor for free to keep for a year..

    the likelihood you owning an 8k display as they are likely to within 6-7.5 USD as the retail here for 8-8.5 k AUD here and they start 65 inch and up..

    sad to say i thknk we are at the limits of what both vga/dvi/hdmi can do interms of data transport i pretty much think even with usb-c display port or even firewire I think we are at the limits of what a copper interface can provide in terms of raw speed..

  3. 4K is garbage for gaming, and ALWAYS WILL BE. Graphical advancement is not going to stop in our lifetimes, and visual effect quality/complexity AND high framerates (120-240FPS+) are always going to be more beneficial to an immersive gaming experience than merely increasing resolution beyond what the human eye is even capable of discerning from the distance you should be from your display.

    For FILM, of course, the higher the resolution the better, sure it takes up a ton of space to store the file, but in most cases we are talking a blue-ray disc that doesn't really cost extra to fill up vs having room left over, on top of the ever-increasing internet speed and bandwidth for streaming options. If film started embracing high framerate standards, even that would still be more versatile than gaming because in almost all cases you would not be sacrificing anything critical to achieve that, you'd just be buying a 4K+ TV that's capable of 120hz playback which already exist in abundance, you wouldn't have to "turn the graphics down" to maintain the framerate because everything is pre-rendered.

    1080P is not an optimal stopping point, I'll give you that, 1440P is as close to the perfect balance as we're going to get. 77% more intensive is far more manageable than the 300% increase of 4K. Lets say 100hz is a as low as I would consider impressively playable. That would mean you have to push you game at ~180FPS at 1080P to consider moving to 1440P. With a high-end GPU that is actually feasible in the newest games, but honestly only because the consoles AND the focus on 4K has been holding graphical advancement back, otherwise games would be a lot more intensive at lower resolutions. Contrast that with being expected to hit ~400 FPS at 1080P to justify moving to 4K, and it is unlikely that you won't be making gigantic scarifies to achieve that responsive 100Hz experience, all for what is essentially just less blur when you stick your face to the screen in a way that you wouldn't normally, suffering greatly diminished lighting/shadows/textures/LOD/etc. And yes I know framerate doesn't always scale linearly with resolution, but it's not necessarily untrue either as some games are heavily GPU-bound in that way.

    The big shame, is that if you also want to console game, the 1440P display isn't optimal for that, unless you find one with a 1080p picture-in-picture mode that works very well. If you're serious about it, you basically need a high refresh adaptive-sync 1440P display, and a 4K display that can still cleanly scale up from 1080 when a console game has a performance mode.

  4. This video demonstrates both the extremely high level of polish and production quality of this channel, while simultaneously being a perfect distillation of how much of a shameful mess the implementation of these standards has been

  5. 4:15 – At the end of the day? Why just at the end of the day? What about the beginning of the day, or the middle of the day?

  6. I personally use the HDMI cable that came with the PS4 Pro that I totally wasn't expecting and it's done well with the 4k and HDR experience.

  7. Stereoscopic 3d Trump's resolution every time. I use my 3d monitor 99% of the time. Dolphin emulator in 3d is so badass.. I hardly ever use my 4k TV. Almost useless to me.

  8. I have HDR on an edge-lit LED and can confirm it's absolutely useless. Don't worry about HDR unless you are getting OLED!

  9. How much $$ to just come over to my place and set this up? I'm not sure I "captured" all of this info. Great work!

  10. I have a 36 inch Sony CRT and a 32 inch JVC CRT. I don't have plans to use LCD or HDMI for some time to come. It seems expensive too.

  11. About the stuff you said about hdmi cables and resolution:
    I mean that's all true and all but, how is it possible to buy a cable for less than 3 bucks, that have 4K/60 compatibility, while feeling very slim and flimsy? 😐
    https://www.amazon.de/gp/aw/d/B001JIQS8U/ref=ppx_yo_mob_b_inactive_ship_o0_img?ie=UTF8&psc=1

  12. 8K will be the holy grail of retro gaming, handling every resolution ever with pixel perfect scaling.

  13. Do you guys really still like playing videogames?
    I only really play them when I'm drunk or high, too many things on my head

  14. And all of the games are laggy, choppy messes. Appropriate since most of these graphically intense games are just one long cutscene

  15. 1080p is good enough though. And they need to find a way to make lower resolutions look more realistic in games. 480i recordings of actual people look real because they are real people being recorded. Do the same thing with video game graphics.

  16. First off, great video guys. Learned a ton about 4k that I never knew.
    Secondly I purchased the same Denon receiver as Try and just hooked it up. Bought it because it is supposed to play nice with the OSSC which is a huge deal to me. However when I turn the OSSC to 480i passthru mode while its running through the receiver it massively zooms in on the picture and has large black alternating lines. This does not occur when I'm running the OSSC directly into my TV. When I turn the OSSC to 480i 4x (bob) while running through the receiver it returns to a normal picture.
    Any Advice on how to correct this strange occurrence would be most appreciated as I would really like to play in 480i passthru for the picture quality while its running into the receiver.
    If it helps my setup is a 70001 PS2 slim > HD retrovison cable > OSSC > Denon receiver > Samnsung TV.

  17. I picked up the DVDO Edge, it has a De-interlacing technology better than the DVDO ABT-102 card used with the dVdo vp30-50. Personally when using the ABT-102 on a 480p BVM monitor showed a streamless picture I've never seen with any other device. I can't explain it, it was perfect framerate.

  18. i can attest to the fact about HDR on a cheaper 4K Tv , my LG was really good value , 50" & only £400 , good picture & Os reasonable connectivity 3X HDMI & HDR , but the HDR is not good , everything i have tried is horribly blurry , looks like i just turned my contrast down to 0 , no joking !! so if HDR is your thing , i would spend more cash , for me its not really a thing thankfully , but its a little annoying when there is no option in a certain game to turn off HDR (DMC 5 being one of them) & its no use to just turn off Deep color as on PC if it senses a HDR input it will automatically turn it on so keep this in mind (at leased on my LG) even if you have it disabled in windows too , its a pain in the neck believe me , but thankfully not many games Force HDR only DMC 5 from what i have seen & 1 Moto GP game back along

  19. The [email protected] was legit informative and eye-opening, I was trying to figure out why my PC would drop video to my TV occasionally. I thought the TV was being fucky. Turns out the cable is trash.

  20. what would you guys siggest me. i want to play wii should i use a component cable or the wii hdmi adapter? i have an 40 inch lcd sony tv and a 65 inch 4k hdr sony oled which tv would be better for wii?

  21. Should we hold HDMI selector boxes to the same standards as HDMI cables? As in, is it important that they also transfer data at 18gbs?

  22. OLED displays will experience burn in with time. My Oneplus 5 has an OLED screen. The toolbar at the top of the screen has mild burn in after a couple years of use.

  23. After watching several of your videos this week I have been thinking about how to get my old systems to look a little better for me. It occurs to me that it might be beneficial to look into 480i projectors. I agree that gaming on CRT's looks way better than an HDTV. I just played through Ace Combat 4 again for old time sake on my PS2. I started on an old 720p tv, then I moved over to an older CRT I luckily still have. It's big and heavy though. I don't imagine I'll keep it long. What of a standard def lightweight projector though?

  24. So what you guys are saying is to get the OSSC to convert the Classic Xbox component to HDMI correct? Been in the market for a proper component to HDMI converter for the Xbox, the HD Link cable by pound is trash sadly.

    Is there any decent alternatives that can get one by till they can afford the OSSC? Maybe you guys should dedicate an episode to Component to HDMI converters on amazon to help narrow them down and help people like me find the right alternative.

  25. about the live gamer 4k, is resolution changes from the ossc tolerable? my elgato hd60s is so bad with it, the screen blacks out and i can't get it to display anything unless i unplug and plug it back in. drives me crazy

  26. For the newer 4K consoles I just hook those 2 direct to the tv, since the LG OLED has 4 HDMI ports. The other two ports I have splitters on and hook my emulation PC and all my other hdmi consoles through those splitters. The only non HDMI console I still own is the original Xbox. Only because I have some old Ninja Gaiden Black mission mode scores that took many hundreds of hours back then. I have a converter hooked through my avermedia device to record and display it through HDMI. I may have also had to run the cable through a switcher to get the avermedia to allow it to record, but unsure if Im mistaking that with PS3 recording.

  27. The LG 65 inch C8 and C7 have been amazing for most content. I did have some bad grey banding that got worse on C7 and ended up with a C8 as a replacement. The issue is MINIMAL on this set and I never see it in regular content. It was a big problem it seemed before but I think its been being improved on. Its still a gamble on what set your going to get with any tv tech.

    360 games especially look pretty grainy and PS3 as well. Switch looks surprisingly good most of the time. It must be higher res then 360 and ps3 most of the time. I still think ps3 and 360 looked better on my plasma panasonic VT at 1080 p, though the colors pop more on this set and it would be hard to go back to that resolution after getting used to the clarity on this set.

    For newer consoles at 4k the games look incredibly good. My emulation dedicated pc, most games look amazing , especially with graphic filters for the ps2 and ps1 games. They all look remastered for the most part. So with emulation and the current 4K games, they look awesome mostly. Wii U doesnt look too bad either for some reason. 360 is the worst offender so far. But if you dont mind the extra pixelation for xbox 360 its not too bad, just not as clean as it was on my plasma. Its a very small sacrifice for the overall massive visual improvements. No burn in at all and I play ALOT of old games as well as new and everything inbetween, shown on the channel.

    Luckily the 360 (and possibly ps3 games with ps5 upcoming) has some great games upscaled on the Xbox one X. Hoping to see the continued. More original Xbox games would be great to see too. Those are extremely impressive on the TV. So there are ways around it. When it comes to games not enhanced on Xbox One X , pixelation will vary. Some of the best games are enhanced though and for the others Ive accepted that since I dont play that previous console gen as heavily as the current or even really old games on emulation (Wii and below). Emulation often has graphic filters or the games just look cleaner at default. My emulation PC only has a 1080 p video card but it looks great since sharpening those visuals on old consoles can only go so far on a 4k display.

  28. Entry Level 4k is dog shit compared to even decent 1080p screens. Give it a few more years and we'll start seeing affordable 120+ hz with decent brightness and maybe even OLED/MicroLED monitors.

  29. Once I went 4K I find not acceptable to go back to 1080p. And I said that about 720p TV as compared to 1080p that once you go 1080p from 720p you don't go back. I have a 4K LG monitor and I game in windowed 1440p on high to ultra settings and the visuals are crisp and gorgeous ^^. And in the future I will say once you go 8K no going back to 4K. And with Display Port 2.0 being able to do 16K I'll say the same thing about 8K in the distant future. DisplayPort 2.0 can do a billion colors, 16K resolution, and is in the Thunderbolt 3 form factor. With more pixel density the better the experience. I can do 1080p games and video in window on my 4K monitor and the windowed 1080p looks crisp and sharp due to the larger pixel count of the screen being 4K rather than smaller 1080p.

  30. Want a long HDMI cable for 2160p 60fps? Get a HDMI fibrewire cable. My PC and PS4 Pro are in an other room. Sure I needed two usb active extension cables for mouse and keyboard, But that's it. Just a big 4K TV on the wall. Switching did not work with any switch so I needed two cables.

  31. This video singehandedly inspired me to upgrade to 4k and get into creating a home theater. I can't thank you guys enough for making this.

  32. I know someone (personally) that has his retro consoles hooked up to a really big really good crt. Instead, my dream retro gaming setup would be the biggest flatscreen tv available with the biggest resolution available. As for the consoles I would have all the Analogue available at the time with all the consoles that haven’t got the Analogue revision yet I would use the original console with the colour the console is known for / the most common colour(, non of that limited edition shit.) with a Hyperkin HDMI adapter for each of them. To put the consoles I would have a big Ikea kallax shelf. It is basically a grid of 33 cm by 33 cm spaces with the biggest version being a grid of 5*5. Then I would have room for 25 consoles. (I don’t know if that would be enough but we will see.)

  33. Great video. I already own 4K since 2014 and its cost me thousands to keep updating a pc to sustain the resolution. but I still watched the entire thing, very informative

  34. since CRTs are heavy, bulky, use up lots of power, and are also kinda unreliable due to the capacitors, flybacks, etc. LCDs are a good option but adapters and converters and a decent LCD is recommended.

  35. First off, you are making incredible videos guys!! thank you so much! i am watching every episode! 🙂

    My Name is Goran and im from Croatia but i live in germany. I have a Question about my Setup and it would be very nice if someone can help me please. I have an OSSC 1.6, an LG OLED C7 in 55 inch and a Denon x2200w Receiver. now when i hook up my PS2 Slim via original PS3 component cable directly to the OSSC and then to my C7 then i have in passthrough mode a very poor picture. i know the ossc is not that good in deinterlacing. also the picture is grainy with less detail at silent hill 2. but when i hook up the ps2 directly to my denon receiver and select at say upscale to 720p or 4K then the picture is more detailed and less grainy. would that mean that the denon receiver upscales better than the C7 oled?? or what does this mean. is it recommended to hook up the ps2 to the receiver rather then the OSSC?! thank your very much guys!

  36. no matter what filter or effect they use, retro games are no meant to look nice on modern display. They look pixelated or blurry.

  37. Will enabling Game mode cause the same effect as disabling all the unnecessary gunk in the picture settings? When I have PC mode enabled on my LG TV most of the picture options are grayed out (but shown as "off"), and for some reason it forces me to use a Wide color gamut that oversaturates reds and greens (Standard color gamut feels more natural).

    I tried to compare input lag between PC mode and non-PC mode (with all gunk off), and honestly I could tell no difference. I looked at a few forums and couldn't really find a decisive answer if PC mode produced less input lag (if ever so slightly).

    Also, do you have any knowledge of what the color gamut actually does? My TV is around 10 years old so it's unlikely it has something to do with HDR.

  38. How do i find someone like these guys to come to my house and set me up, im being serious, this is all SO confusing

  39. Hte OSSC's scanlines look terrible, this should be possible to do much better on 4K…they need to put one single aliased line between the actual image line and the scanline.

  40. I really don't see the need to switch to 4k. 1080p looks great. The difference is not enough to justify the switch.

  41. Someone needs a compilation of their intros they are very well scripted I love them always gives me chills I say to myself I would have never thought about some things to put into an opening

  42. I told myself 2K would be where I am stopping. Many of the DVDs released today look comparable to BluRay. The technology is just going to keep advancing and advancing. BluRay hasnt even been out that long. VHS was around for at least 20 years before DVD showed up. It sucks having to constantly upgrade

  43. Does the OSSC fix the problem that the Framemesiter has while playing games that switch from 240p to 480i for menu screens? It makes it frustrating playing Castlevania on my Sega Saturn and having a long pause while it switches between looking at the map and then switching back to gameplay.

  44. Hello! Can anyone help me please or test something.
    I have an OSSC connected to my LG C7 55 inch.
    When i play PAL PS1 Games in 3x mode then i can use the TVs vertical Zoom Function to fill the empty Vertical space. The horizontal stays the same. But if i play on my ps2 on passthrough mode with 576i and use the vertical zoom function then it stretches the image horizontally?! Why is that the case? Regards

  45. So is an OSSC utterly essential, if I want to display retro games like Gamecube, N64, or Dreamcast in their intended respective visual fidelities, while playing them on something like a 4K tv, let alone an HDTV?

  46. THAT IS NOT BACKWARDS COMPARABILITY 🙁 they just sell us the same game over and over. Ps2 had true backwards comparability

  47. Q: could I get a direct link to where you got the double-sided pads or tape you used for mounting your Wii sensor bar. I have no idea which one I should buy. I too have a OLED panel and there is no thick base to sit the bar on!

  48. Many Made in China cables are problematic, if not always, occassionally glitching, because they barely meet the standard.
    Costs more, but name brand cables are better. A good sign is when the cables are gold platted. Beware of fake name brand cables from China. About a third of my Amaon purchases turn out to be fake Chinese products.

Leave a Reply

Your email address will not be published. Required fields are marked *