Skip to content

Support RGB colorspaces and explicit tone mapping#126

Draft
ns6089 wants to merge 2 commits into
moonlight-stream:masterfrom
ns6089:csc_ext
Draft

Support RGB colorspaces and explicit tone mapping#126
ns6089 wants to merge 2 commits into
moonlight-stream:masterfrom
ns6089:csc_ext

Conversation

@ns6089
Copy link
Copy Markdown
Contributor

@ns6089 ns6089 commented Feb 18, 2026

It's here as a placeholder for the most part, but comments are always appreciated.
There's no point in merging this in any form without adding application support at the same time,
and it will take me quite a bit of time to get to that personally.


To control bit depth and colorspace of video stream, we presently have the following tools in master:

  1. SDP x-nv-video[0].encoderCscMode >> 1
    • COLORSPACE_REC_601
    • COLORSPACE_REC_709
    • COLORSPACE_REC_2020
  2. SDP x-nv-video[0].encoderCscMode & 1
    • COLOR_RANGE_LIMITED
    • COLOR_RANGE_FULL
  3. SDP x-nv-video[0].dynamicRangeMode
    • 0
    • 1
  4. URL /launch?hdrMode=
    • 0
    • 1

Although 3 and 4 seem to be always used together by moonlight-qt.

I want to address two issues:

  1. Add support for RGB 4:4:4 encoding, which bypasses banding artifacts that stem from rounding inaccuracies in RGB->YUV->RGB conversion
  2. Support requesting tone mapped HDR->SDR 10-bit video, since this is currently impossible due to protocol ambiguity

Comment thread src/Limelight.h Outdated
// If not set and selected format is 10-bit, HDR content will be encoded as is.
// The reason behind this field is the ambiguity of 'supportedVideoFormats',
// as it's impossible to request 10-bit tone mapped content through that field alone.
int toneMapHDR;
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure which one is better, pushing like this or pulling through renderer capabilities.

@andygrundman
Copy link
Copy Markdown
Contributor

Love this, I want to add tone-mapping to Xbox and would be happy to work on testing any protocol changes. I am having a tough time thinking about what the right user options should be for something like this though.

@ns6089
Copy link
Copy Markdown
Contributor Author

ns6089 commented Mar 4, 2026

I'm lagging behind on the server side of things unfortunately, so there's nothing yet to test at the moment. But in my opinion, in our architecture tone mapping belongs on the server. If we delegate it to client, we'll be wasting bitrate and introduce a bit (pun unintended) of additional banding for no reason. Unlike inverse tone mapping (SDR to HDR), but that thing can get very complicated and computationally intensive, so it's probably makes more sense to put server into HDR mode and use Windows Auto HDR or Special K for older games.

As for user options, this depends on whether the client controls the HDR state of its displays or not. Most probably don't, but I don't know if this holds true for Xbox. And adding options for things we don't control makes no sense to me. If client has HDR display - request HDR, if client has SDR display - request SDR, if moonlight is moved from HDR display to SDR or vice versa - notify server through control message (will need protocol extension). This is my take at least.

@ns6089
Copy link
Copy Markdown
Contributor Author

ns6089 commented Mar 6, 2026

@andygrundman I've updated the PR. Server-side tone mapping is now controlled by renderer capabilities mask, in opt-in manner. So if you add client-side tone mapping there will be no clash. Feel free to, by the way. Even though I consider server-side tone mapping to be a better choice, adding it to all rendering paths in sunshine (particularly on Linux) is not realistic.

@andygrundman
Copy link
Copy Markdown
Contributor

You're right, a server-side tone-mapper does make a lot of sense. Do you think hooking into something like reshade is feasible? There's a lot of good stuff there. I would really like to be able to turn on one of those cool HDR visualizers for example.

@ns6089
Copy link
Copy Markdown
Contributor Author

ns6089 commented Mar 6, 2026

Hard to say, I'm not familiar enough with them. And since they can't cover all cases, I still need to work on some generic solution first. But some degree of integration would certainly be nice. I also considered placing capture (and possibly encoding) there, in theory this could provide negative latency (we may get the frame before it gets sent to screen). But again, same problem - need to iron out generic pipelines first.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants