lkml.org 
[lkml]   [2017]   [Mar]   [17]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
Date
From
SubjectRe: DRM Atomic property for color-space conversion
Hi Ville,

On Thu, Mar 16, 2017 at 07:36:56PM +0200, Ville Syrjälä wrote:
>On Thu, Mar 16, 2017 at 07:05:12PM +0200, Sharma, Shashank wrote:
>> On 3/16/2017 5:55 PM, Brian Starkey wrote:
>> > On Thu, Mar 16, 2017 at 05:14:07PM +0200, Sharma Shashank wrote:
>> >> On 3/16/2017 4:37 PM, Local user for Liviu Dudau wrote:
>> >>> On Thu, Mar 16, 2017 at 04:30:59PM +0200, Ville Syrjälä wrote:
>> >>>> On Thu, Mar 16, 2017 at 04:20:29PM +0200, Sharma, Shashank wrote:
>> >>>>> On 3/16/2017 4:07 PM, Ville Syrjälä wrote:

[snip]

>> >>>>>>
>> >>>>>> So what we might need is something like:
>> >>>>>> enum YCBCR_TO_RGB_CSC
>> >>>>>> * YCbCr BT.601 limited to RGB BT.709 full
>> >>>>>> * YCbCr BT.709 limited to RGB BT.709 full <this would be the
>> >>>>>> likely default value IMO>
>> >>>>>> * YCbCr BT.601 limited to RGB BT.2020 full
>> >>>>>> * YCbCr BT.709 limited to RGB BT.2020 full
>> >>>>>> * YCbCr BT.2020 limited to RGB BT.2020 full
>> >>>>>>
>> >>>>>> And thanks to BT.2020 we'll need a RGB->RGB CSC property as well.
>> >>>>>> Eg:
>> >>>>>> enum RGB_TO_RGB_CSC
>> >>>>>> * bypass (or separate 709->709, 2020->2020?) <this would be the
>> >>>>>> default>
>> >>>>>> * RGB BT.709 full to RGB BT.2020 full
>> >
>> > I like this approach, from a point of view of being explicit and
>> > discoverable by userspace. It also happens to map quite nicely to our
>> > hardware... we have a matrix before degamma, so we could do a
>> > CSC + Gamut conversion there in one go, which is apparently not 100%
>> > mathematically correct, but in general is good enough.
>> >
>> > ... however having talked this over a bit with someone who understands
>> > the detail a lot better than me, it sounds like the "correct" thing to
>> > do as per the spec is:
>> >
>> > CSC -> DEGAMMA -> GAMUT
>> >
>> > e.g.
>> >
>> > YCbCr bt.601 limited to RGB bt.601 full -> degamma ->
>> > RGB bt.601 full to RGB bt.709 full
>> >
>> > So that sounds like what we need to support in the API, and also
>> > sounds more like the "separate properties" approach.
>> I agree.
>> >
>> >>>>>>
>> >>>>>> Alternatives would involve two properties to define the input and
>> >>>>>> output
>> >>>>>> from the CSC separately, but then you lose the capability to see
>> >>>>>> which
>> >>>>>> combinations are actually supoorted.
>> >>>>> I was thinking about this too, or would it make more sense to
>> >>>>> create two
>> >>>>> properties:
>> >>>>> - one for gamut mapping (cases like RGB709->RGB2020)
>> >>>>> - other one for Color space conversion (cases lile YUV 709 -> RGB
>> >>>>> 709)
>> >>>>>
>> >>>>> Gamut mapping can represent any of the fix function mapping,
>> >>>>> wereas CSC
>> >>>>> can bring up any programmable matrix
>> >>>>>
>> >>>>> Internally these properties can use the same HW unit or even same
>> >>>>> function.
>> >>>>> Does it sound any good ?
>> >
>> > It seems to me that actually the two approaches can be combined into
>> > the same thing:
>> > * We definitely need a YCbCr-to-RGB conversion before degamma
>> > (for converting YUV data to RGB, in some flavour)
>> > * We definitely need an RGB-to-RGB conversion after gamma to handle
>> > 709 layers blended with Rec.2020.
>> > The exact conversion each of those properties represents (CSC + gamut,
>> > CSC only, gamut only) can be implicit in the enum name.
>> >
>> > For hardware which has a fixed-function CSC before DEGAMMA with a
>> > matrix after DEGAMMA, I'd expect to see something like below. None of
>> > the YCBCR_TO_RGB_CSC values include a gamut conversion, because that
>> > is instead exposed with the RGB_TO_RGB_CSC property (which represents
>> > the hardware matrix)
>> >
>> > YCBCR_TO_RGB_CSC (before DEGAMMA):
>> > YCbCr BT.601 limited to RGB BT.601 full
>> > YCbCr BT.709 limited to RGB BT.709 full
>> > YCbCr BT.2020 limited to RGB BT.2020 full
>> >
>> > RGB_TO_RGB_CSC (after DEGAMMA):
>> > RGB BT.601 full to RGB BT.709 full
>> > RGB BT.709 full to RGB BT.2020 full
>> >
>> >
>> > On the other hand, on hardware which does a CSC + Gamut conversion in
>> > one go, before DEGAMMA (like ours), you might have:
>> >
>> > YCBCR_TO_RGB_CSC (before DEGAMMA):
>> > YCbCr BT.601 limited to RGB BT.601 full
>> > YCbCr BT.601 limited to RGB BT.709 full
>> > YCbCr BT.709 limited to RGB BT.709 full
>> > YCbCr BT.2020 limited to RGB BT.2020 full
>> >
>> > RGB_TO_RGB_CSC (after DEGAMMA):
>> > Not supported
>> >
>> > Userspace can parse the two properties to figure out its options to
>> > get from desired input -> desired output. It is perhaps a little
>> > verbose, but it's descriptive and flexible.
>> >
>> Seems to be a good idea, Ville ?
>
>Looks pretty sane to me.
>
>Though how we'll do the degamma/gamma is rather unclear still.
>
>I think we might be looking at two variants of hardware:
>- A fully programmable one with separate stages:
> csc -> degamma -> gamut -> gamma
>- A totally fixed one with just a few different variants
> of the pipeline baked into the hardware
>
>If we want to expose the gamma/degamma to the user, how exactly are we
>going to do that with the latter form or hardware. I guess we could
>specify that if the degamma property is not exposed, there will be an
>implicit degamma stage between the two csc and gamut. And if it is
>exposed the output from the first csc is non-linear and thus needs
>the degamma programmed to make it so before the gamut mapping.
>
>And perhaps a similar rule could work for the gamma; If it's present
>the output from the gamut mapping is expected to be linear, and
>otherwise non-linear. Not quite sure about this. In fact I don't yet
>know what our hardware would output from the end of the fixed pipeline.

I don't really like the implicit nature of that, and I also don't
think it fits with practical use-cases - for instance on current
Android, blending is assumed to be done with non-linear data, so an
implicit linearisation doesn't fit with that.
I think it's much better to make sure each element in the pipeline is
discoverable and configurable from userspace.

There's also the fact that we already have GAMMA/DEGAMMA properties
with a defined interface and semantics.

Mali-DP falls in the "fully programmable" camp - we can use a
programmable de-gamma curve in the plane pipeline, and for this, the
existing DEGAMMA property is a good fit.

For not-programmable hardware, would a second "DEGAMMA_FIXED" property
make sense, which is an enum type exposing what curves are supported?
(with analogous GAMMA_FIXED as well)

Drivers should expose one or the other (but not both) for each
gamma/degamma conversion the hardware implements.

-Brian

>
>>
>> - Shashank
>> >>>> It's certainly possible. One problem is that we can't inform userspace
>> >>>> upfront which combinations are supported. Whether that's a real
>> >>>> problem
>> >>>> I'm not sure. With atomic userspace can of course check upfront if
>> >>>> something can be done or not, but the main problem is then coming up
>> >>>> with a fallback strategy that doesn't suck too badly.
>> >>>>
>> >
>> > The approach above helps limit the set exposed to userspace to be only
>> > those which are supported - because devices which don't have separate
>> > hardware for the two stages won't expose values for both.
>> >
>> >>>> Anyways, I don't think I have any strong favorites here. Would be nice
>> >>>> to hear what everyone else thinks.
>> >>> I confess to a lack of experience in the subject here, but what is
>> >>> the more common
>> >>> request coming from userspace: converting YUV <-> RGB but keeping
>> >>> the gammut mapping
>> >>> separate, or YUV (gammut x) <-> RGB (gammut y) ? In other words: I
>> >>> can see the usefulness
>> >>> of having an explicit way of decomposing the color mapping process
>> >>> and control the
>> >>> parameters, but how often do apps or compositors go through the
>> >>> whole chain?
>> >> Right now, more or less the interest is on the RGB->YUV conversion
>> >> side, coz till now BT 2020 gamut was not in
>> >> picture. REC 601 and 709 have very close gamuts, so it was ok to
>> >> blend frames mostly without bothering about
>> >> gamut, but going fwd, ones REC 2020 comes into picture, we need to
>> >> bother about mapping gamuts too, else
>> >> blending Rec709 buffers and Rec2020 buffers together would cause very
>> >> visible gamut mismatch.
>> >>
>> >> So considering futuristic developments, it might be ok to consider
>> >> both. Still, as Ville mentioned, it would be good
>> >> to hear from other too.
>> >>
>> >
>> > Yeah I agree that we definitely need to consider both for anything we
>> > come up with now.
>> >
>> > Cheers,
>> > Brian
>> >
>> >> - Shashank
>> >>>
>> >>> Best regards,
>> >>> Liviu
>> >>>
>> >>>> --
>> >>>> Ville Syrjälä
>> >>>> Intel OTC
>> >>
>
>--
>Ville Syrjälä
>Intel OTC

\
 
 \ /
  Last update: 2017-03-17 11:33    [W:0.185 / U:0.312 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site