lkml.org 
[lkml]   [2019]   [May]   [6]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
Date
From
SubjectRe: Support for 2D engines/blitters in V4L2 and DRM
On Wed, 17 Apr 2019 20:10:15 +0200
Paul Kocialkowski <paul.kocialkowski@bootlin.com> wrote:

> There's also the possibility of writing up a drm-render DDX to handle
> these 2D blitters that can make things a lot faster when running a
> desktop environment. As for wayland, well, I don't really know what to
> think. I was under the impression that it relies on GL for 2D
> operations, but am really not sure how true that actually is.

Hi Paul,

Wayland does not rely on anything really, it does not even have any
rendering commands, and is completely agnostic to how applications or
display servers might be drawing things. Wayland (protocol) does care
about buffer types and fences though, since those are the things passed
between applications and servers.

In a Wayland architecture, each display server (called a Wayland
compositor, corresponding to Xorg + window manager + compositing
manager) uses whatever they want to use for putting the screen contents
together. OpenGL is a popular choice, yes, but they may also use Vulkan,
Pixman, Cairo, Skia, DRM KMS planes, and whatnot or a mix of any.
Sometimes it may so happen that the display server does not need to
render at all, the display hardware can realize the screen contents
through e.g. KMS planes.

Writing a hardware specific driver (like a DDX for Xorg) for one
display server (or a display server library like wlroots or libweston)
is no longer reasonable. You would have to do it on so many display
server projects. What really makes it infeasible is the
hardware-specific aspect. People would have to write a driver for every
display server project for every hardware model. That's just not
feasible today.

Some display server projects even refuse to take hardware-specific code
upstream, because keeping it working has a high cost and only very few
people can test it.

The only way as I see that you could have Wayland compositors at large
take advantage of 2D hardware units is to come up with the common
userspace API in the sense similar to Vulkan or OpenGL, so that each
display server would only need to support the API, and the API
implementation would handle the hardware-specific parts. OpenWF by
Khronos may have been the most serious effort in that, good luck
finding any users or implementations today. Although maybe Android's
hwcomposer could be the next one.

However, if someone is doing a special Wayland compositor to be used on
specific hardware, they can of course use whatever to put the screen
contents together in a downstream fork. Wayland does not restrict that
in any way, not even by buffer or fence types because you can extend
Wayland to deal with anything you need, as long as you also modify the
apps or toolkits to do it too. The limitations are really more
political and practical if you aim for upstream and wide-spread use of
2D hardware blocks.


Thanks,
pq
[unhandled content-type:application/pgp-signature]
\
 
 \ /
  Last update: 2019-05-06 10:29    [W:7.080 / U:0.012 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site