|
|
Created:
3 years, 8 months ago by Daniele Castagna Modified:
3 years, 8 months ago CC:
chromium-reviews, kalyank, ozone-reviews_chromium.org, piman+watch_chromium.org Target Ref:
refs/heads/master Project:
chromium Visibility:
Public. |
Descriptionozone: Create RGBX (not RGBA) buffers in GbmSurface.
We used to always create opaque DRM framebuffers.
After crrev.com/2790553002 we started allowing non-opaque fb.
GbmSurface, that was used in some tests, would allocate a BGRA
buffer and use it as primary plane. On some devices formats
with alpha are not supported on primary plane.
This CL makes sure that GbmSurface allocates RGBX buffers.
BUG=707108
TEST=video_decode_accelerator_unittest now passes.
Review-Url: https://codereview.chromium.org/2825933002
Cr-Commit-Position: refs/heads/master@{#465640}
Committed: https://chromium.googlesource.com/chromium/src/+/5d308f5270f55633e074bdaa2b2e06d303057d53
Patch Set 1 #
Total comments: 3
Messages
Total messages: 19 (9 generated)
The CQ bit was checked by dcastagna@chromium.org to run a CQ dry run
Dry run: CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.chromium.or...
dcastagna@chromium.org changed reviewers: + dnicoara@chromium.org, kylechar@chromium.org
dcastagna@chromium.org changed reviewers: + kcwu@chromium.org
The CQ bit was unchecked by commit-bot@chromium.org
Dry run: This issue passed the CQ dry run.
https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... File ui/ozone/platform/drm/gpu/gbm_surface.cc (right): https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... ui/ozone/platform/drm/gpu/gbm_surface.cc:133: new gl::GLImageNativePixmap(GetSize(), GL_RGB); Why is this GL_RGB? Wouldn't this mean a pixel with only 3 components? (vs 4 in the Bufferformat::BGRX_8888)
https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... File ui/ozone/platform/drm/gpu/gbm_surface.cc (right): https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... ui/ozone/platform/drm/gpu/gbm_surface.cc:133: new gl::GLImageNativePixmap(GetSize(), GL_RGB); On 2017/04/19 at 14:31:09, dnicoara wrote: > Why is this GL_RGB? Wouldn't this mean a pixel with only 3 components? (vs 4 in the Bufferformat::BGRX_8888) That's the internal pixel format, it doesn't need to match the number of channels of the buffer format. In practice, we just use it for validation, take a look at ValidInternalFormat in gl_image_native_pixmap.cc
https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... File ui/ozone/platform/drm/gpu/gbm_surface.cc (right): https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... ui/ozone/platform/drm/gpu/gbm_surface.cc:133: new gl::GLImageNativePixmap(GetSize(), GL_RGB); On 2017/04/19 14:51:06, Daniele Castagna wrote: > On 2017/04/19 at 14:31:09, dnicoara wrote: > > Why is this GL_RGB? Wouldn't this mean a pixel with only 3 components? (vs 4 > in the Bufferformat::BGRX_8888) > > That's the internal pixel format, it doesn't need to match the number of > channels of the buffer format. > > In practice, we just use it for validation, take a look at ValidInternalFormat > in gl_image_native_pixmap.cc Yeah, I saw that and it also confused me. So, what's the relationship between the 2 then? It seems highly confusing as I expected GL_RGB to be used in GL operations. Or at the very least I expected BufferFormat to be converted to a GL_* format at some point. Should this at the very least use something like GLImageNativePixmap::GetInternalFormatForTesting()?
On 2017/04/19 at 15:08:23, dnicoara wrote: > https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... > File ui/ozone/platform/drm/gpu/gbm_surface.cc (right): > > https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... > ui/ozone/platform/drm/gpu/gbm_surface.cc:133: new gl::GLImageNativePixmap(GetSize(), GL_RGB); > On 2017/04/19 14:51:06, Daniele Castagna wrote: > > On 2017/04/19 at 14:31:09, dnicoara wrote: > > > Why is this GL_RGB? Wouldn't this mean a pixel with only 3 components? (vs 4 > > in the Bufferformat::BGRX_8888) > > > > That's the internal pixel format, it doesn't need to match the number of > > channels of the buffer format. > > > > In practice, we just use it for validation, take a look at ValidInternalFormat > > in gl_image_native_pixmap.cc > > Yeah, I saw that and it also confused me. So, what's the relationship between the 2 then? It seems highly confusing as I expected GL_RGB to be used in GL operations. Or at the very least I expected BufferFormat to be converted to a GL_* format at some point. > > Should this at the very least use something like GLImageNativePixmap::GetInternalFormatForTesting()? When a client of the GPU service creates an image with CreateImageCHROMIUM it specifies an internal format, that is later used service side to validate bufferformat along with internalformat (in CreateImageValidInternalFormat and IsImageFormatCompatibleWithGpuMemoryBufferFormat). internalformat is usually used by our GL implementation. In this case we're creating the image directly, so we skip all the GPU service validation. Saying GL_RGB just suggests the alpha channel should be ignored. In general data format and internalformat don't need to match. See https://github.com/phracker/MacOSX-SDKs/blob/master/MacOSX10.8.sdk/System/Lib... for example. You can use a data format GL_UNSIGNED_INT_8_8_8_8_REV with GL_RGB or GL_RGBA. If you use it with GL_RGB you'll get 1.0 when you read the alpha channel.
On 2017/04/19 at 15:29:33, Daniele Castagna wrote: > On 2017/04/19 at 15:08:23, dnicoara wrote: > > https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... > > File ui/ozone/platform/drm/gpu/gbm_surface.cc (right): > > > > https://codereview.chromium.org/2825933002/diff/1/ui/ozone/platform/drm/gpu/g... > > ui/ozone/platform/drm/gpu/gbm_surface.cc:133: new gl::GLImageNativePixmap(GetSize(), GL_RGB); > > On 2017/04/19 14:51:06, Daniele Castagna wrote: > > > On 2017/04/19 at 14:31:09, dnicoara wrote: > > > > Why is this GL_RGB? Wouldn't this mean a pixel with only 3 components? (vs 4 > > > in the Bufferformat::BGRX_8888) > > > > > > That's the internal pixel format, it doesn't need to match the number of > > > channels of the buffer format. > > > > > > In practice, we just use it for validation, take a look at ValidInternalFormat > > > in gl_image_native_pixmap.cc > > > > Yeah, I saw that and it also confused me. So, what's the relationship between the 2 then? It seems highly confusing as I expected GL_RGB to be used in GL operations. Or at the very least I expected BufferFormat to be converted to a GL_* format at some point. > > > > Should this at the very least use something like GLImageNativePixmap::GetInternalFormatForTesting()? > > When a client of the GPU service creates an image with CreateImageCHROMIUM it specifies an internal format, that is later used service side to validate bufferformat along with internalformat (in CreateImageValidInternalFormat and IsImageFormatCompatibleWithGpuMemoryBufferFormat). internalformat is usually used by our GL implementation. > > In this case we're creating the image directly, so we skip all the GPU service validation. Saying GL_RGB just suggests the alpha channel should be ignored. > > In general data format and internalformat don't need to match. See https://github.com/phracker/MacOSX-SDKs/blob/master/MacOSX10.8.sdk/System/Lib... for example. > You can use a data format GL_UNSIGNED_INT_8_8_8_8_REV with GL_RGB or GL_RGBA. If you use it with GL_RGB you'll get 1.0 when you read the alpha channel. This code is similar to what we do when we create the output surface: https://cs.chromium.org/chromium/src/content/browser/compositor/gpu_process_t...
lgtm Thank you for explaining! nit: Not as important now, but should this use something like GLImageNativePixmap::GetInternalFormatForTesting() in case we ever change the primary display format and it wouldn't pass ValidInternalFormat()?
The CQ bit was checked by dcastagna@chromium.org
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.chromium.or...
CQ is committing da patch. Bot data: {"patchset_id": 1, "attempt_start_ts": 1492620811083790, "parent_rev": "786ed37e758793e92b323bcad36fad924ca305ad", "commit_rev": "5d308f5270f55633e074bdaa2b2e06d303057d53"}
Message was sent while issue was closed.
Description was changed from ========== ozone: Create RGBX (not RGBA) buffers in GbmSurface. We used to always create opaque DRM framebuffers. After crrev.com/2790553002 we started allowing non-opaque fb. GbmSurface, that was used in some tests, would allocate a BGRA buffer and use it as primary plane. On some devices formats with alpha are not supported on primary plane. This CL makes sure that GbmSurface allocates RGBX buffers. BUG=707108 TEST=video_decode_accelerator_unittest now passes. ========== to ========== ozone: Create RGBX (not RGBA) buffers in GbmSurface. We used to always create opaque DRM framebuffers. After crrev.com/2790553002 we started allowing non-opaque fb. GbmSurface, that was used in some tests, would allocate a BGRA buffer and use it as primary plane. On some devices formats with alpha are not supported on primary plane. This CL makes sure that GbmSurface allocates RGBX buffers. BUG=707108 TEST=video_decode_accelerator_unittest now passes. Review-Url: https://codereview.chromium.org/2825933002 Cr-Commit-Position: refs/heads/master@{#465640} Committed: https://chromium.googlesource.com/chromium/src/+/5d308f5270f55633e074bdaa2b2e... ==========
Message was sent while issue was closed.
Committed patchset #1 (id:1) as https://chromium.googlesource.com/chromium/src/+/5d308f5270f55633e074bdaa2b2e... |