-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Off-screen rendering with webview2 #547
Comments
Currently this is not possible. We have offscreen rendering on our backlog, and are tracking it in #20, but I think this ask is clearer so I'll also add this issue to our item. Thanks! |
Hi, do you have any updates on this which you're able to share? |
Unfortunately not yet. We have not begun work on this yet. This is a large amount of work, and while very high on our priority list, gets bumped each quarter as higher priority asks come in. I really want to do this work as it's currently one of our top asks, but the earliest that could happen is Q1 2022 at this point. |
Any news on this? |
We are starting on the design phase of this work. Would you mind sharing your use case so that we can consider it in our plans? |
We are currently using CefSharp to render into an offscreen buffer that is sent to an FPGA to be composited onto a live video feed. The render includes playback of MPEG4-encoded video. We were looking into using WebView because CefSharp's default Chromium build does not include the MPEG4 codec. But without offscreen rendering, that's a moot point. We've since produced our own custom build of Chromium to include the codec. |
Our use case is to show web content in our immersive VR spaces. See https://www.igloovision.com/software/enterprise-package/igloo-web We need access to the web rendered textures so that we can warp and blend multiple images to projector outputs to create a clear seamless view on the inside of a cylinder or room. Currently we use CEF for the web input with a user maintained patch for getting the shared textures. See https://bitbucket.org/chromiumembedded/cef/pull-requests/285 and read down to the end of the comments for full details. We build CEF with proprietary codecs included. The builds of our app and CEF are done with C++. This is unsatisfactory because it uses the deprecated custom compositing (which could be removed in future) instead of skia and is difficult to keep updated. We would look at moving to webview2 if access to the rendered textures could be provided and supported along with the ability to include proprietary codecs |
Thanks for the info @rjx-ray! If you don't need high-frequency rendering you could consider using the CapturePreview function to get an image, but this isn't great for things like videos or other animations. |
Hi @champnic, thanks for the response but we do need high-frequency rendering, typically for YouTube and other web video display. |
Game UI development, specifically within DirectX 9 (Ex) / 10 / 11 contexts. Ability to render into an offscreen texture and display that as overlay of the game. |
Currently, we utilize CefSharp Offscreen for generating a PDF from HTML content or using it to generate screenshots of HTML in a server side environment. |
Hi, are there any updates on this feature? |
CAD editor software, I want to compose web ui and opengl/d3d rendering together. |
Any XR Application wanting to be able to have some 2D UI inside the 3D Environment: it means being able to render into a DirectX Texture and to be able to inject input in the off-screen view (pointer + keyboard) |
Thanks for the info! Unfortunately our design work is slow going due to high priority bugs, but we're still making progress. |
Is #579 related? |
My use case would be Game Overlay. |
I'd say an api like CefSharp would be quite pleasant and easy to port over existing CefSharp code. |
Hi, do you have any updates you're able to share? |
Adding another use case: Presentations on screens that are not just Windows desktops. From our experience with CEF we'd need the following (in descending priority order) to switch:
It might be tempting to keep these APIs as close to the corresponding Windows APIs as possible but it's not strictly necessary - this feature will mostly be used from deep within client code, and very possibly behind a bunch of abstraction layers, so I'd aim for minimal and clean first. Hope this was not too much/harsh but currently a whole industry is dependent on that CEF pull request above which has been in PR limbo for years and will soon just stop working altogether, and an alternative to that would be a very appreciated thing ;) |
@champnic - Is there any news you can share with us? |
+1 |
It would require modifications to the Chromium source, not only the framework - which is how I had understood your comment. Sorry for misunderstanding.
Maybe you just create a new issue... |
Missed this, so replying to @fredemmott
To be clear -- I require ability of custom perfect-framepaced arbitrary frame rates below Max Hz.e.g. 57fps looks exactly like 57Hz, with a green-colored VALID and very flat www.testufo.com/animation-time-graph. Or 123.5fps that looks like perfect stutter free 123.5Hz Etc. I need a fully dynamic-framerate-capable (not hardcoded to maxHz) VRR framepacing that is correctly refreshing at at dynamic asynchronous refresh cycles, and not to default Windows' scheduled MaxHz-refreshes-per-seconds during DWM+VRR ops. Chromium code design does something to force refreshing of the Chromium framebuffer at every DWM scheduled refresh cycle somehow, spoiling VRRs' raison-detre. Before W3C moved to WHATWG, I tried to post a suggestion for a VSYNC API that would solve this. w3c/html#375 ... For now, I have given up on browsers' complete inability to do proper true VRR. So the current fallback plan is to use some offscreen CEF style system, and simply sieze over control over frame presentation to do it the proper way. Hopefully there's no hardcoded tick-tock built into Chromium (Chrome automatically uses 60fps on displays it cannot sync to refresh rate to, e.g. older Linux distributions), or it ruins my plans. Part of the reason is skills silos -- browser developers are VERY skilled at designing browsers, but don't understand VRR engineering. I believe browser developers don't quite fully understand VRR - so this "browser bug" (non-true VRR during sub-MaxHz frame rates) has existed for almost a decade; So if I set a 57fps cap (by any means, like busywaits inside requestAnimationFrame() or a provided sanctioned technique...), I need the monitor sending photons to my eyeballs exactly 1/57sec apart (as VRR is designed to do so) and not rounded-off to the next refresh cycle scheduled by Microsoft Windows DWM (even when Windowed GSYNC is enabled). In true VRR operations, the frame Present() or glxxSwapBuffers() call, immediately causes the display to refresh. That's how video games do it. Why?
Yes, I have tried the framerate cap setting in Chrome, and it doesn't bypass the forced DWM refreshing, unlike many other apps can.
Desired solution:
I am still deeply dissapointed that browsers still don't do proper VRR (during sub-MaxHz frame rates).This is why I am moving my plans to doing offscreen rendering, to workaround this "browsers cant do VRR" bug.Developer TIP (for Chromium engineers)Debugging suggestion for Chromium software developers (if anyone reads this):
|
Hi And these days, I think there are solutions that should allow you to provide WebView2 on Linux and especially to add this mode. So, is it possible to provide offscreen-rendering using Ozone Layer ? (cf. chromiumembedded/cef#3263) The CEF team will be migrating to this architecture. So if, like them, your heart is in Chromium, you can either do what they do or do it better. Regards |
As you can see in |
Hello @reitolab I don't think it was merged because this pull request was rejected. The CEF strategy is described here: chromiumembedded/cef#3685 What is certain is that Microsoft has what it takes to offer OSR on both Linux and Windows. |
|
@reitolab
So what's the target date? When will we have this mode? |
this hasn't been done in 4 years, so I wouldn't expect it in this decade either |
I'm not the Webview2 dev, so I can't give any date. I was posting CEF solution here just to provide a example, and hoping someone in MS could done it. I've working on that viz solution for 3 months and it finally merged into cef 3 days ago. |
@microdee One thing's for sure: if they need help, they can count on us. Right? @reitolab |
Hello @bradp0721 Don't you have any specific information to give us on the subject?
Regards |
Maybe it is faster to apply a MS job and implement it yourself |
I'm fine. I'm just helping out. @reitolab Are you employed by CEF? |
No, I'm not. I think we shouldn't continue chat here. |
Are CoreWebView2Texture, CoreWebView2TextureStream or CoreWebView2WebTexture classes related to this request? Documentation seems a bit sparse. |
wow, seems promising |
apparently those are in there for a year and a half, see WebView2 release notes however I think it's more like sharing a texture resource between JS and the host, not rendering the entire HTML page onto a texture, however I'd be glad to be proven wrong there ;) |
is there any js api to use the texture? |
I have no idea but I found this old example |
ah bummer it's not rendering HTML to texture indeed as what's this thread is about but serving a texture as a media source for JS, regardless it's a cool feature I don't remember CEF having anything like that for example. |
That's also something I want, I did a little dig about WebGPU, and I think it is possible to do that with a GPUTexture, with dawn native, not standard WebGPU, unless further modification to chromium. I also saw some API of WebView2 that receiving a texture from JS? So probably we can also get a texture of the screen? not sure |
the PR linked above uses the API to modify the stream from the host (a video stream) and the host reading back the modifications; unless you saw some API to get access to the browsers texture/framebuffer its probably just that, video editing capabilities |
Any Update on this. |
Another ping for this thread.... |
Me too! |
Hi
Is it possible to get webview2 to render into a shared memory region like CEF :
https://bitbucket.org/chromiumembedded/cef/wiki/GeneralUsage#markdown-header-off-screen-rendering
In my application I use 2 process architecture where main process does not have network access and second process uses CEF to render webcontent into a shared memory region from where main app can read the pixels. I am wondering if I can achieve this using webview2
Thanks
AB#28491736
The text was updated successfully, but these errors were encountered: