You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We've seen an issue where when capturing with WGC the whole computer feels sluggish. What's weird, though, is that WGC can be capturing at 60 fps, but it feels like the whole computer is only running at say 15 fps (as per clicking a window and dragging it around, or just making circles with the mouse cursor). This happens especially when pushing the resource limits of whatever machine the app is running on (but isn't terribly difficult to do on say a high end laptop).
Even more interesting, looking at the underlying captured WGC capture, it actually is capturing at that high framerate (can see video data from recorded video captured via WGC update at the higher framerate). I've even tried taking a slow motion video with my phone against a timecode video (showing individual frame numbers on screen) and can see the video displaying on the monitor skipping frames, but the underlying WGC captured data has not skipped those frames.
May be helpful to define some terminology I'll throw out here (may be different than official terminology).
Monitor refresh rate (what the hardware can update at, and what DWM reports) : 59.97 Hz or thereabouts
WGC capture rate (how many OnFrameArrived callbacks WGC is throwing at us over a given period of time) : seems to capture somewhere in between 50 and 59.97 Hz over a fairly long test period right now
Effect display rate (what the user actually sees, which may be different from the monitor refresh rate) : 41.84 Hz (over exact same moment in time, computed over several seconds from a phone slow motion captured video where I verified timecode frame numbers showing up in the WGC capture but not on the monitor).
So I guess my questions are:
Is this possible for the WGC pipeline to be capturing at a high framerate, to such a degree that causes enough resource usage to actually slow down the effective display rate, which can make the computer feel sluggish, even though from all diagnostic points of view, we're getting perhaps still the full framerate from WGC? (and DWM just always seems to report the hardware refresh rate, not the effective display rate?)
For diagnostic/testing purposes, Is it possible to retrieve the effective display rate as opposed to just the static monitor refresh rate? I've tried quite a lot of APIs and none of them that I found seemed to do this.
Is it possible to slow down WGC slightly to match the effect display rate, rather than decimating some of those frames to the effective display the user sees and making the whole computer feel sluggish? (as the user experience has a negative impact when this happens)
As a side note, we have already implemented one of the suggestions mentioned in a prior thread about holding frames and not releasing them until later, to force pacing to a given X fps -- that works great in fact, so ty for that suggestion! However, because we can't figure out how to grab the effective display rate, we're a little in the dark on diagnostics and still probably capturing at too high a framerate vs what a user's computer resources seem to allow.
Thanks!
The text was updated successfully, but these errors were encountered:
That's certainly surprising, the capture shouldn't outpace the monitor...
What build of Windows are you running? I'll need to do some digging and get back to you.
Also, one thing I should mention is that we recently added a "MinUpdateInterval" feature that allows you to cap the frame rate. If you don't want the capture to exceed 30Hz you can set the min update interval to 33ms. However that was added to build 26100, which I think is currently only available to new machines.
I'm currently running on Version 10.0.22631.4037, and I know at least one other person who reported the issue was on 22631 as well. However, I'm unsure about the full array of others reporting the same issue if it's across multiple OS versions. Likely the same or other Windows 11 builds.
And, excellent to hear about "MinUpdateInterval", ty!
We've seen an issue where when capturing with WGC the whole computer feels sluggish. What's weird, though, is that WGC can be capturing at 60 fps, but it feels like the whole computer is only running at say 15 fps (as per clicking a window and dragging it around, or just making circles with the mouse cursor). This happens especially when pushing the resource limits of whatever machine the app is running on (but isn't terribly difficult to do on say a high end laptop).
Even more interesting, looking at the underlying captured WGC capture, it actually is capturing at that high framerate (can see video data from recorded video captured via WGC update at the higher framerate). I've even tried taking a slow motion video with my phone against a timecode video (showing individual frame numbers on screen) and can see the video displaying on the monitor skipping frames, but the underlying WGC captured data has not skipped those frames.
May be helpful to define some terminology I'll throw out here (may be different than official terminology).
So I guess my questions are:
As a side note, we have already implemented one of the suggestions mentioned in a prior thread about holding frames and not releasing them until later, to force pacing to a given X fps -- that works great in fact, so ty for that suggestion! However, because we can't figure out how to grab the effective display rate, we're a little in the dark on diagnostics and still probably capturing at too high a framerate vs what a user's computer resources seem to allow.
Thanks!
The text was updated successfully, but these errors were encountered: