-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mediaTime : The media presentation timestamp (PTS) in seconds of the frame presented #59
Comments
Yes the media time in the callback should map exactly to the frame's timestamp. There are some rare cases where we do rewrite timestamps, but generally the media time should be exactly what the demuxer emits. If you have a sample to look at we can investigate if you think this isn't working right. |
To expand with details and capture more information for future users of the API: The Let's say you have a something that looks like this:
If the callback is in sync with the frame presented you would have:
If the callback is 1 v-sync behind the frame presented you would have:
You can check whether the frame is already on screen, or is about to be on screen, using the |
Thank you for your responses. I was out by 10 frames, and as you pointed, it was the encoding. It is working really well! |
Does chrome have a javascript api to let us know what the refresh rate of monitor is using? Without this information we can't decide if 1 v-sync late happens programmatically. |
Curious, what kind of encoding may cause this drift? H.264 with B frames? How can we check if the timestamp used by the encoding is correct? Will such kind of video be played correctly? |
The video encoding has no impact on this API. The frame delay is related to the compositor painting before the frame callback can be delivered to the main thread. The API tries to tell you what frame will be painted, but sometimes the notification goes out too later since the operation happens across multiple threads. |
You do not need to know what the display rate is (although you could calculate it by taking the median of times between |
I'm trying to determine the exact threshold for
Can we use
What is the state between 0.000100 second to 16ms? Uncertain? |
I would try I would consider every other case as late. |
Does it conflict with the conclusion made in this thread:
More specifically, is the following case possible? Note when we get rVFC callback for frame A, at expectedDisplayTime of frame A, frame C is on screen.
|
Yes that is a conflict, and you should disregard the above statement. It seems like you are after 100% certainty, and the above should be true most of the time, but not 100% of the time. As for your example, are you asking if the 3rd frame can show up on screen before the 1st frame does? Not really. |
Oh, seems like there's disagreement in this topic. Is this statement true
In #69 ? |
If you are paused, and the pause has completed and the internal state stabilized, and then you seek, that statement is true. |
In an onpaused event we keep the currentTime as targetTime, and I'll seek to an adjacent time. In the next onSeeked event, I seek back to the targetTime. In this way I can guarantee the PTS time I have is the latest? |
I'm going to say probably, but you would have to try it out yourself to see if that algorithm works. You have to seek far enough to cause new frames to show up, which might be visible to the user. |
My videos are all constant frame rate so we can know the enough length to seek to trigger a rvfc/new frame to show up. (Accuracy is more important. We can sacrifice a little bit of user experience for this.) |
Is there a programmatic way to know this state? onpause does not mean this state?
Is it possible that internal state is not stabilized even at this time? |
Actually I'm wondering why do we need to care about The seek action is kicked off in onPaused event, and when I get the onSeeked event, it must be after the completion of pause (and whether the internal state is stable for that pause no longer matters to us, since we are at another stage, the seek after pause). And then we seek back to the time when we pause, and the rvfc of this second seek operation is what we care about. |
I'm not 100% sure, but I assume that the pause can complete (e.g. currentTime will no longer advance), and then there is a vsync that updates the current frame which matched the paused timestamp. Perhaps there could be a callback where
You're probably right that the two seeks remove the need to care about the internal state being stable. Just keep in mind that onSeeked events are fired as |
To avoid mismatch, we need to make sure the following steps happen sequentially: onPaused->seek->rVFC->onSeeked->seek->rVFC->onSeeked This can be examined programmatically by checking |
I would like to ask whether the mediatime I received in the requestVideFrameCallback represents the current frame being displayed?
I have a video with a known start time code and the video frame rate, and I am trying to work out the current SMPTE using the mediatime data.
It works really well until i get to roughly half way on the video (and it happens at this particular time). The mediatime being passed appears to drift in relation to the frame it is being displayed.
Thanks
The text was updated successfully, but these errors were encountered: