RenderingPipeline

from geometry to pixels

Measuring Latency on HMDs

As latency is a big concern in virtual reality applications, I believe it would be highly beneficial to add a way to measure the latency to the (output) devices itself. Here I want to describe a way I think latency measurements could be done live for HMDs. This technique is not limited to HMDs and could be used on any type of combined input-output user interface (e.g. touch screens), but HMDs might benefit the most (as applications using HMDs are very sensitive to latency).

State-of-the-art latency tests

This idea came to me when I thought about how to measure the latency for the Oculus Rift. You can use a high-speed camera, as I have described on an earlier blogpost. The advantage is that this technique works with every device, but it takes some time for each test. Also, you can’t use the HMD during testing.

An alternative – at least for the Rift – is the latency tester from Oculus itself. It works by sending a USB command to the computer, which the application will answer by drawing a coloured square in one “eye”. This colour change will get detected by the latency tester with a photo diode and it will then show the measured latency on its display (multiple runs are performed of course). While the testing is automated, the Rift can’t be used while testing.

So here’s an idea for a in-device test:

The application sends a command via USB to the HMD to start the test (let’s call this the START_TEST command). The HMD will now store its internal timestamp and send a command back (DRAW_MARKER command). The application will change the rendering when the DRAW_MARKER command will get parsed together with the other USB signals (head rotation etc.) basically, it takes the same “route” thru the application as all the user inputs. What changes in the rending is the top-left pixel colour: Normally the renderings for the Rift are surrounded by a black border due to the distortion shader but even on HMDs that use the whole screen, changing one pixel (which should be black normally not to trigger the parsing by accident) won’t get seen by the user. So let’s say it changes from black to white caused by the final shader, the distortion shader. The HMD will parse the incoming pixel stream – at least the first pixel – for this magic color value and if it finally gets detected, it will calculate the time the HMD has waited for this to happen. Remember, it stored the internal time when it started the DRAW_MARKER command! Now it can send back the time difference in a LATENCY_RESULT command.

What is not measured is the additional time the display needs to change its colour as well as the delay from the sensors itself. Both values however are constants for one specific HMD and can get added to the return value of the LATENCY_RESULT command. If a higher frequency of latency measures is desirable, each DRAW_MARKER can also have a unique ID and the colour of the marker pixel can encode this ID (more than one pixel can be used as well).

This kind of testing can be done at runtime, while a user has the HMD on his/her head and won’t be seen by the user as this part of the screen is often not visible at all (e.g. in the Rift) or is at least very small.

Advantages if done in-device

Obviously this is useful during development to get feedback of the latency at all times. But it can also have benefits for the users: They can test what settings on there PC have an influence on latency, which games are how fast and how low the latency has to be for them to not get sick. The application can also use the real-time latency to adjust the prediction filter to the actual latency! This filter (as implemented in the Oculus Rift SDK) can predict where the users head will be headed to in a specified amount of time. If set correctly (but only for small numbers), this can increase the immersion. The problem is that the the setting has to match the real latency as closely as possible, real-time measurements would give the correct value here. The application could also reduce rendering effects to try to reduce the latency or at least warn the user in case the latency gets too high, that the experience will be reduced and the risk of motion sickness gets increased.

Update 4/21/2014: Looks like the Oculus Rift DK2 is just doing the latency test as proposed here by analysing the pixel color of the first pixel in the display controller and sending the value back with the timestamp to the application. Check out the GDC talk beginning at 50:00.

m4s0n501

, , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

*