Curious about the Oculus Rift? (Question and Answer thread)

I have an Oculus Rift DK2 and have been keeping up with the information on the Oculus Rift overall overtime. If you have any questions regarding development for it, use of it, etc. I figured you could post in this thread with your questions and I would do my best to answer.

You all seem to be semi-curious about AR and VR recently, so I figured it might help spur more of your interests :smile:.

I had a go on one a while ago and it was a sort of rollercoaster simulator where you could look around and experience the ride, it was cool and all but it made me nauseous. Is this the norm and if so do you get used to it after a while?

Also are there any things interesting things you’ve done with/about the Oculus Rift that casuals like myself wouldn’t know about?

I had a go on one a while ago and it was a sort of rollercoaster simulator where you could look around and experience the ride, it was cool and all but it made me nauseous. Is this the norm and if so do you get used to it after a while?

Also are there any things interesting things you’ve done with/about the Oculus Rift that casuals like myself wouldn’t know about?

You do get used to the nauseous parts of the Oculus Rift. It’s just your mind comprehending too many things at once, but when you’re exposed to it more often, you get used to it. Oddly enough (at least for me) your mind go in a thrill addiction.

The great thing about the Oculus Rift is the nausea part actually. The Oculus Rift has made leaps in terms of nausea, in that it does not make you sick very often. This is more so a programming problem than it is the Oculus Rift problem, and it occurs from things such as shaky camera movement in game (think grenades or in battlefield how your head bobs when you run). So I wouldn’t expect yourself to be sick very much if you were to buy it. There are currently a list of suggestions for developers when you are creating games that can be found in the I believe it was the user manual. I’ll post it so you can read it without downloading it.

Best Practices Guide

Executive Summary of Best Practices
Rendering
● Use the Oculus VR distortion shaders. Approximating your own distortion solution, even when
it “looks about right,” is most often discomforting for users.
● Get the projection matrix exactly right and use of the default Oculus head model. Any deviation
from the optical flow that accompanies real world head movement creates oculomotor and
bodily discomfort.
● Maintain VR immersion from start to finish – don’t affix an image in front of the user (such as a
full-field splash screen that does not respond to head movements), as this can be disorienting.
● The images presented to each eye should differ only in terms of viewpoint; post-processing
effects (e.g., light distortion, bloom) must be applied to both eyes consistently as well as
rendered in z-depth correctly to create a properly fused image.
● Consider supersampling and/or anti-aliasing to remedy low apparent resolution, which will
appear worst at the center of each eye’s screen.
Minimizing Latency
● Your code should run at a frame rate equal to or greater than the Rift display refresh rate, vsynced
and unbuffered. Lag and dropped frames produce judder which is discomforting in VR.
● Ideally, target 20ms or less motion-to-photon latency (measurable with the Rift’s built-in latency
tester). Organise your code to minimize the time from sensor fusion (reading the Rift sensors) to
rendering.
● Game loop latency is not a single constant and varies over time. The SDK uses some tricks
(e.g., predictive tracking, TimeWarp) to shield the user from the effects of latency, but do
everything you can to minimize variability in latency across an experience.
● Use the SDK’s predictive tracking, making sure you feed in an accurate time parameter into the
function call. The predictive tracking value varies based on application latency and must be
tuned per application.
● Consult the OculusRoomTiny source code as an example for minimizing latency and applying
proper rendering techniques in your code.
Optimization
● Decrease eye-render buffer resolution to save video memory and increase frame rate.
● Although dropping display resolution can seem like a good method for improving performance,
the resulting benefit comes primarily from its effect on eye-render buffer resolution. Dropping
January 9, 2015 version
3 ©January 2015, Oculus VR, LLC
the eye-render buffer resolution while maintaining display resolution can improve performance
with less of an effect on visual quality than doing both.
Head-tracking and Viewpoint
● Avoid visuals that upset the user’s sense of stability in their environment. Rotating or moving
the horizon line or other large components of the user’s environment in conflict with the user’s
real-world self-motion (or lack thereof) can be discomforting.
● The display should respond to the user’s movements at all times, without exception. Even in
menus, when the game is paused, or during cutscenes, users should be able to look around.
● Use the SDK’s position tracking and head model to ensure the virtual cameras rotate and move
in a manner consistent with head and body movements; discrepancies are discomforting.
Positional Tracking
● The rendered image must correspond directly with the user’s physical movements; do not
manipulate the gain of the virtual camera’s movements. A single global scale on the entire
head model is fine (e.g. to convert feet to meters, or to shrink or grow the player), but do not
scale head motion independent of inter-pupillary distance (IPD).
● With positional tracking, users can now move their viewpoint to look places you might have not
expected them to, such as under objects, over ledges, and around corners. Consider your
approach to culling and backface rendering, etc…
● Under certain circumstances, users might be able to use positional tracking to clip through the
virtual environment (e.g., put their head through a wall or inside objects). Our observation is
that users tend to avoid putting their heads through objects once they realize it is possible,
unless they realize an opportunity to exploit game design by doing so. Regardless, developers
should plan for how to handle the cameras clipping through geometry. One approach to the
problem is to trigger a message telling them they have left the camera’s tracking volume
(though they technically may still be in the camera frustum).
● Provide the user with warnings as they approach (but well before they reach) the edges of the
position camera’s tracking volume as well as feedback for how they can re-position themselves
to avoid losing tracking.
● We recommend you do not leave the virtual environment displayed on the Rift screen if the user
leaves the camera’s tracking volume, where positional tracking is disabled. It is far less
discomforting to have the scene fade to black or otherwise attenuate the image (such as
dropping brightness and/or contrast) before tracking is lost. Be sure to provide the user with
feedback that indicates what has happened and how to fix it.
● Augmenting or disabling position tracking is discomforting. Avoid doing so whenever possible,
and darken the screen or at least retain orientation tracking using the SDK head model when
position tracking is lost.
Accelerations
● Acceleration creates a mismatch among your visual, vestibular, and proprioceptive senses;
minimize the duration and frequency of such conflicts. Make accelerations as short (preferably
instantaneous) and infrequent as you can.
● Remember that “acceleration” does not just mean speeding up while going forward; it refers to
January 9, 2015 version
4 ©January 2015, Oculus VR, LLC
any change in the motion of the user. Slowing down or stopping, turning while moving or
standing still, and stepping or getting pushed sideways are all forms of acceleration.
● Have accelerations initiated and controlled by the user whenever possible. Shaking, jerking, or
bobbing the camera will be uncomfortable for the player.
Movement Speed
● Viewing the environment from a stationary position is most comfortable in VR; however, when
movement through the environment is required, users are most comfortable moving through
virtual environments at a constant velocity. Real-world speeds will be comfortable for longer—
for reference, humans walk at an average rate of 1.4 m/s.
● Teleporting between two points instead of walking between them is worth experimenting with in
some cases, but can also be disorienting. If using teleportation, provide adequate visual cues
so users can maintain their bearings, and preserve their original orientation if possible.
● Movement in one direction while looking in another direction can be disorienting. Minimize the
necessity for the user to look away from the direction of travel, particularly when moving faster
than a walking pace.
● Avoid vertical linear oscillations, which are most discomforting at 0.2 Hz, and off-vertical-axis
rotation, which are most discomforting at 0.3 Hz.
Cameras
● Zooming in or out with the camera can induce or exacerbate simulator sickness, particularly if
they cause head and camera movements to fall out of 1-to-1 correspondence with each other.
We advise against using “zoom” effects until further research and development finds a
comfortable and user-friendly implementation…
● For third-person content, be aware that the guidelines for accelerations and movements still
apply to the camera regardless of what the avatar is doing. Furthermore, users must always
have the freedom to look all around the environment, which can add new requirements to the
design of your content.
● Avoid using Euler angles whenever possible; quaternions are preferable. Try looking straight
up and straight down to test your camera; it should always be stable and consistent with your
head orientation.
● Do not use “head bobbing” camera effects; they create a series of small but uncomfortable
accelerations.
Managing and Testing Simulator Sickness
● Test your content with a variety of un-biased users to ensure it is comfortable to a broader
audience. As a developer, you are the worst test subject. Repeated exposure to and familiarity
with the Rift and your content makes you less susceptible to simulator sickness or content
distaste than a new user.
● People’s responses and tolerance to sickness vary, and visually induced motion sickness
occurs more readily in virtual reality headsets than with computer or TV screens. Your
audience will not “muscle through” an overly intense experience, nor should they be expected
to do so.
January 9, 2015 version
5 ©January 2015, Oculus VR, LLC
● Consider implementing mechanisms that allow users to adjust the intensity of the visual
experience. This will be content-specific, but adjustments might include movement speed, the
size of accelerations, or the breadth of the displayed FOV. Any such settings should default to
the lowest-intensity experience.
● For all user-adjustable settings related to simulator sickness management, users may want to
change them on-the-fly (for example, as they become accustomed to VR or become fatigued).
Whenever possible, allow users to change these settings in-game without restarting.
● An independent visual background that matches the player’s real-world inertial reference frame
(such as a skybox that does not move in response to controller input but can be scanned with
head movements) can reduce visual conflict with the vestibular system and increase comfort
(see Appendix G for details).
● High spatial frequency imagery (e.g., stripes, fine textures) can enhance the perception of
motion in the virtual environment, leading to discomfort. Use—or offer the option of—flatter
textures in the environment (such as solid-colored rather than patterned surfaces) to provide a
more comfortable experience to sensitive users.
Degree of Stereoscopic Depth (“3D-ness”)
● For individualized realism and a correctly scaled world, use the middle-to-eye separation
vectors supplied by the SDK from the user’s profile.
● Be aware that depth perception from stereopsis is sensitive up close, but quickly diminishes
with distance. Two mountains miles apart in the distance will provide the same sense of depth
as two pens inches apart on your desk.
● Although increasing the distance between the virtual cameras can enhance the sense of depth
from stereopsis, beware of unintended side effects. First, this will force users to converge their
eyes more than usual, which could lead to eye strain if you do not move objects farther away
from the cameras accordingly. Second, it can give rise to perceptual anomalies and discomfort
if you fail to scale head motion equally with eye separation.
User Interface
● UIs should be a 3D part of the virtual world and sit approximately 2-3 meters away from the
viewer—even if it’s simply drawn onto a floating flat polygon, cylinder or sphere that floats in
front of the user.
● Don’t require the user to swivel their eyes in their sockets to see the UI. Ideally, your UI should
fit inside the middle 1/3rd of the user’s viewing area; otherwise, they should be able to examine
it with head movements.
● Use caution for UI elements that move or scale with head movements (e.g., a long menu that
scrolls or moves as you move your head to read it). Ensure they respond accurately to the
user’s movements and are easily readable without creating distracting motion or discomfort.
● Strive to integrate your interface elements as intuitive and immersive parts of the 3D world. For
example, ammo count might be visible on the user’s weapon rather than in a floating HUD.
● Draw any crosshair, reticle, or cursor at the same depth as the object it is targeting; otherwise, it
can appear as a doubled image when it is not at the plane of depth on which the eyes are
converged.
January 9, 2015 version
6 ©January 2015, Oculus VR, LLC
Controlling the Avatar
● User input devices can’t be seen while wearing the Rift. Allow the use of familiar controllers as
the default input method. If a keyboard is absolutely required, keep in mind that users will have
to rely on tactile feedback (or trying keys) to find controls.
● Consider using head movement itself as a direct control or as a way of introducing context
sensitivity into your control scheme.
Sound
● When designing audio, keep in mind that the output source follows the user’s head movements
when they wear headphones, but not when they use speakers. Allow users to choose their
output device in game settings, and make sure in-game sounds appear to emanate from the
correct locations by accounting for head position relative to the output device.
● Presenting NPC (non-player character) speech over a central audio channel or left and right
channels equally is a common practice, but can break immersion in VR. Spatializing audio,
even roughly, can enhance the user’s experience.
● Keep positional tracking in mind with audio design; for example, sounds should get louder as
the user leans towards their source, even if the avatar is otherwise stationary.
Content
● For recommendations related to distance, one meter in the real world corresponds roughly to
one unit of distance in Unity.
● The optics of the DK2 Rift make it most comfortable to view objects that fall within a range of
0.75 to 3.5 meters from the user’s eyes. Although your full environment may occupy any range
of depths, objects at which users will look for extended periods of time (such as menus and
avatars) should fall in that range.
● Converging the eyes on objects closer than the comfortable distance range above can cause
the lenses of the eyes to misfocus, making clearly rendered objects appear blurry as well as
lead to eyestrain.
● Bright images, particularly in the periphery, can create noticeable display flicker for sensitive
users; if possible, use darker colors to prevent discomfort.
● A virtual avatar representing the user’s body in VR can have pros and cons. On the one hand,
it can increase immersion and help ground the user in the VR experience, when contrasted to
representing the player as a disembodied entity. On the other hand, discrepancies between
what the user’s real-world and virtual bodies are doing can lead to unusual sensations (for
example, looking down and seeing a walking avatar body while the user is sitting still in a chair).
Consider these factors in designing your content.
● Consider the size and texture of your artwork as you would with any system where visual
resolution and texture aliasing is an issue (e.g. avoid very thin objects).
● Unexpected vertical accelerations, like those that accompany traveling over uneven or
undulating terrain, can create discomfort. Consider flattening these surfaces or steadying the
user’s viewpoint when traversing such terrain.
● Be aware that your user has an unprecedented level of immersion, and frightening or shocking
content can have a profound effect on users (particularly sensitive ones) in a way past media
could not. Make sure players receive warning of such content in advance so they can decide
January 9, 2015 version
7 ©January 2015, Oculus VR, LLC
whether or not they wish to experience it.
● Don’t rely entirely on the stereoscopic 3D effect to provide depth to your content; lighting,
texture, parallax (the way objects appear to move in relation to each other when the user
moves), and other visual features are equally (if not more) important to conveying depth and
space to the user. These depth cues should be consistent with the direction and magnitude of
the stereoscopic effect.
● Design environments and interactions to minimize the need for strafing, back-stepping, or
spinning, which can be uncomfortable in VR.
● People will typically move their heads/bodies if they have to shift their gaze and hold it on a
point farther than 15-20° of visual angle away from where they are currently looking. Avoid
forcing the user to make such large shifts to prevent muscle fatigue and discomfort.
● Don’t forget that the user is likely to look in any direction at any time; make sure they will not
see anything that breaks their sense of immersion (such as technical cheats in rendering the
environment).
Health and Safety
● Carefully read and implement the warnings that accompany the Rift (Appendix L) to ensure the
health and safety of both you, the developer, and your users.
● Refrain from using any high-contrast flashing or alternating colors that change with a frequency
in the 1-30 hz range. This can trigger seizures in individuals with photosensitive epilepsy.
● Avoid high-contrast, high-spatial-frequency gratings (e.g., fine, black-and-white stripes), as they
can also trigger epileptic seizures.
● SDK rendered applications will automatically implement a “Health and Safety Warning” flash
screen that appears on startup of your content. If using app rendering, you must implement the
flash screen yourself.

As you can see there are a lot of things done to help ensure you do not feel sick. I have personally only felt sick once and I personally attribute that more so to my eating habits.

As for anything interesting, when you fall in a game, you get the tingling sensation as to if you were actually falling. Horror games are also very, very good when using a headset as it really does immerse yourself quite nicely. There are many things to do with the oculus rift outside of games too!

Check out

Experience Japan

and Titans of Space

Everything also scales very nicely and seems very realistic. Mammoths in skyrim were astounding to look at. Turns out they would be pretty scary to fight in real life!

Last time I went to go buy one their website was so hard to navigate and get the proper info on it.

Where is the “add to cart” button so I can properly purchase one? I.e the website link to purchase

https://www.oculus.com/order/

https://www.oculus.com/order/

Figures.

I’ll be ordering one shortly. @Manga any game recommendations?

Actually you should wait! The Oculus Rift consumer version is coming out pre-2016, so this year. Unless you have plans to develop games or applications for it that is.

For games I would suggest

games

Windlands:

http://youtu.be/Lj_Yqr9WbCw

Dreadhalls can be fun:

http://youtu.be/fl7fz__6B-4

Blue Marble:

http://youtu.be/emKab5REmMk

Birdy King Land (a bit strange):

Open Me

http://youtu.be/5NpXh7Wx7AI

and Euro Truck Simulator is interesting!

For games that are already made that work well: Skyrim, Outlast, and Mirror’s Edge. Know that I use Vorpx for these however, which is a driver bought without the Oculus Rift that uses Directx 9-11, but also the Stereo 3D feature. For a full list of games Vorpx may help with, look here. Note that it does cost 40$.

Scariest game so far that works very well? Mental Torture.

Here area some cool things to look forward to!

Other things

http://youtu.be/jLp3W1gbhRk

https://www.youtube.com/watch?v=y90hK650xOQ

The top video involves a LEAP which is around 70-80$.
Also know that the realism that rift is able to display is insane. When you do get the Oculus Rift there is a demo in the runtime you must download that shows this. It is very, very realistic.

The Stanley Parable is also good fun if you are into that.

Had to break the post into two separate parts due to video limits.