Get updates on the future of VR/AR:

  FILTERS:      Art & Design      Education      Explorations      Gaming      Hardware      Medical      Music      Orion      Unity      Unreal      UX Design      VR/AR   

// Art & Design

Tomorrow in Montreal, audience members at the IX Symposium will see one of Jupiter’s moons appear inside a 60-foot dome. But this isn’t something you can find in a telescope – it’s a trippy virtual environment with stark geometric shapes and classical forms.

We at Thomas Street have been eyeing the Oculus Rift for quite some time, paying particular attention to demos featuring novel interfaces. We all grew tired of talking about how we wanted to explore VR development, so we allocated several weeks to tinkering with the Oculus Rift and Leap Motion— staffing one full-time developer and a […]

A few weeks back, a first-person VR animation experiment hit Reddit. True to form, the ever-investigative VR community immediately began unpacking the possibilities a tool like this could bring to the field of animation. Does virtual reality have the potential to unlock new technical and artistic workflows? What new freedoms (or constraints) does it offer creative professionals? Could this proof of concept be transformed into actual software in the near future?

One of the most exciting things about VR is its power to play tricks on the mind. From creating new senses to improving old ones, here are four ways that VR developers are experimenting with human perception.

Hand tracking and virtual reality are both emerging technologies, and combining the two into a fluid and seamless experience can be a real challenge. This month, we’re exploring the bleeding edge of VR design with a closer look at our VR Best Practices Guidelines.

Jody Medich is a UX designer and researcher who believes that the next giant leap in technology involves devices and interfaces that can “speak human.” In this essay, she asks how a 3D user interface could transform how we explore and understand content – by giving our brains a whole new dimension of working memory.

Hand tracking and virtual reality are both emerging technologies, and combining the two into a fluid and seamless experience can be a real challenge. This month, we’re exploring the bleeding edge of VR design with a closer look at our VR Best Practices Guidelines.

Once the most underrated element of virtual reality, sound is now widely recognized to be a major element in creating VR with “presence.” In this post, we take a look at 4 ways that sound, VR, and motion controls can be a powerful combination.

In yesterday’s post, I talked about the need for 3D design tools for VR that can match the power of our imaginations. After being inspired by street artists like Sergio Odeith, I made sketches and notes outlining the functionality I wanted. From there I researched the space, hoping that someone had created and released exactly what I was looking for. Unfortunately I didn’t find it; either the output was not compatible with DK2, the system was extremely limited, the input relied on a device I didn’t own, or it was extremely expensive.

What if you could create art outside the boundaries of physics, but still within the real world? For artists like Sergio Odeith, this means playing tricks with perspective. Sergio makes stunning anamorphic (3D-perspective-based) art using spray paint, a surface with a right angle, and his imagination.

Creative 3D thinkers like Odeith should have the ability to use their freehand art skills to craft beautiful volumetric pieces. Not just illusions on the corners of walls, but three-dimensional works that that people can share the same space with. This was what inspired me to create Graffiti 3D – a VR demo that I entered into the Leap Motion 3D Jam. It’s available free for Windows, Mac, and Linux on my itch.io site.

What if you could disassemble a robot at a touch? Motion control opens up exciting possibilities for manipulating 3D designs, with VR adding a whole new dimension to the mix. Recently, Battleship VR and Robot Chess developer Nathan Beattie showcased a small CAD experiment at the Avalon Airshow. Supported by the School of Engineering, Deakin University, the demo lets users take apart a small spherical robot created by engineering student Daniel Howard.

Nathan has since open sourced the project, although the laboratory environment is only available in the executable demo for licensing reasons. Check out the source code at github.com/Zaeran/CAD-Demo.

The “Augmented Hand Series” (by Golan Levin, Chris Sugrue, and Kyle McDonald) is a real-time interactive software system that presents playful, dreamlike, and uncanny transformations of its visitors’ hands. It consists of a box into which the visitor inserts their hand, and a screen which displays their ‘reimagined’ hand—for example, with an extra finger, or with fingers that move autonomously. Critically, the project’s transformations operate within the logical space of the hand itself, which is to say: the artwork performs “hand-aware” visualizations that alter the deep structure of how the hand appears.