WebVR Experiment: #2 — Semi-Static Discography Preview

6/17–3 Hour Sprint

July 8, 2018

Code & Design

Experiment 2 Goal:

Single-User, Dynamic Scene Album Preview:

  • Display Multiple Album Buttons
  • On-Gaze Change Selected Album Box, Album Environment, & Track

Experiment 2 Shipped:

Single-User, Semi-Static Scene Album Preview: https://musicworldtest1.neocities.org/


  • Loads a preset environment with the popular a-environment component
  • Plays a single track, The Ride by Anonymuz with the native sound component a-sound.
  • Displays multiple album-covers that cover Anonymuz’s discography as the material skin for a native a-box.
  • On-Click Change Album Environment & Track — no Selected Album change*


Experiment sprint number two is underway with a sharper focus on a much more manageable scope — an extension of the origin sprint’s attempt at creating an immersive multi-album artist discography.

Well, we have one album…so let’s tackle that issue first by simply hard-coding two additional <a-box> components along the same x-axis as our first album component. We’ll need to repeat the previous steps we took for preparing assets, mainly uploading the images to Imgur & creating the appropriate <img> tags within our <assets> container.

Design-wise, especially in an interface as new as mixed reality, it’s critical to continuously give users feedback so they can map actions to outputs. For example, since we know the order of the albums & their matching skins/tracks, we instantly know what album is playing. For users, this isn’t immediately users — which one of the three boxes in front of them corresponds to the context they currently find themselves in?

The answer we arrived at here is to provide the selected album component with a slightly larger scale in it’s surface area (bigger depth/height/width) relative to it’s peers.

With three albums now neatly displayed in front on us, in less than 30 mins, it’s time to tackle the fun part…

Defining & Handling State

Our A-Frame scene has a default state of Album Selected (the larger box), an associated Album Environment, & an associated Album track. Now that we have three different album buttons, we have three different possible states. Each of these three states is simply the change we/the user expect to see in our scene when one of the three buttons is clicked. To recap, we need to account for a general state that consists of the following three properties:

  • Album Selected — via the enlarged <a-box> component
  • Track Playing — via the native A-Frame <a-sound> component
  • Album Environment — via the public A-Frame Environment component

One of the core maintainers of A-Frame & overall WebVR pioneers, Kevin Ngo, as usual, comes in very handy when deciding how to approach the state problem in front of us. He released an A-Frame State component that was quite painless to install; we imported the following two lines to our <Head></Head> tag & fired up a blank State.js file that would hold our scene’s state:

<script src=”https://unpkg.com/aframe-state-component@^3.0.0/dist/aframe-state-component.min.js"></script><script src=”state.js”></script>

Away we went to our state.js file in order to define our state with it’s three internal properties that we’ll later bind to our overall scene. Brief familiarity with React/Redux came quite handy here as creating the state & it’s associated dispatchers was straight-forward. We soon had our three properties & three action dispatchers neatly summed up in fewer than 25 lines of code.

We headed back to our index.html file in order to bind our state to the appropriate components — this is done by prepending “bind__” keyword to a components attribute. For example, the sound component:

<a-sound bind__src=”value: selectedAlbumTrack” autoplay=”true” position=”0 2 5"></a-sound>

We quickly wired (binded) our environment & scaled album box to our default state, held our breath & pressed refresh…bingo! Nothing broke. Visually our A-Frame scene looks the same, but behind the scenes (sorry) we made some pretty big progress as the default scene is no longer hard-coded but dynamically loaded through our global state defined in state.js. All that’s left now is to create a cursor for our user & to write an event-handler for our <a-box> album component to change our state when the user gazes at an album. We’re moving pretty fast, however, just under an hour left in this sprint.

On-Gaze Selected Album Change

The gaze-cursor, which uses the native A-Frame <a-cursor> component, was straightforward in implementation. Now it’s only a matter of attaching event-listeners to our albums that will emit changes to our state. Simple enough.

And yet here’s where we finally ran into multiple problems that led to us falling just short of our shipping goal.

First, a quick disclaimer, the A-Frame documentation, in retrospect, certainly does give an example on event-listeners; however, it’s not immediately obvious that this example works with a simple change to the passed parameter. For a good 15 minutes+ I scoured the docs for an answer on the default way to “create a button” from an existing component with no clear answer. Considering the richness of the A-Frame documentation, I was surprised by this lack of information on such a key user feature.

Fortunately, the GitHub branch contained a gist that more thoroughly explained how event-handling works for components. Simply enough, create a new component & within the init: lifecycle function just use this.el.addEventListener(‘click’, function(evt) {our logic here}). We named our component “cursor-listener” & wrote a pretty straight-forward If-Then statement based on the HTML ID of the <a-box> clicked that emitted the three state changes required for an album scene change. Now for the moment of truth with less than 30 minutes left…to test it out!

Image for post

It worked! At least, at first glance it works well…at second glance it becomes clear that we have multiple bugs that we’re going to face in our next sprint. First off, the scale of the selected album strangely enough, did not change when we clicked on the second, middle album (the leftmost album remains the largest). Second, was our delayed reaction to gaze not being automatically mapped — it’s probably not obvious from the above GIF, but we’re clicking here, not gazing.

On to WebVR Practice 3!

New Public Components & Technologies Used: