July 19, 2018
Code & Design
Dynamic Album Preview w/ A Music Visualizer
Dynamic Album Preview w/ A Music Visualizer
The example below works only if auto-play is turned on. If you’re on Chrome, that’s not likely — I suggest trying the following link on Firefox.
This is a very exciting sprint as we try to integrate the music discography scene we’ve built in Experiment’s 1- 3 with the simple music visualizer made in sprint Experiment 4.
I usually wouldn’t recommend this, but if you haven’t read the previous articles in this series it’s likely worth your time as I’m using those practice outcomes as this article’s starting-point. Let’s get into it!
My initial approach to this sprint was to move the logic from the visualizer (found here: https://musicworldtest3.neocities.org/) into our single-scene album discography (found here: https://musicworldtest2.neocities.org/). The idea here was to refactor the music visualizer logic by placing it in a script tag as the last component registered in our index.html. Later pointing our visualizer logic to the A-Frame native sound component (through it’s ID) to both play & analyze the mp3 file currently attached to this component — recall in our previous articles that “clicking/gazing” an album updated the attribute src for our sound component. Simple enough. Instead of creating a new audio context, destination, & audio source, we’ll just piggy-back off the one already playing.
It’s likely my hopeful naivety came off with a bit of irony in that closing sentence, but in case it didn’t I’ll spell it out: this time around I very much failed to leverage the robust Web Audio API effectively.
I ended up wasting a lot of valuable time here, probably upwards of ~90 minutes, confounding myself by trying to directly map the little knowledge I had gained in the previous sprint to this clearly more complicated exercise. I believe that the majority of my pain came from dealing with the web audio API with an unknown layer of functionality thrown in from the A-Frame <a-sound> component. How could I access this component’s audio context? Was it even possible to manipulate audio nodes with that layer in-between?
After losing counts of different approaches here, I threw in the towel when on one iteration clicking the different albums did switch, play & analyze the corresponding track; however, the previous track would not stop playing which led to multiple tracks playing all distorting themselves while the poor single analyze node desperately tried to analyze all incoming gains. Consequentially, the visualizer columns were trying to keep up with the analysis output — less than 15 seconds of testing this build my browser froze, followed by my graphics driver crashing & forcibly restarting my laptop…
Routing our audio visualizer logic through the native <a-sound> component would have to wait. Back to the drawing board…
I was stuck. So I decided to head to the source of it all, the A-Frame documentation, to check if any overlooked component would stimulate some creativity. A curious component caught my eye that I hadn’t used before: the <a-link> native component.
My first thought jumped at the possibility of pushing this sprint through an albeit very monotonous & repetitive solution: we could deploy three individual, separate websites that all function alike & link to each other. Users visit one site initially, & are then re-routed to another another site entirely when they click on an another album. My follow-up thought was that this solution was not only very mediocre, but that it’d also provide quite an unbearable user experience. Linking to exterior html files was a no-go…
But that led to the first productive alternative: could we somehow link to multiple interior html files? This would require significantly less manual work & provide a better user experience.
A quick ~15 minutes later we had three different sets of “index.html” & “visualizer.js” files with each corresponding to one of the three album states. Gazing an album still took the user to a new scene with an associated environment & track. Awesome, the track playing confirms that our web audio API & visualizer logic was perfectly intact. All that’s left now is to generate the entities we’re going to use to visualize the gain data passed through our analyze node…aka build the actual visualizer.
For our visualizer, I wanted to entirely surround the user in three concentric circles each containing differently-sized columns positioned along their perimeter. Currently it only displayed five sad columns in a straight line along our X-axis.
Fortunately the front-end logic here for placing each ring of columns is pretty straight-forward geometry. We want to place columns in a perfect circle around the user at (0,0,0). In order to do that, we need to make sure that all columns N are perfectly equidistant by distance R from the user. The trick is to remember that the eye-level axes pair is not X & Y, but X & Z. We can decide an arbitrary distance R, like 20 units, then use simple Sin() & Cos() functions to calculate the exact X & Z values for circular positioning. An example is given below:
var entityEl = document.createElement('a-entity');
x: 20*Math.cos(degreeCounter*(Math.PI/180)), y: .5,
Looping through the function above results in a single ring of columns, we want three rings. So the next step is to replicate the function above two more times but with a different R value both times.
With just under ~15 minutes left we mapped the analyzer node data output to the height attribute of our visualizer circle columns & went off to Neocities in order to deploy the following:
As the GIF above indicates we successfully implemented a beautiful three-ring visualizer that surrounds the user’s origin position. Neat! I think it might be time to actually deploy something like the figure above to a custom domain next.
On to WebVR Practice 6!
This experiment can be found here: https://musicworldtest4.neocities.org/
New Public Components & Technologies Used: