See The Sound: The Code Behind Music Visualizers

Music visualization is the process of generating animated imagery based on a piece of music. The job of a visualizer is (usually) to perform this process in real-time for any given song. This topic has always been of great interest to me as I still remember being blown away by the Windows Media Player visualizer when I was 12.

this hit wayyy different in 2006.

This is a really great curated list of audio visualizers by Github user willianjusten. You’ll notice that many of them use some variation of WebGL and/or Web Audio API. WebGL is a JavaScript API for real-time rendering of 2D and 3D graphics. It’s compatible with browsers on everything from a desktop to a smartphone, and it allows for hardware acceleration, which means the device’s GPU can be utilized to create faster, more complex animations. It’s not an easy language to learn, which is why there are several JavaScript libraries dedicated to making WebGL more accessible (three.js and Babylon.js are two of the most popular). Here’s an example of a visualizer created using some of the languages mentioned so far:

12 year old me would’ve said this is “sooo trippy” despite having no idea what that meant.

Web Audio API, on the other hand, is a JavaScript API designed to handle (you guessed it) audio. This powerful tool can be used to synthesize & process audio, mimicking functions you would find in modern game audio & music production software. When combined with WebGL, this can be used to give your visualizer an extra layer of interactivity (such as being able to add filters & effects to the music) or even create an instrument in your browser.

me after learning hedwig’s theme

There’s a huge amount of resources out there regarding visualization so this is really just the tip of the iceberg. I would encourage anyone interested in learning more to browse the previously mentioned Github list and find a developer who inspires you (my current favorite is Jordan Machado)!