Spatial Music Visualizer

Inside Music is a Google WebVR Experiment that lets you step inside a song, giving you a closer look at how music is made. The bonus is the music is spatialized as well so you get a completely different audio experience from a normal stereo mix.

Open the Song Visualizer in a new tab: https://sonicviz.gitlab.io/sonicviz-spatial-music/
You can move around using the WASD keys and mouse, just like a regular game controller mode.
Note: Best used with Google Chrome browser with no other tabs open.

“Interaction
Select a song from the menu. The stems of the song will appear in a circle around you, each represented by a sphere. In 360 Mode, tap the spheres to turn them on or off. In VR Mode, you can use your controller to toggle their state. On Google Cardboard, you will have a retical (a small circle in front of you eye) which can be used to turn the stems on and off.”

I thought it would be a good opportunity to pull apart and test it with a couple of my own songs:

There’s huge potential with spatial music to revolutionize music production and delivery, and we’re only just getting started. For some more info on this you can read my blog post on “Immersive Audio and Musical AI“.

There’s a bit of a process to go through, including configuring your development workflow and tools but in the end it’s a pretty cool way of getting inside the music. I also used it as an opportunity to test gitlab CI and page hosting.

Next step will be to extend it with some custom visualizations, refine the asset pipeline workflow. I’ve actually had a similar concept bouncing around to do in Unity3D so I’ll probably do that at some point.

See also: https://www.canvas.co.com/creations/3901

Streaming Live Music

A long term action research project on networked virtual live streaming music performance, virtual experiences, and interfaces.
Solo performances and collaborative live networked real time music jams with musicians around the planet.

Live Music has long been the “Killer App” of the SecondLife online social experience, acting as the glue that binds people to come together for shared experiences.
I’ve produced and performed at 1000+ events in SL and have learned a lot over this time in areas as varied as:

  • Virtual event production
  • Virtual set production
  • Platform evaluation
  • Technology strategy
  • Streaming Audio and Video
  • Virtual platform economics
  • Virtual event management

See also:

I developed an interactive reactive AI controlled rhythm section that responds to my playing dynamics in real time.

Networked live music with musicians from two countries  (Japan/Canada) simultaneously.

Networked live music with musicians from two countries (Japan/Taiwan) simultaneously.

Music Flow

Adaptive music composition driven interactively by real time 3D artificial intelligence.

Prototype for a VR project in the health and art therapy market.

Built using Unity3D

Neurofeedback and Biofeedback “InnerActive” games

Biofeedback “InnerActive” games

Mind/body/spirit integration and performance enhancement.

Developing a series of single and multiplayer biofeedback audiovisual instruments/apps for Second Life and other application environments utilizing SCL, HRV, and EEG interfaces:

  • In game events, mini-games, and audiovisual music instruments
  • Download this Interview about neuro & bio feedback performance in SecondLife, Second Life Newspaper, 04/04/06
  • Workshop Presenter at free-play 05 Indy GameDev conference
    “An Introduction to Biofeedback Interfaces for games and instruments” sponsored by WildDivine
  • Live performances utilising various technologies and interfaces. 1. Pink Cow, Tokyo, 8/01/06
  • Real Life/SL mixed reality bioPerformance ARTTour 2006 20/21 May 06

See also:

Pin It on Pinterest