Fopra – Focused Practice

Update Oct 2018: This project has now been retired as is no longer live

FOcused PRActice (https://fopra.live) is a timing tool to help you practice music (and other things) more effectively. It’s a session timer that lets you split your precious practice time up into three stages: Warmup, Practice, and Perform. You can read more about how to use it on the about page at https://fopra.live/#/about

I’m interested in developing educational tools to help people learn faster and more effectively so this is another small step in that direction.

Please use it and give me some feedback! Note: Best used with Google Chrome browser

Technical Development

This is also a prototype app to test a number of things I’ve been exploring in the rapid web application and PWA (Progressive Web App) development space. PWA’s are single page applications that can work both online and offline. You can also add the app to your mobile device homescreen just like a “normal” mobile app but without the friction of having to deploy it through an appstore as it’s a pure web application. You can find the “Add to Homescreen” option in your mobile browser settings.

Fopra is built with a vue.js based framework called Quasar (http://quasar-framework.org). Having worked with some other reactive javascript frameworks such as Meteor I am really impressed by the quality of Quasar, considering it comes from a one man operation.

Some other things I was also testing with this app were Gitlab’s CI (continuous integration) processes and SPA hosting via Netifly.com, complete with Https & CDN support.

Spatial Music Visualizer

Inside Music is a Google WebVR Experiment that lets you step inside a song, giving you a closer look at how music is made. The bonus is the music is spatialized as well so you get a completely different audio experience from a normal stereo mix.

Open the Song Visualizer in a new tab: https://sonicviz.gitlab.io/sonicviz-spatial-music/
You can move around using the WASD keys and mouse, just like a regular game controller mode.
Note: Best used with Google Chrome browser with no other tabs open.

“Interaction
Select a song from the menu. The stems of the song will appear in a circle around you, each represented by a sphere. In 360 Mode, tap the spheres to turn them on or off. In VR Mode, you can use your controller to toggle their state. On Google Cardboard, you will have a retical (a small circle in front of you eye) which can be used to turn the stems on and off.”

I thought it would be a good opportunity to pull apart and test it with a couple of my own songs:

There’s huge potential with spatial music to revolutionize music production and delivery, and we’re only just getting started. For some more info on this you can read my blog post on “Immersive Audio and Musical AI“.

There’s a bit of a process to go through, including configuring your development workflow and tools but in the end it’s a pretty cool way of getting inside the music. I also used it as an opportunity to test gitlab CI and page hosting.

Next step will be to extend it with some custom visualizations, refine the asset pipeline workflow. I’ve actually had a similar concept bouncing around to do in Unity3D so I’ll probably do that at some point.

See also: https://www.canvas.co.com/creations/3901

Streaming Live Music

A long term action research project on networked virtual live streaming music performance, virtual experiences, and interfaces.
Solo performances and collaborative live networked real time music jams with musicians around the planet.

Live Music has long been the “Killer App” of the SecondLife online social experience, acting as the glue that binds people to come together for shared experiences.
I’ve produced and performed at 1000+ events in SL and have learned a lot over this time in areas as varied as:

  • Virtual event production
  • Virtual set production
  • Platform evaluation
  • Technology strategy
  • Streaming Audio and Video
  • Virtual platform economics
  • Virtual event management

See also:

I developed an interactive reactive AI controlled rhythm section that responds to my playing dynamics in real time.

Networked live music with musicians from two countries  (Japan/Canada) simultaneously.

Networked live music with musicians from two countries (Japan/Taiwan) simultaneously.

Music Flow

Adaptive music composition driven interactively by real time 3D artificial intelligence.

Prototype for a VR project in the health and art therapy market.

Built using Unity3D

Pin It on Pinterest