Future of Web Apps – Session 1 – APIs – Outside the API Box – Ruth John
This was live blogged – there will be mistakes.
It’s been 11 year since Winamp and 11 years since Ruth had been VJing. Can we reproduce winamps amazing visualisations on the web?
To do this we would need a moving visual and a sound wave.
We can create a moving visual very easily with css animations.
We can also use the web audio API to sort the audio. But it’s quite a complex API. It’s built on nodes which is not normally how web developers are used to writing in.
We can control volume, create filters and crate sounds. We can also analyse. For example, we can find the frequency of the sound in order to create a visualisation based on that. For example lighting up a different light based on piano notes. But this isn’t what you would want to see in a club. We can do a lot more with that to make it more functional
We can also use GetUserMedia API to get sound from the mic to create a visualisation too.
There’s also loads and loads of more web Apis we can include too. Geolocation, ambient light API, vibration API etc etc.
In the vibration API, we can cause a users device to vibrate. We could make a users device when drums play for example.
You can also do clever things like mixing 2 videos together based on a track. The example showed mixing he man and thunder cats changing based on the music.
We can also choose which video to play using the WebSpeech API.
it’s made up of 2 parts. Synthesis – where your device talks to you and recognition where the device detects what you say.
The voice will be detected and we can switch which video to show. But this wouldn’t work in a club as it would be too noisy
But what about an alternative? Introducing the gamepad API.
You could press a different button on the gamepad to show a different video or move the thumb sticks to rotate the video.
We could of course, just use the keyboard of the device but it doesn’t give us many features (see rotation option in game pad)
We can however use things like the DeviceOrientation API. This picks up where the device is pointing, tilting and rotating etc. We can interact with things here.
With devices and web sockets, we can think about audience participation. We could let the users decide which video they want to see. We could play the video to their individual screen if we wanted. We can do the vibration we spoke about earlier and also let the users be part of the performance
We are using a lot of these Apis for an extended purpose than they were designed for. We are taking their intended use and extending this.
There are many new web Apis available so go out and investigate them for your own needs.
Browser support isn’t that great but check caniuse.com to check support.