Previously, we created an audio analyser which sets up everything we need to start of analysing music. We left off by using the onaudioprocess
event handler of the script processor node to call our analyseBoost
function, which will analyse the music. In this tutorial we'll finish off looking at the analyseBoost
function, and look at how it measures the sound level of a track by leveraging the audio analyser.
Analyse Boost
We left off by using the onaudioprocess
event handler of the script processor node to call our analyseBoost function:
scriptProcessorNode.onaudioprocess = function(e) { audioAnalyserInstance.analyseBoost(); }
This means that for as long as audio is being processed, or the song is playing, we will be analysing the music. Therefore, if we set the music to play during the init()
function of a three.js scene, the we will constantly be analysing the boost of the song, meaning that within the render()
function of a three.js scene we will have real time access to the music data.
Now let's see how the boost is actually analysed for use in the three.js render()
function.
The code
analyseBoost:function(){ var audioArray = new Uint8Array(this.analyser.frequencyBinCount); this.analyser.getByteFrequencyData(audioArray); this.boost = 0; for (var i = 0; i < audioArray.length; i++) { this.boost += audioArray[i]; } this.boost = this.boost / audioArray.length; }
To analyse the sound frequency of our file, we use the frequencyBinCount
property of the analyser node to calculate the number of values needed for the visualization (see here). We then use this Uint8Array
(audioArray
) as an argument to getByteFrequencyData()
, which copies the frequency data into our audioArray
(line 5).
Then in line 7, the boost variable (which is used by external classes) is initialized to 0. We then add each index of the audioArray
to the boost variable in the for loop on lines 9-11, and then get the average boost by dividing the boost by the number of items in the audioArray
on line 13.
Improvements
The boost detected here is only a 'loudness' value, and doesn't allow us to analyse the track for specific instruments. Possible improvements to this JavaScript object would be to add additional functions to detect different characteristics of the music.