Audio Sampling Bit Depth in HTML5/JavaScript

This post continues on from the previous one which introduced sampling and sample rate which is here if you would like to refresh your memory. This one discusses bit depth of digital audio which is the number of bits used to represent each sample. The more bits, the greater the set which can be used to describe the amplitude of the sample and the more accurately a signal can be sampled. This affects the dynamic range and the Signal to Noise Ratio (SNR). The dynamic range is the difference between the largest and smallest possible amplitudes the signal can represent. The SNR is a little different and will be explained later in the post. Increasing the bit depth increases both of these measurements.

 

Quantization

The amplitude of an analogue signal exists on a continuum so there can be an infinite number of possible amplitudes. Digital signals have a discrete number of possible amplitude corresponding to the number of bits. In the digitisation process, the analogue signal amplitude will never exactly match a possible digital amplitude so it must be quantized. This means that the amplitude is rounded to the nearest digital discrete value.

 

Bit Reduction Effect

This demonstration is an audio effect for reducing bit depth. As with the sample rate, it is not possible to actually adjust the bit depth in the Web Audio API so this effect only simulates bit reduction.

Below is the code for the effect processor. You can find the rest of the code at the bottom of this page or on GitHub here.
 

Scripts/com/littleDebugger/daw/dsp/bitDepthReduction.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw.dsp");

// Psuedo bit depth reduction effect.
// <bitReductionControl> Control for reducing bit depth.
// - value - The 'psuedo' bit depth of the output.
// <ditherControl> Control for adding dither.
// - value - Decimal 0 to 1 for amount of dither.
com.littleDebugger.daw.dsp.bitDepthReduction = function (bitReductionControl, ditherControl) {

    // Process audio buffer.
    // <inputBuffer> The buffer to processed.
    // <outputBuffer> The processed buffer.
    return function (inputBuffer, outputBuffer) {

        // Number of bits for the output.
        var bits = bitReductionControl.value;
        // Number of unique values represented by the bits.
        var values = Math.pow(2, bits);
        // The interval between values when on a scale between -1 and +1 which is what Web Audio API uses.
        // This also takes into account 0 (the mid point) so 1 number is lost. 
        // This only makes a significant difference for the very low bit depths.
        var interval = 1 / ((values / 2) - 1);

        // Iterate over each sample in the buffer.
        for (var sample = 0; sample < inputBuffer.length; sample++) {
            // Generate the dither value
            var dither = (((Math.random() * (interval * 2)) - interval) * ditherControl.value);

            // Quantize the ampltidue + dither (if any).
            var quantizedAmplitude = Math.round((inputBuffer[sample] + dither) / interval) * interval;

            // Assign output sample.
            outputBuffer[sample] = quantizedAmplitude;
        }
    }
};

 

Bit Reduction Demonstration

This visualiser has the bit rate reduction processor attached. As usual, the red wave is the input and the blue is the processed output. This time a yellow wave has been added and the reason for this will be revealed shortly.

Click/touching at different heights on the visualiser will affect the output bit depth. At the top, the output will be 16 bit down to the bottom which is 3 bit.

There are also a couple of extra controls. The check box enables the frequency rate reduction effect so you can use both bit reduction and sample rate reduction effects together. The sample rate reduction effect is controlled using the y-axis of the visualiser, as described in the previous post.

The output bit depth is displayed next to the new controls.

 

Bit Reduction Outcome

Reducing the bit depth introduces artefacts to the output which sound like noise. Noise will be discussed in a later post but, for now, let’s just say that it sound like a radio station which is not tuned into a station well (if you can remember what that sounded like) or like rain hitting the ground. Noise is generated with random sample amplitudes.

Referring back to the sum of sines post, it makes sense that the additional artefacts should sound like noise. We can think of the output signal as a composite of the original signal plus an additional (somewhat random) signal made of different between the original signal and the quantized signal. The yellow wave on the visualiser shows this additional signal which is known as the quantization error. The is very much like noise because it is seemly random how much the signal will need to be quantized. This quantization error is the ‘noise’ in the SNR. As with the dynamic range, the ‘signal’ part of the SNR is the maximum possible amplitude.

The quantization error is always random, though. Listen to the 300Hz sine wave at 4 bits and you will notice that there is no noise introduced, just other frequencies. When the quantized signal is a repeated pattern then the quantization error also becomes a repeated pattern and repeated patterns become frequencies rather than noise. You can see the repeated pattern in the form on the yellow wave. The lower bit depths in general start to introduce more repeated patterns and several samples in a row might be quantized to the sample discrete sample.

Listen to each of the audio files from the drop down menu to see how the bit depth reduction and dithering affect them.
 

Dithering

Dithering is the process of adding/subtracting a small amount of random noise to the sample amplitude before quantization. The new (unlabeled) range input on the visualiser controls the dither. It ranges from zero noise up to the interval between 2 discrete values.

The dithering is most effective at around the midpoint of the range. This would equate to +/- half an interval of noise.

You should hear that the level of noise present in the output increases with the dithering but the original audio is much more recognisable and there are little or no other artefacts introduced other than the noise. So in this case, randomly distorting the signal actually improves the sample representation.

The 2 bit the audio is almost impossible to make out – This is most apparent on the speech clips. When the signal is dithered it is still possible to hear and understand. The trade-off introduced by dithering is an increase of noise for a more comprehensible signal.

 

Bit Depth Comparision

When comparing bit depths it will be helpful to use examples of well-known computers with different bit rates.

The Commodore 64 had 4-bit audio. 4-bits can represent 16 values. Consoles, like the  NES and the SEGA Master System, had 8-bit audio which can represent 256 values.

Below is a wave that is 256 pixels tall from peak to peak. You can see it is quite an accurate representation of a sinusoidal wave as it is not really pixellated. If we say that an average monitor is 30cm tall and has 1080 px in height (1080p) then the 8-bit wave is about 7.5cm tall.

8 bit depth Sine wave

 
These are just visual representations of the waves but they should give you an idea about the difference in fidelity between some common bit depths.
 

Next are the 16-bit systems. The Sega Mega Drive and Super NES fall into this category. The maximum peak to peak wave using the same scale as above (1080 pixels/30cm tall) would be the size below.

16 bit depth Sine wave

 

Next, we have 24-bit which is what most PCs support today. Below the size of the wave on the same scale.

24 bit depth sine wave

 

Some high-end sound cards support 32-bit. Again, the maximum size wave on the same scale is below.

32 bit depth sine wave

 

Change Log

  • Bit Depth Reduction effect

 

Source Code (click to expand)

index.html

<!DOCTYPE html>
<html>

<head>
    <meta charset="UTF-8">
    <title>JavaScript Audio Visualiser</title>
    <script src="Scripts/com/littleDebugger/namespacer.js"></script>
    <script src="Scripts/com/littleDebugger/daw/dsp/sampleRateReduction.js"></script>
    <script src="Scripts/com/littleDebugger/daw/dsp/bitDepthReduction.js"></script>
    <script src="Scripts/com/littleDebugger/daw/dsp/gain.js"></script>
    <script src="Scripts/com/littleDebugger/daw/audioContext.js"></script>
    <script src="Scripts/com/littleDebugger/daw/audioLoader.js"></script>
    <script src="Scripts/com/littleDebugger/daw/dsp/visualiser.js"></script>
    <script src="Scripts/com/littleDebugger/daw/player.js"></script>
    <script src="Scripts/com/littleDebugger/utility/ui/controlHelpers.js"></script>
    <script src="Scripts/com/littleDebugger/ui/fullScreenEvent.js"></script>
    <script src="Scripts/com/littleDebugger/ui/screenInput.js"></script>
    <link rel="stylesheet" type="text/css" href="Styles/index.css"/>
</head>

<body class="non-selectable">
    <div id="container">
        <div id="canvasContainer">
            <div id="controls">
                <div id="topControls">
                    <div>
                        <button id="maximise" style="font-size: x-small">Full Screen</button>
                        <button id="showHideControls" style="font-size: x-small">Show controls</button>
                        <input id="playButton" type="button" value="Play" style="font-weight: bold"/>
                        <input id="stopButton" type="button" value="Stop" style="color: red"/>
                        <input id="processSampleRate" type="checkbox"/>
                        <input id="ditherControl" type="range" min="0" max="1" step="0.01" value="0"/>
                        <input id="sampleRate" type="range" min="1" max="140" value="1" style="display:none"/>
                        <input id="bitDepth" type="range" min="2" max="16" value="16" style="display:none"/>
                        <span id="sampleRateDisplay"></span>
                        <span class="message" id="loadingMessage"> Loading...</span>
                    </div>
                    <div>
                        <input id="gain" type="range" min="0" max="2" value="1" step="0.01" style="display:none"/>
                    </div>
                </div>
                <div id="innerControls">
                    <div class="inline">
                        <span>Buffer:</span>
                        <span>
                            <select id="bufferSizeSelect">
                            <option>256</option>
                            <option>512</option>
                            <option>1024</option>
                            <option>2048</option>
                            <option>4096</option>
                            <option>8192</option>
                            <option selected>16384</option>
                        </select>
                        </span>
                    </div>
                    <div class="inline">
                        <span>
                            Resolution:
                        </span>
                        <span>
                            <select id="resolutionSelect">
                            <option data-width="1920" data-height="1000">V</option>
                            <option data-width="1024" data-height="864">IV</option>
                            <option data-width="800" data-height="640">III</option>
                            <option data-width="640" data-height="480" selected>II</option>
                            <option data-width="320" data-height="200">I</option>
                        </select>
                        </span>
                    </div>
                    <div class="inline">
                        <span>
                            Width:
                        </span>
                        <span>
                            <input id="lineWidth" type="range" min="1" max="15" value="1"/>
                        </span>
                    </div>
                    <div class="inline">
                        <span>Fit:</span>
                        <span>
                            <input id="fitToCanvasCheckbox" type="checkbox"/>
                        </span>
                    </div>
                    <div class="inline">
                        <span>
                        Refresh:
                    </span>
                        <span>
                        <input id="refreshRate" type="range" min="1" max="20" value="1"/>
                    </span>
                    </div>
                    <div class="inline">
                        <span><input id="audioSourceSwitch" type="button" value="Filesystem"/></span>
                        <select id="fileToPlay">
                            <option value="1" data-audioFile='Audio/bensound-funnysong.mp3'>
                                Funny Song
                            </option>
                            <option value="2" data-audioFile='Audio/300hzSine.mp3'>
                                300Hz Sine Wave
                            </option>
                            <option value="3" data-audioFile='Audio/MLKDream.mp3'>
                                I Have A Dream
                            </option>
                            <option value="3" data-audioFile='Audio/Donut.mp3'>
                                I Am A Donut
                            </option>
                        </select>
                        <input id="file" type="file" accept="audio/*" style="display: none">
                        </span>
                    </div>
                </div>
            </div>
            <canvas id="visualiserCanvas"></canvas>
        </div>
    </div>
    </div>
</body>

</html>
<script src="Scripts/index.js"></script>

images/index.html

<!DOCTYPE html>
<html>

<head>
    <meta charset="UTF-8">
    <title>JavaScript Audio Visualiser</title>
    <script src="Scripts/com/littleDebugger/namespacer.js"></script>
    <script src="Scripts/com/littleDebugger/daw/dsp/sampleRateReduction.js"></script>
    <script src="Scripts/com/littleDebugger/daw/audioContext.js"></script>
    <script src="Scripts/com/littleDebugger/daw/audioLoader.js"></script>
    <script src="Scripts/com/littleDebugger/daw/dsp/visualiser.js"></script>
    <script src="Scripts/com/littleDebugger/daw/player.js"></script>
    <script src="Scripts/com/littleDebugger/utility/ui/controlHelpers.js"></script>
    <script src="Scripts/com/littleDebugger/ui/fullScreenEvent.js"></script>
    <link rel="stylesheet" type="text/css" href="Styles/index.css"/>
</head>

<body class="non-selectable">
    <div id="test"></div>
    <div id="container">
        <div id="canvasContainer">
            <div id="topControls">
                <div style="float:right">
                    <span class="message" id="loadingMessage">Loading...</span>
                    <button id="maximise" style="font-size: x-small">Maximise</button>
                    <button id="showHideControls" style="font-size: x-small">Show controls</button>
                    <input id="playButton" type="button" value="Play" style="font-weight: bold"/>
                    <input id="stopButton" type="button" value="Stop" style="color: red"/>
                </div>
                <div>
                    <input id="bitReduction" type="range" min="1" max="140" value="1" style="display:none"/>
                    <button id="bitReductionDecrease" style="display:none">-</button>
                    <button id="bitReductionIncrease" style="display:none">+</button>
                    <span id="sampleRate"></span>
                    <span><a href="https://goo.gl/toxBhH">Instructions</a></span>
                </div>
            </div>
            <div id="controls">
                <div id="innerControls">
                    <div class="inline">
                        <span>Buffer:</span>
                        <span>
                            <select id="bufferSizeSelect">
                            <option>256</option>
                            <option>512</option>
                            <option>1024</option>
                            <option>2048</option>
                            <option>4096</option>
                            <option>8192</option>
                            <option selected>16384</option>
                        </select>
                        </span>
                    </div>
                    <div class="inline">
                        <span>
                            Resolution:
                        </span>
                        <span>
                            <select id="resolutionSelect">
                            <option data-width="1920" data-height="1000">V</option>
                            <option data-width="1024" data-height="864">IV</option>
                            <option data-width="800" data-height="640">III</option>
                            <option data-width="640" data-height="480" selected>II</option>
                            <option data-width="320" data-height="200">I</option>
                        </select>
                        </span>
                    </div>
                    <div class="inline">
                        <span>
                            Width:
                        </span>
                        <span>
                            <input id="lineWidth" type="range" min="1" max="15" value="1"/>
                        </span>
                    </div>
                                        <div class="inline">
                        <span>Fit:</span>
                        <span>
                            <input id="fitToCanvasCheckbox" type="checkbox"/>
                        </span>
                    </div>
                    <div class="inline">
                        <span>
                        Refresh:
                    </span>
                        <span>
                        <input id="refreshRate" type="range" min="1" max="20" value="1"/>
                    </span>
                    </div>
                    <div class="inline">
                        <span><input id="audioSourceSwitch" type="button" value="Filesystem"/></span>
                        <select id="fileToPlay">
                            <option value="1" data-audioFile='Audio/bensound-funnysong.mp3'>
                                Funny Song
                            </option>
                        </select>
                        <input id="file" type="file" accept="audio/*" style="display: none">
                    </span>
                    </div>
                </div>
            </div>
            <canvas id="visualiserCanvas"></canvas>
        </div>
    </div>
    </div>
</body>

</html>
<script src="Scripts/index.js"></script>

Scripts/index.js

var audioSourceIsFileSystem = false;
// Audio volume warning is shown the first time audio is played only.
var showAudioVolumeWarning = true;

// Reference to audioLoader module.
var audioLoader = com.littleDebugger.daw.audioLoader;
// Reference to controlHelpers module.
var controlHelpers = com.littleDebugger.utility.ui.controlWrapper;

// Reference to the audio processor used for this workshop.
var audioProcessor;

var loadingMessage = document.getElementById('loadingMessage');

var audioSourceText = {
    1: "Filesystem",
    0: "Server"
};

// Array of objects with colour and alpha (opacity) properties.
// The first object represented the configuration for the input buffer and the second for the output.
// This is configurable so that the visualiser can show many different waves at the same time.
var waveDisplayConfigs = [{
        colour: "rgb(205,0,40)",
        alpha: 1
    },
    {
        colour: "rgb(0, 225,255)",
        alpha: 1
    },
    {
        colour: "rgb(255, 225,0)",
        alpha: 1
    }
];

var canvas = document.getElementById('visualiserCanvas');

// Initialise visualiser.
var visualiser = com.littleDebugger.daw.dsp.visualiser(
    waveDisplayConfigs,
    canvas,
    document.getElementById('lineWidth'),
    document.getElementById('fitToCanvasCheckbox'),
    document.getElementById('refreshRate'));

var playControl = document.getElementById('playButton');
var stopControl = document.getElementById('stopButton');
var filesystemFileControl = document.getElementById('file');
var audioFileControl = document.getElementById('fileToPlay');
var fileSourceControl = document.getElementById('audioSourceSwitch');

// The callback for the audioProcessingEvent from the audio player.
// The code is not in the audio player because its currently doing more than it should be
//  due to calling the visualiser.
var processAudio = function (audioProcessingEvent) {
    var inputBuffer = audioProcessingEvent.inputBuffer;
    var outputBuffer = audioProcessingEvent.outputBuffer;

    var updateVisualiser = true;
    for (var channel = 0; channel < outputBuffer.numberOfChannels; channel++) {
        var inputData = inputBuffer.getChannelData(channel);

        // Reduce the original gain so amplification can be demonstrated too.
        for (var sample = 0; sample < inputBuffer.length; sample++) {
            inputData[sample] = inputData[sample] * 0.5;
        }

        var outputData = outputBuffer.getChannelData(channel);

        audioProcessor(inputData, outputData);

        // Visualiser should only be updated for 1 channel.
        if (updateVisualiser) {
            // Make a new wave of the different between the input and output.
            var difference = [];
            for (var i = 0; i < inputData.length; i++) {
                difference[i] = inputData[i] - outputData[i];
            }

            visualiser.drawWave([inputData, outputData, difference]);
            updateVisualiser = false;
        }
    }
};

// Wire up control events.
// Some of the following event handling could be contained in a module.
// I am not exactly sure how it will all be grouped and split yet so it is just in the main page JS file. 

fileSourceControl.onclick = function () {
    this.value = audioSourceText[audioSourceIsFileSystem * 1];
    if (audioSourceIsFileSystem) {
        filesystemFileControl.style.display = 'none';
        audioFileControl.style.display = 'inline';
        audioSourceIsFileSystem = false;
        audioFileControl.onchange();
    } else {
        filesystemFileControl.style.display = 'inline';
        audioFileControl.style.display = 'none';
        audioSourceIsFileSystem = true;
        filesystemFileControl.value = null;
        filesystemFileControl.click();
    }
};

window.addEventListener(audioLoader.audioLoadingStartedEventName, function () {
    loadingMessage.style.display = 'inline';
    playControl.disabled = true;
    playControl.style.color = "grey";
});

window.addEventListener(audioLoader.audioLoadingCompletedEventName, function () {
    loadingMessage.style.display = 'none';
    playControl.disabled = false;
    playControl.style.color = "green";
});

playControl.onclick = function () {
    if (showAudioVolumeWarning) {
        alert('Please make sure the audio volume is set to an appropriate level!');
        showAudioVolumeWarning = false;
    }
    player.startAudio();
};

stopControl.onclick = function () {
    player.stopAudio();
};

audioFileControl.onchange = function () {
    player.stopAudio();
    playControl.disabled = true;
    player.cueAudioFile(this.selectedOptions[0].getAttribute('data-audioFile'));
};

filesystemFileControl.onchange = function () {
    player.stopAudio();
    var localFile = window.URL.createObjectURL(this.files[0]);
    player.cueAudioFile(localFile);
};

resolutionControl = document.getElementById('resolutionSelect');

// Add event to change the canvas resolution when the resolution select is changed.
resolutionControl.onchange = function () {
    var width = this.options[this.selectedIndex].getAttribute('data-width');
    var height = this.options[this.selectedIndex].getAttribute('data-height');
    visualiser.setDimensions(width, height);
};

resolutionControl.onchange();

// Initialise the player.
var player = com.littleDebugger.daw.player(
    document.getElementById('bufferSizeSelect'),
    processAudio);

var sampleRateControl = document.getElementById('sampleRate');
var bitDepthControl = document.getElementById('bitDepth');
var processSampleRate = document.getElementById('processSampleRate');
var ditherControl = document.getElementById('ditherControl');

var sampleRateReduction = com.littleDebugger.daw.dsp.sampleRateReduction(
    sampleRateControl);
var bitDepthReduction = com.littleDebugger.daw.dsp.bitDepthReduction(
    bitDepthControl, ditherControl);

audioProcessor = function (inputData, outputData) {
    bitDepthReduction(inputData, outputData);
    if (processSampleRate.checked == true) {
        sampleRateReduction(outputData, outputData);
    }
};

var sampleRateDisplay = document.getElementById('sampleRateDisplay');

var updateText = function () {
    var text = bitDepthControl.value + " BIT ";

    if (processSampleRate.checked) {
        text += (player.sampleRate / sampleRateControl.value).toFixed(0) + "Hz";
    }

    sampleRateDisplay.innerHTML = text;
}

sampleRateControl.onchange = updateText;
bitDepthControl.onchange = updateText;
processSampleRate.onchange = updateText;

// Show sample rate.
sampleRateControl.onchange();

var canvasContainer = document.getElementById('canvasContainer');
var visualiserCanvas = document.getElementById('visualiserCanvas');
var controls = document.getElementById('controls');

// This displays the page better when in a blog post Iframe.
// It will be modularised or replaced (hopefully with CSS only).
var setDimensions = function () {
    visualiserCanvas.style.width = (window.innerWidth) + "px";
    visualiserCanvas.style.height = (window.innerHeight - controls.clientHeight - 20) + "px";
};

window.onresize = function () {
    setDimensions();
};

setDimensions();

// Button to enter full screen.
com.littleDebugger.ui.fullScreenEvent(
    canvasContainer,
    document.getElementById('maximise'));

// Handle hiding block of controls.
var controlsHidden = false;
var hidableControls = document.getElementById('innerControls');
document.getElementById('showHideControls').onclick = function () {
    if (controlsHidden) {
        controlsHidden = false;
        hidableControls.style.display = "block";
        setDimensions();
    } else {
        controlsHidden = true;
        hidableControls.style.display = "none";
        setDimensions();
    }
};

// Cue the audio.
audioFileControl.onchange();

com.littleDebugger.ui.screenInput(canvas, sampleRateControl, bitDepthControl);

Scripts/com/littleDebugger/namespacer.js

// Simple pattern used for namespacing in JavaScript.
// The module pattern will be used to group related functionality. 
// Modules are not yet supported in the main browswers natively.

// I do not plan to use any 3rd party libraries. 
// This may mean reinventing the wheel in some cases but I do not want anything 
// going on under the hood which I am not aware of.
// I will 'borrow' functions and snippets where required. This will be referenced.

if (typeof (com) === 'undefined') {
    com = {};
}

if (typeof (com.littleDebugger) === 'undefined') {
    com.littleDebugger = {};
}

if (typeof (com.littleDebugger.namespacer) === 'undefined') {
    com.littleDebugger.namespacer = {};
}

// Creates a namespace in the global space.
// <namespaceText> . seperated namespace to be created.
com.littleDebugger.namespacer.createNamespace = function (namespaceText) {
    var namespaces = namespaceText.split('.');
    if (typeof (window[namespaces[0]]) === 'undefined') {
        window[namespaces[0]] = {};
    }

    var currentSpace = window[namespaces[0]];

    for (i = 1; i < namespaces.length; i++) {
        var namespace = namespaces[i];
        if (typeof (currentSpace[namespace]) === 'undefined') {
            currentSpace[namespace] = {};
        }

        currentSpace = currentSpace[namespace];
    };
};

Scripts/com/littleDebugger/daw/audioContext.js

// Create namespace.
com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw");

// The is the visualiser.
com.littleDebugger.daw.getAudioContext = (function () {
    // Support Web Audio API in different supported broswers.
    // Taken from http://chimera.labs.oreilly.com/books/1234000001552/ch01.html#s01_2
    var getAudioContext = function () {
        var ContextClass = (
            window.AudioContext ||
            window.webkitAudioContext ||
            window.mozAudioContext ||
            window.oAudioContext ||
            window.msAudioContext);
        if (ContextClass) {
            return new ContextClass();
        } else {
            alert("Web Audio API is not available. Please use a supported browser.");
            throw new Exception();
        }
    };

    return getAudioContext;
})();

Scripts/com/littleDebugger/daw/audioLoader.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw");

com.littleDebugger.daw.audioLoader = function () {
    this.audioLoadingStartedEventName = 'audio-loading-started';
    this.audioLoadingCompletedEventName = 'audio-loading-completed';

    // Load audio file.
    // <fileName> Name of the file to be loaded. This can be on local machine if it has been loaded correctly.
    // <audioCtx> Audio context on which the audio file should be played.
    // <sourceReturnCallback> Callback to attach the audio the context when loaded.
    this.loadAudioFile = function (fileName, audioCtx, sourceReturnCallback) {
        this.loadAudioBufferFromFile(fileName, audioCtx, function (buffer) {
            sourceReturnCallback(this.createBuffer(buffer, audioCtx));
        })
    };

    // Load audio file and return the buffer.
    // This function is public but is not yet called from outside of this module. 
    // It will be though, which might give you an idea about how I plan to play audio later on
    // in the series.

    // Function was based on the example here: 
    // https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/decodeAudioData
    this.loadAudioBufferFromFile = function (fileName, audioCtx, bufferReturnCallback) {
        fireEvent(audioLoadingStartedEventName, fileName);

        var request = new XMLHttpRequest();
        request.open('GET', fileName, true);
        request.responseType = 'arraybuffer';
        request.onload = function () {
            var audioData = request.response;
            audioCtx.decodeAudioData(audioData, function (buffer) {
                    fireEvent(audioLoadingCompletedEventName, fileName);
                    bufferReturnCallback(buffer);
                },
                function (e) {
                    "Error decoding audio file." + e.err
                });
        }

        request.send();
    };

    // Creates an audio buffer.
    this.createBuffer = function (buffer, audioCtx) {
        source = audioCtx.createBufferSource();
        source.buffer = buffer;
        return source;
    };

    // Fires event.
    var fireEvent = function (eventName, detail) {
        var event = new CustomEvent(eventName, {
            'detail': detail
        });
        window.dispatchEvent(event);
    };

    return this;
}();

Scripts/com/littleDebugger/daw/player.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw");

// This is the audio player.
// It handles the audio context for loading, playing and stopping the audio.
// <audioLoader>Reference to the audioLoader.js module.
// <getAudioContext>Reference to the audioContext.js module.
com.littleDebugger.daw.player = (function (audioLoader, getAudioContext) {
    var initialise = function (
        bufferSizeControl,
        processAudioCallback) {
        var that = {};
        that.audioPlayingEventName = "audio-playing";
        that.audioStoppedEventName = "audio-stopped";

        var playingAudio = false;
        var audioCtx = getAudioContext();
        var source = audioCtx.createBufferSource();
        var scriptNode;

        // Reloads the audio file.
        that.cueAudioFile = function (fileName) {
            audioLoader.loadAudioFile(fileName, audioCtx, setSource);
        };

        // Starts the audio playing.
        that.startAudio = function () {
            if (!playingAudio) {
                fireEvent(that.audioPlayingEventName);
                bufferSizeControl.disabled = true;
                scriptNode = audioCtx.createScriptProcessor(bufferSizeControl.value, 1, 1);
                scriptNode.onaudioprocess = function (audioProcessingEvent) {
                    processAudioCallback(audioProcessingEvent);
                }

                playingAudio = true;
                source.connect(scriptNode);
                scriptNode.connect(audioCtx.destination);
                source.start();
            }
        };

        // Stops the audio.
        that.stopAudio = function () {
            if (playingAudio) {
                fireEvent(that.audioStoppedEventName);
                source.stop();
                playingAudio = false;
                bufferSizeControl.disabled = false;
                source.disconnect(scriptNode);
                scriptNode.disconnect(audioCtx.destination);
                setSource(audioLoader.createBuffer(source.buffer, audioCtx));
            }
        };

        var fireEvent = function (eventName) {
            var event = new Event(that.audioStoppedEventName);
            window.dispatchEvent(event);
        };

        // Used as a callback to set the local source variable. 
        var setSource = function (src) {
            source = src;
            setOnended();
        };

        // When the buffer source stops playControling, disconnect everything.
        var setOnended = function () {
            source.onended = that.stopAudio;
        };

        that.sampleRate = audioCtx.sampleRate;

        return that;
    };

    return initialise;
})(
    com.littleDebugger.daw.audioLoader,
    com.littleDebugger.daw.getAudioContext);

Scripts/com/littleDebugger/daw/dsp/bitDepthReduction.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw.dsp");

// Psuedo bit depth reduction effect.
// <bitReductionControl> Control for reducing bit depth.
// - value - The 'psuedo' bit depth of the output.
// <ditherControl> Control for adding dither.
// - value - Decimal 0 to 1 for amount of dither.
com.littleDebugger.daw.dsp.bitDepthReduction = function (bitReductionControl, ditherControl) {

    // Process audio buffer.
    // <inputBuffer> The buffer to processed.
    // <outputBuffer> The processed buffer.
    return function (inputBuffer, outputBuffer) {

        // Number of bits for the output.
        var bits = bitReductionControl.value;
        // Number of unique values represented by the bits.
        var values = Math.pow(2, bits);
        // The interval between values when on a scale between -1 and +1 which is what Web Audio API uses.
        // This also takes into account 0 (the mid point) so 1 number is lost. 
        // This only makes a significant difference for the very low bit depths.
        var interval = 1 / ((values / 2) - 1);

        // Iterate over each sample in the buffer.
        for (var sample = 0; sample < inputBuffer.length; sample++) {
            // Generate the dither value
            var dither = (((Math.random() * (interval * 2)) - interval) * ditherControl.value);

            // Quantize the ampltidue + dither (if any).
            var quantizedAmplitude = Math.round((inputBuffer[sample] + dither) / interval) * interval;

            // Assign output sample.
            outputBuffer[sample] = quantizedAmplitude;
        }
    }
};

Scripts/com/littleDebugger/daw/dsp/gain.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw.dsp");

// Applies gain to signal.
com.littleDebugger.daw.dsp.gain = function (gainControl) {
    // Process audio buffer.
    // <inputBuffer> The buffer to processed.
    // <outputBuffer> The processed buffer.
    return function (inputBuffer, outputBuffer) {
        var gain = gainControl.value;
        for (var sample = 0; sample < inputBuffer.length; sample++) {
            outputBuffer[sample] = inputBuffer[sample] * gain;
        }
    }
};

Scripts/com/littleDebugger/daw/dsp/sampleRateReduction.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw.dsp");

// Psuedo sample rate reduction effect.
com.littleDebugger.daw.dsp.sampleRateReduction = function (bitReductionControl) {

    var currentPos = 0;
    var pow = 0;

    // Process audio buffer.
    // <inputBuffer> The buffer to processed.
    // <outputBuffer> The processed buffer.
    return function (inputBuffer, outputBuffer) {
        var reductionFraction = bitReductionControl.value;

        for (var sample = 0; sample < inputBuffer.length; sample++) {
            currentPos++;
            if (currentPos >= reductionFraction) {
                pow = inputBuffer[sample];
                currentPos = 0;
            }

            outputBuffer[sample] = pow;
        }
    }
};

Scripts/com/littleDebugger/daw/dsp/visualiser.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw.dsp");

// Visualiser module.
com.littleDebugger.daw.dsp.visualiser = function () {
    // Function to create a visualiser. 
    // The parameters represent the visualiser controls but since they are passed in then there is no 
    // dependency on the DOM.
    // The controls should not have a dependcy on a specific type of element but just need the 
    // appropriate properties. (child properties indented with '-')

    // <waveDisplayConfigs> Array of objects. Each object has properties related to the configuration of each waveform to 
    // the draw on the visualiser.
    // <canvas> Canvas element which where the visualiser will be drawn.
    // <waveWidthControl> Control for the width of the waveform lines. 
    // -<value> Line width in pixels.
    // <fitToVisualiserWidthControl> Control for stretching/contracting the buffer to fit neatly into the width of the 
    // visualiser. 
    // -<checked> Boolean property.
    // <refreshRateControl> Set how many buffers the visualiser should recieve before updating.
    // -<value> Integer property.
    var initialise = function (
        waveDisplayConfigs,
        canvas,
        waveWidthControl,
        fitToVisualiserWidthControl,
        refreshRateControl) {
        // Setup refresh rate.
        var visualFrame = 1;

        var visualiser = {};

        // Get canvas context.
        var ctx = canvas.getContext('2d');

        // We be the virtical midpoint of the canvas.
        var virticalMidpoint;

        // Get the vertical point on the canvas for amplitude.
        var getVerticalPoint = function (virticalMidpoint, amplitude) {
            return virticalMidpoint + (amplitude * virticalMidpoint);
        };

        // Draw wave on canvas.
        // <inputData> Audio buffer.
        // <ctx> Canvas context.
        // <strokeStlye> Colour of wave line.
        // <alpha> Alpha of wave line.
        var drawLine = function (inputData, ctx, strokeStyle, alpha) {
            var inputLength = inputData.length;

            ctx.globalAlpha = alpha;
            ctx.beginPath();
            ctx.strokeStyle = strokeStyle;
            ctx.lineWidth = waveWidthControl.value;
            ctx.moveTo(0, getVerticalPoint(virticalMidpoint, inputData[0]));

            var fit = fitToVisualiserWidthControl.checked;
            var inputLength = inputData.length;
            var canvasWidth = canvas.width;

            for (var sample = 1; sample < inputLength; sample++) {
                var x = fit ? (sample / inputLength) * canvasWidth : sample;
                ctx.lineTo(x, getVerticalPoint(virticalMidpoint, inputData[sample]))
            }

            ctx.stroke();
        };

        // Refresh the canvas with new buffers
        // <buffers> Array of buffers to display.
        visualiser.drawWave = function (buffers) {
            // Chec if the canvas should be updated.
            if (visualFrame % refreshRateControl.value == 0) {
                visualFrame = 1;
            } else {
                visualFrame++;
                return;
            }

            // Clear the canvas (could be optimised).
            ctx.clearRect(0, 0, canvas.width, canvas.height);

            // Iterate over each buffer and draw the wave.
            var i = 0;
            buffers.forEach(function (buffer) {
                var colour = waveDisplayConfigs[i].colour;
                var alpha = waveDisplayConfigs[i].alpha;
                drawLine(buffer, ctx, colour, alpha);
                i++;
            });
        };

        visualiser.setDimensions = function (width, height) {
            canvas.width = width;
            canvas.height = height;
            virticalMidpoint = canvas.height / 2;
        };

        return visualiser;
    };

    return initialise;
}();

Scripts/com/littleDebugger/daw/dsp/generator/sineWave.js

// The sine (sinusoidal) wave generator.
com.littleDebugger.namespacer.createNamespace("com.littleDebugger.daw.dsp.generator.sineWave");

// 'Constructor'
// <offset> The offset of the initial phase in degrees. - Starting from 9 o'clock.
// <sampleRate> The sample rate of the audio player. This is required for correct frequency waves.
com.littleDebugger.daw.dsp.generator.sineWave = function(offset, sampleRate) {
    var circleRadians = 2 * Math.PI;
    var circleDegrees = 360;

    var that = {};
    var phase;

    // Reset the state of the generator so it can be reused.
    that.reset = function() {
        phase = circleRadians * (offset / circleDegrees);
    };

    // Get the next sample in the cycle.
    // <freqency> The frequency of the wave.
    that.getSample = function(frequency) {
        var nextValue = Math.sin(phase);
        phase += circleRadians * (frequency / sampleRate);
        if (phase > circleRadians) {
            phase = phase % circleRadians;
        }

        return nextValue;
    };

    that.reset();

    return that;
};

Scripts/com/littleDebugger/ui/fullScreenEvent.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.ui");


com.littleDebugger.ui.fullScreenEvent = function (elementToFillScreen, controlElement, exitFullScreenCallback) {
    controlElement.onclick = function () {
        if (elementToFillScreen.requestFullscreen) {
            elementToFillScreen.requestFullscreen();
        } else if (elementToFillScreen.webkitRequestFullscreen) {
            elementToFillScreen.webkitRequestFullscreen();
        } else if (elementToFillScreen.mozRequestFullScreen) {
            elementToFillScreen.mozRequestFullScreen();
        } else if (elementToFillScreen.msRequestFullscreen) {
            elementToFillScreen.msRequestFullscreen();
        }
    };

    if (typeof (exitFullScreenCallback) != 'undefined') {

        function exitHandler(e) {
            if (document.webkitIsFullScreen === false 
                || document.mozFullScreen === false 
                || document.msFullscreenElement === null) {
                exitFullScreenCallback();
            }
        }

        document.addEventListener('webkitfullscreenchange', exitHandler, false);
        document.addEventListener('mozfullscreenchange', exitHandler, false);
        document.addEventListener('fullscreenchange', exitHandler, false);
        document.addEventListener('MSFullscreenChange', exitHandler, false);
    }
};

Scripts/com/littleDebugger/ui/screenInput.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.ui");

// Handle touch/click events within an element. clicks/drags on X-axis sets the control.Value
// from the control.min (left hand side of element) to control.max (right had side of element).
// The element needs a min, max, value and onchange() event.

// Currently only handles X axis of element. Y will be added later.
com.littleDebugger.ui.screenInput = function (el, controlX, controlY) {
    // http://www.html5canvastutorials.com/advanced/html5-canvas-mouse-coordinates/
    function getMousePos(el, evt) {
        var rect = el.getBoundingClientRect();
        return {
            x: evt.clientX - rect.left,
            y: evt.clientY - rect.top
        };
    }

    var update = false;

    var updateMouse = function (evt) {
        var mousePos = getMousePos(el, evt);

        if (controlX !== 'undefined') {
            // need actual dimensions http://stackoverflow.com/a/4032188/6830533
            var x = mousePos.x / el.scrollWidth;
            controlX.value = parseInt(controlX.min) + (x * (controlX.max - controlX.min));
            controlX.onchange();
        }

        if (controlY !== 'undefined') {
            var y = 1 - (mousePos.y / el.scrollHeight);
            controlY.value = parseInt(controlY.min) + (y * (controlY.max - controlY.min));
            controlY.onchange();
        }
    };

    el.addEventListener('mousedown',
        function (evt) {
            update = true;
            updateMouse(evt);
            // Stop dragging element;
            event.preventDefault();
        }, false);


    // BOOL FOR IF GOING BACK INTO ELEMENT CONTINOUOS
    window.addEventListener('mouseup',
        function () {
            update = false;
        }, false);


    el.addEventListener('mousemove',
        function (evt) {
            if (update)
                updateMouse(evt);
        }, false);


    el.addEventListener('touchstart',
        function (evt) {
            update = true;
            updateTouch(evt);
        }, false);

    el.addEventListener('touchend',
        function () {
            update = false;
        }, false);


    el.addEventListener('touchmove',
        function (evt) {
            if (update)
                updateTouch(evt);
        }, false);


    var updateTouch = function (evt) {
        var mousePos = getTouchPos(el, evt);

        if (controlX !== 'undefined') {
            // need actual dimensions http://stackoverflow.com/a/4032188/6830533
            var x = mousePos.x / el.scrollWidth;
            controlX.value = parseInt(controlX.min) + (x * (controlX.max - controlX.min));
            controlX.onchange();
        }

        if (controlY !== 'undefined') {
            var y = 1 - (mousePos.y / el.scrollHeight);
            controlY.value = parseInt(controlY.min) + (y * (controlY.max - controlY.min));
            controlY.onchange();
        }
    };

    function getTouchPos(el, evt) {
        var rect = el.getBoundingClientRect();
        return {
            x: evt.touches[0].clientX - rect.left,
            y: evt.touches[0].clientY - rect.top
        };
    }
};

Scripts/com/littleDebugger/utility/ui/controlHelpers.js

com.littleDebugger.namespacer.createNamespace("com.littleDebugger.utility.ui");

// Hides the logic to show/hide elements on the DOM.
com.littleDebugger.utility.ui.controlWrapper = (function () {
    var hiddenClass = 'hidden';
    var that = {};

    var internalWrapControl = function(){

    }

    that.wrapControl = function (control) {
        var wrapper = {};

        var inputType = control.nodeName == "INPUT";

        // Hides a element.
        wrapper.hideControl = function () {
            control.classList.remove(hiddenClass);
        };

        // Shows a element.
        wrapper.showControl = function () {
            control.classList.add(hiddenClass);
        };

        wrapper.setDisabled = function(){
            control.disabled = true;
        }

        wrapper.setEnabled = function(){
            control.disabled = false;
        }        

        wrapper.setValue = function (newValue) {
            if (inputType) {
                control.value = newValue;
            } else {
                control.innerHTML = newValue;
            }
        }

        wrapper.value = control.value;

        control.onchange = function(){
            wrapper.value = control.value;
            wrapper.onchange();
        }

        control.onclick = function(){

            wrapper.onclick();
        }

        wrapper.onchange = function() {};
        wrapper.onclick = function() {};
        wrapper.getValue = function(){
            if (inputType) {
                return control.value;
            } else {
                return control.innerHTML;
            }
        }

        

        return wrapper;
    };

    that.getWrappedControl = function(element){
        return that.wrapControl(document.getElementById(element));
    };

    return that;
})();

Styles/index.css

/* Set the oscilloscope background on canvas on give it a border. */

#visualiserCanvas {
    background: url('../images/osc.jpg');
    background-size: 100% 100%;
}

/* Set the container in the center of the page. */

#container {
    padding: 0 0;
    margin: 0 0;
}

/* Loading message. */
.message {
    color: red;
}


/* Show/hide audio source controls. */
.hidden {
    display: none;
}


/* To stop the controls jumping around when changing audio source. */
#audioFileSelect {
    height: 1.5em;
}

.inline {
    display: inline-block;
    border-right: 2px solid white;
    margin: 0 0;
    padding: 0 1px;
    width:min-content;
    vertical-align: top;
}

#innerControls{
    padding: 0 0;
    margin: auto auto;
    max-width:max-content;
}

.non-selectable {
    -moz-user-select: none;
    -khtml-user-select: none;
    -webkit-user-select: none;
    -o-user-select: none;
    user-select: none;
} 

body {
    margin: 0 !important;
    background: black;
    color: white;
}