Handling Audio Events with Channels in Redux Saga

Settling on Redux Saga channels to handle audio in a React application

Imagem de capa

Recently I was developing a React app that required audio playback. The first option I considered was use of the native HTML5 audio element, rendering the following element will play a simple audio clip.

<audio src=“/audio-sample.mp3” autoPlay />

Browsers have started blocking most autoplay audio and video so unless you manually whitelist the site the previous example won’t play automatically on page load. For more information around what gets blocked and how to work around it I would suggest this article.

This would have worked fine for a crude version 1 of what I was trying to achieve but to create a good user experience I need the UI to update in response to audio events such as play and ended.

One way we might handle this in vanilla JavaScript is by grabbing a reference to the DOM element then attaching event handlers as required.

function playback_ended(){
   //hide audio playback indicator
}

function play(){
       var audioElement = document.querySelector('#my_audio_element');
       audioElement.addEventListener("ended", playback_ended)
       audioElement.play();
       //show audio playback indicator
}

There is an issue with the previous example, when using React the changes we make are applied to the virtual DOM which React then uses to calculate the changes it needs to make to the “actual” DOM during render. This means it’s important for us to avoid working directly with the DOM otherwise our code and the React rendering lifecycle will be stepping on each others toes. If you would like to learn more the React docs might be a good place to start.

With this in mind a more appropriate way to utilise audio events in React would be to assign the event handlers as attributes on the audio element when it’s defined in a render method. We could use the native audio element as featured previously or use a library such as React Sound for some additional functionality.

This is a basic example from the React Sound docs…

render() {
  return (
    <Sound
      url="cool_sound.mp3"
      playStatus={Sound.status.PLAYING}
      playFromPosition={300 /* in milliseconds */}
      onLoading={this.handleSongLoading}
      onPlaying={this.handleSongPlaying}
      onFinishedPlaying={this.handleSongFinishedPlaying}
    />
  );
}

Triggering Redux actions from onPlaying and onFinishedPlaying would have fit my needs nicely and it’s an approach that would be perfectly sufficient for most peoples use cases, but there’s something about using DOM elements to handle audio that doesn’t sit right with me, I’m more in favour of handling audio playback entirely with JavaScript which might look like this…

const player = new Audio(uri);

const onPlay = () => {
  //update UI
};

const onEnd = () => {
  //update UI
};

player.addEventListener('play', onPlay);
player.addEventListener('ended', onEnd);

player.play();

Great, audio playback handled entirely in JavaScript, but where do we fit this into a (reasonably) well architected React + Redux application?

Although it would get the job done I didn’t like the idea of cluttering my reducer functions with this audio related code. The operations we are performing are asynchronous in nature, we call a function (play) then there may be delay of an unknown length if the audio clip needs to be loaded from a server, then the clip will play and finally playback will be complete.

Audio playback is a kind of side effect and in this regard very similar to AJAX requests so it would be appropriate to handle them as sagas using Redux-Saga.

redux-saga is a library that aims to make application side effects (i.e. asynchronous things like data fetching and impure things like accessing the browser cache) easier to manage, more efficient to execute, easy to test, and better at handling failures. - redux-saga.js.org

To better understand how Redux Saga is used to handle asynchronous operations I can suggest following the beginner tutorial in the Redux Saga docs.

In my simple use case I register my saga “playSoundSaga” to fire any time the PLAY_SOUND action is dispatched.

import { all, call, put, take, takeLatest } from 'redux-saga/effects';
import { ActionTypes } from 'constants/index';
import { createPlaySoundChannel } from './createPlaySoundChannel';

export function* playSoundSaga({ payload }) {
  const channel = yield call(createPlaySoundChannel, payload.uri);

  while (true) {
    const { play, ended } = yield take(channel);
    if (play) {
      yield put({
        type: ActionTypes.PLAYBACK_STARTED,
        payload: {},
      });
    }
    if (ended) {
      yield put({
        type: ActionTypes.PLAYBACK_ENDED,
        payload: {},
      });
      return;
    }
  }
}

export default function* root() {
  yield all([takeLatest(ActionTypes.PLAY_SOUND, playSoundSaga)]);
}

At a high level the saga creates an event channel that yields any time an audio event that we are interested in fires (in this case play and ended).

When the play event occurs I dispatch my PLAYBACK_STARTED action so that a playback in progress animation can be displayed. Because we are inside a while true loop we now return to waiting for the channel to yield another event. In our simple use case this will be the ended event in which case we dispatch the PLAYBACK_ENDED action. We then return which exits us from the the infinite while loop and also the playSoundSaga (until it is triggered again from another PLAY_SOUND action).

The initialisation of the audio player object and attaching of the event handlers is now handled inside the definition of the channel. This time rather than directly firing actions from our event handlers we emit events from the channel to be handled inside playSoundSaga.

import { buffers, eventChannel, END } from 'redux-saga';

export const createPlaySoundChannel = uri =>
  eventChannel(emitter => {
    const player = new Audio(uri);

    const onPlay = () => {
      emitter({ play: true });
    };

    const onEnd = () => {
      emitter({ ended: true });
      emitter(END);
    };

    player.addEventListener('play', onPlay);
    player.addEventListener('ended', onEnd);

    player.play();

    return () => {
      player.removeEventListener('play', onPlay);
      player.removeEventListener('ended', onEnd);
    };
  }, buffers.sliding(2));

This approach is a fairly complex way of handling my simple use case but it would really come into it’s own when dealing with more complex chains of events like firing actions to indicate progress etc. Full code examples can be found here and here, please leave a comment if you have any thoughts to share!