How to build a xylophone app with Audio API, React Native, and Expo

React Native, when used with Expo as a toolchain, eases the common pain points in managing iOS and Android applications. After saying that, I realized that there’s a delight in using this ever-growing open source mobile application framework.

Expo has gained a lot of credibility as a framework that provides collective solutions for building React Native applications by lowering the time and effort of developers using it. They’re continuing to enhance it from time to time to keep up with the latest changes in React Native community. That said, Expo SDK33 is a blast.

With that, let’s dive into one of Expo’s API. In this step-by-step tutorial, you’re going to use Expo’s Audio API to develop a toy xylophone app

Table of Contents

Requirements

To follow this tutorial, please make sure you have the following installed on your local development environment and have access to the services mentioned below:

  • Node.js (>=8.x.x) with npm/yarn installed.
  • expo-cli (>= 2.19.4), previously known as create-react-native-app.
  • watchman is the file change watcher for React Native projects.

Getting Started

To create a new Expo project, the only requirement is to have expo-cli installed. Then, execute the following command to create a new project directory.

expo init rn-xylophone-app

# navigate inside the app folder
cd rn-xylophone-app

# install the following dependency
yarn add expo-av

Once the project directory is generated, navigate inside it as shown in the above command. Then install the required dependency to add the functionality of playing an audio file inside the React Native app. The dependency expo-av will help you use the Audio API and its promise-based asynchronous methods to play the audio files. You’re going to implement this functionality later.

The last step is to have some sound files saved in your assets folder. You can, of course, use your own audio files, but if you want to use the same audio files used in this tutorial, you can download them at the link given below.

You might have gotten the idea of what the user interface is going to look like while glimpsing the demo in the previous section. For each button, you’re going to need a different color. Hence, create a new file called contants/Colors.js and add the following code.

export const NoteOne = 'red'
export const NoteTwo = 'orange'
export const NoteThree = 'yellow'
export const NoteFour = 'green'
export const NoteFive = '#00FFFF    '
export const NoteSix = '#000080'
export const NoteSeven = '#B266FF'

Require this file and all the color codes inside App.js file after other imports.

// ...after other imports

import {
	NoteOne,
	NoteTwo,
	NoteThree,
	NoteFour,
	NoteFive,
	NoteSix,
	NoteSeven
} from './constants/Colors'

The color names are specified to mark each audio file, which are named and numbered similarly. To import all the sound files needed to build the application from the assets folder, add the below object before the App component as shown.

const xyloSounds = {
	one: require('./assets/note1.wav'),
	two: require('./assets/note2.wav'),
	three: require('./assets/note3.wav'),
	four: require('./assets/note4.wav'),
	five: require('./assets/note5.wav'),
	six: require('./assets/note6.wav'),
	seven: require('./assets/note7.wav')
}

The above object xyloSounds consists of the path to each sound file. This will be helpful when you’re writing business logic to play these audio files and have to detect which audio file to play for each specific note.

Build the first UI button

In this section, you’re going to create a button using TouchableOpacity that’s going to play the sound for the note when pressed. To start, make sure in the file App.js you’ve imported the following APIs from the react-native core.

import { StyleSheet, Text, View, TouchableOpacity } from 'react-native'

Then, you’ll have to modify the contents of the render function from the default, boilerplate text that any Expo application comes with. This is going to be done by creating a View container for each button, which will have a fixed height and margin of value 5 to add some spacing between the buttons.

<View style={styles.container}>
	<View style={styles.buttonContainer}>
		<TouchableOpacity
			style={[styles.button, { backgroundColor: NoteOne }]}
			onPress={() => this.handlePlaySound('one')}
		>
			<Text style={styles.buttonText}>Note 1</Text>
		</TouchableOpacity>
	</View>
</View>

Notice that each button will have its background color specified in the file constants/Colors.js. This is done by using the inline styling method. To combine multiple styles in React Native, you can use an array notation like above.

The button has one onPress method that’s going to be responsible for playing the correct sound associated with the note. You’ll be creating the method handlePlaySound in the next section. However, do note that the value one being passed to this method is coming from the path you specified earlier for each audio file. Lastly, the button is going to have some text to display the correct audio file number.

The above snippet is followed by the styles that are created using the StyleSheet.create() method.

const styles = StyleSheet.create({
	container: {
		flex: 1,
		backgroundColor: '#fff',
		marginTop: 50
	},
	buttonContainer: {
		height: 40,
		margin: 5
	},
	button: {
		flex: 1,
		alignItems: 'center',
		justifyContent: 'center'
	},
	buttonText: {
		color: '#fff',
		fontSize: 18
	}
})

To see the current state of the application in action, go back to the terminal window and run the command yarn start or expo start if you don’t have yarn installed. In the simulator screen, you’re going to be welcomed, as shown in the below image.

Add functionality to play audio files

To play a sound in an Expo application, you’re required to import the API for the Audio class from expo-av. So at the top of the App.js file and after other imports, you can add the following line.

import { Audio } from 'expo-av'

Next, you have to add the method handlePlaySound inside the App function and before the render() method. Inside this function, create a new sound object. Whenever you’re required to play sound using the expo-av library, you have to create a new object. This object is going to represent the instance of the class Audio.sound.

handlePlaySound = async note => {
	const soundObject = new Audio.Sound()

	try {
		let source = require('./assets/note1.wav')
		await soundObject.loadAsync(source)
		await soundObject
			.playAsync()
			.then(async playbackStatus => {
				setTimeout(() => {
					soundObject.unloadAsync()
				}, playbackStatus.playableDurationMillis)
			})
			.catch(error => {
				console.log(error)
			})
	} catch (error) {
		console.log(error)
	}
}

In the above snippet, you can notice that the method handlePlaySound is going to accept one parameter. This parameter is going to be the note’s number; hence, the name of the parameter being passed in the above snippet is called note. Inside that, the first line creates the instance of the class Audio.Sound().

Since JavaScript syntax of async/await is being used, it’s better to create a try/catch block so that this Expo app doesn’t give us any error when running the application. Inside this block, the first method loadAsync is used to create and load the sound from the source. Hence, the variable source defined explicitly is passed and contains the path of the first audio file from the assets folder.

To play the sound, the playAsync() method is used. This method, however, further extends, using a promise that takes one object called playbackStatus. This object uses playableDurationMillis to identify the position until the audio file should run from the memory.

Once the audio file is played, the soundObject calls the method unloadAsync(), which unloads the media file from memory. This allows the media file to be played again and again. The setTimeout function’s value depends on the duration of the media file being played from memory.

Go back to the simulator or the device the current app is running on and try to press the first button. You’ll hear the sound of the first note.

Finishing the App

To finish building the application, you have to read the path of each file from the object xyloSounds. Edit the value of source inside the method handlePlaySound(). Also, add the button for each note and don’t forget to pass the correct source value inside the onPress() method. For your reference, here’s the complete code of the App.js file.

import React from 'react'
import { StyleSheet, Text, View, TouchableOpacity } from 'react-native'
import { Audio } from 'expo-av'

import {
	NoteOne,
	NoteTwo,
	NoteThree,
	NoteFour,
	NoteFive,
	NoteSix,
	NoteSeven
} from './constants/Colors'

const xyloSounds = {
	one: require('./assets/note1.wav'),
	two: require('./assets/note2.wav'),
	three: require('./assets/note3.wav'),
	four: require('./assets/note4.wav'),
	five: require('./assets/note5.wav'),
	six: require('./assets/note6.wav'),
	seven: require('./assets/note7.wav')
}

export default function App() {
	handlePlaySound = async note => {
		const soundObject = new Audio.Sound()

		try {
			let source = xyloSounds[note]
			// let source = require('./assets/note1.wav')
			await soundObject.loadAsync(source)
			await soundObject
				.playAsync()
				.then(async playbackStatus => {
					setTimeout(() => {
						soundObject.unloadAsync()
					}, playbackStatus.playableDurationMillis)
				})
				.catch(error => {
					console.log(error)
				})
		} catch (error) {
			console.log(error)
		}
	}

	return (
		<View style={styles.container}>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteOne }]}
					onPress={() => this.handlePlaySound('one')}
				>
					<Text style={styles.buttonText}>Note 1</Text>
				</TouchableOpacity>
			</View>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteTwo }]}
					onPress={() => this.handlePlaySound('two')}
				>
					<Text style={styles.buttonText}>Note 2</Text>
				</TouchableOpacity>
			</View>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteThree }]}
					onPress={() => this.handlePlaySound('three')}
				>
					<Text style={styles.buttonText}>Note 3</Text>
				</TouchableOpacity>
			</View>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteFour }]}
					onPress={() => this.handlePlaySound('four')}
				>
					<Text style={styles.buttonText}>Note 4</Text>
				</TouchableOpacity>
			</View>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteFive }]}
					onPress={() => this.handlePlaySound('five')}
				>
					<Text style={styles.buttonText}>Note 5</Text>
				</TouchableOpacity>
			</View>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteSix }]}
					onPress={() => this.handlePlaySound('six')}
				>
					<Text style={styles.buttonText}>Note 6</Text>
				</TouchableOpacity>
			</View>
			<View style={styles.buttonContainer}>
				<TouchableOpacity
					style={[styles.button, { backgroundColor: NoteSeven }]}
					onPress={() => this.handlePlaySound('seven')}
				>
					<Text style={styles.buttonText}>Note 7</Text>
				</TouchableOpacity>
			</View>
		</View>
	)
}

const styles = StyleSheet.create({
	container: {
		flex: 1,
		backgroundColor: '#fff',
		marginTop: 50
	},
	buttonContainer: {
		height: 40,
		margin: 5
	},
	button: {
		flex: 1,
		alignItems: 'center',
		justifyContent: 'center'
	},
	buttonText: {
		color: '#fff',
		fontSize: 18
	}
})

Now run the application in the simulator, and you’ll get the following screen.

Conclusion

You have reached the end of this tutorial. I hope you have learned how to integrate the expo-av library to use Audio class to create functionality in your cross-platform applications and play audio media files. Important things to notice in this demo application is how to use available methods like loadAsync(), unloadAsync() and leverage the duration of the playing media using the object playplaybackStatus.

You can find the complete to this tutorial at the GitHub repository below.

For complete reference and what other things you can do with Audio class, please refer to the official documentation here.

Fritz

Our team has been at the forefront of Artificial Intelligence and Machine Learning research for more than 15 years and we're using our collective intelligence to help others learn, understand and grow using these new technologies in ethical and sustainable ways.

Comments 0 Responses

Leave a Reply

Your email address will not be published. Required fields are marked *

wix banner square