hertzjs
v0.0.3
Published
TypeScript audio engine library designed to help developers build web based multi-track audio editors
Downloads
35
Readme
HertzJs
HertsJs is javascript audio engine library designed to help developers build web based multi-track audio editors.
Notes
This project is still WIP, please do not use it for production yet as we are still working on the initial version and the API might change in breaking ways.
Built with
This project is built with TypeScript, and relies on the AudioContext web API.
How to install
npm install hertzjs
How to use
Creating & importing a project
First we create a new project instance to hold our project. We can either create a new empty project
let project = new AudioProject();
Or if we have the JSON serialization of the project, then we can import it instead.
let project = AudioProject.createFromJson(`
{
"cursor":0,
"tracks":[
{
"clips":[
{
"path":"file3.mp3",
"startsAt":3,
"offset":3,
"duration":6,
"effects":[
{
"name" : "fade-out",
"params":{
"duration" : 4
}
}
]
}
]
}
]
}
`);
In the previous code snippet, we are importing a project which has one track, that contains one clip, which has a file called file3.mp3
that starts playing at second 3, and the skips the first 3 seconds of its content, and keeps playing for 6 seconds. It also has one fade-out effect that starts at the last 4 seconds. So if we play the project, we should hear the music starting from second 3, and keeps playing until second 5, and then starts fading out until second 9.
Adding clips to project
Adding a clip is very simple, we simply get the track that we want to add a clip to and then use the addClip
method to insert a new clip as follows
let track = project.getTracks()[0]; //Extracts the first track
let clip = track.newClip('file1.wav'); //Inserts a new clip in the first track
//We can even change the clip's props
clip.setStarts(3)
clip.setDuration(8)
clip.setOffset(2)
//We can even add effects on the clip
clip.addEffect('fade-in', {duration : 2})
Exporting the project
If we want to save the project for example in the database, we need to call the method toJson()
as follows.
let json = project.toJson();
//Save the json content in the database or somewhere safe.
Undo & Redo
The project allows for the features such as undo and redo.
Whenever you need to commit the project to the history, simply use the commit()
method as follows.
project.commit(); //Save saves the current version in the history, should be called whenever the user does an action.
To undo the changes, we simply use the undo()
method as follows
project.undo();
To perform a redo we can use the redo()
method as follows
project.redo()
Note: If we undo, and then call a commit, we can no longer perform a redo as all the future changes have been overwritten.
How does the undo/redo work internally?
The undo and redo features are possible thanks to the History
class which manages the various versions of the project. Every project version stored in the that class is simply the json of the project extracted when we performed a commit. So the undo simply clears the project and import the json of previous version. The redo does the same thing again...
Playback
To play the project, we can simply use the play()
method as follows
project.play()
Rendering
To render the project into a an audio buffer, we can simply use the render()
promise method as follows
project.render().then((buffer) => {
console.log('buffer1', project.playBuffer(buffer)) // You can play the buffer, or use it somehow
})
Events
The project fires various events that we can listen to to perform some action. For example
//Detects when the player is playing
project.on('play', () => {
console.log('playing...')
})
//Detects when the cursor is updating, for example to perform some UI interactions.
project.on('cursor:update', (second) => {
if(playerUI)
playerUI.style.width = `${second * 10}px`;
})
Classes & relationships
Every audio project is represented by an object instance from the class AudioProject
.
An AudioProject has an array of AudioTrack
. Each AudioTrack
has an array of AudioClip
.
And an AudioClip
references a audio file in the file system, for example file1.mp3
.
The AudioClip
defines when the audio starts (startsAt), how long it plays (duration), and how much to skip (offset), it also has an array of AudioEffect
which represents the list of effects applied on that clip.
When we render the project, the system will iterate over all the tracks, and foreach track it will iterate over all of its clips, and for each clip it will apply all of its effects. In reality, what we are doing here is we are constructing the audio graph by connecting The Audio Web Api nodes correctly with the right effects and with each others.
Creating new effects
Effects are stored in the folder ./src/effects/
And they extend the class AudioEffect
and implement the AudioEffectInterface
.
Here is an example fade-in
effect.
import AudioClip from "../AudioClip";
import AudioEffect from "../AudioEffect"
import AudioEffectInterface from '../interfaces/AudioEffectInterface'
export default class FadeInEffect extends AudioEffect implements AudioEffectInterface {
protected name = 'fade-in';
public apply(audioClip : AudioClip, audioContext : AudioContext | OfflineAudioContext): any {
console.log('Applying Fade-in with params', this.params)
const gainNode = audioContext.createGain();
gainNode.gain.value = 0;
const gainParam = gainNode.gain;
const fadeInDuration = this.params?.duration ?? 2; // duration of the fade-in in seconds
const fadeInStartTime = audioClip.getStartsAt(); // start time of the fade-in in seconds
gainParam.setValueAtTime(0, fadeInStartTime);
gainParam.linearRampToValueAtTime(1, fadeInStartTime + fadeInDuration);
return gainNode;
}
}
After creating an effect, we must register it in the ./effect/Utils.ts
file.
Authors
This project was built by Nidhal Abidi for the Colmena project.