recording-app-saturn
v0.4.7
Published
## VideoJSRecord Props:
Downloads
9
Readme
recording-app-saturn
VideoJSRecord Props:
<VideoJSRecord
v-if="isReady"
ref="videoRecorder"
position="1-2"
:activeDevices="activeDevices"
:customizations="customVideo"
:onError="handleDeviceErrors"
:onStart="triggerStartEvent"
:onStream="streamHandler"
:onComplete="handleSave"
:refresh="stopRecording"
:downloadLocally="false"
:formallyRequest="false"
:screenRecord="false"
:mediaSelect="false"
:autoStart="true"
:isHidden="true"
:noAudio="false"
:pickDevice="2"
/>
- v-if - If
isReady
is true, it will create the component or else, destroy the component. - ref - Using the ref will allow for certain control over the videoRecorder component. This control includes being able to start, stop, pause, and resume the recording.
- position - This prop is expected to be a string. If so, it allows for placing the recording anywhere on the screen based on a coordinate system, based on percentages in order to scale to multiple screen configurations.
- activeDevices - This prop is expected to be an array. If so, it allows the component to keep track of which cameras are currently being used.
- customizations - This prop is expected to be an object. If so, it allows the developer to customize the colors of the VideoJSRecord component. The dev also has the ability to change the body text of the component.
- onError - This prop is expected to be a function. If so, it will be triggered when videoJS experiences a problem.
- onStart - This prop is expected to be a function. If so, it will be triggered as soon as the recording starts.
- onStream - This prop is expected to be a function. If so, it will execute when the video is recording. The entire player object is provided as an argument to this function. The
timeSlice
property (within VideoJSRecord) determines how often the onStream function will be called in milliseconds. DEFAULT: 1000 ms. - onComplete - This prop is expected to be a function. If so, it will execute when the video recording has finished. The recorded data is returned as an array.
- refresh - This prop is expected to be a function. If so, it will allow for reloading the VideoJSRecord component.
- downloadLocally - This prop is expected to be a boolean. If so, it causes an automatic download of the recorded video when the recording has finished.
- formallyRequest - This prop is expected to be a boolean. If so, it will formally request that the user accept recording permissions before the browser requests it. This is considered a soft-landing solution to "accidentally" clicking "decline".
- screenRecord - This prop is expected to be a boolean. If so, it records the screen instead of using the camera.
- mediaSelect - If so, it allows the end-user to select a camera or microphone. If disabled, default camera/mic will be used.
- autoStart - This prop is expected to be a boolean. If so, it makes the video recording start as soon as the user agrees to share their camera/mic/screen feed.
- isHidden - This prop is expected to be a boolean. It hides the video recording from being visible but still indicates to the user that they're being recorded. This indication is done with the red "recording" dot and the acronym "REC".
- noAudio - This prop is expected to be a boolean. If so, it disables audio from the recording if set to true.
- pickDevice - This prop is expected to be a number. If so, it will pick the selected device out of the list of possible devices available in the system.
Example Functions (in App.js):
methods: {
startRecording() {
if (this.videoRecorder) {
console.log("turning video back on...");
this.isReady = true;
}
},
pauseRecording() {
if (this.videoRecorder) {
this.$refs.videoRecorder.controls.pauseRecord = true;
}
},
resumeRecording() {
if (this.videoRecorder) {
this.$refs.videoRecorder.controls.resumeRecord = true;
}
},
stopRecording(restart = false) {
if (this.videoRecorder) {
this.$refs.videoRecorder.controls.stopRecord = true;
setTimeout(() => {
this.isReady = false;
setTimeout(() => {
if (restart === true) {
this.startRecording();
}
}, 500);
}, 1500);
}
},
streamHandler(player, { deviceId }) {
console.log("Stream Handler (player): ", player);
console.log("Stream Handler (record): ", player.record());
console.log("The device being used is:", deviceId);
},
handleSave(recording) {
console.log("Saving:", recording);
console.log("Event Triggered - recording has ENDED.");
},
handleDeviceErrors(error) {
if (error != "camera-disconnected") this.isReady = false;
console.log("Error -", error);
},
triggerStartEvent(player) {
console.log("Event Triggered - recording has STARTED.", player.record());
},
},
The methods above will utilize the videoRecorder ref that is created in the component's props.
Want to make this work on your end?
Copy and Paste the following code into your App.js
<template>
<div id="app">
<VideoJSRecord
v-if="isReady"
ref="videoRecorder"
position="1-2"
:activeDevices="activeDevices"
:customizations="customVideo"
:onError="handleDeviceErrors"
:onStart="triggerStartEvent"
:onStream="streamHandler"
:onComplete="handleSave"
:refresh="stopRecording"
:downloadLocally="false"
:formallyRequest="false"
:screenRecord="false"
:mediaSelect="false"
:autoStart="true"
:isHidden="true"
:noAudio="false"
:pickDevice="2"
/>
<button
@click="startRecording"
:style="{
marginTop: '100px',
}"
>
Start Recording
</button>
<button
@click="pauseRecording"
:style="{
marginTop: '100px',
}"
>
Pause Recording
</button>
<button
@click="resumeRecording"
:style="{
marginTop: '100px',
}"
>
Resume Recording
</button>
<button
@click="stopRecording"
:style="{
marginTop: '100px',
}"
>
Stop Recording
</button>
</div>
</template>
<script>
import VideoJSRecord from "./VideoJSRecord.vue";
export default {
name: "app",
data() {
return {
isReady: true,
customVideo: null,
videoRecorder: null,
activeDevices: [],
};
},
created() {
// this.customVideo = {
// REC_color: "#F79e44",
// txt_color: "#fff1e3",
// bg_color: "#4e1d00",
// btn_color: "#F79e44",
// body: "",
// };
},
mounted() {
if (this.$refs.videoRecorder) {
this.videoRecorder = this.$refs.videoRecorder;
}
},
methods: {
startRecording() {
if (this.videoRecorder) {
console.log("turning video back on...");
this.isReady = true;
}
},
pauseRecording() {
if (this.videoRecorder) {
this.$refs.videoRecorder.controls.pauseRecord = true;
}
},
resumeRecording() {
if (this.videoRecorder) {
this.$refs.videoRecorder.controls.resumeRecord = true;
}
},
stopRecording(restart = false) {
if (this.videoRecorder) {
this.$refs.videoRecorder.controls.stopRecord = true;
setTimeout(() => {
this.isReady = false;
setTimeout(() => {
if (restart === true) {
this.startRecording();
}
}, 500);
}, 1500);
}
},
streamHandler(player, { deviceId }) {
console.log("Stream Handler (player): ", player);
console.log("Stream Handler (record): ", player.record());
console.log("The device being used is:", deviceId);
},
handleSave(recording) {
console.log("Saving:", recording);
console.log("Event Triggered - recording has ENDED.");
},
handleDeviceErrors(error) {
if (error != "camera-disconnected") this.isReady = false;
console.log("Error -", error);
},
triggerStartEvent(player) {
console.log(
"Event Triggered - recording has STARTED.",
player.record()
);
},
},
components: {
VideoJSRecord,
},
};
</script>
<style>
/* change player background color */
#myVideo {
background-color: rgb(0, 0, 0);
}
#app {
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
}
</style>
The only change required is the import statement of the component.
Want to livestream the recorded video?
For the best guide I've found, visit: https://joshuatz.com/posts/2020/appending-videos-in-javascript-with-mediasource-buffers/
Sending Data To The Backend
Reminder: streamHandler() will trigger every 1000 ms because of the timeSlice property.
Within streamHandler()
, the last index of recordedData
was utilized. Something similar to the following:
const { length } = this.player.recordedData;
this.player.recordedData[length - 1].arrayBuffer().then(response => {
...;
...;
});
Within this function a new uint8Array()
was created using the response
. This variable is called bytes
. These bytes are converted to binary using the bytesLength
property to iterate through the entire bytes
variable. The converted values are stored inside a variable called binary
.
Following this conversion process, comes the part where you send it to the backend. You can pass it in the form of an object with an ID. This can be done in the following way:
{ bytes: window.btoa(binary), id: this.videoId }
In the example below, we're doing this conversion process by extracting the last blob of the player instance. Our backend handler is called meetingRoom
. videoDataSendToServer()
is a function in our backend handler that sends the data to the backend.
The Sample Code:
streamHandler(player) {
const { length: recordedDataLength } = player.recordedData;
if (recordedDataLength != 0) {
player.recordedData[recordedDataLength - 1].arrayBuffer().then(response => {
let binary = '';
let bytes = new Uint8Array(response);
let len = bytes.byteLength;
for (let i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
this.$meetingRoom.videoDataSendToServer({ bytes: window.btoa(binary), id: this.videoId })
}).catch(error => {
console.log('Error in streamHandler():\t', error);
})
}
},
Sending Data Back To The Frontend
The backend can raise an event elsewhere in the application. This event handler must be reproduced inside the mounted()
hook of the component. We can call the event, "sendBackBuffer"
. Let's continue with the same backend handler (meetingRoom
) for this example:
this.$meetingRoom.$on("sendBackBuffer", async (data) => {});
Inside this eventHandler we must convert the data back from binary to bytes. This can be done in the following manner:
let binary_string = window.atob(data.bytes);
let len = binary_string.length;
let bytes = new Uint8Array(len);
for (let i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
console.log("Bytes Received:", bytes.byteLength);
After re-converting the data back to bytes, we must re-create the blob from this. Finally, you want to create an arrayBuffer from this blob. This can be done in the following manner:
const mimeType = 'video/webm; codecs="vp9",opus';
let blob = new Blob([bytes], { type: `${mimeType}` });
this.videoBuffer = await blob.arrayBuffer();
The rest of the steps to livestreaming in the frontend are provided in the link at the beginning of this section.
Linking the MediaSource
Before this eventHandler, but still inside the mounted()
hook, we must create the new mediasource instance and link it to the <video />
tag you wish to display the livestream in. For this example, we'll give that video a reference and call it liveStreamPlayer
. Let's not forget to first check if Mediasource
is compatible with the client's browser.
if (window.MediaSource) {
this.mediaSource = new MediaSource();
this.$refs.liveStreamPlayer.src = URL.createObjectURL(this.mediaSource);
this.mediaSource.addEventListener("sourceopen", this.handleSourceOpen);
} else {
console.log(
"The Media Source Extensions API is not supported on your browser."
);
}
this.handleSourceOpen
function is found inside the provided guide at the beginning of this section.
How The Receiving Frontend Should Look:
<template>
<video ref="videoPlayer" autoplay="true" controls="true" />
</template>
<script>
export default {
name: "appName",
data() {},
mounted() {
if (window.MediaSource) {
this.mediaSource = new MediaSource();
this.$refs.liveStreamPlayer.src = URL.createObjectURL(this.mediaSource);
this.mediaSource.addEventListener("sourceopen", this.handleSourceOpen);
} else {
console.log(
"The Media Source Extensions API is not supported on your browser."
);
}
this.$meetingRoom.$on("sendBackBuffer", async (data) => {
const mimeType = 'video/webm; codecs="vp9,opus"';
let binary_string = window.atob(data.bytes);
let len = binary_string.length;
let bytes = new Uint8Array(len);
for (let i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
console.log("Bytes Received:", bytes.byteLength);
let blob = new Blob([bytes], { type: `${mimeType}` });
this.videoBuff = await blob.arrayBuffer();
});
},
};
</script>
<style></style>