-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't record a canvas that consists of objects that react to an audio file #28
Comments
Hi, thanks for filing this! I was wondering when I wrote the video handling code whether anyone would have this use case. Thanks for an actual real world case! Currently the You may have some luck just changing the audio element to a video element-- I believe audio files can work as video files, and there is some support for video elements. In a future version, I'll try to add audio support. |
Thanks a lot for your time! I tried this and it works. Adjusted the node names checkers to match 'audio' instead of 'video'. I drew an MP3 player and it gets played and rendered correctly. But for some reason, the audio visualization doesn't get shown (connected via AudioContext/AnalyserNode). Drawing the animation on a canvas with the requestAnimationFrame function, data from the analyser. It works perfectly fine if I open it via the browser. Trying to figure out what the cause can be, do you have any suggestions? Once again, thanks a lot for your time!
I tried to do this first but it gives me the same problem. Before I edited the |
I'm not very familiar with how |
I should also add that videos modified via |
Yes, this is a very simple example of what I'm trying to record. let audio = new Audio();
audio.src = '/audio/track.mp3';
audio.controls = true;
audio.loop = true;
audio.autoplay = false;
// Establish all variables that your Analyser will use
let canvas, ctx, source, context, analyser, fbc_array, bars, bar_x, bar_width, bar_height;
function initMp3Player() {
document.getElementById('audio').appendChild(audio);
window.AudioContext = window.AudioContext || window.webkitAudioContext;
context = new AudioContext();
analyser = context.createAnalyser();
canvas = document.getElementById('visualizer');
ctx = canvas.getContext('2d');
source = context.createMediaElementSource(audio);
source.connect(analyser);
analyser.connect(context.destination);
frameLooper();
}
function frameLooper() {
window.requestAnimationFrame(frameLooper);
fbc_array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(fbc_array);
ctx.clearRect(0, 0, canvas.width, canvas.height);
ctx.fillStyle = '#00CCFF';
bars = 100;
for (var i = 0; i < bars; i++) {
bar_x = i * 3;
bar_width = 2;
bar_height = -(fbc_array[i] / 2);
ctx.fillRect(bar_x, canvas.height, bar_width, bar_height);
}
} |
It took me a minute but I found a way to do it. Preprocess the audio by using an OfflineAudioContext |
Awesome! Glad to hear that you got it working. If you eventually want to share the end result and the code, I'm interested in seeing it. |
I'd be interested to see, too, @frizurd! |
Hey there,
I really love the plugin and thank you very much for sharing it with us 🙏
I'm trying to record a local webpage that consists of multiple HTML canvases and an HTML audio element. The canvases react and move based on the audio file, I'm hoping to record the movement, and glue the MP3 file to the video afterward the creation of the video.
In the preparePage function I trigger the page to play the audio element which triggers the canvases to animate, which all works fine. But the actual video is not realtime/aligned with the audio file, the video skips a lot of frames in between. I feel like its only recording 1 FPS.
Is there some way of making this awesome plugin work for my use case or am I misunderstanding something?
Thank you in advance.
The text was updated successfully, but these errors were encountered: