This is an addon to https://github.com/avinashpaliwal/Super-SloMo for Event-Based simulation. The code is standalone and updated to latest pytorch, you do not need to install avinashpaliwal code.
So far you can use the same model trained on adobe240fps dataset by avinashpaliwal.
This addon allows to create a video with motion-dependant interpolation. This means it will produce more or less frames dependently on maximum optical flow between 2 frames.
Be sure the repo is in your PYTHONPATH. For example:
export PYTHONPATH=$PYTHONPATH:path\to\repo\Super-SloMo/:
Now, you can run the main tool for one video:
python async_slomo.py path\to\video.mp4 path\to\output.mp4 --sf -1 --checkpoint path\to\checkpoint.ckpt --video_fps M --lambda_flow 0.5 --viz 1
Or for an entire Folder:
python async_slomo.py input_path\to\ output_path\to\ --sf -1 --checkpoint path\to\checkpoint.ckpt --video_fps M
After running the script, you should see 2 files per video:
- output.mp4
- output_ts.npy
This can then be used by our event-based simulator available with Metavision Intelligence.