Posts
Apr 12, 2023
1 min read
Thin Plate Spline Motion Model
A practical guide to using the Thin Plate Spline Motion Model for animating static images by transferring motion from a driving video, with setup instructions and observations about optimal video dimensions and motion tracking limitations.

This is my tentative workflow for using this repository to animate static images using a driving video.
GitHub Repository
Conda Environment and Usage
conda activate thin-plate-spline
cd C:\Users\trima\Documents\GitHub\Thin-Plate-Spline-Motion-Model
python demo.py --config config/vox-256.yaml --checkpoint checkpoints/vox.pth.tar --source_image assets/source.png --driving_video assets/driving.mp4
python demo.py --config config/vox-256.yaml --checkpoint checkpoints/vox.pth.tar --find_best_frame --source_image assets/0014.png --driving_video assets/driving.mp4 --result_video output.mp4
First Test
- As far as I can tell, this program requires a
driving_videothat is 1:1 aspect ratio. Makes sense because the model was trained on 256x256 data. - It really doesn’t like a zooming or panning camera. I diffused frame 420 and the frames nearest that frame are definitely where the
result_videois most coherent, and farthest away from frame 420 it’s lost motion tracking entirely. - The
result_videois 256x256. By default, this program doesn’t output the invididual frames. Probably wouldn’t be hard to make this change so the frames can be uspcaled.

Connected Reading
Related entries
Chosen from shared tags, categories, and nearby section context.
