DancingThings: Synthetic Training Data for Optical Flow with Spline-based Procedural Deformation
Stamatis Alexandropoulos, Karhan Kayan, Jia Deng
Publication details coming soon.
We introduce DancingThings, the first optical flow dataset that focuses on general non-rigid motion. We generate DancingThings by warping 3D assets with a spline-based procedural deformation method. Our method works by creating free-form deformation fields around the assets and animating them with 4D NURBS where there is an extra time curve that controls how the control points move. DancingThings contains 400k image pairs with ground-truth flow annotations and occlusion masks. It is a larger dataset than FlyingChairs, FlyingThings3D, Sintel, and Spring combined. We show that baseline methods trained on our dataset have better accuracy on synthetic and real-world benchmarks with non-rigid movement compared to existing datasets.
Stamatis Alexandropoulos, Karhan Kayan, Jia Deng
Publication details coming soon.