Dynamic Textures (DTs) are sequences of images of moving scenes that exhibit certain stationarity properties in time such as smoke, vegetation and fire. The analysis of DT is important for recognition, segmentation, synthesis or retrieval for a range of applications including surveillance, medical imaging and remote sensing. Convolutional Neural Networks (CNNs) have recently proven to be well suited for texture analysis with a design similar to dense filter banks. The repetitivity property of DTs in space and time allows us to consider them as volumes and to analyze regularly sampled spatial and temporal slices. We train CNNs on spatial frames and temporal slices extracted from the DT sequences and combine their predictions in a late fusion approach to obtain a competitive DT classifier trained end-to-end.

Share:  

Dynamic texture analysis with deep learning on three orthogonal planes
Dr. Vincent Andrearczyk, HES-SO
19 April 2018 · 10:44 a.m.
128 views