Visible to the public Coherent Online Video Style Transfer

TitleCoherent Online Video Style Transfer
Publication TypeConference Paper
Year of Publication2017
AuthorsChen, D., Liao, J., Yuan, L., Yu, N., Hua, G.
Conference Name2017 IEEE International Conference on Computer Vision (ICCV)
Date Publishedoct
KeywordsCoherence, coherent online video style transfer, efficient network, end-to-end network, feed-forward network, flickering results, image fast neural style transfer, image sequences, image stylization networks, Integrated optics, Metrics, naive extension, neural nets, neural style transfer, Optical filters, Optical imaging, optimisation, optimization-based video style transfer, per-frame baseline, pubcrawl, Recurrent neural networks, resilience, Resiliency, Scalability, short-term coherence, temporally coherent stylized video sequences, Training, video processing, video sequences, video signal processing, visually comparable coherence
AbstractTraining a feed-forward network for the fast neural style transfer of images has proven successful, but the naive extension of processing videos frame by frame is prone to producing flickering results. We propose the first end-to-end network for online video style transfer, which generates temporally coherent stylized video sequences in near realtime. Two key ideas include an efficient network by incorporating short-term coherence, and propagating short-term coherence to long-term, which ensures consistency over a longer period of time. Our network can incorporate different image stylization networks and clearly outperforms the per-frame baseline both qualitatively and quantitatively. Moreover, it can achieve visually comparable coherence to optimization-based video style transfer, but is three orders of magnitude faster.
DOI10.1109/ICCV.2017.126
Citation Keychen_coherent_2017