Artistic Stylization using Convolutional Neural Networks

In our recent work on artistic stylisation we use Convolutional Neural Networks (CNNs) to extract and recombine the content and style of arbitrary images. The algorithm relies on texture representations defined on feature activations in the CNN. The original work on texture synthesis with CNNs is available on arXiv and a collection of natural textures synthesised using this method can be found on our website.

Please note that in both, the texture synthesis and the artistic stylization work, we use the L-BFGS optimisation method as this appears to give the best results.

Finally, the images in both papers paper were generated using a normalised version of the 19-layer VGG-Network described in the work by Simonyan and Zisserman. The weights in the normalised network are scaled such that the mean activation of each filter over images and positions is equal to one. Such re-scaling can always be done without changing the output of a neural network as long as the non-linearities in the network rectifying linear. We found that this helps with the image generation and apologise that this was not made clear in the preprint version of the manuscripts.
The normalised network can be downloaded here.
University of Tuebingen BCCN CIN MPI