Skip to content

Style Transfer based on Generative Adversarial Network

Notifications You must be signed in to change notification settings

harrywang-1523/SURF2017

 
 

Repository files navigation

SURF2017


This is a project of XJTLU 2017 Summer Undergraduate Research Fellowship, it aims at designing a generative adversarial network to implement style transfer from a style image to content image. Related literature could be viewed from Wiki

1. Overview


Neural Style Transfer is one of the cutting-edge topic in deep learning field. Given an colored image, like this proposed, and another image that contains the style desired, they could be combined by using Neural Style Transfer and it looks like this.
dawn sky anime

dawn sky style transfer anime


Our goal is to implement the neural style transfer by using cycleGAN. At the same time, we also want to take one step further by using CAN, which could generate image itself after a well-feed training process.

2. Framework

Despite so many existing and well-performed deep learning frameworks (like caffe, chainer etc), our group chooses Tensorflow for its reliability and adaptability.

Edge detection

Edge detection based on Kears deep learning framework has been implemented, and test image is

Input test image Output test image


The performance is not bad, and for non-anime photo the output is

Output test image


There are more results released by using Keras framework, please see this [link](http://stellarcoder.com/surf/anime_test) created by DexHunter. The network is trained on Professor Flemming 's workstation with 4 Titan X GPUs, which cost 2 weeks to implement.

VGG-19 Pretrained Very Deep Network

This file is essential for the network, the download link could be viewed from here

About

Style Transfer based on Generative Adversarial Network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%