One, foreword

Hi, I’m Jack.

I saw a very interesting new paper posted on arXiv: “Stylized Neural Painting”.

This is an algorithm made for art. Take a look at the result:

You read correctly, the algorithm according to the pictures we provide, automatically create a stroke of oil painting!

Image style transfer algorithm, has been studied, but the previous effect is more or less interesting, and this just published “Stylized Neural Painting” effect has a good improvement.

Get your brushes and keyboard ready for today’s hand-to-hand teaching.

Algorithm principle, environment construction, effect realization, one-stop service, all in the following!

Stylized Neural Painting

This paper proposes an algorithm to transform pictures into paintings, which can produce realistic paintings and controllable styles.

Different from the previous image style transfer algorithm, the algorithm deduces every stroke of painting according to the provided picture, and achieves the painting effect of “one stroke and one stroke”.

A new neural network renderer was designed to simulate the behavior of vector renderers, and stroke prediction was used as a parameter search process to maximize the similarity between input and rendered output.

The color and shape are decoupled by a two-channel neural network renderer of rasterized network and shadow network. The image generated by this method has high fidelity both in overall appearance and local texture.

The smaller picture in the lower right corner is the original picture, while the larger picture is the oil painting transformed by algorithm.

In addition to oil painting style, there are watercolors, markers:

There are also color or texture style transfers:

For more details, please refer to paper:

Address: arxiv.org/abs/2011.08…

Three, the effect test

Github project address: github.com/jiupinjia/s…

Step 1: Set up the test environment.

It’s easy to install the dependency libraries according to requirements.txt.

There are no special libraries.

Step 2: Download the weight files of the trained models. There are four models in total, and I packed them directly.

I will be procedures and weight files have been packaged, too much trouble, you can download direct use.

Download address (extraction code: Jack) :

Pan.baidu.com/s/1i9OsVHmd…

Step 3: In the project directory, run the program.

python demo_prog.py --img_path ./test_images/apple.jpg --canvas_color 'white' --max_m_strokes 500 --max_divide 5 --renderer oilpaintbrush --renderer_checkpoint_dir checkpoints_G_oilpaintbrush
Copy the code

Img_path: image to enter.

Canvas_color: can be understood as the background color of the artboard.

Max_m_strokes and max_divide: Controls the size of strokes and the fineness of the painting.

Renderer: The style of the painting. Options include watercolor, Markerpen, OilpaintBrush, and Rectangle.

Renderer_checkpoint_dir: weight file path.

The resulting output is saved in the Output folder, in order of strokes.

Operation effect:

Run on RTX 2060 Super, take about 5 minutes, end result:

Four, omg

Graphic deep learning and other technical tutorials are still being written, let me sort them out, please look forward to!

I’m Jack, and I’ll see you next time.

This article will be updated continuously. You can find it on our wechat official account by searching [JackCui-ai]. GitHub github.com/Jack-Cheris… Welcome Star.