All articles have been moved to my personal site: me.harley-xk.studio, please visit the comments

This name is suspected of being opportunistic, hoping to be worthy of the ancestors >_<

What’s this?

MaLiang is an OpenGL ES3 based graffiti library for iOS. It uses Swift only, supports custom textures, pressure sensing, automatic brush strokes, and offers some room for customization.

This article can be seen as a detailed extension and supplement to the README instructions on Github.

use

My philosophy is to make things as simple and elegant as possible, even though it’s hard sometimes, but try to live up to it. The MaLiang was easy to integrate and use, and I hid a lot of internal logic that was neither useful nor necessary for the user. Of course, if you’re curious, you can look at the source code yourself. This article also introduces some ideas for internal implementation.

integration

MaLiang has been pushed to the official RepO of Cocopods, so all you need to do is add a Pod directive to your Podfile and install will use it in your project:

pod 'MaLiang'
Copy the code

Then introduce Mudule where it is needed. Error: Moudle is not found

import MaLiang
Copy the code

Several major classes

1. Canvas

MaLiang’s most fundamental component is the Canvas, where all graffiti takes place. Canvas is essentially a UIView, so you can use whatever method you used to create a UIView to create a Canvas and add it to your interface.

  • If you prefer code flow, just call UIView’s generic constructor init(frame:).

  • If you think IB streaming is the way to go, just drag a UIView from a XIB or storyboard onto the interface, change the class name to Canvas, and press Enter. Xcode should automatically set the Module to MaLiang.

Set the correct layout constraints on the Canvas and you can start doodling, such as writing something like this 🙂

Well, to draw it like this, there is something missing 🙂

Canvas inherits MLView (ML is short for MaLiang, not that machine learning thing), MLView does pretty much everything with OpenGL, and although it’s defined as an Open class, it’s practically useless. But it doesn’t hurt to know some of the principles

The core of OpenGL doodle is Texture, which is essentially the process of constantly stacking textures onto the canvas along the path of your finger. So what kind of handwriting can be drawn depends entirely on the texture used, as well as its size, color, size and other parameters.

MLView is initialized to create a default texture using its own image. This texture is a simple opaque dot, so only the simplest lines can be drawn. If you want to draw something like the one above, you need to use a more complex texture. MaLiang’s sample project offers several textures that mimic the effects of a pencil, pen, and brush, with which the text is written.

The snapshot

Canvas provides a simple snapshot function:

open func snapshot(a) -> UIImage?
Copy the code

Calling this method generates a snapshot of the current contents of the canvas and returns it as an Image. The snapshot logic is simple to implement, or you can implement more complex snapshot logic yourself.

2. Brush

Using textures directly is still a bit cumbersome, but there are also parameters related to textures such as color, line thickness, and other parameters, so a Brush class is provided to handle all of this data.

Brush properties that change immediately affect subsequent painting.

  • opacitytransparency

As mentioned above, the essence of graffiti is the process of superimposing textures onto brushes. Therefore, in order to create handwriting of different shades, textures need to have transparency, which can be adjusted through the opacity property.

  • pointSizeHandwriting thickness

PointSize, which directly affects the thickness of your handwriting, is measured as a standard unit of iOS size (point), so it is an adaptive screen pixel density property. You don’t need to calculate the actual pixels based on the device type, just specify the size visible to the eye.

  • pointStepdistance

As above, since handwriting is achieved by superimposing textures, in addition to transparency, the distance between two textures also affects the depth of handwriting. In addition, if the dot spacing is larger than the size of the handwriting, you can even draw a dotted line effect. The unit of point distance is also point.

  • forceSensitivePressure sensitivity

PointSize affects the thickness of handwriting, rather than determining it directly, because of the existence of pressure sensing. The actual size of the handwriting will fluctuate up or down from the pointSize specified by the pressure. The higher the pressure, the thicker the handwriting. ForceSensitive Affects the intensity of handwriting pressure fluctuations. You are advised to set this parameter to a value between 0 and 1. If set too high, the handwriting will be too intense and distorted with pressure; If forceSensitive is set to 0, the pressure-sensitive effect is turned off for the brush, and the handwriting thickness does not change with the pressure.

The MaLiang uses pressure-sensing on iOS devices by default, and simulates pressure-sensing on devices that don’t support it. The simulated pressure sensation depends on the speed of gesture movement to judge the amount of pressure, and the faster the speed, the less pressure.

  • colorcolor

The actual color drawn will be calculated into the value of opacity. However, due to the superposition between textures, mutual effects can be basically cancelled out. You generally do not need to specify additional transparency values for colors.

  • texturetexture

The texture is a non-public property, so you only need to initialize the Brush object with the texture Image. You don’t need to care about the implementation of the texture.

The actual drawing color is the result of mixing the color of the texture with the color of the texture, so the texture map needs to be white to ensure that the correct color is drawn. This problem may improve in the future.

Texture is actually an object of type MLTexture. MLTexture contains the OpenGL implementation for texture creation, texture binding when switching brushes, and so on.

3. Document

Document is not a necessary component to implement doodling, but is designed to provide some deeper functionality. Document maintains all of the handwriting data on the canvas that holds it, and relies on this data to implement undo and redo functions. MaLiang already does both by default.

With the data held by Document, you can also easily implement the logic of saving doodle data to files. In turn, the saved data can be restored to a canvas image, enabling data synchronization across devices.

Document function is not enabled by default, need to be manually enabled through the code:

canvas.setupDocument()
Copy the code

Document needs to use some disk space to store temporary data, so if the device is running out of storage space, the above operation will throw an exception. To ensure the robustness of the program, it is recommended to use do catch mode to catch possible exceptions:

do {
    try canvas.setupDocument()
} catch {
    // do somthing when error occurs
}
Copy the code

Some features planned for implementation

There are still some unimplemented features in MaLiang that will be added gradually in the future. Of course, you will help me implement them and then submit the PR to me 🙂

  • Undo & Redo, now implemented

  • Export picture, already achieved

  • Draws text to the specified position on the canvas

  • Draws the specified picture to the specified position on the canvas

  • Texture rotation, texture rotation can achieve some more special handwriting effects

origin

MaLiang started as a graffiti project years ago, and it was implemented in Objective-C and OpenGL ES1, which didn’t have very good anti-aliasing support, so it didn’t work very well. And because I was too young at that time, the design and structure of the whole frame were messy. It made it to shelves for a while, but for a variety of reasons, the project died with the company at the time.

I started this project again last year. I was going to completely rewrite it based on Swift and OpenGL ES3, and I was going to improve on things that I didn’t handle well at the time, and I was going to expand on some things that I had recently thought of, and I ended up with this library.

According to Swift?

It is not easy to use Swift to directly deal with OpenGL. I have been advised to use OC or C as the middle layer to call OpenGL and Swift to encapsulate the upper logic. This can achieve the desired effect at the lowest cost.

As a side project, though, cost wasn’t my primary concern, and the library was OpenGL based, but it was only a few hundred lines of code dealing with OpenGL. In pursuit of such a little cost and convenience, sacrificing the unity and cleanliness of the entire project structure is unacceptable to me.

In addition, the introduction of OC code means that the dynamic runtime environment of OC is also introduced, which will have a certain impact on the execution efficiency of Swift. Although, as an iOS project, there’s no way to get rid of OC’s dynamic runtime environment right now, my paranoia doesn’t seem to make much sense, but who knows 🙂

application

After all that talk, what’s the use of this library? I honestly don’t know. Maybe it could be a signature? However, CoreGraphics is enough for the signature. Maybe I could use it to make a drawing App to amuse kids, maybe I would…

In the final analysis, this is mainly to the original ignorant period experience of a memorial bar. Anyone interested can take it to play 🙂

The next step might be to develop a doodle App based on this library. Of course, the project many years ago will not be revived. The new App will be a brand new project integrating many of my own ideas. Of course, I hope not to give up halfway!