The original:
How to Use Emojis as Icons


The author:
Preethi Sam

Use monochrome Emoji

Icon is becoming more and more important in Web design. There are many resources about icon on the web, both free and paid. Here, we will introduce how to use emoji as icon in a familiar way.

One benefit of emoji is that it is already built into the system, while icon resources need to be fetched from the site server. Emoji are just simple characters similar to regular text, so they can be a good alternative to traditional icon images.

Using emojis is as simple as adding them to HTML as text via the keyboard or using their Unicode character points directly. But there’s a bit of a problem with using them as ICONS.

Typically, ICONS are monochrome shapes, but emoji are not.

You can use text-shadow to make it monochrome.

< ul > < li > < span class = icon > 🚲 < / span > Bicycles < / li > < li > < span class = icon > ✈ ️ < / span > Planes < / li > < li > < span Class = icon > 🚂 < / span > Trains < / li > < / ul > < ul > < li > < span class = icon > 📥 Inbox < / span > < / li > < li > < span class = icon > 📤 < / span > Outbox < / li > < li > < span class = icon > 📁 < / span > Folder < / li > < / ul >Copy the code

Add some CSS:

.icon {
    color: transparent;
    text-shadow: 0 0 #ec2930;
}
Copy the code

Emojis as Icons

Setting the color to transparent hides the original emoji, and what we see is its shadow.

Another way to do this is to use background-clip:

.icon {
    color: transparent;
    background-color: #ec2930;
    background-clip: text;
    -webkit-background-clip: text;
}
Copy the code

In this way, you can also set a gradient background to get a gradient icon:

.icon {
    color: transparent;
    background-image: linear-gradient(45deg, blue, red);
    background-clip: text;
    -webkit-background-clip: text;
}
Copy the code

Emojis as Icons: gradient

Unicode and emoji

Of the six emojis in the example above, one is of a different type than the other five: ✈️. As we already know, emoji is either a magic or a Unicode character, which is treated differently by different systems. For example, in Windows 7 and before, these characters were originally monochrome, but now Windows 10 has a variety of colorful emoji. Take a look at the six emoji code points in this example:

'🚲. CodePointAt (). The toString (16) / / 1 f6b2' ✈ ️ '. CodePointAt (). The toString (16) / / 2708 '🚂' codePointAt (). The toString (16) / / 1f682 '📥'.codePointat ().tostring (16) // 1f4e5 '📤'. CodePointAt ().tostring (16) // 1f4e4 '📁'. CodePointAt ().tostring (16) // 1f4c1Copy the code

Except for ‘✈️’, none of the emojis are in the Unicode base plane (U+0000 to U+FFFF).

Once upon a time, it was thought that the world’s symbols required only 16 bits, or 65,536 characters. When it became clear that it wasn’t enough, we expanded it by adding 16 auxiliary planes in the range of U+000000 to U+10FFFF. The Unicode encoding used in JavaScript is UTF-16, which uses two bytes for a character in the base plane and four bytes for the secondary plane. The Unicode representation for a character is as follows in JavaScript:

\uxxxx
Copy the code

XXXX indicates the Unicode code point of a character, ranging from u0000 to uFFFF. Such as:

The console. The log (' \ u2708 ') / / ✈Copy the code

The plane ✈ looks a bit strange and is not at all the same as the example ✈️. Why?

'✈ ️'. Length / / 2Copy the code

‘✈️’ is also of length 2, so look at the second code point

'✈ ️'. CodePointAt (1). The toString (16) / / fe0fCopy the code

Fe0f this is actually an emoji character selector, is the previous version of the existing characters for emoji conversion.

So, use it together like this:

The console. The log (' \ u2708 \ ufe0f ') / / ✈ ️Copy the code

For the remaining five emoji, which are themselves characters with code points above UFFFF, a new notation has been added to ES6:

\u{xxxxxx}
Copy the code

Something like this:

The console. The log (' \ u f6b2 {1} ') / / 🚲 console. The log (' \ u f682 {1} ') / / 🚂 console. The log (' \ u f4e5 {1} ') / / 📥 console. The log (' \ u f4e4 {1} ') / / 📤 console. The log (' \ u f4c1 {1} ') / / 📁Copy the code

Of course, it is also possible to use charCodeAt to get code points in different positions, using old-fashioned notation:

// d83d const l = '🚂'.charcodeat (1).tostring (16) // de82 Console. log(string.fromCharcode (0xD83D, 0xDE82)) // 🚂 console.log('\ud83d\ude82') // 🚂Copy the code

The base plane/UD800 to/UDFFf is empty. The secondary plane has 2^20 characters. In UTF-16 encoding, the high level maps to/UD800 to/UDBFF (space size 2^10, i.e. 0x400), and the low level maps to/UDC00 to/UDFFf. The corresponding mapping rule calculation method is as follows:

H = Math.floor((char - 0x10000) / 0x400) + 0xD800

L = (char - 0x10000) % 0x400 + 0xDC00
Copy the code

Therefore, when the code point is known, the code can be directly calculated.