【 8BIT 】 Active reduction of risk

【8 Bit Universe】 Many popular songs in Europe and America have been 8Bit. Can you still hear it?! Super nice!

Why write this article

It’s been a busy year, and people have been supplementing themselves with the knowledge they need to do their jobs, but not the entertainment of technology.

Recently, I was listening to a radio of COMPUTER Core network: We use 8bit music principle to tell you how FC game music is made. Part of the feeling of the principle and Web Audio have a lot of similarities, so I write this article, hoping that students who have no music foundation can also write their own 8bit music after watching it.

Since it is a practical article, this article may involve a lot of music knowledge. I will try my best to use the knowledge needed to make everyone understand quickly in the simplest way, and I will also attach relevant learning materials.

There’s not much code in this article, so if you’re just interested in technology, stop there. The principle part of the specific technical code, I will write at the beginning of another year, here first dig a hole, the following is the text.

8bit music introduction

What is 8bit music

8Bit music is also called chip music.

Game consoles at the time (such as FC) had too little memory to store high-resolution PCM recordings. So game music needs to be synthesized in real time, and the basic sound synthesis engine must be built into the hardware, storing the music code (equivalent to sheet music) in the game chip, from which chip music is born and gradually becomes a style.

How to create 8Bit music

We mainly use FC sound system to explain. In THE FC sound system, there is no concept of Musical Instruments, but different waveforms are provided for the use of arrangers, and different waveforms are called when composing music to form different sound effects.

FC provides the following five channels:

Square wave (2 channels)

The waveform is shown as follows:

Square waves occupy two sound channels, because there are different proportions of waveform, can produce different pitches of sound, can be analogous to two guitars in an electroacoustic band.

Triangle wave (1 channel)

A triangular wave, as its name implies, is a wave with its peaks and troughs all triangular, as shown below:

It can be likened to the bass of an electro-acoustic band.

Noise (1 channel)

Noise is one of the most common sounds in FC game music and sound effects. Explosions, footsteps, collisions, and rhythm points of music are all made with noise.

The rhythm is more important than the sound. Because noise is more recognizable than square and triangular waves, it can create a better sense of rhythm, which can be likened to drums in an electroacoustic band.

Sampling (1 channel)

Finally, there is a sound channel is the sampling sound channel, because the sampling pattern is more, relatively more complex, so in this article do not talk about

Modification of a sound

Finally, FC supports the modification of a note. Developers can modify a note by volume, wave closure, trill, and so on.

Web Audio profile

Web Audio supports sine, square, triangular, and sawtooth waves. You can hear different waveforms producing different sounds:

demo

However, you may have noticed that none of these consoles are sine waves. Because sine waves came out after the console was released, they were much milder in tone and we wouldn’t use them in practice.

Considering the demo, we decided to use tone.js for development for two main reasons:

  • Tone.js accepts pitch, rather than requiring it to play a 440Hz frequency, we can simply play A4 instead of 440Hz.
  • Attack and Release in ADSR are encapsulated to facilitate development. Those interested in ADSR can refer to wave seals

Formal combat

First, demo: Link

We edit an 8bit piece of music, mainly concerned with the monophonic score, harmony and sound effects of the three parts, the following are explained separately.

Step 1: music reconstruction

Consider tone.js’s key vocalization functions:

.triggerAttackRelease ()
Copy the code

Contains four parameters:

  • Note: Frequency or position of notes played
  • Duration: the duration or shape of a note
  • Time: When are the notes triggered
  • 3, The velocity of a musical note: the higher the speed, the greater the amplitude and the greater the volume

It can be seen that time is mainly related to duration, which reflects the position of notes on the time axis of an instrument.

And the volume of a song will be relatively uniform, so we modify a score, the main concern is the notes and the duration of the notes (i.e. the beat), for example:

var synth = new Tone.Synth().toMaster()

synth.triggerAttackRelease('C4'.'4n'.'8n'.1)
synth.triggerAttackRelease('E4'.'8n'.'4n + 8n'.1)
Copy the code

C4 and E4 represent the frequency of sound or the position of the note in the score, 4n and 8n represent the duration of sound or the shape of the note in the score, time parameter is cumulative, and velocity parameter is also fixed.

So how do we figure out C4, E4, 4n, 8n?

Take “Ode To Joy” for example, here’s the piano score:

Note position

1=C in the top left, which means that the song is in C major, which uses the C key on the piano as “do”, and then the clef:

The score is treble clef, so it’s played with the “middle C” key as “do”.

What is the “central C” key? There are altogether 52 white keys on the piano, 7 octaves, the first key is A1, then B1, C1. From left to right, the fourth octave of C, the C4 key is the “central C”, and the “do” of this score is the “C4” we saw in the previous code.

How to read the staff? Very simple, the staff is the piano relative key spectrum, as shown below:

There are also 36 black keys on the piano, as shown below:

C4``D4
C4#
D4b
Law of twelve equal

Note the shape

Let’s look again at the top left corner of the score, “4/4”, which means that this is a score of 44 beats, what is 44 beats? The 44 beat is a quarter note beat with 4 beats per measure. A vertical line distinguishes a bar. What is a quarter note? See below:

The 4n and 8n in the previous example code actually stand for quarter notes and eighth notes.

At this point, people will have doubts. Although the shape of the note represents the beat, the beat is also relative. How does it correspond to the absolute time in reality? This is where the concept BPM is introduced. BPM is short for Beat Per Minute, or how many beats Per Minute there are in a 44-beat score. If BPM is set to 120, that means that each quarter note lasts exactly 0.5 seconds.

The practical application

After talking about music for a long time, we still have to go back to the code, take the first bar for example, this bar has four beats, each of which is a quarter note, respectively “mi”, “mi”, “fa”, “so”, how do we write in the code? As follows:

var synth = new Tone.Synth().toMaster()
Tone.Transport.bpm.value = 120

synth.triggerAttackRelease('E4'.'4n'.'0'.1)
synth.triggerAttackRelease('E4'.'4n'.'4n'.1)
synth.triggerAttackRelease('F4'.'4n'.'2n'.1)
synth.triggerAttackRelease('G4'.'4n'.'2n+4n'.1)
Copy the code

The transformation of the score for such a bar is complete.

Step 2: Introduce harmony

What is harmony? Harmony is the sound combination of two or more different notes that sound simultaneously according to certain rules. Different Musical Instruments have different timbre, and only by matching them together in a harmonious way can good music be formed. Timbre is essentially different waveforms.

After finishing the monophonic score, we are going to introduce some new instruments (waveforms) :

var triangleOptions = {
  oscillator: {
  	type: 'triangle'}}var squareOptions = {
  oscillator: {
  	type: 'square'}}var squareSynth = new Tone.Synth(squareOptions).toMaster()

var triangleSynth = new Tone.Synth(triangleOptions).toMaster()

var noiseSynth = new Tone.NoiseSynth().toMaster()
Copy the code

Each tone should have a different waveform and also correspond to different instruments. SquareSynth square waves correspond mainly to melodic instruments such as piano and guitar, triangleSynth triangle waves are mainly used to simulate bass, and noiseSynth noises are mainly used to simulate percussion.

Each instrument needs to adapt its own unique score.

Part three: Sound effects

Finally, you can use the Envelope to change the sound effects, such as adding reverb, echo, light and heavy to make the music sound more vivid. This is controlled by Tone.Envelope() :

  • Attack: Volume time from 0 to maximum volume, in unit of time
  • Decay: The time, in units of time, at which the loudness of a sustain was sustained
  • Sustain: The sustained volume as a percentage of the total volume, in percentage, before the release execution
  • Sustain the amount of time in which the volume of a sustain reaches zero
envelope: {
    attack  : 0.01 ,
    decay  : 0.1 ,
    sustain  : 0.5 ,
    release  : 1 ,
    attackCurve  : linear ,
    releaseCurve  : exponential
}
Copy the code

A variety of sound effects can be achieved by controlling Attack, Decay, Sustain, and Release, but it’s up to you to experiment.

conclusion

This article mainly explains the overall steps of developing 8bit music using Web Audio technology and tone.js class library. It is mainly divided into three steps:

  • Single instrument score modification
  • Multi-instrument harmony
  • Add sound effects

For more, check out my modified demo, which also included the score of Super Mario and The Legend of Zelda written by the radicels, as well as my rewritten Ode to Joy, and a simple piano score to write a bass score.

Hope everyone can create their own 8bit music!