directory

  • What is the memory in the on-heap?
  • How is JVM heap memory divided?
  • What happens when the JVM heap is full?
  • Solve the system GC lag problem based on out-of-heap memory

Today to talk about a very interesting knowledge, is off-heap memory, usually go out for an interview, or research some technology, often may encounter off-heap memory this thing, but many people may not know what off-heap memory is in the end, so today to give you in-depth analysis.

What is the memory in the on-heap?

Say this off – heap heap memory, have to say on first – heap heap memory inside, the on – heap heap memory believe a lot of people should be familiar with, that is we usually write good Java system actually run up to a JVM process, the JVM process is a piece of memory space, for he This memory space is the in-heap memory, which is roughly shown in the figure below.

How is JVM heap memory divided?

So what’s the problem here in general? Generally, this is not a big problem, but if you have to cache a lot of data in the JVM’s heap memory, you can have a problem. The so-called data cache means that you store a lot of data in the heap memory, which is always used, so you can’t recycle it. As a result, a lot of data may remain in the JVM’s heap memory, as shown below.

So the next question is coming, the JVM heap memory is divided, in an area is the young generation, an area is old age, like the cache data, it is in the heap memory for a long time, so often in the young generation to stay for a period of time, and then because can not recycling, so give and take them to the old s as below.

What happens when the JVM heap is full?

But if put too much in this old s cache data, may lead to him the rest of the available space will be less, at that time, may cause the old s often put other data is filled with, once filled with full gc will trigger the JVM, there is a garbage collection threads to recycle old data in the s, as the following figure.

But it generally can recycling is, in addition to cache the data of some space, even if you recycle, but cached data is always exist, so can’t recovery, at this time will cause every time you after recycling part of the remaining space, then still a lot of remaining cache data, at this time for the cached data will always occupy too much space in old age.

The JVM will be stopped from processing the external request, and the JVM will be stopped from processing the external request. In this case, the JVM will be stopped from processing the external request. Will feel your system performance often jitter, a card, a card for a while.

So often, caching a lot of data inside the JVM is likely to result in the old generation frequently filling up, triggering fullGC frequently, and causing the system to stop processing requests frequently, as shown below.

Solve the system GC lag problem based on out-of-heap memory

So, in this case, we often optimize to move data from the JVM heap to off-heap memory, so the question is, what is off-heap memory? As the name implies, the area of memory that is not managed by the JVM, the part of memory that is managed by the OS, is called out-of-heap memory.

So we can actually write a lot of data directly to off-heap memory, which will not take up the old age space in the JVM heap, which will not cause the old age to fill up frequently and trigger the full GC frequently, which will cause the system performance to jitter frequently, as shown in the figure below.

So since this out-of-heap memory is so good, the question is, what’s the downside? Of course it does, because if you’re using memory in the JVM heap, after you write a lot of data, if the memory is full, then the JVM will automatically do garbage collection to free up some memory for you. It’s automatic.

But if you are using is out of memory, it would be a no JVM to help you manage, at this point you have to manage that piece of memory space, that is to say, you write the data, when the need, you have to pay attention to release part of the memory, so this leads to the heap memory while does not lead to your JVM frequent gc, But it can make your code management more difficult, as shown below.

So how does this out-of-heap memory get allocated in Java code in general? Middleware such as Netty and RocketMQ are used to manage large amounts of in-memory data, so they choose to apply for a block of out-of-heap memory and store the data in it for fine management.

// Define the size of off-heap memory to request, in this case 1GB
int memorySize = 1024 * 1024 * 1024;
/ / in Java in the ByteBuffer. AllocateDirect method can apply for a heap memory
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(memorySize);
// Write data to off-heap memory
byte[] bytes = "hello world".getBytes();
byteBuffer.put(bytes);
// Read data from off-heap memory
byteBuffer.flip();
byte[] readBytes = new byte[bytes.length];
byteBuffer.get(readBytes, 0, bytes.length);
Copy the code

So now that you’ve seen in this piece of code how we apply for out-of-heap memory, how we write to it, how we read it, how do we free out-of-heap memory? The out-of-heap memory is actually referenced by a ByteBuffer object in the JVM heap, so if the ByteBuffer object in the JVM heap is reclaimed, its associated out-of-heap memory will be freed, as shown in the figure below.

Ok, today’s knowledge point is shared here, I believe you should have a clear understanding of the concept of out-of-heap memory after reading.

END

Scan code for free 600+ pages of Huisei teachers original fine articles summary PDF

Summary of original technical articles