Why change the original log file generation system of XXL-job

The original client log file generation policy of xxl-job is as follows: A log record will generate a file, or when the database has a log logId, corresponding client will generate a file, due to the timing task ran a lot, and some task interval time is very short, such as triggering a few seconds, the result is that the client generates a lot of documents, but the content of each file is actually not much, But will take up more than a large number of individual files disk, disk resources caused by tension, affect the performance of file system, at the same time for a long time, alarm will trigger the resources, so, if you don’t want to often go to clean up the log file, then the fragmentary files in some way to integrate is urgently needed.

This article is long and involves more code

Modified log file generation policy

A basic description

To reduce the number of log files, you need to merge scattered log files and create external index files to maintain the original log content corresponding to the log Id of each time. In the reading process, the index file is read first, and then the real log content is read.

Here is the log file description analysis diagram:

  • For an executer, only one logid_jobid_index. log index file is generated every day for all scheduled tasks under the executer to maintain the mapping between log ids and task ids.
  • For a scheduled task, only one jobid. log log file is generated every day, which stores all the log content of the current day.
  • Log file jobid_index. log is used to maintain the index in the log content, so that you can know which logId the corresponding row number in jobid. log belongs to for easy search.

That’s the general idea, and you can expect to see a significant reduction in log files and an improvement in disk usage.

This function modified by the author has been running online for nearly a year, and there is no problem at present. If necessary, it can be adjusted and modified according to its own business, and carefully tested to prevent unknown errors.

The code field

The modification in this paper is based on xxL-Job version 1.8.2, and other versions have not been tested yet

Open the code directory xxl-job-core module, which mainly involves the following file changes:

  • XxlJobFileAppender.java
  • XxlJobLogger.java
  • JobThread.java
  • ExecutorBizImpl.java
  • LRUCacheUtil.java

XxlJobFileAppender.java

Some previously unmodified methods in the code will not be pasted here.

package com.xxl.job.core.log;

import com.xxl.job.core.biz.model.LogResult;
import com.xxl.job.core.util.LRUCacheUtil;
import org.apache.commons.io.FilenameUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.util.StringUtils;

import java.io.*;
import java.text.DecimalFormat;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.regex.Pattern;

/**
 * store trigger log in each log-file
 * @author xuxueli 2016-3-12 19:25:12
 */
public class XxlJobFileAppender {
	private static Logger logger = LoggerFactory.getLogger(XxlJobFileAppender.class);

	// for JobThread (support log for child thread of job handler)
	//public static ThreadLocal<String> contextHolder = new ThreadLocal<String>();
	public static final InheritableThreadLocal<String> contextHolder = new InheritableThreadLocal<String>();
	// for logId, record the logId
	public static final InheritableThreadLocal<Integer> contextHolderLogId = new InheritableThreadLocal<>();
	// for JobId, the Id of the scheduled task
	public static final InheritableThreadLocal<Integer> contextHolderJobId = new InheritableThreadLocal<>();

	// Use a cache map collection to access index offset information
	public static final LRUCacheUtil<Integer, Map<String ,Long>> indexOffsetCacheMap = new LRUCacheUtil<>(80);

	private static final String DATE_FOMATE = "yyyy-MM-dd";

	private static final String UTF_8 = "utf-8";

	// File name suffix
	private static final String FILE_SUFFIX = ".log";

	private static final String INDEX_SUFFIX = "_index";

	private static final String LOGID_JOBID_INDEX_SUFFIX = "logId_jobId_index";

	private static final String jobLogIndexKey = "jobLogIndexOffset";

	private static final String indexOffsetKey = "indexOffset";

	/** * log base path * * strut like: * ---/ * ---/gluesource/ * ---/gluesource/10_1514171108000.js * ---/gluesource/10_1514171108000.js * ---/2017-12-25/ * ---/2017-12-25/639.log * ---/2017-12-25/821.log * */
	private static String logBasePath = "/data/applogs/xxl-job/jobhandler";
	private static String glueSrcPath = logBasePath.concat("/gluesource");
	public static void initLogPath(String logPath){
		// init
		if(logPath! =null && logPath.trim().length()>0) {
			logBasePath = logPath;
		}
		// mk base dir
		File logPathDir = new File(logBasePath);
		if(! logPathDir.exists()) { logPathDir.mkdirs(); } logBasePath = logPathDir.getPath();// mk glue dir
		File glueBaseDir = new File(logPathDir, "gluesource");
		if(! glueBaseDir.exists()) { glueBaseDir.mkdirs(); } glueSrcPath = glueBaseDir.getPath(); }public static String getLogPath(a) {
		return logBasePath;
	}
	public static String getGlueSrcPath(a) {
		return glueSrcPath;
	}

	* log filename, like "logPath/ YYYY-MM-dd/jobid.log "*@param triggerDate
	 * @param jobId
	 * @return* /
	public static String makeLogFileNameByJobId(Date triggerDate, int jobId) {

		// filePath/yyyy-MM-dd
		// avoid concurrent problem, can not be static
		SimpleDateFormat sdf = new SimpleDateFormat(DATE_FOMATE);
		File logFilePath = new File(getLogPath(), sdf.format(triggerDate));
		if(! logFilePath.exists()) { logFilePath.mkdir(); }// Generate a log index file
		String logIndexFileName = logFilePath.getPath()
				.concat("/")
				.concat(String.valueOf(jobId))
				.concat(INDEX_SUFFIX)
				.concat(FILE_SUFFIX);
		File logIndexFilePath = new File(logIndexFileName);
		if(! logIndexFilePath.exists()) {try {
				logIndexFilePath.createNewFile();
				logger.debug("Generate log index file, file path: {}", logIndexFilePath);
			} catch(IOException e) { logger.error(e.getMessage(), e); }}// Generate the global index of the jobId corresponding to the current day's logId in the YYYY-MM-DD folder
		String logIdJobIdIndexFileName = logFilePath.getPath()
				.concat("/")
				.concat(LOGID_JOBID_INDEX_SUFFIX)
				.concat(FILE_SUFFIX);
		File logIdJobIdIndexFileNamePath = new File(logIdJobIdIndexFileName);
		if(! logIdJobIdIndexFileNamePath.exists()) {try {
				logIdJobIdIndexFileNamePath.createNewFile();
				logger.debug("Generate logId and jobId index file, file path: {}", logIdJobIdIndexFileNamePath);
			} catch(IOException e) { logger.error(e.getMessage(), e); }}// filePath/ YYYY-MM-DD/jobid. log Log file
		String logFileName = logFilePath.getPath()
				.concat("/")
				.concat(String.valueOf(jobId))
				.concat(FILE_SUFFIX);
		return logFileName;
	}


	* admin read log, generate logFileName bu logId *@param triggerDate
	 * @param logId
	 * @return* /
	public static String makeFileNameForReadLog(Date triggerDate, int logId) {
		// filePath/yyyy-MM-dd
		SimpleDateFormat sdf = new SimpleDateFormat(DATE_FOMATE);
		File logFilePath = new File(getLogPath(), sdf.format(triggerDate));
		if(! logFilePath.exists()) { logFilePath.mkdir(); } String logIdJobIdFileName = logFilePath.getPath().concat("/")
				.concat(LOGID_JOBID_INDEX_SUFFIX)
				.concat(FILE_SUFFIX);
		// find logId->jobId mapping
		// Get the index map
		String infoLine = readIndex(logIdJobIdFileName, logId);
		String[] arr = infoLine.split("- >");
		int jobId = 0;
		try {
			jobId = Integer.parseInt(arr[1]);
		} catch (Exception e) {
			logger.error("makeFileNameForReadLog StringArrayException,{},{}", e.getMessage(), e);
			throw new RuntimeException("StringArrayException");
		}
		String logFileName = logFilePath.getPath().concat("/")
				.concat(String.valueOf(jobId)).concat(FILE_SUFFIX);
		return logFileName;
	}

	/** * appends content to log file, appends index to index file * append log *@param logFileName
	 * @param appendLog
	 */
	public static void appendLogAndIndex(String logFileName, String appendLog) {

		// log file
		if (logFileName == null || logFileName.trim().length() == 0) {
			return;
		}
		File logFile = new File(logFileName);

		if(! logFile.exists()) {try {
				logFile.createNewFile();
			} catch (Exception e) {
				logger.error(e.getMessage(), e);
				return; }}// start append, count line num
		long startLineNum = countFileLineNum(logFileName);
		logger.debug("Start append log file, start line: {}", startLineNum);
		// log
		if (appendLog == null) {
			appendLog = "";
		}
		appendLog += "\r\n";

		// append file content
		try {
			FileOutputStream fos = null;
			try {
				fos = new FileOutputStream(logFile, true);
				fos.write(appendLog.getBytes("utf-8"));
				fos.flush();
			} finally {
				if(fos ! =null) {
					try {
						fos.close();
					} catch(IOException e) { logger.error(e.getMessage(), e); }}}}catch (Exception e) {
			logger.error(e.getMessage(), e);
		}
		// end append, count line num, count again
		long endLineNum = countFileLineNum(logFileName);
		Long lengthTmp = endLineNum - startLineNum;
		int length = 0;
		try {
			length = lengthTmp.intValue();
		} catch (Exception e) {
			logger.error("Long to int Exception", e);
		}
		logger.debug("End append log file, end line: {}, length: {}", endLineNum, length);
		Map<String, Long> indexOffsetMap = new HashMap<>();
		appendIndexLog(logFileName, startLineNum, length, indexOffsetMap);
		appendLogIdJobIdFile(logFileName, indexOffsetMap);
	}


	/** * Create a mapping between log Id and JobId *@param logFileName
	 * @param indexOffsetMap
	 */
	public static void appendLogIdJobIdFile(String logFileName, Map indexOffsetMap) {
		// Get the value of the variable stored in ThreadLocal
		int logId = XxlJobFileAppender.contextHolderLogId.get();
		int jobId = XxlJobFileAppender.contextHolderJobId.get();
		File file = new File(logFileName);
		// Get the parent directory, find the index file under the same folder
		String parentDirName = file.getParent();
		// logId_jobId_index fileName
		String logIdJobIdIndexFileName = parentDirName.concat("/")
				.concat(LOGID_JOBID_INDEX_SUFFIX)
				.concat(FILE_SUFFIX);
		// Get the logId from the cache
		boolean jobLogIndexOffsetExist = indexOffsetCacheMap.exists(logId);
		Long jobLogIndexOffset = null;
		if (jobLogIndexOffsetExist) {
			jobLogIndexOffset = indexOffsetCacheMap.get(logId).get(jobLogIndexKey);
		}
		if (jobLogIndexOffset == null) {
			// If it is empty, add it
			StringBuffer stringBuffer = new StringBuffer();
			stringBuffer.append(logId).append("- >").append(jobId).append("\r\n");
			Long currentPoint = getAfterAppendIndexLog(logIdJobIdIndexFileName, stringBuffer.toString());
			indexOffsetMap.put(jobLogIndexKey, currentPoint);
			indexOffsetCacheMap.save(logId, indexOffsetMap);
		}
		// If it is not null, the cache already exists and no other processing is performed
	}

	/** * appends index file contents, returns offset *@param fileName
	 * @param content
	 * @return* /
	private static Long getAfterAppendIndexLog(String fileName, String content) {
		RandomAccessFile raf = null;
		Long point = null;
		try {
			raf = new RandomAccessFile(fileName, "rw");
			long end = raf.length();
			// The pointer is placed at the end of the file because it is appended
			raf.seek(end);
			raf.writeBytes(content);
			// Get the current pointer offset
			/** * the offset is placed in the cache variable: note that the offset is obtained at the beginning of the fetch, not after the append, otherwise it will get to the end of the offset */
			point = end;
		} catch (IOException e) {
			logger.error(e.getMessage(), e);
		} finally {
			try {
				raf.close();
			} catch(IOException e) { logger.error(e.getMessage(), e); }}return point;
	}

	/** * append index logs like "345->(577,10)" *@param logFileName
	 * @param from
	 * @param length
	 * @param indexOffsetMap
	 */
	public static void appendIndexLog(String logFileName, Long from, int length, Map indexOffsetMap) {
		int strLength = logFileName.length();
		// Get the index file name by intercepting
		String prefixFilePath = logFileName.substring(0, strLength - 4);
		String logIndexFilePath = prefixFilePath.concat(INDEX_SUFFIX).concat(FILE_SUFFIX);
		File logIndexFile = new File(logIndexFilePath);
		if(! logIndexFile.exists()) {try {
				logIndexFile.createNewFile();
			} catch (IOException e) {
				logger.error(e.getMessage(), e);
				return; }}int logId = XxlJobFileAppender.contextHolderLogId.get();

		StringBuffer stringBuffer = new StringBuffer();
		// Determine whether to add or modify
		boolean indexOffsetExist = indexOffsetCacheMap.exists(logId);
		Long indexOffset = null;
		if (indexOffsetExist) {
			indexOffset = indexOffsetCacheMap.get(logId).get(indexOffsetKey);
		}
		if (indexOffset == null) {
			// append
			String lengthStr = getFormatNum(length);
			stringBuffer.append(logId).append("->(")
					.append(from).append(",").append(lengthStr).append(")\r\n");
			// Add a new index to record the offset
			Long currentIndexPoint = getAfterAppendIndexLog(logIndexFilePath, stringBuffer.toString());
			indexOffsetMap.put(indexOffsetKey, currentIndexPoint);
		} else {
			String infoLine = getIndexLineIsExist(logIndexFilePath, logId);
			// Modify index file contents
			int startTmp = infoLine.indexOf("(");
			int endTmp = infoLine.indexOf(")");
			String[] lengthTmp = infoLine.substring(startTmp + 1, endTmp).split(",");
			int lengthTmpInt = 0;
			try {
				lengthTmpInt = Integer.parseInt(lengthTmp[1]);
				from = Long.valueOf(lengthTmp[0]);
			} catch (Exception e) {
				logger.error("appendIndexLog StringArrayException,{},{}", e.getMessage(), e);
				throw new RuntimeException("StringArrayException");
			}
			int modifyLength = length + lengthTmpInt;
			String lengthStr2 = getFormatNum(modifyLength);
			stringBuffer.append(logId).append("->(")
					.append(from).append(",").append(lengthStr2).append(")\r\n"); modifyIndexFileContent(logIndexFilePath, infoLine, stringBuffer.toString()); }}/**
	 * handle getFormatNum
	 * like 5 to 005
	 * @return* /
	private static String getFormatNum(int num) {
		DecimalFormat df = new DecimalFormat("000");
		String str1 = df.format(num);
		return str1;
	}

	/** * check whether the index exists *@param filePath
	 * @param logId
	 * @return* /
	private static String getIndexLineIsExist(String filePath, int logId) {
		// The log is called every time a row is generated, so the index file needs to be merged with the same logId
		String prefix = logId + "- >";
		Pattern pattern = Pattern.compile(prefix + ". *?");
		String indexInfoLine = "";
		RandomAccessFile raf = null;
		try {
			raf = new RandomAccessFile(filePath, "rw");
			String tmpLine = null;
			/ / the offset
			boolean indexOffsetExist = indexOffsetCacheMap.exists(logId);
			Long cachePoint = null;
			if (indexOffsetExist) {
				cachePoint = indexOffsetCacheMap.get(logId).get(indexOffsetKey);
			}
			if (null == cachePoint) {
				cachePoint = Long.valueOf(0);
			}
			raf.seek(cachePoint);
			while((tmpLine = raf.readLine()) ! =null) {
				final long point = raf.getFilePointer();
				boolean matchFlag = pattern.matcher(tmpLine).find();
				if (matchFlag) {
					indexInfoLine = tmpLine;
					break; } cachePoint = point; }}catch (IOException e) {
			logger.error(e.getMessage(), e);
		} finally {
			try {
				raf.close();
			} catch(IOException e) { logger.error(e.getMessage(), e); }}return indexInfoLine;
	}

	/** * select * from map; /** * from map; /** * from map@param filePath
	 * @param logId
	 * @return* /
	private static String readIndex(String filePath, int logId) {
		filePath = FilenameUtils.normalize(filePath);
		String prefix = logId + "- >";
		Pattern pattern = Pattern.compile(prefix + ". *?");
		String indexInfoLine = "";
		BufferedReader bufferedReader = null;
		try {
			bufferedReader = new BufferedReader(new FileReader(filePath));
			String tmpLine = null;
			while((tmpLine = bufferedReader.readLine()) ! =null) {
				boolean matchFlag = pattern.matcher(tmpLine).find();
				if (matchFlag) {
					indexInfoLine = tmpLine;
					break;
				}
			}
			bufferedReader.close();
		} catch (IOException e) {
			logger.error(e.getMessage(), e);
		} finally {
			if(bufferedReader ! =null) {
				try {
					bufferedReader.close();
				} catch(IOException e) { logger.error(e.getMessage(), e); }}}return indexInfoLine;
	}

	/** * Modifies the logIndexFile contents *@param indexFileName
	 * @param oldContent
	 * @param newContent
	 * @return* /
	private static boolean modifyIndexFileContent(String indexFileName, String oldContent, String newContent) {
		RandomAccessFile raf = null;
		int logId = contextHolderLogId.get();
		try {
			raf = new RandomAccessFile(indexFileName, "rw");
			String tmpLine = null;
			/ / the offset
			boolean indexOffsetExist = indexOffsetCacheMap.exists(logId);
			Long cachePoint = null;
			if (indexOffsetExist) {
				cachePoint = indexOffsetCacheMap.get(logId).get(indexOffsetKey);
			}
			if (null == cachePoint) {
				cachePoint = Long.valueOf(0);
			}
			raf.seek(cachePoint);
			while((tmpLine = raf.readLine()) ! =null) {
				final long point = raf.getFilePointer();
				if(tmpLine.contains(oldContent)) { String str = tmpLine.replace(oldContent, newContent); raf.seek(cachePoint); raf.writeBytes(str); } cachePoint = point; }}catch (IOException e) {
			logger.error(e.getMessage(), e);
		} finally {
			try {
				raf.close();
			} catch(IOException e) { logger.error(e.getMessage(), e); }}return true;
	}

	/** * Count the number of file contents *@param logFileName
	 * @return* /
	private static long countFileLineNum(String logFileName) {
		File file = new File(logFileName);
		if (file.exists()) {
			try {
				FileReader fileReader = new FileReader(file);
				LineNumberReader lineNumberReader = new LineNumberReader(fileReader);
				lineNumberReader.skip(Long.MAX_VALUE);
				// getLineNumber() starts counting at 0, so increments by 1
				long totalLines = lineNumberReader.getLineNumber() + 1;
				fileReader.close();
				lineNumberReader.close();
				return totalLines;
			} catch(IOException e) { logger.error(e.getMessage(), e); }}return 0;
	}


	/** ** select * from logIndexFile; 2.logFile *@param logFileName
	 * @param logId
	 * @param fromLineNum
	 * @return* /
	public static LogResult readLogByIndex(String logFileName, int logId, int fromLineNum) {
		int strLength = logFileName.length();
		// Get the filename prefix to go out.log
		String prefixFilePath = logFileName.substring(0, strLength-4);
		String logIndexFilePath = prefixFilePath.concat(INDEX_SUFFIX).concat(FILE_SUFFIX);
		// valid logIndex file
		if (StringUtils.isEmpty(logIndexFilePath)) {
			return new LogResult(fromLineNum, 0."readLogByIndex fail, logIndexFile not found".true);
		}
		logIndexFilePath = FilenameUtils.normalize(logIndexFilePath);
		File logIndexFile = new File(logIndexFilePath);
		if(! logIndexFile.exists()) {return new LogResult(fromLineNum, 0."readLogByIndex fail, logIndexFile not exists".true);
		}
		// valid log file
		if (StringUtils.isEmpty(logFileName)) {
			return new LogResult(fromLineNum, 0."readLogByIndex fail, logFile not found".true);
		}
		logFileName = FilenameUtils.normalize(logFileName);
		File logFile = new File(logFileName);
		if(! logFile.exists()) {return new LogResult(fromLineNum, 0."readLogByIndex fail, logFile not exists".true);
		}

		// read logIndexFile
		String indexInfo = readIndex(logIndexFilePath, logId);
		int startNum = 0;
		int endNum = 0;
		if(! StringUtils.isEmpty(indexInfo)) {int startTmp = indexInfo.indexOf("(");
			int endTmp = indexInfo.indexOf(")");
			String[] fromAndTo = indexInfo.substring(startTmp + 1, endTmp).split(",");
			try {
				startNum = Integer.parseInt(fromAndTo[0]);
				endNum = Integer.parseInt(fromAndTo[1]) + startNum;
			} catch (Exception e) {
				logger.error("readLogByIndex StringArrayException,{},{}", e.getMessage(), e);
				throw new RuntimeException("StringArrayException"); }}// read File
		StringBuffer logContentBuffer = new StringBuffer();
		int toLineNum = 0;
		LineNumberReader reader = null;
		try {
			reader = new LineNumberReader(new InputStreamReader(new FileInputStream(logFile), UTF_8));
			String line = null;

			while((line = reader.readLine()) ! =null) {
				// [from, to], start as fromNum(logIndexFile)
				toLineNum = reader.getLineNumber();
				if (toLineNum >= startNum && toLineNum < endNum) {
					logContentBuffer.append(line).append("\n");
				}
				// break when read over
				if (toLineNum >= endNum) {
					break; }}}catch (IOException e) {
			logger.error(e.getMessage(), e);
		} finally {
			if(reader ! =null) {
				try {
					reader.close();
				} catch (IOException e) {
					logger.error(e.getMessage(), e);
				}
			}
		}
		LogResult logResult = new LogResult(fromLineNum, toLineNum, logContentBuffer.toString(), false);
		returnlogResult; }}Copy the code

XxlJobLogger.java

package com.xxl.job.core.log;

import com.xxl.job.core.util.DateUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.helpers.FormattingTuple;
import org.slf4j.helpers.MessageFormatter;

import java.io.PrintWriter;
import java.io.StringWriter;
import java.util.Date;

/** * Created by xuxueli on 17/4/28. */
public class XxlJobLogger {
    private static Logger logger = LoggerFactory.getLogger("xxl-job logger");

    /**
     * append log
     *
     * @param callInfo
     * @param appendLog
     */
    private static void logDetail(StackTraceElement callInfo, String appendLog) {


        /*// "yyyy-MM-dd HH:mm:ss [ClassName]-[MethodName]-[LineNumber]-[ThreadName] log"; StackTraceElement[] stackTraceElements = new Throwable().getStackTrace(); StackTraceElement callInfo = stackTraceElements[1]; * /

        StringBuffer stringBuffer = new StringBuffer();
        stringBuffer.append(DateUtil.formatDateTime(new Date())).append("")
            .append("["+ callInfo.getClassName() + "#" + callInfo.getMethodName() +"]").append("-")
            .append("["+ callInfo.getLineNumber() +"]").append("-")
            .append("["+ Thread.currentThread().getName() +"]").append("") .append(appendLog! =null? appendLog:"");
        String formatAppendLog = stringBuffer.toString();

        // appendlog
        String logFileName = XxlJobFileAppender.contextHolder.get();
        if(logFileName! =null && logFileName.trim().length()>0) {
            // XxlJobFileAppender.appendLog(logFileName, formatAppendLog);
			  // Modify the method call here
            // modify appendLogAndIndex for addIndexLogInfo
            XxlJobFileAppender.appendLogAndIndex(logFileName, formatAppendLog);
        } else {
            logger.info("> > > > > > > > > > > {}", formatAppendLog); }}}Copy the code

JobThread.java

@Override
	public void run(a) {...// execute
		while(! toStop){ running =false;
			idleTimes++;

            TriggerParam triggerParam = null;
            ReturnT<String> executeResult = null;
            try {
				// to check toStop signal, we need cycle, so wo cannot use queue.take(), instand of poll(timeout)
				triggerParam = triggerQueue.poll(3L, TimeUnit.SECONDS);
				if(triggerParam! =null) {
					running = true;
					idleTimes = 0;
					triggerLogIdSet.remove(triggerParam.getLogId());

					// log filename, like "logPath/yyyy-MM-dd/9999.log"
					// String logFileName = XxlJobFileAppender.makeLogFileName(new Date(triggerParam.getLogDateTim()), triggerParam.getLogId());

					// modify rename the generated log file to jobId
					String logFileName = XxlJobFileAppender.makeLogFileNameByJobId(new Date(triggerParam.getLogDateTim()), triggerParam.getJobId());
					XxlJobFileAppender.contextHolderJobId.set(triggerParam.getJobId());
					// Change this parameter based on the xxL-job version
					XxlJobFileAppender.contextHolderLogId.set(Integer.parseInt(String.valueOf(triggerParam.getLogId())));

					XxlJobFileAppender.contextHolder.set(logFileName);
					ShardingUtil.setShardingVo(newShardingUtil.ShardingVO(triggerParam.getBroadcastIndex(), triggerParam.getBroadcastTotal())); . }Copy the code

ExecutorBizImpl.java

package com.xxl.job.core.biz.impl;

import com.xxl.job.core.biz.ExecutorBiz;
import com.xxl.job.core.biz.model.LogResult;
import com.xxl.job.core.biz.model.ReturnT;
import com.xxl.job.core.biz.model.TriggerParam;
import com.xxl.job.core.enums.ExecutorBlockStrategyEnum;
import com.xxl.job.core.executor.XxlJobExecutor;
import com.xxl.job.core.glue.GlueFactory;
import com.xxl.job.core.glue.GlueTypeEnum;
import com.xxl.job.core.handler.IJobHandler;
import com.xxl.job.core.handler.impl.GlueJobHandler;
import com.xxl.job.core.handler.impl.ScriptJobHandler;
import com.xxl.job.core.log.XxlJobFileAppender;
import com.xxl.job.core.thread.JobThread;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.Date;

/** * Created by xuxueli on 17/3/1. */
public class ExecutorBizImpl implements ExecutorBiz {
    private static Logger logger = LoggerFactory.getLogger(ExecutorBizImpl.class);


    /** * override log reading method *@param logDateTim
     * @param logId
     * @param fromLineNum
     * @return* /
    @Override
    public ReturnT<LogResult> log(long logDateTim, long logId, int fromLineNum) {
        // log filename: logPath/yyyy-MM-dd/9999.log
        String logFileName = XxlJobFileAppender.makeFileNameForReadLog(new Date(logDateTim), (int)logId);

        LogResult logResult = XxlJobFileAppender.readLogByIndex(logFileName, Integer.parseInt(String.valueOf(logId)), fromLineNum);
        return newReturnT<LogResult>(logResult); }}Copy the code

LRUCacheUtil.java

Implement a cache container via LinkedHashMap

package com.xxl.job.core.util;

import java.util.LinkedHashMap;
import java.util.Map;

/ * * *@Author: liangxuanhao
 * @DescriptionImplement a fixed size cache using LinkedHashMap *@Date: * /
public class LRUCacheUtil<K.V> extends LinkedHashMap<K.V> {

    // Maximum cache capacity
    private static final int CACHE_MAX_SIZE = 100;

    private int limit;

    public LRUCacheUtil(a) {
        this(CACHE_MAX_SIZE);
    }

    public LRUCacheUtil(int cacheSize) {
        // true updates to the end
        super(cacheSize, 0.75 f.true);
        this.limit = cacheSize;
    }

	/** * lock synchronization to prevent multithreaded security issues */
    public synchronized V save(K key, V val) {
        return put(key, val);
    }

    public V getOne(K key) {
        return get(key);
    }

    public boolean exists(K key) {
        return containsKey(key);
    }

    /** * Check whether the limit is exceeded@param elsest
     * @returnReturns true if the limit is exceeded, false */ otherwise
    @Override
    protected boolean removeEldestEntry(Map.Entry elsest) {
        // When called after the put or putAll method, the capacity limit is exceeded and the LRU is deleted as least recently unused
        return size() > limit;
    }

    @Override
    public String toString(a) {
        StringBuilder sb = new StringBuilder();
        for (Map.Entry<K, V> entry : entrySet()) {
            sb.append(String.format("%s:%s ", entry.getKey(), entry.getValue()));
        }
        returnsb.toString(); }}Copy the code

The results of

The implementation effect is shown in the figure: to achieve the expected expectation, and the effect is good.