最近在做项目的时候由于要用到一个日志的功能,然后将操作日志写到文件中,最后定时从日志中读取文件写到数据库中,由于在定时任务中用到了RandomAccessFile记录日志的增量部分,所以产生了在用RandomAccessFile对象读取文件的时候乱码的问题,最后纠结一段时间,终于找到了解决方法。不过想到在项目中也碰到过有关乱码的问题,所以在这里就记录下对于各种类型的乱码问题的解决方法吧,

首先,在Apache karaf中配置log4j日志,在karaf中生成日志文件,其中日志配置如下:

#运行和错误日志写到文件
 log4j.logger.com.report = info, RunningLog, ErrorLog

 log4j.appender.RunningLog =org.apache.log4j.sift.MDCSiftingAppender
 log4j.appender.RunningLog.key=bundle.name
 log4j.appender.RunningLog.default=karaf
 log4j.appender.RunningLog.appender=org.apache.log4j.DailyRollingFileAppender
 log4j.appender.RunningLog.appender.DatePattern ='.'yyyy-MM-dd-HH
 log4j.appender.RunningLog.appender.file=${karaf.data}/log/RunningLog.log
 log4j.appender.RunningLog.appender.append=true
 log4j.appender.RunningLog.appender.layout=org.apache.log4j.PatternLayout
 log4j.appender.RunningLog.appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss}-> %-5.5p-> %m(%l)%n
 #设置日志编码为UTF-8编码格式,避免乱码情况(这句话貌似没有起作用)
 log4j.appender.RunningLog.encoding=UTF-8

 log4j.appender.ErrorLog=org.apache.log4j.DailyRollingFileAppender
 log4j.appender.ErrorLog.DatePattern ='.'yyyy-MM-dd
 log4j.appender.ErrorLog.file=${karaf.data}/log/ErrorLog/ErrorLog.log
 log4j.appender.ErrorLog.layout=org.apache.log4j.PatternLayout
 log4j.appender.ErrorLog.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss}-> %-5.5p-> %m(%l)%n
 log4j.appender.ErrorLog.Threshold = ERROR
 #设置日志编码为UTF-8编码格式,避免乱码情况(这句话貌似没有起作用)
 log4j.appender.ErrorLog.encoding=UTF-8


 #数据上报相关的日志配置
 log4j.logger.com.gzydt.report.base=info, RunningLog, ErrorLog
 log4j.logger.com.gzydt.report.library=info, RunningLog, ErrorLog
 log4j.logger.com.gzydt.report.message=info, RunningLog, ErrorLog

这样配置之后,就会在指定的文件夹下面生成日志文件了,生成日志文件之后,使用RandomAccessFile来读取文件的增量的时候出现乱码了,这是我写的定时任务


package com.gzydt.report.logging.rest.impl;

import java.io.File;
import java.io.RandomAccessFile;
import java.text.SimpleDateFormat;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;

import org.json.JSONObject;

import com.gzydt.report.logging.entity.OperateLog;
import com.gzydt.report.logging.service.OperateLogService;

/**
 * 定时读取日志文件
 * 
 * @since 2015-7-25
 * 
 */
public class OperateLogResourceImpl{

	private long lastTimeSize = 0;

	private static boolean isRunning = false;

	String runningLogPath = "data/log/RunningLog.log";
	private OperateLogService operateLogService;

	public void setOperateLogService(OperateLogService operateLogService) {
		this.operateLogService = operateLogService;
	}

	/**
	 * 设置定时任务,每隔1分钟去日志文件中读取日志,然后写到数据库中
	 */
	public void init() {
		if (!isRunning) {
			isRunning = true;
			// 定时任务,每隔1分钟去文件中读取
			try {
				File file = new File(runningLogPath);
				if (!file.exists()) {
					file.createNewFile();
				}
				// 定义RandomAccessFile来获取日志文件的增量部分
				final RandomAccessFile accessFile = new RandomAccessFile(file,
						"r");
				ScheduledExecutorService schService = Executors
						.newScheduledThreadPool(1);
				schService.scheduleWithFixedDelay(new Runnable() {
					@Override
					public void run() {
						try {
							accessFile.seek(lastTimeSize);
							String tmp = "";
							while ((tmp = accessFile.readLine()) != null) {
								// 解决中文乱码的问题
				         tmp = new String(tmp.getBytes("8859_1"), "gbk");
								System.out.println(tmp);
								OperateLog entity = analyLine(tmp);
								operateLogService.add(entity);
							}
							lastTimeSize = accessFile.length();
						} catch (Exception e) {
							e.printStackTrace();
						}
					}
				}, 0, 60, TimeUnit.SECONDS);
			} catch (Exception e) {
				e.printStackTrace();
			}
			isRunning = false;
		} else {
			System.out.println("上一轮的扫描入库还没有结束,不能立马进入到下一轮的扫描入库操作");
		}
	}

	private OperateLog analyLine(String line) throws Exception {
		final String text = "2015-07-25 14:04:16-> INFO -> {\"modelName\":企业填报,\"modelOperation\":查看企业填报信息,\"userId\":12342353}(com.gzydt.report.message.rest.impl.NoticeRestImpl.get(NoticeRestImpl.java:90))";
		if (line == null) {
			throw new Exception("日志格式错误。参考格式如下:\n" + text + "\n");
		}
		String[] split = line.split("->");
		OperateLog log = new OperateLog();
		SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
		log.setOperateTime(df.parse(split[0]));
		JSONObject message = new JSONObject(split[2].substring(0,
				split[2].indexOf("(")));
		log.setModelName(message.optString("modelName"));
		log.setModelOperation(message.optString("modelOperation"));
		log.setUserId(message.optString("userId"));
		return log;
	}

}

在这句话

tmp = accessFile.readLine()

这里就出现乱码了,就其原因可能是因为日志文件在osgi的环境下是ANSI编码的,所以在读取代码的时候也必须是ANSI编码的,其中解决方法如下

// 解决中文乱码的问题
	tmp = new String(tmp.getBytes("8859_1"), "gbk");



配置如下:


<bean id="operateLog" class="com.gzydt.report.logging.rest.impl.OperateLogResourceImpl"  init-method="init">
		<property name="operateLogService" ref="operateLogService" />
	</bean>

还有一种解决乱码的方法是

// 解决中文名字乱码的问题
 filename = new String(file.getName().getBytes("UTF-8"),"ISO-8859-1");

还有一种解决乱码的方法是

RequestEntity entity = new StringRequestEntity(json.toString(), "application/json", "UTF-8");