问题定位
业务中部分批量更新操作超时,后排查发现是数据库连接池满了,导致更新失败,原先的理解是批量更新操作应该不会将连接池占满才对,参考文献少配置了rewriteBatchedStatements参数,否则即使使用批量操作也会一条一条将请求提交给mysql,google很多文章并没有详细解释这个参数是如何起作用的,所以特此记录一下
useCursorFetch : 在查询大批量的数据时,MySQL是以FetchAll的方式,将所有的数据都堆到客户端的,要想使用stream方式,要通过 useCursorFetch 参数打开开关;这个参数mysql缺省是false的;(MySQL最低版本 5.0.2)(https://www.jianshu.com/p/64bd83f3bcc5)
rewriteBatchedStatements:MyBatis 的批量方式,
有 Mybatis Batch,
原生JDBC Batch方式,
以及Mybatis的foreach拼接SQL的方式
除了最后一种foreach拼接SQL的方式不受rewriteBatchedStatements影响,另两种方式,需要打开 rewriteBatchedStatements 参数,才能实现高性能的批量操作(MySQL最低驱动版本:5.1.13)
public class NotifyRecordDaoTest extends BaseTest {
@Resource(name = "masterDataSource")
private DataSource dataSource;
@Test
public void insert() throws Exception {
Connection connection = dataSource.getConnection();
connection.setAutoCommit(false);
String sql = "insert into notify_record(" +
" partner_no," +
" trade_no, loan_no, notify_times," +
" limit_notify_times, notify_url, notify_type,notify_content," +
" notify_status)" +
" values(?,?,?,?,?,?,?,?,?) ";
PreparedStatement statement = connection.prepareStatement(sql);
for (int i = 0; i < 10000; i++) {
statement.setString(1, "1");
statement.setString(2, i + "");
statement.setInt(3, 1);
statement.setInt(4, 1);
statement.setString(5, "1");
statement.setString(6, "1");
statement.setString(7, "1");
statement.setString(8, "1");
statement.setString(9, "1");
statement.addBatch();
}
long start = System.currentTimeMillis();
statement.executeBatch();
connection.commit();
connection.close();
statement.close();
System.out.println(System.currentTimeMillis() - start);
}
@Test
public void insertB() {
List<NotifyRecordEntity> notifyRecordEntityList = Lists.newArrayList();
for (int i = 0; i < 10000; i++) {
NotifyRecordEntity record = new NotifyRecordEntity();
record.setLastNotifyTime(new Date());
record.setPartnerNo("1");
record.setLimitNotifyTimes(1);
record.setNotifyUrl("1");
record.setLoanNo("1");
record.setNotifyContent("1");
record.setTradeNo("" + i);
record.setNotifyTimes(1);
record.setNotifyType(EnumNotifyType.DAIFU);
record.setNotifyStatus(EnumNotifyStatus.FAIL);
notifyRecordEntityList.add(record);
}
long start = System.currentTimeMillis();
Map<String, Object> params = Maps.newHashMap();
params.put("notifyRecordEntityList", notifyRecordEntityList);
DaoFactory.notifyRecordDao.insertSelectiveList(params);
System.out.println(System.currentTimeMillis() - start);
}
@Resource
SqlSessionFactory sqlSessionFactory;
@Test
public void insertC() {
SqlSession sqlsession = sqlSessionFactory.openSession(ExecutorType.BATCH, false);
NotifyRecordDao notifyRecordDao = sqlsession.getMapper(NotifyRecordDao.class);
int num = 0;
for (int i = 0; i < 10000; i++) {
NotifyRecordEntity record = new NotifyRecordEntity();
record.setLastNotifyTime(new Date());
record.setPartnerNo("1");
record.setLimitNotifyTimes(1);
record.setNotifyUrl("1");
record.setLoanNo("1");
record.setNotifyContent("1");
record.setTradeNo("s" + i);
record.setNotifyTimes(1);
record.setNotifyType(EnumNotifyType.DAIFU);
record.setNotifyStatus(EnumNotifyStatus.FAIL);
notifyRecordDao.insert(record);
num++;
// if(num>=1000){
// sqlsession.commit();
// sqlsession.clearCache();
// num=0;
// }
}
long start = System.currentTimeMillis();
sqlsession.commit();
sqlsession.clearCache();
sqlsession.close();
System.out.println(System.currentTimeMillis() - start);
}
}
测试插入一万条数据的发现除了拼接SQL的方式需要用5秒多的时间外,Mybatis Batch和原生JDBC Batch都需要50多秒,怎么想都觉得不可能,写法没有问题一定是数据库或者数据库连接配置上有问题。
后来才发现要批量执行的话,JDBC连接URL字符串中需要新增一个参数:rewriteBatchedStatements=true
master.jdbc.url=jdbc:mysql://112.126.84.3:3306/outreach_platform?useUnicode=true&characterEncoding=utf8&allowMultiQueries=true&rewriteBatchedStatements=true
关于rewriteBatchedStatements这个参数介绍:
MySQL的JDBC连接的url中要加rewriteBatchedStatements参数,并保证5.1.13以上版本的驱动,才能实现高性能的批量插入。
MySQL JDBC驱动在默认情况下会无视executeBatch()语句,把我们期望批量执行的一组sql语句拆散,一条一条地发给MySQL数据库,批量插入实际上是单条插入,直接造成较低的性能。
只有把rewriteBatchedStatements参数置为true, 驱动才会帮你批量执行SQL
另外这个选项对INSERT/UPDATE/DELETE都有效
添加rewriteBatchedStatements=true这个参数后的执行速度比较:
同个表插入一万条数据时间近似值:
JDBC BATCH 1.1秒左右 > Mybatis BATCH 2.2秒左右 > 拼接SQL 4.5秒左右