1、整体思路

     目前Redis可以说风生水起,无论是政府口、企业、互联网,Redis缓存技术均已经得到了广泛应用,今年半年多没有写代码了,手痒难耐,决定学习一下Redis。

     由于是初学,先做个简单的

                (1)将200万条数据灌入到到MySql中。

                (2)由Redis缓存其中的10000条。

    后期我考虑建立实际场景,通过web端访问,在同一套前端框架的条件下,观察读取Redis库及MySql库的时间。

 

2、配置过程

     (1) Spring + Hibernate + redis 配置

      没有采用Maven,想看看各个jar包之间的依赖关系,结果差点被玩儿死,废话不说,所有jar包截图如下:

hibernate使用redis hibernate和redis_hibernate使用redis

(注意:redis 需要依赖的jar包包括:commons-pool2-2.4.2.jar;spring-data-redis.1.4.1;jedis-2.4.1.jar)

     (2) 目录结构

hibernate使用redis hibernate和redis_xml_02

      说明,没有对Service层进行配置,在Dao层中直接编写了业务逻辑,Spring 对com.RedisTest.Dao包下的类进行注入管理。

     (3) 数据库实例及表空间

hibernate使用redis hibernate和redis_spring_03

       说明,创建数据表redistest_effctive。

     (4) 配置redis 属性文件

      redis.properties文件

redis.hostName=127.0.0.1
redis.port=6379
redis.timeout=15000

redis.maxIdle=6
redis.minEvictableIdleTimeMillis=300000
redis.numTestsPerEvictionRun=3
redis.timeBetweenEvictionRunsMillis=60000

      spring-xml文件

<beans xmlns="http://www.springframework.org/schema/beans"   
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"  
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd" default-autowire="byName">  
    <bean id="jedisPoolConfig" class="redis.clients.jedis.JedisPoolConfig">  
        
        <!-- maxIdle 最大空闲数,数据库连接的最大空闲时间,超过空闲时间,数据库连接将被标记为不可用,然后被释放,设为0表示无限制 -->
        <property name="maxIdle" value="${redis.maxIdle}"></property>  
       
        <!-- minEvictableIdleTimeMillis 连接保持空闲而不被驱逐的最长时间 -->
        <property name="minEvictableIdleTimeMillis" value="${redis.minEvictableIdleTimeMillis}"></property>  
       
        <!-- 设定在进行后台对象清理时,每次检查几个链接。默认值是3.
        如果numTestsPerEvictionRun>=0, 则取numTestsPerEvictionRun 和池内的链接数 的较小值 作为每次检测的链接数,
        如果numTestsPerEvictionRun<0,则每次检查的链接数是检查时池内链接的总数乘以这个值的负倒数再向上取整的结果。 -->
        <property name="numTestsPerEvictionRun" value="${redis.numTestsPerEvictionRun}"></property> 
        
        <!-- 1) Destroy线程会检测连接的间隔时间
             2) testWhileIdle的判断依据,详细看testWhileIdle属性的说明 -->
        <property name="timeBetweenEvictionRunsMillis" value="${redis.timeBetweenEvictionRunsMillis}"></property>
    </bean>  
    <bean id="jedisConnectionFactory" class="org.springframework.data.redis.connection.jedis.JedisConnectionFactory" destroy-method="destroy">  
        <property name="poolConfig" ref="jedisPoolConfig"></property>  
        <property name="hostName" value="${redis.hostName}"></property>  
        <property name="port" value="${redis.port}"></property>  
        <property name="timeout" value="${redis.timeout}"></property>  
       <!--  <property name="usePool" value="${redis.usePool}"></property>   -->
    </bean>  
    <bean id="jedisTemplate" class="org.springframework.data.redis.core.RedisTemplate">  
        <property name="connectionFactory" ref="jedisConnectionFactory"></property>  
        <property name="keySerializer">  
            <bean class="org.springframework.data.redis.serializer.StringRedisSerializer"/>  
        </property>  
        <property name="valueSerializer">  
            <bean class="org.springframework.data.redis.serializer.JdkSerializationRedisSerializer"/>  
        </property>  
    </bean>  
</beans>

     (5) 配置hibernate属性文件

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE hibernate-configuration PUBLIC
        "-//Hibernate/Hibernate Configuration DTD 3.0//EN"
        "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
    <session-factory>
        <!-- 配置Hibernate的基本属性 -->
        <!-- 1.数据源配置到IOC容器中 -->
        <!-- 2.关联的.hbm.xml也在IOC容器配置SessionFactory实例 -->
        <!-- 3.配置Hibernate的基本属性:方言,SQL显示及格式化,生成数据表的策略以及二级缓存 -->
        <property name="hibernate.dialect">org.hibernate.dialect.MySQL5Dialect</property>
        <property name="hibernate.show_sql">true</property>
        <property name="hbm2ddl.auto">update</property>
        <property name="current_session_context_class">thread</property>
    </session-factory>
</hibernate-configuration>

     (4) 配置spring 属性文件

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:tx="http://www.springframework.org/schema/tx"
    xmlns:aop="http://www.springframework.org/schema/aop"
    xmlns:context="http://www.springframework.org/schema/context"
    xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
        http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-4.1.xsd
        http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-4.1.xsd
        http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-4.1.xsd">
    
    <context:component-scan base-package="com.RedisTest"></context:component-scan>    
    <bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
        <property name="driverClassName" value="com.mysql.jdbc.Driver" />
        <property name="url" value="jdbc:mysql://localhost:3306/contractmanage" />
        <property name="username" value="root"></property>
        <property name="password" value="admin"></property>
    </bean>     
    <bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean" lazy-init="false">
        <!-- 注入datasource,给sessionfactoryBean内setdatasource提供数据源 -->
        <property name="dataSource" ref="dataSource" />
        <property name="configLocation" value="classpath:hibernate.cfg.xml"></property>
        <!-- //加载实体类的映射文件位置及名称 -->
        <property name="mappingLocations" value="classpath:com/RedisTest/Mapper/*.hbm.xml"></property>
    </bean>      
    <!-- 在spring 配置文件中加入对redis的依赖注入项 -->
    <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
        <property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE" />
        <property name="ignoreResourceNotFound" value="true" />
        <property name="locations" value="classpath:redis.properties"/>
    </bean>    
    <!-- 配置Spring声明式事务 -->
    <bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager">
        <property name="sessionFactory" ref="sessionFactory"></property>
    </bean> 
      <!-- 配置事务事务属性 -->
     <tx:advice id="txAdvice" transaction-manager="transactionManager">
        <tx:attributes>
            <tx:method name="get*" read-only="true"/>
            <tx:method name="*"/>
        </tx:attributes>
    </tx:advice>
    <!-- 配置事务切点,并把切点和事务属性关联起来 -->
    <aop:config>

        <!-- 配置切入点 -->
        <aop:pointcut expression="execution(* com.RedisTest.Dao.*Data.*.*(..))" id="txPointcut"/>
        <aop:advisor advice-ref="txAdvice" pointcut-ref="txPointcut"/>
    </aop:config>
    <import resource="spring-redis.xml" />
</beans>

3、代码说明

(1)com.RedisTest.Application包:

package com.RedisTest.Application;

import java.util.ArrayList;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Set;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.data.redis.core.BoundSetOperations;
import org.springframework.data.redis.core.HashOperations;
import org.springframework.data.redis.core.ListOperations;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.data.redis.core.ValueOperations;
import org.springframework.stereotype.Repository;

@Repository
public class RedisCacheUtil<T> {

    /**
     * @Autowired 是根据类型进行自动装配的,如果spring上下文不止一个
     * 类型的bean时,就会抛出BeanCreationException异常,可以使用@Qualifer配合Autowired解决问题
     */
    @Autowired @Qualifier("jedisTemplate")
    public RedisTemplate redisTemplate;
    
    
    /**
     * 缓存基本的对象, Integer、String、实体类等
     * @param key
     * @param value
     * @return 缓存的对象
     */
    public <T> ValueOperations<String,T> setCacheObject(String key,T value){
        ValueOperations<String,T> operation = redisTemplate.opsForValue();
        operation.set(key, value);
        return operation;
    }
    
    /**
     * 获得缓存的基本对象
     * @param key 缓存键值
     * @return
     */
    public <T> T getCacheObject(String key){
        ValueOperations<String, T> operation = redisTemplate.opsForValue(); 
        return operation.get(key);
    }
    
    /**
     * 缓存List数据
     * @param key 缓存的键值
     * @param dataList 待缓存的List数据
     * @return 缓存的对象集合
     */
    public <T> ListOperations<String,T> setCacheList(String key,List<T> dataList){
        ListOperations listOperation = redisTemplate.opsForList();
        if(null != dataList){
            int size = dataList.size();
            for (int i = 0; i < size; i++) {
                listOperation.rightPush(key, dataList.get(i));
            }
        }
        return listOperation;
    }
    
    /**
     * 获得缓存的list对象
     * @param key 缓存的键值
     * @return
     */
    public <T> List<T> getCacheList(String key){
        List<T> dataList = new ArrayList<T>();
        ListOperations<String,T> listOperation = redisTemplate.opsForList();
        long size = listOperation.size(key);
        for (int i = 0; i < size; i++) {
            dataList.add((T) listOperation.leftPop(key));
        }
        return dataList;
    }
    
    /**
     * 缓存Set
     * @param key 缓存键值
     * @param dataSet 缓存的数据
     * @return 缓存数据的对象
     */
    public <T> BoundSetOperations<String,T> setCacheSet(String key,Set<T> dataSet){
        BoundSetOperations<String,T> setOperation = redisTemplate.boundSetOps(key);
        Iterator<T> it = dataSet.iterator();
        while(it.hasNext())
        {
            setOperation.add(it.next());
        }
        return setOperation;
    }
    
    /**
     * 获得缓存的Set
     * @param key 
     * @return
     */
    public Set<T> getCacheSet(String key){
        Set<T> dataSet = new HashSet<T>();
        BoundSetOperations<String, T> operation = redisTemplate.boundSetOps(key);
        Long size = operation.size();
        for (int i = 0; i < size; i++) {
            dataSet.add(operation.pop());
        }
        return dataSet;
    }
    
    /**
     * 缓存Map
     * @param key 键值
     * @param dataMap 
      * @return
     */
    public <T> HashOperations<String,String,T> setCacheMap(String key,Map<String,T> dataMap)
    {
        HashOperations hashOperations = redisTemplate.opsForHash();
        if(null != dataMap){
            for (Map.Entry<String, T> entry :dataMap.entrySet()) {
                hashOperations.put(key, entry.getKey(), entry.getValue());
            }
        }
        return hashOperations;
    }

    
    /**
     * 获得缓存的Map
     * @param key 
     * @return
     */
    public <T> Map<String,T> getCacheMap(String key){
        Map<String,T> map = redisTemplate.opsForHash().entries(key);
        return map;
    }
    
    /**
     * 缓存Map
     * @param key 
     * @param dataMap
     * @return
     */
    public <T> HashOperations<String,Integer,T> setCacheIntegerMap(String key,Map<Integer,T> dataMap){
        HashOperations hashOperation = redisTemplate.opsForHash();
        if(null != dataMap){
            for(Map.Entry<Integer, T> entry : dataMap.entrySet()){
                hashOperation.put(key, entry.getKey(), entry.getValue());
            }
        }
        return hashOperation;
    }
    
    /**
     * 获得缓存的Map
     * @param key 键值
     * @return
     */
    public <T> Map<Integer,T> getCacheIntegerMap(String key){
        Map<Integer,T> map = redisTemplate.opsForHash().entries(key);
        return map;
    }
    
    
}

(2)com.RedisTest.Dao包:

接口类【IImportData.java】、【IRedisCacheData.java】:

package com.RedisTest.Dao;

public interface IImportData {

    /**
     * 批量保存列表
     */
    public void saveList();
}
package com.RedisTest.Dao;

import java.util.List;

import com.RedisTest.Model.RedisTest;

public interface IRedisCacheData {
    
    /**
     * 缓存到redis列表
     */
    public void cacheList();
    
    /**
     * 读取mysql数据表
     * @return
     */
    public List<RedisTest> phyList();
    
    /**
     * 读取redis缓存列表
     */
    public void readCacheList();
    
}

实现类:【ImportData.java】、【RedisCacheData.java】

ImportData.java 主要作用是向库表中插入数据:

package com.RedisTest.Dao;

import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
import org.springframework.transaction.annotation.Transactional;

import com.RedisTest.Application.InitCode;
import com.RedisTest.Model.RedisTest;

@Repository
public class ImportData implements IImportData{

    private static int i =0;
    
    @Autowired
    private SessionFactory sessionFactory;
    
    private Session getSession(){
        return sessionFactory.openSession();
    }
    
    public ImportData(){
        ++i;
        System.out.println("-----------我被实例化了第 "+i+" 次啦!!!-----------");
    }
    
    /**
     * 假的方法
     * @param fake
     */
    public void saveList(String fake){
        System.out.println("----我是假的方法-----");
    }
    
    /**
     * 保存数据
     */
    public void saveList() {
        Session session = null;
        try {
            session = getSession();
            session.beginTransaction();
            for(int i=0;i<2000000;i++){
                RedisTest test = new RedisTest();
                InitCode code = new InitCode(i);
                test.setCode(code.getLastedCode());
                test.setName("Ozil"+i);
                session.save(test);
                //批量插入的对象立即写入数据库并释放内存
                if(i%100 == 0){
                    session.flush();
                    session.clear();
                }
            }
            session.getTransaction().commit();//提交事务
        } 
        catch (Exception e) {
            e.printStackTrace();
            session.getTransaction().rollback();
        }finally{
            session.close();
        }
    }
}

RedisCacheData.java

package com.RedisTest.Dao;

import java.util.HashMap;
import java.util.List;
import java.util.Map;

import org.hibernate.Query;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;

import com.RedisTest.Application.RedisCacheUtil;
import com.RedisTest.Model.RedisTest;

@Repository
public class RedisCacheData implements IRedisCacheData{

    @Autowired
    private SessionFactory sessionFactory;
    
    @Autowired
    private RedisCacheUtil<RedisTest> redisCache;
    
    private Session getSession(){
        return sessionFactory.openSession();
    }
    
    /**
     * 默认构造器执行缓存操作
     */
    public RedisCacheData(){
        System.out.println("----------------RedisCacheData被spring 装载啦!!!!!-------------");
    }
    
    /**
     * 读取缓存列表
     */
    @Override
    public void readCacheList() {
        if(null != redisCache){
            Map<Integer,RedisTest> readRedisMap = redisCache.getCacheIntegerMap("redisTestMap");
            System.out.println(readRedisMap.size());
            for(int key:readRedisMap.keySet()){
                System.out.println("key="+Integer.toString(key)+",value="+readRedisMap.get(key).getName());
            }
        }
    }
    
    /**
     * 获取缓存列表
     */
    @Override
    public void cacheList() {
        try {
            List<RedisTest> list = this.phyList();
            Map<Integer,RedisTest> redisTestMap = new HashMap<Integer,RedisTest>();
            int listcount = list.size();
            for (int i = 0; i < listcount; i++) {
                redisTestMap.put(list.get(i).getId(), list.get(i));
            }
            redisCache.setCacheIntegerMap("redisTestMap", redisTestMap);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
    
    /**
     * 从mysql中读取数据
     */
    @Override
    public List<RedisTest> phyList() {
        Session session = null;
        List<RedisTest> list = null;
        try {
            session = getSession();
            String hql = "from RedisTest where Id < 10000";
            Query query = session.createQuery(hql);
            list = query.list();
            return list;
        } catch (Exception e) {
            e.printStackTrace();
            session.getTransaction().rollback();
        }
        return list;
    }
}

(3)com.RedisTest.Mapper包:

RedisTestEntity.hbm.xml文件

<?xml version="1.0"?>
<!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN"
"http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd">
<hibernate-mapping>
    <class name="com.RedisTest.Model.RedisTest" table="RedisTest_Effctive">
        <id name="id" type="java.lang.Integer">
            <column name="ID" />
            <generator class="native" />
        </id>
        <property name="Code" type="java.lang.String">
            <column name="Code" />
        </property>
        <property name="Name" type="java.lang.String">
            <column name="Name" />
        </property>
    </class>
</hibernate-mapping>

(4)com.RedisTest.Model包:

package com.RedisTest.Model;

import java.io.Serializable;

//实体类必须实现序列化接口
public class RedisTest implements Serializable{

    /**
     * 
     */
    private static final long serialVersionUID = 1L;
    
    public int getId() {
        return id;
    }
    public void setId(int id) {
        this.id = id;
    }
    public String getName() {
        return Name;
    }
    public void setName(String name) {
        Name = name;
    }
    public String getCode() {
        return Code;
    }
    public void setCode(String code) {
        Code = code;
    }
    private int id;
    private String Name;
    private String Code;
}

(5)com.RedisTest.Test包:

package com.RedisTest.Test;

import javax.activation.DataSource;
import org.junit.Test;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

import com.RedisTest.Dao.ImportData;
import com.RedisTest.Dao.RedisCacheData;

public class RunTest {

    public RunTest(){
        System.out.println("--------------RunTest测试类,我已经被实例化了!------------");
    }
    private ApplicationContext context  = null;
    private ImportData importData = null;
    private RedisCacheData redisCacheData = null;
    
    {
        context = new ClassPathXmlApplicationContext("applicationContext.xml");
        importData = (ImportData)context.getBean(ImportData.class);
        redisCacheData = (RedisCacheData)context.getBean(RedisCacheData.class);
    }
    
    @Test
    public void Test(){
        DataSource dataSource = (DataSource)context.getBean(DataSource.class);
        System.out.println(dataSource);
    }
    
    /**
     * 批量写入数据
     * @param args
     */
    @Test
    public void Test2(){
        importData.saveList("fake");
    }
    
    
    /**
     * redis测试
     */
    @Test
    public void TestRedisCache(){
        //将物理表数据写入Map。而后将map存入到redis
        redisCacheData.cacheList();
        //从redis中,读取并打印
        redisCacheData.readCacheList();
    }
}

4、结果展示

(1) Redis Desktop Manager 是很方便的redis 客户端。

hibernate使用redis hibernate和redis_hibernate使用redis_04

  仅往Redis库中存入了10000条数据,从数据库中读取200万条数据再存入redis中,担心机器扛不住。由于存入的是序列化后的数据,因此出现了乱码。

(2) 从Redis 库中读取数据,可以看到会按照Map的结构特点无序排列

hibernate使用redis hibernate和redis_xml_05

5、后记

之后,我考虑采用一套前端列表框架的条件下,分别对MySql 数据库及Redis 库进行读取分页操作,观察缓存效果。