上一篇整理了ServiceMix环境的搭建过程,接下来介绍在ServiceMix平台下如何开发程序并部署到Karaf容器内,同时介绍下karaf容器内置的ActiveMQ消息组件的发送和接收,以及目前应用比较广泛的分布式高吞吐量的消息系统Kafka在Karaf容器中的使用。

首先看下我们开发的程序部署的位置,我们使用Maven打包后的jar文件放在下图的deploy目录下,如果需要用到配置文件,可以放在etc目录下,该目录对应你Maven程序里的etc目录:


1.用IntelliJ创建maven项目,修改pom内容,定义项目名称和创建apache felix的plugin配置,并增加引用Kafka,完整内容请查阅后面的源码文件:

<groupId>com.danejiang</groupId>
<artifactId>BundleTest</artifactId>
<version>1.0.0</version>
<packaging>bundle</packaging>
<name>DaneJiang BundleTest</name>
<description>DaneJiang BundleTest</description>

<dependencies>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>javax.jmdns</groupId>
        <artifactId>jmdns</artifactId>
        <version>3.4.1</version>
    </dependency>
</dependencies>
复制代码

2.创建activemq.java,实现ActiveMQ的生产者发送消息、消费者接收消息处理和停止操作: ActiveMQ生产者发送消息代码如下:

public static boolean send(String topicType,String topicName, String topicMessage) {
        try {
            // 创建连接工厂
            ActiveMQConnectionFactory factory = new ActiveMQConnectionFactory("tcp://0.0.0.0:61616");

            // 创建JMS连接实例,并启动连接
            Connection connection = factory.createConnection("smx", "smx");
            connection.start();

            // 创建Session对象,不开启事务
            Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);

            // 创建主题和生成者:按消息类型分别处理
            MessageProducer producer = null;
            if(topicType.toLowerCase().equals("queue")){
                // 创建主题
                Queue queue = session.createQueue(topicName);

                // 创建生成者
                producer = session.createProducer(queue);
            }else if(topicType.toLowerCase().equals("topic")){
                // 创建主题
                Topic topic = session.createTopic(topicName);

                // 创建生成者
                producer = session.createProducer(topic);
            }else{
                logger.info("Send MQ Message error:not set topic type.");
                return false;
            }

            // 设置消息不需持久化。默认消息需要持久化
            //producer.setDeliveryMode(DeliveryMode.NON_PERSISTENT);
            producer.setDeliveryMode(DeliveryMode.PERSISTENT);

            // 创建文本消息  或者其他格式的信息
            TextMessage message = session.createTextMessage(topicMessage);

            // 发送消息。non-persistent 默认异步发送;persistent 默认同步发送
            producer.send(message);

            // 关闭会话和连接
            producer.close();
            session.close();
            connection.close();

            logger.info("Send MQ Message:" + topicName);
            return true;
        } catch (Exception e) {
            logger.info("Send MQ Message error:" + e.toString());
            return false;
        }
    }
复制代码

ActiveMQ消费者代码如下:

private static Connection rConnection= null;
    private static Session rSession = null;
    private static MessageConsumer rMessageConsumer = null;
   
    public static boolean receive(String topicType, String topicName) {
        try {
            // 创建连接工厂
            ActiveMQConnectionFactory connectionFactory = new ActiveMQConnectionFactory("tcp://0.0.0.0:61616");

            // 创建JMS连接实例,并启动连接
            rConnection = connectionFactory.createConnection("smx", "smx");
            rConnection.start();

            // 创建Session对象,不开启事务
            rSession = rConnection.createSession(false, Session.AUTO_ACKNOWLEDGE);

            // 创建主题和消费者:按消息类型分别处理
            if (topicType.toLowerCase().equals("queue")) {
                // 创建主题
                Queue queue = rSession.createQueue(topicName);

                // 创建消费者
                rMessageConsumer = rSession.createConsumer(queue);
            } else if (topicType.toLowerCase().equals("topic")) {
                // 创建主题
                Topic topic = rSession.createTopic(topicName);

                // 创建消费者
                rMessageConsumer = rSession.createConsumer(topic);
            } else {
                logger.info("Start MQ Message error:not set topic type.");
                return false;
            }

            // 异步消费
            rMessageConsumer.setMessageListener(new MessageListener() {
                @Override
                public void onMessage(Message message) {
                    TextMessage mess = (TextMessage) message;
                    try {
                        //消息处理
                        logger.info("Receive MQ Message:" + topicName+",Result:"+ mainService.doMQ(topicName, mess.getText()));
                    } catch (JMSException e) {
                        logger.info("Receive MQ Message error:" + e.toString());
                    }
                }
            });

            logger.info("Started receive MQ Message:" + topicName);
            return true;
        } catch (Exception e) {
            logger.info("Start receive MQ Message error:" + e.toString());
            return false;
        }
    }
复制代码

下面的方法用于停止消费者的异步消费事件:

public static void stop() {
        try {
            // 关闭会话和连接
            if (rMessageConsumer != null) rMessageConsumer.close();
            if (rSession != null) rSession.close();
            if (rConnection != null) rConnection.close();

            logger.info("Stoped MQ Message.");
        } catch (Exception e) {
            logger.info("Stop MQ Message error:" + e.toString());
        }
    }
复制代码

3.创建kafka.java,实现Kafka消息组件的生产者发送消息、消费者接收消息处理和停止操作: Kafka生产者代码如下:

public static boolean send(String topicName, String topicMessage) {
        try {
            Thread.currentThread().setContextClassLoader(null);
            Properties props = new Properties();
            props.put("bootstrap.servers", "hadoop01:9092");
            props.put("acks", "all");
            props.put("retries", 0);
            props.put("batch.size", 16384);
            props.put("linger.ms", 1);
            props.put("buffer.memory", 33554432);
            props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
            props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

            Producer<String, String> producer = null;

            try {
                producer = new KafkaProducer<>(props);
                producer.send(new ProducerRecord<String, String>(topicName, topicMessage));
            } catch (Exception e) {
                logger.info("Send Kafka Message error:" + e.toString());
                return false;
            } finally {
                producer.close();
            }

            logger.info("Send Kafka Message:" + topicName);

            return true;
        } catch (Exception ex) {
            logger.info("Send Kafka Message error:" + ex.toString());
            return false;
        }
    }
复制代码

Kafka消费者代码如下:

private static KafkaConsumer<String, String> kafkaConsumer = null;
    public static boolean receive() {
        try {
            Thread.currentThread().setContextClassLoader(null);
            Properties props = new Properties();
            props.put("bootstrap.servers", "hadoop01:9092");
            props.put("group.id", "Group-1");
            props.put("enable.auto.commit", "true");
            props.put("auto.commit.interval.ms", "1000");
            props.put("auto.offset.reset", "earliest");
            props.put("session.timeout.ms", "30000");
            props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
            props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

            kafkaConsumer = new KafkaConsumer<>(props);
            kafkaConsumer.subscribe(Collections.singletonList("test"));

            SimpleDateFormat df = new SimpleDateFormat("yyyyMMddHHmmssSSS");
            new Thread(df.format(new Date())) {
                public void run() {
                    while (true) {
                        ConsumerRecords<String, String> records = kafkaConsumer.poll(100);
                        for (ConsumerRecord<String, String> record : records) {
                            logger.info("Receive Kafka Message:" + record.topic()+",Result:"+ mainService.doKafka(record.topic(),record.value()));
                        }
                    }
                }
            }.start();

            return true;
        } catch (Exception e) {
            logger.info("Start receive Kafka Message error:" + e.toString());
            return false;
        }
    }
复制代码

下面的方法用于停止消费者处理事件:

public static void stop() {
        try {
            // 关闭会话和连接
            if (kafkaConsumer != null) kafkaConsumer.close();

            logger.info("Stop Kafka Message.");
        } catch (Exception e) {
            logger.info("Stop Kafka Message error:" + e.toString());
        }
    }
复制代码

4.创建main函数,模拟消息组件的启动和停止事件,以及处理过程: start函数启动消费者监听事件,stop函数停止监听,后续这两个函数会和karaf中bundle的启动和停止事件绑定:

public static void start() {
        try {
            //activemq消费者启动
            activemq.receive("Topic", "DaneJiang");

            //kafka消费者启动
            kafka.receive();

            logger.info("MainService start success.");
        } catch (Exception ex) {
            logger.info("MainService start error:" + ex.toString());
        }
    }

    public static void stop() {
        try {
            //activemq停止
            activemq.stop();

            //kafka停止
            kafka.stop();
        } catch (Exception ex) {
            logger.info("MainService stop error:" + ex.toString());
        }
    }
复制代码

下面是接收到消息后的处理事件,具体内容根据需要自行调整:

public static String doMQ(String topicName, String topicMessage) {
        try {
            String result = "";
            switch (topicName.toUpperCase()) {
                case "DANEJIANG":
                    result = topicMessage + " World!";
                    break;
                default:
                    result = "Receive Error Type:Type=" + topicName + ",Message=" + topicMessage;
                    break;
            }

            return result;
        } catch (Exception ex) {
            logger.info("doMQ error:" + ex.toString());
            return ex.toString();
        }
    }

    public static String doKafka(String topicName, String topicMessage) {
        try {
            String result = "";
            switch (topicName.toUpperCase()) {
                case "DANEJIANG":
                    result = topicMessage + " World!";
                    break;
                default:
                    result = "Receive Error Type:Type=" + topicName + ",Message=" + topicMessage;
                    break;
            }

            return result;
        } catch (Exception ex) {
            logger.info("doKafka error:" + ex.toString());
            return ex.toString();
        }
    }
复制代码

5.最后创建文件Activator.java,用于处理bundle启动和停止时触发AceiveMQ和Kafka消息组件的对应事件:

import com.danejiang.service.mainService;
import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;

public class Activator implements BundleActivator {
    @Override
    public void start(BundleContext arg0) throws Exception {
        mainService.start();
        System.out.println("start bundle!");
    }

    @Override
    public void stop(BundleContext arg0) throws Exception {
        mainService.stop();
        System.out.println("stop bundle!");
    }
}
复制代码

6.代码完成后,用maven打包成jar文件,并将程序中lib目录下的jmdns-3.4.1.jar、kafka-clients-2.1.0.jar和lz4-java-1.5.0.jar一并上传至ServiceMix下的deploy目录中:


使用命令bin/client进入karaf,输入bundle:list可以查看到相关组件已经启用: 输入log:display可以查看组件的启动情况:

7.文章最后放上这次测试的源代码,水平有限还请各位指正错误,谢谢! github.com/danejiang/S…