3.3. UDF 开发实例

3.3.1. Step 1 创建 Maven 工程

    <dependencies> <!-- https://mvnrepository.com/artifact/org.apache.hive/hive-exec -->
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>3.1.1</version>
        </dependency> <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.1.1</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                    <encoding>UTF-8</encoding>
                </configuration>
            </plugin>
        </plugins>
    </build>

3.3.2. Step 2 开发 Java 类集成 UDF

public class MyUDF extends UDF {
    public Text evaluate(final Text str) {
        String tmp_str = str.toString();
        if (str != null && !tmp_str.equals("")) {
            String str_ret = tmp_str.substring(0, 1).toUpperCase() + tmp_str.substring(1);
            return new Text(str_ret);
        }
        return new Text("");
    }
}

3.3.3. Step 3 项目打包,并上传到hive的lib目录下

大数据-UDF开发实例_hadoop

3.3.4. Step 4 添加jar包

重命名我们的jar包名称

cd /export/servers/apache-hive-3.1.1-bin/lib 
mv original-day_05_hive_udf-1.0-SNAPSHOT.jar myudf.jar

hive的客户端添加我们的jar包

add jar /export/servers/apache-hive-3.1.1-bin/lib/udf.jar;

3.3.5. Step 5 设置函数与我们的自定义函数关联

create temporary function my_upper as 'cn.itcast.udf.ItcastUDF';

大数据-UDF开发实例_hive_02

3.3.6. Step 6 使用自定义函数

select my_upper('abc');