spark单机部署问题

1.端口不能绑定

$SPARK_HOME/bin/run-example SparkPi 10

15/02/27 16:14:36 INFO Remoting: Starting remoting

15/02/27 16:14:36 ERROR NettyTransport: failed to bind to bt-199-037.bta.net.cn/202.106.199.37:0, shutting down Netty transport

15/02/27 16:14:36 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.

15/02/27 16:14:36 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.

15/02/27 16:14:36 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

15/02/27 16:14:36 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.

15/02/27 16:14:36 INFO Slf4jLogger: Slf4jLogger started

15/02/27 16:14:36 INFO Remoting: Starting remoting

15/02/27 16:14:36 ERROR NettyTransport: failed to bind to bt-199-037.bta.net.cn/202.106.199.37:0, shutting down Netty transport

Exception in thread "main" java.net.BindException: Failed to bind to: bt-199-037.bta.net.cn/202.106.199.37:0: Service 'sparkDriver' failed after 16 retries!

        at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)

        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)



处理方法

从现象来看应该akka不能绑定到ip或者端口,从http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/<9A13072E9AA64A9B846FACA846FCA7C8@gmail.com>中找到方法。


spark-env.sh中添加配置:

export  SPARK_MASTER_IP=127.0.0.1

export  SPARK_LOCAL_IP=127.0.0.1