远程环境:centos7.6

远程py:3.6.8

本地环境:win10

本地py:3.6.8

pycharm版本:2019.1

pycharm链接mysql pycharm链接远程环境_pycharm链接mysql

一、PyCharm实现远程调试代码

1、打开PyCharm 工具, 点击 Tools -> Deployment -> configuration,如下图:

pycharm链接mysql pycharm链接远程环境_spark_02


2、在新窗口添加远程服务器,在左上角点击 + 号,如下图:

pycharm链接mysql pycharm链接远程环境_python_03


pycharm链接mysql pycharm链接远程环境_python_04


3、配置远端服务器的连接信息,如下图:

3.1 设置服务器连接信息:

pycharm链接mysql pycharm链接远程环境_spark_05


pycharm链接mysql pycharm链接远程环境_python_06


这里有个测试连接,可以测试是否能连接上(这里我真的是有点米,一开始是连不上的,然后我瞎点了几下后又还原回原来的连接信息后,又可以连接上了,可能是我之前开过vpn导致的重新定向服务器问题,关了后重启的程序,一开始是点了save password的,好像关了后就可以,然后再点击即可)

pycharm链接mysql pycharm链接远程环境_spark_07


Authentication支持3种类型

  • Password,密码认证,简单(推荐)
  • Key pair(OpenSSH or PuTTY),私钥认证
  • OpenSSH config and authentication agent

3.2 设置本地和远端服务器目录

pycharm链接mysql pycharm链接远程环境_pycharm链接mysql_08


Local path:定义本地目录,如 d:\pywork

Deployment path on server:自定Linux服务器远端目录,切记,这里是相对目录!因为前面已经指定到了 /data/python目录,如果这里指定为/class18,那么配置成功后,将仅能看到 class18目录下的代码文件。根据自己需要设置即可!

Web path on server:web路径暂时不需要设置,保持默认。4、经过上面步骤的配置后,我们可以在PyCharm 界面的右边查看远端代码,如下图:

pycharm链接mysql pycharm链接远程环境_服务器_09


切记!勾选 Automatic Upload 实现本地自动同步到远端

二、设置Python版本信息

这里由于本地与远程服务端的py是同一版本,所以设置方面相对于原文而言稍微简单些

pycharm链接mysql pycharm链接远程环境_python_10


pycharm链接mysql pycharm链接远程环境_服务器_11


pycharm链接mysql pycharm链接远程环境_python_12


pycharm链接mysql pycharm链接远程环境_服务器_13


pycharm链接mysql pycharm链接远程环境_服务器_14


选择远程服务器上Python解释器的位置,服务器上的远程同步文件夹Sync folders,可以选择多个。如果不知道Python安装在哪,可以远程连接服务器后,使用 命令 which python 找到Python安装位置。

pycharm链接mysql pycharm链接远程环境_服务器_15


pycharm链接mysql pycharm链接远程环境_服务器_16


pycharm链接mysql pycharm链接远程环境_python_17


然后配置pycharm的远程环境变量

pycharm链接mysql pycharm链接远程环境_pycharm链接mysql_18


pycharm链接mysql pycharm链接远程环境_spark_19


pycharm链接mysql pycharm链接远程环境_spark_20


PYTHONPATH=/usr/local/spark/spark-2.4.3-bin-hadoop2.7/python:/usr/local/spark/spark-2.4.3-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH

这条环境变量实际上是我在centos中设置的环境变量变化而来的

pycharm链接mysql pycharm链接远程环境_spark_21


开始执行测试语句

from pyspark import SparkContext
from pyspark import SparkConf

conf = SparkConf().setMaster("local[*]").setAppName("test")
sc = SparkContext.getOrCreate(conf)
textFile = sc.textFile("file:///home/You Have Only One Life.txt")
wordCount = textFile.flatMap(lambda line: line.split(" ")).map(lambda word: (word,1)).reduceByKey(lambda a, b : a + b)
wordCount.foreach(print)
sc.stop()
quit()

执行结果

ssh://root@192.168.61.128:22/usr/local/python3/bin/python3.6 -u /tmp/pycharm_project_466/temp.py
19/06/05 08:22:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
[Stage 0:>                                                          (0 + 1) / 1]('There', 1)
('are', 2)
('moments', 1)
('in', 4)
('life', 4)
('when', 4)
('you', 26)
('miss', 2)
('someone', 1)
('so', 2)
('much', 1)
('that', 6)
('just', 3)
('want', 6)
('to', 15)
('pick', 1)
('them', 3)
('from', 1)
('your', 4)
('dreams', 1)
('and', 6)
('hug', 1)
('for', 2)
('real!', 1)
('Dream', 1)
('what', 2)
('dream;go', 1)
('where', 1)
('go;be', 1)
('be,because', 1)
('have', 7)
('only', 2)
('one', 4)
('chance', 1)
('do', 1)
('all', 1)
('the', 8)
('things', 2)
('do.', 1)
('May', 1)
('enough', 1)
('happiness', 1)
('make', 6)
('sweet,enough', 1)
('trials', 1)
('strong,enough', 1)
('sorrow', 1)
('keep', 1)
('human,enough', 1)
('hope', 1)
('happy', 1)
('Always', 1)
('put', 1)
('yourself', 1)
('others��shoes.If', 1)
('feel', 1)
('it', 1)
('hurts', 2)
('you,it', 1)
('probably', 1)
('other', 1)
('person,', 1)
('too.', 1)
('The', 1)
('happiest', 1)
('of', 6)
('people', 3)
('don��t', 2)
('necessarily', 1)
('best', 1)
('everything;they', 1)
('most', 1)
('everything', 1)
('comes', 1)
('along', 1)
('their', 3)
('way.Happiness', 1)
('lies', 1)
('those', 8)
('who', 10)
('cry,those', 1)
('hurt,', 1)
('searched,and', 1)
('tried,for', 1)
('they', 1)
('can', 1)
('appreciate', 2)
('importance', 1)
('touched', 2)
('lives.Love', 1)
('begins', 1)
('with', 4)
('a', 4)
('smile,grows', 1)
('kiss', 1)
('ends', 1)
('tear.The', 1)
('brightest', 1)
('future', 1)
('will', 3)
('always', 1)
('be', 1)
('based', 1)
('on', 3)
('forgotten', 1)
('past,', 1)
('can��t', 1)
('go', 2)
('well', 1)
('lifeuntil', 1)
('let', 2)
('past', 1)
('failures', 1)
('heartaches.', 1)
('When', 1)
('were', 2)
('born,you', 1)
('crying', 1)
('everyone', 2)
('around', 2)
('was', 1)
('smiling.Live', 1)
("die,you're", 1)
('is', 2)
('smiling', 1)
('crying.', 1)
('Please', 1)
('send', 1)
('this', 2)
('message', 1)
('mean', 1)
('something', 1)
('you,to', 1)
('way', 1)
('or', 1)
('another,to', 1)
('smile', 1)
('really', 2)
('need', 1)
('it,to', 1)
('see', 1)
('brighter', 1)
('side', 1)
('down,to', 1)
('know', 1)
('friendship.And', 1)
('if', 1)
('don��t,', 1)
('worry,nothing', 1)
('bad', 1)
('happen', 1)
('you,you', 1)
('out', 1)
('opportunity', 1)
('brighten', 1)
('someone��s', 1)
('day', 1)
('message.', 1)

Process finished with exit code 0

这就算执行成功了