python spark環境配置
阿新 • • 發佈:2018-08-09
命令 utf 令行 .com 環境 mage rom image log 1、下載如下
放在D盤
添加 SPARK_HOME = D:\spark-2.3.0-bin-hadoop2.7。
- 並將 %SPARK_HOME%/bin 添加至環境變量PATH。
- 然後進入命令行,輸入pyspark命令。若成功執行。則成功設置環境變量
找到pycharm sitepackage目錄
右鍵點擊即可進入目錄,將上面D:\spark-2.3.0-bin-hadoop2.7裏面有個/python/pyspark目錄拷貝到上面的 sitepackage目錄
安裝 py4j
試驗如下代碼:
from __future__ import print_function import sys from operator import add import os # Path for spark source folder os.environ[‘SPARK_HOME‘] = "D:\spark-2.3.0-bin-hadoop2.7" # Append pyspark to Python Path sys.path.append("D:\spark-2.3.0-bin-hadoop2.7\python") sys.path.append("D:\spark-2.3.0-bin-hadoop2.7\python\lib\py4j-0.9-src.zip") from pyspark import SparkContext from pyspark import SparkConf if __name__ == ‘__main__‘: inputFile = "D:\Harry.txt" outputFile = "D:\Harry1.txt" sc = SparkContext() text_file = sc.textFile(inputFile) counts = text_file.flatMap(lambda line: line.split(‘ ‘)).map(lambda word: (word, 1)).reduceByKey(lambda a, b: a + b) counts.saveAsTextFile(outputFile)
計算成功即可
python spark環境配置