已解決:Exception: Python in worker has different version 2.7 than that in driver 3.6
阿新 • • 發佈:2020-12-18
技術標籤:大資料pythonsparksparkhadooplinuxpython
已解決:Exception: Python in worker has different version 2.7 than that in driver 3.6, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.
在阿里雲伺服器上執行pyspark模組程式時,核心報錯如上
伺服器centos環境:python(預設為python2)、python3,即雙python環境
安裝的pyspark==2.1.2版本,python3環境下安裝的,注意pyspark版本要與安裝的spark版本相符合(安裝的spark版本為2.1.1)
如下圖執行:python3 xxx.py 報錯如下
[[email protected] code]# python3 foreach.py Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 20/12/17 15:30:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 20/12/17 15:30:27 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 172.16.1.186 instead (on interface eth0) 20/12/17 15:30:27 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 20/12/17 15:30:30 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0) org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/opt/software/spark/python/lib/pyspark.zip/pyspark/worker.py", line 125, in main ("%d.%d" % sys.version_info[:2], version)) Exception: Python in worker has different version 2.7 than that in driver 3.6, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set. at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193) at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234) at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152) at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 20/12/17 15:30:30 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/opt/software/spark/python/lib/pyspark.zip/pyspark/worker.py", line 125, in main ("%d.%d" % sys.version_info[:2], version))
解決方法:
報錯可知程式中的變數PYSPARK_PYTHON 、PYSPARK_DRIVER_PYTHON 本應呼叫python3,但是預設的python是2版本,但是2版本中又缺少pyspark等庫,所以報錯。
使用whichispython3命令查詢python3的位置,在程式中指明上述兩個變數呼叫的python版本,如下
from pyspark import SparkContext # 以下三行為新增內容 import os os.environ["PYSPARK_PYTHON"]="/usr/bin/python3" os.environ["PYSPARK_DRIVER_PYTHON"]="/usr/bin/python3"
儲存再次執行,可正常執行