1. 程式人生 > >解決A master URL must be set in your configuration

解決A master URL must be set in your configuration

win7 + idea + spark
對應pmo檔案如下:

<properties>
        <spark.version>2.0.2</spark.version>
        <scala.version>2.11</scala.version>
    </properties>

    <dependencies>
        <!--<dependency>-->
            <!--<groupId>org.scala-lang</groupId>-->
<!--<artifactId>scala-library</artifactId>--> <!--<version>2.11.8</version>--> <!--</dependency>--> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_${scala.version}</artifactId
>
<version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_${scala.version}</artifactId> <version>${spark.version}</version
>
</dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_${scala.version}</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-hive_${scala.version}</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_${scala.version}</artifactId> <version>${spark.version}</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <version>2.15.2</version> <executions> <execution> <id>scala-compile-first</id> <goals> <goal>compile</goal> </goals> <configuration> <includes> <include>**/*.scala</include> </includes> </configuration> </execution> <execution> <id>scala-test-compile</id> <goals> <goal>testCompile</goal> </goals> </execution> </executions> </plugin> </plugins> </build>

執行wordcount的例子時,報錯:A master URL must be set in your configuration

object WordCount {
  def main(args: Array[String])
  {
    val conf = new SparkConf().setAppName("wordCount")
    val sc = new SparkContext(conf)
    val rdd = sc.textFile(args(0))
    val wordcount = rdd.flatMap(_.split(" ")).map(x => (x, 1)).reduceByKey(_ + _)
    val wordsort = wordcount.map(x => (x._2, x._1)).sortByKey(false).map(x => (x._2, x._1))
    wordsort.saveAsTextFile(args(1))
    sc.stop()
  }
}

解決辦法:在VM OPTION裡新增:“-Dspark.master=local”解決了:執行結果如下: