Spark例項演示之SparkPi
阿新 • • 發佈:2019-01-10
1.貼上原始碼
object SparkPi {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Spark Pi").setMaster("spark://hadoop1:7077").setJars(List("/home/kaiseu/MyProject/IdeaProjects/week2/out/artifacts/week2/week2.jar"))
val spark = new SparkContext(conf)
val slices = if (args.length > 0) args(0).toInt else 2
val n = 100000 * slices
val count = spark.parallelize(1 to n, slices).map { i =>
val x = Math.random * 2 - 1
val y = Math.random * 2 - 1
if (x * x + y * y < 1) 1 else 0
}.reduce(_ + _)
println("Pi is rounghly " + 4.0 * count / n)
spark.stop()
}
}
2 啟動spark叢集
到spark安裝目錄:/app/hadoop/spark1003.通過spark叢集web頁面訪問"http://hadoop1:8080"
4.執行結果