1. 程式人生 > >java呼叫spark的藉口執行WordCount

java呼叫spark的藉口執行WordCount

"F:\Program Files\Java\jdk1.7.0_15\bin\java" -Didea.launcher.port=7533 "-Didea.launcher.bin.path=F:\Program Files (x86)\JetBrains\IntelliJIDEASpark\bin" -Dfile.encoding=UTF-8 -classpath "F:\Program Files\Java\jdk1.7.0_15\jre\lib\charsets.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\deploy.jar;F:\Program
Files\Java\jdk1.7.0_15\jre\lib\ext\access-bridge-64.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\dnsns.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\jaccess.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\localedata.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\sunec.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\sunjce_provider.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\sunmscapi.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\ext\zipfs.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\javaws.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\jce.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\jfr.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\jfxrt.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\jsse.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\management-agent.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\plugin.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\resources.jar;F:\Program Files\Java\jdk1.7.0_15\jre\lib\rt.jar;C:\Users\Administrator.WIN-20160809ARI\Desktop\temp\spark\JavaSpark\target\classes;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\scala-lang\scala-library\2.10.6\scala-library-2.10.6.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\spark\spark-core_2.10\1.6.2\spark-core_2.10-1.6.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7-tests.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\twitter\chill_2.10\0.5.0\chill_2.10-0.5.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\esotericsoftware\kryo\kryo\2.21\kryo-2.21.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\esotericsoftware\reflectasm\reflectasm\1.07\reflectasm-1.07-shaded.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\esotericsoftware\minlog\minlog\1.2\minlog-1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\objenesis\objenesis\1.2\objenesis-1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\twitter\chill-java\0.5.0\chill-java-0.5.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\xbean\xbean-asm5-shaded\4.4\xbean-asm5-shaded-4.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\spark\spark-launcher_2.10\1.6.2\spark-launcher_2.10-1.6.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\spark\spark-network-common_2.10\1.6.2\spark-network-common_2.10-1.6.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\spark\spark-network-shuffle_2.10\1.6.2\spark-network-shuffle_2.10-1.6.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.4.4\jackson-annotations-2.4.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\spark\spark-unsafe_2.10\1.6.2\spark-unsafe_2.10-1.6.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\net\java\dev\jets3t\jets3t\0.7.1\jets3t-0.7.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-codec\commons-codec\1.3\commons-codec-1.3.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\curator\curator-recipes\2.4.0\curator-recipes-2.4.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\curator\curator-framework\2.4.0\curator-framework-2.4.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\zookeeper\zookeeper\3.4.5\zookeeper-3.4.5.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\google\guava\guava\14.0.1\guava-14.0.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\commons\commons-lang3\3.3.2\commons-lang3-3.3.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\commons\commons-math3\3.4.1\commons-math3-3.4.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\slf4j\jul-to-slf4j\1.7.10\jul-to-slf4j-1.7.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\slf4j\slf4j-log4j12\1.7.10\slf4j-log4j12-1.7.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\xerial\snappy\snappy-java\1.1.2.1\snappy-java-1.1.2.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\roaringbitmap\RoaringBitmap\0.5.11\RoaringBitmap-0.5.11.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-net\commons-net\2.2\commons-net-2.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\typesafe\akka\akka-remote_2.10\2.3.11\akka-remote_2.10-2.3.11.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\typesafe\akka\akka-actor_2.10\2.3.11\akka-actor_2.10-2.3.11.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\typesafe\config\1.2.1\config-1.2.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\io\netty\netty\3.8.0.Final\netty-3.8.0.Final.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\uncommons\maths\uncommons-maths\1.2.2a\uncommons-maths-1.2.2a.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\typesafe\akka\akka-slf4j_2.10\2.3.11\akka-slf4j_2.10-2.3.11.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\json4s\json4s-jackson_2.10\3.2.10\json4s-jackson_2.10-3.2.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\json4s\json4s-core_2.10\3.2.10\json4s-core_2.10-3.2.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\json4s\json4s-ast_2.10\3.2.10\json4s-ast_2.10-3.2.10.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\scala-lang\scalap\2.10.0\scalap-2.10.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\scala-lang\scala-compiler\2.10.0\scala-compiler-2.10.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\mesos\mesos\0.21.1\mesos-0.21.1-shaded-protobuf.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\io\netty\netty-all\4.0.29.Final\netty-all-4.0.29.Final.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.2\metrics-core-3.1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.2\metrics-jvm-3.1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.2\metrics-json-3.1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.2\metrics-graphite-3.1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.4.4\jackson-databind-2.4.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.4.4\jackson-core-2.4.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.10\2.4.4\jackson-module-scala_2.10-2.4.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\scala-lang\scala-reflect\2.10.4\scala-reflect-2.10.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\thoughtworks\paranamer\paranamer\2.6\paranamer-2.6.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\tachyonproject\tachyon-client\0.8.2\tachyon-client-0.8.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-lang\commons-lang\2.4\commons-lang-2.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\tachyonproject\tachyon-underfs-hdfs\0.8.2\tachyon-underfs-hdfs-0.8.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\tachyonproject\tachyon-underfs-s3\0.8.2\tachyon-underfs-s3-0.8.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\tachyonproject\tachyon-underfs-local\0.8.2\tachyon-underfs-local-0.8.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\net\razorvine\pyrolite\4.9\pyrolite-4.9.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\net\sf\py4j\py4j\0.9\py4j-0.9.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-client\2.6.4\hadoop-client-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-common\2.6.4\hadoop-common-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\avro\avro\1.7.4\avro-1.7.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-auth\2.6.4\hadoop-auth-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\httpcomponents\httpclient\4.2.5\httpclient-4.2.5.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\httpcomponents\httpcore\4.2.4\httpcore-4.2.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\curator\curator-client\2.6.0\curator-client-2.6.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\htrace\htrace-core\3.0.4\htrace-core-3.0.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\commons\commons-compress\1.4.1\commons-compress-1.4.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\tukaani\xz\1.0\xz-1.0.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.6.4\hadoop-hdfs-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\xerces\xercesImpl\2.9.1\xercesImpl-2.9.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\xml-apis\xml-apis\1.3.04\xml-apis-1.3.04.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.6.4\hadoop-mapreduce-client-app-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.6.4\hadoop-mapreduce-client-common-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.6.4\hadoop-yarn-client-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.6.4\hadoop-yarn-server-common-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.6.4\hadoop-mapreduce-client-shuffle-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.6.4\hadoop-yarn-api-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.6.4\hadoop-mapreduce-client-core-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.6.4\hadoop-yarn-common-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\javax\xml\stream\stax-api\1.0-2\stax-api-1.0-2.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\javax\activation\activation\1.1\activation-1.1.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\javax\servlet\servlet-api\2.5\servlet-api-2.5.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.9.13\jackson-jaxrs-1.9.13.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\codehaus\jackson\jackson-xc\1.9.13\jackson-xc-1.9.13.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.6.4\hadoop-mapreduce-client-jobclient-2.6.4.jar;C:\Users\Administrator.WIN-20160809ARI\.m2\repository\org\apache\hadoop\hadoop-annotations\2.6.4\hadoop-annotations-2.6.4.jar;F:\Program Files (x86)\JetBrains\IntelliJIDEASpark\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain JavaWordCount

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/12/04 21:02:04 INFO SparkContext: Running Spark version 1.6.2
16/12/04 21:02:06 INFO SecurityManager: Changing view acls to: Administrator
16/12/04 21:02:06 INFO SecurityManager: Changing modify acls to: Administrator
16/12/04 21:02:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Administrator); users with modify permissions: Set(Administrator)
16/12/04 21:02:07 INFO Utils: Successfully started service 'sparkDriver' on port 55299.
16/12/04 21:02:07 INFO Slf4jLogger: Slf4jLogger started
16/12/04 21:02:07 INFO Remoting: Starting remoting
16/12/04 21:02:07 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://
[email protected]
:55312]
16/12/04 21:02:07 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 55312.
16/12/04 21:02:07 INFO SparkEnv: Registering MapOutputTracker
16/12/04 21:02:07 INFO SparkEnv: Registering BlockManagerMaster
16/12/04 21:02:07 INFO DiskBlockManager: Created local directory at C:\Users\Administrator.WIN-20160809ARI\AppData\Local\Temp\blockmgr-335551fd-4327-4871-b594-28c901093e15
16/12/04 21:02:08 INFO MemoryStore: MemoryStore started with capacity 1807.0 MB
16/12/04 21:02:08 INFO SparkEnv: Registering OutputCommitCoordinator
16/12/04 21:02:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/12/04 21:02:08 INFO SparkUI: Started SparkUI at http://192.168.164.1:4040
16/12/04 21:02:08 INFO Executor: Starting executor ID driver on host localhost
16/12/04 21:02:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55319.
16/12/04 21:02:08 INFO NettyBlockTransferService: Server created on 55319
16/12/04 21:02:08 INFO BlockManagerMaster: Trying to register BlockManager
16/12/04 21:02:08 INFO BlockManagerMasterEndpoint: Registering block manager localhost:55319 with 1807.0 MB RAM, BlockManagerId(driver, localhost, 55319)
16/12/04 21:02:08 INFO BlockManagerMaster: Registered BlockManager
16/12/04 21:02:09 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 156.3 KB, free 156.3 KB)
16/12/04 21:02:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 14.1 KB, free 170.3 KB)
16/12/04 21:02:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:55319 (size: 14.1 KB, free: 1807.0 MB)

16/12/04 21:02:09 INFO SparkContext: Created broadcast 0 from textFile at JavaWordCount.java:21
16/12/04 21:02:11 WARN : Your hostname, WIN-20160809ARI resolves to a loopback/non-reachable address: fe80:0:0:0:0:5efe:c0a8:c789%20, but we couldn't find any external IP address!
16/12/04 21:02:11 INFO FileInputFormat: Total input paths to process : 1
16/12/04 21:02:12 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
16/12/04 21:02:12 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
16/12/04 21:02:12 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
16/12/04 21:02:12 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
16/12/04 21:02:12 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
16/12/04 21:02:12 INFO SparkContext: Starting job: saveAsTextFile at JavaWordCount.java:59
16/12/04 21:02:12 INFO DAGScheduler: Registering RDD 3 (mapToPair at JavaWordCount.java:30)
16/12/04 21:02:12 INFO DAGScheduler: Registering RDD 5 (mapToPair at JavaWordCount.java:45)
16/12/04 21:02:12 INFO DAGScheduler: Got job 0 (saveAsTextFile at JavaWordCount.java:59) with 1 output partitions
16/12/04 21:02:12 INFO DAGScheduler: Final stage: ResultStage 2 (saveAsTextFile at JavaWordCount.java:59)
16/12/04 21:02:12 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 1)
16/12/04 21:02:12 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 1)
16/12/04 21:02:12 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:30), which has no missing parents
16/12/04 21:02:12 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.7 KB, free 175.1 KB)
16/12/04 21:02:12 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.6 KB, free 177.7 KB)
16/12/04 21:02:12 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:55319 (size: 2.6 KB, free: 1807.0 MB)
16/12/04 21:02:12 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
16/12/04 21:02:12 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:30)
16/12/04 21:02:12 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/12/04 21:02:12 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 2109 bytes)
16/12/04 21:02:12 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/12/04 21:02:12 INFO HadoopRDD: Input split: file:/D:/china.txt:0+19
16/12/04 21:02:12 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 2253 bytes result sent to driver
16/12/04 21:02:12 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 261 ms on localhost (1/1)
16/12/04 21:02:12 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
16/12/04 21:02:12 INFO DAGScheduler: ShuffleMapStage 0 (mapToPair at JavaWordCount.java:30) finished in 0.287 s
16/12/04 21:02:12 INFO DAGScheduler: looking for newly runnable stages
16/12/04 21:02:12 INFO DAGScheduler: running: Set()
16/12/04 21:02:12 INFO DAGScheduler: waiting: Set(ShuffleMapStage 1, ResultStage 2)
16/12/04 21:02:12 INFO DAGScheduler: failed: Set()
16/12/04 21:02:12 INFO DAGScheduler: Submitting ShuffleMapStage 1 (MapPartitionsRDD[5] at mapToPair at JavaWordCount.java:45), which has no missing parents
16/12/04 21:02:12 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 4.0 KB, free 181.7 KB)
16/12/04 21:02:12 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 2.3 KB, free 184.0 KB)
16/12/04 21:02:12 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:55319 (size: 2.3 KB, free: 1807.0 MB)
16/12/04 21:02:12 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1006
16/12/04 21:02:12 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 1 (MapPartitionsRDD[5] at mapToPair at JavaWordCount.java:45)
16/12/04 21:02:12 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
16/12/04 21:02:12 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, partition 0,NODE_LOCAL, 1883 bytes)
16/12/04 21:02:12 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
16/12/04 21:02:12 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
16/12/04 21:02:12 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 8 ms
16/12/04 21:02:12 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1374 bytes result sent to driver
16/12/04 21:02:12 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 96 ms on localhost (1/1)
16/12/04 21:02:12 INFO DAGScheduler: ShuffleMapStage 1 (mapToPair at JavaWordCount.java:45) finished in 0.096 s
16/12/04 21:02:12 INFO DAGScheduler: looking for newly runnable stages
16/12/04 21:02:12 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
16/12/04 21:02:12 INFO DAGScheduler: running: Set()
16/12/04 21:02:12 INFO DAGScheduler: waiting: Set(ResultStage 2)
16/12/04 21:02:12 INFO DAGScheduler: failed: Set()
16/12/04 21:02:12 INFO DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[8] at saveAsTextFile at JavaWordCount.java:59), which has no missing parents
16/12/04 21:02:12 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 66.1 KB, free 250.1 KB)
16/12/04 21:02:12 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 23.1 KB, free 273.2 KB)
16/12/04 21:02:12 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:55319 (size: 23.1 KB, free: 1807.0 MB)
16/12/04 21:02:12 INFO SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:1006
16/12/04 21:02:12 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[8] at saveAsTextFile at JavaWordCount.java:59)
16/12/04 21:02:12 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
16/12/04 21:02:12 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, partition 0,NODE_LOCAL, 1894 bytes)
16/12/04 21:02:12 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)
16/12/04 21:02:12 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
16/12/04 21:02:12 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
16/12/04 21:02:13 INFO FileOutputCommitter: Saved output of task 'attempt_201612042102_0002_m_000000_2' to file:/D:/1.txt/_temporary/0/task_201612042102_0002_m_000000
16/12/04 21:02:13 INFO SparkHadoopMapRedUtil: attempt_201612042102_0002_m_000000_2: Committed
16/12/04 21:02:13 INFO Executor: Finished task 0.0 in stage 2.0 (TID 2). 2080 bytes result sent to driver
16/12/04 21:02:13 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 181 ms on localhost (1/1)
16/12/04 21:02:13 INFO DAGScheduler: ResultStage 2 (saveAsTextFile at JavaWordCount.java:59) finished in 0.182 s
16/12/04 21:02:13 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 
16/12/04 21:02:13 INFO DAGScheduler: Job 0 finished: saveAsTextFile at JavaWordCount.java:59, took 0.818282 s
16/12/04 21:02:13 INFO SparkUI: Stopped Spark web UI at http://192.168.164.1:4040
16/12/04 21:02:13 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/12/04 21:02:13 INFO MemoryStore: MemoryStore cleared
16/12/04 21:02:13 INFO BlockManager: BlockManager stopped
16/12/04 21:02:13 INFO BlockManagerMaster: BlockManagerMaster stopped
16/12/04 21:02:13 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/12/04 21:02:13 INFO SparkContext: Successfully stopped SparkContext
16/12/04 21:02:13 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/12/04 21:02:13 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/12/04 21:02:13 INFO ShutdownHookManager: Shutdown hook called
16/12/04 21:02:13 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator.WIN-20160809ARI\AppData\Local\Temp\spark-5409914b-5d2d-40cf-95ce-e1383c67fbcd


Process finished with exit code 0

相關推薦

java呼叫spark藉口執行WordCount

"F:\Program Files\Java\jdk1.7.0_15\bin\java" -Didea.launcher.port=7533 "-Didea.launcher.bin.path=F:\Program Files (x86)\JetBrains\Intelli

java 呼叫cmd互動式執行命令並獲得執行結果

需求是這樣:需要呼叫cmd執行某個位置下的可執行程式,(具體我這裡是gtsstp.exe),首先可以通過Process的getOutputStream向其輸入命令,然後用InputStreamReader獲得執行結果。 這裡需要注意的是:因為讀取執行結果時是按

java呼叫spark+hdfs計算的一個小demo

最近在入門spark+hadoop,偽分散式安裝,部署推薦這幾個地址,不錯。這邊順手記錄一下自己用到的兩個小程式。 推薦教程 maven配置 <project xmlns="

eclipse下執行wordcount報錯 java.lang.ClassNotFoundException 解決辦法

eclipse下執行wordcount報錯 java.lang.classnotfoundexception 解決辦法eclipse下執行wordcount報錯 java.lang.ClassNotFoundException17/08/29 07:52:54 INFO Configuration.depre

Java 呼叫Linux 命令,並獲取命令執行結果

1.工具類 public class ExcuteLinux { public static String exeCmd(String commandStr) { String result = null; try { St

java 如何使用多執行呼叫類的靜態方法?

  1.情景展示   靜態方法內部實現:將指定內容生成圖片格式的二維碼;   如何通過多執行緒實現? 2.分析   之所以採用多執行緒,是為了節省時間  3.解決方案   準備工作   logo檔案     將生成的檔案儲存在F

java實現kafka整合spark streaming完成wordCount,updateStateByKey完成實時狀態更新

引入依賴 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId&g

Java呼叫外部程式、執行Shell或cmd命令

亦可以使用Runtime呼叫外部程式 public static void main(String []args) throws IOException { List<String> command = new ArrayList<>();

java呼叫Linux執行Python爬蟲,並將資料儲存到elasticsearch--(一、環境指令碼搭建)

java呼叫Linux執行Python爬蟲,並將資料儲存到elasticsearch中 一、以下部落格程式碼使用的開發工具及環境如下: 1、idea: 2、jdk:1.8 3、elasticsearch:5.2.0 4、Linux 5、Python 6、maven 二、maven座標: <!--jav

基因資料處理119之java呼叫SSW在linux下執行

基因資料處理系列 1.解釋 測試自帶Example: [email protected]:~/xubo/tools/Complete-Striped-Smith-Waterman

java呼叫matlab:二、在服務端(客戶機)搭建執行環境和常見問題解決

在服務端(客戶機)搭建matlab執行環境1.下載MCR到客戶機https://cn.mathworks.com/products/compiler/matlab-runtime.html注意對應的版本和位數,我的是matlab2012b x64就下載這個二、安裝下載下來的M

java編寫spark程式,簡單示例及執行

最近因為工作需要,研究了下spark,因為scala還不熟,所以先學習了java的spark程式寫法,下面是我的簡單測試程式的程式碼,大部分函式的用法已在註釋裡面註明。 我的環境:hadoop 2.2.0                    spark-0.9.0  

java呼叫shell命令並獲取執行結果

原文地址:http://blog.csdn.net/arkblue/article/details/7897396 使用到Process和Runtime兩個類,返回值通過Process類的getInputStream()方法獲取 package ark;

Spark程式設計環境搭建(基於Intellij IDEA的Ultimate版本)(包含Java和Scala版的WordCount)(博主強烈推薦)

  為什麼,我要在這裡提出要用Ultimate版本。 基於Intellij IDEA搭建Spark開發環境搭——參考文件 操作步驟   a)建立maven 專案   b)引入依賴(Spark 依賴、打包外掛等等) 基於Intellij

Spark on YARN簡介與執行wordcount(master、slave1和slave2)(博主推薦)

前期部落格 Spark On YARN模式   這是一種很有前景的部署模式。但限於YARN自身的發展,目前僅支援粗粒度模式(Coarse-grained Mode)。這是由於YARN上的Container資源是不可以動態伸縮的,一旦Container啟動之後,可使用

Spark standalone簡介與執行wordcount(master、slave1和slave2)

 前期部落格 1. Standalone模式       即獨立模式,自帶完整的服務,可單獨部署到一個叢集中,無需依賴任何其他資源管理系統。從一定程度上說,該模式是其他兩種的基礎。借鑑Spark開發模式,我們可以得到一種開發新型計算框架的一般思路:先設計出它的s

win10+eclipse+hadoop2.7.2+maven+local模式直接通過Run as Java Application執行wordcount

一、準備工作  (1)Hadoop2.7.2 在linux部署完畢,成功啟動dfs和yarn,通過jps檢視,程序都存在  (2)安裝maven 二、最終效果  在windows系統中,直接通過Run as Java Application執行wordcount,而不需要先打包成jar包,然後在lin

spark叢集搭建與叢集上執行wordcount程式

Spark 配置 1、master 機器 Spark 配置 進入 Spark 安裝目錄下的 conf 目錄, 拷貝 spark-env.sh.template 到 spark-env.sh。 cp spark-env.sh.template spark-e

Java呼叫外部程式命令時執行緒阻塞問題分析

    今天要寫個遠端重啟服務的功能,為了開發速度,暫時定為Java程式碼+WMIC命令的方法,簡單的說,就是利用Java呼叫本機應用程式的方法。涉及到的 Java類有java.lang包裡面的Runtime、Process、ProcessBuilder三個類,以及wmic

Spark local/standalone/yarn/遠端除錯-執行WordCount

local 直接啟動spark-shell ./spark-shell --master local[*] 編寫scala程式碼 sc.textFile("/input/file01.txt") res0.cache() res0.count val