Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CancelledKeyException when running Adam with separate spark cluster #17

Open
davidonlaptop opened this issue Apr 14, 2015 · 1 comment

Comments

@davidonlaptop
Copy link
Member

Consequence

Worker is listed in the removed executors list with its state as KILLED.

However this is an output file being produced, it is not clear yet if this is a real problem.

Steps to reproduce

adamcloud@Adams-Mac-mini-Ubuntu:~$ docker exec adam adam-submit --master spark://spark-master:7077 transform hdfs://hdfs-namenode:9000/SRR062634.sam hdfs://hdfs-namenode:9000/SRR062634b.adam
Spark assembly has been built with Hive, including Datanucleus jars on classpath
2015-04-14 18:56:10 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

STDERR

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/04/14 18:56:11 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT]
15/04/14 18:56:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/04/14 18:56:12 INFO SecurityManager: Changing view acls to: root
15/04/14 18:56:12 INFO SecurityManager: Changing modify acls to: root
15/04/14 18:56:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/04/14 18:56:12 INFO Slf4jLogger: Slf4jLogger started
15/04/14 18:56:12 INFO Remoting: Starting remoting
15/04/14 18:56:12 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@spark-worker1:60308]
15/04/14 18:56:12 INFO Remoting: Remoting now listens on addresses: [akka.tcp://driverPropsFetcher@spark-worker1:60308]
15/04/14 18:56:12 INFO Utils: Successfully started service 'driverPropsFetcher' on port 60308.
15/04/14 18:56:12 INFO SecurityManager: Changing view acls to: root
15/04/14 18:56:12 INFO SecurityManager: Changing modify acls to: root
15/04/14 18:56:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/04/14 18:56:12 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/04/14 18:56:12 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/04/14 18:56:12 INFO Slf4jLogger: Slf4jLogger started
15/04/14 18:56:12 INFO Remoting: Starting remoting
15/04/14 18:56:12 INFO Remoting: Remoting shut down
15/04/14 18:56:12 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
15/04/14 18:56:12 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@spark-worker1:37876]
15/04/14 18:56:12 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkExecutor@spark-worker1:37876]
15/04/14 18:56:12 INFO Utils: Successfully started service 'sparkExecutor' on port 37876.
15/04/14 18:56:12 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@adam:33997/user/CoarseGrainedScheduler
15/04/14 18:56:12 INFO WorkerWatcher: Connecting to worker akka.tcp://sparkWorker@spark-worker1:43766/user/Worker
15/04/14 18:56:12 INFO WorkerWatcher: Successfully connected to akka.tcp://sparkWorker@spark-worker1:43766/user/Worker
15/04/14 18:56:12 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
15/04/14 18:56:12 INFO SecurityManager: Changing view acls to: root
15/04/14 18:56:12 INFO SecurityManager: Changing modify acls to: root
15/04/14 18:56:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/04/14 18:56:12 INFO Slf4jLogger: Slf4jLogger started
15/04/14 18:56:12 INFO Remoting: Starting remoting
15/04/14 18:56:13 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@spark-worker1:46753]
15/04/14 18:56:13 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkExecutor@spark-worker1:46753]
15/04/14 18:56:13 INFO Utils: Successfully started service 'sparkExecutor' on port 46753.
15/04/14 18:56:13 INFO AkkaUtils: Connecting to MapOutputTracker: akka.tcp://sparkDriver@adam:33997/user/MapOutputTracker
15/04/14 18:56:13 INFO AkkaUtils: Connecting to BlockManagerMaster: akka.tcp://sparkDriver@adam:33997/user/BlockManagerMaster
15/04/14 18:56:13 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150414185613-f576
15/04/14 18:56:13 INFO Utils: Successfully started service 'Connection manager for block manager' on port 37422.
15/04/14 18:56:13 INFO ConnectionManager: Bound socket to port 37422 with id = ConnectionManagerId(spark-worker1,37422)
15/04/14 18:56:13 INFO MemoryStore: MemoryStore started with capacity 1060.3 MB
15/04/14 18:56:13 INFO BlockManagerMaster: Trying to register BlockManager
15/04/14 18:56:13 INFO BlockManagerMaster: Registered BlockManager
15/04/14 18:56:13 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@adam:33997/user/HeartbeatReceiver
15/04/14 18:56:13 INFO CoarseGrainedExecutorBackend: Got assigned task 0
15/04/14 18:56:13 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/paranamer-2.3.jar with timestamp 1429037770711
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/paranamer-2.3.jar to /tmp/fetchFileTemp9138374815620003401.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./paranamer-2.3.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-cli-1.2.jar with timestamp 1429037770686
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-cli-1.2.jar to /tmp/fetchFileTemp4300381861535729741.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-cli-1.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-httpclient-3.1.jar with timestamp 1429037770688
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-httpclient-3.1.jar to /tmp/fetchFileTemp2955185790565174009.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-httpclient-3.1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/jcommander-1.27.jar with timestamp 1429037770786
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/jcommander-1.27.jar to /tmp/fetchFileTemp3393129393988809981.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./jcommander-1.27.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/objenesis-1.2.jar with timestamp 1429037770720
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/objenesis-1.2.jar to /tmp/fetchFileTemp5267878146906553193.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./objenesis-1.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/fastutil-6.4.4.jar with timestamp 1429037770755
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/fastutil-6.4.4.jar to /tmp/fetchFileTemp1363741256222612084.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./fastutil-6.4.4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-jexl-2.1.1.jar with timestamp 1429037770782
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-jexl-2.1.1.jar to /tmp/fetchFileTemp4854967959179966282.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-jexl-2.1.1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/grizzled-slf4j_2.10-1.0.2.jar with timestamp 1429037770856
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/grizzled-slf4j_2.10-1.0.2.jar to /tmp/fetchFileTemp2750787710904710876.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./grizzled-slf4j_2.10-1.0.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-jackson-1.6.0rc4.jar with timestamp 1429037770764
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-jackson-1.6.0rc4.jar to /tmp/fetchFileTemp7793053302073882282.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-jackson-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/bcel-5.2.jar with timestamp 1429037770781
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/bcel-5.2.jar to /tmp/fetchFileTemp6684520750287603408.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./bcel-5.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/hadoop-bam-7.0.0.jar with timestamp 1429037770767
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/hadoop-bam-7.0.0.jar to /tmp/fetchFileTemp4154138373923122267.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./hadoop-bam-7.0.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/servo-core-0.5.5.jar with timestamp 1429037770791
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/servo-core-0.5.5.jar to /tmp/fetchFileTemp9045351472743461951.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./servo-core-0.5.5.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/snappy-java-1.0.5.3.jar with timestamp 1429037770703
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/snappy-java-1.0.5.3.jar to /tmp/fetchFileTemp6129776536072113626.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./snappy-java-1.0.5.3.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/cofoja-1.1-r150.jar with timestamp 1429037770770
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/cofoja-1.1-r150.jar to /tmp/fetchFileTemp7506307979723596237.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./cofoja-1.1-r150.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/json4s-ast_2.10-3.2.10.jar with timestamp 1429037770849
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/json4s-ast_2.10-3.2.10.jar to /tmp/fetchFileTemp5860121585304136161.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./json4s-ast_2.10-3.2.10.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scala-library-2.10.4.jar with timestamp 1429037770808
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scala-library-2.10.4.jar to /tmp/fetchFileTemp4255564728918789950.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scala-library-2.10.4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-format-2.2.0-rc1.jar with timestamp 1429037770766
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-format-2.2.0-rc1.jar to /tmp/fetchFileTemp165001245190906794.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-format-2.2.0-rc1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-io-1.3.2.jar with timestamp 1429037770704
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-io-1.3.2.jar to /tmp/fetchFileTemp5067147027478568727.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-io-1.3.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/asm-4.0.jar with timestamp 1429037770720
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/asm-4.0.jar to /tmp/fetchFileTemp318227717983144312.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./asm-4.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-compress-1.4.1.jar with timestamp 1429037770690
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-compress-1.4.1.jar to /tmp/fetchFileTemp4348044553948052935.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-compress-1.4.1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/slf4j-log4j12-1.7.5.jar with timestamp 1429037770809
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/slf4j-log4j12-1.7.5.jar to /tmp/fetchFileTemp1260749043253037096.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./slf4j-log4j12-1.7.5.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/kryo-2.21.jar with timestamp 1429037770719
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/kryo-2.21.jar to /tmp/fetchFileTemp7213170923379151436.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./kryo-2.21.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/args4j-2.0.23.jar with timestamp 1429037770809
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/args4j-2.0.23.jar to /tmp/fetchFileTemp8135928262602238416.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./args4j-2.0.23.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-hadoop-1.6.0rc4.jar with timestamp 1429037770761
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-hadoop-1.6.0rc4.jar to /tmp/fetchFileTemp4657446138937067937.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-hadoop-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/joda-convert-1.6.jar with timestamp 1429037770861
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/joda-convert-1.6.jar to /tmp/fetchFileTemp5262321399304079929.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./joda-convert-1.6.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-avro-1.6.0rc4.jar with timestamp 1429037770756
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-avro-1.6.0rc4.jar to /tmp/fetchFileTemp328522683892592644.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-avro-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalatra-json_2.10-2.3.0.jar with timestamp 1429037770846
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalatra-json_2.10-2.3.0.jar to /tmp/fetchFileTemp891212980175908144.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalatra-json_2.10-2.3.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scala-reflect-2.10.4.jar with timestamp 1429037770867
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scala-reflect-2.10.4.jar to /tmp/fetchFileTemp27874364891379654.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scala-reflect-2.10.4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1429037770710
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/jackson-mapper-asl-1.9.13.jar to /tmp/fetchFileTemp8180814965879888474.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./jackson-mapper-asl-1.9.13.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scala-compiler-2.10.0.jar with timestamp 1429037770846
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scala-compiler-2.10.0.jar to /tmp/fetchFileTemp6701382586835769484.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scala-compiler-2.10.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/bsh-2.0b4.jar with timestamp 1429037770786
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/bsh-2.0b4.jar to /tmp/fetchFileTemp2388925456893403133.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./bsh-2.0b4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/rl_2.10-0.4.10.jar with timestamp 1429037770857
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/rl_2.10-0.4.10.jar to /tmp/fetchFileTemp7265779172490063278.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./rl_2.10-0.4.10.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/minlog-1.2.jar with timestamp 1429037770720
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/minlog-1.2.jar to /tmp/fetchFileTemp7393705099077145723.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./minlog-1.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/mime-util-2.1.3.jar with timestamp 1429037770859
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/mime-util-2.1.3.jar to /tmp/fetchFileTemp304383920554853460.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./mime-util-2.1.3.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalap-2.10.0.jar with timestamp 1429037770851
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalap-2.10.0.jar to /tmp/fetchFileTemp4091359871699194773.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalap-2.10.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/xz-1.0.jar with timestamp 1429037770690
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/xz-1.0.jar to /tmp/fetchFileTemp1898496628024199440.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./xz-1.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/httpcore-4.3.1.jar with timestamp 1429037770790
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/httpcore-4.3.1.jar to /tmp/fetchFileTemp7892039994601518338.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./httpcore-4.3.1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/jakarta-regexp-1.4.jar with timestamp 1429037770781
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/jakarta-regexp-1.4.jar to /tmp/fetchFileTemp1426692907134838681.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./jakarta-regexp-1.4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-common-1.6.0rc4.jar with timestamp 1429037770759
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-common-1.6.0rc4.jar to /tmp/fetchFileTemp5086786835973030426.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-common-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/slf4j-api-1.7.5.jar with timestamp 1429037770697
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/slf4j-api-1.7.5.jar to /tmp/fetchFileTemp6896817311355197448.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./slf4j-api-1.7.5.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/log4j-1.2.17.jar with timestamp 1429037770698
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/log4j-1.2.17.jar to /tmp/fetchFileTemp4527060373665486864.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./log4j-1.2.17.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/ant-1.8.2.jar with timestamp 1429037770778
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/ant-1.8.2.jar to /tmp/fetchFileTemp6733947518044913252.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./ant-1.8.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-generator-1.6.0rc4.jar with timestamp 1429037770760
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-generator-1.6.0rc4.jar to /tmp/fetchFileTemp2787981656240273372.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-generator-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/httpclient-4.3.2.jar with timestamp 1429037770788
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/httpclient-4.3.2.jar to /tmp/fetchFileTemp767377273667406719.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./httpclient-4.3.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/joda-time-2.3.jar with timestamp 1429037770860
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/joda-time-2.3.jar to /tmp/fetchFileTemp8092498449699013036.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./joda-time-2.3.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/adam-core-0.15.0.jar with timestamp 1429037770717
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/adam-core-0.15.0.jar to /tmp/fetchFileTemp6699838462856191513.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./adam-core-0.15.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/htsjdk-1.118.jar with timestamp 1429037770773
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/htsjdk-1.118.jar to /tmp/fetchFileTemp8057782358742865819.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./htsjdk-1.118.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-codec-1.4.jar with timestamp 1429037770688
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-codec-1.4.jar to /tmp/fetchFileTemp2904536354338960242.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-codec-1.4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/testng-6.8.8.jar with timestamp 1429037770785
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/testng-6.8.8.jar to /tmp/fetchFileTemp177021206433593691.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./testng-6.8.8.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/adam-cli-0.15.0.jar with timestamp 1429037770868
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/adam-cli-0.15.0.jar to /tmp/fetchFileTemp1198597501853490421.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./adam-cli-0.15.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalate-core_2.10-1.6.1.jar with timestamp 1429037770815
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalate-core_2.10-1.6.1.jar to /tmp/fetchFileTemp1544095283491947968.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalate-core_2.10-1.6.1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalac-scoverage-plugin_2.10-0.99.2.jar with timestamp 1429037770704
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalac-scoverage-plugin_2.10-0.99.2.jar to /tmp/fetchFileTemp2418724355007354737.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalac-scoverage-plugin_2.10-0.99.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/commons-logging-1.1.3.jar with timestamp 1429037770689
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/commons-logging-1.1.3.jar to /tmp/fetchFileTemp2193111831579078333.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./commons-logging-1.1.3.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/jackson-core-asl-1.9.13.jar with timestamp 1429037770708
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/jackson-core-asl-1.9.13.jar to /tmp/fetchFileTemp2508275494061804380.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./jackson-core-asl-1.9.13.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/avro-1.7.6.jar with timestamp 1429037770707
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/avro-1.7.6.jar to /tmp/fetchFileTemp3201061414883736034.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./avro-1.7.6.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/ant-launcher-1.8.2.jar with timestamp 1429037770779
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/ant-launcher-1.8.2.jar to /tmp/fetchFileTemp8671223998556853064.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./ant-launcher-1.8.2.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/adam-apis-0.15.0.jar with timestamp 1429037770792
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/adam-apis-0.15.0.jar to /tmp/fetchFileTemp452417511092631568.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./adam-apis-0.15.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/guava-14.0.1.jar with timestamp 1429037770696
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/guava-14.0.1.jar to /tmp/fetchFileTemp5078180267466695914.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./guava-14.0.1.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/reflectasm-1.07-shaded.jar with timestamp 1429037770719
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/reflectasm-1.07-shaded.jar to /tmp/fetchFileTemp1888943180233620703.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./reflectasm-1.07-shaded.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/bdg-formats-0.4.0.jar with timestamp 1429037770705
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/bdg-formats-0.4.0.jar to /tmp/fetchFileTemp7247129694857228863.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./bdg-formats-0.4.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/json4s-core_2.10-3.2.10.jar with timestamp 1429037770848
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/json4s-core_2.10-3.2.10.jar to /tmp/fetchFileTemp5003181348880161547.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./json4s-core_2.10-3.2.10.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/juniversalchardet-1.0.3.jar with timestamp 1429037770858
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/juniversalchardet-1.0.3.jar to /tmp/fetchFileTemp3163585313980219109.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./juniversalchardet-1.0.3.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalatra_2.10-2.3.0.jar with timestamp 1429037770855
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalatra_2.10-2.3.0.jar to /tmp/fetchFileTemp8229749564150709625.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalatra_2.10-2.3.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalatra-common_2.10-2.3.0.jar with timestamp 1429037770856
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalatra-common_2.10-2.3.0.jar to /tmp/fetchFileTemp7222374969558899826.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalatra-common_2.10-2.3.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-encoding-1.6.0rc4.jar with timestamp 1429037770760
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-encoding-1.6.0rc4.jar to /tmp/fetchFileTemp3713177571411689667.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-encoding-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/annotations-2.0.0.jar with timestamp 1429037770792
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/annotations-2.0.0.jar to /tmp/fetchFileTemp2076170929340010228.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./annotations-2.0.0.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/parquet-column-1.6.0rc4.jar with timestamp 1429037770758
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/parquet-column-1.6.0rc4.jar to /tmp/fetchFileTemp2383576209123461404.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./parquet-column-1.6.0rc4.jar to class loader
15/04/14 18:56:13 INFO Executor: Fetching http://172.17.0.16:45360/jars/scalate-util_2.10-1.6.1.jar with timestamp 1429037770816
15/04/14 18:56:13 INFO Utils: Fetching http://172.17.0.16:45360/jars/scalate-util_2.10-1.6.1.jar to /tmp/fetchFileTemp1285303994951399648.tmp
15/04/14 18:56:13 INFO Executor: Adding file:/usr/local/spark-1.1.0-bin-hadoop2.3/work/app-20150414185611-0000/0/./scalate-util_2.10-1.6.1.jar to class loader
15/04/14 18:56:13 INFO TorrentBroadcast: Started reading broadcast variable 1
15/04/14 18:56:13 INFO SendingConnection: Initiating connection to [adam/172.17.0.16:52263]
15/04/14 18:56:13 INFO SendingConnection: Connected to [adam/172.17.0.16:52263], 1 messages pending
15/04/14 18:56:33 INFO ConnectionManager: Accepted connection from [adam/172.17.0.16:34797]
15/04/14 18:56:33 INFO MemoryStore: ensureFreeSpace(22331) called with curMem=0, maxMem=1111794647
15/04/14 18:56:33 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 21.8 KB, free 1060.3 MB)
15/04/14 18:56:33 INFO BlockManagerMaster: Updated info of block broadcast_1_piece0
15/04/14 18:56:33 INFO TorrentBroadcast: Reading broadcast variable 1 took 20.130715702 s
15/04/14 18:56:34 INFO MemoryStore: ensureFreeSpace(64336) called with curMem=22331, maxMem=1111794647
15/04/14 18:56:34 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 62.8 KB, free 1060.2 MB)
15/04/14 18:56:34 INFO TorrentBroadcast: Started reading broadcast variable 0
15/04/14 18:56:44 INFO MemoryStore: ensureFreeSpace(15731) called with curMem=86667, maxMem=1111794647
15/04/14 18:56:44 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 15.4 KB, free 1060.2 MB)
15/04/14 18:56:44 INFO BlockManagerMaster: Updated info of block broadcast_0_piece0
15/04/14 18:56:44 INFO TorrentBroadcast: Reading broadcast variable 0 took 10.02929592 s
15/04/14 18:56:44 INFO MemoryStore: ensureFreeSpace(264748) called with curMem=102398, maxMem=1111794647
15/04/14 18:56:44 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 258.5 KB, free 1059.9 MB)
15/04/14 18:56:44 INFO deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
15/04/14 18:56:44 INFO NewHadoopRDD: Input split: hdfs://hdfs-namenode:9000/SRR062634.sam:0+94792807
15/04/14 18:56:45 INFO CodecPool: Got brand-new compressor [.gz]
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
15/04/14 18:57:00 INFO FileOutputCommitter: Saved output of task 'attempt_201504141856_0002_r_000000_0' to hdfs://hdfs-namenode:9000/SRR062634b.adam/_temporary/0/task_201504141856_0002_r_000000
15/04/14 18:57:00 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1846 bytes result sent to driver
15/04/14 18:57:01 INFO ConnectionManager: Removing ReceivingConnection to ConnectionManagerId(adam,52263)
15/04/14 18:57:01 INFO ConnectionManager: Removing SendingConnection to ConnectionManagerId(adam,52263)
15/04/14 18:57:01 INFO ConnectionManager: Removing SendingConnection to ConnectionManagerId(adam,52263)
15/04/14 18:57:01 INFO ConnectionManager: key already cancelled ? sun.nio.ch.SelectionKeyImpl@2662e5cf
java.nio.channels.CancelledKeyException
    at sun.nio.ch.SelectionKeyImpl.ensureValid(SelectionKeyImpl.java:73)
    at sun.nio.ch.SelectionKeyImpl.readyOps(SelectionKeyImpl.java:87)
    at java.nio.channels.SelectionKey.isConnectable(SelectionKey.java:336)
    at org.apache.spark.network.ConnectionManager.run(ConnectionManager.scala:375)
    at org.apache.spark.network.ConnectionManager$$anon$4.run(ConnectionManager.scala:139)
15/04/14 18:57:01 ERROR CoarseGrainedExecutorBackend: Driver Disassociated [akka.tcp://sparkExecutor@spark-worker1:37876] -> [akka.tcp://sparkDriver@adam:33997] disassociated! Shutting down.

STDOUT

Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.codec.CodecConfig: Compression: GZIP
Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.ParquetOutputFormat: Dictionary is on
Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.ParquetOutputFormat: Validation is off
Apr 14, 2015 6:56:45 PM INFO: parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
Apr 14, 2015 6:56:59 PM INFO: parquet.hadoop.InternalParquetRecordWriter: Flushing mem store to file. allocated memory: 96,056,480
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 41,000B for [contig, contigName] BINARY: 308,846 values, 77,721B raw, 40,977B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 8B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 41,000B for [contig, contigLength] INT64: 308,846 values, 77,721B raw, 40,977B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 8B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 40,646B for [contig, contigMD5] BINARY: 308,846 values, 77,717B raw, 40,623B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 40,646B for [contig, referenceURL] BINARY: 308,846 values, 77,717B raw, 40,623B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 40,646B for [contig, assembly] BINARY: 308,846 values, 77,717B raw, 40,623B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 40,646B for [contig, species] BINARY: 308,846 values, 77,717B raw, 40,623B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 338,355B for [start] INT64: 308,846 values, 596,823B raw, 338,332B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [oldPosition] INT64: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 338,429B for [end] INT64: 308,846 values, 596,823B raw, 338,406B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 81,665B for [mapq] INT32: 308,846 values, 105,817B raw, 81,642B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 71 entries, 284B raw, 71B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 903,697B for [readName] BINARY: 308,846 values, 6,544,586B raw, 903,530B comp, 7 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 9,287,498B for [sequence] BINARY: 308,846 values, 32,120,231B raw, 9,286,756B comp, 31 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 10,872,009B for [qual] BINARY: 308,846 values, 32,120,231B raw, 10,871,267B comp, 31 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 141,052B for [cigar] BINARY: 308,846 values, 452,474B raw, 141,006B comp, 2 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 13,084 entries, 255,861B raw, 13,084B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [oldCigar] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 97B for [basesTrimmedFromStart] INT32: 308,846 values, 24B raw, 59B comp, 2 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 4B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 97B for [basesTrimmedFromEnd] INT32: 308,846 values, 24B raw, 59B comp, 2 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 4B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [readPaired] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [properPair] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 31,185B for [readMapped] BOOLEAN: 308,846 values, 38,614B raw, 31,162B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [mateMapped] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [firstOfPair] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [secondOfPair] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [failedVendorQualityChecks] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [duplicateRead] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 22,508B for [readNegativeStrand] BOOLEAN: 308,846 values, 38,614B raw, 22,485B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [mateNegativeStrand] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 31,185B for [primaryAlignment] BOOLEAN: 308,846 values, 38,614B raw, 31,162B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [secondaryAlignment] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 104B for [supplementaryAlignment] BOOLEAN: 308,846 values, 38,614B raw, 82B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mismatchingPositions] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [origQual] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 82,289B for [attributes] BINARY: 308,846 values, 142,604B raw, 81,848B comp, 21 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 16 entries, 1,110B raw, 16B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 147B for [recordGroupName] BINARY: 308,846 values, 36B raw, 90B comp, 3 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 9B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [recordGroupSequencingCenter] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [recordGroupDescription] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [recordGroupRunDateEpoch] INT64: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [recordGroupFlowOrder] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [recordGroupKeySequence] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 97B for [recordGroupLibrary] BINARY: 308,846 values, 24B raw, 59B comp, 2 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 6B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [recordGroupPredictedMedianInsertSize] INT32: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 196B for [recordGroupPlatform] BINARY: 308,846 values, 48B raw, 120B comp, 4 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 12B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 97B for [recordGroupPlatformUnit] BINARY: 308,846 values, 24B raw, 59B comp, 2 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 6B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 97B for [recordGroupSample] BINARY: 308,846 values, 24B raw, 59B comp, 2 pages, encodings: [RLE, BIT_PACKED, PLAIN_DICTIONARY], dic { 1 entries, 6B raw, 1B comp}
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateAlignmentStart] INT64: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateAlignmentEnd] INT64: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateContig, contigName] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateContig, contigLength] INT64: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateContig, contigMD5] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateContig, referenceURL] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateContig, assembly] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
Apr 14, 2015 6:57:00 PM INFO: parquet.hadoop.ColumnChunkPageWriteStore: written 47B for [mateContig, species] BINARY: 308,846 values, 8B raw, 28B comp, 1 pages, encodings: [RLE, BIT_PACKED, PLAIN]
@sebastienbonami
Copy link
Member

@davidonlaptop @flangelier As you are using hostnames, have you try in hdfs-site.xml to set the property dfs.namenode.datanode.registration.ip-hostname-check to true instead of false?

https://github.com/GELOG/adamcloud/blob/a66c69d6f78678b123dc7b9600e0d5d4281c3aae/env/macmini/hdfs-site.xml#L36-39

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants