I guess thats a warning. But I get the following error during run time.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/sbt_8fbaec8d/target/54c6dd23/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/sbt_8fbaec8d/target/be4b3c56/slf4j-simple-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See SLF4J Error Codes for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
[error] (run-main-0) java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.<init>()V from class org.apache.hadoop.mapred.FileInputFormat
[error] java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.<init>()V from class org.apache.hadoop.mapred.FileInputFormat
[error] at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:312)
[error] at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:204)
[error] at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
[error] at scala.Option.getOrElse(Option.scala:138)
[error] at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
[error] at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49)
[error] at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
[error] at scala.Option.getOrElse(Option.scala:138)
[error] at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
[error] at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49)
[error] at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
[error] at scala.Option.getOrElse(Option.scala:138)
[error] at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
[error] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
[error] at org.apache.spark.rdd.RDD.count(RDD.scala:1168)
[error] at org.apache.spark.graphx.GraphLoader$.edgeListFile(GraphLoader.scala:94)
[error] at example.MyPPRPlay$.main(MyPRDriver.scala:26)
[error] at example.MyPPRPlay.main(MyPRDriver.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.lang.reflect.Method.invoke(Method.java:498)
19/04/07 12:51:55 ERROR Utils: uncaught error in thread spark-listener-group-appStatus, stopping SparkContext
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:97)
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:83)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:83)
…
Thanks,
Eshwar