Error using spark (sharedRDD.savePairs(rdd)) with Ignite 1.5.0, but not ignite 1.4.0

classic Classic list List threaded Threaded
4 messages Options
Richard Pelavin Richard Pelavin
Reply | Threaded
Open this post in threaded view
|

Error using spark (sharedRDD.savePairs(rdd)) with Ignite 1.5.0, but not ignite 1.4.0

I am using spark-shell  (spark 1.5.1) running a standalone spark cluster and seeing  error at bottom of post on the spark workers when I execute 'sharedRDD.savePairs(rdd)'  in spark shell


It works fine when I use ignite 1.4.0

Might be due to starting things differently. For ignite 1.4.0 I use:

./bin/spark-shell \./bin/spark-shell \
  --packages org.apache.ignite:ignite-spark:1.4.0 \
  --jars /usr/lib/ignite/libs/ignite-query-objects.jar

while after building 1.5.0 from source I use:

BASE="/usr/lib/ignite/libs"
VER="1.5.0-SNAPSHOT"
IGNITE_JARS="${BASE}/ignite-query-objects.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-core-${VER}.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/cache-api-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-shmem-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-spring/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-indexing/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-log4j/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-spark/*"
./bin/spark-shell --jars $IGNITE_JARS

and get error

Any pointers would be appreciated.
Thanks,
Rich
---

Error stack:

15/11/16 19:54:34 ERROR TaskSetManager: Task 0 in stage 2.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 7, 10.0.0.206): java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at java.lang.ClassLoader.findClass(ClassLoader.java:531)
        at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:64)
        ... 30 more
Anton Vinogradov Anton Vinogradov
Reply | Threaded
Open this post in threaded view
|

Re: Error using spark (sharedRDD.savePairs(rdd)) with Ignite 1.5.0, but not ignite 1.4.0

Richard, 

Possible solutions:
1) Ingite was builded from sources incorrectly. 
Could you please build 1.4 from sources in same way and check it works.

2) ignite-spark.jar is not inside classpath. 
Put all required jars to one folder and check it works.

3) You have problems with classloaders.
Restart all JVMs related to ingite-spak usage. 

Also, 
Please properly subscribe to the user list (this way we will not have to manually approve your emails). 
All you need to do is send an email to “[hidden email]” and follow simple instructions in the reply.

On Tue, Nov 17, 2015 at 7:23 AM, Richard Pelavin <[hidden email]> wrote:
I am using spark-shell  (spark 1.5.1) running a standalone spark cluster and seeing  error at bottom of post on the spark workers when I execute 'sharedRDD.savePairs(rdd)'  in spark shell


It works fine when I use ignite 1.4.0

Might be due to starting things differently. For ignite 1.4.0 I use:

./bin/spark-shell \./bin/spark-shell \
  --packages org.apache.ignite:ignite-spark:1.4.0 \
  --jars /usr/lib/ignite/libs/ignite-query-objects.jar

while after building 1.5.0 from source I use:

BASE="/usr/lib/ignite/libs"
VER="1.5.0-SNAPSHOT"
IGNITE_JARS="${BASE}/ignite-query-objects.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-core-${VER}.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/cache-api-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-shmem-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-spring/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-indexing/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-log4j/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-spark/*"
./bin/spark-shell --jars $IGNITE_JARS

and get error

Any pointers would be appreciated.
Thanks,
Rich
---

Error stack:

15/11/16 19:54:34 ERROR TaskSetManager: Task 0 in stage 2.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 7, 10.0.0.206): java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at java.lang.ClassLoader.findClass(ClassLoader.java:531)
        at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:64)
        ... 30 more

Richard Pelavin Richard Pelavin
Reply | Threaded
Open this post in threaded view
|

Re: Error using spark (sharedRDD.savePairs(rdd)) with Ignite 1.5.0, but not ignite 1.4.0

Thanks Anton,
You write:
>..
>2) ignite-spark.jar is not inside classpath. 
>Put all required jars to one folder and check it works.

Checked ignite 1.4.0 starting spark shell using local jars and see that had same problem; so it is not an ignite1.5.0 problem and instead problem in my local environment
- Rich

On Tue, Nov 17, 2015 at 6:16 AM, Anton Vinogradov <[hidden email]> wrote:
Richard, 

Possible solutions:
1) Ingite was builded from sources incorrectly. 
Could you please build 1.4 from sources in same way and check it works.

2) ignite-spark.jar is not inside classpath. 
Put all required jars to one folder and check it works.

3) You have problems with classloaders.
Restart all JVMs related to ingite-spak usage. 

Also, 
Please properly subscribe to the user list (this way we will not have to manually approve your emails). 
All you need to do is send an email to “[hidden email]” and follow simple instructions in the reply.

On Tue, Nov 17, 2015 at 7:23 AM, Richard Pelavin <[hidden email]> wrote:
I am using spark-shell  (spark 1.5.1) running a standalone spark cluster and seeing  error at bottom of post on the spark workers when I execute 'sharedRDD.savePairs(rdd)'  in spark shell


It works fine when I use ignite 1.4.0

Might be due to starting things differently. For ignite 1.4.0 I use:

./bin/spark-shell \./bin/spark-shell \
  --packages org.apache.ignite:ignite-spark:1.4.0 \
  --jars /usr/lib/ignite/libs/ignite-query-objects.jar

while after building 1.5.0 from source I use:

BASE="/usr/lib/ignite/libs"
VER="1.5.0-SNAPSHOT"
IGNITE_JARS="${BASE}/ignite-query-objects.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-core-${VER}.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/cache-api-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-shmem-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-spring/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-indexing/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-log4j/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-spark/*"
./bin/spark-shell --jars $IGNITE_JARS

and get error

Any pointers would be appreciated.
Thanks,
Rich
---

Error stack:

15/11/16 19:54:34 ERROR TaskSetManager: Task 0 in stage 2.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 7, 10.0.0.206): java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at java.lang.ClassLoader.findClass(ClassLoader.java:531)
        at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:64)
        ... 30 more


Anton Vinogradov Anton Vinogradov
Reply | Threaded
Open this post in threaded view
|

Re: Error using spark (sharedRDD.savePairs(rdd)) with Ignite 1.5.0, but not ignite 1.4.0

Richard,

Could you please give me more info about working and non-working solutions?

On Tue, Nov 17, 2015 at 8:56 PM, Richard Pelavin <[hidden email]> wrote:
Thanks Anton,
You write:
>..
>2) ignite-spark.jar is not inside classpath. 
>Put all required jars to one folder and check it works.

Checked ignite 1.4.0 starting spark shell using local jars and see that had same problem; so it is not an ignite1.5.0 problem and instead problem in my local environment
- Rich

On Tue, Nov 17, 2015 at 6:16 AM, Anton Vinogradov <[hidden email]> wrote:
Richard, 

Possible solutions:
1) Ingite was builded from sources incorrectly. 
Could you please build 1.4 from sources in same way and check it works.

2) ignite-spark.jar is not inside classpath. 
Put all required jars to one folder and check it works.

3) You have problems with classloaders.
Restart all JVMs related to ingite-spak usage. 

Also, 
Please properly subscribe to the user list (this way we will not have to manually approve your emails). 
All you need to do is send an email to “[hidden email]” and follow simple instructions in the reply.

On Tue, Nov 17, 2015 at 7:23 AM, Richard Pelavin <[hidden email]> wrote:
I am using spark-shell  (spark 1.5.1) running a standalone spark cluster and seeing  error at bottom of post on the spark workers when I execute 'sharedRDD.savePairs(rdd)'  in spark shell


It works fine when I use ignite 1.4.0

Might be due to starting things differently. For ignite 1.4.0 I use:

./bin/spark-shell \./bin/spark-shell \
  --packages org.apache.ignite:ignite-spark:1.4.0 \
  --jars /usr/lib/ignite/libs/ignite-query-objects.jar

while after building 1.5.0 from source I use:

BASE="/usr/lib/ignite/libs"
VER="1.5.0-SNAPSHOT"
IGNITE_JARS="${BASE}/ignite-query-objects.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-core-${VER}.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/cache-api-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-shmem-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-spring/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-indexing/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-log4j/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-spark/*"
./bin/spark-shell --jars $IGNITE_JARS

and get error

Any pointers would be appreciated.
Thanks,
Rich
---

Error stack:

15/11/16 19:54:34 ERROR TaskSetManager: Task 0 in stage 2.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 7, 10.0.0.206): java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at java.lang.ClassLoader.findClass(ClassLoader.java:531)
        at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
        at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:64)
        ... 30 more