error in running shared rdd in ignite

classic Classic list List threaded Threaded
2 messages Options
mehdi sey mehdi sey
Reply | Threaded
Open this post in threaded view
|

error in running shared rdd in ignite

hi, i have a code for writing into ignite rdd. this program read data from
spark rdd and catch it on ignite rdd. i run it with command line in Linux
Ubuntu but in the middle of execution i have encounter with below error. i
checked in spark UI for watching if job complete or not but the job is not
complete and failed. why? i have attached piece of code that i have wrote
and run with command.

$SPARK_HOME/bin/spark-submit --class "com.gridgain.RDDWriter" --master
spark://linux-client:7077 ~/spark\ and\ ignite\
issue/ignite-and-spark-integration-master/ignite-rdd/ignite-spark-scala/target/ignite-spark-scala-1.0.jar
2019-01-05 11:47:02 WARN  Utils:66 - Your hostname, linux-client resolves to
a loopback address: 127.0.1.1, but we couldn't find any external IP address!
2019-01-05 11:47:02 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind
to another address
2019-01-05 11:47:03 WARN  NativeCodeLoader:62 - Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
2019-01-05 11:47:03 INFO  SparkContext:54 - Running Spark version 2.4.0
2019-01-05 11:47:03 INFO  SparkContext:54 - Submitted application: RDDWriter
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing view acls to: mehdi
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing modify acls to:
mehdi
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing view acls groups to:
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing modify acls groups
to:
2019-01-05 11:47:03 INFO  SecurityManager:54 - SecurityManager:
authentication disabled; ui acls disabled; users  with view permissions:
Set(mehdi); groups with view permissions: Set(); users  with modify
permissions: Set(mehdi); groups with modify permissions: Set()
2019-01-05 11:47:03 WARN  MacAddressUtil:136 - Failed to find a usable
hardware address from the network interfaces; using random bytes:
88:26:00:23:5d:50:a0:61
2019-01-05 11:47:03 INFO  Utils:54 - Successfully started service
'sparkDriver' on port 36233.
2019-01-05 11:47:03 INFO  SparkEnv:54 - Registering MapOutputTracker
2019-01-05 11:47:03 INFO  SparkEnv:54 - Registering BlockManagerMaster
2019-01-05 11:47:03 INFO  BlockManagerMasterEndpoint:54 - Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
2019-01-05 11:47:03 INFO  BlockManagerMasterEndpoint:54 -
BlockManagerMasterEndpoint up
2019-01-05 11:47:03 INFO  DiskBlockManager:54 - Created local directory at
/tmp/blockmgr-6e47832e-855a-4305-a293-662379733b7f
2019-01-05 11:47:03 INFO  MemoryStore:54 - MemoryStore started with capacity
366.3 MB
2019-01-05 11:47:03 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2019-01-05 11:47:03 INFO  log:192 - Logging initialized @2024ms
2019-01-05 11:47:04 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build
timestamp: unknown, git hash: unknown
2019-01-05 11:47:04 INFO  Server:419 - Started @2108ms
2019-01-05 11:47:04 INFO  AbstractConnector:278 - Started
ServerConnector@5ba745bc{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-01-05 11:47:04 INFO  Utils:54 - Successfully started service 'SparkUI'
on port 4040.
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@606fc505{/jobs,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2c30b71f{/jobs/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1d81e101{/jobs/job,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@bf71cec{/jobs/job/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@22d6cac2{/stages,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@30cdae70{/stages/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1654a892{/stages/stage,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6c000e0c{/stages/stage/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5f233b26{/stages/pool,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@44f9779c{/stages/pool/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6974a715{/storage,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5e8a459{/storage/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@43d455c9{/storage/rdd,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4c9e9fb8{/storage/rdd/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@9ec531{/environment,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@40147317{/environment/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@210f0cc1{/executors,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@19542407{/executors/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6f95cd51{/executors/threadDump,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@c7a977f{/executors/threadDump/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@55caeb35{/static,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@3ae66c85{/,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@16943e88{/api,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@3961a41a{/jobs/job/kill,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5a4ed68f{/stages/stage/kill,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started
at http://linux-client:4040
2019-01-05 11:47:04 INFO  SparkContext:54 - Added JAR
file:/home/mehdi/spark%20and%20ignite%20issue/ignite-and-spark-integration-master/ignite-rdd/ignite-spark-scala/target/ignite-spark-scala-1.0.jar
at spark://linux-client:36233/jars/ignite-spark-scala-1.0.jar with timestamp
1546676224149
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Connecting
to master spark://linux-client:7077...
2019-01-05 11:47:04 INFO  TransportClientFactory:267 - Successfully created
connection to linux-client/127.0.1.1:7077 after 40 ms (0 ms spent in
bootstraps)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Connected to Spark
cluster with app ID app-20190105114704-0003
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/0 on worker-20190105103259-127.0.1.1-43911
(127.0.1.1:43911) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/0 on hostPort 127.0.1.1:43911 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/1 on worker-20190105103304-127.0.1.1-44569
(127.0.1.1:44569) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/1 on hostPort 127.0.1.1:44569 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/2 on worker-20190105103301-127.0.1.1-34465
(127.0.1.1:34465) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/2 on hostPort 127.0.1.1:34465 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/3 on worker-20190105103256-127.0.1.1-46653
(127.0.1.1:46653) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/3 on hostPort 127.0.1.1:46653 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  Utils:54 - Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 41343.
2019-01-05 11:47:04 INFO  NettyBlockTransferService:54 - Server created on
linux-client:41343
2019-01-05 11:47:04 INFO  BlockManager:54 - Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/2 is now RUNNING
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/1 is now RUNNING
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/3 is now RUNNING
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/0 is now RUNNING
2019-01-05 11:47:04 INFO  BlockManagerMaster:54 - Registering BlockManager
BlockManagerId(driver, linux-client, 41343, None)
2019-01-05 11:47:04 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager linux-client:41343 with 366.3 MB RAM, BlockManagerId(driver,
linux-client, 41343, None)
2019-01-05 11:47:04 INFO  BlockManagerMaster:54 - Registered BlockManager
BlockManagerId(driver, linux-client, 41343, None)
2019-01-05 11:47:04 INFO  BlockManager:54 - Initialized BlockManager:
BlockManagerId(driver, linux-client, 41343, None)
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@70e02081{/metrics/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:05 INFO  EventLoggingListener:54 - Logging events to
file:/tmp/spark-events/app-20190105114704-0003
2019-01-05 11:47:05 INFO  StandaloneSchedulerBackend:54 - SchedulerBackend
is ready for scheduling beginning after reached minRegisteredResourcesRatio:
0.0
2019-01-05 11:47:06 INFO  XmlBeanDefinitionReader:317 - Loading XML bean
definitions from URL
[file:/usr/local/apache-ignite-fabric-2.6.0-bin/examples/config/spark/example-shared-rdd.xml]
2019-01-05 11:47:06 INFO  GenericApplicationContext:583 - Refreshing
org.springframework.context.support.GenericApplicationContext@2ec3633f:
startup date [Sat Jan 05 11:47:06 IRST 2019]; root of context hierarchy
Can't load log handler "org.apache.ignite.logger.java.JavaLoggerFileHandler"
java.lang.ClassNotFoundException:
org.apache.ignite.logger.java.JavaLoggerFileHandler
java.lang.ClassNotFoundException:
org.apache.ignite.logger.java.JavaLoggerFileHandler
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.util.logging.LogManager$5.run(LogManager.java:965)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.util.logging.LogManager.loadLoggerHandlers(LogManager.java:958)
        at
java.util.logging.LogManager.initializeGlobalHandlers(LogManager.java:1578)
        at java.util.logging.LogManager.access$1500(LogManager.java:145)
        at
java.util.logging.LogManager$RootLogger.accessCheckedHandlers(LogManager.java:1667)
        at java.util.logging.Logger.getHandlers(Logger.java:1777)
        at
org.apache.ignite.logger.java.JavaLogger.findHandler(JavaLogger.java:411)
        at org.apache.ignite.logger.java.JavaLogger.configure(JavaLogger.java:241)
        at org.apache.ignite.logger.java.JavaLogger.<init>(JavaLogger.java:181)
        at org.apache.ignite.logger.java.JavaLogger.<init>(JavaLogger.java:135)
        at
org.apache.ignite.internal.LongJVMPauseDetector.<clinit>(LongJVMPauseDetector.java:44)
        at org.apache.ignite.internal.IgniteKernal.<clinit>(IgniteKernal.java:300)
        at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:2009)
        at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1723)
        at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:1151)
        at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:671)
        at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:611)
        at org.apache.ignite.Ignition.getOrStart(Ignition.java:419)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
        at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:63)
        at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:99)
        at
com.gridgain.RDDWriter$.delayedEndpoint$com$gridgain$RDDWriter$1(SparkIgniteTest.scala:26)
        at com.gridgain.RDDWriter$delayedInit$body.apply(SparkIgniteTest.scala:23)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at com.gridgain.RDDWriter$.main(SparkIgniteTest.scala:23)
        at com.gridgain.RDDWriter.main(SparkIgniteTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2019-01-05 11:47:07 INFO  IgniteKernal:566 -

>>>    __________  ________________  
>>>   /  _/ ___/ |/ /  _/_  __/ __/  
>>>  _/ // (7 7    // /  / / / _/    
>>> /___/\___/_/|_/___/ /_/ /___/  
>>>
>>> ver. 2.6.0#19700101-sha1:DEV
>>> 2018 Copyright(C) Apache Software Foundation
>>>
>>> Ignite documentation: http://ignite.apache.org

2019-01-05 11:47:07 INFO  IgniteKernal:566 - Config URL: n/a
2019-01-05 11:47:07 INFO  IgniteKernal:566 - IgniteConfiguration
[igniteInstanceName=null, pubPoolSize=8, svcPoolSize=8, callbackPoolSize=8,
stripedPoolSize=8, sysPoolSize=8, mgmtPoolSize=4, igfsPoolSize=8,
dataStreamerPoolSize=8, utilityCachePoolSize=8,
utilityCacheKeepAliveTime=60000, p2pPoolSize=2, qryPoolSize=8,
igniteHome=/usr/local/apache-ignite-fabric-2.6.0-bin/,
igniteWorkDir=/usr/local/apache-ignite-fabric-2.6.0-bin/work,
mbeanSrv=com.sun.jmx.mbeanserver.JmxMBeanServer@76b47204,
nodeId=f512a374-f6b6-4918-9790-6183a43f8064,
marsh=org.apache.ignite.internal.binary.BinaryMarshaller@1f1cddf3,
marshLocJobs=false, daemon=false, p2pEnabled=false, netTimeout=5000,
sndRetryDelay=1000, sndRetryCnt=3, metricsHistSize=10000,
metricsUpdateFreq=2000, metricsExpTime=9223372036854775807,
discoSpi=TcpDiscoverySpi [addrRslvr=null, sockTimeout=0, ackTimeout=0,
marsh=null, reconCnt=10, reconDelay=2000, maxAckTimeout=600000,
forceSrvMode=false, clientReconnectDisabled=false, internalLsnr=null],
segPlc=STOP, segResolveAttempts=2, waitForSegOnStart=true,
allResolversPassReq=true, segChkFreq=10000, commSpi=TcpCommunicationSpi
[connectGate=null, connPlc=null, enableForcibleNodeKill=false,
enableTroubleshootingLog=false,
srvLsnr=org.apache.ignite.spi.communication.tcp.TcpCommunicationSpi$2@3fd05b3e,
locAddr=null, locHost=null, locPort=47100, locPortRange=100, shmemPort=-1,
directBuf=true, directSndBuf=false, idleConnTimeout=600000,
connTimeout=5000, maxConnTimeout=600000, reconCnt=10, sockSndBuf=32768,
sockRcvBuf=32768, msgQueueLimit=0, slowClientQueueLimit=0, nioSrvr=null,
shmemSrv=null, usePairedConnections=false, connectionsPerNode=1,
tcpNoDelay=true, filterReachableAddresses=false, ackSndThreshold=32,
unackedMsgsBufSize=0, sockWriteTimeout=2000, lsnr=null, boundTcpPort=-1,
boundTcpShmemPort=-1, selectorsCnt=4, selectorSpins=0, addrRslvr=null,
ctxInitLatch=java.util.concurrent.CountDownLatch@4eb9f2af[Count = 1],
stopping=false,
metricsLsnr=org.apache.ignite.spi.communication.tcp.TcpCommunicationMetricsListener@4a481728],
evtSpi=org.apache.ignite.spi.eventstorage.NoopEventStorageSpi@4ace284d,
colSpi=NoopCollisionSpi [], deploySpi=LocalDeploymentSpi [lsnr=null],
indexingSpi=org.apache.ignite.spi.indexing.noop.NoopIndexingSpi@64deb58f,
addrRslvr=null, clientMode=true, rebalanceThreadPoolSize=1,
txCfg=org.apache.ignite.configuration.TransactionConfiguration@2b1cd7bc,
cacheSanityCheckEnabled=true, discoStartupDelay=60000, deployMode=SHARED,
p2pMissedCacheSize=100, locHost=null, timeSrvPortBase=31100,
timeSrvPortRange=100, failureDetectionTimeout=10000,
clientFailureDetectionTimeout=30000, metricsLogFreq=60000, hadoopCfg=null,
connectorCfg=org.apache.ignite.configuration.ConnectorConfiguration@33379242,
odbcCfg=null, warmupClos=null, atomicCfg=AtomicConfiguration
[seqReserveSize=1000, cacheMode=PARTITIONED, backups=1, aff=null,
grpName=null], classLdr=null, sslCtxFactory=null, platformCfg=null,
binaryCfg=null, memCfg=null, pstCfg=null, dsCfg=null, activeOnStart=true,
autoActivation=true, longQryWarnTimeout=3000, sqlConnCfg=null,
cliConnCfg=ClientConnectorConfiguration [host=null, port=10800,
portRange=100, sockSndBufSize=0, sockRcvBufSize=0, tcpNoDelay=true,
maxOpenCursorsPerConn=128, threadPoolSize=8, idleTimeout=0,
jdbcEnabled=true, odbcEnabled=true, thinCliEnabled=true, sslEnabled=false,
useIgniteSslCtxFactory=true, sslClientAuth=false, sslCtxFactory=null],
authEnabled=false, failureHnd=null, commFailureRslvr=null]
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Daemon mode: off
2019-01-05 11:47:07 INFO  IgniteKernal:566 - OS: Linux 4.15.0-43-generic
amd64
2019-01-05 11:47:07 INFO  IgniteKernal:566 - OS user: mehdi
2019-01-05 11:47:07 INFO  IgniteKernal:566 - PID: 7165
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Language runtime: Scala ver.
2.11.12
2019-01-05 11:47:07 INFO  IgniteKernal:566 - VM information: Java(TM) SE
Runtime Environment 1.8.0_192-ea-b04 Oracle Corporation Java HotSpot(TM)
64-Bit Server VM 25.192-b04
2019-01-05 11:47:07 INFO  IgniteKernal:566 - VM total memory: 0.89GB
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Remote Management [restart:
off, REST: off, JMX (remote: off)]
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Logger: Log4JLogger
[quiet=false, config=null]
2019-01-05 11:47:07 INFO  IgniteKernal:566 -
IGNITE_HOME=/usr/local/apache-ignite-fabric-2.6.0-bin/
2019-01-05 11:47:07 INFO  IgniteKernal:566 - VM arguments: [-Xmx1g]
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Configured caches [in
'sysMemPlc' dataRegion: ['ignite-sys-cache'], in 'null' dataRegion:
['sharedRDD']]
2019-01-05 11:47:07 WARN  GridDiagnostic:571 - Default local host is a
loopback address. This can be a sign of potential network configuration
problem.
2019-01-05 11:47:07 INFO  IgniteKernal:566 - 3-rd party licenses can be
found at: /usr/local/apache-ignite-fabric-2.6.0-bin//libs/licenses
2019-01-05 11:47:07 WARN  GridDiagnostic:571 - No live network interfaces
detected. If IP-multicast discovery is used - make sure to add 127.0.0.1 as
a local address.
2019-01-05 11:47:07 WARN  GridDiagnostic:571 - Initial heap size is 126MB
(should be no less than 512MB, use -Xms512m -Xmx512m).
2019-01-05 11:47:08 INFO  IgnitePluginProcessor:566 - Configured plugins:
2019-01-05 11:47:08 INFO  IgnitePluginProcessor:566 -   ^-- None
2019-01-05 11:47:08 INFO  IgnitePluginProcessor:566 -
2019-01-05 11:47:08 INFO  FailureProcessor:566 - Configured failure handler:
[hnd=StopNodeOrHaltFailureHandler [tryStop=false, timeout=0]]
2019-01-05 11:47:08 INFO  TcpCommunicationSpi:566 - Successfully bound
communication NIO server to TCP port [port=47101, locHost=0.0.0.0/0.0.0.0,
selectorsCnt=4, selectorSpins=0, pairedConn=false]
2019-01-05 11:47:08 WARN  TcpCommunicationSpi:571 - Message queue limit is
set to 0 which may lead to potential OOMEs when running cache operations in
FULL_ASYNC or PRIMARY_SYNC modes due to message queues growth on sender and
receiver sides.
2019-01-05 11:47:08 WARN  NoopCheckpointSpi:571 - Checkpoints are disabled
(to enable configure any GridCheckpointSpi implementation)
2019-01-05 11:47:08 WARN  GridCollisionManager:571 - Collision resolution is
disabled (all jobs will be activated upon arrival).
2019-01-05 11:47:08 INFO  IgniteKernal:566 - Security status
[authentication=off, tls/ssl=off]
2019-01-05 11:47:09 INFO  ClientListenerProcessor:566 - Client connector
processor has started on TCP port 10801
2019-01-05 11:47:09 INFO  GridRestProcessor:566 - REST protocols do not
start on client node. To start the protocols on client node set
'-DIGNITE_REST_START_ON_CLIENT=true' system property.
2019-01-05 11:47:09 INFO  IgniteKernal:566 - Non-loopback local IPs: N/A
2019-01-05 11:47:09 INFO  IgniteKernal:566 - Enabled local MACs: N/A
2019-01-05 11:47:09 WARN  IgniteKernal:571 - Ignite is starting on loopback
address... Only nodes on the same physical computer can participate in
topology.
2019-01-05 11:47:09 WARN  TcpDiscoveryMulticastIpFinder:571 - Failed to send
multicast message (is multicast enabled on this node?).
2019-01-05 11:47:10 INFO  time:566 - Started exchange init
[topVer=AffinityTopologyVersion [topVer=4, minorTopVer=0], crd=false,
evt=NODE_JOINED, evtNode=f512a374-f6b6-4918-9790-6183a43f8064,
customEvt=null, allowMerge=true]
2019-01-05 11:47:11 INFO  GridCacheProcessor:566 - Started cache
[name=ignite-sys-cache, id=-2100569601, memoryPolicyName=sysMemPlc,
mode=REPLICATED, atomicity=TRANSACTIONAL, backups=2147483647]
2019-01-05 11:47:11 INFO  TcpCommunicationSpi:566 - Established outgoing
communication connection [locAddr=/0:0:0:0:0:0:0:1:43032,
rmtAddr=/0:0:0:0:0:0:0:1%lo:47100]
2019-01-05 11:47:11 INFO  GridCacheProcessor:566 - Started cache
[name=sharedRDD, id=-1581581875, memoryPolicyName=null, mode=PARTITIONED,
atomicity=ATOMIC, backups=1]
2019-01-05 11:47:11 INFO  time:566 - Finished exchange init
[topVer=AffinityTopologyVersion [topVer=4, minorTopVer=0], crd=false]
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49300) with ID 0
2019-01-05 11:47:11 INFO  GridDhtPartitionsExchangeFuture:566 - Received
full message, will finish exchange
[node=0a01ccfc-6d3a-4490-bdd8-90cf3b71928d, resVer=AffinityTopologyVersion
[topVer=4, minorTopVer=0]]
2019-01-05 11:47:11 INFO  GridDhtPartitionsExchangeFuture:566 - Finish
exchange future [startVer=AffinityTopologyVersion [topVer=4, minorTopVer=0],
resVer=AffinityTopologyVersion [topVer=4, minorTopVer=0], err=null]
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49296) with ID 3
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49294) with ID 2
2019-01-05 11:47:11 INFO  IgniteKernal:566 - Performance suggestions for
grid  (fix if possible)
2019-01-05 11:47:11 INFO  IgniteKernal:566 - To disable, set
-DIGNITE_PERFORMANCE_SUGGESTIONS_DISABLED=true
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Enable G1 Garbage
Collector (add '-XX:+UseG1GC' to JVM options)
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Set max direct memory
size if getting 'OOME: Direct buffer memory' (add
'-XX:MaxDirectMemorySize=<size>[g|G|m|M|k|K]' to JVM options)
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Disable processing of
calls to System.gc() (add '-XX:+DisableExplicitGC' to JVM options)
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Decrease number of
backups (set 'backups' to 0)
2019-01-05 11:47:11 INFO  IgniteKernal:566 - Refer to this page for more
performance suggestions:
https://apacheignite.readme.io/docs/jvm-and-system-tuning
2019-01-05 11:47:11 INFO  IgniteKernal:566 -
2019-01-05 11:47:11 INFO  IgniteKernal:566 - To start Console Management &
Monitoring run ignitevisorcmd.{sh|bat}
2019-01-05 11:47:11 INFO  IgniteKernal:566 -
2019-01-05 11:47:11 INFO  IgniteKernal:566 -

>>> +---------------------------------+
>>> Ignite ver. 2.6.0#19700101-sha1:DEV
>>> +---------------------------------+
>>> OS name: Linux 4.15.0-43-generic amd64
>>> CPU(s): 8
>>> Heap: 0.89GB
>>> VM name: 7165@linux-client
>>> Local node [ID=F512A374-F6B6-4918-9790-6183A43F8064, order=4,
>>> clientMode=true]
>>> Local node addresses: [0:0:0:0:0:0:0:1%lo, 127.0.0.1]
>>> Local ports: TCP:10801 TCP:47101 UDP:47400

2019-01-05 11:47:11 INFO  GridDiscoveryManager:566 - Topology snapshot
[ver=4, servers=1, clients=1, CPUs=8, offheap=1.6GB, heap=1.9GB]
2019-01-05 11:47:11 INFO  GridDiscoveryManager:566 -   ^-- Node
[id=F512A374-F6B6-4918-9790-6183A43F8064, clusterState=ACTIVE]
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49298) with ID 1
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:38343 with 127.2 MB RAM, BlockManagerId(1, 127.0.1.1,
38343, None)
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:42897 with 127.2 MB RAM, BlockManagerId(2, 127.0.1.1,
42897, None)
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:32861 with 127.2 MB RAM, BlockManagerId(3, 127.0.1.1,
32861, None)
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:32913 with 127.2 MB RAM, BlockManagerId(0, 127.0.1.1,
32913, None)
2019-01-05 11:47:12 INFO  SparkContext:54 - Starting job: foreachPartition
at IgniteRDD.scala:233
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Got job 0 (foreachPartition at
IgniteRDD.scala:233) with 10 output partitions
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Final stage: ResultStage 0
(foreachPartition at IgniteRDD.scala:233)
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Parents of final stage: List()
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Missing parents: List()
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Submitting ResultStage 0
(MapPartitionsRDD[2] at map at SparkIgniteTest.scala:28), which has no
missing parents
2019-01-05 11:47:12 INFO  MemoryStore:54 - Block broadcast_0 stored as
values in memory (estimated size 4.6 KB, free 366.3 MB)
2019-01-05 11:47:12 INFO  MemoryStore:54 - Block broadcast_0_piece0 stored
as bytes in memory (estimated size 2.6 KB, free 366.3 MB)
2019-01-05 11:47:12 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on linux-client:41343 (size: 2.6 KB, free: 366.3 MB)
2019-01-05 11:47:12 INFO  SparkContext:54 - Created broadcast 0 from
broadcast at DAGScheduler.scala:1161
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Submitting 10 missing tasks from
ResultStage 0 (MapPartitionsRDD[2] at map at SparkIgniteTest.scala:28)
(first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2019-01-05 11:47:12 INFO  TaskSchedulerImpl:54 - Adding task set 0.0 with 10
tasks
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 0.0 in stage 0.0
(TID 0, 127.0.1.1, executor 1, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 1.0 in stage 0.0
(TID 1, 127.0.1.1, executor 0, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 2.0 in stage 0.0
(TID 2, 127.0.1.1, executor 2, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 3.0 in stage 0.0
(TID 3, 127.0.1.1, executor 3, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 4.0 in stage 0.0
(TID 4, 127.0.1.1, executor 1, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 5.0 in stage 0.0
(TID 5, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 6.0 in stage 0.0
(TID 6, 127.0.1.1, executor 2, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 7.0 in stage 0.0
(TID 7, 127.0.1.1, executor 3, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:38343 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:32913 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:32861 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:42897 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 8.0 in stage 0.0
(TID 8, 127.0.1.1, executor 1, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 WARN  TaskSetManager:66 - Lost task 0.0 in stage 0.0
(TID 0, 127.0.1.1, executor 1): java.lang.NoClassDefFoundError: Could not
initialize class org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)

2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 0.1 in stage 0.0
(TID 9, 127.0.1.1, executor 1, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 WARN  TaskSetManager:66 - Lost task 4.0 in stage 0.0
(TID 4, 127.0.1.1, executor 1): java.lang.ExceptionInInitializerError
        at
org.apache.ignite.internal.util.IgniteUtils.<clinit>(IgniteUtils.java:769)
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)
Caused by: java.lang.RuntimeException: jdk.internal.misc.JavaNioAccess class
is unavailable.
        at
org.apache.ignite.internal.util.GridUnsafe.javaNioAccessObject(GridUnsafe.java:1453)
        at org.apache.ignite.internal.util.GridUnsafe.<clinit>(GridUnsafe.java:112)
        ... 17 more
Caused by: java.lang.IllegalAccessException: class
org.apache.ignite.internal.util.GridUnsafe cannot access class
jdk.internal.misc.SharedSecrets (in module java.base) because module
java.base does not export jdk.internal.misc to unnamed module @2c5781b6
        at
java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:360)
        at
java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:589)
        at java.base/java.lang.reflect.Method.invoke(Method.java:556)
        at
org.apache.ignite.internal.util.GridUnsafe.javaNioAccessObject(GridUnsafe.java:1450)
        ... 18 more

2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 4.1 in stage 0.0
(TID 10, 127.0.1.1, executor 3, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 3.0 in stage 0.0
(TID 3) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 1]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 3.1 in stage 0.0
(TID 11, 127.0.1.1, executor 3, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 7.0 in stage 0.0
(TID 7) on 127.0.1.1, executor 3: java.lang.ExceptionInInitializerError
(null) [duplicate 1]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 7.1 in stage 0.0
(TID 12, 127.0.1.1, executor 2, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 2.0 in stage 0.0
(TID 2) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 2]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 2.1 in stage 0.0
(TID 13, 127.0.1.1, executor 1, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 9.0 in stage 0.0
(TID 14, 127.0.1.1, executor 1, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 0.1 in stage 0.0
(TID 9) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 3]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 8.0 in stage 0.0
(TID 8) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 4]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 8.1 in stage 0.0
(TID 15, 127.0.1.1, executor 0, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 1.0 in stage 0.0
(TID 1) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 5]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 1.1 in stage 0.0
(TID 16, 127.0.1.1, executor 3, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 0.2 in stage 0.0
(TID 17, 127.0.1.1, executor 2, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 4.1 in stage 0.0
(TID 10) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 6]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 5.0 in stage 0.0
(TID 5) on 127.0.1.1, executor 0: java.lang.ExceptionInInitializerError
(null) [duplicate 2]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 5.1 in stage 0.0
(TID 18, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 6.0 in stage 0.0
(TID 6) on 127.0.1.1, executor 2: java.lang.ExceptionInInitializerError
(null) [duplicate 3]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 6.1 in stage 0.0
(TID 19, 127.0.1.1, executor 2, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 7.1 in stage 0.0
(TID 12) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 7]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 9.0 in stage 0.0
(TID 14) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 8]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 9.1 in stage 0.0
(TID 20, 127.0.1.1, executor 1, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 7.2 in stage 0.0
(TID 21, 127.0.1.1, executor 1, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 2.1 in stage 0.0
(TID 13) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 9]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 2.2 in stage 0.0
(TID 22, 127.0.1.1, executor 2, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 0.2 in stage 0.0
(TID 17) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 10]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 0.3 in stage 0.0
(TID 23, 127.0.1.1, executor 3, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 3.1 in stage 0.0
(TID 11) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 11]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 5.1 in stage 0.0
(TID 18) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 12]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 5.2 in stage 0.0
(TID 24, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 3.2 in stage 0.0
(TID 25, 127.0.1.1, executor 1, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 9.1 in stage 0.0
(TID 20) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 13]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 9.2 in stage 0.0
(TID 26, 127.0.1.1, executor 3, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 4.2 in stage 0.0
(TID 27, 127.0.1.1, executor 3, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 0.3 in stage 0.0
(TID 23) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 14]
2019-01-05 11:47:14 ERROR TaskSetManager:70 - Task 0 in stage 0.0 failed 4
times; aborting job
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 1.1 in stage 0.0
(TID 16) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 15]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 7.2 in stage 0.0
(TID 21) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 16]
2019-01-05 11:47:14 INFO  TaskSchedulerImpl:54 - Cancelling stage 0
2019-01-05 11:47:14 INFO  TaskSchedulerImpl:54 - Killing all running tasks
in stage 0: Stage cancelled
2019-01-05 11:47:14 INFO  TaskSchedulerImpl:54 - Stage 0 was cancelled
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 8.1 in stage 0.0
(TID 15) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 17]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 2.2 in stage 0.0
(TID 22) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 18]
2019-01-05 11:47:14 INFO  DAGScheduler:54 - ResultStage 0 (foreachPartition
at IgniteRDD.scala:233) failed in 2.358 s due to Job aborted due to stage
failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task
0.3 in stage 0.0 (TID 23, 127.0.1.1, executor 3):
java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)

Driver stacktrace:
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 3.2 in stage 0.0
(TID 25) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 19]
2019-01-05 11:47:15 INFO  DAGScheduler:54 - Job 0 failed: foreachPartition
at IgniteRDD.scala:233, took 2.443962 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 0.0 (TID 23, 127.0.1.1, executor 3):
java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)

Driver stacktrace:
        at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:933)
        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
        at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:933)
        at org.apache.ignite.spark.IgniteRDD.savePairs(IgniteRDD.scala:233)
        at
com.gridgain.RDDWriter$.delayedEndpoint$com$gridgain$RDDWriter$1(SparkIgniteTest.scala:28)
        at com.gridgain.RDDWriter$delayedInit$body.apply(SparkIgniteTest.scala:23)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at com.gridgain.RDDWriter$.main(SparkIgniteTest.scala:23)
        at com.gridgain.RDDWriter.main(SparkIgniteTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.lang.Thread.run(Thread.java:844)
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 6.1 in stage 0.0
(TID 19) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 20]
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 4.2 in stage 0.0
(TID 27) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 21]
2019-01-05 11:47:15 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 9.2 in stage 0.0
(TID 26) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 22]
2019-01-05 11:47:15 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 5.2 in stage 0.0
(TID 24) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 23]
2019-01-05 11:47:15 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool
2019-01-05 11:48:11 INFO  IgniteKernal:566 -
Metrics for local node (to disable set 'metricsLogFrequency' to 0)
    ^-- Node [id=f512a374, uptime=00:01:00.027]
    ^-- H/N/C [hosts=1, nodes=2, CPUs=8]
    ^-- CPU [cur=0.43%, avg=0.85%, GC=0%]
    ^-- PageMemory [pages=0]
    ^-- Heap [used=392MB, free=56.85%, comm=602MB]
    ^-- Non heap [used=83MB, free=-1%, comm=85MB]
    ^-- Outbound messages queue [size=0]
    ^-- Public thread pool [active=0, idle=0, qSize=0]
    ^-- System thread pool [active=0, idle=0, qSize=0]
2019-01-05 11:48:20 INFO  GridUpdateNotifier:566 - Update status is not
available.

<http://apache-ignite-users.70518.x6.nabble.com/file/t2160/Screenshot_from_2019-01-05_12-21-06.png>







--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
ilya.kasnacheev ilya.kasnacheev
Reply | Threaded
Open this post in threaded view
|

Re: error in running shared rdd in ignite

Hello!

I guess you should add VM options:
--add-exports=java.base/jdk.internal.misc=ALL-UNNAMED
--add-exports=java.base/sun.nio.ch=ALL-UNNAMED 
As per running under Java 9.

In case of your IDE, please specify JVM as noted.

Regard,
--
Ilya Kasnacheev


сб, 5 янв. 2019 г. в 11:59, mehdi sey <[hidden email]>:
hi, i have a code for writing into ignite rdd. this program read data from
spark rdd and catch it on ignite rdd. i run it with command line in Linux
Ubuntu but in the middle of execution i have encounter with below error. i
checked in spark UI for watching if job complete or not but the job is not
complete and failed. why? i have attached piece of code that i have wrote
and run with command.

$SPARK_HOME/bin/spark-submit --class "com.gridgain.RDDWriter" --master
spark://linux-client:7077 ~/spark\ and\ ignite\
issue/ignite-and-spark-integration-master/ignite-rdd/ignite-spark-scala/target/ignite-spark-scala-1.0.jar
2019-01-05 11:47:02 WARN  Utils:66 - Your hostname, linux-client resolves to
a loopback address: 127.0.1.1, but we couldn't find any external IP address!
2019-01-05 11:47:02 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind
to another address
2019-01-05 11:47:03 WARN  NativeCodeLoader:62 - Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
2019-01-05 11:47:03 INFO  SparkContext:54 - Running Spark version 2.4.0
2019-01-05 11:47:03 INFO  SparkContext:54 - Submitted application: RDDWriter
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing view acls to: mehdi
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing modify acls to:
mehdi
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing view acls groups to:
2019-01-05 11:47:03 INFO  SecurityManager:54 - Changing modify acls groups
to:
2019-01-05 11:47:03 INFO  SecurityManager:54 - SecurityManager:
authentication disabled; ui acls disabled; users  with view permissions:
Set(mehdi); groups with view permissions: Set(); users  with modify
permissions: Set(mehdi); groups with modify permissions: Set()
2019-01-05 11:47:03 WARN  MacAddressUtil:136 - Failed to find a usable
hardware address from the network interfaces; using random bytes:
88:26:00:23:5d:50:a0:61
2019-01-05 11:47:03 INFO  Utils:54 - Successfully started service
'sparkDriver' on port 36233.
2019-01-05 11:47:03 INFO  SparkEnv:54 - Registering MapOutputTracker
2019-01-05 11:47:03 INFO  SparkEnv:54 - Registering BlockManagerMaster
2019-01-05 11:47:03 INFO  BlockManagerMasterEndpoint:54 - Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
2019-01-05 11:47:03 INFO  BlockManagerMasterEndpoint:54 -
BlockManagerMasterEndpoint up
2019-01-05 11:47:03 INFO  DiskBlockManager:54 - Created local directory at
/tmp/blockmgr-6e47832e-855a-4305-a293-662379733b7f
2019-01-05 11:47:03 INFO  MemoryStore:54 - MemoryStore started with capacity
366.3 MB
2019-01-05 11:47:03 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2019-01-05 11:47:03 INFO  log:192 - Logging initialized @2024ms
2019-01-05 11:47:04 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build
timestamp: unknown, git hash: unknown
2019-01-05 11:47:04 INFO  Server:419 - Started @2108ms
2019-01-05 11:47:04 INFO  AbstractConnector:278 - Started
ServerConnector@5ba745bc{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-01-05 11:47:04 INFO  Utils:54 - Successfully started service 'SparkUI'
on port 4040.
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@606fc505{/jobs,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2c30b71f{/jobs/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1d81e101{/jobs/job,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@bf71cec{/jobs/job/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@22d6cac2{/stages,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@30cdae70{/stages/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1654a892{/stages/stage,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6c000e0c{/stages/stage/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5f233b26{/stages/pool,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@44f9779c{/stages/pool/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6974a715{/storage,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5e8a459{/storage/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@43d455c9{/storage/rdd,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4c9e9fb8{/storage/rdd/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@9ec531{/environment,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@40147317{/environment/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@210f0cc1{/executors,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@19542407{/executors/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6f95cd51{/executors/threadDump,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@c7a977f{/executors/threadDump/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@55caeb35{/static,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@3ae66c85{/,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@16943e88{/api,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@3961a41a{/jobs/job/kill,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5a4ed68f{/stages/stage/kill,null,AVAILABLE,@Spark}
2019-01-05 11:47:04 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started
at http://linux-client:4040
2019-01-05 11:47:04 INFO  SparkContext:54 - Added JAR
file:/home/mehdi/spark%20and%20ignite%20issue/ignite-and-spark-integration-master/ignite-rdd/ignite-spark-scala/target/ignite-spark-scala-1.0.jar
at spark://linux-client:36233/jars/ignite-spark-scala-1.0.jar with timestamp
1546676224149
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Connecting
to master spark://linux-client:7077...
2019-01-05 11:47:04 INFO  TransportClientFactory:267 - Successfully created
connection to linux-client/127.0.1.1:7077 after 40 ms (0 ms spent in
bootstraps)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Connected to Spark
cluster with app ID app-20190105114704-0003
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/0 on worker-20190105103259-127.0.1.1-43911
(127.0.1.1:43911) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/0 on hostPort 127.0.1.1:43911 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/1 on worker-20190105103304-127.0.1.1-44569
(127.0.1.1:44569) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/1 on hostPort 127.0.1.1:44569 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/2 on worker-20190105103301-127.0.1.1-34465
(127.0.1.1:34465) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/2 on hostPort 127.0.1.1:34465 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
added: app-20190105114704-0003/3 on worker-20190105103256-127.0.1.1-46653
(127.0.1.1:46653) with 2 core(s)
2019-01-05 11:47:04 INFO  StandaloneSchedulerBackend:54 - Granted executor
ID app-20190105114704-0003/3 on hostPort 127.0.1.1:46653 with 2 core(s),
512.0 MB RAM
2019-01-05 11:47:04 INFO  Utils:54 - Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 41343.
2019-01-05 11:47:04 INFO  NettyBlockTransferService:54 - Server created on
linux-client:41343
2019-01-05 11:47:04 INFO  BlockManager:54 - Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/2 is now RUNNING
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/1 is now RUNNING
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/3 is now RUNNING
2019-01-05 11:47:04 INFO  StandaloneAppClient$ClientEndpoint:54 - Executor
updated: app-20190105114704-0003/0 is now RUNNING
2019-01-05 11:47:04 INFO  BlockManagerMaster:54 - Registering BlockManager
BlockManagerId(driver, linux-client, 41343, None)
2019-01-05 11:47:04 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager linux-client:41343 with 366.3 MB RAM, BlockManagerId(driver,
linux-client, 41343, None)
2019-01-05 11:47:04 INFO  BlockManagerMaster:54 - Registered BlockManager
BlockManagerId(driver, linux-client, 41343, None)
2019-01-05 11:47:04 INFO  BlockManager:54 - Initialized BlockManager:
BlockManagerId(driver, linux-client, 41343, None)
2019-01-05 11:47:04 INFO  ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@70e02081{/metrics/json,null,AVAILABLE,@Spark}
2019-01-05 11:47:05 INFO  EventLoggingListener:54 - Logging events to
file:/tmp/spark-events/app-20190105114704-0003
2019-01-05 11:47:05 INFO  StandaloneSchedulerBackend:54 - SchedulerBackend
is ready for scheduling beginning after reached minRegisteredResourcesRatio:
0.0
2019-01-05 11:47:06 INFO  XmlBeanDefinitionReader:317 - Loading XML bean
definitions from URL
[file:/usr/local/apache-ignite-fabric-2.6.0-bin/examples/config/spark/example-shared-rdd.xml]
2019-01-05 11:47:06 INFO  GenericApplicationContext:583 - Refreshing
org.springframework.context.support.GenericApplicationContext@2ec3633f:
startup date [Sat Jan 05 11:47:06 IRST 2019]; root of context hierarchy
Can't load log handler "org.apache.ignite.logger.java.JavaLoggerFileHandler"
java.lang.ClassNotFoundException:
org.apache.ignite.logger.java.JavaLoggerFileHandler
java.lang.ClassNotFoundException:
org.apache.ignite.logger.java.JavaLoggerFileHandler
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.util.logging.LogManager$5.run(LogManager.java:965)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.util.logging.LogManager.loadLoggerHandlers(LogManager.java:958)
        at
java.util.logging.LogManager.initializeGlobalHandlers(LogManager.java:1578)
        at java.util.logging.LogManager.access$1500(LogManager.java:145)
        at
java.util.logging.LogManager$RootLogger.accessCheckedHandlers(LogManager.java:1667)
        at java.util.logging.Logger.getHandlers(Logger.java:1777)
        at
org.apache.ignite.logger.java.JavaLogger.findHandler(JavaLogger.java:411)
        at org.apache.ignite.logger.java.JavaLogger.configure(JavaLogger.java:241)
        at org.apache.ignite.logger.java.JavaLogger.<init>(JavaLogger.java:181)
        at org.apache.ignite.logger.java.JavaLogger.<init>(JavaLogger.java:135)
        at
org.apache.ignite.internal.LongJVMPauseDetector.<clinit>(LongJVMPauseDetector.java:44)
        at org.apache.ignite.internal.IgniteKernal.<clinit>(IgniteKernal.java:300)
        at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:2009)
        at
org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1723)
        at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:1151)
        at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:671)
        at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:611)
        at org.apache.ignite.Ignition.getOrStart(Ignition.java:419)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
        at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:63)
        at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:99)
        at
com.gridgain.RDDWriter$.delayedEndpoint$com$gridgain$RDDWriter$1(SparkIgniteTest.scala:26)
        at com.gridgain.RDDWriter$delayedInit$body.apply(SparkIgniteTest.scala:23)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at com.gridgain.RDDWriter$.main(SparkIgniteTest.scala:23)
        at com.gridgain.RDDWriter.main(SparkIgniteTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2019-01-05 11:47:07 INFO  IgniteKernal:566 -

>>>    __________  ________________ 
>>>   /  _/ ___/ |/ /  _/_  __/ __/ 
>>>  _/ // (7 7    // /  / / / _/   
>>> /___/\___/_/|_/___/ /_/ /___/   
>>>
>>> ver. 2.6.0#19700101-sha1:DEV
>>> 2018 Copyright(C) Apache Software Foundation
>>>
>>> Ignite documentation: http://ignite.apache.org

2019-01-05 11:47:07 INFO  IgniteKernal:566 - Config URL: n/a
2019-01-05 11:47:07 INFO  IgniteKernal:566 - IgniteConfiguration
[igniteInstanceName=null, pubPoolSize=8, svcPoolSize=8, callbackPoolSize=8,
stripedPoolSize=8, sysPoolSize=8, mgmtPoolSize=4, igfsPoolSize=8,
dataStreamerPoolSize=8, utilityCachePoolSize=8,
utilityCacheKeepAliveTime=60000, p2pPoolSize=2, qryPoolSize=8,
igniteHome=/usr/local/apache-ignite-fabric-2.6.0-bin/,
igniteWorkDir=/usr/local/apache-ignite-fabric-2.6.0-bin/work,
mbeanSrv=com.sun.jmx.mbeanserver.JmxMBeanServer@76b47204,
nodeId=f512a374-f6b6-4918-9790-6183a43f8064,
marsh=org.apache.ignite.internal.binary.BinaryMarshaller@1f1cddf3,
marshLocJobs=false, daemon=false, p2pEnabled=false, netTimeout=5000,
sndRetryDelay=1000, sndRetryCnt=3, metricsHistSize=10000,
metricsUpdateFreq=2000, metricsExpTime=9223372036854775807,
discoSpi=TcpDiscoverySpi [addrRslvr=null, sockTimeout=0, ackTimeout=0,
marsh=null, reconCnt=10, reconDelay=2000, maxAckTimeout=600000,
forceSrvMode=false, clientReconnectDisabled=false, internalLsnr=null],
segPlc=STOP, segResolveAttempts=2, waitForSegOnStart=true,
allResolversPassReq=true, segChkFreq=10000, commSpi=TcpCommunicationSpi
[connectGate=null, connPlc=null, enableForcibleNodeKill=false,
enableTroubleshootingLog=false,
srvLsnr=org.apache.ignite.spi.communication.tcp.TcpCommunicationSpi$2@3fd05b3e,
locAddr=null, locHost=null, locPort=47100, locPortRange=100, shmemPort=-1,
directBuf=true, directSndBuf=false, idleConnTimeout=600000,
connTimeout=5000, maxConnTimeout=600000, reconCnt=10, sockSndBuf=32768,
sockRcvBuf=32768, msgQueueLimit=0, slowClientQueueLimit=0, nioSrvr=null,
shmemSrv=null, usePairedConnections=false, connectionsPerNode=1,
tcpNoDelay=true, filterReachableAddresses=false, ackSndThreshold=32,
unackedMsgsBufSize=0, sockWriteTimeout=2000, lsnr=null, boundTcpPort=-1,
boundTcpShmemPort=-1, selectorsCnt=4, selectorSpins=0, addrRslvr=null,
ctxInitLatch=java.util.concurrent.CountDownLatch@4eb9f2af[Count = 1],
stopping=false,
metricsLsnr=org.apache.ignite.spi.communication.tcp.TcpCommunicationMetricsListener@4a481728],
evtSpi=org.apache.ignite.spi.eventstorage.NoopEventStorageSpi@4ace284d,
colSpi=NoopCollisionSpi [], deploySpi=LocalDeploymentSpi [lsnr=null],
indexingSpi=org.apache.ignite.spi.indexing.noop.NoopIndexingSpi@64deb58f,
addrRslvr=null, clientMode=true, rebalanceThreadPoolSize=1,
txCfg=org.apache.ignite.configuration.TransactionConfiguration@2b1cd7bc,
cacheSanityCheckEnabled=true, discoStartupDelay=60000, deployMode=SHARED,
p2pMissedCacheSize=100, locHost=null, timeSrvPortBase=31100,
timeSrvPortRange=100, failureDetectionTimeout=10000,
clientFailureDetectionTimeout=30000, metricsLogFreq=60000, hadoopCfg=null,
connectorCfg=org.apache.ignite.configuration.ConnectorConfiguration@33379242,
odbcCfg=null, warmupClos=null, atomicCfg=AtomicConfiguration
[seqReserveSize=1000, cacheMode=PARTITIONED, backups=1, aff=null,
grpName=null], classLdr=null, sslCtxFactory=null, platformCfg=null,
binaryCfg=null, memCfg=null, pstCfg=null, dsCfg=null, activeOnStart=true,
autoActivation=true, longQryWarnTimeout=3000, sqlConnCfg=null,
cliConnCfg=ClientConnectorConfiguration [host=null, port=10800,
portRange=100, sockSndBufSize=0, sockRcvBufSize=0, tcpNoDelay=true,
maxOpenCursorsPerConn=128, threadPoolSize=8, idleTimeout=0,
jdbcEnabled=true, odbcEnabled=true, thinCliEnabled=true, sslEnabled=false,
useIgniteSslCtxFactory=true, sslClientAuth=false, sslCtxFactory=null],
authEnabled=false, failureHnd=null, commFailureRslvr=null]
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Daemon mode: off
2019-01-05 11:47:07 INFO  IgniteKernal:566 - OS: Linux 4.15.0-43-generic
amd64
2019-01-05 11:47:07 INFO  IgniteKernal:566 - OS user: mehdi
2019-01-05 11:47:07 INFO  IgniteKernal:566 - PID: 7165
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Language runtime: Scala ver.
2.11.12
2019-01-05 11:47:07 INFO  IgniteKernal:566 - VM information: Java(TM) SE
Runtime Environment 1.8.0_192-ea-b04 Oracle Corporation Java HotSpot(TM)
64-Bit Server VM 25.192-b04
2019-01-05 11:47:07 INFO  IgniteKernal:566 - VM total memory: 0.89GB
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Remote Management [restart:
off, REST: off, JMX (remote: off)]
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Logger: Log4JLogger
[quiet=false, config=null]
2019-01-05 11:47:07 INFO  IgniteKernal:566 -
IGNITE_HOME=/usr/local/apache-ignite-fabric-2.6.0-bin/
2019-01-05 11:47:07 INFO  IgniteKernal:566 - VM arguments: [-Xmx1g]
2019-01-05 11:47:07 INFO  IgniteKernal:566 - Configured caches [in
'sysMemPlc' dataRegion: ['ignite-sys-cache'], in 'null' dataRegion:
['sharedRDD']]
2019-01-05 11:47:07 WARN  GridDiagnostic:571 - Default local host is a
loopback address. This can be a sign of potential network configuration
problem.
2019-01-05 11:47:07 INFO  IgniteKernal:566 - 3-rd party licenses can be
found at: /usr/local/apache-ignite-fabric-2.6.0-bin//libs/licenses
2019-01-05 11:47:07 WARN  GridDiagnostic:571 - No live network interfaces
detected. If IP-multicast discovery is used - make sure to add 127.0.0.1 as
a local address.
2019-01-05 11:47:07 WARN  GridDiagnostic:571 - Initial heap size is 126MB
(should be no less than 512MB, use -Xms512m -Xmx512m).
2019-01-05 11:47:08 INFO  IgnitePluginProcessor:566 - Configured plugins:
2019-01-05 11:47:08 INFO  IgnitePluginProcessor:566 -   ^-- None
2019-01-05 11:47:08 INFO  IgnitePluginProcessor:566 -
2019-01-05 11:47:08 INFO  FailureProcessor:566 - Configured failure handler:
[hnd=StopNodeOrHaltFailureHandler [tryStop=false, timeout=0]]
2019-01-05 11:47:08 INFO  TcpCommunicationSpi:566 - Successfully bound
communication NIO server to TCP port [port=47101, locHost=0.0.0.0/0.0.0.0,
selectorsCnt=4, selectorSpins=0, pairedConn=false]
2019-01-05 11:47:08 WARN  TcpCommunicationSpi:571 - Message queue limit is
set to 0 which may lead to potential OOMEs when running cache operations in
FULL_ASYNC or PRIMARY_SYNC modes due to message queues growth on sender and
receiver sides.
2019-01-05 11:47:08 WARN  NoopCheckpointSpi:571 - Checkpoints are disabled
(to enable configure any GridCheckpointSpi implementation)
2019-01-05 11:47:08 WARN  GridCollisionManager:571 - Collision resolution is
disabled (all jobs will be activated upon arrival).
2019-01-05 11:47:08 INFO  IgniteKernal:566 - Security status
[authentication=off, tls/ssl=off]
2019-01-05 11:47:09 INFO  ClientListenerProcessor:566 - Client connector
processor has started on TCP port 10801
2019-01-05 11:47:09 INFO  GridRestProcessor:566 - REST protocols do not
start on client node. To start the protocols on client node set
'-DIGNITE_REST_START_ON_CLIENT=true' system property.
2019-01-05 11:47:09 INFO  IgniteKernal:566 - Non-loopback local IPs: N/A
2019-01-05 11:47:09 INFO  IgniteKernal:566 - Enabled local MACs: N/A
2019-01-05 11:47:09 WARN  IgniteKernal:571 - Ignite is starting on loopback
address... Only nodes on the same physical computer can participate in
topology.
2019-01-05 11:47:09 WARN  TcpDiscoveryMulticastIpFinder:571 - Failed to send
multicast message (is multicast enabled on this node?).
2019-01-05 11:47:10 INFO  time:566 - Started exchange init
[topVer=AffinityTopologyVersion [topVer=4, minorTopVer=0], crd=false,
evt=NODE_JOINED, evtNode=f512a374-f6b6-4918-9790-6183a43f8064,
customEvt=null, allowMerge=true]
2019-01-05 11:47:11 INFO  GridCacheProcessor:566 - Started cache
[name=ignite-sys-cache, id=-2100569601, memoryPolicyName=sysMemPlc,
mode=REPLICATED, atomicity=TRANSACTIONAL, backups=2147483647]
2019-01-05 11:47:11 INFO  TcpCommunicationSpi:566 - Established outgoing
communication connection [locAddr=/0:0:0:0:0:0:0:1:43032,
rmtAddr=/0:0:0:0:0:0:0:1%lo:47100]
2019-01-05 11:47:11 INFO  GridCacheProcessor:566 - Started cache
[name=sharedRDD, id=-1581581875, memoryPolicyName=null, mode=PARTITIONED,
atomicity=ATOMIC, backups=1]
2019-01-05 11:47:11 INFO  time:566 - Finished exchange init
[topVer=AffinityTopologyVersion [topVer=4, minorTopVer=0], crd=false]
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49300) with ID 0
2019-01-05 11:47:11 INFO  GridDhtPartitionsExchangeFuture:566 - Received
full message, will finish exchange
[node=0a01ccfc-6d3a-4490-bdd8-90cf3b71928d, resVer=AffinityTopologyVersion
[topVer=4, minorTopVer=0]]
2019-01-05 11:47:11 INFO  GridDhtPartitionsExchangeFuture:566 - Finish
exchange future [startVer=AffinityTopologyVersion [topVer=4, minorTopVer=0],
resVer=AffinityTopologyVersion [topVer=4, minorTopVer=0], err=null]
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49296) with ID 3
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49294) with ID 2
2019-01-05 11:47:11 INFO  IgniteKernal:566 - Performance suggestions for
grid  (fix if possible)
2019-01-05 11:47:11 INFO  IgniteKernal:566 - To disable, set
-DIGNITE_PERFORMANCE_SUGGESTIONS_DISABLED=true
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Enable G1 Garbage
Collector (add '-XX:+UseG1GC' to JVM options)
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Set max direct memory
size if getting 'OOME: Direct buffer memory' (add
'-XX:MaxDirectMemorySize=<size>[g|G|m|M|k|K]' to JVM options)
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Disable processing of
calls to System.gc() (add '-XX:+DisableExplicitGC' to JVM options)
2019-01-05 11:47:11 INFO  IgniteKernal:566 -   ^-- Decrease number of
backups (set 'backups' to 0)
2019-01-05 11:47:11 INFO  IgniteKernal:566 - Refer to this page for more
performance suggestions:
https://apacheignite.readme.io/docs/jvm-and-system-tuning
2019-01-05 11:47:11 INFO  IgniteKernal:566 -
2019-01-05 11:47:11 INFO  IgniteKernal:566 - To start Console Management &
Monitoring run ignitevisorcmd.{sh|bat}
2019-01-05 11:47:11 INFO  IgniteKernal:566 -
2019-01-05 11:47:11 INFO  IgniteKernal:566 -

>>> +---------------------------------+
>>> Ignite ver. 2.6.0#19700101-sha1:DEV
>>> +---------------------------------+
>>> OS name: Linux 4.15.0-43-generic amd64
>>> CPU(s): 8
>>> Heap: 0.89GB
>>> VM name: 7165@linux-client
>>> Local node [ID=F512A374-F6B6-4918-9790-6183A43F8064, order=4,
>>> clientMode=true]
>>> Local node addresses: [0:0:0:0:0:0:0:1%lo, 127.0.0.1]
>>> Local ports: TCP:10801 TCP:47101 UDP:47400

2019-01-05 11:47:11 INFO  GridDiscoveryManager:566 - Topology snapshot
[ver=4, servers=1, clients=1, CPUs=8, offheap=1.6GB, heap=1.9GB]
2019-01-05 11:47:11 INFO  GridDiscoveryManager:566 -   ^-- Node
[id=F512A374-F6B6-4918-9790-6183A43F8064, clusterState=ACTIVE]
2019-01-05 11:47:11 INFO  CoarseGrainedSchedulerBackend$DriverEndpoint:54 -
Registered executor NettyRpcEndpointRef(spark-client://Executor)
(127.0.0.1:49298) with ID 1
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:38343 with 127.2 MB RAM, BlockManagerId(1, 127.0.1.1,
38343, None)
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:42897 with 127.2 MB RAM, BlockManagerId(2, 127.0.1.1,
42897, None)
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:32861 with 127.2 MB RAM, BlockManagerId(3, 127.0.1.1,
32861, None)
2019-01-05 11:47:12 INFO  BlockManagerMasterEndpoint:54 - Registering block
manager 127.0.1.1:32913 with 127.2 MB RAM, BlockManagerId(0, 127.0.1.1,
32913, None)
2019-01-05 11:47:12 INFO  SparkContext:54 - Starting job: foreachPartition
at IgniteRDD.scala:233
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Got job 0 (foreachPartition at
IgniteRDD.scala:233) with 10 output partitions
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Final stage: ResultStage 0
(foreachPartition at IgniteRDD.scala:233)
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Parents of final stage: List()
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Missing parents: List()
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Submitting ResultStage 0
(MapPartitionsRDD[2] at map at SparkIgniteTest.scala:28), which has no
missing parents
2019-01-05 11:47:12 INFO  MemoryStore:54 - Block broadcast_0 stored as
values in memory (estimated size 4.6 KB, free 366.3 MB)
2019-01-05 11:47:12 INFO  MemoryStore:54 - Block broadcast_0_piece0 stored
as bytes in memory (estimated size 2.6 KB, free 366.3 MB)
2019-01-05 11:47:12 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on linux-client:41343 (size: 2.6 KB, free: 366.3 MB)
2019-01-05 11:47:12 INFO  SparkContext:54 - Created broadcast 0 from
broadcast at DAGScheduler.scala:1161
2019-01-05 11:47:12 INFO  DAGScheduler:54 - Submitting 10 missing tasks from
ResultStage 0 (MapPartitionsRDD[2] at map at SparkIgniteTest.scala:28)
(first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2019-01-05 11:47:12 INFO  TaskSchedulerImpl:54 - Adding task set 0.0 with 10
tasks
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 0.0 in stage 0.0
(TID 0, 127.0.1.1, executor 1, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 1.0 in stage 0.0
(TID 1, 127.0.1.1, executor 0, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 2.0 in stage 0.0
(TID 2, 127.0.1.1, executor 2, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 3.0 in stage 0.0
(TID 3, 127.0.1.1, executor 3, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 4.0 in stage 0.0
(TID 4, 127.0.1.1, executor 1, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 5.0 in stage 0.0
(TID 5, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 6.0 in stage 0.0
(TID 6, 127.0.1.1, executor 2, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:12 INFO  TaskSetManager:54 - Starting task 7.0 in stage 0.0
(TID 7, 127.0.1.1, executor 3, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:38343 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:32913 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:32861 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:13 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in
memory on 127.0.1.1:42897 (size: 2.6 KB, free: 127.2 MB)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 8.0 in stage 0.0
(TID 8, 127.0.1.1, executor 1, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 WARN  TaskSetManager:66 - Lost task 0.0 in stage 0.0
(TID 0, 127.0.1.1, executor 1): java.lang.NoClassDefFoundError: Could not
initialize class org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)

2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 0.1 in stage 0.0
(TID 9, 127.0.1.1, executor 1, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 WARN  TaskSetManager:66 - Lost task 4.0 in stage 0.0
(TID 4, 127.0.1.1, executor 1): java.lang.ExceptionInInitializerError
        at
org.apache.ignite.internal.util.IgniteUtils.<clinit>(IgniteUtils.java:769)
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)
Caused by: java.lang.RuntimeException: jdk.internal.misc.JavaNioAccess class
is unavailable.
        at
org.apache.ignite.internal.util.GridUnsafe.javaNioAccessObject(GridUnsafe.java:1453)
        at org.apache.ignite.internal.util.GridUnsafe.<clinit>(GridUnsafe.java:112)
        ... 17 more
Caused by: java.lang.IllegalAccessException: class
org.apache.ignite.internal.util.GridUnsafe cannot access class
jdk.internal.misc.SharedSecrets (in module java.base) because module
java.base does not export jdk.internal.misc to unnamed module @2c5781b6
        at
java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:360)
        at
java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:589)
        at java.base/java.lang.reflect.Method.invoke(Method.java:556)
        at
org.apache.ignite.internal.util.GridUnsafe.javaNioAccessObject(GridUnsafe.java:1450)
        ... 18 more

2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 4.1 in stage 0.0
(TID 10, 127.0.1.1, executor 3, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 3.0 in stage 0.0
(TID 3) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 1]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 3.1 in stage 0.0
(TID 11, 127.0.1.1, executor 3, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 7.0 in stage 0.0
(TID 7) on 127.0.1.1, executor 3: java.lang.ExceptionInInitializerError
(null) [duplicate 1]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 7.1 in stage 0.0
(TID 12, 127.0.1.1, executor 2, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 2.0 in stage 0.0
(TID 2) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 2]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 2.1 in stage 0.0
(TID 13, 127.0.1.1, executor 1, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 9.0 in stage 0.0
(TID 14, 127.0.1.1, executor 1, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 0.1 in stage 0.0
(TID 9) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 3]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 8.0 in stage 0.0
(TID 8) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 4]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 8.1 in stage 0.0
(TID 15, 127.0.1.1, executor 0, partition 8, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 1.0 in stage 0.0
(TID 1) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 5]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 1.1 in stage 0.0
(TID 16, 127.0.1.1, executor 3, partition 1, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 0.2 in stage 0.0
(TID 17, 127.0.1.1, executor 2, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 4.1 in stage 0.0
(TID 10) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 6]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 5.0 in stage 0.0
(TID 5) on 127.0.1.1, executor 0: java.lang.ExceptionInInitializerError
(null) [duplicate 2]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 5.1 in stage 0.0
(TID 18, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 6.0 in stage 0.0
(TID 6) on 127.0.1.1, executor 2: java.lang.ExceptionInInitializerError
(null) [duplicate 3]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 6.1 in stage 0.0
(TID 19, 127.0.1.1, executor 2, partition 6, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 7.1 in stage 0.0
(TID 12) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 7]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 9.0 in stage 0.0
(TID 14) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 8]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 9.1 in stage 0.0
(TID 20, 127.0.1.1, executor 1, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 7.2 in stage 0.0
(TID 21, 127.0.1.1, executor 1, partition 7, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 2.1 in stage 0.0
(TID 13) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 9]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 2.2 in stage 0.0
(TID 22, 127.0.1.1, executor 2, partition 2, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 0.2 in stage 0.0
(TID 17) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 10]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 0.3 in stage 0.0
(TID 23, 127.0.1.1, executor 3, partition 0, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 3.1 in stage 0.0
(TID 11) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 11]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 5.1 in stage 0.0
(TID 18) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 12]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 5.2 in stage 0.0
(TID 24, 127.0.1.1, executor 0, partition 5, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 3.2 in stage 0.0
(TID 25, 127.0.1.1, executor 1, partition 3, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 9.1 in stage 0.0
(TID 20) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 13]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 9.2 in stage 0.0
(TID 26, 127.0.1.1, executor 3, partition 9, PROCESS_LOCAL, 7927 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Starting task 4.2 in stage 0.0
(TID 27, 127.0.1.1, executor 3, partition 4, PROCESS_LOCAL, 7870 bytes)
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 0.3 in stage 0.0
(TID 23) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 14]
2019-01-05 11:47:14 ERROR TaskSetManager:70 - Task 0 in stage 0.0 failed 4
times; aborting job
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 1.1 in stage 0.0
(TID 16) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 15]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 7.2 in stage 0.0
(TID 21) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 16]
2019-01-05 11:47:14 INFO  TaskSchedulerImpl:54 - Cancelling stage 0
2019-01-05 11:47:14 INFO  TaskSchedulerImpl:54 - Killing all running tasks
in stage 0: Stage cancelled
2019-01-05 11:47:14 INFO  TaskSchedulerImpl:54 - Stage 0 was cancelled
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 8.1 in stage 0.0
(TID 15) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 17]
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 2.2 in stage 0.0
(TID 22) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 18]
2019-01-05 11:47:14 INFO  DAGScheduler:54 - ResultStage 0 (foreachPartition
at IgniteRDD.scala:233) failed in 2.358 s due to Job aborted due to stage
failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task
0.3 in stage 0.0 (TID 23, 127.0.1.1, executor 3):
java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)

Driver stacktrace:
2019-01-05 11:47:14 INFO  TaskSetManager:54 - Lost task 3.2 in stage 0.0
(TID 25) on 127.0.1.1, executor 1: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 19]
2019-01-05 11:47:15 INFO  DAGScheduler:54 - Job 0 failed: foreachPartition
at IgniteRDD.scala:233, took 2.443962 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 0.0 (TID 23, 127.0.1.1, executor 3):
java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:844)

Driver stacktrace:
        at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at scala.Option.foreach(Option.scala:257)
        at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:933)
        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
        at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:933)
        at org.apache.ignite.spark.IgniteRDD.savePairs(IgniteRDD.scala:233)
        at
com.gridgain.RDDWriter$.delayedEndpoint$com$gridgain$RDDWriter$1(SparkIgniteTest.scala:28)
        at com.gridgain.RDDWriter$delayedInit$body.apply(SparkIgniteTest.scala:23)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at com.gridgain.RDDWriter$.main(SparkIgniteTest.scala:23)
        at com.gridgain.RDDWriter.main(SparkIgniteTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class
org.apache.ignite.internal.util.IgniteUtils
        at
org.apache.ignite.spark.IgniteContext$.setIgniteHome(IgniteContext.scala:195)
        at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:142)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:234)
        at
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1.apply(IgniteRDD.scala:233)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.lang.Thread.run(Thread.java:844)
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 6.1 in stage 0.0
(TID 19) on 127.0.1.1, executor 2: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 20]
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 4.2 in stage 0.0
(TID 27) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 21]
2019-01-05 11:47:15 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 9.2 in stage 0.0
(TID 26) on 127.0.1.1, executor 3: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 22]
2019-01-05 11:47:15 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool
2019-01-05 11:47:15 INFO  TaskSetManager:54 - Lost task 5.2 in stage 0.0
(TID 24) on 127.0.1.1, executor 0: java.lang.NoClassDefFoundError (Could not
initialize class org.apache.ignite.internal.util.IgniteUtils) [duplicate 23]
2019-01-05 11:47:15 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose
tasks have all completed, from pool
2019-01-05 11:48:11 INFO  IgniteKernal:566 -
Metrics for local node (to disable set 'metricsLogFrequency' to 0)
    ^-- Node [id=f512a374, uptime=00:01:00.027]
    ^-- H/N/C [hosts=1, nodes=2, CPUs=8]
    ^-- CPU [cur=0.43%, avg=0.85%, GC=0%]
    ^-- PageMemory [pages=0]
    ^-- Heap [used=392MB, free=56.85%, comm=602MB]
    ^-- Non heap [used=83MB, free=-1%, comm=85MB]
    ^-- Outbound messages queue [size=0]
    ^-- Public thread pool [active=0, idle=0, qSize=0]
    ^-- System thread pool [active=0, idle=0, qSize=0]
2019-01-05 11:48:20 INFO  GridUpdateNotifier:566 - Update status is not
available.

<http://apache-ignite-users.70518.x6.nabble.com/file/t2160/Screenshot_from_2019-01-05_12-21-06.png>







--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/