How to config the "query-indexing" module in the node?

classic Classic list List threaded Threaded
7 messages Options
F7753 F7753
Reply | Threaded
Open this post in threaded view
|

How to config the "query-indexing" module in the node?

Well, I have already set the class path on my nodes, but when I run my app, it still shows the lack of "ignite-indexing".
---------------------------------------------------------------------------------------------
Exception in thread "main" javax.cache.CacheException: Failed to execute query. Add module 'ignite-indexing' to the classpath of all Ignite nodes.
        at org.apache.ignite.internal.processors.cache.IgniteCacheProxy.validate(IgniteCacheProxy.java:684)
        at org.apache.ignite.internal.processors.cache.IgniteCacheProxy.query(IgniteCacheProxy.java:619)
        at org.apache.ignite.spark.IgniteRDD.sql(IgniteRDD.scala:125)
        at main.scala.StreamingJoin$$anonfun$main$2.apply(StreamingJoin.scala:331)
        at main.scala.StreamingJoin$$anonfun$main$2.apply(StreamingJoin.scala:270)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
        at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
        at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
        at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)
        at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
        at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
^C[17:01:06] Ignite node stopped OK [uptime=00:01:02:115]
Exception in thread "submit-job-thread-pool-1" java.lang.Error: java.lang.InterruptedException
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:503)
        at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73)
        at org.apache.spark.SimpleFutureAction.org$apache$spark$SimpleFutureAction$$awaitResult(FutureAction.scala:165)
        at org.apache.spark.SimpleFutureAction$$anon$1.run(FutureAction.scala:147)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        ... 2 more
Exception in thread "submit-job-thread-pool-2" Exception in thread "submit-job-thread-pool-0" java.lang.Error: java.lang.InterruptedException
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:503)
        at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73)
        at org.apache.spark.SimpleFutureAction.org$apache$spark$SimpleFutureAction$$awaitResult(FutureAction.scala:165)
        at org.apache.spark.SimpleFutureAction$$anon$1.run(FutureAction.scala:147)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        ... 2 more
java.lang.Error: java.lang.InterruptedException
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:503)
        at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73)
        at org.apache.spark.SimpleFutureAction.org$apache$spark$SimpleFutureAction$$awaitResult(FutureAction.scala:165)
        at org.apache.spark.SimpleFutureAction$$anon$1.run(FutureAction.scala:147)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        ... 2 more

---------------------------------------------------------------------------------------------
F7753 F7753
Reply | Threaded
Open this post in threaded view
|

Re: How to config the "query-indexing" module in the node?

I added this config to my spark-env.sh
------------------------------------------------------------
# Optionally set IGNITE_HOME here.
# IGNITE_HOME=/path/to/my-ignite-home
IGNITE_LIBS="${IGNITE_HOME}/libs/*"
for file in ${IGNITE_HOME}/libs/*
do
    if [ -d ${file} ] && [ "${file}" != "${IGNITE_HOME}"/libs/optional ]; then
        IGNITE_LIBS=${IGNITE_LIBS}:${file}/*
    fi
done
export SPARK_CLASSPATH=$IGNITE_LIBS
------------------------------------------------------------
It seems the "query-indexing" module still absent here on my cluster.
Alexey Kuznetsov Alexey Kuznetsov
Reply | Threaded
Open this post in threaded view
|

Re: How to config the "query-indexing" module in the node?

Hi!

Folder with name "ignite-indexing"  should be in classpath.

Do you have such folder? It should contains: commons-codec-1.6.jar, h2-1.3.175.jar, ignite-indexing.jar, lucene-core-3.5.0.jar

ignite-indexing.jar - should be with version of ignite you are using.

Please check.


On Wed, Apr 6, 2016 at 5:50 PM, F7753 <[hidden email]> wrote:
I added this config to my spark-env.sh
------------------------------------------------------------
# Optionally set IGNITE_HOME here.
# IGNITE_HOME=/path/to/my-ignite-home
IGNITE_LIBS="${IGNITE_HOME}/libs/*"
for file in ${IGNITE_HOME}/libs/*
do
    if [ -d ${file} ] && [ "${file}" != "${IGNITE_HOME}"/libs/optional ];
then
        IGNITE_LIBS=${IGNITE_LIBS}:${file}/*
    fi
done
export SPARK_CLASSPATH=$IGNITE_LIBS
------------------------------------------------------------
It seems the "query-indexing" module still absent here on my cluster.



--
View this message in context: http://apache-ignite-users.70518.x6.nabble.com/How-to-config-the-query-indexing-module-in-the-node-tp3962p3963.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.



--
Alexey Kuznetsov
GridGain Systems
www.gridgain.com
F7753 F7753
Reply | Threaded
Open this post in threaded view
|

Re: How to config the "query-indexing" module in the node?

Yes, I do have this folder and contains all the jars that you have listed above ,in fact it has that folder originally.
Here is the folder and its components:
F7753 F7753
Reply | Threaded
Open this post in threaded view
|

Re: How to config the "query-indexing" module in the node?

In reply to this post by Alexey Kuznetsov
I added "export CLASSPATH=.:$IGNITE_HOME/libs" to /etc/profile, then source it , but of no use ...
vkulichenko vkulichenko
Reply | Threaded
Open this post in threaded view
|

Re: How to config the "query-indexing" module in the node?

How do you launch the application? If using spark-submit script, you can provide Maven coordinates for all required dependencies via --packages parameter [1].

Can you try this?

[1] http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management

-Val
F7753 F7753
Reply | Threaded
Open this post in threaded view
|

Re: How to config the "query-indexing" module in the node?

Thank you, Val
The "query-indexing" exception disappeared, here is the way I start a app:
---------------------------------------------------------------------------------------------
/opt/spark-1.6.1/bin/spark-submit --class main.scala.StreamingJoin   --properties-file conf/spark.conf --packages commons-codec:commons-codec:1.6,org.apache.ignite:ignite-indexing:1.5.0.final,org.apache.    lucene:lucene-core:3.5.0,com.h2database:h2:1.3.175  --repositories http://www.gridgainsystems.com/nexus/content/repositories/external  ./myapp-1.0-SNAPSHOT-jar-with-dependencies.jar  "Socket" $1 $2 $3
---------------------------------------------------------------------------------------------