Unable to load more than 5g data through sqlline

classic Classic list List threaded Threaded
9 messages Options
debashissinha debashissinha
Reply | Threaded
Open this post in threaded view
|

Unable to load more than 5g data through sqlline

<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_005158.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_005224.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_004447.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_004532.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_004559.jpg>

Hi All,
I would request help from anyone who can assist me in a critical error.
I am trying to benchmark ignite on single node without tcp discovery enabled
with tpcds benchmark data.
For this I am using customer table(image attached) and loading 5 gb of csv
file through sql line.

The config( image attached) is for 20 gb of default data region with wal
mode none and eviction mode is only lru . I have also enabled native
persistence . Also in my ignite.sh script I am adding G1GC option in jvm
opts.

After almost 1.63 gb of data getting inserted and which corresponds to
roughly 12000000 rows of data ignite is silently restarting giving an error
kill ignite.sh line 181 with jvm opts. The error for this is attached.

I am having the following config
no of cpus 2
heap memory 1 gb
data region max size(off heap size) 20 gb.
Cluster mode enabled.

Can some one kindly advise me where I am going wrong.

Thanks & Regards
Debashis Sinha





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Павлухин Иван Павлухин Иван
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline


вт, 30 окт. 2018 г. в 23:07, debashissinha <[hidden email]>:
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_005158.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_005224.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_004447.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_004532.jpg>
<http://apache-ignite-users.70518.x6.nabble.com/file/t1918/20181031_004559.jpg>

Hi All,
I would request help from anyone who can assist me in a critical error.
I am trying to benchmark ignite on single node without tcp discovery enabled
with tpcds benchmark data.
For this I am using customer table(image attached) and loading 5 gb of csv
file through sql line.

The config( image attached) is for 20 gb of default data region with wal
mode none and eviction mode is only lru . I have also enabled native
persistence . Also in my ignite.sh script I am adding G1GC option in jvm
opts.

After almost 1.63 gb of data getting inserted and which corresponds to
roughly 12000000 rows of data ignite is silently restarting giving an error
kill ignite.sh line 181 with jvm opts. The error for this is attached.

I am having the following config
no of cpus 2
heap memory 1 gb
data region max size(off heap size) 20 gb.
Cluster mode enabled.

Can some one kindly advise me where I am going wrong.

Thanks & Regards
Debashis Sinha





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


--
Best regards,
Ivan Pavlukhin
wt wt
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

a better option would be to drop sqlline and write your own client that reads
the csv files and loads into the database. This way you can have multiple
threads loading multiple files concurrently and each load you setup the
parameters for the streamer including batch sizes and flush frequency.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
debashissinha debashissinha
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

In reply to this post by Павлухин Иван
Hi ,
I am running sql line from a differnt machine which is not related to the
cluster also . Tried with smaller chunks but difficulty is after around 2.5
gb the issue persists.

Thanks & Regards
Debashis Sinha



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
wt wt
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

a different machine is fine

all you need to do is build a client using either java, c++,.net, scala, and
probably powershell as well

i have something like this in .Net and can parallel load files (one per
thread in a pool of 10 threads)

  using (var ldr = igniteclient.GetDataStreamer<dynamic,
dynamic>(tablename))
            {

                ldr.AllowOverwrite = true;
                ldr.AutoFlushFrequency = 5000;
                ldr.PerNodeParallelOperations = 20;
                ldr.PerNodeBufferSize = 15000;

                foreach (var record in lines) --CSV
                 {
                   1)  create class object and populate it with csv data

                    ldr.AddData(id, instanceRecord);
                }
            }

it will be much faster than the way you are loading it which i presume is
one file at a time and you wont run into this 2.5gb limitation that is
likely related to the tool you are using and you have no control over it

               





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
debashissinha debashissinha
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

Hi ,
I am using the sql line utility to do this ..... Also the exact error that
am getting is

ignite.sh : line 181: 1387 killed "$JAVA" ${JVM_OPTS} ${QUIET}
"${RESTART_SUCCESS_OPT}" ${JMX_MON} \
                 -DIGNITE_HOME="${IGNITE_HOME}" \
                -DIGNITE_PROG_NAME="$0" ${JVM_XOPTS} -cp "${CP}"
${MAIN_CLASS} "${CONFIG}"
       



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
debashissinha debashissinha
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

Also at the same time the ignite node is restarting



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Павлухин Иван Павлухин Иван
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

Hi Debashis,

Sorry for late answer. How much RAM does your server have?
You configured your data region with 7 gb max size. This size
defines how much RAM could be allocated for the region. If your
server has not enough RAM the OS cannot allocate enough for
Ignite and kills it. With persistence enabled you can store as much
data into Ignite as your disk capacity allows. Try to decrease the data
region max size. I hope this will help.

чт, 1 нояб. 2018 г. в 16:41, debashissinha <[hidden email]>:
Also at the same time the ignite node is restarting



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


--
Best regards,
Ivan Pavlukhin
ilya.kasnacheev ilya.kasnacheev
Reply | Threaded
Open this post in threaded view
|

Re: Unable to load more than 5g data through sqlline

In reply to this post by debashissinha
Hello!

Looks like you have allowed more heap/offheap memory for Ignite than your
operating system can allocate. Are you sure you're not starting multiple
nodes on same box?

Regards,



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/