Question on On-Heap Caching

classic Classic list List threaded Threaded
7 messages Options
naresh.goty naresh.goty
Reply | Threaded
Open this post in threaded view
|

Question on On-Heap Caching

Hi All,

We enabled on-heap caching for our application, as per documentation. The
only change required is to set OnheapCacheEnabled property of
CacheConfiguration. Once we made this change, and started up our
application, ignite metrics still shows onheap and offheap memory.  

Note: we are using ignite v2.3

Metrics for local node (to disable set 'metricsLogFrequency' to 0)
    ^-- Node [id=0223cd4a, name=delivery, uptime=00:01:00.174]
    ^-- H/N/C [hosts=1, nodes=1, CPUs=8]
    ^-- CPU [cur=2.63%, avg=13.84%, GC=0.03%]
    ^-- PageMemory [pages=111265]
    ^-- Heap [used=468MB, free=75.98%, comm=1948MB]
    ^-- Non heap [used=139MB, free=-1%, comm=143MB]
    ^-- Public thread pool [active=0, idle=0, qSize=0]
    ^-- System thread pool [active=0, idle=8, qSize=0]
    ^-- Outbound messages queue [size=0]


1) Does ignite still has off-heap memory even if onheapCacheEnabled is true?
2) How to ensure onheapCachedEnabled is honored?
3) Do we need to do any other configuration changes?

Regards,
Naresh



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Alexey Kukushkin Alexey Kukushkin
Reply | Threaded
Open this post in threaded view
|

Re: Question on On-Heap Caching

Hi,

Ignite always stores data off-heap. Enabling on-heap caching just turns Java heap into a cache for the off-heap memory, allowing you to configure eviction policies specific for such a heap cache.

I believe the idea is that in some cases accessing on-heap is faster than accessing off-heap, although I never saw any benchmarks or recommendations what data access scenarios would benefit from the on-heap caching. Remember that storing data on-heap negatively impacts GC. Maybe community will help. You can also benchmark your use case with on-heap caching and without it and share results with the community.

naresh.goty naresh.goty
Reply | Threaded
Open this post in threaded view
|

Re: Question on On-Heap Caching

Thanks Alexey for the info. Actually our application is read-heavy, and we
are seeing high latencies (based on our perf benchmark) when we are
measuring the response times during load tests. Based on the one of the
thread's recommendations
(http://apache-ignite-users.70518.x6.nabble.com/10X-decrease-in-performance-with-Ignite-2-0-0-td12637.html#a12655),
we are trying to check if onheap cache have any reduction in latencies. But
we did not see any noticeable difference in perf using onheap cache
enabled/disabled. We are using ignite v2.3.

Thanks,
Naresh



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
dsetrakyan dsetrakyan
Reply | Threaded
Open this post in threaded view
|

Re: Question on On-Heap Caching

Naresh, several questions:
  1. How are you accessing data, with SQL or key-value APIs?
  2. Are you accessing data locally on the server or remotely from a client? If remotely, then you might want to enable near caching.
D.

On Thu, Nov 9, 2017 at 3:01 PM, naresh.goty <[hidden email]> wrote:
Thanks Alexey for the info. Actually our application is read-heavy, and we
are seeing high latencies (based on our perf benchmark) when we are
measuring the response times during load tests. Based on the one of the
thread's recommendations
(http://apache-ignite-users.70518.x6.nabble.com/10X-decrease-in-performance-with-Ignite-2-0-0-td12637.html#a12655),
we are trying to check if onheap cache have any reduction in latencies. But
we did not see any noticeable difference in perf using onheap cache
enabled/disabled. We are using ignite v2.3.

Thanks,

colinc colinc
Reply | Threaded
Open this post in threaded view
|

Re: Question on On-Heap Caching

In reply to this post by naresh.goty
With Ignite 2+, I have found that the on-heap option makes only modest
improvements to performance in most cases. Likewise for copyOnRead=false,
which works in conjunction with on-heap. These options work best in the case
where you have a small number of cache entries that are read many times. In
benchmarks, I've seen performance gains of around 30% for this type of
scenario.



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
naresh.goty naresh.goty
Reply | Threaded
Open this post in threaded view
|

Re: Question on On-Heap Caching

In reply to this post by dsetrakyan
Hi Dmitriy,

1.We are accessing the data using key-value API's (getAll())
2. Our data is partitioned in a three server node cluster, but our testcase
primarily requests the data through one specific node.

Regards,
Naresh



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
Mikhail Mikhail
Reply | Threaded
Open this post in threaded view
|

Re: Question on On-Heap Caching

Hi Naresh,

One way to reduce latency is to use data collocation and send compute tasks
for this data:

https://apacheignite.readme.io/docs/affinity-collocation

instead of gathering data from several data nodes on client node, you can
collocate data required for you task on one node and send compute task
there.

also as Dmitry saidnear cache can help  in this case too:

https://apacheignite.readme.io/docs/near-caches

Both options should be checked.

Thanks,
Mike.





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/