We enabled on-heap caching for our application, as per documentation. The
only change required is to set OnheapCacheEnabled property of
CacheConfiguration. Once we made this change, and started up our
application, ignite metrics still shows onheap and offheap memory.
Note: we are using ignite v2.3
Metrics for local node (to disable set 'metricsLogFrequency' to 0)
^-- Node [id=0223cd4a, name=delivery, uptime=00:01:00.174]
^-- H/N/C [hosts=1, nodes=1, CPUs=8]
^-- CPU [cur=2.63%, avg=13.84%, GC=0.03%]
^-- PageMemory [pages=111265]
^-- Heap [used=468MB, free=75.98%, comm=1948MB]
^-- Non heap [used=139MB, free=-1%, comm=143MB]
^-- Public thread pool [active=0, idle=0, qSize=0]
^-- System thread pool [active=0, idle=8, qSize=0]
^-- Outbound messages queue [size=0]
1) Does ignite still has off-heap memory even if onheapCacheEnabled is true?
2) How to ensure onheapCachedEnabled is honored?
3) Do we need to do any other configuration changes?
Ignite always stores data off-heap. Enabling on-heap caching just turns Java heap into a cache for the off-heap memory, allowing you to configure eviction policies specific for such a heap cache.
I believe the idea is that in some cases accessing on-heap is faster than accessing off-heap, although I never saw any benchmarks or recommendations what data access scenarios would benefit from the on-heap caching. Remember that storing data on-heap negatively impacts GC. Maybe community will help. You can also benchmark your use case with on-heap caching and without it and share results with the community.
With Ignite 2+, I have found that the on-heap option makes only modest
improvements to performance in most cases. Likewise for copyOnRead=false,
which works in conjunction with on-heap. These options work best in the case
where you have a small number of cache entries that are read many times. In
benchmarks, I've seen performance gains of around 30% for this type of