Stream: troubleshooting

Topic: Datasets with large number of files failed to Re-index v6.4


view this post on Zulip Bikram (Jan 28 2025 at 21:15):

We recently upgraded to v6.4 and Re-indexing got stuck 3-4 times on 2 different datasets which are really huge. We run re-index on a separate server which is not part of Production workload. While re-indexing was stuck Payara server got stuck too and app was unavailable. It ran for hours and 1 time Payara server crashed and restarted with OutofMemory error and other time Payara server did not come back after crash.
I had to manually set indextime of these 2 datasets in DB to proceed with Re-index of other datasets. Last time on 6.2 upgrade it took a while to re-index but we did not run into issue with frozen server.
First dataset has 1057 files with size 321G on disk
Second dataset 2872 files with 84G size

view this post on Zulip Steve Baroti (Jan 31 2025 at 20:53):

Did you try increasing the memory for your system, Bikram - did the indexing finish eventually?

view this post on Zulip Bikram (Jan 31 2025 at 20:56):

System has 32GB memory, I think should be enough. It did not finish and got stuck. I had to manually update indextime in DB to proceed with other datasets.

view this post on Zulip Steve Baroti (Jan 31 2025 at 21:21):

Bikram, go ahead and try 64GB, and please keep us posted! :fingers_crossed:

view this post on Zulip Philip Durbin 🚀 (Feb 25 2025 at 18:41):

@Bikram thanks for opening #11284 about this.

view this post on Zulip Don Sizemore (Feb 26 2025 at 11:57):

@Bikram what's Payara's Xmx value?

view this post on Zulip Bikram (Feb 26 2025 at 13:32):

Hi @Don Sizemore, Its -Xmx24576m


Last updated: Oct 30 2025 at 06:21 UTC