04-15-2021 06:10 AM
Hello,
I want to process a large number of documents (around 3 million). The process consists of updating a metadata of each document from another metadata (so I will need to do a read and write from the database).
For that, I was thinking of using elasticsearch's SCROLL API.
The problem is that I will have a "java.lang.OutOfMemoryError: GC overhead limit exceeded" exception in the middle of processing (knowing that I have xmx = xms = 24g in JAVA_OPTS)
I have tried different configuration for the garbage collector but no great effect noticed.
Can someone help me or give me an idea how to process a large batch of documents in nuxeo.
Thank you in advance.
04-16-2021 04:13 AM
Hello,
I have used the bulkAction mechanism in nuxeo and it looks good.
Thank you
04-16-2021 04:13 AM
Hello,
I have used the bulkAction mechanism in nuxeo and it looks good.
Thank you
04-21-2021 01:05 AM
Can you help me how to use bulk upload with custom metadata?
04-21-2021 05:38 AM
Hello,
Find what you came for
We want to make your experience in Hyland Connect as valuable as possible, so we put together some helpful links.