<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: OutOfmemory exception when processing a large number of documents in Nuxeo Forum</title>
    <link>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314052#M1053</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;</description>
    <pubDate>Wed, 21 Apr 2021 09:38:53 GMT</pubDate>
    <dc:creator>Ahmad_BenMaallem</dc:creator>
    <dc:date>2021-04-21T09:38:53Z</dc:date>
    <item>
      <title>OutOfmemory exception when processing a large number of documents</title>
      <link>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314049#M1050</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;I want to process a large number of documents (around 3 million).
The process consists of updating a metadata of each document from another metadata (so I will need to do a read and write from the database).&lt;/P&gt;
&lt;P&gt;For that, I was thinking of using elasticsearch's SCROLL API.&lt;/P&gt;
&lt;P&gt;The problem is that I will have a "java.lang.OutOfMemoryError: GC overhead limit exceeded" exception in the middle of processing (knowing that I have xmx = xms = 24g in JAVA_OPTS)&lt;/P&gt;
&lt;P&gt;I have tried different configuration for the garbage collector but no great effect noticed.&lt;/P&gt;
&lt;P&gt;Can someone help me or give me an idea how to process a large batch of documents in nuxeo.&lt;/P&gt;
&lt;P&gt;Thank you in advance.&lt;/P&gt;</description>
      <pubDate>Thu, 15 Apr 2021 10:10:38 GMT</pubDate>
      <guid>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314049#M1050</guid>
      <dc:creator>Ahmad_BenMaallem</dc:creator>
      <dc:date>2021-04-15T10:10:38Z</dc:date>
    </item>
    <item>
      <title>Re: OutOfmemory exception when processing a large number of documents</title>
      <link>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314050#M1051</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;I have used the bulkAction mechanism in nuxeo and it looks good.&lt;/P&gt;
&lt;P&gt;Thank you&lt;/P&gt;</description>
      <pubDate>Fri, 16 Apr 2021 08:13:13 GMT</pubDate>
      <guid>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314050#M1051</guid>
      <dc:creator>Ahmad_BenMaallem</dc:creator>
      <dc:date>2021-04-16T08:13:13Z</dc:date>
    </item>
    <item>
      <title>Re: OutOfmemory exception when processing a large number of documents</title>
      <link>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314051#M1052</link>
      <description>&lt;P&gt;Can you help me how to use bulk upload with custom metadata?&lt;/P&gt;</description>
      <pubDate>Wed, 21 Apr 2021 05:05:24 GMT</pubDate>
      <guid>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314051#M1052</guid>
      <dc:creator>sujoy_debnath</dc:creator>
      <dc:date>2021-04-21T05:05:24Z</dc:date>
    </item>
    <item>
      <title>Re: OutOfmemory exception when processing a large number of documents</title>
      <link>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314052#M1053</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;</description>
      <pubDate>Wed, 21 Apr 2021 09:38:53 GMT</pubDate>
      <guid>https://connect.hyland.com/t5/nuxeo-forum/outofmemory-exception-when-processing-a-large-number-of/m-p/314052#M1053</guid>
      <dc:creator>Ahmad_BenMaallem</dc:creator>
      <dc:date>2021-04-21T09:38:53Z</dc:date>
    </item>
  </channel>
</rss>

