cancel
Showing results for 
Search instead for 
Did you mean: 

Performance Issues

zmsil
Champ in-the-making
Champ in-the-making
Hie,

I have recently exported a lot of documents(more than 15000 from Ephesoft into a space in Alfresco where rules will run to execute a script to move those documents into my RM site based on their metadata(e.g creating folders in the RM based on the author and in that a folder based on the title and the moving documents there) . All that is kinda working but then it seems like at a certain point, the rule stops running and share cant load all the folders created. At first i though it was running in the background but after a while the record still did not move. I think it's a memory or indexing issue. When i fully reindex, i can see more folders but still not all.

I also suspected that maybe too much memory was being used between the import of the documents all at once and the running of the rules so i increaed my JVM memory pool to initial size 2048 and max size 6144 and though performance is better, its still not optimum. also increased a few cache sizes in the cache-context.xml file but still no success.

How can i optimize alfresco to be able to import more than 15000 files and run rules into the records management site on them and after that load easily.

Zmsil
1 REPLY 1

marcus_svensson
Champ in-the-making
Champ in-the-making
Are you getting any warnings in the Alfresco logs about "Transactional cache full" or something like that?

In custom built bulk imports which I've worked with, we've some times run into problems where there is a long running transaction while importing the data, this slows down the system and finally makes the system unusable.

Our solution at that point was to divide the import in different transactions, for instance, create a new transaction for each x imports or commit the current transaction every now and then.