cancel
Showing results for 
Search instead for 
Did you mean: 

Loading the Data List...

june_cataquez1
Confirmed Champ
Confirmed Champ

Hello fellow Alfrescians,

I have multiple data in my custom Data List (say 3,000 rows). When opening the list, it takes too long, and when it finishes it prompts 'No list items'. I checked the JVM-Memory in https://localhost:8443/solr4/#/, and it consumes 99.9%. Kindly give me some advice on how to optimize the settings in Alfresco to fix this.

Best Regards,

June

1 ACCEPTED ANSWER

Hi June:

5Gb may be a good starting point for JVM, but the needed heap depends on many things such concurrent users, massive processes.... You need to profile JVM to see how much memory you need. 

But if you are experiencing other type problems with SOLR such as broken indexes, you should check your installation and find the cause of the frequent corruption of indices in your solr.log  (for example, a wrong hot backup process...) . By the way, Solr is a great consumer of cpu and memory..., and the needed memory depends on the number of nodes in the repository:

Calculate the memory needed for Solr nodes | Alfresco Documentation  

Finally, and as I pointed out in my previous message, all of this does not ensure a good behaviour for the big query Smiley Sad

Regards.

--C.

View answer in original post

4 REPLIES 4

cesarista
World-Class Innovator
World-Class Innovator

Hi June:

It may help to diagnose, to inspect javascript and solr queries in catalina.out (and solr.log). Check this to obtain more info about the query:

- How to get logs for Alfresco querys - zylk 

On the other hand, better JVM resources (more heap) or faster disks for SOLR indices can help a little bit, although only this, is not a silver bullet.

Regards.

--C.

Hi Cesar,

What I did was, increase the JVM Memory (JAVA_OPT) to 5 GB. Am I doing it right?.

Also, do you know by any chance how to optimize solr4 to avoid doing reindexing? Because I encountered broken indexes most of the time.

Best Regards,

June

Hi June:

5Gb may be a good starting point for JVM, but the needed heap depends on many things such concurrent users, massive processes.... You need to profile JVM to see how much memory you need. 

But if you are experiencing other type problems with SOLR such as broken indexes, you should check your installation and find the cause of the frequent corruption of indices in your solr.log  (for example, a wrong hot backup process...) . By the way, Solr is a great consumer of cpu and memory..., and the needed memory depends on the number of nodes in the repository:

Calculate the memory needed for Solr nodes | Alfresco Documentation  

Finally, and as I pointed out in my previous message, all of this does not ensure a good behaviour for the big query Smiley Sad

Regards.

--C.

Thank you so much, this is good information to start looking into. I appreciate the help.

Best Regards,

June