09-01-2020 11:49 PM
Hello all,
I think i am not the only person impacted by this issue (i made some google search).
When we choosed ALFRESCO as our ECM, we were told that only populated custom properties would consume database space. This predicate was used in order to size our database server.
Later, we found that this was more or less true :
- True : When you do a bulk import, only populated custom metadatas are present in ALF_NODE_PROPERTIES
- True : When you add a new property to an existing aspect ALF_NODE_PROPERTIES remains unchanged, no null properties are created.
- False : When you create a new document ( with share or CMIS ), all properties of the aspects are set in ALF_NODE_PROPERTIES, whatever properties are set or not.
- False : When you update only one property of an aspect on a bulk-imported document, it adds all the null properties of the aspect in ALF_NODE_PROPERTIES .
The consequence is that when we migrated our previous document database, we had a correct size estimation. But since the start of the run, the database volume is highly increasing with many unuseful rows with empty values, and performance on theses new documents interrogations are very slow, compared to the bulk-created ones.
My two questions are :
- Is it possible to delete theses properties lines without corruption ? I think that SOLR does not index null metadatas.
- Should a trigger on the database be a solution to prevent this kind of behaviour at the creation or update , or should we prefer a dayly batch that remove targeted null custom metadatas ?
Thank you in advance for your help.
Antoine.
Explore our Alfresco products with the links below. Use labels to filter content by product module.