10-03-2016 10:14 AM
Hello,
I am using a Bitnami hosted Alfresco ECM instance, there is a 400GB drive attached to the AWS image and I am at max capacity. I've deleted the backup images via the Bitnami console (not sure if these are stored locally / affect available storage) and am looking for anything else (logs / etc.) that I could clear to reduce the space being used.
It may just be my content store is now that big but last time I checked it was only around half the available storage.
Any ideas would be much appreciated?
Thanks.
10-03-2016 11:31 AM
Maybe this can help you: Alfresco: storage volume estimation | Programming and So
It's not Bitnami based, but it describes a classic structure via installer.
10-03-2016 01:47 PM
This was an interesting read... it seems like 400GB should be more than enough for my current storage requirements. I was able to get treesizefree to connect to the WebDav and that was estimating around 100GB for my contents so still not clear where my space is going.
10-03-2016 11:49 AM
Take a look at alf_data/contentstore.deleted. I've seen a lot of cases where people forget to clean that out. You can set up a cron job to delete the contents of that directory on a schedule.
Look at the tomcat temp and work directories.
The logs can also grow to be very big. Look in $ALFRESCO_HOME/tomcat/logs.
10-03-2016 01:41 PM
OK so I have cleared out contentstore.deleted / temp / log, the contentstore.deleted was the obvious big one... I'm not sure what exactly is in work so I've not yet deleted the content.
My bitnami console now shows 4GB free which is better than 0 but still not ideal.
Anything else i can look for?
10-03-2016 02:26 PM
I'm not sure what this means: "I'm not sure what exactly is in work so I've not yet deleted the content."
If it is regarding the contenstore.deleted directory, that can safely be cleared out. Those files are not recoverable without a lot of work.
You might log in as an admin, go to your profile, then click Trashcan and see if there are a lot of files there. People often forget to clean out their own trashcan, and if you aren't running something like the trashcan cleaner those files will just sit there indefinitely.
With all of that said, you mentioned you think you may have 100 GB of content. Consider that the Solr indices could take up space equivalent to 40% - 60% of your content store, so that may be another place where a large chunk of space is being used.
You can always switch into any of these directories and run "du -h ." and it will add up the space used for that directory and its children.
Tags
Find what you came for
We want to make your experience in Hyland Connect as valuable as possible, so we put together some helpful links.