Alfresco 5.0c backup script WINDOWS
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-28-2015 07:23 AM
Hi everyone,
iam currently trying to set up a beckup strategy for our installation of alfresco. Reading a lot of tutorials about alfresco backup scripts, i noticed they are mostly written for LINUX systems, and the few i could find for windows are getting slightly old.
I wanted to submit to the forum a sample script that i modified from a linux version, to get some advice from specialists, just to know if iam on the right way.
Tutorials also speak about creating a .pgpass file with the following content :
is this LINUX specific, or does it have to be done in WINDOWS too? if it's the case, where should i put that file?
here is the script, please tell me if everything is ok, i haven't tested it yet, not being sure of what ive wrote because it is a port of a shell script:
As i understood, if it's ok and works, i should set the execution with a windows schedueled task. Is it recommended to script SOL4 indexes backup too?
Thanks for your help
iam currently trying to set up a beckup strategy for our installation of alfresco. Reading a lot of tutorials about alfresco backup scripts, i noticed they are mostly written for LINUX systems, and the few i could find for windows are getting slightly old.
I wanted to submit to the forum a sample script that i modified from a linux version, to get some advice from specialists, just to know if iam on the right way.
Tutorials also speak about creating a .pgpass file with the following content :
*:*:*:postgres:[DB-PASSWORD]
is this LINUX specific, or does it have to be done in WINDOWS too? if it's the case, where should i put that file?
here is the script, please tell me if everything is ok, i haven't tested it yet, not being sure of what ive wrote because it is a port of a shell script:
# Archive Dynamics script - Backup of Alfresco # Configuration: CURRENT_FOLDER=$(pwd) # Script folder TIMESTAMP=$( date +%Y%m%d%H%M%S ) # Create timestamp DUMP_NUM=10 # Number of backups to keep AL_FOLDER="C:\alfresco" # Alfresco folder AL_DATA="C:\alfresco\alf_data" # Alfresco data folder DB_HOME="C:\alfresco\postgresql" # PostgreSQL folder# Function - Stop Alfrescofunction al_stop(){ $AL_FOLDER\servicerun STOP # If Alfresco does not stop we MUST exit script # Backing up files with Alfresco working may # corrupt data indexes!! if [ "$?" != "0" ]; then echo "Alfresco Stop FAILED - STOP SCRIPT!" exit 1; fi}# Function - Start Alfrescofunction al_start(){ $AL_FOLDER\servicerun START}# Function - Start PostgreSQL Serverfunction p_start(){ $DB_HOME\scripts\servicerun START}# Verify that argument was providedif [ -d "$1" ]; then # A folder has been provided, save it TARGET_FOLDER="$1"else # No argument was provided for backup location echo "Usage: $0 [TARGET_PATH]" exit 0fi#—————————————-# 1 - Begin by stopping Alfresco#—————————————- al_stop#—————————————-# 2 - Backup the Alfresco database#—————————————- # Start the postgreSQL database (which is stopped automatically # by the Alfresco stop script) p_start # Create a filename for the database tar DB_DUMP=alfresco_db_${TIMESTAMP}.tar # Backup the database to the target folder # -Ft = Export database as tar file $DB_HOME\bin\pg_dump -Ft alfresco > $TARGET_FOLDER\$DB_DUMP # Check if an error was returned if [ "$?" = "0" ]; then echo "DB EXPORT WORKED!" else echo "DB EXPORT FAILED!" fi#——————————————# 3 - Backup the Alfresco content folder#—————————————— # Create a file name with timestamp AL_DUMP=alfresco_data_${TIMESTAMP}.tgz # Tar the Alfresco data folder to the backup # to the backup folder specified tar zcf $TARGET_FOLDER\$AL_DUMP $AL_DATA#——————————————# 4 - Merge the database and data files#—————————————— # Create a backup filename with timestamp BACKUP_FILE="$alfresco_bak_${TIMESTAMP}.tgz" tar zcf $TARGET_FOLDER\$BACKUP_FILE $TARGET_FOLDER\$AL_DUMP $TARGET_FOLDER\$DB_DUMP # If files were merged, delete the duplicates if [ -f "$TARGET_FOLDER\$BACKUP_FILE" ]; then echo "BACKUP SUCCESSFUL" rm $TARGET_FOLDER\$AL_DUMP rm $TARGET_FOLDER\$DB_DUMP SUCCESS=1 fi#——————————————# 5 - We're done, start the Alfresco service#—————————————— al_start#——————————————# 6 - Remove backups older than DUMP_NUM days#—————————————— if [ "$SUCCESS" = 1 ]; then find $TARGET_FOLDER\*.tgz -type f -mtime +${DUMP_NUM} -exec rm {} \; fi
As i understood, if it's ok and works, i should set the execution with a windows schedueled task. Is it recommended to script SOL4 indexes backup too?
Thanks for your help
Labels:
- Labels:
-
Archive
3 REPLIES 3
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-28-2015 11:45 AM
I've never used a alfresco version on windows so I can only provide little information (sorry about that), but you are on the right track. The biggest thing is to make sure the alf_data folder (it contains your data and database AND solr). You don't need to create a separate backup for solr if your backing up the entire alf_data directory.
My shell script for windows is rusty, but are you combining the two compressed files together? (data and database?)
My shell script for windows is rusty, but are you combining the two compressed files together? (data and database?)
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-29-2015 06:07 AM
Thank you for your answer,
yes as you can see, section number 4 organises the merge of data files and database. But about Sol4, so you say that i can actually back up all of the alf_data folder without any risks of corrupting data?
yes as you can see, section number 4 organises the merge of data files and database. But about Sol4, so you say that i can actually back up all of the alf_data folder without any risks of corrupting data?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-29-2015 12:12 PM
The short answer yes,
You still need to create a database dump. I would still recommend using a backup strategy that compresses your backups for space issues that could come up.
You should still read these for references for best practices:
http://fcorti.com/2013/02/06/alfresco-backup-script/
http://wiki.alfresco.com/wiki/Backup_and_Restore
I've adopted fcorti's approach to backing up (using linux) and basically what gets backed up in the alf_data folder the entire folder and is then compressed. Then I create a dump of the database and compress it as well.
I think the biggest key is that if you adopt this same strategy you MUST always keep the two backups together.
Meaning: You can't restore a backup from last week and use the current database as of today. You have to restore the database from last week as well…Otherwise you screw yourself…
You still need to create a database dump. I would still recommend using a backup strategy that compresses your backups for space issues that could come up.
You should still read these for references for best practices:
http://fcorti.com/2013/02/06/alfresco-backup-script/
http://wiki.alfresco.com/wiki/Backup_and_Restore
I've adopted fcorti's approach to backing up (using linux) and basically what gets backed up in the alf_data folder the entire folder and is then compressed. Then I create a dump of the database and compress it as well.
I think the biggest key is that if you adopt this same strategy you MUST always keep the two backups together.
Meaning: You can't restore a backup from last week and use the current database as of today. You have to restore the database from last week as well…Otherwise you screw yourself…
