cancel
Showing results for 
Search instead for 
Did you mean: 
thijslemmens
Champ in-the-making
Champ in-the-making

benchmark
benchmarks
benchmarking
load testing
performance

Overview

This page describes the architecture and installation options for the Alfresco Benchmark Framework.  Specific tests are described here: Benchmark Testing with Alfresco.

There are some additional Instructional Videos available.

Architecture

center

Concepts

Scheduled Events

Data Storage and Transmission

See the sample code's  ScheduleProcesses class.

Event Data

An event processor or event producer can generate any number of scheduled events and pass arbitrary data to these events.  This mechanism is designed for data that pertinent to the specific event only.

The ScheduleProcesses class passes the processName value as a String to future events.

              // We will attach process name as the event data
              Event eventOut = new Event(eventNameProcess, scheduled, processName);

Any format that can be persisted in MongoDB will support event execution by any load driver i.e. if you want your test to be fully distributable, then ensure that you use String, Number or a MongoDB compatible DBObject; the latter being a very rich and descriptive method of passing data between events.

If the event data is not transportable between load drivers (CMIS's Session is a good example) then the framework will persist the data in memory and store a reference to that data in the events collection.  The event will only be executed by the server that created it.

Session Data

If event data is transient, but will apply to a sequence of events, the SessionDataService can be used.  This data is only available during the test run itself but can be reused or manipulated by events that occur *in sequence*.  The trigger is to create a session ID and assign it to an event as is done in the sample's ScheduleProcesses class:

           // We want to carry some arbitrary data around in a 'session'
           DBObject sessionObj = new BasicDBObject()
                   .append('key1', 'value1')
                   .append('key2', 'value2');
           String sessionId = sessionService.startSession(sessionObj);
           eventOut.setSessionId(sessionId);

Any attempt to branch an event will result in the session ending.  Sessions should be terminated when they are logically complete.

       // Session IDs are, by default, carried from originating events to new events.
       // Exceptions that escape to the framework will automatically trigger session closure.
       // Stop the session so that it is possible to count the active sessions
       sessionService.endSession(sessionId);

Data Mirrors

Data mirrors are collections in MongoDB that provide the load drivers with long-lived access to data that would not easily be obtained when a test is run.  Very often, a server-in-test will be alive for weeks, being hit with all manner of load tests and investigations.  During this process, the server will retain much of its state, such as the user base created.  When this data is required by a test run, it is not feasible to rebuild the data using the test server because it could take a long time and is not really of any interest to the tester.  By allowing test implementations access to MongoDB directly, they are able to store any data states that might be useful for the life of the server-in-test.  Additionally, test runs can use a mirror to prevent attempts to create duplicate data and avoid having to use naming conventions to generate any long-lived data.

A data mirror collection follows the naming convention:

mirrors.<target-server-ip>.<function>

If the server-in-test is reset or destroyed, the data mirrors are defunct at best and need to be destroyed if the IP address of the server-in-test is going to be reused.  As of V2.0.1 of the framework, this process must still be done manually:

mongo <mongo-data-host>
use bm20-data
db.<target-server-ip>.<function>.drop();

The sample project provided stores arbitrary process data using a DAO.  The ScheduleProcesses class uses the DAO for long-term storage to mirror data on the fictional server-in-test.

           processDataDAO.createProcess(processName);
           processDataDAO.updateProcessState(processName, DataCreationState.Scheduled);

This allows subsequent test runs to know exactly which fictional processes have already been triggered on the fictional server-in-test.  The default server name used for the sample procCentral', which results a data mirror:

 mirrors.procCentral.processes

Later, the ExecuteProcess event, which is called for each scheduled process creation, advances the process and records the process state for long-term storage:

               processDataDAO.updateProcessState(processName, DataCreationState.Failed);


Installation

Software Requirements

  • Testers
    • Java version: 1.8
    • Tomcat: apache-tomcat-7.0.55 or later
    • MongoDB: db version v2.6.3 to 3.0.8 - Note: 3.2 currently NOT supported
    • Access to Alfresco released artifacts
    • RoboMongo 0.8.4 (or equivalent required for access to raw results)
  • Developers
    • Apache Maven 3.1.1  (only required when building from source)
    • git version 1.9.2.msysgit.0 (only required when building from source)
    • svn, version 1.8.5 (only required when building Alfresco-specific tests from source)
    • Access to Alfresco released artifacts and, possibly, snapshot artifacts

Benchmark Server Setup

Java

Install 1.8

Mongo DB

Follow the installation instructions for MongoDB.

Use the following version or later:

 db version v2.6.3

Open up port 27017 to access from potential load driver machines, developers and other analysis software.

Typically all configuration options are stored in a central bm20-config database.  Each test (or even test run) can specify a new database but starting with only one MongoDB instance is quickest; the bm20-data will be stored alongside the configuration database.

Tomcat 7

Install Apache Tomcat 7.0.55 or later.

Linux Service

 /var/lib> useradd tomcat
/var/lib> tar -xzf apache-tomcat-7.0.59.tar.gz
/var/lib> echo JAVA_OPTS=\'-server -XX:MaxPermSize=1024m -Xms256M -Xmx2G -Dmongo.config.host=172.xxx.xxx.xxx\' > apache-tomcat-7.0.59/bin/setenv.sh
/var/lib> chmod +x apache-tomcat-7.0.59/bin/setenv.sh
/var/lib> chown -R tomcat apache-tomcat-7.0.59
/var/lib> vi /etc/init.d/tomcat


##!/bin/bash
# description: Tomcat Start Stop Restart
# processname: tomcat
# chkconfig: 234 20 80
PATH=$JAVA_HOME/bin:$PATH
export PATH
CATALINA_HOME=/var/lib/apache-tomcat-7.0.59

case $1 in
start)
su - tomcat -c 'sh $CATALINA_HOME/bin/startup.sh'
;;
stop)
su - tomcat -c 'sh $CATALINA_HOME/bin/shutdown.sh'
;;
restart)
su - tomcat -c 'sh $CATALINA_HOME/bin/shutdown.sh'
su - tomcat -c 'sh $CATALINA_HOME/bin/startup.sh'
;;
esac
exit 0

 /var/lib> chmod +x /etc/init.d/tomcat
/var/lib> chkconfig --add tomcat
/var/lib> chkconfig | grep tomcat
/var/lib> service tomcat start

Configuration

You can use any port for Tomcat but the load tests assume port 9080 by default.

To change this, edit <tomcat>/conf/server.xml

 <tomcat>/conf/tomcat-users.xml might contain:
<role rolename='tomcat'/>
<role rolename='manager-gui'/>
<role rolename='manager-script'/>
<role rolename='manager-status'/>
<user username='admin' password='****' roles='tomcat,manager-gui,manager-script,manager-status'/>

The username and password will be required when deploying from Maven during development.

Deployment

At this point, you should have a good server (equivalent to m3xlarge) with MongoDB and Tomcat7 running.

The JAVA_OPTS for the Tomcat VM must point to the IP address of the local machine i.e. the intention is for the benchmark server to be as close to the data as possible in order to generate reports as efficiently as possible.

  1. Download the server application: alfresco-benchmark-server-2.0.5.war.
  2. Connect to your Tomcat server and open the manager application: <tomcat-ip>:9080/manager
  3. Deploy the alfresco-benchmark-server-2.0.5.war to Tomcat using context path alfresco-benchmark-server.
  4. Connect using <tomcat-ip>:9080/alfresco-benchmark-server

Alternatively, deploy using Maven:

  mvn tomcat7:redeploy -DskipTests  -Dbm.tomcat.ip=<tomcat-ipt> -Dbm.tomcat.port=9080 -Dbm.tomcat.server=bm-remote

where the bm-remote has defined server credentials in your Maven security XML settings.

If the Tomcat server rejects the upload, the maximum allowed file size may need to be increased by modifying <tomcat>/webapps/manager/WEB-INF/web.xml:

   <multipart-config>
        <max-file-size>104857600</max-file-size>
     <max-request-size>104857600</max-request-size>
     <file-size-threshold>0</file-size-threshold>
   </multipart-config>


Benchmark Load Driver Setup

If a snapshot of the benchmark server machine was taken with just Java and Tomcat installed, then it can be reused for the load drivers.

Java

Install Java 1.8

Tomcat 7

This is exactly the same as for the server.  The JAVA_OPTS point to the server's MongoDB instance.

 JAVA_OPTS=-Xmx2048M -Dmongo.config.host=54.xxx.xxx.xxx

Access to the MongoDB data storage is done via the server UI, which allows both the drivers and the server to locate the test results; the tests produce data and the server produces reports from the data.

Deployment

  1. Download the sample test application: alfresco-benchmark-sample-2.0.5.war.
  2. Connect to your Tomcat server and open the manager application: <tomcat-ip>:9080/manager
  3. Deploy the alfresco-benchmark-sample-2.0.5.war to Tomcat.
  4. Connect using <tomcat-ip>:9080/alfresco-benchmark-server
  5. Create a new test using the Add another test and select alfresco-benchmark-sample-2.0.5-schema:9
  6. On the property editor page, there should be one registered driver under Driver Details
  7. NB: Expand the MongoDB Connection and click on the ---:27017 to assign the location of the test data storage e.g. 54.xxx.xxx.xxx
  8. Set any other parameters required.  Each test will have a few parameters that need setting when the test is first created.  The sample has no additional mandatory requirements.
  9. Click on the name of your test in the top left
  10. Create a new test run
  11. Press play
  12. Once the test run has finished running, click through and download either the csv or xlsx results

Additional Reading

2 Comments