cancel
Showing results for 
Search instead for 
Did you mean: 

database violations error while starting workflow through the job scheduling

kranthi
Star Contributor
Star Contributor

While I am trying to start the workflow from job scheduling getting database violation error:

ERROR [node.db.DeletedNodeCleanupWorker] [DefaultScheduler_Worker-10] Failed to purge nodes.  If the purgable set is too large for the available DB resources

  then the nodes can be purged manually as well.

  Set log level to WARN for this class to get exception log:

   Max commit time: 1486464301224

   Error:      

### Error updating database.  Cause: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

### The error may involve alfresco.node.delete_NodesByTxnCommitTime-Inline

### The error occurred while setting parameters

### Cause: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

; SQL []; ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".; nested exception is org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

my java code is there below please go through this and say that is there anything to change.

StartWfBeforeDocExpireJobs .java

public class StartWfBeforeDocExpireJobs extends AbstractScheduledLockedJob  {

  private static final Logger logger=Logger.getLogger(StartWfBeforeDocExpireJobs.class);

  private static NodeService nodeService;

  private static SearchService searchService;

  private static WorkflowService workflowService;

  private static AuthorityService authorityService;

  private static AuthenticationComponent authenticationComponent;

  private static TransactionService transactionService;

  @SuppressWarnings("static-access")

  @Override

  public void executeJob(JobExecutionContext jobContext)

  throws JobExecutionException {

  // TODO Auto-generated method stub

  setUp(jobContext);

     this.authenticationComponent.setCurrentUser("admin");

    expireDoc();

  }

  public void setUp(JobExecutionContext jobContext)

   {

  logger.debug("in StartWfBeforeDocExpireJobs");

  StartWfBeforeDocExpireJobs.nodeService = ((NodeService)jobContext.getJobDetail().getJobDataMap().get("nodeService"));

  StartWfBeforeDocExpireJobs.searchService = ((SearchService)jobContext.getJobDetail().getJobDataMap().get("searchService"));

  StartWfBeforeDocExpireJobs.workflowService = ((WorkflowService)jobContext.getJobDetail().getJobDataMap().get("workflowService"));

  StartWfBeforeDocExpireJobs.authorityService = ((AuthorityService)jobContext.getJobDetail().getJobDataMap().get("authorityService"));

  StartWfBeforeDocExpireJobs.authenticationComponent = ((AuthenticationComponent)jobContext.getJobDetail().getJobDataMap().get("authenticationComponent"));

  StartWfBeforeDocExpireJobs.transactionService = ((TransactionService)jobContext.getJobDetail().getJobDataMap().get("transactionService"));

   logger.debug("end StartWfBeforeDocExpireJobs");

   }

  @SuppressWarnings({ "static-access", "unchecked" })

  public void expireDoc()

  {

  @SuppressWarnings("rawtypes")

  RetryingTransactionHelper.RetryingTransactionCallback txnWork = new RetryingTransactionHelper.RetryingTransactionCallback() {

  public Object execute() throws Exception {

  StartWfBeforeDocExpire stwfbfdocExp=new StartWfBeforeDocExpire(StartWfBeforeDocExpireJobs.nodeService,

  StartWfBeforeDocExpireJobs.searchService, StartWfBeforeDocExpireJobs.workflowService,

  StartWfBeforeDocExpireJobs.authorityService);

  logger.debug("method:"+stwfbfdocExp);

  stwfbfdocExp.StartWfBeforeDocExp();

  return null;

  }

  };

  this.transactionService.getRetryingTransactionHelper().doInTransaction(txnWork);

  } 

}

StartWfBeforeDocExpire.java

public class StartWfBeforeDocExpire {

  private static Logger logger=Logger.getLogger(StartWfBeforeDocExpire.class);

  private NodeService nodeService;

  private static SearchService searchService;

  private WorkflowService workflowService;

  private AuthorityService authorityService;

  private  final String GROUP_NAME = "REVIEWERS";

  private  final String WORKFLOW_DESCRIPTION = "Request for approval";

  private  final String TASK_DESCRIPTION = "Check contents and review.";

  private  final String ACTIVITI_PARALLEL_GROUP_REVIEW_NAME = "activiti$activitiParallelGroupReview";

  

  @SuppressWarnings("static-access")

  public StartWfBeforeDocExpire(NodeService nodeService,SearchService searchService,WorkflowService workflowService,AuthorityService authorityService)

  {

  this.nodeService = nodeService;

  this.searchService = searchService;

  this.workflowService = workflowService;

  this.authorityService = authorityService;

  }

  public void StartWfBeforeDocExp() throws Exception

  {

  try{

 

  String query = "PATH:\"//app:company_home/st:sites/*/cm:documentLibrary//*\" ";

  logger.debug("query: "+query);

  List<NodeRef> nRef1 = luceneSearch(query,"cm:created",true);

  if(nRef1!=null)

  {

  logger.debug("nodes found: "+nRef1.size());

  for(NodeRef nRef:nRef1){

  logger.debug("files:"+nRef);

  List<WorkflowInstance> workflowInstances = workflowService.getActiveWorkflows();

  logger.debug("workflowInstances:"+workflowInstances);

  if(workflowInstances.isEmpty()){

  startWorkflow1(nRef);

  logger.debug("workflow started for selected document");

  break;

  }

  }

  }

  }catch (Exception e) {

  System.out.println("Error: "+e.getMessage());

  e.printStackTrace();

  throw e;

  }

}

  protected void startWorkflow1(NodeRef contentNodeRef) {   

  

  NodeRef workflowNodeRef = workflowService.createPackage(null); 

     Map<QName, Serializable> parameters = new HashMap<QName, Serializable>();

     parameters.put(WorkflowModel.ASSOC_PACKAGE, workflowNodeRef);

   //  parameters.put(WorkflowModel.PROP_WORKFLOW_DESCRIPTION, WORKFLOW_DESCRIPTION);

     parameters.put(WorkflowModel.PROP_DESCRIPTION, TASK_DESCRIPTION);

     parameters.put(WorkflowModel.ASSOC_GROUP_ASSIGNEE,

             authorityService.getAuthorityNodeRef("GROUP_" + GROUP_NAME));

     parameters.put(WorkflowModel.PROP_PERCENT_COMPLETE, 50);

     

     // Add zero or more items

     nodeService.addChild(

               workflowNodeRef,

             contentNodeRef,

             WorkflowModel.ASSOC_PACKAGE_CONTAINS,

             QName.createQName(NamespaceService.CONTENT_MODEL_1_0_URI,

                 QName.createValidLocalName(nodeService.getProperty(

                     contentNodeRef, ContentModel.PROP_NAME).toString())));

     WorkflowDefinition wfDefinition =

             workflowService.getDefinitionByName(ACTIVITI_PARALLEL_GROUP_REVIEW_NAME);

    

     workflowService.startWorkflow(wfDefinition.getId(), parameters);

  }

  public static List<NodeRef> luceneSearch(String s,String sortField,Boolean isAsc) {

  SearchParameters sp = new SearchParameters();

  sp.addStore(StoreRef.STORE_REF_WORKSPACE_SPACESSTORE);

  sp.addSort(sortField, isAsc);

  sp.setLanguage(SearchService.LANGUAGE_LUCENE);

  sp.setQuery(s);

  sp.setLimitBy(LimitBy.FINAL_SIZE);

  org.alfresco.service.cmr.search.ResultSet rs=searchService.query(sp);

  if(rs==null)return null;

  if(rs.length()==0)return null;

  return rs.getNodeRefs();

  }

}

6 REPLIES 6

afaust
Legendary Innovator
Legendary Innovator

I do not believe the error in DeletedNodeCleanupWorker is in any way related to your attempt to start a workflow. Both are independant jobs which do not call / trigger each other. I cannot determine any relation via the log excerpts you posted. If you could set the log level for org.alfresco.repo.node.db.DeletedNodeCleanupWorker to WARN (as the message states) you will get more detail about this particular error.

kranthi
Star Contributor
Star Contributor

added that to log statements  after that I got like this:

2017-03-08 16:15:17,085  WARN  [node.db.DeletedNodeCleanupWorker] [DefaultScheduler_Worker-3] Failed to purge nodes.  If the purgable set is too large for the available DB resources

  then the nodes can be purged manually as well.

  Set log level to WARN for this class to get exception log:

   Max commit time: 1486377910675

   Error:      

### Error updating database.  Cause: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

### The error may involve alfresco.node.delete_NodesByTxnCommitTime-Inline

### The error occurred while setting parameters

### Cause: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

; SQL []; ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".; nested exception is org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

org.springframework.dao.DataIntegrityViolationException:

### Error updating database.  Cause: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

### The error may involve alfresco.node.delete_NodesByTxnCommitTime-Inline

### The error occurred while setting parameters

### Cause: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

; SQL []; ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".; nested exception is org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

  at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:245)

  at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)

  at org.mybatis.spring.MyBatisExceptionTranslator.translateExceptionIfPossible(MyBatisExceptionTranslator.java:71)

  at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:346)

  at com.sun.proxy.$Proxy9.delete(Unknown Source)

  at org.mybatis.spring.SqlSessionTemplate.delete(SqlSessionTemplate.java:259)

  at org.alfresco.repo.domain.node.ibatis.NodeDAOImpl.deleteNodesByCommitTime(NodeDAOImpl.java:393)

  at org.alfresco.repo.domain.node.AbstractNodeDAOImpl.purgeNodes(AbstractNodeDAOImpl.java:2044)

  at org.alfresco.repo.node.db.DeletedNodeCleanupWorker$1.execute(DeletedNodeCleanupWorker.java:110)

  at org.alfresco.repo.node.db.DeletedNodeCleanupWorker$1.execute(DeletedNodeCleanupWorker.java:107)

  at org.alfresco.repo.transaction.RetryingTransactionHelper.doInTransaction(RetryingTransactionHelper.java:450)

  at org.alfresco.repo.node.db.DeletedNodeCleanupWorker.purgeOldDeletedNodes(DeletedNodeCleanupWorker.java:122)

  at org.alfresco.repo.node.db.DeletedNodeCleanupWorker.doCleanInternal(DeletedNodeCleanupWorker.java:62)

  at org.alfresco.repo.node.cleanup.AbstractNodeCleanupWorker$1.doWork(AbstractNodeCleanupWorker.java:173)

  at org.alfresco.repo.node.cleanup.AbstractNodeCleanupWorker$1.doWork(AbstractNodeCleanupWorker.java:168)

  at org.alfresco.repo.security.authentication.AuthenticationUtil.runAs(AuthenticationUtil.java:548)

  at org.alfresco.repo.node.cleanup.AbstractNodeCleanupWorker.doCleanAsSystem(AbstractNodeCleanupWorker.java:182)

  at org.alfresco.repo.node.cleanup.AbstractNodeCleanupWorker.doClean(AbstractNodeCleanupWorker.java:124)

  at org.alfresco.repo.node.cleanup.NodeCleanupRegistry.doClean(NodeCleanupRegistry.java:62)

  at org.alfresco.repo.node.cleanup.NodeCleanupJob.execute(NodeCleanupJob.java:54)

  at org.quartz.core.JobRunShell.run(JobRunShell.java:216)

  at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:563)

Caused by: org.postgresql.util.PSQLException: ERROR: update or delete on table "alf_node" violates foreign key constraint "fk_alf_nprop_n" on table "alf_node_properties"

  Detail: Key (id)=(1584) is still referenced from table "alf_node_properties".

  at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2102)

  at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1835)

  at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257)

  at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:512)

  at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:388)

  at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:381)

  at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)

  at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)

  at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)

  at org.apache.ibatis.executor.statement.PreparedStatementHandler.update(PreparedStatementHandler.java:22)

  at org.apache.ibatis.executor.statement.RoutingStatementHandler.update(RoutingStatementHandler.java:51)

  at org.apache.ibatis.executor.SimpleExecutor.doUpdate(SimpleExecutor.java:29)

  at org.apache.ibatis.executor.BaseExecutor.update(BaseExecutor.java:87)

  at org.apache.ibatis.executor.CachingExecutor.update(CachingExecutor.java:46)

  at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java:118)

  at org.apache.ibatis.session.defaults.DefaultSqlSession.delete(DefaultSqlSession.java:131)

  at sun.reflect.GeneratedMethodAccessor436.invoke(Unknown Source)

  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

  at java.lang.reflect.Method.invoke(Method.java:606)

  at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:338)

  ... 18 more

afaust
Legendary Innovator
Legendary Innovator

This stacktrace confirms that the code causing the issue is run in an isolated job and not caused by your workflow start job.

The real cause of the problem may be completely unrelated. Apparently a node has been marked as deleted but not all of its properties have been removed at the time they should have been. This looks like a potential error / bug in the NodeService / NodeDAO area of responsibility.

Which Alfresco version are you running?

kranthi
Star Contributor
Star Contributor

I am using alfresco enterprise 4.2.2 version

douglascrp
World-Class Innovator
World-Class Innovator

I have just faced and fixed 101 different registres in a PostgreSQL database, in which an Alfresco 5.2.f is connected.

I had to take every single ID printed in the log file and execute the following SQL statements:

DELETE
FROM alf_node_properties
WHERE node_id = <ID>;

DELETE FROM alf_node
WHERE ID = <ID>;

After every execution, I had to re-execute the nodeServiceCleanupJobDetail job, as for each execution, I could get only 2 IDs.

I used the OOTB Support Tools to manually execute the job.

The problem with this is that I am not sure what exactly caused such inconsistencies.

I hope this information is useful for others facing the same issue.

For anyone looking, the OOTBee Support Tools can be found here - GitHub - OrderOfTheBee/ootbee-support-tools: OOTBee Support Tools addon to extend set of administrat...