cancel
Showing results for 
Search instead for 
Did you mean: 

Adding item to sub process loop

kccoyote
Champ in-the-making
Champ in-the-making
I have a (working) process where a query gets executed as a service task, and the results get stored in a database.
The next task is modelled as a subprocess for each of these items (populated with a function defined in a  multiInstanceLoopCharacteristics).

Once all of these items' subprocesses are handled (or aborted), the main process continues.

Now a new requirement came up that the query must be able to run again at any time (as long as the main process hasn't continued, that is). If any new items get returned, they each should get an instance of the sub process as well (as if they were originally part of the collection in the multiInstanceLoopCharacteristics).

Is this possible?

It doesn't really matter if it can't be modelled in bpmn: if I can add an item programmatically to the sub process, that's fine as well.

Excerpt:


    <signal id="endHandleItemsSignal" name="endHandleItemsSignal" />

    <process id="processId">

        <startEvent id="startProcess" name="starts the porcess"/>

        <sequenceFlow sourceRef="startProcess" targetRef="executyQuery"/>

        <serviceTask id="executyQuery" name="executes the query"
                     activiti:expression="#{springBean.executeQuery(instance_id)}"
                     activiti:async="${activiti.execution.async}"/>

        <sequenceFlow sourceRef="executyQuery" targetRef="handleItems"/>

        <subProcess id="handleItems">
            <multiInstanceLoopCharacteristics isSequential="false"
                                              activiti:collection="${springBean.getSubprocessItems(instance_id)}"
                                              activiti:elementVariable="item_id">
            </multiInstanceLoopCharacteristics>

            <startEvent id="startHandleItems" />
            …
            <endEvent id="endHandlePatient" />
        </subProcess>

        <boundaryEvent id="boundary" attachedToRef="handleItems" cancelActivity="true">
            <signalEventDefinition signalRef="endHandleItemsSignal"/>
        </boundaryEvent>

        <sequenceFlow sourceRef="boundary" targetRef="endProcess"/>

        <sequenceFlow sourceRef="handleItems" targetRef="endProcess"/>

        <endEvent id="end"/>
    </process>
4 REPLIES 4

trademak
Star Contributor
Star Contributor
The only solution I can think of is to start the multi instance sub process again so it creates the new number of instances. If you add logic to put the already running multi instance sub processes in the right state again that might work.

Best regards,

kccoyote
Champ in-the-making
Champ in-the-making
What would be the best way to 'start the multi instance sub process again'?
I tried with a boundaryEvent:
<code>
        <boundaryEvent id="reExecuteQuery" attachedToRef="handleItems" cancelActivity="false">
            <signalEventDefinition signalRef="reExecuteQuerySignal"/>
        </boundaryEvent>
        <sequenceFlow sourceRef="reExecuteQuery" targetRef="executyQuery"/>
</code>
When I put cancelActivity=true, all existing tasks in the subprocess disappear (logical, but not what I want).
When I put cancelActivity=false (and adjust the service method to only return new items), all existing tasks remain and new tasks get created for the new items. Problem is that the new tasks are in their own execution of the loop. So now I have 2 loop executions.
When I fire the 're-execute' again, now the 2 executions react and executyQuery will be called twice. Next time 3 times, and so on, causing a cascade of service calls (don't want that either)

That logic to put already running subprocesses in the right state again: could you give me some pointers on how to achieve that?

Perhaps it shouldn't be done with a boundary event at all?

kccoyote
Champ in-the-making
Champ in-the-making
Still stuck. Any pointers?

trademak
Star Contributor
Star Contributor
You need to add some logic to the multi instance sub process. This logic needs to determine which of the instances is already running so it only starts the newly ones. You could implement this logic in a service task preceding the multi-instance sub process. I do think that a boundary event is the right solution.

Best regards,