I'm looking for advice on how to approach simulating an Order to Cash process in which the size of the orders (measured in terms of the number of distinct line items [aka "part numbers"]) varies widely and has a significant impact on the time it takes to perform many activities. These "line-item dependant" activities essentially get repeated for each line item in the order. I'm looking for two mechanisms to simulate this:



(1) A function (or other object) in which I can set the number of line items in an order based on a distribution. The practical effect of this would be that if I run a simulation with a start event frequency set to 100/day, then I would get 100 orders, with a different number of line items in each order. That value would be retained for each order/process instance.



(2) A looping mechanism to repeate line-item dependant activities the same number of times as the number of line items in the order. This would require the looping mechanism to be aware of how many loops had been performed and able to compare that with the number of line items in the order.



Most of my modeling is in BPMN, though I've played around a bit with EPCs.

In the case of BPMN, I'm not sure how to implement (1). With BPMN I can start to realize (2) by creating a function with a a Loop Maximum attribute, but I don't know how to set that attribute dynamically during the simulation (i.e. read the value generated in (1) and set Loop Maximum to that value for each process instance).



In the case of EPC, there are ERD Attributes which can possibly be used to set the number of line items (1) and to keep track of how many times a loop has executed, but I'm not sure how to compare the two ERD Attributes (# of line items; # of loops) to control when I exit the loop (exit if # of loops = # of line items).



This seems like a pretty common business pattern, and I’m hoping there are standard approaches to modeling and simulating it.



I'd appreciate any advice any of you could provide.

 or register to reply.

Notify Moderator