Processmaker I/O Adds Process Tokens Endpoint


Last week we released a new version the ProcessMaker I/O workflow engine API which enhances the ProcessMaker I/O by opening up an additional endpoint to the API to let users work with process tokens.  Let’s look at how these process tokens work.

The findTokens API Endpoint

The ProcessMaker I/O engine creates a process token whenever a process instance starts.  Tokens are used to manage the lifecycle of an instance.  You can think of the token flow as the heartbeat of the engine.

The new API endpoint uses the GET method and looks as follows:

Process Tokens Endpoint
Process Tokens Endpoint in workflow API

How the Process Tokens Concept Works

To use this endpoint, you are going to be inputting a process ID and an instance ID.  This is because process tokens are specific to each instance in a process.

But what is a process token?

Process Tokens essentially control the very concept of the “workflow” in a workflow engine.  It is not possible to run the engine without tokens.  Whenever the engine needs to process something, it issues tokens to compute what will happen in the next task of the workflow.  To better understand token flow, imagine a chessboard.  On a chessboard after each opponent has a turn, there is a unique configuration of the pieces on the board.  Based on the configuration of the pieces on the board, we know all the possible next moves.

Token flow works the same way.  Look at the image below.  At the start event, we know that the only possible token path is the sequence flow out of the start event.  This is represented by “T1” in the sequence flow.  Once the token reaches the first task, we know that there is still only one path outward from that task (still “T1”).  However, when the token reaches the parallel gateway, there are two paths forward.  So, at this gateway, something interesting happens.  The initial token “T1” dies and gets exchanged for a 2 new tokens (T2 and T3).  Since these 2 tokens are flowing out from the same parallel gateway, they will share the same token key.  These tokens then die at the Parallel join and a fourth token is born – “T4.”

process token flow
Process Token Flow for a simple process


What you can do with Process Token Flow

By providing access to the token flow, we allow a process administrator to see the positions of each process in the engine at a given moment in time.  If you are a developer adding the ProcessMaker I/O API to your application, you can now add interfaces, indicators, and views which will show users or admins to do and see the following:

  1.     Find the place and object where a running instance is paused.
  2.     Calculate how many tokens are in an instance.
  3.     Detect objects that have a large number of tokens waiting for an action in order to continue the process.  For example, a user approval task might have numerous pending tokens indicating a user that is not processing cases fast enough.  So, based on this token count we might want to create indicators to prompt things like escalations depending on the type of software we are building based on ProcessMaker I/O.
  4.     Develop statistics and reporting related to hot zones and bottlenecks in the process.
  5.     Identify the current object where a token is at any given time.

Exploring further the Tokens Endpoint

Tokens are created by either hidden or explicit start events that kick off a new process instance. A token dies when it reaches the end event in a process.  A token’s movement can be traced through the full sequence flow of a process as it passes through gateways, activities, and events.  In addition, tokens can also be created and killed by the different flow elements.  As we saw above, a parallel gateway produces a token for each of the Sequence Flows that flow out of the parallel gateway.

Note – a token does not occur with a Message Flow.  The message flow is drawn with a dashed line in BPMN.  It is not the same as a sequence flow.  Tokens do not apply to Message Flows.  It only follows the sequence flow of a process. In other words, sending a message from a message event does not generate or push a token forward.  However, if a message event reaches another start event, then a token will be created with the new start event.

Our team is working hard on our product to deliver more engine improvements to further improve the experience of building applications with ProcessMaker I/O.

As always, if you have any questions or feedback, you can email us or ask your questions on our public portal.



  1. Putcha V. Narasimham

    The imaginary Token concept was very difficult to understand and apply but this explanation makes the concept simple. To get around this problem of understanding and using tokens, I have adopted a convention of all processes “notifying” to all the connected processes soon after the process in question is completed. So all the “notices” are tokens in a way. In what way is the use of “tokens” better than using “notices”?

    • Brian Reale

      First off, there is a concept in the BPMN world of items in a process notifying each other. This is done through Message Intermediate events. So consider that a process can be broken into “lanes” of responsibility. And one lane may wait for another lane to complete a task and send a message to itself. A great intro to intermediate events and covering message events are described at: .

      How I like to envision this concept of token passing is imagine a line of firefighters trying to put out a fire with a bucket. The fireman at the hydrant/hose starts the process. He fills the bucket with water. The bucket is our token. The first firefighter is done with his task, so he PASSES the bucket to the next man in line. This is the token passing along the process. And once the bucket reaches the last man in line at the fire, the last firefighter empties the bucket on the fire and the process is done.

      • Putcha V. Narasimham

        Thanks Brian. I will read the article you cited.
        Your example of “line of firefighters” just calls “the real water-filled bucket flow” as “token flow”. There is no flow of imaginary tokens in the example you have given. Each firefighter is receiving an input bucket of water and “passing it on” as output bucket of water. Here there is NO “processing of bucket of water” except “transporting it as it is”. Since it is a single sequence of object flow, the imaginary token accompanies the object that flows. There is no need to imagine any token flow. The need for token flow arises when there are non-sequential flows or parallel flows. Even in such non-sequential flows, I see that flow of notice “done” in the absence of object flow has the same effect as imaginary token flow. Such a scheme of flow of notices eliminates the need for imaginary tokens. Sorry for repeating my proposal. Is my proposal valid? Can we do away with imaginary tokens without misrepresenting a process with parallel flows?

Leave a Reply

Your email address will not be published. Required fields are marked *