Automatiko 0.5.0 released

Workflow instance import, export and archive

Posted by Automatiko team on June 01, 2021 · 6 mins read

We are pleased to announce a new release of Automatiko - 0.5.0


  • Export and import of workflow instances
  • Archive of workflow instances
  • Conflict detection on persistence layer
  • Function Flow execution improvements
  • Serverless workflow support for functions and function flows

Export and import of workflow instances

In many cases there is a need to get hold of particular workflow instance in another environment. Be it a troubleshooting or debugging production problem, moving between data stores, migrating between different workflow versions or simply to inspect the complete state of the instance. Exactly for such use cases Automatiko brought you an export and import functionality for workflow instances. It comes with complete coverage of exporting and importing that includes

  • all workflow instance variables
  • all active node instances
  • sub workflow instances if there are any
  • timers of any kind if there are any
That will give you a complete set of information about given workflow instance and its dependencies. Importing such instance can be performed to any environment that runs the same service but it can be backed by different data store. In case of moving between workflow definition version there might be a need to adjust the exported instance depending if there were non backward compatible changes in the definitions.

Export and import operations are provided to your service via process management addon. Have a look at Automatiko documentation for more information about exporing and importing of workflow instances.

You can see this export and import in action in the below video

Archive of workflow instances

Archiving of a workflow instance provides a way to collect complete set of information about given instance. That means that a workflow instance is extracted from the system and shipped to the caller in a zip archive format. Important to note is that the zip archive has a predefined structure

  • exported workflow instance file - based on the export functionality
  • each variable is stored in separate file as JSON document
  • sub workfow instances (if any) are stored in sub directory with same structure as the top level (exported and variables)

Archive operation is provided to your service via process management addon. Have a look at Automatiko documentation for more information about archiving of workflow instances.

Conflict detection on persistence layer

Conflict detection on persistence layer is very important when there are workflow instances being accessed concurrently. It might not be that important if there is always one request for given workflow instance but if there are possibilities that same instance will be accessed concurrently then dealing with conflicts is a must. A common case fo concurent access is working on user task that is assigned to a group. There could be many human actors accessing same task (and by that same workflow instance) at exact time. There could be also multiple messages coming into the service that target the same workflow instance (based on correlation attributes).

To mitigate side effects of concurrent updates to the same workflow instance Automatiko detects conflicts based on version identifier. Each operation is tagged with a version number that is checked upon completion. If the version matches then it is directly stored in the data store. If the version does not match the execution is discarded and it is retried. The reason is to make sure that it is never possible to override newer data. It is similar to optimistic locking in relational database world. It does not cover all possible scenarios but certainly improves data consistency. Main drawback is the retry that means the operations that failed will be reexecuted. So it is important that operations performed are idempotent.

Function Flow execution improvements

Workflow as a Function Flow is a very powerful concept that allows high scalability based on KNative Eventing but allowing developers to focus on complete business use case rather just individual functions. This release comes with number of improvements to make it even more powerful.

  • control over activities included in single function - developers can declare with a custom attribute (functionFlowContinue) that given activity should be part of the same function. In other words activities can be chained to form a single function execution. In most of the cases this would become useful when there are no needs to invoke operation in separation
  • REST invocation (based on OpenAPI) - Function Flows can now use OpenAPI based client invcation of REST endpoints as part of the workflow definition. This serves as a cornerstone for service orchestration use cases that function flow fits very well.

Serverless workflow support for functions and function flows

Serverless Workflow is a CNCF specification (currently a sandbox project) that promotes a declarative approach to defining workflows. Automatiko has basic support for it already from the 0.1.0 but this release brings in additional enhancements to take advantage of serverless workflow based definitions with concepts of

  • Workflow as a Function
  • Workflow as a Function Flow
With this users can build up event driven workflows that execute as functions on platform like KNative.

A complete example can be found in Automatiko exeamples github repository and the concept behind this example is described in User registration example

Photographs by Unsplash.