We are pleased to announce a new release of Automatiko - 0.5.0
In many cases there is a need to get hold of particular workflow instance in another environment. Be it a troubleshooting or debugging production problem, moving between data stores, migrating between different workflow versions or simply to inspect the complete state of the instance. Exactly for such use cases Automatiko brought you an export and import functionality for workflow instances. It comes with complete coverage of exporting and importing that includes
Export and import operations are provided to your service via process management addon
. Have a look at
Automatiko documentation
for more information about exporing and importing of workflow instances.
You can see this export and import in action in the below video
Archiving of a workflow instance provides a way to collect complete set of information about given instance. That means that a workflow instance is extracted from the system and shipped to the caller in a zip archive format. Important to note is that the zip archive has a predefined structure
Archive operation is provided to your service via process management addon
. Have a look at
Automatiko documentation
for more information about archiving of workflow instances.
Conflict detection on persistence layer is very important when there are workflow instances being accessed concurrently. It might not be that important if there is always one request for given workflow instance but if there are possibilities that same instance will be accessed concurrently then dealing with conflicts is a must. A common case fo concurent access is working on user task that is assigned to a group. There could be many human actors accessing same task (and by that same workflow instance) at exact time. There could be also multiple messages coming into the service that target the same workflow instance (based on correlation attributes).
To mitigate side effects of concurrent updates to the same workflow instance Automatiko detects conflicts based on version identifier. Each operation is tagged with a version number that is checked upon completion. If the version matches then it is directly stored in the data store. If the version does not match the execution is discarded and it is retried. The reason is to make sure that it is never possible to override newer data. It is similar to optimistic locking in relational database world. It does not cover all possible scenarios but certainly improves data consistency. Main drawback is the retry that means the operations that failed will be reexecuted. So it is important that operations performed are idempotent.
Workflow as a Function Flow is a very powerful concept that allows high scalability based on KNative Eventing but allowing developers to focus on complete business use case rather just individual functions. This release comes with number of improvements to make it even more powerful.
functionFlowContinue
) that given activity should
be part of the same function. In other words activities can be chained to form a single function execution. In most of the cases this would become
useful when there are no needs to invoke operation in separationServerless Workflow is a CNCF specification (currently a sandbox project) that promotes a declarative approach to defining workflows. Automatiko has basic support for it already from the 0.1.0 but this release brings in additional enhancements to take advantage of serverless workflow based definitions with concepts of
A complete example can be found in Automatiko exeamples github repository and the concept behind this example is described in User registration example
Photographs by Unsplash.