How to use dataflow analysis
1. Dataflow analysis
Dataflow analysis describes the process of finding errors in your workflow.
The editor has a dedicated tab, where the found errors and warnings are displayed.
Of course, the analysis cannot find all mistakes, especially those implied by logic. The dataflow analysis covers basic semantic errors and mainly ensures that a workflow is able to be executed. |
2. Handling Problems in the Workflow
2.1. The Basics
Lets start with an example. The first error, you will always get is:
The error will automatically show up under the Dataflow Analysis Tab. |
To solve this, all you need to do is add an End Event. Then the analysis will be green again!
As you can see, the editor tries its best, to tell you what exactly is wrong in the workflow, so you can fix it.
Now lets move on to the second most common error. Missing keys, that are mandatory:
Again, the analysis already tells you which node is missing what key. All you need to do here is to add the readValue
-Key for the Write Value node.
You can do so by adding a mapping for the key. Further information on mapping can be found here.
2.2. Views
If you have multiple errors from different tasks and nodes you can select one of two views to display your errors.
3. Triggering the Analysis
In the menubar under „Workflow“ you will find two options for the dataflow analysis.
The lower one, „Analyse Automatically“, can be selected to automatically refresh the analysis with every change made on a workflow.
For bigger workflows, where the analysis can take some time, you might not want to refresh that often.
In that case you can deselect the automatic analysis and trigger the analysis yourself.
When you want to trigger an analysis simply hit the Analyse Dataflow button.