powered by ASAP

Suche

1. What you will learn

After finishing this tutorial you will have a basic overview of what mappings are, why they are needed and how to use them.

2. Prerequisites

To complete this guide you will need:

  • Roughly 10 minutes

  • Basic understanding of Workflows

  • The PAK Bpmn Editor ( Download )

  • The PAK Workflow Executor ( Download )

3. Data Transportation in PAK

In order to demonstrate the usage of mappings, let’s first set up a simple workflow in the Bpmn Editor:

basic workflow
Figure 1. Blueprint of the workflow used to demonstrate mappings

3.1. Which Commands Did We Use?

  • Simple Text Input: Prompts the user with a field to enter a text

  • Check Equals: Checks if two inputs are the same

  • Log Value: Prints a value on the screen

3.2. What Do We Want to Achieve?

The workflow is currently only an enumeration of commands with no real meaning, so let’s give it just that. We want the workflow to take a user input, check if that input is „Hello World“ and print the result.

3.3. How Do We Achieve That?

All data in PAK passes the so called datastore. The datastore can essentially be seen as a collection of key-value pairs. Assuming the user types „Hello World“ in the text prompt, the datastore would contain a pair of the form (outputText → „Hello World“).

command output keys
Figure 2. Retrieving a commands generated keys

The keys which a command generates can be retrieved by hovering over the respective box in the editor. All generated keys will be listed under Output. Respectively, all keys that a command consumes are to be found under Input.

There might also be the case that a command does not consume and/or generate any keys

Not all parameters are necessary for a command to run, the Dataflow Analysis tab of the Bpmn Editor informs us about the minimum keys required for the workflow to be locked and loaded.

dataflow analysis
Figure 3. The Dataflow Analysis tab

4. Basic Mappings

As the reasoning behind mappings might be hard to grasp, let’s demonstrate them in our example. As you can see, our workflow currently misses 4 datastore entries. We can satisfy those entries by providing the respective command with a mapping to fill the gap.

In general there are two kinds of mappings, which will be discussed in more detail below.

  • Constant Mapping: Provides a hard-coded value for the key in question

  • Key/Datastore Mapping: Provides a key to look up in the datastore.

4.1. Constant Mapping

The first command we’ll look into is Simple Text Input. When comparing the inputs with the warnings in the Dataflow Analysis tab, we can see that this command does not require any additional data to function. However, we still want to modify the optional descriptiveText input as this specific key is telling the users about what they need to do.

provide mapping 1
Figure 4. Create new mapping
provide mapping 2
Figure 5. Select key to map
provide mapping 3
Figure 6. Provide constant mapping

In order to provide the mapping we first left click the Simple Text Input (Fig 4. 1) and hit the Add Input Mapping button or the plus sign on the right sight of Input (Fig 4. 2). Select descriptiveText from the key dropdown (Fig 5.) and press the Constant Mapping button (Fig 4. 4) and type the desired description (Fig 4. 5), in our case „Input your text here“. For mandatory keys this should resolve any dataflow analysis warning about missing or empty keys.

To ensure correct data transportation, make sure that your inputs conform to the JSON format. For example

  • To supply an empty text, enter „“

  • To supply „“, enter „\“\““

create equals mappings
Figure 7. Key generated by the Check Equals command

Let’s look at the Check Equals command next. By hovering over the command we can see that it needs two inputs, namely inputA and inputB. As mentioned in

inputB should contain the value that was generated as output by the Simple Text Input. Thus, in order to transfer the user input to field inputB of the Check Equals command use the Key Mapping, which is described in the following.

4.2. Key/Datastore Mapping

Key mappings define a binding between an input parameter and a value residing in the datastore. Recalling 3.3, we found out that Simple Text Input writes a pair with the key outputText to the datastore. We now need to somehow reference this key in the Check Equals command.

provide mapping 4
Figure 8. Provide datastore mapping (builds upon Fig 6.)

In order to do so, simply select Key Mapping instead of Constant Mapping (Fig 8. 4) and select the outputText-key (Fig 8. 5). This essentially means that instead of providing a hard-coded value for this input it will be fetched dynamically while the workflow is running. The key inputB is now mapped to outputText and will be replaced by it.

To print the result of our comparison we can either repeat the steps above and map logOutput of Log Value to result or map the output key of Check Equals as described below.

provide mapping 5
Figure 9. Provide write key mapping

With the Key Mapping you also have the possibility to change a parameter name of a command. This way you can define a customized key for the datastore and create a better overview in a larger workflow.
In our example we can achieve this by clicking the command Check Equals, clicking on the Add Output Mapping button or the plus sign on the right of Output (Fig 4. 2), selecting the output field result (Fig 9. 1) and specifying the customized key in the text field (Fig 9. 2). In this case logOutput.

If the defined mapping value of that key is left empty, you will get a warning and it will be automatically filled with the name of the key, in this case „result“.

5. Testing the Workflow

Save the workflow to a location of your choice and open the resulting .bpmn file in the PAK Workflow Executor.

testing 1
Figure 10. Load the workflow
testing 2
Figure 11. Run the workflow
testing 3
Figure 12. Validate results

The analysis should yield no errors, and you can simply run the workflow. As you can see you will be prompted to enter a text with the description we provided above. After providing the correct input (in our case „Hello World“), true will be printed to the console.

Sonatype Nexus

PAK features connectors and commands for Sonatype Nexus. This means the software can directly interact with Nexus repositories for storing and managing artifacts. Through these connectors, PAK can automate tasks like uploading binaries or retrieving dependencies, ensuring efficient artifact management within Nexus.

Jenkins

PAK has connectors and commands for Jenkins. This allows the software to directly communicate with Jenkins servers, enabling the automation of CI/CD (Continuous Integration/Continuous Deployment) tasks. Through these connectors, PAK can trigger builds, fetch build statuses, or manage job configurations, streamlining the CI/CD processes within Jenkins.

Git Hub

PAK possesses connectors and commands for GitHub. This means the software can interface directly with GitHub repositories, facilitating actions like code pushes, pull requests, or issue tracking. Through these connectors, PAK can automate various GitHub operations, enhancing code collaboration and repository management.

Atlassian Confluence

PAK is equipped with connectors and commands for Atlassian Confluence. This enables the software to directly interact with Confluence spaces and pages. Through these connectors, PAK can automate actions such as creating, updating, or retrieving documentation, ensuring efficient content management and collaboration within Confluence.

Codebeamer

PAK features connectors and commands for Codebeamer. This allows the software to seamlessly integrate with Codebeamer’s ALM (Application Lifecycle Management) platform. Through these connectors, PAK can automate tasks like issue tracking, test management, or requirements tracing, enhancing the coordination and management of software development processes.

JFrog Artifactory

PAK has connectors and commands for JFrog Artifactory. This means the software can directly interface with Artifactory repositories, enabling actions like artifact storage, retrieval, and management. Through these connectors, PAK can automate tasks such as deploying artifacts or managing repository configurations, streamlining the integration and management of binary artifacts within Artifactory.

Amazon Web Services (AWS)

PAK has connectors and commands for Amazon Web Services (AWS). This means the software possesses specialized interfaces to directly interact with AWS services and execute actions on the AWS platform. Through these connectors, PAK can automate AWS-specific commands, such as launching EC2 instances, managing S3 buckets, or configuring Lambda functions. This allows for efficient integration, management, and automation of AWS resources and services directly from PAK.

Atlassian Jira

PAK features integration tools and capabilities for Atlassian Jira. These tools allow for a direct connection to Jira and the execution of specific actions. Using these integration tools, PAK can automate Jira actions such as adding comments or changing ticket priorities, ensuring seamless handling and coordination of Jira processes.

Git

PAK has connectors and commands for Git. This means it has interfaces to directly communicate with Git and execute actions. Through these connectors, the software can automate Git commands such as retrieving changes or creating branches, enabling efficient integration and management of Git tasks.

Generic Human Tasks

PAK offers you a standard set of commands which require creative input from the user. Enables you to start with automating your workflows, that still need abit of human input.

Generic Commands

PAK offers a standard set of commands giving you the first steps to automate your workflows.

Nexus Maven Command Pool

Nexus is an artifact repository manager for storing binaries, libraries, and artifacts, supporting formats like Maven. Maven, a software project management tool, is based on the Project Object Model (POM) and allows developers to consistently define projects and dependencies. Our Command Pool offers commands for interactions between Maven and Nexus, such as artifact uploads or dependency retrieval.

Artifactory Maven Command Pool

Artifactory allows developers to store, retrieve, and manage binary files and artifacts, providing a
central source for all binaries used in a development process. Apache Maven is a software project
management and comprehension tool that enables developers to consistently describe a project and
its dependencies. Our Command Pool offers a collection of commands used to facilitate interactions
between Maven and Artifactory, such as uploading artifacts or retrieving dependencies.

Open API Command Interpreter

The OpenApi Command Interpreter allows you the automatic parsing of commands from an OpenApi defintion. No additional code needs to be written anymore, just add the address to the definition and our framework does the rest!

Kotlin Command Interpreter

The Kotlin Command Interpreter allows you the parsing and execution of commands within a Kotlin environment to automate various tasks or processes.

Bpmn Interpreter

Workflows come in many shapes and forms. The BPMN (Business Process Model and Notation) Interpreter enables the parsing of worklows defined in the BPMN format into the PAK intern model.

Human Task Interpreter

The Human Task Interpreter allows you the parsing and running of commands within a HTML and Javascript environment. Use this to build commands which need the creative input of a workflow user!

Java Command Interpreter

The Java Command Interpreter allows you the parsing and execution of commands within a Java
environment to automate various tasks or processes.

Core

The heart of the PAK-Framework. Contains the means to run workflows with the PAK engine, but also the possibility to enrich the frameworks interfaces with your own implementations and solutions.

RocksDB Persistence

Data that is generated by a workflow run needs to be saved for short or longer terms. Our solution to the Persistence Interface of the PAK-Framework is to use the high-performance, key-value based RocksDB developed by Facebook.

PAK online

PAK Online is a web based application and provides an Open API based REST API. It enables you to upload workflows and run them periodically or on REST demand.

Command Line App

Run tasks and workflows on the console or as part of a CI/CD Pipeline with our Command Line Interface.

Workflow Editor

With our specially developed editor, a wide variety of workflows can be easily modeled in the wide known BPMN process format.

Workflow Executor

The Workflow Executor is the application to run your workflows. It features a multilingual UI and easy managment of your favorite workflows.

Support

We offer a community website where you can exchange ideas and support each other. For our Pro packages we also offer full support via email.