Suche

1. What You Will Learn

After finishing this tutorial you will be able to write your own PAK App including a JarCommandPool which is capable of running workflows.

This guide will build up on this guide: how to build and run commands.

2. Prerequisites

To complete this guide you need:

3. Setup

Now you have to set up our project. We will provide guides for a setup with gradle and with maven.

3.1. Setup with Gradle

First open your selected IDE and create a new Gradle project with the following configuration:

CreateProjectWithGradle
Figure 1. Configuration for a Gradle project

Select a name for your project:

CreateProjectSetProjectName
Figure 2. Set project name

Inside the ‚build.gradle‘ the following build script is required:

BuildGradleFile
Figure 3. build gradle file
plugins {
    id 'java-library'
}

ext {
    // dependency version
    pakVersion = '1.9.16'
}

repositories {
    mavenLocal()
    mavenCentral()
    maven {
        name = 'pak-explorer-maven'
        url 'https://pak.asap.de/nexus/repository/pak-explorer-maven/'
    }
}

dependencies {
    implementation "de.asap.pak.core:pak-engine:${pakVersion}"
    implementation "de.asap.pak.core:pak-simple:${pakVersion}"
    implementation "de.asap.pak.bpmn-model:bpmn-interpreter:${pakVersion}"
    implementation "de.asap.pak.extra:pak-jarpool:${pakVersion}"
    implementation "de.asap.pak.extra:pak-default-datatransformer:${pakVersion}"

    runtimeOnly "de.asap.pak.core:pak-commandjson:${pakVersion}"
    runtimeOnly "de.asap.pak.jlcint:jlcint-interpreter:${pakVersion}"
    runtimeOnly "de.asap.pak.jlcint:jlcint-pakbridge:${pakVersion}"
    runtimeOnly "org.example:example-commands:1.0.0"

    // Optional but useful for logging the work
    implementation 'org.slf4j:slf4j-api:1.7.25'
    implementation 'ch.qos.logback:logback-classic:1.4.0'
}

3.2. Setup with Maven

First open your selected IDE and create a new Maven project with the following configuration:

CreatePakAppWithMaven
Figure 4. Configuration for a Maven Project

Select a name for your project:

CreateProjectSetProjectName
Figure 5. Set project name

The ‚pom.xml‘ requires the following xml:

Pom File
Figure 6. pom file
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>de.pak.app</groupId>
    <artifactId>HowTo-PAK-App</artifactId>
    <version>1.0.0</version>

    <properties>
        <pak-version>1.9.16</pak-version>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
    </properties>

    <repositories>
        <repository>
            <id>pak-explorer-maven</id>
            <url>https://pak.asap.de/nexus/repository/pak-explorer-maven/</url>
        </repository>
    </repositories>

    <dependencies>
        <dependency>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.8.1</version>
        </dependency>

        <dependency>
            <groupId>de.asap.pak.core</groupId>
            <artifactId>pak-engine</artifactId>
            <version>${pak-version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>de.asap.pak.core</groupId>
            <artifactId>pak-simple</artifactId>
            <version>${pak-version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>de.asap.pak.bpmn-model</groupId>
            <artifactId>bpmn-interpreter</artifactId>
            <version>${pak-version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>de.asap.pak.extra</groupId>
            <artifactId>pak-jarpool</artifactId>
            <version>${pak-version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>de.asap.pak.extra</groupId>
            <artifactId>pak-default-datatransformer</artifactId>
            <version>${pak-version}</version>
            <scope>compile</scope>
        </dependency>

        <dependency>
            <groupId>de.asap.pak.core</groupId>
            <artifactId>pak-commandjson</artifactId>
            <version>${pak-version}</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>de.asap.pak.jlcint</groupId>
            <artifactId>jlcint-pakbridge</artifactId>
            <version>${pak-version}</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>de.asap.pak.jlcint</groupId>
            <artifactId>jlcint-interpreter</artifactId>
            <version>${pak-version}</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>jakarta.xml.bind</groupId>
            <artifactId>jakarta.xml.bind-api</artifactId>
            <version>2.3.2</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>org.example</groupId>
            <artifactId>example-commands</artifactId>
            <version>1.0.0</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>1.7.25</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
            <version>1.4.0</version>
            <scope>runtime</scope>
        </dependency>
    </dependencies>
</project>

4. Implementation

For our example app we need to implement a few things:

  • an Engine that runs the workflow.

  • an EngineObserver which logs the events which are fired by the Engine.

  • an AppClass which starts the whole application.

4.1. The Engine Observer

At first, we will implement an EngineObserver that logs events from our Engine.

import de.asap.pak.core.engine.spi.interceptors.EngineEvent;
import de.asap.pak.core.engine.spi.interceptors.IEngineListenerEvent;
import de.asap.pak.core.engine.spi.interceptors.IEngineObserver;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

/**
 * Example Observer to show the registration of an engine observer
 */
public class EngineObserver implements IEngineObserver {

	// Create a logger for this class
	private static final Logger LOG = LoggerFactory.getLogger(EngineObserver.class);

	@Override
	public void observe(IEngineListenerEvent event){
		EngineEvent engineEvent = event.getEvent();

		switch (engineEvent) {
            case WORKFLOW_STARTED:
                handleWorkflowStarted();
                break;
            case WORKFLOW_FINISHED:
                handleWorkflowFinished();
                break;
		}
	}

	private void handleWorkflowStarted(){
		LOG.info("The workflow has been started!");
	}

	private void handleWorkflowFinished(){
		LOG.info("The workflow has finished!");
	}
}
If you want to know more about Engine-Observers & Callbacks click
here

4.2. The Engine

Now we are going to implement the engine, and therefore we need some components.

import de.asap.pak.core.commandpool.spi.ICommandPool;
import de.asap.pak.core.context.api.IContext;
import de.asap.pak.core.context.impl.ContextBuilder;
import de.asap.pak.core.context.services.spi.IDataTransformer;
import de.asap.pak.core.context.services.spi.IJsonMapper;
import de.asap.pak.core.context.services.spi.IPersistenceService;
import de.asap.pak.core.engine.api.IEngine;
import de.asap.pak.core.engine.impl.EngineBuilder;
import de.asap.pak.core.model.api.IModel;
import de.asap.pak.core.model.api.services.IMappingService;
import de.asap.pak.core.simple.context.SimpleMappingService;
import de.asap.pak.core.simple.context.SimplePersistenceService;
import de.asap.pak.core.simple.context.SimpleServiceProviderFactory;
import de.asap.pak.extra.impl.datatransformer.ObjectMapperFacade;
import de.asap.pak.extra.impl.datatransformer.PakDefaultDataTransformer;
import de.asap.pak.extra.jarpool.JarCommandPool;
import de.asap.pak.modelinterpreter.bpmn.BPMNModelInterpreter;
import de.asap.pak.modelinterpreter.bpmn.model.validator.ModelException;

import java.io.InputStream;

/**
 * Class for creating an engine
 */
public final class EngineCreator {

	private EngineCreator() {
		//hidden
	}

	/**
	 * This method configures an engine using the PAK Engine builder by using default implementations
	 *
	 * @param is Input-Stream object, which refers to a BPMN-File
	 */
	public static IEngine createEngine(final InputStream is) throws ModelInterpreterException {
		// In our case we want to execute a Bpmn therefore we create the interpreter to parse and create a model
		final BPMNModelInterpreter interpreter = new BPMNModelInterpreter();
		// Creating an instance of Model which is going to be executed
		final IModel model = interpreter.interpret(is, true); (1)
		// In this example we use the JarCommandPool for providing the commands  to the app
		final ICommandPool pool = new JarCommandPool(); (2)
		// Each Engine needs a "Context" which provides all needed data and services to the engine
		final ContextBuilder cb = new ContextBuilder(); (3)
		// In order to create a context, a serviceprovider factory needs to be passed. The factory will create a ServiceProvider which will hold all important services for the engine
		final SimpleServiceProviderFactory factory = new SimpleServiceProviderFactory(); (4)
		// The persistence service will save all data which is created when executing the workflow model
		factory.addService(IPersistenceService.class,new SimplePersistenceService()); (5)
		factory.addService(IJsonMapper.class, new ObjectMapperFacade());
		factory.addService(IDataTransformer.class, new PakDefaultDataTransformer());
		cb.setServiceProviderFactory(factory);
		//Create the Context instance
		final IContext context = cb.build();
		// Now we got everything we need to build a engine
		final IEngine engine = new EngineBuilder().setCommandPool(pool).setContext(context).setModel(model).build(); (6)

		// We also register our EngineObserver we have written before
		final EngineObserver engineObserver = new EngineObserver();
		engine.registerEngineObserver(engineObserver);
		return engine;
	}
}

In the code above we see some PAK core components, which we want to explain a little more

IModel

(1) Is the model that is going to be executed by the engine.

ICommandPool

(2) A CommandPool provides in general a collection of commands and their needed services. In the specific case of an JarCommandPool the commands are directly provided by one or multiple jars. For a real app we would suggest the MavenCommandPool

IContext

(3) The context contains all information and services for the engine

IServiceProvider

(4) The ServiceProvider provides all kind of services that are needed for executing the model

IPersistenceService

(5) A PersistenceService is responsible for persisting and providing all data created/read by the commands.

IEngine

(6) Is the central class which controls the complete execution of the given workflow model

Click the links for more in-depth information about each topic.

4.3. The App class

The App class will create the engine and execute the workflow.

import de.asap.pak.core.engine.api.IEngine;
import de.asap.pak.core.engine.api.WorkflowException;
import de.asap.pak.modelinterpreter.bpmn.model.validator.ModelException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.IOException;
import java.io.InputStream;

public class App {
	//Create a logger for this class
	private static final Logger LOG = LoggerFactory.getLogger(App.class);

	public void start() {

		IEngine engine = null;

		try (final InputStream inputStream = getClass().getClassLoader().getResourceAsStream("Hello_World.bpmn")) {
			engine = EngineCreator.createEngine(inputStream);
		} catch (IOException | ModelInterpreterException e) {
			LOG.error("Unable to parse the BPMN-File to a model", e);
		}

		try {
			//Start the engine
			engine.start();
		} catch (WorkflowException e) {
			LOG.error("Unable to run the workflow", e);
		}
	}

	public static void main(String[] args) {
		//run the start method of the app
		new App().start();
	}
}

5. Run the App

The app is basically now ready, but before executing the code we need to provide the bpmn and commands properly.

5.1. Provide the BPMN

Just place the BPMN (from the ‚Create your First Command‚ guide) to the resource folder of your project or download it from here.

BpmnResourceFolder

5.2. Provide the Commands

In order for the commands (from the ‚Create your First Command‚ guide) to be found by the Commandpool we need to add the dependency to our project

Make sure the example command is already published to maven local or a remote maven repository. Click here if you need more information for publishing commands locally.
for gradle
//this is the dependency to the example-commands jar which was published locally to mavenLocal
runtimeOnly "org.example:example-commands:1.0.0"
for maven
<dependency>
    <groupId>org.example</groupId>
    <artifactId>example-commands</artifactId>
    <version>1.0.0</version>
    <scope>runtime</scope>
</dependency>

5.3. Run the App

The App is now ready to run!

To do so select the created App class → Click the green play icon in the gutter and select Run ‚App.main()‘

RunTheAppFirstTime
Figure 7. Run the App first time

And you can see the expected output from the Command in the logs.

Successful Run

In the logs our implemented simple EngineObserver provides additional information.

RanEngineLogs
Figure 8. Ran engine logs

Sonatype Nexus

PAK features connectors and commands for Sonatype Nexus. This means the software can directly interact with Nexus repositories for storing and managing artifacts. Through these connectors, PAK can automate tasks like uploading binaries or retrieving dependencies, ensuring efficient artifact management within Nexus.

Jenkins

PAK has connectors and commands for Jenkins. This allows the software to directly communicate with Jenkins servers, enabling the automation of CI/CD (Continuous Integration/Continuous Deployment) tasks. Through these connectors, PAK can trigger builds, fetch build statuses, or manage job configurations, streamlining the CI/CD processes within Jenkins.

Git Hub

PAK possesses connectors and commands for GitHub. This means the software can interface directly with GitHub repositories, facilitating actions like code pushes, pull requests, or issue tracking. Through these connectors, PAK can automate various GitHub operations, enhancing code collaboration and repository management.

Atlassian Confluence

PAK is equipped with connectors and commands for Atlassian Confluence. This enables the software to directly interact with Confluence spaces and pages. Through these connectors, PAK can automate actions such as creating, updating, or retrieving documentation, ensuring efficient content management and collaboration within Confluence.

Codebeamer

PAK features connectors and commands for Codebeamer. This allows the software to seamlessly integrate with Codebeamer’s ALM (Application Lifecycle Management) platform. Through these connectors, PAK can automate tasks like issue tracking, test management, or requirements tracing, enhancing the coordination and management of software development processes.

JFrog Artifactory

PAK has connectors and commands for JFrog Artifactory. This means the software can directly interface with Artifactory repositories, enabling actions like artifact storage, retrieval, and management. Through these connectors, PAK can automate tasks such as deploying artifacts or managing repository configurations, streamlining the integration and management of binary artifacts within Artifactory.

Amazon Web Services (AWS)

PAK has connectors and commands for Amazon Web Services (AWS). This means the software possesses specialized interfaces to directly interact with AWS services and execute actions on the AWS platform. Through these connectors, PAK can automate AWS-specific commands, such as launching EC2 instances, managing S3 buckets, or configuring Lambda functions. This allows for efficient integration, management, and automation of AWS resources and services directly from PAK.

Atlassian Jira

PAK features integration tools and capabilities for Atlassian Jira. These tools allow for a direct connection to Jira and the execution of specific actions. Using these integration tools, PAK can automate Jira actions such as adding comments or changing ticket priorities, ensuring seamless handling and coordination of Jira processes.

Git

PAK has connectors and commands for Git. This means it has interfaces to directly communicate with Git and execute actions. Through these connectors, the software can automate Git commands such as retrieving changes or creating branches, enabling efficient integration and management of Git tasks.

Generic Human Tasks

PAK offers you a standard set of commands which require creative input from the user. Enables you to start with automating your workflows, that still need abit of human input.

Generic Commands

PAK offers a standard set of commands giving you the first steps to automate your workflows.

Nexus Maven Command Pool

Nexus is an artifact repository manager for storing binaries, libraries, and artifacts, supporting formats like Maven. Maven, a software project management tool, is based on the Project Object Model (POM) and allows developers to consistently define projects and dependencies. Our Command Pool offers commands for interactions between Maven and Nexus, such as artifact uploads or dependency retrieval.

Artifactory Maven Command Pool

Artifactory allows developers to store, retrieve, and manage binary files and artifacts, providing a
central source for all binaries used in a development process. Apache Maven is a software project
management and comprehension tool that enables developers to consistently describe a project and
its dependencies. Our Command Pool offers a collection of commands used to facilitate interactions
between Maven and Artifactory, such as uploading artifacts or retrieving dependencies.

Open API Command Interpreter

The OpenApi Command Interpreter allows you the automatic parsing of commands from an OpenApi defintion. No additional code needs to be written anymore, just add the address to the definition and our framework does the rest!

Kotlin Command Interpreter

The Kotlin Command Interpreter allows you the parsing and execution of commands within a Kotlin environment to automate various tasks or processes.

Bpmn Interpreter

Workflows come in many shapes and forms. The BPMN (Business Process Model and Notation) Interpreter enables the parsing of worklows defined in the BPMN format into the PAK intern model.

Human Task Interpreter

The Human Task Interpreter allows you the parsing and running of commands within a HTML and Javascript environment. Use this to build commands which need the creative input of a workflow user!

Java Command Interpreter

The Java Command Interpreter allows you the parsing and execution of commands within a Java
environment to automate various tasks or processes.

Core

The heart of the PAK-Framework. Contains the means to run workflows with the PAK engine, but also the possibility to enrich the frameworks interfaces with your own implementations and solutions.

RocksDB Persistence

Data that is generated by a workflow run needs to be saved for short or longer terms. Our solution to the Persistence Interface of the PAK-Framework is to use the high-performance, key-value based RocksDB developed by Facebook.

PAK online

PAK Online is a web based application and provides an Open API based REST API. It enables you to upload workflows and run them periodically or on REST demand.

Command Line App

Run tasks and workflows on the console or as part of a CI/CD Pipeline with our Command Line Interface.

Workflow Editor

With our specially developed editor, a wide variety of workflows can be easily modeled in the wide known BPMN process format.

Workflow Executor

The Workflow Executor is the application to run your workflows. It features a multilingual UI and easy managment of your favorite workflows.

Support

We offer a community website where you can exchange ideas and support each other. For our Pro packages we also offer full support via email.