Suche

1. What you will learn

In this guide you will learn to implement a service usable by commands that automatically registers itself to the engine to receive a cleanup on engine termination.

2. Prerequisites

  • Roughly 10 minutes

  • JDK11+ installed with JAVA_HOME configured appropriately

  • An IDE (we recommend IntelliJ)

  • Some kind of build tool (like gradle, maven or ant)

  • Know how to configure the service processor

  • Understand the basics of command services

3. The IRequireCleanup interface

3.1. Service states

Some services might be stateful. That means their state might change during an engine run.
A state is a representation of data and processes managed by the instance.
For new engine runs the state should be reset to prevent unnecessary resources being consumed by data persisting or processes continuing to run after engine termination.

3.2. How does it work

The interface provides two main points of interest needing further explanation:

  • The cleanup method

  • Automatically registering services implementing the interface to the engine

The cleanup method is designed to implement service specific cleanup that resets a services state. This method is called by the engine whenever the Engine is entering its terminating state and therefore shutting down. That can happen if an error occurs, the user aborts or suspends the execution of an engine run, or the current run simply finished executing.

A service implementing the interface does not need to be registered manually to the engine as long as it winds up in the engines service provider. For example a IPakService or a IHumanTaskJavaBridge will be automatically registered.

Apps might also want cleanups on some of their services. The same rules with the engines service provider also apply.

3.3. Special Case ILiteService

Services of type ILiteService will also be automatically registered, however their cleanup works a little different to regular services.

Instances of ILiteService are stateless by design because the instances only persist for the one command they are injected into. This design ensures instance states have no effect of other usages of the same ILiteService. However, if the ILiteService uses a static state then it might affect other instances of the class. A cleanup therefore is only useful for static states on the ILiteService.

A static state is a state is a member of the service class itself and not the instances.

3.4. Interface structure

The basic structure of a cleanup service looks like this:

public class CleanupStructureService implements IRequireCleanup { (1)

    // [...]

    public void cleanup() { (2)
    	// cleanup impl
    }


}
1 The service class must implement the IRequireCleanup interface
2 The cleanup() method of the Interface with the cleanup behaviour for the specific class

4. Example Cleanup Service Step by Step

4.1. Remarks

After understanding how the Interface works and how the basic structure looks like, it is time to implement such a service.

The example focuses on implementing a IPakService.
In the following example configuring of the service processor was omitted and is considered already done beforehand.

4.2. Implementing the Service

First the service interface registered to the service processor needs to be defined:

public interface ICacheService extends IPakService { (1)
    void cacheValue(final int value); (2)
    int getCachedValue();
}
1 The interface extents IPakService and therefore is a service loaded by the engines service provider
2 Two methods to call for the services users to cache a value or read the cached value.

Afterwards behaviour can be added in the implementation of the just defined interface:

@PakService (1)
public class CacheService implements ICacheService, IRequireCleanup { (2)

    private int cachedValue = 0;	(3)

    public void cacheValue(final int value) { (4)
    	this.cachedValue = value;
    }

    public int getCachedValue() { (5)
    	return this.cachedValue;
    }

    public void cleanup() { (6)
		this.cachedValue = 0;
    }
}
1 The annotation required by instances of IPakService
2 The definition of the service implementation implementing the previously defined ÌCacheService and the IRequireCleanup interface
3 A private variable that defines the state of the service with initial state value being 0
4 A method from the service interface to change the cached value
5 A method from the service interface to read out a state value
6 The cleanup method to reset the state variable to its initial state value
Logging was omitted in the example. However, in order to make the cleanup visible in the Workflow Executor, some logging was added before and after the state variable reset within the cleanup() method.

4.3. Using the service in commands

After defining the service, we can use it in a command just like any other service. If everything was configured correctly there should be no difference between regular services and services requiring cleanups when implementing a command.

/**
 * @workflowDocu Simple command that uses a service with self-cleanup
 */
@JavaCommand
@CommandGroup("org.example")
public class CacheWriteRead {
	/**
         * @workflowDocu Service that cleans itself up
         */
	@Service (1)
	private CacheService cacheService;

	/**
         * @workflowDocu The Value to cache in the service
         */
	@Persistent
	private int valueToCache;

	/**
         * @workflowDocu The cache value before the write operation
         */
	@Persistent(scope = FieldScope.WRITE_ONLY)
	private int oldCacheValue;

	/**
         * @workflowDocu The cache value after the write operation
         */
	@Persistent(scope = FieldScope.WRITE_ONLY)
	private int newCacheValue;

	@Run
	public void cacheValue() { (2)
		this.oldCacheValue = this.cacheService.getCachedValue();
		this.cacheService.cacheValue(this.valueToCache);
		this.newCacheValue = this.cacheService.getCachedValue();
	}
}
1 Injecting of the service to be able to use it in the command
2 Simulating of two read operations with a write operation on the state variable in between

The state was changed by the command and therefore on a new engine run it is expected that the cache is reset.

4.4. Running the command

After building the command, it can be executed using the Workflow Executor. Following outcome is expected:

ExampleCommand
Figure 1. Example command in single command runner

The expected outcome is the following:

OutputDatastore
Figure 2. Expected output

The datastore now stores values it that were read during the command execution. The oldCacheValue was the initial value and the newCacheValue is the same value that was given to the command as input. If the service would persist over engine runs by adjusting the @PakService annotation to @PakService(persistentClassLoader = true, singletonInstance = true), the initial value of the cache would now be the same as the newCacheValue of the previous run. As a cleanup was implemented, that is not the case as can be seen from the logs of the cleanup method:

CleanupLogs
Figure 3. Logs of the engine run
1 This log was called before we reset the cached value returning the value from the engine run
2 This log was called after we reset the cached value returning the initial value again

5. Use Cases

At last here are some basic use cases where a cleanup might be helpful.

  • Service caching data

  • Service creating temporary files

  • Service managing processes

5.1. Caching Data

A service might handle request made to an api returning big response payloads. In order to reduce the amount requests made to the api the service might want to cache the responses to reuse them later on. The cleanup then clears the cache to ensure the request data does not consume memory beyond the engine run.

The cleanup in this use case would be useful if the service instance persists over multiple engine runs.

5.2. Creating Temp files

Another way a service might process data is to create temporary files to store some information and then discard them later on. In order to prevent those files from persisting beyond an engine run a cleanup might be wise to delete those temporary files.

5.3. Managing Processes

A different variant of a service might handle some processes running on the computer. These processes might only be needed for the runtime of the engine and can be discarded afterwards. This can also be done by managing the processes in a service which then kills the processes when the engine terminates.

Sonatype Nexus

PAK features connectors and commands for Sonatype Nexus. This means the software can directly interact with Nexus repositories for storing and managing artifacts. Through these connectors, PAK can automate tasks like uploading binaries or retrieving dependencies, ensuring efficient artifact management within Nexus.

Jenkins

PAK has connectors and commands for Jenkins. This allows the software to directly communicate with Jenkins servers, enabling the automation of CI/CD (Continuous Integration/Continuous Deployment) tasks. Through these connectors, PAK can trigger builds, fetch build statuses, or manage job configurations, streamlining the CI/CD processes within Jenkins.

Git Hub

PAK possesses connectors and commands for GitHub. This means the software can interface directly with GitHub repositories, facilitating actions like code pushes, pull requests, or issue tracking. Through these connectors, PAK can automate various GitHub operations, enhancing code collaboration and repository management.

Atlassian Confluence

PAK is equipped with connectors and commands for Atlassian Confluence. This enables the software to directly interact with Confluence spaces and pages. Through these connectors, PAK can automate actions such as creating, updating, or retrieving documentation, ensuring efficient content management and collaboration within Confluence.

Codebeamer

PAK features connectors and commands for Codebeamer. This allows the software to seamlessly integrate with Codebeamer’s ALM (Application Lifecycle Management) platform. Through these connectors, PAK can automate tasks like issue tracking, test management, or requirements tracing, enhancing the coordination and management of software development processes.

JFrog Artifactory

PAK has connectors and commands for JFrog Artifactory. This means the software can directly interface with Artifactory repositories, enabling actions like artifact storage, retrieval, and management. Through these connectors, PAK can automate tasks such as deploying artifacts or managing repository configurations, streamlining the integration and management of binary artifacts within Artifactory.

Amazon Web Services (AWS)

PAK has connectors and commands for Amazon Web Services (AWS). This means the software possesses specialized interfaces to directly interact with AWS services and execute actions on the AWS platform. Through these connectors, PAK can automate AWS-specific commands, such as launching EC2 instances, managing S3 buckets, or configuring Lambda functions. This allows for efficient integration, management, and automation of AWS resources and services directly from PAK.

Atlassian Jira

PAK features integration tools and capabilities for Atlassian Jira. These tools allow for a direct connection to Jira and the execution of specific actions. Using these integration tools, PAK can automate Jira actions such as adding comments or changing ticket priorities, ensuring seamless handling and coordination of Jira processes.

Git

PAK has connectors and commands for Git. This means it has interfaces to directly communicate with Git and execute actions. Through these connectors, the software can automate Git commands such as retrieving changes or creating branches, enabling efficient integration and management of Git tasks.

Generic Human Tasks

PAK offers you a standard set of commands which require creative input from the user. Enables you to start with automating your workflows, that still need abit of human input.

Generic Commands

PAK offers a standard set of commands giving you the first steps to automate your workflows.

Nexus Maven Command Pool

Nexus is an artifact repository manager for storing binaries, libraries, and artifacts, supporting formats like Maven. Maven, a software project management tool, is based on the Project Object Model (POM) and allows developers to consistently define projects and dependencies. Our Command Pool offers commands for interactions between Maven and Nexus, such as artifact uploads or dependency retrieval.

Artifactory Maven Command Pool

Artifactory allows developers to store, retrieve, and manage binary files and artifacts, providing a
central source for all binaries used in a development process. Apache Maven is a software project
management and comprehension tool that enables developers to consistently describe a project and
its dependencies. Our Command Pool offers a collection of commands used to facilitate interactions
between Maven and Artifactory, such as uploading artifacts or retrieving dependencies.

Open API Command Interpreter

The OpenApi Command Interpreter allows you the automatic parsing of commands from an OpenApi defintion. No additional code needs to be written anymore, just add the address to the definition and our framework does the rest!

Kotlin Command Interpreter

The Kotlin Command Interpreter allows you the parsing and execution of commands within a Kotlin environment to automate various tasks or processes.

Bpmn Interpreter

Workflows come in many shapes and forms. The BPMN (Business Process Model and Notation) Interpreter enables the parsing of worklows defined in the BPMN format into the PAK intern model.

Human Task Interpreter

The Human Task Interpreter allows you the parsing and running of commands within a HTML and Javascript environment. Use this to build commands which need the creative input of a workflow user!

Java Command Interpreter

The Java Command Interpreter allows you the parsing and execution of commands within a Java
environment to automate various tasks or processes.

Core

The heart of the PAK-Framework. Contains the means to run workflows with the PAK engine, but also the possibility to enrich the frameworks interfaces with your own implementations and solutions.

RocksDB Persistence

Data that is generated by a workflow run needs to be saved for short or longer terms. Our solution to the Persistence Interface of the PAK-Framework is to use the high-performance, key-value based RocksDB developed by Facebook.

PAK online

PAK Online is a web based application and provides an Open API based REST API. It enables you to upload workflows and run them periodically or on REST demand.

Command Line App

Run tasks and workflows on the console or as part of a CI/CD Pipeline with our Command Line Interface.

Workflow Editor

With our specially developed editor, a wide variety of workflows can be easily modeled in the wide known BPMN process format.

Workflow Executor

The Workflow Executor is the application to run your workflows. It features a multilingual UI and easy managment of your favorite workflows.

Support

We offer a community website where you can exchange ideas and support each other. For our Pro packages we also offer full support via email.