powered by ASAP

Suche

Java command annotations are used for the developing Java commands in PAK.
This chapter describes the functionality of the @Persistent annotation.
It also gives an example of how you can use this annotation in your implementation.

1. Basics

The @Persistent annotation marks a variable of a java class as relevant input or output for PAK.
Thus, it is injected from the datastore before or stored to the datastore after the successful command execution.

Without any further options, a mandatory and read-only persistent variable is created with the variable name as the datastore key.
In case the variable is initialized with a value it will be used unless another value was found in the datastore.

2. Properties

The @Persistent annotation has several properties defined which are described in the following.

  • mandatory:
    This property marks whether the variable is mandatory or not.
    In case a variable is mandatory you cannot execute the command as long as no value is assigned to that variable in the datastore.
    By default, the value is true which means that the variable is mandatory.

  • scope:
    This property defines the scope of the variable as well as the writing direction.
    The scope is defined by the enum FieldScope and can either be READ_ONLY, WRITE_ONLY or READ_WRITE.
    For READ_ONLY variables the values are read from the datastore before the command execution.
    Values for WRITE_ONLY variables are written to the datastore after the command execution.
    Lastly, values of READ_WRITE variables are read from and written to the datastore.
    A variable is by default read-only.

  • name:
    This property gives the possibility to define a custom name for the variable.
    In case the name property is set it will be also used as datastore key.
    By default, the name property is an empty String which means the variable name is used.

  • getter and setter:
    This properties allow to define the name of a customized getter/setter method that will be used for fetching/setting the variable value.
    By default, no customized method getter/setter name is given.
    This means that if existing, the getter/setter derived from the variable name will be used or, in case this getter/setter is not present, the variable will be accessed directly.
    In case a customized getter/setter is given, the method with this name will be used.

Please ensure that no other annotated Method, like @Run, @PostConstruct or @PreDestroy, is named like a possible getter.
This will cause a compile error.
An example of that case can be found in Listing 2.

3. Usage

The example code piece in Listing 1 shows the usage of the @Persistent annotation and its properties.

Listing 1. Example usage of the different properties
/**
 * @workflowDocu This command gets an element by its id.
 */
@JavaCommand
@CommandGroup("Element")
public class GetElement {

	[...]

	/**
	 * @workflowDocu Id of the element to fetch
	 */
	@Persistent (1)
	private String id;

	/**
	 * @workflowDocu Username of the current user
	 */
	@Persistent(getter = "getInputUsername", setter = "setInputUsername") (1)
	private String username;

	/**
	 * @workflowDocu Name of the element
	 */
	@Persistent(mandatory = false, scope = FieldScope.READ_WRITE, name = "elementName") (2)
    private String name;

	/**
	 * @workflowDocu Fetched element
	 */
	@Persistent(scope = FieldScope.WRITE_ONLY) (3)
	private Element element;

	public String getInputUsername() {
		return "username/" + this.username;
	}

	public void setInputUsername(final String username) {
		this.username = "username_" + this.username;
	}

	[...]

}
1 Mandatory variable with default read-only scope
2 Optional variable with read-write scope and customized name for the datastore
3 Mandatory variable with write-only scope

In the single command runner of the Workflow Executor the example of Listing 1 will look like shown in Figure 1.
As you can see, there are separate collapsible boxes for input (marked in red) and output (marked in purple) variables.

Since id and username are input variables those variables are placed in the box for input variables.
Additionally, they are surrounded by a red border to mark that the variables are mandatory, which will disappear if a value is given by the user.

For the variable element the alias elementName is displayed and it is marked as optional in the box for input variables.
Since this variable is marked with the scope READ_WRITE it also appears in the output variables box.
Here you can define a custom variable name.
That name represents the key for the datastore the value will be written to after the command execution.
The same applies for the WRITE_ONLY marked variable element whereby it only appears in the box for output variables.

Persistent Executor
Figure 1. Single command runner view for example of Listing 1
The following Listing 2 shows an example how you must not implement a command as it causes a compile error.

In Listing 2 no customized getter/setter is given for the variable element that is annotated with @Persistent.
Thus, the implementation of reflections tries to find a getter with the name getElement.
As the method, which is annotated with @Run is also named getElement this causes a compile error.

Listing 2. Wrong implementation
/**
 * @workflowDocu This command gets an element by its name.
 */
@JavaCommand
@CommandGroup("Element")
public class GetElement {

	[...]

	/**
	 * @workflowDocu Name of the element to fetch
	 */
	@Persistent (1)
	private String element;

	/**
	 * Fetch the element with a given name.
	 */
	@Run
	public void getElement() { (2)
		// Command functionality
		[...]
	}

}
1 Mark element as persistence variable
2 @Run method named as the getter of the persistence variable

Sonatype Nexus

PAK features connectors and commands for Sonatype Nexus. This means the software can directly interact with Nexus repositories for storing and managing artifacts. Through these connectors, PAK can automate tasks like uploading binaries or retrieving dependencies, ensuring efficient artifact management within Nexus.

Jenkins

PAK has connectors and commands for Jenkins. This allows the software to directly communicate with Jenkins servers, enabling the automation of CI/CD (Continuous Integration/Continuous Deployment) tasks. Through these connectors, PAK can trigger builds, fetch build statuses, or manage job configurations, streamlining the CI/CD processes within Jenkins.

Git Hub

PAK possesses connectors and commands for GitHub. This means the software can interface directly with GitHub repositories, facilitating actions like code pushes, pull requests, or issue tracking. Through these connectors, PAK can automate various GitHub operations, enhancing code collaboration and repository management.

Atlassian Confluence

PAK is equipped with connectors and commands for Atlassian Confluence. This enables the software to directly interact with Confluence spaces and pages. Through these connectors, PAK can automate actions such as creating, updating, or retrieving documentation, ensuring efficient content management and collaboration within Confluence.

Codebeamer

PAK features connectors and commands for Codebeamer. This allows the software to seamlessly integrate with Codebeamer’s ALM (Application Lifecycle Management) platform. Through these connectors, PAK can automate tasks like issue tracking, test management, or requirements tracing, enhancing the coordination and management of software development processes.

JFrog Artifactory

PAK has connectors and commands for JFrog Artifactory. This means the software can directly interface with Artifactory repositories, enabling actions like artifact storage, retrieval, and management. Through these connectors, PAK can automate tasks such as deploying artifacts or managing repository configurations, streamlining the integration and management of binary artifacts within Artifactory.

Amazon Web Services (AWS)

PAK has connectors and commands for Amazon Web Services (AWS). This means the software possesses specialized interfaces to directly interact with AWS services and execute actions on the AWS platform. Through these connectors, PAK can automate AWS-specific commands, such as launching EC2 instances, managing S3 buckets, or configuring Lambda functions. This allows for efficient integration, management, and automation of AWS resources and services directly from PAK.

Atlassian Jira

PAK features integration tools and capabilities for Atlassian Jira. These tools allow for a direct connection to Jira and the execution of specific actions. Using these integration tools, PAK can automate Jira actions such as adding comments or changing ticket priorities, ensuring seamless handling and coordination of Jira processes.

Git

PAK has connectors and commands for Git. This means it has interfaces to directly communicate with Git and execute actions. Through these connectors, the software can automate Git commands such as retrieving changes or creating branches, enabling efficient integration and management of Git tasks.

Generic Human Tasks

PAK offers you a standard set of commands which require creative input from the user. Enables you to start with automating your workflows, that still need abit of human input.

Generic Commands

PAK offers a standard set of commands giving you the first steps to automate your workflows.

Nexus Maven Command Pool

Nexus is an artifact repository manager for storing binaries, libraries, and artifacts, supporting formats like Maven. Maven, a software project management tool, is based on the Project Object Model (POM) and allows developers to consistently define projects and dependencies. Our Command Pool offers commands for interactions between Maven and Nexus, such as artifact uploads or dependency retrieval.

Artifactory Maven Command Pool

Artifactory allows developers to store, retrieve, and manage binary files and artifacts, providing a
central source for all binaries used in a development process. Apache Maven is a software project
management and comprehension tool that enables developers to consistently describe a project and
its dependencies. Our Command Pool offers a collection of commands used to facilitate interactions
between Maven and Artifactory, such as uploading artifacts or retrieving dependencies.

Open API Command Interpreter

The OpenApi Command Interpreter allows you the automatic parsing of commands from an OpenApi defintion. No additional code needs to be written anymore, just add the address to the definition and our framework does the rest!

Kotlin Command Interpreter

The Kotlin Command Interpreter allows you the parsing and execution of commands within a Kotlin environment to automate various tasks or processes.

Bpmn Interpreter

Workflows come in many shapes and forms. The BPMN (Business Process Model and Notation) Interpreter enables the parsing of worklows defined in the BPMN format into the PAK intern model.

Human Task Interpreter

The Human Task Interpreter allows you the parsing and running of commands within a HTML and Javascript environment. Use this to build commands which need the creative input of a workflow user!

Java Command Interpreter

The Java Command Interpreter allows you the parsing and execution of commands within a Java
environment to automate various tasks or processes.

Core

The heart of the PAK-Framework. Contains the means to run workflows with the PAK engine, but also the possibility to enrich the frameworks interfaces with your own implementations and solutions.

RocksDB Persistence

Data that is generated by a workflow run needs to be saved for short or longer terms. Our solution to the Persistence Interface of the PAK-Framework is to use the high-performance, key-value based RocksDB developed by Facebook.

PAK online

PAK Online is a web based application and provides an Open API based REST API. It enables you to upload workflows and run them periodically or on REST demand.

Command Line App

Run tasks and workflows on the console or as part of a CI/CD Pipeline with our Command Line Interface.

Workflow Editor

With our specially developed editor, a wide variety of workflows can be easily modeled in the wide known BPMN process format.

Workflow Executor

The Workflow Executor is the application to run your workflows. It features a multilingual UI and easy managment of your favorite workflows.

Support

We offer a community website where you can exchange ideas and support each other. For our Pro packages we also offer full support via email.