Child pages
  • Example Workflows
Skip to end of metadata
Go to start of metadata

The following example workflows are provided to simplify getting started with Flux. These workflows are easily modified and extended to meet your requirements.

Action prescripts and postscripts are used in these flows to provide status messaging and details of the file name being moved. See example prescript here of updating the status message in the Operations Console during a file copy: Prescript Example.png

Output File Information from a Directory (Basic)

This workflow scans a directory and outputs information about all the files it finds. This example illustrates how to query a directory, iterate through the files, and format each file's information into a console message.


Download here:  Output File Information.ffc

Bulk File Copy (Basic)

This workflow scans a directory and copies all the found files. The bulk file copy is useful where all the files have to be copied at a point in time. If this bulk file copy fails, rerunning the flow will copy all the files over again.

This example assumes that a runtime configuration file has been defined, and that runtime configuration properties to identify the file source directory and target directory specified as follows:


Required Runtime Configuration File Properties

Download here: Bulk-Copy.ffc

File Copy using Java Code (Basic)

This example illustrates using Java code to do the following:

  • create a Flux flowchart, 
  • create the criteria for files we want to include, exclude, and filter out using a regular expression filter in the file copy,
  • set the target directory,
  • define a renamer to rename files once they are copied,
  • create a file copy action within the flowchart and populate it with the above criteria,
  • submit the flowchart to the engine for execution, and then 
  • stop (i.e., dispose) the engine, Here is the Javadoc for the Flux engine.

The complete code for this example is in the examples\software_developers\file_copy directory of a Flux installation.


Java Code Example
FlowChart flowchart = EngineHelper.makeFlowChart("FlowChart");

FileFactory fileFactory = flowchart.makeFileFactory();

IncludesFileCriteria sources = fileFactory.makeIncludesFileCriteria();

// List of source files to copy from.

// List of source files to exclude.

// List of files to exclude using regular expressions.

TargetFileCriteria targets = fileFactory.makeTargetFileCriteria();

// Specify the target directory.

// Create a renamer to rename files when copying from
// source to target.
Renamer renamer = fileFactory.makeGlobRenamer("*.txt", "*.java");

// Construct a File Copy Action.
FileCopyAction fileCopyAction = fileFactory.makeFileCopyAction("File Copy Action");

// Create a factory.
Factory factory = Factory.makeInstance();

// Create an engine.
Engine engine = factory.makeEngine();

// Add the job to the engine.

// Start the engine.

// Give the engine a chance to run the job. In this case 120 seconds, or 2 minutes.

// Clean up resources and shutdown engine.

Mass File Copy and Prepend Today's Date to Each Copied File (Basic)

This workflow scans a directory and copies all the found files. The workflow consists of just a single File Copy Action. A Regular Expression Renamer is configured to prepend the date to each copied file.

Note in the illustration below that the From Pattern is (.*). This regular expression captures the file name into a group, later referenced in the To Pattern as \1 (Since it is the first and only regex group captured).

The To Pattern is ${date yyyy-MM-dd}-\1. This pattern prepends the date in a form such as 2014-10-18, with a dash '-', in front of each file name copied from the current directory into c:\test.


Download here: MassFileCopyWithRegexToPrependDate.ffc

Generic File Move, Copy, Rename, Delete (Basic)

This workflow creates a file, tests for its existence, moves it somewhere, copies it back to where it was moved from, renames it, and deletes both files and tests to make sure they both no longer exist.

This example assumes that a runtime configuration file has been defined, and that runtime configuration properties to identify the file source directory and target directory specified as follows:


Required Runtime Configuration File Properties

Download here: FLUX-SAMPLES-FileActions.ffc

Read a CSV File and Print Result (Basic)

This example uses the CsvJdbc library to connect to a CSV file located in the local system, and perform a query. 

This example assumes that you have the 'csvjdbc-1.0-20.jar' in your Flux Classpath; and a CSV file named 'customers.csv' in the path '/Users/Documents'. Using Flux's built-in Database Query action it creates a connection to the location of the file and then performs the query set in the 'Query' property of the Database Query action. 

The JDBC URL property specifies the path to the file, to change the path to point to another location, just append the path to 'jdbc:relique:csv:' as shown below:


In the Query we specify the file name (without file extension):

SELECT Name,email FROM customers

Where 'customers' is the name of our CSV file. The output should look like:


Download here: FLUX-SAMPLES-CSVJDBC.ffccustomers.csvcsvjdbc-1.0-20.jar

Simple Database Query Using the Database Query Action (Basic)

This example illustrates how to query a database table and output values from the returned results. This example queries the FLUX_FLOW_CHART created within every Flux installation. When the DatabaseQuery Action runs using the query select * from FLUX_FLOW_CHART, it returns one database row at a time and passes that to the Console Action. The Console Action outputs the ActionResult and also the contents of the first column (in this case a numeric flowchart identifier.


Shown below are the properties for the flow from the Database Query Action to the Console Action (the flow in blue above). Note that the Condition is "Custom". The Database Query Action returns a RESULT object that contains the fields "row" and "result". The "result" field, if set to true, indicates there are additional rows in the query. The "row" field contains the database row contents. So while the RESULT.result field is true, the Console Action is sent a row to print out.  

Here the RESULT object is printed out, and the first column of the RESULT's row. Printing out the RESULT object just prints out the object's name and identifier, which is generally only useful if one is trying to confirm the type of returned result. Returning the column values in the row is generally much more interesting!

Download here: FLUX-SAMPLES-Simple-DB-Select-using-Database-Query-Action.ffc

Ping a Host and Notify if Down (Basic)

This workflow sends a ping request to a host and email a notification if the host is down.

This example assumes that a shell script has been defined in a directory located at /home/flux/FluxTest/Ping. Modify the Process Action to point to a directory of your choosing or enter the full path to the script or executable in the Command text field. Paths can be relative to the directory was Flux was installed or an absolute path can be used (as in this example).

You will also have to modify the Mail Action to input the correct mail server settings like so:


Required Shell Script for Linux/Unix
for myHost in $host
  ping=$(ping -c 1 $myHost | grep 'received' | awk -F',' '{ print $2 }' | awk '{ print $1 }')
  if [ $ping -eq 0 ]; then
     exit 1
     exit 0
Required Batch File for Windows
@echo off
ping -n 1 %1


Note: You'll need to check that your antivirus software doesn't have any rules blocking SMTP. For example, McAfee Enterprise Virus protection - standard for many corporations - has a default setting that blocks SMTP ports 25 and 587. The default rule needs to be modified to add java.exe and telnet.exe as programs allowed to use the mail ports. This is illustrated in the attached screenshot.


Download here: FLUX-SAMPLES-Ping.ffc

Mail Trigger that prints the Subject and Body of the mail to stdout (Basic)

This workflow polls the 'Inbox' of a test mail user and fires when new mail arrives; it then prints the email subject and the email body to stdout. Export the workflow to the engine, send an email to, and check the Console Action's output match the subject and body. 

Download here: FLUX-SAMPLES-MailTrigger.ffc

Mail Trigger that downloads Attachments and prints Filenames to stdout (Basic)

This workflow polls the 'Inbox' of a test mail user and fires when a new mail arrives; when it does, it retrieves the mail, downloads attachments (if any available) using a postscript, and prints details about the mail and attachments to stdout.

This workflow is already configured with credentials to monitor a test mailbox – to use the example, just export it to your engine, send an email to, and watch the stdout on your engine. The mail trigger can also receive emails with any number of attachments (or no attachments at all) – feel free to experiment to see how the workflow handles different numbers of attachments.

Download here: FLUX-SAMPLES-MailTriggerThatDownloadsAttachment.ffc

This workflow uses a postscript on the Mail Trigger that invokes the Guava library (embedded within Flux) to download files:

for (int i=0; i < flowContext.get("RESULT").attachments.size(); i++) {"RESULT").attachments.get(i).body, new File(flowContext.get("RESULT").attachments.get(i).filename));

By default, this will download files to the Flux home directory. You can modify this to use another directory, which can be either absolute or relative to the Flux home directory:

for (int i=0; i < flowContext.get("RESULT").attachments.size(); i++) {"RESULT").attachments.get(i).body, new File("/path/to/folder/" + flowContext.get("RESULT").attachments.get(i).filename));

The Console Action uses the List operator in Variable Substitution to enumerate the attachment filenames and display them to stdout.

Received Message: ${RESULT.subject}
Downloaded Attachments: <#list RESULT.attachments as THIS_ATTACHMENT>${THIS_ATTACHMENT.filename}<#if THIS_ATTACHMENT_has_next>, </#if></#list>

More on the Guava API call can be found at:,

Flow Chart Trigger that waits for another workflow to finish copying files and then iterates through them and prints to STDOUT (Basic)

This workflow waits on a Flow Chart trigger until the workflow defined in the 'Namespace' property finishes and then polls a folder and iterates through the list of files found. The 'Namespace' can also be used to define entire namespaces, specific workflows, etc. as explained in Flow Chart Trigger.

You will need to edit the File Exist trigger to point to a valid directory in your system. 

This example waits for the /Incoming/Daily workflow attached to finish. You will need to change the 'Source' and 'Target' of the File Copy action to valid directories in your system. 

Download here: Daily-FlowChartTrigger.ffcIncoming-Daily.ffc

Using a For Each Collection Element action to iterate through a collection (Basic)

This workflow scans a folder for files, iterates through them and prints out the filenames one at a time.

It uses a File Exist trigger to poll for files, then Runtime Data Mapping to pass the resulting collection to a For Each Collection Element action like so:

We're calling ${i} from the 'Print Filenames' Console action, 'i' is the Loop Index of the For Each Collection Element action, and we can use Variable Substitution to call that variable from anywhere in the workflow (keep in mind that until the For Each Collection Element action runs, that variable is empty). 

You will need to modify the File Exist trigger's source folder to a valid path in your environment. 

Download here: FLUX-SAMPLES-IterateFiles.ffc

Calling a workflow using a FlowChart action (Basic)

This workflow exports a workflow that's in the Workflow Repository under a new name, waits for it to finish, and then continues executing. 

To run this example you'll need to download both workflows and upload them to your Repository. Then you can start the /FLUX-SAMPLES/FlowChartAction workflow which will in turn start the child workflow. 

Download here: FLUX-SAMPLES-FlowChartAction.ffcFLUX-SAMPLES-ChildWorkflow.ffc

File Trigger that Times Out and Sends Notification Email (Basic)

This workflow scans two different folders for files and then copies them to the same location. The File Exist triggers have a timeout – the triggers will poll the source folders and if nothing is found after 15 minutes, the triggers time out and exit. When the triggers time out, they flow into an Email action, which sends a notification email to let you know that the files were not found in the source folder as expected. 

To run this workflow, the sources for the File Exist triggers, the targets for the File Copy actions and the mail user, password and SMTP server settings of the Mail actions need to be updated to suit the environment where the workflow will be run. Sample values are provided for all, to make it easier to understand and change them.

Download here: FLUX-SAMPLES-FileTriggerWithTimeout.ffc

Send Email Reminders that Stop After Receiving a Mail (Basic)

Consider the use case of receiving reminders during the day at specific times. Until you respond to a reminder, it keeps reminding you every 10 minutes (or until it quits after a number of times). This example sends email reminders to users, then waits for 10 minutes and re-sends the reminder. To stop a reminder for a given time of day, the user must send/reply to one of the reminders. Then the next reminder is sent. 

To run this workflow you'll need:

  • Access to a mail server to send the notifications from
  • Access to a sub folder in the sender's mail inbox (where the user's can send a 'stop' email will be stored)

Before you run this workflow you must:

1- Create a subfolder/mailbox in a mail account that you want the Flux workflow to access. In this example the Flux engine is connecting to a Gmail account. The account is configured with a subfolder/label named 'Inboxx'.

2- Once the subfolder is created make sure you create an email rule so that the response emails to from the user being sent reminders are automatically redirected to that folder (e.g., Inboxx). NOTE: this step is the most important, if the mail trigger is watching the Inbox instead of a custom folder it will consider ANY incoming mail as the "stop reminders for this hour" email.

Here are the instructions to create rules for Gmail:

3- Modify (if needed) the times each timer trigger fires, and all the email settings on both the Email triggers (watch for incoming mail from the user being reminded) as well as the Email actions (that send the reminder emails). 

4- Export the workflow to the engine.

Download here: FLUX-SAMPLES-EmailReminder.ffc

Mail Box Connections

Some mail servers (e.g., Gmail) will complain if you try to connect to the mail server too frequently. This may cause the workflow to fail with the following message:

javax.mail.AuthenticationFailedException: 454 4.7.0 Too many login attempts, please try again later. 67sm2929465qkx.38 - gsmtp

Simply increase the mail trigger polling intervals from the +1m as defined in this example.

File Trigger that Attaches Files to an Email (Basic)

This workflow polls a folder for files, then sends those files as an email attachment (one file per email).

Before you run this workflow you'll need to:

  • Edit the Source in the File Exist trigger
  • Edit the path to the file (not the filename) in the attachments property of the Mail action

Download here: FLUX-SAMPLES_AttachToMail.ffc

File Trigger that Follows Different Conditional Flows Depending on the File Parent Directory (Basic)

This workflow polls a directory for files, and, using conditional flows, takes one out of three courses of action depending on the parent directory of the file being processed.

The conditions set are based on the RESULT.fileinfo_matches from the File Trigger, which are mapped to a For Each Collection Element action's collection. Then, the conditions are accessing the FECEA's loop index (in this example we've named it 'fileinfo' for convenience) and comparing the value to a hard coded value for the parent path.

Download here: FLUX-SAMPLES_FileTriggerConditionalFlows.ffc

File Copy that Fails and Exits on Error (Intermediate)

This workflow scans a directory and copies a file at a time. If a file copy fails, the workflow is set to ‘FAILED’ in the Operations Console. From the console, you can either ‘restart’ the workflow to run from the beginning, or ‘recover’ the workflow which starts the flow at the point where the file copy failed.

This example assumes that a runtime configuration file has been defined, and that runtime configuration properties to identify the file source directory and target directory specified as follows:


Required Runtime Configuration File Properties

Download here: FLUX-SAMPLES-FileCopy-ExitOnError.ffc

Invoking a Rest Service and Print Result to STDOUT  (Intermediate)

This example illustrates how to invoke a REST service on a remote system using Flux. The Rest Action has only one required property: baseURL, which specifies the location of the HTTP service to invoke. Arguments can be passed to the REST service and the REST service can also return values which can be used in the flow chart.

The above screenshot illustrates a Rest Action which invokes the Yahoo StockQuote service at The service takes two arguments: 1. quotes string argument, “GOOG+YHOO+HPQ+IBM+MSFT+ORCL”, and  2. format string argument "snl1d1t1ohgdr", which are passed to the REST service.

The result returned from the REST service is of type RestAction.RestActionResult. The Rest Action postscript maps result.result to a flow context variable "quote".

Variable substitution is used in the Console Action to display the value of the quote flow context variable; and will display the output on the STDOUT.

Download here: FLUX-SAMPLES-GetStockQuote.ffc

File Copy that Continues on Error (Intermediate)

This workflow scans a directory and copies a file at a time and logs a notification in the event of a file copy failure. The workflow than continues to process the next files.


This example assumes that a runtime configuration file has been defined, and that runtime configuration properties to identify the file source directory and target directory specified as follows:


Required Runtime Configuration File Properties

Download here: FLUX-SAMPLES-FileCopy-ContinueOnError.ffc

Report or Data Delivery (Intermediate)

This Flux workflow starts with a simple database table that contains the email address, file name, and a timestamp of when the file was sent (initially an empty field).  Every five minutes a timer trigger fires, triggering a database query to search the database for records where the sent timestamp is empty. For each record found, a Flux mail action builds an email using a customizable email template, and attaches the file specified in the record. The mail is sent, and the time that the mail was sent is updated in the database record. In the event of a mail failure or a file not found issue, a notification is sent.

This example assumes that a database table named 'Customers' has been defined with an email address, file path (to the file that needs to be delivered), and timestamp field (all defined as character or varchar fields). 

Download here: FLUX-SAMPLES-DB+Email.ffc

Run a Windows / Linux Program Under a Different User (Intermediate - Requires Light Scripting Knowledge)

Generally speaking, all processes and activities executed by Flux or a Flux Agent are executed under the user or service account that Flux is started with.

There are instances where processes – such as database scripts (not just SQL statements) and operating system commands – require running under a different user than the user, or the service account, that is running Flux.

In instances where a process requires the privileges granted a different user or service account, you have three choices:

  1. For running such processes on servers that support SSH (Unix, Linux, and Windows machines running a third-party SSH server like Bitvise Server), you can execute the process action using Flux’s SSH Command Action. This action allows you to specify a different user to run the command.
  2. Run the Windows program/batch file/process under a specific user account using Sysinternals' PsExec.
  3. Install a Flux Agent for each user account that is required. Configure each agent to start under the required user or service account and assign processes to agents based on the processes’ required privileges.

Note here that the Flux workflow itself will still run under the user or service account that starts Flux. Only individual process actions or SSH actions will run under the different user or service account.

An example of #2 is illustrated below. This workflow demonstrates how to call a Windows program/batch file/process under a specific user account using Sysinternals' PsExec.

The Process Action executes the command below:

c:\Sysinternals\psexec -u DOMAIN\username -p password -i \\localhost -d notepad.exe

Where the username is provided as 'DOMAIN\username' or 'MACHINE-NAME\username' if running on the local host; and the -d flag indicates PsExec to not wait for process to terminate (non-interactive).

Download here: FLUX-SAMPLES_RunAs-PsExec.ffc

Flow Chart action Child WF that Prints the Parent Name (Intermediate - Requires Light Scripting)

This example shows how to pass the parent wf name to a child through a Flow Chart action.

You can download the parent and child workflows here: FLUX-SAMPLES_FC-PrintParent.ffc, FLUX-SAMPLES_FC-Parent.ffc

Ping an FTP Host and Test Latency Before Copying (Intermediate - Requires Light Scripting)

This Flux workflow pings an FTP host for latency and perform a conditional file copy if the latency is acceptable. The workflow uses a few simple actions:

  • A Process Action performs the ping operation and results its result to stdout.
  • A Regular Expression Action uses variable substitution to retrieve the results, parses the ping time, and uses a postscript (shown below) to convert the ping time for use in a conditional flow.
  • The flow branches to a file copy (if the regular expression found a matching ping time and it was less than 100ms), or to a delay trigger, to wait 5 minutes and retry (if the ping results could not be parsed – that is, if the ping failed – or the ping took longer than 100ms).

The postscript on the Regular Expression action is necessary only to convert the action's result from array format (which cannot be used in a conditional flow) to a long value, which can:

Source for Creating and Saving a Business Interval
boolean matched = flowContext.get("RESULT").matched;
String[] groups = flowContext.get("RESULT").groups;
if (matched) {
  flowContext.put("pingTime", Long.parseLong(groups[1]));

Download here: PingHostAndCopyFiles.ffc

Message Queuing with One-to-many Delivery Using the Audit Trail Trigger (Intermediate – Requires Light Scripting)


A workflow template that can be instantiated to watch for messages on the audit trail. In this example, instances of the workflow will be created using the Flow Chart Action (invoked from a separate workflow) – this allows a workflow to create new listeners on the fly and deliver messages to them as needed.

Variable substitution allows the template to watch for messages with a mandatory event name and optional message filter. As a result, the template is highly reusable – the same template can be reused in a variety of circumstances to watch for messages from different sources without modifying the underlying workflow.

For this example, ten instances of the template will be spun off, each with a unique namespace but otherwise identical parameters, taking full advantage of Flux's concurrency model and ensuring that new messages are always processed immediately by whichever instance is available.


This simple workflow uses a Delay Trigger -> Flow Chart Action loop to instantiate 10 instances of the workflow template above. The "Variables" property on the Flow Chart Action populates the variables that are needed for the Audit Trail Trigger's message filter in the template, and a postscript on the Delay Trigger maintains a running count of active templates.

Received a message from /AuditTrailMessage with timestamp Oct 20, 2015 12:02:15 AM.
Message contents: EventMessage
Message sent from action: Null Action

Put a Job from the Repository to the Engine with a Set of Supplied Variables (Intermediate - Requires CURL or Java Coding)

Starting with Flux 8.0.10, Flux now exposes a REST API that allows one to submit a workflow from the Flux repository to the Flux engine and initialize that workflow with a set of variables (or more precisely - variable values). This is useful in making workflows reusable by exposing variables such as server names or email addresses and allowing these to be added in at workflow submission time.

The following code snippet assumes a workflow exists in the repository named '/Test', and that it has a console action in its workflow that outputs the variable ${Message}. This code submits 500 instances of this workflow to the engine, and each instance will simply output a number. 


Putting a Job with Variables to Flux using Java API
Factory factory = Factory.makeInstance();
Engine engine = factory.lookupEngine("localhost", 7520);
engine.login("admin", "admin");
HashMap map = new HashMap();
// Create 500 instances of the repository workflow named /OutputMessage
// and submit them to the engine as /Test1 ... /Test499. Each workflow has a 
// Console Action that will output the number of workflow - 1 to 499.
for (int i = 1; i < 500; i++) {
  map.put("Message", i);
  engine.putFromRepository("/OutputMessage", "/Test" + i, map, false);
System.out.println("Finished outputting");


The following example uses a CURL command - to submit the JSON shown - to submit some order details. The JSON keys and values contain the following:

      • repositoryNamespace: The name of the workflow in the repository
      • engineNamespace: The namespace the workflow should have when submitted to the engine for execution
      • overwrite: true | false - Whether to overwrite the engine namespace if the workflow is already on the engine
      • variables: variable names and their string or boolean values. Should match workflow variables defined for the workflow.


Putting a Job using CURL
curl -i -k -X POST --user admin:admin -H "Content-Type:application/json" -H "Version:Flux 8.0.11" -d @sample.json https://localhost:7520/put_from_repo
Sample JSON to Supply Variables
  "repositoryNamespace": "/Order Processing",
  "engineNamespace": "/Order Processing/1234",
  "overwrite": "true",
  "variables": {
    "order_id": "1234",
    "order_name": "doe's order"


Removing FAILED Workflows from the ENGINE (Intermediate - Requires Scripting)

This workflow polls the engine for any workflows in the FAILED state and removes them. You can run this wf as is (it re-scheduled for periodically checking for FAILED wf) or remove the flow that goes back to the Timer Trigger so that you can export it to the engine only when needed.

It polls the engine and removes failed wf using a script on the Console Action:

FlowChartIterator fcit = engine.getByState("/",SuperState.ERROR, SubState.FAILED);
while (fcit.hasNext()) {
            FlowChart fc =;

This workflow can easily be tweaked for getting only a specific namespace from the engine and remove the FAILED wf; or to restart instead of removing the FAILED wf from the engine.

Download here: FLUX-SAMPLES_RemoveFailedWF.ffc

Adding and Using a Business Interval (Intermediate - Requires Scripting)

The following script code creates a business interval for use in restricting workflows to only run within specified times and dates. 

Source for Creating and Saving a Business Interval
import flux.repository.RepositoryAdministrator;
RepositoryAdministrator repoAdmin = engine.getRepositoryAdministrator();;
//Start everyday at 8 am
Date startDate = EngineHelper.applyTimeExpression("^d@8H", null, null);
 //Lunch hour starts everyday at noon
Date startLunchHour = EngineHelper.applyTimeExpression("^d@12H", null, null);
BusinessInterval officeTimes= EngineHelper.makeBusinessInterval("Between 8:00 am and 5:00 pm");
officeTimes.include("Office time", startDate, "+9H", "0 0 0 8 * * *");
officeTimes.exclude("Lunch hour", startLunchHour, "+1H", "0 0 0 12 * * *");
officeTimes.excludeDay("Company wide holiday", new Date());
repoAdmin.put("Office-Open-Hours-8-5", officeTimes,true);
//Create a business interval that makes Saturdays and Sundays weekend
BusinessIntervalFactory bif = EngineHelper.makeBusinessIntervalFactory();
BusinessInterval weekend = bif.makeSaturdaySundayWeekend();
repoAdmin.put("Saturdays-and-Sundays", weekend,true);
//Create a business interval that excludes the weekend
BusinessInterval difference = bif.makeIntersection(officeTimes, weekend);
repoAdmin.put("Weekdays-Office-Open-Hours", difference,true);
System.out.println("Calendars Loaded.");

Create a workflow and then add a Null Action to the workflow. Take the above code and paste it into the Prescript for the Null Action and then export the workflow to the engine. Flux will run the above script and create the business intervals/calendars defined and add them into the Flux Repository, as illustrated below:

 After this business interval is loaded to the repository, it can be used in time expressions throughout Flux. You can use this Business Interval in any Timer Trigger as shown below:

If you have a Timer Trigger where you would like to apply your business interval, you will need to include some special characters in your time expression. These characters are:

  • b
  • h

In a Cron-style time expression, the "b" character indicates all values for the given column that are included in the business interval, while the "h" character indicates all values that are excluded. For example, the Cron-style expression 0 0 0 *b indicates "fire at the top of the hour for every hour value included in the interval".

All the information along with more usages of Cron expressions and business intervals can be found in Business Intervals.

Download the above workflow with the Null Action already preconfigured with the Prescript here: FLUX-SAMPLES-Create-Business-Calendars.ffc

Forecasting a Timer Trigger with a Business Interval applied (Intermediate - Requires Scripting)

This example is useful for checking the next firings for a given time expression (to be run in another wf in a Timer Trigger) but with a Business Interval applied to it.

The following script creates a cron expression (edit to add the expression your Trigger will be using) and applies a Business Interval off of the Repository (you'll need to edit the Business Interval name).

import flux.*;
import flux.repository.RepositoryAdministrator;
RepositoryAdministrator repoAdmin = engine.getRepositoryAdministrator();
cron = EngineHelper.makeCron();
//Set the time expression you'll be running in the Timer Trigger
cron.accept("0 0 0 *b");
//Set the name of the Business Interval as shown in the Repository
     for (int i = 0; i < 60; ++i) {

In this example, we're checking the next 60 firings for a Timer Trigger that should "fire at the top of the hour for every hour value included in the interval".

Download the above workflow here: FLUX-SAMPLES_PrintSchedule.ffc

Prepare and Read CSV file, Do Something with the Text in the File (Intermediate – Requires Light Scripting)

It's pretty common to receive a file from a customer and have to do some processing using the info in that file.

This workflow polls a landing folder for a customer file (in the example we're polling 'c:\TEST\SAP\') then, using a batch script, it adds a header to the file, and removes any blank lines. After that's done, the CSV file is read using the CSVJDBC jar, and the first line of text is used for additional processing (in the example we're just printing it out to the console).

Once the processing is finished, the workflow moves the processed file to a new location (using 'c:\TEST\SAP\Processed\' in the example) so it's not picked up again for processing.

To run this workflow, you'll need to download the CSVJDBC.jar file (provided) and place it in the 'lib' directory of your Flux installation folder. An engine restart will be required for the engine to pick up the new jar file.

Download the files needed here: FLUX-SAMPLES_PrepareFileAndProcess.ffc, AddHeadertoCSV.bat, Sample.csv, csvjdbc-1.0-20.jar

Bulk Upload Workflows from a Folder(s) to the Repository  (Intermediate – Requires Light Scripting)

This workflow polls a given folder for workflows (*.ffc files) and uploads them to the Flux Repository using cURL. It uses a File Exist trigger to pick up the files, then passes that info (one file at a time) to a Process action, which in turn runs the batch/shell script to upload the files.

To run this workflow you'll need to install cURL (applies to Windows, most *NIX based systems have cURL by default), which you can download from You will also need to edit: the File trigger source to point to the location of your workflows; and the Process action's command and working directory to point them to the location of the 'upload_wf' script.

Download the files needed here: FLUX-SAMPLES_UploadFromFolderToRepo.ffc, import_wf.bat,

SFTP File Transfer Workflow (Advanced)

A common pattern encountered by workflow designers involves a workflow that fires at a given time every weekday, then polls a remote server for files which are fetched and processed. Often such workflows need to support selecting files for processing based on a naming convention – such as a file description followed by a valid date in the filename. 
A best practice in Flux workflow design is accommodating for exceptional situations during processing. Often customers utilize wild card processing to select and process sets of files, not realizing that if a file in the set fails to correctly process, all following files in the set will fail to process. 

The workflow below starts with a timer trigger to fire at specified intervals. A File Exists Trigger fires collecting the set of filenames available to fetch from the remote server. Then, using a For Each Collection Element Action that submits each filename to a Regular Expression Action to compare a substring of the file to match a valid date (in yyyy-MM-dd format). If a filename matches the regular expression, the filename is submitted to a File Move Action where the file itself is moved to a local folder to be decrypted. If a filename does not match the regular expression, then the flow goes back to the For Each Collection Action to pick up the next filename in line.  
After fetching the remote files, the workflow uses another file trigger to pick up the filenames of local files to be decrypted. This pattern of using a For Each Collection Element Action provides per-file processing, meaning (in this case) if the decryption action fails, the Flux Operations Console has access to the filename of the file that failed (rather than a list of files and no knowledge of which file failed to decrypt). This same pattern is used again to re-encrypt the files.  
If anything should fail in the workflow, mail actions send an email to a set of specified email addresses to alert the customer's team that something failed. After the email has been sent (containing the filenames which failed) the normal processing of the remaining files continues. 


Runtime Configuration

Directories, server names, and numerous other elements of this workflow are externalized to runtime configuration properties allowing the same workflow to be used as a template for many purposes, providing consistency of deployment. This example assumes that a runtime configuration file has been defined as follows:


Runtime Configuration Properties
#Recommended Runtime Configuration File for FLUX for easy maintenance of WF
/MAILuser=<username for connecting to the SMTP server>


Download Here: FLUX-SAMPLES-MFT-Example.ffc

Managed File Transfer (MFT) Workflow (Advanced)

The following workflow template (i.e.,” /MFT/Template”) illustrates an example workflow demonstrating Flux’s MFT capabilities. This reusable and configurable workflow template can deployed to a Flux engine to execute any number of exchange file transfers. For example, submitting the template to a Flux engine as the name “/Acme/weekly” will force the engine to configure the workflow based on a runtime configuration defined for the Acme customer. The same workflow template can be submitted repeatedly with different names to the Flux engine for various customers and time periods (e.g., customers Acme, Baker and weekly, daily, and monthly).

The workflow template diagram is depicted below. It can be reused without modification for many SFTP file exchange partners.


Each customer (or file exchange partner) is configured with a set of runtime parameters – loaded at the time the workflow starts for that partner. The runtime configuration is refreshed automatically when the configuration file is changed (note that this feature is configurable). These parameters include when to trigger the workflow, where to place downloaded files, and what files to download. Adding a new customer or exchange partner involves adding the required entries to this file, and then submitting the workflow template to the Flux engine for that customer.

MFT Runtime Configuration Properties
#Locations for specific downloads
# Server and Password and user names
# File include patterns

A workflow template for generating reports is also provided, as depicted below. This workflow template also has a set of runtime configuration parameters that allow the template to be dynamically configured at runtime to execute selected reports over particular time ranges. These reports were built using Jasper Report Community Edition. No report server installation is required.

Reporting Runtime Configuration Properties
# -- Reporting Component Parameters

The following are the provided reports for this MFT example. Since reports are generated within a Flux workflow, these reports can be emailed, file transferred, and made visible via hyperlinks from the Flux Operations Console.


Finally, a common requirement is allowing external customers to view the status of their exchanges and workflows. Utilizing Flux’s security model, customers can be configured to only view selected workflows and their execution based on their names. So any number of users can be configured to see only those items they are permitted to view. 

The workflows and report files are available for download in this zip file. The .jasper and .jrxml files need to be placed into the FLUX-INSTALL/reports/examples directory. The .ffc file need to be imported into your Flux Workflow Repository.


For additional information or assistance contact The provided zip file is for Windows installations. A Linux version is available from support. 

This example requires that two database tables be added to the Flux schema. These tables are used to track the file transfers runs and the individual files transferred.

Required Database Tables
CREATE TABLE XFERDETAIL (FILENAME varchar (256) NULL, NAMESPACE varchar(256) NULL,URL varchar(256) NULL,DETECTED datetime NULL,FILESIZE bigint NULL,INITIATED datetime NULL,COMPLETED datetime NULL,FAILED datetime NULL,ERROR text NULL,CUSTOMER varchar(64) NULL,CATEGORY varchar(64) NULL,TARGET varchar(256) NULL,SOURCE varchar(256) NULL,RUN_STARTED datetime NULL,LAST_MODIFIED varchar(50) NULL) ;


Monitor Deadlines and SLAs (Advanced)

A very common scenario is the need to meet SLAs (or Service Level Agreements). Flux offers a robust way of dealing with SLAs: you can set deadlines on individual workflows (as shown below), so that if a workflow exceeds its deadline, it will publish an audit trail event indicating that the deadline has passed. The Operations Console will also graphically indicate any workflows that are approaching (or have exceeded) their deadlines.

The Deadline Time expression species the time frame that the workflow (or workflow run) is expected to be completed in. For example, a deadline time expression of "+7m" means the workflow (or each run in the workflow) is expected to finish within seven minutes.

The Deadline Window specifies how soon before the deadline the engine will publish the event flux.audittrail.server.DeadlineApproachingEvent to the audit trail. This event allows you to use the audit trail to view any workflows that are approaching their deadline. For example, a time expression of "-7m" would mean "seven minutes before the date and time of the deadline.

In this example, we have a pair of workflows – one that polls for files, and another in charge of monitoring the SLAs of the first one. 

The first workflow is watching a folder for files, we've pointed it to a folder that contains no files for the sake of the example: we want the workflow to exceed the deadline. 

The second workflow is watching for the Audit Trail events flux.audittrail.server.DeadlineApproachingEvent and flux.audittrail.server.DeadlineExceededEvent. It will print a message when each of those messages are published to the Audit Trail. 

Download here: FileWatcher with deadline.ffcSLA Monitor.ffc

Delete Files Older than 15 Days (Advanced)

This workflow polls a folder for files, calculates how long it's been since they were last updated/modified, and deletes all files that have a modification date of 15 or more days ago. 

It uses Runtime Data Mapping to pass on the File Exist trigger's RESULT.fileinfo_matches to a Collection, and to select the lastModified and url of each file to flow context variables to be used in the Null Action. The Null Action has a script in place that takes care of calculating how many days have passed since each file was modified, like so:

Date d1 = flowContext.get("lastModified");
Date d2 = new Date();
long diff = d2.getTime() - d1.getTime();
long diffDays = diff / (24 * 60 * 60 * 1000);
System.out.print(diffDays + " days, ");
if (diffDays >= 15){

Then, if 15 or more days have passed since the file was modified, the workflow flows into a File Delete action, which deletes the file and flow back to the For Each Collection Element action to process the next file in line. 

Download here: FLUX-SAMPLES-RemoveOldFiles.ffc

Query a DataBase, Build a Collection with the Results and Iterate through that collection to Query a 2nd Database (Advanced)

This example demonstrates how to query a database, build a collection with the results and then iterate through the collection using the data to query a second database (or database table).

Why Can't I Run A Query Inside of Another Query?

Flux has a number of triggers and actions that provide the ability to iterate over their results within a flow within a workflow. These include the Database Query Action, Database Stored Procedure Action, and Regular Expression Action. In order to iterate over the results of these actions, a looping flow has to be defined to process all the results from the action or trigger.

Note that if, for instance, a database action tries to executed another database query action within a looping flow, a transaction break with the start of the 2nd database query action, closing the database result set from the first query. This result in a Flux exception and the workflow will fail.

This is a feature of Flux in its transaction management to provide for failover and load balancing. Since the transaction is committed, resources (including the result set from the database action) are closed. Database Result Sets, since they are operating system resources, cannot be serialized and saved to the Flux database and restored at a later point in the case of system failover or a restart. Therefore, the contents of a database result set need to be accumulated into an intermediate collection that CAN be serialized and stored for recovery.

This example illustrates how to avoid this issue and still perform queries within queries while abiding by Flux's transaction management.


This example uses the built-in database connection to Flux's internal database, and queries the Flux Ready Table. If you are querying a database other than Flux's, make sure you uncheck the checkbox "Built-In Database Connection Used."

The Start Action has a postscript added that creates a new List and saves it to the flowContext to be used later:

import java.util.*;
List list = new LinkedList();
flowContext.put("list", list);

We query the database and map the RESULT.row to a Flow Context variable using Runtime Data Mapping (check the flow properties of the flow between the DB Query and the Console action). Using a postscript on the Console action 'Accumulate Query Results Into List', we fill the List we created with the values retrieved from the database:

import java.util.*;
List list = flowContext.get("list");
flowContext.put("list", list);

Once all resulting rows from the DB query are exhausted, we map the List to a For Each Collection Element action's Collection property. This is done, again, using Runtime Data Mapping; only this time it's in the ELSE flow (orange) between the DB Query for Pending Sends and the FECEA.

After that, the workflow uses the collection of data from the FECEA to query another database/table/etc. In this example we are simply querying the Flux Ready table again, using the primary keys collected in rows from the first query. When all the items in the collection have been exhausted, the workflow prints out a Finished message and exits.

Download Here: FLUX-SAMPLES-QueryDBIterateThroughResultAndQuery2DB.ffc

Query a DataBase, Build a Collection with the Results and Initiate Multiple Workflows (Advanced)

A variation of the workflow shown above is to use a database query to start other workflows. 

See the parent workflows here: FLUX-SAMPLES-InitiateWorkflowsFromDatabaseQuery.ffc initiating this simple child workflow (asynchronously) here: DumpVariablesToConsole.ffc

Factory factory = Factory.makeInstance();
Engine engine = factory.lookupEngine("localhost", 7520);
engine.login("admin", "admin");
HashMap map = new HashMap();
for (int i = 1; i < 500; i++) {
  map.put("Message", i);
  engine.putFromRepository("/OutputMessage", "/Test" + i, map, false);
System.out.println("Finished outputting");

Run Database Stored Procedure and Print the RESULT fields (Advanced)

This example shows ho to set up a Database Stored Procedure action, and then print out the results returned.

You'll need to have a stored procedure to call in your database, in this example, a MySQL SP is used:

-- --------------------------------------------------------------------------------
-- Routine DDL
-- Note: comments before and after the routine body will not be stored by the server
-- --------------------------------------------------------------------------------
CREATE DEFINER=`root`@`localhost` PROCEDURE `test`(IN param1 INT)
  SELECT * FROM example LIMIT param1;

Depending on the number of rows returned by the SP, you might need to edit the postscript of the Database Stored Procedure action. Here's what's setup (the result contains 3 rows):


Download the workflow here: FLUX-SAMPLES_DatabaseStoredProcedure.ffc

Sample Code to Read from a JMS Queue

The following is unsupported example code to interact with a JMS queue from within Flux. Download the source code here:  jmstrigger(1).zip

  • No labels