Advanced Mirth Functionality

Rick Wattras -

Chaining together channels

It's been mentioned a few times in this guide, but one useful application of the ability to route/send messages to other channels on the same Mirth instance is to offload complex transforms or some separate functionality to other channels. The main concept revolves around the primary channel using a Channel Writer (or the Router class) to send data to a second channel that's listening with a Channel Reader. This second channel performs some action, typically a transform but can also simply act as a different way to send to a secondary downstream destination, then will usually have a Response that's parsed by the initial sending channel in order that it either receives the transformed data or understands that the processing was performed successfully. The chain can be as many channels long as your workflow requires.

Some examples of where "chaining" channels can come in handy:

  • Disparate feeds require a unified complex transform
    • In this situation, you only need to maintain one copy of the source of truth transformer channel that simply receives data and responds back with the translated data. The initial sending channel then reads the response in order to receive the transformed message.
  • Using an 'aggregator' channel to receive messages that will be processed differently downstream depending on content in the message
    • Say that your integration partner requires that all HL7 messages be passed over one interface, but you need to map different messages types to different models. You can have a primary TCP Listener channel accepting the firehose of messages, but then create multiple secondary channels for each message type you want to map. The destinations on the primary channel can then be configured to filter on just their appropriate message type and then use a Channel Writer to send each message to the appropriate downstream mapping channel.
  • Allow for multithreading without affecting message order
    • Multithreading can be useful for processing messages at a faster rate, but could lead to messages being processed out of order if it's configured on the receiving channel. Offloading processing to a downstream channel, as in the first bullet point above, and then enabling multithreading there can allow you to process messages concurrently downstream but still in order at the primary receiver channel.

Auth0 Integration

Auth0 is a popular platform for authenticating your requests with an API by retrieving a token. Here is an example Destination JavaScript Writer that you can use as a template for authenticating yourself with a valid auth0 implementation and parsing the response to retrieve the token.

importPackage(Packages.org.apache.http.client);
importPackage(Packages.org.apache.http.client.methods);
importPackage(Packages.org.apache.http.impl.client);
importPackage(Packages.org.apache.http.message);
importPackage(Packages.org.apache.http.client.entity);
importPackage(Packages.org.apache.http.entity);
importPackage(Packages.org.apache.http.util);

var httpclient = new DefaultHttpClient();

var httpPost = new HttpPost("<<Insert your auth0 token "/oauth/ro" authentication URL>>"); 

// FIll in each of the fields below by entering your values between the ""'s
var authJSON = {
	"client_id": "",
	"username": "",
	"password": "",
	"connection": "",
	"grant_type": "",
	"response_type": "",
	"scope": ""
};

httpPost.setEntity(new StringEntity(JSON.stringify(authJSON)));
httpPost.addHeader('Content-Type', 'application/json');

// Execute the HTTP POST
var resp;
try {
	// Get the response
	resp = httpclient.execute(httpPost);
	var statusCode = resp.getStatusLine().getStatusCode();
	var entity = resp.getEntity();
	var responseString = EntityUtils.toString(entity, "UTF-8");
	
	// Save off the response and status code to Channel Maps for any potential troubleshooting
	channelMap.put("responseString", responseString);
	channelMap.put("statusCode", statusCode);
	
	// Parse the JSON response
	var responseJson = JSON.parse(responseString);

	// If an error is returned, manually throw an exception
	//	Else save the token to a channel map for use later in the processing
	if (statusCode >= 300) {
		throw(responseString);
	} else {
		channelMap.put("token", "Bearer " + responseJson.id_token);
	}
} catch (err) {
	logger.debug(err)
	throw(err);
} finally {
	resp.close();
}
  • Just insert your implementation-specific informaiton into the template above
  • Typically the auth0 tokens only last for a certain amount of time. You can either choose to retrieve a new token before each request (for example, you can use multiple destinations where the first destination retrieves and saves the token to a channel map, then the next destination(s) use the token to make their requests. This then becomes part of the workflow each time a message is processed and you always have a valid token., or you can build a channel who's sole purpose is to refresh tokens on a regular basis and saves them off to Global Maps for use within any channel that requires using a token.
    • Just be aware with the latter that you'll have to figure out a way for messages to not fail if their token is refreshed in the middle of processing. One way could be to simply have the destination automatically retry if it fails to send due to having an expired token. Feel free to reach out to Datica for guidance or help troubleshooting.

Custom JAR libraries

If you've created a JAR library containing custom functions that you would like to be able to call within Mirth, contact Datica and provide them with your .jar file and they will place it in your Mirth container's custom-libs directory so that Mirth can pick it up. Once it's in place on the container, simply go to Settings → Resources tab → Reload Resource and you should see your .jar listed in the Loaded Libraries: section. 

Once you've confirmed that your library is loaded into Mirth, you can use them just as you would any of the built-in libraries. Import classes by using the importPackage(Packages.<path>syntax such as "importPackage(Packages.org.apache.http.client);"

Advanced message search

When viewing a Channels messages in the Dashboard, you'll notice below the normal message search filtering options a button labeled "Advanced...".  Clicking this opens up the Advanced Search Filter where you can fine tune what you're looking for beyond the date range and text search offered by the basic search. Here you can set it to search on only certain Destinations, over a specific message ID range, for a certain number of attempts, or even to look for a data string within a specific content type (ex. "Encoded" ). Here's an example Advanced search configuration:

  • Note that any advanced search parameters are applied in addition to any basic search parameters such as date range. 

AWS S3 Integration

A common workflow may entail sending data to an AWS S3 cloud storage repository. By default, Datica's Mirth implementations come built with the AWS SDK libraries already loaded into MIrth, so you can take advantage of this right out of the box. Here's a quick template Destination JavaScript Writer that you can use as a foundation for setting up your own S3 integration (note this example is for sending JSON content):

// Create a temp file to send containing JSON content
var file = new Packages.java.io.File.createTempFile("tempfile", ".json"); // Name this whatever you want, it will be deleted later once it's sent
var bw = new Packages.java.io.BufferedWriter(new Packages.java.io.FileWriter(file));
bw.write(JSON.stringify($('jsonObj'))); // In this example the data we want to send is in a Channel Map called "jsonObj"
bw.close();

// Set the filename and key name
var fileName = $('Channel Name') + "_" + connectorMessage.getMessageId() + ".json"; // Set the file name (use dynamic values) for the file to be created in the S3
var keyName = "path/of/your/choosing"; // Set the S3 directory path

// Set up the Request
var bucketName = "<<Insert the bucket name here>>";
var awsCreds = new Packages.com.amazonaws.auth.BasicAWSCredentials("<<Insert AWS account Access Key here>>", "<<Insert AWS account Secret Key here>>");
var s3client = new Packages.com.amazonaws.services.s3.AmazonS3Client(awsCreds);
var putRequest = new Packages.com.amazonaws.services.s3.model.PutObjectRequest(bucketName, keyName, file);

// Server-Side Encryption
var objectMetadata = new Packages.com.amazonaws.services.s3.model.ObjectMetadata();
objectMetadata.setSSEAlgorithm(Packages.com.amazonaws.services.s3.model.ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
putRequest.setMetadata(objectMetadata);

// Send Request
s3client.putObject(putRequest);
file.delete();

 

Processing Binary Files

Should you have a need to process binary files such as images or PDFs, Mirth has a lot of built-in capability to make this relatively easy. If you're reading in binary files, the File Reader source connectory type has a "File Type" setting that can be changed to Binary from the default Text. With that enabled, any files you pick up will be assumed to have binary content and Mirth will automatically convert it to base64-encoded text in order to display it in the Dashboard:

Example binary message as base64 in the Mirth channel dashboard:

 

If you'd like to write Binary files out, you can use the File Writer Destination and similarly set the "File Type" to Binary. Then in the "Template" text box, simply pull in the Channel Map or Destination Map variable containing the binary data and Mirth will automatically store it appropriately in the file, even if it was base64-encoded when in the dashboard.

Using the built-in Mirth API

Mirth version 3.4 introduced an easily-accessible API that can be used to perform many functions typically only available from within the Mirth GUI from an external location (although it would still have to have access to the Mirth network). You can use the API to automate tasks, deploy channels, retrieve messages, and more. To access the API, while in Mirth click the "View Client API" option in the Other pane in the bottom left corner of the display. This should launch a web browser to https://<Your Mirth IP>:8443/api which should display the following (if you run into a security warning, you can add the URL to the exception list and continue):

Clicking each of the categories in the list will display all of the available methods and API calls, plus their functions and uses:

As you can see, there is a lot of available functionality and some pretty solid documentation to make using it decently straightforward. If you click a specific call, it opens up a GUI where you can view the response model as well as even build your own request that you can test right from the web page (make sure you're logged in via the login section at the top right of the page - your Mirth account credentials should work here). Be aware that as good as the documentation is, we have come across some issues where the listed models are slightly different than what the API actually expects. If you're running into any trouble using the API, you can check out the available Mirthconnect resources or use their user forums, or feel free to reach out to Datica for guidance.

Certificate authentication

In cases where you need to authenticate to a destination endpoint with a certificate, Mirth unfortunately does not have the needed functionality within one of its default Destination Connectors (in particular, the HTTP Sender). However, you can build your own HTTP requests using the Apache libraries included with the Mirth build and a custom JavaScript Writer. The only caveat is that you'll have to work with Datica to install the certifiactes into a keystore on the Mirth server, but once there you should be able to use the keystore path to build a custom HttpClient object with which to build your HTTP requests like so:

// Client keystore path with password
var cksPassword = new java.lang.String("<<Insert keystore password here>>");
var cks = java.security.KeyStore.getInstance("pkcs12"); // update the keystore type if necessary, pkcs12 is standard for Datica
cks.load(new java.io.FileInputStream("<<Insert keystore path here>>"), cksPassword.toCharArray());

// Build the SSL context
var sslcontext = org.apache.http.conn.ssl.SSLContexts.custom()
       .loadKeyMaterial(cks, cksPassword.toCharArray()) 
       .build();

// Create the custom httpclient object 
var csf = new org.apache.http.conn.ssl.SSLConnectionSocketFactory(sslcontext, 
			org.apache.http.conn.ssl.SSLConnectionSocketFactory.ALLOW_ALL_HOSTNAME_VERIFIER);
var httpclient = HttpClients.custom().setSSLSocketFactory(csf).build(); 

// Build your HTTP request here in the same way you would otherwise
//
//
///////////////////////////////////////////////////////////////////

// Later, your execute call will refer to the httpclient object from before
var resp = httpclient.execute(httpPost);
  • If you see errors mentioning "PKIX" when you attempt to send your requests, there may be an issue with the certificate path, the keystore, or the certificate itself. Contact Datica support for help troubleshooting as necessary