Back to Pilotfish Home

eiConsole for Splunk

     

    Normalizing Data to Ingest into Splunk with PilotFish’s eiConsole for Splunk

    This is the PilotFish eiConsole for Splunk. It’s the integrated development environment that you will use to build, maintain, test and deploy all of your integrations.

    In this demonstration today, we’ll be showing how the eiConsole’s built-in Splunk event logging transport can be leveraged to take input from multiple disparate systems and then normalize that data for Splunk use.

    It’s important to note, with the PilotFish common model for integration, that every one of these integrations, no matter how complex, how many sources or targets; it goes through the same 7-stage assembly line process.  Data enters the system flow center stage; it is then transformed from that incoming data format into a canonical XML representation in the source transform stage. Then moves on to the routing stage where that data can be routed arbitrarily to any number of target systems.

    Then the data moves into our target side, where our target transform, where that data is taken from that underlying XML format and turned into whatever data format our outgoing system expects; whether it’s JSON, EDI, HL7, what have you. And then finally at the transport stage, that data is sent to its final destination system.

    As we mentioned here, at a high level, what we’re doing in this demonstration is taking an EDI from this generic socket, we’re exposing a RESTful endpoint that’s consuming some JSON and then we’re also pulling a Microsoft SQL Server database. Then here we’re starting transactions with the data that we get. Then, here in our Source Transform stage, we’re taking that incoming EDI, JSON and our database SQL results and we’re normalizing all that data into XML.

    To do that, we’re using our transformation modules. One thing you notice here is that each of our modules is listed; they are our transports. Transformation modules follow the same basic design principle where each of these modules has a base set of configuration that’s required for use and then some more advanced configuration options if necessary.

    Here we’re using our EDI transformation module just to take that raw EDI input and turn it into this baseline XML canonical representation. And the same for our JSON here and our XML results from our SQL database listener.

    Next, we move on to the routing stage, and since we only have one target system, we’re not doing a whole lot here. We’re just basically routing our data through to our target system.

    And then finally, we move on to our transport. For this, we’re utilizing one of our new pieces of Splunk functionality. This is the Splunk Event Logging Transport. What this is basically doing, it is posting any incoming transaction data into a running Splunk enterprise instance via HTTP.

    All you need to do is to find your connection, some basic index host source name information. Then any data that comes through this system when it reaches the transport stage will then be inserted into the appropriate Splunk index as normal event logs. And they’ll be ready for use there.

    Just to recap this demonstration here, we’re using our Splunk event logging transport to take data from multiple disparate systems. We’re normalizing that data, and then we are taking that data, and we’re posting it into Splunk as transaction event logs.

    So if you’d like to take a test drive, you can Download a Free 90-Day Trial of the eiConsole for Splunk from our website.

    If you’re curious about the software features, free trial, or even a demo – we’re ready to answer any and all questions. Please call us at 813 864 8662 or click the button.

    *Splunk is a registered trademark of Splunk Inc. in the United States and other countries.
    HL7 is the registered trademark of Health Level Seven International. 

    This is a unique website which will require a more modern browser to work! Please upgrade today!