Back to Pilotfish Home

X12 EDI – eiConsole IDE

    Video Key Moments

    X12 EDI Supply Chain Workflow Demo: Configure an 850 Ingestion Interface and Return an 855 Acknowledgment

    In this short demo (built and tested in under 10 minutes), you will see how PilotFish eiConsole is used to configure a purchase order ingestion workflow that receives EDI 850, validates key fields, writes purchase order data to a database and returns an EDI 855 acknowledgment when required.

    [Try the Free 90-day Trial]     [Talk to an Expert]

     

    30-second summary

    • Receive X12 EDI 850 purchase orders over SFTP/FTP, apply pre-processing, then transform EDI to XML for mapping and validation
    • Use the 3-pane drag and drop Data Mapper to apply validation rules, while generating W3C-compliant XSLT under the hood
    • Route transactions based on payload values and sender preferences, then generate and send an EDI 855 only when requested
    • Scale integration across protocols, queues, cloud and databases with 40+ Listeners, 35+ Transports and 140+ Processors

     

    Inside the Purchase Order Ingestion Pipeline

    1. Inbound connectivity: Listener stage
      This route begins by picking up inbound X12 EDI from SFTP/FTP using a Listener configured with host, port, credentials and polling frequency.
    2. Pre-processing: Processor stage
      Before transformation, Processors can normalize the payload, for example decrypting encrypted EDI, handling character encoding, or decompressing archives. In the demo, an asymmetric decryption processor is applied conditionally.
    3. Transform EDI to XML: Source Transform stage
      The Source Transform converts inbound EDI into a common XML structure using the EDI transformation module and X12 table data for EDI 850. The route also enables “Friendly Naming” (human-readable naming) and segment indexing to simplify downstream mapping.
    4. Validate with visual mapping: Data Mapper
      In the Data Mapper, the workflow applies field validation without changing the structure of the EDI 850. In the demo, a validation rule compares PO1 line item count to the CTT total and annotates the message when a mismatch occurs. The mapping generates W3C-compliant XSLT and supports inline test execution.
    5. Route based on trading partner rules: Routing stage
      Routing rules determine which targets receive the message. In the demo, database updates occur for every EDI 850, while an EDI 855 is only generated when criteria are met (for example BEG07 value and an acknowledgment request).
    6. Outbound transformations and delivery: Target Transform and Transport
      On the target side, the workflow maps into an 855 XML structure and then produces outbound EDI 855. For delivery, the route uses a database transport for inserts and an HTTP POST transport to send the 855 responses.
    7. Test, troubleshoot and iterate: Testing stage
      The eiConsole testing mode runs byte-for-byte identical to the eiPlatform runtime and supports starting tests from any stage, replaying saved test configs, reviewing pass/fail markers and drilling into detailed exceptions when a stage fails.

     

    Build Supply Chain EDI Interfaces in 7 Stages

    PilotFish eiConsole structures route configuration into a repeatable 7-stage flow that maps well to supply chain EDI patterns: define source, receive data, transform to XML, apply routing, map to targets, deliver to endpoints, then test and debug.

     

    Supported Supply Chain and Cross-Use X12 Transactions

    This demo focuses on 850 and 855, but PilotFish supports common supply chain and cross-use X12 transaction sets such as 810, 832, 846, 856, 860, 864 and 997. PilotFish is an X12 Partner.

     

    Deployment with eiPlatform

    Routes authored in eiConsole IDE are deployed for unattended execution with eiPlatform and support hot deploy patterns for operational continuity. Works great in Docker containers.

     

    Security, Observability and Governance

    • Encryption in transit and at rest with least-privilege credential patterns
    • Audit logging for message flow and configuration changes
    • Metrics, alerts and dashboards for throughput, error rates and SLAs

     

    Why PilotFish?

    • Faster time to first supply chain interface through visual tooling and reusable maps
    • Lower maintenance with a consistent assembly line model from build to production
    • Better operational visibility through testing workflows and production reporting options

    FAQ


    Yes. The demo shows routing rules that always send to the database target but only send to the 855 target when acknowledgment criteria are met.


    No. The mapping generates W3C-compliant XSLT, enabling portability across XSLT engines.


    Use the Processor stage to chain pre-transform operations such as decryption, decompression and character set normalization before the Source Transform.


    Yes. The eiConsole testing mode runs byte-for-byte identical to the eiPlatform runtime and supports stage-level test execution and debugging.


    Yes. PilotFish is optimized for Docker containerization.


    Check out our FAQ pages for more.


    X12 EDI Supply Chain Workflow Interface Configuration in the eiConsole

    This is the PilotFish eiConsole. It’s the integrated development environment you’ll use to build, test, maintain and deploy all of your integration interfaces. 

    For this video today, we’ll be looking at the sample supply chain EDI workflow and doing some purchase order ingestion (EDI 850). We’ll be taking in some EDI, doing some field validation, a little bit of tweaking and then updating a database and then also sending an acknowledgment back to the sender (EDI 855). So let’s get into it. 

    Main Route Configuration

    When opening our route here, you’ll see our main route grid, which is where you’ll do the majority of the construction of your route. This is where you’ll define your source and target systems, set up your validation work in the mapper, and all those sorts of things. 

    This is organized by having our source systems on the left (these are all the places that we’re taking data from) and we have our target systems on the right (these are all the places that we’re sending data to). The data enters the system and flows through the 7-stage assembly line process as it’s finally sent to its destination system. 

    X12 Purchase Order Route Overview

    So, for this route, as we can see from our sources, we’re picking up some X12 EDI from our SFTP and looking at our transport systems (our target systems). Here, we’re updating our purchase order archive database. And then, in our second target system, we’re sending that 855 acknowledgment back to our sender. 

    Identify Source System in Route

    So, at this point, let’s look at each of the 7 stages and the role that they play in the integration process. The first stage on the far left is the Source System stage – which is mainly documentation. So, for each of our different source systems, we can give it a name, some metadata (if desired) and also a customized icon. This is very useful when you’re working collaboratively on these different routes – to be able to quickly identify the source and target systems, what they’re doing, where the data is coming from – all those sorts of things. 

    Listener Configuration for Incoming Data

    Next, we’ll move into the first functional stage of the integration pipeline, which is our Listener stage. So, if we want to integrate some data, the first thing we have to do is get that data, and we use the Listener stage for that. 

    We have pre-built listeners for all kinds of ways to get data into the system. Whether it’s an FTP (as we’re using here), a directory, an email database query, or something from the cloud (like AWS or Google Cloud Engine), we’ve got a Listener for it. 

    Module Configuration

    All of our different modules work by taking a base level of configuration (these are our required field values) and then more advanced configurations (with the same defaults, if needed). So, for our FTP listener here, all we really need to set up is our host port, our credentials, and how often we want to poll, and then it’s ready to go and receive files. 

    Data Processor Stage

    Next, we move into the Processor Stage.  The Processor Stage is where we perform any pre-transformative operations on this data. Let’s say we’re getting our EDI from our SFTP, and it’s encrypted; it’s in some foreign character encoding or in a zip file that we need to decompress. We’ll chain processors to perform all those sorts of operations – so that when we get to our next stage, we know that the data is ready for transformation. Here in our purchase order workflow – we’re using an asymmetric decryption processor with some conditional execution to say, “In case my EDI payload that I’m picking up is encrypted, I want to go ahead and decrypt that before I send it onto the next stage”. 

    Source Transformation into Common XML Structure

    Next, we’ll move into the Source Transform stage. This stage converts any non-XML formats into an XML structure that we understand. It is a two-part process. 

    First, we use a Transformation module to take any non-XML rep formats and give us a kind of baseline XML representation.  Second, we’ll do a logical mapping – this is an XML to XML mapping using our data mapper. Since we’re ingesting some EDI here, we’re using our EDI transformation module. We’re supplying the X12 table data for the transaction that we’re trying to work with, in this case – EDI 850. And since we’re also dealing with EDI, we’re going to add some enhancements to the XML that we’re producing. We’re going to use some friendly naming here to add some more human-readable friendly names to that XML that we’re producing. To help us later on with our mapping, we’re going to also add some segment indexing. 

    Mapping the Data in Drag & Drop Mapping Tool

    The second piece of our Transformation stage is this logical mapping using our data mapper, which I’ll show now. 

    Here we have our Data Mapper – our 3-pane graphical drag & drop mapping tool. This is organized by having our source format on the left – this is our incoming expected XML structure. We have our target format on the right – this is the XML structure that we’re trying to produce. You drag & drop from the left and the right (use the tool palettes at the top for more complicated operations) to construct your mapping. Since we’re taking in some EDI data, we’ve got our EDI XML format on the left – you should recognize our loops here (our PO1 & CTT from that incoming 850). 

    Field Validation in Data Mapper

    What we’re really doing in this mapping is some field validation. We’re not changing the structure at all—we’re just adding to that data structure if any of our validation rules aren’t met. Our target format is also EDI 850. We’re using an identity template to essentially copy over all of the XML, so we’re copying directly across. And then, for any of our loops or defined elements that match, we’re going to perform this validation rule. 

    Here, we’re using a pretty simple validation rule to say, “Get the count of all of our PO1s and make sure that they’re equal to our total.” We’re saying, “If my line item total doesn’t equal the count of actual line items that I have, then I want to go ahead and identify with this CTT versus PO1 mismatch text attribute.” 

    Data Mapper Generates XSLT

    A little more about our data mapper – under the hood, this is all generating W3C-compliant XSLT, so there are no binary formats – this will run in any XSLT engine. Here in the data mapper, we also have a built-in testing mode. So you can feed in a sample file (as I’ve done here), then you can push “go” and immediately see the results of that incoming data pushed through our mapping that we’ve put together. Taking a step back – we ingested our data via the Listener, we transformed it into that XML and the Source Transform did a little field validation. Next, we move into the Routing Stage. 

    Data Routing Rules in the Routing Stage

    The Routing stage performs a couple of different operations, but the main one that we’re going to focus on here is our routing rules. Routing rules are our primary way of taking our incoming source data and sending it only to the target systems that we want. We can define these arbitrarily complex routing rules to say, based on either some information in the transaction (maybe some metadata that we’ve set), follow these rules, flag the ones that match and then only send the data to those certain target systems. 

    Here, I’ve got a couple of simple rules set up. My first rule here is to look at that EDI XML and say if my BEG07 is AC and if my sender has requested an acknowledgment (EDI 855), then I want to send this message to my second target system here. Otherwise, I just want to go ahead and, in all cases, send it on to the database transport. So we’ll always be updating our database, but only if our sender wants an acknowledgment will we send the data to the second target system. 

    Configuring the Target Transformation

    Moving on to the target side, it operates basically the same as the Source side, but in reverse. 

    First, we’re doing our Target Transform using logical mapping, and then we’re using our transformation module to produce any non-XML formats. For instance, here, looking at our second target system, since we’re producing that 855 response, we’re using our logical mapping to map into that 855 XML and then we’re using our EDI transformation module to actually produce that final EDI. 

    Route Transport System Setup

    Next, look at the Transport System – these operate analogously to the Listener, where we’re pushing data to a Target System. For instance, since we’re updating this purchase order archive database, we’re using our database sequel transport to perform these database inserts. And then in our second target system, we’re using our HTTP POST transport to send our 855 response back to this target URL. In our last stage (target system), this is identical to the source system, where we can give our target systems a name, icon and some metadata. 

    Route Testing Stage

    Now that we’ve got all of our pieces in place let’s see this in action. Here in the eiConsole, we also have a built-in testing mode. This runs byte for byte identical to the eiPlatform – the headless runtime execution engine – so there’s no compilation necessary to test your routes here. Here in the testing mode, you can start and stop your tests from any stage as well as feed in alternate data and transaction attributes to test those crazy edge cases and you can also save your testing configurations for easy replayability later. 

    That’s what I’ll do here, using this sample 850 file that I’m feeding in. When I hit “execute tests,” we’ll see the data move left to right through the system. We’ll get green checkmarks for any successes and then we’ll get red Xs in the event that we have any failures on any stages. 

    So here, looking at the data quickly, we’re starting with just a basic single-line item, EDI 850. Next, we will move into our EDI transformation stage—the transformation module where we took that incoming EDI and transformed it into this XML structure. Here, you can see the friendly names (mentioned earlier), where each of our segments and elements has this really easy-to-understand, human-readable, friendly name attached to it. 

    Route Validation Stage

    Next, we move into our Route Validation stage. As we can see, all we really did was copy the data over. If any of our validation rules had not been met, we would have had that extra element that we would inject into it. But since our number of total line items equals the number of purchased line items, we didn’t break our validation, so there’s no change here. 

    Moving into the routing stage, we can see that since our 850 message included that AC – we did end up sending to both of our target systems. Looking at our database target, we’re using our XSLT to interact with and produce some SQLXML – so this is an XML representation of a database operation. So we’ve taken that 850 and transformed it into these two insert statements, saying “I want to insert into this transaction table” and “Here’s my summary purchase order information”. 

    Down here is my second insert. I’m going to insert into my order item table and use the identity from this first insert as a foreign key. Here is some more information that I want to insert into my order item table. Looking at our second target system, we’re using XSLT to generate our 855 message and then looking at the results of our EDI transformation. Here we see that we’ve produced that EDI 855 message that’s ready for sending. 

    It looks like our Transport here was successful, so our database operations worked. But it looks like we had a failure on our post transport. For any places where we have errors, we can view the stage output and get a really detailed exception message to tell us exactly what went wrong. This allows us to go back, fix the issues as needed and then run our test again. So that’s really all there is to it with the eiConsole and this purchase order ingestion. 

    Summary of Purchase Order Ingestion Workflow

    To recap – we took in some EDI 850 data via FTP, transformed it, did some field validation, updated our purchase order archive database and sent our EDI 855 response back to our sender. 

    It’s really as simple as that! In just under 10 minutes, we were able to build, test and deploy this purchase order ingestion interface using the PilotFish Graphical Automated Assembly Line process. Thank you for watching!


    If you’re curious about the software features, free trial, or even a demo – we’re ready to answer any and all questions. Please call us at 813 864 8662 or click the button.

    X12, chartered by the American National Standards Institute for more than 35 years, develops and maintains EDI standards and XML schemas.

    This is a unique website which will require a more modern browser to work! Please upgrade today!