Welcome Guest! Log in
Due to some maintenance operations, stambia.org will be switched to read-only mode during the 13th November. It will be possible to read and download, but impossible to post on the forums or create new accounts. For any question please contact the support team.

Getting started with hierarchical files

    In this article

    A hierarchical file is a file containing multiple records which can have different structures and can be nested.

    Generally, these lines start with a "record type" field which helps recognizing each line type.

    For example, a file storing Customers, their addresses and mails may look like this:

     hierarchicalFileExample

    This article demonstrates how to handle this kind of files with Stambia DI.

     

    Defining the Metadata

    General structure

    The first step is to create a File Server Metadata, and a directory in it.

    Then add a file with New > File on it and configure its properties.

    Example for our file which is a DELIMITED file with a comma as field separator:

    fileMetdataExample

    Next, create the record structure in a hierarchical way by adding new records with New > Record

    records

    Finally, add the records' fields with New > Field

    records withfields

    Note:

    • The Physical Name and Type properties of each field must be set.
    • Do not forget to set the Position Core property of each field, which indicates the position of the field in the record. You can also specify the Position for each record which will indicate its position in the output file. This can be useful when you need to create a header or a footer in your file.

     

    Filters

    The next step is to add a filter on each record with New -> Filter

    The filters will be used at reading, to differentiate the different records.

    They are required if you plan to read the file (using it as source in Mappings, etc...), otherwise, Stambia has no way to differentiate the records.

    records withFilters

    Each filter must be configured to test the value of the field that permits to recognize the record.

    For our example, it is the RECORDTYPE field, which is the first character of the line (Starts At 1, Length 1).

    For customer records, the value of RECORDTYPE is 'C' (for address 'A', ...).

    If the file exists on your machine, you can verify if a filter is correctly configured by consulting the record's data with Right click > Consult data on it.

    The generated request should return the customers lines:

    consultData

     

    Computed Fields

    The last step is to add computed fields, if necessary.

    Computed fields are specific fields that are calculated at execution, when the file is read.

    They are very useful to simulate a Primary Key/Foreign Key between records for joining records in mappings for example.

    Please refer to this article to learn how to use computed fields.

    We are going to use them on our file for this purpose:

    records withComputedFields

     

    Loading a hierarchical file

    To load a hierarchical file:

    1. Create a Mapping.
    2. Drag and drop the source tables and the target file.
    3. Map the fields
    4. Change the Integration template to Action Process INTEGRATION Hierarchical File

    loadmapping

    Note:

    • The record nodes must be mapped with a repetition key. For each value of this key, a record will be created. Here a customer record will be created for each CUS_ID, a mail record for each EML_ID, ...
    • The computed fields must not be mapped. They are only used at reading.

     

    Using a hierarchical file as source

    To use a hierarchical file as source, create a Mapping.

    Next, drag and drop all the needed source records/sub-records from the hierarchical file. It is not possible to use fields from sub-records directly, they must be instantiated on the mapping:

    sourceHierarchicalFile

    As you can see, the MAIL and ADDRESS records are instantiated on the mapping, and joined with the help of the computed fields.

    For the example, we only load a stage table but it can be any RDBMS table.

    If you want to load another file directly from a hierarchical file, you'll have to go through a stage, like explained in this article.

     

    Using Computed Fields

      When working with files, Stambia offers the possibility to create and use Computed Fields.

      These particular fields, on which an expression is configured, are calculated at execution when the file is read.

      It can be useful to simulate a Primary/Foreign Key for example in hierarchical files.

      Note that Computed Fields should be used only for reading. They are "virtual" fields, they do not exist on the file, and the calculated values can vary at each execution.

       

      Creating Computed Fields

      To create Computed Fields, Right click on a file or record and choose New > Computed Field

      addComputedField

       

      Then fill the parameters corresponding to your needs.

      computedFieldProperties

       

      1 - Computed Fields have a different icon, to differentiate it from the normal fields.

      2 - The Expression is the function which will be used to calculate the field.

      The available functions are:

      id()   Generates an ID
      localPosition()  Indicate the record/row number
      timestamp()  Generates a timestamp
      uuid()  Generates an UUID

       

      Using ancestor parameter

      It is possible to make "links" between Computed Fields, like a Primary/Foreign key would do in a table.

      For that, the Ancestor Level Base parameter has to be set.

      Example:

      ancestors

      The two fields have the expression set to localPosition() and the cusPosNumber in the sub-record has the Ancestor Level Base parameter set to 1.

      With this settings, each Phone of the same customer will have the same cusPosNumber, like a table Foreign Key would do.

       

      Note:

      • If the ancestor is not set, a new value will be attributed to the field.
      • For Primary Key/Foreign Key simulation, you must use localPosition(). Especially if you have multiple sub-records, because the other expressions are generated on the fly and can vary for each of it.

       

      Loading data when source and target are hierarchical Files

        During the integration process, Stambia creates and uses temporary tables to load and integrate the data into the target. But, when using files as source and target, these tables cannot be created, as no databases are used.

        This article demonstrates a solution to create a mapping to load data when the source and target are both hierarchical files.

        The structure of our files looks like this :

        metadata

        We are using here computed fields to simulate a Primary/Foreign key between the record and sub-records. See this article for more informations about computed fields.

         

        The use of staging

        Since the S18 Stambia Designer, it is possible to make multi-target mappings, and create stages. The creation of stages is a powerfull fonctionnality which permits to have a temporary place in a mapping (in which we can make operations on data before insertion, like unions, minus, ...).

        We are going to use this to accomplish our goal.

        Note that even if stages are only available since the S18 Designer, older runtimes can understand and execute the processes generated in S18. It is so possible to use the comfort of S18 without having to upgrade the runtimes.

        To use a stage, drag and drop a database schema in which you want the temporary tables used by the stage to be created.

        dragStage

         

        Note

        In a mapping all source records have to be instanciated. It is not actually possible to map fields of the sub-records, like the MAIL one, directly from the CUSTOMER main record.

        The sub-records have to be put on the mapping and joined. We used here the computed fields to make the join.

         

        Now, we can drag and drop on the stage the fields from the sources we want to use, and then map them to the target :

        finalMapping

         

        Note

        The Record Keys on the target have to be mapped.

         

        With S17 Designer

        The S17 Designer doesn't permit to use stages and multi-target. The solution is to create a table on a database and separate the process in two mappings.

        Mapping 1 : Source File > Table

        Mapping 2 : Table > Target File

        A process can then be created to execute the two mappings.

         

        Ordering data when loading a hierarchical file

          This article explains how to make an order by in a mapping when loading hierarchical files.

           

          The metadata

          The hierarchical file we are loading looks like this:

           fileMetadata

          See this article if you need help to make it.

          Our goal is to order the data by TIT_CODE when loading the file from the demo database.

           

          The mapping

          The mapping is very similar to what we would usually do when loading a hierarchical file.

          The source fields are mapped to the target and all the records are mapped with a repetition key.

          mapping

          Now, to order the customers by TIT_CODE, we are going to modify the expression of the customer record node.

          The idea is to concatenate all the columns we would have put in an ORDER BY SQL Query.

          We set it here to CONCAT(T_CUSTOMER.TIT_CODE, T_CUSTOMER.CUS_ID).

          The data will then be ordered by TIT_CODE and CUS_ID

          updatedMapping

           

          Articles

          Suggest a new Article!