Replication to PI creates wrong AF hierarchy

Historian is configured to replicate data to PI. Data itself replicates correctly, however, asset hierarchy in AF does not follow the area hierarchy in System Platform so the object ends up being in the wrong path. Has anyone encountered the same issue?

  • I haven't heard of this, but do want to understand this more. It might warrant reaching out to Tech Support, but here are few questions:

    1. Which Historian release/patch level are you using?
    2. Are the discrepancies possibly related to renaming objects/areas or moving objects to different areas?
    3. Compare the Galaxy model with the "Location" extended property in the "Runtime" database--are these consistent?
    4. Compare the AF model with the "Location" extended property--is the AF model consistent?

    You can use this SQL query on the "Runtime" database to list the current "Location" values:

    select * from TagExtendedPropertyInfo where PropertyName='Location'
  • Hi  , thank you for your feedback. As for your questions, here are my answers:

    1. Historian in question is 2023 P03
    2. This is possible, but unlikely
    3. Galaxy model seems to be consistent with the Location extended property
    4. AF model is here and there when it comes to consistency with Location extended property meaning that for some objects is consistent, but not for others.

    My best guess as to what is happening is that Galaxy model has many areas in different branches having the same contained name, and somewhere down the replication line somebody gets confused and puts the element in the wrong place, whether is because of too many containment levels or something similar. I might be terribly wrong as well, but that is my gut feeling.

    As I am writing this reply, I have added several tags for replication and they ended up in the wrong place in the AF hierarchy.

    Location for the newly configured tags is
    /UPS/ProizvodnjaPriprema/P/SREBAN/GAS/BD/OTSP/SGS_BD/BD_MMG_KO01
    while it ended up in the following AF location
    /UPS/ProizvodnjaPriprema/P/SEVBAN/KG/OTSP/SGS_BD/BD_MMG_KO01
    but the element
    /UPS/ProizvodnjaPriprema/P/SREBAN
    got created in AF hierarchy as it did not exist before.

  • The discrepancies between the "Location" and AF model are odd (you did say that). I added some colors to highlight the differences:

    GR: /UPS/ProizvodnjaPriprema/P/SREBAN/GAS/BD/OTSP/SGS_BD/BD_MMG_KO01

    AF: /UPS/ProizvodnjaPriprema/P/SEVBAN/KG/OTSP/SGS_BD/BD_MMG_KO01

    Just to make sure I understand correctly, the discrepancies are:

    a. Somehow SREBAN became SEVBAN

    b. Somehow GAS/BD was collapsed into KG

    A few more questions:

    5. Are any of the characters in the object/area names specific to the ISO-8859-2 character set?

    6. What is the default database collation for the "Runtime" database?

    7. Any ideas behind "a" and "b" above, even if just a theory?

    For #6, you can use this in SQL Management Studio:

    SELECT DATABASEPROPERTYEX('Runtime', 'Collation')
  • I would formulate it a bit differently, It is not that SREBAN became SEVBAN as SEVBAN does actually exist in the models, but for some reason BD_MMG_KO01 ended up in SEVBAN branch.

    5. There might be, as we are based in Serbia, but I cannot vouch for that. Is there any query I might perform to determine this?

    6. SQL_Latin1_General_CP1_CI_AS

    7. A change in area hierarchy did occur for the previously existing elements under /UPS/ProizvodnjaPriprema/P/SEVBAN branch, so AF hierarchy for most of the elements under this branch is out of sync, but what is strange is that the newly created elements end up in this part of the branch even though the element that should host them got created.

    Does it make sense to delete whole AF hierarchy and recreate replication configuration in Historian to see if we'll get the correct results when starting from scratch (given the changes in area hierarchy)?

  • There are some problems in the Historian 2023 and 2023 R2 "Replicate to PI" where updates to the Galaxy model are not correctly propagated to Asset Framework. Currently, the only solution available is to delete the AF hierarchy and restart both PI Web API and replication so that it is recreated based on the current model.

  • I have performed the following procedure:

    1. Deleted all replication tags on Historian
    2. Stopped Historian
    3. Stopped PI Web API service
    4. Deleted AF hierarchy
    5. Started Historian
    6. Started PI Web API service
    7. Added replication tags

    AF hierarchy is still all over the place. Out of 196 objects, only 31 of them have correct place in AF hierarchy. I have made an excel that compares Location property to resulting AF hierarchy. I believe I have concluded that the issue is in that AF hierarchy ends up having only one element with a particular name event though there are multiple as per contained naimes in the Galaxy hierarchy. Consider the example below:

    AF hierarchy: /UPS/ProizvodnjaPriprema/P/SEVBAN/SP/ADA/OTSP/KOT/CHA_MMG_SP01
    Galaxy: /UPS/ProizvodnjaPriprema/P/SEVBAN/SP/CHA/OTSP/KOT/CHA_MMG_SP01

    AF hierarchy ends up having only one element called OTSP, KOT, etc. event though there are several in Galaxy hierarchy. So, as another example, Galaxy hierarchy contains the following elements:
    /UPS/ProizvodnjaPriprema/P/SEVBAN/SP/ADA/OTSP
    /UPS/ProizvodnjaPriprema/P/SREBAN/GAS/BD/OTSP
    /UPS/ProizvodnjaPriprema/P/SREBAN/GAS/BDZ/OTSP
    /UPS/ProizvodnjaPriprema/P/SEVBAN/SP/BT/OTSP
    but only one element called OTSP is created in AF hierarchy, seemingly the first one found alphabetically, and eventually all objects contained by area having that contained name end up under this element in AF hierarchy.

    Complete list is available here if you would like to take a look.

  • Thanks for the details. Unfortunately, at this point, I believe you need to reach out to the Tech Support team to have more dedicated and direct support on this. That will also help get the Development team engaged as needed. 

  • Just to conclude this topic. After tech support investigation, it turns out that PI Web API has the limitation that is causing this issue.