I am using Wonderware Historian 2023 R2 and would like to log tags based on events

I am using Wonderware Historian 2023 R2 get datas from ABCIP and would like to log tags based on events. For example, log Tag A only when Tag B equals 1. However, I couldn't find a solution for this. I only found Delta and Forced logging options.


Does anyone know how to achieve this?

thank you

  • I think you can use event tag to achieve this function. Please refer the document as below link:docs.aveva.com/.../CE_Sum_TagProp.html.html

  • The "Event Tags" are legacy functionality preserved so that applications which used them will still work, but they normally store values in conventional SQL Server tables rather than in the high-performance extension tables. You can use event tags for cases like taking a snapshot of the "Tag A" value at the moment "Tag B becomes 1.0", but it is not as good a solution for "while Tag B = 1.0". This legacy event system also has some scalability challenges--not an issue if you're applying this type of filter for 10s of tags, but expect to have problems with 1000s of them.

    It does depend on your use case, but keep in mind that time-series data is stored very efficiently in the extension tables and there are good facilities for filtering data at retrieval time. 

  • I am using just 2 tags, but Tag B changes quite rapidly. When I tested capturing data, it seems like Historian couldn't log the values fast enough.

    Did I misunderstand something, or could there be a misconfiguration on my end?

  • How did you conclude Historian can't store your high-frequency data? It does officially support a sustained data rate of 150K values/second with short bursts to 2x that. If your rate approaches that, some tuning of the "buffer count" might be required, but that is quite simple. Internal stress testing has showed much higher rates. For smaller tag counts, we've seen sustained rates of faster than 25 millisecond updates on modest hardware. 

    In practice, we see other parts of the data acquisition stack are often the limiting factor, not the Historian itself (e.g. the PLC's interface, the OPC server, the Communication Driver, etc.). In some cases, additional tuning or spreading the data acquisition load across multiple instances of the interface/OPC Server/Communications driver will increase the throughput. 

    Your "Event Tag" configuration isn't going to work as you expect:

    1. The "time interval" is the detection latency--you can have a value that changes every 250 milliseconds and still reliably detect that every 60 seconds and it will execute much more efficiently, you will just have up to 60 seconds of delay before the data is available. The results will otherwise be identical. Arguably, this feature shouldn't let you specify the value in milliseconds
    2. The "Snapshot" only triggers when the "Camera_Complete" becomes "ON", not while it is "ON"
    3. The "Snapshot" action doesn't leverage the time-series extension tables in Historian, but stores values in native SQL Server tables. These will be far less efficient than using the extension tables. 
    4. The values captured for snapshots have to come from another tag (e.g. it won't work if you aren't already storing "Tag B")
  • Thanks for the quick response! your answer made me clearly.But I have changed my data storage method by writing an insert script to SQL in the condition script from InTouch instead of using Historian.
    Thank  you

  • If you had throughput problems with Historian, I'd be quite surprised if you can sustain that in SQL. Be sure you've adequately planned for sustained loads, retention volume, and archive policies. As table sizes grow, expect insertion rates to degrade. Also plan your storage requirements--properly indexed, SQL Server uses ~50x as much space.

    I won't claim to be completely objective when it comes to Historian vs. alternatives, but I'll acknowledge there are viable alternatives...but I wouldn't include SQL Server as one. For a more complete perspective, see this.

  • actually I worry about the size of the data so i was trying to insert into History table but I got some error.

    INSERT INSQL.Runtime.dbo.History (DateTime, TagName, vValue, QualityDetail, wwVersion)
    VALUES (GETDATE(), 'SKUBarcode_PC', '320033434', 192, 'LATEST')

    error  Msg 8114, Level 16, State 11, Line 3
    Error converting data type nvarchar to (null).


    Do you have any suggest? thank you

  • I'm not sure if this changed from older releases, but using Historian 2023 R2, the "INSERT History (DateTime, vValue..." doesn't work for me either, but you can replace this with "INSERT StringHistory (DateTime, Value..."

    Also, since you're always appending new values with more recent timestamps, this will be more efficient using:

    INSERT StringHistory (TagName, Value, OPCQuality, wwVersion)
    VALUES ('SKUBarcode_PC', '320033434', 192, 'REALTIME')