Hello All,
Link to images since [img] is not working:
https://ibb.co/C0y83ZQ
https://ibb.co/Jq6vKny
You can see two datapoint type that first one every dpe is connected to plc and second one is used for archiving that has every dpe as the first one except for timestamp. Note that number of these datapoints are over 1000. PLC connection datapoint has dpe called "timestamp" and customer wants us whenever "timestamp" dpe changes, we should get the other dpe values from connection datapoint and put the values to archive datapoint. They also want that in archive datapoint when we are setting the values, their internal timestamp value must be same as "timestamp" dpe value.
We can achieve what customer wants with connecting to connection datapoint and check whether timestamp changes or not and when its changed we can gather other values from connection datapoint and put it to archive datapoint with dpSetTimed using connection datapoint's "timestamp" dpe value. However since we can only use one timestamp per dpSetTimed we cannot set over 1000 datapoints at the same time with one dpSet which is very bad for performance.
So what can we do to solve the problem for synchronization?
Thanks
Timestamp Syncronization
- shokkul
- Posts:37
- Joined: Mon Feb 25, 2019 8:50 am
Timestamp Syncronization
- Attachments
-
- Screenshot-118.png (1.82 KiB) Viewed 4357 times
Last edited by gschijndel on Mon Dec 09, 2019 1:44 pm, edited 5 times in total.
- leoknipp
- Posts:2928
- Joined: Tue Aug 24, 2010 7:28 pm
Re: Timestamp Syncronization
Which driver are you using in WinCC OA to get the values from the PLC?
How is it ensured that the "timestamp" value is received after all other elements are updated?
If there is no guarantee that the timestamp is received is at last you can run into the following problem.
-- DPE "status" is updated
-- DPE "value" is updated
-- DPE "timestamp" is updated
-- Now the values are written to the "archive" datapoint
-- DPE "status" is updated
-- DPE "validity" is updated
-- DPE "timestamp" is updated
-- Now the values are written to the "archive" datapoint.
For"value" an old value is written.
Best Regards
Leopold Knipp
Senior Support Specialist
How is it ensured that the "timestamp" value is received after all other elements are updated?
If there is no guarantee that the timestamp is received is at last you can run into the following problem.
-- DPE "status" is updated
-- DPE "value" is updated
-- DPE "timestamp" is updated
-- Now the values are written to the "archive" datapoint
-- DPE "status" is updated
-- DPE "validity" is updated
-- DPE "timestamp" is updated
-- Now the values are written to the "archive" datapoint.
For"value" an old value is written.
Best Regards
Leopold Knipp
Senior Support Specialist
- shokkul
- Posts:37
- Joined: Mon Feb 25, 2019 8:50 am
Re: Timestamp Syncronization
Sorry for replying back later since I couldn't login to forum idk why 
We are using s7Plus driver and customer said we can assume after every value is updated then timestamp is updated. And we have a script that checks whether incoming timestamp is newer then previous timestamp, if the statement is true we are putting other values to archive.
We made a simulation and we were synchronizing 600 datapoints and 1800 dpe with the poll rate 100ms. We used dpSetTimed and syncronizing 1800 dpe seperately each cycle costs 30ms for us. Even though it seems good, I don't know the possible effect of this much dpTimedSet on Event Manager and Database. If there was a similar function like dpSetTimed but takes multiple timestamp input for multiple datapoints when setting it would be really good.
Thanks
We are using s7Plus driver and customer said we can assume after every value is updated then timestamp is updated. And we have a script that checks whether incoming timestamp is newer then previous timestamp, if the statement is true we are putting other values to archive.
We made a simulation and we were synchronizing 600 datapoints and 1800 dpe with the poll rate 100ms. We used dpSetTimed and syncronizing 1800 dpe seperately each cycle costs 30ms for us. Even though it seems good, I don't know the possible effect of this much dpTimedSet on Event Manager and Database. If there was a similar function like dpSetTimed but takes multiple timestamp input for multiple datapoints when setting it would be really good.
Thanks
- leoknipp
- Posts:2928
- Joined: Tue Aug 24, 2010 7:28 pm
Re: Timestamp Syncronization
If I understood what you have written you are writing 1.800 values in a 100ms interval.
Then you will have 18.000 value changes / second which shall be saved in the archive.
Is this calculation correct?
Are you sure that you need this big amount of data in your database?
For which time range shall this type of data available in your system?
According to the following entry in the Knowledge Base you need about 50 bytes to store 1 value change in the HDB archive:
https://www.winccoa.com/knowledge-base/ ... a48b1008fa
When doing a calculation based on the available information you will get a huge amount of data per day:
18.000 VC / sec = 1.555.200.000 VC / day = 77.760.000.000 bytes
77.760.000.000 bytes = 75.937.600 kB = 74.158 MB = 72,4 GB per day
Best Regards
Leopold Knipp
Senior Support Specialist
Then you will have 18.000 value changes / second which shall be saved in the archive.
Is this calculation correct?
Are you sure that you need this big amount of data in your database?
For which time range shall this type of data available in your system?
According to the following entry in the Knowledge Base you need about 50 bytes to store 1 value change in the HDB archive:
https://www.winccoa.com/knowledge-base/ ... a48b1008fa
When doing a calculation based on the available information you will get a huge amount of data per day:
18.000 VC / sec = 1.555.200.000 VC / day = 77.760.000.000 bytes
77.760.000.000 bytes = 75.937.600 kB = 74.158 MB = 72,4 GB per day
Best Regards
Leopold Knipp
Senior Support Specialist