Skip to Main Content
AVEVA Product Feedback


Status No status
Created by Guest
Created on Nov 14, 2022

Improve Data Cache Performance by parallelizing Data Archive update fetches

Currently, when the Data Cache is signed up for updates on multiple data archives, updates from each archive are fetched in series, one after another. When there are many data archives involved (especially if they are geographically dispersed) this can make the baseline update cycle take quite a long time. If one or more data archive loses connectivity, the performance hit can lead to loss of signups on other data archives.

Fetching the updates from each data archive in parallel would make the Data Cache scale much better for systems with many different sites of varying reliability. With this enhancement, the overall update fetch would take about as long as the slowest-responding data archive, instead of the sum of all DA response times.

  • Attach files