A Markov-Based Update Policy for Constantly Changing Data in Database Systems
In order to maximize the value of an organization's data assets, it is important to keep data in its databases up-to-date. In the era of big data, however, constantly changing data sources make it a challenging task to assure data timeliness in enterprise systems. For instance, due to high frequency of purchase transactions, purchase data stored in an Enterprise Resource Planning (ERP) system can easily become outdated, affecting the accuracy of inventory data and the quality of inventory replenishment decisions. Despite the importance of data timeliness, updating a database as soon as new data arrives is typically not optimal because of high update cost. Therefore, a critical problem in this context is to determine the optimal update policy for database systems. In this study, we develop a Markov decision process model, solved via dynamic programming, to derive the optimal update policy that minimizes the sum of data staleness cost and update cost. Based on real-world enterprise data, we conduct experiments to evaluate the performance of the proposed update policy in relation to benchmark policies analyzed in the prior literature. The experimental results show that the proposed update policy outperforms fixed interval update policies and can lead to significant cost savings
Year of publication: |
2017
|
---|---|
Authors: | Zong, Wei |
Other Persons: | Wu, Feng (contributor) ; Jiang, Zhengrui (contributor) |
Publisher: |
[2017]: [S.l.] : SSRN |
Saved in:
freely available
Saved in favorites
Similar items by person
-
A Markov-based update policy for constantly changing database systems
Zong, Wei, (2017)
-
Li, Yulong, (2017)
-
Joint inventory and transshipment decisions with consumer behavioral heterogeneity
Feng, Pingping, (2023)
- More ...