A business process modeling framework is used for data quality analysis. The modeling framework represents the sources of transactions entering the information processing system, the various tasks within the process that manipulate or transform these transactions, and the data repositories in which the transactions are stored or aggregated. A subset of these tasks is associated as the potential error introduction sources, and the rate and magnitude of various error classes at each such task are probabilistically modeled. This model can be used to predict how changes in transactions volumes and business processes impact data quality at the aggregate level in the data repositories. The model can also account for the presence of error correcting controls and assess how the placement and effectiveness of these controls alter the propagation and aggregation of errors. Optimization techniques are used for the placement of error correcting controls that meet target quality requirements while minimizing the cost of operating these controls. This analysis also contributes to the development of business “dashboards” that allow decision-makers to monitor and react to key performance indicators (KPIs) based on aggregation of the transactions being processed. Data quality estimation in real time provides the accuracy of these KPIs (in terms of the probability that a KPI is above or below a given value), which may condition the action undertaken by the decision-maker.