
Uncertainty assessment in dynamic microsimulation: the case of MikroSim (Germany)
Spatial dynamic microsimulations probabilistically project geographically referenced units with individual characteristics over time. Like any stochastic projection method, their outcomes are inherently uncertain and sensitive to multiple factors. In discrete time dynamic microsimulations, for each simulated time step (often years), each unit passes through different modules addressing different life events (births, deaths, ageing, partnership, employment, …), evaluating if a status transition occurs for them via a Monte Carlo experiment. This inherently introduces uncertainty due to the methods stochastic nature. However, simulations may also be sensitive to other factors, such as the choice of model types and complexity, as well as the parameter estimations, among others.
Few articles detail the uncertainty in dynamic microsimulations, and the importance of its components is often overlooked. This is due to the high computational effort required for testing numerous simulation configurations and individual runs are necessary for the analysis. A complete sensitivity analysis testing the sensitivity to each single parameter in every module of the simulation would be unfeasible due to the complex structure of these microsimulations and the resulting computational power required to run them. Moreover, since dynamic microsimulations are typically developed to address specific problems and vary significantly in design and complexity, one-size-fits-all solutions are unattainable. Lastly, there is no commonly agreed-upon standard for reporting uncertainty in dynamic microsimulations.
Applying variance-based sensitivity analyses to both direct and indirect effects within the employment module of the MikroSim model for Germany, we show that commonly considered sources of uncertainty, namely coefficient and parameter uncertainty, are less influential than qualitative modelling choices. Dynamic microsimulations being inherently complex and computationally intensive, it is crucial to consider potential factors of uncertainty and their influence on simulation outputs in order to more carefully design simulation setups and better communicate results. We find that simple summary measures do not adequately capture overall model uncertainty and therefore urge modellers to account for these broader sources when designing microsimulations and interpreting their results.