
Demand and Supply of Care Over the Life Course
We project the effects of changes in fertility and mortality rates on both the receipt and provision of care in the UK. We investigate the impact on the level and cost of care, as well as its share of total GDP, through the life course and across income and wealth distributions. SimPaths, an open-source dynamic microsimulation model, is employed to design different scenarios over a half-century period. This framework projects life histories over time, developing detailed representations of career paths, family and intergenerational relationships, health, and financial circumstances. Our estimates show that the value of care, as a share of GDP, almost doubles over the five decades of our analysis, with informal care accounting for most of the projected rise.
Read More
Introducing an asset test into client fees for long-term social care: a simulation study using Finnish administrative data
As the population ages and the sustainability gap in public finances in Finland widens, new solutions are needed to ensure sufficient funding for public services. One potential solution is to place greater emphasis on private wealth in the financing of care services. At present, client fees for long-term social and health care services in Finland are determined based on clients’ income. This study examines the potential effects of also taking clients’ assets into account. We focus on the fiscal and distributional implications of such a reform. The analysis is based on the SOTE-SISU static microsimulation model and unique administrative register including wealth.
Read More
Microsimulation at Scale for Chronic Disease Modelling: Executing 100 Million Individual Life-Course Simulations in 100 Seconds
Microsimulation is a uniquely powerful technique for chronic disease modelling because it simulates outcomes at the level of the individual over time, capturing heterogeneity, history-dependent progression, multimorbidity, and complex clinical pathways that cohort averages cannot. In an era when chronic diseases account for the majority of global mortality and impose escalating pressure on health systems, decisions about their prevention, treatment, pricing, and resource allocation carry profound long-term clinical and financial consequences. Consequently, accurate long-horizon modelling of these diseases has become central to policy, reimbursement, and investment decisions. Historically, however, microsimulation has been constrained by computational performance. Statistical precision requires large, simulated populations to reduce Monte Carlo error, and probabilistic sensitivity analysis multiplies this burden through repeated parameter sampling. Many models built in spreadsheets or high-level languages require hours or days to run, limiting scenario exploration, delaying iteration, and reducing their practical utility in time-sensitive decision environments. To address these limitations, a legacy microsimulation stack was rebuilt into a high-performance platform capable of executing 100 million life-course simulations in approximately 100 seconds. Performance gains were achieved through several core engineering innovations. The microsimulation core was implemented in modern C++, enabling direct control over memory allocation, cache locality, and execution flow. Compared with interpreted (e.g. Python, R) or spreadsheet-based environments, compiled C++ dramatically reduces runtime overhead and enables predictable, deterministic execution, strengthening validation processes and supporting regulatory-grade transparency and auditability. Memory architecture was optimised to maximise Central Processing Unit (CPU) cache efficiency and minimise allocation costs. Modelled individuals’ attributes, state transitions, and event processes were encoded in compact, structured formats, allowing large virtual populations to be simulated without performance degradation. The engine exploited modern multi-core CPU architectures through multi-threading, allowing independent patient simulations to run concurrently. Because individual life trajectories are largely independent within Monte Carlo microsimulation, the model parallelises naturally, enabling near-linear scaling with available cores. Beyond single-machine performance, the system supports horizontal scaling via containerised simulation instances, allowing elastic expansion across the infrastructure based on workload demand, without reliance on specialised high-performance computing clusters. The platform includes integrated pipelines for data ingestion, preprocessing, simulation execution, and post-processing. Outputs are automatically aggregated into epidemiological, and economic metrics, including incidence, prevalence, costs, and healthcare resource use outcomes, ready for decision analysis. A user-facing interface abstracts technical complexity, allowing domain experts to configure scenarios and execute simulations without interacting directly with the code or infrastructure. The entire platform is securely hosted in the cloud, allowing for easy set up and access anywhere in the world. The system is comprised of cross-cloud components that allow it to be hosted in any of the major cloud providers. These advances represent a fundamental shift in capability: complex simulations once requiring hours or days may now be completed in seconds, enabling real-time exploration of uncertainty, and rapid scenario iteration to expedite decision-making. Microsimulation can therefore operate at the scale and speed demanded by modern policy, reimbursement, and investment strategies, amid growing chronic disease complexity and multimorbidity.
Read More
Modelling cancer incidences and mortality in the Austrian population using dynamic microsimulation
Population projections indicate that by 2045, the Austrian population aged 65 and older will increase by approximately 47% compared to 2023. Since the likelihood of a cancer diagnosis increases with age, a corresponding rise in cancer cases is expected. To address this and support evidence-based decision-making, a model has been developed on behalf of the Ministry of Health to project cancer incidence, prevalence, and mortality within the population up to the year 2045. Our cancer projection model builds on the microsimulation model used by Statistics Austria for official population projections (Pohl et al, 2025). It introduces a new module for calculating cancer diagnoses and refines existing ones, such as the module for calculating mortality. A key advantage of microsimulation is its ability to account for individual characteristics, allowing factors such as existing diagnoses to influence future disease states and determine cause specific mortality outcomes. In addition, microsimulation offers the possibility of further developing the model in the future, e.g. through extensions such as the consideration of risk factors, as is already done in well-known microsimulation models such as OncoSim. (Ruan et al., 2023). The model parameters are calculated using administrative register data, including the Central Population Register and Cause of Death Statistics, linked with the National Cancer Register – enabling detailed tracking of individual life histories.
Read More
Populations remember: projecting the intergenerational consequences of heat extremes
Extreme heat events cause substantial excess mortality, yet their long-term demographic consequences extend far beyond immediate death counts. Each heatwave creates demographic memory—the cascading effects of lost individuals who would have reproduced, aged, and shaped future population structures. In this work, I will develop a microsimulation framework to quantify how a single extreme heat event reshapes population trajectories over subsequent decades, comparing outcomes with and without the heatwave to isolate its lasting demographic imprint.
Read More
Recent developments of the SimPaths dynamic microsimulation framework
SimPaths is an open-source framework for modelling individual and household life course events, jointly developed at the Centre for Microsimulation and Policy Analysis and the University of Glasgow (Bronka et al., 2025). The framework is designed to project life histories through time, building up a detailed picture of career paths, family (inter)relations, health, and financial circumstances. The modular nature of the SimPaths framework is designed to facilitate analysis of alternative assumptions concerning the tax and benefit system, sensitivity to parameter estimates and alternative approaches for projecting labour/leisure and consumption/savings decisions. SimPaths builds upon standardised assumptions and data sources, which facilitates adaptation to alternative countries.
Read More
Strengthening Validation Frameworks in Dynamic Microsimulation: Evidence from SimPaths
Dynamic microsimulation models such as SimPaths are increasingly used to evaluate long-term policy impacts by generating synthetic trajectories for individuals and households. Their credibility, however, depends on rigorous validation: demonstrating that simulated outcomes can reliably reproduce observed data. Despite their growing role in policy analysis, validation practices remain fragmented and only partially automated (e.g., O’Donoghue et al., 2015; Gosseries & Van der Heyden, 2018). This paper presents ongoing work on strengthening validation frameworks in SimPaths, with a focus on discriminator-based methods and econometric consistency checks. First, we apply classifiers (e.g., Gradient Boosted Machines) to distinguish between simulated and survey data. Discriminator accuracy provides an interpretable quantitative score of similarity: the closer performance is to random guessing, the more realistic the simulated data. Second, we explore the re-estimation of key behavioural regressions using simulated data and assess parameter recovery. This helps identify whether discrepancies arise from implementation issues, estimation limitations, or structural differences between datasets. By combining machine-learning discriminators with regression-based diagnostics, the paper contributes to more automated, transparent, and reproducible validation practices for complex microsimulation models.
Read More