top of page

Building an Estimation Framework for FEMA Emergency Deployments: Setting up a Monte Carlo - Part 3

In our story so far, we've learned about using Monte Carlo simulations and worked through how they are set up. We left the most recent post with the stunning visual in Figure 1 of a single forecast iteration showing the simulated magnitude of deployed people per week.




Figure 1. Single example Monte Carlo draw. Concurrent deployments per week are shown in the Y axis. Coloring is defined by the total number of deployments per quarter of events, with deep green being the highest values.


This visualization of a single draw doesn’t show us the full picture of expected outcomes, however. It represents just one possibility among the vast array of potential scenarios generated by the Monte Carlo simulation. Since we simulated 30,000 different possible paths, each path provides a different outcome based on random draws from the defined probability distributions. By aggregating the results from all 30,000 simulations, we can compute meaningful forecast metrics such as the mean, median, and various percentiles (e.g., 90th or 99.9th), which offer a clearer understanding of deployment needs across a range of possible situations. This aggregation not only highlights typical scenarios but also reveals the likelihood of extreme events, allowing for more informed decision-making and robust contingency planning.




Figure 2. Aggregate simulated statistics for average, 90%, and 99.9% percentile deployment expectations in thousands (vertical axis). The yellow (average) shows that we'd typically anticipate about 5,000 or less people deployed each week by Q3. In a very extreme event, we'd see 25k people deployed in a single given week. Planning for extreme events ensures an organization has a procedure in the event of extreme outcomes. 


As can be seen, a simple approach to forecasting using Monte Carlo simulation has led directly to actionable insights, enabling us to understand how deployment needs will likely evolve in a data driven manner. 


Next Steps - Extending Monte Carlo simulations

At this point, we've worked through a basic Monte Carlo simulation approach. There are a number of ways you may want to extend the simulation approach for your use case, such as

  1. Workforce types/disaster needs: Breaking out different types of disaster events and workforce needs. For example, FEMA breaks out wildfire response and other types of specific natural disasters

  2. Incorporating assumptions: Probability distributions can be extended to account for different assumptions. Our example here only considers historical data, but there is no reason things like increasing event frequency, population growth, or other factors cannot be included.

  3. Tailor simulation draws: While we sampled univariate probabilities, since in our case the days until deployment and day deployed on site were independent, we can also include things that require joint probability. Things like deployment of teams is likely dependent. You can determine this a few ways, including correlation analysis and transfer entropy analysis. In fact, this is a need recently highlighted in the national news.

  4. Needs in different regions: Regional deployments may also be considered, adding a geographic element to the estimation.


In addition to extending the model to incorporate more real world uses and needs, we can also expand on the type of model architecture to help improve our simulation for predictive performance. These alternative mathematical approaches are described in Table 1, including their improvements over our basic approach.


Table 1. Different versions of Monte Carlo simulation, used for specific purposes, improved performance efficiency, or increased process complexity.

Extension Name

Description

Quasi-Monte Carlo

Uses low-discrepancy sequences instead of random sampling to improve convergence speed.

Markov Chain Monte Carlo (MCMC)

Generates dependent samples using a Markov chain to approximate distributions, often used in Bayesian inference.

Multilevel Monte Carlo (MLMC)

Reduces variance and computational cost by combining simulations at different levels of accuracy.

Importance Sampling

Focuses sampling in areas with high impact on the result to improve efficiency.

Variance Reduction Techniques

Techniques such as control variates, antithetic variates, and stratified sampling reduce output variance.

Adaptive Monte Carlo

Adjusts sampling dynamically during the simulation based on intermediate results.

Sequential Monte Carlo (SMC)

Generates samples progressively, used in filtering problems, especially in tracking systems.

Parallel Monte Carlo

Distributes simulations across multiple processors or machines to increase computational efficiency.

Monte Carlo Tree Search (MCTS)

Uses a tree structure to explore decision spaces in games or optimization problems.

Stochastic Gradient Descent Monte Carlo (SGD-MC)

Combines Monte Carlo sampling with stochastic gradient methods to optimize machine learning models.

Perturbation Monte Carlo

Adds small perturbations to assess sensitivity and uncertainty in outputs.

Rare Event Simulation

Tailored Monte Carlo methods that focus on efficiently simulating rare but critical events.


Conclusion

Over the course of this blog series, we've introduced Monte Carlo simulations by building a basic forecast for weekly FEMA deployment forecasts using open data. This method is a powerful forecasting tool used for forecasting complex processes. By walking through the process of setting up and executing these simulations, we've demonstrated how they can model complex, uncertain scenarios with a high degree of accuracy. Monte Carlo simulations provide valuable insights by generating a range of possible outcomes, from average scenarios to extreme events, allowing organizations to plan more effectively.


This forecasting method isn't just limited to FEMA deployments—it has broader applications across various fields where understanding uncertainty is key to making informed decisions. As we've shown, the ability to simulate thousands of outcomes based on historical data enables more data-driven planning and resource allocation, ensuring preparedness for both expected and rare events. Ultimately, Monte Carlo simulations serve as a powerful forecasting tool that transforms uncertainty into actionable insights.

 
Chief AI Officer of Flamelit - a leading Data Science and AI/ML consultancy, specializing in delivering cutting-edge analytical solutions across various industries, where he spearheads data strategies for major U.S. government agencies, healthcare organizations, technology companies, and Fortune 500 companies. Tom holds a PhD in Economics from The University of Texas at Austin. He has extensive experience with advanced AI technologies, data engineering, and AI-driven decision systems.

Comments


bottom of page