Using augmented reality to help inform decisions about forest fires
via Eddie López
To evacuate one’s home is a big deal. It uproots you; it displaces you. No one wants to evacuate haphazardly, yet this is a situation that many Californians face as they are constantly saturated with fire evacuation warnings. How do I, as a resident, know when to go?
Our team sought to empower users by allowing them to better visualize fires relative to their location. In lieu of satellite data – which can often be too slow – our potential solution is a fire visualization app: something that can use fire models to predict a fire’s location, overlay the results onto the real world, and then allow users to visualize these results by using their phone’s camera.
As a proof of concept, I had the opportunity to create some preliminary code, select some locations, and then place fire simulations at those locations. A short demo of this can be found below, where I placed some fire simulations outside of RAND proper:
One of the most significant challenges with implementation is the ability to place virtual objects at different altitudes than the user. This is because the code’s current altitude function relies on the user’s altitude data being correct, and thus, in places where phones might have trouble correctly accounting for altitude (i.e., buildings), this can be issue. If future development continues, one approach for addressing this may be to incorporate GIS data—something which will account for the location’s height independently of the user’s altitude. Additionally, the incorporation of a graphical user interface (i.e. a home screen, evacuation checklist, etc.) as well as various fire models are also of interest.
If you want to see one of the fire models that we were considering as a preliminary input, you can see Tim Gulden’s fire model here. Otherwise, until next time!