Open Source Geospatial Takes Center Stage for Resilience at FedGeoDay 2025
Washington, D.C. – On April 22nd, the air in Washington D.C. was charged with a focused, positive energy as the geospatial community converged for FedGeoDay 2025. Project Geospatial was on hand to cover this pivotal event, and the activity was palpable – the main event space was packed, reflecting the most active FedGeoDay we've seen in several years covering the event. With an impressive turnout of over 200 experts and practitioners from government, industry, academia, and non-profits, the event underscored the community's deep commitment and the perceived urgency of leveraging open source geospatial tools and data to build resilience against increasing environmental and security challenges.
The day's discussions weren't confined to theory, but rather delved into the practical application of open geospatial technologies for societal benefit. Sessions explored innovative flood mapping, hurricane decision support systems, the complexities of cybersecurity in a connected world, the cautious yet necessary integration of AI, and the foundational importance of open data initiatives like OpenStreetMap and Overture Maps. The clear, unifying thread woven throughout every presentation and conversation was the undeniable power of collaboration and community in developing and deploying effective, scalable solutions to help communities prepare for, respond to, and recover from disasters.
The Cornerstones of Resilience: Collaboration, Access, and Speed
Key themes echoed throughout the day were championed as essential operational principles for building resilience. Cross-functional and cross-jurisdictional support and collaboration were highlighted as critical for breaking down silos that can impede effective response efforts. The democratization of data and analysis through open means was championed as a way to empower a wider range of actors, from federal agencies to local emergency managers and even citizens. And perhaps most critically in high-stakes situations, the speed and timeliness of delivery enabled by open source and open data were repeatedly emphasized as crucial advantages for adapting to and supporting climate resilience and emergency response.
These advantages translate directly into community impact:
Increased Collaboration: Means faster, more coordinated responses across different levels of government and organizations when disaster strikes.
Data Democratization: Puts critical information and powerful analytical tools into the hands of more people, fostering local capacity and enabling informed decisions closer to the point of impact.
Enhanced Speed and Timeliness: Allows for rapid deployment of critical information and services, saving time, resources, and potentially lives during emergencies.
Setting the Stage: Earth Observation Data in Action
Katie Picchione | FEDGEODAY 2025 Keynote Speaker
A major highlight that set a foundational tone for the day was the keynote presentation by Katie Picchione from the NASA Disasters Response Coordination System (DRCS). As the response coordination lead for DRCS, Katie works to translate NASA's vast Earth observation capabilities into actionable information for emergency managers. Her talk, "Expanding on the what so what and now what applications of earth observation data," framed the discussion using the critical "what, so what, and now what" framework familiar to disaster responders.
Katie detailed how NASA provides data to answer the "what happened" (describing hazards via imagery, flood maps, change detection like Sentinel 2, SAR, or analysis following the Lahaina fire) and the "so what" (understanding impacts by intersecting hazard data with foundational datasets like population or infrastructure, showing examples using Black Marble nighttime lights for power outages in Houston). She clarified that NASA primarily provides the data and analysis, enabling partners to determine the "now what" – the necessary actions for decision-making, with Earth observation data supplementing traditional field surveys by filling spatial and temporal gaps.
Katie highlighted the accessibility of NASA's open Earth observation data through various platforms like Worldview for quick looks, FIRMS for fire hotspots, the DRCS Portal during active incidents, APIs like GIBS for technical users, and the Earth Data website for downloads. She also pointed to training resources like the ARSET program. Katie concluded with a clear call to action for the FedGeoDay community: to build a stronger community of practice around using open Earth observation data in disaster response, recognizing its immense potential to complement commercial data streams. Her insights clearly resonated, drawing incredible reception and positive vibes from the engaged audience.
Key Demonstrations: Open Source Geospatial Applications for Resilience
Xan Fredericks (USGS) giving her presentation.
A significant portion of the day was dedicated to practical demonstrations showcasing real-world open source geospatial applications for resilience. This session, titled "Demonstrations: Open Source Geospatial Applications for Resilience," was moderated by Ryan Burley of GeoSolutions. As his organization specializes in core code development and steering committee membership for open source geospatial software like GeoServer, GeoNode, and MapStore, supporting customers including HURREVAC, NOAA's nowCOAST, and Critical Response Group, Ryan set the stage for a panel focused on tangible applications leveraging open geospatial tools and data in critical resilience and response scenarios. The session featured four distinct presentations: Karen Townsend, Daniel Dufour, Henry Rodman, and Xan Fredericks. Common themes of cross-functional collaboration, data democratization, and speed of delivery were expected to be woven throughout.
Karen Townsend of Sea Island Software presented on HURREVAC, a widely adopted tool for US government emergency managers in hurricane-prone communities, supported by FEMA, USACE, and NOAA. She highlighted HURREVAC's remarkable history, evolving over 30 years from DOS to a web-based application and tracking 268 Atlantic hurricanes since 1988. Available free to relevant employees worldwide, HURREVAC's core function is combining forecast products (primarily from the National Hurricane Center), storm surge modeling, and USACE evacuation studies. Karen focused on the 2024 season, particularly Hurricanes Helen and Milton, showing how HURREVAC aggregates frequently updating forecast data and includes live layers like river gauges, crucial for inland flooding alerts like those during Helen. She candidly touched upon the significant data integration effort required for their small team to pull together diverse, sometimes text-based, data sources, noting the ongoing technical lift and the value a system like STAC could offer. She also mentioned keeping pace with NOAA in integrating new products like raster-based wind timing and probabilities.
Daniel Dufour from the City of Chattanooga presented on a near real-time map of reported flooding and road closures developed during Hurricane Helen. Prompted by the question, "What if we could create a real time map of flooding?", Daniel described the resulting intentionally simple map using circles for flooding and lines for closures. He drew a key distinction between mapping reported flooding (from human observation via emails and calls to 311/911) and predicted flooding from complex hydrological models, emphasizing that their project focused solely on the former, which is critical for traffic and captures hard-to-predict blockages. Daniel stressed the project's equal reliance on humans and technology. The technical stack involved GitHub Actions, GDAL, OpenStreetMap, MapLibre for display, and integration with their proprietary Tyler Tech data warehouse and Safe FME. Key lessons from this rapid, two-day development highlighted the importance of preparation, a psychologically safe organizational culture for risk-taking, minimizing sign-offs, building capabilities incrementally, and leveraging open data (no passwords) and open source (no licensing issues), noting how proprietary and open source tools successfully worked together.
Henry Rodman from Development Seed presented on EO API, their cloud-native geospatial tech stack designed to make geospatial data more accessible more quickly. Development Seed builds geospatial software "out in the open," and EO API serves as an "easy on-ramp" into cloud-native geospatial technology, which he noted can be challenging to start with despite its availability. Defining cloud-native by its goal of rapid data access and minimal complex pipelines, Henry highlighted the core role of the Spatial Temporal Asset Catalog (STAC) as a de facto standard for cataloging geospatial assets. He explained that while STAC is powerful but hard to use directly, EO API provides sensible defaults through its four main components: PGStack (PostgreSQL extension for STAC), STAC Fast API (serving cataloged data), TTyler PGStack (tile server for visualization), and TPG (for vector data via OGC API features). Henry demonstrated how cataloging data in STAC using PGStack makes it immediately available via OGC services and interoperable with STAC-based tools like Stack Browser and NASA's VETA, showcasing disparate data layers being easily combined. He cited the Planetary Computer as a "gold standard" built on EO API components, where data becomes visualization-ready as soon as it's in cloud storage. Henry emphasized that EO API is open and deployable, even locally, offering workshops.
Xan Fredericks, the Emergency Response Coordinator for the USGS National Geospatial Program, presented on how USGS leverages open data resources to support emergency response and resilience. In her vital role, she oversees the bureau's geospatial information response team and collaborates with partners to deliver actionable science-based information during emergencies. Her presentation was particularly well-received, drawing incredible reception and positive vibes from the audience. Xan's talk focused on the USGS's critical function in providing foundational geospatial data to support emergency management. She addressed how the bureau's authoritative data contributes across the spectrum of a crisis, helping responders understand impacts and inform necessary actions. While the "what, so what, and now what" framework is a key structure used in emergency management, and one that keynote speaker Katie Picchione credited Xan with suggesting as valuable for geospatial applications, the core of Xan's presentation highlighted the indispensable role of trusted government data itself. She spotlighted key USGS datasets, such as elevation data, known internally as 3DEP (3D Elevation Program), emphasizing the importance of these fundamental (and openly accessible) national data assets for critical resilience applications like flood modeling and landslide assessment. Her presentation effectively conveyed how USGS data, made accessible through open initiatives, is a cornerstone for building national resilience by providing the essential geographic truth needed in times of crisis. Xan also discussed USGS 'event support maps' categorized by hazard and highlighted the public applications page (usgs.gov/girt), which links to open resources including training videos explaining resources like US Topos – directly addressing the democratization of data and analysis. She passionately demonstrated the 3DEP viewer, showing how its 70+ trillion lidar points (over 200,000 per US person) are available online, free, and easily shareable via link, reinforcing that during a disaster is no time to discover data incompatibility, stressing the need for interoperable OGC datasets – "there’s no room for ego in emergency response"
Following the presentations, a brief Q&A session moderated by Ryan Burley took place, allowing one question for Karen Townsend regarding data aggregation challenges before time constraints led to attendees being encouraged to connect with presenters at the social event. In summary, this segment, moderated by Ryan Burley, provided a robust overview of how diverse entities leverage open source software, open data, and collaborative efforts – from software companies and city governments to development firms and federal agencies – to enhance disaster response and resilience through practical geospatial applications.
Lightning Talks: Rapid Insights and Innovations
In addition to the key demonstrations, a series of focused "lightning talks" provided further rapid-fire insights into diverse applications and foundational elements within the geospatial domain relevant to resilience. While many valuable insights were shared during these shorter sessions, a few particularly resonated with our coverage team for their illustration of key challenges, innovative solutions, and direct impact:
Quincy Morgan delivered a talk focusing on OpenStreetMap (OSM), highlighting its status as the world's largest repository of free, editable vector data built through global collaboration. His presentation emphasized the value of this community-driven data for global and local resilience efforts and showcased tools like Layer Cake (for cloud-native access to OSM data in GeoParquet) and Slice OSM (for easy area-based downloads) that improve data accessibility and usability for analysis. This talk was compelling because it reinforced the power of the crowd and community in creating a vital open data asset, while also demonstrating practical steps being taken to make that data more readily available for widespread adoption and use in resilience applications.
Emma Paz from Development Seed tackled the significant challenge of making petabytes of commercial satellite data (from programs like CSDA) accessible to researchers. Her talk showcased an approach leveraging metadata and the Spatio Temporal Asset Catalog (STAC) specification, implemented using a PostgreSQL extension (PGStack) and exposed via a STAC Fast API. This was compelling because it highlighted a practical, open approach to standardizing the discovery and access of geospatial data at an immense scale (over 30 petabytes) – a critical need for getting valuable commercial data into the hands of those who can use it for resilience efforts. The robust infrastructure supporting this, capable of hosting 100 million STAC records, demonstrated a scalable solution to a major data management hurdle.
Ghermay Araya delivered a compelling talk focusing on address data, framing it not just as simple geographic information, but as a "hidden spatial data infrastructure" absolutely critical to everyday use cases, including life-saving emergency response. By drawing attention to the lack of a single, unified address management process across the country, Araya tied poor address data directly to real-world, costly consequences, including potential delays in 911 response measured in lives lost. This elevated the discussion of data quality from a technical issue to a critical infrastructure concern with immediate human impact, making a powerful case for modern, open-sourced, and open access address management systems adhering to standards like those from the FGDC.
Jason Gilman from Element 84 explored the cutting-edge intersection of AI and geospatial data, specifically addressing the critical issue of reliability in geospatial AI. His talk focused on using Large Language Models (LLMs) to allow natural language queries about geographic areas, while proposing a method to mitigate the inherent non-deterministic nature of LLMs. The approach involves translating natural language into a deterministic, testable graph of spatial operations. This was particularly compelling for tackling head-on the challenge of ensuring accuracy and trustworthiness in AI systems used for critical geospatial applications, offering an innovative technical solution to make powerful AI tools more reliable for resilience purposes.
These insightful lightning talks provided concrete examples demonstrating how open approaches and innovative technical solutions are being applied to fundamental challenges in the geospatial domain, from managing vast data archives and standardizing critical infrastructure data to ensuring the reliability of advanced AI and deploying critical applications.
Connecting the Dots: Themes Woven Together
As the day progressed, it became increasingly clear how the diverse topics discussed were interconnected and mutually reinforcing. The need for reliable, accessible foundational data highlighted in demonstrations by Xan Fredericks on USGS data (including 3DEP) or lightning talks like Ghermay Araya's on addresses and Quincy Morgan's on OpenStreetMap, directly feeds into the potential for powerful analytics, including those using AI, as discussed by Jason Gilman. The challenges of managing immense data volumes, showcased by Emma Paz, underscore the necessity of the cloud-native tools and standards highlighted throughout the day. And the real-world open-source applications presented by Ryan Burley are the tangible outcomes enabled by progress in all these underlying areas – robust data, reliable systems, and accessible tools. This interconnectedness highlighted that advancing resilience requires a holistic approach, addressing challenges across the entire geospatial data lifecycle and technology stack.
Deep Dives: Panels Explore Key Challenges and Opportunities
Beyond the inspiring keynote and practical showcases, dedicated panel discussions offered deeper dives into critical challenges and opportunities facing the community:
Callout: The AI Panel - Demystifying AI
The "Demystifying AI" panel tackled the potential and challenges of integrating artificial intelligence into geospatial applications for resilience. Panelists discussed various AI technologies, acknowledging that "AI" is often used loosely, but emphasizing that ML, computer vision, and LLMs are tools whose effectiveness hinges on data quality and relevance. Key challenges in the federal context included the scarcity of labeled data needed for training and the complexity of data cleaning, metadata, and standards required to make data AI-ready. Foundation models capable of understanding remote sensing data were seen as a promising opportunity.
The panel placed significant emphasis on ethical considerations and risks, particularly privacy and biases (geographical, temporal, social, and even in crowdsourced data like OpenStreetMap). They stressed the need for explanability – understanding data sources, quality, biases, and methodologies – to build trustworthiness in AI products. For critical applications like disaster response, human verification was deemed essential due to the potential for AI models to produce inaccurate or fabricated outputs. Security risks, such as "rag poisoning" of LLMs, reinforced the need for trusted data sources and adversarial testing.
While acknowledging a relatively slow penetration of AI in some federal areas, hindered by resource constraints and clarity issues, the panel saw immense potential for collaboration across government, academia, and industry. They advocated for a "problem first, model second" approach, focusing on the specific challenge before selecting the AI tool. Discussions also touched on adhering to OMB guidance on sharing AI code and data, and the ongoing question of when AI models are "ready" for critical use cases, suggesting prototyping, understanding limitations, and investing in building trust over time.
From Left to Right: Moderator: Eddie Pickle, Panelists: Jed Sundwall (Radiant Earth), Angelina Calderon (Meta), John Crowley (MapAction), Maggie Cawley (OpenStreetMap US)
Callout: The Open Data for Resilience Panel
The "Open Data for Resilience" panel underscored the vital role of open data, open source software, and community collaboration. Panelists included Maggie Cawley from OpenStreetMap US, John Crowley from Map Action, Angelina Calderon from Meta/Overture Maps, and Jed Sundwall from Radiant Earth.
A point of particular focus, reflecting some high tensions within the community, was the relationship between the long-standing, community-driven OpenStreetMap project and the newer, corporate-backed Overture Maps initiative. Angelina Calderon from Meta, a founding Overture member, framed Overture as a complementary, cloud-native effort focused on scale, usability, and creating a global entity reference system (GURS) for linking data, emphasizing Meta's support for both initiatives.
However, Maggie Cawley from OpenStreetMap US frankly articulated concerns about the impact of Overture on the core OSM community. While acknowledging OSM's role in supporting projects like Overture with foundational data, she highlighted worries that the resources and attention of large companies, including former OSM sponsors, are being diverted away from directly supporting the foundational OpenStreetMap community that many rely on. This shift, she argued, makes it harder for community-driven entities like OpenStreetMap US to secure necessary resources for maintaining and growing the core dataset.
Jed Sundwall from Radiant Earth further elaborated on why large corporations might approach Overture differently than OSM. He pointed to significant legal concerns some corporate lawyers have had with the OpenStreetMap license (ODbL) and identifying a clear legal entity within the globally distributed OSM community structure as factors making a separate consortium like Overture a seemingly more straightforward path for certain types of large-scale corporate engagement and data use. John Crowley from Map Action corroborated this, mentioning the necessity of incorporating entities like the Humanitarian OpenStreetMap Team (HOT) to facilitate interactions with governments. These points underscored that the tension lies not necessarily in conflicting missions, but in the challenges of large-scale corporate investment and participation models interacting with established, community-driven open data structures and their long-term sustainability.
Despite these points of tension, the panel also highlighted successful collaborations, such as Map Action's use of diverse open data for international disaster response and coordination efforts enabled by groups like FEMA's Geospatial Resource Team, which support organizations delivering aid. Challenges like ensuring data feedback loops and addressing fragmentation/lack of standardization across agencies remain. Nevertheless, the panelists expressed optimism due to increasing tools and a growing recognition that building truly resilient data infrastructure requires community, partnerships, and collaboration, advocating for lowering the bar to participation for state/local entities.
Callout: The Cybersecurity Fireside Chat - Achieving Resilience in the Cyber War
The "Achieving Resilience in the Cyber War" fireside chat brought a critical focus on securing the geospatial data, systems, and people vital for resilience. Framing cybersecurity as integral to risk management, the discussion highlighted challenges like sharing vulnerability information (partly due to overclassification) and the risks inherent in open source data (malicious code changes, metadata inconsistencies).
Panelists stressed the need for continuous monitoring, validating data sources before ingestion, and adhering to established security frameworks. Managing stakeholders, educating senior leadership on cyber risks (including supply chain), and building accountability for security within organizations were deemed crucial for governance and funding. Team resilience, involving identifying critical functions, having backup contacts, assessing skills, and training, was also emphasized. Testing methods like "red teaming" were recommended to find vulnerabilities and train staff.
Ultimately, building national cyber resilience requires investing in people, prioritizing technology based on risk, and leveraging frameworks like MITRE ATT&CK. The most critical investment highlighted was validating data "coming in" to prevent downstream issues. Collaboration across sectors was again stressed as essential, requiring agreement on frameworks and a shared understanding of security requirements. The chat reinforced that security is not just a technical checklist but a continuous process involving organizational culture, governance, and human factors.
A Community Driving Forward for Resilience
In conclusion, FedGeoDay 2025, fueled by intense positive energy and marked by a packed room of over 200 dedicated professionals, offered a compelling snapshot of the evolving landscape for leveraging open source and hybrid geospatial technologies for resilience. The event highlighted the immense potential of Earth observation data, the critical role of open data initiatives and underlying open source software, the promising but complex integration of AI, and the absolute necessity of robust cybersecurity practices and strong, human-driven processes.
Discussions throughout the day, from the keynote and demonstrations to the lightning talks and candid panel exchanges on AI, open data dynamics, and cybersecurity, consistently reinforced key takeaways: the vital need for rapid, trustworthy access to diverse, reliable data; the foundational importance of collaboration, data standardization, and community building (while acknowledging inherent tensions in its evolution); the cautious but necessary exploration of AI technologies; and the indispensable role of human expertise, verification, and organizational resilience alongside technological advancements.
The event's success and the palpable energy of the attendees served as a powerful testament to the community's dedication and its collective pursuit of a more resilient future for all. As the community looks forward, the insights and connections made at FedGeoDay 2025 will undoubtedly continue to drive innovation and collaboration in leveraging geospatial power for societal benefit.