The Phoenix Project: A Challenge to the Community to Forge an Open-Source Successor to HIFLD
This is Not a Map, It's a Compass
Let me be clear from the outset: what follows is not a solution served on a silver platter. It is not a finished project plan, complete with budgets and timelines that can be executed tomorrow. To offer such a thing would be arrogant and miss the point entirely.
Instead, this is a white paper in the truest sense—a conceptual foundation, a challenge, and an impassioned plea meant to spark a difficult but necessary conversation. It is born from a sense of profound loss for what the Homeland Infrastructure Foundation-Level Data (HIFLD) portal represented, but also an unshakeable belief in our collective ability to build something far more resilient in its place. This is not a map to a finished destination. It is a compass, offered in the hope that we, as a community, can agree on a direction and begin the journey forward. The government abdicated its responsibility as a steward of our shared digital commons; now, the duty and the opportunity fall to us.
The Digital Public Square Goes Dark
Imagine an emergency operations center on the Gulf Coast as a hurricane churns toward shore. An analyst, tasked with predicting the storm's impact on critical medical infrastructure, seamlessly overlays authoritative national datasets: the location of every hospital and nursing home, the intricate web of electric power transmission lines that serve them, the designated hurricane evacuation routes, and the latest flood hazard plain data. This common operational picture, a shared map of reality, allows for the proactive staging of resources, the targeted evacuation of vulnerable populations, and the anticipation of cascading failures. This life-saving clarity was not a hypothetical scenario; it was the direct product of the HIFLD Open portal, a cornerstone of American preparedness for two decades.
On August 26, 2025, this digital public square went dark. The Department of Homeland Security (DHS), the very agency that had championed a "Whole of Nation" approach to resilience, unilaterally dismantled this vital public asset, declaring that hosting it was "no longer a priority for their mission". In its place, the agency offered a spreadsheet—a list of hyperlinks to the disparate, unstandardized data sources that HIFLD had so painstakingly aggregated and curated. With this decision, a national treasure trove of geospatial data, once celebrated as a tool to empower every citizen, was effectively removed from easy public access, shattering the common picture it had taken years to build.
This event must be understood not as a simple bureaucratic cost-saving measure, but as a critical failure in the stewardship of our national digital infrastructure. The unceremonious shuttering of HIFLD Open exposes the inherent fragility of any centralized, government-hosted public good, which remains perpetually vulnerable to the shifting winds of political priorities and budget cycles. Yet, from these ashes comes an urgent and profound opportunity. The quiet shutdown of a government portal has ignited a grassroots data rescue movement, but this reactive scramble, while necessary, is not a sufficient long-term solution. The moment demands that the geospatial community—from volunteer mappers and academic researchers to corporate innovators and public servants—move beyond mere data preservation and architect a resilient, community-owned successor.
This report issues a challenge and provides a conceptual blueprint for that successor: the Phoenix Project. As a continued thought thread from our previous analysis, "The Rise, Power, and Uncertain Future of America's Open Infrastructure Data," which concluded by asking what it would take to build the next common map, this document provides the answer. It argues that the community must now forge a new digital commons for national infrastructure data, one that is antifragile by design. This document will first chronicle the rise and unceremonious fall of HIFLD Open, establishing the profound value of what was lost and the inadequacy of its replacement. It will then survey the current landscape of open geospatial data, conducting a rigorous analysis of OpenStreetMap (OSM) and the Overture Maps Foundation as potential foundations upon which to build. Finally, it will present a detailed architectural and governance blueprint for a new, semantically-rich, and community-stewarded critical infrastructure atlas, concluding with a direct call to action for the stakeholders who can make this vision a reality. The loss of HIFLD is a warning; the Phoenix Project is the necessary response.
A Common Picture Shattered: The Rise, Value, and Unceremonious Fall of HIFLD Open
To comprehend the magnitude of the void left by HIFLD Open's termination, one must first understand the chaotic world from which it emerged and the unique, multifaceted value it provided. It was more than a data repository; it was a common language, a tool for proactive resilience, and a symbol of a transparent, "Whole of Nation" approach to security. Its story is a testament to the power of open data and a cautionary tale about the fragility of digital public goods.
Genesis in Chaos: Forging a Common Language Post-9/11
In the immediate aftermath of the September 11th attacks, as the United States grappled with a new era of asymmetric threats, a terrifying realization dawned within the nascent homeland security enterprise: nobody was looking at the same map. Federal agencies, state governments, military commands, and local first responders each maintained their own, often proprietary and incompatible, geospatial datasets. This led to what insiders called the "M: drive problem," a reference to the shared network drives in every organization filled with a chaotic jumble of shapefiles and map documents. In early crisis response exercises, this data discord proved catastrophic. A power plant on one agency's map was a mile away from its location on another's. Road networks were inconsistent between jurisdictions. In a crisis demanding unprecedented coordination, the nation lacked a common operational picture.
HIFLD was not born from a grand, top-down federal mandate. It emerged from the operational trenches, driven by a "coalition of the willing"—a determined group of public servants, military officers, and private contractors who recognized the existential danger of this data chaos. As early participants recall, the project was often kept alive "out of sheer will," with little dedicated funding, because its mission was self-evidently critical. The goal was simple yet revolutionary: agree on what foundational data was needed for a common picture, acquire that data, and make it available to everyone who needed it.
The program's evolution mirrored the technological and bureaucratic journey of the post-9/11 era. In the early days, data distribution was painfully analog: batches of DVDs were physically mailed to stakeholders. This was less a limitation of technology and more a product of restrictive IT and security policies that made it nearly impossible for government agencies to host servers for open data sharing. Over time, as technology demonstrated what was possible and policies slowly adapted, HIFLD transitioned from physical media to a sophisticated, cloud-based portal powered by Esri's ArcGIS platform.
A pivotal moment in HIFLD's institutionalization was the formal, multi-year transfer of its data procurement and management responsibilities. For years, the National Geospatial-Intelligence Agency (NGA), with its deep expertise, had taken the lead on acquiring crucial datasets. However, the program's logical home was always within the Department of Homeland Security. Finalizing this move was an arduous process, requiring, according to one DHS official, "little acts of Congress" to complete. By fiscal year 2023, the transition was finalized, and HIFLD was fully managed by DHS's Geospatial Management Office (GMO), cementing its status as a cornerstone of the U.S. homeland security enterprise.
The Anatomy of a National Asset: More Than Just a Data Portal
The fully realized HIFLD program operated through two distinct but related channels, reflecting the dual needs of the security enterprise.
HIFLD Secure: This was the restricted-access repository, a digital vault containing sensitive, proprietary, and controlled information. It housed commercially licensed data and datasets marked "For Official Use Only" (FOUO). Access was tightly controlled, limited to vetted mission partners within the Homeland Security Enterprise (HSE) who had a direct operational need and had signed formal Data Use Agreements (DUAs). This secure environment was, and remains, essential for law enforcement, critical infrastructure protection, and emergency management missions that rely on data not suitable for public release.
HIFLD Open: This public-facing portal was a radical act of transparency, born from the philosophy that national resilience requires empowering "every citizen to have an active role in our security". It provided free, public domain data for "community preparedness, resiliency, research, and more". This was a deliberate act of data democratization, making hundreds of high-quality, national-level foundational datasets accessible to anyone with an internet connection—from urban planners and university researchers to non-profits and journalists.
The catalog of HIFLD Open was a sprawling digital library, containing over 300, and by some counts nearly 400, geospatial datasets that formed a detailed anatomical chart of the United States. It was the product of a massive collaborative effort, aggregating data from a constellation of authoritative sources including Oak Ridge National Laboratory (ORNL), the U.S. Census Bureau, the Federal Railroad Administration (FRA), and the U.S. Geological Survey (USGS). The datasets spanned nearly every critical sector imaginable:
Energy: Electric Power Transmission Lines, Electric Substations, Natural Gas Processing Plants, Ethanol Plants.
Transportation: Railroad Bridges, Hurricane Evacuation Routes, Airports, Ports.
Communications: Cellular Towers, Antenna Structures.
Public Health and Safety: Hospitals, Pharmacies, Urgent Care Facilities, Nursing Homes, EMS Stations, Police Stations.
Community and Commerce: Schools, Colleges and Universities, Places of Worship, Major Sport Venues, Fortune 500 Headquarters.
Hazards: National Flood Hazard Layer, Historical Tornado Tracks, Historical Fire Perimeters.
Crucially, the immense value of HIFLD Open was not merely in the collection of these datasets, but in the intellectual and technical labor of curation. It was not just a list of links. It was a centralized, standardized repository where data from disparate agencies was aggregated into a common platform, made available in multiple user-friendly formats (CSV, KML, Shapefile), and accessible via modern APIs and web services. To ensure the integrity of this collection, DHS instituted the HIFLD Acceptance Review Process (HARP), a rigorous quality control system for all new and updated data submissions. This aggregation and curation function was the core of its value proposition. The government's own justification for the shutdown—that most of the data is available elsewhere—betrays a fundamental misunderstanding of this principle. The community's frantic efforts to archive the portal's contents before its demise serve as the most powerful evidence of this "curator's value." The convenience of finding hundreds of vetted, national-scale layers in one place, ready for analysis, was a force multiplier for countless users.
This curated accessibility had impacts that rippled far beyond disaster response. By offering free, reliable data on national infrastructure, HIFLD Open lowered the barrier to entry for local governments and private businesses to conduct analyses that could spur new investments and drive economic development. The concern voiced by one user that "Fortune 500s are about to feel some real pain" following the shutdown speaks to the deep integration of this public data into the private sector's analytical workflows. Furthermore, in an era of waning public trust, HIFLD Open served as a powerful trust-building mechanism. By allowing any citizen to access and verify the same foundational data the government was using, the portal fostered accountability and a shared, fact-based understanding of national challenges.
The Shutdown: A Reversal of Philosophy and a Return to Fragmentation
The end came abruptly. On June 24, 2025, the community was notified that HIFLD Open would be discontinued by September 30, 2025. The timeline was later revised, accelerating the shutdown to August 26, 2025. The official rationale provided by DHS was startling in its brevity and dismissiveness: hosting the site and providing "Public domain data for community preparedness, resiliency, research, and more" was simply "no longer a priority for their mission".
This decision represented a profound and jarring reversal of the "Whole of Nation" philosophy that DHS itself had championed for years. The very proactive, often invisible work of planning and analysis that HIFLD Open was designed to support was deemed non-essential. The move was particularly confounding given the department's obligations under federal law. An August 2025 report from the DHS Office of Inspector General (OIG) lauded the department for its "significant progress" in fulfilling the responsibilities of the Geospatial Data Act of 2018 (GDA). The report specifically praised DHS for meeting the requirement to ensure its geospatial products "can be readily shared with other Federal and non-Federal users". The decision to shutter HIFLD Open, announced just months prior to the report's publication, stands in stark and hypocritical contradiction to the spirit and letter of the GDA, revealing a deep disconnect between high-level policy compliance and on-the-ground operational decisions within the same department.
The proposed alternative to the portal only deepened the community's dismay. In place of a curated, centralized, and standardized platform, users were offered a "crosswalk"—a spreadsheet with links to the original data sources scattered across various agency websites. This solution willfully ignores the core value that HIFLD provided. It forces every individual user—every emergency manager, academic researcher, and local planner—to become their own data aggregator. They must now hunt for data across dozens of disparate federal portals, each with its own format, update cycle, quality control standard (or lack thereof), and terms of use. This is not a replacement; it is a regression. It dismantles two decades of progress and intentionally recreates the very "M: drive problem" of data fragmentation and incompatibility that HIFLD was born to solve.
Furthermore, the transition was not a simple migration of public data to a more secure platform. The official crosswalk revealed that many of the open layers were not being migrated to the restricted HIFLD Secure site; they were simply being removed from easy public access altogether, creating a new and unnecessary digital divide between official government partners and the broader public they serve.
The Digital Scramble: Community Efforts to Save a Public Good
The announcement of the shutdown triggered an immediate and ad-hoc scramble within the geospatial community to preserve what was about to be lost. Individuals and groups, coordinating via platforms like Reddit, blogs, and professional listservs, began a frantic effort to download and archive the 300-plus datasets before the deadline. One user on Reddit spent two days crawling over 340 data layers to make them available via a public Google Drive folder, noting the lack of transparency and concluding, "better safe than sorry". Another blogger created and shared a detailed spreadsheet with direct download links to help others in their archival efforts, and began coordinating with the Data Rescue Project to host the saved copies elsewhere.
This specific, urgent effort to save the HIFLD data connects to a broader, more formalized movement that has gained momentum in recent years. The "Data Rescue Project" is a volunteer organization of librarians, archivists, researchers, and technologists focused on preserving at-risk federal data. This movement gained significant impetus from concerns that politically sensitive data, particularly related to climate change and public health, could be removed from government websites due to shifting administrative priorities. These groups have developed an independent infrastructure to safeguard vital federal data, creating tools like the Data Rescue Tracker and repositories like DataLumos to preserve and provide continued public access to datasets that have been taken down. Other initiatives, such as the Internet Archive's "End of Term Web Archive," have been systematically crawling federal websites since 2008 to preserve a record of government information across presidential administrations.
The desperate, grassroots campaign to archive HIFLD Open before it vanished is both a testament to the immense perceived value of the data and a stark indictment of the government's decision to dismantle a vital piece of public digital infrastructure. It demonstrates that when the official stewards of public data abdicate their responsibility, a passionate community will attempt to step into the breach. However, this reactive, piecemeal approach is not a sustainable model for ensuring the long-term availability of critical national data. It is a stopgap, a digital lifeboat launched from a sinking ship. A more permanent, resilient, and community-owned vessel is required.
Surveying the Terrain: Evaluating OpenStreetMap and Overture Maps as a New Foundation
The demise of HIFLD Open necessitates a fundamental rethinking of how national infrastructure data is stewarded. A direct, government-funded replacement would likely be susceptible to the same political and budgetary whims that led to the original's downfall. A truly resilient successor must be rooted in a different paradigm: a distributed, community-owned model that is antifragile by design. Fortunately, the open geospatial ecosystem has matured significantly, offering powerful foundational platforms upon which such a successor could be built. The two most viable candidates are the long-standing, community-driven OpenStreetMap (OSM) and the newer, corporate-backed Overture Maps Foundation. A rigorous comparative analysis of these two platforms is essential to architecting a workable blueprint for the Phoenix Project.
The Digital Commons: OpenStreetMap (OSM)
OpenStreetMap is a free, open map database of the world, created and maintained by a global community of volunteers through open collaboration. Since its founding in 2004, it has grown into the largest crowdsourced, open database on the planet, a vibrant "digital commons" for geospatial information.
Governance and Community: OSM's governance is fundamentally bottom-up and community-centric. The project is supported by the OpenStreetMap Foundation (OSMF), an international non-profit organization registered in the UK. The OSMF's role is to support the project by maintaining its core infrastructure (servers, domains), handling legal and financial matters, and organizing the annual State of the Map conference, but it does not control the map's content. The foundation is run by a board elected by its members and operates through a series of volunteer-led Working Groups (e.g., Operations, Data, Licensing). This structure is replicated at national levels through official Local Chapters, such as OpenStreetMap U.S., which has its own board and working groups focused on local community needs and partnerships, including a Government Working Group that liaises with public agencies. The core principle is that the data is owned and maintained collectively by its editors, with no special privileges or super-users; edits go live immediately and are subject to peer review and improvement by the entire community.
Data Model: OSM employs a unique and highly flexible data model that differs markedly from traditional GIS layers. It is a topological data structure built on three basic primitives: nodes (points), ways (ordered lists of nodes, forming lines or polygons), and relations (which group other elements together). The meaning of these geometric features is defined by a "folksonomy" of key-value pairs called
tags. For example, a way might be tagged with
highway=motorway and name="Interstate 5". This free-tagging system is enormously extensible, allowing for an almost unlimited number of attributes to describe each feature. While informal standards for common features are maintained on the community wiki, there is no rigid, pre-ordained schema, which allows the map to evolve organically to capture new types of information.Data Quality and Validation: Given its open, collaborative nature, data quality in OSM is a dynamic process rather than a static guarantee. The quality assurance ecosystem is multi-faceted. It begins with the community itself, where experienced mappers often review the work of newer contributors, providing feedback and corrections. This is often formalized in humanitarian mapping projects through the HOT Tasking Manager, which uses a workflow where one person maps a task and an experienced validator checks it. This is supplemented by a powerful suite of automated and semi-automated tools. The JOSM desktop editor includes a built-in Validator that can check for a wide range of topological and tagging errors (e.g., disconnected roads, overlapping buildings). Web-based tools like Osmose scan the entire database for potential issues and present them on a map for mappers to fix. This combination of peer review and automated checks has allowed OSM to achieve a level of detail and currency that often surpasses proprietary datasets, particularly for features like pedestrian and cycling infrastructure.
Licensing: OSM data is published under the Open Database License (ODbL). This is a "share-alike" license, meaning that any public use of a database derived from OSM must, in turn, be shared under the same license. This legal mechanism is designed to protect the database as a commons, ensuring that improvements and additions flow back to the community and preventing the data from being enclosed within proprietary products.
Relevance to Infrastructure: The OSM community has long been active in mapping critical infrastructure. The flexible tagging system allows for detailed representation of features across various sectors. Specialized projects like OpenInfraMap demonstrate the richness of this data by creating dedicated visualizations of power grids, telecommunications networks, and pipelines directly from the live OSM database. Tags like
power, telecom, man_made, and emergency are well-established, providing a robust, if informal, schema for many of the asset types previously found in HIFLD.
The Corporate Consortium: Overture Maps Foundation
Launched in December 2022, the Overture Maps Foundation represents a different approach to open map data. It is a collaborative effort founded by Amazon Web Services (AWS), Meta, Microsoft, and TomTom, and is organized under the Linux Foundation. Its stated goal is to create reliable, easy-to-use, and interoperable open map data to power commercial and next-generation map products.
Governance and Membership: Overture's governance is more top-down and corporate-driven than OSM's. It is led by its founding members, and its structure consists of two core Working Groups (Map Data and Schema) which oversee various Task Forces (e.g., Buildings, Transportation, Places). These groups are led by senior engineers and product managers from the member companies, with support from Overture staff. While membership has expanded to include other companies, including Esri, the strategic direction is set by the consortium of major technology firms.
Data Model: In contrast to OSM's folksonomy, Overture employs a highly structured, schema-driven data model. The data is organized into distinct "themes"—Addresses, Base, Buildings, Divisions, Places, and Transportation. The schema is formally defined using JSON schema standards, and the data is distributed in a cloud-native, analysis-ready format called GeoParquet. A cornerstone of Overture's technical strategy is the Global Entity Reference System (GERS). GERS assigns a unique, stable identifier to every real-world feature in the database. This is designed to solve the persistent and costly problem of data conflation—the process of merging different datasets that refer to the same real-world objects. With GERS, a building in Overture has a single ID that can be used to link it to data from any other source, such as an address database or a property assessment file.
Data Quality and Curation: Overture's primary value proposition is data quality and ease of use for enterprise applications. It is not a platform for direct, crowdsourced editing. Instead, it produces its datasets by ingesting and conflating data from multiple sources, including OSM, government open data, and data contributed by its member companies. This raw data is then processed through centralized, platform-agnostic pipelines that perform validation, quality checks, and standardization to produce a clean, consistent, and interoperable final product. The quality philosophy is that accuracy improves as the data is used in large-scale applications, which in turn provides feedback to correct errors.
Licensing: Overture data is primarily licensed under the Community Database License Agreement – Permissive v2 (CDLA-Permissive). This is a non-share-alike, "permissive" license. It allows users to take the data and use it in any way they see fit, including in proprietary products, without an obligation to share their modifications back with the community. This choice is explicitly designed to maximize adoption and ease of use in commercial applications. Data derived from OSM, however, remains under the ODbL to comply with its share-alike terms.
Relevance to Infrastructure: Infrastructure is a key component of Overture's "base" theme. The infrastructure feature type is explicitly defined in the schema and includes subtypes like power, communication, bridge, and airport, which are largely derived from corresponding OSM tags. The structured nature of this data, combined with the building footprints and transportation networks in other themes, provides a strong, analysis-ready foundation for many of the datasets formerly in HIFLD.
Analysis and Recommendation for the Phoenix Project
The choice between OSM and Overture is not a simple one; they are fundamentally different projects with different goals, philosophies, and strengths. OSM represents a "digital commons"—a shared resource built and governed by its users, designed for mass participation, and protected by a share-alike license intended to preserve its openness. Overture is more akin to a "digital public utility"—engineered by an industrial consortium to provide a reliable, standardized product for mass consumption, using a permissive license to encourage widespread commercial adoption.
A successor to HIFLD cannot simply choose one model over the other. It must harness the vibrant, on-the-ground energy of the commons while leveraging the reliability and structure of the utility. Attempting to build a new, separate community from scratch would be a monumental task, while relying solely on a corporate consortium for data vital to national security raises concerns about long-term control and alignment of priorities.
Therefore, the most viable path forward is a hybrid approach:
Use OpenStreetMap as the primary platform for community contribution, data editing, and governance. Its established global community, proven collaborative model, and truly open, non-profit governance structure provide the ideal environment for a grassroots, mission-driven project to take root. The OSMF's Local Chapter model, particularly OSM U.S., offers a ready-made legal and organizational framework.
Leverage Overture Maps data as a high-quality, structured reference and validation layer. Overture's clean, conflated datasets for buildings, transportation, and base infrastructure provide an excellent baseline. Its GERS IDs offer the critical technical mechanism needed to link community-contributed data to a stable, globally-referenced entity.
The critical technical linchpin for this hybrid model is the establishment of a robust link between OSM features and Overture's GERS IDs. While Overture already uses OSM data, the real breakthrough for a HIFLD successor would be a dedicated, community-driven effort to systematically map GERS IDs to their corresponding OSM feature IDs. This would create a powerful synergy, allowing an analyst to start with a high-confidence building footprint from Overture (with its stable GERS ID) and then enrich it with detailed, up-to-the-minute attributes—such as the number of hospital beds, the type of construction, or the presence of a backup generator—contributed and maintained by the OSM community. This creates a "best of both worlds" dataset that is both structurally sound and dynamically detailed.
Table 1: Comparative Analysis of OpenStreetMap and Overture Maps as a Foundation for a HIFLD Successor
Comparative Analysis: OpenStreetMap vs. Overture Maps
Criterion | OpenStreetMap (OSM) | Overture Maps |
---|---|---|
Governance Model | Bottom-up, community-driven non-profit (OSMF). | Top-down, corporate consortium led by founding members. |
Primary Goal | Create a free, open map of the world as a "digital commons." | Create reliable, interoperable open map data for commercial products. |
Data Model | Flexible, user-defined "folksonomy" of key-value tags. | Highly structured, pre-defined schema with stable identifiers (GERS). |
Licensing | ODbL (Open Database License) - "Share-alike." | CDLA-Permissive v2 - Permissive, non-share-alike. |
Quality Control | Decentralized; community peer review and automated tools. | Centralized; ingestion and conflation from multiple sources. |
Community Structure | Global, volunteer-based, with local chapters. Mass participation. | Corporate members and partners. Enterprise-focused. |
Key Strength | Vibrant community, high detail, and truly open governance. | Data quality, structural consistency, and ease of use for enterprise. |
Key Weakness | Inconsistent data quality and structure. | Reliance on corporate priorities; less community control. |
This comparative analysis makes it clear that neither platform alone is a perfect replacement for HIFLD. OSM provides the community and governance model essential for a resilient public good, while Overture provides the structured, high-quality data and stable identifiers needed for serious analysis. The blueprint for the Phoenix Project must, therefore, be a bridge that connects these two powerful worlds.
A Blueprint for Resilience: Architecting a Community-Owned Critical Infrastructure Atlas
The failure of HIFLD Open was a failure of a centralized model. The solution is not to replicate that model outside of government, but to architect a decentralized, community-owned system that is resilient by design. This blueprint outlines the core components of such a system: a hybrid foundational data layer, a powerful semantic ontology for analysis, a robust governance and stewardship framework, and a novel approach to establishing data trust through verifiable provenance.
The Foundational Layer: A Hybrid Contribution Model
The base of the Phoenix Project will be built upon the hybrid OSM-Overture model recommended in the previous section. This is not simply a matter of choosing a platform, but of designing a workflow that leverages the distinct strengths of each.
The proposed workflow would operate as follows:
Community Contribution to OSM: The primary data creation and editing activity will occur within the existing OpenStreetMap ecosystem. A dedicated project, perhaps managed through the HOT Tasking Manager or a similar tool, will be established to guide and coordinate the mapping of critical infrastructure features. Mappers will contribute data directly to OSM using established and enhanced tagging conventions for infrastructure, such as those used by OpenInfraMap (
power=plant, man_made=pipeline, etc.). This approach taps into OSM's existing global community of millions of mappers and its mature editing tools and workflows.Overture Data as a Reference Layer: In parallel, a dedicated automated process will ingest Overture's monthly data releases. The themes of particular interest will be Buildings, Transportation, and the Base theme's infrastructure features. This provides a high-quality, globally consistent, and professionally vetted baseline dataset.
The GERS-OSM Bridge: The crucial step is the creation of a "bridge" that links Overture features to their OSM counterparts. This will be a significant technical undertaking, involving automated and semi-automated methods to correlate features based on geometry and attributes. Once a link is established, the Overture GERS ID will be added as a tag to the corresponding OSM feature (e.g., overture:id=...). This creates a persistent, stable link between the dynamic, community-edited OSM feature and the curated, versioned Overture entity. This bridge allows data to flow in both directions: Overture data can be used to identify gaps or errors in OSM, while the rich, detailed attributes from OSM can be joined to the stable Overture entities for analysis.
This hybrid model creates a virtuous cycle. The community's work enriches a global commons, while the structured data from the corporate consortium provides a high-quality scaffold and a mechanism for interoperability that would be difficult for a volunteer project to create on its own.
The Semantic Core: A Prioritized Infrastructure Ontology
The raw points, lines, and polygons on a map, even with rich tags, are just geometry. The true power of HIFLD was in its curated nature, which enabled sophisticated analysis for preparedness and resilience. To replicate and surpass this capability, the Phoenix Project needs more than just tags; it requires a formal semantic layer—an ontology—that allows for computational reasoning about dependencies, interconnections, and cascading failures. Simply importing old HIFLD data into OSM would be a "dumb" transfer of geometry; the ontology provides the "smart" analytical engine.
To this end, this blueprint proposes the development of the Open Resilience and Infrastructure Ontology (ORIO).
Technical Foundation: ORIO will be developed using open standards from the World Wide Web Consortium (W3C), such as the Web Ontology Language (OWL) and the Resource Description Framework (RDF). Geospatial components will be modeled using standards like GeoSPARQL, which allows for complex spatial queries within the semantic framework.
Prioritization Framework: The structure of ORIO will not be arbitrary. It will be explicitly organized and prioritized according to FEMA's seven Community Lifelines framework. This framework is the standard doctrine used by emergency managers across the United States to stabilize incidents and is organized around providing the most fundamental services to a community: Safety & Security; Food, Water, Shelter; Health & Medical; Energy; Communications; Transportation; and Hazardous Materials. By aligning the ontology with this operational framework, the Phoenix Project ensures that its data is immediately relevant and useful to its primary target audience.
Operational Mechanism: The ontology will function as a semantic layer on top of the OSM/Overture base data. For example:
An OSM feature with the tag power=substation would be defined in the ontology as an instance of the class orio:ElectricalSubstation.
orio:ElectricalSubstation would be a subclass of orio:EnergyTransmissionAsset, which in turn is a subclass of the top-level orio:EnergyLifelineAsset.
This class would have defined data properties (e.g., orio:hasVoltage, orio:hasTransformerCount) and object properties that define its relationship to other entities (e.g., orio:isFedBy which links to an orio:ElectricalTransmissionLine, and orio:feeds which links to an orio:DistributionNetwork).
Leveraging Existing Work: The development of ORIO will not start from scratch. It will build upon the significant body of existing work in open-source domain ontologies for specific sectors, such as the Open Energy Ontology (OEO) for the power sector and various transportation ontologies for modeling road and transit networks.
This ontological layer is the true successor to HIFLD's analytical power. It enables semantic queries that are impossible with simple tags alone, such as, "Show all hospitals (Health & Medical Lifeline) that depend on a single electrical substation (Energy Lifeline) located within a 100-year flood plain, and highlight the primary evacuation routes (Transportation Lifeline) from those facilities."
The Broader FOSS4G Ecosystem: Beyond the Database
The Phoenix Project's hybrid data layer and semantic core represent the heart of the initiative, but its utility is magnified when it is integrated into the broader Free and Open Source Software for Geospatial (FOSS4G) ecosystem. The project is not just about creating a static database; it's about building a living data infrastructure that can be easily accessed, shared, and utilized.
A key component for this dissemination is GeoServer, a robust, open-source server that publishes data from any major spatial data source using open standards. By connecting GeoServer to the project's underlying PostGIS database (which is populated by tools like osm2pgsql), the curated infrastructure data can be served to a vast array of applications via standard OGC (Open Geospatial Consortium) web services like the Web Map Service (WMS) and Web Feature Service (WFS). This makes the data immediately usable in desktop GIS software like QGIS, web mapping libraries like OpenLayers, and countless other analytical tools without requiring users to interact directly with the database. This standards-based approach ensures maximum interoperability and lowers the barrier to entry for users across all sectors.
Fueling the Next Generation of Geospatial AI
The true long-term potential of the Phoenix Project extends beyond direct human analysis and into the realm of artificial intelligence and machine learning. The combination of a high-quality baseline (Overture), rich community-sourced detail (OSM), a formal analytical structure (ORIO), and a transparent quality framework creates an unparalleled source of training data for geospatial AI.
This semantically-rich, validated data can fuel a new generation of analytical capabilities:
Automated Change Detection: Machine learning models, particularly deep learning approaches, can be trained on the project's data to automatically extract features like buildings and roads from high-resolution satellite or aerial imagery. By comparing newly extracted features against the existing database, the system can automatically flag new construction, demolished structures, or infrastructure damage after a disaster, creating a near-real-time update cycle.
Predictive Analytics: The dataset enables the training of models to predict socio-economic indicators based on the density, type, and condition of infrastructure. This could help planners identify underserved communities, forecast areas of rapid growth, or model the economic impact of a proposed infrastructure project.
Vulnerability and Risk Modeling: AI can analyze the complex network of dependencies encoded in the ORIO ontology to identify hidden vulnerabilities and predict cascading failures. For example, a model could learn the patterns of infrastructure stress that precede power outages or transportation network collapses, allowing for proactive mitigation efforts.
AI-Assisted Mapping: The data serves as a powerful training set for tools that assist human mappers. AI can suggest features to be mapped from imagery, pre-populate attributes, and flag potential errors, dramatically increasing the speed and accuracy of community contributions, as has been demonstrated in humanitarian mapping contexts.
By creating a trusted, machine-readable representation of the nation's infrastructure, the Phoenix Project becomes more than a map; it becomes an engine for automated insight and intelligent decision-making.
Governance and Stewardship: An Open Infrastructure Consortium
A project of this scale and importance requires a clear and transparent governance structure to guide its development, ensure its sustainability, and build trust among its contributors and users. A purely informal, ad-hoc structure would not be sufficient.
This blueprint proposes the formation of a new governance body, the Open Infrastructure Consortium (OIC). The most effective path to establishing the OIC would be to structure it as a charter project under OpenStreetMap U.S.. This model, successfully pioneered by OpenHistoricalMap, allows a specialized project to benefit from the established non-profit legal and administrative framework of an official OSM Local Chapter while maintaining its own distinct mission, advisory group, and development team. This approach firmly roots the project within the OSM community while giving it the autonomy it needs to focus on its specific goals.
The proposed structure of the OIC would include:
A Steering Committee: This group would provide high-level strategic direction. It should include representatives from the core OSM community, the Overture Maps Foundation, academic institutions with expertise in disaster management and GIS, and key non-profit partners like the National Alliance for Public Safety GIS (NAPSG) Foundation.
A Technical Working Group: This group of volunteer and potentially sponsored developers would be the custodians of the project's technical assets. Their responsibilities would include maintaining and extending the ORIO ontology, managing the data ingestion and conflation pipelines, and developing the validation tools and workflows.
A Government and Industry Liaison Group: Modeled directly on the successful OSM U.S. Government Working Group, this committee's mission would be to seek out mutually beneficial relationships with authoritative data providers. It would engage with federal agencies like FEMA, CISA, USGS, and DOT, as well as state and local governments, to encourage the contribution of open data and to establish partnerships for data validation.
A Framework for Trust: Multi-Tiered Data Validation
A key challenge for any community-driven data project, especially one dealing with information critical to life and safety, is establishing trust and communicating data quality. HIFLD derived its trust from its centralized, authoritative source: the U.S. Department of Homeland Security. A decentralized project cannot replicate this model of authority. Instead, it must build trust through a transparent and robust system of
verifiable provenance.
The Phoenix Project will not claim that every piece of data is perfect. Instead, it will transparently communicate to the user the origin and validation status of each feature, allowing them to make informed decisions based on their specific needs. This will be accomplished through a proposed three-tiered quality model:
Tier 1 (Community Contributed): This is the baseline data as it exists in OpenStreetMap. It is subject to the standard, organic community validation processes, including peer review and automated error-checking tools. This data is incredibly valuable for its currency and detail but has variable consistency.
Tier 2 (Expert Vetted): This tier represents data that has been reviewed and flagged as "vetted" by members of the Open Infrastructure Consortium with verified domain expertise. For example, a power systems engineer could validate the connectivity of a substation, or a local emergency manager could confirm the location of an evacuation shelter. This process would add a tag to the OSM data (e.g., oric:vetting_level=expert) and would be managed through a dedicated workflow.
Tier 3 (Authoritatively Linked): This represents the highest level of confidence. Data in this tier is programmatically linked, via its GERS ID or another stable identifier, to an official open dataset from a government agency or other authoritative source. For instance, a power plant feature in OSM could be linked to its corresponding entry in the Energy Information Administration's (EIA) database. This tier would include metadata on the source, the date of the last update, and a direct link to the source record.
This tiered system shifts the paradigm of trust. Instead of asking users to "trust us because of who we are," it enables them to "trust the data because you can see precisely where it came from and how it has been checked." An emergency planner preparing for an imminent disaster could filter their analysis to use only Tier 3 data. A researcher studying long-term infrastructure trends might be comfortable using Tier 2 and Tier 1 data. This flexible, transparent approach to quality is the only viable way for a community-driven project to provide data for mission-critical applications.
Table 2: Sample Structure of the Open Resilience and Infrastructure Ontology (ORIO) based on FEMA Lifelines
Proposed Governance and Funding Model
Component | Proposed Model | Rationale / Key Functions |
---|---|---|
Legal Entity | 501(c)(3) Non-Profit Organization. | Provides legal/financial backbone; shields contributors from liability; can accept tax-deductible donations. |
Governance Board | Elected representatives from membership tiers (e.g., individual, corporate, academic). | Sets strategic direction, manages budget, ensures mission alignment, hires/fires key staff. |
Technical Steering Committee | Merit-based appointment/election of key technical contributors. | Defines data schemas and standards, oversees technical infrastructure, resolves disputes, manages QA/QC. |
Community Management | Hired staff, reporting to the Governance Board. | Moderates forums, organizes events, manages outreach and communications, supports new contributors. |
Primary Funding | Diversified Model: Membership fees, foundation grants, corporate sponsorships, and service contracts. | Ensures financial resilience and prevents corporate capture by avoiding reliance on a single funding source. |
The Business Case for a Public-Private Digital Commons
A blueprint for a technically superior successor to HIFLD is necessary, but not sufficient. The open-source world is littered with promising projects that withered due to a lack of sustained resources. For years, the OpenStreetMap community and other open-source ecosystems have highlighted the challenges of relying on volunteer efforts and inconsistent donations to maintain what has become critical digital infrastructure. The Phoenix Project cannot afford to repeat this pattern. To be truly resilient, it requires not just a resilient architecture, but a resilient business model.
From Volunteer Project to Essential Infrastructure
The Phoenix Project, by directly addressing a mission-critical need created by a government failure, transcends the definition of a typical volunteer mapping project. It is, by its very nature, essential public infrastructure. Open source software already runs much of the world's critical systems, from government services to national security. By providing the foundational data layer for understanding and managing the nation's physical infrastructure, the Phoenix Project assumes a similar level of importance. Treating it as a hobbyist endeavor funded by spare change and goodwill is a recipe for failure. Instead, it must be recognized and funded as the public good it is.
A Hybrid Funding Model for a Hybrid Data Platform
A sustainable future for the Phoenix Project requires a diversified, hybrid funding model that mirrors its hybrid data approach, drawing support from the public and private sectors that directly benefit from its existence.
Government Investment and Grants: Federal, state, and local agencies are the most direct beneficiaries of a reliable, open, and detailed national infrastructure map. For these agencies, investing in the Phoenix Project is not charity; it is a highly cost-effective method of procuring essential data and analytical capabilities. Rather than each agency attempting to build and maintain its own disparate datasets, they can pool resources to support a single, shared commons. This can be achieved through direct funding, grants, and procurement contracts that mandate contributions back to the project. The principle of "Public Money, Public Code" should apply: when taxpayer money is used to create or enhance this data, the results should be available to all. Numerous government grant programs, from the NSF to NASA and the NIH, already exist to fund open-source and open-science initiatives, providing a clear pathway for support.
Corporate Sponsorship and Membership: The private sector was a major user of HIFLD Open for everything from logistics and supply chain management to risk analysis and market intelligence. These same corporations have a clear financial incentive to ensure the long-term stability of its successor. The Open Infrastructure Consortium (OIC) can establish a tiered corporate sponsorship program, modeled on the success of organizations like the Open Source Geospatial Foundation (OSGeo) and the Open Geospatial Consortium (OGC). In exchange for financial support, sponsors can receive benefits such as technical advisory roles, brand recognition, and a direct say in the strategic direction of a data platform that is critical to their operations.
Value-Added Services: A common and successful business model for open-source projects is to provide the core data and software for free while offering paid, value-added services to enterprise users. The OIC, or partner organizations within the ecosystem, could offer services such as:
Enterprise Support: Guaranteed service-level agreements (SLAs) for data access, technical support, and training.
Custom Development and Data Extracts: Bespoke development of analytical tools or the creation of specialized, pre-packaged data extracts for specific industries.
Certified Hosting: Providing managed, certified instances of the Phoenix Project database and associated software stack (like GeoServer) for organizations that lack the in-house expertise to manage it themselves.
By combining direct public investment, private sector sponsorship, and a market for professional services, the Phoenix Project can build a stable financial foundation, ensuring it can retain the core staff and maintain the infrastructure necessary to serve the nation for decades to come.
A Call to Action: From Data Rescue to Data Renaissance
The shutdown of HIFLD Open was a failure of stewardship, but it does not have to be a permanent loss. It can, and must, be the catalyst for something better: a transition from a frantic, reactive period of data rescue to a proactive, collaborative era of data renaissance. The blueprint outlined in this report is ambitious but achievable. It provides a path toward an open, community-owned critical infrastructure atlas that is more resilient, more detailed, and more democratically governed than its predecessor. Realizing this vision, however, requires a concerted effort from all corners of the geospatial and resilience communities.
The Path Forward: A Phased Approach
A project of this magnitude cannot be realized overnight. A deliberate, phased approach is necessary to build momentum, demonstrate value, and scale effectively.
Phase 1: Mobilization & Scaffolding (First 6 months): The immediate priority is to formalize the effort and build the foundational structures. This involves:
Forming the Consortium: Officially establishing the Open Infrastructure Consortium (OIC) as a charter project under OpenStreetMap U.S., recruiting the initial members of the Steering Committee and working groups.
Consolidating the Archive: Building upon the ad-hoc community efforts, consolidate the "rescued" HIFLD datasets into a single, well-documented, and publicly accessible archive. This serves as the project's foundational dataset and a valuable public good in its own right.
Initial Ontology Development: The Technical Working Group will begin the development of the Open Resilience and Infrastructure Ontology (ORIO), focusing initially on drafting the schema for one or two key lifelines, such as Energy and Transportation. This development can be based on the existing HIFLD data structure while also working off best practices from frameworks like the military's Modernized Integrated Database (MIDB) and adapting existing open ontologies.
Phase 2: Pilot Project & Tooling (Next 12 months): With the initial structures in place, the project must demonstrate its viability through a focused pilot. This involves:
Selecting a Pilot Region: Choose a single state or major metropolitan area with active OSM communities and available government open data to serve as a testbed.
Data Conflation and Linking: For the pilot region, begin the technical work of conflating the rescued HIFLD data, the current OSM data, and the latest Overture data release. Develop and test the initial tools and workflows for linking features and assigning GERS IDs within OSM.
Testing the Tiers: Implement and test the three-tiered validation framework. Engage local domain experts (e.g., from the state's emergency management agency or department of transportation) to participate in the "Tier 2" vetting process and work with state data portals to establish "Tier 3" authoritative links.
Phase 3: Scaling & Expansion (24+ months): Based on the lessons learned from the pilot, the project will scale to a national level. This involves:
Nationwide Rollout: Systematically expand the data conflation and validation process to cover the entire United States, prioritizing regions based on risk and community capacity.
Community and Partner Recruitment: Launch a broad campaign to recruit volunteer mappers, domain experts for the vetting process, and government partners at all levels to contribute data and expertise.
Continuous Improvement: The ORIO ontology will be continuously refined and expanded to cover all seven lifelines in greater detail. The validation tools and data processing pipelines will be improved based on community feedback and technological advancements.
A Challenge to the Stakeholders
This vision cannot be realized by a small group alone. It requires a commitment from the entire ecosystem.
To the Data Rescuers and the GIS Community: Your work to archive HIFLD was a vital act of digital preservation. It was triage. Now is the time to transition from archivists to architects. The passion, technical skills, and collaborative spirit demonstrated in the frantic days before the shutdown are the seed corn from which this new digital commons will grow. Your expertise is needed to build the tools, vet the data, and steward this permanent solution.
To the OpenStreetMap Community: You have built the world's most successful and vibrant collaborative mapping project, a testament to the power of open, distributed contribution. The Phoenix Project offers an opportunity to apply this powerful model to a mission-critical use case that directly supports national and community resilience. By embracing this challenge, you can demonstrate unequivocally the power of your platform not just for navigation or general-purpose mapping, but as essential public infrastructure.
To the Overture Maps Foundation: You are building the enterprise-grade, analysis-ready open map data of the future. Your work on a structured schema and the Global Entity Reference System is a game-changer for data interoperability. Partnering with the Phoenix Project is a powerful way to ensure your data is leveraged for public good. It also creates an invaluable feedback loop, allowing a dedicated community of infrastructure experts to improve and enrich the very data that underpins your commercial members' products.
To Government Agencies (FEMA, CISA, USGS, DOT, and State/Local Counterparts): The shutdown of HIFLD Open created a void in public data access that your own missions depend on filling. Supporting this community-driven effort is not an abdication of responsibility, but a strategic embrace of a more resilient, cost-effective, and collaborative model for stewarding public data. Become a key partner in this endeavor. Contribute your authoritative open data to serve as the foundation for the "Tier 3" validation process. Encourage your subject matter experts to participate in the "Tier 2" vetting process. A thriving, community-maintained infrastructure atlas is a direct asset to your own operational effectiveness and a fulfillment of your mandate under the Geospatial Data Act.
Forging an Antifragile Commons
The story of HIFLD Open is a cautionary tale about the fragility of centralized digital public goods. Its value was immense, but its foundation was brittle. The solution is not to simply ask the government to rebuild the same structure, hoping for a different outcome in the next budget cycle. The solution is to build something fundamentally different: a distributed, community-owned, and semantically rich resource that is antifragile—one that grows stronger and more reliable with every new contributor, every new data source, and every new application.
The Phoenix Project is more than a replacement for a defunct government portal. It is an opportunity to build the shared map that our collective security and resilience truly depend on—a map built not by a single agency, but by the "Whole of Nation" it is meant to serve. The terrain has been surveyed, the blueprint has been drawn. The challenge is now before us. It is time to build.
-
HIFLD Open GIS Portal Shuts Down Aug 26 2025 | At These ..., accessed September 15, 2025, https://atcoordinates.info/2025/08/08/hifld-open-gis-portal-shuts-down-aug-26-2025/
Mapping Your Infrastructure: Datasets for Infrastructure Identification | CISA, accessed September 15, 2025, https://www.cisa.gov/resources-tools/resources/mapping-your-infrastructure-datasets-infrastructure-identification
Department of Homeland Security - Organizations - Dataset - Catalog - Data.gov, accessed September 15, 2025, https://catalog.data.gov/organization/dhs-gov?publisher=HIFLD
Geospatial Frontiers, accessed September 15, 2025, https://projectgeospatial.org/geospatial-frontiers
The Rise, Power, and Uncertain Future of America's Open Infrastructure Data, accessed September 15, 2025, https://projectgeospatial.org/geospatial-frontiers/the-rise-power-and-uncertain-future-of-americas-open-infrastructure-data
HIFLD Open, accessed September 15, 2025, https://hifld-geoplatform.hub.arcgis.com/pages/a6a99fd33af64ed9bc51e55760123a82
Preserving At-Risk Public Data - Ithaka S+R, accessed September 15, 2025, https://sr.ithaka.org/blog/preserving-access-for-at-risk-public-data/
HIFLD Open Is Dead*, Long Live HIFLD - geoMusings, accessed September 15, 2025, https://blog.geomusings.com/2025/07/02/hifld-open-is-dead-long-live-hifld/
DHS HIFLD Open: Open Data for Economic Resiliency - Esri, accessed September 15, 2025, https://www.esri.com/about/newsroom/insider/open-data-for-economic-resiliency
Homeland Infrastructure Foundation-Level Data (HIFLD) - ChucktownFloods, accessed September 15, 2025, https://chucktownfloods.cofc.edu/data-sources/homeland-infrastructure-foundation-level-data-hifld/
HIFLD Open Data - International Association of Fire Chiefs, accessed September 15, 2025, https://www.iafc.org/topics-and-tools/resources/resource/hifld-open-data
2024 HIFLD User Forum Day 1 - Homeland Security, accessed September 15, 2025, https://www.dhs.gov/medialibrary/assets/video/57814
Homeland Infrastructure Foundation-Level Data (HIFLD), accessed September 15, 2025, https://www.dhs.gov/gmo/hifld
HIFLD Open, accessed September 15, 2025, https://hifld-geoplatform.hub.arcgis.com/pages/hifld-open
300+ HIFLD Datasets Archived : r/gis - Reddit, accessed September 15, 2025, https://www.reddit.com/r/gis/comments/1mzu4np/300_hifld_datasets_archived/
Geospatial Data Sets | ORNL, accessed September 15, 2025, https://www.ornl.gov/project/geospatial-data-sets
hifld - Dataset - Catalog - Data.gov, accessed September 15, 2025, https://catalog.data.gov/dataset?publisher=HIFLD
HIFLD Open to be discontinued by Sept. 30, 2025 - NAPSG Foundation, accessed September 15, 2025, https://www.napsgfoundation.org/hifld_open/
News & Notes - NAPSG Foundation, accessed September 15, 2025, https://www.napsgfoundation.org/news/
DHS Has Met Nearly All Geospatial Data Act Responsibilities - DHS OIG, accessed September 15, 2025, https://www.oig.dhs.gov/sites/default/files/assets/2025-08/OIG-25-34-Aug25.pdf
Data rescue - Wikipedia, accessed September 15, 2025, https://en.wikipedia.org/wiki/Data_rescue
Data Rescue: A 2025 Update – The Tinker - Columbia University Libraries Blogs, accessed September 15, 2025, https://blogs.library.columbia.edu/the-tinker/2025/03/11/data-rescue-a-2025-update/
Data Rescue Project - USU Libraries - Utah State University, accessed September 15, 2025, https://library.usu.edu/news/2025/data-rescue-project
OpenStreetMap - Wikipedia, accessed September 15, 2025, https://en.wikipedia.org/wiki/OpenStreetMap
Providing-data-to-OpenStreetMap.pdf, accessed September 15, 2025, https://blog.openstreetmap.org/wp-content/uploads/2020/07/Providing-data-to-OpenStreetMap.pdf
OpenStreetMap Foundation, accessed September 15, 2025, https://osmfoundation.org/
OpenStreetMap Foundation - Wikipedia, accessed September 15, 2025, https://en.wikipedia.org/wiki/OpenStreetMap_Foundation
Foundation - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Foundation
Officers & Board - OpenStreetMap Foundation, accessed September 15, 2025, https://osmfoundation.org/wiki/Officers_%26_Board
Working Groups & Committees - OpenStreetMap US, accessed September 15, 2025, https://openstreetmap.us/get-involved/working-groups/
Foundation/Local Chapters/United States/Governance Committee - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Foundation/Local_Chapters/United_States/Governance_Committee
OpenStreetMap US Board, accessed September 15, 2025, https://openstreetmap.us/about/board/
Evolution of the OSM Data Model - OpenStreetMap Foundation, accessed September 15, 2025, https://osmfoundation.org/w/images/e/e6/2022-08-15-study-evolution-of-the-osm-data-model-by-Jochen-Topf_CC-BY-SA_4.0.pdf
Map features - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Map_features
Validation - LearnOSM, accessed September 15, 2025, https://learnosm.org/en/coordination/validation/
How to Validate Mapping, accessed September 15, 2025, https://www.missingmaps.org/assets/downloads/Validating_Data_EN.pdf
Tasking Manager/Validating data - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Tasking_Manager/Validating_data
5.2. Validating with JOSM - HOT Toolbox, accessed September 15, 2025, https://toolbox.hotosm.org/pages/5_data_quality_assurance/5.2._validating_with_josm/
Reviewing OSM Data - LearnOSM, accessed September 15, 2025, https://learnosm.org/en/coordination/review/
Quality assurance - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Quality_assurance
Open Infrastructure Map, accessed September 15, 2025, https://openinframap.org/
Open Map Data from Overture Maps Foundation Draws More Industries and Companies, accessed September 15, 2025, https://www.linuxfoundation.org/press/open-map-data-from-overture-maps-foundation-draws-more-industries-and-companies
Who We Are - Overture Maps Foundation, accessed September 15, 2025, https://overturemaps.org/about/who-we-are/
Overture Maps Foundation: Home, accessed September 15, 2025, https://overturemaps.org/
Working Groups and Task Forces - Overture Maps Foundation, accessed September 15, 2025, https://overturemaps.org/about/working-groups/
Overview | Overture Maps Documentation, accessed September 15, 2025, https://docs.overturemaps.org/schema/
Introduction | Overture Maps Documentation - Overture Maps Foundation, accessed September 15, 2025, https://docs.overturemaps.org/
Data Guides | Overture Maps Documentation, accessed September 15, 2025, https://docs.overturemaps.org/guides/
Overture Maps Launches GERS, a Global Standard for Interoperable Geospatial IDs, to Drive Data Interoperability, accessed September 15, 2025, https://overturemaps.org/announcements/2025/overture-maps-launches-gers-a-global-standard-for-interoperable-geospatial-ids-to-drive-data-interoperability/
Data Extraction Community Guideline - Overture Maps Foundation, accessed September 15, 2025, https://overturemaps.org/data-extraction-community-guideline/
Building Platform-Agnostic Infrastructure - Overture Maps Foundation, accessed September 15, 2025, https://overturemaps.org/blog/2025/overture-maps-building-platform-agnostic-infrastructure-a-collaboration-story/
Overture Maps Foundation: Making Open Data the Winning Choice, accessed September 15, 2025, https://overturemaps.org/blog/2025/overture-maps-foundation-making-open-data-the-winning-choice/
FAQs - Overture Maps Foundation, accessed September 15, 2025, https://overturemaps.org/about/faq/
AWS Marketplace: Overture Maps Foundation Open Map Data, accessed September 15, 2025, https://aws.amazon.com/marketplace/pp/prodview-p45bsevtsevgk
infrastructure | Overture Maps Documentation, accessed September 15, 2025, https://docs.overturemaps.org/schema/reference/base/infrastructure/
OpenHistoricalMap/Governance - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/OpenHistoricalMap/Governance
Humanitarian OSM Team - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Humanitarian_OSM_Team
Spatial cyberinfrastructures, ontologies, and the humanities - PMC - PubMed Central, accessed September 15, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC3078361/
City infrastructure ontologies - NERC Open Research Archive, accessed September 15, 2025, https://nora.nerc.ac.uk/id/eprint/535389/1/1-s2.0-S0198971523000546-main.pdf
INFRASTRUCTURE RESEARCH ONTOLOGIES FINAL REPORT Authors - DAFNI, accessed September 15, 2025, https://www.dafni.ac.uk/wp-content/uploads/2021/05/IRO-final-report-31-03-2021.pdf
OWL Web Ontology Language Reference - W3C, accessed September 15, 2025, https://www.w3.org/TR/owl-ref/
OGC Benefits of Representing Spatial Data Using Semantic and Graph Technologies, accessed September 15, 2025, https://docs.ogc.org/wp/19-078r1/19-078r1.html
CUSEC Critical Infrastructure Template, accessed September 15, 2025, https://risp-cusec.opendata.arcgis.com/pages/cusec-cikr-template
A high-level electrical energy ontology with weighted attributes - ResearchGate, accessed September 15, 2025, https://www.researchgate.net/publication/276075897_A_high-level_electrical_energy_ontology_with_weighted_attributes
OEO Ontology - Open Energy Platform, accessed September 15, 2025, https://openenergyplatform.org/ontology/
OEMA Ontologies, accessed September 15, 2025, https://innoweb.mondragon.edu/ontologies/oema/index-en.html
Ontologies for Transportation Research: A Survey - | Enterprise Integration Laboratory – EIL - University of Toronto, accessed September 15, 2025, https://eil.mie.utoronto.ca/wp-content/uploads/2015/06/TransportationOntologiesSurvey_Jan2018.pdf
Transportation System Ontology - GitHub Pages, accessed September 15, 2025, https://enterpriseintegrationlab.github.io/icity/TransportationSystem/doc/index-en.html
A1-D4 Ontology of Transportation Networks - School of Mathematical and Computer Sciences, accessed September 15, 2025, http://www.macs.hw.ac.uk/bisel/rewerse/deliverables/m18/a1-d4.pdf
Applying the LOT methodology to a Public Bus Transport Ontology aligned with Transmodel: Challenges and Results | www.semantic-web-journal.net, accessed September 15, 2025, https://www.semantic-web-journal.net/content/applying-lot-methodology-public-bus-transport-ontology-aligned-transmodel-challenges-and-0
GeoServer - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/GeoServer
GeoServer and OpenStreetMap, accessed September 15, 2025, https://geoserver.org/tips%20and%20tricks/tutorials/2009/01/30/geoserver-and-openstreetmap.html
How do I render my own maps for my website? - OpenStreetMap Help, accessed September 15, 2025, https://help.openstreetmap.org/questions/136/how-do-i-render-my-own-maps-for-my-website/
osm2pgsql and conversion to layers in Geoserver - General talk, accessed September 15, 2025, https://community.openstreetmap.org/t/osm2pgsql-and-conversion-to-layers-in-geoserver/54871
Machine learning - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/Machine_learning
AI-generated buildings in OpenStreetMap: frequency of use and ..., accessed September 15, 2025, https://www.tandfonline.com/doi/full/10.1080/17538947.2025.2473637
Automated road surface classification in OpenStreetMap using MaskCNN and aerial imagery - Frontiers, accessed September 15, 2025, https://www.frontiersin.org/journals/big-data/articles/10.3389/fdata.2025.1657320/full
Automate Building Footprint Extraction using Deep learning | ArcGIS API for Python, accessed September 15, 2025, https://developers.arcgis.com/python/latest/samples/automate-building-footprint-extraction-using-instance-segmentation/
Using OpenStreetMap Data and Machine Learning to Generate Socio-Economic Indicators, accessed September 15, 2025, https://www.mdpi.com/2220-9964/9/9/498
Research on mapping with open machine learning | Humanitarian OpenStreetMap Team, accessed September 15, 2025, https://www.hotosm.org/projects/reseach-on-mapping-with-machine-learning/
OpenHistoricalMap - Wikipedia, accessed September 15, 2025, https://en.wikipedia.org/wiki/OpenHistoricalMap
OpenHistoricalMap/FAQ - OpenStreetMap Wiki, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/OpenHistoricalMap/FAQ
OpenStreetMap for Government, accessed September 15, 2025, https://wiki.openstreetmap.org/wiki/OpenStreetMap_for_Government
Finances - OpenStreetMap Foundation, accessed September 15, 2025, https://osmfoundation.org/wiki/Finances
Resilience Initiative - Sustainability in OpenStreetMap - Global Facility for Disaster Reduction and Recovery (GFDRR), accessed September 15, 2025, https://www.gfdrr.org/sites/default/files/publication/Sustainability-in-OSM-Erica-Hagen.pdf
World Bank Documents and Reports, accessed September 15, 2025, https://documents1.worldbank.org/curated/en/738531592553760735/pdf/Sustainability-in-OpenStreetMap-Building-a-More-Stable-Ecosystem-in-OSM-for-Development-and-Humanitarianism.pdf
Funding Open Source like public infrastructure | Dries Buytaert, accessed September 15, 2025, https://dri.es/funding-open-source-like-public-infrastructure
Funding Opportunities For Open Source - gw ospo - The George Washington University, accessed September 15, 2025, https://ospo.gwu.edu/funding-opportunities-open-source
Public-private funding models in open source software development: A case study on scikit-learn - arXiv, accessed September 15, 2025, https://arxiv.org/html/2404.06484v1
How to sponsor? - OSGeo, accessed September 15, 2025, https://www.osgeo.org/about/how-to-become-a-sponsor/
Hosting and Sponsorship Opportunities - Open Geospatial Consortium (OGC), accessed September 15, 2025, https://www.ogc.org/event-sponsorship-opportunities/
Open Geospatial Consortium (OGC), accessed September 15, 2025, https://www.ogc.org/
Exploring Business Models for Public Open Data Resources, accessed September 15, 2025, https://data.europa.eu/sites/default/files/report/Exploring%20business%20models%20for%20public%20open%20data%20resources.pdf
An analysis of open source GIS software business models and case studies, accessed September 15, 2025, https://pure.ewha.ac.kr/en/publications/an-analysis-of-open-source-gis-software-business-models-and-case-
Businesses Using Open Source GIS - Geography Realm, accessed September 15, 2025, https://www.geographyrealm.com/businesses-using-open-source-gis/
Unleashing Economic Foresight: Repurposing the Military's Masterpiece for the Global EconomyDescription text goes here