Detailed Data Collection Streamlines Project Permitting

Accurate and thorough collection of geographic information system (GIS) datasets is integral to successful project review and permitting. Equally important is submitting datasets with ample detail so they can be easily understood, analyzed, and manipulated by project owners and regulators.

Unintelligible data slows permitting

Often, regulators receive datasets that make perfect sense to the data collector, but are unintelligible to someone without project-specific background knowledge. Strict data collection, curation, and management standards ensure high quality, detailed data that can be understood by all project stakeholders and permitters.

Detailed data collection and management improves workflow

Tyler Friesen, a Dudek GIS analyst, offers the following tips for efficiently collecting and curating large datasets:

  • Establish logical naming conventions so the viewer can get a quick overview of the data and its source (e.g., A dataset named “Soils” could be project-specific soils mapping or it could be from a third party source. Conversely, a dataset named “Soils_USDA_PlanArea” tells the viewer the dataset contains soils data from the USDA that covers the geographic extent of the entire plan area).
  • Provide complete, logical attribute information that clearly identifies geographic features and their associated tabular data. For example, a dataset containing the impact footprint for a project should clearly spell out which areas will be permanently or temporarily impacted along with the areas that will be directly or indirectly impacted.
  • Fill in metadata for all authoritative master datasets. At minimum, metadata should include Summary, Description, Date, Credits, and Use Limitations. Additionally, metadata should be formatted to meet the standards of the permitting agency (usually the Federal Geographic Data Committee).
  • Structure databases to mirror work products. For example, a database for an environmental impact report should organize the data under the same headings as in the document (Biology, Noise, Geology, Traffic, etc.).
  • Document the assumptions used in dataset development so regulating agencies can easily verify data quality. For example, providing the assumptions made in a given species habitat model (land cover types, elevation ranges, proximity to other suitable habitats, etc.) allows regulators to easily vet the models and their accuracy as questions arise during the review process.

It is also necessary to include a version date, and to archive data versions at each step from collection and creation to analysis and manipulation. This ensures both high quality data and the ability to easily trace any analytical steps taken if there is any question about how the data was created or how the results of an analysis were gleaned. Archiving also helps the project owner easily revert to a previous version if need be, saving time and money that would be spent undoing work completed since the last version.

Managing data for the South Sacramento Habitat Conservation Plan

Dudek successfully employed this method on the South Sacramento Habitat Conservation Plan (SSHCP), which takes a regional approach to balancing development with conservation and protection of habitat, open space, and agricultural lands within a 374,000-acre study area.

In 2013, Dudek provided GIS services for the Sacramento County project. At that time, there was approximately 18 years’ worth of data to review, catalog, and analyze. Additionally, the data required constant updating and versioning as the project and planning process evolved. GIS staff created an automated process to model individual watersheds of approximately 7,000 vernal pools covering more than 40,000 acres. Tasks involved building, maintaining, and analyzing large GIS datasets including more than 100 gigabytes of LIDAR-derived elevation models, which were used to model vernal pool micro watershed boundaries.

At the end of the project, the analysis results were packaged and sent to the U.S. Fish and Wildlife Service for review. The data structure was such that it allowed for anyone savvy with GIS to easily understand the data contents, its source, version date, and the analysis performed. Most importantly, it included complete and detailed metadata. Rich Radmacher, Senior Planner with the County of Sacramento, praised the ease of assessment, and said that the results were some of the most complete and detailed datasets that he has seen.

Delivering high quality, complete GIS data avoids the back and forth between the consultant/project owner and the regulatory agencies. Friesen says, “Being transparent in this manner also fosters trust between the consultant, project owner, and the regulatory agencies.” If they can see and understand the data and results of analysis, they are more likely to accept the result, which streamlines the entire permitting process, saving a project time and money.


Tyler Friesen has 5 years’ experience as a GIS analyst. Mark McGinnis is the GIS manager at Dudek, and has more than 15 years’ professional experience in geospatial technologies and application development. For more information, contact Mr. McGinnis at mmcginnis@dudek.com.