To solve this problem, I first created a collection of Linear Referencing tools in the ArcGIS Pro menu tab. The accident and pavement data were converted to event layers and then intersected by overlaying them on roadway route data. Using a road rating threshold value of 75, the number of accidents per mile was calculated for high and low rated portions of roadway.
Before beginning analysis, a custom tool tab called “Linear Referencing” was created using the “Customize the Ribbon” interface in the software options. I created three subgroups and labeled them by their focus: Routes, Events and Add-In tools. The Routes group was set up with tools for calibrating, creating and locating features along routes. The Events group had four tools added to dissolve events, make layers from events, overlay and transform events. The Add-In group had two tools downloaded from GitHub to identify route locations and to set from/to measures.
To analyze the data, I first familiarized myself with the information in the data tables for the accident and pavement information. Focusing on Route 30000030, commonly known as North Carolina State Highway 30, I found the 15-mile road had 25 accidents and a rating value ranging from 66-88. The accident and pavement data were both converted to route event layers named “Accident Events” and “Pavement Events”. This allowed the data to be displayed and ultimately combined to determine if road rating influences the number of auto accidents. Intersecting these two layers was completed by overlaying the two sets of data together over the Some Routes feature class. Normally this is easiest through the Overlay Route Events tool,which was added to the customized tab, however due to a bug in this version of ArcPro the tool fails. In lieu of the tool, this process was completed using a python script creating a new data table named “Accident and Pavement Events”. This table was then displayed using the Make Route Event Layer tool and the attribute table analyzed
Linear referencing the stream, would allow for examining flow rates of small sections of stream and water quality at various points. One example is the Crabtree Creek project proposal for Raleigh. Crabtree Creek flows through the city of Raleigh, it’s already heavily urbanized, polluted and known for flooding during heavy rains. The proposal seeks to redevelop the area where the flooding occurs to create a waterfront business district and alleviate flooding. In order to do this, understanding variation in stream flow rates is important so a system of low dams and levees can be installed. However, if the water is too slow or stagnant, this will lead to sedimentation of the creek bed, deposition of pollutants and reduced water quality.
Currently the National Weather Service collects data on the creek at 5 locations in the affected area. Unfortunately, flow rate data is only available from two of those sites. In order to segment the river into smaller sections to improve resolution of linear referencing, additional data would have to be collected or flow rates estimated based upon the 12-hour river stage data from the other locations. In terms of water quality and sedimentation the USGS and the US Fish & Wildlife service have data collections from sampling sediment pollution data along the creek.
Linear Referencing of the creek data would be done in a similar manner to the project described above with point data collected on sediment pollutant events merged with linear stream flow variation events. This data could then be used to model different scenarios of dam and levee implementation to determine the best course of action to minimize flooding and maximize water quality.