The paper presents a cross-comparison of different estimation methods to model pedestrian and bicycle crashes. The study contributes to macro level safety studies by providing further methodological and empirical evidence on the various factors that influence the frequency of pedestrian and bicycle crashes at the planning level. Random parameter negative binomial (RPNB) models are estimated to explore the effects of various planning factors associated with total, serious injury and minor injury crashes while accounting for unobserved heterogeneity. Results of the RPNB models were compared with the results of a non-spatial negative binomial (NB) model and a Poisson-Gamma-CAR model. Key findings are, (1) the RPNB model performed best with the lowest mean absolute deviation, mean squared predicted error and Akaiki information criterion measures and (2) signs of estimated parameters are consistent if these variables are significant in models with the same response variables. We found that vehicle kilometers traveled (VKT), population, percentage of commuters cycling or walking to work, and percentage of households without motor vehicles have a significant and positive correlation with the number of pedestrian and bicycle crashes. Mixed land use is also found to have a positive association with the number of pedestrian and bicycle crashes. Results have planning and policy implications aimed at encouraging the use of sustainable modes of transportation while ensuring the safety of pedestrians and cyclist.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.aap.2016.05.001 | DOI Listing |
Sensors (Basel)
November 2024
School of Electronic and Electrical Engineering, Shanghai University of Engineering Science, Shanghai 201620, China.
For drone-based detection tasks, accurately identifying small-scale targets like people, bicycles, and pedestrians remains a key challenge. In this paper, we propose DV-DETR, an improved detection model based on the Real-Time Detection Transformer (RT-DETR), specifically optimized for small target detection in high-density scenes. To achieve this, we introduce three main enhancements: (1) ResNet18 as the backbone network to improve feature extraction and reduce model complexity; (2) the integration of recalibration attention units and deformable attention mechanisms in the neck network to enhance multi-scale feature fusion and improve localization accuracy; and (3) the use of the Focaler-IoU loss function to better handle the imbalanced distribution of target scales and focus on challenging samples.
View Article and Find Full Text PDFTraffic Inj Prev
November 2024
Waymo LLC, Mountain View, California.
Objective: Understanding and modeling baseline driving safety risk in dense urban areas represents a crucial starting point for automated driving system (ADS) safety impact analysis. The purpose of this study was to leverage naturalistic vulnerable road user (VRU) collision data to quantify collision rates, crash severity, and injury risk distributions in the absence of objective injury outcome data.
Methods: From over 500 million vehicle miles traveled, a total of 335 collision events involving VRUs were video verified and reconstructed (126 pedestrians, 144 cyclists, and 65 motorcyclists).
Transp Res Rec
December 2023
Department of Psychology, Western Michigan University, Kalamazoo, MI.
The gateway in-street sign configuration has been demonstrated to be a low-cost method for increasing motorist yielding the right of way to pedestrians at crosswalks. It has previously been hypothesized that the gateway is effective because it visually narrows a travel lane. In the present study, gateway widths (i.
View Article and Find Full Text PDFJ Trauma Inj
March 2024
Department of Surgery, Saint Francis Hospital and Medical Center, Hartford, CT, USA.
Otolaryngol Head Neck Surg
October 2024
Department of Otolaryngology-Head and Neck Surgery, Penn State College of Medicine, Hershey, Pennsylvania, USA.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!