The Internet of Vehicles (IoV) transforms the automobile industry through connected vehicles with communication infrastructure that improves traffic control, safety and information, and entertainment services. However, some issues remain, like data protection, privacy, compatibility with other protocols and systems, and the availability of stable and continuous connections. Specific problems are related to energy consumption for transmitting information, distributing energy loads across the vehicle's sensors and communication units, and designing energy-efficient approaches to processing received data and making decisions in the context of the IoV environment. In the realm of IoV, we propose OptiE2ERL, an advanced Reinforcement Learning (RL) based model designed to optimize energy efficiency and routing. Our model leverages a reward matrix and the Bellman equation to determine the optimal path from source to destination, effectively managing communication overhead. The model considers critical parameters such as Remaining Energy Level (REL), Bandwidth and Interference Level (BIL), Mobility Pattern (MP), Traffic Condition (TC), and Network Topological Arrangement (NTA), ensuring a comprehensive approach to route optimization. Extensive simulations were conducted using NS2 and Python, demonstrating that OptiE2ERL significantly outperforms existing models like LEACH, PEGASIS, and EER-RL across various performance metrics. Specifically, our model extends the network lifetime, delays the occurrence of the first dead node, and maintains a higher residual energy rate. Furthermore, OptiE2ERL enhances network scalability and robustness, making it a superior choice for IoV applications. The simulation results highlight the effectiveness of our model in achieving energy-efficient routing while maintaining network performance under different scenarios. By incorporating a diverse set of parameters and utilizing RL techniques, OptiE2ERL provides a robust solution for the challenges faced in IoV networks. This research contributes to the field by presenting a model that optimizes energy consumption and ensures reliable and efficient communication in dynamic vehicular environments.

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-025-86608-5DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11759676PMC

Publication Analysis

Top Keywords

reinforcement learning
8
learning based
8
route optimization
8
energy efficiency
8
internet vehicles
8
energy consumption
8
model
7
energy
7
iov
5
based route
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!