Persons
Mostafa Kishanifarahani, Ph.D.
All publications
Reducing Storage and Communication Latencies in Vehicular Edge Cloud
- Authors: Mostafa Kishanifarahani, Ph.D., prof. Ing. Zdeněk Bečvář, Ph.D., Nikooroo, M., Asadi, H.
- Publication: 2022 Joint European Conference on Networks and Communications & 6G Summit (EuCNC/6G Summit). New Jersey: IEEE, 2022. p. 291-296. ISSN 2575-4912. ISBN 978-1-6654-9871-5.
- Year: 2022
- DOI: 10.1109/EuCNC/6GSummit54941.2022.9815597
- Link: https://doi.org/10.1109/EuCNC/6GSummit54941.2022.9815597
- Department: Department of Telecommunications Engineering
-
Annotation:
Low-latency data access is crucial in edge clouds serving autonomous vehicles. Storage I/O caching is a promising solution to deliver the desired storage performance at a reasonable cost in vehicular edge platforms. Current storage I/O caching methods, however, are not specialized for workload characteristics and demands of autonomous vehicles and/or do not consider the communication latency between the vehicle and the base station hosting the edge cloud node. In this work, we propose a storage mechanism for vehicular edge cloud platforms taking communication, I/O cache, and storage latencies into account. We evaluate our proposed framework using realistic storage traces of vehicular services. Our framework reduces the average latency and the average latency of high-priority services by up to 1.56x and 2.43x, respectively, compared to the state-of-the-art works.
PADSA: Priority-Aware Block Data Storage Architecture for Edge Cloud Serving Autonomous Vehicles
- Authors: Mostafa Kishanifarahani, Ph.D., prof. Ing. Zdeněk Bečvář, Ph.D., Asadi, H.
- Publication: 2021 IEEE VEHICULAR NETWORKING CONFERENCE (VNC). New York: IEEE, 2021. p. 170-177. ISSN 2157-9857. ISBN 978-1-6654-4450-7.
- Year: 2021
- DOI: 10.1109/VNC52810.2021.9644617
- Link: https://doi.org/10.1109/VNC52810.2021.9644617
- Department: Department of Telecommunications Engineering
-
Annotation:
An efficient Input/Output (I/O) caching mechanism for data storage can deliver the desired performance at a reasonable cost to edge nodes serving autonomous vehicles. Current storage caching solutions are proposed to address common applications for autonomous vehicles that are less demanding in terms of the latency (e.g., map or software upgrades). However, a serious revision of these solutions is necessary for autonomous vehicles, which rely on safety- and time-critical communication for services, such as collision avoidance, requiring very low latency. In this paper, we propose a three-level storage caching architecture for virtualized edge cloud platforms serving autonomous vehicles. This architecture prioritizes safety-critical services and allocates the two top-level caches of Dynamic Random Access Memory (DRAM) and Non-Volatile Memory (NVM) to the top priority services. We further evaluate optimum cache space allocated to each service to minimize the average latency. The experimental results show that the proposed architecture reduces the average latency in safety-critical applications by up to 70% compared to the state-of-the-art.