Milad Rafiee

Orcid: 0000-0002-2199-0184

According to our database1, Milad Rafiee authored at least 12 papers between 2019 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Cross Network Layer Cognitive Service Orchestration in Edge Computing Systems.
Proceedings of the IEEE International Conference on Edge Computing and Communications, 2024

2022
Efficient pipelined flow classification for intelligent data processing in IoT.
Digit. Commun. Networks, 2022

2021
Optimal Distribution of Workloads in Cloud-Fog Architecture in Intelligent Vehicular Networks.
IEEE Trans. Intell. Transp. Syst., 2021

Efficient Flow Processing in 5G-Envisioned SDN-Based Internet of Vehicles Using GPUs.
IEEE Trans. Intell. Transp. Syst., 2021

2020
MBitCuts: optimal bit-level cutting in geometric space packet classification.
J. Supercomput., 2020

A CRC-Based Classifier Micro-Engine for Efficient Flow Processing in SDN-Based Internet of Things.
Mob. Inf. Syst., 2020

An efficient parallel genetic algorithm solution for vehicle routing problem in cloud implementation of the intelligent transportation systems.
J. Cloud Comput., 2020

Efficient parallelisation of the packet classification algorithms on multi-core central processing units using multi-threading application program interfaces.
IET Comput. Digit. Tech., 2020

Investigating the efficiency of multithreading application programming interfaces for parallel packet classification in wireless sensor networks.
Turkish J. Electr. Eng. Comput. Sci., 2020

Efficient resource management and workload allocation in fog-cloud computing paradigm in IoT using learning classifier systems.
Comput. Commun., 2020

2019
A calibrated asymptotic framework for analyzing packet classification algorithms on GPUs.
J. Supercomput., 2019

Enhancing the performance of the aggregated bit vector algorithm in network packet classification using GPU.
PeerJ Comput. Sci., 2019


  Loading...