Edge Assisted Real-time Object Detection for Mobile AR at MobiCom 2019
This article is a summary of a YouTube video "MobiCom 2019 - Edge Assisted Real-time Object Detection for Mobile Augmented Reality" by ACM SIGMOBILE ONLINE
TLDR Offloading vision tasks to an edge cloud can improve object detection accuracy and speed in mobile augmented reality, reducing latency and computational load on mobile devices.
Mobile augmented reality has the potential to enable various applications, but there is a limitation in the ability to detect and classify complex objects in the real world.
Offloading vision tasks to an edge cloud can help improve object detection accuracy and make it faster by utilizing powerful cloud resources.
The proposed system reduces latency from 80 milliseconds to only 15 milliseconds, allowing for real-time object detection on mobile devices.
By leveraging the spatial and temporal correlation between continuous frames and identifying regions of interest, the encoding method can prioritize encoding those regions with high quality, resulting in more efficient transmission of information to the server.
The proposed technique of parallel streaming and transmitting image slices can significantly reduce latency in real-time object detection for mobile augmented reality.
This edge-assisted real-time object detection method could potentially enhance mobile augmented reality experiences by maintaining high-quality object detection while reducing computational load on mobile devices.
The edge-assisted real-time object detection technique aims to reduce latency and motion sickness in mobile augmented reality by using motion vectors embedded in video frames instead of traditional methods like sift or optical flow.
The end-to-end system enables high-poly object detection for augmented reality and mixed reality, running at 60 frames per second.