Augmented Reality for Indoor Navigation System - An Android application for navigating user in an indoor environment with augmented arrows, without continuous data connectivity. You can find a demomstration video here. The application consists of 3 main modules:
- Source Detection (using text recognition for boards)
- Navigation (restricted to 1 floor)
- AR render (Sceneform render of ARCore)
Source detection includes identification of nearby landmarks (which are mostly boards, logos and objects). In the given code, the boards can be detected using text recognition by MLKit.
Navigation can be developed using all sensors of a smart phone - accelerometer (for step counts), magnetometer (for direction) and barometer(for multiple floors). This code consists of restricted navigation support for a particular indoor geographical space only.
AR applications can be developed using any AR SDKs on a suitable platform. Some of the popular ones include Vuforia, ARCore, ARKit. For this application, we have used ARCore in Android Studio. For the same, we use ARFragment class to create a fragment for AR View in our application.
Google demonstrated AR navigation already integrated with GMaps in Pixel3a in its I/O 2019. Awaiting the same beauty and exerience for indoor navigation... :) Also it has pushed for more on-device applications and algorithms so as to enhance privacy and minimise the need to share user data. Pushes this project idea further, yay!