Augmented Reality Office App


The client is a US based company in California. The project is a client-server app involving three distinct techniques using augmented reality(AR) visualization, indoor navigation, and face recognition in the office environment of our client.

The app has two main functions. First it allows client’s guests or customers, using their mobile device to scan, uniquely identify employees face and view their information.

The second function involves using the app with Epson Moverio augmented reality smart glasses BT-300. Here the user will have immediate access to client’s completed projects information.

The flow goes like this. The user wares the AR glasses with a high resolution camera and walks through office’s glory alley, where the icons of completed projects are placed.

By looking at the special marker icons, the camera is pointed to the project. Once it’s fixed on the icon or project logo, a colored image of the project name becomes visible to the user. User then taps on the projects and views projects info such as name, location, duration, technologies used, and newts) lifecycle.

The Challenge

User could not see image directly from the camera when using the app with see-through devices like Epson Glasses and using a projection matrix available by AR Toolkit was not possible.
The sensor data contained additional meaningless information, and we could not use it for long-term position tracking.
The magnometer data was out of line and produced different scale on each axis. This caused data dependency on the device orientation. With another words, when user walked through the same route twice in opposite directions, different values were produced by the magnetometer.


We decided to use calibration method to derive eye-to-eye camera relations. Next, we obtained the intrinsic camera matrix of an eye in the camera coordinate system by calibrating ten points with known coordinates. This was done for both the left and right eye independently.

the values we derived from magnetic field was used to create a map of our client’s office. So, when user walked through route, we traced magnetic field changes to obtain user’s walk-through path. We used a custom designed calibration method to obtain correction coefficients.


Users can now see augmented images perfectly aligned with real objects with 20 x 20 vision using see-through devices.
User position can be determined with accuracy of 5 to 6.5 feet.
Our custom designed calibration method produced stable, direction-independent values of the magnetic field.

*All case studies are for illustration purposes only. Due to NDA agreements between the client and the development team, project details cannot be disclosed.


Professional Services


Android, ARToolkit, DLib, GoogleVision, PostgreSQL+cube, Spring Boot

Areas of Expertise

Augmented Reality, Computer Vision, Neural Networks

Other Technologies

Projective and single view technology, MATLAB/Octave


8 Months


1 machine learning developer, 1 designer, 1 business analyst