Google Unveils Android Mobile Vision API to Detect and Track Human Faces
Google Inc. (NASDAQ:GOOG) has rolled out Google Play service 7.8 for Android software developers. The first is Nearby Messages, a cross-platform API that uses a combination of Bluetooth, Wi-Fi, and an ultrasound audio modem to connect nearby mobile devices and beacons.
Another feature is the Mobile Vision API, which consists of two APIs.
In line with the new update to Google Play Services coming out, we take a look at a new API that app developers can play around with when planning out their new camera-based apps.
“The Face API [which is inside the Mobile Vision API] allows developers to find human faces in images and video”, Magnus Hyttsten, developer advocate on the Google Play Services team, wrote in his own blog post on the release.
While this feature is already there in a number of smartphones, the Face API additionally recognises the nose, eyes, and mouth in a human face and helps keep track of their position, both in images and video. The API can detect whether particular facial characteristics are present using classifications, such as if eyes are open or closed or how much the person is smiling. “It’s faster, more accurate and provides more information than the Android FaceDetector”. Google stresses it’s not facial recognition-it’s more akin to being able to recognize if the device is looking at a face and spotting multiple orientations, not using your face to determine your identity. It supports a range of barcodes and can detect multiple barcodes at once. In preparation for the Android M release, Google also added high and normal priority to Google Cloud Messaging, enabling users to send messages that need immediate attention, like an incoming voice call alert, at a higher priority than others.