Plantronics has been focusing on an API set from their headset/earbuds that enables apps to interact with the headset. The APIs can be integrated into WebRTC apps, and can create some very interesting value propositions. The API includes a whole set of sensors and an API to integrate those into an app:
- Wearing state: is the headset being worn or not?
- Proximity: is the headset near or far from the devices it’s paired to?
- Head orientation: get the orientation of the user’s head in Euler Angles or quaternions.
- Pedometer: use the device as a pedometer to get step-count
- Free-fall detection: determine if the user or the device is in a state of free fall
- Taps detection: measure the direction and count of taps on the device’s surface.
- Nod detection: determine if the user had nodded or shaken their head
- Raw sensors: get raw sensor data for the accelerometer, magnetometer and gyroscope to do your own processing.
The APIs are simple to use and provide for the app to subscribe and query sensor information and events. They allow the headset to interact with secure element for safe, closed transactions and create and respond to always-on voice commands. The APIs are supported across popular platforms: Android, iOS, Windows and Mac OS X.
Now Plantronics is sponsoring a contest with $85,000 in prize money for the best apps integrating to this technology. The contest is for teams of one to five developers. Plantronics will review proposals and pick 20 teams to get prototype devices to do initial development. From that group, 10 will be chosen for a next round, and will have a Plantronics mentor and develop a prototype. Finally, five teams will be chosen as finalists for live presentations to Plantronics, with the eventual winner getting $50,000, second place $25,000, and third place $10,000. Teams can register here.
Plantronics sees a wide range of applications and has identified some typical applications that could be developed using the APIs and sensors/capabilities:
- FIDO enrollment/2-factor authentication for Website access
- HID Security system keyless door entry via NFC reader
- Computer unlock via Bluetooth/NFC
- Identity certification via wear state
- Payment system via NFC for Visa, MasterCard, etc.
- “Where am I looking” app, transmits wearer’s head orientation to map application
- Remote control via head gesture (quad copter/camera/RC device)
- 3D audio control via head orientation
- Step tracking for exercise apps
- Enhanced presence detection for headset-enabled desktop apps
- Head gesture detection (nod/shake) for meeting systems
- Free-fall emergency dialing
I think there are very interesting applications by combining the speech with other sensors. For example, using the sensors on multiple attendees in a meeting room with an outbound video feed could be used to detect which person the majority of other attendees was looking at and then use that to focus the camera to that person for transmission to others on a video conference. Using two factor device and voice/word based security for access is another obvious use.
Plantronics has been touting the value of this integration and of sensors in the headset, now they are putting a real prize to get the market rolling. By using WebRTC as the basis for the development, teams will be able to use the easy communications paradigm, wither for PC implementations or mobile apps that are native OS based. The deadline for idea submission is November 15, 2014.
Edited by Maurice Nagle