Zensei

Embedded, Multi-electrode Bioimpedance Sensing
for Implicit, Ubiquitous User Recognition

Introduction

New interactions and connectivity protocols are increasingly expanding into shared objects and environments, such as furniture, vehicles, lighting, and entertainment systems. For transparent personalization in such contexts, we see an opportunity for embedded recognition, to complement traditional, explicit authentication.

We introduce Zensei, an implicit sensing system that leverages bio-sensing, signal processing and machine learning to classify uninstrumented users by their body’s electrical properties. Zensei could allow many objects to recognize users. E.g., phones that unlock when held, cars that automatically adjust mirrors and seats, or power tools that restore user setting.

We introduce wide-spectrum bioimpedance hardware that measures both amplitude and phase. It extends previous approaches through multi-electrode sensing and high-speed wireless data collection for embedded devices. We implement the sensing in devices and furniture, where unique electrode configurations generate characteristic profiles based on user’s unique electrical properties. Finally, we discuss results from a comprehensive, longitudinal 22-day data collection experiment with 46 subjects. Our analysis shows promising classification accuracy and low false acceptance rate.

This project was completed at MIT Media Lab in collaboration with Takram London and Google ATAP.

Zensei User Recognition

Zensei: New User Recognition Technology

Zensei enables physical objects to identify users by sensing the body’s electrical properties and touch/grasping behavior through electrical frequency response sensing. It allows almost any object to be capable of user recognition. For example, a phone that unlocks as it is picked up, a car that adjusts the mirrors and seat based on the driver, or a shared tablet that activates parental mode when a child holds it.

The sensor hardware is small in size and works by sensing the electrical frequency response properties of a user touching a surface. An array of electrodes measures this response from multiple perspectives, building up a virtual representation of the user's electricle profile depending on their skin characteristics and touch/grasping style. Thousands of frequency properties are then collected and fed into a machine learning algorithm for subsequent user recognition. This electrode array data, when combined with the use of an AC signal, allows Zensei to be implemented in a wide variety of surfaces and configurations both with and without direct skin contact.

Zensei Board

DESIGN OF FULLY-INTEGRATED, WIRELESS, AND EMBEDDABLE SENSING BOARD

Zensei works by sensing the amplitude and phase response of an extremely tiny AC signal through an array of up to 8 electrodes. The signal modulated through one electrode at a time before being recieved at the remaining sensing electrodes for every possible electrode rotation.

On-board signal generators create sine waves at a range of programmable frequencies (~1KHz-1.5MHz). The signal is then amplified and outputs at a select electrode pair. A part of the user’s body touches the electrodes, and the return signal’s amplitude and phase component are captured with the Analog-to-Digital converter (ADC) port of the mi- croprocessor and RF gain and phase detector IC. Our current sensor board hardware supports up to eight electrodes. The captured amplitude and phase data are then transferred to a PC or Phone to be processed by a machine learning classifier.

Zensei Board

Experimental Evaluation

To evaluate the sensor's performance, a comprehensive, longitudinal 22-day data collection experiment was performed with 46 subjects on three different test rigs: a fixed hand print (direct skin contact, constrained posture), a chair (no skin contact, constrained posture), and a smartphone (direct skin contact, unconstrained posture).

Best performance was achieved with either direct skin contact and either very constrained posture (pure intrinsic biometric) or highly variable posture (unique grasping sytles of the user).

Zensei Board
Sensor Exploded View
Sensor Exploded View

Smartphone Case

To further demonstrate the concept of casual identification on everyday devices, we built a round smartphone case to house our sensor board. The case's round shape invites unique grasping styles while simultaneously adding surface area for the painted-on electrodes.

The Zensei sensor board is mounted within the case and communicates with the attached smartphone via bluetooth for continuous classification.

Publications

Munehiko Sato, Rohan S Puri, Alex Olwal
Yosuke Ushigome, Lukas Franciszkiewicz, Deepak Chandra, Ivan Poupyrev, Ramesh Raskar.
“Zensei: Embedded, Multi-electrode Bioimpedance Sensing for Implicit, Ubiquitous User Recognition,” ACM CHI ’17, Denver, CO, USA, May, 2017.
Zensei Paper Image
Download Paper PDF
Visit ACM Digital Library page

Team

Team Member

Munehiko Sato

Research Scientist
Camera Culture Group
MIT Media Lab

Team Member

Rohan S Puri

Research Engineer
Camera Culture Group
MIT Media Lab

Team Member

Alex Olwal

Senior Research Scientist
Google

Team Member

Yosuke Ushigome

Design Engineer
Takram London

Team Member

Lukas Franciszkiewicz

Design Engineer
Takram London

Team Member

Deepak Chandra

Technical Program Lead
Google ATAP

Team Member

Ivan Poupyrev

Technical Program Lead
Google ATAP

Team Member

Ramesh Raskar

Associate Professor
Camera Culture Group
MIT Media Lab