top of page

Novel Sensing

MultiSoft

Ubicomp 2018

We introduce MultiSoft, a multilayer soft sensor capable of sensing real-time contact localization, classification of deformation types, and estimation of deformation magnitudes. We propose a multimodal sensing pipeline that carries out both inverse problem solving and machine learning tasks. Specifically, we employ an electrical impedance tomography (EIT) for contact localization and a support vector machine (SVM) for classifying deformations and regressing their magnitudes. We propose a deformation-aware system which enables maintaining a persistent single-point contact localization throughout the deformation. By updating a time-varying distribution of conductivity change caused by deformations, a single-point contact localization can be maintained and restored to support interaction using both contact localization and deformations. We devise a multilayer structure to fabricate a highly stretchable and flexible soft sensor with a short sensor settlement after excitations. Through a series of experiments and evaluations, we validate both raw sensor and multimodal sensing performance with the proposed method. We further demonstrate applicability and feasibility of MultiSoft with example applications.

iSoft

We present iSoft, a single volume soft sensor capable of sensing real-time continuous contact and unidirectional stretching. We propose a low-cost and an easy way to fabricate such piezoresistive elastomer-based soft sensors for instant interactions. We employ an electrical impedance tomography (EIT) technique to estimate changes of resistance distribution on the sensor caused by fingertip contact. To compensate for the rebound elasticity of the elastomer and achieve real-time continuous contact sensing, we apply a dynamic baseline update for EIT. The baseline updates are triggered by fingertip contact and movement detections. Further, we support unidirectional stretching sensing using a model-based approach which works separately with continuous contact sensing. We also provide a software toolkit for users to design and deploy personalized interfaces with customized sensors. Through a series of experiments and evaluations, we validate the performance of contact and stretching sensing. Through example applications, we show the variety of examples enabled by iSoft.

TRing

We present TRing, a finger-worn input device which provides instant and customizable interactions. TRing offers a novel method for making plain objects interactive using an embedded magnet and a finger-worn device. With a particle filter integrated magnetic sensing technique, we compute the fingertip's position relative to the embedded magnet. We also offer a magnet placement algorithm that guides the magnet installation location based upon the user's interface customization. By simply inserting or attaching a small magnet, we bring interactivity to both fabricated and existing objects. In our evaluations, TRing shows an average tracking error of 8.6 mm in 3D space and a 2D targeting error of 4.96 mm, which are sufficient for implementing average-sized conventional controls such as buttons and sliders. A user study validates the input performance with TRing on a targeting task (92% accuracy within 45 mm distance) and a cursor control task (91% accuracy for a 10 mm target). Furthermore, we show examples that highlight the interaction capability of our approach.

TMotion

We present TMotion, a self-contained 3D input that enables spatial interactions around mobile device using a magnetic sensing technique. We embed a permanent magnet and an inertial measurement unit (IMU) in a stylus. When the stylus moves around the mobile device, we obtain a continuous magnetometer readings. By numerically solving non-linear magnetic field equations with known orientation from IMU, we achieve 3D position tracking with update rate greater than 30Hz. Our experiments evaluated the position tracking accuracy, showing an average error of 4.55mm in the space of 80mm×120mm×100mm. Furthermore, the experiments confirmed the tracking robustness against orientations and dynamic tracings. In task evaluations, we verified the tracking and targeting performance in spatial interactions with users. We demonstrate example applications that highlight TMotion's interaction capability.

Smart Textile

We propose a textile wearable device which enables a multimodal sensing input for an eyes-free mobile interaction during daily activities. Although existing input devices possess multimodal sensing capabilities with a small form factor, they still suffer from deficiencies in compactness and softness due to the nature of embedded materials and components. For our prototype, we paint a conductive silicone rubber on a single layer of textile and stitch conductive threads. From a single layer of the textile, multimodal sensing (strain and pressure) values are extracted via voltage dividers. A regression analysis, multi-level thresholding and a temporal position tracking algorithm are applied to capture the different levels and modes of finger interactions to support the input taxonomy. We then demonstrate example applications with interaction design allowing users to control existing mobile, wearable, and digital devices. The evaluation results confirm that the prototype can achieve an accuracy of ≥80% for demonstrating all input types, ≥88%for locating the specific interaction areas for eyes-free interaction, and the robustness during daily activity related motions. Multitasking study reveals that our prototype promotes relatively fast response with low perceived workload comparing to existing eyes-free input.

bottom of page