A Single-Motor Haptic interface that generates realistic soft-tissue feedback for palpation training.

(more information and visual contents will only be disclosed after our research paper is published in order to protect Morph Lab's intellectual property.)
Two hands were pressing on a palpation training device with a laptop screen displaying a virtual patient's face in pain.

Background and Motivation

Palpation is a key diagnostic tool that is used by medical professionals to assess the health of an individual by feeling for abnormalities in the body. It is often used in physical examinations to detect conditions such as tumors, swelling, or tenderness (pain/discomfort when an area is touched).

However, there are challenges in the conventional methods to train medical professionals to master the skills of palpation. Some limitations include:

1. Difficulty in getting consent. Currently, we only be performed on a consenting individual, and the person being examined must be comfortable with being touched.

2. Skill generalisability. The skills developed through palpation training on specific patients / volunteers may not transfer well to other individuals, as each person's body is unique and may have different characteristics.

3. Safe concerns. Palpation training on a real human body can be potentially harmful if it is not performed correctly.

These limitations make conventional palpation training expensive and unreliable​. Hence, there is a need to develop effective and reliable training tools that can provide medical professionals with the opportunity to practice and improve their palpation skills in a safe and controlled environment.

In this Robotic Research Project,​ I teamed with Chuankai, who's specialised in material design and manufacture process and control systems. We attempted to develop a novel haptic interface to address the limitations by standing on the shoulder of the RoboPatient project of the Morph Lab. We were only given 8 weeks to conduct physical prototyping, testing, and evaluation with human subject experiment. This is a great challenge for us but we managed to complete all our milestones including multi-modal integration with a visual-based programme.

Research Journey

4 stage research journey

First, to make sure the haptic responses simulated from our interface were realistic, we obtained our stiffness response data using an indentation test with 3D-printed finger mounted on a force sensor, by measuring the force response (stiffness) with regard to the indentation depth on a piece of dummy soft tissue. We compared the stiffness curve from regions with and without abnormalies (as simulated with a plastic tumor embedded within the tissue).

Next, with that baseline data, we selected the Brushless DC (BLDC) motor as our actuator for the haptic response force generation. The rationales for using BLDC instead of other actuation methods include the ease and realiability of output control, compliance configurability, space and weight constraints, and the cost of production. With a BLDC driven by SimpleFOC, we developed our own control algorithm and tested its performance margin. The results reviewed that a better hardware design is required in order to simulate a suitable range of force response.

As a result, we implemented a gear system to upgrade the torque output (by 6 times). With satisfactory performance, we designed the final part of our whole research - a pilot study with human participants to try out the system and provide perceptual feedback. This would help us debug any issue and make sure our multi-modal integration is robust.

Finally, we obtained consents and collected results from participants and compared their palpation performance to obtain insights into the effectiveness of such an interface in facilitating abnormaly detection.

Design Research Outcome

We cannot disclose details of our research findings until our research paper has been officially published.

However, I have summarised the skills I have gained from this Robotic Research Programme.

Technical Skills

  • Improved prototyping skills on mechatronics (i.e. Arduino and BLDC motor)

  • Developed Torque control algorithm

  • Designed experiments to obtain high quality data from the ATI force sensor

  • Programmed in C# to integrate the haptic interface with a training system

Communication Skills

  • Weekly updates to our supervisor and the module leader

  • Production of a written report on IEEE standard

  • Produced in-viva styled presentation and question handling

Research Management Skills

  • Project management and task-tracking

  • Agile development and task prioritisation under constraints

  • Resource management (budgeting and documenting equipment inventory)

  • Risk assessments, ethic clearance, and participant consent process

Special Thanks

I want to say thank you to those who helped me with the execution of the study, especially Dr Thilina Dulantha Lalitharatne as our supervisor and Professor Thrishantha Nanayakkara who runs this research module.

Yukun (Morph Lab) and Xinran (REDS lab) helped with 3D printing and motor debugging. Rusne (Bioengineering) built the Unity application which created the foundation of the multimodal integration. Barry (Morph Lab) helped with the equipment supply and guidance on using the motor driver. All the participants taking part in the pilot trial helped us improved the system design, without which this study will not be possible.