Publications by authors named "Etienne Burdet"

93 Publications

A Three-Limb Teleoperated Robotic System with Foot Control for Flexible Endoscopic Surgery.

Ann Biomed Eng 2021 Apr 8. Epub 2021 Apr 8.

Robotics Research Center, School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore, Singapore.

Flexible endoscopy requires a lot of skill to manipulate both the endoscope and the associated instruments. In most robotic flexible endoscopic systems, the endoscope and instruments are controlled separately by two operators, which may result in communication errors and inefficient operation. Our solution is to enable the surgeon to control both the endoscope and the instruments. Here, we present a novel tele-operation robotic endoscopic system commanded by one operator using the continuous and simultaneous movements of their two hands and one foot. This 13-degree-of-freedom (DoF) system integrates a foot-controlled robotic flexible endoscope and two hand-controlled robotic endoscopic instruments, a robotic grasper and a robotic cauterizing hook. A dedicated foot-interface transfers the natural foot movements to the 4-DoF movements of the endoscope while two other commercial hand interfaces map the movements of the two hands to the two instruments individually. An ex-vivo experiment was carried out by six subjects without surgical experience, where the simultaneous control with foot and hands was compared with a sequential clutch-based hand control. The participants could successfully teleoperate the endoscope and the two instruments to cut the tissues at scattered target areas in a porcine stomach. Foot control yielded 43.7% faster task completion and required less mental effort as compared to the clutch-based hand control scheme, which proves the concept of three-limb tele-operation surgery and the developed flexible endoscopic system.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10439-021-02766-3DOI Listing
April 2021

Arm movement adaptation to concurrent pain constraints.

Sci Rep 2021 Mar 24;11(1):6792. Epub 2021 Mar 24.

Chair of Robotics and System Intelligence, Munich School of Robotics and Machine Intelligence, Technical University Munich, 80797, Munich, Germany.

How do humans coordinate their movements in order to avoid pain? This paper investigates a motor task in the presence of concurrent potential pain sources: the arm must be withdrawn to avoid a slap on the hand while avoiding an elbow obstacle with an electrical noxious stimulation. The results show that our subjects learned to control the hand retraction movement in order to avoid the potential pain. Subject-specific motor strategies were used to modify the joint movement coordination to avoid hitting the obstacle with the elbow at the cost of increasing the risk of hand slap. Furthermore, they used a conservative strategy as if assuming an obstacle in 100% of the trials.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-021-86173-7DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7991641PMC
March 2021

EEG measures of sensorimotor processing and their development are abnormal in children with isolated dystonia and dystonic cerebral palsy.

Neuroimage Clin 2021 Jan 19;30:102569. Epub 2021 Jan 19.

Children's Neurosciences Department, Evelina London Children's Hospital, Guy's and St Thomas NHS Foundation Trust, London SE1 7EH, United Kingdom. Electronic address:

Dystonia is a disorder of sensorimotor integration associated with abnormal oscillatory activity within the basal ganglia-thalamo-cortical networks. Event-related changes in spectral EEG activity reflect cortical processing but are sparsely investigated in relation to sensorimotor processing in dystonia. This study investigates modulation of sensorimotor cortex EEG activity in response to a proprioceptive stimulus in children with dystonia and dystonic cerebral palsy (CP). Proprioceptive stimuli, comprising brief stretches of the wrist flexors, were delivered via a robotic wrist interface to 30 young people with dystonia (20 isolated genetic/idiopathic and 10 dystonic CP) and 22 controls (mean age 12.7 years). Scalp EEG was recorded using the 10-20 international system and the relative change in post-stimulus power with respect to baseline was calculated for the alpha (8-12 Hz) and beta (14-30 Hz) frequency bands. A clear developmental profile in event-related spectral changes was seen in controls. Controls showed a prominent early alpha/mu band event-related desynchronisation (ERD) followed by an event-related synchronisation (ERS) over the contralateral sensorimotor cortex following movement of either hand. The alpha ERD was significantly smaller in the dystonia groups for both dominant and non-dominant hand movement (ANCOVA across the 3 groups with age as covariate: dominant hand F(2,47) = 4.45 p = 0.017; non-dominant hand F(2,42) = 9.397 p < 0.001. Alpha ERS was significantly smaller in dystonia for the dominant hand (ANCOVA F(2,47) = 7.786 p = 0.001). There was no significant difference in ERD or ERS between genetic/idiopathic dystonia and dystonic CP. CONCLUSION: Modulation of alpha/mu activity by a proprioceptive stimulus is reduced in dystonia, demonstrating a developmental abnormality of sensorimotor processing which is common to isolated genetic/idiopathic and acquired dystonia/dystonic CP.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.nicl.2021.102569DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8044718PMC
January 2021

Analogous adaptations in speed, impulse and endpoint stiffness when learning a real and virtual insertion task with haptic feedback.

Sci Rep 2020 12 18;10(1):22342. Epub 2020 Dec 18.

Imperial College of Science, Technology and Medicine, South Kensington, London, SW7 2AZ, UK.

Humans have the ability to use a diverse range of handheld tools. Owing to its versatility, a virtual environment with haptic feedback of the force is ideally suited to investigating motor learning during tool use. However, few simulators exist to recreate the dynamic interactions during real tool use, and no study has compared the correlates of motor learning between a real and virtual tooling task. To this end, we compared two groups of participants who either learned to insert a real or virtual tool into a fixture. The trial duration, the movement speed, the force impulse after insertion and the endpoint stiffness magnitude decreased as a function of trials, but they changed at comparable rates in both environments. A ballistic insertion strategy observed in both environments suggests some interdependence when controlling motion and controlling interaction, contradicting a prominent theory of these two control modalities being independent of one another. Our results suggest that the brain learns real and virtual insertion in a comparable manner, thereby supporting the use of a virtual tooling task with haptic feedback to investigate motor learning during tool use.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-020-79433-5DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7749137PMC
December 2020

Abnormal microscale neuronal connectivity triggered by a proprioceptive stimulus in dystonia.

Sci Rep 2020 11 27;10(1):20758. Epub 2020 Nov 27.

Department of Basic and Clinical Neuroscience, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, SE5 9RX, UK.

We investigated modulation of functional neuronal connectivity by a proprioceptive stimulus in sixteen young people with dystonia and eight controls. A robotic wrist interface delivered controlled passive wrist extension movements, the onset of which was synchronised with scalp EEG recordings. Data were segmented into epochs around the stimulus and up to 160 epochs per subject were averaged to produce a Stretch Evoked Potential (StretchEP). Event-related network dynamics were estimated using a methodology that features Wavelet Transform Coherency (WTC). Global Microscale Nodal Strength (GMNS) was introduced to estimate overall engagement of areas into short-lived networks related to the StretchEP, and Global Connectedness (GC) estimated the spatial extent of the StretchEP networks. Dynamic Connectivity Maps showed a striking difference between dystonia and controls, with particularly strong theta band event-related connectivity in dystonia. GC also showed a trend towards higher values in dystonia than controls. In summary, we demonstrate the feasibility of this method to investigate event-related neuronal connectivity in relation to a proprioceptive stimulus in a paediatric patient population. Young people with dystonia show an exaggerated network response to a proprioceptive stimulus, displaying both excessive theta-band synchronisation across the sensorimotor network and widespread engagement of cortical regions in the activated network.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-020-77533-wDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7695825PMC
November 2020

Flexible assimilation of human's target for versatile human-robot physical interaction.

IEEE Trans Haptics 2020 Nov 23;PP. Epub 2020 Nov 23.

Recent studies on the physical interaction between humans have revealed their ability to read the partner's motion plan and use it to improve one's own control. Inspired by these results, we develop an intention assimilation controller (IAC) that enables a contact robot to estimate the human's virtual target from the interaction force, and combine it with its own target to plan motion. While the virtual target depends on the control gains assumed for the human, we show that this does not affect the stability of the human-robot system, and our novel scheme covers a continuum of interaction behaviours from assistance to competition. Simulations and experiments illustrate how the IAC can assist the human or compete with them to prevent collisions. We demonstrate the IAC's advantages over related methods, such as faster convergence to a target, guidance with less force, safer obstacle avoidance and a wider range of interaction behaviours.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TOH.2020.3039725DOI Listing
November 2020

A Multimodal Intention Detection Sensor Suite for Shared Autonomy of Upper-Limb Robotic Prostheses.

Sensors (Basel) 2020 Oct 27;20(21). Epub 2020 Oct 27.

Department of Mechanical Engineering, UK Dementia Research Institute Care-Research and Technology Centre (DRI-CRT) Imperial College London, London SW7 2AZ, UK.

Neurorobotic augmentation (e.g., robotic assist) is now in regular use to support individuals suffering from impaired motor functions. A major unresolved challenge, however, is the excessive cognitive load necessary for the human-machine interface (HMI). Grasp control remains one of the most challenging HMI tasks, demanding simultaneous, agile, and precise control of multiple degrees-of-freedom (DoFs) while following a specific timing pattern in the joint and human-robot task spaces. Most commercially available systems use either an indirect mode-switching configuration or a limited sequential control strategy, limiting activation to one DoF at a time. To address this challenge, we introduce a shared autonomy framework centred around a low-cost multi-modal sensor suite fusing: (a) mechanomyography (MMG) to estimate the intended muscle activation, (b) camera-based visual information for integrated autonomous object recognition, and (c) inertial measurement to enhance intention prediction based on the grasping trajectory. The complete system predicts user intent for grasp based on measured dynamical features during natural motions. A total of 84 motion features were extracted from the sensor suite, and tests were conducted on 10 able-bodied and 1 amputee participants for grasping common household objects with a robotic hand. Real-time grasp classification accuracy using visual and motion features obtained 100%, 82.5%, and 88.9% across all participants for detecting and executing grasping actions for a bottle, lid, and box, respectively. The proposed multimodal sensor suite is a novel approach for predicting different grasp strategies and automating task performance using a commercial upper-limb prosthetic device. The system also shows potential to improve the usability of modern neurorobotic systems due to the intuitive control design.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s20216097DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7662487PMC
October 2020

An fMRI Compatible Smart Device for Measuring Palmar Grasping Actions in Newborns.

Sensors (Basel) 2020 Oct 23;20(21). Epub 2020 Oct 23.

Unit of Measurements and Biomedical Instrumentation, Università Campus Bio-Medico di Roma, Via Alvaro del Portillo, 00128 Rome, Italy.

Grasping is one of the first dominant motor behaviors that enable interaction of a newborn infant with its surroundings. Although atypical grasping patterns are considered predictive of neuromotor disorders and injuries, their clinical assessment suffers from examiner subjectivity, and the neuropathophysiology is poorly understood. Therefore, the combination of technology with functional magnetic resonance imaging (fMRI) may help to precisely map the brain activity associated with grasping and thus provide important insights into how functional outcomes can be improved following cerebral injury. This work introduces an MR-compatible device (i.e., smart graspable device (SGD)) for detecting grasping actions in newborn infants. Electromagnetic interference immunity (EMI) is achieved using a fiber Bragg grating sensor. Its biocompatibility and absence of electrical signals propagating through the fiber make the safety profile of the SGD particularly favorable for use with fragile infants. Firstly, the SGD design, fabrication, and metrological characterization are described, followed by preliminary assessments on a preterm newborn infant and an adult during an fMRI experiment. The results demonstrate that the combination of the SGD and fMRI can safely and precisely identify the brain activity associated with grasping behavior, which may enable early diagnosis of motor impairment and help guide tailored rehabilitation programs.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s20216040DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7660640PMC
October 2020

A Clustering-Based Approach to Identify Joint Impedance During Walking.

IEEE Trans Neural Syst Rehabil Eng 2020 08 29;28(8):1808-1816. Epub 2020 Jun 29.

Mechanical impedance, which changes with posture and muscle activations, characterizes how the central nervous system regulates the interaction with the environment. Traditional approaches to impedance estimation, based on averaging of movement kinetics, requires a large number of trials and may introduce bias to the estimation due to the high variability in a repeated or periodic movement. Here, we introduce a data-driven modeling technique to estimate joint impedance considering the large gait variability. The proposed method can be used to estimate impedance in both the stance and swing phases of walking. A 2-pass clustering approach is used to extract groups of unperturbed gait data and estimate candidate baselines. Then patterns of perturbed data are matched with the most similar unperturbed baseline. The kinematic and torque deviations from the baselines are regressed locally to compute joint impedance at different gait phases. Simulations using the trajectory data of a subject's gait at different speeds demonstrate a more accurate estimation of ankle stiffness and damping with the proposed clustering-based method when compared with two methods: i) using average unperturbed baselines, and ii) matching shifted and scaled average unperturbed velocity baselines. Furthermore, the proposed method requires fewer trials than methods based on average unperturbed baselines. The experimental results on human hip impedance estimation show the feasibility of clustering-based technique and verifies that it reduces the estimation variability.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNSRE.2020.3005389DOI Listing
August 2020

Cable-Driven Robotic Interface for Lower Limb Neuromechanics Identification.

IEEE Trans Biomed Eng 2021 Feb 20;68(2):461-469. Epub 2021 Jan 20.

This paper presents a versatile cable-driven robotic interface to investigate the single-joint joint neuromechanics of the hip, knee and ankle in the sagittal plane. This endpoint-based interface offers highly dynamic interaction and accurate position control (as is typically required for neuromechanics identification), and provides measurements of position, interaction force and electromyography (EMG) of leg muscles. It can be used with the subject upright, corresponding to a natural posture during walking or standing, and does not impose kinematic constraints on a joint, in contrast to existing interfaces. Mechanical evaluations demonstrated that the interface yields a rigidity above 500 N/m with low viscosity. Tests with a rigid dummy leg and linear springs show that it can identify the mechanical impedance of a limb accurately. A smooth perturbation is developed and tested with a human subject, which can be used to estimate the hip neuromechanics.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TBME.2020.3004491DOI Listing
February 2021

Estimating Human Wrist Stiffness during a Tooling Task.

Sensors (Basel) 2020 Jun 8;20(11). Epub 2020 Jun 8.

Robotics Research Center, School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore 639798, Singapore.

In this work, we propose a practical approach to estimate human joint stiffness during tooling tasks for the purpose of programming a robot by demonstration. More specifically, we estimate the stiffness along the wrist radial-ulnar deviation while a human operator performs flexion-extension movements during a polishing task. The joint stiffness information allows to transfer skills from expert human operators to industrial robots. A typical hand-held, abrasive tool used by humans during finishing tasks was instrumented at the handle (through which both robots and humans are attached to the tool) to assess the 3D force/torque interactions between operator and tool during finishing task, as well as the 3D kinematics of the tool itself. Building upon stochastic methods for human arm impedance estimation, the novelty of our approach is that we rely on the natural variability taking place during the multi-passes task itself to estimate (neuro-)mechanical impedance during motion. Our apparatus (hand-held, finishing tool instrumented with motion capture and multi-axis force/torque sensors) and algorithms (for filtering and impedance estimation) were first tested on an impedance-controlled industrial robot carrying out the finishing task of interest, where the impedance could be pre-programmed. We were able to accurately estimate impedance in this case. The same apparatus and algorithms were then applied to the same task performed by a human operators. The stiffness values of the human operator, at different force level, correlated positively with the muscular activity, measured during the same task.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s20113260DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7308925PMC
June 2020

The Influence of Posture, Applied Force and Perturbation Direction on Hip Joint Viscoelasticity.

IEEE Trans Neural Syst Rehabil Eng 2020 05 26;28(5):1138-1145. Epub 2020 Mar 26.

Limb viscoelasticity is a critical neuromechanical factor used to regulate the interaction with the environment. It plays a key role in modelling human sensorimotor control, and can be used to assess the condition of healthy and neurologically affected individuals. This paper reports the estimation of hip joint viscoelasticity during voluntary force control using a novel device that applies a leg displacement without constraining the hip joint. The influence of hip angle, applied limb force and perturbation direction on the stiffness and viscosity values was studied in ten subjects. No difference was detected in the hip joint stiffness between the dominant and non-dominant legs, but a small dependency was observed on the perturbation direction. Both hip stiffness and viscosity increased monotonically with the applied force magnitude, with posture being observed to have a slight influence. These results are in line with previous measurements carried out on upper limbs, and can be used as a baseline for lower limb movement simulation and further neuromechanical investigations.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNSRE.2020.2983515DOI Listing
May 2020

Identification of the best strategy to command variable stiffness using electromyographic signals.

J Neural Eng 2020 02 18;17(1):016058. Epub 2020 Feb 18.

Department of Biomedical and Dental Sciences and Morphofunctional Imaging, University of Messina, Messina, Italy. Department of Mechanical and Aerospace Engineering, Politecnico di Torino, Turin, Italy. Author to whom any correspondence should be addressed.

Objective: In the last decades, many EMG-controlled robotic devices were developed. Since stiffness control may be required to perform skillful interactions, different groups developed devices whose stiffness is real-time controlled based on EMG signal samples collected from the operator. However, this control strategy may be fatiguing. In this study, we proposed and experimentally validated a novel stiffness control strategy, based on the average muscle co-contraction estimated from EMG samples collected in the previous 1 or 2 s.

Approach: Nine subjects performed a tracking task with their right wrist in five different sessions. In four sessions a haptic device (Hi-5) applied a sinusoidal perturbing torque. In Baseline session, co-contraction reduced the effect of the perturbation only by stiffening the wrist. In contrast, during aided sessions the perturbation amplitude was also reduced (mimicking the effect of additional stiffening provided by EMG-driven robotic device) either proportionally to the co-contraction exerted by the subject sample-by-sample (Proportional), or according to the average co-contraction exerted in the previous 1 s (Integral 1s), or 2 s (Integral 2s). Task error, metabolic cost during the tracking task, perceived fatigue, and the median EMG frequency calculated during a sub-maximal isometric torque generation tasks that alternated with the tracking were compared across sessions.

Main Results: Positive effects of the reduction of the perturbation provided by co-contraction estimation was identified in all the investigated variables. Integral 1s session showed lower metabolic cost with respect to the Proportional session, and lower perceived fatigue with respect to both the Proportional and the Integral 2s sessions.

Significance: This study's results showed that controlling the stiffness of an EMG-driven robotic device proportionally to the operator's co-contraction, averaged in the previous 1 s, represents the best control strategy because it required less metabolic cost and led to a lower perceived fatigue.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1088/1741-2552/ab6d88DOI Listing
February 2020

Prediction of Gait Freezing in Parkinsonian Patients: A Binary Classification Augmented With Time Series Prediction.

IEEE Trans Neural Syst Rehabil Eng 2019 09 6;27(9):1909-1919. Epub 2019 Aug 6.

This paper presents a novel technique to predict freezing of gait in advanced stage Parkinsonian patients using movement data from wearable sensors. A two-class approach is presented which consists of autoregressive predictive models to project the feature time series, followed by machine learning based classifiers to discriminate freezing from nonfreezing based on the predicted features. To implement and validate our technique a set of time domain and frequency domain features were extracted from the 3D acceleration data, which was then analyzed using information theoretic and feature selection approaches to determine the most discriminative features. Predictive models were trained to predict the features from their past values, then fed into binary classifiers based on support vector machines and probabilistic neural networks which were rigorously cross validated. We compared the results of this approach with a three-class classification approach proposed in previous literature, in which a pre-freezing class was introduced and the problem of prediction of the gait freezing incident was reduced to solving a three-class classification problem. The two-class approach resulted in a sensitivity of 93±4%, specificity of 91±6%, with an expected prediction horizon of 1.72 s. Our subject-specific gait freezing prediction algorithm outperformed existing algorithms, yields consistent results across different subjects and is robust against the choice of classifier, with slight variations in the selected features. In addition, we analyzed the merits and limitations of different families of features to predict gait freezing.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNSRE.2019.2933626DOI Listing
September 2019

Influence of visual-coupling on bimanual coordination in unilateral spastic cerebral palsy.

IEEE Int Conf Rehabil Robot 2019 06;2019:1013-1018

Controlling two objects simultaneously during a bimanual task is a cognitively demanding process; both hands need to be temporally and spatially coordinated to achieve the shared task goal. Children with unilateral spastic cerebral palsy (USCP) exhibit severe sensory and motor impairments to one side of their body that make the process of coordinating bimanual movements particularly exhausting. Prior studies have shown that performing visually-coupled task could reduce cognitive interference associated with performing 'two tasks at once' in an uncoupled bimanual task. For children with USCP, who also present with cognitive delay, performing this type of task may allow them to process and plan their movement faster. We tested this hypothesis by examining the grip force control of 7 children with USCP during unimanual and visually-coupled bimanual tasks. Results demonstrated that despite the visual coupling, the bimanual coordination of these children remained impaired. However, there may be a potential benefit of visually-coupled task in encouraging both hands to initiate in concert. The implication of the study for children with USCP is discussed.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/ICORR.2019.8779390DOI Listing
June 2019

The effect of skill level matching in dyadic interaction on learning of a tracing task.

IEEE Int Conf Rehabil Robot 2019 06;2019:824-829

Dyadic interaction between humans has gained great research interest in the last years. The effects of factors that influence the interaction, as e.g. roles or skill level matching, are still not well understood. In this paper, we further investigated the effect of skill level matching between partners on learning of a visuo-motor task. Understanding the effect of skill level matching is crucial for applications in collaborative rehabilitation. Fifteen healthy participants were asked to trace a path while being subjected to a visuo-motor rotation (Novice). The Novices were paired with a partner, forming one of the three Dyad Types: a) haptic connection to another Novice, b) haptic connection to an Expert (no visuo-motor rotation), or c) no haptic. The intervention consisted of a Familiarization phase, followed by a Training phase, in which the Novices were learning the task in the respective Dyad Type, and a Test phase in which the learning was assessed (haptic connection removed, if any). Results suggest that learning of the task with a haptic connection to an Expert was least beneficial. However, during the Training phase the dyads comprising an Expert clearly outperformed the dyads with matched skill levels. The results point towards the same direction as previous findings in literature and can be explained by current motor-learning theories. Future work needs to corroborate these preliminary results.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/ICORR.2019.8779485DOI Listing
June 2019

optimizing self-exercise scheduling in motor stroke using Challenge Point Framework theory.

IEEE Int Conf Rehabil Robot 2019 06;2019:435-440

An important challenge for technology-assisted self-led rehabilitation is how to automate appropriate schedules of exercise that are responsive to patients' needs, and optimal for learning. While random scheduling has been found to be superior for long-term learning relative to fixed scheduling (Contextual Interference), this method is limited by not adequately accounting for task difficulty, or skill acquisition during training. One method that combines contextual interference with adaptation of the challenge to the skill-level of the player is Challenge Point Framework (CPF) theory. In this pilot study we test whether self-led motor training based upon CPF scheduling achieves faster learning than deterministic, fixed scheduling. Training was implemented in a mobile gaming device adapted for arm disability, allowing for grip and wrist exercises. We tested 11 healthy volunteers and 12 hemiplegic stroke patients in a single-blinded no crossover controlled randomized trial. Results suggest that patients training with CPF-based adaption performed better than those training with fixed conditions. This was not seen for healthy volunteers whose performance was close to ceiling. Further data collection is required to determine the significance of the results.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/ICORR.2019.8779497DOI Listing
June 2019

Individuals physically interacting in a group rapidly coordinate their movement by estimating the collective goal.

Elife 2019 02 12;8. Epub 2019 Feb 12.

Imperial College London, London, United Kingdom.

How can a human collective coordinate, for example to move a banquet table, when each person is influenced by the inertia of others who may be inferior at the task? We hypothesized that large groups cannot coordinate through touch alone, accruing to a zero-sum scenario where individuals inferior at the task hinder superior ones. We tested this hypothesis by examining how dyads, triads and tetrads, whose right hands were physically coupled together, followed a common moving target. Surprisingly, superior individuals followed the target accurately even when coupled to an inferior group, and the interaction benefits increased with the group size. A computational model shows that these benefits arose as each individual uses their respective interaction force to infer the collective's target and enhance their movement planning, which permitted coordination in seconds independent of the collective's size. By estimating the collective's movement goal, its individuals make physical interaction beneficial, swift and scalable.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.7554/eLife.41328DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6372281PMC
February 2019

Bimanual coordination during a physically coupled task in unilateral spastic cerebral palsy children.

J Neuroeng Rehabil 2019 01 3;16(1). Epub 2019 Jan 3.

Department of Bioengineering, Imperial College London, South Kensington, London, SW7 2AZ, UK.

Background: Single object bimanual manipulation, or physically-coupled bimanual tasks, are ubiquitous in daily lives. However, the predominant focus of previous studies has been on uncoupled bimanual actions, where the two hands act independently to manipulate two disconnected objects. In this paper, we explore interlimb coordination among children with unilateral spastic cerebral palsy (USCP), by investigating upper limb motor control during a single object bimanual lifting task.

Methods: 15 children with USCP and 17 typically developing (TD) children performed a simple single-object bimanual lifting task. The object was an instrumented cube that can record the contact force on each of its faces alongside estimating its trajectory during a prescribed two-handed lifting motion. The subject's performance was measured in terms of the duration of individual phases, linearity and monotonicity of the grasp-to-load force synergy, interlimb force asymmetry, and movement smoothness.

Results: Similar to their TD counterparts, USCP subjects were able to produce a linear grasp-to-load force synergy. However, they demonstrated difficulties in producing monotonic forces and generating smooth movements. No impairment of anticipatory control was observed within the USCP subjects. However, our analysis showed that the USCP subjects shifted the weight of the cube onto their more-abled side, potentially to minimise the load on the impaired side, which suggests a developed strategy of compensating for inter-limb asymmetries, such as muscle strength.

Conclusion: Bimanual interaction with a single mutual object has the potential to facilitate anticipation and sequencing of force control in USCP children unlike previous studies which showed deficits during uncoupled bimanual actions. We suggest that this difference could be partly due to the provision of adequate cutaneous and kinaesthetic information gathered from the dynamic exchange of forces between the two hands, mediated through the physical coupling.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s12984-018-0454-zDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6318978PMC
January 2019

A novel sensor design for accurate measurement of facial somatosensation in pre-term infants.

PLoS One 2018 16;13(11):e0207145. Epub 2018 Nov 16.

Department of Bioengineering, Imperial College of Science, Technology and Medicine, South Kensington Campus, London, United Kingdom.

Facial somatosensory feedback is critical for breastfeeding in the first days of life. However, its development has never been investigated in humans. Here we develop a new interface to measure facial somatosensation in newborn infants. The novel system allows to measure neuronal responses to touching the face of the subject by synchronously recording scalp electroencephalography (EEG) and the force applied by the experimenter. This is based on a dedicated force transducer that can be worn on the finger underneath a clinical nitrile glove and linked to a commercial EEG acquisition system. The calibrated device measures the pressure applied by the investigator when tapping the skin concurrently with the resulting brain response. With this system, we were able to demonstrate that taps of 192 mN (mean) reliably elicited facial somatosensory responses in 7 pre-term infants. These responses had a time course similar to those following limbs stimulation, but more lateral topographical distribution consistent with body representations in primary somatosensory areas. The method introduced can therefore be used to reliably measure facial somatosensory responses in vulnerable infants.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0207145PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6239299PMC
April 2019

Muscle patterns underlying voluntary modulation of co-contraction.

PLoS One 2018 19;13(10):e0205911. Epub 2018 Oct 19.

Laboratory of Neuromotor Physiology, IRCCS Fondazione Santa Lucia, Rome, Italy.

Manipulative actions involving unstable interactions with the environment require controlling mechanical impedance through muscle co-contraction. While much research has focused on how the central nervous system (CNS) selects the muscle patterns underlying a desired movement or end-point force, the coordination strategies used to achieve a desired end-point impedance have received considerably less attention. We recorded isometric forces at the hand and electromyographic (EMG) signals in subjects performing a reaching task with an external disturbance. In a virtual environment, subjects displaced a cursor by applying isometric forces and were instructed to reach targets in 20 spatial locations. The motion of the cursor was then perturbed by disturbances whose effects could be attenuated by increasing co-contraction. All subjects could voluntarily modulate co-contraction when disturbances of different magnitudes were applied. For most muscles, activation was modulated by target direction according to a cosine tuning function with an offset and an amplitude increasing with disturbance magnitude. Co-contraction was characterized by projecting the muscle activation vector onto the null space of the EMG-to-force mapping. Even in the baseline the magnitude of the null space projection was larger than the minimum magnitude required for non-negative muscle activations. Moreover, the increase in co-contraction was not obtained by scaling the baseline null space projection, scaling the difference between the null space projections in any block and the projection of the non-negative minimum-norm muscle vector, or scaling the difference between the null space projections in the perturbed blocks and the baseline null space projection. However, the null space projections in the perturbed blocks were obtained by linear combination of the baseline null space projection and the muscle activation used to increase co-contraction without generating any force. The failure of scaling rules in explaining voluntary modulation of arm co-contraction suggests that muscle pattern generation may be constrained by muscle synergies.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0205911PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6195298PMC
April 2019

Is EMG a Viable Alternative to BCI for Detecting Movement Intention in Severe Stroke?

IEEE Trans Biomed Eng 2018 12 21;65(12):2790-2797. Epub 2018 Mar 21.

Objective: In light of the shortcomings of current restorative brain-computer interfaces (BCI), this study investigated the possibility of using EMG to detect hand/wrist extension movement intention to trigger robot-assisted training in individuals without residual movements.

Methods: We compared movement intention detection using an EMG detector with a sensorimotor rhythm based EEG-BCI using only ipsilesional activity. This was carried out on data of 30 severely affected chronic stroke patients from a randomized control trial using an EEG-BCI for robot-assisted training.

Results: The results indicate the feasibility of using EMG to detect movement intention in this severely handicapped population; probability of detecting EMG when patients attempted to move was higher (p 0.001) than at rest. Interestingly, 22 out of 30 (or 73%) patients had sufficiently strong EMG in their finger/wrist extensors. Furthermore, in patients with detectable EMG, there was poor agreement between the EEG and EMG intent detectors, which indicates that these modalities may detect different processes.

Conclusion: A substantial segment of severely affected stroke patients may benefit from EMG-based assisted therapy. When compared to EEG, a surface EMG interface requires less preparation time, which is easier to don/doff, and is more compact in size.

Significance: This study shows that a large proportion of severely affected stroke patients have residual EMG, which yields a direct and practical way to trigger robot-assisted training.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TBME.2018.2817688DOI Listing
December 2018

Sensory integration of apparent motion speed and vibration magnitude.

IEEE Trans Haptics 2018 Jul-Sep;11(3):455-463. Epub 2017 Nov 15.

Tactile apparent motion can display directional information in an intuitive way. It can for example be used to give directions to visually impaired individuals, or for waypoint navigation while cycling on busy streets, when vision or audition should not be loaded further. However, although humans can detect very short tactile patterns, discriminating between similar motion speeds has been shown to be difficult. Here we develop and investigate a method where the speed of tactile apparent motion around the user's wrist is coupled with vibration magnitude. This redundant coupling is used to produce tactile patterns from slow&weak to fast&strong. We compared the just noticeable difference (JND) of the coupled and the individual variables. The results show that the perception of the coupled variable can be characterised by JND smaller than JNDs of the individual variables. This allowed us to create short tactile pattens (tactons) for display of direction and speed, which can be distinguished significantly better than tactons based on motion alone. Additionally, most subjects were also able to identify the coupled-variable tactons better than the magnitude-based tactons.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TOH.2017.2772232DOI Listing
March 2019

Haptic communication between humans is tuned by the hard or soft mechanics of interaction.

PLoS Comput Biol 2018 03 22;14(3):e1005971. Epub 2018 Mar 22.

Department of Bioengineering, Imperial College of Science, Technology and Medicine, South Kensington, London, United Kingdom.

To move a hard table together, humans may coordinate by following the dominant partner's motion [1-4], but this strategy is unsuitable for a soft mattress where the perceived forces are small. How do partners readily coordinate in such differing interaction dynamics? To address this, we investigated how pairs tracked a target using flexion-extension of their wrists, which were coupled by a hard, medium or soft virtual elastic band. Tracking performance monotonically increased with a stiffer band for the worse partner, who had higher tracking error, at the cost of the skilled partner's muscular effort. This suggests that the worse partner followed the skilled one's lead, but simulations show that the results are better explained by a model where partners share movement goals through the forces, whilst the coupling dynamics determine the capacity of communicable information. This model elucidates the versatile mechanism by which humans can coordinate during both hard and soft physical interactions to ensure maximum performance with minimal effort.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1371/journal.pcbi.1005971DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5863953PMC
March 2018

Balancing the playing field: collaborative gaming for physical training.

J Neuroeng Rehabil 2017 Nov 20;14(1):116. Epub 2017 Nov 20.

Department of Bioengineering, Imperial College of Science, Technology and Medicine, London, UK.

Background: Multiplayer video games promoting exercise-based rehabilitation may facilitate motor learning, by increasing motivation through social interaction. However, a major design challenge is to enable meaningful inter-subject interaction, whilst allowing for significant skill differences between players. We present a novel motor-training paradigm that allows real-time collaboration and performance enhancement, across a wide range of inter-subject skill mismatches, including disabled vs. able-bodied partnerships.

Methods: A virtual task consisting of a dynamic ball on a beam, is controlled at each end using independent digital force-sensing handgrips. Interaction is mediated through simulated physical coupling and locally-redundant control. Game performance was measured in 16 healthy-healthy and 16 patient-expert dyads, where patients were hemiparetic stroke survivors using their impaired arm. Dual-player was compared to single-player performance, in terms of score, target tracking, stability, effort and smoothness; and questionnaires probing user-experience and engagement.

Results: Performance of less-able subjects (as ranked from single-player ability) was enhanced by dual-player mode, by an amount proportionate to the partnership's mismatch. The more abled partners' performances decreased by a similar amount. Such zero-sum interactions were observed for both healthy-healthy and patient-expert interactions. Dual-player was preferred by the majority of players independent of baseline ability and subject group; healthy subjects also felt more challenged, and patients more skilled.

Conclusion: This is the first demonstration of implicit skill balancing in a truly collaborative virtual training task leading to heightened engagement, across both healthy subjects and stroke patients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s12984-017-0319-xDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5694911PMC
November 2017

Positioning the endoscope in laparoscopic surgery by foot: Influential factors on surgeons' performance in virtual trainer.

Annu Int Conf IEEE Eng Med Biol Soc 2017 Jul;2017:3944-3948

We have investigated how surgeons can use the foot to position a laparoscopic endoscope, a task that normally requires an extra assistant. Surgeons need to train in order to exploit the possibilities offered by this new technique and safely manipulate the endoscope together with the hands movements. A realistic abdominal cavity has been developed as training simulator to investigate this multi-arm manipulation. In this virtual environment, the surgeon's biological hands are modelled as laparoscopic graspers while the viewpoint is controlled by the dominant foot. 23 surgeons and medical students performed single-handed and bimanual manipulation in this environment. The results show that residents had superior performance compared to both medical students and more experienced surgeons, suggesting that residency is an ideal period for this training. Performing the single-handed task improves the performance in the bimanual task, whereas the converse was not true.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/EMBC.2017.8037719DOI Listing
July 2017

Anticipatory detection of turning in humans for intuitive control of robotic mobility assistance.

Bioinspir Biomim 2017 09 26;12(5):055004. Epub 2017 Sep 26.

School of Electronic Engineering and Computer Science, Queen Mary University of London, United Kingdom. Department of Bioengineering, Imperial College of Science, Technology and Medicine, London, United Kingdom.

Many wearable lower-limb robots for walking assistance have been developed in recent years. However, it remains unclear how they can be commanded in an intuitive and efficient way by their user. In particular, providing robotic assistance to neurologically impaired individuals in turning remains a significant challenge. The control should be safe to the users and their environment, yet yield sufficient performance and enable natural human-machine interaction. Here, we propose using the head and trunk anticipatory behaviour in order to detect the intention to turn in a natural, non-intrusive way, and use it for triggering turning movement in a robot for walking assistance. We therefore study head and trunk orientation during locomotion of healthy adults, and investigate upper body anticipatory behaviour during turning. The collected walking and turning kinematics data are clustered using the k-means algorithm and cross-validation tests and k-nearest neighbours method are used to evaluate the performance of turning detection during locomotion. Tests with seven subjects exhibited accurate turning detection. Head anticipated turning by more than 400-500 ms in average across all subjects. Overall, the proposed method detected turning 300 ms after its initiation and 1230 ms before the turning movement was completed. Using head anticipatory behaviour enabled to detect turning faster by about 100 ms, compared to turning detection using only pelvis orientation measurements. Finally, it was demonstrated that the proposed turning detection can improve the quality of human-robot interaction by improving the control accuracy and transparency.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1088/1748-3190/aa80adDOI Listing
September 2017

SITAR: a system for independent task-oriented assessment and rehabilitation.

J Rehabil Assist Technol Eng 2017 Jan-Dec;4:2055668317729637. Epub 2017 Sep 18.

1Department of Bioengineering, Imperial College of Science, Technology and Medicine, London, UK.

Introduction: Over recent years, task-oriented training has emerged as a dominant approach in neurorehabilitation. This article presents a novel, sensor-based system for independent task-oriented assessment and rehabilitation (SITAR) of the upper limb.

Methods: The SITAR is an ecosystem of interactive devices including a touch and force-sensitive tabletop and a set of intelligent objects enabling functional interaction. In contrast to most existing sensor-based systems, SITAR provides natural training of visuomotor coordination through collocated visual and haptic workspaces alongside multimodal feedback, facilitating learning and its transfer to real tasks. We illustrate the possibilities offered by the SITAR for sensorimotor assessment and therapy through pilot assessment and usability studies.

Results: The pilot data from the assessment study demonstrates how the system can be used to assess different aspects of upper limb reaching, pick-and-place and sensory tactile resolution tasks. The pilot usability study indicates that patients are able to train arm-reaching movements independently using the SITAR with minimal involvement of the therapist and that they were motivated to pursue the SITAR-based therapy.

Conclusion: SITAR is a versatile, non-robotic tool that can be used to implement a range of therapeutic exercises and assessments for different types of patients, which is particularly well-suited for task-oriented training.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/2055668317729637DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6453030PMC
September 2017

A simple tool to measure spasticity in spinal cord injury subjects.

IEEE Int Conf Rehabil Robot 2017 07;2017:1590-1596

This work presents a wearable device and the algorithms for quantitative modelling of joint spasticity and its application in a pilot group of subjects with different levels of spinal cord injury. The device comprises light-weight instrumented handles to measure the interaction force between the subject and the physical therapist performing the tests, EMG sensors and inertial measurement units to measure muscle activity and joint kinematics. Experimental tests included the passive movement of different body segments, where the spasticity was expected, at different velocities. Tonic stretch reflex thresholds and their velocity modulation factor are computed, as a quantitative index of spasticity, by using the kinematics data at the onset of spasm detected through thresholding the EMG data. This technique was applied to two spinal cord injury subjects. The proposed method allowed the analysis of spasticity at muscle and joint levels. The obtained results are in line with the expert diagnosis and qualitative spasticity characterisation on each individual.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/ICORR.2017.8009475DOI Listing
July 2017

Validity of a sensor-based table-top platform to measure upper limb function.

IEEE Int Conf Rehabil Robot 2017 07;2017:652-657

Objective measurement is an essential part of the assessment process in neurological dysfunction such as stroke. However, current clinical scores are insensitive and based on subjective observation from experts. Technology provides an opportunity for enhanced accuracy and specificity of objective measurement. This study describes the use of an interactive force-sensitive table-top platform for the assessment of reach in post-stroke patients, admitted as part of a three week intensive upper limb training programme. Objective measures from the reachable workspace were extracted and included normalised reach distance, normalised reached speed and reach dragging. The data was compared to standardised Fugl-Meyer (FM) clinical scores, recorded at admission (FMPRE) and discharge (FMPOST). Results indicate strong relationships between the three objective measures and subjective FM scores, with significant Spearman correlations found in all cases (|ρ| > 0.5, p < 0.05). The results highlight the validity for a sensor-based table-top system to provide a simple, flexible, and objective platform for assessment of impaired upper limb motor function.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/ICORR.2017.8009322DOI Listing
July 2017