Utilize este identificador para referenciar este registo:
https://hdl.handle.net/1822/90824
Registo completo
Campo DC | Valor | Idioma |
---|---|---|
dc.contributor.author | Moreira, Luís Carlos Rodrigues | por |
dc.contributor.author | Figueiredo, Joana | por |
dc.contributor.author | Cerqueira, João José | por |
dc.contributor.author | Santos, Cristina | por |
dc.date.accessioned | 2024-04-09T07:04:30Z | - |
dc.date.issued | 2024 | - |
dc.identifier.uri | https://hdl.handle.net/1822/90824 | - |
dc.description.abstract | Robotic assistive devices (orthoses and exoskeletons) usually feature non-intrusive sensors and intelligent algorithms. These enable the decoding of different locomotion modes (LMs), allowing the devices to tailor their assistance and support users in executing everyday walking tasks. Despite recent advances, most LM decoding tools (i) decode a small number of LMs simultaneously; (ii) have high recognition delays and low prediction times in advance; and (iii) do not consider the usually slow preferred speeds of neurologically impaired users. This study aims to address these shortcomings by presenting an automatic and user-independent kinematic-based LM decoding tool to classify, in real-time, 4 LMs (standing (St), level-ground walking (LGW), stair descent (SD), and stair ascent (SA)) when using a robotic assistive device at slow speeds. The proposed deep learning tool revealed an average F1-score, Matthew's Correlation Coefficient, and accuracy of 0.96 ± 0.014, 0.95 ± 0.018, and 0.98 ± 0.0072, respectively. Real-time experiments with the robotic device showed the ability to predict the upcoming LM with an average computational load and prediction time in advance of 2 ± 1 ms and 480 ± 240 ms, respectively, across six transitions (St-LGW, LGW-St, LGW-SD, SD-LGW, LGW-SA, and SA-LGW). These results suggest that the proposed LM decoding tool has the potential to be used in real-time to adapt the assistance of robotic assistive devices in advance. | por |
dc.description.sponsorship | This work was funded by the FAIR project under grant 2022.05844.PTDC and by the Fundação para a Ciência e Tecnologia (FCT) under the scholarship reference 2020.05711.BD, under the Stimulus of Scientific Employment with the grant 2020.03393.CEECIND, under the national support to R&D units grant through the reference project UIDB/04436/2020 and UIDP/04436/2020. | por |
dc.language.iso | eng | por |
dc.publisher | IEEE | por |
dc.relation | 2022.05844.PTDC | por |
dc.relation | info:eu-repo/grantAgreement/FCT/POR_NORTE/2020.05711.BD/PT | por |
dc.relation | info:eu-repo/grantAgreement/FCT/CEEC IND 3ed/2020.03393.CEECIND%2FCP1600%2FCT0011/PT | por |
dc.relation | info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F04436%2F2020/PT | por |
dc.relation | info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F04436%2F2020/PT | por |
dc.rights | restrictedAccess | por |
dc.subject | Deep learning | por |
dc.subject | Exoskeletons | por |
dc.subject | Locomotion mode recognition and prediction | por |
dc.subject | Wearable sensors | por |
dc.title | A real-time kinematic-based locomotion mode prediction algorithm for an ankle orthosis | por |
dc.type | conferencePaper | por |
dc.peerreviewed | yes | por |
dc.date.embargo | 10000-01-01 | - |
sdum.conferencePublication | The 24th IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC2024) | por |
Aparece nas coleções: | CMEMS - Artigos em livros de atas/Papers in proceedings |
Ficheiros deste registo:
Ficheiro | Descrição | Tamanho | Formato | |
---|---|---|---|---|
PaperRepositorium.pdf Acesso restrito! | 548,28 kB | Adobe PDF | Ver/Abrir |