Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/90824

Registo completo
Campo DCValorIdioma
dc.contributor.authorMoreira, Luís Carlos Rodriguespor
dc.contributor.authorFigueiredo, Joanapor
dc.contributor.authorCerqueira, João Josépor
dc.contributor.authorSantos, Cristinapor
dc.date.accessioned2024-04-09T07:04:30Z-
dc.date.issued2024-
dc.identifier.urihttps://hdl.handle.net/1822/90824-
dc.description.abstractRobotic assistive devices (orthoses and exoskeletons) usually feature non-intrusive sensors and intelligent algorithms. These enable the decoding of different locomotion modes (LMs), allowing the devices to tailor their assistance and support users in executing everyday walking tasks. Despite recent advances, most LM decoding tools (i) decode a small number of LMs simultaneously; (ii) have high recognition delays and low prediction times in advance; and (iii) do not consider the usually slow preferred speeds of neurologically impaired users. This study aims to address these shortcomings by presenting an automatic and user-independent kinematic-based LM decoding tool to classify, in real-time, 4 LMs (standing (St), level-ground walking (LGW), stair descent (SD), and stair ascent (SA)) when using a robotic assistive device at slow speeds. The proposed deep learning tool revealed an average F1-score, Matthew's Correlation Coefficient, and accuracy of 0.96 ± 0.014, 0.95 ± 0.018, and 0.98 ± 0.0072, respectively. Real-time experiments with the robotic device showed the ability to predict the upcoming LM with an average computational load and prediction time in advance of 2 ± 1 ms and 480 ± 240 ms, respectively, across six transitions (St-LGW, LGW-St, LGW-SD, SD-LGW, LGW-SA, and SA-LGW). These results suggest that the proposed LM decoding tool has the potential to be used in real-time to adapt the assistance of robotic assistive devices in advance.por
dc.description.sponsorshipThis work was funded by the FAIR project under grant 2022.05844.PTDC and by the Fundação para a Ciência e Tecnologia (FCT) under the scholarship reference 2020.05711.BD, under the Stimulus of Scientific Employment with the grant 2020.03393.CEECIND, under the national support to R&D units grant through the reference project UIDB/04436/2020 and UIDP/04436/2020.por
dc.language.isoengpor
dc.publisherIEEEpor
dc.relation2022.05844.PTDCpor
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/2020.05711.BD/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/CEEC IND 3ed/2020.03393.CEECIND%2FCP1600%2FCT0011/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F04436%2F2020/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F04436%2F2020/PTpor
dc.rightsrestrictedAccesspor
dc.subjectDeep learningpor
dc.subjectExoskeletonspor
dc.subjectLocomotion mode recognition and predictionpor
dc.subjectWearable sensorspor
dc.titleA real-time kinematic-based locomotion mode prediction algorithm for an ankle orthosispor
dc.typeconferencePaperpor
dc.peerreviewedyespor
dc.date.embargo10000-01-01-
sdum.conferencePublicationThe 24th IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC2024)por
Aparece nas coleções:CMEMS - Artigos em livros de atas/Papers in proceedings

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
PaperRepositorium.pdf
Acesso restrito!
548,28 kBAdobe PDFVer/Abrir

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID