Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/81667

Registo completo
Campo DCValorIdioma
dc.contributor.authorRibeiro, Nuno Miguel Ferretepor
dc.contributor.authorMouta, Pedropor
dc.contributor.authorSantos, Cristinapor
dc.date.accessioned2023-01-09T22:50:46Z-
dc.date.issued2021-05-25-
dc.identifier.citationRibeiro, N.F., Mouta, P. & Santos, C.P. Two kinematic data-based approaches for cane event detection. J Ambient Intell Human Comput (2021). https://doi.org/10.1007/s12652-021-03313-7por
dc.identifier.issn1868-5137por
dc.identifier.urihttps://hdl.handle.net/1822/81667-
dc.description.abstractDetect cane events in real-life walking situations is needed to assess indirectly human gait without using wearable systems which may be undesirable, uncomfortable, or difficult to wear, especially for patients who are undergoing rehabilitation. This article fosters two reliable kinematic data-based approaches-a machine learning classifier and an adaptive rule-based finite-state machine (FSM)-to detect four cane events that can operate at diverse gait speeds and on diverse real-life terrains in real-time. A comparative analysis was performed to identify the most suitable machine learning classifier and the most relevant subset of features. The FSM only uses the cane's angular velocity and acceleration to facilitate its integration for daily and repeated use. Repeated measures from two groups of seven healthy subjects each were acquired to validate both approaches. The first group (23.29 +/- 1.16 years) performed trials under controlled situations in a treadmill at different speeds (from 1.0 to 1.5 km/h) and slopes (from 0 to 10%). Heterogeneous gait patterns were further collected from the second group (24.14 +/- 0.83 years) during forward walking on flat, rough, and inclined surfaces and climbing staircases. The CNN-LSTM when using the first 32 features ranked by the Relief-F method was more accurate than the FSM. The CNN-LSTM detects cane events accurately with an accuracy higher than 99% under controlled and real-life situations, except for the maximum support moment (MSM) (accuracy > 85.53%). The FSM detects most of the cane events accurately (accuracy > 90.63%). Misdetection was more pronounced in MSM (43.75% < accuracy < 84.91%). The lower computational load, together with high performances, makes these two approaches suitable for gait assessment in the rehabilitation field.por
dc.description.sponsorshipThis work has been supported by the FCT-Fundacao para a Ciencia e Tecnologia-national funds, under the scholarship reference PD/BD/141515/2018 and the national support to R&D units grant, through the reference project UIDB/04436/2020 and UIDP/04436/2020.por
dc.language.isoengpor
dc.publisherSpringerpor
dc.relationinfo:eu-repo/grantAgreement/FCT/POR_NORTE/PD%2FBD%2F141515%2F2018/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F04436%2F2020/PTpor
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F04436%2F2020/PTpor
dc.rightsrestrictedAccesspor
dc.subjectHuman gait analysispor
dc.subjectReal-time gait event detectionpor
dc.subjectAdaptive computational methodspor
dc.subjectInertial sensorspor
dc.subjectArtificial intelligencepor
dc.titleTwo kinematic data-based approaches for cane event detectionpor
dc.typearticle-
dc.peerreviewedyespor
dc.relation.publisherversionhttps://link.springer.com/article/10.1007/s12652-021-03313-7por
oaire.citationStartPage567por
oaire.citationEndPage579por
oaire.citationIssue1por
oaire.citationVolume14por
dc.date.updated2023-01-09T17:28:01Z-
dc.identifier.doi10.1007/s12652-021-03313-7por
dc.date.embargo10000-01-01-
dc.subject.fosCiências Naturais::Ciências da Computação e da Informaçãopor
dc.subject.wosScience & Technology-
sdum.export.identifier11081-
sdum.journalJournal of Ambient Intelligence and Humanized Computingpor
Aparece nas coleções:DEI - Artigos em revistas internacionais

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
Ribeiro2021TwoKinematicDatabasedApproach.pdf
Acesso restrito!
2,18 MBAdobe PDFVer/Abrir

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID