Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/70952

Registo completo
Campo DCValorIdioma
dc.contributor.authorSilva, Vinicius Corrêa Alvespor
dc.contributor.authorSoares, Filomenapor
dc.contributor.authorEsteves, João Senapor
dc.contributor.authorVercelli, Giannipor
dc.date.accessioned2021-03-24T11:48:16Z-
dc.date.issued2020-10-
dc.identifier.isbn9781728192819por
dc.identifier.issn2157-0221-
dc.identifier.urihttps://hdl.handle.net/1822/70952-
dc.description.abstractResearchers have been using different technological solutions (platforms) as intervention tools with children with Autism Spectrum Disorder (ASD), who typically present difficulties in engaging and interacting with their peers. Social robots, one example of these technological solutions, are often unaware of their game partners, preventing the automatic adaptation of their behaviour to the user. Therefore, enriching the interaction between the user and the platform, lightening up the cognitive burden on the human operator, may be a valuable contribution. An information that can be used to enrich this interaction and, consequently, adapt the system behaviour is the recognition of different actions of the user through skeleton pose data from depth sensors. The present work proposes a method to automatically detect in real-time typical and stereotypical actions of children with ASD by using the Intel RealSense and the Nuitrack SDK to detect and extract the user joints coordinates. A Convolution Neural Network learning model trained on the different actions is used to classify the different patterns of behaviour. The model achieved an average accuracy of 92.6±0.5% on the test data. The entire pipeline runs on average at 31 FPS.por
dc.description.sponsorshipThe authors thank the teachers and students of the Elementary School of Gualtar (EB1/JI Gualtar) in Braga for their participation. This work has been supported by FCT Fundacao para a Ciencia e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020 and by the PhD scholarship SFRH/BD/SFRH/BD/133314/2017. This work has been also partly supported by RISEWISE (RISE Women with disabilities In Social Engagement) H2020 EU project under the Agreement N 690874.por
dc.language.isoengpor
dc.publisherIEEEpor
dc.relationUIDB/00319/2020por
dc.relationSFRH/BD/133314/2017por
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/690874/EU-
dc.rightsrestrictedAccesspor
dc.subjectAutism spectrum disorderpor
dc.subjectHuman action recognitionpor
dc.subjectHuman computer interactionpor
dc.titleHuman action recognition using an image-based temporal and spatial representationpor
dc.typeconferencePaperpor
dc.peerreviewedyespor
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/9222408por
oaire.citationStartPage41por
oaire.citationEndPage46por
oaire.citationVolume2020-Octoberpor
dc.date.updated2021-03-24T09:57:26Z-
dc.identifier.doi10.1109/ICUMT51630.2020.9222408por
dc.date.embargo10000-01-01-
dc.identifier.eisbn978-1-7281-9281-9-
dc.subject.wosScience & Technologypor
sdum.export.identifier9140-
sdum.journalInternational Congress on Ultra Modern Telecommunications and Control Systems and Workshopspor
sdum.conferencePublication12th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT)por
Aparece nas coleções:CAlg - Artigos em livros de atas/Papers in proceedings

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
Human-action-Silva.pdf
Acesso restrito!
737,76 kBAdobe PDFVer/Abrir

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID