Show simple item record

dc.contributor.authorNapier, Emily
dc.date.accessioned2024-04-25T13:10:56Z
dc.date.available2024-04-25T13:10:56Z
dc.date.issued2024-04-16
dc.identifier.urihttp://hdl.handle.net/10222/84070
dc.description.abstractHuman motion-capture data can be represented, modeled, and generated through computational techniques. This thesis explores representations and strategies for querying, interpolating, and sequence modeling of motion-capture data. We employ spectral analysis of motion capture data to facilitate the query and comparison of movements, and identify target features for interpolation. We train a decoder-only transformer model on text-encoded motion-capture data, which we fine-tune for dance generation and movement classification. Our core contributions are defining interpolation and language model training procedures for generating motion-captured dance.en_US
dc.language.isoenen_US
dc.subjectmotion-captureen_US
dc.subjectmachine learningen_US
dc.titleModeling Human Motion-Capture Data for Creativityen_US
dc.date.defence2023-08-11
dc.contributor.departmentFaculty of Computer Scienceen_US
dc.contributor.degreeMaster of Computer Scienceen_US
dc.contributor.external-examinern/aen_US
dc.contributor.thesis-readerJoseph Mallochen_US
dc.contributor.thesis-readerCarlos Hernandez Castilloen_US
dc.contributor.thesis-supervisorSageev Ooreen_US
dc.contributor.thesis-supervisorGavia Grayen_US
dc.contributor.ethics-approvalNot Applicableen_US
dc.contributor.manuscriptsYesen_US
dc.contributor.copyright-releaseNot Applicableen_US
 Find Full text

Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record