Zur Kurzanzeige

Reconstruction of Human Motions Based on Low-Dimensional Control Signals

dc.contributor.advisorWeber, Andreas
dc.contributor.authorTautges, Jochen
dc.date.accessioned2020-04-18T00:38:18Z
dc.date.available2020-04-18T00:38:18Z
dc.date.issued09.08.2012
dc.identifier.urihttps://hdl.handle.net/20.500.11811/5362
dc.description.abstractThis thesis addresses the question to what extent it is possible to reconstruct human full-body motions from very sparse control signals.
To this end, we first investigate the use of multi-linear representations of human motions. We show that multi-linear motion models together with knowledge from prerecorded motion capture databases can be used to realize a basic motion reconstruction framework that relies on very sparse inertial sensor input only. However, due to the need for a semantic pre-classification of the motion to be reconstructed and rather restricting database requirements, the described framework is not suitable for a more general motion capture scenario.
We address these issues in a second, more flexible approach, which relies on sparse accelerometer readings only. Specifically, we employ four 3D accelerometers that are attached to the extremities of a human actor to learn a series of local models of human poses at runtime. The main challenge in generating these local models is to find a reliable mapping from the lowdimensional space of accelerations to the high-dimensional space of human poses or motions. We describe a novel online framework that successfully deals with this challenge. In particular, we introduce a novel method for very efficiently retrieving poses and motion segments from a large motion capture database based on a continuous stream of accelerometer readings, as well as a novel prior model that minimizes reconstruction ambiguities while simultaneously accounting for temporal and spatial variations.
Thirdly, we will outline a conceptually very simple yet very effective framework for reconstructing motions based on sparse sets of marker positions. Here, the sparsity of the control signal results from problems that occurred during a motion capture session and is thus unintentional. As a consequence, we do not control the information we can access, which introduces several new challenges. The basic idea of the presented framework is to approximate the original performance by rearranging suitable, time-warped motion subsequences retrieved from a knowledge base containing motion capture data that is known to be similar to the original performance.
en
dc.language.isoeng
dc.rightsIn Copyright
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subjectComputeranimation
dc.subjectBewegungserfassung
dc.subjectBewegungsrekonstruktion
dc.subjectBewegungssynthese
dc.subjectBeschleunigungssensoren
dc.subjectInertialsensoren
dc.subjectMultilineares Modell
dc.subjectTensoren
dc.subjectcomputer animation
dc.subjectmotion capture
dc.subjectmotion synthesis
dc.subjectmotion reconstruction
dc.subjectmotion retrieval
dc.subjectaccelerometers
dc.subjectinertial sensors
dc.subjectmulti-linear model
dc.subjecttensors
dc.subjectonline lazy neighborhood graph
dc.subjectsubsequence graph
dc.subjectdynamic time warping
dc.subject.ddc004 Informatik
dc.titleReconstruction of Human Motions Based on Low-Dimensional Control Signals
dc.typeDissertation oder Habilitation
dc.publisher.nameUniversitäts- und Landesbibliothek Bonn
dc.publisher.locationBonn
dc.rights.accessRightsopenAccess
dc.identifier.urnhttps://nbn-resolving.org/urn:nbn:de:hbz:5n-29462
ulbbn.pubtypeErstveröffentlichung
ulbbnediss.affiliation.nameRheinische Friedrich-Wilhelms-Universität Bonn
ulbbnediss.affiliation.locationBonn
ulbbnediss.thesis.levelDissertation
ulbbnediss.dissID2946
ulbbnediss.date.accepted27.07.2012
ulbbnediss.fakultaetMathematisch-Naturwissenschaftliche Fakultät
dc.contributor.coRefereeBadler, Norman I.


Dateien zu dieser Ressource

Thumbnail
Thumbnail
Thumbnail
Thumbnail
Thumbnail

Das Dokument erscheint in:

Zur Kurzanzeige

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden:

InCopyright