-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathabstract.tex
23 lines (22 loc) · 1.63 KB
/
abstract.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
\vspace{-0.125 in}
Dressing is one of the most common activities in human society. Perfecting
the skill of dressing can take an average child three to four years of
daily practice. The challenge is primarily due to the combined difficulty
of coordinating different body parts and manipulating soft and deformable
objects (clothes). We present a technique to synthesize human dressing by
controlling a human character to put on an article of simulated clothing.
We identify a set of \emph{primitive actions} which account for the vast
majority of motions observed in human dressing. These primitive actions
can be assembled into a variety of motion sequences for dressing different
garments with different styles. Exploiting both feed-forward and feedback
control mechanisms, we develop a dressing controller to handle each of the
primitive actions. The controller plans a path to achieve the action goal
while making constant adjustments locally based on the current state of
the simulated cloth when necessary. We demonstrate that our framework is
versatile and able to animate dressing with different clothing types
including a jacket, a pair of shorts, a robe, and a vest. Our controller
is also robust to different cloth mesh resolutions which can cause the
cloth simulator to generate significantly different cloth motions. In
addition, we show that the same controller can be extended to assistive
dressing.
% Because the human dressing motion is difficult to animate or motion capture, the input motion does not need to be exact or complete (a few keyframes or pretend-dressing mocap). We develop a feedback controller that takes into account the state of cloth