-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inverting the new_joint_vecs? #4
Comments
Hi, @andrewnc We actually do not use rotation in our representation from our experiments in the appendix. And we followed Humanml3D format to visualize the motion using position. If you are seeking to visualize the motion in software like Blender, you may try this. And I also visualize the position with rotation after FK in my script, and it seems good. Shunlin |
Thank you for the response. To make sure I understand correctly, I would take the 623 dimensional data, extract the XYZ positions for each joint, and run something like https://github.com/IDEA-Research/HumanTOMATO/blob/main/src/tomato_represenation/common/skeleton.py#L103 to recover the rotations for each joint? |
|
This is great work! I have a quick question. After I have processed the motions as described, I have three folders.
joint
new_joints
new_joint_vecs
. When training a generative model here, as described in your paper, you would use the full 623 dimensional vector innew_joint_vecs
If you wanted to extract the rotations in
new_joint_vecs
to apply to an fbx - how would you do this?I mean to say, it seems the joint order has changed, there is no Jaw joint, and directly removing the continuous 6d rotations and transforming them into quaternions doesn't yield the desired effect.
I'm curious if you have insights here
The text was updated successfully, but these errors were encountered: