

I've been using iClone since version 2, and I've had more fun playing during the last couple of weeks with Version 5 and the motion capture plugin than I had with any of the then-new features of the prior versions. Leg animation could be captured in a second pass using arms. It would be really nifty if an alternate capture mode could be created to calibrate and track only the upper body. A hand/gesture capture made using closeup views of the hands would substantially speed the animation process where hands play a large part of the animation, and would even open the doors for using the product for producing sign-language animations (although two captures would need to be made, one for the arms, and the other to lay the gestures down.ħ. The ability to simultaneously use two or three kinects or cameras to create more accurate 360 degree motion captureĦ. NOT easy to implement, but would be awesome:ĥ. The mocap wishlist is endless, but these seemed like four improvements that could be made relatively easily. A feature that would allow the capture to be made in "slow motion" and then played realtime would be useful for capturing complicated or precise motions. me) is incapable of adequately performing in realtime. Sometimes there is a complicated motion that needs to be captured (for example, some dance steps), that the sad human operator (i.e. This simple act of calibration would make interaction with a virtual environment much more intuitive, and would eliminate a great deal of motion layer editing.Ĥ.
Ipi mocap studio 2 calibration driver#
Just like the driver captures an H pose, it should be possible to have a character calibration, where your character appears on the screen, and presents several different poses for you to mimic, and the captured information could be used to adjust your animation prior to capture so that the avatar's 1) arms go straight down/up when your arms are straight down/up, that your avatars fingers touch when you touch your fingers together, that your legs come together when you stand with your feet together, etc. While this can be manually corrected by editing the motion layer, it makes capture a bit of a challenge. Limb calibration: Maybe it's just my body shape/proportions, but I find that when I capture myself with my arms straight at my sides, my avatar's arms protrude at about 30 degrees from the torso. It would be really nice to have some sort of smoothing filter that could be applied to captured animations after the capture to provide for variable smoothing of the actions to make the motions appear less spastic.ģ. The realtime nature of the capture and the limitations of the capture device/software tends to make captured motions extremely "twitchy" compared to manually animated motions. While you can cut this out of the animation, it would be substantially easier to just have a countdown timer that would allow you time to position yourself/your avatar before the recording starts.Ģ.

As a result, every captured animation begins with my avatar reaching over to press the space key. Working by myself, however, leaves me oftentimes with nobody to press the space key for me to begin recording. One of the things I love about iClone is the fact that I, as an individual, can do things that just years ago would take a team of people to accomplish. This practice has forced me to face a few limitations of the device and software, but also given me a list of relatively minor improvements that would lead to a substantially more usable product. I have been playing with the motion capture, and although my first experiments produced rather disappointing results, practice and experimentation, as well as laying down animations in multiple layers for different body parts has improved dramatically the results I have received.
