iPi Soft

Motion Capture for the Masses

newsletter share-facebook share-twitter

Archive for the ‘All’ Category

The New Update 4.6.3 Released

Posted on: April 4th, 2024

The New Update 4.6.3 Released

What’s new:

  • AI-based Tracking with Auto-recover option feature added. It allows to automatically detect and fix tracking errors with the help of AI-based tracking.
  • [BUG fix] Rare AI-based tracking crashes in some cases.
  • [BUG fix] Fixed crashing on project closing when Pose Mismatch view mode is on.
  • [BUG fix] Project recording with Real-time Tracking for Live Preview could not open.

For more information and download links visit Release notes.

tracking-ai-auto-recover-2

The New Update 4.6.2 Released

Posted on: December 26th, 2023

The New Update 4.6.2 Released

What’s new:

  • Trajectory filtering for AI poses that improves overall AI-based Tracking quality.
  • Interpolation of AI poses missed due to misdetections that improves overall AI-based Tracking quality.

For more information and download links visit Release notes.

30% Holiday Discount: Merry Christmas and Happy New Year! Stay safe and creative!
Use the coupon code NY2024 to get 30% off any product till Jan 17th, 2024.
AI-Filtering-1

The New Update 4.6.1 Released

Posted on: November 8th, 2023

The new version 4.6.1 includes improvements to AI-based pose detection and a number of bug fixes.

What’s new:

  • Detect Pose (AI) now sets head, feet poses and hands orientation if Track or Track (AI-Based) tracking option selected.
  • Improved quality of Detect Pose (AI) for arms, legs and feet.
  • Set realistic arm and leg twists in Detect Pose (AI) (formerly set zero twists).
  • Ensure from enormous arm twists in jitter removal, Detect Pose (AI).

Bug fixes:

  • Trajectory filtering was not applied after AI-based tracking.
  • AI-based feet tracking sometimes returned invalid pose that led to crash.
  • Rare crashes in trajectory filtering, jitter removal related to AI-based tracking.

For more information and download links visit Release notes.

Posted on: September 19th, 2023

Using iPi Motion Capture To Teach 3D Animation In A Public School

A Q&A With Josep Maria Duque, Technical Coordinator at EMAV

(School of Audiovisual Media), Barcelona

Since 1970, EMAV («Escola de Mitjans Audiovisuals», https://www.emav.com/), a public school dependent on the Barcelona Education Consortium, has been actively involved in the training of professionals in the field of Audiovisual Communication. The exciting evolution of the sector over these decades has given the school some convictions regarding the contents and the learning methodology of students.

Being a keystone in audiovisual productions, 3D animation takes its important place in the school’s higher education cycle curriculum; 2-year course consists of 2,000 hours. Motion capture is an important technique in modern production pipelines, so the school needs effective tools for teaching motion capture. EMAV’s choice was iPi Motion Capture, and it’s been successfully used in the school for over 5 years.

We spoke with Josep Maria Duque, EMAV’s technical coordinator, to find out how the school is using iPi Soft (https://www.ipisoft.com/) technology.

Q. Please describe the areas at the EMAV that are using iPi Motion Capture.

iPi Motion Capture is used at the «Escola de Mitjans Audiovisuals» (EMAV) to make the motion captures that are used in the subjects of “3D Animation” and “Videogame Design” of the “3D Animation and Videogames” cycle, where the movements made by the students are captured  and applied to the bipedal characters that are imported into 3D Studio and Unreal.

Q. How long has the iPi Motion Capture been in use at EMAV?

We have been using iPi for 5 years, first evaluating its possibilities with the trial version and later acquiring the commercial version.

Q. How did the  EMAV  team first hear about the software?

We found out about the existence of iPi on the Internet, reading in various forums (like https://www.cgchannel.com ) about its possibilities.

Q. Please describe in detail how iPi Motion Capture is being used at  EMAV.

We use the “Basic” version to capture an actor with iPi Recorder with two Kinect v2 cameras (sometimes also with up to 6 Sony PS3 Eye cameras) to later process them with iPi Mocap Studio and export the “bip” file to 3D Max, where we import it into a Character Studio model to later render it, or export it to Unreal, in order to make a video game character. We have also made sometimes a connection with LiveLink, connecting iPi via Real-Time Tracking feature to Unreal to capture in real time and make a live performance.

Q. How many people (students, teachers) have the opportunity to work with iPi Mocap?

Normally we have 30 students per course, but the captures are made in small groups of 3 or 4 students so as not to overwhelm the capture set, and later they are processed individually on a PC that has the program installed.

Q. Does using iPi Motion Capture improve upon a previous workflow, or is this something completely new?

Before iPi we used manual video rotoscoping techniques, which is a lot of work.

Q. What are the major advantages of iPi Motion Capture vs competing products that made you choose it for your work?

The main advantage is that it can be used with affordable hardware, such as Kinect cameras or webcams, and that the process is quite automatic. When we need an ultra-precise capture we use the capture system with VIVE trackers and the Valve index controls for finger capture.

Q. Where do you see the future of markerless motion capture use in your area?

It is increasingly common to use simple webcams that, supported by “AI” techniques, simplify the process and greatly reduce costs and capture time, so systems with “markers” or complex hardware will be relegated to high-budget productions or technical-scientific applications.

Q. Please describe a particular project iPi Motion Capture is being used for.

iPi has been used in several of the subject of  MP (“Project module” ) at the end of the course for several years, to animate the 3D characters of the video games made by the students of the cycle.

EMAV-2

EMAV-1

The New Version 4.6 is Out!

Posted on: September 6th, 2023

The New Version 4.6 is Out!

For the past year our dev team’s been busy working hard on the new version that has just come out. We experimented a lot with recently introduced AI-based tracking algorithms and find effective way to use them for high-accurate full-body motion capture we master in since 2008.

Despite all the hype around AI algorithms they have not only widely advertised advantages, but also their limitations. If we talk about human motion capture, the major one is that AI algorithms that guess based on numerous movement patterns contained in training datasets, cannot track well unique / rare / complex motions. So we combined AI tracking with our proprietary high-accurate algorithms that find pose that best match actual input data, to make the tracking pipeline easier, faster and add more features without the need for accuracy trade-off.

Read More>>

30% Fall Discount:

To celebrate current important software update, we offer special 30% discount off any perpetual, one-year or annual support prolongation software plans,  in our Online Store. Use coupon code NV4623 (offer expires Sept 26th, 2023).

release-notes-4_6_0-small

v46-skeleton

A Conversation With Blood & Clay Film Director

Posted on: August 20th, 2023

A Conversation With Blood & Clay Film Director

bnc_logo

A Q&A With Martin Rahmlow, Director of Blood & Clay Short Film

Blood & Clay (https://bloodandclay.com/bnc.php) is an animated short movie (~20 min) co-directed by Martin Rahmlow and Albert Radl. The movie tells the story of the orphan Lizbeth, who tries to escape her nemesis Director Kosswick and his Golem. The movie is produced in a hybrid stop-motion/cgi technique. The film is created by a three-member team – Martin Rahmlow, Albert Radl and Onni Pohl.

We spoke with Martin Rahmlow, film director, to get known how they are using iPi Soft (https://www.ipisoft.com/) technology in their production pipeline.

 bnc_martin

Q. Please tell a few words about yourself and the team.

All three of us met 20 years ago at film school: Filmakademie Baden-Württemberg. Albert supported me during the production of my first stop-motion movie ‘Jimmy’ (https://vimeo.com/manage/videos/66549889) and I did the VFX for Alberts ‘Prinz Ratte’ movie (https://www.amazon.de/Prinz-Ratte-Albert-Radl/dp/B07HM1WMGD). ‘Blood & Clay’ (https://bloodandclay.com/bnc.php)  is the first movie all three of us collaborating. Albert and I are the directors of the movie and Onni focuses on the technical part.

Q. How did the team first hear about iPiSoft Mocap? And, how long has the team been using it?

In May 2015 Onni was assigned setting up real life 3d characters for an advertisement for Mercedes Benz Sprinter (production: Infected). It was a freeze moment at a festival. There was a life shooting, but all the background people were made with poses that were extracted from tracking data made with iPi.  A college of Onni, Fabian Schaper, had proposed this concept, as he had a Kinect sensor and appropriate notebook.

A Year ago, we discussed this possibility with our team and after trying the software, we decided, that it could help us with our ‘Blood & Clay’, as we have human like characters and a lot of animation.   We did a few tests in the beginning but had a serious recording session (with an actress and our two directors) just in May. We have to process and clean the data and will have another session for some missing details.

Q. What is the storyline for the film?

The movie tells the story of the orphan Lizbeth, who tries to escape her nemesis Director Kosswick und his Golem.

Q. Please describe briefly your production pipeline, and the role of iPi Mocap in the pipeline.

  • We use Prism as production pipeline and scripted little add-ons for our project
  • Texturing and detailing is done with Substance and ZBrush
  • Modelling and rigging is done in Maya
  • Characters get an individual (simplified) mocap rig
  • The sets are scratch-built and painted miniatures (scale 1:4) – then scanned and transferred into 3d assets
  • Layout version of the film (animatic) may contain raw parts of the first mocap session
  • During the animation phase, we plan to use our recorded animation (and loops, that are created from it) as a basis for all possible body movements
  • We use Ragdoll to simulate cloth and hair and YETI for fur and hair FX
  • For final rendering we’re considering several options. Either going the classic path tracing route like Arnold or rendering with a game engine like Unity or Unreal

Q.  How long has the team been working on the film? What were the overall technical challenges that the team faced and how iPi Mocap helps to meet them?

We started with production in 2020. Since it will be a 20 min short, we have a lot of animation with 3 very different, but (partly) human characters. The feature of iPi to load an individual rig for the export gives us the possibility to fit the recording to the character at export time perfectly. Eventually all scenes with intensive human body movement, that are on the ground (not hanging or dangling) are planned to get a basis animation for body movement with head tracking from iPi – sometimes we use separate hands tracking. Fingers and face will be animated separately.

We had a first technical test in January. It took some hours until we found the perfect way to calibrate our two cams. We did our recording session in May, using 2 Kinect v2 cams and 2 JoyCon controllers. We have another Azure Kinect, but unfortunately it can’t be used together with Kinect v2. Nevertheless, our calibration was good and the tests we did with our recordings turned out very promising.

At the moment we are experimenting with iPi to assist for a roto scene. Hands are shaping clay in this shot and we want to replace the hands and fingers with 3d hands. We got a close up of hands shaping clay and recorded with iPi from another perspective, using special trackers for the hands. We hope that this will give us a base for the roto to facilitate the finger roto-motion, as hands are always quite difficult to track, while moving the fingers a lot. Because we had limited space, we modified the T-Pose. Arms were stretched out to the front.

In our recording session, we recorded movements like crawling and somebody pushing himself forward with his arms, sitting on a rolling board, but we did not processed the recordings yet and are very curious if iPi will be able to help us with this task.

bnc-title

Renderosity 2022 Animation Halloween Contest

Posted on: October 27th, 2022

A great opportunity to win a perpetual license of iPi Motion Capture – just send your animation to Renderosity Animation Halloween Contest. Submission deadline is Oct 31st.

1st prize is Pro, 2nd prize is Basic, and 3rd prize is Express edition. A lot of other sponsors give away their software licenses as well.

Good luck in the competition!

https://www.renderosity.com/contests/1594/2022-animation-halloween-contest

Renderosity-Halloween-2022

iPi Mocap Live Link Plugin for Unreal Engine 5.0 Released

Posted on: July 22nd, 2022

iPi Mocap Live Link plugin for Unreal Engine 5.0 released.

Use it in the same way as plugins with UE4. Live Link menu item is now in Window > Virtual Production group in UE5.

Plugin download link: https://files.ipisoft.com/iPiMocap-Unreal-5-0.zip

Docs on how to use the plugin: https://docs.ipisoft.com/Animation_Streaming#Usage_in_Unreal

Importand Updates and Enhancements in Motion Transfer Profile Editing

Posted on: June 27th, 2022

Importand Updates and Enhancements in Motion Transfer Profile Editing

We have released updates and fixes for editing of motion transfer profiles.
What’s new:

  • Improvements in UI for editing of motion transfer profile:
    • Check that no duplicate targets used when updating a symmetric bone mapping.
    • Update a symmetric bone when adding/removing a target bone.
    • Enabled the “(Unused)” item in the combo list.
  • Fixes in editing of motion transfer profile:
    • Fixed crash when trying to select an already used target bone in the viewport.
    • Fixed inability to map symmetric bone when it can’t be assigned automatically.
    • Fixed an error when reading swing/twist weights from an XML file.

See details in the release notes:

http://docs.ipisoft.com/iPi_Mocap_Studio_Release_Notes#ver._4.5.7.258

The New Version Includes Unreal Engine’s MetaHuman Character Support

Posted on: April 18th, 2022

The New Version Includes Unreal Engine’s MetaHuman Character Support

The recent release of iPiMocapStudio 4.5.6 includes support for Unreal Engine’s MetaHuman character and other motion transfer enhancements:

  • Support for Unreal Engine’s MetaHuman character in animation export and streaming.
  • Motion transfer enhancements (see details).
    • Allow for multiple target bones in character mapping.
    • Allow for separate swing and twist rotation channels.
  • Added built-in motion transfer profile for Daz Genesis 8 character.
  • Using separate swing and twist rotation channels in UE4 Mannequin, Endorphin built-in motion transfer profiles.

See details in the release notes:

http://docs.ipisoft.com/iPi_Mocap_Studio_Release_Notes#ver._4.5.6.256

meta-human-support