Issues • Re: Ipi Recorder Crashes with FrameGap
vmaslov wrote:

Does also Mocap Studio crash when tracking this video?

The problem may be related to Windows Camera Frame Server service which sometimes causes cameras to produce corrupted frames. Before recording next time try closing all aps that can use camera - e.g. video messengers, video recording apps etc. This may help in preventing frame gaps and bad frames issues.


Studio crashes as well. I will send video to support email. Thank you.

Statistics: Posted by TTrue — Thu Feb 20, 2020 7:19 am


Issues • Re: Ipi Recorder Crashes with FrameGap
Does also Mocap Studio crash when tracking this video?

The problem may be related to Windows Camera Frame Server service which sometimes causes cameras to produce corrupted frames. Before recording next time try closing all aps that can use camera - e.g. video messengers, video recording apps etc. This may help in preventing frame gaps and bad frames issues.

Statistics: Posted by vmaslov — Thu Feb 20, 2020 4:15 am


Issues • Re: Ipi Recorder Crashes with FrameGap
Hi
Can you share an .ipivideo file so we take a look? You may send a link to our support email if you don't want to post it here.

Statistics: Posted by vmaslov — Thu Feb 20, 2020 2:19 am


Issues • Ipi Recorder Crashes with FrameGap
Hello I am trying just replay a recorded view on Recorder it seems to be throwing an error, maybe when there is a frame gap.
Recorded on Ipi recorder 4.4.4.81
Kinect for Azure.
It will play fine up to this frame, and if I skip over the frame and start after it works fine... until next frame gap.
Any help is appreciated.

Code:
iPi Recorder 4.4.4.81

OS: Microsoft Windows NT 10.0.18362.0 (64 bits)
.NET: 4.7.2.528040
CPU count: 8
GC: workstation (Interactive)
Running in 64 bits

System.AggregateException: One or more errors occurred. ---> System.NotSupportedException: No imaging component suitable to complete this operation was found. ---> System.Runtime.InteropServices.COMException: The component cannot be found. (Exception from HRESULT: 0x88982F50)
   --- End of inner exception stack trace ---
   at System.Windows.Media.Imaging.BitmapDecoder.SetupDecoderFromUriOrStream(Uri uri, Stream stream, BitmapCacheOption cacheOption, Guid& clsId, Boolean& isOriginalWritable, Stream& uriStream, UnmanagedMemoryStream& unmanagedMemoryStream, SafeFileHandle& safeFilehandle)
   at System.Windows.Media.Imaging.BitmapDecoder..ctor(Stream bitmapStream, BitmapCreateOptions createOptions, BitmapCacheOption cacheOption, Guid expectedClsId)
   at System.Windows.Media.Imaging.JpegBitmapDecoder..ctor(Stream bitmapStream, BitmapCreateOptions createOptions, BitmapCacheOption cacheOption)
   at #=zOF8faGOR5fOX2r_1DuvO$mevsJnvuHw3Zu31sl8=.#=zmPwhZXeKO3BRQCquKg==(Byte[] #=zLd1rFDI=, Int32 #=z8$QiVTU=, Int32 #=z6tC6uYY=, Byte[] #=zQX2WQlJcigFR, BitmapInfo& #=zxVPTWjU=)
   at #=zbkPomS2pQjbPGc_MC0rb30n0tShigTli2w3pYZhhdWZYALtzTw==.#=zgnfp53E=(Byte[] #=zLd1rFDI=, Int32 #=z8$QiVTU=, Int32 #=z6tC6uYY=, Byte[] #=zQX2WQlJcigFR)
   at #=z1AN6ahD6c7CnF3iddyU23Bw=.#=zCSdAgO3hS4GdjB2s44M9U1s=.#=znnScZmx6O5otWxeKNQ==(Byte[] #=zHAaBasuOGGvL)
   at #=zqHlsIeVNbYwpQFcZv5$Xqfc6c05E.#=z5ys9DzWbiqPK(Action`1 #=zOnXLDB6Gif7i)
   at #=z1AN6ahD6c7CnF3iddyU23Bw=.#=zhL2fdpPk7MKM(#=zKPCl8M_LM5nC62OHMA== #=zj6YrC78=)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=zvU4uygibZkhmWLFJr7QGEPw=.#=z$9r_6iUsU2H9WVcn9Q==(#=zZNv5x5the5m2y5Z0_vE69uI= #=z0MPiCiw=)
   at System.Threading.Tasks.Parallel.<>c__DisplayClass17_0`1.<ForWorker>b__1()
   at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask)
   at System.Threading.Tasks.Task.<>c__DisplayClass176_0.<ExecuteSelfReplicating>b__0(Object <p0>)
   --- End of inner exception stack trace ---
   at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
   at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
   at System.Threading.Tasks.Parallel.ForWorker[TLocal](Int32 fromInclusive, Int32 toExclusive, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState, Func`4 bodyWithLocal, Func`1 localInit, Action`1 localFinally)
   at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable`1 source, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState, Action`3 bodyWithStateAndIndex, Func`4 bodyWithStateAndLocal, Func`5 bodyWithEverything, Func`1 localInit, Action`1 localFinally)
   at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable`1 source, Action`1 body)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=zhL2fdpPk7MKM(#=zKPCl8M_LM5nC62OHMA== #=zj6YrC78=)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=zKv8UeVej7nU5j_bYeE8nzP0=(#=zKPCl8M_LM5nC62OHMA== #=zj6YrC78=)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=znIZM1UByvuvB.#=z6bHYsAo=(Int32 #=z3TPwzs4=, Action`1 #=zmwsGAOY=)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=zhL2fdpPk7MKM(Int32 #=zJsVsdHE=)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=zsnXLD5__sxaQ(Int32 #=zFT06bto=)
   at iPiRecorder.ViewModel.PlaybackController.SetTimeTrackValue(Double value)
   at iPiRecorder.ViewModel.PlaybackController.set_TimeTrackValue(Double value)
   at -.dje_zBGGQPL8ESYSJ783WZU5WBDGJDSNQ_ejd.#=zPfwdb10OBzwF()
   at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)
   at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)
---> (Inner Exception #0) System.NotSupportedException: No imaging component suitable to complete this operation was found. ---> System.Runtime.InteropServices.COMException: The component cannot be found. (Exception from HRESULT: 0x88982F50)
   --- End of inner exception stack trace ---
   at System.Windows.Media.Imaging.BitmapDecoder.SetupDecoderFromUriOrStream(Uri uri, Stream stream, BitmapCacheOption cacheOption, Guid& clsId, Boolean& isOriginalWritable, Stream& uriStream, UnmanagedMemoryStream& unmanagedMemoryStream, SafeFileHandle& safeFilehandle)
   at System.Windows.Media.Imaging.BitmapDecoder..ctor(Stream bitmapStream, BitmapCreateOptions createOptions, BitmapCacheOption cacheOption, Guid expectedClsId)
   at System.Windows.Media.Imaging.JpegBitmapDecoder..ctor(Stream bitmapStream, BitmapCreateOptions createOptions, BitmapCacheOption cacheOption)
   at #=zOF8faGOR5fOX2r_1DuvO$mevsJnvuHw3Zu31sl8=.#=zmPwhZXeKO3BRQCquKg==(Byte[] #=zLd1rFDI=, Int32 #=z8$QiVTU=, Int32 #=z6tC6uYY=, Byte[] #=zQX2WQlJcigFR, BitmapInfo& #=zxVPTWjU=)
   at #=zbkPomS2pQjbPGc_MC0rb30n0tShigTli2w3pYZhhdWZYALtzTw==.#=zgnfp53E=(Byte[] #=zLd1rFDI=, Int32 #=z8$QiVTU=, Int32 #=z6tC6uYY=, Byte[] #=zQX2WQlJcigFR)
   at #=z1AN6ahD6c7CnF3iddyU23Bw=.#=zCSdAgO3hS4GdjB2s44M9U1s=.#=znnScZmx6O5otWxeKNQ==(Byte[] #=zHAaBasuOGGvL)
   at #=zqHlsIeVNbYwpQFcZv5$Xqfc6c05E.#=z5ys9DzWbiqPK(Action`1 #=zOnXLDB6Gif7i)
   at #=z1AN6ahD6c7CnF3iddyU23Bw=.#=zhL2fdpPk7MKM(#=zKPCl8M_LM5nC62OHMA== #=zj6YrC78=)
   at #=zZ1ZaC4T3aTAVNk$AxA==.#=zvU4uygibZkhmWLFJr7QGEPw=.#=z$9r_6iUsU2H9WVcn9Q==(#=zZNv5x5the5m2y5Z0_vE69uI= #=z0MPiCiw=)
   at System.Threading.Tasks.Parallel.<>c__DisplayClass17_0`1.<ForWorker>b__1()
   at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask)
   at System.Threading.Tasks.Task.<>c__DisplayClass176_0.<ExecuteSelfReplicating>b__0(Object <p0>)<---

Statistics: Posted by TTrue — Wed Feb 19, 2020 3:12 pm


Issues • Re: Demo Data : dual-k4a-180-wfov-rolls
Please check that actor model is actually aligned with point cloud (roughly) by rotating the scene and viewing from different angles. Sometimes it looks aligned in 2 dimensions, but is not aligned in 3rd dimension. Please refer to the attached screenshot.

If you still experience the issue, please record screencast of your actions in order we could track down the problem.

Actor align screenshots.jpg

Statistics: Posted by pas — Wed Feb 19, 2020 12:15 am


Issues • Re: Demo Data : dual-k4a-180-wfov-rolls
It's a Nvidia RTX 2070 card
I'll check at work on friday to see if I have the latest drivers.

Thx,

Iwan

Statistics: Posted by e1two — Tue Feb 18, 2020 11:53 am


Issues • Re: Demo Data : dual-k4a-180-wfov-rolls
Hmm this may be a GPU/driver specific glitch. What is your GPU model? Please ensure you have the latest graphics driver installed.

Statistics: Posted by vmaslov — Tue Feb 18, 2020 8:14 am


Issues • Re: Demo Data : dual-k4a-180-wfov-rolls
Hi Ipi,

Thanks for your quick reply!

Yes. I followed the tutorial, displayed both cameras in the 3D view and lined up the actor pose
globally as well as changing the pose to fit them as best as possible. Once that was done and
after doublechecking the actor size, I just hit the refit pose button.

Thx,
Iwan

Statistics: Posted by e1two — Tue Feb 18, 2020 7:21 am


General • Re: question which camera?
If not yet, check this comparison of camera setups in our docs
http://docs.ipisoft.com/Multiple_Depth_Sensors_vs_Web_Cameras_vs_Action_Cameras_Comparison

As you mention martial arts, my first impression is that moves are fast and there is high likeliness of self-occlusions. In that case, you'll benefit from greater frame rate and greater number of POVs provided by PS Eye cameras.

Statistics: Posted by vmaslov — Tue Feb 18, 2020 3:29 am


General • question which camera?
I was thinking about getting a 6 camera ps3 setup in place? Want to do motion capture for martial art moves. By what I see in example video's that system should capture actions well.. Is this a good choice or is there a better path camera like the 2 azure kinetics... Thanks

Statistics: Posted by pbright — Mon Feb 17, 2020 3:21 pm


Issues • Re: Demo Data : dual-k4a-180-wfov-rolls
Hi Iwan
Do you align an actor model with a point cloud before you start tracking?

Statistics: Posted by vmaslov — Mon Feb 17, 2020 10:33 am


Issues • Demo Data : dual-k4a-180-wfov-rolls
Hi all,

I'm getting ready for our first dual azure capture this coming friday and downloaded
the sample dual azure data to get a headstart. I get the data to track using the fast
tracking algorithm, however when I uncheck this option or try to refit pose, the skeleton
goes all funky.

Is this normal behaviour or have I skipped a step? Or do you always need to use the fast
tracking algorithm when using the azure cameras. Any extra info would help.

Thanks,

Iwan

Statistics: Posted by e1two — Mon Feb 17, 2020 9:30 am


General • Re: Kinect and Kinect 2 - big difference for single capture?
No, the data formats for each type of Kinect sensor is different and not compatible with each other.

For example, you can use multiple Kinect 1 sensors (a.k.a., XBox 360) together, and you can use multiple Kinect 2 sensors (a.k.a., Kinect for XBox One) together, but you can't mix Kinect 1 with Kinect 2 sensors.

Since 'Kinect For Windows' is almost identical to Kinect 1, these two are compatible with each when you use identical settings.

Just to cover all bases with the Microsoft sensors, Azure Kinect sensors are only compatible with other Azure Kinect sensors.

Statistics: Posted by Greenlaw — Sat Feb 08, 2020 8:11 pm


General • Re: Kinect and Kinect 2 - big difference for single capture?
Another problem. Can I use Kinect 1 and Kinect 2 to use at the same time?

Statistics: Posted by akira32 — Sat Feb 08, 2020 7:00 pm


General • Re: What direction should I mount the move controller on my
Sure, glad to help.

The only downside to holding the controller is maintaining the firm grip while allowing the wrist to rotate naturally. It's easy to forget and loosen the grip on the controller while performing but with practice it can become second nature to do both and make it look natural.

Statistics: Posted by Greenlaw — Sat Feb 08, 2020 2:38 pm


General • Re: What direction should I mount the move controller on my
The legendary Greenlaw! Thanks for the response. Really appreciate it. I see you everywhere and reading your experiences has been helpful in getting my stuff started. I think I'll stick to holding the controller as well, at least for now.

Statistics: Posted by Eumenes — Sat Feb 08, 2020 1:25 pm


General • Re: What direction should I mount the move controller on my
Tip: Early on, I experimented with attaching a controller to the back of a workout glove. To do this, I stitched a dense foam rubber riser to the back of the glove and used a wide velcro strap to attach the controller. The riser was required because when I bent my wrist, the long handle of the sensor would hit my arm and limit the range of movement. By lifting the controller, it was less likely to hit my arm.

After some successful testing, I decided to abandon the glove and just hold the controller in my hands. I prefer this because I can use the controller's buttons as a remote to operate iPi Mocap Studio. I can't do this easily when it's attached to a glove.

Some users will disassemble the controller and just attach the inner parts to a glove. I've never bothered to go that far with it. For me, holding the sensor in my hand has worked fine.

Statistics: Posted by Greenlaw — Fri Feb 07, 2020 8:35 pm


General • Re: What direction should I mount the move controller on my
You just need to be able to point the controller directly at a sensor at the beginning of your recording. This action gives Mocap Studio the reference frame it needs when you adjust it later in Mocap Studio. It is important that the orientation of the controller remains fixed relative to the hand, whether your holding it or when it's attached to the hand.

For example, if you're holding the controller in your hand, you need to keep a firm grip on it. Gripping it loose and sloppy is bound to record poor data.

When the controller is mounted to the back of the hand, it's typically aligned with the fingers. This is mainly because it's easier to point to a sensor this way. As you noticed, this orientation is different from when the controller held in the hand, so it really doesn't matter. You just need to make sure the orientation remains constant relative to the hand. So when it's attached to the hand, it must be attached firmly and not allowed to bounce, twist or slide when you move around.

After recording and tracking the body mocap, you can adjust the orientation of the controller on the Actor skeleton. When the controller is manually aligned to match the orientation seen in the reference frame, then the hands/head motion is be applied. The key is to make sure the character's body motion is tracked, corrected and smoothed before you apply the hands/head data.

The Wiki has more detailed information about this:

http://docs.ipisoft.com/Motion_Controllers

I hope this helps. Good luck!

Statistics: Posted by Greenlaw — Fri Feb 07, 2020 8:28 pm


General • What direction should I mount the move controller on my hand
So, I finally got my hands on some move controllers and I want to make it a little less bulky. Threads like these: viewtopic.php?f=2&t=9665 have shown that it's not too difficult to strip the outer casing and leave only the essentials like the board and battery. So, my only concern right now is which direction to point the move controller stripped down board.


https://www.youtube.com/watch?v=4mV7yITXyBU <= Seed Crystal Games stripped their move controller and has it pointing forward on the back of the hand, like some sort of wrist mounted gun. But the earlier mentioned thread has their's mounted horizontally across the palm. I'm confused by this as wouldn't you have to point your wrist in a strange way to calibrate it when pointing to the camera? What's the best way to mount it?

Statistics: Posted by Eumenes — Thu Feb 06, 2020 9:40 pm


General • Re: Kinect and Kinect 2 - big difference for single capture?
Yeah, naming is confusing:

1. Kinect 1 = Kinect for XBox 360.

2. Kinect for Windows is similar to the above but has 'Near Mode', good for scanning small objects and face tracking (with other software.)

3. Kinect 2 = Kinect for XBox One

4. The latest version is Azure Kinect, which is a completely new and different device. This one records the highest quality yet.

For iPi Mocap Studio, options 1 & 2 delivers the same quality. Option 2 gives you a few extra features useful for adjusting the image quality but it's not critical. Mocap quality is okay. 1 is what I used for the early Brudders projects like Happy Box; 2 is what I used for the Brudders music video excerpt, and for VFX in a few SyFy channel feature films.

With option 3, the mocap data is less noisy which means less jittering. This is what I've been using recently and I've been mostly satisfied with it. I used it for a little VFX work a few years ago.

Option 4 is the highest quality depth sensor available for iPi Mocap Studio. I got a set of these last week and I'm setting aside time to test them this weekend. I'll post more info on my progress shortly after.

Re: 1 sensor vs 2 sensors, there is a HUGE difference in quality. Here's more info:

Using 1 sensor in iPi Mocap Studio results in higher quality than using 1 sensor with Kinect SDK (realtime capture) because of the software's separate tracking process. However, it's prone to the same occlusion errors you get with using single sensors in general. A single sensor is adequate for 'standing in place' performances, with limited arm movements.

Using 2 or more sensors removes the occlusion issue, allowing for more accurate recordings for walking, spinning, jumping, and other complex movements. I highly recommend using 2 sensors; the results are significantly better even when using the old Xbox 360 Kinect.

Hope this helps.

Statistics: Posted by Greenlaw — Thu Feb 06, 2020 8:35 am