Комментарии:
How can I learn swiftUI and ARKit ???
ОтветитьWow! It works! Thank you!
Ответитьits possible generate a previous set of moviments and compare with my skeleton moviments? To be clear, compare two skeletons positions?
ОтветитьIs there any way to detect and track wrist, arm, and hand points only from a tight angle, like if I am holding my phone and looking at my arm?
ОтветитьCan Motion Capture be used in macOS Catalyst apps? So far I have no luck with that.
ОтветитьNice demo! Is it possible to use the body tracking data and apply it to a mesh in real time? With RealityKit
ОтветитьHello Ryan,
I followed your tutorial step by step. The project build is successful but still, skeleton points aren't visible. Can you please help?
Please can you help
i get an error on the ARViewContainer on line 19 and ARView underlined red and says fatal error what shall i do
Thank you
ОтветитьAre tracking points better now? These seemed unreliable if needed for accurate data.
ОтветитьThank you! Do you have suggestion on how to also run other inference during or after body tracking is running? (e.g., object detection using the Vision API). I can imagine a pipeline that looks like: Camera -> ARKit (ARBodyTrackingConfiguration) -> Vision (or maybe CoreML). For example, if we can track that the human body is doing a certain pose, we can then trigger detection of the kind of shirt they're wearing
ОтветитьThank you 😊
Ответитьthis is so ridiculous, they took microsoft kinect technology, and cant even make results as good as what was done 10 years ago. kinect v1 and v2 had better mocap than this garbage. even the fullbody skleleton is resized every frame and floats in mid air.
Ответитьhello, new to this sorta stuff, great video by the way, would this be able to be used for a augmented reality garment try on in real time? if i have a large screen and camera set up?
ОтветитьSuch a great tutorial! You are extremely articulate and describe your steps super clearly. Thanks for putting this together!
ОтветитьHi Ryan - thanks - super helpful! I got the code from Patreon, and my skeleton is mirrored, i.e., when I raise my right hand, the skeleton raises its left. Have you seen that? Anyone else? Any solution?
ОтветитьFollowed the tutorial and the app builds just fine, but it freezes whenever I point my camera at a person. The console prints out: <0x113939bb0>: ARSessionDelegate is retaining 15 ARFrames. This can lead to future camera frames being dropped.
How can I fix this?
Tested on iPhone XS, iOS 15.6
case leftShoulderToLeftArm
case leftArmToLeftForearm
case leftForearmToLeftHand
case rightShoulderToRightArm
case rightArmToRightForearm
case rightForearmToRightHand
case spine7ToLeftShoulder
case spine7ToRightShoulder
case neck1ToSpine7
case spine7ToSpine6
case spineToSpine5
case hipsToLeftUpLeg
case leftUpLegToLeftLeg
case leftLegToLeftFoot
case hipsToRightUpLeg
case rightUpLegToRightLeg
case rightLegToRightFoot
Can we count the body actions with this?
ОтветитьHi Ryan, Could you please mention the list of supporting device for this particular code?
ОтветитьAnd another thing is not clear to me.
In the init method we instantiate jointEntity and boneEntity and add them as children of the parent entity.
But these child entities (in particular jointEntity) have no translation position, rotation or even scale...
So they should collapse on top of each other in the center of the anchor entity thats contains the parent entity!
Why doesn't this happen and we see the whole skeleton appear correctly?
Hello,
first of all thanks for this tutorial!
I have followed all the steps and the app works,
however there are two things I have noticed that are not working well:
1. if the person rotates on itself, the skeleton does not rotate
2. if the device rotates on itself before to frame the person then the skeleton appears rotated differently from the person (it is rotated as the device was rotated at the start of the app)
Probably the two problems are a single problem, but I cannot understand how and where I can indicate the correct rotation to the skeleton (starting from the ARBodyAnchor I suppose)...
Hey :)
Thanks for all your work!
Is it possible to record and export the motion data to other programs to later smooth the animations and use them for characters in e.g. blender?
Hi Ryan, I just tested it on my phone and it worked pretty well. Actually, my phone went to sleep during the test. It might be good to add a line of code (UIApplication.shared.isIdleTimerDisabled = true) to avoid sleep. What do you think? Thanks.
ОтветитьThank you, Ryan, for that fantastic video. It was so helpful and instructive.
ОтветитьThank you Ryan, another great video.
ОтветитьWelcome back! Looking forward to more Apple AR content!
ОтветитьGreat video, very well explained. Have you already reviewed the new features of ARKit 6?
ОтветитьNice to see you again!
ОтветитьYou’re back 😍
Ответить