Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Get Face Landmark Hand Landmark and Body Landmark? #4

Open
Epoch2022-iOS opened this issue Dec 8, 2022 · 8 comments
Open

How to Get Face Landmark Hand Landmark and Body Landmark? #4

Epoch2022-iOS opened this issue Dec 8, 2022 · 8 comments
Assignees
Labels
question Further information is requested

Comments

@Epoch2022-iOS
Copy link

Hello, I'm developer with an iOS application,
Now I need Use MediaPipe To Tracking human body、face and hands, I try to Run the Example Demo and get the realtime pixelBuffer, But I can't get the Face Landmark、Hand Landmark and Body Landmark with the code.

I need your help now to get that landmark.

language:Swift/Objective-C
Xcode Version:14.0.1
Device:IPhone 12 Pro Max

@61315 61315 self-assigned this Dec 10, 2022
@61315 61315 added the question Further information is requested label Dec 10, 2022
@61315
Copy link
Owner

61315 commented Dec 23, 2022

@Epoch2022-iOS Have you tried the Holistic example from the official repo? They provide examples for Javascript, Android, and iOS, respectively, and I highly advise you to do so.

Please provide a complete description of the issue as well. Unless otherwise, I honestly don't know how to help you.

If this is a request for adding a prebuilt version of the holistic example, please say so.

I'm also willing to change the current iOS playground project so that it includes all of the examples in the mediapipe iOS example package. This would make it one big universal project where you can play around with everything.

Best of luck.

@Epoch2022-iOS
Copy link
Author

@Epoch2022-iOS Have you tried the Holistic example from the official repo? They provide examples for Javascript, Android, and iOS, respectively, and I highly advise you to do so.

Please provide a complete description of the issue as well. Unless otherwise, I honestly don't know how to help you.

If this is a request for adding a prebuilt version of the holistic example, please say so.

I'm also willing to change the current iOS playground project so that it includes all of the examples in the mediapipe iOS example package. This would make it one big universal project where you can play around with everything.

Best of luck.

thanks. we want use face and hand track on one project by Mediapipe. Not Holistic.
that's my framework demo:https://github.com/Epoch2022-iOS/DoubleFrameworkTest-iOS

@61315
Copy link
Owner

61315 commented Dec 24, 2022

I get the gist.

Apparently, the interface shown in both the hand and face examples does not provide access to the landmark data.
I'll see to it that these projects have appropriate member functions so that you can get the landmark data.

Thank you.

@Epoch2022-iOS
Copy link
Author

I get the gist.

Apparently, the interface shown in both the hand and face examples does not provide access to the landmark data. I'll see to it that these projects have appropriate member functions so that you can get the landmark data.

Thank you.

thanks. wait for your response!

@61315
Copy link
Owner

61315 commented Dec 26, 2022

Checkout the new delegate.

- (void)tracker: (MPPBFaceGeometry *)tracker didOutputGeometry: (NSArray<NSNumber *> *)indices withVertices: (NSArray<NSNumber *> *)vertices withFace: (NSInteger)index;

Also see the implementation of how to consume the landmark data.

let vertexSource = SCNGeometrySource(
data: vertexData, semantic: .vertex, vectorCount: vertices.count / 5,
usesFloatComponents: true, componentsPerVector: 3,
bytesPerComponent: MemoryLayout<Float>.size,
dataOffset: MemoryLayout<Float>.stride * 0,
dataStride: MemoryLayout<Float>.size * 5)

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.

20221226-214218-954.mp4

@Epoch2022-iOS
Copy link
Author

Checkout the new delegate.

- (void)tracker: (MPPBFaceGeometry *)tracker didOutputGeometry: (NSArray<NSNumber *> *)indices withVertices: (NSArray<NSNumber *> *)vertices withFace: (NSInteger)index;

Also see the implementation of how to consume the landmark data.

let vertexSource = SCNGeometrySource(
data: vertexData, semantic: .vertex, vectorCount: vertices.count / 5,
usesFloatComponents: true, componentsPerVector: 3,
bytesPerComponent: MemoryLayout<Float>.size,
dataOffset: MemoryLayout<Float>.stride * 0,
dataStride: MemoryLayout<Float>.size * 5)

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.

20221226-214218-954.mp4

thanks for you replay!
for you exp. video, I know you use Mediapipe's face track with [MPPBFaceGeometry.h], But we want use face and hand together....
when I export face and hand track SDK with same project, I get a error:Class CaptureDelegate is implemented in both...
this is my project, you can download it and Run the target, you will see it.

thanks again!

@Epoch2022-iOS
Copy link
Author

Checkout the new delegate.

- (void)tracker: (MPPBFaceGeometry *)tracker didOutputGeometry: (NSArray<NSNumber *> *)indices withVertices: (NSArray<NSNumber *> *)vertices withFace: (NSInteger)index;

Also see the implementation of how to consume the landmark data.

let vertexSource = SCNGeometrySource(
data: vertexData, semantic: .vertex, vectorCount: vertices.count / 5,
usesFloatComponents: true, componentsPerVector: 3,
bytesPerComponent: MemoryLayout<Float>.size,
dataOffset: MemoryLayout<Float>.stride * 0,
dataStride: MemoryLayout<Float>.size * 5)

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.
20221226-214218-954.mp4

thanks for you replay! for you exp. video, I know you use Mediapipe's face track with [MPPBFaceGeometry.h], But we want use face and hand together.... when I export face and hand track SDK with same project, I get a error:Class CaptureDelegate is implemented in both... this is my project, you can download it and Run the target, you will see it.

thanks again!

Thank you very much for your patient answer!
Let me describe my problem again:
We need to obtain hand landmarks as well as face landmarks!
After referring to the demo you provided, I did not find an effective solution. For the same problem, I also consulted the official technical engineers of Google Mediapipe, and they told me that I needed to customize a.pbtxt file and then build a new one. BUILD file.
This is my question and answer and Google: **[Thank you very much for your patient answer!
Let me describe my problem again:
We need to obtain hand landmarks as well as face landmarks!
After referring to the demo you provided, I did not find an effective solution. For the same problem, I also consulted the official technical engineers of Google Mediapipe, and they told me that I needed to customize a.pbtxt file and then build a new one. BUILD file.
This is my question and answer and Google: https://github.com/google/mediapipe/issues/3936#event-8092845671

@ghost
Copy link

ghost commented Dec 26, 2022

Checkout the new delegate.

- (void)tracker: (MPPBFaceGeometry *)tracker didOutputGeometry: (NSArray<NSNumber *> *)indices withVertices: (NSArray<NSNumber *> *)vertices withFace: (NSInteger)index;

Also see the implementation of how to consume the landmark data.

let vertexSource = SCNGeometrySource(
data: vertexData, semantic: .vertex, vectorCount: vertices.count / 5,
usesFloatComponents: true, componentsPerVector: 3,
bytesPerComponent: MemoryLayout<Float>.size,
dataOffset: MemoryLayout<Float>.stride * 0,
dataStride: MemoryLayout<Float>.size * 5)

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.

20221226-214218-954.mp4

Hello! The DEMO you made is amazing, and the texture material fits the face very well! I am an AR developer of iOS. I can use leftEyeTransform and rightEyeTransform data in ARKit for funny expression and Eye tracking.

I have a question about how to obtain leftEyeTransform and rightEyeTransform data in MediaPipe's Face Mesh?
[simd_float4x4]A transform matrix indicating the position and orientation of the left eye and right eye.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants