◻ Google MLKit integration ◻ Face detection ◻ Face contours ◻ Face expressions ◻ Face movement |
Add the following line to your Podfile
file:
pod 'YoonitFacefy'
And run in the root of your project:
pod install
This is a basic usage to the FacefyYoonit
.
Feel free to use the demo.
import YoonitFacefy
...
let image = UIImage(contentsOfFile: "image path")
let facefy: Facefy = Facefy()
self.facefy.detect(image!) { faceDetected in
if let faceDetected: FaceDetected = faceDetected {
if let leftEyeOpenProbability = faceDetected.leftEyeOpenProbability {
print(String(format: "%.2f", leftEyeOpenProbability))
}
if let rightEyeOpenProbability = faceDetected.rightEyeOpenProbability {
print(String(format: "%.2f", rightEyeOpenProbability))
}
if let smilingProbability = faceDetected.smilingProbability {
print(String(format: "%.2f", faceDetected.smilingProbability))
}
if let hasHeadEulerAngleX = faceDetected.hasHeadEulerAngleX {
print(String(format: "%.2f", hasHeadEulerAngleX))
}
if let hasHeadEulerAngleY = faceDetected.hasHeadEulerAngleY {
print(String(format: "%.2f", hasHeadEulerAngleY))
}
if let hasHeadEulerAngleZ = faceDetected.hasHeadEulerAngleZ {
print(String(format: "%.2f", hasHeadEulerAngleZ))
}
if let cgImage = image?.cgImage {
// Crop the face image from the camera frame.
let faceImage = UIImage(
cgImage: cgImage.cropping(to: faceDetected.boundingBox)!
).withHorizontallyFlippedOrientation()
}
}
} onError: { message in
print(message)
}
Function | Parameters | Return Type | Description |
---|---|---|---|
detect | image: UIImage, onSuccess: @escaping (FaceDetected?) -> Void, onError: @escaping (String) -> Void |
void | Detect a face from image and return the result in the FaceDetected as a closure. |
Attribute | Type | Description |
---|---|---|
boundingBox | CGRect |
The face bounding box related to the image input. |
leftEyeOpenProbability | Float? |
The left eye open probability. |
rightEyeOpenProbability | Float? |
The right eye open probability. |
smilingProbability | Float? |
The smiling probability. |
headEulerAngleX | Float? |
The angle in degrees that indicate the vertical head direction. See Head Movements. |
headEulerAngleY | Float? |
The angle in degrees that indicate the horizontal head direction. See Head Movements. |
headEulerAngleZ | Float? |
The angle in degrees that indicate the tilt head direction. See Head Movements. |
contours | [CGPoint] |
List of points that represents the shape of the detected face. |
Here we explaining the above gif and how reached the "results". Each "movement" (vertical, horizontal and tilt) is a state, based in the angle in degrees that indicate head direction;
Head Direction | Attribute | v < -36° | -36° < v < -12° | -12° < v < 12° | 12° < v < 36° | 36° < v |
---|---|---|---|---|---|---|
Vertical | headEulerAngleX |
Super Down | Down | Frontal | Up | Super Up |
Horizontal | headEulerAngleY |
Super Left | Left | Frontal | Right | Super Right |
Tilt | headEulerAngleZ |
Super Right | Right | Frontal | Left | Super Left |
Clone the repo, change what you want and send PR. For commit messages we use Conventional Commits.
Contributions are always welcome!
Code with ❤ by the Yoonit Team