Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI Face SDK update #45402

Merged
merged 21 commits into from
Oct 22, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions sdk/face/Azure.AI.Vision.Face/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,29 @@

## 1.0.0-beta.2 (Unreleased)

- Added support for the Large Face List and Large Person Group:
- Added client `LargeFaceListClient` and `LargePersonGroupClient`.
- Added operations `FindSimilarFromLargeFaceList`, `IdentifyFromLargePersonGroup` and `VerifyFromLargePersonGroup` to `FaceClient`.
- Added models for supporting Large Face List and Large Person Group.
- Added support for latest Detect Liveness Session API:
- Added operations `GetSessionImage` and `DetectFromSessionImage` to `FaceSessionClient`.
- Added properties `EnableSessionImage ` and `LivenessSingleModalModel` to model `CreateLivenessSessionContent`.
- Added model `CreateLivenessWithVerifySessionContent`.

### Features Added

### Breaking Changes

- Changed the parameter of `CreateLivenessWithVerifySession` from model `CreateLivenessSessionContent` to `CreateLivenessWithVerifySessionContent`.

### Bugs Fixed

- Remove `Mask` from `FaceAsttributes.Detection01`, which is not supported.

### Other Changes

- Change the default service API version to `v1.2-preview.1`.

## 1.0.0-beta.1 (2024-05-27)

This is the first preview Azure AI Face client library that follows the [.NET Azure SDK Design Guidelines](https://azure.github.io/azure-sdk/dotnet_introduction.html).
Expand Down
29 changes: 26 additions & 3 deletions sdk/face/Azure.AI.Vision.Face/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ The Azure AI Face service provides AI algorithms that detect, recognize, and ana
- Liveness detection
- Face recognition
- Face verification ("one-to-one" matching)
- Face identification ("one-to-many" matching)
- Find similar faces
- Group faces

Expand Down Expand Up @@ -97,18 +98,40 @@ AzureKeyCredential credential = new AzureKeyCredential("<your apiKey>");
var client = new FaceClient(endpoint, credential);
```

### Specifiy the service version

chunyu3 marked this conversation as resolved.
Show resolved Hide resolved
The Azure Face service has multiple versions available, and the client library supports the following versions:

- v1.1-preivew.1
- v1.2-preview.1

The service version can be specified when creating the client:

```C# Snippet:CreateFaceClientWithVersion
Uri endpoint = new Uri("<your endpoint>");
DefaultAzureCredential credential = new DefaultAzureCredential();
var client = new FaceClient(endpoint, credential, new AzureAIVisionFaceClientOptions(AzureAIVisionFaceClientOptions.ServiceVersion.V1_2_Preview_1));
```

## Key concepts

### FaceClient

`FaceClient` provides operations for:

- Face detection and analysis: Detect human faces in an image and return the rectangle coordinates of their locations, and optionally with landmarks, and face-related attributes. This operation is required as a first step in all the other face recognition scenarios.
- Face recognition: Confirm that a user is who they claim to be based on how closely their face data matches the target face.
Support Face verification ("one-to-one" matching).
- Face recognition: Confirm that a user is who they claim to be based on how closely their face data matches the target face. It includes Face verification ("one-to-one" matching) and Face identification ("one-to-many" matching).
- Finding similar faces from a smaller set of faces that look similar to the target face.
- Grouping faces into several smaller groups based on similarity.

### FaceAdministrationClient

`FaceAdministrationClient` is provided to interact with the following data structures that hold data on faces and
persons for Face recognition:

- LargeFaceList
- LargePersonGroup

### FaceSessionClient

`FaceSessionClient` is provided to interact with sessions which is used for Liveness detection.
Expand Down Expand Up @@ -163,7 +186,7 @@ foreach (var detectedFace in detectedFaces)
{
Console.WriteLine($"Face Rectangle: left={detectedFace.FaceRectangle.Left}, top={detectedFace.FaceRectangle.Top}, width={detectedFace.FaceRectangle.Width}, height={detectedFace.FaceRectangle.Height}");
Console.WriteLine($"Head pose: pitch={detectedFace.FaceAttributes.HeadPose.Pitch}, roll={detectedFace.FaceAttributes.HeadPose.Roll}, yaw={detectedFace.FaceAttributes.HeadPose.Yaw}");
Console.WriteLine($"Mask: {detectedFace.FaceAttributes.Mask}");
Console.WriteLine($"Mask: NoseAndMouthCovered={detectedFace.FaceAttributes.Mask.NoseAndMouthCovered}, Type={detectedFace.FaceAttributes.Mask.Type}");
Console.WriteLine($"Quality: {detectedFace.FaceAttributes.QualityForRecognition}");
Console.WriteLine($"Recognition model: {detectedFace.RecognitionModel}");
Console.WriteLine($"Landmarks: ");
Expand Down

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion sdk/face/Azure.AI.Vision.Face/assets.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,5 @@
"AssetsRepo": "Azure/azure-sdk-assets",
"AssetsRepoPrefixPath": "net",
"TagPrefix": "net/face/Azure.AI.Vision.Face",
"Tag": "net/face/Azure.AI.Vision.Face_7088055bd6"
"Tag": "net/face/Azure.AI.Vision.Face_a2e6c14099"
}
3 changes: 3 additions & 0 deletions sdk/face/Azure.AI.Vision.Face/samples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,3 +17,6 @@ Azure AI Vision Face is a cloud service that gives you access to advanced algor
- From URL
- Detect liveness in faces with session [synchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample2_DetectLivenessWithSession.cs) or [asynchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample2_DetectLivenessWithSessionAsync.cs)
- Detect liveness with face verification with session [synchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample3_DetectLivenessWithVerifyWithSession.cs) or [asynchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample3_DetectLivenessWithVerifyWithSessionAsync.cs)
- Stateless face recognition [synchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample4_StatelessFaceRecognition.cs) or [asynchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample4_StatelessFaceRecognitionAsync.cs)
- Verification and identification from Large Person Group [synchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample5_LargePersonGroup.cs) or [asynchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample5_LargePersonGroupAsync.cs)
- Find similar faces from a large face list [synchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample6_LargeFaceList.cs) or [asynchronously](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/tests/samples/Sample6_LargeFaceListAsync.cs)
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ foreach (var detectedFace in detectedFaces)
{
Console.WriteLine($"Face Rectangle: left={detectedFace.FaceRectangle.Left}, top={detectedFace.FaceRectangle.Top}, width={detectedFace.FaceRectangle.Width}, height={detectedFace.FaceRectangle.Height}");
Console.WriteLine($"Head pose: pitch={detectedFace.FaceAttributes.HeadPose.Pitch}, roll={detectedFace.FaceAttributes.HeadPose.Roll}, yaw={detectedFace.FaceAttributes.HeadPose.Yaw}");
Console.WriteLine($"Mask: {detectedFace.FaceAttributes.Mask}");
Console.WriteLine($"Mask: NoseAndMouthCovered={detectedFace.FaceAttributes.Mask.NoseAndMouthCovered}, Type={detectedFace.FaceAttributes.Mask.Type}");
Console.WriteLine($"Quality: {detectedFace.FaceAttributes.QualityForRecognition}");
Console.WriteLine($"Recognition model: {detectedFace.RecognitionModel}");
Console.WriteLine($"Landmarks: ");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ foreach (var detectedFace in detectedFaces)
{
Console.WriteLine($"Face Rectangle: left={detectedFace.FaceRectangle.Left}, top={detectedFace.FaceRectangle.Top}, width={detectedFace.FaceRectangle.Width}, height={detectedFace.FaceRectangle.Height}");
Console.WriteLine($"Head pose: pitch={detectedFace.FaceAttributes.HeadPose.Pitch}, roll={detectedFace.FaceAttributes.HeadPose.Roll}, yaw={detectedFace.FaceAttributes.HeadPose.Yaw}");
Console.WriteLine($"Mask: {detectedFace.FaceAttributes.Mask}");
Console.WriteLine($"Mask: NoseAndMouthCovered={detectedFace.FaceAttributes.Mask.NoseAndMouthCovered}, Type={detectedFace.FaceAttributes.Mask.Type}");
Console.WriteLine($"Quality: {detectedFace.FaceAttributes.QualityForRecognition}");
Console.WriteLine($"Recognition model: {detectedFace.RecognitionModel}");
Console.WriteLine($"Landmarks: ");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ var sessionClient = new FaceSessionClient(endpoint, credential);
Before you can detect liveness in a face, you need to create a liveness detection session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token.

```C# Snippet:CreateLivenessWithVerifySession
var parameters = new CreateLivenessSessionContent(LivenessOperationMode.Passive) {
var parameters = new CreateLivenessWithVerifySessionContent(LivenessOperationMode.Passive) {
SendResultsToClient = true,
DeviceCorrelationId = Guid.NewGuid().ToString(),
};
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ var sessionClient = new FaceSessionClient(endpoint, credential);
Before you can detect liveness in a face, you need to create a liveness detection session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token.

```C# Snippet:CreateLivenessWithVerifySessionAsync
var parameters = new CreateLivenessSessionContent(LivenessOperationMode.Passive) {
var parameters = new CreateLivenessWithVerifySessionContent(LivenessOperationMode.Passive) {
SendResultsToClient = true,
DeviceCorrelationId = Guid.NewGuid().ToString(),
};
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# Stateless face recognition

This sample demonstrates how to recognize faces in an image without data structure.

To get started you'll need an Azure AI resource or a Face resource. See [README][README] for prerequisites and instructions.

## Creating a `FaceClient`

To create a new `FaceClient` you need the endpoint and credentials from your resource. In the sample below you'll use a `DefaultAzureCredential` object to authenticate. You can set `endpoint` based on an environment variable, a configuration setting, or any way that works for your application. See [Authenticate the client][README_authticate] for instructions.

```C# Snippet:CreateFaceClient
Uri endpoint = new Uri("<your endpoint>");
DefaultAzureCredential credential = new DefaultAzureCredential();
var client = new FaceClient(endpoint, credential);
```

## Verify whether two faces belong to the same person

To verify whether two faces belong to the same person, you can use the `VerifyFaceToFace` method. This method returns a `FaceVerificationResult` object that contains a `Confidence` score indicating the similarity between the two faces.

```C# Snippet:VerifyFaceToFace
var data = new (string Name, Uri Uri)[] {
("Dad image 1", new Uri(FaceTestConstant.UrlFamily1Dad1Image)),
("Dad image 2", new Uri(FaceTestConstant.UrlFamily1Dad2Image)),
("Son image 1", new Uri(FaceTestConstant.UrlFamily1Son1Image))
};
var faceIds = new List<Guid>();

foreach (var tuple in data)
{
var detectResponse = client.Detect(tuple.Uri, FaceDetectionModel.Detection03, FaceRecognitionModel.Recognition04, true);
Console.WriteLine($"Detected {detectResponse.Value.Count} face(s) in the image '{tuple.Name}'.");
faceIds.Add(detectResponse.Value.Single().FaceId.Value);
}

var verifyDad1Dad2Response = client.VerifyFaceToFace(faceIds[0], faceIds[1]);
Console.WriteLine($"Verification between Dad image 1 and Dad image 2: {verifyDad1Dad2Response.Value.Confidence}");
Console.WriteLine($"Is the same person: {verifyDad1Dad2Response.Value.IsIdentical}");

var verifyDad1SonResponse = client.VerifyFaceToFace(faceIds[0], faceIds[2]);
Console.WriteLine($"Verification between Dad image 1 and Son image 1: {verifyDad1SonResponse.Value.Confidence}");
Console.WriteLine($"Is the same person: {verifyDad1SonResponse.Value.IsIdentical}");
```

## Find similar faces from a list of faces

To find similar faces from a list of faces, you can use the `FindSimilar` method. This method returns a list of `FaceFindSimilarResult` objects that contain the `FaceId` of the face and a `Confidence` score indicating the similarity between the face and the query face.

```C# Snippet:FindSimilar
var dadImage = new Uri(FaceTestConstant.UrlFamily1Dad1Image);
var detectDadResponse = client.Detect(dadImage, FaceDetectionModel.Detection03, FaceRecognitionModel.Recognition04, true);
Console.WriteLine($"Detected {detectDadResponse.Value.Count} face(s) in the Dad image.");
var dadFaceId = detectDadResponse.Value.Single().FaceId.Value;

var targetImage = new Uri(FaceTestConstant.UrlIdentification1Image);
var detectResponse = client.Detect(targetImage, FaceDetectionModel.Detection03, FaceRecognitionModel.Recognition04, true);
Console.WriteLine($"Detected {detectResponse.Value.Count} face(s) in the image.");
var faceIds = detectResponse.Value.Select(face => face.FaceId.Value);

var response = client.FindSimilar(dadFaceId, faceIds);
var similarFaces = response.Value;
Console.WriteLine($"Found {similarFaces.Count} similar face(s) in the target image.");
foreach (var similarFace in similarFaces)
{
Console.WriteLine($"Face ID: {similarFace.FaceId}, confidence: {similarFace.Confidence}");
}
```

## Group faces

To group faces, you can use the `Group` method. This method returns a `FaceGroupingResult` objects that contain a 2 dimensional array of faces. Each array represents a group of faces that belong to the same person. There is also a faces array that contains all the faces that were not grouped.

```C# Snippet:Group
var targetImages = new (string, Uri)[] {
("Group image", new Uri(FaceTestConstant.UrlIdentification1Image)),
("Dad image 1", new Uri(FaceTestConstant.UrlFamily1Dad1Image)),
("Dad image 2", new Uri(FaceTestConstant.UrlFamily1Dad2Image)),
("Son image 1", new Uri(FaceTestConstant.UrlFamily1Son1Image))
};
var faceIds = new Dictionary<Guid, (FaceDetectionResult, string)>();

foreach (var (imageName, targetImage) in targetImages)
{
var detectResponse = client.Detect(targetImage, FaceDetectionModel.Detection03, FaceRecognitionModel.Recognition04, true);
Console.WriteLine($"Detected {detectResponse.Value.Count} face(s) in the image '{imageName}'.");
foreach (var face in detectResponse.Value)
{
faceIds[face.FaceId.Value] = (face, imageName);
}
}

var groupResponse = client.Group(faceIds.Keys);
var groups = groupResponse.Value;

Console.WriteLine($"Found {groups.Groups.Count} group(s) in the target images.");
foreach (var group in groups.Groups)
{
Console.WriteLine($"Group: ");
foreach (var faceId in group)
{
Console.WriteLine($" {faceId} from '{faceIds[faceId].Item2}', face rectangle: {faceIds[faceId].Item1.FaceRectangle.Left}, {faceIds[faceId].Item1.FaceRectangle.Top}, {faceIds[faceId].Item1.FaceRectangle.Width}, {faceIds[faceId].Item1.FaceRectangle.Height}");
}
}

Console.WriteLine($"Found {groups.MessyGroup.Count} face(s) that are not in any group.");
foreach (var faceId in groups.MessyGroup)
{
Console.WriteLine($" {faceId} from '{faceIds[faceId].Item2}', face rectangle: {faceIds[faceId].Item1.FaceRectangle.Left}, {faceIds[faceId].Item1.FaceRectangle.Top}, {faceIds[faceId].Item1.FaceRectangle.Width}, {faceIds[faceId].Item1.FaceRectangle.Height}");
}
```

[README]: https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face#getting-started
[README_authticate]: https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face#authenticate-the-client
Loading