1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136
|
# coding: utf-8
# -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
"""
FILE: sample_face_liveness_detection_async.py
DESCRIPTION:
This sample demonstrates how to determine if a face in an input video stream is real (live) or fake (spoof).
The liveness solution integration involves two different components: a mobile application and
an app server/orchestrator, and here we demonstrate the entire process from the perspective of the server side.
For more information about liveness detection, see
https://learn.microsoft.com/azure/ai-services/computer-vision/tutorials/liveness.
USAGE:
python sample_face_liveness_detection_async.py
Set the environment variables with your own values before running this sample:
1) AZURE_FACE_API_ENDPOINT - the endpoint to your Face resource.
2) AZURE_FACE_API_ACCOUNT_KEY - your Face API key.
"""
import asyncio
import os
import uuid
from dotenv import find_dotenv, load_dotenv
from shared.constants import (
CONFIGURATION_NAME_FACE_API_ACCOUNT_KEY,
CONFIGURATION_NAME_FACE_API_ENDPOINT,
DEFAULT_FACE_API_ACCOUNT_KEY,
DEFAULT_FACE_API_ENDPOINT,
)
from shared.helpers import beautify_json, get_logger
class DetectLiveness:
def __init__(self):
load_dotenv(find_dotenv())
self.endpoint = os.getenv(CONFIGURATION_NAME_FACE_API_ENDPOINT, DEFAULT_FACE_API_ENDPOINT)
self.key = os.getenv(CONFIGURATION_NAME_FACE_API_ACCOUNT_KEY, DEFAULT_FACE_API_ACCOUNT_KEY)
self.logger = get_logger("sample_face_liveness_detection_async")
async def wait_for_liveness_check_request(self):
# The logic to wait for liveness check request from mobile application.
pass
async def send_auth_token_to_client(self, token):
# The logic to provide the session-authorization-token to the mobile application.
pass
async def wait_for_liveness_session_complete(self):
# The logic to wait the notification from mobile application.
self.logger.info(
"Please refer to https://learn.microsoft.com/azure/ai-services/computer-vision/tutorials/liveness"
" and use the mobile client SDK to perform liveness detection on your mobile application."
)
input("Press any key to continue when you complete these steps to run sample to get session results ...")
pass
async def livenessSession(self):
"""This example demonstrates the liveness detection from a app server-side perspective.
To get the full picture of the entire steps, see https://learn.microsoft.com/azure/ai-services/computer-vision/tutorials/liveness#orchestrate-the-liveness-solution. # noqa: E501
"""
from azure.core.credentials import AzureKeyCredential
from azure.ai.vision.face.aio import FaceSessionClient
from azure.ai.vision.face.models import (
CreateLivenessSessionContent,
LivenessOperationMode,
)
async with FaceSessionClient(
endpoint=self.endpoint, credential=AzureKeyCredential(self.key)
) as face_session_client:
# 1. Wait for liveness check request
await self.wait_for_liveness_check_request()
# 2. Create a session.
self.logger.info("Create a new liveness session.")
created_session = await face_session_client.create_liveness_session(
CreateLivenessSessionContent(
liveness_operation_mode=LivenessOperationMode.PASSIVE,
device_correlation_id=str(uuid.uuid4()),
send_results_to_client=False,
auth_token_time_to_live_in_seconds=60,
)
)
self.logger.info(f"Result: {beautify_json(created_session.as_dict())}")
# 3. Provide session authorization token to mobile application.
token = created_session.auth_token
await self.send_auth_token_to_client(token)
# 4 ~ 6. The mobile application uses the session-authorization-token to perform the liveness detection.
# To learn how to integrate the UI and the code into your native mobile application, see
# https://learn.microsoft.com/azure/ai-services/computer-vision/tutorials/liveness#integrate-liveness-into-mobile-application # noqa: E501
# 7. Wait for session completion notification from client.
await self.wait_for_liveness_session_complete()
# 8. Query for the liveness detection result as the session is completed.
self.logger.info("Get the liveness detection result.")
liveness_result = await face_session_client.get_liveness_session_result(created_session.session_id)
self.logger.info(f"Result: {beautify_json(liveness_result.as_dict())}")
# Furthermore, you can query all request and response for this sessions, and list all sessions you have by
# calling `get_liveness_session_audit_entries` and `get_liveness_sessions`.
self.logger.info("Get the audit entries of this session.")
audit_entries = await face_session_client.get_liveness_session_audit_entries(created_session.session_id)
for idx, entry in enumerate(audit_entries):
self.logger.info(f"----- Audit entries: #{idx+1}-----")
self.logger.info(f"Entry: {beautify_json(entry.as_dict())}")
self.logger.info("List all liveness sessions.")
sessions = await face_session_client.get_liveness_sessions()
for idx, session in enumerate(sessions):
self.logger.info(f"----- Sessions: #{idx+1}-----")
self.logger.info(f"Session: {beautify_json(session.as_dict())}")
# Clean up: delete the session
self.logger.info("Delete the session.")
await face_session_client.delete_liveness_session(created_session.session_id)
async def main():
sample = DetectLiveness()
await sample.livenessSession()
if __name__ == "__main__":
asyncio.run(main())
|