1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112
|
# coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) Python Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from copy import deepcopy
from typing import Any, TYPE_CHECKING, Union
from typing_extensions import Self
from azure.core import PipelineClient
from azure.core.credentials import AzureKeyCredential
from azure.core.pipeline import policies
from azure.core.rest import HttpRequest, HttpResponse
from ._configuration import ConversationAnalysisClientConfiguration
from ._operations import _ConversationAnalysisClientOperationsMixin
from ._utils.serialization import Deserializer, Serializer
if TYPE_CHECKING:
from azure.core.credentials import TokenCredential
class ConversationAnalysisClient(_ConversationAnalysisClientOperationsMixin):
"""The language service conversations API is a suite of natural language processing (NLP) skills
that can be used to analyze structured conversations (textual or spoken). The synchronous API
in this suite accepts a request and mediates among multiple language projects, such as LUIS
Generally Available, Question Answering, Conversational Language Understanding, and then calls
the best candidate service to handle the request. At last, it returns a response with the
candidate service's response as a payload.\\n\\n In some cases, this API needs to forward
requests and responses between the caller and an upstream service. The asynchronous APIs in
this suite enable tasks like Conversation Summarization and Conversational PII detection.
:param endpoint: Supported Cognitive Services endpoint (e.g.,
https://<resource-name>.api.cognitiveservices.azure.com). Required.
:type endpoint: str
:param credential: Credential used to authenticate requests to the service. Is either a key
credential type or a token credential type. Required.
:type credential: ~azure.core.credentials.AzureKeyCredential or
~azure.core.credentials.TokenCredential
:keyword api_version: The API version to use for this operation. Default value is
"2025-05-15-preview". Note that overriding this default value may result in unsupported
behavior.
:paramtype api_version: str
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no
Retry-After header is present.
"""
def __init__(self, endpoint: str, credential: Union[AzureKeyCredential, "TokenCredential"], **kwargs: Any) -> None:
_endpoint = "{Endpoint}/language"
self._config = ConversationAnalysisClientConfiguration(endpoint=endpoint, credential=credential, **kwargs)
_policies = kwargs.pop("policies", None)
if _policies is None:
_policies = [
policies.RequestIdPolicy(**kwargs),
self._config.headers_policy,
self._config.user_agent_policy,
self._config.proxy_policy,
policies.ContentDecodePolicy(**kwargs),
self._config.redirect_policy,
self._config.retry_policy,
self._config.authentication_policy,
self._config.custom_hook_policy,
self._config.logging_policy,
policies.DistributedTracingPolicy(**kwargs),
policies.SensitiveHeaderCleanupPolicy(**kwargs) if self._config.redirect_policy else None,
self._config.http_logging_policy,
]
self._client: PipelineClient = PipelineClient(base_url=_endpoint, policies=_policies, **kwargs)
self._serialize = Serializer()
self._deserialize = Deserializer()
self._serialize.client_side_validation = False
def send_request(self, request: HttpRequest, *, stream: bool = False, **kwargs: Any) -> HttpResponse:
"""Runs the network request through the client's chained policies.
>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = client.send_request(request)
<HttpResponse: 200 OK>
For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request
:param request: The network request you want to make. Required.
:type request: ~azure.core.rest.HttpRequest
:keyword bool stream: Whether the response payload will be streamed. Defaults to False.
:return: The response of your network call. Does not do error handling on your response.
:rtype: ~azure.core.rest.HttpResponse
"""
request_copy = deepcopy(request)
path_format_arguments = {
"Endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True),
}
request_copy.url = self._client.format_url(request_copy.url, **path_format_arguments)
return self._client.send_request(request_copy, stream=stream, **kwargs) # type: ignore
def close(self) -> None:
self._client.close()
def __enter__(self) -> Self:
self._client.__enter__()
return self
def __exit__(self, *exc_details: Any) -> None:
self._client.__exit__(*exc_details)
|