diff --git a/sdk/ai/azure-ai-projects/.github/skills/SKILLS_README.md b/sdk/ai/azure-ai-projects/.github/skills/SKILLS_README.md new file mode 100644 index 000000000000..7ea120484d10 --- /dev/null +++ b/sdk/ai/azure-ai-projects/.github/skills/SKILLS_README.md @@ -0,0 +1,33 @@ +# CoPilot skills for azure-ai-projects development + +## Prerequisite + +* Clone the `azure-sdk-for-python` repo to your local machine, if you don't already have it: + ``` + git clone https://github.com/Azure/azure-sdk-for-python.git + ``` +* Change to the directory `sdk\ai\azure-ai-projects`. +* Switch to the current feature branch: `git switch feature/azure-ai-projects/2.2.0`. +* Make sure you don't have any files edited or added in this branch (clean `git status` state). + +## Emit from TypeSpec and create a PR + +### Using GitHub CoPilot in VSCode + +* Open VSCode in the current folder. +* Open the CoPilot chat window ("Toggle Chat"). +* Make sure you are in "Agent" mode. +* Start typing `/azure-ai-projects` and press tab to auto complete it to `/azure-ai-projects-emit-from-typespec`, then press Enter. +* Answer some questions and approve execution to go through the workflow + +### Using CoPilot CLI or Agency Copilot CLI + +* Install [GitHub CoPilot CLI](https://docs.github.com/copilot/how-tos/copilot-cli/set-up-copilot-cli/install-copilot-cli) or [Agency CoPilot CLI](https://aka.ms/agency) (VPN required) if you don't already have it. +* Run CoPilot CLI by typing `copilot` +* Start typing `/azure-ai-projects` and press tab to auto complete it to `/azure-ai-projects-emit-from-typespec`, then press Enter. +* Answer some questions and approve execution to go through the workflow + + + + + diff --git a/sdk/ai/azure-ai-projects/.github/skills/azure-ai-projects-emit-from-typespec/SKILL.md b/sdk/ai/azure-ai-projects/.github/skills/azure-ai-projects-emit-from-typespec/SKILL.md new file mode 100644 index 000000000000..4f95c93e92bd --- /dev/null +++ b/sdk/ai/azure-ai-projects/.github/skills/azure-ai-projects-emit-from-typespec/SKILL.md @@ -0,0 +1,189 @@ +--- +name: azure-ai-projects-emit-from-typespec +license: MIT +metadata: + version: "1.0.0" + distribution: local +description: "Emit the azure-ai-projects Python SDK from TypeSpec, apply post-emitter fixes, update changelog, and create a Pull Request. WHEN: \"emit SDK from TypeSpec\", \"generate azure-ai-projects SDK\", \"update azure-ai-projects from TypeSpec\", \"emit from TypeSpec\", \"regenerate azure-ai-projects\". DO NOT USE FOR: other Azure SDK packages, manual code edits without TypeSpec. INVOKES: azsdk-common-generate-sdk-locally skill, post-emitter-fixes.cmd script, git commands, gh CLI for PR creation." +compatibility: + requires: "azure-sdk-mcp server, local azure-sdk-for-python clone, git, gh CLI" +--- + +# Emit azure-ai-projects Python SDK from TypeSpec + +This skill guides Copilot through emitting the azure-ai-projects Python SDK from TypeSpec, +applying post-emitter fixes, updating the changelog, installing package from sources and creating a Pull Request. + +**Working directory:** `sdk/ai/azure-ai-projects` + +**Skills:** This workflow relies on skills defined under `.github/skills/` at the root of the repository. Use those skills for SDK generation, building, changelog updates, and other SDK lifecycle operations instead of running commands directly. In particular: + +- **`azsdk-common-generate-sdk-locally`** – For generating SDK from TypeSpec, building, running checks/tests, updating changelog, metadata, and version. + +--- + +## Step 1: Gather information from the user + +Ask the user the following questions **one at a time**, waiting for each answer before proceeding. + +### 1a. Topic branch name + +Ask the user to choose **one** of the following two options for the target topic branch: + +1. **Create a new topic branch (with default branch name)** – Create a new topic branch for the emitted changes. If selected, this default branch name will be used "/", where `github-userid` is the user's GitHub ID and `DD-MM-HHMM` is the current date-time using date, month, hour and minute. For example, if the GitHub ID is "dargilco" and the current date and time is May 1st, 2026 at 8:13am, the default branch name would be `dargilco/emit-from-typespec-01-05-0813`. This should be the default option, and the default branch name should be displayed. If you press enter without typing anything, this option will be selected. + +2. **Create a new topic branch (branch name given by user)** - Ask the user for the branch name. Mention that a common format is "/". If the user enters a branch name `feature/azure-ai-projects/2.2.0` then stop and report that they cannot emit directly to the current feature branch. + +3. **Emit to current branch** – Emit directly to the current branch without creating a new topic branch. This is not common, but may be necessary if the user is re-running this workflow because of a previous failure, where the topic branch was already created. If the current branch is named `feature/azure-ai-projects/2.2.0` then stop and report that they cannot emit directly to the current feature branch. + +### 1b. TypeSpec source + +Ask the user to choose **one** of the following three options for the TypeSpec source: + +1. **Latest commit on `feature/foundry-release`** – Automatically find the latest commit to the `feature/foundry-release` branch in [Azure/azure-rest-api-specs](https://github.com/Azure/azure-rest-api-specs) that touched files under `specification/ai-foundry/data-plane/Foundry`, and use that commit hash. This should be the default option. If you press enter without typing anything, this option will be selected. + +2. **Local TypeSpec folder** – Emit from a local clone of the [azure-rest-api-specs](https://github.com/Azure/azure-rest-api-specs) repository. If selected, ask for the **full folder path** to the TypeSpec project. This is the folder ending with `\specification\ai-foundry\data-plane\Foundry`. If it does not end with that string, stop and report the error to the user. Do not continue. + +3. **TypeSpec commit hash** – Emit from a specific commit in the [azure-rest-api-specs](https://github.com/Azure/azure-rest-api-specs) repository. If selected, ask for the **full commit SHA** (40 characters). + + +--- + +## Step 2: Record the current branch + +Before creating the topic branch, record the name of the **current Git branch**. This is the branch that the topic branch will be created from, and the branch the PR will target. + +``` +git branch --show-current +``` + +Save this as `BASE_BRANCH`. + +--- + +## Step 3: Create the topic branch + +Create the topic branch off the current branch and switch to it: + +``` +git fetch +git switch -c origin/ +``` + +Replace `` with the name provided by the user in Step 1a. + +--- + +## Step 4: Emit SDK from TypeSpec + +Use the **`azsdk-common-generate-sdk-locally`** skill to generate the SDK code. The skill knows how to invoke `azsdk_package_generate_code` and related MCP tools. + +Provide the skill with the TypeSpec source selected by the user. With is either: + +- **Local folder:** Pass the local spec repo path for local generation. Or, +- **Commit hash:** Update `commit:` in `tsp-location.yaml` to the full SHA first, then invoke the skill for generation. + +Note: +- You are only allowed to use the `tsp-client update` command. Do not use any of the other `tsp-client` commands. +- If you are generating from local TypeSpec folder, do not edit the file `tsp-location.yaml`. Leave it as is. It should not be used by the emitter. +- If you are generating from local TypeSpec folder, make sure that the local folder path you provide `tsp-client update --local-spec-repo` ends with `specification\ai-foundry\data-plane\Foundry`. +- **If the generation fails**, stop and report the error to the user. Do not continue. + +--- + +## Step 5: Run post-emitter fixes + +After a successful emit, run the post-emitter fix script located in the `sdk/ai/azure-ai-projects` folder: + +``` +post-emitter-fixes.cmd +``` + +This script applies azure-ai-projects-specific corrections to the emitted code (restores `pyproject.toml`, fixes enum names, patches Sphinx doc-string issues, and runs `black` formatting). + +**If the script fails**, stop and report the error to the user. Do not continue. Do not attempt to analyze the script failures and fix them with Copilot. The script should be fixed by the engineering team if it is not working. + +--- + +## Step 6: Fix patched code + +The emitted code may have introduced another beta sub-client (a new property on class `BetaOperations`). It may have also added another enum value to the existing internal class `_FoundryFeaturesOptInKeys`. This means that the client library needs to set a new HTTP request header when making REST API calls to the service, to opt-in to the new service features which are still in preview. If that's the case, do the following: + +* Update the dictionary `_BETA_OPERATION_FEATURE_HEADERS` defined in `azure\ai\projects\models\_patch.py`, to include a new key-value pair to map the new beta sub-client name to the proper value from `_FoundryFeaturesOptInKeys`. If no new beta sub-client was introduced, but a new enum value was added to `_FoundryFeaturesOptInKeys`, you will need to update one of the existing key-value pairs in `_BETA_OPERATION_FEATURE_HEADERS` to a comma-separated join of multiple values from `_FoundryFeaturesOptInKeys`. + +* Do a similar change to the dictionary `EXPECTED_FOUNDRY_FEATURES` defined in the test file `tests\foundry_features_header\foundry_features_header_test_base.py`: add a new key-value pair if a new beta sub-client was introduced, or update an existing key-value pair to include the new enum value if no new beta sub-client was introduced. + +* Finally, look at the two files `azure\ai\projects\operations\_patch.py` and `azure\ai\projects\aio\operations\_patch.py`. They define the public `BetaOperations` classes for the sync and async clients. To support a new sub-client, you will need to add a new property to this class with the proper doc string. You will need to update the import statement at the top of the file to import the new sub-client class. And you will need to update `__all__` statement at the bottom of the file to include the new sub-client class name. Follow the examples you see there for `BetaDatasetsOperations` or `BetaSkillsOperations`. + +If a new enum value was added to `_AgentDefinitionOptInKeys`, please print a note on screen that mentions which value was added, and tell the user that a review is needed to make sure this new value is properly used. But otherwise continue on. + +--- + +## Step 7: Install package from sources + +In the folder `sdk\ai\azure-ai-projects`, run `pip install -e .` to install the package from sources. If there are any errors, stop and report the error to the user. Do not continue. + +--- + +## Step 8: Update CHANGELOG.md + +Use the **`azsdk-common-generate-sdk-locally`** skill's changelog capability (`azsdk_package_update_changelog_content`) to update `CHANGELOG.md` in the `sdk/ai/azure-ai-projects` folder with a summary of changes from the TypeSpec emit. Some guidelines to follow: +* Start by examining the public SDK API surface of the latest released version of the azure-ai-projects package. The source code for this version can be found in the Main branch of the `azure-sdk-for-python` repository, in the folder `sdk\ai\azure-ai-projects`. +* Then compare it to the public SDK API surface of current version in this topic branch. +* Look at the existing change log from the latest version (if exists) and edit or add to it to capture all the changes you see. If a change log does not exist for the current version at the top of `CHANGELOG.md`, create a new one. +* If a new method was added, there is no need to add the list of all new classes that define the inputs and output of the method. It's enough to mention that the new method was added. +* Show the user the proposed changelog entry and ask for confirmation or edits before saving. + +--- + +## Step 9: Update samples and tests + +If there were any breaking changes in existing APIs, like class or method renames, update the samples and tests accordingly to reflect those changes. Changes should be made in the "samples" and "tests" folders under `sdk/ai/azure-ai-projects`. + +--- + +## Step 10: Commit and push + +Stage all changes (excluding file names that start with `.env`), commit, and push the topic branch: + +``` +git add -A -- ':!.env*' +git commit -m "Emit SDK from TypeSpec and apply post-emitter fixes + +Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>" +git push -u origin +``` + +--- + +## Step 11: Create a Pull Request + +Create a draft PR from the **topic branch** to the **base branch** (recorded in Step 2): + +``` +gh pr create --draft --base --head --assignee @me --title "" --body "" +``` + +- **Title:** Use a descriptive title such as `[azure-ai-projects] Emit SDK from TypeSpec ()`. +- **Body:** Include which TypeSpec source was used and a summary of the changelog entry. + +Show the user the PR URL when done. + +--- + +## Step 12: Optionally run tests locally + +Prompt the user with this message: "Tests will run as part of the Pull Request. However, you can optionally run tests locally in a Python virtual environment, right now. It will take a few minutes. Do you want to run tests locally? (yes/no)" + +If the user answers "yes", run all tests from recordings. Follow these guidelines: +* Run tests in a local Python virtual environment. Create this virtual environment if it does not already exists: + ``` + python -m venv .venv + ``` + and activate it: + ``` + .venv\Scripts\activate + ``` +* Show test progress on screen, as tests are run. + + diff --git a/sdk/ai/azure-ai-projects/CHANGELOG.md b/sdk/ai/azure-ai-projects/CHANGELOG.md index d6a0253d9f0a..32fca73275fb 100644 --- a/sdk/ai/azure-ai-projects/CHANGELOG.md +++ b/sdk/ai/azure-ai-projects/CHANGELOG.md @@ -1,9 +1,40 @@ # Release History +## 2.2.0 (Unreleased) + +### Features Added + +* New Agent tool `FabricIQPreviewTool`. +* New Agent tool `ToolboxSearchPreviewTool`. +* New string properties `description` and `name` added to all Agent tools. +* New `.beta.datasets` sub-client with data generation job operations: `create_generation_job`, `get_generation_job`, `list_generation_jobs`, `cancel_generation_job`, `delete_generation_job`. +* New read-only property `content_hash` on `CodeConfiguration`, returning the SHA-256 hex digest of the uploaded code zip. +* New evaluator generation job operations on `.beta.evaluators`: `create_generation_job`, `get_generation_job`, `list_generation_jobs`, `cancel_generation_job`, `delete_generation_job`. +* New methods on `.beta.agents` sub-client for code-based hosted agents: `update_agent_from_code()`, `create_agent_version_from_code()`, `download_agent_version_code()`, `download_agent_code()`. + +### Breaking Changes + +Breaking changes in beta operations: +* Renamed model `AgentEndpoint` to `AgentEndpointConfig`. +* Required property `isolation_key_source` removed from class `EntraAuthorizationScheme`. +* Required keyword argument `isolation_key` removed from `.beta.agents.create_session()` and `.beta.agents.delete_session()` methods. +* Argument `body` in methods `.beta.evaluation_taxonomies.create()` and `.beta.evaluation_taxonomies.update()` renamed to `taxonomy`. + +### Bugs Fixed + +* Fixed telemetry instrumentor to correctly call is_recording() as a method on spans, ensuring non-recording spans are properly skipped (e.g., when sampling is configured) ([GitHub issue 46544](https://github.com/Azure/azure-sdk-for-python/issues/46544)). + +### Sample updates + +* Added Hosted Agent creation sample `sample_hosted_agent_create.py`, demonstrating hosted agent version creation and retrieval with `AIProjectClient`. +* The Hosted Agent creation sample also demonstrates assigning the hosted agent managed identity the Azure AI User RBAC role on the backing Azure AI account. +* Updated the other Hosted Agent samples to reuse an existing Hosted Agent as a prerequisite, instead of creating a new hosted agent version in each sample. + ## 2.1.0 (2026-04-20) ### Features Added +* New `WorkIQPreviewTool`. * `get_openai_client()` on `AIProjectClient` now takes an optional input argument `agent_name`. If provided, the returned OpenAI client will use a base URL of Agent endpoint instead of Foundry Project endpoint. As Agent endpoints are a preview feature, you need to set `allow_preview=True` on the `AIProjectClient` constructor. diff --git a/sdk/ai/azure-ai-projects/README.md b/sdk/ai/azure-ai-projects/README.md index b2f754b91fd2..a442f76d91fc 100644 --- a/sdk/ai/azure-ai-projects/README.md +++ b/sdk/ai/azure-ai-projects/README.md @@ -1,6 +1,6 @@ # Azure AI Projects client library for Python -The AI Projects client library (in preview) is part of the Microsoft Foundry SDK, and provides easy access to +The AI Projects client library is part of the Microsoft Foundry SDK, and provides easy access to resources in your Microsoft Foundry Project. Use it to: * **Create and run Agents** using methods on the `.agents` client property. @@ -13,6 +13,7 @@ resources in your Microsoft Foundry Project. Use it to: * Browser Automation (Preview) * Code Interpreter * Computer Use (Preview) + * Fabric IQ (Preview) * File Search * Function Tool * Image Generation @@ -23,6 +24,7 @@ resources in your Microsoft Foundry Project. Use it to: * OpenAPI * Web Search * Web Search (Preview) + * Work IQ (Preview) * **Get an OpenAI client** using `.get_openai_client()` method to run Responses, Conversations, Evaluations and Fine-Tuning operations with your Agent. * **Manage memory stores (preview)** for Agent conversations, using `.beta.memory_stores` operations. * **Explore additional evaluation tools (some in preview)** to assess the performance of your generative AI application, using `.evaluation_rules`, @@ -55,8 +57,7 @@ To report an issue with the client library, or request additional features, plea * An [Azure subscription][azure_sub]. * A [project in Microsoft Foundry](https://learn.microsoft.com/azure/foundry/how-to/create-projects). * A Foundry project endpoint URL of the form `https://your-ai-services-account-name.services.ai.azure.com/api/projects/your-project-name`. It can be found in your Microsoft Foundry Project home page. Below we will assume the environment variable `FOUNDRY_PROJECT_ENDPOINT` was defined to hold this value. -* To authenticate using API key, you will need the "Project API key" as shown in your Microsoft Foundry Project home page. -* To authenticate using Entra ID, your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/python/api/azure-core/azure.core.credentials.tokencredential) interface. Code samples here use [DefaultAzureCredential](https://learn.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential). To get that working, you will need: +* Client authentication is done using Entra ID. To authenticate, your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/python/api/azure-core/azure.core.credentials.tokencredential) interface. Code samples here use [DefaultAzureCredential](https://learn.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential). To get that working, you will need: * An appropriate role assignment. See [Role-based access control in Microsoft Foundry portal](https://learn.microsoft.com/azure/foundry/concepts/rbac-foundry). Role assignment can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal. * [Azure CLI](https://learn.microsoft.com/cli/azure/install-azure-cli) installed. * You are logged into your Azure account by running `az login`. @@ -116,12 +117,12 @@ async with ( For comprehensive examples covering Agents, tool usage, evaluation, fine-tuning, datasets, indexes, and more, see: -* **[Microsoft Foundry Agents overview](https://learn.microsoft.com/azure/foundry/agents/overview)** — concepts, setup, and quickstarts. +* **[Microsoft Foundry Agents overview](https://learn.microsoft.com/azure/foundry/agents/overview)** — concepts, setup, and quick-starts. * **[Runtime components](https://learn.microsoft.com/azure/foundry/agents/concepts/runtime-components?tabs=python)** — deep-dive into agent architecture. * **[Tool catalog](https://learn.microsoft.com/azure/foundry/agents/concepts/tool-catalog)** — all available tools and agent capabilities. * **[SDK samples folder][samples]** — fully runnable Python code for synchronous and asynchronous clients covering all operations below. -The sections below cover SDK-specific behaviours (authentication variants, exception handling, logging, tracing) that are not documented in the above Learn pages. +The sections below cover SDK-specific behaviors (authentication variants, exception handling, logging, tracing) that are not documented in the above Learn pages. ### Performing Responses operations using OpenAI client @@ -179,219 +180,13 @@ For product guidance, see: For SDK usage examples in this package, see `samples/hosted_agents/`, including CRUD, file upload/download, and skills scenarios. -## Tracing +## Client-side tracing -### Experimental Feature Gate +See [Add client-side tracing to Foundry agents (preview)](https://learn.microsoft.com/azure/foundry/observability/how-to/trace-agent-client-side?tabs=python). -**Important:** GenAI tracing instrumentation is an experimental preview feature. Spans, attributes, and events may be modified in future versions. To use it, you must explicitly opt in by setting the environment variable: +**Important:** GenAI tracing instrumentation is an experimental preview feature. Spans, attributes, and events may be modified in future versions. -```bash -AZURE_EXPERIMENTAL_ENABLE_GENAI_TRACING=true -``` - -This environment variable must be set before calling `AIProjectInstrumentor().instrument()`. If the environment variable is not set or is set to any value other than `true` (case-insensitive), tracing instrumentation will not be enabled and a warning will be logged. - -Only enable this feature after reviewing your requirements and understanding that the tracing behavior may change in future versions. - -### Getting Started with Tracing - -You can add an Application Insights Azure resource to your Microsoft Foundry project. See the Tracing tab in your Microsoft Foundry project. If one was enabled, you can get the Application Insights connection string, configure your AI Projects client, and observe traces in Azure Monitor. Typically, you might want to start tracing before you create a client or Agent. - -For tracing concepts in Microsoft Foundry, see [Trace an agent](https://learn.microsoft.com/azure/foundry/observability/concepts/trace-agent-concept). - -### Installation - -Make sure to install OpenTelemetry and the Azure SDK tracing plugin via - -```bash -pip install "azure-ai-projects>=2.0.0b4" opentelemetry-sdk azure-core-tracing-opentelemetry azure-monitor-opentelemetry -``` - -You will also need an exporter to send telemetry to your observability backend. You can print traces to the console or use a local viewer such as [Aspire Dashboard](https://learn.microsoft.com/dotnet/aspire/fundamentals/dashboard/standalone?tabs=bash). - -To connect to Aspire Dashboard or another OpenTelemetry compatible backend, install OTLP exporter: - -```bash -pip install opentelemetry-exporter-otlp -``` - -### How to enable tracing - -**Remember:** Before enabling tracing, ensure you have set the `AZURE_EXPERIMENTAL_ENABLE_GENAI_TRACING=true` environment variable as described in the [Experimental Feature Gate](#experimental-feature-gate) section. - -Here is a code sample that shows how to enable Azure Monitor tracing: - - - -```python -# Enable Azure Monitor tracing -application_insights_connection_string = project_client.telemetry.get_application_insights_connection_string() -configure_azure_monitor(connection_string=application_insights_connection_string) -``` - - - -You may also want to create a span for your scenario: - - - -```python -tracer = trace.get_tracer(__name__) -scenario = os.path.basename(__file__) - -with tracer.start_as_current_span(scenario): -``` - - - -See the full sample in file `\agents\telemetry\sample_agent_basic_with_azure_monitor_tracing.py` in the [Samples][samples] folder. - -**Note:** In order to view the traces in the Microsoft Foundry portal, the agent ID should be passed in as part of the response generation request. - -In addition, you might find it helpful to see the tracing logs in the console. Remember to set `AZURE_EXPERIMENTAL_ENABLE_GENAI_TRACING=true` before running the following code: - - - -```python -# Setup tracing to console -# Requires opentelemetry-sdk -span_exporter = ConsoleSpanExporter() -tracer_provider = TracerProvider() -tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter)) -trace.set_tracer_provider(tracer_provider) -tracer = trace.get_tracer(__name__) - -# Enable instrumentation with content tracing -AIProjectInstrumentor().instrument() -``` - - - -See the full sample in file `\agents\telemetry\sample_agent_basic_with_console_tracing.py` in the [Samples][samples] folder. - -### Enabling trace context propagation - -Trace context propagation allows client-side spans generated by the Projects SDK to be correlated with server-side spans from Azure OpenAI and other Azure services. When enabled, the SDK automatically injects W3C Trace Context headers (`traceparent` and `tracestate`) into HTTP requests made by OpenAI clients obtained via `get_openai_client()`. - -This feature ensures that all operations within a distributed trace share the same trace ID, providing end-to-end visibility across your application and Azure services in your observability backend (such as Azure Monitor). - -Trace context propagation is **enabled by default** when tracing is enabled (for example through `configure_azure_monitor` or the `AIProjectInstrumentor().instrument()` call). To disable it, set the `AZURE_TRACING_GEN_AI_ENABLE_TRACE_CONTEXT_PROPAGATION` environment variable to `false`, or pass `enable_trace_context_propagation=False` to the `AIProjectInstrumentor().instrument()` call. - -**When does the change take effect?** -- Changes to `enable_trace_context_propagation` (whether via `instrument()` or the environment variable) only affect OpenAI clients obtained via `get_openai_client()` **after** the change is applied. Previously acquired clients are unaffected. -- To apply the new setting to all clients, call `AIProjectInstrumentor().instrument(enable_trace_context_propagation=)` before acquiring your OpenAI clients, or re-acquire the clients after making the change. - -**Security and Privacy Considerations:** -- **Trace IDs are sent to external services**: The `traceparent` and `tracestate` headers from your client-side originating spans are injected into requests sent to service. This enables end-to-end distributed tracing, but note that the trace identifier may be shared beyond the initial API call. -- **Enabled by Default**: If you have privacy or compliance requirements that prohibit sharing trace identifiers with services, disable trace context propagation by setting `enable_trace_context_propagation=False` or the environment variable to `false`. - -#### Controlling baggage propagation - -When trace context propagation is enabled, you can separately control whether the baggage header is included. By default, only `traceparent` and `tracestate` headers are propagated. To also include the `baggage` header, set the `AZURE_TRACING_GEN_AI_TRACE_CONTEXT_PROPAGATION_INCLUDE_BAGGAGE` environment variable to `true`: - -If no value is provided for the `enable_baggage_propagation` parameter with the `AIProjectInstrumentor.instrument()` call and the environment variable is not set, the value defaults to `false` and baggage is not included. - -**Note:** The `enable_baggage_propagation` flag is evaluated dynamically on each request, so changes take effect **immediately** for all clients that have the trace context propagation hook registered. However, the hook is only registered on clients acquired via `get_openai_client()` **while trace context propagation was enabled**. Clients acquired when trace context propagation was disabled will never propagate baggage, regardless of the `enable_baggage_propagation` value. - -**Why is baggage propagation separate?** - -The baggage header can contain arbitrary key-value pairs added anywhere in your application's trace context. Unlike trace IDs (which are randomly generated identifiers), baggage may contain: - -- User identifiers or session information -- Authentication tokens or credentials -- Business-specific data or metadata -- Personally identifiable information (PII) - -Baggage is automatically propagated through your entire application's call chain, meaning data added in one part of your application will be included in requests to Azure OpenAI unless explicitly controlled. - -**Important Security Considerations:** - -- **Review Baggage Contents**: Before enabling baggage propagation, audit what data your application (and any third-party libraries) adds to OpenTelemetry baggage. -- **Sensitive Data Risk**: Baggage is sent to Azure OpenAI and may be logged or processed by Microsoft services. Never add sensitive information to baggage when baggage propagation is enabled. -- **Opt-in by Design**: Baggage propagation is disabled by default (even when trace context propagation is enabled) to prevent accidental exposure of sensitive data. -- **Minimal Propagation**: `traceparent` and `tracestate` headers are generally sufficient for distributed tracing. Only enable baggage propagation if your specific observability requirements demand it. - -### Enabling content recording - -Content recording controls whether message contents and tool call related details, such as parameters and return values, are captured with the traces. This data may include sensitive user information. - -To enable content recording, set the `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` environment variable to `true`. If the environment variable is not set and no value is provided with the `AIProjectInstrumentor().instrument()` call for the content recording parameter, content recording defaults to `false`. - -**Important:** The environment variable only controls content recording for built-in traces. When you use custom tracing decorators on your own functions, all parameters and return values are always traced. - -### Disabling automatic instrumentation - -The AI Projects client library automatically instruments OpenAI responses and conversations operations through `AiProjectInstrumentation`. You can disable this instrumentation by setting the environment variable `AZURE_TRACING_GEN_AI_INSTRUMENT_RESPONSES_API` to `false`. If the environment variable is not set, the responses and conversations APIs will be instrumented by default. - -### Tracing Binary Data - -Binary data are images and files sent to the service as input messages. When you enable content recording (`OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` set to `true`), by default you only trace file IDs and filenames. To enable full binary data tracing, set `AZURE_TRACING_GEN_AI_INCLUDE_BINARY_DATA` to `true`. In this case: - -* **Images**: Image URLs (including data URIs with base64-encoded content) are included -* **Files**: File data is included if sent via the API - -**Important:** Binary data can contain sensitive information and may significantly increase trace size. Some trace backends and tracing implementations may have limitations on the maximum size of trace data that can be sent to and/or supported by the backend. Ensure your observability backend and tracing implementation support the expected trace payload sizes when enabling binary data tracing. - -### How to trace your own functions - -The decorator `trace_function` is provided for tracing your own function calls using OpenTelemetry. By default the function name is used as the name for the span. Alternatively you can provide the name for the span as a parameter to the decorator. - -**Note:** The `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` environment variable does not affect custom function tracing. When you use the `trace_function` decorator, all parameters and return values are always traced by default. - -This decorator handles various data types for function parameters and return values, and records them as attributes in the trace span. The supported data types include: - -* Basic data types: str, int, float, bool -* Collections: list, dict, tuple, set - * Special handling for collections: - * If a collection (list, dict, tuple, set) contains nested collections, the entire collection is converted to a string before being recorded as an attribute. - * Sets and dictionaries are always converted to strings to ensure compatibility with span attributes. - -Object types are omitted, and the corresponding parameter is not traced. - -The parameters are recorded in attributes `code.function.parameter.` and the return value is recorder in attribute `code.function.return.value` - -#### Adding custom attributes to spans - -You can add custom attributes to spans by creating a custom span processor. Here's how to define one: - - - -```python -class CustomAttributeSpanProcessor(SpanProcessor): - def __init__(self) -> None: - pass - - def on_start(self, span: Span, parent_context=None): - # Add this attribute to all spans - span.set_attribute("trace_sample.sessionid", "123") - - # Add another attribute only to create_thread spans - if span.name == "create_thread": - span.set_attribute("trace_sample.create_thread.context", "abc") - - def on_end(self, span: ReadableSpan): - # Clean-up logic can be added here if necessary - pass -``` - - - -Then add the custom span processor to the global tracer provider: - - - -```python -provider = cast(TracerProvider, trace.get_tracer_provider()) -provider.add_span_processor(CustomAttributeSpanProcessor()) -``` - - - -See the full sample in file `\agents\telemetry\sample_agent_basic_with_console_tracing_custom_attributes.py` in the [Samples][samples] folder. - -### Additional resources - -For more information see [Agent tracing overview (preview)](https://learn.microsoft.com/azure/foundry/observability/concepts/trace-agent-concept). +Samples can be found in the sub-folders `agents/telemetry` and `telemetry` in the [Samples][samples] folder. ## Troubleshooting diff --git a/sdk/ai/azure-ai-projects/apiview-properties.json b/sdk/ai/azure-ai-projects/apiview-properties.json index f6caeb745b80..0700d1af4fcc 100644 --- a/sdk/ai/azure-ai-projects/apiview-properties.json +++ b/sdk/ai/azure-ai-projects/apiview-properties.json @@ -10,10 +10,14 @@ "azure.ai.projects.models.AgentClusterInsightRequest": "Azure.AI.Projects.AgentClusterInsightRequest", "azure.ai.projects.models.InsightResult": "Azure.AI.Projects.InsightResult", "azure.ai.projects.models.AgentClusterInsightResult": "Azure.AI.Projects.AgentClusterInsightResult", + "azure.ai.projects.models.DataGenerationJobSource": "Azure.AI.Projects.DataGenerationJobSource", + "azure.ai.projects.models.AgentDataGenerationJobSource": "Azure.AI.Projects.AgentDataGenerationJobSource", "azure.ai.projects.models.AgentDefinition": "Azure.AI.Projects.AgentDefinition", "azure.ai.projects.models.AgentDetails": "Azure.AI.Projects.AgentObject", - "azure.ai.projects.models.AgentEndpoint": "Azure.AI.Projects.AgentEndpoint", "azure.ai.projects.models.AgentEndpointAuthorizationScheme": "Azure.AI.Projects.AgentEndpointAuthorizationScheme", + "azure.ai.projects.models.AgentEndpointConfig": "Azure.AI.Projects.AgentEndpointConfig", + "azure.ai.projects.models.EvaluatorGenerationJobSource": "Azure.AI.Projects.EvaluatorGenerationJobSource", + "azure.ai.projects.models.AgentEvaluatorGenerationJobSource": "Azure.AI.Projects.AgentEvaluatorGenerationJobSource", "azure.ai.projects.models.BaseCredentials": "Azure.AI.Projects.BaseCredentials", "azure.ai.projects.models.AgenticIdentityPreviewCredentials": "Azure.AI.Projects.AgenticIdentityPreviewCredentials", "azure.ai.projects.models.AgentIdentity": "Azure.AI.Projects.AgentIdentity", @@ -81,6 +85,8 @@ "azure.ai.projects.models.EvaluationRuleAction": "Azure.AI.Projects.EvaluationRuleAction", "azure.ai.projects.models.ContinuousEvaluationRuleAction": "Azure.AI.Projects.ContinuousEvaluationRuleAction", "azure.ai.projects.models.CosmosDBIndex": "Azure.AI.Projects.CosmosDBIndex", + "azure.ai.projects.models.CreateAgentVersionFromCodeContent": "Azure.AI.Projects.CreateAgentVersionFromCodeContent", + "azure.ai.projects.models.CreateAgentVersionFromCodeRequest": "Azure.AI.Projects.CreateAgentVersionFromCodeRequest", "azure.ai.projects.models.Trigger": "Azure.AI.Projects.Trigger", "azure.ai.projects.models.CronTrigger": "Azure.AI.Projects.CronTrigger", "azure.ai.projects.models.CustomCredential": "Azure.AI.Projects.CustomCredential", @@ -90,7 +96,18 @@ "azure.ai.projects.models.CustomToolParam": "OpenAI.CustomToolParam", "azure.ai.projects.models.RecurrenceSchedule": "Azure.AI.Projects.RecurrenceSchedule", "azure.ai.projects.models.DailyRecurrenceSchedule": "Azure.AI.Projects.DailyRecurrenceSchedule", + "azure.ai.projects.models.DataGenerationJob": "Azure.AI.Projects.DataGenerationJob", + "azure.ai.projects.models.DataGenerationJobInputs": "Azure.AI.Projects.DataGenerationJobInputs", + "azure.ai.projects.models.DataGenerationJobOptions": "Azure.AI.Projects.DataGenerationJobOptions", + "azure.ai.projects.models.DataGenerationJobOutput": "Azure.AI.Projects.DataGenerationJobOutput", + "azure.ai.projects.models.DataGenerationJobResult": "Azure.AI.Projects.DataGenerationJobResult", + "azure.ai.projects.models.DataGenerationModelOptions": "Azure.AI.Projects.DataGenerationModelOptions", + "azure.ai.projects.models.DataGenerationTokenUsage": "Azure.AI.Projects.DataGenerationTokenUsage", "azure.ai.projects.models.DatasetCredential": "Azure.AI.Projects.AssetCredentialResponse", + "azure.ai.projects.models.DatasetDataGenerationJobOutput": "Azure.AI.Projects.DatasetDataGenerationJobOutput", + "azure.ai.projects.models.DatasetDataGenerationJobSource": "Azure.AI.Projects.DatasetDataGenerationJobSource", + "azure.ai.projects.models.DatasetEvaluatorGenerationJobSource": "Azure.AI.Projects.DatasetEvaluatorGenerationJobSource", + "azure.ai.projects.models.DatasetReference": "Azure.AI.Projects.DatasetReference", "azure.ai.projects.models.DatasetVersion": "Azure.AI.Projects.DatasetVersion", "azure.ai.projects.models.DeleteAgentResponse": "Azure.AI.Projects.DeleteAgentResponse", "azure.ai.projects.models.DeleteAgentVersionResponse": "Azure.AI.Projects.DeleteAgentVersionResponse", @@ -100,8 +117,6 @@ "azure.ai.projects.models.EmbeddingConfiguration": "Azure.AI.Projects.EmbeddingConfiguration", "azure.ai.projects.models.EntraAuthorizationScheme": "Azure.AI.Projects.EntraAuthorizationScheme", "azure.ai.projects.models.EntraIDCredentials": "Azure.AI.Projects.EntraIDCredentials", - "azure.ai.projects.models.IsolationKeySource": "Azure.AI.Projects.IsolationKeySource", - "azure.ai.projects.models.EntraIsolationKeySource": "Azure.AI.Projects.EntraIsolationKeySource", "azure.ai.projects.models.EvalResult": "Azure.AI.Projects.EvalResult", "azure.ai.projects.models.EvalRunResultCompareItem": "Azure.AI.Projects.EvalRunResultCompareItem", "azure.ai.projects.models.EvalRunResultComparison": "Azure.AI.Projects.EvalRunResultComparison", @@ -117,10 +132,18 @@ "azure.ai.projects.models.ScheduleTask": "Azure.AI.Projects.ScheduleTask", "azure.ai.projects.models.EvaluationScheduleTask": "Azure.AI.Projects.EvaluationScheduleTask", "azure.ai.projects.models.EvaluationTaxonomy": "Azure.AI.Projects.EvaluationTaxonomy", + "azure.ai.projects.models.EvaluatorGenerationArtifacts": "Azure.AI.Projects.EvaluatorGenerationArtifacts", + "azure.ai.projects.models.EvaluatorGenerationInputs": "Azure.AI.Projects.EvaluatorGenerationInputs", + "azure.ai.projects.models.EvaluatorGenerationJob": "Azure.AI.Projects.EvaluatorGenerationJob", + "azure.ai.projects.models.EvaluatorGenerationTokenUsage": "Azure.AI.Projects.EvaluatorGenerationTokenUsage", "azure.ai.projects.models.EvaluatorMetric": "Azure.AI.Projects.EvaluatorMetric", "azure.ai.projects.models.EvaluatorVersion": "Azure.AI.Projects.EvaluatorVersion", "azure.ai.projects.models.FabricDataAgentToolParameters": "Azure.AI.Projects.FabricDataAgentToolParameters", + "azure.ai.projects.models.FabricIQPreviewTool": "Azure.AI.Projects.FabricIQPreviewTool", + "azure.ai.projects.models.FabricIQPreviewToolParameters": "Azure.AI.Projects.FabricIQPreviewToolParameters", "azure.ai.projects.models.FieldMapping": "Azure.AI.Projects.FieldMapping", + "azure.ai.projects.models.FileDataGenerationJobOutput": "Azure.AI.Projects.FileDataGenerationJobOutput", + "azure.ai.projects.models.FileDataGenerationJobSource": "Azure.AI.Projects.FileDataGenerationJobSource", "azure.ai.projects.models.FileDatasetVersion": "Azure.AI.Projects.FileDatasetVersion", "azure.ai.projects.models.FileSearchTool": "OpenAI.FileSearchTool", "azure.ai.projects.models.VersionSelectionRule": "Azure.AI.Projects.VersionSelectionRule", @@ -130,7 +153,6 @@ "azure.ai.projects.models.FunctionShellToolParamEnvironmentContainerReferenceParam": "OpenAI.FunctionShellToolParamEnvironmentContainerReferenceParam", "azure.ai.projects.models.FunctionShellToolParamEnvironmentLocalEnvironmentParam": "OpenAI.FunctionShellToolParamEnvironmentLocalEnvironmentParam", "azure.ai.projects.models.FunctionTool": "OpenAI.FunctionTool", - "azure.ai.projects.models.HeaderIsolationKeySource": "Azure.AI.Projects.HeaderIsolationKeySource", "azure.ai.projects.models.TelemetryEndpointAuth": "Azure.AI.Projects.TelemetryEndpointAuth", "azure.ai.projects.models.HeaderTelemetryEndpointAuth": "Azure.AI.Projects.HeaderTelemetryEndpointAuth", "azure.ai.projects.models.HostedAgentDefinition": "Azure.AI.Projects.HostedAgentDefinition", @@ -190,6 +212,8 @@ "azure.ai.projects.models.PromptAgentDefinition": "Azure.AI.Projects.PromptAgentDefinition", "azure.ai.projects.models.PromptAgentDefinitionTextOptions": "Azure.AI.Projects.PromptAgentDefinitionTextOptions", "azure.ai.projects.models.PromptBasedEvaluatorDefinition": "Azure.AI.Projects.PromptBasedEvaluatorDefinition", + "azure.ai.projects.models.PromptDataGenerationJobSource": "Azure.AI.Projects.PromptDataGenerationJobSource", + "azure.ai.projects.models.PromptEvaluatorGenerationJobSource": "Azure.AI.Projects.PromptEvaluatorGenerationJobSource", "azure.ai.projects.models.ProtocolVersionRecord": "Azure.AI.Projects.ProtocolVersionRecord", "azure.ai.projects.models.RaiConfig": "Azure.AI.Projects.RaiConfig", "azure.ai.projects.models.RankingOptions": "OpenAI.RankingOptions", @@ -198,6 +222,8 @@ "azure.ai.projects.models.RedTeam": "Azure.AI.Projects.RedTeam", "azure.ai.projects.models.ResponseUsageInputTokensDetails": "OpenAI.ResponseUsageInputTokensDetails", "azure.ai.projects.models.ResponseUsageOutputTokensDetails": "OpenAI.ResponseUsageOutputTokensDetails", + "azure.ai.projects.models.RubricBasedEvaluatorDefinition": "Azure.AI.Projects.RubricBasedEvaluatorDefinition", + "azure.ai.projects.models.RubricCriterion": "Azure.AI.Projects.RubricCriterion", "azure.ai.projects.models.SASCredentials": "Azure.AI.Projects.SASCredentials", "azure.ai.projects.models.Schedule": "Azure.AI.Projects.Schedule", "azure.ai.projects.models.ScheduleRun": "Azure.AI.Projects.ScheduleRun", @@ -207,6 +233,7 @@ "azure.ai.projects.models.SessionLogEvent": "Azure.AI.Projects.SessionLogEvent", "azure.ai.projects.models.SharepointGroundingToolParameters": "Azure.AI.Projects.SharepointGroundingToolParameters", "azure.ai.projects.models.SharepointPreviewTool": "Azure.AI.Projects.SharepointPreviewTool", + "azure.ai.projects.models.SimpleQnADataGenerationJobOptions": "Azure.AI.Projects.SimpleQnADataGenerationJobOptions", "azure.ai.projects.models.SkillObject": "Azure.AI.Projects.SkillObject", "azure.ai.projects.models.SkillReferenceParam": "OpenAI.SkillReferenceParam", "azure.ai.projects.models.ToolChoiceParam": "OpenAI.ToolChoiceParam", @@ -223,6 +250,7 @@ "azure.ai.projects.models.TextResponseFormatText": "OpenAI.TextResponseFormatConfigurationResponseFormatText", "azure.ai.projects.models.ToolboxObject": "Azure.AI.Projects.ToolboxObject", "azure.ai.projects.models.ToolboxPolicies": "Azure.AI.Projects.ToolboxPolicies", + "azure.ai.projects.models.ToolboxSearchPreviewTool": "Azure.AI.Projects.ToolboxSearchPreviewTool", "azure.ai.projects.models.ToolboxVersionObject": "Azure.AI.Projects.ToolboxVersionObject", "azure.ai.projects.models.ToolChoiceAllowed": "OpenAI.ToolChoiceAllowed", "azure.ai.projects.models.ToolChoiceCodeInterpreter": "OpenAI.ToolChoiceCodeInterpreter", @@ -236,6 +264,10 @@ "azure.ai.projects.models.ToolChoiceWebSearchPreview20250311": "OpenAI.ToolChoiceWebSearchPreview20250311", "azure.ai.projects.models.ToolDescription": "Azure.AI.Projects.ToolDescription", "azure.ai.projects.models.ToolProjectConnection": "Azure.AI.Projects.ToolProjectConnection", + "azure.ai.projects.models.ToolUseFineTuningDataGenerationJobOptions": "Azure.AI.Projects.ToolUseFineTuningDataGenerationJobOptions", + "azure.ai.projects.models.TracesDataGenerationJobOptions": "Azure.AI.Projects.TracesDataGenerationJobOptions", + "azure.ai.projects.models.TracesDataGenerationJobSource": "Azure.AI.Projects.TracesDataGenerationJobSource", + "azure.ai.projects.models.TracesEvaluatorGenerationJobSource": "Azure.AI.Projects.TracesEvaluatorGenerationJobSource", "azure.ai.projects.models.UpdateToolboxRequest": "Azure.AI.Projects.UpdateToolboxRequest", "azure.ai.projects.models.UserProfileMemoryItem": "Azure.AI.Projects.UserProfileMemoryItem", "azure.ai.projects.models.VersionIndicator": "Azure.AI.Projects.VersionIndicator", @@ -267,6 +299,7 @@ "azure.ai.projects.models.ContainerSkillType": "OpenAI.ContainerSkillType", "azure.ai.projects.models.SearchContextSize": "OpenAI.SearchContextSize", "azure.ai.projects.models.AgentProtocol": "Azure.AI.Projects.AgentProtocol", + "azure.ai.projects.models.CodeDependencyResolution": "Azure.AI.Projects.CodeDependencyResolution", "azure.ai.projects.models.TelemetryEndpointKind": "Azure.AI.Projects.TelemetryEndpointKind", "azure.ai.projects.models.TelemetryDataKind": "Azure.AI.Projects.TelemetryDataKind", "azure.ai.projects.models.TelemetryEndpointAuthType": "Azure.AI.Projects.TelemetryEndpointAuthType", @@ -278,7 +311,6 @@ "azure.ai.projects.models.VersionSelectorType": "Azure.AI.Projects.VersionSelectorType", "azure.ai.projects.models.AgentEndpointProtocol": "Azure.AI.Projects.AgentEndpointProtocol", "azure.ai.projects.models.AgentEndpointAuthorizationSchemeType": "Azure.AI.Projects.AgentEndpointAuthorizationSchemeType", - "azure.ai.projects.models.IsolationKeySourceKind": "Azure.AI.Projects.IsolationKeySourceKind", "azure.ai.projects.models.VersionIndicatorType": "Azure.AI.Projects.VersionIndicatorType", "azure.ai.projects.models.AgentSessionStatus": "Azure.AI.Projects.AgentSessionStatus", "azure.ai.projects.models.PageOrder": "Azure.AI.Projects.PageOrder", @@ -290,6 +322,8 @@ "azure.ai.projects.models.EvaluatorDefinitionType": "Azure.AI.Projects.EvaluatorDefinitionType", "azure.ai.projects.models.EvaluatorMetricType": "Azure.AI.Projects.EvaluatorMetricType", "azure.ai.projects.models.EvaluatorMetricDirection": "Azure.AI.Projects.EvaluatorMetricDirection", + "azure.ai.projects.models.EvaluatorGenerationJobSourceType": "Azure.AI.Projects.EvaluatorGenerationJobSourceType", + "azure.ai.projects.models.JobStatus": "Azure.AI.Projects.JobStatus", "azure.ai.projects.models.OperationState": "Azure.Core.Foundations.OperationState", "azure.ai.projects.models.InsightType": "Azure.AI.Projects.InsightType", "azure.ai.projects.models.SampleType": "Azure.AI.Projects.SampleType", @@ -304,6 +338,11 @@ "azure.ai.projects.models.RecurrenceType": "Azure.AI.Projects.RecurrenceType", "azure.ai.projects.models.DayOfWeek": "Azure.AI.Projects.DayOfWeek", "azure.ai.projects.models.ScheduleTaskType": "Azure.AI.Projects.ScheduleTaskType", + "azure.ai.projects.models.DataGenerationJobSourceType": "Azure.AI.Projects.DataGenerationJobSourceType", + "azure.ai.projects.models.DataGenerationJobType": "Azure.AI.Projects.DataGenerationJobType", + "azure.ai.projects.models.SimpleQnAFineTuningQuestionType": "Azure.AI.Projects.SimpleQnAFineTuningQuestionType", + "azure.ai.projects.models.DataGenerationJobScenario": "Azure.AI.Projects.DataGenerationJobScenario", + "azure.ai.projects.models.DataGenerationJobOutputType": "Azure.AI.Projects.DataGenerationJobOutputType", "azure.ai.projects.models.EvaluationRuleActionType": "Azure.AI.Projects.EvaluationRuleActionType", "azure.ai.projects.models.EvaluationRuleEventType": "Azure.AI.Projects.EvaluationRuleEventType", "azure.ai.projects.models.ConnectionType": "Azure.AI.Projects.ConnectionType", diff --git a/sdk/ai/azure-ai-projects/assets.json b/sdk/ai/azure-ai-projects/assets.json index 87ee354cd9f9..59aa2d33f39a 100644 --- a/sdk/ai/azure-ai-projects/assets.json +++ b/sdk/ai/azure-ai-projects/assets.json @@ -2,5 +2,5 @@ "AssetsRepo": "Azure/azure-sdk-assets", "AssetsRepoPrefixPath": "python", "TagPrefix": "python/ai/azure-ai-projects", - "Tag": "python/ai/azure-ai-projects_64257c2deb" + "Tag": "python/ai/azure-ai-projects_f15b61b44c" } diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py b/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py index 950bfaf73334..8b68c0515aec 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py @@ -24,6 +24,86 @@ logger = logging.getLogger(__name__) +# --------------------------------------------------------------------------- +# Shared helpers used by both the sync and async AIProjectClient.get_openai_client() +# implementations. Defined at module level so the async client can import and reuse +# them without duplicating the logic. +# --------------------------------------------------------------------------- + + +def _resolve_openai_base_url(config: Any, agent_name: Optional[str], kwargs: dict) -> str: + """Resolve the base URL for the (Async)OpenAI client. + + :param config: Generated client configuration carrying ``endpoint`` and ``allow_preview``. + :type config: Any + :param agent_name: Optional hosted-agent name. + :type agent_name: str or None + :param kwargs: Caller keyword arguments; ``base_url`` is popped when present. + :type kwargs: dict + :return: The base URL to use for the (Async)OpenAI client. + :rtype: str + :raises ValueError: If ``agent_name`` is provided but ``allow_preview=True`` was not set. + """ + if "base_url" in kwargs: + return kwargs.pop("base_url") + if agent_name is not None: + if config.allow_preview: + return config.endpoint.rstrip("/") + f"/agents/{agent_name}/endpoint/protocols/openai" + raise ValueError( + "Calling `get_openai_client` method with an `agent_name` requires you to set `allow_preview=True`" + "\nwhen constructing the AIProjectClient. Note that preview features are under development and " + "\nsubject to change. They should not be used in production environments." + ) + return config.endpoint.rstrip("/") + "/openai/v1" + + +def _resolve_openai_query_params(config: Any, agent_name: Optional[str], kwargs: dict) -> dict: + """Build the ``default_query`` dict for the (Async)OpenAI client. + + :param config: Generated client configuration carrying ``api_version``. + :type config: Any + :param agent_name: Optional hosted-agent name. + :type agent_name: str or None + :param kwargs: Caller keyword arguments; ``default_query`` is popped when present. + :type kwargs: dict + :return: Query parameters to forward to the (Async)OpenAI client. + :rtype: dict + """ + default_query = dict[str, str](kwargs.pop("default_query", None) or {}) + if agent_name is not None and "api-version" not in default_query: + default_query["api-version"] = config.api_version + return default_query + + +def _resolve_openai_default_headers(agent_name: Optional[str], kwargs: dict) -> dict: + """Build the ``default_headers`` dict for the (Async)OpenAI client. + + :param agent_name: Optional hosted-agent name. + :type agent_name: str or None + :param kwargs: Caller keyword arguments; ``default_headers`` is popped when present. + :type kwargs: dict + :return: Headers to forward to the (Async)OpenAI client. + :rtype: dict + """ + default_headers = dict[str, str](kwargs.pop("default_headers", None) or {}) + if agent_name is not None and not _has_header_case_insensitive(default_headers, _FOUNDRY_FEATURES_HEADER_NAME): + default_headers[_FOUNDRY_FEATURES_HEADER_NAME] = _BETA_OPERATION_FEATURE_HEADERS["agents"] + return default_headers + + +def _build_openai_user_agent(custom_user_agent: Optional[str], openai_default_user_agent: str) -> str: + """Build the SDK-prefixed User-Agent string for the (Async)OpenAI client. + + :param custom_user_agent: Caller-supplied user_agent kwarg captured at construction time. + :type custom_user_agent: str or None + :param openai_default_user_agent: The OpenAI client's own default user-agent. + :type openai_default_user_agent: str + :return: Combined User-Agent string. + :rtype: str + """ + return "-".join(ua for ua in [custom_user_agent, "AIProjectClient"] if ua) + " " + openai_default_user_agent + + class AIProjectClient(AIProjectClientGenerated): # pylint: disable=too-many-instance-attributes """AIProjectClient. @@ -101,6 +181,35 @@ def __init__( self.telemetry = TelemetryOperations(self) # type: ignore + def _get_openai_api_key(self, kwargs: dict): + """Resolve the API key for the OpenAI client. + + :param kwargs: Caller keyword arguments; ``api_key`` is popped when present. + :type kwargs: dict + :return: The API key string or a bearer-token-provider callable. + :rtype: str or Callable + """ + if "api_key" in kwargs: + return kwargs.pop("api_key") + return get_bearer_token_provider( + self._config.credential, # pylint: disable=protected-access + "https://ai.azure.com/.default", + ) + + def _get_openai_http_client(self, kwargs: dict): + """Resolve the HTTP transport client for the OpenAI client. + + :param kwargs: Caller keyword arguments; ``http_client`` is popped when present. + :type kwargs: dict + :return: An httpx.Client instance configured with logging transport, or ``None``. + :rtype: httpx.Client or None + """ + if "http_client" in kwargs: + return kwargs.pop("http_client") + if self._console_logging_enabled: + return httpx.Client(transport=_OpenAILoggingTransport()) + return None + @distributed_trace def get_openai_client(self, *, agent_name: Optional[str] = None, **kwargs: Any) -> OpenAI: """Get an authenticated OpenAI client from the `openai` package. @@ -131,51 +240,17 @@ def get_openai_client(self, *, agent_name: Optional[str] = None, **kwargs: Any) kwargs = kwargs.copy() if kwargs else {} - # Allow caller to override base_url - if "base_url" in kwargs: - base_url = kwargs.pop("base_url") - elif agent_name is not None: - if self._config.allow_preview: - base_url = ( - self._config.endpoint.rstrip("/") + f"/agents/{agent_name}/endpoint/protocols/openai" - ) # pylint: disable=protected-access - else: - raise ValueError( - "Calling `get_openai_client` method with an `agent_name` requires you to set `allow_preview=True`" - "\nwhen constructing the AIProjectClient. Note that preview features are under development and " - "\nsubject to change. They should not be used in production environments." - ) - else: - base_url = self._config.endpoint.rstrip("/") + "/openai/v1" # pylint: disable=protected-access - - default_query = dict[str, str](kwargs.pop("default_query", None) or {}) - if agent_name is not None and "api-version" not in default_query: - default_query["api-version"] = self._config.api_version # pylint: disable=protected-access + base_url = _resolve_openai_base_url(self._config, agent_name, kwargs) + default_query = _resolve_openai_query_params(self._config, agent_name, kwargs) logger.debug( # pylint: disable=specify-parameter-names-in-call "[get_openai_client] Creating OpenAI client using Entra ID authentication, base_url = `%s`", # pylint: disable=line-too-long base_url, ) - # Allow caller to override api_key, otherwise use token provider - if "api_key" in kwargs: - api_key = kwargs.pop("api_key") - else: - api_key = get_bearer_token_provider( - self._config.credential, # pylint: disable=protected-access - "https://ai.azure.com/.default", - ) - - if "http_client" in kwargs: - http_client = kwargs.pop("http_client") - elif self._console_logging_enabled: - http_client = httpx.Client(transport=_OpenAILoggingTransport()) - else: - http_client = None - - default_headers = dict[str, str](kwargs.pop("default_headers", None) or {}) - if agent_name is not None and not _has_header_case_insensitive(default_headers, _FOUNDRY_FEATURES_HEADER_NAME): - default_headers[_FOUNDRY_FEATURES_HEADER_NAME] = _BETA_OPERATION_FEATURE_HEADERS["agents"] + api_key = self._get_openai_api_key(kwargs) + http_client = self._get_openai_http_client(kwargs) + default_headers = _resolve_openai_default_headers(agent_name, kwargs) openai_custom_user_agent = default_headers.get("User-Agent", None) @@ -195,11 +270,7 @@ def _create_openai_client(**kwargs) -> OpenAI: if openai_custom_user_agent: final_user_agent = openai_custom_user_agent else: - final_user_agent = ( - "-".join(ua for ua in [self._custom_user_agent, "AIProjectClient"] if ua) - + " " - + openai_default_user_agent - ) + final_user_agent = _build_openai_user_agent(self._custom_user_agent, openai_default_user_agent) default_headers["User-Agent"] = final_user_agent diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.pyi b/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.pyi index c40454fc9320..42009c1c227e 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.pyi +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.pyi @@ -108,6 +108,11 @@ class AIProjectClient(AIProjectClientGenerated): # To make mypy happy... otherwise imports of the below result in mypy "attr-defined" error class _AuthSecretsFilter(logging.Filter): ... +def _resolve_openai_base_url(config: Any, agent_name: Optional[str], kwargs: dict) -> str: ... +def _resolve_openai_query_params(config: Any, agent_name: Optional[str], kwargs: dict) -> dict: ... +def _resolve_openai_default_headers(agent_name: Optional[str], kwargs: dict) -> dict: ... +def _build_openai_user_agent(custom_user_agent: Optional[str], openai_default_user_agent: str) -> str: ... + __all__: List[str] = ["AIProjectClient"] def patch_sdk() -> None: ... diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_utils/utils.py b/sdk/ai/azure-ai-projects/azure/ai/projects/_utils/utils.py index 707b7d8fac75..bd821750f4c6 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_utils/utils.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_utils/utils.py @@ -10,6 +10,7 @@ from .._utils.model_base import Model, SdkJSONEncoder + # file-like tuple could be `(filename, IO (or bytes))` or `(filename, IO (or bytes), content_type)` FileContent = Union[str, bytes, IO[str], IO[bytes]] diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py b/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py index ccb75164d3bc..8abefc77cabd 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py @@ -6,4 +6,4 @@ # Changes may cause incorrect behavior and will be lost if the code is regenerated. # -------------------------------------------------------------------------- -VERSION = "2.1.0" +VERSION = "2.2.0" diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_patch.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_patch.py index 34528053a55c..6eafa71053e6 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_patch.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_patch.py @@ -16,8 +16,13 @@ from azure.core.tracing.decorator import distributed_trace from azure.core.credentials_async import AsyncTokenCredential from azure.identity.aio import get_bearer_token_provider -from .._patch import _AuthSecretsFilter -from ..models._patch import _BETA_OPERATION_FEATURE_HEADERS, _FOUNDRY_FEATURES_HEADER_NAME, _has_header_case_insensitive +from .._patch import ( + _AuthSecretsFilter, + _build_openai_user_agent, + _resolve_openai_base_url, + _resolve_openai_default_headers, + _resolve_openai_query_params, +) from ._client import AIProjectClient as AIProjectClientGenerated from .operations import TelemetryOperations @@ -101,6 +106,35 @@ def __init__( self.telemetry = TelemetryOperations(self) # type: ignore + def _get_openai_api_key(self, kwargs: dict): + """Resolve the API key for the AsyncOpenAI client. + + :param kwargs: Caller keyword arguments; ``api_key`` is popped when present. + :type kwargs: dict + :return: The API key string or a bearer-token-provider callable. + :rtype: str or Callable + """ + if "api_key" in kwargs: + return kwargs.pop("api_key") + return get_bearer_token_provider( + self._config.credential, # pylint: disable=protected-access + "https://ai.azure.com/.default", + ) + + def _get_openai_http_client(self, kwargs: dict): + """Resolve the HTTP transport client for the AsyncOpenAI client. + + :param kwargs: Caller keyword arguments; ``http_client`` is popped when present. + :type kwargs: dict + :return: An httpx.AsyncClient instance configured with logging transport, or ``None``. + :rtype: httpx.AsyncClient or None + """ + if "http_client" in kwargs: + return kwargs.pop("http_client") + if self._console_logging_enabled: + return httpx.AsyncClient(transport=_OpenAILoggingTransport()) + return None + @distributed_trace def get_openai_client(self, *, agent_name: Optional[str] = None, **kwargs: Any) -> AsyncOpenAI: """Get an authenticated AsyncOpenAI client from the `openai` package. @@ -131,51 +165,17 @@ def get_openai_client(self, *, agent_name: Optional[str] = None, **kwargs: Any) kwargs = kwargs.copy() if kwargs else {} - # Allow caller to override base_url - if "base_url" in kwargs: - base_url = kwargs.pop("base_url") - elif agent_name is not None: - if self._config.allow_preview: - base_url = ( - self._config.endpoint.rstrip("/") + f"/agents/{agent_name}/endpoint/protocols/openai" - ) # pylint: disable=protected-access - else: - raise ValueError( - "Calling `get_openai_client` method with an `agent_name` requires you to set `allow_preview=True`" - "\nwhen constructing the AIProjectClient. Note that preview features are under development and " - "\nsubject to change. They should not be used in production environments." - ) - else: - base_url = self._config.endpoint.rstrip("/") + "/openai/v1" # pylint: disable=protected-access - - default_query = dict[str, str](kwargs.pop("default_query", None) or {}) - if agent_name is not None and "api-version" not in default_query: - default_query["api-version"] = self._config.api_version # pylint: disable=protected-access + base_url = _resolve_openai_base_url(self._config, agent_name, kwargs) + default_query = _resolve_openai_query_params(self._config, agent_name, kwargs) logger.debug( # pylint: disable=specify-parameter-names-in-call "[get_openai_client] Creating OpenAI client using Entra ID authentication, base_url = `%s`", # pylint: disable=line-too-long base_url, ) - # Allow caller to override api_key, otherwise use token provider - if "api_key" in kwargs: - api_key = kwargs.pop("api_key") - else: - api_key = get_bearer_token_provider( - self._config.credential, # pylint: disable=protected-access - "https://ai.azure.com/.default", - ) - - if "http_client" in kwargs: - http_client = kwargs.pop("http_client") - elif self._console_logging_enabled: - http_client = httpx.AsyncClient(transport=_OpenAILoggingTransport()) - else: - http_client = None - - default_headers = dict[str, str](kwargs.pop("default_headers", None) or {}) - if agent_name is not None and not _has_header_case_insensitive(default_headers, _FOUNDRY_FEATURES_HEADER_NAME): - default_headers[_FOUNDRY_FEATURES_HEADER_NAME] = _BETA_OPERATION_FEATURE_HEADERS["agents"] + api_key = self._get_openai_api_key(kwargs) + http_client = self._get_openai_http_client(kwargs) + default_headers = _resolve_openai_default_headers(agent_name, kwargs) openai_custom_user_agent = default_headers.get("User-Agent", None) @@ -195,11 +195,7 @@ def _create_openai_client(**kwargs) -> AsyncOpenAI: if openai_custom_user_agent: final_user_agent = openai_custom_user_agent else: - final_user_agent = ( - "-".join(ua for ua in [self._custom_user_agent, "AIProjectClient"] if ua) - + " " - + openai_default_user_agent - ) + final_user_agent = _build_openai_user_agent(self._custom_user_agent, openai_default_user_agent) default_headers["User-Agent"] = final_user_agent diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py index 41e22e8fa2e8..699644e8a17d 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py @@ -33,8 +33,9 @@ from azure.core.utils import case_insensitive_dict from ... import models as _models -from ..._utils.model_base import SdkJSONEncoder, _deserialize, _failsafe_deserialize +from ..._utils.model_base import Model as _Model, SdkJSONEncoder, _deserialize, _failsafe_deserialize from ..._utils.serialization import Deserializer, Serializer +from ..._utils.utils import prepare_multipart_form_data from ...operations._operations import ( build_agents_create_version_from_manifest_request, build_agents_create_version_request, @@ -44,24 +45,38 @@ build_agents_get_version_request, build_agents_list_request, build_agents_list_versions_request, + build_beta_agents_create_agent_version_from_code_request, build_beta_agents_create_session_request, build_beta_agents_delete_session_file_request, build_beta_agents_delete_session_request, + build_beta_agents_download_agent_code_request, + build_beta_agents_download_agent_version_code_request, build_beta_agents_download_session_file_request, build_beta_agents_get_session_files_request, build_beta_agents_get_session_log_stream_request, build_beta_agents_get_session_request, build_beta_agents_list_sessions_request, build_beta_agents_patch_agent_details_request, + build_beta_agents_update_agent_from_code_request, build_beta_agents_upload_session_file_request, + build_beta_datasets_cancel_generation_job_request, + build_beta_datasets_create_generation_job_request, + build_beta_datasets_delete_generation_job_request, + build_beta_datasets_get_generation_job_request, + build_beta_datasets_list_generation_jobs_request, build_beta_evaluation_taxonomies_create_request, build_beta_evaluation_taxonomies_delete_request, build_beta_evaluation_taxonomies_get_request, build_beta_evaluation_taxonomies_list_request, build_beta_evaluation_taxonomies_update_request, + build_beta_evaluators_cancel_generation_job_request, + build_beta_evaluators_create_generation_job_request, build_beta_evaluators_create_version_request, + build_beta_evaluators_delete_generation_job_request, build_beta_evaluators_delete_version_request, + build_beta_evaluators_get_generation_job_request, build_beta_evaluators_get_version_request, + build_beta_evaluators_list_generation_jobs_request, build_beta_evaluators_list_request, build_beta_evaluators_list_versions_request, build_beta_evaluators_update_version_request, @@ -159,6 +174,7 @@ def __init__(self, *args, **kwargs) -> None: self.schedules = BetaSchedulesOperations(self._client, self._config, self._serialize, self._deserialize) self.toolboxes = BetaToolboxesOperations(self._client, self._config, self._serialize, self._deserialize) self.skills = BetaSkillsOperations(self._client, self._config, self._serialize, self._deserialize) + self.datasets = BetaDatasetsOperations(self._client, self._config, self._serialize, self._deserialize) class AgentsOperations: @@ -3012,13 +3028,157 @@ def __init__(self, *args, **kwargs) -> None: self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @overload + async def update_agent_from_code( + self, agent_name: str, body: _models.CreateAgentVersionFromCodeContent, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentDetails: + """Updates a code-based agent by uploading new code and creating a new version. If the code and + definition are unchanged (matched by x-ms-code-zip-sha256 header), returns the existing + version. The request body is multipart/form-data with a JSON metadata part and a binary code + part (part order is irrelevant). Maximum upload size is 250 MB. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentDetails. The AgentDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def update_agent_from_code( + self, agent_name: str, body: JSON, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentDetails: + """Updates a code-based agent by uploading new code and creating a new version. If the code and + definition are unchanged (matched by x-ms-code-zip-sha256 header), returns the existing + version. The request body is multipart/form-data with a JSON metadata part and a binary code + part (part order is irrelevant). Maximum upload size is 250 MB. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Required. + :type body: JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentDetails. The AgentDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace_async + async def update_agent_from_code( + self, + agent_name: str, + body: Union[_models.CreateAgentVersionFromCodeContent, JSON], + *, + code_zip_sha256: str, + **kwargs: Any + ) -> _models.AgentDetails: + """Updates a code-based agent by uploading new code and creating a new version. If the code and + definition are unchanged (matched by x-ms-code-zip-sha256 header), returns the existing + version. The request body is multipart/form-data with a JSON metadata part and a binary code + part (part order is irrelevant). Maximum upload size is 250 MB. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Is either a CreateAgentVersionFromCodeContent type or a JSON type. Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent or JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentDetails. The AgentDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.AgentDetails] = kwargs.pop("cls", None) + + _body = body.as_dict() if isinstance(body, _Model) else body + _file_fields: list[str] = ["code"] + _data_fields: list[str] = ["metadata"] + _files = prepare_multipart_form_data(_body, _file_fields, _data_fields) + + _request = build_beta_agents_update_agent_from_code_request( + agent_name=agent_name, + code_zip_sha256=code_zip_sha256, + api_version=self._config.api_version, + files=_files, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.AgentDetails, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + @overload async def patch_agent_details( self, agent_name: str, *, content_type: str = "application/merge-patch+json", - agent_endpoint: Optional[_models.AgentEndpoint] = None, + agent_endpoint: Optional[_models.AgentEndpointConfig] = None, agent_card: Optional[_models.AgentCard] = None, **kwargs: Any ) -> _models.AgentDetails: @@ -3030,7 +3190,7 @@ async def patch_agent_details( Default value is "application/merge-patch+json". :paramtype content_type: str :keyword agent_endpoint: The endpoint configuration for the agent. Default value is None. - :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpoint + :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpointConfig :keyword agent_card: Optional agent card for the agent. Default value is None. :paramtype agent_card: ~azure.ai.projects.models.AgentCard :return: AgentDetails. The AgentDetails is compatible with MutableMapping @@ -3080,7 +3240,7 @@ async def patch_agent_details( agent_name: str, body: Union[JSON, IO[bytes]] = _Unset, *, - agent_endpoint: Optional[_models.AgentEndpoint] = None, + agent_endpoint: Optional[_models.AgentEndpointConfig] = None, agent_card: Optional[_models.AgentCard] = None, **kwargs: Any ) -> _models.AgentDetails: @@ -3091,7 +3251,7 @@ async def patch_agent_details( :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] :keyword agent_endpoint: The endpoint configuration for the agent. Default value is None. - :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpoint + :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpointConfig :keyword agent_card: Optional agent card for the agent. Default value is None. :paramtype agent_card: ~azure.ai.projects.models.AgentCard :return: AgentDetails. The AgentDetails is compatible with MutableMapping @@ -3167,119 +3327,76 @@ async def patch_agent_details( return deserialized # type: ignore @overload - async def create_session( - self, - agent_name: str, - *, - isolation_key: str, - version_indicator: _models.VersionIndicator, - content_type: str = "application/json", - agent_session_id: Optional[str] = None, - **kwargs: Any - ) -> _models.AgentSessionResource: - """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version - from ``version_indicator`` and enforces session ownership using the provided isolation key for - session-mutating operations. - - :param agent_name: The name of the agent to create a session for. Required. - :type agent_name: str - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str - :keyword version_indicator: Determines which agent version backs the session. Required. - :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :keyword agent_session_id: Optional caller-provided session ID. If specified, it must be unique - within the agent endpoint. Auto-generated if omitted. Default value is None. - :paramtype agent_session_id: str - :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.AgentSessionResource - :raises ~azure.core.exceptions.HttpResponseError: - """ + async def create_agent_version_from_code( + self, agent_name: str, body: _models.CreateAgentVersionFromCodeContent, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentVersionDetails: + """create_agent_version_from_code. - @overload - async def create_session( - self, agent_name: str, body: JSON, *, isolation_key: str, content_type: str = "application/json", **kwargs: Any - ) -> _models.AgentSessionResource: - """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version - from ``version_indicator`` and enforces session ownership using the provided isolation key for - session-mutating operations. + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. - :param agent_name: The name of the agent to create a session for. Required. + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. :type agent_name: str :param body: Required. - :type body: JSON - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.AgentSessionResource + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentVersionDetails. The AgentVersionDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentVersionDetails :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def create_session( - self, - agent_name: str, - body: IO[bytes], - *, - isolation_key: str, - content_type: str = "application/json", - **kwargs: Any - ) -> _models.AgentSessionResource: - """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version - from ``version_indicator`` and enforces session ownership using the provided isolation key for - session-mutating operations. + async def create_agent_version_from_code( + self, agent_name: str, body: JSON, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentVersionDetails: + """create_agent_version_from_code. - :param agent_name: The name of the agent to create a session for. Required. + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. :type agent_name: str :param body: Required. - :type body: IO[bytes] - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.AgentSessionResource + :type body: JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentVersionDetails. The AgentVersionDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentVersionDetails :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace_async - async def create_session( + async def create_agent_version_from_code( self, agent_name: str, - body: Union[JSON, IO[bytes]] = _Unset, + body: Union[_models.CreateAgentVersionFromCodeContent, JSON], *, - isolation_key: str, - version_indicator: _models.VersionIndicator = _Unset, - agent_session_id: Optional[str] = None, + code_zip_sha256: str, **kwargs: Any - ) -> _models.AgentSessionResource: - """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version - from ``version_indicator`` and enforces session ownership using the provided isolation key for - session-mutating operations. + ) -> _models.AgentVersionDetails: + """create_agent_version_from_code. - :param agent_name: The name of the agent to create a session for. Required. + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. :type agent_name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str - :keyword version_indicator: Determines which agent version backs the session. Required. - :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator - :keyword agent_session_id: Optional caller-provided session ID. If specified, it must be unique - within the agent endpoint. Auto-generated if omitted. Default value is None. - :paramtype agent_session_id: str - :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.AgentSessionResource + :param body: Is either a CreateAgentVersionFromCodeContent type or a JSON type. Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent or JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentVersionDetails. The AgentVersionDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentVersionDetails :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3290,30 +3407,21 @@ async def create_session( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.AgentSessionResource] = kwargs.pop("cls", None) + cls: ClsType[_models.AgentVersionDetails] = kwargs.pop("cls", None) - if body is _Unset: - if version_indicator is _Unset: - raise TypeError("missing required argument: version_indicator") - body = {"agent_session_id": agent_session_id, "version_indicator": version_indicator} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _body = body.as_dict() if isinstance(body, _Model) else body + _file_fields: list[str] = ["code"] + _data_fields: list[str] = ["metadata"] + _files = prepare_multipart_form_data(_body, _file_fields, _data_fields) - _request = build_beta_agents_create_session_request( + _request = build_beta_agents_create_agent_version_from_code_request( agent_name=agent_name, - isolation_key=isolation_key, - content_type=content_type, + code_zip_sha256=code_zip_sha256, api_version=self._config.api_version, - content=_content, + files=_files, headers=_headers, params=_params, ) @@ -3330,7 +3438,7 @@ async def create_session( response = pipeline_response.http_response - if response.status_code not in [201]: + if response.status_code not in [200]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -3346,7 +3454,7 @@ async def create_session( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.AgentSessionResource, response.json()) + deserialized = _deserialize(_models.AgentVersionDetails, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -3354,15 +3462,19 @@ async def create_session( return deserialized # type: ignore @distributed_trace_async - async def get_session(self, agent_name: str, session_id: str, **kwargs: Any) -> _models.AgentSessionResource: - """Retrieves a session by ID. + async def download_agent_version_code( + self, agent_name: str, agent_version: str, **kwargs: Any + ) -> AsyncIterator[bytes]: + """Download the code zip for a specific version of a code-based hosted agent. Returns the + previously-uploaded zip (``application/zip``). The SHA-256 digest of the returned bytes matches + the ``content_hash`` on the agent version's ``code_configuration``. :param agent_name: The name of the agent. Required. :type agent_name: str - :param session_id: The session identifier. Required. - :type session_id: str - :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.AgentSessionResource + :param agent_version: The version of the agent whose code zip should be downloaded. Required. + :type agent_version: str + :return: AsyncIterator[bytes] + :rtype: AsyncIterator[bytes] :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3376,11 +3488,11 @@ async def get_session(self, agent_name: str, session_id: str, **kwargs: Any) -> _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.AgentSessionResource] = kwargs.pop("cls", None) + cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) - _request = build_beta_agents_get_session_request( + _request = build_beta_agents_download_agent_version_code_request( agent_name=agent_name, - session_id=session_id, + agent_version=agent_version, api_version=self._config.api_version, headers=_headers, params=_params, @@ -3391,7 +3503,7 @@ async def get_session(self, agent_name: str, session_id: str, **kwargs: Any) -> _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = kwargs.pop("stream", True) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) @@ -3411,30 +3523,27 @@ async def get_session(self, agent_name: str, session_id: str, **kwargs: Any) -> ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.AgentSessionResource, response.json()) + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace_async - async def delete_session(self, agent_name: str, session_id: str, *, isolation_key: str, **kwargs: Any) -> None: - """Deletes a session synchronously. Returns 204 No Content when the session is deleted or does not - exist. + async def download_agent_code(self, agent_name: str, **kwargs: Any) -> AsyncIterator[bytes]: + """Download the code zip for the latest version of a code-based hosted agent. Returns the + previously-uploaded zip (``application/zip``). The SHA-256 digest of the returned bytes matches + the ``content_hash`` on the latest version's ``code_configuration``. - :param agent_name: The name of the agent. Required. + :param agent_name: The name of the agent whose latest-version code zip should be downloaded. + Required. :type agent_name: str - :param session_id: The session identifier. Required. - :type session_id: str - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str - :return: None - :rtype: None + :return: AsyncIterator[bytes] + :rtype: AsyncIterator[bytes] :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3448,12 +3557,10 @@ async def delete_session(self, agent_name: str, session_id: str, *, isolation_ke _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) - _request = build_beta_agents_delete_session_request( + _request = build_beta_agents_download_agent_code_request( agent_name=agent_name, - session_id=session_id, - isolation_key=isolation_key, api_version=self._config.api_version, headers=_headers, params=_params, @@ -3463,14 +3570,20 @@ async def delete_session(self, agent_name: str, session_id: str, *, isolation_ke } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", True) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -3478,142 +3591,110 @@ async def delete_session(self, agent_name: str, session_id: str, *, isolation_ke ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore - @distributed_trace - def list_sessions( + return deserialized # type: ignore + + @overload + async def create_session( self, agent_name: str, *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, + version_indicator: _models.VersionIndicator, + content_type: str = "application/json", + agent_session_id: Optional[str] = None, **kwargs: Any - ) -> AsyncItemPaged["_models.AgentSessionResource"]: - """Returns a list of sessions for the specified agent. + ) -> _models.AgentSessionResource: + """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version + from ``version_indicator`` and enforces session ownership using the provided isolation key for + session-mutating operations. - :param agent_name: The name of the agent. Required. + :param agent_name: The name of the agent to create a session for. Required. :type agent_name: str - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of AgentSessionResource - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.AgentSessionResource] + :keyword version_indicator: Determines which agent version backs the session. Required. + :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword agent_session_id: Optional caller-provided session ID. If specified, it must be unique + within the agent endpoint. Auto-generated if omitted. Default value is None. + :paramtype agent_session_id: str + :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentSessionResource :raises ~azure.core.exceptions.HttpResponseError: """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.AgentSessionResource]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(_continuation_token=None): - _request = build_beta_agents_list_sessions_request( - agent_name=agent_name, - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - return _request - - async def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.AgentSessionResource], - deserialized.get("data", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, AsyncList(list_of_elem) - - async def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) - - _stream = False - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response + @overload + async def create_session( + self, agent_name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.AgentSessionResource: + """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version + from ``version_indicator`` and enforces session ownership using the provided isolation key for + session-mutating operations. - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) + :param agent_name: The name of the agent to create a session for. Required. + :type agent_name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentSessionResource + :raises ~azure.core.exceptions.HttpResponseError: + """ - return pipeline_response + @overload + async def create_session( + self, agent_name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.AgentSessionResource: + """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version + from ``version_indicator`` and enforces session ownership using the provided isolation key for + session-mutating operations. - return AsyncItemPaged(get_next, extract_data) + :param agent_name: The name of the agent to create a session for. Required. + :type agent_name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentSessionResource + :raises ~azure.core.exceptions.HttpResponseError: + """ @distributed_trace_async - async def get_session_log_stream( - self, agent_name: str, agent_version: str, session_id: str, **kwargs: Any - ) -> _models.SessionLogEvent: - """Streams console logs (stdout / stderr) for a specific hosted agent session - as a Server-Sent Events (SSE) stream. - - Each SSE frame contains: - - * `event`: always `"log"` - * `data`: a plain-text log line (currently JSON-formatted, but the schema is not contractual and may include additional keys or change format over time; clients should treat it as an opaque string) - - Example SSE frames: - - .. code-block:: - - event: log - data: {"timestamp":"2026-03-10T09:33:17.121Z","stream":"stdout","message":"Starting FoundryCBAgent server on port 8088"} - - event: log - data: {"timestamp":"2026-03-10T09:33:17.130Z","stream":"stderr","message":"INFO: Application startup complete."} - - event: log - data: {"timestamp":"2026-03-10T09:34:52.714Z","stream":"status","message":"Successfully connected to container"} - - event: log - data: {"timestamp":"2026-03-10T09:35:52.714Z","stream":"status","message":"No logs since last 60 seconds"} - - The stream remains open until the client disconnects or the server - terminates the connection. Clients should handle reconnection as needed. + async def create_session( + self, + agent_name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + version_indicator: _models.VersionIndicator = _Unset, + agent_session_id: Optional[str] = None, + **kwargs: Any + ) -> _models.AgentSessionResource: + """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version + from ``version_indicator`` and enforces session ownership using the provided isolation key for + session-mutating operations. - :param agent_name: The name of the hosted agent. Required. + :param agent_name: The name of the agent to create a session for. Required. :type agent_name: str - :param agent_version: The version of the agent. Required. - :type agent_version: str - :param session_id: The session ID (maps to an ADC sandbox). Required. - :type session_id: str - :return: SessionLogEvent. The SessionLogEvent is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SessionLogEvent + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword version_indicator: Determines which agent version backs the session. Required. + :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator + :keyword agent_session_id: Optional caller-provided session ID. If specified, it must be unique + within the agent endpoint. Auto-generated if omitted. Default value is None. + :paramtype agent_session_id: str + :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentSessionResource :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3624,16 +3705,29 @@ async def get_session_log_stream( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.SessionLogEvent] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.AgentSessionResource] = kwargs.pop("cls", None) - _request = build_beta_agents_get_session_log_stream_request( + if body is _Unset: + if version_indicator is _Unset: + raise TypeError("missing required argument: version_indicator") + body = {"agent_session_id": agent_session_id, "version_indicator": version_indicator} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_agents_create_session_request( agent_name=agent_name, - agent_version=agent_version, - session_id=session_id, + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -3643,14 +3737,14 @@ async def get_session_log_stream( _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = True + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -3663,38 +3757,26 @@ async def get_session_log_stream( ) raise HttpResponseError(response=response, model=error) - response_headers = {} - response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) - if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SessionLogEvent, response.text()) + deserialized = _deserialize(_models.AgentSessionResource, response.json()) if cls: - return cls(pipeline_response, deserialized, response_headers) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore @distributed_trace_async - async def _upload_session_file( - self, agent_name: str, agent_session_id: str, content: bytes, *, path: str, **kwargs: Any - ) -> _models.SessionFileWriteResponse: - """Upload a file to the session sandbox via binary stream. Maximum file size is 50 MB. Uploads - exceeding this limit return 413 Payload Too Large. + async def get_session(self, agent_name: str, session_id: str, **kwargs: Any) -> _models.AgentSessionResource: + """Retrieves a session by ID. :param agent_name: The name of the agent. Required. :type agent_name: str - :param agent_session_id: The session ID. Required. - :type agent_session_id: str - :param content: Required. - :type content: bytes - :keyword path: The destination file path within the sandbox, relative to the session home - directory. Required. - :paramtype path: str - :return: SessionFileWriteResponse. The SessionFileWriteResponse is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.SessionFileWriteResponse + :param session_id: The session identifier. Required. + :type session_id: str + :return: AgentSessionResource. The AgentSessionResource is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentSessionResource :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3705,21 +3787,15 @@ async def _upload_session_file( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: str = kwargs.pop("content_type", _headers.pop("Content-Type", "application/octet-stream")) - cls: ClsType[_models.SessionFileWriteResponse] = kwargs.pop("cls", None) - - _content = content + cls: ClsType[_models.AgentSessionResource] = kwargs.pop("cls", None) - _request = build_beta_agents_upload_session_file_request( + _request = build_beta_agents_get_session_request( agent_name=agent_name, - agent_session_id=agent_session_id, - path=path, - content_type=content_type, + session_id=session_id, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -3736,7 +3812,7 @@ async def _upload_session_file( response = pipeline_response.http_response - if response.status_code not in [201]: + if response.status_code not in [200]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -3752,7 +3828,7 @@ async def _upload_session_file( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SessionFileWriteResponse, response.json()) + deserialized = _deserialize(_models.AgentSessionResource, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -3760,20 +3836,16 @@ async def _upload_session_file( return deserialized # type: ignore @distributed_trace_async - async def download_session_file( - self, agent_name: str, agent_session_id: str, *, path: str, **kwargs: Any - ) -> AsyncIterator[bytes]: - """Download a file from the session sandbox as a binary stream. + async def delete_session(self, agent_name: str, session_id: str, **kwargs: Any) -> None: + """Deletes a session synchronously. Returns 204 No Content when the session is deleted or does not + exist. :param agent_name: The name of the agent. Required. :type agent_name: str - :param agent_session_id: The session ID. Required. - :type agent_session_id: str - :keyword path: The file path to download from the sandbox, relative to the session home - directory. Required. - :paramtype path: str - :return: AsyncIterator[bytes] - :rtype: AsyncIterator[bytes] + :param session_id: The session identifier. Required. + :type session_id: str + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3787,12 +3859,11 @@ async def download_session_file( _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) + cls: ClsType[None] = kwargs.pop("cls", None) - _request = build_beta_agents_download_session_file_request( + _request = build_beta_agents_delete_session_request( agent_name=agent_name, - agent_session_id=agent_session_id, - path=path, + session_id=session_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -3802,20 +3873,14 @@ async def download_session_file( } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", True) + _stream = False pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [200]: - if _stream: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [204]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -3823,29 +3888,142 @@ async def download_session_file( ) raise HttpResponseError(response=response, model=error) - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, None, {}) # type: ignore - return deserialized # type: ignore + @distributed_trace + def list_sessions( + self, + agent_name: str, + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + **kwargs: Any + ) -> AsyncItemPaged["_models.AgentSessionResource"]: + """Returns a list of sessions for the specified agent. + + :param agent_name: The name of the agent. Required. + :type agent_name: str + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of AgentSessionResource + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.AgentSessionResource] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.AgentSessionResource]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(_continuation_token=None): + + _request = build_beta_agents_list_sessions_request( + agent_name=agent_name, + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request + + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.AgentSessionResource], + deserialized.get("data", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, AsyncList(list_of_elem) + + async def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) + + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + return pipeline_response + + return AsyncItemPaged(get_next, extract_data) @distributed_trace_async - async def get_session_files( - self, agent_name: str, agent_session_id: str, *, path: str, **kwargs: Any - ) -> _models.SessionDirectoryListResponse: - """List files and directories at a given path in the session sandbox. Returns only the immediate - children of the specified directory (non-recursive). + async def get_session_log_stream( + self, agent_name: str, agent_version: str, session_id: str, **kwargs: Any + ) -> _models.SessionLogEvent: + """Streams console logs (stdout / stderr) for a specific hosted agent session + as a Server-Sent Events (SSE) stream. - :param agent_name: The name of the agent. Required. + Each SSE frame contains: + + * `event`: always `"log"` + * `data`: a plain-text log line (currently JSON-formatted, but the schema is not contractual and may include additional keys or change format over time; clients should treat it as an opaque string) + + Example SSE frames: + + .. code-block:: + + event: log + data: {"timestamp":"2026-03-10T09:33:17.121Z","stream":"stdout","message":"Starting FoundryCBAgent server on port 8088"} + + event: log + data: {"timestamp":"2026-03-10T09:33:17.130Z","stream":"stderr","message":"INFO: Application startup complete."} + + event: log + data: {"timestamp":"2026-03-10T09:34:52.714Z","stream":"status","message":"Successfully connected to container"} + + event: log + data: {"timestamp":"2026-03-10T09:35:52.714Z","stream":"status","message":"No logs since last 60 seconds"} + + The stream remains open until the client disconnects or the server + terminates the connection. Clients should handle reconnection as needed. + + :param agent_name: The name of the hosted agent. Required. :type agent_name: str - :param agent_session_id: The session ID. Required. - :type agent_session_id: str - :keyword path: The directory path to list, relative to the session home directory. Required. - :paramtype path: str - :return: SessionDirectoryListResponse. The SessionDirectoryListResponse is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.SessionDirectoryListResponse + :param agent_version: The version of the agent. Required. + :type agent_version: str + :param session_id: The session ID (maps to an ADC sandbox). Required. + :type session_id: str + :return: SessionLogEvent. The SessionLogEvent is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SessionLogEvent :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -3859,12 +4037,12 @@ async def get_session_files( _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.SessionDirectoryListResponse] = kwargs.pop("cls", None) + cls: ClsType[_models.SessionLogEvent] = kwargs.pop("cls", None) - _request = build_beta_agents_get_session_files_request( + _request = build_beta_agents_get_session_log_stream_request( agent_name=agent_name, - agent_session_id=agent_session_id, - path=path, + agent_version=agent_version, + session_id=session_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -3875,7 +4053,7 @@ async def get_session_files( _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = True pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) @@ -3895,25 +4073,257 @@ async def get_session_files( ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SessionDirectoryListResponse, response.json()) + deserialized = _deserialize(_models.SessionLogEvent, response.text()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace_async - async def delete_session_file( - self, agent_name: str, agent_session_id: str, *, path: str, recursive: Optional[bool] = None, **kwargs: Any - ) -> None: - """Delete a file or directory from the session sandbox. If ``recursive`` is false (default) and - the target is a non-empty directory, the API returns 409 Conflict. - - :param agent_name: The name of the agent. Required. - :type agent_name: str + async def _upload_session_file( + self, agent_name: str, agent_session_id: str, content: bytes, *, path: str, **kwargs: Any + ) -> _models.SessionFileWriteResponse: + """Upload a file to the session sandbox via binary stream. Maximum file size is 50 MB. Uploads + exceeding this limit return 413 Payload Too Large. + + :param agent_name: The name of the agent. Required. + :type agent_name: str + :param agent_session_id: The session ID. Required. + :type agent_session_id: str + :param content: Required. + :type content: bytes + :keyword path: The destination file path within the sandbox, relative to the session home + directory. Required. + :paramtype path: str + :return: SessionFileWriteResponse. The SessionFileWriteResponse is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.SessionFileWriteResponse + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: str = kwargs.pop("content_type", _headers.pop("Content-Type", "application/octet-stream")) + cls: ClsType[_models.SessionFileWriteResponse] = kwargs.pop("cls", None) + + _content = content + + _request = build_beta_agents_upload_session_file_request( + agent_name=agent_name, + agent_session_id=agent_session_id, + path=path, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [201]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.SessionFileWriteResponse, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace_async + async def download_session_file( + self, agent_name: str, agent_session_id: str, *, path: str, **kwargs: Any + ) -> AsyncIterator[bytes]: + """Download a file from the session sandbox as a binary stream. + + :param agent_name: The name of the agent. Required. + :type agent_name: str + :param agent_session_id: The session ID. Required. + :type agent_session_id: str + :keyword path: The file path to download from the sandbox, relative to the session home + directory. Required. + :paramtype path: str + :return: AsyncIterator[bytes] + :rtype: AsyncIterator[bytes] + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) + + _request = build_beta_agents_download_session_file_request( + agent_name=agent_name, + agent_session_id=agent_session_id, + path=path, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", True) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace_async + async def get_session_files( + self, agent_name: str, agent_session_id: str, *, path: str, **kwargs: Any + ) -> _models.SessionDirectoryListResponse: + """List files and directories at a given path in the session sandbox. Returns only the immediate + children of the specified directory (non-recursive). + + :param agent_name: The name of the agent. Required. + :type agent_name: str + :param agent_session_id: The session ID. Required. + :type agent_session_id: str + :keyword path: The directory path to list, relative to the session home directory. Required. + :paramtype path: str + :return: SessionDirectoryListResponse. The SessionDirectoryListResponse is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.SessionDirectoryListResponse + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.SessionDirectoryListResponse] = kwargs.pop("cls", None) + + _request = build_beta_agents_get_session_files_request( + agent_name=agent_name, + agent_session_id=agent_session_id, + path=path, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.SessionDirectoryListResponse, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace_async + async def delete_session_file( + self, agent_name: str, agent_session_id: str, *, path: str, recursive: Optional[bool] = None, **kwargs: Any + ) -> None: + """Delete a file or directory from the session sandbox. If ``recursive`` is false (default) and + the target is a non-empty directory, the API returns 409 Conflict. + + :param agent_name: The name of the agent. Required. + :type agent_name: str :param agent_session_id: The session ID. Required. :type agent_session_id: str :keyword path: The file or directory path to delete, relative to the session home directory. @@ -4196,14 +4606,14 @@ async def delete(self, name: str, **kwargs: Any) -> None: @overload async def create( - self, name: str, body: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -4214,14 +4624,14 @@ async def create( @overload async def create( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: JSON, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: JSON + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -4232,14 +4642,14 @@ async def create( @overload async def create( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: IO[bytes], *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: IO[bytes] + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -4250,15 +4660,15 @@ async def create( @distributed_trace_async async def create( - self, name: str, body: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any + self, name: str, taxonomy: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, JSON, - IO[bytes] Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] + :param taxonomy: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, + JSON, IO[bytes] Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] :return: EvaluationTaxonomy. The EvaluationTaxonomy is compatible with MutableMapping :rtype: ~azure.ai.projects.models.EvaluationTaxonomy :raises ~azure.core.exceptions.HttpResponseError: @@ -4279,10 +4689,10 @@ async def create( content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(taxonomy, (IOBase, bytes)): + _content = taxonomy else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(taxonomy, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_beta_evaluation_taxonomies_create_request( name=name, @@ -4326,14 +4736,14 @@ async def create( @overload async def update( - self, name: str, body: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -4344,14 +4754,14 @@ async def update( @overload async def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: JSON, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: JSON + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -4362,14 +4772,14 @@ async def update( @overload async def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: IO[bytes], *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: IO[bytes] + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -4380,15 +4790,15 @@ async def update( @distributed_trace_async async def update( - self, name: str, body: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any + self, name: str, taxonomy: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, JSON, - IO[bytes] Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] + :param taxonomy: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, + JSON, IO[bytes] Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] :return: EvaluationTaxonomy. The EvaluationTaxonomy is compatible with MutableMapping :rtype: ~azure.ai.projects.models.EvaluationTaxonomy :raises ~azure.core.exceptions.HttpResponseError: @@ -4409,10 +4819,10 @@ async def update( content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(taxonomy, (IOBase, bytes)): + _content = taxonomy else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(taxonomy, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_beta_evaluation_taxonomies_update_request( name=name, @@ -5089,84 +5499,103 @@ async def update_version( return deserialized # type: ignore - -class BetaInsightsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.aio.AIProjectClient`'s - :attr:`insights` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") - @overload - async def generate( - self, insight: _models.Insight, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Insight: - """Generate Insights. + async def create_generation_job( + self, + job: _models.EvaluatorGenerationJob, + *, + operation_id: Optional[str] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Required. - :type insight: ~azure.ai.projects.models.Insight + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Required. + :type job: ~azure.ai.projects.models.EvaluatorGenerationJob + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def generate( - self, insight: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Insight: - """Generate Insights. - - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Required. - :type insight: JSON + async def create_generation_job( + self, job: JSON, *, operation_id: Optional[str] = None, content_type: str = "application/json", **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. + + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Required. + :type job: JSON + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def generate( - self, insight: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Insight: - """Generate Insights. + async def create_generation_job( + self, + job: IO[bytes], + *, + operation_id: Optional[str] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Required. - :type insight: IO[bytes] + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Required. + :type job: IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace_async - async def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwargs: Any) -> _models.Insight: - """Generate Insights. + async def create_generation_job( + self, + job: Union[_models.EvaluatorGenerationJob, JSON, IO[bytes]], + *, + operation_id: Optional[str] = None, + **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Is one of the following types: Insight, JSON, IO[bytes] Required. - :type insight: ~azure.ai.projects.models.Insight or JSON or IO[bytes] - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Is one of the following types: EvaluatorGenerationJob, JSON, + IO[bytes] Required. + :type job: ~azure.ai.projects.models.EvaluatorGenerationJob or JSON or IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -5181,16 +5610,17 @@ async def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwa _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.Insight] = kwargs.pop("cls", None) + cls: ClsType[_models.EvaluatorGenerationJob] = kwargs.pop("cls", None) content_type = content_type or "application/json" _content = None - if isinstance(insight, (IOBase, bytes)): - _content = insight + if isinstance(job, (IOBase, bytes)): + _content = job else: - _content = json.dumps(insight, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(job, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_insights_generate_request( + _request = build_beta_evaluators_create_generation_job_request( + operation_id=operation_id, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -5223,29 +5653,30 @@ async def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwa ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) + response_headers["Location"] = self._deserialize("str", response.headers.get("Location")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.Insight, response.json()) + deserialized = _deserialize(_models.EvaluatorGenerationJob, response.json()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace_async - async def get( - self, insight_id: str, *, include_coordinates: Optional[bool] = None, **kwargs: Any - ) -> _models.Insight: - """Get a specific insight by Id. + async def get_generation_job(self, job_id: str, **kwargs: Any) -> _models.EvaluatorGenerationJob: + """Get info about an evaluator generation job. - :param insight_id: The unique identifier for the insights report. Required. - :type insight_id: str - :keyword include_coordinates: Whether to include coordinates for visualization in the response. - Defaults to false. Default value is None. - :paramtype include_coordinates: bool - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + Gets the details of an evaluator generation job by its ID. + + :param job_id: The ID of the job. Required. + :type job_id: str + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -5259,11 +5690,10 @@ async def get( _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.Insight] = kwargs.pop("cls", None) + cls: ClsType[_models.EvaluatorGenerationJob] = kwargs.pop("cls", None) - _request = build_beta_insights_get_request( - insight_id=insight_id, - include_coordinates=include_coordinates, + _request = build_beta_evaluators_get_generation_job_request( + job_id=job_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -5294,49 +5724,59 @@ async def get( ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Retry-After"] = self._deserialize("int", response.headers.get("Retry-After")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.Insight, response.json()) + deserialized = _deserialize(_models.EvaluatorGenerationJob, response.json()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace - def list( + def list_generation_jobs( self, *, - type: Optional[Union[str, _models.InsightType]] = None, - eval_id: Optional[str] = None, - run_id: Optional[str] = None, - agent_name: Optional[str] = None, - include_coordinates: Optional[bool] = None, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + category: Optional[Union[str, _models.EvaluatorCategory]] = None, **kwargs: Any - ) -> AsyncItemPaged["_models.Insight"]: - """List all insights in reverse chronological order (newest first). + ) -> AsyncItemPaged["_models.EvaluatorGenerationJob"]: + """Returns a list of evaluator generation jobs. - :keyword type: Filter by the type of analysis. Known values are: "EvaluationRunClusterInsight", - "AgentClusterInsight", and "EvaluationComparison". Default value is None. - :paramtype type: str or ~azure.ai.projects.models.InsightType - :keyword eval_id: Filter by the evaluation ID. Default value is None. - :paramtype eval_id: str - :keyword run_id: Filter by the evaluation run ID. Default value is None. - :paramtype run_id: str - :keyword agent_name: Filter by the agent name. Default value is None. - :paramtype agent_name: str - :keyword include_coordinates: Whether to include coordinates for visualization in the response. - Defaults to false. Default value is None. - :paramtype include_coordinates: bool - :return: An iterator like instance of Insight - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.Insight] + Returns a list of evaluator generation jobs. + + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :keyword category: Filter evaluator generation jobs by category. Known values are: "quality", + "safety", and "agents". Default value is None. + :paramtype category: str or ~azure.ai.projects.models.EvaluatorCategory + :return: An iterator like instance of EvaluatorGenerationJob + :rtype: + ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.EvaluatorGenerationJob] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.Insight]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.EvaluatorGenerationJob]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -5346,63 +5786,36 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_insights_list_request( - type=type, - eval_id=eval_id, - run_id=run_id, - agent_name=agent_name, - include_coordinates=include_coordinates, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) + def prepare_request(_continuation_token=None): + _request = build_beta_evaluators_list_generation_jobs_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + category=category, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) return _request async def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.Insight], - deserialized.get("value", []), + List[_models.EvaluatorGenerationJob], + deserialized.get("data", []), ) if cls: list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, AsyncList(list_of_elem) + return deserialized.get("last_id") or None, AsyncList(list_of_elem) - async def get_next(next_link=None): - _request = prepare_request(next_link) + async def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) _stream = False pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access @@ -5422,112 +5835,82 @@ async def get_next(next_link=None): return AsyncItemPaged(get_next, extract_data) + @distributed_trace_async + async def cancel_generation_job(self, job_id: str, **kwargs: Any) -> _models.EvaluatorGenerationJob: + """Cancels an evaluator generation job. -class BetaMemoryStoresOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.aio.AIProjectClient`'s - :attr:`memory_stores` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") - - @overload - async def create( - self, - *, - name: str, - definition: _models.MemoryStoreDefinition, - content_type: str = "application/json", - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + Cancels an evaluator generation job by its ID. - :keyword name: The name of the memory store. Required. - :paramtype name: str - :keyword definition: The memory store definition. Required. - :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param job_id: The ID of the job to cancel. Required. + :type job_id: str + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) - @overload - async def create( - self, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - :param body: Required. - :type body: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails - :raises ~azure.core.exceptions.HttpResponseError: - """ + cls: ClsType[_models.EvaluatorGenerationJob] = kwargs.pop("cls", None) - @overload - async def create( - self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + _request = build_beta_evaluators_cancel_generation_job_request( + job_id=job_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - :param body: Required. - :type body: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails - :raises ~azure.core.exceptions.HttpResponseError: - """ + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.EvaluatorGenerationJob, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore @distributed_trace_async - async def create( - self, - body: Union[JSON, IO[bytes]] = _Unset, - *, - name: str = _Unset, - definition: _models.MemoryStoreDefinition = _Unset, - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + async def delete_generation_job(self, job_id: str, **kwargs: Any) -> None: + """Deletes an evaluator generation job by its ID. Deletes the job record only; the generated + evaluator (if any) is preserved. - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword name: The name of the memory store. Required. - :paramtype name: str - :keyword definition: The memory store definition. Required. - :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param job_id: The ID of the job to delete. Required. + :type job_id: str + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -5538,30 +5921,14 @@ async def create( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) - - if body is _Unset: - if name is _Unset: - raise TypeError("missing required argument: name") - if definition is _Unset: - raise TypeError("missing required argument: definition") - body = {"definition": definition, "description": description, "metadata": metadata, "name": name} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + cls: ClsType[None] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_create_request( - content_type=content_type, + _request = build_beta_evaluators_delete_generation_job_request( + job_id=job_id, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -5570,20 +5937,14 @@ async def create( } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = False pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [200]: - if _stream: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [204]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -5591,102 +5952,87 @@ async def create( ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, None, {}) # type: ignore - return deserialized # type: ignore + +class BetaInsightsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.aio.AIProjectClient`'s + :attr:`insights` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") @overload - async def update( - self, - name: str, - *, - content_type: str = "application/json", - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + async def generate( + self, insight: _models.Insight, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Required. + :type insight: ~azure.ai.projects.models.Insight :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + async def generate( + self, insight: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Required. - :type body: JSON + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Required. + :type insight: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + async def generate( + self, insight: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Required. + :type insight: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace_async - async def update( - self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, - *, - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + async def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwargs: Any) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Is one of the following types: Insight, JSON, IO[bytes] Required. + :type insight: ~azure.ai.projects.models.Insight or JSON or IO[bytes] + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -5701,20 +6047,16 @@ async def update( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) + cls: ClsType[_models.Insight] = kwargs.pop("cls", None) - if body is _Unset: - body = {"description": description, "metadata": metadata} - body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(insight, (IOBase, bytes)): + _content = insight else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(insight, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_memory_stores_update_request( - name=name, + _request = build_beta_insights_generate_request( content_type=content_type, api_version=self._config.api_version, content=_content, @@ -5734,7 +6076,7 @@ async def update( response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -5750,7 +6092,7 @@ async def update( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) + deserialized = _deserialize(_models.Insight, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -5758,13 +6100,18 @@ async def update( return deserialized # type: ignore @distributed_trace_async - async def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: - """Retrieve a memory store. + async def get( + self, insight_id: str, *, include_coordinates: Optional[bool] = None, **kwargs: Any + ) -> _models.Insight: + """Get a specific insight by Id. - :param name: The name of the memory store to retrieve. Required. - :type name: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param insight_id: The unique identifier for the insights report. Required. + :type insight_id: str + :keyword include_coordinates: Whether to include coordinates for visualization in the response. + Defaults to false. Default value is None. + :paramtype include_coordinates: bool + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -5778,10 +6125,11 @@ async def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) + cls: ClsType[_models.Insight] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_get_request( - name=name, + _request = build_beta_insights_get_request( + insight_id=insight_id, + include_coordinates=include_coordinates, api_version=self._config.api_version, headers=_headers, params=_params, @@ -5815,7 +6163,7 @@ async def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) + deserialized = _deserialize(_models.Insight, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -5826,35 +6174,35 @@ async def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: def list( self, *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, + type: Optional[Union[str, _models.InsightType]] = None, + eval_id: Optional[str] = None, + run_id: Optional[str] = None, + agent_name: Optional[str] = None, + include_coordinates: Optional[bool] = None, **kwargs: Any - ) -> AsyncItemPaged["_models.MemoryStoreDetails"]: - """List all memory stores. + ) -> AsyncItemPaged["_models.Insight"]: + """List all insights in reverse chronological order (newest first). - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of MemoryStoreDetails - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.MemoryStoreDetails] + :keyword type: Filter by the type of analysis. Known values are: "EvaluationRunClusterInsight", + "AgentClusterInsight", and "EvaluationComparison". Default value is None. + :paramtype type: str or ~azure.ai.projects.models.InsightType + :keyword eval_id: Filter by the evaluation ID. Default value is None. + :paramtype eval_id: str + :keyword run_id: Filter by the evaluation run ID. Default value is None. + :paramtype run_id: str + :keyword agent_name: Filter by the agent name. Default value is None. + :paramtype agent_name: str + :keyword include_coordinates: Whether to include coordinates for visualization in the response. + Defaults to false. Default value is None. + :paramtype include_coordinates: bool + :return: An iterator like instance of Insight + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.Insight] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.MemoryStoreDetails]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.Insight]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -5864,35 +6212,63 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(_continuation_token=None): + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_insights_list_request( + type=type, + eval_id=eval_id, + run_id=run_id, + agent_name=agent_name, + include_coordinates=include_coordinates, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - _request = build_beta_memory_stores_list_request( - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) return _request async def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.MemoryStoreDetails], - deserialized.get("data", []), + List[_models.Insight], + deserialized.get("value", []), ) if cls: list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, AsyncList(list_of_elem) + return deserialized.get("nextLink") or None, AsyncList(list_of_elem) - async def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) + async def get_next(next_link=None): + _request = prepare_request(next_link) _stream = False pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access @@ -5912,14 +6288,112 @@ async def get_next(_continuation_token=None): return AsyncItemPaged(get_next, extract_data) + +class BetaMemoryStoresOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.aio.AIProjectClient`'s + :attr:`memory_stores` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @overload + async def create( + self, + *, + name: str, + definition: _models.MemoryStoreDefinition, + content_type: str = "application/json", + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. + + :keyword name: The name of the memory store. Required. + :paramtype name: str + :keyword definition: The memory store definition. Required. + :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create( + self, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. + + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create( + self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. + + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + @distributed_trace_async - async def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: - """Delete a memory store. + async def create( + self, + body: Union[JSON, IO[bytes]] = _Unset, + *, + name: str = _Unset, + definition: _models.MemoryStoreDefinition = _Unset, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. - :param name: The name of the memory store to delete. Required. - :type name: str - :return: DeleteMemoryStoreResult. The DeleteMemoryStoreResult is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.DeleteMemoryStoreResult + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword name: The name of the memory store. Required. + :paramtype name: str + :keyword definition: The memory store definition. Required. + :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -5930,14 +6404,30 @@ async def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreRes } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.DeleteMemoryStoreResult] = kwargs.pop("cls", None) - - _request = build_beta_memory_stores_delete_request( - name=name, + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) + + if body is _Unset: + if name is _Unset: + raise TypeError("missing required argument: name") + if definition is _Unset: + raise TypeError("missing required argument: definition") + body = {"definition": definition, "description": description, "metadata": metadata, "name": name} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_memory_stores_create_request( + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -5970,7 +6460,7 @@ async def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreRes if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.DeleteMemoryStoreResult, response.json()) + deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -5978,56 +6468,91 @@ async def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreRes return deserialized # type: ignore @overload - async def _search_memories( + async def update( self, name: str, *, - scope: str, content_type: str = "application/json", - items: Optional[List[dict[str, Any]]] = None, - previous_search_id: Optional[str] = None, - options: Optional[_models.MemorySearchOptions] = None, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> _models.MemoryStoreSearchResult: ... + ) -> _models.MemoryStoreDetails: + """Update a memory store. + + :param name: The name of the memory store to update. Required. + :type name: str + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + @overload - async def _search_memories( + async def update( self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreSearchResult: ... + ) -> _models.MemoryStoreDetails: + """Update a memory store. + + :param name: The name of the memory store to update. Required. + :type name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + @overload - async def _search_memories( + async def update( self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreSearchResult: ... + ) -> _models.MemoryStoreDetails: + """Update a memory store. + + :param name: The name of the memory store to update. Required. + :type name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ @distributed_trace_async - async def _search_memories( + async def update( self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, - scope: str = _Unset, - items: Optional[List[dict[str, Any]]] = None, - previous_search_id: Optional[str] = None, - options: Optional[_models.MemorySearchOptions] = None, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> _models.MemoryStoreSearchResult: - """Search for relevant memories from a memory store based on conversation context. + ) -> _models.MemoryStoreDetails: + """Update a memory store. - :param name: The name of the memory store to search. Required. + :param name: The name of the memory store to update. Required. :type name: str :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] - :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. - Required. - :paramtype scope: str - :keyword items: Items for which to search for relevant memories. Default value is None. - :paramtype items: list[dict[str, any]] - :keyword previous_search_id: The unique ID of the previous search request, enabling incremental - memory search from where the last operation left off. Default value is None. - :paramtype previous_search_id: str - :keyword options: Memory search options. Default value is None. - :paramtype options: ~azure.ai.projects.models.MemorySearchOptions - :return: MemoryStoreSearchResult. The MemoryStoreSearchResult is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreSearchResult + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -6042,17 +6567,10 @@ async def _search_memories( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreSearchResult] = kwargs.pop("cls", None) + cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) if body is _Unset: - if scope is _Unset: - raise TypeError("missing required argument: scope") - body = { - "items": items, - "options": options, - "previous_search_id": previous_search_id, - "scope": scope, - } + body = {"description": description, "metadata": metadata} body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None @@ -6061,7 +6579,7 @@ async def _search_memories( else: _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_memory_stores_search_memories_request( + _request = build_beta_memory_stores_update_request( name=name, content_type=content_type, api_version=self._config.api_version, @@ -6098,24 +6616,23 @@ async def _search_memories( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreSearchResult, response.json()) + deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - async def _update_memories_initial( - self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, - *, - scope: str = _Unset, - items: Optional[List[dict[str, Any]]] = None, - previous_update_id: Optional[str] = None, - update_delay: Optional[int] = None, - **kwargs: Any - ) -> AsyncIterator[bytes]: + @distributed_trace_async + async def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: + """Retrieve a memory store. + + :param name: The name of the memory store to retrieve. Required. + :type name: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -6124,34 +6641,14 @@ async def _update_memories_initial( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) - - if body is _Unset: - if scope is _Unset: - raise TypeError("missing required argument: scope") - body = { - "items": items, - "previous_update_id": previous_update_id, - "scope": scope, - "update_delay": update_delay, - } - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_update_memories_request( + _request = build_beta_memory_stores_get_request( name=name, - content_type=content_type, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -6161,18 +6658,19 @@ async def _update_memories_initial( _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = True + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [202]: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -6180,210 +6678,114 @@ async def _update_memories_initial( ) raise HttpResponseError(response=response, model=error) - response_headers = {} - response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) - - deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) if cls: - return cls(pipeline_response, deserialized, response_headers) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @overload - async def _begin_update_memories( - self, - name: str, - *, - scope: str, - content_type: str = "application/json", - items: Optional[List[dict[str, Any]]] = None, - previous_update_id: Optional[str] = None, - update_delay: Optional[int] = None, - **kwargs: Any - ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: ... - @overload - async def _begin_update_memories( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: ... - @overload - async def _begin_update_memories( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: ... - - @distributed_trace_async - async def _begin_update_memories( + @distributed_trace + def list( self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, *, - scope: str = _Unset, - items: Optional[List[dict[str, Any]]] = None, - previous_update_id: Optional[str] = None, - update_delay: Optional[int] = None, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, **kwargs: Any - ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: - """Update memory store with conversation memories. + ) -> AsyncItemPaged["_models.MemoryStoreDetails"]: + """List all memory stores. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. - Required. - :paramtype scope: str - :keyword items: Conversation items to be stored in memory. Default value is None. - :paramtype items: list[dict[str, any]] - :keyword previous_update_id: The unique ID of the previous update request, enabling incremental - memory updates from where the last operation left off. Default value is None. - :paramtype previous_update_id: str - :keyword update_delay: Timeout period before processing the memory update in seconds. - If a new update request is received during this period, it will cancel the current request and - reset the timeout. - Set to 0 to immediately trigger the update without delay. - Defaults to 300 (5 minutes). Default value is None. - :paramtype update_delay: int - :return: An instance of AsyncLROPoller that returns MemoryStoreUpdateCompletedResult. The - MemoryStoreUpdateCompletedResult is compatible with MutableMapping - :rtype: - ~azure.core.polling.AsyncLROPoller[~azure.ai.projects.models.MemoryStoreUpdateCompletedResult] + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of MemoryStoreDetails + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.MemoryStoreDetails] :raises ~azure.core.exceptions.HttpResponseError: """ - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreUpdateCompletedResult] = kwargs.pop("cls", None) - polling: Union[bool, AsyncPollingMethod] = kwargs.pop("polling", True) - lro_delay = kwargs.pop("polling_interval", self._config.polling_interval) - cont_token: Optional[str] = kwargs.pop("continuation_token", None) - if cont_token is None: - raw_result = await self._update_memories_initial( - name=name, - body=body, - scope=scope, - items=items, - previous_update_id=previous_update_id, - update_delay=update_delay, - content_type=content_type, - cls=lambda x, y, z: x, + cls: ClsType[List[_models.MemoryStoreDetails]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(_continuation_token=None): + + _request = build_beta_memory_stores_list_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, headers=_headers, params=_params, - **kwargs ) - await raw_result.http_response.read() # type: ignore - kwargs.pop("error_map", None) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request - def get_long_running_output(pipeline_response): - response_headers = {} - response = pipeline_response.http_response - response_headers["Operation-Location"] = self._deserialize( - "str", response.headers.get("Operation-Location") + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.MemoryStoreDetails], + deserialized.get("data", []), ) - - deserialized = _deserialize(_models.MemoryStoreUpdateCompletedResult, response.json().get("result", {})) if cls: - return cls(pipeline_response, deserialized, response_headers) # type: ignore - return deserialized + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, AsyncList(list_of_elem) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } + async def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) - if polling is True: - polling_method: AsyncPollingMethod = cast( - AsyncPollingMethod, - AsyncLROBasePolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs), - ) - elif polling is False: - polling_method = cast(AsyncPollingMethod, AsyncNoPolling()) - else: - polling_method = polling - if cont_token: - return AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult].from_continuation_token( - polling_method=polling_method, - continuation_token=cont_token, - client=self._client, - deserialization_callback=get_long_running_output, + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs ) - return AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]( - self._client, raw_result, get_long_running_output, polling_method # type: ignore - ) - - @overload - async def delete_scope( - self, name: str, *, scope: str, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. - - :param name: The name of the memory store. Required. - :type name: str - :keyword scope: The namespace that logically groups and isolates memories to delete, such as a - user ID. Required. - :paramtype scope: str - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def delete_scope( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. + response = pipeline_response.http_response - :param name: The name of the memory store. Required. - :type name: str - :param body: Required. - :type body: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult - :raises ~azure.core.exceptions.HttpResponseError: - """ + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) - @overload - async def delete_scope( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. + return pipeline_response - :param name: The name of the memory store. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult - :raises ~azure.core.exceptions.HttpResponseError: - """ + return AsyncItemPaged(get_next, extract_data) @distributed_trace_async - async def delete_scope( - self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, scope: str = _Unset, **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. + async def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: + """Delete a memory store. - :param name: The name of the memory store. Required. + :param name: The name of the memory store to delete. Required. :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword scope: The namespace that logically groups and isolates memories to delete, such as a - user ID. Required. - :paramtype scope: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :return: DeleteMemoryStoreResult. The DeleteMemoryStoreResult is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DeleteMemoryStoreResult :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -6394,29 +6796,14 @@ async def delete_scope( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreDeleteScopeResult] = kwargs.pop("cls", None) - - if body is _Unset: - if scope is _Unset: - raise TypeError("missing required argument: scope") - body = {"scope": scope} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + cls: ClsType[_models.DeleteMemoryStoreResult] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_delete_scope_request( + _request = build_beta_memory_stores_delete_request( name=name, - content_type=content_type, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -6449,39 +6836,64 @@ async def delete_scope( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreDeleteScopeResult, response.json()) + deserialized = _deserialize(_models.DeleteMemoryStoreResult, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - -class BetaRedTeamsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.aio.AIProjectClient`'s - :attr:`red_teams` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @overload + async def _search_memories( + self, + name: str, + *, + scope: str, + content_type: str = "application/json", + items: Optional[List[dict[str, Any]]] = None, + previous_search_id: Optional[str] = None, + options: Optional[_models.MemorySearchOptions] = None, + **kwargs: Any + ) -> _models.MemoryStoreSearchResult: ... + @overload + async def _search_memories( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreSearchResult: ... + @overload + async def _search_memories( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreSearchResult: ... @distributed_trace_async - async def get(self, name: str, **kwargs: Any) -> _models.RedTeam: - """Get a redteam by name. - - :param name: Identifier of the red team run. Required. + async def _search_memories( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + scope: str = _Unset, + items: Optional[List[dict[str, Any]]] = None, + previous_search_id: Optional[str] = None, + options: Optional[_models.MemorySearchOptions] = None, + **kwargs: Any + ) -> _models.MemoryStoreSearchResult: + """Search for relevant memories from a memory store based on conversation context. + + :param name: The name of the memory store to search. Required. :type name: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. + Required. + :paramtype scope: str + :keyword items: Items for which to search for relevant memories. Default value is None. + :paramtype items: list[dict[str, any]] + :keyword previous_search_id: The unique ID of the previous search request, enabling incremental + memory search from where the last operation left off. Default value is None. + :paramtype previous_search_id: str + :keyword options: Memory search options. Default value is None. + :paramtype options: ~azure.ai.projects.models.MemorySearchOptions + :return: MemoryStoreSearchResult. The MemoryStoreSearchResult is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreSearchResult :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -6492,14 +6904,34 @@ async def get(self, name: str, **kwargs: Any) -> _models.RedTeam: } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreSearchResult] = kwargs.pop("cls", None) - _request = build_beta_red_teams_get_request( + if body is _Unset: + if scope is _Unset: + raise TypeError("missing required argument: scope") + body = { + "items": items, + "options": options, + "previous_search_id": previous_search_id, + "scope": scope, + } + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_memory_stores_search_memories_request( name=name, + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -6523,163 +6955,33 @@ async def get(self, name: str, **kwargs: Any) -> _models.RedTeam: except (StreamConsumedError, StreamClosedError): pass map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.RedTeam, response.json()) + deserialized = _deserialize(_models.MemoryStoreSearchResult, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @distributed_trace - def list(self, **kwargs: Any) -> AsyncItemPaged["_models.RedTeam"]: - """List a redteam by name. - - :return: An iterator like instance of RedTeam - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.RedTeam] - :raises ~azure.core.exceptions.HttpResponseError: - """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.RedTeam]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_red_teams_list_request( - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - return _request - - async def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.RedTeam], - deserialized.get("value", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, AsyncList(list_of_elem) - - async def get_next(next_link=None): - _request = prepare_request(next_link) - - _stream = False - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response - - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) - - return pipeline_response - - return AsyncItemPaged(get_next, extract_data) - - @overload - async def create( - self, red_team: _models.RedTeam, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Required. - :type red_team: ~azure.ai.projects.models.RedTeam - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def create(self, red_team: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Required. - :type red_team: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def create( - self, red_team: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Required. - :type red_team: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @distributed_trace_async - async def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: Any) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Is one of the following types: RedTeam, JSON, IO[bytes] - Required. - :type red_team: ~azure.ai.projects.models.RedTeam or JSON or IO[bytes] - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ + async def _update_memories_initial( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + scope: str = _Unset, + items: Optional[List[dict[str, Any]]] = None, + previous_update_id: Optional[str] = None, + update_delay: Optional[int] = None, + **kwargs: Any + ) -> AsyncIterator[bytes]: error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -6692,16 +6994,27 @@ async def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwar _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) + if body is _Unset: + if scope is _Unset: + raise TypeError("missing required argument: scope") + body = { + "items": items, + "previous_update_id": previous_update_id, + "scope": scope, + "update_delay": update_delay, + } + body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None - if isinstance(red_team, (IOBase, bytes)): - _content = red_team + if isinstance(body, (IOBase, bytes)): + _content = body else: - _content = json.dumps(red_team, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_red_teams_create_request( + _request = build_beta_memory_stores_update_memories_request( + name=name, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -6714,19 +7027,18 @@ async def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwar _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = True pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [201]: - if _stream: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [202]: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -6734,25 +7046,1110 @@ async def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwar ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() + response_headers = {} + response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore + + return deserialized # type: ignore + + @overload + async def _begin_update_memories( + self, + name: str, + *, + scope: str, + content_type: str = "application/json", + items: Optional[List[dict[str, Any]]] = None, + previous_update_id: Optional[str] = None, + update_delay: Optional[int] = None, + **kwargs: Any + ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: ... + @overload + async def _begin_update_memories( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: ... + @overload + async def _begin_update_memories( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: ... + + @distributed_trace_async + async def _begin_update_memories( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + scope: str = _Unset, + items: Optional[List[dict[str, Any]]] = None, + previous_update_id: Optional[str] = None, + update_delay: Optional[int] = None, + **kwargs: Any + ) -> AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]: + """Update memory store with conversation memories. + + :param name: The name of the memory store to update. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. + Required. + :paramtype scope: str + :keyword items: Conversation items to be stored in memory. Default value is None. + :paramtype items: list[dict[str, any]] + :keyword previous_update_id: The unique ID of the previous update request, enabling incremental + memory updates from where the last operation left off. Default value is None. + :paramtype previous_update_id: str + :keyword update_delay: Timeout period before processing the memory update in seconds. + If a new update request is received during this period, it will cancel the current request and + reset the timeout. + Set to 0 to immediately trigger the update without delay. + Defaults to 300 (5 minutes). Default value is None. + :paramtype update_delay: int + :return: An instance of AsyncLROPoller that returns MemoryStoreUpdateCompletedResult. The + MemoryStoreUpdateCompletedResult is compatible with MutableMapping + :rtype: + ~azure.core.polling.AsyncLROPoller[~azure.ai.projects.models.MemoryStoreUpdateCompletedResult] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreUpdateCompletedResult] = kwargs.pop("cls", None) + polling: Union[bool, AsyncPollingMethod] = kwargs.pop("polling", True) + lro_delay = kwargs.pop("polling_interval", self._config.polling_interval) + cont_token: Optional[str] = kwargs.pop("continuation_token", None) + if cont_token is None: + raw_result = await self._update_memories_initial( + name=name, + body=body, + scope=scope, + items=items, + previous_update_id=previous_update_id, + update_delay=update_delay, + content_type=content_type, + cls=lambda x, y, z: x, + headers=_headers, + params=_params, + **kwargs + ) + await raw_result.http_response.read() # type: ignore + kwargs.pop("error_map", None) + + def get_long_running_output(pipeline_response): + response_headers = {} + response = pipeline_response.http_response + response_headers["Operation-Location"] = self._deserialize( + "str", response.headers.get("Operation-Location") + ) + + deserialized = _deserialize(_models.MemoryStoreUpdateCompletedResult, response.json().get("result", {})) + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore + return deserialized + + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + + if polling is True: + polling_method: AsyncPollingMethod = cast( + AsyncPollingMethod, + AsyncLROBasePolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs), + ) + elif polling is False: + polling_method = cast(AsyncPollingMethod, AsyncNoPolling()) + else: + polling_method = polling + if cont_token: + return AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult].from_continuation_token( + polling_method=polling_method, + continuation_token=cont_token, + client=self._client, + deserialization_callback=get_long_running_output, + ) + return AsyncLROPoller[_models.MemoryStoreUpdateCompletedResult]( + self._client, raw_result, get_long_running_output, polling_method # type: ignore + ) + + @overload + async def delete_scope( + self, name: str, *, scope: str, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :keyword scope: The namespace that logically groups and isolates memories to delete, such as a + user ID. Required. + :paramtype scope: str + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def delete_scope( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def delete_scope( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace_async + async def delete_scope( + self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, scope: str = _Unset, **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword scope: The namespace that logically groups and isolates memories to delete, such as a + user ID. Required. + :paramtype scope: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreDeleteScopeResult] = kwargs.pop("cls", None) + + if body is _Unset: + if scope is _Unset: + raise TypeError("missing required argument: scope") + body = {"scope": scope} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_memory_stores_delete_scope_request( + name=name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.MemoryStoreDeleteScopeResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + +class BetaRedTeamsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.aio.AIProjectClient`'s + :attr:`red_teams` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @distributed_trace_async + async def get(self, name: str, **kwargs: Any) -> _models.RedTeam: + """Get a redteam by name. + + :param name: Identifier of the red team run. Required. + :type name: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + + _request = build_beta_red_teams_get_request( + name=name, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.RedTeam, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def list(self, **kwargs: Any) -> AsyncItemPaged["_models.RedTeam"]: + """List a redteam by name. + + :return: An iterator like instance of RedTeam + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.RedTeam] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.RedTeam]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_red_teams_list_request( + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + return _request + + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.RedTeam], + deserialized.get("value", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("nextLink") or None, AsyncList(list_of_elem) + + async def get_next(next_link=None): + _request = prepare_request(next_link) + + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + return pipeline_response + + return AsyncItemPaged(get_next, extract_data) + + @overload + async def create( + self, red_team: _models.RedTeam, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Required. + :type red_team: ~azure.ai.projects.models.RedTeam + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create(self, red_team: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Required. + :type red_team: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create( + self, red_team: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Required. + :type red_team: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace_async + async def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: Any) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Is one of the following types: RedTeam, JSON, IO[bytes] + Required. + :type red_team: ~azure.ai.projects.models.RedTeam or JSON or IO[bytes] + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + + content_type = content_type or "application/json" + _content = None + if isinstance(red_team, (IOBase, bytes)): + _content = red_team + else: + _content = json.dumps(red_team, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_red_teams_create_request( + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [201]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.RedTeam, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + +class BetaSchedulesOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.aio.AIProjectClient`'s + :attr:`schedules` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @distributed_trace_async + async def delete(self, schedule_id: str, **kwargs: Any) -> None: + """Delete a schedule. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :return: None + :rtype: None + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[None] = kwargs.pop("cls", None) + + _request = build_beta_schedules_delete_request( + schedule_id=schedule_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [204]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if cls: + return cls(pipeline_response, None, {}) # type: ignore + + @distributed_trace_async + async def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: + """Get a schedule by id. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) + + _request = build_beta_schedules_get_request( + schedule_id=schedule_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.Schedule, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def list( + self, + *, + type: Optional[Union[str, _models.ScheduleTaskType]] = None, + enabled: Optional[bool] = None, + **kwargs: Any + ) -> AsyncItemPaged["_models.Schedule"]: + """List all schedules. + + :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". + Default value is None. + :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType + :keyword enabled: Filter by the enabled status. Default value is None. + :paramtype enabled: bool + :return: An iterator like instance of Schedule + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.Schedule] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.Schedule]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_schedules_list_request( + type=type, + enabled=enabled, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + return _request + + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.Schedule], + deserialized.get("value", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("nextLink") or None, AsyncList(list_of_elem) + + async def get_next(next_link=None): + _request = prepare_request(next_link) + + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + return pipeline_response + + return AsyncItemPaged(get_next, extract_data) + + @overload + async def create_or_update( + self, schedule_id: str, schedule: _models.Schedule, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Required. + :type schedule: ~azure.ai.projects.models.Schedule + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create_or_update( + self, schedule_id: str, schedule: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Required. + :type schedule: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create_or_update( + self, schedule_id: str, schedule: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Required. + :type schedule: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace_async + async def create_or_update( + self, schedule_id: str, schedule: Union[_models.Schedule, JSON, IO[bytes]], **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Is one of the following types: Schedule, JSON, + IO[bytes] Required. + :type schedule: ~azure.ai.projects.models.Schedule or JSON or IO[bytes] + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) + + content_type = content_type or "application/json" + _content = None + if isinstance(schedule, (IOBase, bytes)): + _content = schedule else: - deserialized = _deserialize(_models.RedTeam, response.json()) + _content = json.dumps(schedule, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_schedules_create_or_update_request( + schedule_id=schedule_id, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200, 201]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.Schedule, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace_async + async def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models.ScheduleRun: + """Get a schedule run by id. + + :param schedule_id: The unique identifier of the schedule. Required. + :type schedule_id: str + :param run_id: The unique identifier of the schedule run. Required. + :type run_id: str + :return: ScheduleRun. The ScheduleRun is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ScheduleRun + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.ScheduleRun] = kwargs.pop("cls", None) + + _request = build_beta_schedules_get_run_request( + schedule_id=schedule_id, + run_id=run_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.ScheduleRun, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def list_runs( + self, + schedule_id: str, + *, + type: Optional[Union[str, _models.ScheduleTaskType]] = None, + enabled: Optional[bool] = None, + **kwargs: Any + ) -> AsyncItemPaged["_models.ScheduleRun"]: + """List all schedule runs. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". + Default value is None. + :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType + :keyword enabled: Filter by the enabled status. Default value is None. + :paramtype enabled: bool + :return: An iterator like instance of ScheduleRun + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.ScheduleRun] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.ScheduleRun]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_schedules_list_runs_request( + schedule_id=schedule_id, + type=type, + enabled=enabled, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - return deserialized # type: ignore + return _request + + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.ScheduleRun], + deserialized.get("value", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("nextLink") or None, AsyncList(list_of_elem) + async def get_next(next_link=None): + _request = prepare_request(next_link) -class BetaSchedulesOperations: + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + return pipeline_response + + return AsyncItemPaged(get_next, extract_data) + + +class BetaToolboxesOperations: """ .. warning:: **DO NOT** instantiate this class directly. Instead, you should access the following operations through :class:`~azure.ai.projects.aio.AIProjectClient`'s - :attr:`schedules` attribute. + :attr:`toolboxes` attribute. """ def __init__(self, *args, **kwargs) -> None: @@ -6762,14 +8159,108 @@ def __init__(self, *args, **kwargs) -> None: self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @overload + async def create_version( + self, + name: str, + *, + tools: List[_models.Tool], + content_type: str = "application/json", + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + policies: Optional[_models.ToolboxPolicies] = None, + **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :keyword tools: The list of tools to include in this version. Required. + :paramtype tools: list[~azure.ai.projects.models.Tool] + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the toolbox. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is + None. + :paramtype metadata: dict[str, str] + :keyword policies: Policy configuration for this toolbox version. Default value is None. + :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create_version( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + async def create_version( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :raises ~azure.core.exceptions.HttpResponseError: + """ + @distributed_trace_async - async def delete(self, schedule_id: str, **kwargs: Any) -> None: - """Delete a schedule. + async def create_version( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + tools: List[_models.Tool] = _Unset, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + policies: Optional[_models.ToolboxPolicies] = None, + **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :return: None - :rtype: None + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword tools: The list of tools to include in this version. Required. + :paramtype tools: list[~azure.ai.projects.models.Tool] + :keyword description: A human-readable description of the toolbox. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is + None. + :paramtype metadata: dict[str, str] + :keyword policies: Policy configuration for this toolbox version. Default value is None. + :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -6780,14 +8271,29 @@ async def delete(self, schedule_id: str, **kwargs: Any) -> None: } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) - _request = build_beta_schedules_delete_request( - schedule_id=schedule_id, + if body is _Unset: + if tools is _Unset: + raise TypeError("missing required argument: tools") + body = {"description": description, "metadata": metadata, "policies": policies, "tools": tools} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_toolboxes_create_version_request( + name=name, + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -6796,28 +8302,45 @@ async def delete(self, schedule_id: str, **kwargs: Any) -> None: } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore @distributed_trace_async - async def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: - """Get a schedule by id. + async def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: + """Retrieve a toolbox. - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule + :param name: The name of the toolbox to retrieve. Required. + :type name: str + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -6831,10 +8354,10 @@ async def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) + cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) - _request = build_beta_schedules_get_request( - schedule_id=schedule_id, + _request = build_beta_toolboxes_get_request( + name=name, api_version=self._config.api_version, headers=_headers, params=_params, @@ -6859,12 +8382,16 @@ async def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: except (StreamConsumedError, StreamClosedError): pass map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.Schedule, response.json()) + deserialized = _deserialize(_models.ToolboxObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -6875,25 +8402,35 @@ async def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: def list( self, *, - type: Optional[Union[str, _models.ScheduleTaskType]] = None, - enabled: Optional[bool] = None, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, **kwargs: Any - ) -> AsyncItemPaged["_models.Schedule"]: - """List all schedules. + ) -> AsyncItemPaged["_models.ToolboxObject"]: + """List all toolboxes. - :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. Default value is None. - :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType - :keyword enabled: Filter by the enabled status. Default value is None. - :paramtype enabled: bool - :return: An iterator like instance of Schedule - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.Schedule] + :paramtype before: str + :return: An iterator like instance of ToolboxObject + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.ToolboxObject] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.Schedule]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.ToolboxObject]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -6903,60 +8440,35 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_schedules_list_request( - type=type, - enabled=enabled, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) + def prepare_request(_continuation_token=None): + _request = build_beta_toolboxes_list_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) return _request async def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.Schedule], - deserialized.get("value", []), + List[_models.ToolboxObject], + deserialized.get("data", []), ) if cls: list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, AsyncList(list_of_elem) + return deserialized.get("last_id") or None, AsyncList(list_of_elem) - async def get_next(next_link=None): - _request = prepare_request(next_link) + async def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) _stream = False pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access @@ -6966,81 +8478,53 @@ async def get_next(next_link=None): if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) return pipeline_response return AsyncItemPaged(get_next, extract_data) - @overload - async def create_or_update( - self, schedule_id: str, schedule: _models.Schedule, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. - - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Required. - :type schedule: ~azure.ai.projects.models.Schedule - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def create_or_update( - self, schedule_id: str, schedule: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. - - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Required. - :type schedule: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def create_or_update( - self, schedule_id: str, schedule: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. + @distributed_trace + def list_versions( + self, + name: str, + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + **kwargs: Any + ) -> AsyncItemPaged["_models.ToolboxVersionObject"]: + """List all versions of a toolbox. - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Required. - :type schedule: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule + :param name: The name of the toolbox to list versions for. Required. + :type name: str + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of ToolboxVersionObject + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.ToolboxVersionObject] :raises ~azure.core.exceptions.HttpResponseError: """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - @distributed_trace_async - async def create_or_update( - self, schedule_id: str, schedule: Union[_models.Schedule, JSON, IO[bytes]], **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. + cls: ClsType[List[_models.ToolboxVersionObject]] = kwargs.pop("cls", None) - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Is one of the following types: Schedule, JSON, - IO[bytes] Required. - :type schedule: ~azure.ai.projects.models.Schedule or JSON or IO[bytes] - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule - :raises ~azure.core.exceptions.HttpResponseError: - """ error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -7049,69 +8533,65 @@ async def create_or_update( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) - _params = kwargs.pop("params", {}) or {} - - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) - - content_type = content_type or "application/json" - _content = None - if isinstance(schedule, (IOBase, bytes)): - _content = schedule - else: - _content = json.dumps(schedule, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + def prepare_request(_continuation_token=None): - _request = build_beta_schedules_create_or_update_request( - schedule_id=schedule_id, - content_type=content_type, - api_version=self._config.api_version, - content=_content, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) + _request = build_beta_toolboxes_list_versions_request( + name=name, + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.ToolboxVersionObject], + deserialized.get("data", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, AsyncList(list_of_elem) - response = pipeline_response.http_response + async def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) - if response.status_code not in [200, 201]: - if _stream: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass - map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.Schedule, response.json()) + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return pipeline_response - return deserialized # type: ignore + return AsyncItemPaged(get_next, extract_data) @distributed_trace_async - async def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models.ScheduleRun: - """Get a schedule run by id. + async def get_version(self, name: str, version: str, **kwargs: Any) -> _models.ToolboxVersionObject: + """Retrieve a specific version of a toolbox. - :param schedule_id: The unique identifier of the schedule. Required. - :type schedule_id: str - :param run_id: The unique identifier of the schedule run. Required. - :type run_id: str - :return: ScheduleRun. The ScheduleRun is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ScheduleRun + :param name: The name of the toolbox. Required. + :type name: str + :param version: The version identifier to retrieve. Required. + :type version: str + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7125,11 +8605,11 @@ async def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.ScheduleRun] = kwargs.pop("cls", None) + cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) - _request = build_beta_schedules_get_run_request( - schedule_id=schedule_id, - run_id=run_id, + _request = build_beta_toolboxes_get_version_request( + name=name, + version=version, api_version=self._config.api_version, headers=_headers, params=_params, @@ -7163,238 +8643,83 @@ async def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ScheduleRun, response.json()) + deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @distributed_trace - def list_runs( - self, - schedule_id: str, - *, - type: Optional[Union[str, _models.ScheduleTaskType]] = None, - enabled: Optional[bool] = None, - **kwargs: Any - ) -> AsyncItemPaged["_models.ScheduleRun"]: - """List all schedule runs. - - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". - Default value is None. - :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType - :keyword enabled: Filter by the enabled status. Default value is None. - :paramtype enabled: bool - :return: An iterator like instance of ScheduleRun - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.ScheduleRun] - :raises ~azure.core.exceptions.HttpResponseError: - """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.ScheduleRun]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_schedules_list_runs_request( - schedule_id=schedule_id, - type=type, - enabled=enabled, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - return _request - - async def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.ScheduleRun], - deserialized.get("value", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, AsyncList(list_of_elem) - - async def get_next(next_link=None): - _request = prepare_request(next_link) - - _stream = False - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response - - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) - - return pipeline_response - - return AsyncItemPaged(get_next, extract_data) - - -class BetaToolboxesOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.aio.AIProjectClient`'s - :attr:`toolboxes` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") - @overload - async def create_version( - self, - name: str, - *, - tools: List[_models.Tool], - content_type: str = "application/json", - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - policies: Optional[_models.ToolboxPolicies] = None, - **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + async def update( + self, name: str, *, default_version: str, content_type: str = "application/json", **kwargs: Any + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str - :keyword tools: The list of tools to include in this version. Required. - :paramtype tools: list[~azure.ai.projects.models.Tool] + :keyword default_version: The version identifier that the toolbox should point to. When set, + the toolbox's default version will resolve to this version instead of the latest. Required. + :paramtype default_version: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :keyword description: A human-readable description of the toolbox. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is - None. - :paramtype metadata: dict[str, str] - :keyword policies: Policy configuration for this toolbox version. Default value is None. - :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def create_version( + async def update( self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str :param body: Required. :type body: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def create_version( + async def update( self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str :param body: Required. :type body: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace_async - async def create_version( - self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, - *, - tools: List[_models.Tool] = _Unset, - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - policies: Optional[_models.ToolboxPolicies] = None, - **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + async def update( + self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, default_version: str = _Unset, **kwargs: Any + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] - :keyword tools: The list of tools to include in this version. Required. - :paramtype tools: list[~azure.ai.projects.models.Tool] - :keyword description: A human-readable description of the toolbox. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is - None. - :paramtype metadata: dict[str, str] - :keyword policies: Policy configuration for this toolbox version. Default value is None. - :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :keyword default_version: The version identifier that the toolbox should point to. When set, + the toolbox's default version will resolve to this version instead of the latest. Required. + :paramtype default_version: str + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7409,12 +8734,12 @@ async def create_version( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) + cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) if body is _Unset: - if tools is _Unset: - raise TypeError("missing required argument: tools") - body = {"description": description, "metadata": metadata, "policies": policies, "tools": tools} + if default_version is _Unset: + raise TypeError("missing required argument: default_version") + body = {"default_version": default_version} body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None @@ -7423,7 +8748,7 @@ async def create_version( else: _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_toolboxes_create_version_request( + _request = build_beta_toolboxes_update_request( name=name, content_type=content_type, api_version=self._config.api_version, @@ -7460,7 +8785,7 @@ async def create_version( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) + deserialized = _deserialize(_models.ToolboxObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -7468,13 +8793,13 @@ async def create_version( return deserialized # type: ignore @distributed_trace_async - async def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: - """Retrieve a toolbox. + async def delete(self, name: str, **kwargs: Any) -> None: + """Delete a toolbox and all its versions. - :param name: The name of the toolbox to retrieve. Required. + :param name: The name of the toolbox to delete. Required. :type name: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7488,9 +8813,9 @@ async def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) + cls: ClsType[None] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_get_request( + _request = build_beta_toolboxes_delete_request( name=name, api_version=self._config.api_version, headers=_headers, @@ -7501,20 +8826,14 @@ async def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = False pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [200]: - if _stream: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [204]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -7522,50 +8841,21 @@ async def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.ToolboxObject, response.json()) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore - - return deserialized # type: ignore + return cls(pipeline_response, None, {}) # type: ignore - @distributed_trace - def list( - self, - *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, - **kwargs: Any - ) -> AsyncItemPaged["_models.ToolboxObject"]: - """List all toolboxes. + @distributed_trace_async + async def delete_version(self, name: str, version: str, **kwargs: Any) -> None: + """Delete a specific version of a toolbox. - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of ToolboxObject - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.ToolboxObject] + :param name: The name of the toolbox. Required. + :type name: str + :param version: The version identifier to delete. Required. + :type version: str + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.ToolboxObject]] = kwargs.pop("cls", None) - error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -7574,158 +8864,155 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(_continuation_token=None): + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - _request = build_beta_toolboxes_list_request( - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - return _request + cls: ClsType[None] = kwargs.pop("cls", None) - async def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.ToolboxObject], - deserialized.get("data", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, AsyncList(list_of_elem) + _request = build_beta_toolboxes_delete_version_request( + name=name, + version=version, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - async def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) - _stream = False - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs + response = pipeline_response.http_response + + if response.status_code not in [204]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, ) - response = pipeline_response.http_response + raise HttpResponseError(response=response, model=error) - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) + if cls: + return cls(pipeline_response, None, {}) # type: ignore - return pipeline_response - return AsyncItemPaged(get_next, extract_data) +class BetaSkillsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. - @distributed_trace - def list_versions( + Instead, you should access the following operations through + :class:`~azure.ai.projects.aio.AIProjectClient`'s + :attr:`skills` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @overload + async def create( self, - name: str, *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, + name: str, + content_type: str = "application/json", + description: Optional[str] = None, + instructions: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> AsyncItemPaged["_models.ToolboxVersionObject"]: - """List all versions of a toolbox. - - :param name: The name of the toolbox to list versions for. Required. - :type name: str - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of ToolboxVersionObject - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.ToolboxVersionObject] - :raises ~azure.core.exceptions.HttpResponseError: - """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.ToolboxVersionObject]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(_continuation_token=None): - - _request = build_beta_toolboxes_list_versions_request( - name=name, - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - return _request + ) -> _models.SkillObject: + """Creates a skill. - async def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.ToolboxVersionObject], - deserialized.get("data", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, AsyncList(list_of_elem) + :keyword name: The unique name of the skill. Required. + :paramtype name: str + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the skill. Default value is None. + :paramtype description: str + :keyword instructions: Instructions that define the behavior of the skill. Default value is + None. + :paramtype instructions: str + :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. - async def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. Default value is None. + :paramtype metadata: dict[str, str] + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject + :raises ~azure.core.exceptions.HttpResponseError: + """ - _stream = False - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response + @overload + async def create(self, body: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.SkillObject: + """Creates a skill. - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject + :raises ~azure.core.exceptions.HttpResponseError: + """ - return pipeline_response + @overload + async def create( + self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.SkillObject: + """Creates a skill. - return AsyncItemPaged(get_next, extract_data) + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject + :raises ~azure.core.exceptions.HttpResponseError: + """ @distributed_trace_async - async def get_version(self, name: str, version: str, **kwargs: Any) -> _models.ToolboxVersionObject: - """Retrieve a specific version of a toolbox. + async def create( + self, + body: Union[JSON, IO[bytes]] = _Unset, + *, + name: str = _Unset, + description: Optional[str] = None, + instructions: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.SkillObject: + """Creates a skill. - :param name: The name of the toolbox. Required. - :type name: str - :param version: The version identifier to retrieve. Required. - :type version: str - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword name: The unique name of the skill. Required. + :paramtype name: str + :keyword description: A human-readable description of the skill. Default value is None. + :paramtype description: str + :keyword instructions: Instructions that define the behavior of the skill. Default value is + None. + :paramtype instructions: str + :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. + + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. Default value is None. + :paramtype metadata: dict[str, str] + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7736,15 +9023,28 @@ async def get_version(self, name: str, version: str, **kwargs: Any) -> _models.T } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_get_version_request( - name=name, - version=version, + if body is _Unset: + if name is _Unset: + raise TypeError("missing required argument: name") + body = {"description": description, "instructions": instructions, "metadata": metadata, "name": name} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_skills_create_request( + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -7761,7 +9061,7 @@ async def get_version(self, name: str, version: str, **kwargs: Any) -> _models.T response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -7777,83 +9077,21 @@ async def get_version(self, name: str, version: str, **kwargs: Any) -> _models.T if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) + deserialized = _deserialize(_models.SkillObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @overload - async def update( - self, name: str, *, default_version: str, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. - - :param name: The name of the toolbox to update. Required. - :type name: str - :keyword default_version: The version identifier that the toolbox should point to. When set, - the toolbox's default version will resolve to this version instead of the latest. Required. - :paramtype default_version: str - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. - - :param name: The name of the toolbox to update. Required. - :type name: str - :param body: Required. - :type body: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - async def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. - - :param name: The name of the toolbox to update. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - @distributed_trace_async - async def update( - self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, default_version: str = _Unset, **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. + async def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject: + """Creates a skill from a zip package. - :param name: The name of the toolbox to update. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword default_version: The version identifier that the toolbox should point to. When set, - the toolbox's default version will resolve to this version instead of the latest. Required. - :paramtype default_version: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject + :param body: The zip package used to create the skill. Required. + :type body: bytes + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7867,23 +9105,12 @@ async def update( _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) + content_type: str = kwargs.pop("content_type", _headers.pop("Content-Type", "application/zip")) + cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - if body is _Unset: - if default_version is _Unset: - raise TypeError("missing required argument: default_version") - body = {"default_version": default_version} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = body - _request = build_beta_toolboxes_update_request( - name=name, + _request = build_beta_skills_create_from_package_request( content_type=content_type, api_version=self._config.api_version, content=_content, @@ -7903,7 +9130,7 @@ async def update( response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -7919,7 +9146,7 @@ async def update( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ToolboxObject, response.json()) + deserialized = _deserialize(_models.SkillObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -7927,13 +9154,13 @@ async def update( return deserialized # type: ignore @distributed_trace_async - async def delete(self, name: str, **kwargs: Any) -> None: - """Delete a toolbox and all its versions. + async def get(self, name: str, **kwargs: Any) -> _models.SkillObject: + """Retrieves a skill. - :param name: The name of the toolbox to delete. Required. + :param name: The unique name of the skill. Required. :type name: str - :return: None - :rtype: None + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7947,9 +9174,9 @@ async def delete(self, name: str, **kwargs: Any) -> None: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_delete_request( + _request = build_beta_skills_get_request( name=name, api_version=self._config.api_version, headers=_headers, @@ -7960,14 +9187,20 @@ async def delete(self, name: str, **kwargs: Any) -> None: } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -7975,19 +9208,24 @@ async def delete(self, name: str, **kwargs: Any) -> None: ) raise HttpResponseError(response=response, model=error) + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.SkillObject, response.json()) + if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore @distributed_trace_async - async def delete_version(self, name: str, version: str, **kwargs: Any) -> None: - """Delete a specific version of a toolbox. + async def download(self, name: str, **kwargs: Any) -> AsyncIterator[bytes]: + """Downloads a skill package. - :param name: The name of the toolbox. Required. - :type name: str - :param version: The version identifier to delete. Required. - :type version: str - :return: None - :rtype: None + :param name: The unique name of the skill. Required. + :type name: str + :return: AsyncIterator[bytes] + :rtype: AsyncIterator[bytes] :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8001,11 +9239,10 @@ async def delete_version(self, name: str, version: str, **kwargs: Any) -> None: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_delete_version_request( + _request = build_beta_skills_download_request( name=name, - version=version, api_version=self._config.api_version, headers=_headers, params=_params, @@ -8015,14 +9252,20 @@ async def delete_version(self, name: str, version: str, **kwargs: Any) -> None: } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", True) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + await response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -8030,42 +9273,121 @@ async def delete_version(self, name: str, version: str, **kwargs: Any) -> None: ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore + return deserialized # type: ignore -class BetaSkillsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. + @distributed_trace + def list( + self, + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + **kwargs: Any + ) -> AsyncItemPaged["_models.SkillObject"]: + """Returns the list of all skills. - Instead, you should access the following operations through - :class:`~azure.ai.projects.aio.AIProjectClient`'s - :attr:`skills` attribute. - """ + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of SkillObject + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.SkillObject] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + cls: ClsType[List[_models.SkillObject]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(_continuation_token=None): + + _request = build_beta_skills_list_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request + + async def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.SkillObject], + deserialized.get("data", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, AsyncList(list_of_elem) + + async def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) + + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + return pipeline_response + + return AsyncItemPaged(get_next, extract_data) @overload - async def create( + async def update( self, - *, name: str, + *, content_type: str = "application/json", description: Optional[str] = None, instructions: Optional[str] = None, metadata: Optional[dict[str, str]] = None, **kwargs: Any ) -> _models.SkillObject: - """Creates a skill. + """Updates an existing skill. - :keyword name: The unique name of the skill. Required. - :paramtype name: str + :param name: The unique name of the skill. Required. + :type name: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -8087,9 +9409,13 @@ async def create( """ @overload - async def create(self, body: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.SkillObject: - """Creates a skill. + async def update( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.SkillObject: + """Updates an existing skill. + :param name: The unique name of the skill. Required. + :type name: str :param body: Required. :type body: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. @@ -8101,11 +9427,13 @@ async def create(self, body: JSON, *, content_type: str = "application/json", ** """ @overload - async def create( - self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + async def update( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any ) -> _models.SkillObject: - """Creates a skill. + """Updates an existing skill. + :param name: The unique name of the skill. Required. + :type name: str :param body: Required. :type body: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. @@ -8117,113 +9445,34 @@ async def create( """ @distributed_trace_async - async def create( - self, - body: Union[JSON, IO[bytes]] = _Unset, - *, - name: str = _Unset, - description: Optional[str] = None, - instructions: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.SkillObject: - """Creates a skill. - - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword name: The unique name of the skill. Required. - :paramtype name: str - :keyword description: A human-readable description of the skill. Default value is None. - :paramtype description: str - :keyword instructions: Instructions that define the behavior of the skill. Default value is - None. - :paramtype instructions: str - :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be - useful for storing additional information about the object in a structured - format, and querying for objects via API or the dashboard. - - Keys are strings with a maximum length of 64 characters. Values are strings - with a maximum length of 512 characters. Default value is None. - :paramtype metadata: dict[str, str] - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) - _params = kwargs.pop("params", {}) or {} - - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - - if body is _Unset: - if name is _Unset: - raise TypeError("missing required argument: name") - body = {"description": description, "instructions": instructions, "metadata": metadata, "name": name} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - - _request = build_beta_skills_create_request( - content_type=content_type, - api_version=self._config.api_version, - content=_content, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) - pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - - response = pipeline_response.http_response - - if response.status_code not in [201]: - if _stream: - try: - await response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) - - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.SkillObject, response.json()) - - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore - - return deserialized # type: ignore - - @distributed_trace_async - async def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject: - """Creates a skill from a zip package. + async def update( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + description: Optional[str] = None, + instructions: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.SkillObject: + """Updates an existing skill. - :param body: The zip package used to create the skill. Required. - :type body: bytes + :param name: The unique name of the skill. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword description: A human-readable description of the skill. Default value is None. + :paramtype description: str + :keyword instructions: Instructions that define the behavior of the skill. Default value is + None. + :paramtype instructions: str + :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. + + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. Default value is None. + :paramtype metadata: dict[str, str] :return: SkillObject. The SkillObject is compatible with MutableMapping :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: @@ -8239,12 +9488,21 @@ async def create_from_package(self, body: bytes, **kwargs: Any) -> _models.Skill _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - content_type: str = kwargs.pop("content_type", _headers.pop("Content-Type", "application/zip")) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - _content = body + if body is _Unset: + body = {"description": description, "instructions": instructions, "metadata": metadata} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_skills_create_from_package_request( + _request = build_beta_skills_update_request( + name=name, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -8264,7 +9522,7 @@ async def create_from_package(self, body: bytes, **kwargs: Any) -> _models.Skill response = pipeline_response.http_response - if response.status_code not in [201]: + if response.status_code not in [200]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -8288,13 +9546,13 @@ async def create_from_package(self, body: bytes, **kwargs: Any) -> _models.Skill return deserialized # type: ignore @distributed_trace_async - async def get(self, name: str, **kwargs: Any) -> _models.SkillObject: - """Retrieves a skill. + async def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: + """Deletes a skill. :param name: The unique name of the skill. Required. :type name: str - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DeleteSkillResponse. The DeleteSkillResponse is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DeleteSkillResponse :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8308,9 +9566,9 @@ async def get(self, name: str, **kwargs: Any) -> _models.SkillObject: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) + cls: ClsType[_models.DeleteSkillResponse] = kwargs.pop("cls", None) - _request = build_beta_skills_get_request( + _request = build_beta_skills_delete_request( name=name, api_version=self._config.api_version, headers=_headers, @@ -8345,21 +9603,41 @@ async def get(self, name: str, **kwargs: Any) -> _models.SkillObject: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SkillObject, response.json()) + deserialized = _deserialize(_models.DeleteSkillResponse, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore + +class BetaDatasetsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.aio.AIProjectClient`'s + :attr:`datasets` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: AsyncPipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @distributed_trace_async - async def download(self, name: str, **kwargs: Any) -> AsyncIterator[bytes]: - """Downloads a skill package. + async def get_generation_job(self, job_id: str, **kwargs: Any) -> _models.DataGenerationJob: + """Get info about a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :return: AsyncIterator[bytes] - :rtype: AsyncIterator[bytes] + Gets the details of a data generation job by its ID. + + :param job_id: The ID of the job. Required. + :type job_id: str + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8373,10 +9651,10 @@ async def download(self, name: str, **kwargs: Any) -> AsyncIterator[bytes]: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[AsyncIterator[bytes]] = kwargs.pop("cls", None) + cls: ClsType[_models.DataGenerationJob] = kwargs.pop("cls", None) - _request = build_beta_skills_download_request( - name=name, + _request = build_beta_datasets_get_generation_job_request( + job_id=job_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -8387,7 +9665,7 @@ async def download(self, name: str, **kwargs: Any) -> AsyncIterator[bytes]: _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", True) + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) @@ -8408,9 +9686,12 @@ async def download(self, name: str, **kwargs: Any) -> AsyncIterator[bytes]: raise HttpResponseError(response=response, model=error) response_headers = {} - response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + response_headers["Retry-After"] = self._deserialize("int", response.headers.get("Retry-After")) - deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.DataGenerationJob, response.json()) if cls: return cls(pipeline_response, deserialized, response_headers) # type: ignore @@ -8418,15 +9699,19 @@ async def download(self, name: str, **kwargs: Any) -> AsyncIterator[bytes]: return deserialized # type: ignore @distributed_trace - def list( + def list_generation_jobs( self, *, limit: Optional[int] = None, order: Optional[Union[str, _models.PageOrder]] = None, before: Optional[str] = None, + scenario: Optional[Union[str, _models.DataGenerationJobScenario]] = None, + type: Optional[List[Union[str, _models.DataGenerationJobType]]] = None, **kwargs: Any - ) -> AsyncItemPaged["_models.SkillObject"]: - """Returns the list of all skills. + ) -> AsyncItemPaged["_models.DataGenerationJob"]: + """Returns a list of data generation jobs. + + Returns a list of data generation jobs. :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the @@ -8442,14 +9727,19 @@ def list( subsequent call can include before=obj_foo in order to fetch the previous page of the list. Default value is None. :paramtype before: str - :return: An iterator like instance of SkillObject - :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.SkillObject] + :keyword scenario: Filter data generation jobs by their scenario. Known values are: + "supervised_finetuning", "reinforcement_finetuning", and "evaluation". Default value is None. + :paramtype scenario: str or ~azure.ai.projects.models.DataGenerationJobScenario + :keyword type: Filter data generation jobs by their type. Default value is None. + :paramtype type: list[str or ~azure.ai.projects.models.DataGenerationJobType] + :return: An iterator like instance of DataGenerationJob + :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.ai.projects.models.DataGenerationJob] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.SkillObject]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.DataGenerationJob]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -8461,11 +9751,13 @@ def list( def prepare_request(_continuation_token=None): - _request = build_beta_skills_list_request( + _request = build_beta_datasets_list_generation_jobs_request( limit=limit, order=order, after=_continuation_token, before=before, + scenario=scenario, + type=type, api_version=self._config.api_version, headers=_headers, params=_params, @@ -8479,7 +9771,7 @@ def prepare_request(_continuation_token=None): async def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.SkillObject], + List[_models.DataGenerationJob], deserialized.get("data", []), ) if cls: @@ -8508,107 +9800,98 @@ async def get_next(_continuation_token=None): return AsyncItemPaged(get_next, extract_data) @overload - async def update( + async def create_generation_job( self, - name: str, + job: _models.DataGenerationJob, *, + operation_id: Optional[str] = None, content_type: str = "application/json", - description: Optional[str] = None, - instructions: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. + ) -> _models.DataGenerationJob: + """Creates a data generation job. - :param name: The unique name of the skill. Required. - :type name: str + Creates a data generation job. + + :param job: The job to create. Required. + :type job: ~azure.ai.projects.models.DataGenerationJob + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :keyword description: A human-readable description of the skill. Default value is None. - :paramtype description: str - :keyword instructions: Instructions that define the behavior of the skill. Default value is - None. - :paramtype instructions: str - :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be - useful for storing additional information about the object in a structured - format, and querying for objects via API or the dashboard. - - Keys are strings with a maximum length of 64 characters. Values are strings - with a maximum length of 512 characters. Default value is None. - :paramtype metadata: dict[str, str] - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. - - :param name: The unique name of the skill. Required. - :type name: str - :param body: Required. - :type body: JSON + async def create_generation_job( + self, job: JSON, *, operation_id: Optional[str] = None, content_type: str = "application/json", **kwargs: Any + ) -> _models.DataGenerationJob: + """Creates a data generation job. + + Creates a data generation job. + + :param job: The job to create. Required. + :type job: JSON + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - async def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. + async def create_generation_job( + self, + job: IO[bytes], + *, + operation_id: Optional[str] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.DataGenerationJob: + """Creates a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] + Creates a data generation job. + + :param job: The job to create. Required. + :type job: IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace_async - async def update( + async def create_generation_job( self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, + job: Union[_models.DataGenerationJob, JSON, IO[bytes]], *, - description: Optional[str] = None, - instructions: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, + operation_id: Optional[str] = None, **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. + ) -> _models.DataGenerationJob: + """Creates a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword description: A human-readable description of the skill. Default value is None. - :paramtype description: str - :keyword instructions: Instructions that define the behavior of the skill. Default value is - None. - :paramtype instructions: str - :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be - useful for storing additional information about the object in a structured - format, and querying for objects via API or the dashboard. + Creates a data generation job. - Keys are strings with a maximum length of 64 characters. Values are strings - with a maximum length of 512 characters. Default value is None. - :paramtype metadata: dict[str, str] - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :param job: The job to create. Is one of the following types: DataGenerationJob, JSON, + IO[bytes] Required. + :type job: ~azure.ai.projects.models.DataGenerationJob or JSON or IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8623,20 +9906,17 @@ async def update( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) + cls: ClsType[_models.DataGenerationJob] = kwargs.pop("cls", None) - if body is _Unset: - body = {"description": description, "instructions": instructions, "metadata": metadata} - body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(job, (IOBase, bytes)): + _content = job else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(job, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_skills_update_request( - name=name, + _request = build_beta_datasets_create_generation_job_request( + operation_id=operation_id, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -8656,7 +9936,7 @@ async def update( response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: await response.read() # Load the body in memory and close the socket @@ -8669,24 +9949,30 @@ async def update( ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) + response_headers["Location"] = self._deserialize("str", response.headers.get("Location")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SkillObject, response.json()) + deserialized = _deserialize(_models.DataGenerationJob, response.json()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace_async - async def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: - """Deletes a skill. + async def cancel_generation_job(self, job_id: str, **kwargs: Any) -> _models.DataGenerationJob: + """Cancels a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :return: DeleteSkillResponse. The DeleteSkillResponse is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.DeleteSkillResponse + Cancels a data generation job by its ID. + + :param job_id: The ID of the job to cancel. Required. + :type job_id: str + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8700,10 +9986,10 @@ async def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.DeleteSkillResponse] = kwargs.pop("cls", None) + cls: ClsType[_models.DataGenerationJob] = kwargs.pop("cls", None) - _request = build_beta_skills_delete_request( - name=name, + _request = build_beta_datasets_cancel_generation_job_request( + job_id=job_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -8737,9 +10023,63 @@ async def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.DeleteSkillResponse, response.json()) + deserialized = _deserialize(_models.DataGenerationJob, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore + + @distributed_trace_async + async def delete_generation_job(self, job_id: str, **kwargs: Any) -> None: + """Deletes a data generation job. + + Deletes a data generation job by its ID. + + :param job_id: The ID of the job to delete. Required. + :type job_id: str + :return: None + :rtype: None + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[None] = kwargs.pop("cls", None) + + _request = build_beta_datasets_delete_generation_job_request( + job_id=job_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = False + pipeline_response: PipelineResponse = await self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [204]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if cls: + return cls(pipeline_response, None, {}) # type: ignore diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch.py index 56dea6851b0d..3c3e527771b4 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch.py @@ -18,6 +18,7 @@ from ._patch_sessions_async import BetaAgentsOperations from ...operations._patch import _BETA_OPERATION_FEATURE_HEADERS, _OperationMethodHeaderProxy from ._operations import ( + BetaDatasetsOperations, BetaEvaluationTaxonomiesOperations, BetaEvaluatorsOperations, BetaInsightsOperations, @@ -57,6 +58,8 @@ class BetaOperations(GeneratedBetaOperations): """:class:`~azure.ai.projects.aio.operations.BetaToolboxesOperations` operations""" skills: BetaSkillsOperations """:class:`~azure.ai.projects.aio.operations.BetaSkillsOperations` operations""" + datasets: BetaDatasetsOperations + """:class:`~azure.ai.projects.aio.operations.BetaDatasetsOperations` operations""" def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) @@ -78,6 +81,7 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: __all__: List[str] = [ "AgentsOperations", "BetaAgentsOperations", + "BetaDatasetsOperations", "BetaEvaluationTaxonomiesOperations", "BetaEvaluatorsOperations", "BetaInsightsOperations", diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/models/__init__.py b/sdk/ai/azure-ai-projects/azure/ai/projects/models/__init__.py index 8dfca8b30a03..e2766099b650 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/models/__init__.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/models/__init__.py @@ -21,10 +21,12 @@ AgentCardSkill, AgentClusterInsightRequest, AgentClusterInsightResult, + AgentDataGenerationJobSource, AgentDefinition, AgentDetails, - AgentEndpoint, AgentEndpointAuthorizationScheme, + AgentEndpointConfig, + AgentEvaluatorGenerationJobSource, AgentIdentity, AgentObjectVersions, AgentSessionResource, @@ -83,6 +85,8 @@ ContainerSkill, ContinuousEvaluationRuleAction, CosmosDBIndex, + CreateAgentVersionFromCodeContent, + CreateAgentVersionFromCodeRequest, CronTrigger, CustomCredential, CustomGrammarFormatParam, @@ -90,7 +94,19 @@ CustomToolParam, CustomToolParamFormat, DailyRecurrenceSchedule, + DataGenerationJob, + DataGenerationJobInputs, + DataGenerationJobOptions, + DataGenerationJobOutput, + DataGenerationJobResult, + DataGenerationJobSource, + DataGenerationModelOptions, + DataGenerationTokenUsage, DatasetCredential, + DatasetDataGenerationJobOutput, + DatasetDataGenerationJobSource, + DatasetEvaluatorGenerationJobSource, + DatasetReference, DatasetVersion, DeleteAgentResponse, DeleteAgentVersionResponse, @@ -100,7 +116,6 @@ EmbeddingConfiguration, EntraAuthorizationScheme, EntraIDCredentials, - EntraIsolationKeySource, EvalResult, EvalRunResultCompareItem, EvalRunResultComparison, @@ -117,10 +132,19 @@ EvaluationTaxonomy, EvaluationTaxonomyInput, EvaluatorDefinition, + EvaluatorGenerationArtifacts, + EvaluatorGenerationInputs, + EvaluatorGenerationJob, + EvaluatorGenerationJobSource, + EvaluatorGenerationTokenUsage, EvaluatorMetric, EvaluatorVersion, FabricDataAgentToolParameters, + FabricIQPreviewTool, + FabricIQPreviewToolParameters, FieldMapping, + FileDataGenerationJobOutput, + FileDataGenerationJobSource, FileDatasetVersion, FileSearchTool, FixedRatioVersionSelectionRule, @@ -130,7 +154,6 @@ FunctionShellToolParamEnvironmentContainerReferenceParam, FunctionShellToolParamEnvironmentLocalEnvironmentParam, FunctionTool, - HeaderIsolationKeySource, HeaderTelemetryEndpointAuth, HostedAgentDefinition, HourlyRecurrenceSchedule, @@ -150,7 +173,6 @@ InsightScheduleTask, InsightSummary, InsightsMetadata, - IsolationKeySource, LocalShellToolParam, LocalSkillParam, MCPTool, @@ -194,6 +216,8 @@ PromptAgentDefinition, PromptAgentDefinitionTextOptions, PromptBasedEvaluatorDefinition, + PromptDataGenerationJobSource, + PromptEvaluatorGenerationJobSource, ProtocolVersionRecord, RaiConfig, RankingOptions, @@ -203,6 +227,8 @@ RedTeam, ResponseUsageInputTokensDetails, ResponseUsageOutputTokensDetails, + RubricBasedEvaluatorDefinition, + RubricCriterion, SASCredentials, Schedule, ScheduleRun, @@ -213,6 +239,7 @@ SessionLogEvent, SharepointGroundingToolParameters, SharepointPreviewTool, + SimpleQnADataGenerationJobOptions, SkillObject, SkillReferenceParam, SpecificApplyPatchParam, @@ -244,9 +271,14 @@ ToolChoiceWebSearchPreview20250311, ToolDescription, ToolProjectConnection, + ToolUseFineTuningDataGenerationJobOptions, ToolboxObject, ToolboxPolicies, + ToolboxSearchPreviewTool, ToolboxVersionObject, + TracesDataGenerationJobOptions, + TracesDataGenerationJobSource, + TracesEvaluatorGenerationJobSource, Trigger, UpdateToolboxRequest, UserProfileMemoryItem, @@ -276,6 +308,7 @@ AgentVersionStatus, AttackStrategy, AzureAISearchQueryType, + CodeDependencyResolution, ComputerEnvironment, ConnectionType, ContainerMemoryLimit, @@ -283,6 +316,10 @@ ContainerSkillType, CredentialType, CustomToolParamFormatType, + DataGenerationJobOutputType, + DataGenerationJobScenario, + DataGenerationJobSourceType, + DataGenerationJobType, DatasetType, DayOfWeek, DeploymentType, @@ -291,6 +328,7 @@ EvaluationTaxonomyInputType, EvaluatorCategory, EvaluatorDefinitionType, + EvaluatorGenerationJobSourceType, EvaluatorMetricDirection, EvaluatorMetricType, EvaluatorType, @@ -300,7 +338,7 @@ IndexType, InputFidelity, InsightType, - IsolationKeySourceKind, + JobStatus, MemoryItemKind, MemoryOperationKind, MemoryStoreKind, @@ -318,6 +356,7 @@ ScheduleTaskType, SearchContextSize, SessionLogEventType, + SimpleQnAFineTuningQuestionType, TelemetryDataKind, TelemetryEndpointAuthType, TelemetryEndpointKind, @@ -342,10 +381,12 @@ "AgentCardSkill", "AgentClusterInsightRequest", "AgentClusterInsightResult", + "AgentDataGenerationJobSource", "AgentDefinition", "AgentDetails", - "AgentEndpoint", "AgentEndpointAuthorizationScheme", + "AgentEndpointConfig", + "AgentEvaluatorGenerationJobSource", "AgentIdentity", "AgentObjectVersions", "AgentSessionResource", @@ -404,6 +445,8 @@ "ContainerSkill", "ContinuousEvaluationRuleAction", "CosmosDBIndex", + "CreateAgentVersionFromCodeContent", + "CreateAgentVersionFromCodeRequest", "CronTrigger", "CustomCredential", "CustomGrammarFormatParam", @@ -411,7 +454,19 @@ "CustomToolParam", "CustomToolParamFormat", "DailyRecurrenceSchedule", + "DataGenerationJob", + "DataGenerationJobInputs", + "DataGenerationJobOptions", + "DataGenerationJobOutput", + "DataGenerationJobResult", + "DataGenerationJobSource", + "DataGenerationModelOptions", + "DataGenerationTokenUsage", "DatasetCredential", + "DatasetDataGenerationJobOutput", + "DatasetDataGenerationJobSource", + "DatasetEvaluatorGenerationJobSource", + "DatasetReference", "DatasetVersion", "DeleteAgentResponse", "DeleteAgentVersionResponse", @@ -421,7 +476,6 @@ "EmbeddingConfiguration", "EntraAuthorizationScheme", "EntraIDCredentials", - "EntraIsolationKeySource", "EvalResult", "EvalRunResultCompareItem", "EvalRunResultComparison", @@ -438,10 +492,19 @@ "EvaluationTaxonomy", "EvaluationTaxonomyInput", "EvaluatorDefinition", + "EvaluatorGenerationArtifacts", + "EvaluatorGenerationInputs", + "EvaluatorGenerationJob", + "EvaluatorGenerationJobSource", + "EvaluatorGenerationTokenUsage", "EvaluatorMetric", "EvaluatorVersion", "FabricDataAgentToolParameters", + "FabricIQPreviewTool", + "FabricIQPreviewToolParameters", "FieldMapping", + "FileDataGenerationJobOutput", + "FileDataGenerationJobSource", "FileDatasetVersion", "FileSearchTool", "FixedRatioVersionSelectionRule", @@ -451,7 +514,6 @@ "FunctionShellToolParamEnvironmentContainerReferenceParam", "FunctionShellToolParamEnvironmentLocalEnvironmentParam", "FunctionTool", - "HeaderIsolationKeySource", "HeaderTelemetryEndpointAuth", "HostedAgentDefinition", "HourlyRecurrenceSchedule", @@ -471,7 +533,6 @@ "InsightScheduleTask", "InsightSummary", "InsightsMetadata", - "IsolationKeySource", "LocalShellToolParam", "LocalSkillParam", "MCPTool", @@ -515,6 +576,8 @@ "PromptAgentDefinition", "PromptAgentDefinitionTextOptions", "PromptBasedEvaluatorDefinition", + "PromptDataGenerationJobSource", + "PromptEvaluatorGenerationJobSource", "ProtocolVersionRecord", "RaiConfig", "RankingOptions", @@ -524,6 +587,8 @@ "RedTeam", "ResponseUsageInputTokensDetails", "ResponseUsageOutputTokensDetails", + "RubricBasedEvaluatorDefinition", + "RubricCriterion", "SASCredentials", "Schedule", "ScheduleRun", @@ -534,6 +599,7 @@ "SessionLogEvent", "SharepointGroundingToolParameters", "SharepointPreviewTool", + "SimpleQnADataGenerationJobOptions", "SkillObject", "SkillReferenceParam", "SpecificApplyPatchParam", @@ -565,9 +631,14 @@ "ToolChoiceWebSearchPreview20250311", "ToolDescription", "ToolProjectConnection", + "ToolUseFineTuningDataGenerationJobOptions", "ToolboxObject", "ToolboxPolicies", + "ToolboxSearchPreviewTool", "ToolboxVersionObject", + "TracesDataGenerationJobOptions", + "TracesDataGenerationJobSource", + "TracesEvaluatorGenerationJobSource", "Trigger", "UpdateToolboxRequest", "UserProfileMemoryItem", @@ -594,6 +665,7 @@ "AgentVersionStatus", "AttackStrategy", "AzureAISearchQueryType", + "CodeDependencyResolution", "ComputerEnvironment", "ConnectionType", "ContainerMemoryLimit", @@ -601,6 +673,10 @@ "ContainerSkillType", "CredentialType", "CustomToolParamFormatType", + "DataGenerationJobOutputType", + "DataGenerationJobScenario", + "DataGenerationJobSourceType", + "DataGenerationJobType", "DatasetType", "DayOfWeek", "DeploymentType", @@ -609,6 +685,7 @@ "EvaluationTaxonomyInputType", "EvaluatorCategory", "EvaluatorDefinitionType", + "EvaluatorGenerationJobSourceType", "EvaluatorMetricDirection", "EvaluatorMetricType", "EvaluatorType", @@ -618,7 +695,7 @@ "IndexType", "InputFidelity", "InsightType", - "IsolationKeySourceKind", + "JobStatus", "MemoryItemKind", "MemoryOperationKind", "MemoryStoreKind", @@ -636,6 +713,7 @@ "ScheduleTaskType", "SearchContextSize", "SessionLogEventType", + "SimpleQnAFineTuningQuestionType", "TelemetryDataKind", "TelemetryEndpointAuthType", "TelemetryEndpointKind", diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/models/_enums.py b/sdk/ai/azure-ai-projects/azure/ai/projects/models/_enums.py index 588927c00421..40648951bbda 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/models/_enums.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/models/_enums.py @@ -21,6 +21,8 @@ class _AgentDefinitionOptInKeys(str, Enum, metaclass=CaseInsensitiveEnumMeta): """CONTAINER_AGENTS_V1_PREVIEW.""" AGENT_ENDPOINT_V1_PREVIEW = "AgentEndpoints=V1Preview" """AGENT_ENDPOINT_V1_PREVIEW.""" + CODE_AGENTS_V1_PREVIEW = "CodeAgents=V1Preview" + """CODE_AGENTS_V1_PREVIEW.""" class _FoundryFeaturesOptInKeys(str, Enum, metaclass=CaseInsensitiveEnumMeta): @@ -40,6 +42,8 @@ class _FoundryFeaturesOptInKeys(str, Enum, metaclass=CaseInsensitiveEnumMeta): """MEMORY_STORES_V1_PREVIEW.""" TOOLBOXES_V1_PREVIEW = "Toolboxes=V1Preview" """TOOLBOXES_V1_PREVIEW.""" + DATA_GENERATION_JOBS_V1_PREVIEW = "DataGenerationJobs=V1Preview" + """DATA_GENERATION_JOBS_V1_PREVIEW.""" class AgentBlueprintReferenceType(str, Enum, metaclass=CaseInsensitiveEnumMeta): @@ -232,6 +236,16 @@ class AzureAISearchQueryType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """Query type ``vector_semantic_hybrid``.""" +class CodeDependencyResolution(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """How package dependencies are resolved at deployment time for a code-based hosted agent.""" + + BUNDLED = "bundled" + """The caller has bundled all dependencies into the uploaded zip; the service performs no remote + build.""" + REMOTE_BUILD = "remote_build" + """The service builds dependencies remotely from the manifest included in the uploaded zip.""" + + class ComputerEnvironment(str, Enum, metaclass=CaseInsensitiveEnumMeta): """Type of ComputerEnvironment.""" @@ -329,6 +343,52 @@ class CustomToolParamFormatType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """GRAMMAR.""" +class DataGenerationJobOutputType(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """The supported output file types for a data generation job.""" + + FILE = "file" + """The generated data is an Azure OpenAI File.""" + DATASET = "dataset" + """The generated data is a Dataset.""" + + +class DataGenerationJobScenario(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """The supported scenarios for a data generation job.""" + + SUPERVISED_FINETUNING = "supervised_finetuning" + """Supervised Fine-tuning scenario.""" + REINFORCEMENT_FINETUNING = "reinforcement_finetuning" + """Reinforcement Fine-tuning scenario.""" + EVALUATION = "evaluation" + """Evaluation scenario.""" + + +class DataGenerationJobSourceType(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """The supported source types for data generation jobs.""" + + PROMPT = "prompt" + """Prompt source — inline text provided by the user.""" + AGENT = "agent" + """Agent source — references an agent.""" + TRACES = "traces" + """Traces source — conversation traces from Application Insights.""" + DATASET = "dataset" + """Dataset source — reference to a dataset.""" + FILE = "file" + """File source — Azure OpenAI file.""" + + +class DataGenerationJobType(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """The supported data generation job types.""" + + SIMPLE_QNA = "simple_qna" + """Simple question and answers between user and agent.""" + TRACES = "traces" + """Single turn query and response from agent traces.""" + TOOL_USE = "tool_use" + """Tool calling conversation between user and agent.""" + + class DatasetType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """Enum to determine the type of data.""" @@ -415,6 +475,22 @@ class EvaluatorDefinitionType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """Service-based evaluator.""" OPENAI_GRADERS = "openai_graders" """OpenAI graders.""" + RUBRICS = "rubrics" + """Rubric-based evaluator definition. Stores rubric criteria for both quality and safety + evaluators. Can be created via the generate API or manually via createVersion.""" + + +class EvaluatorGenerationJobSourceType(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """The supported source types for evaluator generation jobs.""" + + PROMPT = "prompt" + """Prompt source — inline text provided by the user.""" + AGENT = "agent" + """Agent source — references an agent to fetch instructions and metadata from.""" + TRACES = "traces" + """Traces source — conversation traces from Application Insights.""" + DATASET = "dataset" + """Dataset source — reference to a dataset.""" class EvaluatorMetricDirection(str, Enum, metaclass=CaseInsensitiveEnumMeta): @@ -516,13 +592,19 @@ class InsightType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """Evaluation Comparison.""" -class IsolationKeySourceKind(str, Enum, metaclass=CaseInsensitiveEnumMeta): - """Type of IsolationKeySourceKind.""" +class JobStatus(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """Extensible status values shared by Foundry jobs.""" - ENTRA = "Entra" - """ENTRA.""" - HEADER = "Header" - """HEADER.""" + QUEUED = "queued" + """Job is waiting to start.""" + IN_PROGRESS = "in_progress" + """Job is actively processing.""" + SUCCEEDED = "succeeded" + """Job completed successfully.""" + FAILED = "failed" + """Job failed.""" + CANCELLED = "cancelled" + """Job was cancelled by the caller.""" class MemoryItemKind(str, Enum, metaclass=CaseInsensitiveEnumMeta): @@ -725,6 +807,15 @@ class SessionLogEventType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """A log line from the agent session container.""" +class SimpleQnAFineTuningQuestionType(str, Enum, metaclass=CaseInsensitiveEnumMeta): + """The supported question types for SimpleQnA data generation jobs used for fine-tuning scenarios.""" + + SHORT_ANSWER = "short_answer" + """Short answer question type.""" + LONG_ANSWER = "long_answer" + """Long answer question type.""" + + class TelemetryDataKind(str, Enum, metaclass=CaseInsensitiveEnumMeta): """The type of telemetry data to export.""" @@ -840,6 +931,10 @@ class ToolType(str, Enum, metaclass=CaseInsensitiveEnumMeta): """MEMORY_SEARCH_PREVIEW.""" WORK_IQ_PREVIEW = "work_iq_preview" """WORK_IQ_PREVIEW.""" + FABRIC_IQ_PREVIEW = "fabric_iq_preview" + """FABRIC_IQ_PREVIEW.""" + TOOLBOX_SEARCH_PREVIEW = "toolbox_search_preview" + """TOOLBOX_SEARCH_PREVIEW.""" AZURE_AI_SEARCH = "azure_ai_search" """AZURE_AI_SEARCH.""" AZURE_FUNCTION = "azure_function" diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/models/_models.py b/sdk/ai/azure-ai-projects/azure/ai/projects/models/_models.py index f2da2ee9bdb2..ee95d39675aa 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/models/_models.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/models/_models.py @@ -12,6 +12,7 @@ from typing import Any, Literal, Mapping, Optional, TYPE_CHECKING, Union, overload from .._utils.model_base import Model as _Model, rest_discriminator, rest_field +from .._utils.utils import FileType from ._enums import ( AgentBlueprintReferenceType, AgentEndpointAuthorizationSchemeType, @@ -21,15 +22,18 @@ ContainerSkillType, CredentialType, CustomToolParamFormatType, + DataGenerationJobOutputType, + DataGenerationJobSourceType, + DataGenerationJobType, DatasetType, DeploymentType, EvaluationRuleActionType, EvaluationTaxonomyInputType, EvaluatorDefinitionType, + EvaluatorGenerationJobSourceType, FunctionShellToolParamEnvironmentType, IndexType, InsightType, - IsolationKeySourceKind, MemoryItemKind, MemoryStoreKind, MemoryStoreObjectType, @@ -59,16 +63,18 @@ class Tool(_Model): A2APreviewTool, ApplyPatchToolParam, AzureAISearchTool, AzureFunctionTool, BingCustomSearchPreviewTool, BingGroundingTool, BrowserAutomationPreviewTool, CaptureStructuredOutputsTool, CodeInterpreterTool, ComputerUsePreviewTool, CustomToolParam, - MicrosoftFabricPreviewTool, FileSearchTool, FunctionTool, ImageGenTool, LocalShellToolParam, - MCPTool, MemorySearchPreviewTool, OpenApiTool, SharepointPreviewTool, FunctionShellToolParam, - WebSearchTool, WebSearchPreviewTool, WorkIQPreviewTool + MicrosoftFabricPreviewTool, FabricIQPreviewTool, FileSearchTool, FunctionTool, ImageGenTool, + LocalShellToolParam, MCPTool, MemorySearchPreviewTool, OpenApiTool, SharepointPreviewTool, + FunctionShellToolParam, ToolboxSearchPreviewTool, WebSearchTool, WebSearchPreviewTool, + WorkIQPreviewTool :ivar type: Required. Known values are: "function", "file_search", "computer_use_preview", "web_search", "mcp", "code_interpreter", "image_generation", "local_shell", "shell", "custom", "web_search_preview", "apply_patch", "a2a_preview", "bing_custom_search_preview", "browser_automation_preview", "fabric_dataagent_preview", "sharepoint_grounding_preview", - "memory_search_preview", "work_iq_preview", "azure_ai_search", "azure_function", - "bing_grounding", "capture_structured_outputs", and "openapi". + "memory_search_preview", "work_iq_preview", "fabric_iq_preview", "toolbox_search_preview", + "azure_ai_search", "azure_function", "bing_grounding", "capture_structured_outputs", and + "openapi". :vartype type: str or ~azure.ai.projects.models.ToolType """ @@ -79,8 +85,8 @@ class Tool(_Model): \"shell\", \"custom\", \"web_search_preview\", \"apply_patch\", \"a2a_preview\", \"bing_custom_search_preview\", \"browser_automation_preview\", \"fabric_dataagent_preview\", \"sharepoint_grounding_preview\", \"memory_search_preview\", \"work_iq_preview\", - \"azure_ai_search\", \"azure_function\", \"bing_grounding\", \"capture_structured_outputs\", - and \"openapi\".""" + \"fabric_iq_preview\", \"toolbox_search_preview\", \"azure_ai_search\", \"azure_function\", + \"bing_grounding\", \"capture_structured_outputs\", and \"openapi\".""" @overload def __init__( @@ -105,6 +111,10 @@ class A2APreviewTool(Tool, discriminator="a2a_preview"): :ivar type: The type of the tool. Always ``"a2a_preview``. Required. A2A_PREVIEW. :vartype type: str or ~azure.ai.projects.models.A2A_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar base_url: Base URL of the agent. :vartype base_url: str :ivar agent_card_path: The path to the agent card relative to the ``base_url``. If not @@ -118,6 +128,10 @@ class A2APreviewTool(Tool, discriminator="a2a_preview"): type: Literal[ToolType.A2A_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The type of the tool. Always ``\"a2a_preview``. Required. A2A_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" base_url: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) """Base URL of the agent.""" agent_card_path: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) @@ -131,6 +145,8 @@ class A2APreviewTool(Tool, discriminator="a2a_preview"): def __init__( self, *, + name: Optional[str] = None, + description: Optional[str] = None, base_url: Optional[str] = None, agent_card_path: Optional[str] = None, project_connection_id: Optional[str] = None, @@ -410,6 +426,94 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: self.type = InsightType.AGENT_CLUSTER_INSIGHT # type: ignore +class DataGenerationJobSource(_Model): + """The base source model for data generation jobs. + + You probably want to use the sub-classes and not this class directly. Known sub-classes are: + AgentDataGenerationJobSource, DatasetDataGenerationJobSource, FileDataGenerationJobSource, + PromptDataGenerationJobSource, TracesDataGenerationJobSource + + :ivar type: The type of source. Required. Known values are: "prompt", "agent", "traces", + "dataset", and "file". + :vartype type: str or ~azure.ai.projects.models.DataGenerationJobSourceType + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + """ + + __mapping__: dict[str, _Model] = {} + type: str = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) + """The type of source. Required. Known values are: \"prompt\", \"agent\", \"traces\", \"dataset\", + and \"file\".""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional description of what this source represents — helps the pipeline interpret its content + (e.g., 'Company refund policy document' or 'Describes the agent's core capabilities').""" + + @overload + def __init__( + self, + *, + type: str, + description: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class AgentDataGenerationJobSource(DataGenerationJobSource, discriminator="agent"): + """Agent source for data generation jobs — references an agent to fetch instructions and metadata + from. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Agent. Required. Agent source — + references an agent. + :vartype type: str or ~azure.ai.projects.models.AGENT + :ivar agent_name: The agent name to fetch instructions from. Required. + :vartype agent_name: str + :ivar agent_version: The agent version. If not specified, the latest version is used. + :vartype agent_version: str + """ + + type: Literal[DataGenerationJobSourceType.AGENT] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Agent. Required. Agent source — references an agent.""" + agent_name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent name to fetch instructions from. Required.""" + agent_version: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent version. If not specified, the latest version is used.""" + + @overload + def __init__( + self, + *, + agent_name: str, + description: Optional[str] = None, + agent_version: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobSourceType.AGENT # type: ignore + + class AgentDefinition(_Model): """AgentDefinition. @@ -459,7 +563,7 @@ class AgentDetails(_Model): :ivar versions: The latest version of the agent. Required. :vartype versions: ~azure.ai.projects.models.AgentObjectVersions :ivar agent_endpoint: The endpoint configuration for the agent. - :vartype agent_endpoint: ~azure.ai.projects.models.AgentEndpoint + :vartype agent_endpoint: ~azure.ai.projects.models.AgentEndpointConfig :ivar instance_identity: The instance identity of the agent. :vartype instance_identity: ~azure.ai.projects.models.AgentIdentity :ivar blueprint: The blueprint for the agent. @@ -478,7 +582,7 @@ class AgentDetails(_Model): """The name of the agent. Required.""" versions: "_models.AgentObjectVersions" = rest_field(visibility=["read", "create", "update", "delete", "query"]) """The latest version of the agent. Required.""" - agent_endpoint: Optional["_models.AgentEndpoint"] = rest_field( + agent_endpoint: Optional["_models.AgentEndpointConfig"] = rest_field( visibility=["read", "create", "update", "delete", "query"] ) """The endpoint configuration for the agent.""" @@ -498,7 +602,7 @@ def __init__( id: str, # pylint: disable=redefined-builtin name: str, versions: "_models.AgentObjectVersions", - agent_endpoint: Optional["_models.AgentEndpoint"] = None, + agent_endpoint: Optional["_models.AgentEndpointConfig"] = None, agent_card: Optional["_models.AgentCard"] = None, ) -> None: ... @@ -513,8 +617,40 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) -class AgentEndpoint(_Model): - """AgentEndpoint. +class AgentEndpointAuthorizationScheme(_Model): + """AgentEndpointAuthorizationScheme. + + You probably want to use the sub-classes and not this class directly. Known sub-classes are: + BotServiceAuthorizationScheme, BotServiceRbacAuthorizationScheme, EntraAuthorizationScheme + + :ivar type: Required. Known values are: "Entra", "BotService", and "BotServiceRbac". + :vartype type: str or ~azure.ai.projects.models.AgentEndpointAuthorizationSchemeType + """ + + __mapping__: dict[str, _Model] = {} + type: str = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) + """Required. Known values are: \"Entra\", \"BotService\", and \"BotServiceRbac\".""" + + @overload + def __init__( + self, + *, + type: str, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class AgentEndpointConfig(_Model): + """AgentEndpointConfig. :ivar version_selector: The version selector of the agent endpoint determines how traffic is routed to different versions of the agent. @@ -560,19 +696,22 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) -class AgentEndpointAuthorizationScheme(_Model): - """AgentEndpointAuthorizationScheme. +class EvaluatorGenerationJobSource(_Model): + """The base source model for evaluator generation jobs. Polymorphic over ``type``. You probably want to use the sub-classes and not this class directly. Known sub-classes are: - BotServiceAuthorizationScheme, BotServiceRbacAuthorizationScheme, EntraAuthorizationScheme + AgentEvaluatorGenerationJobSource, DatasetEvaluatorGenerationJobSource, + PromptEvaluatorGenerationJobSource, TracesEvaluatorGenerationJobSource - :ivar type: Required. Known values are: "Entra", "BotService", and "BotServiceRbac". - :vartype type: str or ~azure.ai.projects.models.AgentEndpointAuthorizationSchemeType + :ivar type: The type of source. Required. Known values are: "prompt", "agent", "traces", and + "dataset". + :vartype type: str or ~azure.ai.projects.models.EvaluatorGenerationJobSourceType """ __mapping__: dict[str, _Model] = {} type: str = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) - """Required. Known values are: \"Entra\", \"BotService\", and \"BotServiceRbac\".""" + """The type of source. Required. Known values are: \"prompt\", \"agent\", \"traces\", and + \"dataset\".""" @overload def __init__( @@ -592,6 +731,55 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) +class AgentEvaluatorGenerationJobSource(EvaluatorGenerationJobSource, discriminator="agent"): + """Agent source for evaluator generation jobs — references an agent to fetch instructions and + metadata from. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Agent. Required. Agent source — + references an agent to fetch instructions and metadata from. + :vartype type: str or ~azure.ai.projects.models.AGENT + :ivar agent_name: The agent name to fetch instructions from. Required. + :vartype agent_name: str + :ivar agent_version: The agent version. If not specified, the latest version is used. + :vartype agent_version: str + """ + + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional description of what this source represents — helps the pipeline interpret its content + (e.g., 'Company refund policy document' or 'Describes the agent's core capabilities').""" + type: Literal[EvaluatorGenerationJobSourceType.AGENT] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Agent. Required. Agent source — references an agent + to fetch instructions and metadata from.""" + agent_name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent name to fetch instructions from. Required.""" + agent_version: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent version. If not specified, the latest version is used.""" + + @overload + def __init__( + self, + *, + agent_name: str, + description: Optional[str] = None, + agent_version: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = EvaluatorGenerationJobSourceType.AGENT # type: ignore + + class BaseCredentials(_Model): """A base class for connection credentials. @@ -1487,12 +1675,20 @@ class AzureAISearchTool(Tool, discriminator="azure_ai_search"): :ivar type: The object type, which is always 'azure_ai_search'. Required. AZURE_AI_SEARCH. :vartype type: str or ~azure.ai.projects.models.AZURE_AI_SEARCH + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar azure_ai_search: The azure ai search index resource. Required. :vartype azure_ai_search: ~azure.ai.projects.models.AzureAISearchToolResource """ type: Literal[ToolType.AZURE_AI_SEARCH] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The object type, which is always 'azure_ai_search'. Required. AZURE_AI_SEARCH.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" azure_ai_search: "_models.AzureAISearchToolResource" = rest_field( visibility=["read", "create", "update", "delete", "query"] ) @@ -1503,6 +1699,8 @@ def __init__( self, *, azure_ai_search: "_models.AzureAISearchToolResource", + name: Optional[str] = None, + description: Optional[str] = None, ) -> None: ... @overload @@ -1876,6 +2074,10 @@ class BingCustomSearchPreviewTool(Tool, discriminator="bing_custom_search_previe :ivar type: The object type, which is always 'bing_custom_search_preview'. Required. BING_CUSTOM_SEARCH_PREVIEW. :vartype type: str or ~azure.ai.projects.models.BING_CUSTOM_SEARCH_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar bing_custom_search_preview: The bing custom search tool parameters. Required. :vartype bing_custom_search_preview: ~azure.ai.projects.models.BingCustomSearchToolParameters """ @@ -1883,6 +2085,10 @@ class BingCustomSearchPreviewTool(Tool, discriminator="bing_custom_search_previe type: Literal[ToolType.BING_CUSTOM_SEARCH_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The object type, which is always 'bing_custom_search_preview'. Required. BING_CUSTOM_SEARCH_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" bing_custom_search_preview: "_models.BingCustomSearchToolParameters" = rest_field( visibility=["read", "create", "update", "delete", "query"] ) @@ -1893,6 +2099,8 @@ def __init__( self, *, bing_custom_search_preview: "_models.BingCustomSearchToolParameters", + name: Optional[str] = None, + description: Optional[str] = None, ) -> None: ... @overload @@ -2028,12 +2236,20 @@ class BingGroundingTool(Tool, discriminator="bing_grounding"): :ivar type: The object type, which is always 'bing_grounding'. Required. BING_GROUNDING. :vartype type: str or ~azure.ai.projects.models.BING_GROUNDING + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar bing_grounding: The bing grounding search tool parameters. Required. :vartype bing_grounding: ~azure.ai.projects.models.BingGroundingSearchToolParameters """ type: Literal[ToolType.BING_GROUNDING] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The object type, which is always 'bing_grounding'. Required. BING_GROUNDING.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" bing_grounding: "_models.BingGroundingSearchToolParameters" = rest_field( visibility=["read", "create", "update", "delete", "query"] ) @@ -2044,6 +2260,8 @@ def __init__( self, *, bing_grounding: "_models.BingGroundingSearchToolParameters", + name: Optional[str] = None, + description: Optional[str] = None, ) -> None: ... @overload @@ -2181,6 +2399,10 @@ class BrowserAutomationPreviewTool(Tool, discriminator="browser_automation_previ :ivar type: The object type, which is always 'browser_automation_preview'. Required. BROWSER_AUTOMATION_PREVIEW. :vartype type: str or ~azure.ai.projects.models.BROWSER_AUTOMATION_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar browser_automation_preview: The Browser Automation Tool parameters. Required. :vartype browser_automation_preview: ~azure.ai.projects.models.BrowserAutomationToolParameters """ @@ -2188,6 +2410,10 @@ class BrowserAutomationPreviewTool(Tool, discriminator="browser_automation_previ type: Literal[ToolType.BROWSER_AUTOMATION_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The object type, which is always 'browser_automation_preview'. Required. BROWSER_AUTOMATION_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" browser_automation_preview: "_models.BrowserAutomationToolParameters" = rest_field( visibility=["read", "create", "update", "delete", "query"] ) @@ -2198,6 +2424,8 @@ def __init__( self, *, browser_automation_preview: "_models.BrowserAutomationToolParameters", + name: Optional[str] = None, + description: Optional[str] = None, ) -> None: ... @overload @@ -2579,10 +2807,10 @@ class EvaluatorDefinition(_Model): """Base evaluator configuration with discriminator. You probably want to use the sub-classes and not this class directly. Known sub-classes are: - CodeBasedEvaluatorDefinition, PromptBasedEvaluatorDefinition + CodeBasedEvaluatorDefinition, PromptBasedEvaluatorDefinition, RubricBasedEvaluatorDefinition :ivar type: The type of evaluator definition. Required. Known values are: "prompt", "code", - "prompt_and_code", "service", and "openai_graders". + "prompt_and_code", "service", "openai_graders", and "rubrics". :vartype type: str or ~azure.ai.projects.models.EvaluatorDefinitionType :ivar init_parameters: The JSON schema (Draft 2020-12) for the evaluator's input parameters. This includes parameters like type, properties, required. @@ -2597,7 +2825,7 @@ class EvaluatorDefinition(_Model): __mapping__: dict[str, _Model] = {} type: str = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) """The type of evaluator definition. Required. Known values are: \"prompt\", \"code\", - \"prompt_and_code\", \"service\", and \"openai_graders\".""" + \"prompt_and_code\", \"service\", \"openai_graders\", and \"rubrics\".""" init_parameters: Optional[dict[str, Any]] = rest_field(visibility=["read", "create", "update", "delete", "query"]) """The JSON schema (Draft 2020-12) for the evaluator's input parameters. This includes parameters like type, properties, required.""" @@ -2699,6 +2927,16 @@ class CodeConfiguration(_Model): :vartype runtime: str :ivar entry_point: The entry point command and arguments for the code execution. Required. :vartype entry_point: list[str] + :ivar dependency_resolution: How package dependencies are resolved at deployment time. Defaults + to ``bundled``, where the caller bundles all dependencies into the uploaded zip and the service + performs no remote build. ``remote_build`` instructs the service to build dependencies remotely + from the manifest included in the uploaded zip. Required. Known values are: "bundled" and + "remote_build". + :vartype dependency_resolution: str or ~azure.ai.projects.models.CodeDependencyResolution + :ivar content_hash: The SHA-256 hex digest of the uploaded code zip. Set by the service from + the ``x-ms-code-zip-sha256`` request header; read-only in responses and never accepted in + request payloads. + :vartype content_hash: str """ runtime: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) @@ -2706,6 +2944,17 @@ class CodeConfiguration(_Model): Required.""" entry_point: list[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) """The entry point command and arguments for the code execution. Required.""" + dependency_resolution: Union[str, "_models.CodeDependencyResolution"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """How package dependencies are resolved at deployment time. Defaults to ``bundled``, where the + caller bundles all dependencies into the uploaded zip and the service performs no remote build. + ``remote_build`` instructs the service to build dependencies remotely from the manifest + included in the uploaded zip. Required. Known values are: \"bundled\" and \"remote_build\".""" + content_hash: Optional[str] = rest_field(visibility=["read"]) + """The SHA-256 hex digest of the uploaded code zip. Set by the service from the + ``x-ms-code-zip-sha256`` request header; read-only in responses and never accepted in request + payloads.""" @overload def __init__( @@ -2713,6 +2962,7 @@ def __init__( *, runtime: str, entry_point: list[str], + dependency_resolution: Union[str, "_models.CodeDependencyResolution"], ) -> None: ... @overload @@ -3398,6 +3648,97 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: self.type = IndexType.COSMOS_DB # type: ignore +class CreateAgentVersionFromCodeContent(_Model): + """Multipart request body for updating or versioning a code-based agent (POST /agents/{name} and + POST /agents/{name}/versions). + + :ivar metadata: JSON metadata including description and hosted definition. Required. + :vartype metadata: ~azure.ai.projects.models.CreateAgentVersionFromCodeRequest + :ivar code: The code zip file (max 250 MB). Required. + :vartype code: ~azure.ai.projects._utils.utils.FileType + """ + + metadata: "_models.CreateAgentVersionFromCodeRequest" = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """JSON metadata including description and hosted definition. Required.""" + code: FileType = rest_field( + visibility=["read", "create", "update", "delete", "query"], is_multipart_file_input=True + ) + """The code zip file (max 250 MB). Required.""" + + @overload + def __init__( + self, + *, + metadata: "_models.CreateAgentVersionFromCodeRequest", + code: FileType, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class CreateAgentVersionFromCodeRequest(_Model): + """JSON metadata for code-based agent operations (create, update, create version). The agent name + comes from the URL path parameter or the ``x-ms-agent-name`` header, so it is not included in + this model. The content hash (SHA-256 of the zip) is carried in the ``x-ms-code-zip-sha256`` + header. + + :ivar description: A human-readable description of the agent. + :vartype description: str + :ivar metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. + + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. + :vartype metadata: dict[str, str] + :ivar definition: The hosted agent definition including code_configuration (runtime, + entry_point), cpu, memory, and protocol_versions. Required. + :vartype definition: ~azure.ai.projects.models.HostedAgentDefinition + """ + + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """A human-readable description of the agent.""" + metadata: Optional[dict[str, str]] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. + + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters.""" + definition: "_models.HostedAgentDefinition" = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The hosted agent definition including code_configuration (runtime, entry_point), cpu, memory, + and protocol_versions. Required.""" + + @overload + def __init__( + self, + *, + definition: "_models.HostedAgentDefinition", + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + class Trigger(_Model): """Base model for Trigger of the schedule. @@ -3724,23 +4065,53 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: self.type = RecurrenceType.DAILY # type: ignore -class DatasetCredential(_Model): - """Represents a reference to a blob for consumption. +class DataGenerationJob(_Model): + """Data Generation Job resource. - :ivar blob_reference: Credential info to access the storage account. Required. - :vartype blob_reference: ~azure.ai.projects.models.BlobReference + :ivar id: Server-assigned unique identifier. Required. + :vartype id: str + :ivar inputs: Caller-supplied inputs. + :vartype inputs: ~azure.ai.projects.models.DataGenerationJobInputs + :ivar result: Result produced on success. + :vartype result: ~azure.ai.projects.models.DataGenerationJobResult + :ivar status: Current lifecycle status. Required. Known values are: "queued", "in_progress", + "succeeded", "failed", and "cancelled". + :vartype status: str or ~azure.ai.projects.models.JobStatus + :ivar error: Error details — populated only on failure. + :vartype error: ~azure.ai.projects.models.ApiError + :ivar created_at: The timestamp when the job was created, represented in Unix time (seconds + since January 1, 1970). Required. + :vartype created_at: ~datetime.datetime + :ivar finished_at: The timestamp when the job was finished, represented in Unix time (seconds + since January 1, 1970). + :vartype finished_at: ~datetime.datetime """ - blob_reference: "_models.BlobReference" = rest_field( - name="blobReference", visibility=["read", "create", "update", "delete", "query"] + id: str = rest_field(visibility=["read"]) + """Server-assigned unique identifier. Required.""" + inputs: Optional["_models.DataGenerationJobInputs"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] ) - """Credential info to access the storage account. Required.""" + """Caller-supplied inputs.""" + result: Optional["_models.DataGenerationJobResult"] = rest_field(visibility=["read"]) + """Result produced on success.""" + status: Union[str, "_models.JobStatus"] = rest_field(visibility=["read"]) + """Current lifecycle status. Required. Known values are: \"queued\", \"in_progress\", + \"succeeded\", \"failed\", and \"cancelled\".""" + error: Optional["_models.ApiError"] = rest_field(visibility=["read"]) + """Error details — populated only on failure.""" + created_at: datetime.datetime = rest_field(visibility=["read"], format="unix-timestamp") + """The timestamp when the job was created, represented in Unix time (seconds since January 1, + 1970). Required.""" + finished_at: Optional[datetime.datetime] = rest_field(visibility=["read"], format="unix-timestamp") + """The timestamp when the job was finished, represented in Unix time (seconds since January 1, + 1970).""" @overload def __init__( self, *, - blob_reference: "_models.BlobReference", + inputs: Optional["_models.DataGenerationJobInputs"] = None, ) -> None: ... @overload @@ -3754,11 +4125,446 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) -class DatasetVersion(_Model): - """DatasetVersion Definition. +class DataGenerationJobInputs(_Model): + """Caller-supplied inputs for a data generation job. + + :ivar name: The display name of the data generation job. Required. + :vartype name: str + :ivar sources: The sources used for the data generation job. Required. + :vartype sources: list[~azure.ai.projects.models.DataGenerationJobSource] + :ivar options: The options for the data generation job. Required. + :vartype options: ~azure.ai.projects.models.DataGenerationJobOptions + :ivar scenario: The scenario of the data generation job. Either for fine-tuning or evaluation. + Required. Known values are: "supervised_finetuning", "reinforcement_finetuning", and + "evaluation". + :vartype scenario: str or ~azure.ai.projects.models.DataGenerationJobScenario + """ + + name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The display name of the data generation job. Required.""" + sources: list["_models.DataGenerationJobSource"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The sources used for the data generation job. Required.""" + options: "_models.DataGenerationJobOptions" = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The options for the data generation job. Required.""" + scenario: Union[str, "_models.DataGenerationJobScenario"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The scenario of the data generation job. Either for fine-tuning or evaluation. Required. Known + values are: \"supervised_finetuning\", \"reinforcement_finetuning\", and \"evaluation\".""" + + @overload + def __init__( + self, + *, + name: str, + sources: list["_models.DataGenerationJobSource"], + options: "_models.DataGenerationJobOptions", + scenario: Union[str, "_models.DataGenerationJobScenario"], + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class DataGenerationJobOptions(_Model): + """Options for managing data generation jobs. + + You probably want to use the sub-classes and not this class directly. Known sub-classes are: + SimpleQnADataGenerationJobOptions, ToolUseFineTuningDataGenerationJobOptions, + TracesDataGenerationJobOptions + + :ivar type: The data generation job type. Required. Known values are: "simple_qna", "traces", + and "tool_use". + :vartype type: str or ~azure.ai.projects.models.DataGenerationJobType + :ivar max_samples: Maximum number of samples to generate. Required. + :vartype max_samples: int + :ivar train_split: The proportion of the generated data to be used for training when the data + is used for fine-tuning. The rest will be used for validation. Value should be between 0 and 1. + :vartype train_split: float + :ivar model_options: The LLM model options. + :vartype model_options: ~azure.ai.projects.models.DataGenerationModelOptions + """ + + __mapping__: dict[str, _Model] = {} + type: str = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) + """The data generation job type. Required. Known values are: \"simple_qna\", \"traces\", and + \"tool_use\".""" + max_samples: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Maximum number of samples to generate. Required.""" + train_split: Optional[float] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The proportion of the generated data to be used for training when the data is used for + fine-tuning. The rest will be used for validation. Value should be between 0 and 1.""" + model_options: Optional["_models.DataGenerationModelOptions"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The LLM model options.""" + + @overload + def __init__( + self, + *, + type: str, + max_samples: int, + train_split: Optional[float] = None, + model_options: Optional["_models.DataGenerationModelOptions"] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class DataGenerationJobOutput(_Model): + """Output information for a data generation job. + + You probably want to use the sub-classes and not this class directly. Known sub-classes are: + DatasetDataGenerationJobOutput, FileDataGenerationJobOutput + + :ivar type: The type of the output. Required. Known values are: "file" and "dataset". + :vartype type: str or ~azure.ai.projects.models.DataGenerationJobOutputType + """ + + __mapping__: dict[str, _Model] = {} + type: str = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) + """The type of the output. Required. Known values are: \"file\" and \"dataset\".""" + + @overload + def __init__( + self, + *, + type: str, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class DataGenerationJobResult(_Model): + """Result produced by a successful data generation job. + + :ivar outputs: The final job outputs: Azure OpenAI files for fine-tuning, or datasets for + evaluation. + :vartype outputs: list[~azure.ai.projects.models.DataGenerationJobOutput] + :ivar generated_samples: The number of samples actually generated. Required. + :vartype generated_samples: int + :ivar token_usage: The token usage information for the data generation job. + :vartype token_usage: ~azure.ai.projects.models.DataGenerationTokenUsage + """ + + outputs: Optional[list["_models.DataGenerationJobOutput"]] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The final job outputs: Azure OpenAI files for fine-tuning, or datasets for evaluation.""" + generated_samples: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The number of samples actually generated. Required.""" + token_usage: Optional["_models.DataGenerationTokenUsage"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The token usage information for the data generation job.""" + + @overload + def __init__( + self, + *, + generated_samples: int, + outputs: Optional[list["_models.DataGenerationJobOutput"]] = None, + token_usage: Optional["_models.DataGenerationTokenUsage"] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class DataGenerationModelOptions(_Model): + """LLM model options for data generation jobs. + + :ivar model: Base model name used to generate data. Required. + :vartype model: str + """ + + model: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Base model name used to generate data. Required.""" + + @overload + def __init__( + self, + *, + model: str, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class DataGenerationTokenUsage(_Model): + """Token usage information for a data generation job. + + :ivar prompt_tokens: The number of prompt tokens used. Required. + :vartype prompt_tokens: int + :ivar completion_tokens: The number of completion tokens generated. Required. + :vartype completion_tokens: int + :ivar total_tokens: Total number of tokens used. Required. + :vartype total_tokens: int + """ + + prompt_tokens: int = rest_field(visibility=["read"]) + """The number of prompt tokens used. Required.""" + completion_tokens: int = rest_field(visibility=["read"]) + """The number of completion tokens generated. Required.""" + total_tokens: int = rest_field(visibility=["read"]) + """Total number of tokens used. Required.""" + + +class DatasetCredential(_Model): + """Represents a reference to a blob for consumption. + + :ivar blob_reference: Credential info to access the storage account. Required. + :vartype blob_reference: ~azure.ai.projects.models.BlobReference + """ + + blob_reference: "_models.BlobReference" = rest_field( + name="blobReference", visibility=["read", "create", "update", "delete", "query"] + ) + """Credential info to access the storage account. Required.""" + + @overload + def __init__( + self, + *, + blob_reference: "_models.BlobReference", + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class DatasetDataGenerationJobOutput(DataGenerationJobOutput, discriminator="dataset"): + """Dataset output for a data generation job. + + :ivar type: Dataset output. Required. The generated data is a Dataset. + :vartype type: str or ~azure.ai.projects.models.DATASET + :ivar id: The id of the output dataset created. + :vartype id: str + :ivar name: The name of the output dataset and can be optionally set during job creation time. + :vartype name: str + :ivar version: The version of the output dataset. + :vartype version: str + :ivar description: Description of the output dataset and can be optionally set during job + creation time. + :vartype description: str + :ivar tags: Tag dictionary of the output dataset and can be optionally set during job creation + time. + :vartype tags: dict[str, str] + """ + + type: Literal[DataGenerationJobOutputType.DATASET] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """Dataset output. Required. The generated data is a Dataset.""" + id: Optional[str] = rest_field(visibility=["read"]) + """The id of the output dataset created.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The name of the output dataset and can be optionally set during job creation time.""" + version: Optional[str] = rest_field(visibility=["read"]) + """The version of the output dataset.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Description of the output dataset and can be optionally set during job creation time.""" + tags: Optional[dict[str, str]] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Tag dictionary of the output dataset and can be optionally set during job creation time.""" + + @overload + def __init__( + self, + *, + name: Optional[str] = None, + description: Optional[str] = None, + tags: Optional[dict[str, str]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobOutputType.DATASET # type: ignore + + +class DatasetDataGenerationJobSource(DataGenerationJobSource, discriminator="dataset"): + """Dataset source for data generation jobs — reference to a dataset. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Dataset. Required. Dataset source — + reference to a dataset. + :vartype type: str or ~azure.ai.projects.models.DATASET + :ivar name: The name of the dataset. Required. + :vartype name: str + :ivar version: The version of the dataset. If not specified, the latest version is used. + :vartype version: str + """ + + type: Literal[DataGenerationJobSourceType.DATASET] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Dataset. Required. Dataset source — reference to a + dataset.""" + name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The name of the dataset. Required.""" + version: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The version of the dataset. If not specified, the latest version is used.""" + + @overload + def __init__( + self, + *, + name: str, + description: Optional[str] = None, + version: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobSourceType.DATASET # type: ignore + + +class DatasetEvaluatorGenerationJobSource(EvaluatorGenerationJobSource, discriminator="dataset"): + """Dataset source for evaluator generation jobs — reference to a dataset. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Dataset. Required. Dataset source — + reference to a dataset. + :vartype type: str or ~azure.ai.projects.models.DATASET + :ivar name: The name of the dataset. Required. + :vartype name: str + :ivar version: The version of the dataset. If not specified, the latest version is used. + :vartype version: str + """ + + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional description of what this source represents — helps the pipeline interpret its content + (e.g., 'Company refund policy document' or 'Describes the agent's core capabilities').""" + type: Literal[EvaluatorGenerationJobSourceType.DATASET] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Dataset. Required. Dataset source — reference to a + dataset.""" + name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The name of the dataset. Required.""" + version: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The version of the dataset. If not specified, the latest version is used.""" + + @overload + def __init__( + self, + *, + name: str, + description: Optional[str] = None, + version: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = EvaluatorGenerationJobSourceType.DATASET # type: ignore + + +class DatasetReference(_Model): + """Reference to a versioned Foundry Dataset. + + :ivar name: Dataset name. Required. + :vartype name: str + :ivar version: Dataset version. Required. + :vartype version: str + """ + + name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Dataset name. Required.""" + version: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Dataset version. Required.""" + + @overload + def __init__( + self, + *, + name: str, + version: str, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) - You probably want to use the sub-classes and not this class directly. Known sub-classes are: - FileDatasetVersion, FolderDatasetVersion + +class DatasetVersion(_Model): + """DatasetVersion Definition. + + You probably want to use the sub-classes and not this class directly. Known sub-classes are: + FileDatasetVersion, FolderDatasetVersion :ivar data_uri: URI of the data (`example `_). Required. @@ -4062,22 +4868,14 @@ class EntraAuthorizationScheme(AgentEndpointAuthorizationScheme, discriminator=" :ivar type: Required. ENTRA. :vartype type: str or ~azure.ai.projects.models.ENTRA - :ivar isolation_key_source: Required. - :vartype isolation_key_source: ~azure.ai.projects.models.IsolationKeySource """ type: Literal[AgentEndpointAuthorizationSchemeType.ENTRA] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """Required. ENTRA.""" - isolation_key_source: "_models.IsolationKeySource" = rest_field( - visibility=["read", "create", "update", "delete", "query"] - ) - """Required.""" @overload def __init__( self, - *, - isolation_key_source: "_models.IsolationKeySource", ) -> None: ... @overload @@ -4119,65 +4917,6 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: self.type = CredentialType.ENTRA_ID # type: ignore -class IsolationKeySource(_Model): - """IsolationKeySource. - - You probably want to use the sub-classes and not this class directly. Known sub-classes are: - EntraIsolationKeySource, HeaderIsolationKeySource - - :ivar kind: Required. Known values are: "Entra" and "Header". - :vartype kind: str or ~azure.ai.projects.models.IsolationKeySourceKind - """ - - __mapping__: dict[str, _Model] = {} - kind: str = rest_discriminator(name="kind", visibility=["read", "create", "update", "delete", "query"]) - """Required. Known values are: \"Entra\" and \"Header\".""" - - @overload - def __init__( - self, - *, - kind: str, - ) -> None: ... - - @overload - def __init__(self, mapping: Mapping[str, Any]) -> None: - """ - :param mapping: raw JSON to initialize the model. - :type mapping: Mapping[str, Any] - """ - - def __init__(self, *args: Any, **kwargs: Any) -> None: - super().__init__(*args, **kwargs) - - -class EntraIsolationKeySource(IsolationKeySource, discriminator="Entra"): - """EntraIsolationKeySource. - - :ivar kind: Required. ENTRA. - :vartype kind: str or ~azure.ai.projects.models.ENTRA - """ - - kind: Literal[IsolationKeySourceKind.ENTRA] = rest_discriminator(name="kind", visibility=["read", "create", "update", "delete", "query"]) # type: ignore - """Required. ENTRA.""" - - @overload - def __init__( - self, - ) -> None: ... - - @overload - def __init__(self, mapping: Mapping[str, Any]) -> None: - """ - :param mapping: raw JSON to initialize the model. - :type mapping: Mapping[str, Any] - """ - - def __init__(self, *args: Any, **kwargs: Any) -> None: - super().__init__(*args, **kwargs) - self.kind = IsolationKeySourceKind.ENTRA # type: ignore - - class EvalResult(_Model): """Result of the evaluation. @@ -4792,9 +5531,255 @@ class EvaluationScheduleTask(ScheduleTask, discriminator="Evaluation"): def __init__( self, *, - eval_id: str, - eval_run: Any, - configuration: Optional[dict[str, str]] = None, + eval_id: str, + eval_run: Any, + configuration: Optional[dict[str, str]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = ScheduleTaskType.EVALUATION # type: ignore + + +class EvaluationTaxonomy(_Model): + """Evaluation Taxonomy Definition. + + :ivar id: Asset ID, a unique identifier for the asset. + :vartype id: str + :ivar name: The name of the resource. Required. + :vartype name: str + :ivar version: The version of the resource. Required. + :vartype version: str + :ivar description: The asset description text. + :vartype description: str + :ivar tags: Tag dictionary. Tags can be added, removed, and updated. + :vartype tags: dict[str, str] + :ivar taxonomy_input: Input configuration for the evaluation taxonomy. Required. + :vartype taxonomy_input: ~azure.ai.projects.models.EvaluationTaxonomyInput + :ivar taxonomy_categories: List of taxonomy categories. + :vartype taxonomy_categories: list[~azure.ai.projects.models.TaxonomyCategory] + :ivar properties: Additional properties for the evaluation taxonomy. + :vartype properties: dict[str, str] + """ + + id: Optional[str] = rest_field(visibility=["read"]) + """Asset ID, a unique identifier for the asset.""" + name: str = rest_field(visibility=["read"]) + """The name of the resource. Required.""" + version: str = rest_field(visibility=["read"]) + """The version of the resource. Required.""" + description: Optional[str] = rest_field(visibility=["create", "update"]) + """The asset description text.""" + tags: Optional[dict[str, str]] = rest_field(visibility=["create", "update"]) + """Tag dictionary. Tags can be added, removed, and updated.""" + taxonomy_input: "_models.EvaluationTaxonomyInput" = rest_field( + name="taxonomyInput", visibility=["read", "create", "update", "delete", "query"] + ) + """Input configuration for the evaluation taxonomy. Required.""" + taxonomy_categories: Optional[list["_models.TaxonomyCategory"]] = rest_field( + name="taxonomyCategories", visibility=["read", "create", "update", "delete", "query"] + ) + """List of taxonomy categories.""" + properties: Optional[dict[str, str]] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Additional properties for the evaluation taxonomy.""" + + @overload + def __init__( + self, + *, + taxonomy_input: "_models.EvaluationTaxonomyInput", + description: Optional[str] = None, + tags: Optional[dict[str, str]] = None, + taxonomy_categories: Optional[list["_models.TaxonomyCategory"]] = None, + properties: Optional[dict[str, str]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class EvaluatorGenerationArtifacts(_Model): + """Service-managed provenance artifacts produced by an evaluator generation job. Present only on + EvaluatorVersion resources created via the generation pipeline. The combined-JSONL Foundry + Dataset is read-only and resolves to a versioned dataset in a service-reserved namespace. + + :ivar dataset: Reference to the single Foundry Dataset (one combined JSONL file, + version-aligned to ``EvaluatorVersion.version``) holding all artifacts produced by the + generation pipeline. Each row in the JSONL carries a ``kind`` field discriminating its content + (e.g. ``spec``, ``tools``, ``context``). Required. + :vartype dataset: ~azure.ai.projects.models.DatasetReference + :ivar kinds: The kinds of rows present in ``dataset``. Always contains ``"spec"`` (the + generated evaluation specification, a Markdown document describing what the evaluator + measures). May additionally contain ``"tools"`` (when the generation pipeline produced or + inferred OpenAI tool schemas) and/or ``"context"`` (when supplementary materials such as file + uploads or trace samples were used during generation). Required. + :vartype kinds: list[str] + """ + + dataset: "_models.DatasetReference" = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Reference to the single Foundry Dataset (one combined JSONL file, version-aligned to + ``EvaluatorVersion.version``) holding all artifacts produced by the generation pipeline. Each + row in the JSONL carries a ``kind`` field discriminating its content (e.g. ``spec``, ``tools``, + ``context``). Required.""" + kinds: list[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The kinds of rows present in ``dataset``. Always contains ``\"spec\"`` (the generated + evaluation specification, a Markdown document describing what the evaluator measures). May + additionally contain ``\"tools\"`` (when the generation pipeline produced or inferred OpenAI + tool schemas) and/or ``\"context\"`` (when supplementary materials such as file uploads or + trace samples were used during generation). Required.""" + + @overload + def __init__( + self, + *, + dataset: "_models.DatasetReference", + kinds: list[str], + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class EvaluatorGenerationInputs(_Model): + """Caller-supplied inputs for an evaluator generation job. + + :ivar name: Display name for this generation job. Required. + :vartype name: str + :ivar sources: Source materials for generation — agent descriptions, prompts, traces, or + datasets. Each entry is an ``EvaluatorGenerationJobSource`` variant discriminated by ``type``. + Required. + :vartype sources: list[~azure.ai.projects.models.EvaluatorGenerationJobSource] + :ivar category: Category determines the rubric generation focus: 'quality' (default) produces + quality-focused rubric criteria, 'safety' produces policy-derived safety rubric criteria. Both + use the same rubric structure. Singular because quality and safety generation are mutually + exclusive pipelines — the output EvaluatorVersion.categories is an array (e.g., ['agents', + 'quality']). Known values are: "quality", "safety", and "agents". + :vartype category: str or ~azure.ai.projects.models.EvaluatorCategory + :ivar model: The LLM model to use for rubric generation (e.g., 'gpt-4o'). Required — users must + provide their own model rather than relying on service-owned capacity. Required. + :vartype model: str + :ivar evaluator_name: The evaluator name to create or update. If an evaluator with this name + already exists, the service retrieves the latest version's criteria as context for improvement. + Required. + :vartype evaluator_name: str + """ + + name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Display name for this generation job. Required.""" + sources: list["_models.EvaluatorGenerationJobSource"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """Source materials for generation — agent descriptions, prompts, traces, or datasets. Each entry + is an ``EvaluatorGenerationJobSource`` variant discriminated by ``type``. Required.""" + category: Optional[Union[str, "_models.EvaluatorCategory"]] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """Category determines the rubric generation focus: 'quality' (default) produces quality-focused + rubric criteria, 'safety' produces policy-derived safety rubric criteria. Both use the same + rubric structure. Singular because quality and safety generation are mutually exclusive + pipelines — the output EvaluatorVersion.categories is an array (e.g., ['agents', 'quality']). + Known values are: \"quality\", \"safety\", and \"agents\".""" + model: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The LLM model to use for rubric generation (e.g., 'gpt-4o'). Required — users must provide + their own model rather than relying on service-owned capacity. Required.""" + evaluator_name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The evaluator name to create or update. If an evaluator with this name already exists, the + service retrieves the latest version's criteria as context for improvement. Required.""" + + @overload + def __init__( + self, + *, + name: str, + sources: list["_models.EvaluatorGenerationJobSource"], + model: str, + evaluator_name: str, + category: Optional[Union[str, "_models.EvaluatorCategory"]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + +class EvaluatorGenerationJob(_Model): + """Evaluator Generation Job resource — a long-running job that generates rubric-based evaluator + definitions from source materials. On success, the result is the persisted EvaluatorVersion. + + :ivar id: Server-assigned unique identifier. Required. + :vartype id: str + :ivar inputs: Caller-supplied inputs. + :vartype inputs: ~azure.ai.projects.models.EvaluatorGenerationInputs + :ivar result: Result produced on success. + :vartype result: ~azure.ai.projects.models.EvaluatorVersion + :ivar status: Current lifecycle status. Required. Known values are: "queued", "in_progress", + "succeeded", "failed", and "cancelled". + :vartype status: str or ~azure.ai.projects.models.JobStatus + :ivar error: Error details — populated only on failure. + :vartype error: ~azure.ai.projects.models.ApiError + :ivar created_at: The timestamp when the job was created, represented in Unix time (seconds + since January 1, 1970). Required. + :vartype created_at: ~datetime.datetime + :ivar finished_at: The timestamp when the job finished, represented in Unix time (seconds since + January 1, 1970). + :vartype finished_at: ~datetime.datetime + :ivar usage: Token consumption summary. Populated when the job reaches a terminal state. + :vartype usage: ~azure.ai.projects.models.EvaluatorGenerationTokenUsage + """ + + id: str = rest_field(visibility=["read"]) + """Server-assigned unique identifier. Required.""" + inputs: Optional["_models.EvaluatorGenerationInputs"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """Caller-supplied inputs.""" + result: Optional["_models.EvaluatorVersion"] = rest_field(visibility=["read"]) + """Result produced on success.""" + status: Union[str, "_models.JobStatus"] = rest_field(visibility=["read"]) + """Current lifecycle status. Required. Known values are: \"queued\", \"in_progress\", + \"succeeded\", \"failed\", and \"cancelled\".""" + error: Optional["_models.ApiError"] = rest_field(visibility=["read"]) + """Error details — populated only on failure.""" + created_at: datetime.datetime = rest_field(visibility=["read"], format="unix-timestamp") + """The timestamp when the job was created, represented in Unix time (seconds since January 1, + 1970). Required.""" + finished_at: Optional[datetime.datetime] = rest_field(visibility=["read"], format="unix-timestamp") + """The timestamp when the job finished, represented in Unix time (seconds since January 1, 1970).""" + usage: Optional["_models.EvaluatorGenerationTokenUsage"] = rest_field(visibility=["read"]) + """Token consumption summary. Populated when the job reaches a terminal state.""" + + @overload + def __init__( + self, + *, + inputs: Optional["_models.EvaluatorGenerationInputs"] = None, ) -> None: ... @overload @@ -4806,60 +5791,34 @@ def __init__(self, mapping: Mapping[str, Any]) -> None: def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) - self.type = ScheduleTaskType.EVALUATION # type: ignore -class EvaluationTaxonomy(_Model): - """Evaluation Taxonomy Definition. +class EvaluatorGenerationTokenUsage(_Model): + """Token consumption summary for an evaluator generation job. Populated when the job reaches a + terminal state. - :ivar id: Asset ID, a unique identifier for the asset. - :vartype id: str - :ivar name: The name of the resource. Required. - :vartype name: str - :ivar version: The version of the resource. Required. - :vartype version: str - :ivar description: The asset description text. - :vartype description: str - :ivar tags: Tag dictionary. Tags can be added, removed, and updated. - :vartype tags: dict[str, str] - :ivar taxonomy_input: Input configuration for the evaluation taxonomy. Required. - :vartype taxonomy_input: ~azure.ai.projects.models.EvaluationTaxonomyInput - :ivar taxonomy_categories: List of taxonomy categories. - :vartype taxonomy_categories: list[~azure.ai.projects.models.TaxonomyCategory] - :ivar properties: Additional properties for the evaluation taxonomy. - :vartype properties: dict[str, str] + :ivar input_tokens: Number of input (prompt) tokens consumed. Required. + :vartype input_tokens: int + :ivar output_tokens: Number of output (completion) tokens generated. Required. + :vartype output_tokens: int + :ivar total_tokens: Total tokens consumed (input + output). Required. + :vartype total_tokens: int """ - id: Optional[str] = rest_field(visibility=["read"]) - """Asset ID, a unique identifier for the asset.""" - name: str = rest_field(visibility=["read"]) - """The name of the resource. Required.""" - version: str = rest_field(visibility=["read"]) - """The version of the resource. Required.""" - description: Optional[str] = rest_field(visibility=["create", "update"]) - """The asset description text.""" - tags: Optional[dict[str, str]] = rest_field(visibility=["create", "update"]) - """Tag dictionary. Tags can be added, removed, and updated.""" - taxonomy_input: "_models.EvaluationTaxonomyInput" = rest_field( - name="taxonomyInput", visibility=["read", "create", "update", "delete", "query"] - ) - """Input configuration for the evaluation taxonomy. Required.""" - taxonomy_categories: Optional[list["_models.TaxonomyCategory"]] = rest_field( - name="taxonomyCategories", visibility=["read", "create", "update", "delete", "query"] - ) - """List of taxonomy categories.""" - properties: Optional[dict[str, str]] = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """Additional properties for the evaluation taxonomy.""" + input_tokens: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Number of input (prompt) tokens consumed. Required.""" + output_tokens: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Number of output (completion) tokens generated. Required.""" + total_tokens: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Total tokens consumed (input + output). Required.""" @overload def __init__( self, *, - taxonomy_input: "_models.EvaluationTaxonomyInput", - description: Optional[str] = None, - tags: Optional[dict[str, str]] = None, - taxonomy_categories: Optional[list["_models.TaxonomyCategory"]] = None, - properties: Optional[dict[str, str]] = None, + input_tokens: int, + output_tokens: int, + total_tokens: int, ) -> None: ... @overload @@ -4947,6 +5906,10 @@ class EvaluatorVersion(_Model): :vartype categories: list[str or ~azure.ai.projects.models.EvaluatorCategory] :ivar definition: Definition of the evaluator. Required. :vartype definition: ~azure.ai.projects.models.EvaluatorDefinition + :ivar generation_artifacts: Provenance artifacts from the generation pipeline. Read-only; + present only on evaluator versions created via an EvaluatorGenerationJob. Each artifact + resolves to a versioned Foundry Dataset. + :vartype generation_artifacts: ~azure.ai.projects.models.EvaluatorGenerationArtifacts :ivar created_by: Creator of the evaluator. Required. :vartype created_by: str :ivar created_at: Creation date/time of the evaluator. Required. @@ -4978,6 +5941,10 @@ class EvaluatorVersion(_Model): """The categories of the evaluator. Required.""" definition: "_models.EvaluatorDefinition" = rest_field(visibility=["read", "create"]) """Definition of the evaluator. Required.""" + generation_artifacts: Optional["_models.EvaluatorGenerationArtifacts"] = rest_field(visibility=["read"]) + """Provenance artifacts from the generation pipeline. Read-only; present only on evaluator + versions created via an EvaluatorGenerationJob. Each artifact resolves to a versioned Foundry + Dataset.""" created_by: str = rest_field(visibility=["read"]) """Creator of the evaluator. Required.""" created_at: datetime.datetime = rest_field(visibility=["read"], format="rfc3339") @@ -5051,6 +6018,90 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) +class FabricIQPreviewTool(Tool, discriminator="fabric_iq_preview"): + """FabricIQPreviewTool. + + :ivar type: The object type, which is always 'fabric_iq_preview'. Required. FABRIC_IQ_PREVIEW. + :vartype type: str or ~azure.ai.projects.models.FABRIC_IQ_PREVIEW + :ivar fabric_iq_preview: The FabricIQ tool parameters. Required. + :vartype fabric_iq_preview: ~azure.ai.projects.models.FabricIQPreviewToolParameters + """ + + type: Literal[ToolType.FABRIC_IQ_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The object type, which is always 'fabric_iq_preview'. Required. FABRIC_IQ_PREVIEW.""" + fabric_iq_preview: "_models.FabricIQPreviewToolParameters" = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The FabricIQ tool parameters. Required.""" + + @overload + def __init__( + self, + *, + fabric_iq_preview: "_models.FabricIQPreviewToolParameters", + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = ToolType.FABRIC_IQ_PREVIEW # type: ignore + + +class FabricIQPreviewToolParameters(_Model): + """FabricIQPreviewToolParameters. + + :ivar project_connection_id: The ID of the FabricIQ project connection. Required. + :vartype project_connection_id: str + :ivar server_label: (Optional) The label of the FabricIQ MCP server to connect to. + :vartype server_label: str + :ivar server_url: (Optional) The URL of the FabricIQ MCP server. If not provided, the URL from + the project connection will be used. + :vartype server_url: str + :ivar require_approval: (Optional) Whether the agent requires approval before executing + actions. Default is always. Is either a MCPToolRequireApproval type or a str type. + :vartype require_approval: ~azure.ai.projects.models.MCPToolRequireApproval or str + """ + + project_connection_id: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The ID of the FabricIQ project connection. Required.""" + server_label: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """(Optional) The label of the FabricIQ MCP server to connect to.""" + server_url: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """(Optional) The URL of the FabricIQ MCP server. If not provided, the URL from the project + connection will be used.""" + require_approval: Optional[Union["_models.MCPToolRequireApproval", str]] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """(Optional) Whether the agent requires approval before executing actions. Default is always. Is + either a MCPToolRequireApproval type or a str type.""" + + @overload + def __init__( + self, + *, + project_connection_id: str, + server_label: Optional[str] = None, + server_url: Optional[str] = None, + require_approval: Optional[Union["_models.MCPToolRequireApproval", str]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + class FieldMapping(_Model): """Field mapping configuration class. @@ -5104,6 +6155,80 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) +class FileDataGenerationJobOutput(DataGenerationJobOutput, discriminator="file"): + """Azure OpenAI file output for a data generation job. + + :ivar type: Azure OpenAI file output. Required. The generated data is an Azure OpenAI File. + :vartype type: str or ~azure.ai.projects.models.FILE + :ivar id: The id of the output Azure OpenAI file. Required. + :vartype id: str + :ivar filename: The filename of the output Azure OpenAI file. Required. + :vartype filename: str + """ + + type: Literal[DataGenerationJobOutputType.FILE] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """Azure OpenAI file output. Required. The generated data is an Azure OpenAI File.""" + id: str = rest_field(visibility=["read"]) + """The id of the output Azure OpenAI file. Required.""" + filename: str = rest_field(visibility=["read"]) + """The filename of the output Azure OpenAI file. Required.""" + + @overload + def __init__( + self, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobOutputType.FILE # type: ignore + + +class FileDataGenerationJobSource(DataGenerationJobSource, discriminator="file"): + """File source for data generation jobs — Azure OpenAI file input. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this job, which is File. Required. File source — Azure OpenAI + file. + :vartype type: str or ~azure.ai.projects.models.FILE + :ivar id: Input Azure Open AI file id used for data generation. Required. + :vartype id: str + """ + + type: Literal[DataGenerationJobSourceType.FILE] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this job, which is File. Required. File source — Azure OpenAI file.""" + id: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Input Azure Open AI file id used for data generation. Required.""" + + @overload + def __init__( + self, + *, + id: str, # pylint: disable=redefined-builtin + description: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobSourceType.FILE # type: ignore + + class FileDatasetVersion(DatasetVersion, discriminator="uri_file"): """FileDatasetVersion Definition. @@ -5508,44 +6633,6 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: self.type = ToolType.FUNCTION # type: ignore -class HeaderIsolationKeySource(IsolationKeySource, discriminator="Header"): - """HeaderIsolationKeySource. - - :ivar kind: Required. HEADER. - :vartype kind: str or ~azure.ai.projects.models.HEADER - :ivar user_isolation_key: The user isolation key header value. Required. - :vartype user_isolation_key: str - :ivar chat_isolation_key: The chat isolation key header value. Required. - :vartype chat_isolation_key: str - """ - - kind: Literal[IsolationKeySourceKind.HEADER] = rest_discriminator(name="kind", visibility=["read", "create", "update", "delete", "query"]) # type: ignore - """Required. HEADER.""" - user_isolation_key: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The user isolation key header value. Required.""" - chat_isolation_key: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The chat isolation key header value. Required.""" - - @overload - def __init__( - self, - *, - user_isolation_key: str, - chat_isolation_key: str, - ) -> None: ... - - @overload - def __init__(self, mapping: Mapping[str, Any]) -> None: - """ - :param mapping: raw JSON to initialize the model. - :type mapping: Mapping[str, Any] - """ - - def __init__(self, *args: Any, **kwargs: Any) -> None: - super().__init__(*args, **kwargs) - self.kind = IsolationKeySourceKind.HEADER # type: ignore - - class TelemetryEndpointAuth(_Model): """Authentication configuration for a telemetry endpoint. @@ -6814,6 +7901,10 @@ class MemorySearchPreviewTool(Tool, discriminator="memory_search_preview"): :ivar type: The type of the tool. Always ``memory_search_preview``. Required. MEMORY_SEARCH_PREVIEW. :vartype type: str or ~azure.ai.projects.models.MEMORY_SEARCH_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar memory_store_name: The name of the memory store to use. Required. :vartype memory_store_name: str :ivar scope: The namespace used to group and isolate memories, such as a user ID. Limits which @@ -6829,6 +7920,10 @@ class MemorySearchPreviewTool(Tool, discriminator="memory_search_preview"): type: Literal[ToolType.MEMORY_SEARCH_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The type of the tool. Always ``memory_search_preview``. Required. MEMORY_SEARCH_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" memory_store_name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) """The name of the memory store to use. Required.""" scope: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) @@ -6848,6 +7943,8 @@ def __init__( *, memory_store_name: str, scope: str, + name: Optional[str] = None, + description: Optional[str] = None, search_options: Optional["_models.MemorySearchOptions"] = None, update_delay: Optional[int] = None, ) -> None: ... @@ -7296,6 +8393,10 @@ class MicrosoftFabricPreviewTool(Tool, discriminator="fabric_dataagent_preview") :ivar type: The object type, which is always 'fabric_dataagent_preview'. Required. FABRIC_DATAAGENT_PREVIEW. :vartype type: str or ~azure.ai.projects.models.FABRIC_DATAAGENT_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar fabric_dataagent_preview: The fabric data agent tool parameters. Required. :vartype fabric_dataagent_preview: ~azure.ai.projects.models.FabricDataAgentToolParameters """ @@ -7303,6 +8404,10 @@ class MicrosoftFabricPreviewTool(Tool, discriminator="fabric_dataagent_preview") type: Literal[ToolType.FABRIC_DATAAGENT_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The object type, which is always 'fabric_dataagent_preview'. Required. FABRIC_DATAAGENT_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" fabric_dataagent_preview: "_models.FabricDataAgentToolParameters" = rest_field( visibility=["read", "create", "update", "delete", "query"] ) @@ -7313,6 +8418,8 @@ def __init__( self, *, fabric_dataagent_preview: "_models.FabricDataAgentToolParameters", + name: Optional[str] = None, + description: Optional[str] = None, ) -> None: ... @overload @@ -7432,33 +8539,33 @@ class ModelSamplingParams(_Model): """Represents a set of parameters used to control the sampling behavior of a language model during text generation. - :ivar temperature: The temperature parameter for sampling. Required. + :ivar temperature: The temperature parameter for sampling. Defaults to 1.0. :vartype temperature: float - :ivar top_p: The top-p parameter for nucleus sampling. Required. + :ivar top_p: The top-p parameter for nucleus sampling. Defaults to 1.0. :vartype top_p: float - :ivar seed: The random seed for reproducibility. Required. + :ivar seed: The random seed for reproducibility. Defaults to 42. :vartype seed: int - :ivar max_completion_tokens: The maximum number of tokens allowed in the completion. Required. + :ivar max_completion_tokens: The maximum number of tokens allowed in the completion. :vartype max_completion_tokens: int """ - temperature: float = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The temperature parameter for sampling. Required.""" - top_p: float = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The top-p parameter for nucleus sampling. Required.""" - seed: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The random seed for reproducibility. Required.""" - max_completion_tokens: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The maximum number of tokens allowed in the completion. Required.""" + temperature: Optional[float] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The temperature parameter for sampling. Defaults to 1.0.""" + top_p: Optional[float] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The top-p parameter for nucleus sampling. Defaults to 1.0.""" + seed: Optional[int] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The random seed for reproducibility. Defaults to 42.""" + max_completion_tokens: Optional[int] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The maximum number of tokens allowed in the completion.""" @overload def __init__( self, *, - temperature: float, - top_p: float, - seed: int, - max_completion_tokens: int, + temperature: Optional[float] = None, + top_p: Optional[float] = None, + seed: Optional[int] = None, + max_completion_tokens: Optional[int] = None, ) -> None: ... @overload @@ -8235,23 +9342,108 @@ class PromptBasedEvaluatorDefinition(EvaluatorDefinition, discriminator="prompt" :vartype metrics: dict[str, ~azure.ai.projects.models.EvaluatorMetric] :ivar type: Required. Prompt-based definition. :vartype type: str or ~azure.ai.projects.models.PROMPT - :ivar prompt_text: The prompt text used for evaluation. Required. - :vartype prompt_text: str + :ivar prompt_text: The prompt text used for evaluation. Required. + :vartype prompt_text: str + """ + + type: Literal[EvaluatorDefinitionType.PROMPT] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """Required. Prompt-based definition.""" + prompt_text: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The prompt text used for evaluation. Required.""" + + @overload + def __init__( + self, + *, + prompt_text: str, + init_parameters: Optional[dict[str, Any]] = None, + data_schema: Optional[dict[str, Any]] = None, + metrics: Optional[dict[str, "_models.EvaluatorMetric"]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = EvaluatorDefinitionType.PROMPT # type: ignore + + +class PromptDataGenerationJobSource(DataGenerationJobSource, discriminator="prompt"): + """Prompt source for data generation jobs — inline text provided by the user. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Prompt. Required. Prompt source — inline + text provided by the user. + :vartype type: str or ~azure.ai.projects.models.PROMPT + :ivar prompt: Inline prompt text (e.g., agent description, policy text, supplementary context). + Required. + :vartype prompt: str + """ + + type: Literal[DataGenerationJobSourceType.PROMPT] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Prompt. Required. Prompt source — inline text + provided by the user.""" + prompt: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Inline prompt text (e.g., agent description, policy text, supplementary context). Required.""" + + @overload + def __init__( + self, + *, + prompt: str, + description: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobSourceType.PROMPT # type: ignore + + +class PromptEvaluatorGenerationJobSource(EvaluatorGenerationJobSource, discriminator="prompt"): + """Prompt source for evaluator generation jobs — inline text provided by the user. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Prompt. Required. Prompt source — inline + text provided by the user. + :vartype type: str or ~azure.ai.projects.models.PROMPT + :ivar prompt: Inline prompt text (e.g., agent description, policy text, supplementary context). + Required. + :vartype prompt: str """ - type: Literal[EvaluatorDefinitionType.PROMPT] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore - """Required. Prompt-based definition.""" - prompt_text: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) - """The prompt text used for evaluation. Required.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional description of what this source represents — helps the pipeline interpret its content + (e.g., 'Company refund policy document' or 'Describes the agent's core capabilities').""" + type: Literal[EvaluatorGenerationJobSourceType.PROMPT] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Prompt. Required. Prompt source — inline text + provided by the user.""" + prompt: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Inline prompt text (e.g., agent description, policy text, supplementary context). Required.""" @overload def __init__( self, *, - prompt_text: str, - init_parameters: Optional[dict[str, Any]] = None, - data_schema: Optional[dict[str, Any]] = None, - metrics: Optional[dict[str, "_models.EvaluatorMetric"]] = None, + prompt: str, + description: Optional[str] = None, ) -> None: ... @overload @@ -8263,7 +9455,7 @@ def __init__(self, mapping: Mapping[str, Any]) -> None: def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) - self.type = EvaluatorDefinitionType.PROMPT # type: ignore + self.type = EvaluatorGenerationJobSourceType.PROMPT # type: ignore class ProtocolVersionRecord(_Model): @@ -8631,6 +9823,122 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) +class RubricBasedEvaluatorDefinition(EvaluatorDefinition, discriminator="rubrics"): + """Rubric-based evaluator definition — stores rubric criteria produced by the generate API. Used + for both quality and safety evaluators. + + :ivar init_parameters: The JSON schema (Draft 2020-12) for the evaluator's input parameters. + This includes parameters like type, properties, required. + :vartype init_parameters: dict[str, any] + :ivar data_schema: The JSON schema (Draft 2020-12) for the evaluator's input data. This + includes parameters like type, properties, required. + :vartype data_schema: dict[str, any] + :ivar metrics: List of output metrics produced by this evaluator. + :vartype metrics: dict[str, ~azure.ai.projects.models.EvaluatorMetric] + :ivar type: Required. Rubric-based evaluator definition. Stores rubric criteria for both + quality and safety evaluators. Can be created via the generate API or manually via + createVersion. + :vartype type: str or ~azure.ai.projects.models.RUBRICS + :ivar rubric_criteria: Rubric criteria — the scoring blueprint used by the LLM judge. Quality + evaluators include a non-editable residual criterion with rubric_id 'general_quality' + (always_applicable: true); safety evaluators include 'general_policy_compliance'. Both use the + same rubric structure. Required. + :vartype rubric_criteria: list[~azure.ai.projects.models.RubricCriterion] + """ + + type: Literal[EvaluatorDefinitionType.RUBRICS] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """Required. Rubric-based evaluator definition. Stores rubric criteria for both quality and safety + evaluators. Can be created via the generate API or manually via createVersion.""" + rubric_criteria: list["_models.RubricCriterion"] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """Rubric criteria — the scoring blueprint used by the LLM judge. Quality evaluators include a + non-editable residual criterion with rubric_id 'general_quality' (always_applicable: true); + safety evaluators include 'general_policy_compliance'. Both use the same rubric structure. + Required.""" + + @overload + def __init__( + self, + *, + rubric_criteria: list["_models.RubricCriterion"], + init_parameters: Optional[dict[str, Any]] = None, + data_schema: Optional[dict[str, Any]] = None, + metrics: Optional[dict[str, "_models.EvaluatorMetric"]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = EvaluatorDefinitionType.RUBRICS # type: ignore + + +class RubricCriterion(_Model): + """A single rubric criterion — one measurable quality dimension in an evaluator's scoring + blueprint. + + :ivar rubric_id: Stable identifier for this rubric criterion (snake_case, e.g., + ``correct_resolution``). Required. Provided by the user when manually creating a rubric + evaluator or during human-in-the-loop review of a generated catalog; the generation pipeline + produces an initial value the user can edit. Editable when saving new versions. Required. + :vartype rubric_id: str + :ivar description: What this criterion measures (e.g., 'Correctly identifies the user's + reservation intent and pursues the appropriate workflow'). Required. + :vartype description: str + :ivar weight: Relative weight of this criterion (1-10). The generation pipeline assigns exactly + one criterion weight 8-10; all others use 1-6. User edits are not constrained by this + heuristic. Required. + :vartype weight: int + :ivar always_applicable: When true, the LLM judge always scores this criterion regardless of + relevance (skips applicability assessment). The service-generated general quality/policy + criterion has this set to true and is non-editable. Users may set this on their own custom + criteria. + :vartype always_applicable: bool + """ + + rubric_id: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Stable identifier for this rubric criterion (snake_case, e.g., ``correct_resolution``). + Required. Provided by the user when manually creating a rubric evaluator or during + human-in-the-loop review of a generated catalog; the generation pipeline produces an initial + value the user can edit. Editable when saving new versions. Required.""" + description: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """What this criterion measures (e.g., 'Correctly identifies the user's reservation intent and + pursues the appropriate workflow'). Required.""" + weight: int = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Relative weight of this criterion (1-10). The generation pipeline assigns exactly one criterion + weight 8-10; all others use 1-6. User edits are not constrained by this heuristic. Required.""" + always_applicable: Optional[bool] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """When true, the LLM judge always scores this criterion regardless of relevance (skips + applicability assessment). The service-generated general quality/policy criterion has this set + to true and is non-editable. Users may set this on their own custom criteria.""" + + @overload + def __init__( + self, + *, + rubric_id: str, + description: str, + weight: int, + always_applicable: Optional[bool] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + + class SASCredentials(BaseCredentials, discriminator="SAS"): """Shared Access Signature (SAS) credential definition. @@ -9001,6 +10309,10 @@ class SharepointPreviewTool(Tool, discriminator="sharepoint_grounding_preview"): :ivar type: The object type, which is always 'sharepoint_grounding_preview'. Required. SHAREPOINT_GROUNDING_PREVIEW. :vartype type: str or ~azure.ai.projects.models.SHAREPOINT_GROUNDING_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str :ivar sharepoint_grounding_preview: The sharepoint grounding tool parameters. Required. :vartype sharepoint_grounding_preview: ~azure.ai.projects.models.SharepointGroundingToolParameters @@ -9009,6 +10321,10 @@ class SharepointPreviewTool(Tool, discriminator="sharepoint_grounding_preview"): type: Literal[ToolType.SHAREPOINT_GROUNDING_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore """The object type, which is always 'sharepoint_grounding_preview'. Required. SHAREPOINT_GROUNDING_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" sharepoint_grounding_preview: "_models.SharepointGroundingToolParameters" = rest_field( visibility=["read", "create", "update", "delete", "query"] ) @@ -9019,6 +10335,8 @@ def __init__( self, *, sharepoint_grounding_preview: "_models.SharepointGroundingToolParameters", + name: Optional[str] = None, + description: Optional[str] = None, ) -> None: ... @overload @@ -9033,6 +10351,53 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: self.type = ToolType.SHAREPOINT_GROUNDING_PREVIEW # type: ignore +class SimpleQnADataGenerationJobOptions(DataGenerationJobOptions, discriminator="simple_qna"): + """The options for a data generation job with SimpleQnA type. + + :ivar max_samples: Maximum number of samples to generate. Required. + :vartype max_samples: int + :ivar train_split: The proportion of the generated data to be used for training when the data + is used for fine-tuning. The rest will be used for validation. Value should be between 0 and 1. + :vartype train_split: float + :ivar model_options: The LLM model options. + :vartype model_options: ~azure.ai.projects.models.DataGenerationModelOptions + :ivar type: The data generation job type, which is SimpleQnA for this model. Required. Simple + question and answers between user and agent. + :vartype type: str or ~azure.ai.projects.models.SIMPLE_QNA + :ivar question_types: The question types to generate. Used only for fine-tuning scenarios. + :vartype question_types: list[str or ~azure.ai.projects.models.SimpleQnAFineTuningQuestionType] + """ + + type: Literal[DataGenerationJobType.SIMPLE_QNA] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The data generation job type, which is SimpleQnA for this model. Required. Simple question and + answers between user and agent.""" + question_types: Optional[list[Union[str, "_models.SimpleQnAFineTuningQuestionType"]]] = rest_field( + visibility=["read", "create", "update", "delete", "query"] + ) + """The question types to generate. Used only for fine-tuning scenarios.""" + + @overload + def __init__( + self, + *, + max_samples: int, + train_split: Optional[float] = None, + model_options: Optional["_models.DataGenerationModelOptions"] = None, + question_types: Optional[list[Union[str, "_models.SimpleQnAFineTuningQuestionType"]]] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobType.SIMPLE_QNA # type: ignore + + class SkillObject(_Model): """A skill object. @@ -9666,6 +11031,46 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) +class ToolboxSearchPreviewTool(Tool, discriminator="toolbox_search_preview"): + """A tool for searching over the agent's toolbox. When present, deferred tools are hidden from + ``tools/list`` and only discoverable via ``search_tools`` queries at runtime. + + :ivar type: The type of the tool. Always ``toolbox_search_preview``. Required. + TOOLBOX_SEARCH_PREVIEW. + :vartype type: str or ~azure.ai.projects.models.TOOLBOX_SEARCH_PREVIEW + :ivar name: Optional user-defined name for this tool or configuration. + :vartype name: str + :ivar description: Optional user-defined description for this tool or configuration. + :vartype description: str + """ + + type: Literal[ToolType.TOOLBOX_SEARCH_PREVIEW] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The type of the tool. Always ``toolbox_search_preview``. Required. TOOLBOX_SEARCH_PREVIEW.""" + name: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined name for this tool or configuration.""" + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional user-defined description for this tool or configuration.""" + + @overload + def __init__( + self, + *, + name: Optional[str] = None, + description: Optional[str] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = ToolType.TOOLBOX_SEARCH_PREVIEW # type: ignore + + class ToolboxVersionObject(_Model): """A specific version of a toolbox. @@ -10140,6 +11545,237 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) +class ToolUseFineTuningDataGenerationJobOptions( + DataGenerationJobOptions, discriminator="tool_use" +): # pylint: disable=name-too-long + """The options for a data generation job with ToolUse type. Used only for fine-tuning scenarios. + + :ivar max_samples: Maximum number of samples to generate. Required. + :vartype max_samples: int + :ivar train_split: The proportion of the generated data to be used for training when the data + is used for fine-tuning. The rest will be used for validation. Value should be between 0 and 1. + :vartype train_split: float + :ivar model_options: The LLM model options. + :vartype model_options: ~azure.ai.projects.models.DataGenerationModelOptions + :ivar type: The data generation job type, which is ToolUse for this model. Required. Tool + calling conversation between user and agent. + :vartype type: str or ~azure.ai.projects.models.TOOL_USE + """ + + type: Literal[DataGenerationJobType.TOOL_USE] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The data generation job type, which is ToolUse for this model. Required. Tool calling + conversation between user and agent.""" + + @overload + def __init__( + self, + *, + max_samples: int, + train_split: Optional[float] = None, + model_options: Optional["_models.DataGenerationModelOptions"] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobType.TOOL_USE # type: ignore + + +class TracesDataGenerationJobOptions(DataGenerationJobOptions, discriminator="traces"): + """The options for a data generation job with Traces type. + + :ivar max_samples: Maximum number of samples to generate. Required. + :vartype max_samples: int + :ivar train_split: The proportion of the generated data to be used for training when the data + is used for fine-tuning. The rest will be used for validation. Value should be between 0 and 1. + :vartype train_split: float + :ivar model_options: The LLM model options. + :vartype model_options: ~azure.ai.projects.models.DataGenerationModelOptions + :ivar type: The data generation job type, which is Traces for this model. Required. Single turn + query and response from agent traces. + :vartype type: str or ~azure.ai.projects.models.TRACES + """ + + type: Literal[DataGenerationJobType.TRACES] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The data generation job type, which is Traces for this model. Required. Single turn query and + response from agent traces.""" + + @overload + def __init__( + self, + *, + max_samples: int, + train_split: Optional[float] = None, + model_options: Optional["_models.DataGenerationModelOptions"] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobType.TRACES # type: ignore + + +class TracesDataGenerationJobSource(DataGenerationJobSource, discriminator="traces"): + """Traces source for data generation jobs — conversation traces from Application Insights. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Traces. Required. Traces source — + conversation traces from Application Insights. + :vartype type: str or ~azure.ai.projects.models.TRACES + :ivar agent_id: The unique agent ID used to filter traces. Optional — when omitted, traces are + filtered by ``agent_name`` (and ``agent_version`` if specified). + :vartype agent_id: str + :ivar agent_name: The agent name to fetch traces for. Required. + :vartype agent_name: str + :ivar agent_version: The agent version. If not specified, traces for ALL versions of the agent + are included within the time window. + :vartype agent_version: str + :ivar start_time: Start of the time window (Unix timestamp in seconds) for fetching traces. + :vartype start_time: ~datetime.datetime + :ivar end_time: End of the time window (Unix timestamp in seconds). Defaults to current time. + :vartype end_time: ~datetime.datetime + :ivar max_traces: Maximum number of traces to retrieve. + :vartype max_traces: int + """ + + type: Literal[DataGenerationJobSourceType.TRACES] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Traces. Required. Traces source — conversation traces + from Application Insights.""" + agent_id: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The unique agent ID used to filter traces. Optional — when omitted, traces are filtered by + ``agent_name`` (and ``agent_version`` if specified).""" + agent_name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent name to fetch traces for. Required.""" + agent_version: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent version. If not specified, traces for ALL versions of the agent are included within + the time window.""" + start_time: Optional[datetime.datetime] = rest_field( + visibility=["read", "create", "update", "delete", "query"], format="unix-timestamp" + ) + """Start of the time window (Unix timestamp in seconds) for fetching traces.""" + end_time: Optional[datetime.datetime] = rest_field( + visibility=["read", "create", "update", "delete", "query"], format="unix-timestamp" + ) + """End of the time window (Unix timestamp in seconds). Defaults to current time.""" + max_traces: Optional[int] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Maximum number of traces to retrieve.""" + + @overload + def __init__( + self, + *, + agent_name: str, + description: Optional[str] = None, + agent_id: Optional[str] = None, + agent_version: Optional[str] = None, + start_time: Optional[datetime.datetime] = None, + end_time: Optional[datetime.datetime] = None, + max_traces: Optional[int] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = DataGenerationJobSourceType.TRACES # type: ignore + + +class TracesEvaluatorGenerationJobSource(EvaluatorGenerationJobSource, discriminator="traces"): + """Traces source for evaluator generation jobs — conversation traces from Application Insights. + + :ivar description: Optional description of what this source represents — helps the pipeline + interpret its content (e.g., 'Company refund policy document' or 'Describes the agent's core + capabilities'). + :vartype description: str + :ivar type: The source type for this source, which is Traces. Required. Traces source — + conversation traces from Application Insights. + :vartype type: str or ~azure.ai.projects.models.TRACES + :ivar agent_id: The unique agent ID used to filter traces. Optional — when omitted, traces are + filtered by ``agent_name`` (and ``agent_version`` if specified). + :vartype agent_id: str + :ivar agent_name: The agent name to fetch traces for. Required. + :vartype agent_name: str + :ivar agent_version: The agent version. If not specified, traces for ALL versions of the agent + are included within the time window. + :vartype agent_version: str + :ivar start_time: Start of the time window (Unix timestamp in seconds) for fetching traces. + :vartype start_time: ~datetime.datetime + :ivar end_time: End of the time window (Unix timestamp in seconds). Defaults to current time. + :vartype end_time: ~datetime.datetime + :ivar max_traces: Maximum number of traces to retrieve. + :vartype max_traces: int + """ + + description: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Optional description of what this source represents — helps the pipeline interpret its content + (e.g., 'Company refund policy document' or 'Describes the agent's core capabilities').""" + type: Literal[EvaluatorGenerationJobSourceType.TRACES] = rest_discriminator(name="type", visibility=["read", "create", "update", "delete", "query"]) # type: ignore + """The source type for this source, which is Traces. Required. Traces source — conversation traces + from Application Insights.""" + agent_id: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The unique agent ID used to filter traces. Optional — when omitted, traces are filtered by + ``agent_name`` (and ``agent_version`` if specified).""" + agent_name: str = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent name to fetch traces for. Required.""" + agent_version: Optional[str] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """The agent version. If not specified, traces for ALL versions of the agent are included within + the time window.""" + start_time: Optional[datetime.datetime] = rest_field( + visibility=["read", "create", "update", "delete", "query"], format="unix-timestamp" + ) + """Start of the time window (Unix timestamp in seconds) for fetching traces.""" + end_time: Optional[datetime.datetime] = rest_field( + visibility=["read", "create", "update", "delete", "query"], format="unix-timestamp" + ) + """End of the time window (Unix timestamp in seconds). Defaults to current time.""" + max_traces: Optional[int] = rest_field(visibility=["read", "create", "update", "delete", "query"]) + """Maximum number of traces to retrieve.""" + + @overload + def __init__( + self, + *, + agent_name: str, + description: Optional[str] = None, + agent_id: Optional[str] = None, + agent_version: Optional[str] = None, + start_time: Optional[datetime.datetime] = None, + end_time: Optional[datetime.datetime] = None, + max_traces: Optional[int] = None, + ) -> None: ... + + @overload + def __init__(self, mapping: Mapping[str, Any]) -> None: + """ + :param mapping: raw JSON to initialize the model. + :type mapping: Mapping[str, Any] + """ + + def __init__(self, *args: Any, **kwargs: Any) -> None: + super().__init__(*args, **kwargs) + self.type = EvaluatorGenerationJobSourceType.TRACES # type: ignore + + class UpdateToolboxRequest(_Model): """UpdateToolboxRequest. diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/models/_patch.py b/sdk/ai/azure-ai-projects/azure/ai/projects/models/_patch.py index a8b53026f7f3..03b2e81ab05d 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/models/_patch.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/models/_patch.py @@ -48,10 +48,12 @@ "schedules": _FoundryFeaturesOptInKeys.SCHEDULES_V1_PREVIEW.value, "toolboxes": _FoundryFeaturesOptInKeys.TOOLBOXES_V1_PREVIEW.value, "skills": _FoundryFeaturesOptInKeys.SKILLS_V1_PREVIEW.value, + "datasets": _FoundryFeaturesOptInKeys.DATA_GENERATION_JOBS_V1_PREVIEW.value, "agents": ",".join( [ _AgentDefinitionOptInKeys.HOSTED_AGENTS_V1_PREVIEW.value, _AgentDefinitionOptInKeys.AGENT_ENDPOINT_V1_PREVIEW.value, + _AgentDefinitionOptInKeys.CODE_AGENTS_V1_PREVIEW.value, ] ), } diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py index 52f7a21b1821..a0c8c986ba56 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py @@ -35,8 +35,9 @@ from .. import models as _models from .._configuration import AIProjectClientConfiguration -from .._utils.model_base import SdkJSONEncoder, _deserialize, _failsafe_deserialize +from .._utils.model_base import Model as _Model, SdkJSONEncoder, _deserialize, _failsafe_deserialize from .._utils.serialization import Deserializer, Serializer +from .._utils.utils import prepare_multipart_form_data JSON = MutableMapping[str, Any] _Unset: Any = object() @@ -797,6 +798,33 @@ def build_indexes_create_or_update_request(name: str, version: str, **kwargs: An return HttpRequest(method="PATCH", url=_url, params=_params, headers=_headers, **kwargs) +def build_beta_agents_update_agent_from_code_request( # pylint: disable=name-too-long + agent_name: str, *, code_zip_sha256: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/agents/{agent_name}" + path_format_arguments = { + "agent_name": _SERIALIZER.url("agent_name", agent_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["x-ms-code-zip-sha256"] = _SERIALIZER.header("code_zip_sha256", code_zip_sha256, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + def build_beta_agents_patch_agent_details_request( # pylint: disable=name-too-long agent_name: str, **kwargs: Any ) -> HttpRequest: @@ -826,7 +854,87 @@ def build_beta_agents_patch_agent_details_request( # pylint: disable=name-too-l return HttpRequest(method="PATCH", url=_url, params=_params, headers=_headers, **kwargs) -def build_beta_agents_create_session_request(agent_name: str, *, isolation_key: str, **kwargs: Any) -> HttpRequest: +def build_beta_agents_create_agent_version_from_code_request( # pylint: disable=name-too-long + agent_name: str, *, code_zip_sha256: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/agents/{agent_name}/versions" + path_format_arguments = { + "agent_name": _SERIALIZER.url("agent_name", agent_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["x-ms-code-zip-sha256"] = _SERIALIZER.header("code_zip_sha256", code_zip_sha256, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_agents_download_agent_version_code_request( # pylint: disable=name-too-long + agent_name: str, agent_version: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/zip") + + # Construct URL + _url = "/agents/{agent_name}/versions/{agent_version}/code:download" + path_format_arguments = { + "agent_name": _SERIALIZER.url("agent_name", agent_name, "str"), + "agent_version": _SERIALIZER.url("agent_version", agent_version, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_agents_download_agent_code_request( # pylint: disable=name-too-long + agent_name: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/zip") + + # Construct URL + _url = "/agents/{agent_name}/code:download" + path_format_arguments = { + "agent_name": _SERIALIZER.url("agent_name", agent_name, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_agents_create_session_request(agent_name: str, **kwargs: Any) -> HttpRequest: _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) @@ -846,7 +954,6 @@ def build_beta_agents_create_session_request(agent_name: str, *, isolation_key: _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") # Construct headers - _headers["x-session-isolation-key"] = _SERIALIZER.header("isolation_key", isolation_key, "str") if content_type is not None: _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") @@ -879,10 +986,7 @@ def build_beta_agents_get_session_request(agent_name: str, session_id: str, **kw return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) -def build_beta_agents_delete_session_request( - agent_name: str, session_id: str, *, isolation_key: str, **kwargs: Any -) -> HttpRequest: - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) +def build_beta_agents_delete_session_request(agent_name: str, session_id: str, **kwargs: Any) -> HttpRequest: _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) @@ -898,10 +1002,7 @@ def build_beta_agents_delete_session_request( # Construct parameters _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") - # Construct headers - _headers["x-session-isolation-key"] = _SERIALIZER.header("isolation_key", isolation_key, "str") - - return HttpRequest(method="DELETE", url=_url, params=_params, headers=_headers, **kwargs) + return HttpRequest(method="DELETE", url=_url, params=_params, **kwargs) def build_beta_agents_list_sessions_request( @@ -1380,6 +1481,141 @@ def build_beta_evaluators_update_version_request( # pylint: disable=name-too-lo return HttpRequest(method="PATCH", url=_url, params=_params, headers=_headers, **kwargs) +def build_beta_evaluators_create_generation_job_request( # pylint: disable=name-too-long + *, operation_id: Optional[str] = None, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/evaluator_generation_jobs" + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + if operation_id is not None: + _headers["Operation-Id"] = _SERIALIZER.header("operation_id", operation_id, "str") + if content_type is not None: + _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_evaluators_get_generation_job_request( # pylint: disable=name-too-long + job_id: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/evaluator_generation_jobs/{jobId}" + path_format_arguments = { + "jobId": _SERIALIZER.url("job_id", job_id, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_evaluators_list_generation_jobs_request( # pylint: disable=name-too-long + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + after: Optional[str] = None, + before: Optional[str] = None, + category: Optional[Union[str, _models.EvaluatorCategory]] = None, + **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/evaluator_generation_jobs" + + # Construct parameters + if limit is not None: + _params["limit"] = _SERIALIZER.query("limit", limit, "int") + if order is not None: + _params["order"] = _SERIALIZER.query("order", order, "str") + if after is not None: + _params["after"] = _SERIALIZER.query("after", after, "str") + if before is not None: + _params["before"] = _SERIALIZER.query("before", before, "str") + if category is not None: + _params["category"] = _SERIALIZER.query("category", category, "str") + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_evaluators_cancel_generation_job_request( # pylint: disable=name-too-long + job_id: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/evaluator_generation_jobs/{jobId}:cancel" + path_format_arguments = { + "jobId": _SERIALIZER.url("job_id", job_id, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_evaluators_delete_generation_job_request( # pylint: disable=name-too-long + job_id: str, **kwargs: Any +) -> HttpRequest: + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + # Construct URL + _url = "/evaluator_generation_jobs/{jobId}" + path_format_arguments = { + "jobId": _SERIALIZER.url("job_id", job_id, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + return HttpRequest(method="DELETE", url=_url, params=_params, **kwargs) + + def build_beta_insights_generate_request(**kwargs: Any) -> HttpRequest: _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) @@ -2305,40 +2541,179 @@ def build_beta_skills_delete_request(name: str, **kwargs: Any) -> HttpRequest: return HttpRequest(method="DELETE", url=_url, params=_params, headers=_headers, **kwargs) -class BetaOperations: # pylint: disable=too-many-instance-attributes - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`beta` attribute. - """ +def build_beta_datasets_get_generation_job_request( # pylint: disable=name-too-long + job_id: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") - self.agents = BetaAgentsOperations(self._client, self._config, self._serialize, self._deserialize) - self.evaluation_taxonomies = BetaEvaluationTaxonomiesOperations( - self._client, self._config, self._serialize, self._deserialize - ) - self.evaluators = BetaEvaluatorsOperations(self._client, self._config, self._serialize, self._deserialize) - self.insights = BetaInsightsOperations(self._client, self._config, self._serialize, self._deserialize) - self.memory_stores = BetaMemoryStoresOperations(self._client, self._config, self._serialize, self._deserialize) - self.red_teams = BetaRedTeamsOperations(self._client, self._config, self._serialize, self._deserialize) - self.schedules = BetaSchedulesOperations(self._client, self._config, self._serialize, self._deserialize) - self.toolboxes = BetaToolboxesOperations(self._client, self._config, self._serialize, self._deserialize) - self.skills = BetaSkillsOperations(self._client, self._config, self._serialize, self._deserialize) + # Construct URL + _url = "/data_generation_jobs/{jobId}" + path_format_arguments = { + "jobId": _SERIALIZER.url("job_id", job_id, "str"), + } + _url: str = _url.format(**path_format_arguments) # type: ignore -class AgentsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_datasets_list_generation_jobs_request( # pylint: disable=name-too-long + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + after: Optional[str] = None, + before: Optional[str] = None, + scenario: Optional[Union[str, _models.DataGenerationJobScenario]] = None, + type: Optional[List[Union[str, _models.DataGenerationJobType]]] = None, + **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/data_generation_jobs" + + # Construct parameters + if limit is not None: + _params["limit"] = _SERIALIZER.query("limit", limit, "int") + if order is not None: + _params["order"] = _SERIALIZER.query("order", order, "str") + if after is not None: + _params["after"] = _SERIALIZER.query("after", after, "str") + if before is not None: + _params["before"] = _SERIALIZER.query("before", before, "str") + if scenario is not None: + _params["scenario"] = _SERIALIZER.query("scenario", scenario, "str") + if type is not None: + _params["type"] = _SERIALIZER.query("type", type, "[str]", div=",") + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="GET", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_datasets_create_generation_job_request( # pylint: disable=name-too-long + *, operation_id: Optional[str] = None, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/data_generation_jobs" + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + if operation_id is not None: + _headers["Operation-Id"] = _SERIALIZER.header("operation_id", operation_id, "str") + if content_type is not None: + _headers["Content-Type"] = _SERIALIZER.header("content_type", content_type, "str") + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_datasets_cancel_generation_job_request( # pylint: disable=name-too-long + job_id: str, **kwargs: Any +) -> HttpRequest: + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + accept = _headers.pop("Accept", "application/json") + + # Construct URL + _url = "/data_generation_jobs/{jobId}:cancel" + path_format_arguments = { + "jobId": _SERIALIZER.url("job_id", job_id, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + # Construct headers + _headers["Accept"] = _SERIALIZER.header("accept", accept, "str") + + return HttpRequest(method="POST", url=_url, params=_params, headers=_headers, **kwargs) + + +def build_beta_datasets_delete_generation_job_request( # pylint: disable=name-too-long + job_id: str, **kwargs: Any +) -> HttpRequest: + _params = case_insensitive_dict(kwargs.pop("params", {}) or {}) + + api_version: str = kwargs.pop("api_version", _params.pop("api-version", "v1")) + # Construct URL + _url = "/data_generation_jobs/{jobId}" + path_format_arguments = { + "jobId": _SERIALIZER.url("job_id", job_id, "str"), + } + + _url: str = _url.format(**path_format_arguments) # type: ignore + + # Construct parameters + _params["api-version"] = _SERIALIZER.query("api_version", api_version, "str") + + return HttpRequest(method="DELETE", url=_url, params=_params, **kwargs) + + +class BetaOperations: # pylint: disable=too-many-instance-attributes + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`beta` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + self.agents = BetaAgentsOperations(self._client, self._config, self._serialize, self._deserialize) + self.evaluation_taxonomies = BetaEvaluationTaxonomiesOperations( + self._client, self._config, self._serialize, self._deserialize + ) + self.evaluators = BetaEvaluatorsOperations(self._client, self._config, self._serialize, self._deserialize) + self.insights = BetaInsightsOperations(self._client, self._config, self._serialize, self._deserialize) + self.memory_stores = BetaMemoryStoresOperations(self._client, self._config, self._serialize, self._deserialize) + self.red_teams = BetaRedTeamsOperations(self._client, self._config, self._serialize, self._deserialize) + self.schedules = BetaSchedulesOperations(self._client, self._config, self._serialize, self._deserialize) + self.toolboxes = BetaToolboxesOperations(self._client, self._config, self._serialize, self._deserialize) + self.skills = BetaSkillsOperations(self._client, self._config, self._serialize, self._deserialize) + self.datasets = BetaDatasetsOperations(self._client, self._config, self._serialize, self._deserialize) + + +class AgentsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. Instead, you should access the following operations through :class:`~azure.ai.projects.AIProjectClient`'s @@ -5184,13 +5559,157 @@ def __init__(self, *args, **kwargs) -> None: self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @overload + def update_agent_from_code( + self, agent_name: str, body: _models.CreateAgentVersionFromCodeContent, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentDetails: + """Updates a code-based agent by uploading new code and creating a new version. If the code and + definition are unchanged (matched by x-ms-code-zip-sha256 header), returns the existing + version. The request body is multipart/form-data with a JSON metadata part and a binary code + part (part order is irrelevant). Maximum upload size is 250 MB. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentDetails. The AgentDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def update_agent_from_code( + self, agent_name: str, body: JSON, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentDetails: + """Updates a code-based agent by uploading new code and creating a new version. If the code and + definition are unchanged (matched by x-ms-code-zip-sha256 header), returns the existing + version. The request body is multipart/form-data with a JSON metadata part and a binary code + part (part order is irrelevant). Maximum upload size is 250 MB. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Required. + :type body: JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentDetails. The AgentDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace + def update_agent_from_code( + self, + agent_name: str, + body: Union[_models.CreateAgentVersionFromCodeContent, JSON], + *, + code_zip_sha256: str, + **kwargs: Any + ) -> _models.AgentDetails: + """Updates a code-based agent by uploading new code and creating a new version. If the code and + definition are unchanged (matched by x-ms-code-zip-sha256 header), returns the existing + version. The request body is multipart/form-data with a JSON metadata part and a binary code + part (part order is irrelevant). Maximum upload size is 250 MB. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Is either a CreateAgentVersionFromCodeContent type or a JSON type. Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent or JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentDetails. The AgentDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.AgentDetails] = kwargs.pop("cls", None) + + _body = body.as_dict() if isinstance(body, _Model) else body + _file_fields: list[str] = ["code"] + _data_fields: list[str] = ["metadata"] + _files = prepare_multipart_form_data(_body, _file_fields, _data_fields) + + _request = build_beta_agents_update_agent_from_code_request( + agent_name=agent_name, + code_zip_sha256=code_zip_sha256, + api_version=self._config.api_version, + files=_files, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.AgentDetails, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + @overload def patch_agent_details( self, agent_name: str, *, content_type: str = "application/merge-patch+json", - agent_endpoint: Optional[_models.AgentEndpoint] = None, + agent_endpoint: Optional[_models.AgentEndpointConfig] = None, agent_card: Optional[_models.AgentCard] = None, **kwargs: Any ) -> _models.AgentDetails: @@ -5202,7 +5721,7 @@ def patch_agent_details( Default value is "application/merge-patch+json". :paramtype content_type: str :keyword agent_endpoint: The endpoint configuration for the agent. Default value is None. - :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpoint + :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpointConfig :keyword agent_card: Optional agent card for the agent. Default value is None. :paramtype agent_card: ~azure.ai.projects.models.AgentCard :return: AgentDetails. The AgentDetails is compatible with MutableMapping @@ -5252,7 +5771,7 @@ def patch_agent_details( agent_name: str, body: Union[JSON, IO[bytes]] = _Unset, *, - agent_endpoint: Optional[_models.AgentEndpoint] = None, + agent_endpoint: Optional[_models.AgentEndpointConfig] = None, agent_card: Optional[_models.AgentCard] = None, **kwargs: Any ) -> _models.AgentDetails: @@ -5263,7 +5782,7 @@ def patch_agent_details( :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] :keyword agent_endpoint: The endpoint configuration for the agent. Default value is None. - :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpoint + :paramtype agent_endpoint: ~azure.ai.projects.models.AgentEndpointConfig :keyword agent_card: Optional agent card for the agent. Default value is None. :paramtype agent_card: ~azure.ai.projects.models.AgentCard :return: AgentDetails. The AgentDetails is compatible with MutableMapping @@ -5339,27 +5858,296 @@ def patch_agent_details( return deserialized # type: ignore @overload - def create_session( - self, - agent_name: str, - *, - isolation_key: str, - version_indicator: _models.VersionIndicator, - content_type: str = "application/json", - agent_session_id: Optional[str] = None, - **kwargs: Any - ) -> _models.AgentSessionResource: - """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version - from ``version_indicator`` and enforces session ownership using the provided isolation key for - session-mutating operations. + def create_agent_version_from_code( + self, agent_name: str, body: _models.CreateAgentVersionFromCodeContent, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentVersionDetails: + """create_agent_version_from_code. - :param agent_name: The name of the agent to create a session for. Required. - :type agent_name: str - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str - :keyword version_indicator: Determines which agent version backs the session. Required. - :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentVersionDetails. The AgentVersionDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentVersionDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create_agent_version_from_code( + self, agent_name: str, body: JSON, *, code_zip_sha256: str, **kwargs: Any + ) -> _models.AgentVersionDetails: + """create_agent_version_from_code. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Required. + :type body: JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentVersionDetails. The AgentVersionDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentVersionDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace + def create_agent_version_from_code( + self, + agent_name: str, + body: Union[_models.CreateAgentVersionFromCodeContent, JSON], + *, + code_zip_sha256: str, + **kwargs: Any + ) -> _models.AgentVersionDetails: + """create_agent_version_from_code. + + :param agent_name: The unique name that identifies the agent. Name can be used to + retrieve/update/delete the agent. + + * Must start and end with alphanumeric characters, + * Can contain hyphens in the middle + * Must not exceed 63 characters. Required. + :type agent_name: str + :param body: Is either a CreateAgentVersionFromCodeContent type or a JSON type. Required. + :type body: ~azure.ai.projects.models.CreateAgentVersionFromCodeContent or JSON + :keyword code_zip_sha256: SHA-256 hex digest of the uploaded code zip. Used for change + detection (dedup) and integrity verification. Required. + :paramtype code_zip_sha256: str + :return: AgentVersionDetails. The AgentVersionDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.AgentVersionDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.AgentVersionDetails] = kwargs.pop("cls", None) + + _body = body.as_dict() if isinstance(body, _Model) else body + _file_fields: list[str] = ["code"] + _data_fields: list[str] = ["metadata"] + _files = prepare_multipart_form_data(_body, _file_fields, _data_fields) + + _request = build_beta_agents_create_agent_version_from_code_request( + agent_name=agent_name, + code_zip_sha256=code_zip_sha256, + api_version=self._config.api_version, + files=_files, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.AgentVersionDetails, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def download_agent_version_code(self, agent_name: str, agent_version: str, **kwargs: Any) -> Iterator[bytes]: + """Download the code zip for a specific version of a code-based hosted agent. Returns the + previously-uploaded zip (``application/zip``). The SHA-256 digest of the returned bytes matches + the ``content_hash`` on the agent version's ``code_configuration``. + + :param agent_name: The name of the agent. Required. + :type agent_name: str + :param agent_version: The version of the agent whose code zip should be downloaded. Required. + :type agent_version: str + :return: Iterator[bytes] + :rtype: Iterator[bytes] + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[Iterator[bytes]] = kwargs.pop("cls", None) + + _request = build_beta_agents_download_agent_version_code_request( + agent_name=agent_name, + agent_version=agent_version, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", True) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def download_agent_code(self, agent_name: str, **kwargs: Any) -> Iterator[bytes]: + """Download the code zip for the latest version of a code-based hosted agent. Returns the + previously-uploaded zip (``application/zip``). The SHA-256 digest of the returned bytes matches + the ``content_hash`` on the latest version's ``code_configuration``. + + :param agent_name: The name of the agent whose latest-version code zip should be downloaded. + Required. + :type agent_name: str + :return: Iterator[bytes] + :rtype: Iterator[bytes] + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[Iterator[bytes]] = kwargs.pop("cls", None) + + _request = build_beta_agents_download_agent_code_request( + agent_name=agent_name, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", True) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore + + return deserialized # type: ignore + + @overload + def create_session( + self, + agent_name: str, + *, + version_indicator: _models.VersionIndicator, + content_type: str = "application/json", + agent_session_id: Optional[str] = None, + **kwargs: Any + ) -> _models.AgentSessionResource: + """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version + from ``version_indicator`` and enforces session ownership using the provided isolation key for + session-mutating operations. + + :param agent_name: The name of the agent to create a session for. Required. + :type agent_name: str + :keyword version_indicator: Determines which agent version backs the session. Required. + :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -5373,7 +6161,7 @@ def create_session( @overload def create_session( - self, agent_name: str, body: JSON, *, isolation_key: str, content_type: str = "application/json", **kwargs: Any + self, agent_name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any ) -> _models.AgentSessionResource: """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version from ``version_indicator`` and enforces session ownership using the provided isolation key for @@ -5383,9 +6171,6 @@ def create_session( :type agent_name: str :param body: Required. :type body: JSON - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -5396,13 +6181,7 @@ def create_session( @overload def create_session( - self, - agent_name: str, - body: IO[bytes], - *, - isolation_key: str, - content_type: str = "application/json", - **kwargs: Any + self, agent_name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any ) -> _models.AgentSessionResource: """Creates a new session for an agent endpoint. The endpoint resolves the backing agent version from ``version_indicator`` and enforces session ownership using the provided isolation key for @@ -5412,9 +6191,6 @@ def create_session( :type agent_name: str :param body: Required. :type body: IO[bytes] - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -5429,7 +6205,6 @@ def create_session( agent_name: str, body: Union[JSON, IO[bytes]] = _Unset, *, - isolation_key: str, version_indicator: _models.VersionIndicator = _Unset, agent_session_id: Optional[str] = None, **kwargs: Any @@ -5442,9 +6217,6 @@ def create_session( :type agent_name: str :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str :keyword version_indicator: Determines which agent version backs the session. Required. :paramtype version_indicator: ~azure.ai.projects.models.VersionIndicator :keyword agent_session_id: Optional caller-provided session ID. If specified, it must be unique @@ -5482,7 +6254,6 @@ def create_session( _request = build_beta_agents_create_session_request( agent_name=agent_name, - isolation_key=isolation_key, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -5595,7 +6366,7 @@ def get_session(self, agent_name: str, session_id: str, **kwargs: Any) -> _model @distributed_trace def delete_session( # pylint: disable=inconsistent-return-statements - self, agent_name: str, session_id: str, *, isolation_key: str, **kwargs: Any + self, agent_name: str, session_id: str, **kwargs: Any ) -> None: """Deletes a session synchronously. Returns 204 No Content when the session is deleted or does not exist. @@ -5604,9 +6375,6 @@ def delete_session( # pylint: disable=inconsistent-return-statements :type agent_name: str :param session_id: The session identifier. Required. :type session_id: str - :keyword isolation_key: Isolation key used by the agent endpoint to enforce session ownership - for session-mutating operations. Required. - :paramtype isolation_key: str :return: None :rtype: None :raises ~azure.core.exceptions.HttpResponseError: @@ -5627,7 +6395,6 @@ def delete_session( # pylint: disable=inconsistent-return-statements _request = build_beta_agents_delete_session_request( agent_name=agent_name, session_id=session_id, - isolation_key=isolation_key, api_version=self._config.api_version, headers=_headers, params=_params, @@ -6370,14 +7137,14 @@ def delete(self, name: str, **kwargs: Any) -> None: # pylint: disable=inconsist @overload def create( - self, name: str, body: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -6388,14 +7155,14 @@ def create( @overload def create( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: JSON, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: JSON + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -6406,14 +7173,14 @@ def create( @overload def create( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: IO[bytes], *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: IO[bytes] + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -6424,15 +7191,15 @@ def create( @distributed_trace def create( - self, name: str, body: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any + self, name: str, taxonomy: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any ) -> _models.EvaluationTaxonomy: """Create an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, JSON, - IO[bytes] Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] + :param taxonomy: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, + JSON, IO[bytes] Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] :return: EvaluationTaxonomy. The EvaluationTaxonomy is compatible with MutableMapping :rtype: ~azure.ai.projects.models.EvaluationTaxonomy :raises ~azure.core.exceptions.HttpResponseError: @@ -6453,10 +7220,10 @@ def create( content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(taxonomy, (IOBase, bytes)): + _content = taxonomy else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(taxonomy, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_beta_evaluation_taxonomies_create_request( name=name, @@ -6500,14 +7267,14 @@ def create( @overload def update( - self, name: str, body: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: _models.EvaluationTaxonomy, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -6518,14 +7285,14 @@ def update( @overload def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: JSON, *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: JSON + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -6536,14 +7303,14 @@ def update( @overload def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + self, name: str, taxonomy: IO[bytes], *, content_type: str = "application/json", **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Required. - :type body: IO[bytes] + :param taxonomy: The evaluation taxonomy. Required. + :type taxonomy: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -6554,15 +7321,15 @@ def update( @distributed_trace def update( - self, name: str, body: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any + self, name: str, taxonomy: Union[_models.EvaluationTaxonomy, JSON, IO[bytes]], **kwargs: Any ) -> _models.EvaluationTaxonomy: """Update an evaluation taxonomy. :param name: The name of the evaluation taxonomy. Required. :type name: str - :param body: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, JSON, - IO[bytes] Required. - :type body: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] + :param taxonomy: The evaluation taxonomy. Is one of the following types: EvaluationTaxonomy, + JSON, IO[bytes] Required. + :type taxonomy: ~azure.ai.projects.models.EvaluationTaxonomy or JSON or IO[bytes] :return: EvaluationTaxonomy. The EvaluationTaxonomy is compatible with MutableMapping :rtype: ~azure.ai.projects.models.EvaluationTaxonomy :raises ~azure.core.exceptions.HttpResponseError: @@ -6583,10 +7350,10 @@ def update( content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(taxonomy, (IOBase, bytes)): + _content = taxonomy else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(taxonomy, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_beta_evaluation_taxonomies_update_request( name=name, @@ -7265,80 +8032,103 @@ def update_version( return deserialized # type: ignore - -class BetaInsightsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`insights` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") - @overload - def generate( - self, insight: _models.Insight, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Insight: - """Generate Insights. + def create_generation_job( + self, + job: _models.EvaluatorGenerationJob, + *, + operation_id: Optional[str] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Required. - :type insight: ~azure.ai.projects.models.Insight + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Required. + :type job: ~azure.ai.projects.models.EvaluatorGenerationJob + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def generate(self, insight: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.Insight: - """Generate Insights. - - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Required. - :type insight: JSON + def create_generation_job( + self, job: JSON, *, operation_id: Optional[str] = None, content_type: str = "application/json", **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. + + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Required. + :type job: JSON + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def generate(self, insight: IO[bytes], *, content_type: str = "application/json", **kwargs: Any) -> _models.Insight: - """Generate Insights. + def create_generation_job( + self, + job: IO[bytes], + *, + operation_id: Optional[str] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Required. - :type insight: IO[bytes] + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Required. + :type job: IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace - def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwargs: Any) -> _models.Insight: - """Generate Insights. + def create_generation_job( + self, + job: Union[_models.EvaluatorGenerationJob, JSON, IO[bytes]], + *, + operation_id: Optional[str] = None, + **kwargs: Any + ) -> _models.EvaluatorGenerationJob: + """Creates an evaluator generation job. - :param insight: Complete evaluation configuration including data source, evaluators, and result - settings. Is one of the following types: Insight, JSON, IO[bytes] Required. - :type insight: ~azure.ai.projects.models.Insight or JSON or IO[bytes] - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + Creates an evaluator generation job. The service generates rubric-based evaluator definitions + from the provided source materials asynchronously. + + :param job: The job to create. Is one of the following types: EvaluatorGenerationJob, JSON, + IO[bytes] Required. + :type job: ~azure.ai.projects.models.EvaluatorGenerationJob or JSON or IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7353,16 +8143,17 @@ def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwargs: A _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.Insight] = kwargs.pop("cls", None) + cls: ClsType[_models.EvaluatorGenerationJob] = kwargs.pop("cls", None) content_type = content_type or "application/json" _content = None - if isinstance(insight, (IOBase, bytes)): - _content = insight + if isinstance(job, (IOBase, bytes)): + _content = job else: - _content = json.dumps(insight, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(job, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_insights_generate_request( + _request = build_beta_evaluators_create_generation_job_request( + operation_id=operation_id, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -7395,27 +8186,30 @@ def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwargs: A ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) + response_headers["Location"] = self._deserialize("str", response.headers.get("Location")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.Insight, response.json()) + deserialized = _deserialize(_models.EvaluatorGenerationJob, response.json()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace - def get(self, insight_id: str, *, include_coordinates: Optional[bool] = None, **kwargs: Any) -> _models.Insight: - """Get a specific insight by Id. + def get_generation_job(self, job_id: str, **kwargs: Any) -> _models.EvaluatorGenerationJob: + """Get info about an evaluator generation job. - :param insight_id: The unique identifier for the insights report. Required. - :type insight_id: str - :keyword include_coordinates: Whether to include coordinates for visualization in the response. - Defaults to false. Default value is None. - :paramtype include_coordinates: bool - :return: Insight. The Insight is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Insight + Gets the details of an evaluator generation job by its ID. + + :param job_id: The ID of the job. Required. + :type job_id: str + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7429,11 +8223,10 @@ def get(self, insight_id: str, *, include_coordinates: Optional[bool] = None, ** _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.Insight] = kwargs.pop("cls", None) + cls: ClsType[_models.EvaluatorGenerationJob] = kwargs.pop("cls", None) - _request = build_beta_insights_get_request( - insight_id=insight_id, - include_coordinates=include_coordinates, + _request = build_beta_evaluators_get_generation_job_request( + job_id=job_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -7464,49 +8257,58 @@ def get(self, insight_id: str, *, include_coordinates: Optional[bool] = None, ** ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Retry-After"] = self._deserialize("int", response.headers.get("Retry-After")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.Insight, response.json()) + deserialized = _deserialize(_models.EvaluatorGenerationJob, response.json()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace - def list( + def list_generation_jobs( self, *, - type: Optional[Union[str, _models.InsightType]] = None, - eval_id: Optional[str] = None, - run_id: Optional[str] = None, - agent_name: Optional[str] = None, - include_coordinates: Optional[bool] = None, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + category: Optional[Union[str, _models.EvaluatorCategory]] = None, **kwargs: Any - ) -> ItemPaged["_models.Insight"]: - """List all insights in reverse chronological order (newest first). + ) -> ItemPaged["_models.EvaluatorGenerationJob"]: + """Returns a list of evaluator generation jobs. - :keyword type: Filter by the type of analysis. Known values are: "EvaluationRunClusterInsight", - "AgentClusterInsight", and "EvaluationComparison". Default value is None. - :paramtype type: str or ~azure.ai.projects.models.InsightType - :keyword eval_id: Filter by the evaluation ID. Default value is None. - :paramtype eval_id: str - :keyword run_id: Filter by the evaluation run ID. Default value is None. - :paramtype run_id: str - :keyword agent_name: Filter by the agent name. Default value is None. - :paramtype agent_name: str - :keyword include_coordinates: Whether to include coordinates for visualization in the response. - Defaults to false. Default value is None. - :paramtype include_coordinates: bool - :return: An iterator like instance of Insight - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.Insight] + Returns a list of evaluator generation jobs. + + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :keyword category: Filter evaluator generation jobs by category. Known values are: "quality", + "safety", and "agents". Default value is None. + :paramtype category: str or ~azure.ai.projects.models.EvaluatorCategory + :return: An iterator like instance of EvaluatorGenerationJob + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.EvaluatorGenerationJob] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.Insight]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.EvaluatorGenerationJob]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -7516,63 +8318,36 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_insights_list_request( - type=type, - eval_id=eval_id, - run_id=run_id, - agent_name=agent_name, - include_coordinates=include_coordinates, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) + def prepare_request(_continuation_token=None): + _request = build_beta_evaluators_list_generation_jobs_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + category=category, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) return _request def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.Insight], - deserialized.get("value", []), + List[_models.EvaluatorGenerationJob], + deserialized.get("data", []), ) if cls: list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, iter(list_of_elem) + return deserialized.get("last_id") or None, iter(list_of_elem) - def get_next(next_link=None): - _request = prepare_request(next_link) + def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) _stream = False pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access @@ -7592,112 +8367,84 @@ def get_next(next_link=None): return ItemPaged(get_next, extract_data) + @distributed_trace + def cancel_generation_job(self, job_id: str, **kwargs: Any) -> _models.EvaluatorGenerationJob: + """Cancels an evaluator generation job. -class BetaMemoryStoresOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. + Cancels an evaluator generation job by its ID. - Instead, you should access the following operations through - :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`memory_stores` attribute. - """ + :param job_id: The ID of the job to cancel. Required. + :type job_id: str + :return: EvaluatorGenerationJob. The EvaluatorGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.EvaluatorGenerationJob + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - @overload - def create( - self, - *, - name: str, - definition: _models.MemoryStoreDefinition, - content_type: str = "application/json", - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + cls: ClsType[_models.EvaluatorGenerationJob] = kwargs.pop("cls", None) - :keyword name: The name of the memory store. Required. - :paramtype name: str - :keyword definition: The memory store definition. Required. - :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails - :raises ~azure.core.exceptions.HttpResponseError: - """ + _request = build_beta_evaluators_cancel_generation_job_request( + job_id=job_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - @overload - def create( - self, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) - :param body: Required. - :type body: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails - :raises ~azure.core.exceptions.HttpResponseError: - """ + response = pipeline_response.http_response - @overload - def create( - self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) - :param body: Required. - :type body: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails - :raises ~azure.core.exceptions.HttpResponseError: - """ + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.EvaluatorGenerationJob, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore @distributed_trace - def create( - self, - body: Union[JSON, IO[bytes]] = _Unset, - *, - name: str = _Unset, - definition: _models.MemoryStoreDefinition = _Unset, - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Create a memory store. + def delete_generation_job( # pylint: disable=inconsistent-return-statements + self, job_id: str, **kwargs: Any + ) -> None: + """Deletes an evaluator generation job by its ID. Deletes the job record only; the generated + evaluator (if any) is preserved. - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword name: The name of the memory store. Required. - :paramtype name: str - :keyword definition: The memory store definition. Required. - :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param job_id: The ID of the job to delete. Required. + :type job_id: str + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7708,30 +8455,14 @@ def create( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) - - if body is _Unset: - if name is _Unset: - raise TypeError("missing required argument: name") - if definition is _Unset: - raise TypeError("missing required argument: definition") - body = {"definition": definition, "description": description, "metadata": metadata, "name": name} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + cls: ClsType[None] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_create_request( - content_type=content_type, + _request = build_beta_evaluators_delete_generation_job_request( + job_id=job_id, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -7740,20 +8471,14 @@ def create( } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = False pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [200]: - if _stream: - try: - response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [204]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -7761,102 +8486,83 @@ def create( ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, None, {}) # type: ignore - return deserialized # type: ignore + +class BetaInsightsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`insights` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") @overload - def update( - self, - name: str, - *, - content_type: str = "application/json", - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + def generate( + self, insight: _models.Insight, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Required. + :type insight: ~azure.ai.projects.models.Insight :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + def generate(self, insight: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Required. - :type body: JSON + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Required. + :type insight: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + def generate(self, insight: IO[bytes], *, content_type: str = "application/json", **kwargs: Any) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Required. + :type insight: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace - def update( - self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, - *, - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.MemoryStoreDetails: - """Update a memory store. + def generate(self, insight: Union[_models.Insight, JSON, IO[bytes]], **kwargs: Any) -> _models.Insight: + """Generate Insights. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword description: A human-readable description of the memory store. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default - value is None. - :paramtype metadata: dict[str, str] - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param insight: Complete evaluation configuration including data source, evaluators, and result + settings. Is one of the following types: Insight, JSON, IO[bytes] Required. + :type insight: ~azure.ai.projects.models.Insight or JSON or IO[bytes] + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7871,20 +8577,16 @@ def update( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) + cls: ClsType[_models.Insight] = kwargs.pop("cls", None) - if body is _Unset: - body = {"description": description, "metadata": metadata} - body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(insight, (IOBase, bytes)): + _content = insight else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(insight, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_memory_stores_update_request( - name=name, + _request = build_beta_insights_generate_request( content_type=content_type, api_version=self._config.api_version, content=_content, @@ -7904,7 +8606,7 @@ def update( response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: response.read() # Load the body in memory and close the socket @@ -7920,7 +8622,7 @@ def update( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) + deserialized = _deserialize(_models.Insight, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -7928,13 +8630,16 @@ def update( return deserialized # type: ignore @distributed_trace - def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: - """Retrieve a memory store. + def get(self, insight_id: str, *, include_coordinates: Optional[bool] = None, **kwargs: Any) -> _models.Insight: + """Get a specific insight by Id. - :param name: The name of the memory store to retrieve. Required. - :type name: str - :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :param insight_id: The unique identifier for the insights report. Required. + :type insight_id: str + :keyword include_coordinates: Whether to include coordinates for visualization in the response. + Defaults to false. Default value is None. + :paramtype include_coordinates: bool + :return: Insight. The Insight is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Insight :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -7948,10 +8653,11 @@ def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) + cls: ClsType[_models.Insight] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_get_request( - name=name, + _request = build_beta_insights_get_request( + insight_id=insight_id, + include_coordinates=include_coordinates, api_version=self._config.api_version, headers=_headers, params=_params, @@ -7985,7 +8691,7 @@ def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) + deserialized = _deserialize(_models.Insight, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -7996,35 +8702,35 @@ def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: def list( self, *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, + type: Optional[Union[str, _models.InsightType]] = None, + eval_id: Optional[str] = None, + run_id: Optional[str] = None, + agent_name: Optional[str] = None, + include_coordinates: Optional[bool] = None, **kwargs: Any - ) -> ItemPaged["_models.MemoryStoreDetails"]: - """List all memory stores. + ) -> ItemPaged["_models.Insight"]: + """List all insights in reverse chronological order (newest first). - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of MemoryStoreDetails - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.MemoryStoreDetails] + :keyword type: Filter by the type of analysis. Known values are: "EvaluationRunClusterInsight", + "AgentClusterInsight", and "EvaluationComparison". Default value is None. + :paramtype type: str or ~azure.ai.projects.models.InsightType + :keyword eval_id: Filter by the evaluation ID. Default value is None. + :paramtype eval_id: str + :keyword run_id: Filter by the evaluation run ID. Default value is None. + :paramtype run_id: str + :keyword agent_name: Filter by the agent name. Default value is None. + :paramtype agent_name: str + :keyword include_coordinates: Whether to include coordinates for visualization in the response. + Defaults to false. Default value is None. + :paramtype include_coordinates: bool + :return: An iterator like instance of Insight + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.Insight] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.MemoryStoreDetails]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.Insight]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -8034,35 +8740,63 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(_continuation_token=None): + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_insights_list_request( + type=type, + eval_id=eval_id, + run_id=run_id, + agent_name=agent_name, + include_coordinates=include_coordinates, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - _request = build_beta_memory_stores_list_request( - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) return _request def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.MemoryStoreDetails], - deserialized.get("data", []), + List[_models.Insight], + deserialized.get("value", []), ) if cls: list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, iter(list_of_elem) + return deserialized.get("nextLink") or None, iter(list_of_elem) - def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) + def get_next(next_link=None): + _request = prepare_request(next_link) _stream = False pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access @@ -8082,14 +8816,112 @@ def get_next(_continuation_token=None): return ItemPaged(get_next, extract_data) + +class BetaMemoryStoresOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`memory_stores` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @overload + def create( + self, + *, + name: str, + definition: _models.MemoryStoreDefinition, + content_type: str = "application/json", + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. + + :keyword name: The name of the memory store. Required. + :paramtype name: str + :keyword definition: The memory store definition. Required. + :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create( + self, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. + + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create( + self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. + + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + @distributed_trace - def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: - """Delete a memory store. + def create( + self, + body: Union[JSON, IO[bytes]] = _Unset, + *, + name: str = _Unset, + definition: _models.MemoryStoreDefinition = _Unset, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.MemoryStoreDetails: + """Create a memory store. - :param name: The name of the memory store to delete. Required. - :type name: str - :return: DeleteMemoryStoreResult. The DeleteMemoryStoreResult is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.DeleteMemoryStoreResult + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword name: The name of the memory store. Required. + :paramtype name: str + :keyword definition: The memory store definition. Required. + :paramtype definition: ~azure.ai.projects.models.MemoryStoreDefinition + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8100,14 +8932,30 @@ def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.DeleteMemoryStoreResult] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_delete_request( - name=name, + if body is _Unset: + if name is _Unset: + raise TypeError("missing required argument: name") + if definition is _Unset: + raise TypeError("missing required argument: definition") + body = {"definition": definition, "description": description, "metadata": metadata, "name": name} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_memory_stores_create_request( + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -8140,7 +8988,7 @@ def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.DeleteMemoryStoreResult, response.json()) + deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -8148,56 +8996,91 @@ def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: return deserialized # type: ignore @overload - def _search_memories( + def update( self, name: str, *, - scope: str, content_type: str = "application/json", - items: Optional[List[dict[str, Any]]] = None, - previous_search_id: Optional[str] = None, - options: Optional[_models.MemorySearchOptions] = None, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> _models.MemoryStoreSearchResult: ... + ) -> _models.MemoryStoreDetails: + """Update a memory store. + + :param name: The name of the memory store to update. Required. + :type name: str + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + @overload - def _search_memories( + def update( self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreSearchResult: ... + ) -> _models.MemoryStoreDetails: + """Update a memory store. + + :param name: The name of the memory store to update. Required. + :type name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ + @overload - def _search_memories( + def update( self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreSearchResult: ... + ) -> _models.MemoryStoreDetails: + """Update a memory store. + + :param name: The name of the memory store to update. Required. + :type name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ @distributed_trace - def _search_memories( + def update( self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, - scope: str = _Unset, - items: Optional[List[dict[str, Any]]] = None, - previous_search_id: Optional[str] = None, - options: Optional[_models.MemorySearchOptions] = None, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> _models.MemoryStoreSearchResult: - """Search for relevant memories from a memory store based on conversation context. + ) -> _models.MemoryStoreDetails: + """Update a memory store. - :param name: The name of the memory store to search. Required. + :param name: The name of the memory store to update. Required. :type name: str :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] - :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. - Required. - :paramtype scope: str - :keyword items: Items for which to search for relevant memories. Default value is None. - :paramtype items: list[dict[str, any]] - :keyword previous_search_id: The unique ID of the previous search request, enabling incremental - memory search from where the last operation left off. Default value is None. - :paramtype previous_search_id: str - :keyword options: Memory search options. Default value is None. - :paramtype options: ~azure.ai.projects.models.MemorySearchOptions - :return: MemoryStoreSearchResult. The MemoryStoreSearchResult is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreSearchResult + :keyword description: A human-readable description of the memory store. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the memory store. Default + value is None. + :paramtype metadata: dict[str, str] + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8212,17 +9095,10 @@ def _search_memories( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreSearchResult] = kwargs.pop("cls", None) + cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) if body is _Unset: - if scope is _Unset: - raise TypeError("missing required argument: scope") - body = { - "items": items, - "options": options, - "previous_search_id": previous_search_id, - "scope": scope, - } + body = {"description": description, "metadata": metadata} body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None @@ -8231,7 +9107,7 @@ def _search_memories( else: _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_memory_stores_search_memories_request( + _request = build_beta_memory_stores_update_request( name=name, content_type=content_type, api_version=self._config.api_version, @@ -8268,24 +9144,23 @@ def _search_memories( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreSearchResult, response.json()) + deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - def _update_memories_initial( - self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, - *, - scope: str = _Unset, - items: Optional[List[dict[str, Any]]] = None, - previous_update_id: Optional[str] = None, - update_delay: Optional[int] = None, - **kwargs: Any - ) -> Iterator[bytes]: + @distributed_trace + def get(self, name: str, **kwargs: Any) -> _models.MemoryStoreDetails: + """Retrieve a memory store. + + :param name: The name of the memory store to retrieve. Required. + :type name: str + :return: MemoryStoreDetails. The MemoryStoreDetails is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDetails + :raises ~azure.core.exceptions.HttpResponseError: + """ error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -8294,34 +9169,14 @@ def _update_memories_initial( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[Iterator[bytes]] = kwargs.pop("cls", None) - - if body is _Unset: - if scope is _Unset: - raise TypeError("missing required argument: scope") - body = { - "items": items, - "previous_update_id": previous_update_id, - "scope": scope, - "update_delay": update_delay, - } - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + cls: ClsType[_models.MemoryStoreDetails] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_update_memories_request( + _request = build_beta_memory_stores_get_request( name=name, - content_type=content_type, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -8331,18 +9186,19 @@ def _update_memories_initial( _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = True + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [202]: - try: - response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -8350,209 +9206,114 @@ def _update_memories_initial( ) raise HttpResponseError(response=response, model=error) - response_headers = {} - response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) - - deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.MemoryStoreDetails, response.json()) if cls: - return cls(pipeline_response, deserialized, response_headers) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @overload - def _begin_update_memories( - self, - name: str, - *, - scope: str, - content_type: str = "application/json", - items: Optional[List[dict[str, Any]]] = None, - previous_update_id: Optional[str] = None, - update_delay: Optional[int] = None, - **kwargs: Any - ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: ... - @overload - def _begin_update_memories( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: ... - @overload - def _begin_update_memories( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: ... - @distributed_trace - def _begin_update_memories( + def list( self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, *, - scope: str = _Unset, - items: Optional[List[dict[str, Any]]] = None, - previous_update_id: Optional[str] = None, - update_delay: Optional[int] = None, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, **kwargs: Any - ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: - """Update memory store with conversation memories. + ) -> ItemPaged["_models.MemoryStoreDetails"]: + """List all memory stores. - :param name: The name of the memory store to update. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. - Required. - :paramtype scope: str - :keyword items: Conversation items to be stored in memory. Default value is None. - :paramtype items: list[dict[str, any]] - :keyword previous_update_id: The unique ID of the previous update request, enabling incremental - memory updates from where the last operation left off. Default value is None. - :paramtype previous_update_id: str - :keyword update_delay: Timeout period before processing the memory update in seconds. - If a new update request is received during this period, it will cancel the current request and - reset the timeout. - Set to 0 to immediately trigger the update without delay. - Defaults to 300 (5 minutes). Default value is None. - :paramtype update_delay: int - :return: An instance of LROPoller that returns MemoryStoreUpdateCompletedResult. The - MemoryStoreUpdateCompletedResult is compatible with MutableMapping - :rtype: - ~azure.core.polling.LROPoller[~azure.ai.projects.models.MemoryStoreUpdateCompletedResult] + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of MemoryStoreDetails + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.MemoryStoreDetails] :raises ~azure.core.exceptions.HttpResponseError: """ - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreUpdateCompletedResult] = kwargs.pop("cls", None) - polling: Union[bool, PollingMethod] = kwargs.pop("polling", True) - lro_delay = kwargs.pop("polling_interval", self._config.polling_interval) - cont_token: Optional[str] = kwargs.pop("continuation_token", None) - if cont_token is None: - raw_result = self._update_memories_initial( - name=name, - body=body, - scope=scope, - items=items, - previous_update_id=previous_update_id, - update_delay=update_delay, - content_type=content_type, - cls=lambda x, y, z: x, + cls: ClsType[List[_models.MemoryStoreDetails]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(_continuation_token=None): + + _request = build_beta_memory_stores_list_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, headers=_headers, params=_params, - **kwargs ) - raw_result.http_response.read() # type: ignore - kwargs.pop("error_map", None) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request - def get_long_running_output(pipeline_response): - response_headers = {} - response = pipeline_response.http_response - response_headers["Operation-Location"] = self._deserialize( - "str", response.headers.get("Operation-Location") + def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.MemoryStoreDetails], + deserialized.get("data", []), ) - - deserialized = _deserialize(_models.MemoryStoreUpdateCompletedResult, response.json().get("result", {})) if cls: - return cls(pipeline_response, deserialized, response_headers) # type: ignore - return deserialized + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, iter(list_of_elem) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } + def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) - if polling is True: - polling_method: PollingMethod = cast( - PollingMethod, LROBasePolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs) - ) - elif polling is False: - polling_method = cast(PollingMethod, NoPolling()) - else: - polling_method = polling - if cont_token: - return LROPoller[_models.MemoryStoreUpdateCompletedResult].from_continuation_token( - polling_method=polling_method, - continuation_token=cont_token, - client=self._client, - deserialization_callback=get_long_running_output, + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs ) - return LROPoller[_models.MemoryStoreUpdateCompletedResult]( - self._client, raw_result, get_long_running_output, polling_method # type: ignore - ) - - @overload - def delete_scope( - self, name: str, *, scope: str, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. - - :param name: The name of the memory store. Required. - :type name: str - :keyword scope: The namespace that logically groups and isolates memories to delete, such as a - user ID. Required. - :paramtype scope: str - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def delete_scope( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. + response = pipeline_response.http_response - :param name: The name of the memory store. Required. - :type name: str - :param body: Required. - :type body: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult - :raises ~azure.core.exceptions.HttpResponseError: - """ + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) - @overload - def delete_scope( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. + return pipeline_response - :param name: The name of the memory store. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult - :raises ~azure.core.exceptions.HttpResponseError: - """ + return ItemPaged(get_next, extract_data) @distributed_trace - def delete_scope( - self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, scope: str = _Unset, **kwargs: Any - ) -> _models.MemoryStoreDeleteScopeResult: - """Delete all memories associated with a specific scope from a memory store. + def delete(self, name: str, **kwargs: Any) -> _models.DeleteMemoryStoreResult: + """Delete a memory store. - :param name: The name of the memory store. Required. + :param name: The name of the memory store to delete. Required. :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword scope: The namespace that logically groups and isolates memories to delete, such as a - user ID. Required. - :paramtype scope: str - :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with - MutableMapping - :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :return: DeleteMemoryStoreResult. The DeleteMemoryStoreResult is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DeleteMemoryStoreResult :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8563,29 +9324,14 @@ def delete_scope( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.MemoryStoreDeleteScopeResult] = kwargs.pop("cls", None) - - if body is _Unset: - if scope is _Unset: - raise TypeError("missing required argument: scope") - body = {"scope": scope} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + cls: ClsType[_models.DeleteMemoryStoreResult] = kwargs.pop("cls", None) - _request = build_beta_memory_stores_delete_scope_request( + _request = build_beta_memory_stores_delete_request( name=name, - content_type=content_type, api_version=self._config.api_version, - content=_content, headers=_headers, params=_params, ) @@ -8618,39 +9364,64 @@ def delete_scope( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.MemoryStoreDeleteScopeResult, response.json()) + deserialized = _deserialize(_models.DeleteMemoryStoreResult, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - -class BetaRedTeamsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`red_teams` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @overload + def _search_memories( + self, + name: str, + *, + scope: str, + content_type: str = "application/json", + items: Optional[List[dict[str, Any]]] = None, + previous_search_id: Optional[str] = None, + options: Optional[_models.MemorySearchOptions] = None, + **kwargs: Any + ) -> _models.MemoryStoreSearchResult: ... + @overload + def _search_memories( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreSearchResult: ... + @overload + def _search_memories( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreSearchResult: ... @distributed_trace - def get(self, name: str, **kwargs: Any) -> _models.RedTeam: - """Get a redteam by name. - - :param name: Identifier of the red team run. Required. - :type name: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam + def _search_memories( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + scope: str = _Unset, + items: Optional[List[dict[str, Any]]] = None, + previous_search_id: Optional[str] = None, + options: Optional[_models.MemorySearchOptions] = None, + **kwargs: Any + ) -> _models.MemoryStoreSearchResult: + """Search for relevant memories from a memory store based on conversation context. + + :param name: The name of the memory store to search. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. + Required. + :paramtype scope: str + :keyword items: Items for which to search for relevant memories. Default value is None. + :paramtype items: list[dict[str, any]] + :keyword previous_search_id: The unique ID of the previous search request, enabling incremental + memory search from where the last operation left off. Default value is None. + :paramtype previous_search_id: str + :keyword options: Memory search options. Default value is None. + :paramtype options: ~azure.ai.projects.models.MemorySearchOptions + :return: MemoryStoreSearchResult. The MemoryStoreSearchResult is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreSearchResult :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8661,14 +9432,34 @@ def get(self, name: str, **kwargs: Any) -> _models.RedTeam: } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreSearchResult] = kwargs.pop("cls", None) - _request = build_beta_red_teams_get_request( + if body is _Unset: + if scope is _Unset: + raise TypeError("missing required argument: scope") + body = { + "items": items, + "options": options, + "previous_search_id": previous_search_id, + "scope": scope, + } + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_memory_stores_search_memories_request( name=name, + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -8692,161 +9483,33 @@ def get(self, name: str, **kwargs: Any) -> _models.RedTeam: except (StreamConsumedError, StreamClosedError): pass map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.RedTeam, response.json()) + deserialized = _deserialize(_models.MemoryStoreSearchResult, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @distributed_trace - def list(self, **kwargs: Any) -> ItemPaged["_models.RedTeam"]: - """List a redteam by name. - - :return: An iterator like instance of RedTeam - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.RedTeam] - :raises ~azure.core.exceptions.HttpResponseError: - """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.RedTeam]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_red_teams_list_request( - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - return _request - - def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.RedTeam], - deserialized.get("value", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, iter(list_of_elem) - - def get_next(next_link=None): - _request = prepare_request(next_link) - - _stream = False - pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response - - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) - - return pipeline_response - - return ItemPaged(get_next, extract_data) - - @overload - def create( - self, red_team: _models.RedTeam, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Required. - :type red_team: ~azure.ai.projects.models.RedTeam - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def create(self, red_team: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Required. - :type red_team: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def create(self, red_team: IO[bytes], *, content_type: str = "application/json", **kwargs: Any) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Required. - :type red_team: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @distributed_trace - def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: Any) -> _models.RedTeam: - """Creates a redteam run. - - :param red_team: Redteam to be run. Is one of the following types: RedTeam, JSON, IO[bytes] - Required. - :type red_team: ~azure.ai.projects.models.RedTeam or JSON or IO[bytes] - :return: RedTeam. The RedTeam is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.RedTeam - :raises ~azure.core.exceptions.HttpResponseError: - """ + def _update_memories_initial( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + scope: str = _Unset, + items: Optional[List[dict[str, Any]]] = None, + previous_update_id: Optional[str] = None, + update_delay: Optional[int] = None, + **kwargs: Any + ) -> Iterator[bytes]: error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -8859,16 +9522,27 @@ def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: An _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + cls: ClsType[Iterator[bytes]] = kwargs.pop("cls", None) + if body is _Unset: + if scope is _Unset: + raise TypeError("missing required argument: scope") + body = { + "items": items, + "previous_update_id": previous_update_id, + "scope": scope, + "update_delay": update_delay, + } + body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None - if isinstance(red_team, (IOBase, bytes)): - _content = red_team + if isinstance(body, (IOBase, bytes)): + _content = body else: - _content = json.dumps(red_team, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_red_teams_create_request( + _request = build_beta_memory_stores_update_memories_request( + name=name, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -8881,19 +9555,18 @@ def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: An _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = True pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [201]: - if _stream: - try: - response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [202]: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -8901,25 +9574,1107 @@ def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: An ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.RedTeam, response.json()) + response_headers = {} + response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore + + return deserialized # type: ignore + + @overload + def _begin_update_memories( + self, + name: str, + *, + scope: str, + content_type: str = "application/json", + items: Optional[List[dict[str, Any]]] = None, + previous_update_id: Optional[str] = None, + update_delay: Optional[int] = None, + **kwargs: Any + ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: ... + @overload + def _begin_update_memories( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: ... + @overload + def _begin_update_memories( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: ... + + @distributed_trace + def _begin_update_memories( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + scope: str = _Unset, + items: Optional[List[dict[str, Any]]] = None, + previous_update_id: Optional[str] = None, + update_delay: Optional[int] = None, + **kwargs: Any + ) -> LROPoller[_models.MemoryStoreUpdateCompletedResult]: + """Update memory store with conversation memories. + + :param name: The name of the memory store to update. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword scope: The namespace that logically groups and isolates memories, such as a user ID. + Required. + :paramtype scope: str + :keyword items: Conversation items to be stored in memory. Default value is None. + :paramtype items: list[dict[str, any]] + :keyword previous_update_id: The unique ID of the previous update request, enabling incremental + memory updates from where the last operation left off. Default value is None. + :paramtype previous_update_id: str + :keyword update_delay: Timeout period before processing the memory update in seconds. + If a new update request is received during this period, it will cancel the current request and + reset the timeout. + Set to 0 to immediately trigger the update without delay. + Defaults to 300 (5 minutes). Default value is None. + :paramtype update_delay: int + :return: An instance of LROPoller that returns MemoryStoreUpdateCompletedResult. The + MemoryStoreUpdateCompletedResult is compatible with MutableMapping + :rtype: + ~azure.core.polling.LROPoller[~azure.ai.projects.models.MemoryStoreUpdateCompletedResult] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreUpdateCompletedResult] = kwargs.pop("cls", None) + polling: Union[bool, PollingMethod] = kwargs.pop("polling", True) + lro_delay = kwargs.pop("polling_interval", self._config.polling_interval) + cont_token: Optional[str] = kwargs.pop("continuation_token", None) + if cont_token is None: + raw_result = self._update_memories_initial( + name=name, + body=body, + scope=scope, + items=items, + previous_update_id=previous_update_id, + update_delay=update_delay, + content_type=content_type, + cls=lambda x, y, z: x, + headers=_headers, + params=_params, + **kwargs + ) + raw_result.http_response.read() # type: ignore + kwargs.pop("error_map", None) + + def get_long_running_output(pipeline_response): + response_headers = {} + response = pipeline_response.http_response + response_headers["Operation-Location"] = self._deserialize( + "str", response.headers.get("Operation-Location") + ) + + deserialized = _deserialize(_models.MemoryStoreUpdateCompletedResult, response.json().get("result", {})) + if cls: + return cls(pipeline_response, deserialized, response_headers) # type: ignore + return deserialized + + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + + if polling is True: + polling_method: PollingMethod = cast( + PollingMethod, LROBasePolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs) + ) + elif polling is False: + polling_method = cast(PollingMethod, NoPolling()) + else: + polling_method = polling + if cont_token: + return LROPoller[_models.MemoryStoreUpdateCompletedResult].from_continuation_token( + polling_method=polling_method, + continuation_token=cont_token, + client=self._client, + deserialization_callback=get_long_running_output, + ) + return LROPoller[_models.MemoryStoreUpdateCompletedResult]( + self._client, raw_result, get_long_running_output, polling_method # type: ignore + ) + + @overload + def delete_scope( + self, name: str, *, scope: str, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :keyword scope: The namespace that logically groups and isolates memories to delete, such as a + user ID. Required. + :paramtype scope: str + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def delete_scope( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def delete_scope( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace + def delete_scope( + self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, scope: str = _Unset, **kwargs: Any + ) -> _models.MemoryStoreDeleteScopeResult: + """Delete all memories associated with a specific scope from a memory store. + + :param name: The name of the memory store. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword scope: The namespace that logically groups and isolates memories to delete, such as a + user ID. Required. + :paramtype scope: str + :return: MemoryStoreDeleteScopeResult. The MemoryStoreDeleteScopeResult is compatible with + MutableMapping + :rtype: ~azure.ai.projects.models.MemoryStoreDeleteScopeResult + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.MemoryStoreDeleteScopeResult] = kwargs.pop("cls", None) + + if body is _Unset: + if scope is _Unset: + raise TypeError("missing required argument: scope") + body = {"scope": scope} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_memory_stores_delete_scope_request( + name=name, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.MemoryStoreDeleteScopeResult, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + +class BetaRedTeamsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`red_teams` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @distributed_trace + def get(self, name: str, **kwargs: Any) -> _models.RedTeam: + """Get a redteam by name. + + :param name: Identifier of the red team run. Required. + :type name: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + + _request = build_beta_red_teams_get_request( + name=name, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.RedTeam, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def list(self, **kwargs: Any) -> ItemPaged["_models.RedTeam"]: + """List a redteam by name. + + :return: An iterator like instance of RedTeam + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.RedTeam] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.RedTeam]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_red_teams_list_request( + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + return _request + + def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.RedTeam], + deserialized.get("value", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("nextLink") or None, iter(list_of_elem) + + def get_next(next_link=None): + _request = prepare_request(next_link) + + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + return pipeline_response + + return ItemPaged(get_next, extract_data) + + @overload + def create( + self, red_team: _models.RedTeam, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Required. + :type red_team: ~azure.ai.projects.models.RedTeam + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create(self, red_team: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Required. + :type red_team: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create(self, red_team: IO[bytes], *, content_type: str = "application/json", **kwargs: Any) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Required. + :type red_team: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace + def create(self, red_team: Union[_models.RedTeam, JSON, IO[bytes]], **kwargs: Any) -> _models.RedTeam: + """Creates a redteam run. + + :param red_team: Redteam to be run. Is one of the following types: RedTeam, JSON, IO[bytes] + Required. + :type red_team: ~azure.ai.projects.models.RedTeam or JSON or IO[bytes] + :return: RedTeam. The RedTeam is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.RedTeam + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.RedTeam] = kwargs.pop("cls", None) + + content_type = content_type or "application/json" + _content = None + if isinstance(red_team, (IOBase, bytes)): + _content = red_team + else: + _content = json.dumps(red_team, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_red_teams_create_request( + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [201]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.RedTeam, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + +class BetaSchedulesOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`schedules` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @distributed_trace + def delete(self, schedule_id: str, **kwargs: Any) -> None: # pylint: disable=inconsistent-return-statements + """Delete a schedule. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :return: None + :rtype: None + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[None] = kwargs.pop("cls", None) + + _request = build_beta_schedules_delete_request( + schedule_id=schedule_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [204]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if cls: + return cls(pipeline_response, None, {}) # type: ignore + + @distributed_trace + def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: + """Get a schedule by id. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) + + _request = build_beta_schedules_get_request( + schedule_id=schedule_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.Schedule, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def list( + self, + *, + type: Optional[Union[str, _models.ScheduleTaskType]] = None, + enabled: Optional[bool] = None, + **kwargs: Any + ) -> ItemPaged["_models.Schedule"]: + """List all schedules. + + :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". + Default value is None. + :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType + :keyword enabled: Filter by the enabled status. Default value is None. + :paramtype enabled: bool + :return: An iterator like instance of Schedule + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.Schedule] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.Schedule]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_schedules_list_request( + type=type, + enabled=enabled, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + return _request + + def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.Schedule], + deserialized.get("value", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("nextLink") or None, iter(list_of_elem) + + def get_next(next_link=None): + _request = prepare_request(next_link) + + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + return pipeline_response + + return ItemPaged(get_next, extract_data) + + @overload + def create_or_update( + self, schedule_id: str, schedule: _models.Schedule, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Required. + :type schedule: ~azure.ai.projects.models.Schedule + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create_or_update( + self, schedule_id: str, schedule: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Required. + :type schedule: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create_or_update( + self, schedule_id: str, schedule: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Required. + :type schedule: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @distributed_trace + def create_or_update( + self, schedule_id: str, schedule: Union[_models.Schedule, JSON, IO[bytes]], **kwargs: Any + ) -> _models.Schedule: + """Create or update operation template. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :param schedule: The resource instance. Is one of the following types: Schedule, JSON, + IO[bytes] Required. + :type schedule: ~azure.ai.projects.models.Schedule or JSON or IO[bytes] + :return: Schedule. The Schedule is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.Schedule + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) + _params = kwargs.pop("params", {}) or {} + + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) + + content_type = content_type or "application/json" + _content = None + if isinstance(schedule, (IOBase, bytes)): + _content = schedule + else: + _content = json.dumps(schedule, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_schedules_create_or_update_request( + schedule_id=schedule_id, + content_type=content_type, + api_version=self._config.api_version, + content=_content, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200, 201]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.Schedule, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models.ScheduleRun: + """Get a schedule run by id. + + :param schedule_id: The unique identifier of the schedule. Required. + :type schedule_id: str + :param run_id: The unique identifier of the schedule run. Required. + :type run_id: str + :return: ScheduleRun. The ScheduleRun is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ScheduleRun + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[_models.ScheduleRun] = kwargs.pop("cls", None) + + _request = build_beta_schedules_get_run_request( + schedule_id=schedule_id, + run_id=run_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.ScheduleRun, response.json()) + + if cls: + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore + + @distributed_trace + def list_runs( + self, + schedule_id: str, + *, + type: Optional[Union[str, _models.ScheduleTaskType]] = None, + enabled: Optional[bool] = None, + **kwargs: Any + ) -> ItemPaged["_models.ScheduleRun"]: + """List all schedule runs. + + :param schedule_id: Identifier of the schedule. Required. + :type schedule_id: str + :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". + Default value is None. + :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType + :keyword enabled: Filter by the enabled status. Default value is None. + :paramtype enabled: bool + :return: An iterator like instance of ScheduleRun + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.ScheduleRun] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[List[_models.ScheduleRun]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(next_link=None): + if not next_link: + + _request = build_beta_schedules_list_runs_request( + schedule_id=schedule_id, + type=type, + enabled=enabled, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + else: + # make call to next link with the client's api-version + _parsed_next_link = urllib.parse.urlparse(next_link) + _next_request_params = case_insensitive_dict( + { + key: [urllib.parse.quote(v) for v in value] + for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() + } + ) + _next_request_params["api-version"] = self._config.api_version + _request = HttpRequest( + "GET", + urllib.parse.urljoin(next_link, _parsed_next_link.path), + params=_next_request_params, + headers=_headers, + ) + path_format_arguments = { + "endpoint": self._serialize.url( + "self._config.endpoint", self._config.endpoint, "str", skip_quote=True + ), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - return deserialized # type: ignore + return _request + + def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.ScheduleRun], + deserialized.get("value", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("nextLink") or None, iter(list_of_elem) + + def get_next(next_link=None): + _request = prepare_request(next_link) + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response -class BetaSchedulesOperations: + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + raise HttpResponseError(response=response) + + return pipeline_response + + return ItemPaged(get_next, extract_data) + + +class BetaToolboxesOperations: """ .. warning:: **DO NOT** instantiate this class directly. Instead, you should access the following operations through :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`schedules` attribute. + :attr:`toolboxes` attribute. """ def __init__(self, *args, **kwargs) -> None: @@ -8929,14 +10684,108 @@ def __init__(self, *args, **kwargs) -> None: self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @overload + def create_version( + self, + name: str, + *, + tools: List[_models.Tool], + content_type: str = "application/json", + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + policies: Optional[_models.ToolboxPolicies] = None, + **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :keyword tools: The list of tools to include in this version. Required. + :paramtype tools: list[~azure.ai.projects.models.Tool] + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the toolbox. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is + None. + :paramtype metadata: dict[str, str] + :keyword policies: Policy configuration for this toolbox version. Default value is None. + :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create_version( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :raises ~azure.core.exceptions.HttpResponseError: + """ + + @overload + def create_version( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :raises ~azure.core.exceptions.HttpResponseError: + """ + @distributed_trace - def delete(self, schedule_id: str, **kwargs: Any) -> None: # pylint: disable=inconsistent-return-statements - """Delete a schedule. + def create_version( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + tools: List[_models.Tool] = _Unset, + description: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + policies: Optional[_models.ToolboxPolicies] = None, + **kwargs: Any + ) -> _models.ToolboxVersionObject: + """Create a new version of a toolbox. If the toolbox does not exist, it will be created. - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :return: None - :rtype: None + :param name: The name of the toolbox. If the toolbox does not exist, it will be created. + Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword tools: The list of tools to include in this version. Required. + :paramtype tools: list[~azure.ai.projects.models.Tool] + :keyword description: A human-readable description of the toolbox. Default value is None. + :paramtype description: str + :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is + None. + :paramtype metadata: dict[str, str] + :keyword policies: Policy configuration for this toolbox version. Default value is None. + :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8947,14 +10796,29 @@ def delete(self, schedule_id: str, **kwargs: Any) -> None: # pylint: disable=in } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) - _request = build_beta_schedules_delete_request( - schedule_id=schedule_id, + if body is _Unset: + if tools is _Unset: + raise TypeError("missing required argument: tools") + body = {"description": description, "metadata": metadata, "policies": policies, "tools": tools} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_toolboxes_create_version_request( + name=name, + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -8963,28 +10827,45 @@ def delete(self, schedule_id: str, **kwargs: Any) -> None: # pylint: disable=in } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore + + return deserialized # type: ignore @distributed_trace - def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: - """Get a schedule by id. + def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: + """Retrieve a toolbox. - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule + :param name: The name of the toolbox to retrieve. Required. + :type name: str + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -8998,10 +10879,10 @@ def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) + cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) - _request = build_beta_schedules_get_request( - schedule_id=schedule_id, + _request = build_beta_toolboxes_get_request( + name=name, api_version=self._config.api_version, headers=_headers, params=_params, @@ -9026,12 +10907,16 @@ def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: except (StreamConsumedError, StreamClosedError): pass map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.Schedule, response.json()) + deserialized = _deserialize(_models.ToolboxObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -9042,25 +10927,35 @@ def get(self, schedule_id: str, **kwargs: Any) -> _models.Schedule: def list( self, *, - type: Optional[Union[str, _models.ScheduleTaskType]] = None, - enabled: Optional[bool] = None, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, **kwargs: Any - ) -> ItemPaged["_models.Schedule"]: - """List all schedules. + ) -> ItemPaged["_models.ToolboxObject"]: + """List all toolboxes. - :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. Default value is None. - :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType - :keyword enabled: Filter by the enabled status. Default value is None. - :paramtype enabled: bool - :return: An iterator like instance of Schedule - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.Schedule] + :paramtype before: str + :return: An iterator like instance of ToolboxObject + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.ToolboxObject] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.Schedule]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.ToolboxObject]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -9070,60 +10965,35 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_schedules_list_request( - type=type, - enabled=enabled, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) + def prepare_request(_continuation_token=None): + _request = build_beta_toolboxes_list_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) return _request def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.Schedule], - deserialized.get("value", []), + List[_models.ToolboxObject], + deserialized.get("data", []), ) if cls: list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, iter(list_of_elem) + return deserialized.get("last_id") or None, iter(list_of_elem) - def get_next(next_link=None): - _request = prepare_request(next_link) + def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) _stream = False pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access @@ -9133,81 +11003,53 @@ def get_next(next_link=None): if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) return pipeline_response return ItemPaged(get_next, extract_data) - @overload - def create_or_update( - self, schedule_id: str, schedule: _models.Schedule, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. - - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Required. - :type schedule: ~azure.ai.projects.models.Schedule - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def create_or_update( - self, schedule_id: str, schedule: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. - - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Required. - :type schedule: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def create_or_update( - self, schedule_id: str, schedule: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. + @distributed_trace + def list_versions( + self, + name: str, + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + **kwargs: Any + ) -> ItemPaged["_models.ToolboxVersionObject"]: + """List all versions of a toolbox. - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Required. - :type schedule: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule + :param name: The name of the toolbox to list versions for. Required. + :type name: str + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of ToolboxVersionObject + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.ToolboxVersionObject] :raises ~azure.core.exceptions.HttpResponseError: """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - @distributed_trace - def create_or_update( - self, schedule_id: str, schedule: Union[_models.Schedule, JSON, IO[bytes]], **kwargs: Any - ) -> _models.Schedule: - """Create or update operation template. + cls: ClsType[List[_models.ToolboxVersionObject]] = kwargs.pop("cls", None) - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :param schedule: The resource instance. Is one of the following types: Schedule, JSON, - IO[bytes] Required. - :type schedule: ~azure.ai.projects.models.Schedule or JSON or IO[bytes] - :return: Schedule. The Schedule is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.Schedule - :raises ~azure.core.exceptions.HttpResponseError: - """ error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -9216,69 +11058,65 @@ def create_or_update( } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) - _params = kwargs.pop("params", {}) or {} - - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.Schedule] = kwargs.pop("cls", None) - - content_type = content_type or "application/json" - _content = None - if isinstance(schedule, (IOBase, bytes)): - _content = schedule - else: - _content = json.dumps(schedule, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + def prepare_request(_continuation_token=None): - _request = build_beta_schedules_create_or_update_request( - schedule_id=schedule_id, - content_type=content_type, - api_version=self._config.api_version, - content=_content, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) + _request = build_beta_toolboxes_list_versions_request( + name=name, + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) - pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) + def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.ToolboxVersionObject], + deserialized.get("data", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, iter(list_of_elem) - response = pipeline_response.http_response + def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) - if response.status_code not in [200, 201]: - if _stream: - try: - response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass - map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.Schedule, response.json()) + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return pipeline_response - return deserialized # type: ignore + return ItemPaged(get_next, extract_data) @distributed_trace - def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models.ScheduleRun: - """Get a schedule run by id. + def get_version(self, name: str, version: str, **kwargs: Any) -> _models.ToolboxVersionObject: + """Retrieve a specific version of a toolbox. - :param schedule_id: The unique identifier of the schedule. Required. - :type schedule_id: str - :param run_id: The unique identifier of the schedule run. Required. - :type run_id: str - :return: ScheduleRun. The ScheduleRun is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ScheduleRun + :param name: The name of the toolbox. Required. + :type name: str + :param version: The version identifier to retrieve. Required. + :type version: str + :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxVersionObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -9292,11 +11130,11 @@ def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models.Sched _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.ScheduleRun] = kwargs.pop("cls", None) + cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) - _request = build_beta_schedules_get_run_request( - schedule_id=schedule_id, - run_id=run_id, + _request = build_beta_toolboxes_get_version_request( + name=name, + version=version, api_version=self._config.api_version, headers=_headers, params=_params, @@ -9330,238 +11168,83 @@ def get_run(self, schedule_id: str, run_id: str, **kwargs: Any) -> _models.Sched if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ScheduleRun, response.json()) + deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @distributed_trace - def list_runs( - self, - schedule_id: str, - *, - type: Optional[Union[str, _models.ScheduleTaskType]] = None, - enabled: Optional[bool] = None, - **kwargs: Any - ) -> ItemPaged["_models.ScheduleRun"]: - """List all schedule runs. - - :param schedule_id: Identifier of the schedule. Required. - :type schedule_id: str - :keyword type: Filter by the type of schedule. Known values are: "Evaluation" and "Insight". - Default value is None. - :paramtype type: str or ~azure.ai.projects.models.ScheduleTaskType - :keyword enabled: Filter by the enabled status. Default value is None. - :paramtype enabled: bool - :return: An iterator like instance of ScheduleRun - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.ScheduleRun] - :raises ~azure.core.exceptions.HttpResponseError: - """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.ScheduleRun]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(next_link=None): - if not next_link: - - _request = build_beta_schedules_list_runs_request( - schedule_id=schedule_id, - type=type, - enabled=enabled, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - else: - # make call to next link with the client's api-version - _parsed_next_link = urllib.parse.urlparse(next_link) - _next_request_params = case_insensitive_dict( - { - key: [urllib.parse.quote(v) for v in value] - for key, value in urllib.parse.parse_qs(_parsed_next_link.query).items() - } - ) - _next_request_params["api-version"] = self._config.api_version - _request = HttpRequest( - "GET", - urllib.parse.urljoin(next_link, _parsed_next_link.path), - params=_next_request_params, - headers=_headers, - ) - path_format_arguments = { - "endpoint": self._serialize.url( - "self._config.endpoint", self._config.endpoint, "str", skip_quote=True - ), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - return _request - - def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.ScheduleRun], - deserialized.get("value", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("nextLink") or None, iter(list_of_elem) - - def get_next(next_link=None): - _request = prepare_request(next_link) - - _stream = False - pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response - - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - raise HttpResponseError(response=response) - - return pipeline_response - - return ItemPaged(get_next, extract_data) - - -class BetaToolboxesOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. - - Instead, you should access the following operations through - :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`toolboxes` attribute. - """ - - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") - @overload - def create_version( - self, - name: str, - *, - tools: List[_models.Tool], - content_type: str = "application/json", - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - policies: Optional[_models.ToolboxPolicies] = None, - **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + def update( + self, name: str, *, default_version: str, content_type: str = "application/json", **kwargs: Any + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str - :keyword tools: The list of tools to include in this version. Required. - :paramtype tools: list[~azure.ai.projects.models.Tool] + :keyword default_version: The version identifier that the toolbox should point to. When set, + the toolbox's default version will resolve to this version instead of the latest. Required. + :paramtype default_version: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :keyword description: A human-readable description of the toolbox. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is - None. - :paramtype metadata: dict[str, str] - :keyword policies: Policy configuration for this toolbox version. Default value is None. - :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def create_version( + def update( self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str :param body: Required. :type body: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def create_version( + def update( self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str :param body: Required. :type body: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace - def create_version( - self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, - *, - tools: List[_models.Tool] = _Unset, - description: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - policies: Optional[_models.ToolboxPolicies] = None, - **kwargs: Any - ) -> _models.ToolboxVersionObject: - """Create a new version of a toolbox. If the toolbox does not exist, it will be created. + def update( + self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, default_version: str = _Unset, **kwargs: Any + ) -> _models.ToolboxObject: + """Update a toolbox to point to a specific version. - :param name: The name of the toolbox. If the toolbox does not exist, it will be created. - Required. + :param name: The name of the toolbox to update. Required. :type name: str :param body: Is either a JSON type or a IO[bytes] type. Required. :type body: JSON or IO[bytes] - :keyword tools: The list of tools to include in this version. Required. - :paramtype tools: list[~azure.ai.projects.models.Tool] - :keyword description: A human-readable description of the toolbox. Default value is None. - :paramtype description: str - :keyword metadata: Arbitrary key-value metadata to associate with the toolbox. Default value is - None. - :paramtype metadata: dict[str, str] - :keyword policies: Policy configuration for this toolbox version. Default value is None. - :paramtype policies: ~azure.ai.projects.models.ToolboxPolicies - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :keyword default_version: The version identifier that the toolbox should point to. When set, + the toolbox's default version will resolve to this version instead of the latest. Required. + :paramtype default_version: str + :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.ToolboxObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -9576,12 +11259,12 @@ def create_version( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) + cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) if body is _Unset: - if tools is _Unset: - raise TypeError("missing required argument: tools") - body = {"description": description, "metadata": metadata, "policies": policies, "tools": tools} + if default_version is _Unset: + raise TypeError("missing required argument: default_version") + body = {"default_version": default_version} body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None @@ -9590,7 +11273,7 @@ def create_version( else: _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_toolboxes_create_version_request( + _request = build_beta_toolboxes_update_request( name=name, content_type=content_type, api_version=self._config.api_version, @@ -9627,7 +11310,7 @@ def create_version( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) + deserialized = _deserialize(_models.ToolboxObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -9635,13 +11318,13 @@ def create_version( return deserialized # type: ignore @distributed_trace - def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: - """Retrieve a toolbox. + def delete(self, name: str, **kwargs: Any) -> None: # pylint: disable=inconsistent-return-statements + """Delete a toolbox and all its versions. - :param name: The name of the toolbox to retrieve. Required. + :param name: The name of the toolbox to delete. Required. :type name: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -9655,9 +11338,9 @@ def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) + cls: ClsType[None] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_get_request( + _request = build_beta_toolboxes_delete_request( name=name, api_version=self._config.api_version, headers=_headers, @@ -9668,20 +11351,14 @@ def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) + _stream = False pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [200]: - if _stream: - try: - response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass + if response.status_code not in [204]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -9689,50 +11366,23 @@ def get(self, name: str, **kwargs: Any) -> _models.ToolboxObject: ) raise HttpResponseError(response=response, model=error) - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.ToolboxObject, response.json()) - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore - - return deserialized # type: ignore + return cls(pipeline_response, None, {}) # type: ignore @distributed_trace - def list( - self, - *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, - **kwargs: Any - ) -> ItemPaged["_models.ToolboxObject"]: - """List all toolboxes. + def delete_version( # pylint: disable=inconsistent-return-statements + self, name: str, version: str, **kwargs: Any + ) -> None: + """Delete a specific version of a toolbox. - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of ToolboxObject - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.ToolboxObject] + :param name: The name of the toolbox. Required. + :type name: str + :param version: The version identifier to delete. Required. + :type version: str + :return: None + :rtype: None :raises ~azure.core.exceptions.HttpResponseError: """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.ToolboxObject]] = kwargs.pop("cls", None) - error_map: MutableMapping = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, @@ -9741,158 +11391,153 @@ def list( } error_map.update(kwargs.pop("error_map", {}) or {}) - def prepare_request(_continuation_token=None): + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - _request = build_beta_toolboxes_list_request( - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - return _request + cls: ClsType[None] = kwargs.pop("cls", None) - def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.ToolboxObject], - deserialized.get("data", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, iter(list_of_elem) + _request = build_beta_toolboxes_delete_version_request( + name=name, + version=version, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) - def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) - _stream = False - pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs + response = pipeline_response.http_response + + if response.status_code not in [204]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, ) - response = pipeline_response.http_response + raise HttpResponseError(response=response, model=error) - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) + if cls: + return cls(pipeline_response, None, {}) # type: ignore - return pipeline_response - return ItemPaged(get_next, extract_data) +class BetaSkillsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. - @distributed_trace - def list_versions( + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`skills` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + + @overload + def create( self, - name: str, *, - limit: Optional[int] = None, - order: Optional[Union[str, _models.PageOrder]] = None, - before: Optional[str] = None, + name: str, + content_type: str = "application/json", + description: Optional[str] = None, + instructions: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> ItemPaged["_models.ToolboxVersionObject"]: - """List all versions of a toolbox. - - :param name: The name of the toolbox to list versions for. Required. - :type name: str - :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and - 100, and the - default is 20. Default value is None. - :paramtype limit: int - :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for - ascending order and``desc`` - for descending order. Known values are: "asc" and "desc". Default value is None. - :paramtype order: str or ~azure.ai.projects.models.PageOrder - :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your - place in the list. - For instance, if you make a list request and receive 100 objects, ending with obj_foo, your - subsequent call can include before=obj_foo in order to fetch the previous page of the list. - Default value is None. - :paramtype before: str - :return: An iterator like instance of ToolboxVersionObject - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.ToolboxVersionObject] - :raises ~azure.core.exceptions.HttpResponseError: - """ - _headers = kwargs.pop("headers", {}) or {} - _params = kwargs.pop("params", {}) or {} - - cls: ClsType[List[_models.ToolboxVersionObject]] = kwargs.pop("cls", None) - - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - def prepare_request(_continuation_token=None): - - _request = build_beta_toolboxes_list_versions_request( - name=name, - limit=limit, - order=order, - after=_continuation_token, - before=before, - api_version=self._config.api_version, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - return _request + ) -> _models.SkillObject: + """Creates a skill. - def extract_data(pipeline_response): - deserialized = pipeline_response.http_response.json() - list_of_elem = _deserialize( - List[_models.ToolboxVersionObject], - deserialized.get("data", []), - ) - if cls: - list_of_elem = cls(list_of_elem) # type: ignore - return deserialized.get("last_id") or None, iter(list_of_elem) + :keyword name: The unique name of the skill. Required. + :paramtype name: str + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :keyword description: A human-readable description of the skill. Default value is None. + :paramtype description: str + :keyword instructions: Instructions that define the behavior of the skill. Default value is + None. + :paramtype instructions: str + :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. - def get_next(_continuation_token=None): - _request = prepare_request(_continuation_token) + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. Default value is None. + :paramtype metadata: dict[str, str] + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject + :raises ~azure.core.exceptions.HttpResponseError: + """ - _stream = False - pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - response = pipeline_response.http_response + @overload + def create(self, body: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.SkillObject: + """Creates a skill. - if response.status_code not in [200]: - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) + :param body: Required. + :type body: JSON + :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. + Default value is "application/json". + :paramtype content_type: str + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject + :raises ~azure.core.exceptions.HttpResponseError: + """ - return pipeline_response + @overload + def create(self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any) -> _models.SkillObject: + """Creates a skill. - return ItemPaged(get_next, extract_data) + :param body: Required. + :type body: IO[bytes] + :keyword content_type: Body Parameter content-type. Content type parameter for binary body. + Default value is "application/json". + :paramtype content_type: str + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject + :raises ~azure.core.exceptions.HttpResponseError: + """ @distributed_trace - def get_version(self, name: str, version: str, **kwargs: Any) -> _models.ToolboxVersionObject: - """Retrieve a specific version of a toolbox. + def create( + self, + body: Union[JSON, IO[bytes]] = _Unset, + *, + name: str = _Unset, + description: Optional[str] = None, + instructions: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.SkillObject: + """Creates a skill. - :param name: The name of the toolbox. Required. - :type name: str - :param version: The version identifier to retrieve. Required. - :type version: str - :return: ToolboxVersionObject. The ToolboxVersionObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxVersionObject + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword name: The unique name of the skill. Required. + :paramtype name: str + :keyword description: A human-readable description of the skill. Default value is None. + :paramtype description: str + :keyword instructions: Instructions that define the behavior of the skill. Default value is + None. + :paramtype instructions: str + :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. + + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. Default value is None. + :paramtype metadata: dict[str, str] + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -9903,15 +11548,28 @@ def get_version(self, name: str, version: str, **kwargs: Any) -> _models.Toolbox } error_map.update(kwargs.pop("error_map", {}) or {}) - _headers = kwargs.pop("headers", {}) or {} + _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.ToolboxVersionObject] = kwargs.pop("cls", None) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) + cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_get_version_request( - name=name, - version=version, + if body is _Unset: + if name is _Unset: + raise TypeError("missing required argument: name") + body = {"description": description, "instructions": instructions, "metadata": metadata, "name": name} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + + _request = build_beta_skills_create_request( + content_type=content_type, api_version=self._config.api_version, + content=_content, headers=_headers, params=_params, ) @@ -9928,7 +11586,7 @@ def get_version(self, name: str, version: str, **kwargs: Any) -> _models.Toolbox response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: response.read() # Load the body in memory and close the socket @@ -9944,83 +11602,21 @@ def get_version(self, name: str, version: str, **kwargs: Any) -> _models.Toolbox if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ToolboxVersionObject, response.json()) + deserialized = _deserialize(_models.SkillObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore - @overload - def update( - self, name: str, *, default_version: str, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. - - :param name: The name of the toolbox to update. Required. - :type name: str - :keyword default_version: The version identifier that the toolbox should point to. When set, - the toolbox's default version will resolve to this version instead of the latest. Required. - :paramtype default_version: str - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. - - :param name: The name of the toolbox to update. Required. - :type name: str - :param body: Required. - :type body: JSON - :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. - Default value is "application/json". - :paramtype content_type: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - - @overload - def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. - - :param name: The name of the toolbox to update. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] - :keyword content_type: Body Parameter content-type. Content type parameter for binary body. - Default value is "application/json". - :paramtype content_type: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - @distributed_trace - def update( - self, name: str, body: Union[JSON, IO[bytes]] = _Unset, *, default_version: str = _Unset, **kwargs: Any - ) -> _models.ToolboxObject: - """Update a toolbox to point to a specific version. + def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject: + """Creates a skill from a zip package. - :param name: The name of the toolbox to update. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword default_version: The version identifier that the toolbox should point to. When set, - the toolbox's default version will resolve to this version instead of the latest. Required. - :paramtype default_version: str - :return: ToolboxObject. The ToolboxObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.ToolboxObject + :param body: The zip package used to create the skill. Required. + :type body: bytes + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10034,23 +11630,12 @@ def update( _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.ToolboxObject] = kwargs.pop("cls", None) + content_type: str = kwargs.pop("content_type", _headers.pop("Content-Type", "application/zip")) + cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - if body is _Unset: - if default_version is _Unset: - raise TypeError("missing required argument: default_version") - body = {"default_version": default_version} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = body - _request = build_beta_toolboxes_update_request( - name=name, + _request = build_beta_skills_create_from_package_request( content_type=content_type, api_version=self._config.api_version, content=_content, @@ -10070,7 +11655,7 @@ def update( response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: response.read() # Load the body in memory and close the socket @@ -10086,7 +11671,7 @@ def update( if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.ToolboxObject, response.json()) + deserialized = _deserialize(_models.SkillObject, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore @@ -10094,13 +11679,13 @@ def update( return deserialized # type: ignore @distributed_trace - def delete(self, name: str, **kwargs: Any) -> None: # pylint: disable=inconsistent-return-statements - """Delete a toolbox and all its versions. + def get(self, name: str, **kwargs: Any) -> _models.SkillObject: + """Retrieves a skill. - :param name: The name of the toolbox to delete. Required. + :param name: The unique name of the skill. Required. :type name: str - :return: None - :rtype: None + :return: SkillObject. The SkillObject is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10114,9 +11699,9 @@ def delete(self, name: str, **kwargs: Any) -> None: # pylint: disable=inconsist _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_delete_request( + _request = build_beta_skills_get_request( name=name, api_version=self._config.api_version, headers=_headers, @@ -10127,14 +11712,20 @@ def delete(self, name: str, **kwargs: Any) -> None: # pylint: disable=inconsist } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -10142,21 +11733,24 @@ def delete(self, name: str, **kwargs: Any) -> None: # pylint: disable=inconsist ) raise HttpResponseError(response=response, model=error) + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.SkillObject, response.json()) + if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, {}) # type: ignore - @distributed_trace - def delete_version( # pylint: disable=inconsistent-return-statements - self, name: str, version: str, **kwargs: Any - ) -> None: - """Delete a specific version of a toolbox. + return deserialized # type: ignore - :param name: The name of the toolbox. Required. - :type name: str - :param version: The version identifier to delete. Required. - :type version: str - :return: None - :rtype: None + @distributed_trace + def download(self, name: str, **kwargs: Any) -> Iterator[bytes]: + """Downloads a skill package. + + :param name: The unique name of the skill. Required. + :type name: str + :return: Iterator[bytes] + :rtype: Iterator[bytes] :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10170,11 +11764,10 @@ def delete_version( # pylint: disable=inconsistent-return-statements _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[None] = kwargs.pop("cls", None) + cls: ClsType[Iterator[bytes]] = kwargs.pop("cls", None) - _request = build_beta_toolboxes_delete_version_request( + _request = build_beta_skills_download_request( name=name, - version=version, api_version=self._config.api_version, headers=_headers, params=_params, @@ -10184,14 +11777,20 @@ def delete_version( # pylint: disable=inconsistent-return-statements } _request.url = self._client.format_url(_request.url, **path_format_arguments) - _stream = False + _decompress = kwargs.pop("decompress", True) + _stream = kwargs.pop("stream", True) pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [200]: + if _stream: + try: + response.read() # Load the body in memory and close the socket + except (StreamConsumedError, StreamClosedError): + pass map_error(status_code=response.status_code, response=response, error_map=error_map) error = _failsafe_deserialize( _models.ApiErrorResponse, @@ -10199,42 +11798,121 @@ def delete_version( # pylint: disable=inconsistent-return-statements ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if cls: - return cls(pipeline_response, None, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore + return deserialized # type: ignore -class BetaSkillsOperations: - """ - .. warning:: - **DO NOT** instantiate this class directly. + @distributed_trace + def list( + self, + *, + limit: Optional[int] = None, + order: Optional[Union[str, _models.PageOrder]] = None, + before: Optional[str] = None, + **kwargs: Any + ) -> ItemPaged["_models.SkillObject"]: + """Returns the list of all skills. - Instead, you should access the following operations through - :class:`~azure.ai.projects.AIProjectClient`'s - :attr:`skills` attribute. - """ + :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and + 100, and the + default is 20. Default value is None. + :paramtype limit: int + :keyword order: Sort order by the ``created_at`` timestamp of the objects. ``asc`` for + ascending order and``desc`` + for descending order. Known values are: "asc" and "desc". Default value is None. + :paramtype order: str or ~azure.ai.projects.models.PageOrder + :keyword before: A cursor for use in pagination. ``before`` is an object ID that defines your + place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + Default value is None. + :paramtype before: str + :return: An iterator like instance of SkillObject + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.SkillObject] + :raises ~azure.core.exceptions.HttpResponseError: + """ + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} - def __init__(self, *args, **kwargs) -> None: - input_args = list(args) - self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") - self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") - self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") - self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + cls: ClsType[List[_models.SkillObject]] = kwargs.pop("cls", None) + + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + def prepare_request(_continuation_token=None): + + _request = build_beta_skills_list_request( + limit=limit, + order=order, + after=_continuation_token, + before=before, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + return _request + + def extract_data(pipeline_response): + deserialized = pipeline_response.http_response.json() + list_of_elem = _deserialize( + List[_models.SkillObject], + deserialized.get("data", []), + ) + if cls: + list_of_elem = cls(list_of_elem) # type: ignore + return deserialized.get("last_id") or None, iter(list_of_elem) + + def get_next(_continuation_token=None): + _request = prepare_request(_continuation_token) + + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + response = pipeline_response.http_response + + if response.status_code not in [200]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + return pipeline_response + + return ItemPaged(get_next, extract_data) @overload - def create( + def update( self, - *, name: str, + *, content_type: str = "application/json", description: Optional[str] = None, instructions: Optional[str] = None, metadata: Optional[dict[str, str]] = None, **kwargs: Any ) -> _models.SkillObject: - """Creates a skill. + """Updates an existing skill. - :keyword name: The unique name of the skill. Required. - :paramtype name: str + :param name: The unique name of the skill. Required. + :type name: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -10256,9 +11934,13 @@ def create( """ @overload - def create(self, body: JSON, *, content_type: str = "application/json", **kwargs: Any) -> _models.SkillObject: - """Creates a skill. + def update( + self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + ) -> _models.SkillObject: + """Updates an existing skill. + :param name: The unique name of the skill. Required. + :type name: str :param body: Required. :type body: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. @@ -10270,9 +11952,13 @@ def create(self, body: JSON, *, content_type: str = "application/json", **kwargs """ @overload - def create(self, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any) -> _models.SkillObject: - """Creates a skill. + def update( + self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + ) -> _models.SkillObject: + """Updates an existing skill. + :param name: The unique name of the skill. Required. + :type name: str :param body: Required. :type body: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. @@ -10284,113 +11970,34 @@ def create(self, body: IO[bytes], *, content_type: str = "application/json", **k """ @distributed_trace - def create( - self, - body: Union[JSON, IO[bytes]] = _Unset, - *, - name: str = _Unset, - description: Optional[str] = None, - instructions: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, - **kwargs: Any - ) -> _models.SkillObject: - """Creates a skill. - - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword name: The unique name of the skill. Required. - :paramtype name: str - :keyword description: A human-readable description of the skill. Default value is None. - :paramtype description: str - :keyword instructions: Instructions that define the behavior of the skill. Default value is - None. - :paramtype instructions: str - :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be - useful for storing additional information about the object in a structured - format, and querying for objects via API or the dashboard. - - Keys are strings with a maximum length of 64 characters. Values are strings - with a maximum length of 512 characters. Default value is None. - :paramtype metadata: dict[str, str] - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject - :raises ~azure.core.exceptions.HttpResponseError: - """ - error_map: MutableMapping = { - 401: ClientAuthenticationError, - 404: ResourceNotFoundError, - 409: ResourceExistsError, - 304: ResourceNotModifiedError, - } - error_map.update(kwargs.pop("error_map", {}) or {}) - - _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) - _params = kwargs.pop("params", {}) or {} - - content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - - if body is _Unset: - if name is _Unset: - raise TypeError("missing required argument: name") - body = {"description": description, "instructions": instructions, "metadata": metadata, "name": name} - body = {k: v for k, v in body.items() if v is not None} - content_type = content_type or "application/json" - _content = None - if isinstance(body, (IOBase, bytes)): - _content = body - else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - - _request = build_beta_skills_create_request( - content_type=content_type, - api_version=self._config.api_version, - content=_content, - headers=_headers, - params=_params, - ) - path_format_arguments = { - "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), - } - _request.url = self._client.format_url(_request.url, **path_format_arguments) - - _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", False) - pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access - _request, stream=_stream, **kwargs - ) - - response = pipeline_response.http_response - - if response.status_code not in [201]: - if _stream: - try: - response.read() # Load the body in memory and close the socket - except (StreamConsumedError, StreamClosedError): - pass - map_error(status_code=response.status_code, response=response, error_map=error_map) - error = _failsafe_deserialize( - _models.ApiErrorResponse, - response, - ) - raise HttpResponseError(response=response, model=error) - - if _stream: - deserialized = response.iter_bytes() if _decompress else response.iter_raw() - else: - deserialized = _deserialize(_models.SkillObject, response.json()) - - if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore - - return deserialized # type: ignore + def update( + self, + name: str, + body: Union[JSON, IO[bytes]] = _Unset, + *, + description: Optional[str] = None, + instructions: Optional[str] = None, + metadata: Optional[dict[str, str]] = None, + **kwargs: Any + ) -> _models.SkillObject: + """Updates an existing skill. - @distributed_trace - def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject: - """Creates a skill from a zip package. + :param name: The unique name of the skill. Required. + :type name: str + :param body: Is either a JSON type or a IO[bytes] type. Required. + :type body: JSON or IO[bytes] + :keyword description: A human-readable description of the skill. Default value is None. + :paramtype description: str + :keyword instructions: Instructions that define the behavior of the skill. Default value is + None. + :paramtype instructions: str + :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be + useful for storing additional information about the object in a structured + format, and querying for objects via API or the dashboard. - :param body: The zip package used to create the skill. Required. - :type body: bytes + Keys are strings with a maximum length of 64 characters. Values are strings + with a maximum length of 512 characters. Default value is None. + :paramtype metadata: dict[str, str] :return: SkillObject. The SkillObject is compatible with MutableMapping :rtype: ~azure.ai.projects.models.SkillObject :raises ~azure.core.exceptions.HttpResponseError: @@ -10406,12 +12013,21 @@ def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject _headers = case_insensitive_dict(kwargs.pop("headers", {}) or {}) _params = kwargs.pop("params", {}) or {} - content_type: str = kwargs.pop("content_type", _headers.pop("Content-Type", "application/zip")) + content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) - _content = body + if body is _Unset: + body = {"description": description, "instructions": instructions, "metadata": metadata} + body = {k: v for k, v in body.items() if v is not None} + content_type = content_type or "application/json" + _content = None + if isinstance(body, (IOBase, bytes)): + _content = body + else: + _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_skills_create_from_package_request( + _request = build_beta_skills_update_request( + name=name, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -10431,7 +12047,7 @@ def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject response = pipeline_response.http_response - if response.status_code not in [201]: + if response.status_code not in [200]: if _stream: try: response.read() # Load the body in memory and close the socket @@ -10455,13 +12071,13 @@ def create_from_package(self, body: bytes, **kwargs: Any) -> _models.SkillObject return deserialized # type: ignore @distributed_trace - def get(self, name: str, **kwargs: Any) -> _models.SkillObject: - """Retrieves a skill. + def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: + """Deletes a skill. :param name: The unique name of the skill. Required. :type name: str - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DeleteSkillResponse. The DeleteSkillResponse is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DeleteSkillResponse :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10475,9 +12091,9 @@ def get(self, name: str, **kwargs: Any) -> _models.SkillObject: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) + cls: ClsType[_models.DeleteSkillResponse] = kwargs.pop("cls", None) - _request = build_beta_skills_get_request( + _request = build_beta_skills_delete_request( name=name, api_version=self._config.api_version, headers=_headers, @@ -10512,21 +12128,41 @@ def get(self, name: str, **kwargs: Any) -> _models.SkillObject: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SkillObject, response.json()) + deserialized = _deserialize(_models.DeleteSkillResponse, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore + +class BetaDatasetsOperations: + """ + .. warning:: + **DO NOT** instantiate this class directly. + + Instead, you should access the following operations through + :class:`~azure.ai.projects.AIProjectClient`'s + :attr:`datasets` attribute. + """ + + def __init__(self, *args, **kwargs) -> None: + input_args = list(args) + self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") + self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") + self._serialize: Serializer = input_args.pop(0) if input_args else kwargs.pop("serializer") + self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") + @distributed_trace - def download(self, name: str, **kwargs: Any) -> Iterator[bytes]: - """Downloads a skill package. + def get_generation_job(self, job_id: str, **kwargs: Any) -> _models.DataGenerationJob: + """Get info about a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :return: Iterator[bytes] - :rtype: Iterator[bytes] + Gets the details of a data generation job by its ID. + + :param job_id: The ID of the job. Required. + :type job_id: str + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10540,10 +12176,10 @@ def download(self, name: str, **kwargs: Any) -> Iterator[bytes]: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[Iterator[bytes]] = kwargs.pop("cls", None) + cls: ClsType[_models.DataGenerationJob] = kwargs.pop("cls", None) - _request = build_beta_skills_download_request( - name=name, + _request = build_beta_datasets_get_generation_job_request( + job_id=job_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -10554,7 +12190,7 @@ def download(self, name: str, **kwargs: Any) -> Iterator[bytes]: _request.url = self._client.format_url(_request.url, **path_format_arguments) _decompress = kwargs.pop("decompress", True) - _stream = kwargs.pop("stream", True) + _stream = kwargs.pop("stream", False) pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access _request, stream=_stream, **kwargs ) @@ -10575,9 +12211,12 @@ def download(self, name: str, **kwargs: Any) -> Iterator[bytes]: raise HttpResponseError(response=response, model=error) response_headers = {} - response_headers["Content-Type"] = self._deserialize("str", response.headers.get("Content-Type")) + response_headers["Retry-After"] = self._deserialize("int", response.headers.get("Retry-After")) - deserialized = response.iter_bytes() if _decompress else response.iter_raw() + if _stream: + deserialized = response.iter_bytes() if _decompress else response.iter_raw() + else: + deserialized = _deserialize(_models.DataGenerationJob, response.json()) if cls: return cls(pipeline_response, deserialized, response_headers) # type: ignore @@ -10585,15 +12224,19 @@ def download(self, name: str, **kwargs: Any) -> Iterator[bytes]: return deserialized # type: ignore @distributed_trace - def list( + def list_generation_jobs( self, *, limit: Optional[int] = None, order: Optional[Union[str, _models.PageOrder]] = None, before: Optional[str] = None, + scenario: Optional[Union[str, _models.DataGenerationJobScenario]] = None, + type: Optional[List[Union[str, _models.DataGenerationJobType]]] = None, **kwargs: Any - ) -> ItemPaged["_models.SkillObject"]: - """Returns the list of all skills. + ) -> ItemPaged["_models.DataGenerationJob"]: + """Returns a list of data generation jobs. + + Returns a list of data generation jobs. :keyword limit: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the @@ -10609,14 +12252,19 @@ def list( subsequent call can include before=obj_foo in order to fetch the previous page of the list. Default value is None. :paramtype before: str - :return: An iterator like instance of SkillObject - :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.SkillObject] + :keyword scenario: Filter data generation jobs by their scenario. Known values are: + "supervised_finetuning", "reinforcement_finetuning", and "evaluation". Default value is None. + :paramtype scenario: str or ~azure.ai.projects.models.DataGenerationJobScenario + :keyword type: Filter data generation jobs by their type. Default value is None. + :paramtype type: list[str or ~azure.ai.projects.models.DataGenerationJobType] + :return: An iterator like instance of DataGenerationJob + :rtype: ~azure.core.paging.ItemPaged[~azure.ai.projects.models.DataGenerationJob] :raises ~azure.core.exceptions.HttpResponseError: """ _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[List[_models.SkillObject]] = kwargs.pop("cls", None) + cls: ClsType[List[_models.DataGenerationJob]] = kwargs.pop("cls", None) error_map: MutableMapping = { 401: ClientAuthenticationError, @@ -10628,11 +12276,13 @@ def list( def prepare_request(_continuation_token=None): - _request = build_beta_skills_list_request( + _request = build_beta_datasets_list_generation_jobs_request( limit=limit, order=order, after=_continuation_token, before=before, + scenario=scenario, + type=type, api_version=self._config.api_version, headers=_headers, params=_params, @@ -10646,7 +12296,7 @@ def prepare_request(_continuation_token=None): def extract_data(pipeline_response): deserialized = pipeline_response.http_response.json() list_of_elem = _deserialize( - List[_models.SkillObject], + List[_models.DataGenerationJob], deserialized.get("data", []), ) if cls: @@ -10675,107 +12325,98 @@ def get_next(_continuation_token=None): return ItemPaged(get_next, extract_data) @overload - def update( + def create_generation_job( self, - name: str, + job: _models.DataGenerationJob, *, + operation_id: Optional[str] = None, content_type: str = "application/json", - description: Optional[str] = None, - instructions: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. + ) -> _models.DataGenerationJob: + """Creates a data generation job. - :param name: The unique name of the skill. Required. - :type name: str + Creates a data generation job. + + :param job: The job to create. Required. + :type job: ~azure.ai.projects.models.DataGenerationJob + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :keyword description: A human-readable description of the skill. Default value is None. - :paramtype description: str - :keyword instructions: Instructions that define the behavior of the skill. Default value is - None. - :paramtype instructions: str - :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be - useful for storing additional information about the object in a structured - format, and querying for objects via API or the dashboard. - - Keys are strings with a maximum length of 64 characters. Values are strings - with a maximum length of 512 characters. Default value is None. - :paramtype metadata: dict[str, str] - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def update( - self, name: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. - - :param name: The unique name of the skill. Required. - :type name: str - :param body: Required. - :type body: JSON + def create_generation_job( + self, job: JSON, *, operation_id: Optional[str] = None, content_type: str = "application/json", **kwargs: Any + ) -> _models.DataGenerationJob: + """Creates a data generation job. + + Creates a data generation job. + + :param job: The job to create. Required. + :type job: JSON + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @overload - def update( - self, name: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. + def create_generation_job( + self, + job: IO[bytes], + *, + operation_id: Optional[str] = None, + content_type: str = "application/json", + **kwargs: Any + ) -> _models.DataGenerationJob: + """Creates a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :param body: Required. - :type body: IO[bytes] + Creates a data generation job. + + :param job: The job to create. Required. + :type job: IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ @distributed_trace - def update( + def create_generation_job( self, - name: str, - body: Union[JSON, IO[bytes]] = _Unset, + job: Union[_models.DataGenerationJob, JSON, IO[bytes]], *, - description: Optional[str] = None, - instructions: Optional[str] = None, - metadata: Optional[dict[str, str]] = None, + operation_id: Optional[str] = None, **kwargs: Any - ) -> _models.SkillObject: - """Updates an existing skill. + ) -> _models.DataGenerationJob: + """Creates a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :param body: Is either a JSON type or a IO[bytes] type. Required. - :type body: JSON or IO[bytes] - :keyword description: A human-readable description of the skill. Default value is None. - :paramtype description: str - :keyword instructions: Instructions that define the behavior of the skill. Default value is - None. - :paramtype instructions: str - :keyword metadata: Set of 16 key-value pairs that can be attached to an object. This can be - useful for storing additional information about the object in a structured - format, and querying for objects via API or the dashboard. + Creates a data generation job. - Keys are strings with a maximum length of 64 characters. Values are strings - with a maximum length of 512 characters. Default value is None. - :paramtype metadata: dict[str, str] - :return: SkillObject. The SkillObject is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.SkillObject + :param job: The job to create. Is one of the following types: DataGenerationJob, JSON, + IO[bytes] Required. + :type job: ~azure.ai.projects.models.DataGenerationJob or JSON or IO[bytes] + :keyword operation_id: Client-generated unique ID for idempotent retries. When absent, the + server creates the job unconditionally. Default value is None. + :paramtype operation_id: str + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10790,20 +12431,17 @@ def update( _params = kwargs.pop("params", {}) or {} content_type: Optional[str] = kwargs.pop("content_type", _headers.pop("Content-Type", None)) - cls: ClsType[_models.SkillObject] = kwargs.pop("cls", None) + cls: ClsType[_models.DataGenerationJob] = kwargs.pop("cls", None) - if body is _Unset: - body = {"description": description, "instructions": instructions, "metadata": metadata} - body = {k: v for k, v in body.items() if v is not None} content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(job, (IOBase, bytes)): + _content = job else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(job, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore - _request = build_beta_skills_update_request( - name=name, + _request = build_beta_datasets_create_generation_job_request( + operation_id=operation_id, content_type=content_type, api_version=self._config.api_version, content=_content, @@ -10823,7 +12461,7 @@ def update( response = pipeline_response.http_response - if response.status_code not in [200]: + if response.status_code not in [201]: if _stream: try: response.read() # Load the body in memory and close the socket @@ -10836,24 +12474,30 @@ def update( ) raise HttpResponseError(response=response, model=error) + response_headers = {} + response_headers["Operation-Location"] = self._deserialize("str", response.headers.get("Operation-Location")) + response_headers["Location"] = self._deserialize("str", response.headers.get("Location")) + if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.SkillObject, response.json()) + deserialized = _deserialize(_models.DataGenerationJob, response.json()) if cls: - return cls(pipeline_response, deserialized, {}) # type: ignore + return cls(pipeline_response, deserialized, response_headers) # type: ignore return deserialized # type: ignore @distributed_trace - def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: - """Deletes a skill. + def cancel_generation_job(self, job_id: str, **kwargs: Any) -> _models.DataGenerationJob: + """Cancels a data generation job. - :param name: The unique name of the skill. Required. - :type name: str - :return: DeleteSkillResponse. The DeleteSkillResponse is compatible with MutableMapping - :rtype: ~azure.ai.projects.models.DeleteSkillResponse + Cancels a data generation job by its ID. + + :param job_id: The ID of the job to cancel. Required. + :type job_id: str + :return: DataGenerationJob. The DataGenerationJob is compatible with MutableMapping + :rtype: ~azure.ai.projects.models.DataGenerationJob :raises ~azure.core.exceptions.HttpResponseError: """ error_map: MutableMapping = { @@ -10867,10 +12511,10 @@ def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: _headers = kwargs.pop("headers", {}) or {} _params = kwargs.pop("params", {}) or {} - cls: ClsType[_models.DeleteSkillResponse] = kwargs.pop("cls", None) + cls: ClsType[_models.DataGenerationJob] = kwargs.pop("cls", None) - _request = build_beta_skills_delete_request( - name=name, + _request = build_beta_datasets_cancel_generation_job_request( + job_id=job_id, api_version=self._config.api_version, headers=_headers, params=_params, @@ -10904,9 +12548,65 @@ def delete(self, name: str, **kwargs: Any) -> _models.DeleteSkillResponse: if _stream: deserialized = response.iter_bytes() if _decompress else response.iter_raw() else: - deserialized = _deserialize(_models.DeleteSkillResponse, response.json()) + deserialized = _deserialize(_models.DataGenerationJob, response.json()) if cls: return cls(pipeline_response, deserialized, {}) # type: ignore return deserialized # type: ignore + + @distributed_trace + def delete_generation_job( # pylint: disable=inconsistent-return-statements + self, job_id: str, **kwargs: Any + ) -> None: + """Deletes a data generation job. + + Deletes a data generation job by its ID. + + :param job_id: The ID of the job to delete. Required. + :type job_id: str + :return: None + :rtype: None + :raises ~azure.core.exceptions.HttpResponseError: + """ + error_map: MutableMapping = { + 401: ClientAuthenticationError, + 404: ResourceNotFoundError, + 409: ResourceExistsError, + 304: ResourceNotModifiedError, + } + error_map.update(kwargs.pop("error_map", {}) or {}) + + _headers = kwargs.pop("headers", {}) or {} + _params = kwargs.pop("params", {}) or {} + + cls: ClsType[None] = kwargs.pop("cls", None) + + _request = build_beta_datasets_delete_generation_job_request( + job_id=job_id, + api_version=self._config.api_version, + headers=_headers, + params=_params, + ) + path_format_arguments = { + "endpoint": self._serialize.url("self._config.endpoint", self._config.endpoint, "str", skip_quote=True), + } + _request.url = self._client.format_url(_request.url, **path_format_arguments) + + _stream = False + pipeline_response: PipelineResponse = self._client._pipeline.run( # pylint: disable=protected-access + _request, stream=_stream, **kwargs + ) + + response = pipeline_response.http_response + + if response.status_code not in [204]: + map_error(status_code=response.status_code, response=response, error_map=error_map) + error = _failsafe_deserialize( + _models.ApiErrorResponse, + response, + ) + raise HttpResponseError(response=response, model=error) + + if cls: + return cls(pipeline_response, None, {}) # type: ignore diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch.py b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch.py index c7c234ee5494..784a8ba6b182 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch.py @@ -20,6 +20,7 @@ from ._patch_memories import BetaMemoryStoresOperations from ._patch_sessions import BetaAgentsOperations from ._operations import ( + BetaDatasetsOperations, BetaEvaluationTaxonomiesOperations, BetaEvaluatorsOperations, BetaInsightsOperations, @@ -112,6 +113,8 @@ class BetaOperations(GeneratedBetaOperations): """:class:`~azure.ai.projects.operations.BetaToolboxesOperations` operations""" skills: BetaSkillsOperations """:class:`~azure.ai.projects.operations.BetaSkillsOperations` operations""" + datasets: BetaDatasetsOperations + """:class:`~azure.ai.projects.operations.BetaDatasetsOperations` operations""" def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) @@ -133,6 +136,7 @@ def __init__(self, *args: Any, **kwargs: Any) -> None: __all__: List[str] = [ "AgentsOperations", "BetaAgentsOperations", + "BetaDatasetsOperations", "BetaEvaluationTaxonomiesOperations", "BetaEvaluatorsOperations", "BetaInsightsOperations", diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_ai_project_instrumentor.py b/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_ai_project_instrumentor.py index 1a22ca314704..c23a994b795e 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_ai_project_instrumentor.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_ai_project_instrumentor.py @@ -645,7 +645,7 @@ def _add_message_event( # pylint: disable=too-many-branches,too-many-statements attribute_name = GEN_AI_INPUT_MESSAGES # Set the attribute on the span - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): span.add_attribute(attribute_name, message_json) def _get_field(self, obj: Any, field: str) -> Any: @@ -722,7 +722,7 @@ def _add_instructions_event( # Use attributes for instructions tracing # System instructions format: array of content objects without role/parts wrapper message_json = json.dumps(content_array, ensure_ascii=False) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): span.add_attribute(GEN_AI_SYSTEM_MESSAGE, message_json) def _status_to_string(self, status: Any) -> str: @@ -782,7 +782,7 @@ def start_create_agent_span( # pylint: disable=too-many-locals reasoning_summary=reasoning_summary, structured_inputs=(str(structured_inputs) if structured_inputs is not None else None), ) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): span.add_attribute(GEN_AI_OPERATION_NAME, OperationName.CREATE_AGENT.value) if name: span.add_attribute(GEN_AI_AGENT_NAME, name) @@ -842,7 +842,7 @@ def start_create_thread_span( # _tool_resources: Optional["ToolResources"] = None, ) -> "Optional[AbstractSpan]": span = start_span(OperationName.CREATE_THREAD, server_address=server_address, port=port) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): for message in messages or []: self.add_thread_message_event(span, message) diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_responses_instrumentor.py b/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_responses_instrumentor.py index 95cb28183b35..fdb3cc456214 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_responses_instrumentor.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_responses_instrumentor.py @@ -523,7 +523,7 @@ def _set_attributes(self, span: "AbstractSpan", *attrs: Tuple[str, Any]) -> None def _set_span_attribute_safe(self, span: "AbstractSpan", key: str, value: Any) -> None: """Safely set a span attribute only if the value is meaningful.""" - if not span or not span.span_instance.is_recording: + if not span or not span.span_instance.is_recording(): return # Only set attribute if value exists and is meaningful @@ -846,7 +846,7 @@ def _add_workflow_action_events( conversation_id: Optional[str] = None, ) -> None: """Add workflow action events to the span for workflow agents.""" - if not span or not span.span_instance.is_recording: + if not span or not span.span_instance.is_recording(): return # Check if response has output items @@ -1149,7 +1149,7 @@ def _add_tool_call_events( # pylint: disable=too-many-branches conversation_id: Optional[str] = None, ) -> None: """Add tool call events to the span from response output.""" - if not span or not span.span_instance.is_recording: + if not span or not span.span_instance.is_recording(): return # Extract function calls and tool calls from response output @@ -1638,7 +1638,7 @@ def start_responses_span( gen_ai_provider=RESPONSES_PROVIDER, ) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): # Set operation name attribute (start_span doesn't set this automatically) self._set_attributes( span, @@ -2614,7 +2614,7 @@ def cleanup(self): # Join all accumulated output content complete_content = "".join(self.accumulated_output) - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): # Add tool call events if we detected any output items (tool calls, etc.) if self.has_output_items: # Create mock response with output items for event generation @@ -2721,7 +2721,7 @@ def __init__( ) # End span with proper status - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): self.span.span_instance.set_status( # pyright: ignore [reportPossiblyUnboundVariable] StatusCode.OK @@ -2764,7 +2764,7 @@ def __next__(self): span_attributes=span_attributes, error_type=str(type(e).__name__), ) - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): self.span.span_instance.set_status( # pyright: ignore [reportPossiblyUnboundVariable] StatusCode.ERROR, @@ -2791,7 +2791,7 @@ def _finalize_span(self): span_attributes=span_attributes, ) - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): # Note: For streaming responses, response metadata like tokens, finish_reasons # are typically not available in individual chunks, so we focus on content. @@ -3092,7 +3092,7 @@ def cleanup(self): # Join all accumulated output content complete_content = "".join(self.accumulated_output) - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): # Add tool call events if we detected any output items (tool calls, etc.) if self.has_output_items: # Create mock response with output items for event generation @@ -3199,7 +3199,7 @@ def __init__( ) # End span with proper status - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): self.span.span_instance.set_status( # pyright: ignore [reportPossiblyUnboundVariable] StatusCode.OK @@ -3241,7 +3241,7 @@ async def __anext__(self): span_attributes=span_attributes, error_type=str(type(e).__name__), ) - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): self.span.span_instance.set_status( # pyright: ignore [reportPossiblyUnboundVariable] StatusCode.ERROR, @@ -3268,7 +3268,7 @@ def _finalize_span(self): span_attributes=span_attributes, ) - if self.span.span_instance.is_recording: + if self.span.span_instance.is_recording(): # Note: For streaming responses, response metadata like tokens, finish_reasons # are typically not available in individual chunks, so we focus on content. @@ -3407,7 +3407,7 @@ def start_create_conversation_span( gen_ai_provider=RESPONSES_PROVIDER, ) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): self._set_span_attribute_safe(span, GEN_AI_OPERATION_NAME, OperationName.CREATE_CONVERSATION.value) return span @@ -3605,7 +3605,7 @@ def start_list_conversation_items_span( gen_ai_provider=RESPONSES_PROVIDER, ) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): # Set operation name attribute (start_span doesn't set this automatically) self._set_attributes( span, @@ -3624,7 +3624,7 @@ def _add_conversation_item_event( # pylint: disable=too-many-branches,too-many- item: Any, ) -> None: """Add a conversation item event to the span.""" - if not span or not span.span_instance.is_recording: + if not span or not span.span_instance.is_recording(): return # Extract basic item information diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_utils.py b/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_utils.py index 931c3d2abf7b..47047e2720c3 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_utils.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/_utils.py @@ -223,7 +223,7 @@ def start_span( schema_version=GEN_AI_SEMANTIC_CONVENTIONS_SCHEMA_VERSION, ) - if span and span.span_instance.is_recording: + if span and span.span_instance.is_recording(): span.add_attribute(AZ_NAMESPACE, AZ_NAMESPACE_VALUE) span.add_attribute(GEN_AI_PROVIDER_NAME, AGENTS_PROVIDER) diff --git a/sdk/ai/azure-ai-projects/cspell.json b/sdk/ai/azure-ai-projects/cspell.json index b3584c59e861..7decf206d14a 100644 --- a/sdk/ai/azure-ai-projects/cspell.json +++ b/sdk/ai/azure-ai-projects/cspell.json @@ -12,6 +12,7 @@ "closefd", "cogsvc", "CSDL", + "dargilco", "dedup", "evals", "FineTuning", @@ -31,8 +32,8 @@ "Tadmaq", "Udbk", "UPIA", - "xhigh", - "Vnext" + "Vnext", + "xhigh" ], "ignorePaths": [ "*.csv", diff --git a/sdk/ai/azure-ai-projects/dev_requirements.txt b/sdk/ai/azure-ai-projects/dev_requirements.txt index 3a0781d99156..6641c1e8f14a 100644 --- a/sdk/ai/azure-ai-projects/dev_requirements.txt +++ b/sdk/ai/azure-ai-projects/dev_requirements.txt @@ -8,11 +8,13 @@ azure-core-tracing-opentelemetry azure-mgmt-authorization azure-mgmt-cognitiveservices azure-mgmt-resource +azure-mgmt-subscription azure-monitor-opentelemetry azure-monitor-query jsonref opentelemetry-sdk python-dotenv +black # Can't include those, because they are not supported in Python 3.9. Samples that use these package # cannot be run as pytest, because the pipeline will fail on Python 3.9 jobs. # pillow diff --git a/sdk/ai/azure-ai-projects/docs/subclients.md b/sdk/ai/azure-ai-projects/docs/subclients.md new file mode 100644 index 000000000000..e87bdabc8fd4 --- /dev/null +++ b/sdk/ai/azure-ai-projects/docs/subclients.md @@ -0,0 +1,90 @@ +# AIProjectClient Subclients + +This document lists all subclients available on `AIProjectClient` and their public method counts. + +## Top-level Subclients + +| Subclient | Class Name | Public Methods | +|-----------|------------|----------------| +| `agents` | AgentsOperations | 8 | +| `evaluation_rules` | EvaluationRulesOperations | 4 | +| `connections` | ConnectionsOperations | 3 | +| `datasets` | DatasetsOperations | 9 | +| `deployments` | DeploymentsOperations | 2 | +| `indexes` | IndexesOperations | 5 | +| `telemetry` | TelemetryOperations | 1 | +| `beta` | BetaOperations | 0 (container only) | + +## Nested Subclients on `.beta` + +| Subclient | Class Name | Public Methods | +|-----------|------------|----------------| +| `beta.agents` | BetaAgentsOperations | 10 | +| `beta.evaluation_taxonomies` | BetaEvaluationTaxonomiesOperations | 5 | +| `beta.evaluators` | BetaEvaluatorsOperations | 11 | +| `beta.insights` | BetaInsightsOperations | 3 | +| `beta.memory_stores` | BetaMemoryStoresOperations | 8 | +| `beta.red_teams` | BetaRedTeamsOperations | 3 | +| `beta.schedules` | BetaSchedulesOperations | 6 | +| `beta.toolboxes` | BetaToolboxesOperations | 8 | +| `beta.skills` | BetaSkillsOperations | 7 | +| `beta.datasets` | BetaDatasetsOperations | 5 | + +## Summary + +**Total: 98 unique public methods across all subclients** + +--- + +### Method Details + +#### AgentsOperations (8) +`get`, `delete`, `list`, `create_version`, `create_version_from_manifest`, `get_version`, `delete_version`, `list_versions` + +#### EvaluationRulesOperations (4) +`get`, `delete`, `create_or_update`, `list` + +#### ConnectionsOperations (3) +`list`, `get`, `get_default` + +#### DatasetsOperations (9) +`list_versions`, `list`, `get`, `delete`, `create_or_update`, `pending_upload`, `get_credentials`, `upload_file`, `upload_folder` + +#### DeploymentsOperations (2) +`get`, `list` + +#### IndexesOperations (5) +`list_versions`, `list`, `get`, `delete`, `create_or_update` + +#### TelemetryOperations (1) +`get_application_insights_connection_string` + +#### BetaAgentsOperations (10) +`patch_agent_details`, `create_session`, `get_session`, `delete_session`, `list_sessions`, `get_session_log_stream`, `download_session_file`, `get_session_files`, `delete_session_file`, `upload_session_file` + +#### BetaEvaluationTaxonomiesOperations (5) +`get`, `list`, `delete`, `create`, `update` + +#### BetaEvaluatorsOperations (11) +`list_versions`, `list`, `get_version`, `delete_version`, `create_version`, `update_version`, `create_generation_job`, `get_generation_job`, `list_generation_jobs`, `cancel_generation_job`, `delete_generation_job` + +#### BetaInsightsOperations (3) +`generate`, `get`, `list` + +#### BetaMemoryStoresOperations (8) +`create`, `update`, `get`, `list`, `delete`, `delete_scope`, `search_memories`, `begin_update_memories` + +#### BetaRedTeamsOperations (3) +`get`, `list`, `create` + +#### BetaSchedulesOperations (6) +`delete`, `get`, `list`, `create_or_update`, `get_run`, `list_runs` + +#### BetaToolboxesOperations (8) +`create_version`, `get`, `list`, `list_versions`, `get_version`, `update`, `delete`, `delete_version` + +#### BetaSkillsOperations (7) +`create`, `create_from_package`, `get`, `download`, `list`, `update`, `delete` + +#### BetaDatasetsOperations (5) +`get_generation_job`, `list_generation_jobs`, `create_generation_job`, `cancel_generation_job`, `delete_generation_job` diff --git a/sdk/ai/azure-ai-projects/post-emitter-fixes.cmd b/sdk/ai/azure-ai-projects/post-emitter-fixes.cmd index ca22422a244f..3f55d67e708d 100644 --- a/sdk/ai/azure-ai-projects/post-emitter-fixes.cmd +++ b/sdk/ai/azure-ai-projects/post-emitter-fixes.cmd @@ -63,6 +63,7 @@ REM The emitter wraps bullet/code-block lines with insufficient indentation. powershell -Command "$files='azure\ai\projects\operations\_operations.py','azure\ai\projects\aio\operations\_operations.py'; foreach ($f in $files) { $c=Get-Content $f -Raw; $c=$c -replace 'schema\r?\n\s+is not contractual and may include additional keys or change format\r?\n\s+over time [^\r\n]*clients should treat it as an opaque string\)', 'schema is not contractual and may include additional keys or change format over time; clients should treat it as an opaque string)'; $c=$c -replace '(message\":\"Starting)\r?\n\s+(FoundryCBAgent server on port 8088\"})', '$1 $2'; $c=$c -replace '(message\":\"INFO: Application)\r?\n\s+(startup complete\.\"})', '$1 $2'; $c=$c -replace '(message\":\"Successfully)\r?\n\s+(connected to container\"})', '$1 $2'; $c=$c -replace '(message\":\"No logs since)\r?\n\s+(last 60 seconds\"})', '$1 $2'; Set-Content $f $c -NoNewline }" REM Finishing by running 'black' tool to format code. +pip install black black --config ../../../eng/black-pyproject.toml . || echo black not found, skipping formatting. diff --git a/sdk/ai/azure-ai-projects/pyrightconfig.json b/sdk/ai/azure-ai-projects/pyrightconfig.json index b8b684eb3af1..c4714f25326c 100644 --- a/sdk/ai/azure-ai-projects/pyrightconfig.json +++ b/sdk/ai/azure-ai-projects/pyrightconfig.json @@ -3,9 +3,13 @@ "reportMissingImports": false, "pythonVersion": "3.11", "extraPaths": [ + "./../../authorization/azure-mgmt-authorization", + "./../../cognitiveservices/azure-mgmt-cognitiveservices", "./../../core/azure-core", "./../../evaluation/azure-ai-evaluation", "./../../identity/azure-identity", + "./../../resources/azure-mgmt-resource", + "./../../resources/azure-mgmt-subscription", "./../../monitor/azure-monitor-opentelemetry" ] } \ No newline at end of file diff --git a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_builtin_with_traces.py b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_builtin_with_traces.py index dd2271d4d6b7..a5e88ab53301 100644 --- a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_builtin_with_traces.py +++ b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_builtin_with_traces.py @@ -7,12 +7,21 @@ """ DESCRIPTION: Given an AIProjectClient, this sample demonstrates how to run Azure AI Evaluations - against agent traces collected in Azure Application Insights. The sample fetches - trace IDs for a given agent and time range, creates an evaluation group configured - for trace analysis, and monitors the evaluation run until it completes. + against agent traces collected in Azure Application Insights. + + Supports three modes: + - Default mode (no flags): Queries Application Insights client-side for trace IDs + using the AGENT_ID environment variable, then passes them to the eval service. + - Agent ID mode (--agent-id): Passes the agent ID directly to the eval service, + which resolves traces server-side from Application Insights. + - Trace ID mode (--trace-ids): Passes explicit trace IDs to the eval service. USAGE: python sample_evaluations_builtin_with_traces.py + python sample_evaluations_builtin_with_traces.py --agent-id "my-agent:1" + python sample_evaluations_builtin_with_traces.py --trace-ids abc123 def456 + python sample_evaluations_builtin_with_traces.py --agent-id "my-agent:1" --lookback-hours 48 --max-traces 20 + python sample_evaluations_builtin_with_traces.py --no-cleanup Before running the sample: @@ -21,7 +30,8 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - Required. The Azure AI Project endpoint, as found in the overview page of your Microsoft Foundry project. It has the form: https://.services.ai.azure.com/api/projects/. - 2) APPINSIGHTS_RESOURCE_ID - Required. The Azure Application Insights resource ID that stores agent traces. + 2) APPINSIGHTS_RESOURCE_ID - Required (for default mode). The Azure Application Insights resource ID that stores + agent traces. Not needed when using --agent-id or --trace-ids. It has the form: /subscriptions//resourceGroups//providers/Microsoft.Insights/components/. 3) AGENT_ID - Required. The agent identifier emitted by the Azure tracing integration, used to filter traces. 4) FOUNDRY_MODEL_NAME - Required. The Azure OpenAI deployment name to use with the built-in evaluators. @@ -29,19 +39,18 @@ Defaults to 1. """ +import argparse import os import time from datetime import datetime, timedelta, timezone from pprint import pprint -from typing import Any, Dict, List +from typing import Any, Dict, List, Optional from dotenv import load_dotenv from azure.identity import DefaultAzureCredential from azure.monitor.query import LogsQueryClient, LogsQueryStatus from azure.ai.projects import AIProjectClient from azure.ai.projects.models import ( - AzureAIDataSourceConfig, TestingCriterionAzureAIEvaluator, - TracesPreviewEvalRunDataSource, ) load_dotenv() @@ -53,7 +62,7 @@ ] # Sample : /subscriptions//resourceGroups//providers/Microsoft.Insights/components/ agent_id = os.environ["AGENT_ID"] model_deployment_name = os.environ["FOUNDRY_MODEL_NAME"] -trace_query_hours = int(os.environ.get("TRACE_LOOKBACK_HOURS", "1")) +default_lookback_hours = int(os.environ.get("TRACE_LOOKBACK_HOURS", "1")) def _build_evaluator_config(name: str, evaluator_name: str) -> TestingCriterionAzureAIEvaluator: @@ -63,9 +72,9 @@ def _build_evaluator_config(name: str, evaluator_name: str) -> TestingCriterionA name=name, evaluator_name=evaluator_name, data_mapping={ - "query": "{{query}}", - "response": "{{response}}", - "tool_definitions": "{{tool_definitions}}", + "query": "{{sample.query}}", + "response": "{{sample.response}}", + "tool_definitions": "{{sample.tool_definitions}}", }, initialization_parameters={ "deployment_name": model_deployment_name, @@ -121,92 +130,133 @@ def get_trace_ids( return [] -def main() -> None: - end_time = datetime.now(tz=timezone.utc) - start_time = end_time - timedelta(hours=trace_query_hours) +def main() -> None: # pylint: disable=too-many-statements + parser = argparse.ArgumentParser(description="Run Azure AI trace evaluations against agent traces.") + mode = parser.add_mutually_exclusive_group() + mode.add_argument("--agent-id", default=None, help="Agent ID for server-side trace resolution") + mode.add_argument("--trace-ids", nargs="+", default=None, help="Explicit trace IDs to evaluate") + parser.add_argument("--lookback-hours", type=int, default=None, help="Lookback window in hours") + parser.add_argument("--max-traces", type=int, default=50, help="Max traces in agent-id mode (default: 50)") + parser.add_argument("--no-cleanup", action="store_true", help="Keep eval group after run") + args = parser.parse_args() + + lookback_hours = args.lookback_hours or default_lookback_hours + trace_ids: Optional[List[str]] = None + agent_id_for_server: Optional[str] = None + metadata: Dict[str, str] = {} + + if args.agent_id: + agent_id_for_server = args.agent_id + print("Mode: Server-side agent ID resolution") + print(f"Agent ID: {args.agent_id}") + print(f"Lookback: {lookback_hours}h, Max traces: {args.max_traces}") + metadata["agent_id"] = args.agent_id + + elif args.trace_ids: + trace_ids = list(args.trace_ids) + print(f"Mode: Explicit trace IDs ({len(trace_ids)} provided)") + + else: + end_time = datetime.now(tz=timezone.utc) + start_time = end_time - timedelta(hours=lookback_hours) - print("Querying Application Insights for trace identifiers...") - print(f"Agent ID: {agent_id}") - print(f"Time range: {start_time.isoformat()} to {end_time.isoformat()}") + print("Querying Application Insights for trace identifiers...") + print(f"Agent ID: {agent_id}") + print(f"Time range: {start_time.isoformat()} to {end_time.isoformat()}") - trace_ids = get_trace_ids(appinsights_resource_id, agent_id, start_time, end_time) + trace_ids = get_trace_ids(appinsights_resource_id, agent_id, start_time, end_time) - if not trace_ids: - print("No trace IDs found for the provided agent and time window.") - return + if not trace_ids: + print("No trace IDs found for the provided agent and time window.") + return - print(f"\nFound {len(trace_ids)} trace IDs:") - for trace_id in trace_ids: - print(f" - {trace_id}") + print(f"\nFound {len(trace_ids)} trace IDs:") + for tid in trace_ids: + print(f" - {tid}") + + metadata["agent_id"] = agent_id + metadata["start_time"] = start_time.isoformat() + metadata["end_time"] = end_time.isoformat() with DefaultAzureCredential() as credential: with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: client = project_client.get_openai_client() - data_source_config = AzureAIDataSourceConfig( - type="azure_ai_source", - scenario="traces", - ) - testing_criteria = [ - _build_evaluator_config( - name="intent_resolution", - evaluator_name="builtin.intent_resolution", - ), - _build_evaluator_config( - name="task_adherence", - evaluator_name="builtin.task_adherence", - ), - ] - - print("\nCreating evaluation") - eval_object = client.evals.create( - name="agent_trace_eval_group", - data_source_config=data_source_config, - testing_criteria=testing_criteria, - ) - print(f"Evaluation created (id: {eval_object.id}, name: {eval_object.name})") - - print("\nGet Evaluation by Id") - eval_object_response = client.evals.retrieve(eval_object.id) - print("Evaluation Response:") - pprint(eval_object_response) - - print("\nCreating Eval Run with trace IDs") - run_name = f"agent_trace_eval_{datetime.now().strftime('%Y%m%d_%H%M%S')}" - data_source = TracesPreviewEvalRunDataSource( - type="azure_ai_traces_preview", - trace_ids=trace_ids, - lookback_hours=trace_query_hours, - ) - eval_run_object = client.evals.runs.create( - eval_id=eval_object.id, - name=run_name, - metadata={ - "agent_id": agent_id, - "start_time": start_time.isoformat(), - "end_time": end_time.isoformat(), - }, - data_source=data_source, - ) - print("Eval Run created") - pprint(eval_run_object) - - print("\nMonitoring Eval Run status...") - while True: - run = client.evals.runs.retrieve(run_id=eval_run_object.id, eval_id=eval_object.id) - print(f"Status: {run.status}") - - if run.status in {"completed", "failed", "canceled"}: - print("\nEval Run finished!") - print("Final Eval Run Response:") - pprint(run) - break - - time.sleep(5) - print("Waiting for eval run to complete...") - - client.evals.delete(eval_id=eval_object.id) - print("Evaluation deleted") + data_source_config = { + "type": "azure_ai_source", + "scenario": "traces", + } + + testing_criteria = [ + _build_evaluator_config( + name="intent_resolution", + evaluator_name="builtin.intent_resolution", + ), + _build_evaluator_config( + name="task_adherence", + evaluator_name="builtin.task_adherence", + ), + ] + + print("\nCreating evaluation") + eval_object = client.evals.create( + name="agent_trace_eval_group", + data_source_config=data_source_config, # type: ignore + testing_criteria=testing_criteria, # type: ignore + ) + print(f"Evaluation created (id: {eval_object.id}, name: {eval_object.name})") + + print("\nGet Evaluation by Id") + eval_object_response = client.evals.retrieve(eval_object.id) + print("Evaluation Response:") + pprint(eval_object_response) + + # Build data source based on mode + if agent_id_for_server: + data_source: Dict[str, Any] = { + "type": "azure_ai_traces", + "agent_id": agent_id_for_server, + "lookback_hours": lookback_hours, + "max_traces": args.max_traces, + } + else: + assert trace_ids is not None + data_source = { + "type": "azure_ai_traces", + "trace_ids": trace_ids, + "lookback_hours": lookback_hours, + } + + print("\nCreating Eval Run") + run_name = f"agent_trace_eval_{datetime.now().strftime('%Y%m%d_%H%M%S')}" + eval_run_object = client.evals.runs.create( + eval_id=eval_object.id, + name=run_name, + metadata=metadata if metadata else None, + data_source=data_source, # type: ignore + ) + print("Eval Run created") + pprint(eval_run_object) + + print("\nMonitoring Eval Run status...") + while True: + run = client.evals.runs.retrieve(run_id=eval_run_object.id, eval_id=eval_object.id) + print(f"Status: {run.status}") + + if run.status in {"completed", "failed", "canceled"}: + print("\nEval Run finished!") + print("Final Eval Run Response:") + pprint(run) + break + + time.sleep(5) + print("Waiting for eval run to complete...") + + if not args.no_cleanup: + client.evals.delete(eval_id=eval_object.id) + print("Evaluation deleted") + else: + print(f"Skipping cleanup (--no-cleanup). Eval ID: {eval_object.id}") if __name__ == "__main__": diff --git a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio.py b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio.py index 7583f1e8636e..165bd0f43929 100644 --- a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio.py +++ b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio.py @@ -26,13 +26,15 @@ 3) AZURE_AI_MODEL_DEPLOYMENT_NAME_FOR_AUDIO - Required. The name of the model deployment for audio to use for evaluation, recommend to use "gpt-4o-audio-preview" """ -import os import base64 - -from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient +import os import time from pprint import pprint + +from dotenv import load_dotenv +from azure.ai.projects import AIProjectClient +from azure.identity import DefaultAzureCredential +from openai.types.eval_create_params import DataSourceConfigCustom from openai.types.evals.create_eval_completions_run_data_source_param import ( CreateEvalCompletionsRunDataSourceParam, SourceFileContent, @@ -41,8 +43,6 @@ InputMessagesTemplateTemplateEvalItem, ) from openai.types.responses import EasyInputMessageParam, ResponseInputAudioParam -from openai.types.eval_create_params import DataSourceConfigCustom -from dotenv import load_dotenv load_dotenv() file_path = os.path.abspath(__file__) @@ -54,7 +54,13 @@ def audio_to_base64(audio_path: str) -> str: - """Read an audio file and return its base64-encoded content.""" + """Read an audio file and return its base64-encoded content. + + :param audio_path: Absolute path to the input audio file. + :type audio_path: str + :return: Base64-encoded audio data. + :rtype: str + """ with open(audio_path, "rb") as f: return base64.b64encode(f.read()).decode("utf-8") @@ -183,7 +189,7 @@ def audio_to_base64(audio_path: str) -> str: while True: run = client.evals.runs.retrieve(run_id=eval_run_response.id, eval_id=eval_object.id) - if run.status == "completed" or run.status == "failed": + if run.status in {"completed", "failed"}: output_items = list(client.evals.runs.output_items.list(run_id=run.id, eval_id=eval_object.id)) pprint(output_items) print(f"Eval Run Report URL: {run.report_url}") diff --git a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio_model_target.py b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio_model_target.py index 5c3c4ae51f78..75eb7b97f566 100644 --- a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio_model_target.py +++ b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_audio_model_target.py @@ -26,13 +26,15 @@ 3) AZURE_AI_MODEL_DEPLOYMENT_NAME_FOR_AUDIO - Required. The name of the model deployment for audio to use for evaluation, recommend to use "gpt-4o-audio-preview" """ -import os import base64 - -from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient +import os import time from pprint import pprint + +from dotenv import load_dotenv +from azure.ai.projects import AIProjectClient +from azure.identity import DefaultAzureCredential +from openai.types.eval_create_params import DataSourceConfigCustom from openai.types.evals.create_eval_completions_run_data_source_param import ( SourceFileContent, SourceFileContentContent, @@ -40,8 +42,6 @@ InputMessagesTemplateTemplateEvalItem, ) from openai.types.responses import EasyInputMessageParam, ResponseInputAudioParam -from openai.types.eval_create_params import DataSourceConfigCustom -from dotenv import load_dotenv load_dotenv() file_path = os.path.abspath(__file__) @@ -53,7 +53,13 @@ def audio_to_base64(audio_path: str) -> str: - """Read an audio file and return its base64-encoded content.""" + """Read an audio file and return its base64-encoded content. + + :param audio_path: Absolute path to the input audio file. + :type audio_path: str + :return: Base64-encoded audio data. + :rtype: str + """ with open(audio_path, "rb") as f: return base64.b64encode(f.read()).decode("utf-8") @@ -188,7 +194,7 @@ def audio_to_base64(audio_path: str) -> str: while True: run = client.evals.runs.retrieve(run_id=eval_run_response.id, eval_id=eval_object.id) - if run.status == "completed" or run.status == "failed": + if run.status in {"completed", "failed"}: output_items = list(client.evals.runs.output_items.list(run_id=run.id, eval_id=eval_object.id)) pprint(output_items) print(f"Eval Run Report URL: {run.report_url}") diff --git a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_image_model_target.py b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_image_model_target.py index 88920549113d..01a22ccfe588 100644 --- a/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_image_model_target.py +++ b/sdk/ai/azure-ai-projects/samples/evaluations/sample_evaluations_score_model_grader_with_image_model_target.py @@ -23,15 +23,17 @@ 2) AZURE_AI_MODEL_DEPLOYMENT_NAME - Required. The name of the model deployment to use for evaluation. """ -import os import base64 -from PIL import Image +import os +import time from io import BytesIO +from pprint import pprint -from azure.identity import DefaultAzureCredential +from PIL import Image +from dotenv import load_dotenv from azure.ai.projects import AIProjectClient -import time -from pprint import pprint +from azure.identity import DefaultAzureCredential +from openai.types.eval_create_params import DataSourceConfigCustom from openai.types.evals.create_eval_completions_run_data_source_param import ( SourceFileContent, SourceFileContentContent, @@ -40,8 +42,6 @@ InputMessagesTemplateTemplateEvalItemContentInputImage, ) from openai.types.responses import EasyInputMessageParam -from openai.types.eval_create_params import DataSourceConfigCustom -from dotenv import load_dotenv load_dotenv() file_path = os.path.abspath(__file__) @@ -51,8 +51,8 @@ model_deployment_name = os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"] -def image_to_data_uri(image_path: str) -> str: - with Image.open(image_path) as img: +def image_to_data_uri(image_file_path: str) -> str: + with Image.open(image_file_path) as img: buffered = BytesIO() img.save(buffered, format=img.format or "PNG") img_str = base64.b64encode(buffered.getvalue()).decode() @@ -186,7 +186,7 @@ def image_to_data_uri(image_path: str) -> str: while True: run = client.evals.runs.retrieve(run_id=eval_run_response.id, eval_id=eval_object.id) - if run.status == "completed" or run.status == "failed": + if run.status in {"completed", "failed"}: output_items = list(client.evals.runs.output_items.list(run_id=run.id, eval_id=eval_object.id)) pprint(output_items) print(f"Eval Run Report URL: {run.report_url}") diff --git a/sdk/ai/azure-ai-projects/samples/evaluations/sample_redteam_evaluations.py b/sdk/ai/azure-ai-projects/samples/evaluations/sample_redteam_evaluations.py index 82cd63dfe664..a88d71125b12 100644 --- a/sdk/ai/azure-ai-projects/samples/evaluations/sample_redteam_evaluations.py +++ b/sdk/ai/azure-ai-projects/samples/evaluations/sample_redteam_evaluations.py @@ -95,7 +95,7 @@ def main() -> None: # pylint: disable=too-many-statements description="Taxonomy for red teaming evaluation", taxonomy_input=agent_taxonomy_input ) - taxonomy = project_client.beta.evaluation_taxonomies.create(name=agent_name, body=eval_taxonomy_input) + taxonomy = project_client.beta.evaluation_taxonomies.create(name=agent_name, taxonomy=eval_taxonomy_input) taxonomy_path = os.path.join(tempfile.gettempdir(), f"taxonomy_{agent_name}.json") with open(taxonomy_path, "w", encoding="utf-8") as f: f.write(json.dumps(_to_json_primitive(taxonomy), indent=2)) diff --git a/sdk/ai/azure-ai-projects/samples/evaluations/sample_scheduled_evaluations.py b/sdk/ai/azure-ai-projects/samples/evaluations/sample_scheduled_evaluations.py index 85c89d83abad..5b66ce656b07 100644 --- a/sdk/ai/azure-ai-projects/samples/evaluations/sample_scheduled_evaluations.py +++ b/sdk/ai/azure-ai-projects/samples/evaluations/sample_scheduled_evaluations.py @@ -380,7 +380,7 @@ def schedule_redteam_evaluation() -> None: # pylint: disable=too-many-locals description="Taxonomy for red teaming evaluation", taxonomy_input=agent_taxonomy_input ) - taxonomy = project_client.beta.evaluation_taxonomies.create(name=agent_name, body=eval_taxonomy_input) + taxonomy = project_client.beta.evaluation_taxonomies.create(name=agent_name, taxonomy=eval_taxonomy_input) taxonomy_path = os.path.join(data_folder, f"taxonomy_{agent_name}.json") # Create the data folder if it doesn't exist os.makedirs(data_folder, exist_ok=True) diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/hosted_agents_util.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/hosted_agents_util.py index 9641a6fb2616..7705776d20fd 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/hosted_agents_util.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/hosted_agents_util.py @@ -1,48 +1,12 @@ import asyncio import logging -import time -from contextlib import asynccontextmanager, contextmanager -from typing import AsyncGenerator, Optional +from typing import Optional from azure.ai.projects import AIProjectClient from azure.ai.projects.aio import AIProjectClient as AsyncAIProjectClient -from azure.ai.projects.models import HostedAgentDefinition, ProtocolVersionRecord, VersionRefIndicator - - -def wait_for_agent_version_active( - project_client: AIProjectClient, - agent_name: str, - agent_version: str, - *, - logger: Optional[logging.Logger] = None, - max_attempts: int = 60, - poll_interval_seconds: int = 10, -) -> None: - if logger: - logger.info("Waiting for agent version to become active...") - - for attempt in range(max_attempts): - time.sleep(poll_interval_seconds) - version_details = project_client.agents.get_version(agent_name=agent_name, agent_version=agent_version) - status = version_details.status - - if logger: - logger.debug(f"Agent version status: {status} (attempt {attempt + 1}/{max_attempts})") - print(f"Agent version status: {status} (attempt {attempt + 1})") - - if status == "active": - if logger: - logger.info("Agent version is now active") - return - - if status == "failed": - if logger: - logger.error(f"Agent version provisioning failed: {dict(version_details)}") - raise RuntimeError(f"Agent version provisioning failed: {dict(version_details)}") - - if logger: - logger.error("Timed out waiting for agent version to become active") - raise RuntimeError("Timed out waiting for agent version to become active") +from azure.ai.projects.models import ( + AgentVersionDetails, +) async def wait_for_agent_version_active_async( @@ -81,97 +45,29 @@ async def wait_for_agent_version_active_async( raise RuntimeError("Timed out waiting for agent version to become active") -@contextmanager -def create_agent_and_session( +def get_latest_active_agent_version( project_client: AIProjectClient, agent_name: str, - image: str, - isolation_key: str = "sample-isolation-key", -): - agent = project_client.agents.create_version( - agent_name=agent_name, - definition=HostedAgentDefinition( - cpu="0.5", - memory="1Gi", - image=image, - container_protocol_versions=[ - ProtocolVersionRecord(protocol="responses", version="1.0.0"), - ], - ), - metadata={"enableVnextExperience": "true"}, - ) - print(f"Agent created (name: {agent.name}, version: {agent.version})") - - wait_for_agent_version_active( - project_client=project_client, - agent_name=agent_name, - agent_version=agent.version, - ) - - session = project_client.beta.agents.create_session( - agent_name=agent_name, - isolation_key=isolation_key, - version_indicator=VersionRefIndicator(agent_version=agent.version), +) -> AgentVersionDetails: + for version in project_client.agents.list_versions(agent_name=agent_name, order="desc"): + if version.status == "active": + return version + + raise RuntimeError( + f"No active version found for hosted agent '{agent_name}'. " + "Create or activate a version before running this sample." ) - print(f"Session created (id: {session.agent_session_id}, status: {session.status})") - try: - yield agent, session - finally: - project_client.beta.agents.delete_session( - agent_name=agent_name, - session_id=session.agent_session_id, - isolation_key=isolation_key, - ) - print(f"Session with id: {session.agent_session_id} deleted.") - project_client.agents.delete_version(agent_name=agent_name, agent_version=agent.version) - print(f"Agent version {agent.version} deleted.") - - -@asynccontextmanager -async def create_agent_and_session_async( +async def get_latest_active_agent_version_async( project_client: AsyncAIProjectClient, agent_name: str, - image: str, - isolation_key: str = "sample-isolation-key", -) -> AsyncGenerator[tuple[str, str], None]: - agent = await project_client.agents.create_version( - agent_name=agent_name, - definition=HostedAgentDefinition( - cpu="0.5", - memory="1Gi", - image=image, - container_protocol_versions=[ - ProtocolVersionRecord(protocol="responses", version="1.0.0"), - ], - ), - metadata={"enableVnextExperience": "true"}, +) -> AgentVersionDetails: + async for version in project_client.agents.list_versions(agent_name=agent_name, order="desc"): + if version.status == "active": + return version + + raise RuntimeError( + f"No active version found for hosted agent '{agent_name}'. " + "Create or activate a version before running this sample." ) - print(f"Agent created (name: {agent.name}, version: {agent.version})") - - await wait_for_agent_version_active_async( - project_client=project_client, - agent_name=agent_name, - agent_version=agent.version, - ) - - session = await project_client.beta.agents.create_session( - agent_name=agent_name, - isolation_key=isolation_key, - version_indicator=VersionRefIndicator(agent_version=agent.version), - ) - print(f"Session created (id: {session.agent_session_id}, status: {session.status})") - - try: - yield agent.version, session.agent_session_id - finally: - await project_client.beta.agents.delete_session( - agent_name=agent_name, - session_id=session.agent_session_id, - isolation_key=isolation_key, - ) - print(f"Session with id: {session.agent_session_id} deleted.") - - await project_client.agents.delete_version(agent_name=agent_name, agent_version=agent.version) - print(f"Agent version {agent.version} deleted.") diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/rbac_util.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/rbac_util.py new file mode 100644 index 000000000000..ce12b18f23d7 --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/rbac_util.py @@ -0,0 +1,132 @@ +import uuid +from typing import Any, cast +from urllib.parse import urlparse + +from azure.core.credentials import TokenCredential +from azure.core.exceptions import HttpResponseError, ResourceExistsError, ResourceNotFoundError +from azure.mgmt.authorization import AuthorizationManagementClient, models as authorization_models +from azure.mgmt.resource import ResourceManagementClient + +from azure.ai.projects.models import AgentVersionDetails + +AZURE_AI_USER_ROLE_DEFINITION_GUID = "53ca6127-db72-4b80-b1b0-d745d6d5456d" + + +def _extract_resource_group_name(resource_id: str) -> str: + parts = resource_id.strip("/").split("/") + for index, part in enumerate(parts): + if part.lower() == "resourcegroups" and index + 1 < len(parts): + return parts[index + 1] + return "" + + +def _resolve_ai_account_resource_id( + credential: TokenCredential, + account_name: str, + project_name: str, + subscription_id: str, +) -> str: + resource_client = ResourceManagementClient(credential, subscription_id) + project_resources = resource_client.resources.list( + filter="resourceType eq 'Microsoft.CognitiveServices/accounts/projects'" + ) + + project_id_segment = f"/accounts/{account_name}/projects/{project_name}".lower() + matching_projects = [ + resource for resource in project_resources if resource.id and project_id_segment in resource.id.lower() + ] + if not matching_projects: + raise RuntimeError(f"Could not locate Foundry project '{project_name}' in subscription '{subscription_id}'.") + + if not matching_projects[0].id: + raise RuntimeError("Foundry project resource ID is empty.") + resource_group_name = _extract_resource_group_name(matching_projects[0].id) + account_resources = resource_client.resources.list_by_resource_group( + resource_group_name=resource_group_name, + filter="resourceType eq 'Microsoft.CognitiveServices/accounts'", + ) + + account_matches = [resource.id for resource in account_resources if resource.name == account_name and resource.id] + if not account_matches: + raise RuntimeError( + f"Could not locate Azure AI account '{account_name}' in resource group '{resource_group_name}'." + ) + return account_matches[0] + + +def _ensure_agent_identity_rbac_with_role_id( + credential: TokenCredential, principal_id: str, scope_resource_id: str, subscription_id: str, role_id: str +) -> tuple[bool, str]: + authorization_client = AuthorizationManagementClient(credential, subscription_id) + role_definition_id = f"/subscriptions/{subscription_id}/providers/Microsoft.Authorization/roleDefinitions/{role_id}" + role_assignment_name = str( + uuid.uuid5( + uuid.NAMESPACE_URL, + f"{scope_resource_id}|{principal_id}|{role_definition_id}", + ) + ) + + try: + authorization_client.role_assignments.get(scope_resource_id, role_assignment_name) + print(f"Azure AI User role already assigned to principal {principal_id}.") + return False, role_assignment_name + except ResourceNotFoundError: + pass + + create_parameters_kwargs = cast( + dict[str, Any], + { + "role_definition_id": role_definition_id, + "principal_id": principal_id, + "principal_type": authorization_models.PrincipalType.SERVICE_PRINCIPAL, + }, + ) + parameters = authorization_models.RoleAssignmentCreateParameters(**create_parameters_kwargs) + + authorization_client.role_assignments.create(scope_resource_id, role_assignment_name, parameters) + print(f"Assigned Azure AI User role to principal {principal_id} at scope {scope_resource_id}.") + return True, role_assignment_name + + +def ensure_agent_identity_rbac( + agent: AgentVersionDetails, + credential: TokenCredential, + subscription_id: str, + foundry_project_endpoint: str, +) -> None: + """Ensure the hosted agent identity has Azure AI User role on the Azure AI account. + + This resolves the Azure AI account resource ID from the Foundry project endpoint, + reads the hosted agent managed identity principal ID from ``agent``, and + creates a deterministic role assignment for the Azure AI User role if one does not + already exist. + + :param agent: Agent version details containing ``instance_identity``. + :type agent: ~azure.ai.projects.models.AgentVersionDetails + :param credential: Credential used for Azure Resource Manager authorization calls. + :type credential: ~azure.core.credentials.TokenCredential + :param subscription_id: Azure subscription ID containing the Foundry project/account. + :type subscription_id: str + :param foundry_project_endpoint: Foundry project endpoint in the format + ``https://.services.ai.azure.com/api/projects/``. + :type foundry_project_endpoint: str + :raises RuntimeError: If the agent identity principal ID is unavailable, or if the + account/project resources cannot be resolved. + :raises ~azure.core.exceptions.HttpResponseError: If role assignment creation fails + for reasons other than an existing assignment. + """ + if not agent.instance_identity or not agent.instance_identity.principal_id: + raise RuntimeError("Agent instance_identity or principal_id is not available.") + principal_id = agent.instance_identity.principal_id + + account_name = urlparse(foundry_project_endpoint).hostname.split(".")[0] # type: ignore[union-attr] + project_name = foundry_project_endpoint.rstrip("/").split("/api/projects/")[1].split("/")[0] + scope_resource_id = _resolve_ai_account_resource_id(credential, account_name, project_name, subscription_id) + + _ensure_agent_identity_rbac_with_role_id( + credential=credential, + principal_id=principal_id, + scope_resource_id=scope_resource_id, + subscription_id=subscription_id, + role_id=AZURE_AI_USER_ROLE_DEFINITION_GUID, + ) diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint.py index 0833548e857e..ab873a78060b 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint.py @@ -6,7 +6,7 @@ """ DESCRIPTION: - This sample demonstrates how to create a Hosted Agent and Session, + This sample demonstrates how to use an existing Hosted Agent and create a Session, configure an Agent endpoint for Responses protocol, and invoke the OpenAI Responses API against that agent endpoint using the synchronous AIProjectClient. @@ -27,11 +27,10 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. """ import os @@ -42,18 +41,18 @@ from azure.ai.projects import AIProjectClient from azure.ai.projects.models import ( - AgentEndpoint, + AgentEndpointConfig, AgentEndpointProtocol, FixedRatioVersionSelectionRule, VersionSelector, ) -from hosted_agents_util import create_agent_and_session +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version load_dotenv() endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] -agent_name = "MySessionHostedAgent" +agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] with ( DefaultAzureCredential() as credential, @@ -62,33 +61,47 @@ credential=credential, allow_preview=True, ) as project_client, - create_agent_and_session(project_client, agent_name, image) as (agent, session), ): - # Configure endpoint routing so this agent name serves the created version. - # 100% of traffic is routed to the single created version. - endpoint_config = AgentEndpoint( - version_selector=VersionSelector( - version_selection_rules=[ - FixedRatioVersionSelectionRule(agent_version=agent.version, traffic_percentage=100), - ] - ), - protocols=[AgentEndpointProtocol.RESPONSES], - ) - patched_agent = project_client.beta.agents.patch_agent_details( - agent_name=agent_name, - agent_endpoint=endpoint_config, - ) - print(f"Agent endpoint configured for agent: {patched_agent.name}") + agent = get_latest_active_agent_version(project_client, agent_name) - # Create an OpenAI client bound to the agent endpoint. - openai_client = project_client.get_openai_client(agent_name=agent_name) - - # Call Responses API and bind the request to the created agent session. - response = openai_client.responses.create( - input="What is the size of France in square miles?", - extra_body={ - "agent_session_id": session.agent_session_id, - }, + session = project_client.beta.agents.create_session( + agent_name=agent_name, + version_indicator=VersionRefIndicator(agent_version=agent.version), ) - print(f"Response output: {response.output_text}") + print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + try: + # Configure endpoint routing so this agent name serves the created version. + # 100% of traffic is routed to the single created version. + endpoint_config = AgentEndpointConfig( + version_selector=VersionSelector( + version_selection_rules=[ + FixedRatioVersionSelectionRule(agent_version=agent.version, traffic_percentage=100), + ] + ), + protocols=[AgentEndpointProtocol.RESPONSES], + ) + + patched_agent = project_client.beta.agents.patch_agent_details( + agent_name=agent_name, + agent_endpoint=endpoint_config, + ) + print(f"Agent endpoint configured for agent: {patched_agent.name}") + + # Create an OpenAI client bound to the agent endpoint. + openai_client = project_client.get_openai_client(agent_name=agent_name) + + # Call Responses API and bind the request to the created agent session. + response = openai_client.responses.create( + input="What is the size of France in square miles?", + extra_body={ + "agent_session_id": session.agent_session_id, + }, + ) + print(f"Response output: {response.output_text}") + finally: + project_client.beta.agents.delete_session( + agent_name=agent_name, + session_id=session.agent_session_id, + ) + print(f"Session deleted (id: {session.agent_session_id})") diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint_async.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint_async.py index 91a9d8ac2437..e2f4ef37d14b 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint_async.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_agent_endpoint_async.py @@ -6,7 +6,7 @@ """ DESCRIPTION: - This sample demonstrates how to create a Hosted Agent and Session, + This sample demonstrates how to use an existing Hosted Agent and create a Session, configure an Agent endpoint for Responses protocol, and invoke the OpenAI Responses API against that agent endpoint using the asynchronous AIProjectClient. @@ -27,11 +27,10 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. """ import asyncio @@ -43,21 +42,21 @@ from azure.ai.projects.aio import AIProjectClient from azure.ai.projects.models import ( - AgentEndpoint, + AgentEndpointConfig, AgentEndpointProtocol, FixedRatioVersionSelectionRule, VersionSelector, ) -from hosted_agents_util import create_agent_and_session_async +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version_async load_dotenv() -endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] -agent_name = "MySessionHostedAgent" +async def main(): + endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] + agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] -async def main() -> None: async with ( DefaultAzureCredential() as credential, AIProjectClient( @@ -65,35 +64,50 @@ async def main() -> None: credential=credential, allow_preview=True, ) as project_client, - create_agent_and_session_async(project_client, agent_name, image) as (agent_version, session_id), ): - # Configure endpoint routing so this agent name serves the created version. - # 100% of traffic is routed to the single created version. - endpoint_config = AgentEndpoint( - version_selector=VersionSelector( - version_selection_rules=[ - FixedRatioVersionSelectionRule(agent_version=agent_version, traffic_percentage=100), - ] - ), - protocols=[AgentEndpointProtocol.RESPONSES], - ) - patched_agent = await project_client.beta.agents.patch_agent_details( + agent = await get_latest_active_agent_version_async(project_client, agent_name) + + session = await project_client.beta.agents.create_session( agent_name=agent_name, - agent_endpoint=endpoint_config, + version_indicator=VersionRefIndicator(agent_version=agent.version), ) - print(f"Agent endpoint configured for agent: {patched_agent.name}") + print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + try: + # Configure endpoint routing so this agent name serves the created version. + # 100% of traffic is routed to the single created version. + endpoint_config = AgentEndpointConfig( + version_selector=VersionSelector( + version_selection_rules=[ + FixedRatioVersionSelectionRule(agent_version=agent.version, traffic_percentage=100), + ] + ), + protocols=[AgentEndpointProtocol.RESPONSES], + ) + + patched_agent = await project_client.beta.agents.patch_agent_details( + agent_name=agent_name, + agent_endpoint=endpoint_config, + ) + print(f"Agent endpoint configured for agent: {patched_agent.name}") + + # Create an OpenAI client bound to the agent endpoint. + openai_client = project_client.get_openai_client(agent_name=agent_name) - # Create an OpenAI client bound to the agent endpoint. - async with project_client.get_openai_client(agent_name=agent_name) as openai_client: # Call Responses API and bind the request to the created agent session. response = await openai_client.responses.create( input="What is the size of France in square miles?", extra_body={ - "agent_session_id": session_id, + "agent_session_id": session.agent_session_id, }, ) print(f"Response output: {response.output_text}") + finally: + await project_client.beta.agents.delete_session( + agent_name=agent_name, + session_id=session.agent_session_id, + ) + print(f"Session deleted (id: {session.agent_session_id})") if __name__ == "__main__": diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_hosted_agent_create.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_hosted_agent_create.py new file mode 100644 index 000000000000..6239a79bf05b --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_hosted_agent_create.py @@ -0,0 +1,110 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +""" +DESCRIPTION: + This sample demonstrates CRUD operations for Hosted Agent versions + using the synchronous AIProjectClient. + + This is the only hosted_agents sample that sets up agent identity RBAC + via `ensure_agent_identity_rbac`. + +USAGE: + python sample_hosted_agent_create.py + + Before running the sample: + + pip install "azure-ai-projects>=2.1.0" azure-mgmt-authorization azure-mgmt-resource python-dotenv + + Set these environment variables with your own values: + 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview + page of your Microsoft Foundry portal. + 2) FOUNDRY_HOSTED_AGENT_NAME - The Hosted Agent name. + 3) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format + '/[:|@]'. + 4) AZURE_SUBSCRIPTION_ID - Azure subscription ID where the + Azure AI account and project are deployed. +""" + +import os +import time + +from dotenv import load_dotenv + +from azure.identity import DefaultAzureCredential + +from azure.ai.projects import AIProjectClient +from azure.ai.projects.models import HostedAgentDefinition, ProtocolVersionRecord +from rbac_util import ensure_agent_identity_rbac + +load_dotenv() + +endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] +agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] +image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] +subscription_id = os.environ["AZURE_SUBSCRIPTION_ID"] + + +def wait_for_agent_version_active( + project_client: AIProjectClient, + agent_name: str, + agent_version: str, + max_attempts: int = 60, + poll_interval_seconds: int = 10, +) -> None: + for attempt in range(max_attempts): + time.sleep(poll_interval_seconds) + version_details = project_client.agents.get_version(agent_name=agent_name, agent_version=agent_version) + status = version_details.status + + print(f"Agent version status: {status} (attempt {attempt + 1})") + + if status == "active": + return + + if status == "failed": + raise RuntimeError(f"Agent version provisioning failed: {dict(version_details)}") + + raise RuntimeError("Timed out waiting for agent version to become active") + + +with ( + DefaultAzureCredential() as credential, + AIProjectClient( + endpoint=endpoint, + credential=credential, + allow_preview=True, + ) as project_client, +): + created = project_client.agents.create_version( + agent_name=agent_name, + definition=HostedAgentDefinition( + cpu="0.5", + memory="1Gi", + image=image, + container_protocol_versions=[ + ProtocolVersionRecord(protocol="responses", version="1.0.0"), + ], + ), + metadata={"enableVnextExperience": "true"}, + ) + print(f"Created hosted agent version: {created.version}") + + wait_for_agent_version_active( + project_client=project_client, + agent_name=agent_name, + agent_version=created.version, + ) + + ensure_agent_identity_rbac( + agent=created, + credential=credential, + subscription_id=subscription_id, + foundry_project_endpoint=endpoint, + ) + + fetched = project_client.agents.get_version(agent_name=agent_name, agent_version=created.version) + print(f"Fetched hosted agent version: {fetched.version}, status: {fetched.status}") diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream.py index e2658265cbb7..4c495f5e5a22 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream.py @@ -6,31 +6,34 @@ """ DESCRIPTION: - This sample demonstrates how to stream hosted agent session logs - using `project_client.beta.agents.get_session_log_stream` with the - synchronous AIProjectClient. + This sample demonstrates how to stream hosted agent session logs + using `project_client.beta.agents.get_session_log_stream` with the + synchronous AIProjectClient. - Sessions only work with Hosted Agents. + Sessions only work with Hosted Agents. - Session and log stream operations are currently preview features. - In the Python SDK, you access these operations via - `project_client.beta.agents`. + Session and log stream operations are currently preview features. + In the Python SDK, you access these operations via + `project_client.beta.agents`. USAGE: - python sample_session_log_stream.py + python sample_session_log_stream.py - Before running the sample: + Before running the sample: - pip install "azure-ai-projects>=2.1.0" python-dotenv + pip install "azure-ai-projects>=2.1.0" python-dotenv - Set these environment variables with your own values: - 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview - page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + Set these environment variables with your own values: + 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview + page of your Microsoft Foundry portal. + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. + + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. + + NOTE: This sample assumes the Foundry project and Azure AI account are in the + same resource group. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. """ import os @@ -41,18 +44,18 @@ from azure.ai.projects import AIProjectClient from azure.ai.projects.models import ( - AgentEndpoint, + AgentEndpointConfig, AgentEndpointProtocol, FixedRatioVersionSelectionRule, VersionSelector, ) -from hosted_agents_util import create_agent_and_session +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version load_dotenv() endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] -agent_name = "MySessionHostedAgent" +agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] def _iter_sse_frames(stream, max_log_events: int): @@ -91,25 +94,32 @@ def _iter_sse_frames(stream, max_log_events: int): credential=credential, allow_preview=True, ) as project_client, - create_agent_and_session(project_client, agent_name, image) as (agent, session), + project_client.get_openai_client(agent_name=agent_name) as openai_client, ): - endpoint_config = AgentEndpoint( - version_selector=VersionSelector( - version_selection_rules=[ - FixedRatioVersionSelectionRule(agent_version=agent.version, traffic_percentage=100), - ] - ), - protocols=[AgentEndpointProtocol.RESPONSES], - ) - - project_client.beta.agents.patch_agent_details( + agent = get_latest_active_agent_version(project_client, agent_name) + session = project_client.beta.agents.create_session( agent_name=agent_name, - agent_endpoint=endpoint_config, + version_indicator=VersionRefIndicator(agent_version=agent.version), ) - print(f"Agent endpoint configured for agent: {agent_name}") - input_text = "Say hello in one short sentence." + print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + try: + endpoint_config = AgentEndpointConfig( + version_selector=VersionSelector( + version_selection_rules=[ + FixedRatioVersionSelectionRule(agent_version=agent.version, traffic_percentage=100), + ] + ), + protocols=[AgentEndpointProtocol.RESPONSES], + ) + + project_client.beta.agents.patch_agent_details( + agent_name=agent_name, + agent_endpoint=endpoint_config, + ) + + print(f"Agent endpoint configured for agent: {agent_name}") + input_text = "Say hello in one short sentence." - with project_client.get_openai_client(agent_name=agent_name) as openai_client: response = openai_client.responses.create( input=input_text, extra_body={ @@ -118,12 +128,18 @@ def _iter_sse_frames(stream, max_log_events: int): ) print(f"Response output: {response.output_text}") - print("Streaming session logs...") - raw_stream = project_client.beta.agents.get_session_log_stream( - agent_name=agent_name, - agent_version=agent.version, - session_id=session.agent_session_id, - ) - for frame in _iter_sse_frames(raw_stream, max_log_events=30): - print(f"SSE event: {frame.get('event')}") - print(f"SSE data: {frame.get('data')}\n") + print("Streaming session logs...") + raw_stream = project_client.beta.agents.get_session_log_stream( + agent_name=agent_name, + agent_version=agent.version, + session_id=session.agent_session_id, + ) + for frame in _iter_sse_frames(raw_stream, max_log_events=30): + print(f"SSE event: {frame.get('event')}") + print(f"SSE data: {frame.get('data')}\n") + finally: + project_client.beta.agents.delete_session( + agent_name=agent_name, + session_id=session.agent_session_id, + ) + print(f"Session deleted (id: {session.agent_session_id})") diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream_async.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream_async.py index aeafaf39b7ae..22808cfa27ee 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream_async.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_session_log_stream_async.py @@ -26,11 +26,14 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. + + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. + + NOTE: This sample assumes the Foundry project and Azure AI account are in the + same resource group. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. """ import asyncio @@ -42,21 +45,47 @@ from azure.ai.projects.aio import AIProjectClient from azure.ai.projects.models import ( - AgentEndpoint, + AgentEndpointConfig, AgentEndpointProtocol, FixedRatioVersionSelectionRule, VersionSelector, ) -from hosted_agents_util import create_agent_and_session_async +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version_async load_dotenv() -endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] -agent_name = "MySessionHostedAgent" + +def _iter_sse_frames(stream, max_log_events: int): + event_count = 0 + buffer = "" + + for chunk in stream: + buffer += chunk.decode("utf-8", errors="replace") + + while "\n\n" in buffer: + frame, buffer = buffer.split("\n\n", 1) + event_name = None + data_lines = [] + + for line in frame.splitlines(): + if line.startswith("event: "): + event_name = line[7:] + elif line.startswith("data: "): + data_lines.append(line[6:]) + + if data_lines or event_name: + event_count += 1 + yield { + "event": event_name, + "data": "\n".join(data_lines), + } + + if event_count >= max_log_events: + return -async def _iter_sse_frames(stream, max_log_events: int): +async def _iter_sse_frames_async(stream, max_log_events: int): event_count = 0 buffer = "" @@ -85,7 +114,10 @@ async def _iter_sse_frames(stream, max_log_events: int): return -async def main() -> None: +async def main(): + endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] + agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] + async with ( DefaultAzureCredential() as credential, AIProjectClient( @@ -93,43 +125,55 @@ async def main() -> None: credential=credential, allow_preview=True, ) as project_client, - create_agent_and_session_async(project_client, agent_name, image) as (agent_version, session_id), ): - endpoint_config = AgentEndpoint( - version_selector=VersionSelector( - version_selection_rules=[ - FixedRatioVersionSelectionRule(agent_version=agent_version, traffic_percentage=100), - ] - ), - protocols=[AgentEndpointProtocol.RESPONSES], - ) - - await project_client.beta.agents.patch_agent_details( + agent = await get_latest_active_agent_version_async(project_client, agent_name) + session = await project_client.beta.agents.create_session( agent_name=agent_name, - agent_endpoint=endpoint_config, + version_indicator=VersionRefIndicator(agent_version=agent.version), ) - print(f"Agent endpoint configured for agent: {agent_name}") - input_text = "Say hello in one short sentence." + print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + try: + endpoint_config = AgentEndpointConfig( + version_selector=VersionSelector( + version_selection_rules=[ + FixedRatioVersionSelectionRule(agent_version=agent.version, traffic_percentage=100), + ] + ), + protocols=[AgentEndpointProtocol.RESPONSES], + ) + + await project_client.beta.agents.patch_agent_details( + agent_name=agent_name, + agent_endpoint=endpoint_config, + ) + + print(f"Agent endpoint configured for agent: {agent_name}") + input_text = "Say hello in one short sentence." - async with project_client.get_openai_client(agent_name=agent_name) as openai_client: + openai_client = project_client.get_openai_client(agent_name=agent_name) response = await openai_client.responses.create( input=input_text, extra_body={ - "agent_session_id": session_id, + "agent_session_id": session.agent_session_id, }, ) print(f"Response output: {response.output_text}") - print("Streaming session logs...") - raw_stream = await project_client.beta.agents.get_session_log_stream( - agent_name=agent_name, - agent_version=agent_version, - session_id=session_id, - ) - - async for frame in _iter_sse_frames(raw_stream, max_log_events=30): - print(f"SSE event: {frame.get('event')}") - print(f"SSE data: {frame.get('data')}\n") + print("Streaming session logs...") + raw_stream = await project_client.beta.agents.get_session_log_stream( + agent_name=agent_name, + agent_version=agent.version, + session_id=session.agent_session_id, + ) + async for frame in _iter_sse_frames_async(raw_stream, max_log_events=30): + print(f"SSE event: {frame.get('event')}") + print(f"SSE data: {frame.get('data')}\n") + finally: + await project_client.beta.agents.delete_session( + agent_name=agent_name, + session_id=session.agent_session_id, + ) + print(f"Session deleted (id: {session.agent_session_id})") if __name__ == "__main__": diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud.py index 5a840d7d0df1..310af03e2cdd 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud.py @@ -24,11 +24,17 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. + +SDK FUNCTIONS: + - project_client.agents.list_versions: resolves the active version for the existing hosted agent. + - project_client.beta.agents.create_session: creates a session for the agent. + - project_client.beta.agents.get_session: retrieves a session by ID. + - project_client.beta.agents.list_sessions: lists sessions for an agent. + - project_client.beta.agents.delete_session: deletes a session by ID. """ import os @@ -38,13 +44,13 @@ from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient -from azure.ai.projects.models import HostedAgentDefinition, VersionRefIndicator, ProtocolVersionRecord -from hosted_agents_util import wait_for_agent_version_active +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version load_dotenv() endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] +agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] with ( DefaultAzureCredential() as credential, @@ -54,39 +60,12 @@ allow_preview=True, ) as project_client, ): - agent_name = "MySessionHostedAgent" - - # Create an agent version to back the session - agent = project_client.agents.create_version( - agent_name=agent_name, - definition=HostedAgentDefinition( - cpu="0.5", - memory="1Gi", - image=image, - container_protocol_versions=[ - ProtocolVersionRecord(protocol="responses", version="1.0.0"), - ], - ), - metadata={"enableVnextExperience": "true"}, - ) - print(f"Agent created (name: {agent.name}, version: {agent.version})") - - wait_for_agent_version_active( - project_client=project_client, - agent_name=agent_name, - agent_version=agent.version, - ) - - isolation_key = "sample-isolation-key" - - # Create a session for the agent - print(f"Creating {3} sessions for the agent...") + agent = get_latest_active_agent_version(project_client, agent_name) session = project_client.beta.agents.create_session( agent_name=agent_name, - isolation_key=isolation_key, version_indicator=VersionRefIndicator(agent_version=agent.version), ) - print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + print(f"Created session (id: {session.agent_session_id}, status: {session.status})") # Retrieve the session by its ID fetched = project_client.beta.agents.get_session( @@ -102,11 +81,8 @@ for item in sessions: print(f" - {item.agent_session_id} (status: {item.status})") - # Delete the session - print(f"Deleting session with id: {session.agent_session_id}...") project_client.beta.agents.delete_session( agent_name=agent_name, session_id=session.agent_session_id, - isolation_key=isolation_key, ) - print(f"Session with id: {session.agent_session_id} deleted.") + print(f"Deleted session (id: {session.agent_session_id})") diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud_async.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud_async.py index 49169702195b..038c2fe68daa 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud_async.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_crud_async.py @@ -24,11 +24,17 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. + +SDK FUNCTIONS: + - project_client.agents.list_versions: resolves the active version for the existing hosted agent. + - project_client.beta.agents.create_session: creates a session for the agent. + - project_client.beta.agents.get_session: retrieves a session by ID. + - project_client.beta.agents.list_sessions: lists sessions for an agent. + - project_client.beta.agents.delete_session: deletes a session by ID. """ import asyncio @@ -39,16 +45,16 @@ from azure.identity.aio import DefaultAzureCredential from azure.ai.projects.aio import AIProjectClient -from azure.ai.projects.models import HostedAgentDefinition, ProtocolVersionRecord, VersionRefIndicator -from hosted_agents_util import wait_for_agent_version_active_async +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version_async load_dotenv() -endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] +async def main(): + endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] + agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] -async def main() -> None: async with ( DefaultAzureCredential() as credential, AIProjectClient( @@ -57,62 +63,32 @@ async def main() -> None: allow_preview=True, ) as project_client, ): - agent_name = "MySessionHostedAgent" - - # Create an agent version to back the session. - agent = await project_client.agents.create_version( + agent = await get_latest_active_agent_version_async(project_client, agent_name) + session = await project_client.beta.agents.create_session( agent_name=agent_name, - definition=HostedAgentDefinition( - cpu="0.5", - memory="1Gi", - image=image, - container_protocol_versions=[ - ProtocolVersionRecord(protocol="responses", version="v1"), - ], - ), - metadata={"enableVnextExperience": "true"}, + version_indicator=VersionRefIndicator(agent_version=agent.version), ) - print(f"Agent created (name: {agent.name}, version: {agent.version})") + print(f"Created session (id: {session.agent_session_id}, status: {session.status})") - await wait_for_agent_version_active_async( - project_client=project_client, + # Retrieve the session by its ID + fetched = await project_client.beta.agents.get_session( agent_name=agent_name, - agent_version=agent.version, + session_id=session.agent_session_id, ) + print(f"Retrieved session (id: {fetched.agent_session_id}, status: {fetched.status})") - isolation_key = "sample-isolation-key" - session = await project_client.beta.agents.create_session( + # List sessions for the agent + print("Listing sessions for the agent...") + sessions = project_client.beta.agents.list_sessions(agent_name=agent_name) + print("Sessions:") + async for item in sessions: + print(f" - {item.agent_session_id} (status: {item.status})") + + await project_client.beta.agents.delete_session( agent_name=agent_name, - isolation_key=isolation_key, - version_indicator=VersionRefIndicator(agent_version=agent.version), + session_id=session.agent_session_id, ) - print(f"Session created (id: {session.agent_session_id}, status: {session.status})") - - try: - fetched = await project_client.beta.agents.get_session( - agent_name=agent_name, - session_id=session.agent_session_id, - ) - print(f"Retrieved session (id: {fetched.agent_session_id}, status: {fetched.status})") - - print("Listing sessions for the agent...") - sessions = [] - async for item in project_client.beta.agents.list_sessions(agent_name=agent_name): - sessions.append(item) - print("Sessions:") - for item in sessions: - print(f" - {item.agent_session_id} (status: {item.status})") - finally: - print(f"Deleting session with id: {session.agent_session_id}...") - await project_client.beta.agents.delete_session( - agent_name=agent_name, - session_id=session.agent_session_id, - isolation_key=isolation_key, - ) - print(f"Session with id: {session.agent_session_id} deleted.") - - await project_client.agents.delete_version(agent_name=agent_name, agent_version=agent.version) - print(f"Agent version {agent.version} deleted.") + print(f"Deleted session (id: {session.agent_session_id})") if __name__ == "__main__": diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download.py index fa423e02c28c..eda9647b431c 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download.py @@ -24,11 +24,10 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. """ import os @@ -38,12 +37,13 @@ from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient -from hosted_agents_util import create_agent_and_session +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version load_dotenv() endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] +agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] # Construct the paths to the data folder and data file used in this sample script_dir = os.path.dirname(os.path.abspath(__file__)) @@ -62,10 +62,13 @@ allow_preview=True, ) as project_client, ): - agent_name = "MySessionHostedAgent" - - with create_agent_and_session(project_client, agent_name, image) as (_, session): - + agent = get_latest_active_agent_version(project_client, agent_name) + session = project_client.beta.agents.create_session( + agent_name=agent_name, + version_indicator=VersionRefIndicator(agent_version=agent.version), + ) + print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + try: # Upload and list session files project_client.beta.agents.upload_session_file( agent_name=agent_name, @@ -115,3 +118,9 @@ agent_session_id=session.agent_session_id, path=remote_file_path2, ) + finally: + project_client.beta.agents.delete_session( + agent_name=agent_name, + session_id=session.agent_session_id, + ) + print(f"Session deleted (id: {session.agent_session_id})") diff --git a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download_async.py b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download_async.py index 0f37627826d1..bf1bc228e4b9 100644 --- a/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download_async.py +++ b/sdk/ai/azure-ai-projects/samples/hosted_agents/sample_sessions_files_upload_download_async.py @@ -6,8 +6,8 @@ """ DESCRIPTION: - This sample demonstrates how to perform session file upload, list, download, - and delete operations using the asynchronous AIProjectClient. + This sample demonstrates how to perform CRUD operations on agent Sessions + using the asynchronous AIProjectClient. Sessions only work with Hosted Agents. @@ -24,11 +24,10 @@ Set these environment variables with your own values: 1) FOUNDRY_PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the Overview page of your Microsoft Foundry portal. - 2) FOUNDRY_AGENT_CONTAINER_IMAGE - The Hosted Agent container image in the format '/[:|@]' + 2) FOUNDRY_HOSTED_AGENT_NAME - The name of an existing Hosted Agent. - You can build and push an example image from - `samples/hosted_agents/assets/responses-echo-agent` and use that image value - for `FOUNDRY_AGENT_CONTAINER_IMAGE`. + If you don't have a Hosted Agent, run `sample_hosted_agent_create.py` first + to create one as a prerequisite. """ import asyncio @@ -39,23 +38,24 @@ from azure.identity.aio import DefaultAzureCredential from azure.ai.projects.aio import AIProjectClient -from hosted_agents_util import create_agent_and_session_async +from azure.ai.projects.models import VersionRefIndicator +from hosted_agents_util import get_latest_active_agent_version_async load_dotenv() -endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] -image = os.environ["FOUNDRY_AGENT_CONTAINER_IMAGE"] -# Construct the paths to the data folder and data file used in this sample -script_dir = os.path.dirname(os.path.abspath(__file__)) -data_folder = os.path.join(script_dir, "assets") -data_file1 = os.path.join(data_folder, "data_file1.txt") -data_file2 = os.path.join(data_folder, "data_file2.txt") -remote_file_path1 = "/remote/data_file1.txt" -remote_file_path2 = "/remote/data_file2.txt" +async def main(): + endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"] + agent_name = os.environ["FOUNDRY_HOSTED_AGENT_NAME"] + # Construct the paths to the data folder and data file used in this sample + script_dir = os.path.dirname(os.path.abspath(__file__)) + data_folder = os.path.join(script_dir, "assets") + data_file1 = os.path.join(data_folder, "data_file1.txt") + data_file2 = os.path.join(data_folder, "data_file2.txt") + remote_file_path1 = "/remote/data_file1.txt" + remote_file_path2 = "/remote/data_file2.txt" -async def main() -> None: async with ( DefaultAzureCredential() as credential, AIProjectClient( @@ -64,12 +64,17 @@ async def main() -> None: allow_preview=True, ) as project_client, ): - agent_name = "MySessionHostedAgent" - - async with create_agent_and_session_async(project_client, agent_name, image) as (_, session_id): + agent = await get_latest_active_agent_version_async(project_client, agent_name) + session = await project_client.beta.agents.create_session( + agent_name=agent_name, + version_indicator=VersionRefIndicator(agent_version=agent.version), + ) + print(f"Session created (id: {session.agent_session_id}, status: {session.status})") + try: + # Upload and list session files await project_client.beta.agents.upload_session_file( agent_name=agent_name, - session_id=session_id, + session_id=session.agent_session_id, content_or_file_path=data_file1, path=remote_file_path1, ) @@ -77,7 +82,7 @@ async def main() -> None: print(f"Uploading session file: {data_file2} -> {remote_file_path2}") await project_client.beta.agents.upload_session_file( agent_name=agent_name, - session_id=session_id, + session_id=session.agent_session_id, content_or_file_path=data_file2, path=remote_file_path2, ) @@ -85,35 +90,42 @@ async def main() -> None: print("Listing session files for the session at path '.'...") files = await project_client.beta.agents.get_session_files( agent_name=agent_name, - agent_session_id=session_id, + agent_session_id=session.agent_session_id, path="/remote", ) for entry in files.entries: print(f" - name={entry.name}, size={entry.size}, is_directory={entry.is_directory}") print(f"Downloading and printing content from '{remote_file_path1}'") - download_stream = await project_client.beta.agents.download_session_file( + content_bytes = b"" + async for chunk in await project_client.beta.agents.download_session_file( agent_name=agent_name, - agent_session_id=session_id, + agent_session_id=session.agent_session_id, path=remote_file_path1, - ) - content_bytes = b"".join([chunk async for chunk in download_stream]) + ): + content_bytes += chunk file_content = content_bytes.decode("utf-8", errors="replace") print(f"Session file content ({remote_file_path1}):\n{file_content}") print(f"Deleting session file at path: {remote_file_path1}...") await project_client.beta.agents.delete_session_file( agent_name=agent_name, - agent_session_id=session_id, + agent_session_id=session.agent_session_id, path=remote_file_path1, ) print(f"Deleting session file at path: {remote_file_path2}...") await project_client.beta.agents.delete_session_file( agent_name=agent_name, - agent_session_id=session_id, + agent_session_id=session.agent_session_id, path=remote_file_path2, ) + finally: + await project_client.beta.agents.delete_session( + agent_name=agent_name, + session_id=session.agent_session_id, + ) + print(f"Session deleted (id: {session.agent_session_id})") if __name__ == "__main__": diff --git a/sdk/ai/azure-ai-projects/tests/agents/telemetry/test_non_recording_span.py b/sdk/ai/azure-ai-projects/tests/agents/telemetry/test_non_recording_span.py new file mode 100644 index 000000000000..c64b0288bdb1 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/agents/telemetry/test_non_recording_span.py @@ -0,0 +1,202 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +""" +Tests verifying that instrumentors correctly skip non-recording spans. + +When a span is not recording, the instrumentor must not attempt to write +attributes or events to it. These tests use a mock span whose +``is_recording()`` returns False and whose mutation methods raise +``AssertionError`` if called, ensuring the guards work correctly. +""" + +import os +import json +import pytest +from unittest.mock import MagicMock, PropertyMock + +from azure.ai.projects.telemetry._ai_project_instrumentor import ( + _AIAgentsInstrumentorPreview, +) +from azure.ai.projects.telemetry._responses_instrumentor import ( + _ResponsesInstrumentorPreview, +) + + +def _make_non_recording_span(): + """Return a mock AbstractSpan wrapping a non-recording OTel span. + + * ``span_instance.is_recording()`` returns ``False`` + * ``span_instance.is_recording`` (the property/method) also returns ``False`` + so the guard correctly skips writes. + * Any call to ``add_event``, ``set_status``, ``record_exception`` or + ``set_attribute`` raises ``AssertionError``, catching any code path + that fails to check ``is_recording()`` properly. + """ + span_instance = MagicMock() + span_instance.is_recording = MagicMock(return_value=False) + span_instance.add_event = MagicMock(side_effect=AssertionError("add_event called on non-recording span")) + span_instance.set_status = MagicMock(side_effect=AssertionError("set_status called on non-recording span")) + span_instance.record_exception = MagicMock( + side_effect=AssertionError("record_exception called on non-recording span") + ) + + span = MagicMock() + span.span_instance = span_instance + span.add_attribute = MagicMock(side_effect=AssertionError("add_attribute called on non-recording span")) + return span + + +class TestNonRecordingSpanProjectInstrumentor: + """Verify _AIAgentsInstrumentorPreview skips non-recording spans.""" + + def test_add_message_event_skips_non_recording_span(self): + """_add_message_event should not write to a non-recording span.""" + instrumentor = _AIAgentsInstrumentorPreview() + span = _make_non_recording_span() + + # This must not raise; the guard should return early. + instrumentor._add_message_event(span, role="user", content="hello") + + def test_add_instructions_event_skips_non_recording_span(self): + """_add_instructions_event should not write to a non-recording span.""" + instrumentor = _AIAgentsInstrumentorPreview() + span = _make_non_recording_span() + + instrumentor._add_instructions_event(span, instructions="Be helpful", additional_instructions=None) + + def test_start_create_agent_span_skips_non_recording_span(self): + """start_create_agent_span should not write attributes to a non-recording span.""" + instrumentor = _AIAgentsInstrumentorPreview() + + # We need to patch start_span to return our non-recording span + from unittest.mock import patch + + non_recording_span = _make_non_recording_span() + + with patch( + "azure.ai.projects.telemetry._ai_project_instrumentor.start_span", + return_value=non_recording_span, + ): + result = instrumentor.start_create_agent_span( + server_address="test.openai.azure.com", + port=443, + model="gpt-4", + name="test-agent", + instructions="Be helpful", + ) + + # Should return the span but not have written any attributes/events to it + assert result is non_recording_span + non_recording_span.add_attribute.assert_not_called() + non_recording_span.span_instance.add_event.assert_not_called() + + +class TestNonRecordingSpanResponsesInstrumentor: + """Verify _ResponsesInstrumentorPreview skips non-recording spans.""" + + def test_set_span_attribute_safe_skips_non_recording_span(self): + """_set_span_attribute_safe should not write to a non-recording span.""" + instrumentor = _ResponsesInstrumentorPreview() + span = _make_non_recording_span() + + # This must not raise; the guard should return early. + instrumentor._set_span_attribute_safe(span, "test.key", "test_value") + + def test_start_responses_span_skips_non_recording_span(self): + """start_responses_span should not write attributes to a non-recording span.""" + instrumentor = _ResponsesInstrumentorPreview() + + from unittest.mock import patch + + non_recording_span = _make_non_recording_span() + + with patch( + "azure.ai.projects.telemetry._responses_instrumentor.start_span", + return_value=non_recording_span, + ): + result = instrumentor.start_responses_span( + server_address="test.openai.azure.com", + port=443, + model="gpt-4", + assistant_name="test-agent", + conversation_id="conv-123", + input_text="Hello", + ) + + assert result is non_recording_span + non_recording_span.add_attribute.assert_not_called() + non_recording_span.span_instance.add_event.assert_not_called() + + def test_start_create_conversation_span_skips_non_recording_span(self): + """start_create_conversation_span should not write to a non-recording span.""" + instrumentor = _ResponsesInstrumentorPreview() + + from unittest.mock import patch + + non_recording_span = _make_non_recording_span() + + with patch( + "azure.ai.projects.telemetry._responses_instrumentor.start_span", + return_value=non_recording_span, + ): + result = instrumentor.start_create_conversation_span( + server_address="test.openai.azure.com", + port=443, + ) + + assert result is non_recording_span + non_recording_span.add_attribute.assert_not_called() + non_recording_span.span_instance.add_event.assert_not_called() + + def test_start_list_conversation_items_span_skips_non_recording_span(self): + """start_list_conversation_items_span should not write to a non-recording span.""" + instrumentor = _ResponsesInstrumentorPreview() + + from unittest.mock import patch + + non_recording_span = _make_non_recording_span() + + with patch( + "azure.ai.projects.telemetry._responses_instrumentor.start_span", + return_value=non_recording_span, + ): + result = instrumentor.start_list_conversation_items_span( + server_address="test.openai.azure.com", + port=443, + conversation_id="conv-123", + ) + + assert result is non_recording_span + non_recording_span.add_attribute.assert_not_called() + non_recording_span.span_instance.add_event.assert_not_called() + + def test_add_tool_call_events_skips_non_recording_span(self): + """_add_tool_call_events should not write to a non-recording span.""" + instrumentor = _ResponsesInstrumentorPreview() + span = _make_non_recording_span() + + # Create a mock response with function call output + mock_response = MagicMock() + mock_output_item = MagicMock() + mock_output_item.type = "function_call" + mock_output_item.name = "get_weather" + mock_output_item.call_id = "call_123" + mock_output_item.arguments = '{"city": "Seattle"}' + mock_response.output = [mock_output_item] + + instrumentor._add_tool_call_events(span, mock_response) + + def test_add_conversation_item_event_skips_non_recording_span(self): + """_add_conversation_item_event should not write to a non-recording span.""" + instrumentor = _ResponsesInstrumentorPreview() + span = _make_non_recording_span() + + mock_item = MagicMock() + mock_item.id = "item_123" + mock_item.type = "message" + mock_item.role = "user" + mock_item.content = [] + + instrumentor._add_conversation_item_event(span, mock_item) diff --git a/sdk/ai/azure-ai-projects/tests/foundry_features_header/foundry_features_header_test_base.py b/sdk/ai/azure-ai-projects/tests/foundry_features_header/foundry_features_header_test_base.py index 2c80a40f9902..c0c242ee5723 100644 --- a/sdk/ai/azure-ai-projects/tests/foundry_features_header/foundry_features_header_test_base.py +++ b/sdk/ai/azure-ai-projects/tests/foundry_features_header/foundry_features_header_test_base.py @@ -43,7 +43,8 @@ "schedules": "Schedules=V1Preview", "toolboxes": "Toolboxes=V1Preview", "skills": "Skills=V1Preview", - "agents": "HostedAgents=V1Preview,AgentEndpoints=V1Preview", + "datasets": "DataGenerationJobs=V1Preview", + "agents": "HostedAgents=V1Preview,AgentEndpoints=V1Preview,CodeAgents=V1Preview", } # Shared test cases for non-beta methods that optionally send the Foundry-Features header. diff --git a/sdk/ai/azure-ai-projects/tests/responses/openai_test_helpers.py b/sdk/ai/azure-ai-projects/tests/responses/openai_test_helpers.py new file mode 100644 index 000000000000..fbf3637291ea --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/responses/openai_test_helpers.py @@ -0,0 +1,69 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +"""Shared helpers for unit-testing AIProjectClient.get_openai_client (sync and async). + +These helpers build lightweight client stubs that bypass the real ``__init__`` so unit +tests can target individual branches of ``get_openai_client`` without making any +network calls. +""" + +from typing import Optional +from unittest.mock import MagicMock + +from azure.ai.projects import AIProjectClient +from azure.ai.projects.aio import AIProjectClient as AsyncAIProjectClient + +ENDPOINT = "https://myaccount.services.ai.azure.com/api/projects/myproject" +API_VERSION = "2025-01-01" + +# Patch targets used by tests to swap in mocked OpenAI/AsyncOpenAI constructors +# and bearer-token providers. +SYNC_OPENAI_PATCH = "azure.ai.projects._patch.OpenAI" +ASYNC_OPENAI_PATCH = "azure.ai.projects.aio._patch.AsyncOpenAI" +SYNC_TOKEN_PROVIDER_PATCH = "azure.ai.projects._patch.get_bearer_token_provider" +ASYNC_TOKEN_PROVIDER_PATCH = "azure.ai.projects.aio._patch.get_bearer_token_provider" + + +def make_sync_client( + allow_preview: bool = True, + console_logging: bool = False, + custom_user_agent: Optional[str] = None, +) -> AIProjectClient: + """Return a minimal sync AIProjectClient stub suitable for unit-testing get_openai_client.""" + client = AIProjectClient.__new__(AIProjectClient) + client._config = MagicMock() + client._config.endpoint = ENDPOINT + client._config.allow_preview = allow_preview + client._config.api_version = API_VERSION + client._config.credential = MagicMock() + client._console_logging_enabled = console_logging + client._custom_user_agent = custom_user_agent + return client + + +def make_async_client( + allow_preview: bool = True, + console_logging: bool = False, + custom_user_agent: Optional[str] = None, +) -> AsyncAIProjectClient: + """Return a minimal async AIProjectClient stub suitable for unit-testing get_openai_client.""" + client = AsyncAIProjectClient.__new__(AsyncAIProjectClient) + client._config = MagicMock() + client._config.endpoint = ENDPOINT + client._config.allow_preview = allow_preview + client._config.api_version = API_VERSION + client._config.credential = MagicMock() + client._console_logging_enabled = console_logging + client._custom_user_agent = custom_user_agent + return client + + +def mock_openai(user_agent: str = "openai/1.0"): + """Return ``(mock_class, mock_instance)`` where ``mock_class`` acts as the OpenAI constructor.""" + instance = MagicMock() + instance.user_agent = user_agent + mock_cls = MagicMock(return_value=instance) + return mock_cls, instance diff --git a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint.py b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint.py index 87aecf0a94fa..be5d6f429040 100644 --- a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint.py +++ b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint.py @@ -4,13 +4,26 @@ # Licensed under the MIT License. # ------------------------------------ """ -Unit tests for verifying the base_url of the OpenAI client returned by AIProjectClient.get_openai_client(). +Unit tests for verifying the base_url, default_query (api-version) and Foundry feature +header behavior of the OpenAI client returned by AIProjectClient.get_openai_client(). No network calls are made. """ +from unittest.mock import patch + import pytest from azure.core.credentials import AccessToken from azure.ai.projects import AIProjectClient +from azure.ai.projects.models._patch import _FOUNDRY_FEATURES_HEADER_NAME + +from openai_test_helpers import ( + ENDPOINT, + API_VERSION, + SYNC_OPENAI_PATCH, + SYNC_TOKEN_PROVIDER_PATCH, + make_sync_client, + mock_openai, +) FAKE_ENDPOINT = "https://fake-account.services.ai.azure.com/api/projects/fake-project" AGENT_NAME = "fake-agent-name" @@ -59,3 +72,106 @@ def test_get_openai_client_with_agent_name_and_allow_preview(self): expected_base_url = FAKE_ENDPOINT.rstrip("/") + f"/agents/{AGENT_NAME}/endpoint/protocols/openai" assert str(openai_client.base_url).rstrip("/") == expected_base_url + + def test_trailing_slash_on_endpoint_is_stripped(self): + """Trailing slash on endpoint must not produce a double slash in the base URL.""" + client = make_sync_client() + client._config.endpoint = ENDPOINT + "/" + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + for c in mock_cls.call_args_list: + host = c.kwargs["base_url"].replace("https://", "") + assert "//" not in host + + +# =========================================================================== +# default_query / api-version injection branches +# =========================================================================== + + +class TestDefaultQueryBranches: + def test_no_agent_no_api_version_injected(self): + """Branch: no agent_name -> api-version is NOT injected into default_query.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + for c in mock_cls.call_args_list: + assert "api-version" not in c.kwargs.get("default_query", {}) + + def test_agent_injects_api_version(self): + """Branch: agent_name + no caller api-version -> inject SDK api_version.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(agent_name="my-agent") + for c in mock_cls.call_args_list: + assert c.kwargs["default_query"]["api-version"] == API_VERSION + + def test_agent_does_not_override_caller_api_version(self): + """Branch: agent_name + caller-provided api-version -> keep caller value.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(agent_name="my-agent", default_query={"api-version": "caller-v"}) + for c in mock_cls.call_args_list: + assert c.kwargs["default_query"]["api-version"] == "caller-v" + + def test_caller_default_query_values_preserved(self): + """Caller-supplied default_query keys other than api-version are preserved.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(default_query={"foo": "bar"}) + for c in mock_cls.call_args_list: + assert c.kwargs["default_query"]["foo"] == "bar" + + +# =========================================================================== +# Foundry feature header injection branches +# =========================================================================== + + +class TestFoundryFeatureHeaderBranches: + def test_no_agent_no_feature_header_injected(self): + """Branch: no agent_name -> Foundry feature header is NOT injected.""" + + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + real_call = mock_cls.call_args_list[-1] + assert _FOUNDRY_FEATURES_HEADER_NAME not in real_call.kwargs.get("default_headers", {}) + + def test_agent_injects_foundry_feature_header(self): + """Branch: agent_name + header not present -> inject the Foundry feature header.""" + + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(agent_name="my-agent") + real_call = mock_cls.call_args_list[-1] + assert _FOUNDRY_FEATURES_HEADER_NAME in real_call.kwargs.get("default_headers", {}) + + def test_agent_does_not_override_existing_feature_header(self): + """Branch: agent_name + header already in caller headers -> keep caller value.""" + + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client( + agent_name="my-agent", + default_headers={_FOUNDRY_FEATURES_HEADER_NAME: "caller-value"}, + ) + real_call = mock_cls.call_args_list[-1] + assert real_call.kwargs["default_headers"][_FOUNDRY_FEATURES_HEADER_NAME] == "caller-value" + + def test_caller_headers_other_keys_preserved(self): + """Caller-supplied non-feature headers are passed through to the OpenAI client.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(default_headers={"X-My-Header": "hello"}) + real_call = mock_cls.call_args_list[-1] + assert real_call.kwargs["default_headers"]["X-My-Header"] == "hello" diff --git a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint_async.py b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint_async.py index f489a4f982b6..8335531787af 100644 --- a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint_async.py +++ b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_endpoint_async.py @@ -4,14 +4,27 @@ # Licensed under the MIT License. # ------------------------------------ """ -Unit tests for verifying the base_url of the AsyncOpenAI client returned by AIProjectClient.get_openai_client(). +Unit tests for verifying the base_url, default_query (api-version) and Foundry feature +header behavior of the AsyncOpenAI client returned by AIProjectClient.get_openai_client(). No network calls are made. """ -import pytest from typing import Any +from unittest.mock import patch + +import pytest from azure.core.credentials_async import AsyncTokenCredential from azure.ai.projects.aio import AIProjectClient +from azure.ai.projects.models._patch import _FOUNDRY_FEATURES_HEADER_NAME + +from openai_test_helpers import ( + ENDPOINT, + API_VERSION, + ASYNC_OPENAI_PATCH, + ASYNC_TOKEN_PROVIDER_PATCH, + make_async_client, + mock_openai, +) FAKE_ENDPOINT = "https://fake-account.services.ai.azure.com/api/projects/fake-project" AGENT_NAME = "fake-agent-name" @@ -68,3 +81,95 @@ async def test_get_openai_client_with_agent_name_and_allow_preview_async(self): expected_base_url = FAKE_ENDPOINT.rstrip("/") + f"/agents/{AGENT_NAME}/endpoint/protocols/openai" assert str(openai_client.base_url).rstrip("/") == expected_base_url + + @pytest.mark.asyncio + async def test_trailing_slash_on_endpoint_is_stripped_async(self): + """Trailing slash on endpoint must not produce a double slash in the base URL.""" + client = make_async_client() + client._config.endpoint = ENDPOINT + "/" + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + for c in mock_cls.call_args_list: + host = c.kwargs["base_url"].replace("https://", "") + assert "//" not in host + + +# =========================================================================== +# default_query / api-version injection branches (async) +# =========================================================================== + + +class TestDefaultQueryBranchesAsync: + @pytest.mark.asyncio + async def test_no_agent_no_api_version_injected(self): + """Branch: no agent_name -> api-version is NOT injected into default_query.""" + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + for c in mock_cls.call_args_list: + assert "api-version" not in c.kwargs.get("default_query", {}) + + @pytest.mark.asyncio + async def test_agent_injects_api_version(self): + """Branch: agent_name + no caller api-version -> inject SDK api_version.""" + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(agent_name="my-agent") + for c in mock_cls.call_args_list: + assert c.kwargs["default_query"]["api-version"] == API_VERSION + + @pytest.mark.asyncio + async def test_agent_does_not_override_caller_api_version(self): + """Branch: agent_name + caller-provided api-version -> keep caller value.""" + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(agent_name="my-agent", default_query={"api-version": "caller-v"}) + for c in mock_cls.call_args_list: + assert c.kwargs["default_query"]["api-version"] == "caller-v" + + +# =========================================================================== +# Foundry feature header injection branches (async) +# =========================================================================== + + +class TestFoundryFeatureHeaderBranchesAsync: + @pytest.mark.asyncio + async def test_no_agent_no_feature_header_injected(self): + """Branch: no agent_name -> Foundry feature header is NOT injected.""" + + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + real_call = mock_cls.call_args_list[-1] + assert _FOUNDRY_FEATURES_HEADER_NAME not in real_call.kwargs.get("default_headers", {}) + + @pytest.mark.asyncio + async def test_agent_injects_foundry_feature_header(self): + """Branch: agent_name + header not present -> inject the Foundry feature header.""" + + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client(agent_name="my-agent") + real_call = mock_cls.call_args_list[-1] + assert _FOUNDRY_FEATURES_HEADER_NAME in real_call.kwargs.get("default_headers", {}) + + @pytest.mark.asyncio + async def test_agent_does_not_override_existing_feature_header(self): + """Branch: agent_name + header already in caller headers -> keep caller value.""" + + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client( + agent_name="my-agent", + default_headers={_FOUNDRY_FEATURES_HEADER_NAME: "caller-value"}, + ) + real_call = mock_cls.call_args_list[-1] + assert real_call.kwargs["default_headers"][_FOUNDRY_FEATURES_HEADER_NAME] == "caller-value" diff --git a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides.py b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides.py index 782c5024f43a..e0895ae552ca 100644 --- a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides.py +++ b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides.py @@ -4,17 +4,27 @@ # Licensed under the MIT License. # ------------------------------------ """ -Tests to verify that a custom http_client can be passed to get_openai_client() -and that the returned OpenAI client uses it instead of the default one. +Tests covering caller-side overrides (http_client, api_key, base_url, default_headers) +and the user-agent / token-provider / logging-transport branches of +AIProjectClient.get_openai_client() (sync). """ import os from typing import Any +from unittest.mock import patch + import pytest import httpx from azure.core.credentials import TokenCredential from azure.ai.projects import AIProjectClient +from openai_test_helpers import ( + SYNC_OPENAI_PATCH, + SYNC_TOKEN_PROVIDER_PATCH, + make_sync_client, + mock_openai, +) + class DummyTokenCredential(TokenCredential): """A dummy credential that returns None for testing purposes.""" @@ -134,3 +144,60 @@ def test_api_key_and_base_url_overrides(self): assert ( str(openai_client.base_url) == custom_base_url + "/" ), f"Expected base_url '{custom_base_url}/', got '{openai_client.base_url}'" + + +# =========================================================================== +# api_key resolution branches +# =========================================================================== + + +class TestApiKeyBranches: + def test_token_provider_used_when_no_api_key(self): + """Branch: no 'api_key' kwarg -> get_bearer_token_provider() is invoked.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="provider") as mock_tp: + client.get_openai_client() + mock_tp.assert_called_once_with(client._config.credential, "https://ai.azure.com/.default") + for c in mock_cls.call_args_list: + assert c.kwargs["api_key"] == "provider" + + def test_caller_api_key_skips_token_provider(self): + """Branch: 'api_key' in kwargs -> token provider is NOT called.""" + client = make_sync_client() + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH) as mock_tp: + client.get_openai_client(api_key="my-secret-key") + mock_tp.assert_not_called() + for c in mock_cls.call_args_list: + assert c.kwargs["api_key"] == "my-secret-key" + + +# =========================================================================== +# http_client resolution branches +# =========================================================================== + + +class TestHttpClientBranches: + def test_http_client_is_none_by_default(self): + """Branch: no override + console logging off -> http_client is None.""" + client = make_sync_client(console_logging=False) + mock_cls, _ = mock_openai() + with patch(SYNC_OPENAI_PATCH, mock_cls), patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + for c in mock_cls.call_args_list: + assert c.kwargs["http_client"] is None + + def test_console_logging_creates_logging_transport(self): + """Branch: no override + _console_logging_enabled=True -> httpx.Client with logging transport.""" + client = make_sync_client(console_logging=True) + mock_cls, _ = mock_openai() + with ( + patch(SYNC_OPENAI_PATCH, mock_cls), + patch(SYNC_TOKEN_PROVIDER_PATCH, return_value="tok"), + patch("azure.ai.projects._patch.httpx") as mock_httpx, + patch("azure.ai.projects._patch._OpenAILoggingTransport"), + ): + mock_httpx.Client.return_value = object() + client.get_openai_client() + mock_httpx.Client.assert_called_once() diff --git a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides_async.py b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides_async.py index de8c484fd9f6..220dc961d22a 100644 --- a/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides_async.py +++ b/sdk/ai/azure-ai-projects/tests/responses/test_openai_client_overrides_async.py @@ -4,17 +4,27 @@ # Licensed under the MIT License. # ------------------------------------ """ -Tests to verify that a custom http_client can be passed to get_openai_client() -and that the returned AsyncOpenAI client uses it instead of the default one. +Tests covering caller-side overrides (http_client, api_key, base_url, default_headers) +and the user-agent / token-provider / logging-transport branches of +AIProjectClient.get_openai_client() (async). """ import os from typing import Any +from unittest.mock import patch + import pytest import httpx from azure.core.credentials_async import AsyncTokenCredential from azure.ai.projects.aio import AIProjectClient +from openai_test_helpers import ( + ASYNC_OPENAI_PATCH, + ASYNC_TOKEN_PROVIDER_PATCH, + make_async_client, + mock_openai, +) + class DummyAsyncTokenCredential(AsyncTokenCredential): """A dummy async credential that returns None for testing purposes.""" @@ -138,3 +148,67 @@ async def test_api_key_and_base_url_overrides_async(self): assert ( str(openai_client.base_url) == custom_base_url + "/" ), f"Expected base_url '{custom_base_url}/', got '{openai_client.base_url}'" + + +# =========================================================================== +# api_key resolution branches (async) +# =========================================================================== + + +class TestApiKeyBranchesAsync: + @pytest.mark.asyncio + async def test_token_provider_used_when_no_api_key(self): + """Branch: no 'api_key' kwarg -> get_bearer_token_provider() is invoked.""" + client = make_async_client() + mock_cls, _ = mock_openai() + with ( + patch(ASYNC_OPENAI_PATCH, mock_cls), + patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="async-provider") as mock_tp, + ): + client.get_openai_client() + mock_tp.assert_called_once_with(client._config.credential, "https://ai.azure.com/.default") + for c in mock_cls.call_args_list: + assert c.kwargs["api_key"] == "async-provider" + + @pytest.mark.asyncio + async def test_caller_api_key_skips_token_provider(self): + """Branch: 'api_key' in kwargs -> token provider is NOT called.""" + client = make_async_client() + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH) as mock_tp: + client.get_openai_client(api_key="async-secret") + mock_tp.assert_not_called() + for c in mock_cls.call_args_list: + assert c.kwargs["api_key"] == "async-secret" + + +# =========================================================================== +# http_client resolution branches (async) +# =========================================================================== + + +class TestHttpClientBranchesAsync: + @pytest.mark.asyncio + async def test_http_client_is_none_by_default(self): + """Branch: no override + console logging off -> http_client is None.""" + client = make_async_client(console_logging=False) + mock_cls, _ = mock_openai() + with patch(ASYNC_OPENAI_PATCH, mock_cls), patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"): + client.get_openai_client() + for c in mock_cls.call_args_list: + assert c.kwargs["http_client"] is None + + @pytest.mark.asyncio + async def test_console_logging_creates_async_logging_transport(self): + """Branch: no override + _console_logging_enabled=True -> httpx.AsyncClient with logging transport.""" + client = make_async_client(console_logging=True) + mock_cls, _ = mock_openai() + with ( + patch(ASYNC_OPENAI_PATCH, mock_cls), + patch(ASYNC_TOKEN_PROVIDER_PATCH, return_value="tok"), + patch("azure.ai.projects.aio._patch.httpx") as mock_httpx, + patch("azure.ai.projects.aio._patch._OpenAILoggingTransport"), + ): + mock_httpx.AsyncClient.return_value = object() + client.get_openai_client() + mock_httpx.AsyncClient.assert_called_once() diff --git a/sdk/ai/azure-ai-projects/tests/samples/llm_instructions.py b/sdk/ai/azure-ai-projects/tests/samples/llm_instructions.py index af98d794a98d..0e0f776441cf 100644 --- a/sdk/ai/azure-ai-projects/tests/samples/llm_instructions.py +++ b/sdk/ai/azure-ai-projects/tests/samples/llm_instructions.py @@ -18,7 +18,8 @@ from typing import Final -agent_tools_instructions: Final[str] = """ +agent_tools_instructions: Final[str] = ( + """ We just ran Python code and captured print/log output in an attached log file (TXT). Validate whether sample execution/output is correct for a tool-driven assistant workflow. @@ -43,9 +44,11 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) -memories_instructions: Final[str] = """ +memories_instructions: Final[str] = ( + """ We just ran Python code and captured print/log output in an attached log file (TXT). Validate whether sample execution/output is correct for a memories workflow. @@ -70,9 +73,11 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) -agents_instructions: Final[str] = """ +agents_instructions: Final[str] = ( + """ We just ran Python code and captured print/log output in an attached log file (TXT). Validate whether sample execution/output is correct. @@ -103,9 +108,11 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) -chat_completions_instructions: Final[str] = """ +chat_completions_instructions: Final[str] = ( + """ We just ran Python code and captured print/log output in an attached log file (TXT). Validate whether sample execution/output is correct for Chat Completions scenarios. @@ -124,9 +131,11 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) -resource_management_instructions: Final[str] = """ +resource_management_instructions: Final[str] = ( + """ We just ran Python code and captured print/log output in an attached log file (TXT). Validate whether sample execution/output is correct for resource-management samples (for example connections, files, and deployments). @@ -152,9 +161,11 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) -fine_tuning_instructions: Final[str] = """ +fine_tuning_instructions: Final[str] = ( + """ We just ran Python code and captured print/log output in an attached log file (TXT). Validate whether sample execution/output is correct for a fine-tuning workflow. @@ -178,9 +189,11 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) -evaluations_instructions: Final[str] = """ +evaluations_instructions: Final[str] = ( + """ We just ran Python code for an evaluation sample and captured print/log output in an attached log file (TXT). Your job: determine if the sample code executed to completion WITHOUT throwing an unhandled exception. @@ -202,9 +215,11 @@ Always respond with `reason` indicating the reason for the response. """.strip() +) -hosted_agents_instructions: Final[str] = """ +hosted_agents_instructions: Final[str] = ( + """ We just ran Python code for a hosted-agent sample and captured print/log output in an attached log file (TXT). Validate whether the sample executed correctly. @@ -226,6 +241,7 @@ Always include `reason` with a concise explanation tied to the observed print output. """.strip() +) # Folder (under samples/) -> instructions. diff --git a/sdk/ai/azure-ai-projects/tests/samples/test_samples.py b/sdk/ai/azure-ai-projects/tests/samples/test_samples.py index 4ae4f4010925..92a952615b36 100644 --- a/sdk/ai/azure-ai-projects/tests/samples/test_samples.py +++ b/sdk/ai/azure-ai-projects/tests/samples/test_samples.py @@ -192,6 +192,8 @@ def test_chat_completions_samples(self, sample_path: str, **kwargs) -> None: @SamplePathPasser() @recorded_by_proxy(RecordedTransport.AZURE_CORE, RecordedTransport.HTTPX) def test_hosted_agents_samples(self, sample_path: str, **kwargs) -> None: + if os.path.basename(sample_path) == "sample_hosted_agent_create.py" and not self.is_live: + pytest.skip("sample_hosted_agent_create.py is skipped in replay mode due to RBAC complications.") env_vars = get_sample_env_vars(kwargs) executor = SyncSampleExecutor(self, sample_path, env_vars=env_vars, **kwargs) executor.execute() diff --git a/sdk/ai/azure-ai-projects/tests/test_base.py b/sdk/ai/azure-ai-projects/tests/test_base.py index 0979f5a177b6..1fb8abbc1c3c 100644 --- a/sdk/ai/azure-ai-projects/tests/test_base.py +++ b/sdk/ai/azure-ai-projects/tests/test_base.py @@ -71,6 +71,7 @@ memory_store_chat_model_deployment_name="sanitized-model-deployment-name", memory_store_embedding_model_deployment_name="text-embedding-ada-002", foundry_agent_container_image="sanitizedregistry.azurecr.io/sanitized/sessions-agent:latest", + foundry_hosted_agent_name="sanitized-hosted-agent-name", ) fineTuningServicePreparer = functools.partial( diff --git a/sdk/ai/azure-ai-projects/tsp-location-omitted.yaml b/sdk/ai/azure-ai-projects/tsp-location.yaml similarity index 69% rename from sdk/ai/azure-ai-projects/tsp-location-omitted.yaml rename to sdk/ai/azure-ai-projects/tsp-location.yaml index 70b8d1cf4350..d6d08d8ea372 100644 --- a/sdk/ai/azure-ai-projects/tsp-location-omitted.yaml +++ b/sdk/ai/azure-ai-projects/tsp-location.yaml @@ -1,4 +1,4 @@ directory: specification/ai-foundry/data-plane/Foundry -commit: 9f045138c73bf7aa13ee59b805cd9143dbb77835 +commit: 4de5d57b5542c40135a5c4b9246cc7915d01f117 repo: Azure/azure-rest-api-specs additionalDirectories: