Skip to content
This repository is currently being migrated. It's locked while the migration is in progress.

[Release] Release v0.0.36-sync.0#9

Merged
CaymanWilliams merged 64 commits intomainfrom
update-sync-sdk
Oct 23, 2024
Merged

[Release] Release v0.0.36-sync.0#9
CaymanWilliams merged 64 commits intomainfrom
update-sync-sdk

Conversation

@CaymanWilliams
Copy link

Changes

Tests

  • make test run locally
  • make fmt applied
  • relevant integration tests applied

parthban-db and others added 30 commits July 3, 2024 08:29
## Changes
<!-- Summary of your changes that are easy to understand -->
Changed `pathlib.Path` with the `pathlib.PurePosixPath` in
`/databricks/sdk/mixins/files.py` which always use linux path separators
regardless of the OS that it is running on. Fixes (databricks#660)

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Added a check for trailing slash in the host url. Fixes (databricks#661)

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

---------

Signed-off-by: Parth Bansal <parth.bansal@databricks.com>
## Changes
<!-- Summary of your changes that are easy to understand -->
Changed workflow such that tests will run on windows.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Remove duplicate ubuntu tests

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
Ports databricks/databricks-sdk-go#925 to the
Python SDK.

Partners of Databricks need a mechanism to register themselves in
libraries or applications that they write. In this way, requests made by
users of those libraries will include sufficient information to link
those requests to the original users.

This PR adds a new `useragent` module with functions to manipulate the
user agent.
* `product()`: returns the globally configured product & version.
* `with_product(product: str, product_version: str)`: configure the
global product & version.
* `extra()`: returns the globally configured extra user agent metadata.
* `with_extra(key: str, value: str)`: add an extra entry to the global
extra user agent metadata.
* `with_partner(partner: str)`: add a partner to the global extra user
agent metadata.
* `to_string(product_override: Optional[Tuple[str, str]]=None,
other_info: Optional[List[Tuple[str, str]]] = None): str`: return the
User-Agent header as a string.


One new function here is `with_partner`, which can be used by a partner
to add partner information to the User-Agent header for requests made by
the SDK. The new header will have the form `partner/<parther id>`. The
partner identifier is opaque for the SDK, but it must be alphanumeric.

This PR also removes the requirement that a user agent entry contain
only a single copy of each key. This allows multiple partners to
register in the same library or application.

In this PR, I've also refactored the user agent library to be more
static, aligning it with the Go and Java SDKs. This makes it easier to
maintain and ensures similar behavior between all 3 SDKs. Note that this
SDK has extra functionality that doesn't exist in the Go and Java SDKs,
namely config-level user agent info; that is preserved here.

## Tests
Unit tests were added to verify that the user agent contains all
expected parts and supports multiple partners.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix auth tests for windows. 

- Added a powershell script as bash script doesn't run on windows
- change 'COMSPEC' enviornment variable to run commands on powershell
- Use 'USERPROFILE' instead of 'HOME' as it is alternative of 'HOME' in
windows

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix `tests/integration/test_files.py::test_local_io` for windows. This
PR is part of fixing the test for windows
## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix for workflow that are cancelling due to failed workflow

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix `test_core.py` for windows. 

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
Improve Changelog by grouping changes and enforce tag in PRs

## Tests
- [X] `make test` run locally
- [X] `make fmt` applied
- [ ] relevant integration tests applied
- [X] Recreate old changelog

```

## 0.30.0

### Other Changes

 * Add Windows WorkFlow ([databricks#692](databricks#692)).
 * Check trailing slash in host url ([databricks#681](databricks#681)).
 * Fix auth tests for windows. ([databricks#697](databricks#697)).
 * Remove duplicate ubuntu tests ([databricks#693](databricks#693)).
 * Support partners in SDK ([databricks#648](databricks#648)).
 * fix windows path ([databricks#660](databricks#660)) ([databricks#673](databricks#673)).


### API Changes:

 * Added [w.serving_endpoints_data_plane](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints_data_plane.html) workspace-level service.
 * Added `deploy()` and `start()` methods for [w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html) workspace-level service.
 * Added `batch_get()` method for [w.consumer_listings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/consumer_listings.html) workspace-level service.
 * Added `batch_get()` method for [w.consumer_providers](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/consumer_providers.html) workspace-level service.
 * Added `create_schedule()`, `create_subscription()`, `delete_schedule()`, `delete_subscription()`, `get_schedule()`, `get_subscription()`, `list()`, `list_schedules()`, `list_subscriptions()` and `update_schedule()` methods for [w.lakeview](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakeview.html) workspace-level service.
 * Added `query_next_page()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service.
 * Added `databricks.sdk.service.serving.AppDeploymentMode`, `databricks.sdk.service.serving.ModelDataPlaneInfo` and `databricks.sdk.service.serving.StartAppRequest` dataclasses.
 * Added `databricks.sdk.service.catalog.CatalogIsolationMode` and `databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` dataclasses.
 * Added `databricks.sdk.service.dashboards.CreateScheduleRequest`, `databricks.sdk.service.dashboards.CreateSubscriptionRequest`, `databricks.sdk.service.dashboards.CronSchedule`, `databricks.sdk.service.dashboards.DashboardView`, `databricks.sdk.service.dashboards.DeleteScheduleRequest`, `any`, `databricks.sdk.service.dashboards.DeleteSubscriptionRequest`, `any`, `databricks.sdk.service.dashboards.GetScheduleRequest`, `databricks.sdk.service.dashboards.GetSubscriptionRequest`, `databricks.sdk.service.dashboards.ListDashboardsRequest`, `databricks.sdk.service.dashboards.ListDashboardsResponse`, `databricks.sdk.service.dashboards.ListSchedulesRequest`, `databricks.sdk.service.dashboards.ListSchedulesResponse`, `databricks.sdk.service.dashboards.ListSubscriptionsRequest`, `databricks.sdk.service.dashboards.ListSubscriptionsResponse`, `databricks.sdk.service.dashboards.Schedule`, `databricks.sdk.service.dashboards.SchedulePauseStatus`, `databricks.sdk.service.dashboards.Subscriber`, `databricks.sdk.service.dashboards.Subscription`, `databricks.sdk.service.dashboards.SubscriptionSubscriberDestination`, `databricks.sdk.service.dashboards.SubscriptionSubscriberUser` and `databricks.sdk.service.dashboards.UpdateScheduleRequest` dataclasses.
 * Added `databricks.sdk.service.jobs.PeriodicTriggerConfiguration` and `databricks.sdk.service.jobs.PeriodicTriggerConfigurationTimeUnit` dataclasses.
 * Added `databricks.sdk.service.marketplace.BatchGetListingsRequest`, `databricks.sdk.service.marketplace.BatchGetListingsResponse`, `databricks.sdk.service.marketplace.BatchGetProvidersRequest`, `databricks.sdk.service.marketplace.BatchGetProvidersResponse`, `databricks.sdk.service.marketplace.ProviderIconFile`, `databricks.sdk.service.marketplace.ProviderIconType` and `databricks.sdk.service.marketplace.ProviderListingSummaryInfo` dataclasses.
 * Added `databricks.sdk.service.oauth2.DataPlaneInfo` dataclass.
 * Added `databricks.sdk.service.vectorsearch.QueryVectorIndexNextPageRequest` dataclass.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.ExternalLocationInfo`.
 * Added `max_results` and `page_token` fields for `databricks.sdk.service.catalog.ListCatalogsRequest`.
 * Added `next_page_token` field for `databricks.sdk.service.catalog.ListCatalogsResponse`.
 * Added `table_serving_url` field for `databricks.sdk.service.catalog.OnlineTable`.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.StorageCredentialInfo`.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.UpdateExternalLocation`.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.UpdateStorageCredential`.
 * Added `termination_category` field for `databricks.sdk.service.jobs.ForEachTaskErrorMessageStats`.
 * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`.
 * Added `environment_key` field for `databricks.sdk.service.jobs.RunTask`.
 * Added `environments` field for `databricks.sdk.service.jobs.SubmitRun`.
 * Added `dbt_task` and `environment_key` fields for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`.
 * Added `periodic` field for `databricks.sdk.service.jobs.TriggerSettings`.
 * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`.
 * Added `provider_summary` field for `databricks.sdk.service.marketplace.Listing`.
 * Added `service_principal_id` and `service_principal_name` fields for `databricks.sdk.service.serving.App`.
 * Added `mode` field for `databricks.sdk.service.serving.AppDeployment`.
 * Added `mode` field for `databricks.sdk.service.serving.CreateAppDeploymentRequest`.
 * Added `data_plane_info` field for `databricks.sdk.service.serving.ServingEndpointDetailed`.
 * Added `query_type` field for `databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`.
 * Added `next_page_token` field for `databricks.sdk.service.vectorsearch.QueryVectorIndexResponse`.
 * Changed `list()` method for [a.account_storage_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html) account-level service to return `databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` dataclass.
 * Changed `isolation_mode` field for `databricks.sdk.service.catalog.CatalogInfo` to `databricks.sdk.service.catalog.CatalogIsolationMode` dataclass.
 * Changed `isolation_mode` field for `databricks.sdk.service.catalog.UpdateCatalog` to `databricks.sdk.service.catalog.CatalogIsolationMode` dataclass.
 * Removed `create_deployment()` method for [w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html) workspace-level service.
 * Removed `condition_task`, `dbt_task`, `notebook_task`, `pipeline_task`, `python_wheel_task`, `run_job_task`, `spark_jar_task`, `spark_python_task`, `spark_submit_task` and `sql_task` fields for `databricks.sdk.service.jobs.SubmitRun`.

OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-24
```
## Changes
Add Release tag

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
…cks#707)

## Changes
Move PR message validation to a separate workflow

## Tests
Updated title for this PR
## Changes
Add DataPlane support

## Tests
- [X] `make test` run locally
- [X] `make fmt` applied
- [ ] relevant integration tests applied
- [X] Manual test against staging workspace (prod workspaces don't
support DataPlane APIs)
…ks#709)

## Changes
Trigger the validate workflow in the merge queue
## Changes
Port of databricks/databricks-sdk-go#910 to the
Python SDK.

In order to use Azure U2M or M2M authentication with the Databricks SDK,
users must request a token from the Entra ID instance that the
underlying workspace or account belongs to, as Databricks rejects
requests to workspaces with a token from a different Entra ID tenant.
However, with Azure CLI auth, it is possible that a user is logged into
multiple tenants at the same time. Currently, the SDK uses the
subscription ID from the configured Azure Resource ID for the workspace
when issuing the `az account get-access-token` command. However, when
users don't specify the resource ID, the SDK simply fetches a token for
the active subscription for the user. If the active subscription is in a
different tenant than the workspace, users will see an error such as:

```
io.jsonwebtoken.IncorrectClaimException: Expected iss claim to be: https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/, but was: https://sts.windows.net/e3fe3f22-4b98-4c04-82cc-d8817d1b17da/
```

This PR modifies Azure CLI and Azure SP credential providers to attempt
to load the tenant ID of the workspace if not provided before
authenticating. Currently, there are no unauthenticated endpoints that
the tenant ID can be directly fetched from. However, the tenant ID is
indirectly exposed via the redirect URL used when logging into a
workspace. In this PR, we fetch the tenant ID from this endpoint and
configure it if not already set.

Here, we lazily fetch the tenant ID only in the auth methods that need
it. This prevents us from making any unnecessary requests if these Azure
credential providers are not needed.

## Tests
Unit tests check that the tenant ID is fetched automatically if not
specified for an azure workspace when authenticating with client
ID/secret or with the CLI.

- [x] `make test` run locally
- [x] `make fmt` applied
- [x] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Update OpenAPI spec

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [X] `make test` run locally
- [X] `make fmt` applied
- [x] relevant integration tests applied
…#714)

## Changes
<!-- Summary of your changes that are easy to understand -->
Added tests to make sure regeneration is not going to break API version
pinning: databricks/databricks-sdk-go#993

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
…atabricks#719)

## Changes
This PR fixes the current failing integration tests for the Python SDK,
unblocking their release.

There are two issues:
1. get_workspace_client fails in our integration tests because we call
it with a workspace that is not UC-enabled. Because tests are
authenticated as service principals, and it isn't possible to add
account-level service principals to non-UC workspaces, this call fails.
I address this by running this test against a UC-enabled workspace.
2. test_runtime_auth_from_jobs fails because a new LTS DBR version was
released (15.4) that doesn't support DBFS library installations. To
address this, I have created two tests:
test_runtime_auth_from_jobs_dbfs, which tests native auth using the SDK
installed from DBFS up to LTS 14.3, and
test_runtime_auth_from_jobs_volumes, which does the same with the SDK
installed from a volume.

## Tests
All integration tests passed (retriggered the GCP integration test
locally after adding single user data security mode).

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
…s#721)

## Changes
The current integration test for recursive workspace listing is very
slow because it lists all resources in a very large directory (the
integration test user's home folder). To decrease the time this test
takes, we can simply create a directory with a file and a subdirectory
with another file. This means the test requires only two API calls to
complete.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
To enable the release of the Apps package, we need to manually add it to
our doc generation.

Going forward, this should be added to the internal API specification.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] Codegen tool runs successfully on commit
88571b688969bc4509fb520d86d161eb20c3d662 of the API specification from
this PR.
### New Features and Improvements

* Add DataPlane support
([databricks#700](databricks#700)).
* Support partners in SDK
([databricks#648](databricks#648)).


### Bug Fixes

* Check trailing slash in host url
([databricks#681](databricks#681)).
* Decrease runtime of recursive workspace listing test
([databricks#721](databricks#721)).
* Fix test_get_workspace_client and test_runtime_auth_from_jobs
([databricks#719](databricks#719)).
* Infer Azure tenant ID if not set
([databricks#638](databricks#638)).


### Internal Changes

* Add Release tag and Workflow fix
([databricks#704](databricks#704)).
* Add apps package in docgen
([databricks#722](databricks#722)).
* Fix processing of `quoted` titles
([databricks#712](databricks#712)).
* Improve Changelog by grouping changes
([databricks#703](databricks#703)).
* Move PR message validation to a separate workflow
([databricks#707](databricks#707)).
* Test that Jobs API endpoints are pinned to 2.1
([databricks#714](databricks#714)).
* Trigger the validate workflow in the merge queue
([databricks#709](databricks#709)).
* Update OpenAPI spec
([databricks#715](databricks#715)).


### Other Changes

* Add Windows WorkFlow
([databricks#692](databricks#692)).
* Fix auth tests for windows.
([databricks#697](databricks#697)).
* Fix for cancelled workflow
([databricks#701](databricks#701)).
* Fix test_core for windows
([databricks#702](databricks#702)).
* Fix test_local_io for windows
([databricks#695](databricks#695)).
* Remove duplicate ubuntu tests
([databricks#693](databricks#693)).
* fix windows path
([databricks#660](databricks#660))
([databricks#673](databricks#673)).


### API Changes:

 * Added `databricks.sdk.service.apps` package.
* Added
[a.usage_dashboards](https://databricks-sdk-py.readthedocs.io/en/latest/account/usage_dashboards.html)
account-level service.
* Added
[w.alerts_legacy](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts_legacy.html)
workspace-level service,
[w.queries_legacy](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries_legacy.html)
workspace-level service and
[w.query_visualizations_legacy](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations_legacy.html)
workspace-level service.
* Added
[w.genie](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/genie.html)
workspace-level service.
* Added
[w.notification_destinations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/notification_destinations.html)
workspace-level service.
* Added `update()` method for
[w.clusters](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clusters.html)
workspace-level service.
* Added `list_visualizations()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service.
* Added `databricks.sdk.service.catalog.GetBindingsSecurableType` and
`databricks.sdk.service.catalog.UpdateBindingsSecurableType`
dataclasses.
* Added `databricks.sdk.service.billing.ActionConfiguration`,
`databricks.sdk.service.billing.ActionConfigurationType`,
`databricks.sdk.service.billing.AlertConfiguration`,
`databricks.sdk.service.billing.AlertConfigurationQuantityType`,
`databricks.sdk.service.billing.AlertConfigurationTimePeriod`,
`databricks.sdk.service.billing.AlertConfigurationTriggerType`,
`databricks.sdk.service.billing.BudgetConfiguration`,
`databricks.sdk.service.billing.BudgetConfigurationFilter`,
`databricks.sdk.service.billing.BudgetConfigurationFilterClause`,
`databricks.sdk.service.billing.BudgetConfigurationFilterOperator`,
`databricks.sdk.service.billing.BudgetConfigurationFilterTagClause`,
`databricks.sdk.service.billing.BudgetConfigurationFilterWorkspaceIdClause`,
`databricks.sdk.service.billing.CreateBillingUsageDashboardRequest`,
`databricks.sdk.service.billing.CreateBillingUsageDashboardResponse`,
`databricks.sdk.service.billing.CreateBudgetConfigurationBudget`,
`databricks.sdk.service.billing.CreateBudgetConfigurationBudgetActionConfigurations`,
`databricks.sdk.service.billing.CreateBudgetConfigurationBudgetAlertConfigurations`,
`databricks.sdk.service.billing.CreateBudgetConfigurationRequest`,
`databricks.sdk.service.billing.CreateBudgetConfigurationResponse`,
`databricks.sdk.service.billing.DeleteBudgetConfigurationRequest`,
`any`, `databricks.sdk.service.billing.GetBillingUsageDashboardRequest`,
`databricks.sdk.service.billing.GetBillingUsageDashboardResponse`,
`databricks.sdk.service.billing.GetBudgetConfigurationRequest`,
`databricks.sdk.service.billing.GetBudgetConfigurationResponse`,
`databricks.sdk.service.billing.ListBudgetConfigurationsRequest`,
`databricks.sdk.service.billing.ListBudgetConfigurationsResponse`,
`databricks.sdk.service.billing.UpdateBudgetConfigurationBudget`,
`databricks.sdk.service.billing.UpdateBudgetConfigurationRequest`,
`databricks.sdk.service.billing.UpdateBudgetConfigurationResponse` and
`databricks.sdk.service.billing.UsageDashboardType` dataclasses.
* Added `databricks.sdk.service.compute.ListClustersFilterBy`,
`databricks.sdk.service.compute.ListClustersSortBy`,
`databricks.sdk.service.compute.ListClustersSortByDirection`,
`databricks.sdk.service.compute.ListClustersSortByField`,
`databricks.sdk.service.compute.UpdateCluster`,
`databricks.sdk.service.compute.UpdateClusterResource` and `any`
dataclasses.
* Added `databricks.sdk.service.dashboards.ExecuteMessageQueryRequest`,
`databricks.sdk.service.dashboards.GenieAttachment`,
`databricks.sdk.service.dashboards.GenieConversation`,
`databricks.sdk.service.dashboards.GenieCreateConversationMessageRequest`,
`databricks.sdk.service.dashboards.GenieGetConversationMessageRequest`,
`databricks.sdk.service.dashboards.GenieGetMessageQueryResultRequest`,
`databricks.sdk.service.dashboards.GenieGetMessageQueryResultResponse`,
`databricks.sdk.service.dashboards.GenieMessage`,
`databricks.sdk.service.dashboards.GenieStartConversationMessageRequest`,
`databricks.sdk.service.dashboards.GenieStartConversationResponse`,
`databricks.sdk.service.dashboards.MessageError`,
`databricks.sdk.service.dashboards.MessageErrorType`,
`databricks.sdk.service.dashboards.MessageStatus`,
`databricks.sdk.service.dashboards.QueryAttachment`,
`databricks.sdk.service.dashboards.Result` and
`databricks.sdk.service.dashboards.TextAttachment` dataclasses.
* Added `any`, `databricks.sdk.service.iam.MigratePermissionsRequest`
and `databricks.sdk.service.iam.MigratePermissionsResponse` dataclasses.
* Added `databricks.sdk.service.oauth2.ListCustomAppIntegrationsRequest`
and `databricks.sdk.service.oauth2.ListPublishedAppIntegrationsRequest`
dataclasses.
* Added `databricks.sdk.service.pipelines.IngestionPipelineDefinition`
and `databricks.sdk.service.pipelines.PipelineStateInfoHealth`
dataclasses.
* Added `databricks.sdk.service.serving.GoogleCloudVertexAiConfig`
dataclass.
* Added `databricks.sdk.service.settings.Config`,
`databricks.sdk.service.settings.CreateNotificationDestinationRequest`,
`databricks.sdk.service.settings.DeleteNotificationDestinationRequest`,
`databricks.sdk.service.settings.DestinationType`,
`databricks.sdk.service.settings.EmailConfig`, `any`,
`databricks.sdk.service.settings.GenericWebhookConfig`,
`databricks.sdk.service.settings.GetNotificationDestinationRequest`,
`databricks.sdk.service.settings.ListNotificationDestinationsRequest`,
`databricks.sdk.service.settings.ListNotificationDestinationsResponse`,
`databricks.sdk.service.settings.ListNotificationDestinationsResult`,
`databricks.sdk.service.settings.MicrosoftTeamsConfig`,
`databricks.sdk.service.settings.NotificationDestination`,
`databricks.sdk.service.settings.PagerdutyConfig`,
`databricks.sdk.service.settings.SlackConfig` and
`databricks.sdk.service.settings.UpdateNotificationDestinationRequest`
dataclasses.
* Added `databricks.sdk.service.sql.AlertCondition`,
`databricks.sdk.service.sql.AlertConditionOperand`,
`databricks.sdk.service.sql.AlertConditionThreshold`,
`databricks.sdk.service.sql.AlertOperandColumn`,
`databricks.sdk.service.sql.AlertOperandValue`,
`databricks.sdk.service.sql.AlertOperator`,
`databricks.sdk.service.sql.ClientCallContext`,
`databricks.sdk.service.sql.ContextFilter`,
`databricks.sdk.service.sql.CreateAlertRequest`,
`databricks.sdk.service.sql.CreateAlertRequestAlert`,
`databricks.sdk.service.sql.CreateQueryRequest`,
`databricks.sdk.service.sql.CreateQueryRequestQuery`,
`databricks.sdk.service.sql.CreateQueryVisualizationsLegacyRequest`,
`databricks.sdk.service.sql.CreateVisualizationRequest`,
`databricks.sdk.service.sql.CreateVisualizationRequestVisualization`,
`databricks.sdk.service.sql.DatePrecision`,
`databricks.sdk.service.sql.DateRange`,
`databricks.sdk.service.sql.DateRangeValue`,
`databricks.sdk.service.sql.DateRangeValueDynamicDateRange`,
`databricks.sdk.service.sql.DateValue`,
`databricks.sdk.service.sql.DateValueDynamicDate`,
`databricks.sdk.service.sql.DeleteAlertsLegacyRequest`,
`databricks.sdk.service.sql.DeleteQueriesLegacyRequest`,
`databricks.sdk.service.sql.DeleteQueryVisualizationsLegacyRequest`,
`databricks.sdk.service.sql.DeleteVisualizationRequest`, `any`,
`databricks.sdk.service.sql.EncodedText`,
`databricks.sdk.service.sql.EncodedTextEncoding`,
`databricks.sdk.service.sql.EnumValue`,
`databricks.sdk.service.sql.GetAlertsLegacyRequest`,
`databricks.sdk.service.sql.GetQueriesLegacyRequest`,
`databricks.sdk.service.sql.LegacyAlert`,
`databricks.sdk.service.sql.LegacyAlertState`,
`databricks.sdk.service.sql.LegacyQuery`,
`databricks.sdk.service.sql.LegacyVisualization`,
`databricks.sdk.service.sql.LifecycleState`,
`databricks.sdk.service.sql.ListAlertsRequest`,
`databricks.sdk.service.sql.ListAlertsResponse`,
`databricks.sdk.service.sql.ListAlertsResponseAlert`,
`databricks.sdk.service.sql.ListQueriesLegacyRequest`,
`databricks.sdk.service.sql.ListQueryObjectsResponse`,
`databricks.sdk.service.sql.ListQueryObjectsResponseQuery`,
`databricks.sdk.service.sql.ListVisualizationsForQueryRequest`,
`databricks.sdk.service.sql.ListVisualizationsForQueryResponse`,
`databricks.sdk.service.sql.NumericValue`,
`databricks.sdk.service.sql.QueryBackedValue`,
`databricks.sdk.service.sql.QueryParameter`,
`databricks.sdk.service.sql.QuerySource`,
`databricks.sdk.service.sql.QuerySourceDriverInfo`,
`databricks.sdk.service.sql.QuerySourceEntryPoint`,
`databricks.sdk.service.sql.QuerySourceJobManager`,
`databricks.sdk.service.sql.QuerySourceTrigger`,
`databricks.sdk.service.sql.RestoreQueriesLegacyRequest`,
`databricks.sdk.service.sql.RunAsMode`,
`databricks.sdk.service.sql.ServerlessChannelInfo`,
`databricks.sdk.service.sql.StatementResponse`,
`databricks.sdk.service.sql.TextValue`,
`databricks.sdk.service.sql.TrashAlertRequest`,
`databricks.sdk.service.sql.TrashQueryRequest`,
`databricks.sdk.service.sql.UpdateAlertRequest`,
`databricks.sdk.service.sql.UpdateAlertRequestAlert`,
`databricks.sdk.service.sql.UpdateQueryRequest`,
`databricks.sdk.service.sql.UpdateQueryRequestQuery`,
`databricks.sdk.service.sql.UpdateVisualizationRequest` and
`databricks.sdk.service.sql.UpdateVisualizationRequestVisualization`
dataclasses.
* Added `force` field for
`databricks.sdk.service.catalog.DeleteSchemaRequest`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.catalog.GetBindingsRequest`.
* Added `include_aliases` field for
`databricks.sdk.service.catalog.GetByAliasRequest`.
* Added `include_aliases` field for
`databricks.sdk.service.catalog.GetModelVersionRequest`.
* Added `include_aliases` field for
`databricks.sdk.service.catalog.GetRegisteredModelRequest`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.catalog.ListSystemSchemasRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.catalog.ListSystemSchemasResponse`.
* Added `aliases` field for
`databricks.sdk.service.catalog.ModelVersionInfo`.
* Added `next_page_token` field for
`databricks.sdk.service.catalog.WorkspaceBindingsResponse`.
* Added `version` field for
`databricks.sdk.service.compute.GetPolicyFamilyRequest`.
* Added `filter_by`, `page_size`, `page_token` and `sort_by` fields for
`databricks.sdk.service.compute.ListClustersRequest`.
* Added `next_page_token` and `prev_page_token` fields for
`databricks.sdk.service.compute.ListClustersResponse`.
* Added `page_token` field for
`databricks.sdk.service.jobs.GetRunRequest`.
* Added `iterations`, `next_page_token` and `prev_page_token` fields for
`databricks.sdk.service.jobs.Run`.
* Added `create_time`, `created_by`, `creator_username` and `scopes`
fields for
`databricks.sdk.service.oauth2.GetCustomAppIntegrationOutput`.
* Added `next_page_token` field for
`databricks.sdk.service.oauth2.GetCustomAppIntegrationsOutput`.
* Added `create_time` and `created_by` fields for
`databricks.sdk.service.oauth2.GetPublishedAppIntegrationOutput`.
* Added `next_page_token` field for
`databricks.sdk.service.oauth2.GetPublishedAppIntegrationsOutput`.
* Added `enable_local_disk_encryption` field for
`databricks.sdk.service.pipelines.PipelineCluster`.
* Added `whl` field for
`databricks.sdk.service.pipelines.PipelineLibrary`.
* Added `health` field for
`databricks.sdk.service.pipelines.PipelineStateInfo`.
* Added `ai21labs_api_key_plaintext` field for
`databricks.sdk.service.serving.Ai21LabsConfig`.
* Added `aws_access_key_id_plaintext` and
`aws_secret_access_key_plaintext` fields for
`databricks.sdk.service.serving.AmazonBedrockConfig`.
* Added `anthropic_api_key_plaintext` field for
`databricks.sdk.service.serving.AnthropicConfig`.
* Added `cohere_api_base` and `cohere_api_key_plaintext` fields for
`databricks.sdk.service.serving.CohereConfig`.
* Added `databricks_api_token_plaintext` field for
`databricks.sdk.service.serving.DatabricksModelServingConfig`.
* Added `google_cloud_vertex_ai_config` field for
`databricks.sdk.service.serving.ExternalModel`.
* Added `microsoft_entra_client_secret_plaintext` and
`openai_api_key_plaintext` fields for
`databricks.sdk.service.serving.OpenAiConfig`.
* Added `palm_api_key_plaintext` field for
`databricks.sdk.service.serving.PaLmConfig`.
* Added `expiration_time` field for
`databricks.sdk.service.sharing.CreateRecipient`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.GetRecipientSharePermissionsResponse`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListProviderSharesResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.ListProvidersRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListProvidersResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.ListRecipientsRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListRecipientsResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.ListSharesRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListSharesResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.SharePermissionsRequest`.
* Added `expiration_time` field for
`databricks.sdk.service.sharing.UpdateRecipient`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.UpdateSharePermissions`.
* Added `condition`, `create_time`, `custom_body`, `custom_subject`,
`display_name`, `lifecycle_state`, `owner_user_name`, `parent_path`,
`query_id`, `seconds_to_retrigger`, `trigger_time` and `update_time`
fields for `databricks.sdk.service.sql.Alert`.
 * Added `id` field for `databricks.sdk.service.sql.GetAlertRequest`.
 * Added `id` field for `databricks.sdk.service.sql.GetQueryRequest`.
* Added `page_token` field for
`databricks.sdk.service.sql.ListQueriesRequest`.
* Added `apply_auto_limit`, `catalog`, `create_time`, `display_name`,
`last_modifier_user_name`, `lifecycle_state`, `owner_user_name`,
`parameters`, `parent_path`, `query_text`, `run_as_mode`, `schema`,
`update_time` and `warehouse_id` fields for
`databricks.sdk.service.sql.Query`.
* Added `context_filter` field for
`databricks.sdk.service.sql.QueryFilter`.
* Added `query_source` field for `databricks.sdk.service.sql.QueryInfo`.
* Added `create_time`, `display_name`, `query_id`, `serialized_options`,
`serialized_query_plan` and `update_time` fields for
`databricks.sdk.service.sql.Visualization`.
* Changed `create()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.CreateBudgetConfigurationResponse`
dataclass.
* Changed `create()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.CreateBudgetConfigurationRequest`
dataclass.
* Changed `delete()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.DeleteBudgetConfigurationRequest`
dataclass.
* Changed `delete()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return `any` dataclass.
* Changed `get()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.GetBudgetConfigurationRequest`
dataclass.
* Changed `get()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.GetBudgetConfigurationResponse`
dataclass.
* Changed `list()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.ListBudgetConfigurationsResponse`
dataclass.
* Changed `list()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to require request of
`databricks.sdk.service.billing.ListBudgetConfigurationsRequest`
dataclass.
* Changed `update()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.UpdateBudgetConfigurationResponse`
dataclass.
* Changed `update()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.UpdateBudgetConfigurationRequest`
dataclass.
* Changed `create()` method for
[a.custom_app_integration](https://databricks-sdk-py.readthedocs.io/en/latest/account/custom_app_integration.html)
account-level service with new required argument order.
* Changed `list()` method for
[a.custom_app_integration](https://databricks-sdk-py.readthedocs.io/en/latest/account/custom_app_integration.html)
account-level service to require request of
`databricks.sdk.service.oauth2.ListCustomAppIntegrationsRequest`
dataclass.
* Changed `list()` method for
[a.published_app_integration](https://databricks-sdk-py.readthedocs.io/en/latest/account/published_app_integration.html)
account-level service to require request of
`databricks.sdk.service.oauth2.ListPublishedAppIntegrationsRequest`
dataclass.
* Changed `delete()` method for
[a.workspace_assignment](https://databricks-sdk-py.readthedocs.io/en/latest/account/workspace_assignment.html)
account-level service to return `any` dataclass.
* Changed `update()` method for
[a.workspace_assignment](https://databricks-sdk-py.readthedocs.io/en/latest/account/workspace_assignment.html)
account-level service with new required argument order.
* Changed `create()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.CreateAlertRequest` dataclass.
* Changed `delete()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to return `any` dataclass.
* Changed `delete()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.TrashAlertRequest` dataclass.
* Changed `get()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service with new required argument order.
* Changed `list()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to return
`databricks.sdk.service.sql.ListAlertsResponse` dataclass.
* Changed `list()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to require request of
`databricks.sdk.service.sql.ListAlertsRequest` dataclass.
* Changed `update()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to return `databricks.sdk.service.sql.Alert`
dataclass.
* Changed `update()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.UpdateAlertRequest` dataclass.
* Changed `create()` and `edit()` methods for
[w.cluster_policies](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/cluster_policies.html)
workspace-level service with new required argument order.
* Changed `get()` method for
[w.model_versions](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/model_versions.html)
workspace-level service to return
`databricks.sdk.service.catalog.ModelVersionInfo` dataclass.
* Changed `migrate_permissions()` method for
[w.permission_migration](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/permission_migration.html)
workspace-level service . New request type is
`databricks.sdk.service.iam.MigratePermissionsRequest` dataclass.
* Changed `migrate_permissions()` method for
[w.permission_migration](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/permission_migration.html)
workspace-level service to return
`databricks.sdk.service.iam.MigratePermissionsResponse` dataclass.
* Changed `create()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.CreateQueryRequest` dataclass.
* Changed `delete()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service to return `any` dataclass.
* Changed `delete()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.TrashQueryRequest` dataclass.
* Changed `get()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service with new required argument order.
* Changed `list()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service to return
`databricks.sdk.service.sql.ListQueryObjectsResponse` dataclass.
* Changed `update()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.UpdateQueryRequest` dataclass.
* Changed `create()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.CreateVisualizationRequest` dataclass.
* Changed `delete()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service to return `any` dataclass.
* Changed `delete()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.DeleteVisualizationRequest` dataclass.
* Changed `update()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.UpdateVisualizationRequest` dataclass.
* Changed `list()` method for
[w.shares](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/shares.html)
workspace-level service to require request of
`databricks.sdk.service.sharing.ListSharesRequest` dataclass.
* Changed `execute_statement()` and `get_statement()` methods for
[w.statement_execution](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/statement_execution.html)
workspace-level service to return
`databricks.sdk.service.sql.StatementResponse` dataclass.
* Changed `securable_type` field for
`databricks.sdk.service.catalog.GetBindingsRequest` to
`databricks.sdk.service.catalog.GetBindingsSecurableType` dataclass.
* Changed `securable_type` field for
`databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters` to
`databricks.sdk.service.catalog.UpdateBindingsSecurableType` dataclass.
* Changed `name` field for `databricks.sdk.service.compute.CreatePolicy`
to no longer be required.
* Changed `name` field for `databricks.sdk.service.compute.EditPolicy`
to no longer be required.
* Changed `policy_family_id` field for
`databricks.sdk.service.compute.GetPolicyFamilyRequest` to `str`
dataclass.
* Changed `policy_families` field for
`databricks.sdk.service.compute.ListPolicyFamiliesResponse` to no longer
be required.
* Changed `definition`, `description`, `name` and `policy_family_id`
fields for `databricks.sdk.service.compute.PolicyFamily` to no longer be
required.
* Changed `permissions` field for
`databricks.sdk.service.iam.UpdateWorkspaceAssignments` to no longer be
required.
* Changed `access_control_list` field for
`databricks.sdk.service.jobs.CreateJob` to
`databricks.sdk.service.jobs.JobAccessControlRequestList` dataclass.
* Changed `access_control_list` field for
`databricks.sdk.service.jobs.SubmitRun` to
`databricks.sdk.service.jobs.JobAccessControlRequestList` dataclass.
* Changed `name` and `redirect_urls` fields for
`databricks.sdk.service.oauth2.CreateCustomAppIntegration` to no longer
be required.
* Changed `ingestion_definition` field for
`databricks.sdk.service.pipelines.CreatePipeline` to
`databricks.sdk.service.pipelines.IngestionPipelineDefinition`
dataclass.
* Changed `ingestion_definition` field for
`databricks.sdk.service.pipelines.EditPipeline` to
`databricks.sdk.service.pipelines.IngestionPipelineDefinition`
dataclass.
* Changed `ingestion_definition` field for
`databricks.sdk.service.pipelines.PipelineSpec` to
`databricks.sdk.service.pipelines.IngestionPipelineDefinition`
dataclass.
* Changed `ai21labs_api_key` field for
`databricks.sdk.service.serving.Ai21LabsConfig` to no longer be
required.
* Changed `aws_access_key_id` and `aws_secret_access_key` fields for
`databricks.sdk.service.serving.AmazonBedrockConfig` to no longer be
required.
* Changed `anthropic_api_key` field for
`databricks.sdk.service.serving.AnthropicConfig` to no longer be
required.
* Changed `cohere_api_key` field for
`databricks.sdk.service.serving.CohereConfig` to no longer be required.
* Changed `databricks_api_token` field for
`databricks.sdk.service.serving.DatabricksModelServingConfig` to no
longer be required.
* Changed `palm_api_key` field for
`databricks.sdk.service.serving.PaLmConfig` to no longer be required.
* Changed `tags` field for `databricks.sdk.service.sql.Query` to
`databricks.sdk.service.sql.List` dataclass.
* Changed `user_ids` and `warehouse_ids` fields for
`databricks.sdk.service.sql.QueryFilter` to
`databricks.sdk.service.sql.List` dataclass.
* Changed `results` field for `databricks.sdk.service.sql.QueryList` to
`databricks.sdk.service.sql.LegacyQueryList` dataclass.
* Changed `visualization` field for `databricks.sdk.service.sql.Widget`
to `databricks.sdk.service.sql.LegacyVisualization` dataclass.
* Removed
[w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html)
workspace-level service.
* Removed `restore()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service.
* Removed `databricks.sdk.service.marketplace.FilterType`,
`databricks.sdk.service.marketplace.ProviderIconFile`,
`databricks.sdk.service.marketplace.ProviderIconType`,
`databricks.sdk.service.marketplace.ProviderListingSummaryInfo`,
`databricks.sdk.service.marketplace.SortBy` and
`databricks.sdk.service.marketplace.VisibilityFilter` dataclasses.
* Removed `databricks.sdk.service.billing.Budget`,
`databricks.sdk.service.billing.BudgetAlert`,
`databricks.sdk.service.billing.BudgetList`,
`databricks.sdk.service.billing.BudgetWithStatus`,
`databricks.sdk.service.billing.BudgetWithStatusStatusDailyItem`,
`databricks.sdk.service.billing.DeleteBudgetRequest`, `any`,
`databricks.sdk.service.billing.GetBudgetRequest`, `any`,
`databricks.sdk.service.billing.WrappedBudget` and
`databricks.sdk.service.billing.WrappedBudgetWithStatus` dataclasses.
* Removed `any`, `databricks.sdk.service.iam.PermissionMigrationRequest`
and `databricks.sdk.service.iam.PermissionMigrationResponse`
dataclasses.
* Removed
`databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`
dataclass.
* Removed `databricks.sdk.service.serving.App`,
`databricks.sdk.service.serving.AppDeployment`,
`databricks.sdk.service.serving.AppDeploymentArtifacts`,
`databricks.sdk.service.serving.AppDeploymentMode`,
`databricks.sdk.service.serving.AppDeploymentState`,
`databricks.sdk.service.serving.AppDeploymentStatus`,
`databricks.sdk.service.serving.AppEnvironment`,
`databricks.sdk.service.serving.AppState`,
`databricks.sdk.service.serving.AppStatus`,
`databricks.sdk.service.serving.CreateAppDeploymentRequest`,
`databricks.sdk.service.serving.CreateAppRequest`,
`databricks.sdk.service.serving.DeleteAppRequest`,
`databricks.sdk.service.serving.EnvVariable`,
`databricks.sdk.service.serving.GetAppDeploymentRequest`,
`databricks.sdk.service.serving.GetAppEnvironmentRequest`,
`databricks.sdk.service.serving.GetAppRequest`,
`databricks.sdk.service.serving.ListAppDeploymentsRequest`,
`databricks.sdk.service.serving.ListAppDeploymentsResponse`,
`databricks.sdk.service.serving.ListAppsRequest`,
`databricks.sdk.service.serving.ListAppsResponse`,
`databricks.sdk.service.serving.StartAppRequest`,
`databricks.sdk.service.serving.StopAppRequest`, `any` and
`databricks.sdk.service.serving.UpdateAppRequest` dataclasses.
* Removed `databricks.sdk.service.sql.CreateQueryVisualizationRequest`,
`databricks.sdk.service.sql.DeleteAlertRequest`,
`databricks.sdk.service.sql.DeleteQueryRequest`,
`databricks.sdk.service.sql.DeleteQueryVisualizationRequest`,
`databricks.sdk.service.sql.ExecuteStatementResponse`,
`databricks.sdk.service.sql.GetStatementResponse`,
`databricks.sdk.service.sql.RestoreQueryRequest`,
`databricks.sdk.service.sql.StatementId`,
`databricks.sdk.service.sql.UserId` and
`databricks.sdk.service.sql.WarehouseId` dataclasses.
 * Removed `databricks.sdk.service.compute.PolicyFamilyId` dataclass.
* Removed `can_use_client` field for
`databricks.sdk.service.compute.ListClustersRequest`.
* Removed `is_ascending` and `sort_by` fields for
`databricks.sdk.service.marketplace.ListListingsRequest`.
* Removed `provider_summary` field for
`databricks.sdk.service.marketplace.Listing`.
* Removed `filters` field for
`databricks.sdk.service.marketplace.ListingSetting`.
* Removed `metastore_id` field for
`databricks.sdk.service.marketplace.ListingSummary`.
* Removed `is_ascending` and `sort_by` fields for
`databricks.sdk.service.marketplace.SearchListingsRequest`.
* Removed `created_at`, `last_triggered_at`, `name`, `options`,
`parent`, `query`, `rearm`, `updated_at` and `user` fields for
`databricks.sdk.service.sql.Alert`.
* Removed `alert_id` field for
`databricks.sdk.service.sql.GetAlertRequest`.
* Removed `query_id` field for
`databricks.sdk.service.sql.GetQueryRequest`.
* Removed `order`, `page` and `q` fields for
`databricks.sdk.service.sql.ListQueriesRequest`.
* Removed `include_metrics` field for
`databricks.sdk.service.sql.ListQueryHistoryRequest`.
* Removed `can_edit`, `created_at`, `data_source_id`, `is_archived`,
`is_draft`, `is_favorite`, `is_safe`, `last_modified_by`,
`last_modified_by_id`, `latest_query_data_id`, `name`, `options`,
`parent`, `permission_tier`, `query`, `query_hash`, `run_as_role`,
`updated_at`, `user`, `user_id` and `visualizations` fields for
`databricks.sdk.service.sql.Query`.
* Removed `statement_ids` field for
`databricks.sdk.service.sql.QueryFilter`.
* Removed `can_subscribe_to_live_query` field for
`databricks.sdk.service.sql.QueryInfo`.
* Removed `metadata_time_ms`, `planning_time_ms` and
`query_execution_time_ms` fields for
`databricks.sdk.service.sql.QueryMetrics`.
* Removed `created_at`, `description`, `name`, `options`, `query` and
`updated_at` fields for `databricks.sdk.service.sql.Visualization`.

OpenAPI SHA: f98c07f9c71f579de65d2587bb0292f83d10e55d, Date: 2024-08-12
## Changes

This PR makes sure that single quotes are properly escaped when passing
regex pattern to match errors.

## Tests

Verified that SDK can properly be generated when the pattern contains
single quotes.

Note that `downstreams / compatibility (ucx, databrickslabs)` was
already failing and that this PR should not affect downstream consumers.

- [x] `make test` run locally
- [x] `make fmt` applied
- [x] relevant integration tests applied
…alid semantic version: 0.33.1+420240816190912` (databricks#729)

## Changes

This PR fixes SemVer regex to follow the official recommendation to
capture more patterns. It also ensures that patterns are both SemVer and
PEP440 compliant.

## Tests

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
### Bug Fixes

* Fixed regression introduced in v0.30.0 causing `ValueError: Invalid
semantic version: 0.33.1+420240816190912`
([databricks#729](databricks#729)).


### Internal Changes

* Escape single quotes in regex matchers
([databricks#727](databricks#727)).


### API Changes:

* Added
[w.policy_compliance_for_clusters](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/policy_compliance_for_clusters.html)
workspace-level service.
* Added
[w.policy_compliance_for_jobs](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/policy_compliance_for_jobs.html)
workspace-level service.
* Added
[w.resource_quotas](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/resource_quotas.html)
workspace-level service.
* Added `databricks.sdk.service.catalog.GetQuotaRequest`,
`databricks.sdk.service.catalog.GetQuotaResponse`,
`databricks.sdk.service.catalog.ListQuotasRequest`,
`databricks.sdk.service.catalog.ListQuotasResponse` and
`databricks.sdk.service.catalog.QuotaInfo` dataclasses.
* Added `databricks.sdk.service.compute.ClusterCompliance`,
`databricks.sdk.service.compute.ClusterSettingsChange`,
`databricks.sdk.service.compute.EnforceClusterComplianceRequest`,
`databricks.sdk.service.compute.EnforceClusterComplianceResponse`,
`databricks.sdk.service.compute.GetClusterComplianceRequest`,
`databricks.sdk.service.compute.GetClusterComplianceResponse`,
`databricks.sdk.service.compute.ListClusterCompliancesRequest` and
`databricks.sdk.service.compute.ListClusterCompliancesResponse`
dataclasses.
* Added
`databricks.sdk.service.jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange`,
`databricks.sdk.service.jobs.EnforcePolicyComplianceRequest`,
`databricks.sdk.service.jobs.EnforcePolicyComplianceResponse`,
`databricks.sdk.service.jobs.GetPolicyComplianceRequest`,
`databricks.sdk.service.jobs.GetPolicyComplianceResponse`,
`databricks.sdk.service.jobs.JobCompliance`,
`databricks.sdk.service.jobs.ListJobComplianceForPolicyResponse` and
`databricks.sdk.service.jobs.ListJobComplianceRequest` dataclasses.
* Added `fallback` field for
`databricks.sdk.service.catalog.CreateExternalLocation`.
* Added `fallback` field for
`databricks.sdk.service.catalog.ExternalLocationInfo`.
* Added `fallback` field for
`databricks.sdk.service.catalog.UpdateExternalLocation`.
 * Added `job_run_id` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `job_run_id` field for `databricks.sdk.service.jobs.Run`.
* Added `include_metrics` field for
`databricks.sdk.service.sql.ListQueryHistoryRequest`.
* Added `statement_ids` field for
`databricks.sdk.service.sql.QueryFilter`.
 * Removed `databricks.sdk.service.sql.ContextFilter` dataclass.
* Removed `context_filter` field for
`databricks.sdk.service.sql.QueryFilter`.
* Removed `pipeline_id` and `pipeline_update_id` fields for
`databricks.sdk.service.sql.QuerySource`.

OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21
…bricks#723)

## Changes
<!-- Summary of your changes that are easy to understand -->
`DatabricksCliTokenSource().token()` itself can't be copied. So, Deep
Copy can't be performed for Config.
Added the wrapper function which can be copied. So, Deep copy can be
performed.
## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
… does actually work through integration tests (databricks#736)

Signed-off-by: Serge Smertin <259697+nfx@users.noreply.github.com>
…tabricks#738)

## Changes
The current get_workspace_client test fails because the SP used by the
test does not have access to the first workspace listed. In the
[Go](https://github.com/databricks/databricks-sdk-go/blob/main/internal/account_client_test.go#L12)
&
[Java](https://github.com/databricks/databricks-sdk-java/blob/1b90e2318f8221ac0a6e4b56c9b0e4c286e38c9f/databricks-sdk-java/src/test/java/com/databricks/sdk/integration/AccountClientIT.java#L17)
SDKs, the corresponding test respects the `TEST_WORKSPACE_ID`
environment variable to know which workspace to attempt to login to.
This PR changes the test to use that environment variable as well.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
### Bug Fixes

* Fix `DatabricksConfig.copy` when authenticated with OAuth
([databricks#723](databricks#723)).


### Internal Changes

* Fix get_workspace_client test to match Go SDK behavior
([databricks#738](databricks#738)).
* Verify that `WorkspaceClient` created from `AccountClient` does
actually work through integration tests
([databricks#736](databricks#736)).
## Changes
Add Data Plane access documentation

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
mgyucht and others added 25 commits September 16, 2024 12:52
…ess token from the CLI (databricks#748)

## Changes
Ports databricks/databricks-sdk-go#1021 to the
Python SDK.

The Azure CLI's az account get-access-token command does not allow
specifying --tenant flag if it is authenticated via the CLI.

Fixes databricks#742.

## Tests
Unit tests ensure that all expected cases are treated as managed
identities.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
### New Features and Improvements

* Support Models in `dbutils.fs` operations
([databricks#750](databricks#750)).


### Bug Fixes

* Do not specify --tenant flag when fetching managed identity access
token from the CLI
([databricks#748](databricks#748)).
* Fix deserialization of 401/403 errors
([databricks#758](databricks#758)).
* Use correct optional typing in `WorkspaceClient` for `mypy`
([databricks#760](databricks#760)).
## Changes
Add DataPlane docs to the index

## Tests


- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
…atabricks#761)

## Changes
This PR introduces a new model serving auth method to Databricks SDK. 
- If the correct environment variables are set to identify a model
serving environment
- Check to see if there is an oauth file written by the serving
environment
- If this file exists use the token here for authentication

## Tests
Added Unit tests

- [x] `make test` run locally
- [x] `make fmt` applied
- [x] relevant integration tests applied

---------

Signed-off-by: aravind-segu <aravind.segu@databricks.com>
### New Features and Improvements

* Integrate Databricks SDK with Model Serving Auth Provider
([databricks#761](databricks#761)).


### Bug Fixes

* Add DataPlane docs to the index
([databricks#764](databricks#764)).
* `mypy` error: Skipping analyzing "google": module is installed, but
missing library stubs or py.typed marker
([databricks#769](databricks#769)).
## Changes

This PR updates the contributing guidelines to include the DCO
(Developer Certificate of Origin) that external contributors must
sign-off to contribute.

## Tests

N/A
## Changes
<!-- Summary of your changes that are easy to understand -->
Updating SDK to latest OpenAPI spec + fix generation (need to import
`Optional`)

Note: `test_github_oidc_flow_works_with_azure` fails on genkit
generate-sdk py but passes when run separately right after generation
without any change. This seems to be non blocker so going ahead with SDK
generation.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->
Unit tests. Nightly tests will run over release PR.
- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
### Internal Changes

* Add DCO guidelines
([databricks#773](databricks#773)).
* Update SDK to latest OpenAPI spec
([databricks#766](databricks#766)).


### API Changes:

* Added
[w.disable_legacy_access](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/disable_legacy_access.html)
workspace-level service and
[a.disable_legacy_features](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings/disable_legacy_features.html)
account-level service.
* Added
[w.temporary_table_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/temporary_table_credentials.html)
workspace-level service.
* Added `put_ai_gateway()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html)
workspace-level service.
* Added `databricks.sdk.service.apps.ApplicationState`,
`databricks.sdk.service.apps.ApplicationStatus`,
`databricks.sdk.service.apps.ComputeState` and
`databricks.sdk.service.apps.ComputeStatus` dataclasses.
* Added `databricks.sdk.service.catalog.AwsCredentials`,
`databricks.sdk.service.catalog.AzureUserDelegationSas`,
`databricks.sdk.service.catalog.GcpOauthToken`,
`databricks.sdk.service.catalog.GenerateTemporaryTableCredentialRequest`,
`databricks.sdk.service.catalog.GenerateTemporaryTableCredentialResponse`,
`databricks.sdk.service.catalog.R2Credentials` and
`databricks.sdk.service.catalog.TableOperation` dataclasses.
* Added `databricks.sdk.service.serving.AiGatewayConfig`,
`databricks.sdk.service.serving.AiGatewayGuardrailParameters`,
`databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior`,
`databricks.sdk.service.serving.AiGatewayGuardrailPiiBehaviorBehavior`,
`databricks.sdk.service.serving.AiGatewayGuardrails`,
`databricks.sdk.service.serving.AiGatewayInferenceTableConfig`,
`databricks.sdk.service.serving.AiGatewayRateLimit`,
`databricks.sdk.service.serving.AiGatewayRateLimitKey`,
`databricks.sdk.service.serving.AiGatewayRateLimitRenewalPeriod`,
`databricks.sdk.service.serving.AiGatewayUsageTrackingConfig`,
`databricks.sdk.service.serving.PutAiGatewayRequest` and
`databricks.sdk.service.serving.PutAiGatewayResponse` dataclasses.
* Added `databricks.sdk.service.settings.BooleanMessage`,
`databricks.sdk.service.settings.DeleteDisableLegacyAccessRequest`,
`databricks.sdk.service.settings.DeleteDisableLegacyAccessResponse`,
`databricks.sdk.service.settings.DeleteDisableLegacyFeaturesRequest`,
`databricks.sdk.service.settings.DeleteDisableLegacyFeaturesResponse`,
`databricks.sdk.service.settings.DisableLegacyAccess`,
`databricks.sdk.service.settings.DisableLegacyFeatures`,
`databricks.sdk.service.settings.GetDisableLegacyAccessRequest`,
`databricks.sdk.service.settings.GetDisableLegacyFeaturesRequest`,
`databricks.sdk.service.settings.UpdateDisableLegacyAccessRequest` and
`databricks.sdk.service.settings.UpdateDisableLegacyFeaturesRequest`
dataclasses.
* Added `databricks.sdk.service.workspace.CreateCredentialsRequest`,
`databricks.sdk.service.workspace.CreateRepoRequest`,
`databricks.sdk.service.workspace.CreateRepoResponse`,
`databricks.sdk.service.workspace.DeleteCredentialsRequest`, `any`,
`any`, `databricks.sdk.service.workspace.GetCredentialsRequest`,
`databricks.sdk.service.workspace.GetRepoResponse`,
`databricks.sdk.service.workspace.ListCredentialsResponse`,
`databricks.sdk.service.workspace.UpdateCredentialsRequest`, `any`,
`databricks.sdk.service.workspace.UpdateRepoRequest` and `any`
dataclasses.
* Added `app_status` and `compute_status` fields for
`databricks.sdk.service.apps.App`.
* Added `deployment_id` field for
`databricks.sdk.service.apps.CreateAppDeploymentRequest`.
* Added `external_access_enabled` field for
`databricks.sdk.service.catalog.GetMetastoreSummaryResponse`.
* Added `include_manifest_capabilities` field for
`databricks.sdk.service.catalog.GetTableRequest`.
* Added `include_manifest_capabilities` field for
`databricks.sdk.service.catalog.ListSummariesRequest`.
* Added `include_manifest_capabilities` field for
`databricks.sdk.service.catalog.ListTablesRequest`.
* Added `external_access_enabled` field for
`databricks.sdk.service.catalog.MetastoreInfo`.
* Added `budget_policy_id` and `schema` fields for
`databricks.sdk.service.pipelines.CreatePipeline`.
* Added `budget_policy_id` and `schema` fields for
`databricks.sdk.service.pipelines.EditPipeline`.
* Added `effective_budget_policy_id` field for
`databricks.sdk.service.pipelines.GetPipelineResponse`.
* Added `budget_policy_id` and `schema` fields for
`databricks.sdk.service.pipelines.PipelineSpec`.
* Added `ai_gateway` field for
`databricks.sdk.service.serving.CreateServingEndpoint`.
* Added `ai_gateway` field for
`databricks.sdk.service.serving.ServingEndpoint`.
* Added `ai_gateway` field for
`databricks.sdk.service.serving.ServingEndpointDetailed`.
* Added `workspace_id` field for
`databricks.sdk.service.settings.TokenInfo`.
* Added `credential_id`, `git_provider` and `git_username` fields for
`databricks.sdk.service.workspace.GetCredentialsResponse`.
* Changed `delete()`, `start()` and `stop()` methods for
[w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html)
workspace-level service to return `databricks.sdk.service.apps.App`
dataclass.
* Changed `deploy()` method for
[w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html)
workspace-level service with new required argument order.
* Changed `create()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service . New request type is
`databricks.sdk.service.workspace.CreateCredentialsRequest` dataclass.
* Changed `delete()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service . New request type is
`databricks.sdk.service.workspace.DeleteCredentialsRequest` dataclass.
* Changed `delete()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service to return `any` dataclass.
* Changed `get()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service . New request type is
`databricks.sdk.service.workspace.GetCredentialsRequest` dataclass.
* Changed `get()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service to return
`databricks.sdk.service.workspace.GetCredentialsResponse` dataclass.
* Changed `list()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service to return
`databricks.sdk.service.workspace.ListCredentialsResponse` dataclass.
* Changed `update()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service . New request type is
`databricks.sdk.service.workspace.UpdateCredentialsRequest` dataclass.
* Changed `update()` method for
[w.git_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/git_credentials.html)
workspace-level service to return `any` dataclass.
* Changed `create()` method for
[w.repos](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/repos.html)
workspace-level service to return
`databricks.sdk.service.workspace.CreateRepoResponse` dataclass.
* Changed `create()` method for
[w.repos](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/repos.html)
workspace-level service . New request type is
`databricks.sdk.service.workspace.CreateRepoRequest` dataclass.
* Changed `delete()` method for
[w.repos](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/repos.html)
workspace-level service to return `any` dataclass.
* Changed `get()` method for
[w.repos](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/repos.html)
workspace-level service to return
`databricks.sdk.service.workspace.GetRepoResponse` dataclass.
* Changed `update()` method for
[w.repos](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/repos.html)
workspace-level service to return `any` dataclass.
* Changed `update()` method for
[w.repos](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/repos.html)
workspace-level service . New request type is
`databricks.sdk.service.workspace.UpdateRepoRequest` dataclass.
* Changed `source_code_path` field for
`databricks.sdk.service.apps.AppDeployment` to no longer be required.
* Changed `source_code_path` field for
`databricks.sdk.service.apps.CreateAppDeploymentRequest` to no longer be
required.
* Changed `return_params` and `routine_dependencies` fields for
`databricks.sdk.service.catalog.CreateFunction` to no longer be
required.
* Changed `credential_id` and `git_provider` fields for
`databricks.sdk.service.workspace.CreateCredentialsResponse` to be
required.
* Changed `credential_id` field for
`databricks.sdk.service.workspace.CredentialInfo` to be required.
* Changed `patterns` field for
`databricks.sdk.service.workspace.SparseCheckout` to
`databricks.sdk.service.workspace.List` dataclass.
* Changed `patterns` field for
`databricks.sdk.service.workspace.SparseCheckoutUpdate` to
`databricks.sdk.service.workspace.List` dataclass.
* Removed `databricks.sdk.service.apps.AppState`,
`databricks.sdk.service.apps.AppStatus`, `any` and `any` dataclasses.
* Removed `databricks.sdk.service.sql.ClientCallContext`,
`databricks.sdk.service.sql.EncodedText`,
`databricks.sdk.service.sql.EncodedTextEncoding`,
`databricks.sdk.service.sql.QuerySource`,
`databricks.sdk.service.sql.QuerySourceDriverInfo`,
`databricks.sdk.service.sql.QuerySourceEntryPoint`,
`databricks.sdk.service.sql.QuerySourceJobManager`,
`databricks.sdk.service.sql.QuerySourceTrigger` and
`databricks.sdk.service.sql.ServerlessChannelInfo` dataclasses.
* Removed `databricks.sdk.service.workspace.CreateCredentials`,
`databricks.sdk.service.workspace.CreateRepo`,
`databricks.sdk.service.workspace.DeleteGitCredentialRequest`,
`databricks.sdk.service.workspace.GetGitCredentialRequest`,
`databricks.sdk.service.workspace.SparseCheckoutPattern`,
`databricks.sdk.service.workspace.UpdateCredentials`,
`databricks.sdk.service.workspace.UpdateRepo` and `any` dataclasses.
 * Removed `status` field for `databricks.sdk.service.apps.App`.
* Removed `query_source` field for
`databricks.sdk.service.sql.QueryInfo`.
* Removed `credentials` field for
`databricks.sdk.service.workspace.GetCredentialsResponse`.

OpenAPI SHA: 248f4ad9668661da9d0bf4a7b0119a2d44fd1e75, Date: 2024-09-25
…ks#750) (databricks#778)

This reverts commit 3162545.
Verified that /Models download still work correctly.

## Changes
<!-- Summary of your changes that are easy to understand -->

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix Model Serving Tests. 
- Added preffered auth_type so that test will not use other auth_types
first.
- Patch function that reads `.databrickscfg`.
- Unsetting already existing environment variables.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
…abricks#785)

## Changes
`ApiClient` is also coupled to the `Config` object, which means that it
can't be used in situations where there is no config. For example, when
fetching OIDC endpoints, the user may not have a complete `Config`
instance yet. However, failures when requesting from those endpoints
should still be retried according to the SDK's retry policy.

To address this, I've split the ApiClient into `_BaseClient` and
`ApiClient`. `_BaseClient` is the core implementation of the client
without any dependency on the `Config`. This is similar to what @rauchy
did in the Java SDK to cut the dependency between the `ApiClient` and
`DatabricksConfig`. The `_BaseClient` can then be used when fetching
OIDC endpoint information.

This will be used in
databricks#784 to support
retrying OAuth OIDC endpoint fetches.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
…onses (databricks#786)

## Changes
databricks#683 is caused by a small bug in the template used to generate the
Python SDK. When referring to a class defined in a separate API package,
only the module is imported, not the exact class, so the generated code
needs to use the qualified name of the structure.

Resolved databricks#683.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Update to latest OpenAPI spec

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
### Bug Fixes

* Fix Model Serving Test
([databricks#781](databricks#781)).
* Include package name for external types when deserializing responses
([databricks#786](databricks#786)).


### Internal Changes

* Refactor ApiClient into `_BaseClient` and `ApiClient`
([databricks#785](databricks#785)).
* Update to latest OpenAPI spec
([databricks#787](databricks#787)).
* revert Support Models in `dbutils.fs` operations
([databricks#750](databricks#750))
([databricks#778](databricks#778)).


### API Changes:

* Added
[w.disable_legacy_dbfs](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/disable_legacy_dbfs.html)
workspace-level service.
* Added `default_source_code_path` and `resources` fields for
`databricks.sdk.service.apps.App`.
* Added `resources` field for
`databricks.sdk.service.apps.CreateAppRequest`.
* Added `resources` field for
`databricks.sdk.service.apps.UpdateAppRequest`.

OpenAPI SHA: bc17b474818138f19b78a7bea0675707dead2b87, Date: 2024-10-07
## Changes
Add Open AI Client Mixing with the Serving Endpoints API. Open AI Client
requires a token to be authenticated. Therefore we are moving the
creation of OpenAI client to the databricks sdk so that users can easily
use it in both the notebook and model serving environments

## Tests
Dogfood Test:
https://e2-dogfood.staging.cloud.databricks.com/editor/notebooks/2337940012762945?o=6051921418418893

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied

---------

Signed-off-by: aravind-segu <aravind.segu@databricks.com>
…ic (databricks#792)

## Changes
Update Serving Endpoint mixing template and docs generation logic

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [X] `make test` run locally
- [X] `make fmt` applied
- [ ] relevant integration tests applied

---------

Co-authored-by: Omer Lachish <rauchy@users.noreply.github.com>
### New Features and Improvements

* Open AI Client Mixin
([databricks#779](databricks#779)).


### Bug Fixes

* Update Serving Endpoint mixing template and docs generation logic
([databricks#792](databricks#792)).


### API Changes:

 * Added `databricks.sdk.service.pipelines.ReportSpec` dataclass.
* Added `unity_catalog_provisioning_state` field for
`databricks.sdk.service.catalog.OnlineTable`.
* Added `is_truncated` field for
`databricks.sdk.service.dashboards.Result`.
* Added `effective_budget_policy_id` field for
`databricks.sdk.service.jobs.BaseJob`.
* Added `budget_policy_id` field for
`databricks.sdk.service.jobs.CreateJob`.
* Added `effective_budget_policy_id` field for
`databricks.sdk.service.jobs.Job`.
* Added `budget_policy_id` field for
`databricks.sdk.service.jobs.JobSettings`.
* Added `budget_policy_id` field for
`databricks.sdk.service.jobs.SubmitRun`.
* Added `report` field for
`databricks.sdk.service.pipelines.IngestionConfig`.
* Added `sequence_by` field for
`databricks.sdk.service.pipelines.TableSpecificConfig`.
 * Added `notify_on_ok` field for `databricks.sdk.service.sql.Alert`.
* Added `notify_on_ok` field for
`databricks.sdk.service.sql.CreateAlertRequestAlert`.
* Added `notify_on_ok` field for
`databricks.sdk.service.sql.ListAlertsResponseAlert`.
* Added `notify_on_ok` field for
`databricks.sdk.service.sql.UpdateAlertRequestAlert`.

OpenAPI SHA: cf9c61453990df0f9453670f2fe68e1b128647a2, Date: 2024-10-14

Co-authored-by: Omer Lachish <rauchy@users.noreply.github.com>
## Changes
### OAuth Refactoring
Currently, OAuthClient uses Config internally to resolve the OIDC
endpoints by passing the client ID and host to an internal Config
instance and calling its `oidc_endpoints` method. This has a few
drawbacks:
1. There is nearly a cyclical dependency: `Config` depends on methods in
`oauth.py`, and `OAuthClient` depends on `Config`. This currently
doesn't break because the `Config` import is done at runtime in the
`OAuthClient` constructor.
2. Databricks supports both in-house OAuth and Azure Entra ID OAuth.
Currently, the choice between these options depends on whether a user
specifies the azure_client_id or client_id parameter in the Config.
Because Config is used within OAuthClient, this means that OAuthClient
needs to expose a parameter to configure either client_id or
azure_client_id.

Rather than having these classes deeply coupled to one another, we can
allow users to fetch the OIDC endpoints for a given account/workspace as
a top-level functionality and provide this to `OAuthClient`. This breaks
the cyclic dependency and doesn't require `OAuthClient` to expose any
unnecessary parameters.

Further, I've also tried to remove the coupling of the other classes in
`oauth.py` to `OAuthClient`. Currently, `OAuthClient` serves both as the
mechanism to initialize OAuth and as a kind of configuration object,
capturing OAuth endpoint URLs, client ID/secret, redirect URL, and
scopes. Now, the parameters for each of these classes are explicit,
removing all unnecessarily coupling between them. One nice advantage is
that the Consent can be serialized/deserialized without any reference to
the `OAuthClient` anymore.

There is definitely more work to be done to simplify and clean up the
OAuth implementation, but this should at least unblock users who need to
use Azure Entra ID U2M OAuth in the SDK.

## Tests
The new OIDC endpoint methods are tested, and those tests also verify
that those endpoints are retried in case of rate limiting.

I ran the flask app example against an AWS workspace, and I ran the
external-browser demo example against AWS, Azure and GCP workspaces with
the default client ID and with a newly created OAuth app with and
without credentials.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
### Breaking Changes
* `external_browser` now uses the `databricks-cli` app instead of the
third-party "6128a518-99a9-425b-8333-4cc94f04cacd" application when
performing the U2M login flow for Azure workspaces when a client ID is
not otherwise specified. This matches the AWS behavior.
* The signatures of several OAuth-related constructors have changed to
support U2M OAuth with Azure Entra ID application registrations. See
https://github.com/databricks/databricks-sdk-py/blob/main/examples/flask_app_with_oauth.py
for examples of how to use these classes.
  * `OAuthClient()`: renamed to `OAuthClient.from_host()`
* `SessionCredentials()` and `SessionCredentials.from_dict()`: now
accepts `token_endpoint`, `client_id`, `client_secret`, and
`refresh_url` as parameters, rather than accepting the `OAuthClient`.
* `TokenCache()`: now accepts `host`, `token_endpoint`, `client_id`,
`client_secret`, and `refresh_url` as parameters, rather than accepting
the `OAuthClient`.

### Bug Fixes

* Decouple OAuth functionality from `Config`
([databricks#784](databricks#784)).


### Release

* Release v0.35.0
([databricks#793](databricks#793)).

Co-authored-by: Omer Lachish <rauchy@users.noreply.github.com>
Copy link

@romainissynced romainissynced left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

awesome!

@CaymanWilliams CaymanWilliams merged commit 92f6919 into main Oct 23, 2024
CaymanWilliams added a commit that referenced this pull request Feb 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.