databricks-sdk-go icon indicating copy to clipboard operation
databricks-sdk-go copied to clipboard

Databricks SDK for Go

Results 181 databricks-sdk-go issues
Sort by recently updated
recently updated
newest added
trafficstars

**Description** When I'm building PoC of Agent, encountered this error. Model: Mixtral **Reproduction** # Load the Agent agent = AgentWithRetriever() # Example for testing multiple turns of converastion # 1st...

**Description** We are using M2M oauth with a service principal to call our model serving endpoints using the Go SDK. We see the authentication failing with ``` databricks_integration [455.873984ms]: response...

It would be nice to add function similar to [forEachRemaining](https://docs.oracle.com/en/java/javase/21/docs/api/java.base/java/util/Iterator.html#forEachRemaining(java.util.function.Consumer)) from `java.util.iterator` class. Use case - apply a function for each element, without collecting them into the list...

**Problem Statement** We have multiple issues reported to the Terraform provider (i.e., https://github.com/databricks/terraform-provider-databricks/issues/4113 just as one example) where `default auth: cannot configure default credentials` error is reported because `host` isn't...

**Problem Statement** In the terraform provider, we may need a validation logic that will check that the value is in the list of all possible constants. **Proposed Solution** For all...

## What changes are proposed in this pull request? This is a fix for https://github.com/databricks/terraform-provider-databricks/issues/5218. The issue is cased by the fact that `Scala` selector had default value of `2.12`...

This PR updates the SDK to the latest API changes.

zzz:automated-update

## 🥞 Stacked PR Use this [link](https://github.com/databricks/databricks-sdk-go/pull/1328/files) to review incremental changes. - [**stack/go-none-timeout**](https://github.com/databricks/databricks-sdk-go/pull/1328) [[Files changed](https://github.com/databricks/databricks-sdk-go/pull/1328/files)] --------- ## What changes are proposed in this pull request? Provide the readers and reviewers...

**Description** Databricks endpoints that accept a `max_results` parameter, such as [listing catalogs](https://docs.databricks.com/api/workspace/catalogs/list), have the following advice: > when set to 0, the page length is set to a server configured...