Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PP-2196] Update API doc for OAuthM2M Support #73

Merged
merged 2 commits into from
Oct 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 3 additions & 5 deletions ApiSpecifications.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ The Connect API is used to sign-in or sign-up a user with a partner with Databri
The order of events when connecting Databricks to a partner is as follows:

1. The user clicks the Partner tile.
2. The user confirms the Databricks resources that will be provisioned for the connection (e.g. the Service Principal, the PAT, the SQL Warehouse).
2. The user confirms the Databricks resources that will be provisioned for the connection (e.g. the Service Principal, the PAT or the service principal OAuth secret, the SQL Warehouse).
3. The user clicks Connect.
1. Databricks calls the partner's **Connect API** with all of the Databricks data that the partner needs.
2. The partner provisions any accounts and resources needed. (e.g. persisting the Databricks workspace\_id, provisioning a Databricks output node).
Expand Down Expand Up @@ -138,9 +138,6 @@ POST <partners/databricks/v1/connect>: [example, can be customized]
"is_connection_established" : true|false
"auth": { [Only present if is_connection_established is false]
"personal_access_token": "dapi..."
jackyhu-db marked this conversation as resolved.
Show resolved Hide resolved
// or
"oauth_token": ..., [optional, reserved for future use]
"oauth_scope": ... [optional, reserved for future use]
}
}
"hostname": "organization.cloud.databricks.com",
Expand All @@ -162,7 +159,8 @@ POST <partners/databricks/v1/connect>: [example, can be customized]
"is_sql_endpoint" : true|false, [optional: same value as is_sql_warehouse]
"is_sql_warehouse": true|false, [optional: set if cluster_id is set. Determines whether cluster_id refers to Interactive Cluster or SQL Warehouse]
"data_source_connector": "Oracle", [optional, unused and reserved for future use: for data connector tools, the name of the data source that the user should be referred to in their tool]
"service_principal_id": "a2a25a05-3d59-4515-a73b-b8bc5ab79e31" [optional, the UUID (username) of the service principal identity]
"service_principal_id": "a2a25a05-3d59-4515-a73b-b8bc5ab79e31", [optional, the UUID (username) of the service principal identity]
"service_principal_oauth_secret": "dose..." [optional, the OAuth secret of the service principal identity, it will be passed only when partner config includes OAuth M2M auth option]
}
```

Expand Down
13 changes: 8 additions & 5 deletions OnboardingDoc.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Partner Connect is a destination inside a Databricks workspace that allows Datab

We made Partner Connect for 2 reasons:

1. We want to give our customers access to the value provided by the best data products in the market. Partner Connect removes the complexity from connecting products to Databricks by automatically configuring resources such as SQL Warehouses, clusters, PAT tokens, service principals, and connection files. It can also initiate a free trial of partner products.
1. We want to give our customers access to the value provided by the best data products in the market. Partner Connect removes the complexity from connecting products to Databricks by automatically configuring resources such as SQL Warehouses, clusters, PAT tokens, service principals, OAuth secrets and connection files. It can also initiate a free trial of partner products.
2. We want to help our partners build their businesses and incentivize them to create the best possible product experience for Databricks customers. For more on this topic, see [this blog post](https://databricks.com/blog/2021/11/18/build-your-business-on-databricks-with-partner-connect.html).

### Sample marketing materials and user experience demo
Expand All @@ -31,6 +31,9 @@ The following phrases will help you understand the Databricks product and this d
- **Persona Switcher:** The component on the upper left of the UI that allows the user to choose the active Databricks product. This controls which features are available in the UI, and not all users have access to all 3 options. Partner Connect is available to all 3 personas.
- **Personal Access Token (PAT):** A token that a partner product can use to authenticate with Databricks
- **Service Principal:** An account that a partner product can use when calling Databricks APIs. Service Principals have access controls associated with them.
- **OAuth M2M** It uses service principals to authenticate Databricks. It is also known as 2-legged OAuth and OAuth Client Credentials Flow. Partner product can use service principal UUD (client_id) and OAuth secret (client_secret) to authenticate with Databricks.
- **Service Principal OAuth Secret**: The service principal's secret that a partner product use it along with service principal UUID to authenticate with Databricks.


![](img/persona.png)

Expand Down Expand Up @@ -83,10 +86,10 @@ While there's some customization available, most partners have one of the follow

| Integration | Description |
|------------- | -------------|
| Read Partner | This is used by partners that purely need to read (select) data from the Lakehouse. In Partner Connect, the user selects which data to grant access to your product. Databricks provides the partner a SQL Warehouse and PAT with permissions to query that data. This is often used by **Business Intelligence and Data Quality partners**.
| Write Partner | This is used by partners that purely need to write (ingest) data into the Lakehouse. In Partner Connect, the user selects which catalog to grant write access to your product. Databricks provides the partner a SQL Warehouse and PAT with permissions to create schemas and tables in that catalog. This is often used by **Ingestion partners**.
| Read-Write Partner | This is used by partners that both need to read from and write to the Lakehouse. In Partner Connect, the user selects which catalog to grant write access and which schemas to grant read access for your product. Databricks provides the partner a SQL Warehouse and PAT with permissions to create schemas and tables in that catalog, as well as query the selected data. This is often used by **Data Preparation partners**.
| Notebook Partner | This is used by partners that want to demonstrate their integration with Databricks using a Databricks Notebook. Databricks provides the partner an Interactive Cluster and PAT with permissions. The partner can use the PAT to publish a Databricks Notebook and configure the Interactive Cluster.
| Read Partner | This is used by partners that purely need to read (select) data from the Lakehouse. In Partner Connect, the user selects which data to grant access to your product. Databricks provides the partner a SQL Warehouse, PAT with permissions or OAuth secret of the service principal with permissions to query that data. This is often used by **Business Intelligence and Data Quality partners**.
| Write Partner | This is used by partners that purely need to write (ingest) data into the Lakehouse. In Partner Connect, the user selects which catalog to grant write access to your product. Databricks provides the partner a SQL Warehouse, PAT with permissions or OAuth secret of the service principal with permissions to create schemas and tables in that catalog. This is often used by **Ingestion partners**.
| Read-Write Partner | This is used by partners that both need to read from and write to the Lakehouse. In Partner Connect, the user selects which catalog to grant write access and which schemas to grant read access for your product. Databricks provides the partner a SQL Warehouse, PAT with permissions or OAuth secret of the service principal with permissions to create schemas and tables in that catalog, as well as query the selected data. This is often used by **Data Preparation partners**.
| Notebook Partner | This is used by partners that want to demonstrate their integration with Databricks using a Databricks Notebook. Databricks provides the partner an Interactive Cluster, PAT with permissions or OAuth secret of the service principal with permissions. The partner can use the PAT or service principal secret to publish a Databricks Notebook and configure the Interactive Cluster.
| Desktop application Partner | This is used by partners that have a Desktop application (as opposed to a SaaS offering). In this integration, the user selects either an Interactive Cluster or SQL Warehouse and downloads a connection file to the partner product. This is often used by **Partners with Desktop applications**.<br /><br />N.B. For this type of integration, there is no need for the partner to implement the SaaS APIs mentioned elsewhere throughout this documentation.

## Changelog
Expand Down
2 changes: 1 addition & 1 deletion api-doc/Models/Auth.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

| Name | Type | Description | Notes |
|------------ | ------------- | ------------- | -------------|
| **personal\_access\_token** | **String** | Personal Access Token created for the Service Principal or the user | [default to null] |
| **personal\_access\_token** | **String** | Personal Access Token created for the Service Principal or the user. Note will be null if the auth_options in PartnerConfig is not null and does not contain the value AUTH_PAT.| [default to null] |
| **oauth\_token** | **String** | Oauth token. For future use. | [optional] [default to null] |
| **oauth\_scope** | **String** | Oauth scope. For future use. | [optional] [default to null] |

Expand Down
1 change: 1 addition & 0 deletions api-doc/Models/ConnectRequest.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
| **is\_sql\_warehouse** | **Boolean** | Determines whether cluster_id refers to Interactive Cluster or SQL warehouse. | [optional] [default to null] |
| **data\_source\_connector** | **String** | For data connector tools, the name of the data source that the user should be referred to in their tool. Unused today. | [optional] [default to null] |
| **service\_principal\_id** | **String** | The UUID (username) of the service principal identity that a partner product can use to call Databricks APIs. Note the format is different from the databricks_user_id field in user_info. If empty, no service principal was created | [optional] [default to null] |
| **service\_principal\_oauth\_secret** | **String** | The OAuth secret of the service principal identity that a partner product can use to call Databricks APIs (see [OAuth M2M](https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html)). It will be set only when the `auth_options` in the [PartnerConfig](PartnerConfig.md) contains the value `AUTH_OAUTH_M2M`. | [optional] [default to null] |
| **connection\_scope** | **String** | The scope of users that can use this connection. Workspace means all users in the same workspace. User means only the user creating it. | [optional] [default to null] |

[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
Expand Down
1 change: 1 addition & 0 deletions api-doc/Models/PartnerConfig.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
| **require\_manual\_signup** | **Boolean** | True if the partner requires a manual signup after connect api is called. When set to true, connect api call with is_connection_established (sign in) is expected to return 404 account_not_found or connection_not_found until the user completes the manual signup step. | [optional] [default to null] |
| **trial\_type** | **String** | Enum describing the type of trials the partner support. Partners can chose to support trial account expiration at the individual user or account level. If trial level is user, expiring one user connection should not expire another user in the same account. | [optional] [default to null] |
| **supports\_demo** | **Boolean** | True if partner supports the demo flag in the connect api call. | [optional] [default to null] |
| **auth\_options** | **List** | The available authentication methods that a partner can use to authenticate with Databricks. If it is not specified, `AUTH_PAT` will be used. The allowed options include <ul><li><b>AUTH_PAT</b></li><li><b>AUTH_OAUTH_M2M</b></li></ul>| [optional] [default to null] |
| **test\_workspace\_detail** | [**PartnerConfig_test_workspace_detail**](PartnerConfig_test_workspace_detail.md) | | [optional] [default to null] |

[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
Expand Down
10 changes: 9 additions & 1 deletion openapi/partner-connect-2.0.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -333,7 +333,9 @@ components:
properties:
personal_access_token:
type: string
description: Personal Access Token created for the Service Principal or the user
description: |
Personal Access Token created for the Service Principal or the users.
It will be null if the auth_options in PartnerConfig is not null and does not contain the value AUTH_PAT.
example: "token"
oauth_token:
type: string
Expand Down Expand Up @@ -480,6 +482,12 @@ components:
type: string
description: The UUID (username) of the service principal identity that a partner product can use to call Databricks APIs. Note the format is different from the databricks_user_id field in user_info. If empty, no service principal was created
example: "a2a25a05-3d59-4515-a73b-b8bc5ab79e31"
service_principal_oauth_secret:
type: string
description: |
The secret of the service principal identity that a partner product can use to call Databricks APIs.
It will be set only when the auth_options in PartnerConfig contains the value AUTH_OAUTH_M2M.
example: "secret"
connection_scope:
type: string
description: The scope of users that can use this connection. Workspace means all users in the same workspace. User means only the user creating it.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ import org.openapitools.client.model.PartnerConfigEnums.{
import org.openapitools.client.model.ResourceToProvisionEnums.ResourceType
import org.openapitools.client.model.TestResultEnums.Status
import org.openapitools.client.model.{Connector, _}
import spray.json.{DefaultJsonProtocol, RootJsonFormat}
import spray.json._

object JsonFormatters extends DefaultJsonProtocol {
// Order of declaration matters. Enums need to be defined first otherwise ProductFormats.scala throws NPE.
Expand Down Expand Up @@ -75,12 +75,129 @@ object JsonFormatters extends DefaultJsonProtocol {
implicit val errorResponse: RootJsonFormat[ErrorResponse] = jsonFormat3(
ErrorResponse
)
implicit val connectRequest: RootJsonFormat[ConnectRequest] = jsonFormat22(
ConnectRequest
)
implicit val connectionInfo: RootJsonFormat[ConnectionInfo] = jsonFormat1(
ConnectionInfo
)

// spray.json jsonFormat cannot parse more than 22 fields, custom format is needed
implicit object ConnectionRequestJsonFormat
extends RootJsonFormat[ConnectRequest] {
private def OptionJsString(value: Option[String]) =
value.map(JsString(_)).getOrElse(JsNull)
private def OptionJsBoolean(value: Option[Boolean]) =
value.map(JsBoolean(_)).getOrElse(JsNull)

private def getString(fields: Map[String, JsValue], name: String): String =
fields.get(name) match {
case Some(JsString(value)) => value
case _ => throw DeserializationException(s"$name should be string")
}

private def getNumber(
fields: Map[String, JsValue],
name: String
): BigDecimal =
fields.get(name) match {
case Some(JsNumber(value)) => value
case _ => throw DeserializationException(s"$name should be number")
}

private def getBool(fields: Map[String, JsValue], name: String): Boolean =
fields.get(name) match {
case Some(JsBoolean(value)) => value
case _ => throw DeserializationException(s"$name should be boolean")
}

private def getOptionString(
fields: Map[String, JsValue],
name: String
): Option[String] =
fields.get(name) match {
case Some(JsString(value)) => Some(value)
case Some(JsNull) | None => None
case _ => throw DeserializationException(s"$name should be string")
}

private def getOptionBoolean(
fields: Map[String, JsValue],
name: String
): Option[Boolean] =
fields.get(name) match {
case Some(JsBoolean(value)) => Some(value)
case Some(JsNull) | None => None
case _ => throw DeserializationException(s"$name should be boolean")
}

def write(request: ConnectRequest): JsValue = JsObject(
"user_info" -> request.user_info.toJson,
"connection_id" -> OptionJsString(request.connection_id),
"hostname" -> JsString(request.hostname),
"port" -> JsNumber(request.port),
"workspace_url" -> JsString(request.workspace_url),
"http_path" -> OptionJsString(request.http_path),
"jdbc_url" -> OptionJsString(request.jdbc_url),
"databricks_jdbc_url" -> OptionJsString(request.databricks_jdbc_url),
"workspace_id" -> JsNumber(request.workspace_id),
"demo" -> JsBoolean(request.demo),
"cloud_provider" -> request.cloud_provider.toJson,
"cloud_provider_region" -> OptionJsString(request.cloud_provider_region),
"is_free_trial" -> JsBoolean(request.is_free_trial),
"destination_location" -> OptionJsString(request.destination_location),
"catalog_name" -> OptionJsString(request.catalog_name),
"database_name" -> OptionJsString(request.database_name),
"cluster_id" -> OptionJsString(request.cluster_id),
"is_sql_endpoint" -> OptionJsBoolean(request.is_sql_endpoint),
"is_sql_warehouse" -> OptionJsBoolean(request.is_sql_warehouse),
"data_source_connector" -> OptionJsString(request.data_source_connector),
"service_principal_id" -> OptionJsString(request.service_principal_id),
"service_principal_oauth_secret" -> OptionJsString(
request.service_principal_oauth_secret
),
"connection_scope" -> request.connection_scope
.map(_.toJson)
.getOrElse(JsNull)
)

implicit val connectRequest: RootJsonFormat[ConnectRequest] =
ConnectionRequestJsonFormat

def read(value: JsValue): ConnectRequest = {
val fields = value.asJsObject.fields
val scoptOpt = fields.get("connection_scope") match {
case Some(JsNull) => None
case Some(v) => Some(v.convertTo[ConnectRequestEnums.ConnectionScope])
case None => None
}

ConnectRequest(
user_info = fields("user_info").convertTo[UserInfo],
connection_id = getOptionString(fields, "connection_id"),
hostname = getString(fields, "hostname"),
port = getNumber(fields, "port").toInt,
workspace_url = getString(fields, "workspace_url"),
http_path = getOptionString(fields, "http_path"),
jdbc_url = getOptionString(fields, "jdbc_url"),
databricks_jdbc_url = getOptionString(fields, "databricks_jdbc_url"),
workspace_id = getNumber(fields, "workspace_id").toLong,
demo = getBool(fields, "demo"),
cloud_provider =
fields("cloud_provider").convertTo[ConnectRequestEnums.CloudProvider],
cloud_provider_region =
getOptionString(fields, "cloud_provider_region"),
is_free_trial = getBool(fields, "is_free_trial"),
destination_location = getOptionString(fields, "destination_location"),
catalog_name = getOptionString(fields, "catalog_name"),
database_name = getOptionString(fields, "database_name"),
cluster_id = getOptionString(fields, "cluster_id"),
is_sql_endpoint = getOptionBoolean(fields, "is_sql_endpoint"),
is_sql_warehouse = getOptionBoolean(fields, "is_sql_warehouse"),
data_source_connector =
getOptionString(fields, "data_source_connector"),
service_principal_id = getOptionString(fields, "service_principal_id"),
service_principal_oauth_secret =
getOptionString(fields, "service_principal_oauth_secret"),
connection_scope = scoptOpt
)
}
}

implicit val deleteConnectionRequest
: RootJsonFormat[DeleteConnectionRequest] = jsonFormat3(
DeleteConnectionRequest
Expand Down
Loading
Loading