user_info |
UserInfo |
|
[default to null] |
connection_id |
String |
The partner connection id returned on initial connection. This will only be set when is_connection_established is true |
[optional] [default to null] |
hostname |
String |
Hostname for the Databricks connection. |
[default to null] |
port |
Integer |
Port for the Databricks connection. |
[default to null] |
workspace_url |
String |
Workspace Url to access the Databricks connection. |
[default to null] |
http_path |
String |
Http path for the Databricks connection. Only present if is_sql_endpoint is true. |
[optional] [default to null] |
jdbc_url |
String |
Legacy Spark JDBC connection url |
[optional] [default to null] |
databricks_jdbc_url |
String |
New Databricks JDBC driver connection url |
[optional] [default to null] |
workspace_id |
Long |
Workspace id for the Databricks connection. Same as the user_info.databricks_organization_id |
[default to null] |
demo |
Boolean |
True if this is a demo experience. |
[default to null] |
cloud_provider |
String |
The cloud provider for the Databricks workspace. |
[default to null] |
cloud_provider_region |
String |
The cloud provider region for the Databricks workspace. |
[default to null] |
is_free_trial |
Boolean |
Flag to indicate if this is a Databricks free trial. |
[default to null] |
destination_location |
String |
Optional destination location URI |
[optional] [default to null] |
catalog_name |
String |
Optional catalog name. It could be a custom name if using Unity Catalog, or "hive_metastore" if not. Note that Databricks APIs often require identifiers like this to be escaped with backticks if there are special characters in it. |
[optional] [default to null] |
database_name |
String |
Optional Database name. Unused today. |
[optional] [default to null] |
cluster_id |
String |
Sql endpoint id if is_sql_endpoint is true otherwise cluster id. |
[optional] [default to null] |
is_sql_endpoint |
Boolean |
This is using legacy name. SQL endpoint is now SQL warehouse. This field should be the same as is_sql_warehouse. |
[optional] [default to null] |
is_sql_warehouse |
Boolean |
Determines whether cluster_id refers to Interactive Cluster or SQL warehouse. |
[optional] [default to null] |
data_source_connector |
String |
For data connector tools, the name of the data source that the user should be referred to in their tool. Unused today. |
[optional] [default to null] |
service_principal_id |
String |
The UUID (username) of the service principal identity that a partner product can use to call Databricks APIs. Note the format is different from the databricks_user_id field in user_info. If empty, no service principal was created |
[optional] [default to null] |
service_principal_oauth_secret |
String |
The OAuth secret of the service principal identity that a partner product can use to call Databricks APIs (see OAuth M2M). It will be set only when the auth_options in the PartnerConfig contains the value AUTH_OAUTH_M2M . |
[optional] [default to null] |
connection_scope |
String |
The scope of users that can use this connection. Workspace means all users in the same workspace. User means only the user creating it. |
[optional] [default to null] |
oauth_u2m_app_id |
String |
The client ID of OAuth U2M app connection (created by Partner Connect) that a partner product can use to initiate Databricks OAuth U2M flow. It will be set only when the auth_options in the PartnerConfig contains the value AUTH_OAUTH_U2M . |
[optional] [default to null] |