Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Union schema compatibility #46

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .buildkite/hooks/pre-command
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,6 @@ export CI_SNOWFLAKE_DBT_WAREHOUSE=$(gcloud secrets versions access latest --secr
export CI_DATABRICKS_DBT_HOST=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_HOST" --project="dbt-package-testing-363917")
export CI_DATABRICKS_DBT_HTTP_PATH=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_HTTP_PATH" --project="dbt-package-testing-363917")
export CI_DATABRICKS_DBT_TOKEN=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_TOKEN" --project="dbt-package-testing-363917")
export CI_DATABRICKS_DBT_CATALOG=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_CATALOG" --project="dbt-package-testing-363917")
export CI_DATABRICKS_DBT_CATALOG=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_CATALOG" --project="dbt-package-testing-363917")
export CI_DATABRICKS_SQL_DBT_HTTP_PATH=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_SQL_DBT_HTTP_PATH" --project="dbt-package-testing-363917")
export CI_DATABRICKS_SQL_DBT_TOKEN=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_SQL_DBT_TOKEN" --project="dbt-package-testing-363917")
17 changes: 16 additions & 1 deletion .buildkite/pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,4 +71,19 @@ steps:
- "CI_DATABRICKS_DBT_TOKEN"
- "CI_DATABRICKS_DBT_CATALOG"
commands: |
bash .buildkite/scripts/run_models.sh databricks
bash .buildkite/scripts/run_models.sh databricks

- label: ":databricks: :database: Run Tests - Databricks SQL Warehouse"
key: "run_dbt_databricks_sql"
plugins:
- docker#v3.13.0:
image: "python:3.8"
shell: [ "/bin/bash", "-e", "-c" ]
environment:
- "BASH_ENV=/tmp/.bashrc"
- "CI_DATABRICKS_DBT_HOST"
- "CI_DATABRICKS_SQL_DBT_HTTP_PATH"
- "CI_DATABRICKS_SQL_DBT_TOKEN"
- "CI_DATABRICKS_DBT_CATALOG"
commands: |
bash .buildkite/scripts/run_models.sh databricks-sql
18 changes: 18 additions & 0 deletions .buildkite/scripts/run_models.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,27 @@ db=$1
echo `pwd`
cd integration_tests
dbt deps
if [ "$db" = "databricks-sql" ]; then
dbt seed --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db" --full-refresh
dbt compile --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db"
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db" --full-refresh
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db"
dbt test --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db"
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests, fivetran_platform__usage_pricing: true}' --target "$db" --full-refresh
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests, fivetran_platform__usage_pricing: true}' --target "$db"
dbt test --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db"
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests, fivetran_platform__credits_pricing: false, fivetran_platform__usage_pricing: true}' --target "$db" --full-refresh
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests, fivetran_platform__credits_pricing: false, fivetran_platform__usage_pricing: true}' --target "$db"
dbt test --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db"
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests, fivetran_platform__usage_pricing: false, fivetran_platform_using_destination_membership: false, fivetran_platform_using_user: false}' --target "$db" --full-refresh
dbt run --vars '{mixpanel_schema: mixpanel_sqlw_tests, fivetran_platform__usage_pricing: false, fivetran_platform_using_destination_membership: false, fivetran_platform_using_user: false}' --target "$db"
dbt test --vars '{mixpanel_schema: mixpanel_sqlw_tests}' --target "$db"
else
dbt seed --target "$db" --full-refresh
dbt run --target "$db" --full-refresh
dbt test --target "$db"
dbt run --target "$db"
dbt test --target "$db"
dbt run-operation fivetran_utils.drop_schemas_automation --target "$db"

fi
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,15 @@
# dbt_mixpanel v0.10.0

[PR #46](https://github.com/fivetran/dbt_mixpanel/pull/46) includes the following updates:

## 🚨 Breaking Changes 🚨
> ⚠️ Since the following changes result in the table format changing, we recommend running a `--full-refresh` after upgrading to this version to avoid possible incremental failures.

## Under the Hood
- The `is_databricks_sql_warehouse` macro has been renamed to `is_incremental_compatible` and has been modified to return `true` if the Databricks runtime being used is an all-purpose cluster (previously this macro checked if a sql warehouse runtime was used) **or** if any other non-Databricks supported destination is being used.
- This update was applied as there have been other Databricks runtimes discovered (ie. an endpoint and external runtime) which do not support the `insert_overwrite` incremental strategy used in the `mixpanel__` model.
- In addition to the above, for Databricks users the `mixpanel__` model will now leverage the incremental strategy only if the Databricks runtime is all-purpose. Otherwise, all other Databricks runtimes will not leverage an incremental strategy.

# dbt_mixpanel v0.9.0
[PR #41](https://github.com/fivetran/dbt_mixpanel/pull/41) includes the following updates:

Expand Down
58 changes: 29 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
</p>

# Mixpanel dbt Package ([Docs](https://fivetran.github.io/dbt_mixpanel/))
# 📣 What does this dbt package do?
## 📣 What does this dbt package do?

- Produces modeled tables that leverage Mixpanel data from [Fivetran's connector](https://fivetran.com/docs/applications/mixpanel). It uses the Mixpanel `event` table in the format described by [this ERD](https://fivetran.com/docs/applications/mixpanel#schemainformation).

Expand All @@ -39,23 +39,23 @@ The following table provides a detailed list of all models materialized within t

<!--section-end-->

# 🎯 How do I use the dbt package?
## 🎯 How do I use the dbt package?

## Step 1: Prerequisites
### Step 1: Prerequisites
To use this dbt package, you must have the following:

- At least one Fivetran Mixpanel connector syncing data into your destination.
- A **BigQuery**, **Snowflake**, **Redshift**, **PostgreSQL**, or **Databricks** destination.

### Databricks dispatch configuration
#### Databricks dispatch configuration
If you are using a Databricks destination with this package, you must add the following (or a variation of the following) dispatch configuration within your `dbt_project.yml`. This is required in order for the package to accurately search for macros within the `dbt-labs/spark_utils` then the `dbt-labs/dbt_utils` packages respectively.
```yml
dispatch:
- macro_namespace: dbt_utils
search_order: ['spark_utils', 'dbt_utils']
```

### Database Incremental Strategies
#### Database Incremental Strategies
Many of the end models in this package are materialized incrementally, so we have configured our models to work with the different strategies available to each supported warehouse.

For **BigQuery** and **Databricks** destinations, we have chosen `insert_overwrite` as the default strategy, which benefits from the partitioning capability.
Expand All @@ -64,7 +64,7 @@ For **Snowflake**, **Redshift**, and **Postgres** databases, we have chosen `del

> Regardless of strategy, we recommend that users periodically run a `--full-refresh` to ensure a high level of data quality.

## Step 2: Install the package
### Step 2: Install the package
Include the following mixpanel package version in your `packages.yml` file:
> TIP: Check [dbt Hub](https://hub.getdbt.com/) for the latest installation instructions or [read the dbt docs](https://docs.getdbt.com/docs/package-management) for more information on installing packages.

Expand All @@ -74,7 +74,7 @@ packages:
version: [">=0.9.0", "<0.10.0"] # we recommend using ranges to capture non-breaking changes automatically
```

## Step 3: Define database and schema variables
### Step 3: Define database and schema variables
By default, this package runs using your destination and the `mixpanel` schema. If this is not where your Mixpanel data is (for example, if your Mixpanel schema is named `mixpanel_fivetran`), add the following configuration to your root `dbt_project.yml` file:

```yml
Expand All @@ -83,10 +83,10 @@ vars:
mixpanel_schema: your_schema_name
```

## (Optional) Step 4: Additional configurations
### (Optional) Step 4: Additional configurations

## Macros
### analyze_funnel [(source)](https://github.com/fivetran/dbt_mixpanel/blob/master/macros/analyze_funnel.sql)
### Macros
#### analyze_funnel [(source)](https://github.com/fivetran/dbt_mixpanel/blob/master/macros/analyze_funnel.sql)
You can use the `analyze_funnel(event_funnel, group_by_column, conversion_criteria)` macro to produce a funnel between a given list of event types.

It returns the following:
Expand All @@ -103,7 +103,7 @@ The macro takes the following as arguments:
- `conversion_criteria`: (Optional) A `WHERE` clause that will be applied when selecting from `mixpanel__event`.
- Example: To limit all events in the funnel to the United States, you'd provide `conversion_criteria = 'country_code = "US"'`. To limit the events to only song play events to the US, you'd input `conversion_criteria = 'country_code = "US"' OR event_type != 'play_song'`.

### Pivoting Out Event Properties
#### Pivoting Out Event Properties
By default, this package selects the [default columns collected by Mixpanel](https://help.mixpanel.com/hc/en-us/articles/115004613766-What-properties-do-Mixpanel-s-libraries-store-by-default-). However, you likely have custom properties or columns that you'd like to include in the `mixpanel__event` model.

If there are properties in the `mixpanel.event.properties` JSON blob that you'd like to pivot out into columns, add the following variable to your `dbt_project.yml` file:
Expand All @@ -114,7 +114,7 @@ vars:
event_properties_to_pivot: ['the', 'list', 'of', 'property', 'fields'] # Note: this is case-SENSITIVE and must match the casing of the property as it appears in the JSON
```

### Passthrough Columns
#### Passthrough Columns
Additionally, this package includes all standard source `EVENT` columns defined in the `staging_columns` macro. You can add more columns using our passthrough column variables. These variables allow the passthrough fields to be aliased (`alias`) and casted (`transform_sql`) if desired, although it is not required. Data type casting is configured via a SQL snippet within the `transform_sql` key. You may add the desired SQL snippet while omitting the `as field_name` part of the casting statement - this will be dealt with by the alias attribute - and your custom passthrough fields will be casted accordingly.

Use the following format for declaring the respective passthrough variables:
Expand All @@ -129,14 +129,14 @@ vars:
- name: "this_other_field"
transform_sql: "cast(this_other_field as string)"
```
### Sessions Event Frequency Limit
#### Sessions Event Frequency Limit
The `event_frequencies` field within the `mixpanel__sessions` model reports all event types and the frequency of those events as a JSON blob via a string aggregation. For some users there can be thousands of different event types that take place. For Redshift and Postgres warehouses there currently exists a limit for string aggregations (up to 65,535). As a result, in order for Redshift and Postgres users to still leverage the `event_frequencies` field, an artificial limit is applied to this field of 1,000. If you would like to adjust this limit, you may do so by modifying the below variable in your project configuration.
```yml
vars:
mixpanel:
mixpanel__event_frequency_limit: 500 ## Default is 1000
```
### Event Date Range
#### Event Date Range
Because of the typical volume of event data, you may want to limit this package's models to work with a recent date range of your Mixpanel data (however, note that all final models are materialized as [incremental](https://docs.getdbt.com/docs/building-a-dbt-project/building-models/materializations#incremental) tables).

By default, the package looks at all events since January 1, 2010. To change this start date, add the following variable to your `dbt_project.yml` file:
Expand All @@ -149,7 +149,7 @@ vars:

**Note:** This date range will not affect the `number_of_new_users` column in the `mixpanel__daily_events` or `mixpanel__monthly_events` models. This metric will be *true* new users.

### Global Event Filters
#### Global Event Filters
In addition to limiting the date range, you may want to employ other filters to remove noise from your event data.

To apply a global filter to events (and therefore **all** models in this package), add the following variable to your `dbt_project.yml` file. It will be applied as a `WHERE` clause when selecting from the source table, `mixpanel.event`.
Expand All @@ -161,8 +161,8 @@ vars:
global_event_filter: 'distinct_id != "1234abcd"'
```

### Session Configurations
#### Session Inactivity Timeout
#### Session Configurations
##### Session Inactivity Timeout
This package sessionizes events based on the periods of inactivity between a user's events on a device. By default, the package will denote a new session once the period between events surpasses **30 minutes**.

To change this timeout value, add the following variable to your `dbt_project.yml` file:
Expand All @@ -173,7 +173,7 @@ vars:
sessionization_inactivity: number_of_minutes # ex: 60
```

#### Session Pass-Through Columns
##### Session Pass-Through Columns
By default, the `mixpanel__sessions` model will contain the following columns from `mixpanel__event`:
- `people_id`: The ID of the user
- `device_id`: The ID of the device they used in this session
Expand All @@ -187,7 +187,7 @@ vars:
session_passthrough_columns: ['the', 'list', 'of', 'column', 'names']
```

#### Session Event Criteria
##### Session Event Criteria
In addition to any global event filters, you may want to disclude events or place filters on them in order to qualify for sessionization.

To apply any filters to the events in the sessions model, add the following variable to your `dbt_project.yml` file. It will be applied as a `WHERE` clause when selecting from `mixpanel__event`.
Expand All @@ -200,7 +200,7 @@ vars:
session_event_criteria: 'event_type in ("play_song", "stop_song", "create_playlist")'
```

#### Lookback Window
##### Lookback Window
Events can sometimes arrive late. For example, events triggered on a mobile device that is offline will be sent to Mixpanel once the device reconnects to wifi or a cell network. Since many of the models in this package are incremental, by default we look back 7 days to ensure late arrivals are captured while avoiding requiring a full refresh. To change the default lookback window, add the following variable to your `dbt_project.yml` file:

```yml
Expand All @@ -209,7 +209,7 @@ vars:
lookback_window: number_of_days # default is 7
```

### Changing the Build Schema
#### Changing the Build Schema
By default this package will build the Mixpanel staging models within a schema titled (<target_schema> + `_stg_mixpanel`) and Mixpanel final models within a schema titled (<target_schema> + `mixpanel`) in your target database. If this is not where you would like your modeled Mixpanel data to be written to, add the following configuration to your `dbt_project.yml` file:

```yml
Expand All @@ -220,7 +220,7 @@ models:
+schema: my_new_schema_name # leave blank for just the target_schema
```

### Change the source table references
#### Change the source table references
If an individual source table has a different name than the package expects, add the table name as it appears in your destination to the respective variable:

> IMPORTANT: See this project's [`dbt_project.yml`](https://github.com/fivetran/dbt_mixpanel/blob/main/dbt_project.yml) variable declarations to see the expected names.
Expand All @@ -230,7 +230,7 @@ vars:
mixpanel_<default_source_table_name>_identifier: your_table_name
```

## Event De-Duplication Logic
### Event De-Duplication Logic

Events are considered duplicates and consolidated by the package if they contain the same:
* `insert_id` (used for de-deuplication internally by Mixpanel)
Expand All @@ -240,14 +240,14 @@ Events are considered duplicates and consolidated by the package if they contain

This is performed in line with Mixpanel's internal de-duplication process, in which events are de-duped at the end of each day. This means that if an event was triggered during an offline session at 11:59 PM and _resent_ when the user came online at 12:01 AM, these records would _not_ be de-duplicated. This is the case in both Mixpanel and the Mixpanel dbt package.

## (Optional) Step 5: Orchestrate your models with Fivetran Transformations for dbt Core™
### (Optional) Step 5: Orchestrate your models with Fivetran Transformations for dbt Core™
<details><summary>Expand for details</summary>
<br>

Fivetran offers the ability for you to orchestrate your dbt project through [Fivetran Transformations for dbt Core™](https://fivetran.com/docs/transformations/dbt). Learn how to set up your project for orchestration through Fivetran in our [Transformations for dbt Core setup guides](https://fivetran.com/docs/transformations/dbt#setupguide).
</details>

# 🔍 Does this package have dependencies?
## 🔍 Does this package have dependencies?
This dbt package is dependent on the following dbt packages. Please be aware that these dependencies are installed by default within this package. For more information on the following packages, refer to the [dbt hub](https://hub.getdbt.com/) site.
> IMPORTANT: If you have any of these dependent packages in your own `packages.yml` file, we highly recommend that you remove them from your root `packages.yml` to avoid package version conflicts.

Expand All @@ -259,16 +259,16 @@ packages:
- package: dbt-labs/dbt_utils
version: [">=1.0.0", "<2.0.0"]
```
# 🙌 How is this package maintained and can I contribute?
## Package Maintenance
## 🙌 How is this package maintained and can I contribute?
### Package Maintenance
The Fivetran team maintaining this package _only_ maintains the latest version of the package. We highly recommend you stay consistent with the [latest version](https://hub.getdbt.com/fivetran/mixpanel/latest/) of the package and refer to the [CHANGELOG](https://github.com/fivetran/dbt_mixpanel/blob/main/CHANGELOG.md) and release notes for more information on changes across versions.

## Contributions
### Contributions
A small team of analytics engineers at Fivetran develops these dbt packages. However, the packages are made better by community contributions!

We highly encourage and welcome contributions to this package. Check out [this dbt Discourse article](https://discourse.getdbt.com/t/contributing-to-a-dbt-package/657) on the best workflow for contributing to a package!

# 🏪 Are there any resources available?
## 🏪 Are there any resources available?
- If you have questions or want to reach out for help, please refer to the [GitHub Issue](https://github.com/fivetran/dbt_mixpanel/issues/new/choose) section to find the right avenue of support for you.
- If you would like to provide feedback to the dbt package team at Fivetran or would like to request a new dbt package, fill out our [Feedback Form](https://www.surveymonkey.com/r/DQ7K7WW).
- Have questions or want to just say hi? Book a time during our office hours [on Calendly](https://calendly.com/fivetran-solutions-team/fivetran-solutions-team-office-hours) or email us at [email protected].
2 changes: 1 addition & 1 deletion dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
config-version: 2
name: 'mixpanel'
version: '0.9.0'
version: '0.10.0'
require-dbt-version: [">=1.3.0", "<2.0.0"]
models:
mixpanel:
Expand Down
6 changes: 5 additions & 1 deletion integration_tests/dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: 'mixpanel_integration_tests'
version: '0.9.0'
version: '0.10.0'
config-version: 2
profile: 'integration_tests'
vars:
Expand All @@ -19,3 +19,7 @@ seeds:
dispatch:
- macro_namespace: dbt_utils
search_order: ['spark_utils', 'dbt_utils']

models:
mixpanel:
+schema: "{{ 'mixpanel_sqlw_tests' if target.name == 'databricks-sql' else 'mixpanel' }}"
Loading