-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Additional Steps for BQ_Source #1472
Open
Praveena2607
wants to merge
11
commits into
data-integrations:develop
Choose a base branch
from
Praveena2607:GoogleBigQSource
base: develop
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 5 commits
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
2f6ad0f
Additional Steps for BQ.
Praveena2607 60fb70f
Updated CdfPluginPropertyLocator and associated fiels for BigQuery Pl…
Praveena2607 7e71bc2
Fixed Checkstyle issues and ensured all scenarios are running success…
Praveena2607 0870576
Refactor CdfPluginPropertyLocator alignment and update build dependen…
Praveena2607 06ac469
All the changes are made.Please check.
Praveena2607 840465f
All the changes are made.Please check and let me know.
Praveena2607 6aeccd4
I have now removed the unnecessary imports that are not in use.
Praveena2607 06505be
Merge branch 'data-integrations:develop' into GoogleBigQSource
Praveena2607 645c24c
All the review comments have been addressed, and the required changes…
Praveena2607 8f29e31
All the review comments have been addressed, and the required changes…
Praveena2607 be1e5b6
Merge branch 'data-integrations:develop' into GoogleBigQSource
Praveena2607 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -69,3 +69,153 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans | |
Then Verify the pipeline status is "Succeeded" | ||
Then Verify data is transferred to target GCS bucket | ||
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled | ||
|
||
@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST | ||
Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for partition start date and partition end date | ||
Given Open Datafusion Project to configure pipeline | ||
When Source is BigQuery | ||
When Sink is GCS | ||
Then Open BigQuery source properties | ||
Then Enter BigQuery property reference name | ||
Then Enter BigQuery property "projectId" as macro argument "bqProjectId" | ||
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId" | ||
Then Enter BigQuery property "partitionFrom" as macro argument "bqStartDate" | ||
Then Enter BigQuery property "partitionTo" as macro argument "bqEndDate" | ||
Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType" | ||
Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount" | ||
Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount" | ||
Then Enter BigQuery property "dataset" as macro argument "bqDataset" | ||
Then Enter BigQuery property "table" as macro argument "bqSourceTable" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Open GCS sink properties | ||
Then Enter GCS property reference name | ||
Then Enter GCS property "projectId" as macro argument "gcsProjectId" | ||
Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType" | ||
Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount" | ||
Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount" | ||
Then Enter GCS property "path" as macro argument "gcsSinkPath" | ||
Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix" | ||
Then Enter GCS property "format" as macro argument "gcsFormat" | ||
Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled | ||
Then Validate "GCS" plugin properties | ||
Then Close the GCS properties | ||
Then Connect source as "BigQuery" and sink as "GCS" to establish connection | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Enter runtime argument value "projectId" for key "bqProjectId" | ||
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" | ||
Then Enter runtime argument value "partitionFrom" for key "bqStartDate" | ||
Then Enter runtime argument value "partitionTo" for key "bqEndDate" | ||
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" | ||
Then Enter runtime argument value "serviceAccount" for key "serviceAccount" | ||
Then Enter runtime argument value "dataset" for key "bqDataset" | ||
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" | ||
Then Enter runtime argument value "projectId" for key "gcsProjectId" | ||
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" | ||
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" | ||
Then Enter runtime argument value "csvFormat" for key "gcsFormat" | ||
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled | ||
Then Run the preview of pipeline with runtime arguments | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Click on preview data for GCS sink | ||
Then Close the preview data | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "projectId" for key "bqProjectId" | ||
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" | ||
Then Enter runtime argument value "partitionFrom" for key "bqStartDate" | ||
Then Enter runtime argument value "partitionTo" for key "bqEndDate" | ||
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" | ||
Then Enter runtime argument value "serviceAccount" for key "serviceAccount" | ||
Then Enter runtime argument value "dataset" for key "bqDataset" | ||
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" | ||
Then Enter runtime argument value "projectId" for key "gcsProjectId" | ||
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" | ||
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" | ||
Then Enter runtime argument value "csvFormat" for key "gcsFormat" | ||
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Verify data is transferred to target GCS bucket | ||
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled | ||
|
||
@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST | ||
Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for filter and Output Schema | ||
Given Open Datafusion Project to configure pipeline | ||
When Source is BigQuery | ||
When Sink is GCS | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Use the latest steps from framework. |
||
Then Open BigQuery source properties | ||
Then Enter BigQuery property reference name | ||
Then Enter BigQuery property "projectId" as macro argument "bqProjectId" | ||
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId" | ||
Then Enter BigQuery property "filter" as macro argument "bqFilter" | ||
Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType" | ||
Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount" | ||
Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount" | ||
Then Enter BigQuery property "dataset" as macro argument "bqDataset" | ||
Then Enter BigQuery property "table" as macro argument "bqSourceTable" | ||
Then Select Macro action of output schema property: "Output Schema-macro-input" and set the value to "bqOutputSchema" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Open GCS sink properties | ||
Then Enter GCS property reference name | ||
Then Enter GCS property "projectId" as macro argument "gcsProjectId" | ||
Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType" | ||
Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount" | ||
Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount" | ||
Then Enter GCS property "path" as macro argument "gcsSinkPath" | ||
Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix" | ||
Then Enter GCS property "format" as macro argument "gcsFormat" | ||
Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled | ||
Then Validate "GCS" plugin properties | ||
Then Close the GCS properties | ||
Then Connect source as "BigQuery" and sink as "GCS" to establish connection | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Enter runtime argument value "projectId" for key "bqProjectId" | ||
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" | ||
Then Enter runtime argument value "filter" for key "bqFilter" | ||
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" | ||
Then Enter runtime argument value "serviceAccount" for key "serviceAccount" | ||
Then Enter runtime argument value "dataset" for key "bqDataset" | ||
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" | ||
Then Enter runtime argument value "OutputSchema" for key "bqOutputSchema" | ||
Then Enter runtime argument value "projectId" for key "gcsProjectId" | ||
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" | ||
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" | ||
Then Enter runtime argument value "csvFormat" for key "gcsFormat" | ||
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled | ||
Then Run the preview of pipeline with runtime arguments | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Click on preview data for GCS sink | ||
Then Close the preview data | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Enter runtime argument value "projectId" for key "bqProjectId" | ||
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" | ||
Then Enter runtime argument value "filter" for key "bqFilter" | ||
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" | ||
Then Enter runtime argument value "serviceAccount" for key "serviceAccount" | ||
Then Enter runtime argument value "dataset" for key "bqDataset" | ||
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" | ||
Then Enter runtime argument value "OutputSchema" for key "bqOutputSchema" | ||
Then Enter runtime argument value "projectId" for key "gcsProjectId" | ||
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" | ||
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" | ||
Then Enter runtime argument value "csvFormat" for key "gcsFormat" | ||
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled | ||
Then Run the Pipeline in Runtime with runtime arguments | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Verify data is transferred to target GCS bucket | ||
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is still not reverted to older version
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now, It is reverted and please review.