An external customer provides you with a daily dump of data from their database. The data flows into Google Cloud Storage GCS as comma-separated values (CSV) files. You want to analyze this data in Google BigQuery, but the data could have rows that are formatted incorrectly or corrupted. How should you build this pipeline?
A) Use federated data sources, and check data in the SQL query.
B) Enable BigQuery monitoring in Google Stackdriver and create an alert.
C) Import the data into BigQuery using the gcloud CLI and set max_bad_records to 0 . Import the data into BigQuery using the gcloud CLI and set max_bad_records to 0 .
D) Run a Google Cloud Dataflow batch pipeline to import the data into BigQuery, and push errors to another dead-letter table for analysis.
Correct Answer:
Verified
Q106: You have Google Cloud Dataflow streaming pipeline
Q107: You are operating a streaming Cloud Dataflow
Q108: You work for a global shipping company.
Q109: You want to migrate an on-premises Hadoop
Q110: You have a data pipeline with a
Q112: You need to create a new transaction
Q113: You are migrating your data warehouse to
Q114: MJTelco Case Study Company Overview MJTelco is
Q115: Your team is working on a binary
Q116: You have several Spark jobs that run
Unlock this Answer For Free Now!
View this answer and more for free by performing one of the following actions
Scan the QR code to install the App and get 2 free unlocks
Unlock quizzes for free by uploading documents