You are building a new application that you need to collect data from in a scalable way. Data arrives continuously from the application throughout the day, and you expect to generate approximately 150 GB of JSON data per day by the end of the year. Your requirements are: Decoupling producer from consumer Space and cost-efficient storage of the raw ingested data, which is to be stored indefinitely Near real-time SQL query Maintain at least 2 years of historical data, which will be queried with SQL Which pipeline should you use to meet these requirements?
A) Create an application that provides an API. Write a tool to poll the API and write data to Cloud Storage as gzipped JSON files.
B) Create an application that writes to a Cloud SQL database to store the data. Set up periodic exports of the database to write to Cloud Storage and load into BigQuery.
C) Create an application that publishes events to Cloud Pub/Sub, and create Spark jobs on Cloud Dataproc to convert the JSON data to Avro format, stored on HDFS on Persistent Disk.
D) Create an application that publishes events to Cloud Pub/Sub, and create a Cloud Dataflow pipeline that transforms the JSON event payloads to Avro, writing the data to Cloud Storage and BigQuery.
Correct Answer:
Verified
Q77: You want to automate execution of a
Q78: Your globally distributed auction application allows users
Q79: You want to analyze hundreds of thousands
Q80: You have a data stored in BigQuery.
Q81: You use a dataset in BigQuery for
Q83: You decided to use Cloud Datastore to
Q84: You operate a database that stores stock
Q85: A data scientist has created a BigQuery
Q86: You are creating a new pipeline in
Q87: You are building an application to share
Unlock this Answer For Free Now!
View this answer and more for free by performing one of the following actions
Scan the QR code to install the App and get 2 free unlocks
Unlock quizzes for free by uploading documents