You have developed three data processing jobs. One executes a Cloud Dataflow pipeline that transforms data uploaded to Cloud Storage and writes results to BigQuery. The second ingests data from on-premises servers and uploads it to Cloud Storage. The third is a Cloud Dataflow pipeline that gets information from third-party data providers and uploads the information to Cloud Storage. You need to be able to schedule and monitor the execution of these three workflows and manually execute them when needed. What should you do?
A) Create a Direct Acyclic Graph in Cloud Composer to schedule and monitor the jobs.
B) Use Stackdriver Monitoring and set up an alert with a Webhook notification to trigger the jobs.
C) Develop an App Engine application to schedule and request the status of the jobs using GCP API calls.
D) Set up cron jobs in a Compute Engine instance to schedule and monitor the pipelines using GCP API calls.
Correct Answer:
Verified
Q89: Your United States-based company has created an
Q90: You are operating a Cloud Dataflow streaming
Q91: You are managing a Cloud Dataproc cluster.
Q92: You are a retailer that wants to
Q93: The marketing team at your organization provides
Q95: You use BigQuery as your centralized analytics
Q96: You are running a pipeline in Cloud
Q97: As your organization expands its usage of
Q98: You need to create a data pipeline
Q99: You work for a manufacturing company that
Unlock this Answer For Free Now!
View this answer and more for free by performing one of the following actions
Scan the QR code to install the App and get 2 free unlocks
Unlock quizzes for free by uploading documents