Halloween Special - Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: sntaclus

Which Java SDK class can you use to run your Dataflow programs locally?

A.

LocalRunner

B.

DirectPipelineRunner

C.

MachineRunner

D.

LocalPipelineRunner

Which of the following is NOT one of the three main types of triggers that Dataflow supports?

A.

Trigger based on element size in bytes

B.

Trigger that is a combination of other triggers

C.

Trigger based on element count

D.

Trigger based on time

All Google Cloud Bigtable client requests go through a front-end server ______ they are sent to a Cloud Bigtable node.

A.

before

B.

after

C.

only if

D.

once

You are planning to use Google's Dataflow SDK to analyze customer data such as displayed below. Your project requirement is to extract only the customer name from the data source and then write to an output PCollection.

Tom,555 X street

Tim,553 Y street

Sam, 111 Z street

Which operation is best suited for the above data processing requirement?

A.

ParDo

B.

Sink API

C.

Source API

D.

Data extraction

Which of the following statements is NOT true regarding Bigtable access roles?

A.

Using IAM roles, you cannot give a user access to only one table in a project, rather than all tables in a project.

B.

To give a user access to only one table in a project, grant the user the Bigtable Editor role forthat table.

C.

You can configure access control only at the project level.

D.

To give a user access to only one table in a project, you must configure access through your application.

Which of the following statements about Legacy SQL and Standard SQL is not true?

A.

Standard SQL is the preferred query language for BigQuery.

B.

If you write a query in Legacy SQL, it might generate an error if you try to run it with Standard SQL.

C.

One difference between the two query languages is how you specify fully-qualified table names (i.e. table names that include their associated project name).

D.

You need to set a query language for each dataset and the default is Standard SQL.

Which of these is not a supported method of putting data into a partitioned table?

A.

If you have existing data in a separate file for each day, then create a partitioned table and upload each file into the appropriate partition.

B.

Run a query to get the records for a specific day from an existing table and for the destination table, specify a partitioned table ending with the day in the format "$YYYYMMDD".

C.

Create a partitioned table and stream new records to it every day.

D.

Use ORDER BY to put a table's rows into chronological order and then change the table's type to "Partitioned".

When running a pipeline that has a BigQuery source, on your local machine, you continue to get permission denied errors. What could be the reason for that?

A.

Your gcloud does not have access to the BigQuery resources

B.

BigQuery cannot be accessed from local machines

C.

You are missing gcloud on your machine

D.

Pipelines cannot be run locally

Which of the following is NOT a valid use case to select HDD (hard disk drives) as the storage for Google Cloud Bigtable?

A.

You expect to store at least 10 TB of data.

B.

You will mostly run batch workloads with scans and writes, rather than frequently executing random reads of a small number of rows.

C.

You need to integrate with Google BigQuery.

D.

You will not use the data to back a user-facing or latency-sensitive application.

What are all of the BigQuery operations that Google charges for?

A.

Storage, queries, and streaming inserts

B.

Storage, queries, and loading data from a file

C.

Storage, queries, and exporting data

D.

Queries and streaming inserts