Halloween Special - Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: sntaclus

To give a user read permission for only the first three columns of a table, which access control method would you use?

A.

Primitive role

B.

Predefined role

C.

Authorized view

D.

It's not possible to give access to only the first three columns of a table.

Which of these statements about BigQuery caching is true?

A.

By default, a query's results are not cached.

B.

BigQuery caches query results for 48 hours.

C.

Query results are cached even if you specify a destination table.

D.

There is no charge for a query that retrieves its results from cache.

What are two of the characteristics of using online prediction rather than batch prediction?

A.

It is optimized to handle a high volume of data instances in a job and to run more complex models.

B.

Predictions are returned in the response message.

C.

Predictions are written to output files in a Cloud Storage location that you specify.

D.

It is optimized to minimize the latency of serving predictions.

Which of these operations can you perform from the BigQuery Web UI?

A.

Upload a file in SQL format.

B.

Load data with nested and repeated fields.

C.

Upload a 20 MB file.

D.

Upload multiple files using a wildcard.

Which of the following statements about the Wide & Deep Learning model are true? (Select 2 answers.)

A.

The wide model is used for memorization, while the deep model is used for generalization.

B.

A good use for the wide and deep model is a recommender system.

C.

The wide model is used for generalization, while the deep model is used for memorization.

D.

A good use for the wide and deep model is a small-scale linear regression problem.

You are creating the CI'CD cycle for the code of the directed acyclic graphs (DAGs) running in Cloud Composer. Your team has two Cloud Composer instances: one instance for development and another instance for production. Your team is using a Git repository to maintain and develop the code of the DAGs. You want to deploy the DAGs automatically to Cloud Composer when a certain tag is pushed to the Git repository. What should you do?

A.

1. Use Cloud Build to build a container and the Kubemetes Pod Operator to deploy the code of the DAG to the Google KubernetesEngine (GKE) cluster of the development instance for testing.2. If the tests pass, copy the code to the Cloud Storage bucket of the production instance.

B.

1 Use Cloud Build to copy the code of the DAG to the Cloud Storage bucket of the development instance for DAG testing.2. If the tests pass, use Cloud Build to build a container with the code of the DAG and the KubernetesPodOperator to deploy the container to the Google Kubernetes Engine (GKE) cluster of the production instance.

C.

1 Use Cloud Build to build a container with the code of the DAG and the KubernetesPodOperator to deploy the code to the Google Kubernetes Engine (GKE) cluster of the development instance for testing.2. If the tests pass, use the KubernetesPodOperator to deploy the container to the GKE cluster of the production instance.

D.

1 Use Cloud Build to copy the code of the DAG to the Cloud Storage bucket of the development instance for DAG testing.2. If the tests pass, use Cloud Build to copy the code to the bucket of the production instance.

You are deploying a batch pipeline in Dataflow. This pipeline reads data from Cloud Storage, transforms the data, and then writes the data into BigQuory. The security team has enabled anorganizational constraint in Google Cloud, requiring all Compute Engine instances to use only internal IP addresses and no external IP addresses. What should you do?

A.

Ensure that the firewall rules allow access to Cloud Storage and BigQuery. Use Dataflow with only internal IPs.

B.

Ensure that your workers have network tags to access Cloud Storage and BigQuery. Use Dataflow with only internal IP addresses.

C.

Create a VPC Service Controls perimeter that contains the VPC network and add Dataflow. Cloud Storage, and BigQuery as allowedservices in the perimeter. Use Dataflow with only internal IP addresses.

D.

Ensure that Private Google Access is enabled in the subnetwork. Use Dataflow with only internal IP addresses.