Spring Sale Special - Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: sntaclus

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to design their solution for the future in order to take advantage of cloud and technology improvements as they become available. Which two steps should they take? (Choose two.)

A.

Store as much analytics and game activity data as financially feasible today so it can be used to train machine learning models to predict user behavior in the future.

B.

Begin packaging their game backend artifacts in container images and running them on Kubernetes Engine to improve the availability to scale up or down based on game activity.

C.

Set up a CI/CD pipeline using Jenkins and Spinnaker to automate canary deployments and improve development velocity.

D.

Adopt a schema versioning tool to reduce downtime when adding new game features that require storing additional player data in the database.

E.

Implement a weekly rolling maintenance process for the Linux virtual machines so they can apply critical kernel patches and package updates and reduce the risk of 0-day vulnerabilities.

For this question, refer to the TerramEarth case study.

You start to build a new application that uses a few Cloud Functions for the backend. One use case requires a Cloud Function func_display to invoke another Cloud Function func_query. You want func_query only to accept invocations from func_display. You also want to follow Google's recommended best practices. What should you do?

A.

Create a token and pass it in as an environment variable to func_display. When invoking func_query, include the token in the request Pass the same token to func _query and reject the invocation if the tokens are different.

B.

Make func_query 'Require authentication.' Create a unique service account and associate it to func_display. Grant the service account invoker role for func_query. Create an id token in func_display and include the token to the request when invoking func_query.

C.

Make func _query 'Require authentication' and only accept internal traffic. Create those two functions in the same VPC. Create an ingress firewall rule for func_query to only allow traffic from func_display.

D.

Create those two functions in the same project and VPC. Make func_query only accept internal traffic. Create an ingress firewall for func_query to only allow traffic from func_display. Also, make sure both functions use the same service account.

For this question, refer to the Cymbal Retail case study. Cymbal has a centralized project that supports large video files for Vertex Al model training. Standard storage costs have suddenly increased this month, and you need to determine why. What should you do?

A.

Investigate if the project owner disabled a soft-delete policy on the bucket holding the video files.

B.

Investigate if the project owner moved from dual-region storage to region storage

C.

Investigate If the project owner enabled a soft-delete policy on the bucket holding the video files.

D.

Investigate if the project owner moved from multi-region storage to region stotage.

For this question, refer to the Cymbal Retail case study. Cymbal wants you to connect their on-premises systems to Google Cloud while maintaining secure communication between their on-premises and cloud environments You want to follow Google's recommended approach to ensure the most secure and manageable solution. What should you do?

A.

Use a bastion host to provide secure access lo Google Cloud resources from Cymbal's on-premises systems.

B.

Configure a static VPN connection using SSH tunnels to connect the on-premises systems to Google Cloud

C.

Configure a Cloud VPN gateway and establish a VPN tunnel Configure firewall rules to restrict access to specific resources and services based on IP addresses and ports.

D.

Use Google Cloud's VPC peering to connect Cymbal's on-premises network to Google Cloud.

For this question, refer to the Cymbal Retail case study. Cymbal wants to migrate its diverse database environment to Google Cloud while ensuring high availability and performance for online customers. The company also wants to efficiently store and access large product images These images typically stay In the catalog for more than 90 days and are accessed less and less frequently. You need to select the appropriate Google Cloud services for each database. You also need to design a storage solution for the product images that optimizes cost and performance What should you do?

A.

Migrate all databases to Spanner for consistency, and use Cloud Storage Standard for image storage

B.

Migrate all databases to self-managed instances on Compute Engino. and use a persistent disk for image storage.

C.

Migrate MySQL and SQL Server to Spanner. Redis to Memorystore. and MongoDB to Firestore Use Cloud Storage Standard for image storage, and move

images to Cloud Storage Nearline storage when products become less popular.

D.

Migrate MySQL to Cloud SQL. SQL Server to Cloud SQL. Redis to Memorystore. and MongoDB to Firestore. Use Cloud Storage Standard for image storage, and move images to Cloud Storage Coldline storage when products become less popular

For this question, refer to the Cymbal Retail case study. Cymbal wants to migrate their product catalog management processes to Google Cloud. You need to ensure a smooth migration with proper change management to minimize disruption and risks to the business. You want to follow Google-recommended practices to automate product catalog enrichment, improve product discoverability, increase customer engagement, and minimize costs. What should you do?

A.

Design a migration plan to move all of Cymbal's data to Cloud Storage, and use Compute Engine for all business logic

B.

Design a migration plan to move all of Cymbal's data to Cloud Storage, and use Cloud Run functions for all business logic

C.

Design a migration plan, starting with a pilot project focusing on a specific product category, and gradually expand to other categories.

D.

Design a migration plan with a scheduled window to move all components at once Perform extensive testing to ensure a successful migration.

For this question, refer to the Cymbal Retail case study. Cymbal's generative Al models require high-performance storage for temporary files generated during model training and inference. These files are ephemeral and frequently accessed and modified You need to select a storage solution that minimizes latency and cost and maximizes performance for generative Al workloads. What should you do?

A.

Use a Cloud Storage bucket in the same region as your virtual machines Configure lifecycle policies to delete files after processing

B.

Use Filestore to store temporary files

C.

Use performance persistent disks.

D.

Use Local SSDs attached to the VMs running the generative Al models

For this question, refer to the Cymbal Retail case study Cymbal plans to migrate their existing on-premises systems to Google Cloud and implement Al-powered virtual agents to handle customer interactions You need to provision the compute resources that can scale for the Al-powered virtual agents What should you do?

A.

Use Cloud SQL to store the customer data and product catalog.

B.

Configure Cloud Build to call Al Applications (formerly Vertex Al Agent Builder).

C.

Deploy a Google Kubernetes Engine (GKE) cluster with autoscaling enabled

D.

Create a single, large Compute Engine VM instance with a high CPU allocation.

For this question, refer to the Cymbal Retail case study. Cymbal wants you to design a cloud-first data storage infrastructure for the product catalog modernization project. You want to ensure efficient data access and high availability for Cymbals web application and virtual agents while minimizing operational costs. What should you do?

A.

Use AlloyDB for structured product data, and Cloud Storage for product images

B.

Use Spanner for the structured product data, and BigTable for product images

C.

Use Filestore for the structured product data and Cloud Storage for product images

D.

Use Cloud Storage for structured product data, and BigQuery for product images

You need to upgrade the EHR connection to comply with their requirements. The new connection design must support business-critical needs and meet the same network and security policy requirements. What should you do?

A.

Add a new Dedicated Interconnect connection.

B.

Upgrade the bandwidth on the Dedicated Interconnect connection to 100 G.

C.

Add three new Cloud VPN connections.

D.

Add a new Carrier Peering connection.