Weekend Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: buysanta

Exact2Pass Menu

Question # 4

You are building an application that stores relational data from users. Users across the globe will use this application. Your CTO is concerned about the scaling requirements because the size of the user base is unknown. You need to implement a database solution that can scale with your user growth with minimum configuration changes. Which storage solution should you use?

A.

Cloud SQL

B.

Cloud Spanner

C.

Cloud Firestore

D.

Cloud Datastore

Full Access
Question # 5

You have an application that is currently processing transactions by using a group of managed VM instances. You need to migrate the application so that it is serverless and scalable. You want to implement an asynchronous transaction processing system, while minimizing management overhead. What should you do?

A.

Install Kafka on VM instances to acknowledge incoming transactions. Use Cloud Run to process transactions.

B.

Install Kafka on VM Instances to acknowledge incoming transactions. Use VM Instances to process transactions.

C.

Use Pub/Sub to acknowledge incoming transactions. Use VM instances to process transactions.

D.

Use Pub/Sub to acknowledge incoming transactions. Use Cloud Run to process transactions.

Full Access
Question # 6

You manage an App Engine Service that aggregates and visualizes data from BigQuery. The application is deployed with the default App Engine Service account. The data that needs to be visualized resides in a different project managed by another team. You do not have access to this project, but you want your application to be able to read data from the BigQuery dataset. What should you do?

A.

Ask the other team to grant your default App Engine Service account the role of BigQuery Job User.

B.

Ask the other team to grant your default App Engine Service account the role of BigQuery Data Viewer.

C.

In Cloud IAM of your project, ensure that the default App Engine service account has the role of BigQuery Data Viewer.

D.

In Cloud IAM of your project, grant a newly created service account from the other team the role of BigQuery Job User in your project.

Full Access
Question # 7

Your company set up a complex organizational structure on Google Could Platform. The structure includes hundreds of folders and projects. Only a few team members should be able to view the hierarchical structure. You need to assign minimum permissions to these team members and you want to follow Google-recommended practices. What should you do?

A.

Add the users to roles/browser role.

B.

Add the users to roles/iam.roleViewer role.

C.

Add the users to a group, and add this group to roles/browser role.

D.

Add the users to a group, and add this group to roles/iam.roleViewer role.

Full Access
Question # 8

During a recent audit of your existing Google Cloud resources, you discovered several users with email addresses outside of your Google Workspace domain.

You want to ensure that your resources are only shared with users whose email addresses match your domain. You need to remove any mismatched users, and you want to avoid having to audit your resources to identify mismatched users. What should you do?

A.

Create a Cloud Scheduler task to regularly scan your projects and delete mismatched users.

B.

Create a Cloud Scheduler task to regularly scan your resources and delete mismatched users.

C.

Set an organizational policy constraint to limit identities by domain to automatically remove mismatched users.

D.

Set an organizational policy constraint to limit identities by domain, and then retroactively remove the existing mismatched users.

Full Access
Question # 9

You are configuring service accounts for an application that spans multiple projects. Virtual machines (VMs) running in the web-applications project need access to BigQuery datasets in crm-databases-proj. You want to follow Google-recommended practices to give access to the service account in the web-applications project. What should you do?

A.

Give “project owner” for web-applications appropriate roles to crm-databases- proj

B.

Give “project owner” role to crm-databases-proj and the web-applications project.

C.

Give “project owner” role to crm-databases-proj and bigquery.dataViewer role to web-applications.

D.

Give bigquery.dataViewer role to crm-databases-proj and appropriate roles to web-applications.

Full Access
Question # 10

Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention. What should you do?

A.

Create an export to the sink that saves logs from Cloud Audit to BigQuery.

B.

Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.

C.

Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.

D.

Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.

Full Access
Question # 11

Your company’s infrastructure is on-premises, but all machines are running at maximum capacity. You want to burst to Google Cloud. The workloads on Google Cloud must be able to directly communicate to the workloads on-premises using a private IP range. What should you do?

A.

In Google Cloud, configure the VPC as a host for Shared VPC.

B.

In Google Cloud, configure the VPC for VPC Network Peering.

C.

Create bastion hosts both in your on-premises environment and on Google Cloud. Configure both as proxy servers using their public IP addresses.

D.

Set up Cloud VPN between the infrastructure on-premises and Google Cloud.

Full Access
Question # 12

You have 32 GB of data in a single file that you need to upload to a Nearline Storage bucket. The WAN connection you are using is rated at 1 Gbps, and you are the only one on the connection. You want to use as much of the rated 1 Gbps as possible to transfer the file rapidly. How should you upload the file?

A.

Use the GCP Console to transfer the file instead of gsutil.

B.

Enable parallel composite uploads using gsutil on the file transfer.

C.

Decrease the TCP window size on the machine initiating the transfer.

D.

Change the storage class of the bucket from Nearline to Multi-Regional.

Full Access
Question # 13

(You have an application running inside a Compute Engine instance. You want to provide the application with secure access to a BigQuery dataset. You must ensure that credentials are only valid for a short period of time, and your application will only have access to the intended BigQuery dataset. You want to follow Google-recommended practices and minimize your operational costs. What should you do?)

A.

Attach a custom service account to the instance, and grant the service account the BigQuery Data Viewer IAM role on the project.

B.

Attach a new service account to the instance every hour, and grant the service account the BigQuery Data Viewer IAM role on the dataset.

C.

Attach a custom service account to the instance, and grant the service account the BigQuery Data Viewer IAM role on the dataset.

D.

Attach a new service account to the instance every hour, and grant the service account the BigQuery Data Viewer IAM role on the project.

Full Access
Question # 14

(You are managing an application deployed on Cloud Run. The development team has released a new version of the application. You want to deploy and redirect traffic to this new version of the application. To ensure traffic to the new version of the application is served with no startup time, you want to ensure that there are two idle instances available for incoming traffic before adjusting the traffic flow. You also want to minimize administrative overhead. What should you do?)

A.

Ensure the checkbox "Serve this revision immediately" is unchecked when deploying the new revision. Before changing the traffic rules, use a traffic simulation tool to send load to the new revision.

B.

Configure service autoscaling and set the minimum number of instances to 2.

C.

Configure revision autoscaling for the new revision and set the minimum number of instances to 2.

D.

Configure revision autoscaling for the existing revision and set the minimum number of instances to 2.

Full Access
Question # 15

You are hosting an application on bare-metal servers in your own data center. The application needs access to Cloud Storage. However, security policies prevent the servers hosting the application from having public IP addresses or access to the internet. You want to follow Google-recommended practices to provide the application with access to Cloud Storage. What should you do?

A.

1. Use nslookup to get the IP address for storage.googleapis.com.2. Negotiate with the security team to be able to give a public IP address to the servers.3. Only allow egress traffic from those servers to the IP addresses for storage.googleapis.com.

B.

1. Using Cloud VPN, create a VPN tunnel to a Virtual Private Cloud (VPC) in Google Cloud Platform (GCP).2. In this VPC, create a Compute Engine instance and install the Squid proxy server on this instance.3. Configure your servers to use that instance as a proxy to access Cloud Storage.

C.

1. Use Migrate for Compute Engine (formerly known as Velostrata) to migrate those servers to Compute Engine.2. Create an internal load balancer (ILB) that uses storage.googleapis.com as backend.3. Configure your new instances to use this ILB as proxy.

D.

1. Using Cloud VPN or Interconnect, create a tunnel to a VPC in GCP.2. Use Cloud Router to create a custom route advertisement for 199.36.153.4/30. Announce that network to your on-premises network through the VPN tunnel.3. In your on-premises network, configure your DNS server to resolve *.googleapis.com as a CNAME to restricted.googleapis.com.

Full Access
Question # 16

Your company implemented BigQuery as an enterprise data warehouse. Users from multiple business units run queries on this data warehouse. However, you notice that query costs for BigQuery are very high, and you need to control costs. Which two methods should you use? (Choose two.)

A.

Split the users from business units to multiple projects.

B.

Apply a user- or project-level custom query quota for BigQuery data warehouse.

C.

Create separate copies of your BigQuery data warehouse for each business unit.

D.

Split your BigQuery data warehouse into multiple data warehouses for each business unit.

E.

Change your BigQuery query model from on-demand to flat rate. Apply the appropriate number of slots to each Project.

Full Access
Question # 17

You recently discovered that your developers are using many service account keys during their development process. While you work on a long term improvement, you need to quickly implement a process to enforce short-lived service account credentials in your company. You have the following requirements:

• All service accounts that require a key should be created in a centralized project called pj-sa.

• Service account keys should only be valid for one day.

You need a Google-recommended solution that minimizes cost. What should you do?

A.

Implement a Cloud Run job to rotate all service account keys periodically in pj-sa. Enforce an org policy to deny service account key creation with an exception to pj-sa.

B.

Implement a Kubernetes Cronjob to rotate all service account keys periodically. Disable attachment ofservice accounts to resources in all projects with an exception to pj-sa.

C.

Enforce an org policy constraint allowing the lifetime of service account keys to be 24 hours. Enforce an org policy constraint denying service account key creation with an exception on pj-sa.

D.

Enforce a DENY org policy constraint over the lifetime of service account keys for 24 hours. Disable attachment of service accounts to resources in all projects with an exception to pj-sa.

Full Access
Question # 18

All development (dev) teams in your organization are located in the United States. Each dev team has its own Google Cloud project. You want to restrict access so that each dev team can only create cloud resources in the United States (US). What should you do?

A.

Create a folder to contain all the dev projects Create an organization policy to limit resources in US locations.

B.

Create an organization to contain all the dev projects. Create an Identity and Access Management (IAM) policy to limit the resources in US regions.

C.

Create an Identity and Access Management

D.

Create an Identity and Access Management (IAM)policy to restrict the resources locations in all dev projects. Apply the policy to all dev roles.

Full Access
Question # 19

You are using Deployment Manager to create a Google Kubernetes Engine cluster. Using the same Deployment Manager deployment, you also want to create a DaemonSet in the kube-system namespace of the cluster. You want a solution that uses the fewest possible services. What should you do?

A.

Add the cluster’s API as a new Type Provider in Deployment Manager, and use the new type to create the DaemonSet.

B.

Use the Deployment Manager Runtime Configurator to create a new Config resource that contains the DaemonSet definition.

C.

With Deployment Manager, create a Compute Engine instance with a startup script that uses kubectl to create the DaemonSet.

D.

In the cluster’s definition in Deployment Manager, add a metadata that has kube-system as key and the DaemonSet manifest as value.

Full Access
Question # 20

You deployed a new application inside your Google Kubernetes Engine cluster using the YAML file specified below.

You check the status of the deployed pods and notice that one of them is still in PENDING status:

You want to find out why the pod is stuck in pending status. What should you do?

A.

Review details of the myapp-service Service object and check for error messages.

B.

Review details of the myapp-deployment Deployment object and check for error messages.

C.

Review details of myapp-deployment-58ddbbb995-lp86m Pod and check for warning messages.

D.

View logs of the container in myapp-deployment-58ddbbb995-lp86m pod and check for warning messages.

Full Access
Question # 21

Your company has a Google Cloud Platform project that uses BigQuery for data warehousing. Your data science team changes frequently and has few members. You need to allow members of this team to perform queries. You want to follow Google-recommended practices. What should you do?

A.

1. Create an IAM entry for each data scientist's user account.2. Assign the BigQuery jobUser role to the group.

B.

1. Create an IAM entry for each data scientist's user account.2. Assign the BigQuery dataViewer user role to the group.

C.

1. Create a dedicated Google group in Cloud Identity.2. Add each data scientist's user account to the group.3. Assign the BigQuery jobUser role to the group.

D.

1. Create a dedicated Google group in Cloud Identity.2. Add each data scientist's user account to the group.3. Assign the BigQuery dataViewer user role to the group.

Full Access
Question # 22

(You manage a VPC network in Google Cloud with a subnet that is rapidly approaching its private IP address capacity. You expect the number of Compute Engine VM instances in the same region to double within a week. You need to implement a Google-recommended solution that minimizes operational costs and does not require downtime. What should you do?)

A.

Create a second VPC with the same subnet IP range, and connect this VPC to the existing VPC by using VPC Network Peering.

B.

Delete the existing subnet, and create a new subnet with double the IP range available.

C.

Use the Google Cloud CLI tool to expand the primary IP range of your subnet.

D.

Permit additional traffic from the expected range of private IP addresses to reach your VMs by configuring firewall rules.

Full Access
Question # 23

You need to track and verity modifications to a set of Google Compute Engine instances in your Google Cloud project. In particular, you want to verify OS system patching events on your virtual machines (VMs). What should you do?

A.

Review the Compute Engine activity logs Select and review the Admin Event logs

B.

Review the Compute Engine activity logs Select and review the System Event logs

C.

Install the Cloud Logging Agent In Cloud Logging review the Compute Engine syslog logs

D.

Install the Cloud Logging Agent In Cloud Logging, review the Compute Engine operation logs

Full Access
Question # 24

You created several resources in multiple Google Cloud projects. All projects are linked to different billing accounts. To better estimate future charges, you want to have a single visual representation of all costs incurred. You want to include new cost data as soon as possible. What should you do?

A.

Configure Billing Data Export to BigQuery and visualize the data in Data Studio.

B.

Visit the Cost Table page to get a CSV export and visualize it using Data Studio.

C.

Fill all resources in the Pricing Calculator to get an estimate of the monthly cost.

D.

Use the Reports view in the Cloud Billing Console to view the desired cost information.

Full Access
Question # 25

(Your company’s developers use an automation that you recently built to provision Linux VMs in Compute Engine within a Google Cloud project to perform various tasks. You need to manage the Linux account lifecycle and access for these users. You want to follow Google-recommended practices to simplify access management while minimizing operational costs. What should you do?)

A.

Enable OS Login for all VMs. Use IAM roles to grant user permissions.

B.

Enable OS Login for all VMs. Write custom startup scripts to update user permissions.

C.

Require your developers to create public SSH keys. Make the owner of the public key the root user.

D.

Require your developers to create public SSH keys. Write custom startup scripts to update user permissions.

Full Access
Question # 26

(You are deploying a web application using Compute Engine. You created a managed instance group (MIG) to host the application. You want to follow Google-recommended practices to implement a secure and highly available solution. What should you do?)

A.

Use a proxy Network Load Balancer for the MIG and an A record in your DNS private zone with the load balancer's IP address.

B.

Use a proxy Network Load Balancer for the MIG and a CNAME record in your DNS public zone with the load balancer's IP address.

C.

Use an Application Load Balancer for the MIG and a CNAME record in your DNS private zone with the load balancer's IP address.

D.

Use an Application Load Balancer for the MIG and an A record in your DNS public zone with the load balancer's IP address.

Full Access
Question # 27

Your preview application, deployed on a single-zone Google Kubernetes Engine (GKE) cluster in us-centrall, has gained popularity. You are now ready to make the application generally available. You need to deploy the application to production while ensuring high availability and resilience. You also want to follow Google-recommended practices. What should you do?

A.

Use the gcloud container clusters create command with the options--enable-multi-networking and--enable- autoscaling to create an autoscaling zonal cluster and deploy the application to it.

B.

Use the gcloud container clusters create-auto command to create an autopilot cluster and deploy the application to it.

C.

Use the gcloud container clusters update command with the option—region us-centrall to update the cluster and deploy the application to it.

D.

Use the gcloud container clusters update command with the option—node-locations us-centrall-a,us-centrall-b to update the cluster and deploy the application to the nodes.

Full Access
Question # 28

You need to grant access for three users so that they can view and edit table data on a Cloud Spanner instance. What should you do?

A.

Run gcloud iam roles describe roles/spanner.databaseUser. Add the users to the role.

B.

Run gcloud iam roles describe roles/spanner.databaseUser. Add the users to a new group. Add the group to the role.

C.

Run gcloud iam roles describe roles/spanner.viewer --project my-project. Add the users to the role.

D.

Run gcloud iam roles describe roles/spanner.viewer --project my-project. Add the users to a new group. Add the group to the role.

Full Access
Question # 29

You want to run a single caching HTTP reverse proxy on GCP for a latency-sensitive website. This specific reverse proxy consumes almost no CPU. You want to have a 30-GB in-memory cache, and need an additional 2 GB of memory for the rest of the processes. You want to minimize cost. How should you run this reverse proxy?

A.

Create a Cloud Memorystore for Redis instance with 32-GB capacity.

B.

Run it on Compute Engine, and choose a custom instance type with 6 vCPUs and 32 GB of memory.

C.

Package it in a container image, and run it on Kubernetes Engine, using n1-standard-32 instances as nodes.

D.

Run it on Compute Engine, choose the instance type n1-standard-1, and add an SSD persistent disk of 32 GB.

Full Access
Question # 30

An external member of your team needs list access to compute images and disks in one of your projects. You want to follow Google-recommended practices when you grant the required permissions to this user. What should you do?

A.

Create a custom role, and add all the required compute.disks.list and compute, images.list permissions as includedPermissions. Grant the custom role to the user at the project level.

B.

Create a custom role based on the Compute Image User role Add the compute.disks, list to theincludedPermissions field Grant the custom role to the user at the project level

C.

Grant the Compute Storage Admin role at the project level.

D.

Create a custom role based on the Compute Storage Admin role. Exclude unnecessary permissions from the custom role. Grant the custom role to the user at the project level.

Full Access
Question # 31

You are migrating a production-critical on-premises application that requires 96 vCPUs to perform its task. You want to make sure the application runs in a similar environment on GCP. What should you do?

A.

When creating the VM, use machine type n1-standard-96.

B.

When creating the VM, use Intel Skylake as the CPU platform.

C.

Create the VM using Compute Engine default settings. Use gcloud to modify the running instance to have 96 vCPUs.

D.

Start the VM using Compute Engine default settings, and adjust as you go based on Rightsizing Recommendations.

Full Access
Question # 32

Your company uses Pub/Sub for event-driven workloads. You have a subscription named email-updates attached to the new-orders topic. You need to fetch and acknowledge waiting messages from this subscription. What should you do?

A.

Use the gcloud pubsub subscriptions seek email-updates command.

B.

Use the gcloud pubsub topics describe new-orders command.

C.

Use the gcloud pubsub subscriptions pull email-updates —auto-ack command.

D.

Use the gcloud pubsub topics list-subscriptions new-orders —1ilter="email-updates" command.

Full Access
Question # 33

You are the organization and billing administrator for your company. The engineering team has the Project Creator role on the organization. You do not want the engineering team to be able to link projects to the billing account. Only the finance team should be able to link a project to a billing account, but they should not be able to make any other changes to projects. What should you do?

A.

Assign the finance team only the Billing Account User role on the billing account.

B.

Assign the engineering team only the Billing Account User role on the billing account.

C.

Assign the finance team the Billing Account User role on the billing account and the Project Billing Manager role on the organization.

D.

Assign the engineering team the Billing Account User role on the billing account and the Project Billing Manager role on the organization.

Full Access
Question # 34

(Your company uses a multi-cloud strategy that includes Google Cloud. You want to centralize application logs in a third-party software-as-a-service (SaaS) tool from all environments. You need tointegrate logs originating from Cloud Logging, and you want to ensure the export occurs with the least amount of delay possible. What should you do?)

A.

Use a Cloud Scheduler cron job to trigger a Cloud Function that queries Cloud Logging and sends the logs to the SaaS tool.

B.

Create a Cloud Logging sink and configure Pub/Sub as the destination. Configure the SaaS tool to subscribe to the Pub/Sub topic to retrieve the logs.

C.

Create a Cloud Logging sink and configure Cloud Storage as the destination. Configure the SaaS tool to read the Cloud Storage bucket to retrieve the logs.

D.

Create a Cloud Logging sink and configure BigQuery as the destination. Configure the SaaS tool to query BigQuery to retrieve the logs.

Full Access
Question # 35

You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler feature enabled. You need to make sure that each node of the cluster will run a monitoring pod that sends container metrics to a third-party monitoring solution. What should you do?

A.

Deploy the monitoring pod in a StatefulSet object.

B.

Deploy the monitoring pod in a DaemonSet object.

C.

Reference the monitoring pod in a Deployment object.

D.

Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.

Full Access
Question # 36

You are deploying a web application using Compute Engine. You created a managed instance group (MIG) to host the application. You want to follow Google-recommended practices to implement a secure and highly available solution. What should you do?

A.

Use SSL proxy load balancing for the MIG and an A record in your DNS private zone with the load balancer's IP address.

B.

Use SSL proxy load balancing for the MIG and a CNAME record in your DNS public zone with the load balancer's IP address.

C.

Use HTTP(S) load balancing for the MIG and a CNAME record in your DNS private zone with the load balancer's IP address.

D.

Use HTTP(S) load balancing for the MIG and an A record in your DNS public zone with the load balancer's IP address.

Full Access
Question # 37

Your web application has been running successfully on Cloud Run for Anthos. You want to evaluate an updated version of the application with a specific percentage of your production users (canary deployment). What should you do?

A.

Create a new service with the new version of the application. Split traffic between this version and the version that is currently running.

B.

Create a new revision with the new version of the application. Split traffic between this version and the version that is currently running.

C.

Create a new service with the new version of the application. Add an HTTP Load Balancer in front of both services.

D.

Create a new revision with the new version of the application. Add an HTTP Load Balancer in front of both revisions.

Full Access
Question # 38

You need to configure optimal data storage for files stored in Cloud Storage for minimal cost. The files are used in a mission-critical analytics pipeline that is used continually. The users are in Boston, MA (United States). What should you do?

A.

Configure regional storage for the region closest to the users Configure a Nearline storage class

B.

Configure regional storage for the region closest to the users Configure a Standard storage class

C.

Configure dual-regional storage for the dual region closest to the users Configure a Nearline storage class

D.

Configure dual-regional storage for the dual region closest to the users Configure a Standard storage class

Full Access
Question # 39

You received a JSON file that contained a private key of a Service Account in order to get access to several resources in a Google Cloud project. You downloaded and installed the Cloud SDK and want to use this private key for authentication and authorization when performing gcloud commands. What should you do?

A.

Use the command gcloud auth login and point it to the private key

B.

Use the command gcloud auth activate-service-account and point it to the private key

C.

Place the private key file in the installation directory of the Cloud SDK and rename it to "credentials ison"

D.

Place the private key file in your home directory and rename it to ‘’GOOGLE_APPUCATION_CREDENTiALS".

Full Access
Question # 40

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google- recommended practices to obtain the combined logs for all projects. What should you do?

A.

Navigate to Stackdriver Logging and select resource.labels.project_id="*"

B.

Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.

C.

Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.

D.

Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.

Full Access
Question # 41

You manage three Google Cloud projects with the Cloud Monitoring API enabled. You want to follow Google-recommended practices to visualize CPU and network metrics for all three projects together. What should you do?

A.

1. Create a Cloud Monitoring Dashboard2. Collect metrics and publish them into the Pub/Sub topics 3. Add CPU and network Charts (or each of (he three projects

B.

1. Create a Cloud Monitoring Dashboard.2. Select the CPU and Network metrics from the three projects.3. Add CPU and network Charts lot each of the three protects.

C.

1 Create a Service Account and apply roles/viewer on the three projects2. Collect metrics and publish them lo the Cloud Monitoring API3. Add CPU and network Charts for each of the three projects.

D.

1. Create a fourth Google Cloud project2 Create a Cloud Workspace from the fourth project and add the other three projects

Full Access
Question # 42

You are planning to migrate the following on-premises data management solutions to Google Cloud:

• One MySQL cluster for your main database

• Apache Kafka for your event streaming platform

• One Cloud SOL for PostgreSOL database for your analytical and reporting needs

You want to implement Google-recommended solutions for the migration. You need to ensure that the new solutions provide global scalability and require minimal operational and infrastructure management. What should you do?

A.

Migrate from MySQL to Cloud SQL, from Kafka to Memorystore, and from Cloud SQL for PostgreSQL to Cloud SQL

B.

Migrate from MySQL to Cloud Spanner, from Kafka to Memorystore, and from Cloud SOL for PostgreSQL to Cloud SQL

C.

Migrate from MySQL to Cloud SOL, from Kafka to Pub/Sub, and from Cloud SOL for PostgreSQL to BigQuery.

D.

Migrate from MySQL to Cloud Spanner, from Kafka to Pub/Sub. and from Cloud SQL for PostgreSQL to BigQuery

Full Access
Question # 43

You have just created a new project which will be used to deploy a globally distributed application. You will use Cloud Spanner for data storage. You want to create a Cloud Spanner instance. You want to perform the first step in preparation of creating the instance. What should you do?

A.

Grant yourself the IAM role of Cloud Spanner Admin

B.

Create a new VPC network with subnetworks in all desired regions

C.

Configure your Cloud Spanner instance to be multi-regional

D.

Enable the Cloud Spanner API

Full Access
Question # 44

You have an application running in Google Kubernetes Engine (GKE) with cluster autoscaling enabled. The application exposes a TCP endpoint. There are several replicas of this application. You have a Compute Engine instance in the same region, but in another Virtual Private Cloud (VPC), called gce-network, that has no overlapping IP ranges with the first VPC. This instance needs to connect to the application on GKE. You want to minimize effort. What should you do?

A.

1. In GKE, create a Service of type LoadBalancer that uses the application's Pods as backend.2. Set the service's externalTrafficPolicy to Cluster.3. Configure the Compute Engine instance to use the address of the load balancer that has been created.

B.

1. In GKE, create a Service of type NodePort that uses the application's Pods as backend.2. Create a Compute Engine instance called proxy with 2 network interfaces, one in each VPC.3. Use iptables on this instance to forward traffic from gce-network to the GKE nodes.4. Configure the Compute Engine instance to use the address of proxy in gce-network as endpoint.

C.

1. In GKE, create a Service of type LoadBalancer that uses the application's Pods as backend.2. Add an annotation to this service: cloud.google.com/load-balancer-type: Internal3. Peer the two VPCs together.4. Configure the Compute Engine instance to use the address of the load balancer that has been created.

D.

1. In GKE, create a Service of type LoadBalancer that uses the application's Pods as backend.2. Add a Cloud Armor Security Policy to the load balancer that whitelists the internal IPs of the MIG's instances.3. Configure the Compute Engine instance to use the address of the load balancer that has been created.

Full Access
Question # 45

Users of your application are complaining of slowness when loading the application. You realize the slowness is because the App Engine deployment serving the application is deployed in us-central whereas all users of this application are closest to europe-west3. You want to change the region of the App Engine application to europe-west3 to minimize latency. What’s the best way to change the App Engine region?

A.

Create a new project and create an App Engine instance in europe-west3

B.

Use the gcloud app region set command and supply the name of the new region.

C.

From the console, under the App Engine page, click edit, and change the region drop-down.

D.

Contact Google Cloud Support and request the change.

Full Access
Question # 46

You are building a backend service for an ecommerce platform that will persist transaction data from mobile and web clients. After the platform is launched, you expect a large volume of global transactions. Your business team wants to run SQL queries to analyze the data. You need to build a highly available and scalable data store for the platform. What should you do?

A.

Create a multi-region Cloud Spanner instance with an optimized schema.

B.

Create a multi-region Firestore database with aggregation query enabled.

C.

Create a multi-region Cloud SQL for PostgreSQL database with optimized indexes.

D.

Create a multi-region BigQuery dataset with optimized tables.

Full Access
Question # 47

(Your company was recently impacted by a service disruption that caused multiple Dataflow jobs to get stuck, resulting in significant downtime in downstream applications and revenue loss. You were able to resolve the issue by identifying and fixing an error you found in the code. You need to design a solution with minimal management effort to identify when jobs are stuck in the future to ensure that this issue does not occur again. What should you do?)

A.

Set up Error Reporting to identify stack traces that indicate slowdowns in Dataflow jobs. Set up alerts based on these log entries.

B.

Use the Personalized Service Health dashboard to identify issues with Dataflow jobs across regions.

C.

Update the Dataflow job configurations to send messages to a Pub/Sub topic when there are delays. Configure a backup Dataflow job to process jobs that are delayed. Use Cloud Tasks to trigger an alert when messages are pushed to the Pub/Sub topic.

D.

Set up Cloud Monitoring alerts on the data freshness metric for the Dataflow jobs to receive a notification when a certain threshold is reached.

Full Access
Question # 48

Your projects incurred more costs than you expected last month. Your research reveals that a development GKE container emitted a huge number of logs, which resulted in higher costs. You want to disable the logs quickly using the minimum number of steps. What should you do?

A.

1. Go to the Logs ingestion window in Stackdriver Logging, and disable the log source for the GKE container resource.

B.

1. Go to the Logs ingestion window in Stackdriver Logging, and disable the log source for the GKE Cluster Operations resource.

C.

1. Go to the GKE console, and delete existing clusters.2. Recreate a new cluster.3. Clear the option to enable legacy Stackdriver Logging.

D.

1. Go to the GKE console, and delete existing clusters.2. Recreate a new cluster.3. Clear the option to enable legacy Stackdriver Monitoring.

Full Access
Question # 49

You have two Google Cloud projects: project-a with VPC vpc-a (10.0.0.0/16) and project-b with VPC vpc-b (10.8.0.0/16). Your frontend application resides in vpc-a and the backend API services ate deployed in vpc-b. You need to efficiently and cost-effectively enable communication between these Google Cloud projects. You also want to follow Google-recommended practices. What should you do?

A.

Configure a Cloud Router in vpc-a and another Cloud Router in vpc-b.

B.

Configure a Cloud Interconnect connection between vpc-a and vpc-b.

C.

Create VPC Network Peering between vpc-a and vpc-b.

D.

Create an OpenVPN connection between vpc-a and vpc-b.

Full Access
Question # 50

You have been asked to set up the billing configuration for a new Google Cloud customer. Your customer wants to group resources that share common IAM policies. What should you do?

A.

Use labels to group resources that share common IAM policies

B.

Use folders to group resources that share common IAM policies

C.

Set up a proper billing account structure to group IAM policies

D.

Set up a proper project naming structure to group IAM policies

Full Access
Question # 51

Your company wants to standardize the creation and management of multiple Google Cloud resources using Infrastructure as Code. You want to minimize the amount of repetitive code needed to manage the environment What should you do?

A.

Create a bash script that contains all requirement steps as gcloud commands

B.

Develop templates for the environment using Cloud Deployment Manager

C.

Use curl in a terminal to send a REST request to the relevant Google API for each individual resource.

D.

Use the Cloud Console interface to provision and manage all related resources

Full Access
Question # 52

Your application development team has created Docker images for an application that will be deployed on Google Cloud. Your team does not want to manage the infrastructure associated with this application. You need to ensure that the application can scale automatically as it gains popularity. What should you do?

A.

Create an Instance template with the container image, and deploy a Managed Instance Group withAutoscaling.

B.

Upload Docker images to Artifact Registry, and deploy the application on Google Kubernetes Engine usingStandard mode.

C.

Upload Docker images to the Cloud Storage, and deploy the application on Google Kubernetes Engine usingStandard mode.

D.

Upload Docker images to Artifact Registry, and deploy the application on Cloud Run.

Full Access
Question # 53

You are performing a monthly security check of your Google Cloud environment and want to know who has access to view data stored in your Google Cloud

Project. What should you do?

A.

Enable Audit Logs for all APIs that are related to data storage.

B.

Review the IAM permissions for any role that allows for data access.

C.

Review the Identity-Aware Proxy settings for each resource.

D.

Create a Data Loss Prevention job.

Full Access
Question # 54

You are deploying an application to Cloud Run. Your application requires the use of an API that runs on Google Kubernetes Engine (GKE). You need to ensure that your Cloud Run service can privately reach the API on GKE, and you want to follow Google-recommended practices. What should you do?

A.

Deploy an ingress resource on the GKE cluster to expose the API to the internet. Use Cloud Armor to filter for IP addresses that can connect to the API. On the Cloud Run service, configure the application to fetch its public IP address and update the Cloud Armor policy on startup to allow this IP address to call the API on ports 80 and 443.

B.

Create an egress firewall rule on the VPC to allow connections to 0.0.0.0/0 on ports 80 and 443.

C.

Create an ingress firewall rule on the VPC to allow connections from 0.0.0.0/0 on ports 80 and 443.

D.

Deploy an internal Application Load Balancer to expose the API on GKE to the VPC. Configure Cloud DNS with the IP address of the internal Application Load Balancer. Deploy a Serverless VPC Access connector to allow the Cloud Run service to call the API through the FQDN on Cloud DNS.

Full Access
Question # 55

You want to enable your development team to deploy new features to an existing Cloud Run service in production. To minimize the risk associated with a new revision, you want to reduce the number ofcustomers who might be affected by an outage without introducing any development or operational costs to your customers. You want to follow Google-recommended practices for managing revisions to a service. What should you do9

A.

Deploy your application to a second Cloud Run service, and ask your customers to use the second Cloud Run service.

B.

Ask your customers to retry access to your service with exponential backoff to mitigate any potential problems after the new revision is deployed.

C.

Gradually roll out the new revision and split customer traffic between the revisions to allow rollback in case a problem occurs.

D.

Send all customer traffic to the new revision, and roll back to a previous revision if you witness any problems in production.

Full Access
Question # 56

You are in charge of provisioning access for all Google Cloud users in your organization. Your company recently acquired a startup company that has their own Google Cloud organization. You need to ensure that your Site Reliability Engineers (SREs) have the same project permissions in the startup company's organization as in your own organization. What should you do?

A.

In the Google Cloud console for your organization, select Create role from selection, and choose destination as the startup company's organization

B.

In the Google Cloud console for the startup company, select Create role from selection and choose source as the startup company's Google Cloud organization.

C.

Use the gcloud iam roles copy command, and provide the Organization ID of the startup company'sGoogle Cloud Organization as the destination.

D.

Use the gcloud iam roles copy command, and provide the project IDs of all projects in the startup company s organization as the destination.

Full Access
Question # 57

You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment. What should you do?

A.

Create a single budget for all projects and configure budget alerts on this budget.

B.

Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.

C.

Create a budget per project and configure budget alerts on all of these budgets.

D.

Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.

Full Access
Question # 58

You are assisting a new Google Cloud user who just installed the Google Cloud SDK on their VM. The server needs access to Cloud Storage. The user wants your help to create a new storage bucket. You need to make this change in multiple environments. What should you do?

A.

Use a Deployment Manager script to automate creating storage buckets in an appropriate region

B.

Use a local SSD to improve performance of the VM for the targeted workload

C.

Use the gsutii command to create a storage bucket in the same region as the VM

D.

Use a Persistent Disk SSD in the same zone as the VM to improve performance of the VM

Full Access
Question # 59

You have a large 5-TB AVRO file stored in a Cloud Storage bucket. Your analysts are proficient only in SQL and need access to the data stored in this file. You want to find a cost-effective way to complete their request as soon as possible. What should you do?

A.

Load data in Cloud Datastore and run a SQL query against it.

B.

Create a BigQuery table and load data in BigQuery. Run a SQL query on this table and drop this table after you complete your request.

C.

Create external tables in BigQuery that point to Cloud Storage buckets and run a SQL query on these external tables to complete your request.

D.

Create a Hadoop cluster and copy the AVRO file to NDFS by compressing it. Load the file in a hive table and provide access to your analysts so that they can run SQL queries.

Full Access
Question # 60

You are deploying an application to a Compute Engine VM in a managed instance group. The application must be running at all times, but only a single instance of the VM should run per GCP project. How should you configure the instance group?

A.

Set autoscaling to On, set the minimum number of instances to 1, and then set the maximum number of instances to 1.

B.

Set autoscaling to Off, set the minimum number of instances to 1, and then set the maximum number of instances to 1.

C.

Set autoscaling to On, set the minimum number of instances to 1, and then set the maximum number of instances to 2.

D.

Set autoscaling to Off, set the minimum number of instances to 1, and then set the maximum number of instances to 2.

Full Access
Question # 61

You have experimented with Google Cloud using your own credit card and expensed the costs to your company. Your company wants to streamline the billing process and charge the costs of your projects to their monthly invoice. What should you do?

A.

Grant the financial team the IAM role ofג€Billing Account Userג€ on the billing account linked to your credit card.

B.

Set up BigQuery billing export and grant your financial department IAM access to query the data.

C.

Create a ticket with Google Billing Support to ask them to send the invoice to your company.

D.

Change the billing account of your projects to the billing account of your company.

Full Access
Question # 62

You want to deploy a new containerized application into Google Cloud by using a Kubernetes manifest. You want to have full control over the Kubernetes deployment, and at the same time, you want to minimize configuring infrastructure. What should you do?

A.

Deploy the application on GKE Autopilot.

B.

Deploy the application on GKE Standard.

C.

Deploy the application on Cloud Functions.

D.

Deploy the application on Cloud Run.

Full Access
Question # 63

You are hosting an application from Compute Engine virtual machines (VMs) in us–central1–a. You want to adjust your design to support the failure of a single Compute Engine zone, eliminate downtime, and minimize cost. What should you do?

A.

– Create Compute Engine resources in us–central1–b.–Balance the load across both us–central1–a and us–central1–b.

B.

– Create a Managed Instance Group and specify us–central1–a as the zone.–Configure the Health Check with a short Health Interval.

C.

– Create an HTTP(S) Load Balancer.–Create one or more global forwarding rules to direct traffic to your VMs.

D.

– Perform regular backups of your application.–Create a Cloud Monitoring Alert and be notified if your application becomes unavailable.–Restore from backups when notified.

Full Access
Question # 64

Your management has asked an external auditor to review all the resources in a specific project. The security team has enabled the Organization Policy called Domain Restricted Sharing on the organization node by specifying only your Cloud Identity domain. You want the auditor to only be able to view, but not modify, the resources in that project. What should you do?

A.

Ask the auditor for their Google account, and give them the Viewer role on the project.

B.

Ask the auditor for their Google account, and give them the Security Reviewer role on the project.

C.

Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.

D.

Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.

Full Access
Question # 65

You are working with a user to set up an application in a new VPC behind a firewall. The user is concerned about data egress. You want to configure the fewest open egress ports. What should you do?

A.

Set up a low-priority (65534) rule that blocks all egress and a high-priority rule (1000) that allows only the appropriate ports.

B.

Set up a high-priority (1000) rule that pairs both ingress and egress ports.

C.

Set up a high-priority (1000) rule that blocks all egress and a low-priority (65534) rule that allows only the appropriate ports.

D.

Set up a high-priority (1000) rule to allow the appropriate ports.

Full Access
Question # 66

You are working for a hospital that stores Its medical images in an on-premises data room. The hospital wants to use Cloud Storage for archival storage of these images. The hospital wants an automated process to upload any new medical images to Cloud Storage. You need to design and implement a solution. What should you do?

A.

Deploy a Dataflow job from the batch template "Datastore lo Cloud Storage" Schedule the batch job on the desired interval

B.

In the Cloud Console, go to Cloud Storage Upload the relevant images to the appropriate bucket

C.

Create a script that uses the gsutil command line interface to synchronize the on-premises storage with Cloud Storage Schedule the script as a cron job

D.

Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub topic. Create an application that sends all medical images to the Pub/Sub lope

Full Access
Question # 67

You have a web application deployed as a managed instance group. You have a new version of the application to gradually deploy. Your web application is currently receiving live web traffic. You want to ensure that the available capacity does not decrease during the deployment. What should you do?

A.

Perform a rolling-action start-update with maxSurge set to 0 and maxUnavailable set to 1.

B.

Perform a rolling-action start-update with maxSurge set to 1 and maxUnavailable set to 0.

C.

Create a new managed instance group with an updated instance template. Add the group to the backend service for the load balancer. When all instances in the new managed instance group are healthy, delete the old managed instance group.

D.

Create a new instance template with the new application version. Update the existing managed instance group with the new instance template. Delete the instances in the managed instance group to allow the managed instance group to recreate the instance using the new instance template.

Full Access
Question # 68

You need to manage multiple Google Cloud Platform (GCP) projects in the fewest steps possible. You want to configure the Google Cloud SDK command line interface (CLI) so that you can easily manage multiple GCP projects. What should you?

A.

1. Create a configuration for each project you need to manage.2. Activate the appropriate configuration when you work with each of your assigned GCP projects.

B.

1. Create a configuration for each project you need to manage.2. Use gcloud init to update the configuration values when you need to work with a non-default project

C.

1. Use the default configuration for one project you need to manage.2. Activate the appropriate configuration when you work with each of your assigned GCP projects.

D.

1. Use the default configuration for one project you need to manage.2. Use gcloud init to update the configuration values when you need to work with a non-default project.

Full Access
Question # 69

(You need to migrate multiple PostgreSQL databases from your on-premises data center to Google Cloud. You want to significantly improve the performance of your databases while minimizing changes to your data schema and application code. You expect to exceed 150 TB of data per geographical region. You want to follow Google-recommended practices and minimize your operational costs. What should you do?)

A.

Migrate your data to AlloyDB.

B.

Migrate your data to Spanner.

C.

Migrate your data to Firebase.

D.

Migrate your data to Bigtable.

Full Access
Question # 70

Your company runs a variety of applications and workloads on Google Cloud and you are responsible for managing cloud costs. You need to identify a solution that enables you to perform detailed cost analysis You also must be able to visualize the cost data in multiple ways on the same dashboard What should you do?

A.

Use the cost breakdown report with the available filters from Cloud Billing to visualize the data

B.

Enable the Cloud Billing export to BigQuery. and use Looker Studio to visualize the data

C.

Run Queries in Cloud Monitoring Create dashboards to visualize the billing metrics

D.

Enable Cloud Monitoring metrics export to BigQuery and use Looker to visualize the data

Full Access
Question # 71

You are using Data Studio to visualize a table from your data warehouse that is built on top of BigQuery. Data is appended to the data warehouse during the day. At night, the daily summary is recalculated by overwriting the table. You just noticed that the charts in Data Studio are broken, and you want to analyze the problem. What should you do?

A.

Use the BigQuery interface to review the nightly Job and look for any errors

B.

Review the Error Reporting page in the Cloud Console to find any errors.

C.

In Cloud Logging create a filter for your Data Studio report

D.

Use the open source CLI tool. Snapshot Debugger, to find out why the data was not refreshed correctly.

Full Access
Question # 72

You need to select and configure compute resources for a set of batch processing jobs. These jobs take around 2 hours to complete and are run nightly. You want to minimize service costs. What should you do?

A.

Select Google Kubernetes Engine. Use a single-node cluster with a small instance type.

B.

Select Google Kubernetes Engine. Use a three-node cluster with micro instance types.

C.

Select Compute Engine. Use preemptible VM instances of the appropriate standard machine type.

D.

Select Compute Engine. Use VM instance types that support micro bursting.

Full Access
Question # 73

You are designing an application that lets users upload and share photos. You expect your application to grow really fast and you are targeting a worldwide audience. You want to delete uploaded photos after 30 days. You want to minimize costs while ensuring your application is highly available. Which GCP storage solution should you choose?

A.

Persistent SSD on VM instances.

B.

Cloud Filestore.

C.

Multiregional Cloud Storage bucket.

D.

Cloud Datastore database.

Full Access
Question # 74

You recently deployed a new version of an application to App Engine and then discovered a bug in the release. You need to immediately revert to the prior version of the application. What should you do?

A.

Run gcloud app restore.

B.

On the App Engine page of the GCP Console, select the application that needs to be reverted and click Revert.

C.

On the App Engine Versions page of the GCP Console, route 100% of the traffic to the previous version.

D.

Deploy the original version as a separate application. Then go to App Engine settings and split traffic between applications so that the original version serves 100% of the requests.

Full Access
Question # 75

Your company requires all developers to have the same permissions, regardless of the Google Cloud project they are working on. Your company's security policy also restricts developer permissions to Compute Engine. Cloud Functions, and Cloud SQL. You want to implement the security policy with minimal effort. What should you do?

A.

• Create a custom role with Compute Engine, Cloud Functions, and Cloud SQL permissions in one project within the Google Cloud organization.• Copy the role across all projects created within the organization with the gcloud iam roles copy command.• Assign the role to developers in those projects.

B.

• Add all developers to a Google group in Google Groups for Workspace.• Assign the predefined role of Compute Admin to the Google group at the Google Cloud organization level.

C.

• Add all developers to a Google group in Cloud Identity.• Assign predefined roles for Compute Engine, Cloud Functions, and Cloud SQL permissions to the Google group for each project in the Google Cloud organization.

D.

• Add all developers to a Google group in Cloud Identity.• Create a custom role with Compute Engine, Cloud Functions, and Cloud SQL permissions at the Google Cloud organization level.• Assign the custom role to the Google group.

Full Access
Question # 76

You create a new Google Kubernetes Engine (GKE) cluster and want to make sure that it always runs a supported and stable version of Kubernetes. What should you do?

A.

Enable the Node Auto-Repair feature for your GKE cluster.

B.

Enable the Node Auto-Upgrades feature for your GKE cluster.

C.

Select the latest available cluster version for your GKE cluster.

D.

Select “Container-Optimized OS (cos)” as a node image for your GKE cluster.

Full Access
Question # 77

You are developing an application that will be deployed on Google Cloud. The application will use a service account to retrieve data from BigGuery. Before you deploy your application, you want to test the permissions of this service account from your local machine to ensure there will be no authentication issues. You want to ensure that you use the most secure method while following Google-recommended practices What should you do?

A.

Configure the gcloud CLI with Application Default Credentials using your user account. Issue a relevant BigGuery request through the gcloud CLI to test the access.

B.

Grant the service account the BlgQuery Administrator 1AM role to ensure the service account has all required access.

C.

Generate a service account key, and configure the gcloud CLI to use this key. Issue a relevant BlgQuery request through the gcloud CLI to test the access.

D.

Configure the gcloud CLI to use service account impersonation. Issue a relevant BigQuery request through the gcloud CLI to test the access.

Full Access
Question # 78

You are working in a team that has developed a new application that needs to be deployed on Kubernetes. The production application is business critical and should be optimized for reliability. You need to provision a Kubernetes cluster and want to follow Google-recommended practices. What should you do?

A.

Create a GKE Autopilot cluster. Enroll the cluster in the rapid release channel.

B.

Create a GKE Autopilot cluster. Enroll the cluster in the stable release channel.

C.

Create a zonal GKE standard cluster. Enroll the cluster in the stable release channel.

D.

Create a regional GKE standard cluster. Enroll the cluster in the rapid release channel.

Full Access
Question # 79

Your development team needs a new Jenkins server for their project. You need to deploy the server using the fewest steps possible. What should you do?

A.

Download and deploy the Jenkins Java WAR to App Engine Standard.

B.

Create a new Compute Engine instance and install Jenkins through the command line interface.

C.

Create a Kubernetes cluster on Compute Engine and create a deployment with the Jenkins Docker image.

D.

Use GCP Marketplace to launch the Jenkins solution.

Full Access
Question # 80

Your customer wants you to create a secure website with autoscaling based on the compute instance CPU load. You want to enhance performance by storing static content in Cloud Storage. Which resources are needed to distribute the user traffic?

A.

An internal HTTP(S) load balancer together with Identity-Aware Proxy to allow only HTTPS traffic.

B.

An external HTTP(S) load balancer to distribute the load and a URL map to target the requests for the static content to the Cloud Storage backend. Install the HTTPS certificates on the instance.

C.

An external HTTP(S) load balancer with a managed SSL certificate to distribute the load and a URL map to target the requests for the static content to the Cloud Storage backend.

D.

An external network load balancer pointing to the backend instances to distribute the load evenly. The web servers will forward the request to the Cloud Storage as needed.

Full Access
Question # 81

You recently received a new Google Cloud project with an attached billing account where you will work. You need to create instances, set firewalls, and store data in Cloud Storage. You want to follow Google-recommended practices. What should you do?

A.

Use the gcloud CLI services enablecloudresourcemanager.googleapis.comcommand to enable all resources.

B.

Use the gcloud services enablecompute.googleapis.comcommand to enable Compute Engineand thegcloud services enablestorage-api.googleapis.comcommand to enable the Cloud Storage APIs.

C.

Open the Google Cloud console and enable all Google Cloud APIs from the API dashboard.

D.

Open the Google Cloud console and run gcloud init --project in a Cloud Shell.

Full Access
Question # 82

You have files in a Cloud Storage bucket that you need to share with your suppliers. You want to restrict the time that the files are available to your suppliers to 1 hour. You want to follow Google recommended practices. What should you do?

A.

Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -m 1h gs:///*.

B.

Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -d 1h gs:///.

C.

Create a service account with just the permissions to access files in the bucket. Create a JSON key for the service account. Execute the command gsutil signurl -p 60m gs:///.

D.

Create a JSON key for the Default Compute Engine Service Account. Execute the command gsutil signurl -t 60m gs:///*

Full Access
Question # 83

You are migrating a business critical application from your local data center into Google Cloud. As part of your high-availability strategy, you want to ensure that any data used by the application will be immediately available if a zonal failure occurs. What should you do?

A.

Store the application data on a zonal persistent disk. Create a snapshot schedule for the disk. If an outage occurs, create a new disk from the most recent snapshot and attach it to a new VM in another zone.

B.

Store the application data on a zonal persistent disk. If an outage occurs, create an instance in another zone with this disk attached.

C.

Store the application data on a regional persistent disk. Create a snapshot schedule for the disk. If an outage occurs, create a new disk from the most recent snapshot and attach it to a new VM in another zone.

D.

Store the application data on a regional persistent disk If an outage occurs, create an instance in another zone with this disk attached.

Full Access
Question # 84

Your company stores data from multiple sources that have different data storage requirements. These data include:

1. Customer data that is structured and read with complex queries

2. Historical log data that is large in volume and accessed infrequently

3. Real-time sensor data with high-velocity writes, which needs to be available for analysis but can tolerate some data loss

You need to design the most cost-effective storage solution that fulfills all data storage requirements. What should you do?

A.

Use Spanner for all data.

B.

Use Cloud SQL for customer data, Cloud Storage (Coldline) for historical logs, and BigQuery for sensor data.

C.

Use Cloud SQL for customer data, Cloud Storage (Archive) for historical logs, and Bigtable for sensor data.

D.

Use Firestore for customer data, Cloud Storage (Nearline) for historical logs, and Bigtable for sensor data.

Full Access
Question # 85

You just installed the Google Cloud CLI on your new corporate laptop. You need to list the existing instances of your company on Google Cloud. What must you do before you run the gcloud compute instances list command?

Choose 2 answers

A.

Run gcloud auth login, enter your login credentials in the dialog window, and paste the received login token to gcloud CLI.

B.

Create a Google Cloud service account, and download the service account key. Place the key file in a folder on your machine where gcloud CLI can find it.

C.

Download your Cloud Identity user account key. Place the key file in a folder on your machine where gcloud CLI can find it.

D.

Run gcloud config set compute/zone $my_zone to set the default zone for gcloud CLI.

E.

Run gcloud config set project $my_project to set the default project for gcloud CLI.

Full Access
Question # 86

Your team is using Linux instances on Google Cloud. You need to ensure that your team logs in to these instances in the most secure and cost efficient way. What should you do?

A.

Attach a public IP to the instances and allow incoming connections from the internet on port 22 for SSH.

B.

Use a third party tool to provide remote access to the instances.

C.

Use the gcloud compute ssh command with the --tunnel-through-iap flag. Allow ingress traffic from the IP range 35.235.240.0/20 on port 22.

D.

Create a bastion host with public internet access. Create the SSH tunnel to the instance through the bastion host.

Full Access
Question # 87

You are setting up a Windows VM on Compute Engine and want to make sure you can log in to the VM via RDP. What should you do?

A.

After the VM has been created, use your Google Account credentials to log in into the VM.

B.

After the VM has been created, use gcloud compute reset-windows-password to retrieve the login credentials for the VM.

C.

When creating the VM, add metadata to the instance using ‘windows-password’ as the key and a password as the value.

D.

After the VM has been created, download the JSON private key for the default Compute Engine service account. Use the credentials in the JSON file to log in to the VM.

Full Access
Question # 88

(Your company has a rapidly growing social media platform and a user base primarily located in North America. Due to increasing demand, your current on-premises PostgreSQL database, hosted in your United States headquarters data center, no longer meets your needs. You need to identify a cloud-based database solution that offers automatic scaling, multi-region support for future expansion, and maintains low latency.)

A.

Use Bigtable.

B.

Use BigQuery.

C.

Use Spanner.

D.

Use Cloud SQL for PostgreSQL.

Full Access
Question # 89

You need to create a copy of a custom Compute Engine virtual machine (VM) to facilitate an expected increase in application traffic due to a business acquisition. What should you do?

A.

Create a Compute Engine snapshot of your base VM. Create your images from that snapshot.

B.

Create a Compute Engine snapshot of your base VM. Create your instances from that snapshot.

C.

Create a custom Compute Engine image from a snapshot. Create your images from that image.

D.

Create a custom Compute Engine image from a snapshot. Create your instances from that image.

Full Access
Question # 90

You are running an application on multiple virtual machines within a managed instance group and have autoscaling enabled. The autoscaling policy is configured so that additional instances are added to the group if the CPU utilization of instances goes above 80%. VMs are added until the instance group reaches its maximum limit of five VMs or until CPU utilization of instances lowers to 80%. The initial delay for HTTP health checks against the instances is set to 30 seconds. The virtual machine instances take around three minutes to become available for users. You observe that when the instance group autoscales, it adds more instances then necessary to support the levels of end-user traffic. You want to properly maintain instance group sizes when autoscaling. What should you do?

A.

Set the maximum number of instances to 1.

B.

Decrease the maximum number of instances to 3.

C.

Use a TCP health check instead of an HTTP health check.

D.

Increase the initial delay of the HTTP health check to 200 seconds.

Full Access
Question # 91

You’ve deployed a microservice called myapp1 to a Google Kubernetes Engine cluster using the YAML file specified below:

You need to refactor this configuration so that the database password is not stored in plain text. You want to follow Google-recommended practices. What should you do?

A.

Store the database password inside the Docker image of the container, not in the YAML file.

B.

Store the database password inside a Secret object. Modify the YAML file to populate the DB_PASSWORD environment variable from the Secret.

C.

Store the database password inside a ConfigMap object. Modify the YAML file to populate the DB_PASSWORD environment variable from the ConfigMap.

D.

Store the database password in a file inside a Kubernetes persistent volume, and use a persistent volume claim to mount the volume to the container.

Full Access
Question # 92

You are deploying an application on Google Cloud that requires a relational database for storage. To satisfy your company's security policies, your application must connect to your database through an encrypted and authenticated connection that requires minimal management and integrates with Identity and Access Management (IAM). What should you do?

A.

Deploy a Cloud SQL database with the SSL mode set to encrypted only, configure SSL/TLS client certificates, and configure a database user and password.

B.

Deploy a Cloud SOL database and configure IAM database authentication. Access the database through the Cloud SQL Auth Proxy.

C.

Deploy a Cloud SQL database with the SSL mode set to encrypted only, configure SSL/TLS client certificates, and configure IAM database authentication.

D.

Deploy a Cloud SQL database and configure a database user and password. Access the database through the Cloud SQL Auth Proxy.

Full Access
Question # 93

You are configuring service accounts for an application that spans multiple projects. Virtual machines (VMs) running in the web-applications project need access to BigQuery datasets in the crm-databases project. You want to follow Google-recommended practices to grant access to the service account in the web-applications project. What should you do?

A.

Grant "project owner" for web-applications appropriate roles to crm-databases.

B.

Grant "project owner" role to crm-databases and the web-applications project.

C.

Grant "project owner" role to crm-databases and roles/bigquery.dataViewer role to web-applications.

D.

Grant roles/bigquery.dataViewer role to crm-databases and appropriate roles to web-applications.

Full Access
Question # 94

Several employees at your company have been creating projects with Cloud Platform and paying for it with their personal credit cards, which the company reimburses. The company wants to centralize all these projects under a single, new billing account. What should you do?

A.

Contact cloud-billing@google.com with your bank account details and request a corporate billing account for your company.

B.

Create a ticket with Google Support and wait for their call to share your credit card details over the phone.

C.

In the Google Platform Console, go to the Resource Manage and move all projects to the root Organization.

D.

In the Google Cloud Platform Console, create a new billing account and set up a payment method.

Full Access
Question # 95

Your company has developed a new application that consists of multiple microservices. You want to deploy the application to Google Kubernetes Engine (GKE), and you want to ensure that the cluster can scale as more applications are deployed in the future. You want to avoid manual intervention when each new application is deployed. What should you do?

A.

Deploy the application on GKE, and add a HorizontalPodAutoscaler to the deployment.

B.

Deploy the application on GKE, and add a VerticalPodAutoscaler to the deployment.

C.

Create a GKE cluster with autoscaling enabled on the node pool. Set a minimum and maximum for the size of the node pool.

D.

Create a separate node pool for each application, and deploy each application to its dedicated node pool.

Full Access
Question # 96

(You are migrating your on-premises workload to Google Cloud. Your company is implementing its Cloud Billing configuration and requires access to a granular breakdown of its Google Cloud costs. You need to ensure that the Cloud Billing datasets are available in BigQuery so you can conduct a detailed analysis of costs. What should you do?)

A.

Enable the BigQuery API and ensure that the BigQuery User IAM role is selected. Change the BigQuery dataset to select a data location.

B.

Create a Cloud Billing account. Enable the BigQuery Data Transfer Service API to export pricing data.

C.

Enable Cloud Billing data export to BigQuery when you create a Cloud Billing account.

D.

Enable Cloud Billing on the project and link a Cloud Billing account. Then view the billing data table in the BigQuery dataset.

Full Access
Question # 97

Your company runs its Linux workloads on Compute Engine instances. Your company will be working with a new operations partner that does not use Google Accounts. You need to grant access to the instances to your operations partner so they can maintain the installed tooling. What should you do?

A.

Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.

B.

Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.

C.

Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.

D.

Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.

Full Access