Labour Day Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: buysanta

Exact2Pass Menu

Question # 4

What are purposes for creating a storage integration? (Choose three.)

A.

Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

B.

Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

C.

Support multiple external stages using one single Snowflake object.

D.

Avoid supplying credentials when creating a stage or when loading or unloading data.

E.

Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

F.

Manage credentials from multiple cloud providers in one single Snowflake object.

Full Access
Question # 5

The following table exists in the production database:

A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

A.

Use a masking policy on the username column using a entitlement table with valid dates.

B.

Use a row level policy on the user_events table using a entitlement table with valid dates.

C.

Use a masking policy on the username column with event_timestamp as a conditional column.

D.

Use a secure view on the user_events table using a case statement on the username column.

Full Access
Question # 6

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

A.

Use ON_ERROR = continue in the copy into command.

B.

Use purge = TRUE in the copy into command.

C.

Use FURGE = FALSE in the copy into command.

D.

Use on error = SKIP_FILE in the copy into command.

Full Access
Question # 7

A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: product_category, and product_details. Both tables can be joined by the product_id column. Data access should be governed, and only the partner should have access to the records.

The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

A.

Use Secure Data Sharing with an S3 bucket as a destination.

B.

Publish product_category and product_details data sets on the Snowflake Marketplace.

C.

Create a database user for the partner and give them access to the required data sets.

D.

Create a reader account for the partner and share the data sets as secure views.

Full Access
Question # 8

What is a valid object hierarchy when building a Snowflake environment?

A.

Account --> Database --> Schema --> Warehouse

B.

Organization --> Account --> Database --> Schema --> Stage

C.

Account --> Schema > Table --> Stage

D.

Organization --> Account --> Stage --> Table --> View

Full Access
Question # 9

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Full Access
Question # 10

An Architect needs to design a solution for building environments for development, test, and pre-production, all located in a single Snowflake account. The environments should be based on production data.

Which solution would be MOST cost-effective and performant?

A.

Use zero-copy cloning into transient tables.

B.

Use zero-copy cloning into permanent tables.

C.

Use CREATE TABLE ... AS SELECT (CTAS) statements.

D.

Use a Snowflake task to trigger a stored procedure to copy data.

Full Access
Question # 11

A company has several sites in different regions from which the company wants to ingest data.

Which of the following will enable this type of data ingestion?

A.

The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

B.

The company must replicate data between Snowflake accounts.

C.

The company should provision a reader account to each site and ingest the data through the reader accounts.

D.

The company should use a storage integration for the external stage.

Full Access
Question # 12

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

A.

1. Create a share and add the database privileges to the share

2. Create a new listing on the Snowflake Marketplace

3. Alter the listing and add the share

4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)

2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared

3. Create a share and add the database privileges to the share

4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Add the reader account to the share

4. Share the reader account's URL and credentials with the customer

Full Access
Question # 13

What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?

A.

The Connector only works in Snowflake regions that use AWS infrastructure.

B.

The Connector works with all file formats, including text, JSON, Avro, Ore, Parquet, and XML.

C.

The Connector creates and manages its own stage, file format, and pipe objects.

D.

Loads using the Connector will have lower latency than Snowpipe and will ingest data in real time.

Full Access
Question # 14

At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?

A.

Global

B.

Database

C.

Schema

D.

Table

Full Access
Question # 15

An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

What is the reason for this?

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query Is queued for execution.

D.

The query Is reading from remote storage

Full Access
Question # 16

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

1. Provide access to frequently changing data

2. Keep egress costs to a minimum

3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

A.

Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

B.

Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

C.

Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

D.

Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Full Access
Question # 17

The following DDL command was used to create a task based on a stream:

Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

A.

The warehouse MY_WH will be made active every five minutes to check the stream.

B.

The warehouse MY_WH will only be active when there are results in the stream.

C.

The warehouse MY_WH will never suspend.

D.

The warehouse MY_WH will automatically resize to accommodate the size of the stream.

Full Access
Question # 18

Which Snowflake objects can be used in a data share? (Select TWO).

A.

Standard view

B.

Secure view

C.

Stored procedure

D.

External table

E.

Stream

Full Access
Question # 19

The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

What will happen to the consumer account if a new table (table_6) is added to the provider schema?

A.

The consumer role will automatically see the new table and no additional grants are needed.

B.

The consumer role will see the table only after this grant is given on the consumer side:

grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

C.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

D.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

grant usage on database EDW to share PSHARE_EDW_4TEST ;

grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ;

Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

Full Access
Question # 20

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

A.

Call the LOGIN_HISTORY Information Schema table function.

B.

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Full Access
Question # 21

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Full Access
Question # 22

Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

A.

Use Snowpipe with auto-ingest.

B.

Use a COPY command with a task.

C.

Use a materialized view on an external table.

D.

Use the COPY INTO command.

E.

Use a combination of a task and a stream.

Full Access
Question # 23

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

A.

External table

B.

Materialized view

C.

Search optimization

D.

Result cache

Full Access
Question # 24

Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

A.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

B.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

C.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

D.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

Full Access
Question # 25

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

A.

The MERGE command

B.

The UPSERT command

C.

The CHANGES clause

D.

A STREAM object

E.

The CHANGE_DATA_CAPTURE command

Full Access
Question # 26

A user has the appropriate privilege to see unmasked data in a column.

If the user loads this column data into another column that does not have a masking policy, what will occur?

A.

Unmasked data will be loaded in the new column.

B.

Masked data will be loaded into the new column.

C.

Unmasked data will be loaded into the new column but only users with the appropriate privileges will be able to see the unmasked data.

D.

Unmasked data will be loaded into the new column and no users will be able to see the unmasked data.

Full Access
Question # 27

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

A.

Shared databases are read-only.

B.

Shared databases must be refreshed in order for new data to be visible.

C.

Shared databases cannot be cloned.

D.

Shared databases are not supported by Time Travel.

E.

Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

F.

Shared databases can also be created as transient databases.

Full Access
Question # 28

Consider the following COPY command which is loading data with CSV format into a Snowflake table from an internal stage through a data transformation query.

This command results in the following error:

SQL compilation error: invalid parameter 'validation_mode'

Assuming the syntax is correct, what is the cause of this error?

A.

The VALIDATION_MODE parameter supports COPY statements that load data from external stages only.

B.

The VALIDATION_MODE parameter does not support COPY statements with CSV file formats.

C.

The VALIDATION_MODE parameter does not support COPY statements that transform data during a load.

D.

The value return_all_errors of the option VALIDATION_MODE is causing a compilation error.

Full Access