New Year Special 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: best70

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

What step can be taken to improve the pruning of the reporting tables?

Options:

A.

Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

B.

Increase the size of the virtual warehouse to a size 5X-Large.

C.

Use an ORDER BY command to load the reporting tables.

D.

Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

Buy Now
Questions 5

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.

1. Create a share.

2. Add objects to the share.

3. Add a consumer account to the share for the vendor to access.

B.

1. Create a share.

2. Create a reader account for the vendor to use.

3. Add the reader account to the share.

C.

1. Create a new role called db_share.

2. Grant the db_share role privileges to read data from the company database and schema.

3. Create a user for the vendor.

4. Grant the ds_share role to the vendor's users.

D.

1. Promote an existing database in the company's local account to primary.

2. Replicate the database to Snowflake on Azure in the West-Europe region.

3. Create a share and add objects to the share.

4. Add a consumer account to the share for the vendor to access.

Buy Now
Questions 6

An Architect has been asked to clone schema STAGING as it looked one week ago, Tuesday June 1st at 8:00 AM, to recover some objects.

The STAGING schema has 50 days of retention.

The Architect runs the following statement:

CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-06-01 08:00:00');

The Architect receives the following error: Time travel data is not available for schema STAGING. The requested time is either beyond the allowed time travel period or before the object creation time.

The Architect then checks the schema history and sees the following:

CREATED_ON|NAME|DROPPED_ON

2021-06-02 23:00:00 | STAGING | NULL

2021-05-01 10:00:00 | STAGING | 2021-06-02 23:00:00

How can cloning the STAGING schema be achieved?

Options:

A.

Undrop the STAGING schema and then rerun the CLONE statement.

B.

Modify the statement: CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-05-01 10:00:00');

C.

Rename the STAGING schema and perform an UNDROP to retrieve the previous STAGING schema version, then run the CLONE statement.

D.

Cloning cannot be accomplished because the STAGING schema version was not active during the proposed Time Travel time period.

Buy Now
Questions 7

Which Snowflake objects can be used in a data share? (Select TWO).

Options:

A.

Standard view

B.

Secure view

C.

Stored procedure

D.

External table

E.

Stream

Buy Now
Questions 8

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Buy Now
Questions 9

An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.

Why is this occurring?

Options:

A.

Tasks cannot be cloned.

B.

The objects that the tasks reference are not fully qualified.

C.

Cloned tasks are suspended by default and must be manually resumed.

D.

The Architect has insufficient privileges to alter tasks on the cloned database.

Buy Now
Questions 10

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

1. Provide access to frequently changing data

2. Keep egress costs to a minimum

3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

Options:

A.

Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

B.

Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

C.

Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

D.

Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Buy Now
Questions 11

Which of the below commands will use warehouse credits?

Options:

A.

SHOW TABLES LIKE 'SNOWFL%';

B.

SELECT MAX(FLAKE_ID) FROM SNOWFLAKE;

C.

SELECT COUNT(*) FROM SNOWFLAKE;

D.

SELECT COUNT(FLAKE_ID) FROM SNOWFLAKE GROUP BY FLAKE_ID;

Buy Now
Questions 12

Database DB1 has schema S1 which has one table, T1.

DB1 --> S1 --> T1

The retention period of EG1 is set to 10 days.

The retention period of s: is set to 20 days.

The retention period of t: Is set to 30 days.

The user runs the following command:

Drop Database DB1;

What will the Time Travel retention period be for T1?

Options:

A.

10 days

B.

20 days

C.

30 days

D.

37 days

Buy Now
Questions 13

Which Snowflake architecture recommendation needs multiple Snowflake accounts for implementation?

Options:

A.

Enable a disaster recovery strategy across multiple cloud providers.

B.

Create external stages pointing to cloud providers and regions other than the region hosting the Snowflake account.

C.

Enable zero-copy cloning among the development, test, and production environments.

D.

Enable separation of the development, test, and production environments.

Buy Now
Questions 14

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

Options:

A.

There needs to be fewer objects per tenant.

B.

Security and Role-Based Access Control (RBAC) policies must be simple to configure.

C.

Compute costs must be optimized.

D.

Tenant data shape may be unique per tenant.

E.

Storage costs must be optimized.

Buy Now
Questions 15

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

Options:

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Buy Now
Questions 16

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

Options:

A.

Choose columns that are frequently used in join predicates.

B.

Choose lower cardinality columns to support clustering keys and cost effectiveness.

C.

Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

D.

Choose cluster columns that are most actively used in selective filters.

E.

Choose cluster columns that are actively used in the GROUP BY clauses.

Buy Now
Questions 17

A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.

The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.

According to Snowflake recommended best practice, how should these requirements be met?

Options:

A.

Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.

B.

Deploy a Private Data Exchange in combination with data shares for the European accounts.

C.

Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.

D.

Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.

Buy Now
Questions 18

What is the MOST efficient way to design an environment where data retention is not considered critical, and customization needs are to be kept to a minimum?

Options:

A.

Use a transient database.

B.

Use a transient schema.

C.

Use a transient table.

D.

Use a temporary table.

Buy Now
Questions 19

A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.

After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.

What would cause this to occur? (Choose two.)

Options:

A.

The staging schema has not been setup for MANAGED ACCESS.

B.

The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.

C.

The tables exceed the 1 TB limit for data recovery.

D.

The staging tables are of the TRANSIENT type.

E.

The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.

Buy Now
Questions 20

Which of the following are characteristics of Snowflake’s parameter hierarchy?

Options:

A.

Session parameters override virtual warehouse parameters.

B.

Virtual warehouse parameters override user parameters.

C.

Table parameters override virtual warehouse parameters.

D.

Schema parameters override account parameters.

Buy Now
Questions 21

An Architect runs the following SQL query:

How can this query be interpreted?

Options:

A.

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Buy Now
Questions 22

Following objects can be cloned in snowflake

Options:

A.

Permanent table

B.

Transient table

C.

Temporary table

D.

External tables

E.

Internal stages

Buy Now
Questions 23

An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.

Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?

Options:

A.

Utilize a higher Buffer.flush.time in the connector configuration.

B.

Utilize a higher Buffer.size.bytes in the connector configuration.

C.

Utilize a lower Buffer.size.bytes in the connector configuration.

D.

Utilize a lower Buffer.count.records in the connector configuration.

Buy Now
Questions 24

An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

Options:

A.

Use the Snowflake Connector for Python, connect to remote storage and download the file.

B.

Use the get command in SnowSQL to retrieve the file.

C.

Use the get command in Snowsight to retrieve the file.

D.

Use the Snowflake API endpoint and download the file.

Buy Now
Questions 25

When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP () what will occur?

Options:

A.

All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

B.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

C.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

D.

All rows loaded using a specific COPY statement will have the same timestamp value.

Buy Now
Questions 26

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

Options:

A.

External table

B.

Materialized view

C.

Search optimization

D.

Result cache

Buy Now
Questions 27

A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.

An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.

Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?

Options:

A.

Use secondary roles for all users.

B.

Create a hierarchy between the two read roles.

C.

Request a technical ETL user with the sysadmin role.

D.

Request that the two data domains share data using the Data Exchange.

Buy Now
Questions 28

Company A has recently acquired company B. The Snowflake deployment for company B is located in the Azure West Europe region.

As part of the integration process, an Architect has been asked to consolidate company B's sales data into company A's Snowflake account which is located in the AWS us-east-1 region.

How can this requirement be met?

Options:

A.

Replicate the sales data from company B's Snowflake account into company A's Snowflake account using cross-region data replication within Snowflake. Configure a direct share from company B's account to company A's account.

B.

Export the sales data from company B's Snowflake account as CSV files, and transfer the files to company A's Snowflake account. Import the data using Snowflake's data loading capabilities.

C.

Migrate company B's Snowflake deployment to the same region as company A's Snowflake deployment, ensuring data locality. Then perform a direct database-to-database merge of the sales data.

D.

Build a custom data pipeline using Azure Data Factory or a similar tool to extract the sales data from company B's Snowflake account. Transform the data, then load it into company A's Snowflake account.

Buy Now
Questions 29

Company A would like to share data in Snowflake with Company B. Company B is not on the same cloud platform as Company A.

What is required to allow data sharing between these two companies?

Options:

A.

Create a pipeline to write shared data to a cloud storage location in the target cloud provider.

B.

Ensure that all views are persisted, as views cannot be shared across cloud platforms.

C.

Setup data replication to the region and cloud platform where the consumer resides.

D.

Company A and Company B must agree to use a single cloud platform: Data sharing is only possible if the companies share the same cloud provider.

Buy Now
Questions 30

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

Options:

A.

Database

B.

Schema

C.

Table

D.

Stage

E.

Role

F.

Warehouse

Buy Now
Questions 31

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

Options:

A.

Shared databases are read-only.

B.

Shared databases must be refreshed in order for new data to be visible.

C.

Shared databases cannot be cloned.

D.

Shared databases are not supported by Time Travel.

E.

Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

F.

Shared databases can also be created as transient databases.

Buy Now
Questions 32

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

Options:

A.

The MERGE command

B.

The UPSERT command

C.

The CHANGES clause

D.

A STREAM object

E.

The CHANGE_DATA_CAPTURE command

Buy Now
Questions 33

An Architect uses COPY INTO with the ON_ERROR=SKIP_FILE option to bulk load CSV files into a table called TABLEA, using its table stage. One file named file5.csv fails to load. The Architect fixes the file and re-loads it to the stage with the exact same file name it had previously.

Which commands should the Architect use to load only file5.csv file from the stage? (Choose two.)

Options:

A.

COPY INTO tablea FROM @%tablea RETURN_FAILED_ONLY = TRUE;

B.

COPY INTO tablea FROM @%tablea;

C.

COPY INTO tablea FROM @%tablea FILES = ('file5.csv');

D.

COPY INTO tablea FROM @%tablea FORCE = TRUE;

E.

COPY INTO tablea FROM @%tablea NEW_FILES_ONLY = TRUE;

F.

COPY INTO tablea FROM @%tablea MERGE = TRUE;

Buy Now
Questions 34

A company wants to Integrate its main enterprise identity provider with federated authentication with Snowflake.

The authentication integration has been configured and roles have been created in Snowflake. However, the users are not automatically appearing in Snowflake when created and their group membership is not reflected in their assigned rotes.

How can the missing functionality be enabled with the LEAST amount of operational overhead?

Options:

A.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users and roles.

B.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users, and the resource server must be configured with the right mapping of role assignment.

C.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM, their groups will get created as group accounts in Snowflake and the proper roles can be granted.

D.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM. users will automatically get created and their group membership will be reflected as roles In Snowflake.

Buy Now
Questions 35

A new user user_01 is created within Snowflake. The following two commands are executed:

Command 1-> show grants to user user_01;

Command 2 ~> show grants on user user 01;

What inferences can be made about these commands?

Options:

A.

Command 1 defines which user owns user_01

Command 2 defines all the grants which have been given to user_01

B.

Command 1 defines all the grants which are given to user_01 Command 2 defines which user owns user_01

C.

Command 1 defines which role owns user_01

Command 2 defines all the grants which have been given to user_01

D.

Command 1 defines all the grants which are given to user_01

Command 2 defines which role owns user 01

Buy Now
Questions 36

The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:

Step 1: Data files are loaded in a stage.

Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.

Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.

If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon S3?

Options:

A.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.

B.

The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.

C.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.

D.

The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.

Buy Now
Questions 37

The following DDL command was used to create a task based on a stream:

Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

Options:

A.

The warehouse MY_WH will be made active every five minutes to check the stream.

B.

The warehouse MY_WH will only be active when there are results in the stream.

C.

The warehouse MY_WH will never suspend.

D.

The warehouse MY_WH will automatically resize to accommodate the size of the stream.

Buy Now
Questions 38

Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Select TWO).

A)

B)

C)

D)

E)

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

E.

Option E

Buy Now
Questions 39

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Buy Now
Questions 40

A new table and streams are created with the following commands:

CREATE OR REPLACE TABLE LETTERS (ID INT, LETTER STRING) ;

CREATE OR REPLACE STREAM STREAM_1 ON TABLE LETTERS;

CREATE OR REPLACE STREAM STREAM_2 ON TABLE LETTERS APPEND_ONLY = TRUE;

The following operations are processed on the newly created table:

INSERT INTO LETTERS VALUES (1, 'A');

INSERT INTO LETTERS VALUES (2, 'B');

INSERT INTO LETTERS VALUES (3, 'C');

TRUNCATE TABLE LETTERS;

INSERT INTO LETTERS VALUES (4, 'D');

INSERT INTO LETTERS VALUES (5, 'E');

INSERT INTO LETTERS VALUES (6, 'F');

DELETE FROM LETTERS WHERE ID = 6;

What would be the output of the following SQL commands, in order?

SELECT COUNT (*) FROM STREAM_1;

SELECT COUNT (*) FROM STREAM_2;

Options:

A.

2 & 6

B.

2 & 3

C.

4 & 3

D.

4 & 6

Buy Now
Questions 41

A company has a table with that has corrupted data, named Data. The company wants to recover the data as it was 5 minutes ago using cloning and Time Travel.

What command will accomplish this?

Options:

A.

CREATE CLONE TABLE Recover_Data FROM Data AT(OFFSET => -60*5);

B.

CREATE CLONE Recover_Data FROM Data AT(OFFSET => -60*5);

C.

CREATE TABLE Recover_Data CLONE Data AT(OFFSET => -60*5);

D.

CREATE TABLE Recover Data CLONE Data AT(TIME => -60*5);

Buy Now
Questions 42

A user named USER_01 needs access to create a materialized view on a schema EDW. STG_SCHEMA. How can this access be provided?

Options:

A.

GRANT CREATE MATERIALIZED VIEW ON SCHEMA EDW.STG_SCHEMA TO USER USER_01;

B.

GRANT CREATE MATERIALIZED VIEW ON DATABASE EDW TO USER USERJD1;

C.

GRANT ROLE NEW_ROLE TO USER USER_01;

GRANT CREATE MATERIALIZED VIEW ON SCHEMA ECW.STG_SCHEKA TO NEW_ROLE;

D.

GRANT ROLE NEW_ROLE TO USER_01;

GRANT CREATE MATERIALIZED VIEW ON EDW.STG_SCHEMA TO NEW_ROLE;

Buy Now
Questions 43

What actions are permitted when using the Snowflake SQL REST API? (Select TWO).

Options:

A.

The use of a GET command

B.

The use of a PUT command

C.

The use of a ROLLBACK command

D.

The use of a CALL command to a stored procedure which returns a table

E.

Submitting multiple SQL statements in a single call

Buy Now
Questions 44

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.

Call loadHistoryScan every minute for the maximum time range.

C.

Call insertReport every 8 minutes for a 10-minute time range.

D.

Call loadHistoryScan every 10 minutes for a 15-minutes range.

Buy Now
Questions 45

An Architect Is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake system. The company is planning on sharing data among Its corporate branches using Snowflake data sharing.

What should be considered when sharing the unstructured data within Snowflake?

Options:

A.

A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

B.

A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

C.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

D.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

Buy Now
Questions 46

Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

Options:

A.

Changing the name of the organization

B.

Creating an account

C.

Viewing a list of organization accounts

D.

Changing the name of an account

E.

Deleting an account

F.

Enabling the replication of a database

Buy Now
Questions 47

The following table exists in the production database:

A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

Options:

A.

Use a masking policy on the username column using a entitlement table with valid dates.

B.

Use a row level policy on the user_events table using a entitlement table with valid dates.

C.

Use a masking policy on the username column with event_timestamp as a conditional column.

D.

Use a secure view on the user_events table using a case statement on the username column.

Buy Now
Questions 48

When using the copy into

command with the CSV file format, how does the match_by_column_name parameter behave?

Options:

A.

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.

The parameter will be ignored.

C.

The command will return an error.

D.

The command will return a warning stating that the file has unmatched columns.

Buy Now
command is used to load data from staged files into an existing table in Snowflake. The command supports various file formats, such as CSV, JSON, AVRO, ORC, PARQUET, and XML1.
  • The match_by_column_name parameter is a copy option that enables loading semi-structured data into separate columns in the target table that match corresponding columns represented in the source data. The parameter can have one of the following values2:
  • The match_by_column_name parameter only applies to semi-structured data, such as JSON, AVRO, ORC, PARQUET, and XML. It does not apply to CSV data, which is considered structured data2.
  • When using the copy into
  • command with the CSV file format, the match_by_column_name parameter behaves as follows2:

    References:

    • 1: COPY INTO
    | Snowflake Documentation
  • 2: MATCH_BY_COLUMN_NAME | Snowflake Documentation
  • Exam Code: ARA-C01
    Exam Name: SnowPro Advanced: Architect Certification Exam
    Last Update: Dec 22, 2024
    Questions: 162

    PDF + Testing Engine

    $134.99

    Testing Engine

    $99.99

    PDF (Q&A)

    $84.99