Printable PDF
Download DemoVendor: Snowflake
Certifications: Snowflake Certification
Exam Code: ARA-C01
Exam Name: SnowPro Advanced: Architect Certification Exam
Updated: Nov 09, 2024
Q&As: 65
Note: Product instant download. Please sign in and click My account to download your product.
The ARA-C01 Questions & Answers covers all the knowledge points of the real exam. We update our product frequently so our customer can always have the latest version of the brain dumps. We provide our customers with the excellent 7x24 hours customer service. We have the most professional expert team to back up our grate quality products. If you still cannot make your decision on purchasing our product, please try our free demo.
Experience
Pass4itsure.com exam material in PDF version.
Simply submit your e-mail address below to get
started with our PDF real exam demo of your
Snowflake ARA-C01 exam.
Instant download
Latest update demo according to real exam
VCE
Which Snowflake data modeling approach is designed for BI queries?
A. 3 NF
B. Star schema
C. Data Vault
D. Snowflake schema
Correct Answer: B
Explanation: A star schema is a Snowflake data modeling approach that is designed for BI queries. A star schema is a type of dimensional modeling that organizes data into fact tables and dimension tables. A fact table contains the measures or metrics of the business process, such as sales amount, order quantity, or profit margin. A dimension table contains the attributes or descriptors of the business process, such as product name, customer name, or order date. A star schema is called so because it resembles a star, with one fact table in the center and multiple dimension tables radiating from it. A star schema can improve the performance and simplicity of BI queries by reducing the number of joins, providing fast access to aggregated data, and enabling intuitive query syntax. A star schema can also support various types of analysis, such as trend analysis, slice and dice, drill down, and roll up12. References: Snowflake Documentation: Dimensional Modeling Snowflake Documentation: Star Schema
There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.
An Architect needs to create a read-only role for certain employees working in the human resources department.
Which permission sets must be granted to this role?
A. USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db
B. USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db
C. MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db
D. USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db
Correct Answer: A
To create a read-only role for certain employees working in the human resources department, the role needs to have the following permissions on the hr_db database: Option A is the correct answer because it grants the minimum permissions required for a read-only role on the hr_db database. Option B is incorrect because SELECT on schemas is not a valid permission. Schemas only support USAGE and CREATE permissions. Option C is incorrect because MODIFY on the database is not a valid permission. Databases only support USAGE, CREATE, MONITOR, and OWNERSHIP permissions. Moreover, USAGE on tables is not sufficient for querying the data. Tables support SELECT, INSERT, UPDATE, DELETE, TRUNCATE, REFERENCES, and OWNERSHIP permissions. Option D is incorrect because REFERENCES on tables is not relevant for querying the data. REFERENCES permission allows the role to create foreign key constraints on the tables. References: : https://docs.snowflake.com/en/user-guide/security-access-control- privileges.html#database-privileges : https://docs.snowflake.com/en/user-guide/security-access-control- privileges.html#schema-privileges : https://docs.snowflake.com/en/user-guide/security-access-control- privileges.html#table-privileges
A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.
What is the MOST cost-effective way to bring this data into a Snowflake table?
A. An external table
B. A pipe
C. A stream
D. A copy command at regular intervals
Correct Answer: B
A pipe is a Snowflake object that continuously loads data from files in a stage (internal or external) into a table. A pipe can be configured to use auto-ingest, which means that Snowflake automatically detects new or modified files in the stage and loads them into the table without any manual intervention1. A pipe is the most cost-effective way to bring large numbers of small JSON files into a Snowflake table, because it minimizes the number of COPY commands executed and the number of micro-partitions created. A pipe can use file aggregation, which means that it can combine multiple small files into a single larger file before loading them into the table. This reduces the load time and the storage cost of the data2. An external table is a Snowflake object that references data files stored in an external location, such as Amazon S3, Google Cloud Storage, or Microsoft Azure Blob Storage. An external table does not store the data in Snowflake, but only provides a view of the data for querying. An external table is not a cost-effective way to bring data into a Snowflake table, because it does not support file aggregation, and it requires additional network bandwidth and compute resources to
query the external data3.
A stream is a Snowflake object that records the history of changes (inserts, updates, and deletes) made to a table. A stream can be used to consume the changes from a table and apply them to another table or a task. A stream is not a way
to bring data into a Snowflake table, but a way to process the data after it is loaded into a table4.
A copy command is a Snowflake command that loads data from files in a stage into a table. A copy command can be executed manually or scheduled using a task. A copy command is not a cost-effective way to bring large numbers of small
JSON files into a Snowflake table, because it does not support file aggregation, and it may create many micro-partitions that increase the storage cost of the data5.
References: : Pipes : Loading Data Using Snowpipe : External Tables : Streams : COPY INTO