Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Snowflake Exam DEA-C01 Topic 5 Question 34 Discussion

Actual exam question for Snowflake's DEA-C01 exam
Question #: 34
Topic #: 5
[All DEA-C01 Questions]

A CSV file around 1 TB in size is generated daily on an on-premise server A corresponding table. Internal stage, and file format have already been created in Snowflake to facilitate the data loading process

How can the process of bringing the CSV file into Snowflake be automated using the LEAST amount of operational overhead?

Show Suggested Answer Hide Answer
Suggested Answer: C

This option is the best way to automate the process of bringing the CSV file into Snowflake with the least amount of operational overhead. SnowSQL is a command-line tool that can be used to execute SQL statements and scripts on Snowflake. By scheduling a SQL file that executes a PUT command, the CSV file can be pushed from the on-premise server to the internal stage in Snowflake. Then, by creating a pipe that runs a COPY INTO statement that references the internal stage, Snowpipe can automatically load the file from the internal stage into the table when it detects a new file in the stage. This way, there is no need to manually start or monitor a virtual warehouse or task.


Contribute your Thoughts:

Ellsworth
7 days ago
Option D looks interesting, but I'm not sure if it would be the 'least amount of operational overhead' as the question asks. Bypassing the internal stage might introduce some complexity.
upvoted 0 times
...
Lavonda
20 days ago
I see both points, but I think option B could also work well. Scheduling a SQL file to run using SnowSQL and then creating a task in Snowflake seems like a good approach too.
upvoted 0 times
...
Howard
21 days ago
I'd go with option B. Scheduling a SQL file to push the file to the internal stage and then running a copy into statement in a Snowflake task seems like a straightforward approach.
upvoted 0 times
Selma
11 days ago
Definitely, setting up the task in Snowflake to run after the file is pushed to the internal stage is a smart move.
upvoted 0 times
...
Tenesha
15 days ago
I agree, it seems like the most efficient way to automate the process with the least amount of operational overhead.
upvoted 0 times
...
Janine
16 days ago
Option B sounds like a good choice. It's a simple process to schedule the SQL file and run the copy into statement.
upvoted 0 times
...
...
Suzan
27 days ago
I disagree, I believe option C is the better choice. Using Snowpipe auto-ingest to automatically load the file seems more efficient and less manual intervention required.
upvoted 0 times
...
Chaya
1 months ago
I think option A is the best choice. It seems like the most straightforward way to automate the process with the least amount of operational overhead.
upvoted 0 times
...
Tonette
1 months ago
Option C seems the most efficient way to automate the process. Snowpipe will take care of the loading when the file lands in the internal stage, reducing operational overhead.
upvoted 0 times
Kimbery
19 days ago
I agree, Snowpipe definitely simplifies the process and reduces the manual steps needed for loading the data into Snowflake.
upvoted 0 times
...
Chau
22 days ago
Option C seems the most efficient way to automate the process. Snowpipe will take care of the loading when the file lands in the internal stage, reducing operational overhead.
upvoted 0 times
...
...

Save Cancel