Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data Architect Topic 2 Question 53 Discussion

Actual exam question for Salesforce's Data Architect exam
Question #: 53
Topic #: 2
[All Data Architect Questions]

Northern Trail Outfitters needs to implement an archive solution for Salesforce dat

a. This archive solution needs to help NTO do the following:

1. Remove outdated Information not required on a day-to-day basis.

2. Improve Salesforce performance.

Which solution should be used to meet these requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.


Contribute your Thoughts:

Sheridan
12 days ago
Option D? Really? A full copy sandbox just to archive data? That's overkill, not to mention the cost implications. A simple data archiving solution is the way to go.
upvoted 0 times
...
Margot
14 days ago
Haha, option C is a bit too much work. Who wants to create a formula field just to export a report? That's like solving a puzzle to archive some data.
upvoted 0 times
...
Callie
16 days ago
I agree with Adelina. Option A is the best choice here. Maintaining a separate location for the archived data and automating the process is key to improving Salesforce performance.
upvoted 0 times
Xochitl
3 days ago
I think using scheduled batch jobs to migrate and purge aged data is the most efficient solution.
upvoted 0 times
...
Davida
11 days ago
Option A is definitely the way to go. It's important to have a separate location for archived data.
upvoted 0 times
...
...
Adelina
27 days ago
Option A seems like the most efficient way to manage the data archiving process. Scheduled batch jobs to migrate and purge aged data is a straightforward solution.
upvoted 0 times
Nobuko
7 days ago
I agree, using scheduled batch jobs will definitely help improve Salesforce performance by removing outdated information.
upvoted 0 times
...
Chauncey
9 days ago
A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
Margo
15 days ago
Option A seems like the most efficient way to manage the data archiving process. Scheduled batch jobs to migrate and purge aged data is a straightforward solution.
upvoted 0 times
...
...
Viva
1 months ago
I see your point, Corazon, but I still think option A is the most practical choice for Northern Trail Outfitters.
upvoted 0 times
...
Corazon
1 months ago
I disagree, I believe option C is more efficient as it uses a formula field to identify aged data and export it into SharePoint.
upvoted 0 times
...
Andra
2 months ago
I think option A is the best solution because it involves scheduled batch jobs to migrate and purge aged data.
upvoted 0 times
...

Save Cancel