Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data Architect Topic 1 Question 33 Discussion

Actual exam question for Salesforce's Data Architect exam
Question #: 33
Topic #: 1
[All Data Architect Questions]

Northern Trail Outfitters needs to implement an archive solution for Salesforce dat

a. This archive solution needs to help NTO do the following:

1. Remove outdated Information not required on a day-to-day basis.

2. Improve Salesforce performance.

Which solution should be used to meet these requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.


Contribute your Thoughts:

Margo
6 months ago
Option B seems pretty straightforward - identify an archive location and use time-based workflows to move the data. That could work, but Option A with the scheduled batch jobs sounds a bit more robust to me.
upvoted 0 times
...
Claudia
6 months ago
Hah, a full copy sandbox as an archive? Option D is a bit overkill, don't you think? Might as well just print out all the data and store it in the janitor's closet.
upvoted 0 times
Daron
5 months ago
Hah, a full copy sandbox as an archive? Option D is a bit overkill, don't you think? Might as well just print out all the data and store it in the janitor's closet.
upvoted 0 times
...
Diane
5 months ago
A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
...
Estrella
6 months ago
I personally think C) is the best solution. Using a formula field to identify records reaching a defined age can help with effective data management.
upvoted 0 times
...
Gerald
6 months ago
I'm leaning towards answer A too. Using scheduled batch jobs seems like a more efficient way to manage archived data.
upvoted 0 times
...
Chara
6 months ago
I'm not sure about Option C. Relying on a formula field and exporting to SharePoint? Sounds a bit convoluted. Why not just use a dedicated archiving solution like Option A or B?
upvoted 0 times
Paris
5 months ago
Yeah, I think having a dedicated archiving solution like Option A or B would be more efficient for Northern Trail Outfitters.
upvoted 0 times
...
Sophia
6 months ago
Option C does seem a bit complicated. I agree that Option A or B might be more straightforward.
upvoted 0 times
...
Bulah
6 months ago
I agree, Option A sounds like the best choice for implementing an archive solution for Salesforce data.
upvoted 0 times
...
Emerson
6 months ago
Yeah, using a dedicated archiving solution like Option A or B would probably be the best way to go.
upvoted 0 times
...
Narcisa
6 months ago
Option C might be more complicated than necessary. Option A seems like a more straightforward solution.
upvoted 0 times
...
Margurite
6 months ago
I think having scheduled batch jobs to migrate and purge data nightly, like in Option A, would be more efficient.
upvoted 0 times
...
Luz
6 months ago
Option C does seem a bit complicated. I agree that Option A or B might be more straightforward.
upvoted 0 times
...
...
Lavera
7 months ago
Option A seems like the most comprehensive solution. Scheduled batch jobs to migrate and purge aged data - that's exactly what NTO needs to keep their Salesforce tidy and running smoothly.
upvoted 0 times
Lonna
6 months ago
I agree, Option A sounds like the best way to handle archiving data in Salesforce.
upvoted 0 times
...
Kathryn
6 months ago
Option A seems like the most comprehensive solution. Scheduled batch jobs to migrate and purge aged data - that's exactly what NTO needs to keep their Salesforce tidy and running smoothly.
upvoted 0 times
...
...
Valentin
7 months ago
I disagree, I believe the correct answer is B) Identify a location to store archived data, and move data to the location using a time-based workflow.
upvoted 0 times
...
Pamella
7 months ago
I think the answer is A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
Lauran
8 months ago
Agreed, I think option A is the way to go here. It's a straightforward, well-established approach to archiving data, and it should help us meet the key requirements.
upvoted 0 times
...
Buddy
8 months ago
Option D is the one that's got me scratching my head. Using a full copy sandbox as a source for archived data? That doesn't sound like the most efficient or practical solution to me.
upvoted 0 times
...
Marguerita
8 months ago
Option C is an interesting idea, but I'm not sure a formula field and a report export is the best way to go about this. It seems a bit of a workaround, and we'd have to maintain that formula field and report on an ongoing basis.
upvoted 0 times
...
Gertude
8 months ago
I'm not too sure about option B. Moving data to a location using a time-based workflow seems a bit more complex than the batch job approach. Plus, we'd still need to figure out where to store the archived data.
upvoted 0 times
Tu
7 months ago
Option C might be simpler, but I feel like the batch job approach is more stable for archiving.
upvoted 0 times
...
Margery
7 months ago
But wouldn't option C be simpler with just a formula field and report export?
upvoted 0 times
...
Josephine
7 months ago
I agree, using batch jobs for migration and purge seems efficient.
upvoted 0 times
...
Tu
7 months ago
I think option A sounds like a better choice. It's more straightforward with scheduled batch jobs.
upvoted 0 times
...
...
Danica
8 months ago
Okay, let's go through the options. Option A seems like a good fit - we can identify a location to store the archived data and use scheduled batch jobs to migrate and purge the aged data. That way, we're keeping the data but removing it from the main Salesforce instance.
upvoted 0 times
...
Nelida
8 months ago
Hmm, this question seems pretty straightforward. We need to implement an archive solution for Salesforce data, and the key requirements are to remove outdated information and improve Salesforce performance. Let's see what the options are.
upvoted 0 times
...

Save Cancel