Northern Trail Outfitters needs to implement an archive solution for Salesforce dat
a. This archive solution needs to help NTO do the following:
1. Remove outdated Information not required on a day-to-day basis.
2. Improve Salesforce performance.
Which solution should be used to meet these requirements?
Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.
A national nonprofit organization is using Salesforce to recruit members. The recruitment process requires a member to be matched with a volunteer opportunity. Given the following:
1. A record is created in Project__ c and used to track the project through completion.
2. The member may then start volunteering and is required to track their volunteer hours, which is stored in VTOTime_c object related to the project.
3. Ability to view or edit the VTOTime__c object needs to be the same as the Project__ c
record.
4. Managers must see total hours volunteered while viewing the Project__ c record.
Which data relationship should the data architect use to support this requirement when creating the custom VTOTime__c object?
A master-detail field on VTOTime__c to Project__c is the data relationship that the data architect should use to support the requirement when creating the custom VTOTime__c object. A master-detail relationship creates a parent-child relationship between two objects, where the master record controls certain behaviors of the detail record, such as security, ownership, deletion, and roll-up summary fields. By using a master-detail field on VTOTime__c to Project__c, you can ensure that the ability to view or edit the VTOTime__c object is the same as the Project__c record, and that managers can see the total hours volunteered while viewing the Project__c record using a roll-up summary field.
Northern Trail Outfitters needs to implement an archive solution for Salesforce dat
a. This archive solution needs to help NTO do the following:
1. Remove outdated Information not required on a day-to-day basis.
2. Improve Salesforce performance.
Which solution should be used to meet these requirements?
Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.
Universal Containers has been a customer of Salesforce for 10 years. Currently they have 2 million accounts in the system. Due to an erroneous integration built 3 years ago, it is estimated there are 500,000 duplicates in the system.
Which solution should a data architect recommend to remediate the duplication issue?
Implementing duplicate rules (option D) is the best solution to remediate the duplication issue, as it allows the data architect to identify and merge duplicate accounts in Salesforce using native features and tools. Developing an ETL process that utilizes the merge API to merge the duplicate records (option A) is not a good solution, as it may require more coding and testing effort, and it does not prevent duplicates from being created in Salesforce. Utilizing a data warehouse as the system of truth (option B) is also not a good solution, as it may introduce additional complexity and cost, and it does not address the duplication issue in Salesforce. Extracting the data using data loader and using excel to merge the duplicate records (option C) is also not a good solution, as it may be time-consuming and error-prone, and it does not prevent duplicates from being created in Salesforce.
Which API should a data architect use if exporting 1million records from Salesforce?
Using Bulk API to export 1 million records from Salesforce is the best option. Bulk API is a RESTful API that allows you to perform asynchronous operations on large sets of data. You can use Bulk API to create, update, delete, or query millions of records in batches. Bulk API is optimized for performance and scalability, and it can handle complex data loading scenarios.
Argelia
3 days agoVanesa
28 days agoBuck
29 days agoFloyd
1 months agoLashaunda
2 months agoAllene
2 months agoCarri
2 months agoLisha
3 months agoQuentin
3 months agoPearly
3 months agoMari
4 months agoBettina
4 months agoElenora
4 months agoAlease
4 months agoBrittni
5 months agoKatheryn
5 months agoMabel
6 months agoLuz
6 months agoCarey
6 months agoArgelia
7 months agoChauncey
7 months agoJosephine
7 months agoWilda
7 months agoRicarda
8 months agoHildegarde
8 months agoAbel
9 months agoadam zampa
10 months ago