A client receives multiple feeds from third parties on the same SFTP location:
* Product prices (sftp: prod/prices)
* Stores information (sftp: prod/stores;
* Product information (sftp: prod/catalog)
* Categories information (sftp: prod/marketing)
* Content (sftp: prod/marketing)
Some of the feeds are placed on sftp multiple times a day, as the information is updated in the source system.
The Architect decides to have only two jobs:
* One that checks and downloads available feeds every hour
* One that imports the files from Webdav once a day before the data replication, using the standards steps available in the Job Framework
Which design is correctfor the import Job, taking the steps scope in consideration?
This design maximizes efficiency and concurrency. By having the jobs that import products, stores, prices, and content run in parallel, the system can handle multiple data streams simultaneously, reducing total processing time. The sequential execution of importing categories followed by reindexing ensures that all new and updated information is properly indexed and available for site use, following the completion of the import of more frequently updated data. This order respects dependencies between steps and aligns with best practices for handling complex data workflows in B2C Commerce environments.
Melissa
2 months agoAilene
29 days agoAntonio
1 months agoGary
1 months agoDorcas
1 months agoDiane
2 months agoAdolph
2 months agoTiara
2 months agoJeanice
2 months agoErin
2 months agoAlton
1 months agoVeronique
1 months agoShelia
1 months agoJuan
2 months agoJean
1 months agoCheryl
1 months agoElbert
2 months agoNobuko
2 months agoDiane
2 months ago