Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

VEEAM Exam VMCE_v12 Topic 5 Question 25 Discussion

Actual exam question for VEEAM's VMCE_v12 exam
Question #: 25
Topic #: 5
[All VMCE_v12 Questions]

An engineer is using Veeam Backup and Replication v12.

The only backup repository is a Microsoft Windows server with direct attached Fibre Channel storage array.

The engineer realizes that none of their backups are immutable. A second copy of the backup on a different site and a different media is required.

Which option should be used to provide immutable backups on a secondary site with a different media?

Show Suggested Answer Hide Answer
Suggested Answer: B

To provide immutable backups on a secondary site with a different media, the best option given the context is B: Create a Scale Out Backup Repository (SOBR) with the existing Microsoft Windows Server as the performance tier and an AWS S3 bucket with immutability enabled as the capacity tier.

This approach involves leveraging the existing backup infrastructure (Microsoft Windows Server with direct-attached storage) as the performance tier of the SOBR, where the most recent backups are stored for fast access. For long-term storage and immutability, backups can be offloaded to an AWS S3 bucket configured with Object Lock. The Object Lock feature in AWS S3 provides an additional layer of data protection by making the backup data immutable, meaning it cannot be deleted or modified for a specified duration. This setup ensures that backup data is protected against accidental deletion, ransomware, and other malicious activities.

By implementing this configuration, the engineer can achieve the desired level of data protection and immutability, utilizing cloud storage as a secure and scalable secondary backup location, distinct from the primary on-premises storage media.


Contribute your Thoughts:

Fernanda
2 months ago
Haha, these backup repository options sound like they're straight out of a tech-savvy superhero's playbook! I just hope the engineer isn't trying to back up their secret identity.
upvoted 0 times
Cyndy
1 months ago
D) Create a Scale Out Backup Repository with the existing Microsoft Windows Server as the performance tier and Google Cloud Object Storage with immutability enabled as the capacity tier.
upvoted 0 times
...
William
1 months ago
C) Create a new hardened repository on a new Microsoft Windows Server, mark it as immutable and create a backup copy job on it.
upvoted 0 times
...
Adelaide
2 months ago
B) Create a Scale Out Backup Repository with the existing Microsoft Windows Server as the performance tier and AWS S3 bucket with immutability enabled as the capacity tier.
upvoted 0 times
...
Amira
2 months ago
A) Create a Scale Out Backup Repository with the existing Microsoft Windows Server as the performance tier and an HPe StoreOnce Catalyst share with immutability enabled as the capacity tier.
upvoted 0 times
...
...
Kindra
2 months ago
Hmm, Option D is interesting with the Google Cloud Object Storage, but I'm not sure if that's as widely adopted as AWS S3. I'd probably stick with the more common cloud provider.
upvoted 0 times
...
Thaddeus
2 months ago
I'd go with Option C. Creating a new hardened repository and marking it as immutable is a straightforward way to get the required secondary copy with immutability.
upvoted 0 times
Florinda
2 months ago
Yeah, Option C seems like the most direct way to meet the requirement for a secondary copy with immutability.
upvoted 0 times
...
Lashawnda
2 months ago
I agree, creating a new repository and marking it as immutable seems like a simple solution.
upvoted 0 times
...
Celestina
2 months ago
Option C sounds like the best choice for ensuring immutable backups on a secondary site.
upvoted 0 times
...
...
Celeste
3 months ago
I'm not sure, but option C also seems like a viable option. Creating a new hardened repository on a new server with immutability enabled could work too.
upvoted 0 times
...
Tori
3 months ago
I agree with Franklyn. Using HPe StoreOnce Catalyst share with immutability enabled as the capacity tier sounds like a secure solution.
upvoted 0 times
...
Lezlie
3 months ago
Option B seems like the best choice here. Using AWS S3 with immutability enabled provides a secondary site and different media, which is exactly what the question is asking for.
upvoted 0 times
Luisa
2 months ago
That's a good point, AWS S3 does have a strong reputation for data protection.
upvoted 0 times
...
Breana
2 months ago
I think option B is better because AWS S3 is known for its durability and immutability features.
upvoted 0 times
...
Maryann
2 months ago
I agree, option B with AWS S3 bucket seems like the most suitable choice.
upvoted 0 times
...
Lazaro
2 months ago
But what about option A with HPe StoreOnce Catalyst share for immutability?
upvoted 0 times
...
...
Franklyn
3 months ago
I think option A is the best choice because it provides immutability on a secondary site with a different media.
upvoted 0 times
...

Save Cancel