Were you able to try changing those REG keys as per the KB? Also, do you know if the ReFS storage needs to be configured within Storage Spaces for it to function properly? However, the registry keys mentioned in the KB are not present on my server. I think that update isn't applicable because it was superseded by another one. My largest job is almost 12TB but the RAM issue occurs on a 1TB backup. YMMV, but in our case, even with all the options enabled for us at their most strict settings, the problem still occurred. What you're describing is definitely what I was seeing with 4k. Apparently, from comments, 64k has been an issue as well - just much less frequently. 40629.html you'll find that plenty have had this issue with the default of 4k. Graham8 wrote:Well, if you look over at veeam-backup-replication-f2/refs-4k-hor. They may fix the issue "soon", but I personally wouldn't want to bet on it. So.if you can deal with the backup space inflation of NTFS, then yes, I think that's probably wise. I asked what the cause was, though, because we have other ReFS+2016+Veeam servers (though those have more ram and have been more stable), but I haven't gotten a reply after asking a few times. I had to point out that this is a production server and that Microsoft needs to do their testing internally. I got a reply after a few weeks that they had analyzed them, saying that they think they knew what is going on, and that they wanted to do testing on our server. It's been about a month, I think, since I got the memory dumps sent to MS. I got a bunch of manually-initiated (when the server locks due to resource consumption) memory dumps sent to them, and then after that point I had to reload with something that I could rely on since this wasn't a test server. I've had a MS case open about the issue for months now. Sitruk wrote:I am leaning more towards formatting my repositories with NTFS and calling it a day. I am leaning more towards formatting my repositories with NTFS and calling it a day. Unfortunately, when I attempt to install this fix it says it is not applicable to my computer. Windows recently released a fix for ReFS memory usage. I can sometimes remote in and reboot the server to reclaim the RAM. The job fails and the RAM is left consumed. Then the physical server locks up due to a lack of resources. Performance is great, but once I get to the large VM the RAM is slowly consumed until full. I ordered the VMs to backup smallest first and they all complete successfully. I have one job with all my VMs where most are very small. I even reimaged my physical server thinking Windows was corrupt.Ģx ReFS Repositories 10TB and 20TB both formatted with 64K clusterĪdded to virtual Veeam B&R 6.5 update 2 server as a proxy I have been battling this issue for a couple weeks and it took me this long to narrow down the cause. Before to come back to NTFS, do you know some trick to solve this problem? I'm sure is a ReFS related problem, because it's born with this change in my infrastructure. Now i have again the same problem, what do you suggest? I try it for some days, but it's too slow respect of an hardware (hp smart array) management of my backup storage (JBOD). I have closed the ticket because i read (don't remember where, sorry) to use ReFS with Storage Space as backup repository for Veeam. refer to windows error code 0xC000048C STATUS_BLOCK_TOO_MANY_REFERENCES, but there is no documentation a volume defrag, but the defrag activity hung after few minutes with no disk or CPU activity).ģ. remove configuration of the copy job, and import again creaeting a new backup repository in a new location, impossible in my case (i have only one volume i have try within the same volume, the error remain)Ģ. This is the second time: the first time i open a case but with no solutionġ. Hi, i have the same error (the maximum reference, from the last reply of suprnova) while Copy job merge, after 2 months of backup chain (7 Days, 4 Weeks, 3 Months, of 5 vms in one backup job).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |