Tuesday, December 10, 2013

Formulas to memorize

BDP (Bandwidth Delay Product:
  • Bandwidth in Bytes * Latency in seconds
  • Latency is the same as RTT (Return Trip Time)
  • EX:  Bandwidth = 45 Mb/sec (DS3), RTT = 42ms
  • 45,000,000 bits/1 second * 1 byte/8 bits * .042 seconds = 236250 bytes / 1024 = 230 kbytes
f_cache.dat Sizing:
  •  #_million_files * 44MB = 1/8 RAM
  • EX:  1 million files * 44 = 44 MB = 1/8 of RAM = 44*8 =  352MB RAM

p_cache.dat Sizing:
  • 1 MB per 1 GB of database file = 1/16 RAM
  • EX:  250GB of DB data, the hash cache must be 250MB.  = 1/16 of RAM = 250*16 = 4GB
Networker Catalog Sizing:
  • Catalog Size = (n + (i*d))*c*160*1.5
    • n = number of files to back up
    • d = days between full backup (how many incrementals?)
    • i = incremental data change rate
      • i = n*(% data changes as a decimal)
    • c = number of backup cycles
    • 160 = average estimate of each catalog entry
    • 1.5 = multiplier for growth
     

Data Deduplication Sizing:

  • EBSS gathers total backup size. Round down to nearest TB.
  • Data Domain sizing in the EBSS, select Avamar as the backup provider, and then size the same as using Networker,
  • Non-Exchange environments:Total Backup Environment Size divided by 10,000
  • (leave the dedupe values at 0 and set retention to the same length determined in the DD sizing)
  • Exchange environments: Total Environment Size and multiply by 1.33%
  • (leave the dedupe at 0 and retention the same as on the DD sizing)


Avamar - Data Domain Compression rates
50% = 2x
80% = 5x
90% = 10x
95% = 20x
96% = 25x
98% = 50x
99% = 100x
99.7% = 333x

Burn Rate:

  • ((incrementals+full)*retention)
  • Example:  a 1TB backup that, after compression consumes 200GB physical disk, and based on daily change rate we determine that subsequent incremental backups will be about 100 GB and consume 20 GB on disk, our burn rate after a week would be (7*20) for a total of 240GB.  If our retention period is 4 weeks, total burn rate is 960GB.
Throughput:
  •  Largest backup/Backup time window = Throughput required
  • Ex:  6500GB/10 hrs = 650GB/hour
Performance Buffer:
  •  Max throughput/Max capacity = % of max capacity (Performance buffer)
  • Ex:  1.2 TB per hour / 2.3 TB per hour = 52% (well within 75-85% desired buffer range)

4 comments:

  1. Many thanks for this note Greg. It helped me a lots to prepare for the exam E20-329. Can you please share the .pdf study guide for this exam. I don't have access to VILT trainning :(. Please e-mail to asazami611@gmail.com

    ReplyDelete
  2. Thanks for the kind words. Unfortunately I don't have any resources I'm able to share for the exam that would not be in violation of copyright or EULA. Best of luck to you, and let me know how your exam goes.

    --GR

    ReplyDelete
  3. Hi Greg, I've just pass the E20-329 :P. May thanks for your blogs to help me a lot on this exam. Now I'm learning to the last one E20-891, Do you have any experience on this exam :D

    ReplyDelete
  4. Congratulations! That is quite an accomplishment! I have not taken the E20-891 yet. Let me know how it goes for you, though.

    ReplyDelete