So I have even a modest 12 sticks of RAM running under high load. You are talking about the difference between 36 watts versus 240 freaking watts, JUST FOR RAM.
The simple fact here is that some people don't realize that they spin up this DDR2 era server, even if they don't use it much, and they are minimally adding $250 to $500 or more to their electric bill in a year if they leave it on most of the time.
My first R710 I just spun up when I wanted to use it. Do you know what happened? I got sick of waiting 10+ minutes for it to boot via IPMI, so I left it on all the time.
I guess there are two types of people in the world:
Even if you bump the first number up to $400 for the initial cost of a DDR3 server, which is going to be 3x the server of a DDR2 server and last twice as long, you have
I would MUCH rather pay the $170 difference up front, when I might have the money from saving, rather than be stuck in a $15 extra or more per month "mortgage" payment.
I hate creating extra monthly costs for myself. I would rather pay more up front.
Now obviously if you aren't going to leave it on all the time or you don't pay for power, then go for it. I would still go for the DDR3 server just for performance.