Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Data Storage IT

Software-Defined Data Centers Might Cost Companies More Than They Save 173

storagedude writes "As more and more companies move to virtualized, or software-defined, data centers, cost savings might not be one of the benefits. Sure, utilization rates might go up as resources are pooled, but if the end result is that IT resources become easier for end users to access and provision, they might end up using more resources, not less. That's the view of Peder Ulander of Citrix, who cites the Jevons Paradox, a 150-year-old economic theory that arose from an observation about the relationship between coal efficiency and consumption. Making a resource easier to use leads to greater consumption, not less, says Ulander. As users can do more for themselves and don't have to wait for IT, they do more, so more gets used. The real gain, then, might be that more gets accomplished as IT becomes less of a bottleneck. It won't mean cost savings, but it could mean higher revenues."
This discussion has been archived. No new comments can be posted.

Software-Defined Data Centers Might Cost Companies More Than They Save

Comments Filter:
  • by Anonymous Coward on Monday July 29, 2013 @12:35AM (#44409973)

    So, we are back to the "you'll never need more than 640K" or "the world only needs 5 computers" or "a telephone per town will be good enough" logic?

    Now, in the real world, we invent new things, and then make them as cheaply as possible, increasing use/ownership of the widget, and hopefully finding as many uses for the widget as possible. There are even films from the mid 20th century laying down this concept of capitalism and consumerism working at its most effective. But guess what? There are ALWAYS entrenched interest in the older and crappier ways of doing anything who will pay 'experts' to proclaim that the proposed new ways are stupid/dangerous/expensive or whatever.

    Let me guess. Next up on Slashdot is a reference to Amdahl's law that 'proves' no general computer ever benefits from having more than 3 cores? Those that quote such 'laws' are ALWAYS too thick to comprehend the extremely limited set of circumstances to which the 'law' applies. Those that understand such special circumstances will never quote such 'laws'.

  • by hawguy ( 1600213 ) on Monday July 29, 2013 @01:21AM (#44410117)

    A car analogy:

    This is the same type of BS that the city I live in uses so they don't have to build or expand any roads, "Well, traffic will just be as bad even after we get the highway built, so why bother?"

    It's true. Once they build a new highway into town, people will build houses farther away on the other end of the expanded highway, so the new highway just fuels more suburban sprawl, so it causes more congestion inside the city and for drivers closer to the city where it may not be possible to build more roads at all. Or, in areas without a major population center, it can encourage job centers to spring up along the highway, which is difficult to serve with cost effective transit as people are forced to commute farther and farther to get to their jobs.

    By not building the roads, they implicitly encourage more high density, transit friendly development closer to the city.

    In general, a new highway is a temporary (and expensive) solution to a traffic problem.

  • by Anonymous Coward on Monday July 29, 2013 @02:36AM (#44410273)

    Here's why IT doesn't use those 100 buck 1TB hard drives: http://serverfault.com/questions/263694/why-is-enterprise-storage-so-expensive/263695#263695 [serverfault.com]

  • by uncqual ( 836337 ) on Monday July 29, 2013 @02:57AM (#44410315)

    But, keeping three copies of the data on cheap hardware, one of which is hundreds of miles away and having a couple other data centers to which the data migrates in seconds and minutes is within the scope of a cloud provider -- just business as usual (the exact number of data centers and copies is irrelevant as they depend on this years stats for the low cost hardware - it's all statistics).

    A business whose business isn't to maintain ten(s) of data centers and manage them for redundancy may not be willing (nor, probably, should they) to pay for that level of redundancy just for their own ten terabytes of important data (their business is making innovative widgets efficiently, not managing geographically distributed data centers, each with a connection to at least two independent power sources plus backup generators).

    If a midsized business making drywall needed another car to transport a sale person, would they build an auto plant to build that car? No, they would lease the car from a business whose business was leasing cars (and providing replacement cars and maintenance) and who, in turn, bought them from a specialist in designing and making cars (Toyota for example).

  • by Anonymous Coward on Monday July 29, 2013 @03:23AM (#44410367)

    I can give you a counter example. We had a Cloud Ops team that had the task to exactly prevent this stuff you described.
    Great. Except, they didn't even know how to set an EC2 instance with EBS. Also they couldn't provide the EC2 instance types that were needed.
    So in the end, we just worked around them. Instead of taking *days* to explain them what we needed, we had our EC2 instance running in 5min exactly how we needed it.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...