Follow Slashdot stories on Twitter


Forgot your password?
Businesses Data Storage Software Technology

What's After Big Data? 87

gthuang88 writes: As the marketing hype around "big data" subsides, a recent wave of startups is solving a new class of data-related problems and showing where the field is headed. Niche analytics companies like RStudio, Vast, and FarmLink are trying to provide insights for specific industries such as finance, real estate, and agriculture. Data-wrangling software from startups like Tamr and Trifacta is targeting enterprises looking to find and prep corporate data. And heavily funded startups such as Actifio and DataGravity are trying to make data-storage systems smarter. Together, these efforts highlight where emerging data technologies might actually be used in the business world.
This discussion has been archived. No new comments can be posted.

What's After Big Data?

Comments Filter:
  • Re:A futile effort (Score:4, Informative)

    by lgw ( 121541 ) on Friday August 22, 2014 @02:02PM (#47731405) Journal

    It's like trainspotting, but for advertising memes.

    Gartner is the king/pusher of course. But I think they were actually insightful about this 5 or so years ago. They predicted about 3 year of all hype, no product "cloud", another 3 years of practical, useful cloud infrastructure with nothing really taking advantage of it, and only after that would we see startups (and VC investment opportunities) making use of the cloud to make actual products. I think we're almost there.

    Even for hobby programming, the cloud is becoming quite appealing. For example, take a look at this remarkable Mabdelbrot zoom [] to 10^275. This required 6 core-years to render (6 months wall clock). If you have the patience, the machines sitting idle (perhaps discarded bitcoin rigs) and no fear of power bills, then sure, turn on 3 old high-CPU towers for 6 months. But if you're good at massively parallel coding (and Mabdelbrot rendering is great to learn that!) you can usually get AWS Spot [] machines for under a penny per core-hour. That means you can get that 6 core-years of CPU for about the price of a midrange geek PC, and you can get thousands of cores in parallel, and be done rendering in a day.

    For a hobby project it might be hard to justify spending $hundreds this way, but for a start-up it makes perfect sense. So there's something to the "cloud" IMO if you're trying to do supercomputer parallelism on a shoestring budget, something that's really only become possible in the past couple of years. I'm not sure how cheap 10000 core-hours for $100 is, really, but 10000 cores in parallel for an hour for $100 is something wonderful.

If it's not in the computer, it doesn't exist.