Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Data Storage Government Hardware IT News Science Technology

Obama Administration Places $200 Million Bet On Big Data 72

wiredmikey writes "As the Federal Government aims to make use of the massive volume of digital data being generated on a daily basis, the Obama Administration today announced a 'Big Data Research and Development Initiative' backed by more than $200 million in commitments to start. Through the new Big Data initiative and associated monetary investments, the Obama Administration promises to greatly improve the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data. Interestingly, as part of a number of government announcements on big data today, The National Institutes of Health announced that the world's largest set of data on human genetic variation – produced by the international 1000 Genomes Project (At 200 terabytes so far) is now freely available on the Amazon Web Services (AWS) cloud. Additionally, the Department of Defense (DoD) said it would invest approximately $250 million annually across the Military Departments in a series of programs. 'We also want to challenge industry, research universities, and non-profits to join with the Administration to make the most of the opportunities created by Big Data,' Tom Kalil, Deputy Director for Policy at OSTP noted in a blog post. 'Clearly, the government can't do this on its own. We need what the President calls an 'all hands on deck' effort.'"
This discussion has been archived. No new comments can be posted.

Obama Administration Places $200 Million Bet On Big Data

Comments Filter:
  • Great QOTD (Score:5, Insightful)

    by smwny ( 874786 ) on Thursday March 29, 2012 @05:38PM (#39516493)
    The QOTD at the bottom of the page so perfectly matched this story.

    All the taxes paid over a lifetime by the average American are spent by the government in less than a second. -- Jim Fiebig

    • Thanks goodness government is a non-profit enterprise these days.

      Harken back to the times of kings!

      • by DarkOx ( 621550 )

        Yea but as a percentage the kings took less wealth from a serf than the us tax payer is asked to fork over today, if you look at those who pay some income tax.

    • That's kind of a silly statistic to use, since it's sensitive to population size. There are about 310 million Americans, and 1 year contains about 31 million seconds. You'd expect (roughly) 10 years of an average taxpayer's taxes to be spent every 1 second if revenues == expenditures, everything is inflation adjusted, and everyone pays taxes. If the country instead had only 31 million people you'd expect 1 year per second. The implied point--that government spending is out of control and/or hugely wasteful-

  • Privacy? (Score:5, Insightful)

    by GeneralTurgidson ( 2464452 ) on Thursday March 29, 2012 @05:40PM (#39516511)
    When it comes to big data, there's going to be little privacy.
    • I agree with you good sir. When I hear about things like this... I feel like selling everything I own and walking off into the mountains.
    • Re: (Score:3, Interesting)

      by Sarten-X ( 1102295 )

      That's an absolutely unfounded concern.

      I worked at a Big Data company. About 90% of my job was improving privacy while maintaining the integrity of medical data. The patient's zip code was reduced to 3 digits. Any references to states were removed and forgotten (because there are some zip codes that cross state lines). Any names were removed, as were any user-entered comments (doctor's notes, etc.) that might possibly contain personal information. Any personal information that is necessary for the system bu

      • by Bravoc ( 771258 )
        Yea but.... we're not talking about a company where they could become the target of civil litigation. We're talking about the US Federal Govurn-munt. Need I say more? I don't feel good about this at all.
      • That's pretty cool. I'm hopefully having my "exome" sequenced soon, as part of a clinical trial. I don't think my data will make it into a public database, but I would be for it, so long as my name etc were removed.

        Just some dumb thoughts on TFA: The 1000 Genome project is hosting 200TB of data?!? Haven't these guys ever heard of compression? It seems that an individual's entire genome can be compressed to about 4MB, so the ~2000 genomes produced by the 1000 genome project should easily fit on my micro

    • by Anonymous Coward

      When Obama said his administration would be the most transparent, it is apparent it was about how the government would make all of our private details transparent to government while making what the government does totally opaque to us.

  • by Anonymous Coward on Thursday March 29, 2012 @05:42PM (#39516537)
    Clearly, the government can't do this on its own. We need what the President calls an 'all hands on deck' effort

    So the Obama wants to pick and choose how this will be handled but he wants everyone else to do it? Whatever happened to representation?
  • spies are ecstatic over the goodies that Uncle Sugar is about to drop in their laps.
  • With, of course, the certainty that the data mining capabilities will never be used for evil such as monitoring American citizens for the purposes of identifying nonviolent (but loud) political dissidents.

    The difference between this and Google is that you can haul Google into court when they do evil.

  • by Anonymous Coward

    That is what it looks like to me.

    I think Obama gave Solendra $500 million about one week before the company declared bankruptcy. The execs refused to tell anybody where the money went, and acted offended that anybody would ask.

    Big political contributions are probably about the best investment you make, I figure about a $10 return for every $1 invested.

  • by macwhizkid ( 864124 ) on Thursday March 29, 2012 @06:39PM (#39517079)

    I'm a hard science/computer science guy who's livelihood is working on various NIH/NSF projects. A common thread talking to other scientists the past few years has been the theme that the tools for data analysis have not kept pace with the tools for data acquisition. Companies like National Instruments sell sub-$1000 USB DAQ boards with resolution and bandwidth that would make a scientist from the early 1990's weep for joy. But most data analysis is done the same way it's been done since that same era: with a desktop application working with discrete files, and maybe some ad-hoc scripts. (Only now the scripts are Python instead of C...)

    The funny thing is, most researchers haven't yet wrapped their brains around the notion of offloading data onto cloud computing solutions like Amazon AWS. I was at an AWS presentation a couple months ago, and the university's office of research gave an intro talking about their new supercomputer that has 2000 cores, only to get upstaged 10 minutes later when the Amazon guys introduced their 17000 core virtual supercomputer (#42 on the top 500 list, IIRC). There's a lot of untapped potential right now for using that infrastructure to crunch big data.

    • by Ruie ( 30480 )

      I was at an AWS presentation a couple months ago, and the university's office of research gave an intro talking about their new supercomputer that has 2000 cores, only to get upstaged 10 minutes later when the Amazon guys introduced their 17000 core virtual supercomputer (#42 on the top 500 list, IIRC). There's a lot of untapped potential right now for using that infrastructure to crunch big data.

      Big Data is about I/O not cores... How many GB/sec from disk can that cloud support ?

      • If you need GB/sec for downloading data, you are asking wrong questions. AWS can not only store but also compute.

        BJI is producing so much data they are going back to shipping hard drives (flashback from 2000 for me)

    • I was at an AWS presentation a couple months ago, and the university's office of research gave an intro talking about their new supercomputer that has 2000 cores, only to get upstaged 10 minutes later when the Amazon guys introduced their 17000 core virtual supercomputer (#42 on the top 500 list, IIRC). There's a lot of untapped potential right now for using that infrastructure to crunch big data.

      While very cool, some problems require more communication between threads and might not scale well on more distributed VMs. Still, very cool.

    • Amazon is using the idle time of their huge cloud when it's not being used for christmas shopping ... so the cost of CPU is relatively cheap. Bandwidth and storage is *not* with most cloud sevices.

      So, say I need to calibrate a year's worth of SDO/AIA data ... that'd mean pushing to them somewhere in the range of 500TB of data, and then pulling it back again. They've changed their pricing so the transfer in is now free ... but if I'm doing the math right, that'd cost somewhere on the order of $30k for the

    • by tomhath ( 637240 )
      This doesn't seem like a hardware problem. The government has been talking about "data fusion" for decades. They collect way more data than they know how to use. At a very high level this initiative sounds reasonable, but with no specific goals I wouldn't expect anything tangible to come out of it.
    • The sequencing data produced is of inferior quality than 10 years ago. Somebody might be weeping with joy, but not the assemblers downstream in this data flow.

      Ironically, previously quality did not matter much, when mapping genes was good enough. Nowadays when we are talking SNPs, reads are arguing with each other, MiSeq assemblies in disarray with Ion Torrent. Every sequence variation in alignment is screaming "I am Spartacus".

      The problem of opened Pandora boxes is solved by opening hundred more of them. W

  • Given this administration's record on "investments" and "betting on the future" that's just another $200,000,000 into the pockets of Big Democrats.

  • The Oracle/Cisco/IBM Full Employment Act
  • Sure it's great that we can start to analyze lots more data but does anyone else think we should start using the data we've already got? The next time I hear a politician propose a policy that directly contradicts current research on the subject in order to pander to their constituents or Jesus I think my brain will explode.

  • It's all spent in military departments, according to TFA. I really doubt the community will see much back from this investment in the form of better open source tooling, since a lot of it will be used to deal with military secret stuff, no doubt.

    Also, given the amount of black ops money spent there and the "regular budget", this is nothing. The F22 project alone has budget overruns that make this look like pocket change.
  • Have no fear, I am sure it will be run as well as other government ventures such a your local Registry of Motor Vehicles....
  • Thanks for gambling my $$$.

To communicate is the beginning of understanding. -- AT&T

Working...