You recently got the chance to ask a group of MIT researchers questions about fusion power, and they've now finished writing some incredibly detailed answers. They discuss the things we've learned about fusion in the past decade, how long it's likely to take for fusion to power your home, the biggest problems fusion researchers are working to solve, and why it's important to continue funding fusion projects. They also delve into the specifics of tokamak operation, like dealing with disruption events and the limitations on reactor size, and provide some insight into fusion as a career. Hit the link below for a wealth of information about fusion.
Have you META-MODERATED today? Sign up for the Slashdot Daily Newsletter! DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25.×
An anonymous reader writes "Today, a typical chip might have six or eight cores, all communicating with each other over a single bundle of wires, called a bus. With a bus, only one pair of cores can talk at a time, which would be a serious limitation in chips with hundreds or even thousands of cores. Researchers at MIT say cores should instead communicate the same way computers hooked to the Internet do: by bundling the information they transmit into 'packets.' Each core would have its own router, which could send a packet down any of several paths, depending on the condition of the network as a whole."
samazon writes "North Carolina State University researcher Jag Kasichainula has developed a 'heat spreader' to cool electronics more efficiently using a copper-graphene composite, which is attached using an indium-graphene interface film. According to Kasichainula, the technique will cool 25% faster than pure copper and will cost less to produce than the copper plate heat spreaders currently used by most electronics (abstract). Better performance at a lower cost? Let's hope so."
An anonymous reader writes "The DARPA Robotics Challenge is offering tens of million of dollars in funding to teams from anywhere in the world to build robots capable of performing complex mobility and manipulation tasks such as walking over rubble and operating power tools. It all will culminate in an audacious competition with robots driving trucks, breaking through walls, and attempting to perform repairs in a simulated industrial-disaster setting. The winner takes all: a $2 million cash prize."
An anonymous reader writes "New York City is planning an upgrade to its aging pay-phone infrastructure. A pilot program will next month install 32-inch touchscreens in 250 phone booths throughout the city. The screens will display "local neighborhood information, including lists of nearby restaurants, store sales in the area, traffic updates, landmark information and safety alerts — in multiple languages.' They will facilitate the 311 service, and also allow people to file complaints or request city information. The good news is that these screens won't cost the taxpayers anything. The bad news is that they will be supported by advertising. The plan is to eventually support Skype calls and email, and to integrate Wi-Fi hotspots."
tsu doh nimh writes "A series of hacks perpetrated against so-called 'smart meter' installations over the past several years may have cost a single U.S. electric utility hundreds of millions of dollars annually, the FBI said in cyber intelligence bulletin first revealed today. The law enforcement agency said this is the first known report of criminals compromising the hi-tech meters, and that it expects this type of fraud to spread across the country as more utilities deploy smart grid technology."
MojoKid writes "In preparation for the arrival of their 3rd Generation Core processor products based on their Ivy Bridge microarchitecture, Intel has readied a new chipset dubbed the Z77 Express. New socket 1155 Ivy Bridge processors offer 16 lanes of PCI Express 2.0 or 3.0 connectivity on-die and they feature integrated dual-channel, DDR3 memory controllers with maximum officially supported speeds of up to 1600MHz. The processors are linked to the Z77 chipset via Intel's FDI (Flexible Display Interface) and 20Gb/s DMI 2.0 interfaces. The chipset itself is outfitted with 8 more PCIe 2.0 lanes, six ports of SATA (II and III), an integrated Gigabit MAC, and digital display outputs for up to three displays. Making its debut for the first time in an Intel chipset is also native USB 3.0 support with four USB 3.0 and ten USB 2.0 ports built in."
retroworks writes "Digitimes Reports that 'Intel is set to push a tablet PC product codenamed StudyBook to target emerging markets. ... The StudyBook tablet PC will feature a 10-inch panel with Intel's Medfield platform and adopt dual-operating systems and will target the emerging markets such as China and Brazil. .. The StudyBook tablet PC will be released in the second half of 2012. ... Intel also hopes to push the product into regular retail channels priced below US$299.' Will this be another 'OLPC' disappointment, or is it starting to look very tough for the traditional school book industry?"
An anonymous reader writes "The U.S. Navy is paying a company six figures to hack into used video game consoles and extract sensitive information. The tasks to be completed are for both offline and online data. The organization says it will only use the technology on consoles belonging to nations overseas, because the law doesn't allow it to be used on any 'U.S. persons.'" Should be a doddle.
An anonymous reader writes "Amazon doesn't show off prototypes unless it is pretty confident about the tech, so you may be surprised to find the next Kindle is probably going to have a front-lit display. The lighting tech comes from a company they purchased back in 2010 called Oy Modilis. It specialized in such lighting and has patents related to whatever Amazon decided to use. The display is meant to be lit in a blue-white glow, and if it's anything like Flex lighting probably won't impact battery life too much. The question is, does anyone really want or need a light for their Kindle?"
An anonymous reader writes "I was looking at multimedia players from brands such as SumVision, Noontec and Western Digital. They all seem to be some device which accepts a USB hard-drive and commands from an IR remote control, and throws the result over HDMI. I have my own idea of what a hardware multimedia player should do (e.g. a personalized library screen for episodes, movies and documentaries; resume play; loudness control; etc.). I also think it will a good programming adventure because I will have to make the player compatible with more than a few popular codecs. Is this an FPGA arena? Or a mini-linux tv-box? Any advice, books or starting point to suggest?" There certainly have been a lot of products and projects in this domain over the years, but what's the best place to start in the year 2012?
CowboyRobot writes "Stanford's CPU DB project (cpudb.stanford.edu) is like an open IMDB for microprocessors. Processors have come a long way from the Intel 4004 in 1971, with a clock speed of 740KHz, and CPU DB shows the details of where and when the gains have occured. More importantly, by looking at hundreds of processors over decades, researchers are able to separate the effect of technology scaling from improvements in say, software. The public is encouraged to contribute to the project."
CowboyRobot writes "The National Weather Service has begun testing the way it labels natural disasters. It's hoping that the new warnings, which include words like 'catastrophic,' 'complete devastation likely,' and 'unsurvivable,' will make people more likely to take action to save their lives. But what about their digital lives? Recommendations include: Keep all electronics out of basements and off the floor; Unplug your hardware; Buy a surge protector; Enclose anything valuable in plastic. If the National Weather Service issued a 'complete devastation' warning today, would your data be ready?"
A week ago, we posted news of the delay that the Raspberry Pi Foundation faced because of a requirement that their boards be tested to comply with EU regulations. Now, the word is in, and the Raspberry Pi passed those tests without needing any modifications. From their post describing the ordeal: "The Raspberry Pi had to pass radiated and conducted emissions and immunity tests in a variety of configurations (a single run can take hours), and was subjected to electrostatic discharge (ESD) testing to establish its robustness to being rubbed on a cat. It’s a long process, involving a scary padded room full of blue cones, turntables that rise and fall on demand, and a thing that looks a lot like a television aerial crossed with Cthulhu."
HizookRobotics writes "The official announcement should be out very soon, but for now here's the unofficial, preliminary details based on notes from Dr. Gill Pratt's talk at DTRA Industry Day: The new Grand Challenge is for a humanoid robot (with a bias toward bipedal designs) that can be used in rough terrain and for industrial disasters. The robot will be required to maneuver into and drive an open-frame vehicle (eg. tractor), proceed to a building and dismount, ingress through a locked door using a key, traverse a 100 meter rubble-strewn hallway, climb a ladder, locate a leaking pipe and seal it by closing off a nearby valve, and then replace a faulty pump to resume normal operations — all semi-autonomously with just 'supervisory teleoperation.' It looks like there will be six hardware teams to develop new robots, and twelve software teams using a common platform."