MrSeb writes with news about our coming cybernetic overlords. From the article: "After more than four years of research, DARPA has created a system that successfully combines soldiers, EEG brainwave scanners, 120-megapixel cameras, and multiple computers running cognitive visual processing algorithms into a cybernetic hivemind. Called the Cognitive Technology Threat Warning System (CT2WS), it will be used in a combat setting to significantly improve the U.S. Army's threat detection capabilities. There are two discrete parts to the system: The 120-megapixel camera, which is tripod-mounted and looks over the battlefield; and the computer system, where a soldier sits in front of a computer monitor with an EEG strapped to his head, looking at images captured by the camera, wedding out false threats. In testing, the 120-megapixel camera, combined with the computer vision algorithms, generated 810 false alarms per hour; with a human operator strapped into the EEG, that drops down to just five false alarms per hour. The human brain is surprisingly fast, too: According to DARPA, CT2WS displays 10 images per second to the human operator — and yet that doesn't seem to affect accuracy."
An anonymous reader writes "Yesterday, XDA Developers forum users kinfaus and pokey9000 were discussing how the latest devices from Amazon (the second-generation 7 Kindle Fire and the 7 Kindle Fire HD) come with more sophisticated protection than their predecessors, including locked bootloaders and 'high security' features offered by their OMAP processors. Today, the devices have been rooted." Using a known bug in busybox dating to April even.
An anonymous reader writes "An achievement that would have extraordinary energy and defense implications might be near at Sandia National Laboratories. The lab is testing a concept called MagLIF (Magnetized Liner Inertial Fusion), which uses magnetic fields and laser pre-heating in the quest for energetic fusion. A paper by Sandia researchers that was accepted for publication states that the Z-pinch driven MagLIF fusion could reach 'high-gain' fusion conditions, where the fusion energy released greatly exceeds (by more than 1,000 times) the energy supplied to the fuel."
CowboyRobot writes "Dr. Dobb's reviews an alternative to Google Glass and goes through the steps of coding your own Android-based Heads-Up Display. 'By tucking their 428x240 pixel WQVGA heads-up display in the lower right corner of ski goggles, Recon has effectively created an unobtrusive HUD with a decent 600 MHz ARM Cortex A8 processor running Android 2.3.3 (Eclair). Network connections can be made via a Bluetooth-paired Android smartphone.'"
An anonymous reader writes "I am tasked with developing a service project to teach students in a Bangladeshi village how to type. The school has about 500 students, 12 computers donated to them in 2006, and a limited electricity supply. The students will be given job placement opportunities at a local firm in the city once they reach a certain proficiency. Therefore, we are trying to teach as many of them typing skills as possible. The problem: limited electricity, limited computers, many kids. I have some additional funding collected through donations. Instead of buying more computers, I am looking for a cost effective way that does not need a steady flow of electricity. I realize that to teach typing, I do not need a computer. I could achieve the same using a keyboard connected to a display. A solar powered calculator is a perfect example of a cheap device which has a numpad for input and an LCD for display. But so far I have not come across a device that has a qwerty keyboard and an LCD to display what's typed. I know there are some gaming keyboards that have LCDs built in but they are quite expensive. I am aiming to build a device that cost below USD 50. I considered using typewriters but they are in limited supply on the market. I also considered OLPC but it is double my anticipated budget. Do you have other suggestions?" Considering that (at least in China) sub-$50 Android tablets with capacitive screens are already here, I wish the Alphasmart line was cheaper, but apparently it currently starts at $169.
MrSeb writes "If, like me, you thought Microsoft would price Windows RT competitively, you were wrong: A leaked slide from Asus says that its Vivo Tab RT, due to be released alongside Windows RT at the end of October, will start at $600. Unbelievably, this is $100 more than the iPad 3, and a full $200 more than the iPad 2 or Galaxy Tab 2 10.1. For $600, you would expect some sensational hardware specs — but alas, that's sadly not the case. The Vivo Tab RT has a low-res 10.1-inch 1366×768 IPS display, quad-core Tegra 3 SoC, 2GB of RAM, NFC, 8-megapixel camera and that's about it. Like its Androidesque cousin, the Transformer, the Vivo Tab RT can be plugged into a keyboard/battery dock — but it'll cost you another $200 for the pleasure. (Curiously, the Transformer's docking station only costs $150 — go figure.)"
First time accepted submitter moon_unit2 writes "Technology Review has the scoop on a new industrial robot created by famed robotics researcher Rodney Brooks. The robot, Baxter, is completely safe, extremely adaptable, and ridiculously easy to program. By providing a way to automate simple manufacturing work, it could help make U.S. manufacturers compete with Chinese companies that rely on low-cost human labor. You can see the new robot in action in a related video of the robot in action and Brooks discussing its potential." $22 thousand and shipping next month, goes the story.
MojoKid writes "Intel's next-generation CPU architecture, codenamed Haswell, puts heavy emphasis on reducing power consumption. Pushing Haswell down to a 10W TDP is an achievement, but hitting these targets requires collaboration. Haswell will offer finer-grained control over areas of logic that were previously either on or off, up to and including specific execution units. These optimizations are impressive, particularly the fact that idle CPU power is approaching tablet levels, but they're only part of the story. Operating system changes matter as well, and Intel has teamed up with Microsoft to ensure that Windows 8 takes advantage of current and future hardware. Haswell's 10W target will allow the chip to squeeze into many of the convertible laptop/tablet form factors on display at IDF, while Bay Trail, the 22nm, out-of-order successor to Clover Trail, arrives in 2013 as well. Not to mention the company's demonstration of the first integrated digital WiFi radio. Folks have been trading blows over whether Intel could compete with ARM's core power consumption. Meanwhile, Santa Clara has been busy designing many other aspects of the full system solution for low power consumption and saving a lot of wattage in the process." It's mildly amusing that Windows 8 is the first version to gain dynamic ticks, something Linux has had working since around 2007.
First time accepted submitter ze_jua writes "In this article, Jay Goldberg, a financial analyst who travels to Shenzhen several times a year, analyses the potential consequences of the very low cost of hardware he found there on the consumer electronic industry worldwide. He wrote this piece of text after he found a very nice $45 Android 4 tablet. Are we so close to given-away tablets?"
An anonymous reader writes "Presenting at the IEEE High Performance Extreme Computing conference, a researcher from the University of Tennessee presented evidence that the iPad 2 is as fast as the original Cray-2 supercomputer. Performance improvements were made to the iPad 2 LINPACK software by writing Python for generating and testing various Assembly routines. The researcher also found that the ARM Cortex-A9 easily beats the NVIDIA/AMD GPUs and latest Intel/AMD workstation CPUs in performance-per-Watt efficiency."
1sockchuck writes "Data centers operators often tout their diesel backup generators as a symbol of their reliability. So why does Microsoft want to get rid of them? Microsoft says diesel generators are 'inefficient and costly' and is looking at alternatives to supply emergency backup power for its server farms, including fuel cells powered by natural gas. One possible option is the 'Bloom box,' which both Apple and eBay are using in their data centers (albeit with biogas as the primary fuel). Bloom is positioning its fuel cells as a way to forego expensive UPS units and generators, using the Bloom box for primary power and the utility grid for backup. It's a pitch that benefits from the current low price of natural gas." (Microsoft would like to stop using so much water, too.)
SkinnyGuy writes "Roomba, the world’s first multi-million unit-selling home-helper robot, turns 10 today. iRobot has cooked up a self-congratulatory infographic filled with a collection of interesting and occasionally bizarre facts to mark the occasion. Did you know that dogs, cats and babies have ridden iRobot's iconic home cleaning robot since it was introduced exactly a decade ago?"
Lucas123 writes "The price of 2.5-in solid state drives have dropped by 3X in three years, making many of the most popular models less than $1 per gigabyte or about 74 cents per gig. Hybrid drives, which include a small amount of NAND flash cache alongside spinning disk, in contrast have reached near price parity with hard drives that hover around the .23 cents per gig. While HDDs cannot compare to SSDs in terms of IOPS generated when used in a storage array or server, it's debatable whether they offer performance increases in a laptop significant enough that justify paying three times as much compared with a high-end a hard drive or a hybrid drive. For example, an Intel 520 Series SSD has a max sequential read speed of 456MB/sec compared to a WD Black's 122MB/sec. The SSD boots up in 9 seconds compared to the HDD's 21 seconds and the hybrid drive's 12-second time. So the question becomes, should you pay three times as much for an SSD for twice the performance, or almost the same speeds when compared to a hybrid drive?"
An anonymous reader writes "AllThingsD columnist Arik Hesseldahl noticed another milestone marking the passing of the personal computer era: for the first time since the early '80s, the share of worldwide sales of DRAM chips consumed by PCs (desktop and laptop computers, but not tablets) has dropped below fifty percent. Perhaps a more important milestone was reached last year, when more smartphones were shipped (not sold) worldwide than the combined total of PCs and tablets (also noticed by Microsoft watcher Joe Wilcox). While this is certainly of tremendous marketing and business importance to the likes of Apple, Microsoft, Google, Adobe, and PC OEMs, others may reflect on the impending closing of the history books on the era that started in Silicon Valley a little over 35 years ago."
Bruce Perens writes "Clover Trail, Intel's newly announced 'Linux proof' processor, is already a dead end for technical and business reasons. Clover Trail is said to include power-management that will make the Atom run longer under Windows. It had better, since Atom currently provides about 1/4 of the power efficiency of the ARM processors that run iOS and Android devices. The details of Clover Trail's power management won't be disclosed to Linux developers. Power management isn't magic, though — there is no great secret about shutting down hardware that isn't being used. Other CPU manufacturers, and Intel itself, will provide similar power management to Linux on later chips. Why has Atom lagged so far behind ARM? Simply because ARM requires fewer transistors to do the same job. Atom and most of Intel's line are based on the ia32 architecture. ia32 dates back to the 1970s and is the last bastion of CISC, Complex Instruction Set Computing. ARM and all later architectures are based on RISC, Reduced Instruction Set Computing, which provides very simple instructions that run fast. RISC chips allow the language compilers to perform complex tasks by combining instructions, rather than by selecting a single complex instruction that's 'perfect' for the task. As it happens, compilers are more likely to get optimal performance with a number of RISC instructions than with a few big instructions that are over-generalized or don't do exactly what the compiler requires. RISC instructions are much more likely to run in a single processor cycle than complex ones. So, ARM ends up being several times more efficient than Intel."
Lucas123 writes "Intel for the first time demonstrated the Wireless Gigabit (WiGig) docking specification using an Ultrabook, which was able to achieve 7Gbps performance, ten times the fastest Wi-Fi networks based on the IEEE 802.11n standard. The WiGig medium access control (MAC) and physical (PHY) control specification operates in the unlicensed 60GHz frequency band, which has more spectrum available than the 2.4GHz and 5GHz bands used by existing Wi-Fi products. According to Ali Sadri, chairman of the WiGig Alliance, the specification also supports wireless implementations of HDMI and DisplayPort interfaces, as well as the High-Bandwidth Digital Content Protection (HDCP) scheme used to protect digital content transmitted over those interfaces. It scales to allow transmission of both compressed and uncompressed video."
theodp writes "When it comes to Google's futuristic Glass goggles, people seem to fall into two camps. On the one hand, you have people like NY Times Arts critic Mike Hale, who goes gaga over how fashion designer Diane von Furstenberg put Google glasses on models who walked in her recent Fashion Week show, enabling them to capture video from their point of view as they walked the runway. 'For a preview of how we all may be making movies in a few years,' Hale breathlessly writes, 'take a look at DVF Through Glass .' On the other hand, you have folks like NY Times commenter JokerDanny, who says he's seen this Google Glass movie before. 'David Letterman used to call this Monkey-Cam,' quips JD, referring to the mid-1980's Late Night bits in which Letterman mounted a camera on Zippy the Chimp, enabling the monkey to capture video from his point of view as he roamed the studio. Thanks to the magic of YouTube Doubler, here's a head-to-head comparison of POV video shot by Zippy in 1986 — the year Larry Page and Sergey Brin celebrated their 13th birthdays — to that taken by a DVF model in 2012."
necro81 writes "IEEE Spectrum magazine has a feature article describing DARPA-funded work towards developing a solar cell that's 50% efficient, for a finished module that's 40% efficient — suitable for charging a soldier's gadgets in the field. Conventional silicon and thin-film PV tech can hit cell efficiencies of upwards of 20%, with finished modules hovering in the teens. Triple-junction cells can top 40%, but are expensive to produce and not practical in most applications. Current work by the Very High Efficiency Solar Cell program uses optics (dichroic films) to concentrate incoming sunlight by 20-200x, and split it into constituent spectra, which fall on many small solar cells of different chemistries, each tuned to maximize the conversion of different wavelengths."
mdsolar writes "Reuters reports that the Japanese government said it 'intends to stop using nuclear power by the 2030s, marking a major shift from policy goals set before last year's Fukushima disaster that sought to increase the share of atomic energy to more than half of electricity supply. Japan joins countries such as Germany and Switzerland in turning away from nuclear power ... Japan was the third-biggest user of atomic energy before the disaster. In abandoning atomic power, Japan aims to triple the share of renewable power to 30 percent of its energy mix, but will remain a top importer of oil, coal and gas for the foreseeable future. Prime Minister Yoshihiko Noda's unpopular government, which could face an election this year, had faced intense lobbying from industries to maintain atomic energy and also concerns from its major ally, the United States, which supplied it with nuclear technology in the 1950s.' Meanwhile, the U.S. nuclear renaissance appears to be unraveling."