MojoKid writes "Intel's next-generation CPU architecture, codenamed Haswell, puts heavy emphasis on reducing power consumption. Pushing Haswell down to a 10W TDP is an achievement, but hitting these targets requires collaboration. Haswell will offer finer-grained control over areas of logic that were previously either on or off, up to and including specific execution units. These optimizations are impressive, particularly the fact that idle CPU power is approaching tablet levels, but they're only part of the story. Operating system changes matter as well, and Intel has teamed up with Microsoft to ensure that Windows 8 takes advantage of current and future hardware. Haswell's 10W target will allow the chip to squeeze into many of the convertible laptop/tablet form factors on display at IDF, while Bay Trail, the 22nm, out-of-order successor to Clover Trail, arrives in 2013 as well. Not to mention the company's demonstration of the first integrated digital WiFi radio. Folks have been trading blows over whether Intel could compete with ARM's core power consumption. Meanwhile, Santa Clara has been busy designing many other aspects of the full system solution for low power consumption and saving a lot of wattage in the process." It's mildly amusing that Windows 8 is the first version to gain dynamic ticks, something Linux has had working since around 2007.
Attend or create a Slashdot 20th anniversary party! DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Check out the new SourceForge HTML5 Internet speed test. ×
First time accepted submitter ze_jua writes "In this article, Jay Goldberg, a financial analyst who travels to Shenzhen several times a year, analyses the potential consequences of the very low cost of hardware he found there on the consumer electronic industry worldwide. He wrote this piece of text after he found a very nice $45 Android 4 tablet. Are we so close to given-away tablets?"
An anonymous reader writes "Presenting at the IEEE High Performance Extreme Computing conference, a researcher from the University of Tennessee presented evidence that the iPad 2 is as fast as the original Cray-2 supercomputer. Performance improvements were made to the iPad 2 LINPACK software by writing Python for generating and testing various Assembly routines. The researcher also found that the ARM Cortex-A9 easily beats the NVIDIA/AMD GPUs and latest Intel/AMD workstation CPUs in performance-per-Watt efficiency."
1sockchuck writes "Data centers operators often tout their diesel backup generators as a symbol of their reliability. So why does Microsoft want to get rid of them? Microsoft says diesel generators are 'inefficient and costly' and is looking at alternatives to supply emergency backup power for its server farms, including fuel cells powered by natural gas. One possible option is the 'Bloom box,' which both Apple and eBay are using in their data centers (albeit with biogas as the primary fuel). Bloom is positioning its fuel cells as a way to forego expensive UPS units and generators, using the Bloom box for primary power and the utility grid for backup. It's a pitch that benefits from the current low price of natural gas." (Microsoft would like to stop using so much water, too.)
SkinnyGuy writes "Roomba, the world’s first multi-million unit-selling home-helper robot, turns 10 today. iRobot has cooked up a self-congratulatory infographic filled with a collection of interesting and occasionally bizarre facts to mark the occasion. Did you know that dogs, cats and babies have ridden iRobot's iconic home cleaning robot since it was introduced exactly a decade ago?"
Lucas123 writes "The price of 2.5-in solid state drives have dropped by 3X in three years, making many of the most popular models less than $1 per gigabyte or about 74 cents per gig. Hybrid drives, which include a small amount of NAND flash cache alongside spinning disk, in contrast have reached near price parity with hard drives that hover around the .23 cents per gig. While HDDs cannot compare to SSDs in terms of IOPS generated when used in a storage array or server, it's debatable whether they offer performance increases in a laptop significant enough that justify paying three times as much compared with a high-end a hard drive or a hybrid drive. For example, an Intel 520 Series SSD has a max sequential read speed of 456MB/sec compared to a WD Black's 122MB/sec. The SSD boots up in 9 seconds compared to the HDD's 21 seconds and the hybrid drive's 12-second time. So the question becomes, should you pay three times as much for an SSD for twice the performance, or almost the same speeds when compared to a hybrid drive?"
An anonymous reader writes "AllThingsD columnist Arik Hesseldahl noticed another milestone marking the passing of the personal computer era: for the first time since the early '80s, the share of worldwide sales of DRAM chips consumed by PCs (desktop and laptop computers, but not tablets) has dropped below fifty percent. Perhaps a more important milestone was reached last year, when more smartphones were shipped (not sold) worldwide than the combined total of PCs and tablets (also noticed by Microsoft watcher Joe Wilcox). While this is certainly of tremendous marketing and business importance to the likes of Apple, Microsoft, Google, Adobe, and PC OEMs, others may reflect on the impending closing of the history books on the era that started in Silicon Valley a little over 35 years ago."
Bruce Perens writes "Clover Trail, Intel's newly announced 'Linux proof' processor, is already a dead end for technical and business reasons. Clover Trail is said to include power-management that will make the Atom run longer under Windows. It had better, since Atom currently provides about 1/4 of the power efficiency of the ARM processors that run iOS and Android devices. The details of Clover Trail's power management won't be disclosed to Linux developers. Power management isn't magic, though — there is no great secret about shutting down hardware that isn't being used. Other CPU manufacturers, and Intel itself, will provide similar power management to Linux on later chips. Why has Atom lagged so far behind ARM? Simply because ARM requires fewer transistors to do the same job. Atom and most of Intel's line are based on the ia32 architecture. ia32 dates back to the 1970s and is the last bastion of CISC, Complex Instruction Set Computing. ARM and all later architectures are based on RISC, Reduced Instruction Set Computing, which provides very simple instructions that run fast. RISC chips allow the language compilers to perform complex tasks by combining instructions, rather than by selecting a single complex instruction that's 'perfect' for the task. As it happens, compilers are more likely to get optimal performance with a number of RISC instructions than with a few big instructions that are over-generalized or don't do exactly what the compiler requires. RISC instructions are much more likely to run in a single processor cycle than complex ones. So, ARM ends up being several times more efficient than Intel."
Lucas123 writes "Intel for the first time demonstrated the Wireless Gigabit (WiGig) docking specification using an Ultrabook, which was able to achieve 7Gbps performance, ten times the fastest Wi-Fi networks based on the IEEE 802.11n standard. The WiGig medium access control (MAC) and physical (PHY) control specification operates in the unlicensed 60GHz frequency band, which has more spectrum available than the 2.4GHz and 5GHz bands used by existing Wi-Fi products. According to Ali Sadri, chairman of the WiGig Alliance, the specification also supports wireless implementations of HDMI and DisplayPort interfaces, as well as the High-Bandwidth Digital Content Protection (HDCP) scheme used to protect digital content transmitted over those interfaces. It scales to allow transmission of both compressed and uncompressed video."
theodp writes "When it comes to Google's futuristic Glass goggles, people seem to fall into two camps. On the one hand, you have people like NY Times Arts critic Mike Hale, who goes gaga over how fashion designer Diane von Furstenberg put Google glasses on models who walked in her recent Fashion Week show, enabling them to capture video from their point of view as they walked the runway. 'For a preview of how we all may be making movies in a few years,' Hale breathlessly writes, 'take a look at DVF Through Glass .' On the other hand, you have folks like NY Times commenter JokerDanny, who says he's seen this Google Glass movie before. 'David Letterman used to call this Monkey-Cam,' quips JD, referring to the mid-1980's Late Night bits in which Letterman mounted a camera on Zippy the Chimp, enabling the monkey to capture video from his point of view as he roamed the studio. Thanks to the magic of YouTube Doubler, here's a head-to-head comparison of POV video shot by Zippy in 1986 — the year Larry Page and Sergey Brin celebrated their 13th birthdays — to that taken by a DVF model in 2012."
necro81 writes "IEEE Spectrum magazine has a feature article describing DARPA-funded work towards developing a solar cell that's 50% efficient, for a finished module that's 40% efficient — suitable for charging a soldier's gadgets in the field. Conventional silicon and thin-film PV tech can hit cell efficiencies of upwards of 20%, with finished modules hovering in the teens. Triple-junction cells can top 40%, but are expensive to produce and not practical in most applications. Current work by the Very High Efficiency Solar Cell program uses optics (dichroic films) to concentrate incoming sunlight by 20-200x, and split it into constituent spectra, which fall on many small solar cells of different chemistries, each tuned to maximize the conversion of different wavelengths."
mdsolar writes "Reuters reports that the Japanese government said it 'intends to stop using nuclear power by the 2030s, marking a major shift from policy goals set before last year's Fukushima disaster that sought to increase the share of atomic energy to more than half of electricity supply. Japan joins countries such as Germany and Switzerland in turning away from nuclear power ... Japan was the third-biggest user of atomic energy before the disaster. In abandoning atomic power, Japan aims to triple the share of renewable power to 30 percent of its energy mix, but will remain a top importer of oil, coal and gas for the foreseeable future. Prime Minister Yoshihiko Noda's unpopular government, which could face an election this year, had faced intense lobbying from industries to maintain atomic energy and also concerns from its major ally, the United States, which supplied it with nuclear technology in the 1950s.' Meanwhile, the U.S. nuclear renaissance appears to be unraveling."
New submitter notscientific writes "Renewable sources of energy are obviously a hit but they have as yet failed to live up to the hype. A new study in Nature Climate Change shows however that there is more than enough power to be harnessed from the wind to sustain Earth's entire population... x200! To generate energy from the wind, we may however need to set up wind farms at altitudes of 200-20,000 metres. To be fair, the study is purely theoretical and does not look at the feasibility of such potential wind farms. Regardless, the paper does provide a major boost to backers of wind-generated energy. Science has confirmed that the sky's the limit."
jfruh writes "Every time a company rolls out a new version of a product, it extols how much better it is than the previous version. Thus, Apple spent a part of its iPhone 5 rollout touting the staying power of the latest version of its battery. But have iPhone batteries really seen improvement since the original came out in '07? Kevin Purdy crunches the numbers and concludes that, while the 5's battery beats the 4S's, we still haven't returned to the capabilities of the original phone."
MrSeb writes "Intel often uses the Intel Developer Forum (IDF) as a platform to discuss its long-term vision for computing as well as more practical business initiatives. This year, the company has discussed the shrinking energy cost of computation as well as a point when it believes the energy required for 'meaningful computing' will approach zero and become ubiquitous by the year 2020. The idea that we could push the energy cost of computing down to nearly immeasurable levels is exciting. It's the type of innovation that's needed to drive products like Google Glass or VR headsets like the Oculus Rift. Unfortunately, Intel's slide neatly sidesteps the greatest problems facing such innovations — the cost of computing already accounts for less than half the total energy expenditure of a smartphone or other handheld device. Yes, meaningful compute might approach zero energy — but touchscreens, displays, radios, speakers, cameras, audio processors, and other parts of the equation are all a long way away from being as advanced as Intel's semiconductor processes."