MrSeb writes "A team of researchers from MIT, Caltech, Harvard, and other universities in Europe, have devised a way of boosting the performance of wireless networks by up to 10 times — without increasing transmission power, adding more base stations, or using more wireless spectrum. The researchers' creation, coded TCP, is a novel way of transmitting data so that lost packets don't result in higher latency or re-sent data. With coded TCP, blocks of packets are clumped together and then transformed into algebraic equations (PDF) that describe the packets. If part of the message is lost, the receiver can solve the equation to derive the missing data. The process of solving the equations is simple and linear, meaning it doesn't require much processing on behalf of the router/smartphone/laptop. In testing, the coded TCP resulted in some dramatic improvements. MIT found that campus WiFi (2% packet loss) jumped from 1Mbps to 16Mbps. On a fast-moving train (5% packet loss), the connection speed jumped from 0.5Mbps to 13.5Mbps. Moving forward, coded TCP is expected to have huge repercussions on the performance of LTE and WiFi networks — and the technology has already been commercially licensed to several hardware makers."
Attend or create a Slashdot 20th anniversary party! DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Check out the new SourceForge HTML5 Internet speed test. ×
An anonymous reader writes "AMD just officially took the wraps off Vishera, its next generation of FX processors. Vishera is Piledriver-based like the recently-released Trinity APUs, and the successor to last year's Bulldozer CPU architecture. The octo-core flagship FX-8350 runs at 4.0 GHz and is listed for just $195. The 8350 is followed by the 3.5 GHz FX-8320 at $169. Hexa-core and quad-core parts are also launching, at $132 and $122, respectively. So how does Vishera stack up to Intel's lineup? The answer to that isn't so simple. The FX-8350 can't even beat Intel's previous-generation Core i5-2550K in single-threaded applications, yet it comes very close to matching the much more expensive ($330), current-gen Core i7-3770K in multi-threaded workloads. Vishera's weak point, however, is in power efficiency. On average, the FX-8350 uses about 50 W more than the i7-3770K. Intel aside, the Piledriver-based FX-8350 is a whole lot better than last year's Bulldozer-based FX-8150 which debuted at $235. While some of this has to do with performance improvements, that fact that AMD is asking $40 less this time around certainly doesn't hurt either. At under $200, AMD finally gives the enthusiast builder something to think about, albeit on the low-end." Reviews are available at plenty of other hardware sites, too. Pick your favorite: PC Perspective, Tech Report, Extreme Tech, Hot Hardware, AnandTech, and [H]ard|OCP.
An anonymous reader writes "Technology Review has an update on Microsoft's effort to push Kinect gesture control technology beyond the Xbox console and make it a standard Windows computer accessory. Microsoft has sold Kinect for Windows hardware to developers since February and now products based on it are appearing, such as GestSure's system for surgeons in the operating room. Microsoft won't say when it will begin selling Kinect for Windows hardware directly to consumers, but seems poised to do so once enough developers have readied applications."
Nerval's Lobster writes "In a YouTube interview released by Microsoft, co-founder Bill Gates offered a few hints of where Microsoft plans on taking Windows in coming years. 'It's evolving literally to be a single platform,' he said, referring to how Windows 8 and Windows Phone 8 share a kernel, file system, graphics support, and other elements. At least in theory, that will allow developers to port apps from the desktop/tablet OS to the smartphone OS with relatively little work. The two operating systems already share the same design aesthetic, with Start screens composed of colorful tiles linked to applications. Gates also praised natural user interfaces — which include touch and voice — while taking a subtle dig at Apple's iPad and other tablets on the market. 'People want to consume their mail, reading, video anywhere, and they want it to be awfully simple,' he said. 'But you want to incorporate touch without giving up the kind of mouse, keyboard capability that's just so natural in most settings.'"
An anonymous reader writes "Due to low electricity prices in the Midwest, and an inability to find a buyer for the power station, Dominion will be shutting down and decomissioning Kewaunee Nuclear Power Station. One of two operating nuclear power stations in Wisconsin, Kewaunee's license from the NRC was not due to expire until the end of 2033."
First time accepted submitter icepick3000 writes "There are probably many digital photoframes unused these days laying around. Mine is from the first generation meaning you can only insert a compact flash card and display photos. Newer models nowadays can display weather, news, and stocks. Anyone have some good idea's how to give these old frames a second life? I have been thinking about compact flash cards that support wifi... maybe someone has a better idea?"
Hugh Pickens writes "Jean-Louis Gassée says Apple and Samsung are engaged in a knives-out smartphone war. But when it comes to chips, the two companies must pretend to be civil because Samsung is the sole supplier of ARM-based processors for the iPhone. So why hasn't Intel jumped at the chance to become Apple's ARM source? 'The first explanation is architectural disdain,' writes Gassée. 'Intel sees "no future for ARM," it's a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors.' Next is pride. Intel would have to accept Apple's design and 'pour' it into silicon — it would become a lowlymerchant foundry.' Intel knows how to design and manufacture standard parts, but it has little experience manufacturing other people's custom designs or pricing them. But the most likely answer to the Why-Not-Intel question is money. Intel meticulously tunes the price points for its processors to generate the revenue that will fund development. Intel's published prices range from a 'low' $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Compare this to iSuppli's estimate for the cost of the A6 processor: $17.50. Even if more A6 chips could be produced per wafer — an unproven assumption — Intel's revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel's perception of reality, this would destroy the business model. 'For all of Intel's semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they're inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they'll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?'"
An anonymous reader sends this quote from Wired: "Affordable 3-D printers and CNC mills are popping up everywhere, opening up new worlds of production to wide ranges of designers. However, one major tool still hasn’t received a DIY overhaul: the laser cutter. Maybe people are sensitive because Goldfinger tried to cut James Bond in half with one, but all that changes now with Patrick Hood-Daniel’s new Kickstarter, 'Build Your Own Laser Cutter.' ... A 40-watt laser tube and power supply means it can cut a variety of materials: wood, plastic, fabric, and paper. ... There is one major red flag, however. The machine’s frame is built from of Medium Density Overlay (MDO) — a type of plywood. Hood-Daniels says this is a feature, making the blackTooth less sensitive to thermal distortion and inaccuracy than a metal frame, but it also creates a serious, fire-breathing concern. ... When asked for comment, Hood-Daniel says 'Initially, I had the same thoughts as to the precarious use of wood for the structure, but even with long burns to the structure which were made on accident when starting a run, there was no ignition.'"
An anonymous reader writes "Samsung has decided to terminate an ongoing contract with Apple to supply LCD panels for use in its growing range of devices. That means, come next year, there will be no Samsung panels used across the iPad, iPod, iPhone, and Mac range of devices. The reason seems to be two-fold. On the one hand, Apple has been working hard to secure supplies from other manufacturers and therefore decrease its reliance on Samsung. On the other, Apple is well-known for demanding and pushing lower pricing, meaning it just doesn't make business sense anymore for Samsung to keep supplying Apple with displays."
paroneayea writes "MediaGoblin and LulzBot have teamed up to bring 3-D model support to MediaGoblin! The announcement shows off a live demo of the new feature... it uses Blender on the backend to render stills and thingiview.js to show realtime WebGL previews. This means MediaGoblin is becoming more useful for 3-D artists and people interested in 3-D printing, especially those looking for a free-as-in-freedom alternative to Thingiverse."
Nerval's Lobster writes "If the Apple rumor mill proves correct, the unveiling of the iPad Mini this week could mean sayonara for the iPad 2. At least, that's the prediction of Evercore Partners analyst Rob Cihra, who wrote in a recent note to investors that he believes Apple will remove the iPad 2 from its lineup to make room for a smaller tablet. Apple insider excerpted parts of Cihra's note Oct. 19. Of course, that's just one analyst speculating about the future plans of a company known for playing things close to the proverbial vest: Apple's Oct. 23 event in California could feature all sorts of surprises. So what do we know about the iPad Mini? First, that it might not be called the iPad Mini — that's a moniker dreamed up by the press. Second, a cheaper and smaller iPad could impact the market for e-readers and 'price-sensitive users,' according to J.P. Morgan analyst Mark Moskowitz, which in turn could mean a challenging future for Amazon, Google, and other IT vendors marketing cheaper tablets. Third, the media—driven by unnamed sources and blurry spy photos—seems to have collectively settled on a 7.85-inch screen without a high-resolution Retina Display."
alphadogg writes "Motorola Solutions has unveiled a head-mounted, voice-controlled computer that's targeted at the military and other industries where workers need hands-free access to information. Called the HC1, the device runs on an ARM processor and has an optional camera to send back real-time video over a wireless network. Unlike Google Goggles, though, the HC1 is aimed at the enterprise market with a price tag of $4,000-$5,000 per unit. Areas the company has been experimenting with include 'high-end repair markets,' such as aircraft engines, said Paul Steinberg, CTO of Motorola Solutions (which is the part of Motorola Google did not acquire). 'Emergency medical personnel at trauma centers might be looking at this too.' The HC1 will augment what users see by providing additional data, he said. Multiple units could be networked together and share information. Video here. "
Google's new ARM-powered Chromebook isn't a lot of things: it isn't a full-fledged laptop, it's not a tablet (doesn't even have a touch screen); and by design it's not very good as a stand-alone device. Eric Lai at ZDNet, though, thinks Chromebooks are (with the price drop that accompanies the newest version) a good fit for business customers, at least "for white-collar employees and other workers who rarely stray away from their corporate campus and its Wi-Fi network." Lai lists some interesting large-scale rollouts with Chromebooks, including 19,000 of them in a South Carolina school district. Schools probably especially like the control that ChromeOS means for the laptops they administer. For those who'd like to have a more conventional but still lightweight ARM laptop, I wonder how quickly the ARM variant of Ubuntu will land on the new version. (Looks like I'm not the only one to leap to that thought.)
acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"
mikejuk writes "After six years in the making, the Arduino Due is finally becoming available and, with a price tag of $49, is bound to give a boost to the platform. The Due, which means 2 in Italian and is pronounced 'doo-eh', replaces the 8-bit, 16MHz Uno by a 32-bit, 84MHz processor board that also has a range of new features — more memory, a USB port that allows it to pretend to be a mouse or a keyboard say, 54 I/O pins and so on — but what lets you do more with it is its speed and power. The heart of the new Arduino Due is the Atmel SAM3X8E, an ARM Cortex-M3-based processor, which gives it a huge boost in ADC performance, opening up possibilities for designers. The theoretical sampling rate has gone from the 15 ksps (kilosamples per second) of the existing boards, the Arduino Uno, Leonardo, and Mega 2560, to a whopping 1,000 ksps. What this all means is that the Due can be used for much more sophisticated applications. It can even play back WAV files without any help. Look out for the Due in projects that once would have needed something more like a desktop machine."
colinneagle writes "It's a darned shame, but the writing is on the wall for AMD. The ATI graphics business is the only thing keeping it afloat right now as sales shrivel up and the company faces yet another round of staffing cuts. You can only cut so many times before there's no one left to innovate you out of the mess you're in. Qualcomm, on the other hand, dominates this space, and it has the chips to back it up. The Snapdragon line of ARM-based processors alone is found in a ridiculous number of prominent devices, including Samsung Galaxy S II and S III, Nokia Lumia 900 and 920, Asus Transformer Pad Infinity and the Samsung Galaxy Note. Mind you, Samsung is also in the ARM processor business, yet it is licensing Qualcomm's parts. That's quite a statement."
1sockchuck writes "As Google showed the world its data centers this week, it disclosed one of its best-kept secrets: how it cools its custom servers in high-density racks. All the magic happens in enclosed hot aisles, including supercomputer-style steel tubing that transports water — sometimes within inches of the servers. How many of those servers are there? Google has deployed at least 1 million servers, according to Wired, which got a look inside the company's North Carolina data center. The disclosures accompany a gallery of striking photos by architecture photographer Connie Zhou, who discusses the experience and her approach to the unique assignment."
An anonymous reader writes "ACM Queue interviews Cambridge researcher (and FreeBSD developer) Robert Watson on why processor designs need to change in order to better support security features like Capsicum — and how they change all the time (RISC, GPUs, etc). He also talks about the challenge of building a research team at Cambridge that could actually work with all levels of the stack: CPU design, operating systems, compilers, applications, and formal methods. The DARPA-sponsored SRI and Cambridge CTSRD project is building a new open source processor that can support orders of magnitude greater sandboxing than current designs."
another random user writes "Stanford Ovshinsky, a self-taught American physicist who designed the battery now used in hybrid cars, has died aged 89 from prostate cancer . The electronics field of ovonics was named after Mr Ovshinsky, who owned over 200 patents and has been described as a '[Thomas] Edison of our age.' He introduced the idea of 'glass transistors' in 1968, which paved the way for modern flat-screen monitors."
SchrodingerZ writes "The Society of Automotive Engineers (SAE), an international syndicate, has unveiled what is to become the standard for electric car charging. In today's market there are hundreds of different methods and plugs to charge a variety of different cars, now a single multi use plug is announced as the world standard. Called the J1772 , it 'has two charging plugs incorporated into a single design and is said to reduce charging times from as long as eight hours to as little as 20 minutes.' The cumulative work of over 190 'global experts,' the plug can cater to both AC and DC currents for charging. The plug also sets a new standard on safety regulations, including 'its ability to be safely used in all weather conditions, and the fact that its connections are never live unless commanded by the car during charging.' The J1772 beat out its Japanese competitor the CHAdeMO, used as an option on the Nissan Leaf."