Nerval's Lobster writes "If the Apple rumor mill proves correct, the unveiling of the iPad Mini this week could mean sayonara for the iPad 2. At least, that's the prediction of Evercore Partners analyst Rob Cihra, who wrote in a recent note to investors that he believes Apple will remove the iPad 2 from its lineup to make room for a smaller tablet. Apple insider excerpted parts of Cihra's note Oct. 19. Of course, that's just one analyst speculating about the future plans of a company known for playing things close to the proverbial vest: Apple's Oct. 23 event in California could feature all sorts of surprises. So what do we know about the iPad Mini? First, that it might not be called the iPad Mini — that's a moniker dreamed up by the press. Second, a cheaper and smaller iPad could impact the market for e-readers and 'price-sensitive users,' according to J.P. Morgan analyst Mark Moskowitz, which in turn could mean a challenging future for Amazon, Google, and other IT vendors marketing cheaper tablets. Third, the media—driven by unnamed sources and blurry spy photos—seems to have collectively settled on a 7.85-inch screen without a high-resolution Retina Display."
Catch up on stories from the past week (and beyond) at the Slashdot story archive
alphadogg writes "Motorola Solutions has unveiled a head-mounted, voice-controlled computer that's targeted at the military and other industries where workers need hands-free access to information. Called the HC1, the device runs on an ARM processor and has an optional camera to send back real-time video over a wireless network. Unlike Google Goggles, though, the HC1 is aimed at the enterprise market with a price tag of $4,000-$5,000 per unit. Areas the company has been experimenting with include 'high-end repair markets,' such as aircraft engines, said Paul Steinberg, CTO of Motorola Solutions (which is the part of Motorola Google did not acquire). 'Emergency medical personnel at trauma centers might be looking at this too.' The HC1 will augment what users see by providing additional data, he said. Multiple units could be networked together and share information. Video here. "
Google's new ARM-powered Chromebook isn't a lot of things: it isn't a full-fledged laptop, it's not a tablet (doesn't even have a touch screen); and by design it's not very good as a stand-alone device. Eric Lai at ZDNet, though, thinks Chromebooks are (with the price drop that accompanies the newest version) a good fit for business customers, at least "for white-collar employees and other workers who rarely stray away from their corporate campus and its Wi-Fi network." Lai lists some interesting large-scale rollouts with Chromebooks, including 19,000 of them in a South Carolina school district. Schools probably especially like the control that ChromeOS means for the laptops they administer. For those who'd like to have a more conventional but still lightweight ARM laptop, I wonder how quickly the ARM variant of Ubuntu will land on the new version. (Looks like I'm not the only one to leap to that thought.)
acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"
mikejuk writes "After six years in the making, the Arduino Due is finally becoming available and, with a price tag of $49, is bound to give a boost to the platform. The Due, which means 2 in Italian and is pronounced 'doo-eh', replaces the 8-bit, 16MHz Uno by a 32-bit, 84MHz processor board that also has a range of new features — more memory, a USB port that allows it to pretend to be a mouse or a keyboard say, 54 I/O pins and so on — but what lets you do more with it is its speed and power. The heart of the new Arduino Due is the Atmel SAM3X8E, an ARM Cortex-M3-based processor, which gives it a huge boost in ADC performance, opening up possibilities for designers. The theoretical sampling rate has gone from the 15 ksps (kilosamples per second) of the existing boards, the Arduino Uno, Leonardo, and Mega 2560, to a whopping 1,000 ksps. What this all means is that the Due can be used for much more sophisticated applications. It can even play back WAV files without any help. Look out for the Due in projects that once would have needed something more like a desktop machine."
colinneagle writes "It's a darned shame, but the writing is on the wall for AMD. The ATI graphics business is the only thing keeping it afloat right now as sales shrivel up and the company faces yet another round of staffing cuts. You can only cut so many times before there's no one left to innovate you out of the mess you're in. Qualcomm, on the other hand, dominates this space, and it has the chips to back it up. The Snapdragon line of ARM-based processors alone is found in a ridiculous number of prominent devices, including Samsung Galaxy S II and S III, Nokia Lumia 900 and 920, Asus Transformer Pad Infinity and the Samsung Galaxy Note. Mind you, Samsung is also in the ARM processor business, yet it is licensing Qualcomm's parts. That's quite a statement."
1sockchuck writes "As Google showed the world its data centers this week, it disclosed one of its best-kept secrets: how it cools its custom servers in high-density racks. All the magic happens in enclosed hot aisles, including supercomputer-style steel tubing that transports water — sometimes within inches of the servers. How many of those servers are there? Google has deployed at least 1 million servers, according to Wired, which got a look inside the company's North Carolina data center. The disclosures accompany a gallery of striking photos by architecture photographer Connie Zhou, who discusses the experience and her approach to the unique assignment."
An anonymous reader writes "ACM Queue interviews Cambridge researcher (and FreeBSD developer) Robert Watson on why processor designs need to change in order to better support security features like Capsicum — and how they change all the time (RISC, GPUs, etc). He also talks about the challenge of building a research team at Cambridge that could actually work with all levels of the stack: CPU design, operating systems, compilers, applications, and formal methods. The DARPA-sponsored SRI and Cambridge CTSRD project is building a new open source processor that can support orders of magnitude greater sandboxing than current designs."
another random user writes "Stanford Ovshinsky, a self-taught American physicist who designed the battery now used in hybrid cars, has died aged 89 from prostate cancer . The electronics field of ovonics was named after Mr Ovshinsky, who owned over 200 patents and has been described as a '[Thomas] Edison of our age.' He introduced the idea of 'glass transistors' in 1968, which paved the way for modern flat-screen monitors."
SchrodingerZ writes "The Society of Automotive Engineers (SAE), an international syndicate, has unveiled what is to become the standard for electric car charging. In today's market there are hundreds of different methods and plugs to charge a variety of different cars, now a single multi use plug is announced as the world standard. Called the J1772 , it 'has two charging plugs incorporated into a single design and is said to reduce charging times from as long as eight hours to as little as 20 minutes.' The cumulative work of over 190 'global experts,' the plug can cater to both AC and DC currents for charging. The plug also sets a new standard on safety regulations, including 'its ability to be safely used in all weather conditions, and the fact that its connections are never live unless commanded by the car during charging.' The J1772 beat out its Japanese competitor the CHAdeMO, used as an option on the Nissan Leaf."
An anonymous reader writes "Cyberdyne announced today an improved version of the HAL (Hybrid Assistive Limb) robotic exoskeleton at the Japan Robot Show. From the article: 'he latest version of the HAL has remained brain-controlled but evolved to a full body robot suit that protects against heavy radiation without feeling the weight of the suit. Eventually it could be used by workers dismantling the crippled Fukushima nuclear plant."
Nerval's Lobster writes "Google is whipping the proverbial curtain back from its new Chromebook, which will retail for $249 and up. The Samsung-built device weighs 2.5 pounds and features an 11.6-inch screen (with 1366 x 768 resolution), backed by a 1.75GHz Samsung Exynos 5 Dual Processor. Google claims it will boot up in under 10 seconds and, depending on usage, last for 6.5 hours on one battery charge. From a product perspective, Chrome OS and its associated hardware found itself fighting a two-front battle: the first against Windows PCs and Macs, both of which could claim more robust hardware for a similar cost to the old Chromebooks (which started at $449), and the second against tablets, which offered the same degree of flexibility and connectivity for a cheaper sticker-price. By setting the cost of the new Chromebook at $249, Google continues that pricing skirmish on more favorable terms." CNET got a bit of hands-on time with the new kid, and gives it a lukewarm but positive reception.
Zothecula writes "Video game developers are always looking for new ways to give players a more immersive experience. But with several motion-controlled systems widely available and a viable virtual reality headset in the works, what else could be done to make games seem more realistic? Sony may have an unexpected answer with a recent patent that describes a controller that changes temperature between hot and cold to match in-game actions. With the controller giving 'temperature feedback,' the idea is that players would be able to more closely feel what their character feels, from getting hit with a fireball to traveling through a blizzard."
An anonymous reader writes "Boxee has announced the game-changing Boxee TV, offering live streaming TV via two on-board tuners and an industry-first 'No Limit' DVR service that allows users to record as much TV content as they want, and access it from virtually anywhere. The problem is that the unit, which records directly to the cloud, does not allow recording to a local drive, meaning users are stuck with Boxee for as long as they want to access their stored content — potentially hundreds or thousands of hours – to the tune of $14.99 per month until Boxee ups the ante. CEPro.com suggests, 'I suspect Boxee is offering unlimited storage to make users especially beholden to them. The more content you have, the less likely you are to drop the service.'"
cylonlover writes "A trip on public transport or to the local coffee shop might give the impression that touchscreens are everywhere, but scientists at Autodesk Research of the University of Alberta and the University of Toronto are looking to take the ubiquity of touch interfaces to the next level. They are developing a 'Magic Finger' that allows any surface to detect touch input by shifting the touch technology from the surface to the wearer's finger. It's a proof-of-concept prototype made up of a little Velcro ring that straps to the wearer's fingertip with a trail of wires leading to a box of electronics. On the ring there are a pair of optical sensors. One is a low resolution, high-speed sensor for tracking movement, the other a high-resolution camera, which is able to detect 32 different surface textures with 98 percent accuracy."
First time accepted submitter kfsone writes "I've experienced, first-hand, some of the ways in which spindle disks die, but either I've yet to see an SSD die or I'm not looking in the right places. Most of my admin-type friends have theories on how an SSD dies but admit none of them has actually seen commercial grade drives die or deteriorate. In particular, the failure process seems like it should be more clinical than spindle drives. If you have X many of the same SSD drive and none of them suffer manufacturing defects, if you repeat the same series of operations on them they should all die around the same time. If that's correct, then what happens to SSDs in RAID? Either all your drives will start to fail together or at some point, your drives will become out of sync in-terms of volume sizing. So, have you had to deliberately EOL corporate grade SSDs? Do they die with dignity or go out with a bang?"
An anonymous reader writes "Reiser4 still hasn't been merged into the mainline Linux kernel, but it's still being worked on by a small group of developers following Hans Reiser being convicted for murdering his wife. Reiser4 was updated in September on SourceForge to work with the Linux 3.5 kernel and has been benchmarked against EXT4, Btrfs, XFS, and ReiserFS. Reiser4 loses out in most of the Linux file-system performance tests, has much stigma due to Hans Reiser, and Btrfs is surpassing it feature-wise, so does it have any future in Linux ahead?"
mikejuk writes "Long before the current crop of MOOCs (Massive Online Open Course) there was a course that taught you all you needed to know about computers by starting from the NAND gate and working its way up through the logic circuits needed for a computer, on to an assembler, a compiler, an operating system, and finally Tetris. Recently one of the creators of the course, Shimon Schocken, gave a TED talk explaining how it all happened and why it is still relevant today. Once you have seen what is on offer at http://www.nand2tetris.org/ you will probably decide that it is not only still relevant but the only way to really understand what computers are all about."
Nerval's Lobster writes "A team of researchers from Microsoft and Cornell University has concluded that, in some cases, a totally wireless data center makes logistical sense. In a new paper, a team of researchers from Cornell and Microsoft concluded that a data-center operator could replace hundreds of feet of cable with 60-GHz wireless connections—assuming that the servers themselves are redesigned in cylindrical racks, shaped like prisms, with blade servers addressing both intra- and inter-rack connections. The so-called 'Cayley' data centers, so named because of the network connectivity subgraphs are modeled using Cayley graphs, could be cheaper than traditional wired data centers if the cost of a 60-GHz transceiver drops under $90 apiece, and would likely consume about one-tenth to one-twelfth the power of a wired data center."
sfcrazy writes "Good (and bad) news for Raspberry Pi lovers, the Model B has been upgraded to 512MB RAM from 256MB. Bad news is for those who already got their Model B shipments because all those who have outstanding orders with either distributors will get the *upgraded* version of the device, means with 512MB RAM instead of 256MB. The upgraded devices should be arriving to customers from today onwards. Raspberry Pi team will be pushing a firmware upgrade soon so these news devices can detect and use the additional RAM."