While Samsung dropped 3D support in 2016, LG and Sony -- the last two major TV makers to support the 3D feature in their TVs -- will stop doing so in 2017. None of their TVs, including the high-end OLED TV models, will be able to show 3D movies and TV shows. As a result, 3D TV is dead. The question is no longer when (or even why) 3D TVs will become obsolete, it's will 3D TVs ever rise again? CNET reports: The 3D feature has been offered on select televisions since 2010, when the theatrical success of "Avatar" in 3D helped encourage renewed interest in the technology. In addition to a 3D-capable TV, it requires specialized glasses for each viewer and the 3D version of a TV show or movie -- although some TVs also offer a simulated 3D effect mode. Despite enthusiasm at the box office and years of 3D TVs being available at affordable prices, the technology never really caught on at home. DirecTV canceled its 24/7 3D channel in 2012 and ESPN followed suit a year later. There are plenty of 3D Blu-ray discs still being released, such as "Star Wars: The Force Awakens," but if you want to watch them at home you'll need a TV from 2016 or earlier -- or a home theater projector. Those market trends are clear: Sales of 3D home video gear have declined every year since 2012. According to data from the NPD Group, 3D TV represents just 8 percent of total TV sales dollars for the full year of 2016, down from 16 percent in 2015 and 23 percent in 2012. Native 3D-capable Blu-ray players fell to just 11 percent of the market in 2016, compared to 25 percent in 2015 and 40 percent in 2012. As for whether or not 3D TVs will ever become popular again, David Katzmaier writes via CNET, based on his own "anecdotal experience as a TV reviewer": Over the years, the one thing most people told me about the 3D feature on their televisions was that they never used it. Sure, some people occasionally enjoyed a 3D movie on Blu-ray, but the majority of people I talked to tried it once or twice, maybe, then never picked up the glasses again. I don't think most viewers will miss 3D. I have never awarded points in my reviews for the feature, and 3D performance (which I stopped testing in 2016) has never figured into my ratings. I've had a 3D TV at home since 2011 and I've only used the feature a couple of times, mainly in brief demos to friends and family. Over the 2016 holiday break I offered my family the choice to watch "The Force Awakens" in 2D or 3D, and (after I reminded everyone they had to wear the glasses) 2D was the unanimous choice. But some viewers will be sad to see the feature go. There's even a change.org petition for LG to bring back the feature, which currently stands at 3,981 supporters. Of course 3D TV could come back to life, but I'd be surprised if it happened before TV makers perfect a way to watch it without glasses.
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
mspohr writes: The Economist has an interesting story about two neuroscientists/engineers -- Eric Jonas of the University of California, Berkeley, and Konrad Kording of Northwestern University, in Chicago -- who decided to test the methods of neuroscience using a 6502 processor. Their results are published in the PLOS Computational Biology journal. Neuroscientists explore how the brain works by looking at damaged brains and monitoring inputs and outputs to try to infer intermediate processing. They did the same with the 6502 processor which was used in early Atari, Apple and Commodore computers. What they discovered was that these methods were sorely lacking in that they often pointed in the wrong direction and missed important processing steps.
An anonymous reader quotes a report from BBC: The Scottish government has outlined a new target of reducing greenhouse gas emissions by 66% by 2032. Climate Change Secretary Roseanna Cunningham set out the government's draft climate change plan for the next 15 years at Holyrood. She also targeted a fully-decarbonized electricity sector and 80% of domestic heat coming from low-carbon sources. Ministers committed last year to cut harmful CO2 emissions by 80% by 2050, with a new interim target of 50% by 2020. The previous interim target of 42% was met in 2014 -- six years early. However, the independent Committee on Climate Change said the decrease was largely down to a warmer than average winter reducing the demand for heating. Ms Cunningham said the new targets demonstrated "a new level of ambition" to build a low-carbon economy and a healthier Scotland. Goals to be achieved by 2032 include: Cutting greenhouse emissions by 66%; A fully-decarbonized electricity sector; 80% of domestic heat to come from low-carbon heat technologies; Proportion of ultra-low emission new cars and vans registered in Scotland annually to hit 40%; 250,000 hectares of degraded peatlands restored; Annual woodland creation target increased to at least 15,000 hectares per year. The 172-page document sets a road map for decarbonizing Scotland. The aim -- although not new -- is to reduce greenhouse gas emissions by two thirds by 2032. Among the policies are making half of Scotland's buses low-carbon, full-decarbonizing the electricity sector and making 80% of homes heated by low-carbon technologies.
In an effort to improve air quality, the Chinese government has canceled over 100 coal-fired power plants in 11 provinces -- totaling a combined installed capacity of more than 100 gigawatts. Reuters reports: In a document issued on Jan. 14, financial media group Caixin reported, the National Energy Administration (NEA) suspended the coal projects, some of which were already under construction. The projects worth some 430 billion yuan ($62 billion) were to have been spread across provinces and autonomous regions including Xinjiang, Inner Mongolia, Shanxi, Gansu, Ningxia, Qinghai, Shaanxi and other northwestern areas. Putting the power projects on hold is a major step towards the government's effort to produce power from renewable sources such as solar and wind, and wean the country off coal, which accounts for the majority of the nation's power supply. To put it in perspective, some 130 GW of additional solar and wind power will be installed by 2020, equal to France's total renewable power generation capacity, said Frank Yu, principal consultant at Wood Mackenzie. "This shows the government is keeping its promise in curbing supplies of coal power," Yu said. Some of the projects will still go ahead, but not until 2025 and will likely replace outdated technology, he said.
randomErr writes: 5.7 million adults in the United States have heart failure each year with about 41 million worldwide. Currently, treatment involves surgically implanting a mechanical pump, called a ventricular assist device (VAD), into the heart. The VAD helps maintains the heart's function. But patients with VADs are at high risk for getting blood clots and having a stroke. Researchers at Harvard University and Boston Children's Hospital have created a soft robotic sleeve that doesn't have to be implanted. The robotic sleeve slips around the outside of the heart, squeezing it in sync with the natural rhythm. "This work represents an exciting proof of concept result for this soft robot, demonstrating that it can safely interact with soft tissue and lead to improvements in cardiac function," Conor Walsh, said in a press statement. Seeker reports: "The sleeve they developed is made from thin silicone and attaches to the outside of the heart with a combination of suction devices and sutures. It relies on soft, air-powered actuators that twist and compress in a way that's similar to the outer layer of muscle of a human heart. A gel coating reduces any friction between the sleeve and the organ. Because the sleeve is soft and flexible, it can be customized to fit not just the size and shape of individual hearts, but augment the organ's weaknesses. For example, if a patient's heart is weaker on the left side than the right, the sleeve can be tuned to squeeze with more authority on the left side. As the organ gains strength, the device can be adjusted." The study has been published in the journal Science Translational Medicine.
wiredmikey writes: Security researchers have a uncovered a Mac OS based espionage malware they have named "Quimitchin." The malware is what they consider to be "the first Mac malware of 2017," which appears to be a classic espionage tool. While it has some old code and appears to have existed undetected for some time, it works. It was discovered when an IT admin noticed unusual traffic coming from a particular Mac, and has been seen infecting Macs at biomedical facilities. From SecurityWeek.com: "Quimitchin comprises just two files: a .plist file that simply keeps the .client running at all times, and the .client file containing the payload. The latter is a 'minified and obfuscated' perl script that is more novel in design. It combines three components, Thomas Reed, director of Mac offerings at Malwarebytes and author of the blog post told SecurityWeek: 'a Mac binary, another perl script and a Java class tacked on at the end in the __DATA__ section of the main perl script. The script extracts these, writes them to /tmp/ and executes them.' Its primary purpose seems to be screen captures and webcam access, making it a classic espionage tool. Somewhat surprisingly the code uses antique system calls. 'These are some truly ancient functions, as far as the tech world is concerned, dating back to pre-OS X days,' he wrote in the blog post. 'In addition, the binary also includes the open source libjpeg code, which was last updated in 1998.' The script also contains Linux shell commands. Running the malware on a Linux machine, Malwarebytes 'found that -- with the exception of the Mach-O binary -- everything ran just fine.' It is possible that there is a specific Linux variant of the malware in existence -- but the researchers have not been able to find one. It did find two Windows executable files, courtesy of VirusTotal, that communicated with the same CC server. One of them even used the same libjpeg library, which hasn't been updated since 1998, as that used by Quimitchin."
The latest numbers released by analysts suggest that the Sony PlayStation 4 is selling twice as many units worldwide as the Xbox One since both systems launched in late 2013. The data comes from a new SuperData report on the Nintendo Switch, which is backed up by Niko Partners analyst Daniel Ahmad. SuperData mentions an installed base of 26 million Xbox One units and 55 million PS4 units. Ars Technica reports: Ahmad's chart suggests that Microsoft may have sold slightly more than half of the 53.4 million PS4 units that Sony recently announced it had sold through January 1. Specific numbers aside, though, it's clear Microsoft has done little to close its console sales gap with Sony over the past year -- and may have actually lost ground in that time. The last time we did our own estimate of worldwide console sales, through the end of 2015, we showed the Xbox One with about 57 percent as many systems sold as the PS4 (21.49 million vs. 37.7 million). That lines up broadly with numbers leaked by EA at the time, which suggest the Xbox One had sold about 52.9 percent as well as the PS4 (19 million vs. 35.9 million). One year later, that ratio has dipped to just above or even a bit below 50 percent, according to these reports. The relative sales performance of the Xbox One and PS4 doesn't say anything direct about the health or quality of those platforms, of course. Microsoft doesn't seem to be in any danger of abandoning the Xbox One platform any time soon and has, in fact, recently committed to upgrading it via Project Scorpio later this year. The gap between PS4 and Xbox One sales becomes important only if it becomes so big that publishers start to consider the Xbox One market as a minor afterthought that can be safely ignored for everything but niche games.
Just weeks after the massive Gigafactory started producing batteries, Tesla has announced plans to hire more workers and use the facility to make the motor and gearbox for its upcoming Model 3 electric sedan. CNBC reports: Tesla will invest $350 million for the project, and hire an additional 550 people, according to the governor's comments. That will be over and above the company's existing commitment to hiring 6,500 people at the Gigafactory, according to comments made by Steve Hill, the director of the governor's Office of Economic Development, to Nevada newspaper the Nevada Appeal. Tesla CEO Elon Musk has made manufacturing efficiency a high priority for the company, but Tesla will require a lot of factory floor to meet its goal of to pumping out 500,000 cars by the end of 2018, and then making one million cars by 2020. Meanwhile, the city of Fremont recently approved Tesla's application for an additional 4.6 million square feet of space there.
Thousands of Verizon customers continue to use the Galaxy Note 7 smartphone, the carrier said. This despite the widely publicized recalls spurred by battery fire concerns and a software upgrade designed to kill the phone by preventing it from recharging. From a report: "In spite of our best efforts, there are still customers using the recalled phones who have not returned or exchanged their Note 7 to the point of purchase," a Verizon spokeswoman said. "The recalled Note 7s pose a safety risk to our customers and those around them." So now Verizon is fighting fire with fire, so to speak. The carrier plans to reroute all non-911 outgoing calls to its customer service line, and it might bill the holdouts for the full retail cost of the phone.
Google is rolling out an update for its Android app that makes it easier to search on the web with an inconsistent internet connection. Users can make searches when offline and the Google app will store them, delivering the results later (with an optional notification) when the devices get signal again. From a report: As Google product manager Shekhar Sharad writes in a blog post: "So the next time you lose service, feel free to queue up your searches, put your phone away and carry on with your day. The Google app will work behind-the-scenes to detect when a connection is available again and deliver your search results once completed."
An anonymous reader quotes a report from Motherboard: NASA wants humans and robots to work together as teams. To ensure that, the space agency's autonomous robotics group is currently developing new technology to improve how humans explore the solar system, and how robots can help. When NASA began working with remotely operated robots several years ago, Fong said the scientists needed a piece of software that would allow them to look at terrain and sensor data coming from autonomous robots. That led to the creation of VERVE, a "3D robot user interface," which allows scientists to see and grasp the three-dimensional world of remotely operated robots. VERVE has been used with NASA's K10 planetary rovers (a prototype mobile robot that can travel bumpy terrain), with its K-Rex planetary rovers (robot to determine soil moisture), with SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellites) on the International Space Station (ISS), and with the new robot Astrobee (a robot that can fly around the ISS). In 2013, NASA carried out a series of tests with astronauts on the ISS, during which astronauts who were flying 200 miles above Earth remotely operated the K10 planetary rover in California. Because of time delay, astronauts can't just "joystick a robot," said Maria Bualat, deputy lead of intelligent robotics group at the NASA Ames Research Center. "You need a robot that can operate on its own, complete tasks on its own," she said. "On the other hand, you still want the human in the loop, because the human brings a lot of experience and very powerful cognitive ability that can deal with issues that the autonomy's not quite ready to handle." That's why, according to NASA, human capabilities and robotic capabilities comprise a powerful combination.
Qualcomm shares have plunged after the U.S. Federal Trade Commission filed a lawsuit against the company on Tuesday, accusing the company of using "anticompetitive" tactics to maintain its monopoly on a key semiconductor used in mobile phones. Reuters reports: The FTC, which works with the Justice Department to enforce antitrust law, said that San Diego-based Qualcomm used its dominant position as a supplier of certain phone chips to impose "onerous" supply and licensing terms on cellphone manufacturers and to weaken competitors. Qualcomm said in a statement that it would "vigorously contest" the complaint and denied FTC allegations that it threatened to withhold chips in order to collect unreasonable licensing fees. In its complaint, the FTC said the patents that Qualcomm sought to license are standard essential patents, which means that the industry uses them widely and they are supposed to be licensed on fair, reasonable and non-discriminatory terms. The FTC complaint also accused Qualcomm of refusing to license some standard essential patents to rival chipmakers, and of entering into an exclusive deal with Apple Inc. The FTC asked the U.S. District Court for the Northern District of California in San Jose to order Qualcomm to end these practices.
Mickeycaskill quotes a report from Silicon.co.uk: Accenture research has found Blockchain technology has the potential to reduce infrastructure costs by an average of 30 percent for eight of the world's ten biggest banks. That equates to annual cost savings of $8-12 billion. The findings of the "Banking on Blockchain: A Value Analysis for Investment Banks" report are based on an analysis of granular cost data from the eight banks to identify exactly where value could be achieved. A vast amount of cost for today's investment banks comes from complex data reconciliation and confirmation processes with their clients and counterparts, as banks maintain independent databases of transactions and customer information. However, Blockchain would enable banks to move to a shared, distributed database that spans multiple organizations. It has become increasingly obvious in recent months that blockchain will be key to the future of the banking industry, with the majority of banks expected to adopt the technology within the next three years.
AppleInsider has obtained a note to investors from KGI analyst Ming-Chi Kuo that says Apple's 2017 laptop line will focus on internal component updates, including the platform-wide adoption of Intel's Kaby Lake architecture. What's more is that Apple is expected to manufacture a 15-inch MacBook Pro with up to 32GB of RAM in the fourth quarter of 2017. AppleInsider reports: Apple took flak in releasing its latest MacBook Pro with Touch Bar models with a hard memory cap of 16GB, an minimal allotment viewed as a negative for imaging and video professionals. Responding to customer criticism, Apple said the move was made in a bid to maximize battery life. Essentially, the Intel Skylake CPUs used in Apple's MacBook Pro only support up to 16GB of LPDDR3 RAM at 2133MHz. Though Intel does make processors capable of addressing more than 16GB of memory, those particular chipsets rely on less efficient DDR4 RAM and are usually deployed in desktops with access to dedicated mains power. In order to achieve high memory allotments and keep unplugged battery life performance on par with existing MacBook Pro models, Apple will need to move to an emerging memory technology like LPDDR4 or DDR4L. Such hardware is on track for release later this year. As for the 12-inch MacBook, Kuo believes next-generation versions of the thin-and-light will enter mass production in the second quarter with the same basic design aesthetic introduced in 2015. New for 2017 is a 16GB memory option that will make an appearance thanks to Intel's new processor class.
An anonymous reader quotes a report from Ars Technica: ZeniMax Media, the parent company of both Bethesda Softworks and Id Software, says it will prove at trial that John Carmack and others at Oculus stole trade secrets to "misappropriate" virtual reality technology that was first developed while Carmack was working at Id Software. What's more, ZeniMax is now accusing Oculus of "intentional destruction of evidence to cover up their wrongdoing." Mark Zuckerberg, CEO of Oculus parent company Facebook, is scheduled to respond to those accusations in testimony starting tomorrow, according to a report by Business insider. ZeniMax's statement comes after Carmack testified at trial last week, saying the case was "ridiculous and absurd." His testimony echoed Oculus' initial reaction when ZeniMax's accusations first surfaced in 2014. In court filings leading up to the trial, ZeniMax detailed its case that Carmack, while still an employee at Id Software, "designed the specifications and functionality embodied in the Rift SDK and directed its development." Carmack's technology and guidance allegedly "literally transformed" Oculus founder Palmer Luckey's early Rift prototype from a "primitive virtual reality headset" that was "little more than a display panel." Carmack allegedly used "copyrighted computer code, trade secret information, and technical know-how" from his time at ZeniMax after he moved to Oculus as CTO in 2013. As the trial began last week (as reported by a Law360 summary, registration required), Carmack told the court of his development of a virtual reality demo for Doom 3 in 2012 and his search for a VR headset that would be suitable to run it. That's when he says he got in touch with Luckey, leading to the now legendary E3 2012 demo that introduced Oculus to the public. ZeniMax is seeking $2 billion in damage, which matches the value that Facebook paid for Oculus in 2014. The trial is expected to last three weeks.