Space.com adds: Kepler spots alien worlds by noticing the tiny brightness dips they cause when they cross the face of their host star from the spacecraft's perspective. Kepler is the most accomplished planet hunter in history. It has found more than 2,500 confirmed alien worlds -- about 70 percent of all known exoplanets -- along with a roughly equal number of "candidates" that await confirmation by follow-up observations or analyses. The vast majority of these discoveries have come via observations that Kepler made during its original mission, which ran from 2009 to 2013. Study of these data sets is ongoing; over the past few years, researchers have used improved analysis techniques to spot many exoplanets in data that Kepler gathered a half-decade ago or more.
Space.com describes Thursday's announcement as an exoplanet discovery. (Earlier they reported on the discovery of "a possibly habitable alien world" about 2.2 times the size of earth orbiting a dwarf star "within the range of distances where liquid water could exist on a world's surface".)
Slashdot reader schwit1 points out that other less-credible sites speculate NASA's announcement will be "a major discovery about life beyond earth."
Perens sees the move as "obviously inspired" by the kernel team's earlier announcement, and believes it's directed against one man who made 50 copyright infringement claims involving the Linux kernel "with intent to collect income rather than simply obtain compliance with the GPL license."
Unfortunately, "as far as I can tell, it's Patrick McHardy's legal right to bring such claims regarding the copyrights which he owns, even if it doesn't fit Community Principles which nobody is actually compelled to follow."
Together with a community of likeminded developers, companies and researchers, we have applied sophisticated machine learning techniques and a variety of innovations to build a speech-to-text engine that has a word error rate of just 6.5% on LibriSpeech's test-clean dataset. vIn our initial release today, we have included pre-built packages for Python, NodeJS and a command-line binary that developers can use right away to experiment with speech recognition.
The announcement also touts the release of nearly 400,000 recordings -- downloadable by anyone -- as the first offering from Project Common Voice, "the world's second largest publicly available voice dataset." It launched in July "to make it easy for people to donate their voices to a publicly available database, and in doing so build a voice dataset that everyone can use to train new voice-enabled applications." And while they've started with English-language recordings, "we are working hard to ensure that Common Voice will support voice donations in multiple languages beginning in the first half of 2018."
"We at Mozilla believe technology should be open and accessible to all, and that includes voice... As the web expands beyond the 2D page, into the myriad ways where we connect to the Internet through new means like VR, AR, Speech, and languages, we'll continue our mission to ensure the Internet is a global public resource, open and accessible to all."
"It's been very disappointing," he said... "My feeling is that one of the top executives at the Bureau of Land Management called Needles, California, saying... 'What's going on? Who permitted this?'" Hughes said. Plus, as he and his team were preparing to leave Wednesday, the motorhome/rocket launcher broke down in his driveway, he said... His plan is to try again next week.
FTC Commissioner Terrell McSweeny criticized the move. She said, "So many things wrong here, like even if FCC does this FTC still won't have jurisdiction. But even if we did, most discriminatory conduct by ISPs will be perfectly legal. This won't hurt tech titans with deep pockets. They can afford to pay all the trolls under the bridge. But the entrepreneurs and innovators who truly make the Internet great won't be so lucky. It will be harder for them to compete. The FCC is upending the Internet as we know it, not saving it."
This is what the internet looks like when there is no net neutrality. Earlier today, news outlet Motherboard suggested we should build our own internet if we want to safeguard the essence of open internet.
But Facebook also recognized that it could only take this so far internally, and if they could work with partners and other network operators and hardware manufacturers, they could extend the capabilities of this tool. They are in fact working with other companies in this endeavor including Juniper and Arista networks, but by open sourcing the software, it allows developers to do things with it that Facebook might not have considered, and their engineering team finds that prospect both exciting and valuable.
"Most protocols were initially designed based on constrained hardware and software environment assumptions from decades ago," Facebook said in its announcement. "To continue delivering rich, real-time, and highly engaging user experiences over networks, it's important to accelerate innovation in the routing domain."
Microsoft and GitHub are also working to bring similar capabilities to other platforms, with macOS coming first, and later Linux. The obvious way to do this on both systems is to use FUSE, an infrastructure for building file systems that run in user mode rather than kernel mode (desirable because user-mode development is easier and safer than kernel mode). However, the companies have discovered that FUSE isn't fast enough for this -- a lesson Dropbox also learned when developing a similar capability, Project Infinite. Currently, the companies believe that tapping into a macOS extensibility mechanism called Kauth (or KAuth) will be the best way forward.