Space

Milky Way May Escape Fated Collision With Andromeda Galaxy (science.org) 33

sciencehabit shares a report from Science.org: For years, astronomers thought it was the Milky Way's destiny to collide with its near neighbor the Andromeda galaxy a few billion years from now. But a new simulation finds a 50% chance the impending crunch will end up a near-miss, at least for the next 10 billion years. It's been known that Andromeda is heading toward our home Galaxy since 1912, heading pretty much straight at the Milky Way at a speed of 110 kilometers per second. Such galaxy mergers, which can be seen in progress elsewhere in the universe, are spectacularly messy affairs. Although most stars survive unscathed, the galaxies' spiral structures are obliterated, sending streams of stars spinning off into space. After billions of years, the merged galaxies typically settle into a single elliptical galaxy: a giant featureless blob of stars. A study from 2008 suggested a Milky Way-Andromeda merger was inevitable within the next 5 billion years, and that in the process the Sun and Earth would get gravitationally grabbed by Andromeda for a time before ending up in the distant outer suburbs of the resulting elliptical, which the researchers dub "Milkomeda."

In the new simulation, researchers made use of the most recent and best estimates of the motion and mass of the four largest galaxies in the Local Group. They then plugged those into simulations developed by the Institute for Computational Cosmology at Durham University. First, they ran the simulation including just the Milky Way and Andromeda and found that they merged in slightly less than half of the cases -- lower odds than other recent estimates. When they included the effect of the Triangulum galaxy, the Local Group's third largest, the merger probability increased to about two-thirds. But with the inclusion of the Large Magellanic Cloud, a satellite galaxy of the Milky Way that is the fourth largest in the Local Group, those chances dropped back down to a coin flip. And if the cosmic smashup does happen, it won't be for about 8 billion years. "As it stands, proclamations of the impending demise of our Galaxy appear greatly exaggerated," the researchers write. Meanwhile, if the accelerating expansion of the universe continues unabated, all other galaxies will disappear beyond our cosmic event horizon, leaving Milkomeda as the sole occupant of the visible universe.
The study is available as a preprint on arXiv.
Earth

Excess Memes and 'Reply All' Emails Are Bad For Climate, Researcher Warns (theguardian.com) 120

An anonymous reader quotes a report from The Guardian: When "I can has cheezburger?" became one of the first internet memes to blow our minds, it's unlikely that anyone worried about how much energy it would use up. But research has now found that the vast majority of data stored in the cloud is "dark data", meaning it is used once then never visited again. That means that all the memes and jokes and films that we love to share with friends and family -- from "All your base are belong to us", through Ryan Gosling saying "Hey Girl", to Tim Walz with a piglet -- are out there somewhere, sitting in a datacenter, using up energy. By 2030, the National Grid anticipates that datacenters will account for just under 6% of the UK's total electricity consumption, so tackling junk data is an important part of tackling the climate crisis.

Ian Hodgkinson, a professor of strategy at Loughborough University has been studying the climate impact of dark data and how it can be reduced. "I really started a couple of years ago, it was about trying to understand the negative environmental impact that digital data might have," he said. "And at the top of it might be quite an easy question to answer, but it turns out actually, it's a whole lot more complex. But absolutely, data does have a negative environmental impact." He discovered that 68% of data used by companies is never used again, and estimates that personal data tells the same story. [...] One funny meme isn't going to destroy the planet, of course, but the millions stored, unused, in people's camera rolls does have an impact, he explained: "The one picture isn't going to make a drastic impact. But of course, if you maybe go into your own phone and you look at all the legacy pictures that you have, cumulatively, that creates quite a big impression in terms of energy consumption."
Since we're paying to store data in the cloud, cloud operators and tech companies have a financial incentive to keep people from deleting junk data, says Hodgkinson. He recommends people send fewer pointless emails and avoid the "dreaded 'reply all' button."

"One [figure] that often does the rounds is that for every standard email, that equates to about 4g of carbon. If we then think about the amount of what we mainly call 'legacy data' that we hold, so if we think about all the digital photos that we have, for instance, there will be a cumulative impact."
United States

US Colleges Slash Majors in Effort To Cut Costs (cbsnews.com) 110

St. Cloud State University announced plans to eliminate its music department and cut 42 degree programs and 50 minors, as part of a broader trend of U.S. colleges slashing offerings amid financial pressures. The Minnesota school's decision, driven by a $32 million budget shortfall over two years, reflects challenges facing higher education institutions nationwide. Similar program cuts have been announced at universities across the country, including in North Carolina, Arkansas, and New York. Some smaller institutions have closed entirely, unable to weather the financial storm, reports CBS News.

Federal COVID relief funds have dried up, operational costs are rising, and fewer high school graduates are pursuing college degrees. St. Cloud State's enrollment plummeted from 18,300 students in fall 2020 to about 10,000 in fall 2023, mirroring national trends. The National Student Clearinghouse Research Center reports a decline in four-year college enrollment, despite a slight rebound in community college numbers.
Cloud

Cloud Growth Puts Hyperscalers On Track To 60% of Data Capacity By 2029 (theregister.com) 6

Dan Robinson writes via The Register: Hyperscalers are forecast to account for more than 60 percent of datacenter space by 2029, a stark reversal on just seven years ago when the majority of capacity was made up of on-premises facilities. This trend is the result of demand for cloud services and consumer-oriented digital services such as social networking, e-commerce and online gaming pushing growth in hyperscale bit barns, those operated by megacorps including Amazon, Microsoft and Meta. The figures were published by Synergy Research Group, which says they are drawn from several detailed quarterly tracking research services to build an analysis of datacenter volume and trends.

As of last year's data, those hyperscale companies accounted for 41 percent of the entire global data dormitory capacity, but their share is growing fast. Just over half of the hyperscaler capacity is comprised of own-build facilities, with the rest made up of leased server farms, operated by providers such as Digital Realty or Equinix. On-premises datacenters run by enterprises themselves now account for 37 percent of the total, a drop from when they made up 60 percent a few years ago. The remainder (22 percent) is accounted for by non-hyperscale colocation datacenters.

What the figures appear to show is that hyperscale volume is growing faster than colocation or on-prem capacity -- by an average of 22 percent each year. Hence Synergy believes that while colocation's share of the total will slowly decrease over time, actual colo capacity will continue to rise steadily. Likewise, the proportion of overall bit barn space represented by on-premise facilities is forecast by Synergy to decline by almost three percentage points each year, although the analyst thinks the actual total capacity represented by on-premises datacenters is set to remain relatively stable. It's a case of on-prem essentially standing still in an expanding market.

United Kingdom

UK Regulator To Examine $4 Billion Amazon Investment In AI Startup Anthropic (theguardian.com) 2

An anonymous reader quotes a report from The Guardian: Amazon's $4 billion investment into US artificial intelligence startup Anthropic is to be examined in the latest investigation into technology tie-ups by the UK's competition watchdog. The Competition and Markets Authority (CMA) said on Thursday that it was launching a preliminary investigation into the deal, before deciding whether to refer it for an in-depth review. The deal, announced in March, included a $4 billion investment in Anthropic from Amazon, and a commitment from Anthropic to use Amazon Web Services "as its primary cloud provider for mission critical workloads, including safety research and future foundation model development." The regulator said it was "considering whether it is or may be the case that Amazon's partnership with Anthropic has resulted in the creation of a relevant merger situation." "We are an independent company. Our strategic partnerships and investor relationships do not diminish our corporate governance independence or our freedom to partner with others," said an Anthropic spokesperson said in a statement. "Amazon does not have a seat on Anthropic's board, nor does it have any board observer rights. We intend to cooperate with the CMA and provide them with a comprehensive understanding of Amazon's investment and our commercial collaboration."
Hardware

NVMe 2.1 Specifications Published With New Capabilities (phoronix.com) 22

At the Flash Memory Summit 2024 this week, NVM Express published the NVMe 2.1 specifications, which hope to enhance storage unification across AI, cloud, client, and enterprise. Phoronix's Michael Larabel writes: New NVMe capabilities with the revised specifications include:

- Enabling live migration of PCIe NVMe controllers between NVM subsystems.
- New host-directed data placement for SSDs that simplifies ecosystem integration and is backwards compatible with previous NVMe specifications.
- Support for offloading some host processing to NVMe storage devices.
- A network boot mechanism for NVMe over Fabrics (NVMe-oF).
- Support for NVMe over Fabrics zoning.
- Ability to provide host management of encryption keys and highly granular encryption with Key Per I/O.
- Security enhancements such as support for TLS 1.3, a centralized authentication verification entity for DH-HMAC-CHAP, and post sanitization media verification.
- Management enhancements including support for high availability out-of-band management, management over I3C, out-of-band management asynchronous events and dynamic creation of exported NVM subsystems from underlying NVM subsystem physical resources.
You can learn more about these updates at NVMExpress.org.
AI

Mainframes Find New Life in AI Era (msn.com) 56

Mainframe computers, stalwarts of high-speed data processing, are finding new relevance in the age of AI. Banks, insurers, and airlines continue to rely on these industrial-strength machines for mission-critical operations, with some now exploring AI applications directly on the hardware, WSJ reported in a feature story. IBM, commanding over 96% of the mainframe market, reported 6% growth in its mainframe business last quarter. The company's latest zSystem can process up to 30,000 transactions per second and hold 40 terabytes of data. WSJ adds: Globally, the mainframe market was valued at $3.05 billion in 2023, but new mainframe sales are expected to decline through 2028, IDC said. Of existing mainframes, however, 54% of enterprise leaders in a 2023 Forrester survey said they would increase their usage over the next two years.

Mainframes do have limitations. They are constrained by the computing power within their boxes, unlike the cloud, which can scale up by drawing on computing power distributed across many locations and servers. They are also unwieldy -- with years of old code tacked on -- and don't integrate well with new applications. That makes them costly to manage and difficult to use as a platform for developing new applications.

Space

Are There Diamonds on Mercury? (cnn.com) 29

The planet Mercury could have "a layer of diamonds," reports CNN, citing new research suggesting that about 310 miles (500 kilometers) below the surface...could be a layer of diamonds 11 miles (18 kilometers) thick.

And the study's co-author believes lava might carry some of those diamonds up to the surface: The diamonds might have formed soon after Mercury itself coalesced into a planet about 4.5 billion years ago from a swirling cloud of dust and gas, in the crucible of a high-pressure, high-temperature environment. At this time, the fledgling planet is believed to have had a crust of graphite, floating over a deep magma ocean.

A team of researchers recreated that searing environment in an experiment, with a machine called an anvil press that's normally used to study how materials behave under extreme pressure but also for the production of synthetic diamonds. "It's a huge press, which enables us to subject tiny samples at the same high pressure and high temperature that we would expect deep inside the mantle of Mercury, at the boundary between the mantle and the core," said Bernard Charlier, head of the department of geology at the University of Liège in Belgium and a coauthor of a study reporting the findings.

The team inserted a synthetic mixture of elements — including silicon, titanium, magnesium and aluminum — inside a graphite capsule, mimicking the theorized composition of Mercury's interior in its early days. The researchers then subjected the capsule to pressures almost 70,000 times greater than those found on Earth's surface and temperatures up to 2,000 degrees Celsius (3,630 degrees Fahrenheit), replicating the conditions likely found near Mercury's core billions of years ago.

After the sample melted, the scientists looked at changes in the chemistry and minerals under an electron microscope and noted that the graphite had turned into diamond crystals.

The researchers believe this mechanism "can not only give us more insight into the secrets hidden below Mercury's surface, but on planetary evolution and the internal structure of exoplanets with similar characteristics."
Government

US Progressives Push For Nvidia Antitrust Investigation (reuters.com) 42

Progressive groups and Senator Elizabeth Warren are urging the Department of Justice to investigate Nvidia for potential antitrust violations due to its dominant position in the AI chip market. The groups criticize Nvidia's bundling of software and hardware, claiming it stifles innovation and locks in customers. Reuters reports: Demand Progress and nine other groups wrote a letter (PDF) this week, opens new tab urging Department of Justice antitrust chief Jonathan Kanter to probe business practices at Nvidia, whose market value hit $3 trillion this summer on demand for chips able to run the complex models behind generative AI. The groups, which oppose monopolies and promote government oversight of tech companies, among other issues, took aim at Nvidia's bundling of software and hardware, a practice that French antitrust enforcers have flagged as they prepare to bring charges.

"This aggressively proprietary approach, which is strongly contrary to industry norms about collaboration and interoperability, acts to lock in customers and stifles innovation," the groups wrote. Nvidia has roughly 80% of the AI chip market, including the custom AI processors made by cloud computing companies like Google, Microsoft and Amazon.com. The chips made by the cloud giants are not available for sale themselves but typically rented through each platform.
A spokesperson for Nvidia said: "Regulators need not be concerned, as we scrupulously adhere to all laws and ensure that NVIDIA is openly available in every cloud and on-prem for every enterprise. We'll continue to support aspiring innovators in every industry and market and are happy to provide any information regulators need."
Microsoft

Microsoft Now Lists OpenAI as a Competitor in AI and Search (techcrunch.com) 11

An anonymous reader shares a report: Microsoft has a long and tangled history with OpenAI, having invested a reported $13 billion in the ChatGPT maker as part of a long term partnership. As part of the deal, Microsoft runs OpenAI's models across its enterprise and consumer products, and is OpenAI's exclusive cloud provider. However, the tech giant called the startup a "competitor" for the first time in an SEC filing on Tuesday.

In Microsoft's annual 10K, OpenAI joined long list of competitors in AI, alongside Anthropic, Amazon, and Meta. OpenAI was also listed alongside Google as a competitor to Microsoft in search, thanks to OpenAI's new SearchGPT feature announced last week. It's possible Microsoft is trying to change the narrative on its relationship with OpenAI in light of antitrust concerns -- the FTC is currently looking into the relationship, alongside similar cloud provider investments into AI startups.

Businesses

FOSSA is Buying StackShare, a Site Used By 1.5 Million Developers (techcrunch.com) 4

Open-source compliance and security platform FOSSA has acquired developer community platform StackShare, the company confirmed to TechCrunch. From a report: StackShare is one of the more popular platforms for developers to discuss, track, and share the tools they use to build applications. This encompasses everything from which front-end JavaScript framework to use to which cloud provider to use for specific tasks.
Programming

AWS Quietly Scales Back Some DevOps Services (devclass.com) 50

AWS has quietly halted new customer onboarding for several of its services, including the once-touted CodeCommit source code repository and Cloud9 cloud IDE, signaling a potential retreat from its comprehensive DevOps offering.

The stealth deprecation, discovered by users encountering unexpected errors, has sent ripples through the AWS community, with many expressing frustration over the lack of formal announcements and the continued presence of outdated documentation. AWS VP Jeff Barr belatedly confirmed the decision on social media, listing affected services such as S3 Select, CloudSearch, SimpleDB, Forecast, and Data Pipeline.
The Internet

French Internet Lines Cut In Latest Attack During Olympics (msn.com) 69

An anonymous reader quotes a report from Bloomberg: A number of fiber optic cables carrying broadband service across France were cut overnight in the latest attack on the country's infrastructure during the Olympic Games. Connections serving Paris, which is hosting the Olympic Games this week, and the games themselves weren't affected, a spokesman for Olympics telecom partner, Orange SA, said. Still, this is the second sabotage of French infrastructure in the past few days as the world converges on the capital. Coordinated fires on French rail lines disrupted trains ahead of the opening ceremony on Friday.

The fiber cables were cut in nine departments overall including: Ardeche, Aude, Bouches-du-Rhone, Drome, Herault, Vaucluse, Marne, Meuse and Oise, the French Telecom Federation said. SFR said its network was vandalized between 1 a.m. and 3 a.m. Paris time, and teams are working on repairs, a spokesman for the French phone company said. The carrier is using alternative routes to serve customers, though redirecting the traffic might lead to slower speeds. Other carriers, including Iliad SA's Free and Netalis, also said they were impacted in social media posts. Netalis Chief Executive Officer Nicolas Guillaume said that the telecom company had successfully moved traffic to backup networks early on Monday. French cloud provider OVHcloud is also working to reroute traffic after the incident, which had caused slower performance on connections between Europe and Asia Pacific, a spokesman said.
"We advocate for France reinforcing criminal sanctions for vandalism on telecom infrastructure, which should be put at the same level as vandalism on energy infrastructure," said Romain Bonenfant, head of the French Telecom Federation industry group, in an interview. "Telecom infrastructure, like the railways, covers kilometers across the whole territory -- you can't put surveillance on every part of it."
Microsoft

Microsoft Adds Intrusive OneDrive Ad in Windows 11 (windowslatest.com) 84

Microsoft has intensified its push for OneDrive adoption in Windows 11, introducing a full-screen pop-up that prompts users to back up their files to the cloud service, according to a report from Windows Latest. The new promotional message, which appears after a recent Windows update, mirrors the out-of-box experience typically seen during initial system setup and highlights OneDrive's features, including file protection, collaboration capabilities, and automatic syncing.
GNU is Not Unix

After Crowdstrike Outage, FSF Argues There's a Better Way Forward (fsf.org) 139

"As free software activists, we ought to take the opportunity to look at the situation and see how things could have gone differently," writes FSF campaigns manager Greg Farough: Let's be clear: in principle, there is nothing ethically wrong with automatic updates so long as the user has made an informed choice to receive them... Although we can understand how the situation developed, one wonders how wise it is for so many critical services around the world to hedge their bets on a single distribution of a single operating system made by a single stupefyingly predatory monopoly in Redmond, Washington. Instead, we can imagine a more horizontal structure, where this airline and this public library are using different versions of GNU/Linux, each with their own security teams and on different versions of the Linux(-libre) kernel...

As of our writing, we've been unable to ascertain just how much access to the Windows kernel source code Microsoft granted to CrowdStrike engineers. (For another thing, the root cause of the problem appears to have been an error in a configuration file.) But this being the free software movement, we could guarantee that all security engineers and all stakeholders could have equal access to the source code, proving the old adage that "with enough eyes, all bugs are shallow." There is no good reason to withhold code from the public, especially code so integral to the daily functioning of so many public institutions and businesses. In a cunning PR spin, it appears that Microsoft has started blaming the incident on third-party firms' access to kernel source and documentation. Translated out of Redmond-ese, the point they are trying to make amounts to "if only we'd been allowed to be more secretive, this wouldn't have happened...!"

We also need to see that calling for a diversity of providers of nonfree software that are mere front ends for "cloud" software doesn't solve the problem. Correcting it fully requires switching to free software that runs on the user's own computer.The Free Software Foundation is often accused of being utopian, but we are well aware that moving airlines, libraries, and every other institution affected by the CrowdStrike outage to free software is a tremendous undertaking. Given free software's distinct ethical advantage, not to mention the embarrassing damage control underway from both Microsoft and CrowdStrike, we think the move is a necessary one. The more public an institution, the more vitally it needs to be running free software.

For what it's worth, it's also vital to check the syntax of your configuration files. CrowdStrike engineers would do well to remember that one, next time.

Networking

Is Modern Software Development Mostly 'Junky Overhead'? (tailscale.com) 117

Long-time Slashdot theodp says this "provocative" blog post by former Google engineer Avery Pennarun — now the CEO/founder of Tailscale — is "a call to take back the Internet from its centralized rent-collecting cloud computing gatekeepers."

Pennarun writes: I read a post recently where someone bragged about using Kubernetes to scale all the way up to 500,000 page views per month. But that's 0.2 requests per second. I could serve that from my phone, on battery power, and it would spend most of its time asleep. In modern computing, we tolerate long builds, and then Docker builds, and uploading to container stores, and multi-minute deploy times before the program runs, and even longer times before the log output gets uploaded to somewhere you can see it, all because we've been tricked into this idea that everything has to scale. People get excited about deploying to the latest upstart container hosting service because it only takes tens of seconds to roll out, instead of minutes. But on my slow computer in the 1990s, I could run a perl or python program that started in milliseconds and served way more than 0.2 requests per second, and printed logs to stderr right away so I could edit-run-debug over and over again, multiple times per minute.

How did we get here?

We got here because sometimes, someone really does need to write a program that has to scale to thousands or millions of backends, so it needs all that stuff. And wishful thinking makes people imagine even the lowliest dashboard could be that popular one day. The truth is, most things don't scale, and never need to. We made Tailscale for those things, so you can spend your time scaling the things that really need it. The long tail of jobs that are 90% of what every developer spends their time on. Even developers at companies that make stuff that scales to billions of users, spend most of their time on stuff that doesn't, like dashboards and meme generators.

As an industry, we've spent all our time making the hard things possible, and none of our time making the easy things easy. Programmers are all stuck in the mud. Just listen to any professional developer, and ask what percentage of their time is spent actually solving the problem they set out to work on, and how much is spent on junky overhead.

Tailscale offers a "zero-config" mesh VPN — built on top of WireGuard — for a secure network that's software-defined (and infrastructure-agnostic). "The problem is developers keep scaling things they don't need to scale," Pennarun writes, "and their lives suck as a result...."

"The tech industry has evolved into an absolute mess..." Pennarun adds at one point. "Our tower of complexity is now so tall that we seriously consider slathering LLMs on top to write the incomprehensible code in the incomprehensible frameworks so we don't have to."

Their conclusion? "Modern software development is mostly junky overhead."
The Almighty Buck

Adobe Exec: Early Termination Fees Are 'Like Heroin' (theverge.com) 24

Longtime Slashdot reader sandbagger shares a report from The Verge: Early termination fees are "a bit like heroin for Adobe," according to an Adobe executive quoted in the FTC's newly unredacted complaint against the company for allegedly hiding fees and making it too hard to cancel Creative Cloud. "There is absolutely no way to kill off ETF or talk about it more obviously" in the order flow without "taking a big business hit," this executive said. That's the big reveal in the unredacted complaint, which also contains previously unseen allegations that Adobe was internally aware of studies showing its order and cancellation flows were too complicated and customers were unhappy with surprise early termination fees. In response to the quote, Adobe's general counsel and chief trust officer, Dana Rao, said that he was "disappointed in the way they're continuing to take comments out of context from non-executive employees from years ago to make their case."

Rao added that the person quoted was not on the leadership team that reports to CEO Shantanu Narayen and that whether to charge early termination fees would "not be their decision." The early termination fees in the FTC case represent "less than half a percent of our annual revenue," Rao told The Verge. "It doesn't drive our business, it doesn't drive our business decisions."
AI

FTC's Khan Backs Open AI Models in Bid to Avoid Monopolies (yahoo.com) 8

Open AI models that allow developers to customize them with few restrictions are more likely to promote competition, FTC Chair Lina Khan said, weighing in on a key debate within the industry. From a report: "There's tremendous potential for open-weight models to promote competition," Khan said Thursday in San Francisco at startup incubator Y Combinator. "Open-weight models can liberate startups from the arbitrary whims of closed developers and cloud gatekeepers."

"Open-weight" models disclose what an AI model picked up and was tweaked on during its training process. That allows developers to better customize them and makes them more accessible to smaller companies and researchers. But critics have warned that open models carry an increased risk of abuse and could potentially allow companies from geopolitical rivals like China to piggyback off the technology. Khan's comments come as the Biden administration is considering guidance on the use and safety of open-weight models.

IT

Adobe Exec Compared Creative Cloud Cancellation Fees To 'Heroin' (theverge.com) 34

Early termination fees are "a bit like heroin for Adobe," according to an Adobe executive quoted in the FTC's newly unredacted complaint against the company for allegedly hiding fees and making it too hard to cancel Creative Cloud. The Verge: "There is absolutely no way to kill off ETF or talk about it more obviously" in the order flow without "taking a big business hit," this executive said. That's the big reveal in the unredacted complaint, which also contains previously unseen allegations that Adobe was internally aware of studies showing its order and cancellation flows were too complicated and customers were unhappy with surprise early termination fees.

In a short interview, Adobe's general counsel and chief trust officer, Dana Rao, pushed back on both the specific quote and the FTC's complaint more generally, telling me that he was "disappointed in the way they're continuing to take comments out of context from non-executive employees from years ago to make their case."

Microsoft

Microsoft: Our Licensing Terms Do Not Meaningfully Raise Cloud Rivals' Costs 21

In a response to the UK's Competition and Markets Authority's investigation into cloud services and licensing, Microsoft has defended its practices, asserting that its terms "do not meaningfully raise cloud rivals' costs." The Windows-maker emphasized Amazon's continued dominance in the UK hyperscale market and noted Google's quarter-on-quarter growth, while also highlighting the declining share of Windows Server relative to Linux in cloud operating systems and SQL Server's second-place position behind Oracle.

[...] The CMA's inquiry primarily focuses on the pricing disparity between using Microsoft products on Azure versus rival cloud platforms, with most surveyed customers perceiving Azure as the more cost-effective option for Microsoft software deployment. The Register adds: Microsoft's bullish take on this is that AWS and Google should be grateful that they even get to run its software. In its response, the company said: "This dispute on pricing terms only arises because Microsoft grants all rivals IP licenses in the first place to its software that is of most popularity for use in the cloud. It does this not because there is any legal obligation to share IP with closest rivals in cloud, but for commercial reasons."

Slashdot Top Deals