Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Data Storage Google IT

After Disrupting Businesses, Google Drive's Secret File Cap is Dead for Now 45

Google is backtracking on its decision to put a file creation cap on Google Drive. From a report: Around two months ago, the company decided to cap all Google Drive users to 5 million files, even if they were paying for extra storage. The company did this in the worst way possible, rolling out the limit as a complete surprise and with no prior communication. Some users logged in to find they were suddenly millions of files over the new limit and unable to upload new files until they deleted enough to get under the limit. Some of these users were businesses that had the sudden file cap bring down their systems, and because Google never communicated that the change was coming, many people initially thought the limitation was a bug.

Apparently, sunshine really is the best disinfectant. The story made the tech news rounds on Friday, and Ars got Google on the record saying that the file cap was not a bug and was actually "a safeguard to prevent misuse of our system in a way that might impact the stability and safety of the system." After the weekend reaction to "Google Drive's Secret File Cap!" Google announced on Twitter Monday night that it was rolling back the limit. [...] Google told us it initially rolled the limitation out to stop what it called "misuse" of Drive, and with the tweet saying Google wants to "explore alternate approaches to ensure a great experience for all," it sounds like we might see more kinds of Drive limitations in the future.
This discussion has been archived. No new comments can be posted.

After Disrupting Businesses, Google Drive's Secret File Cap is Dead for Now

Comments Filter:
  • by Cpt_Kirks ( 37296 ) on Tuesday April 04, 2023 @02:10PM (#63425574)

    Somebody just made the top of the next layoff list.

    • I mean no, I guarantee you most of the people caught by this were probably people abusing the un-enforced caps on workspaces for 'unlimited' backups.

      • by DarkRookie2 ( 5551422 ) on Tuesday April 04, 2023 @02:16PM (#63425596)
        Its still a dick move to not announce a feature like this and just apply it.
        • by mysidia ( 191772 ) on Tuesday April 04, 2023 @02:31PM (#63425642)

          That's the problem... Google Drive's purpose is end users saving Documents they are working with as part of their on the thing. There is realistically Not any way anybody ends up with 5 million files on Drive, unless they are creating massive numbers of files through an automated process.

          Like.. shooting 1000 photos a day and uploading all the raw image files would get you there after about 13 years - possibly less if you were then processing the files and saving working versions.

          But that's a high-end "extreme pro" use - that's perfectly reasonable for it to cost extra. Certainly the cloud storage solution for end users to save their Word documents doesn't have to be priced the same way as a cloud solution designed to accommodate high-volume media production.

          The 5 Million files is a 100% reasonable limit for a standard "Office user" tier, the expected use of Google drive, And those who require the ability to process a massive number of files should be expected to pay for that privilege to cover the infrastructure requirements, BUT should never have been rolled out with little notice..

          • by EldoranDark ( 10182303 ) on Tuesday April 04, 2023 @02:50PM (#63425700)
            What is even the problem with 5m files if people fit into the volume envelope? As a photographer though, I don't think the rate you proposed is unrealistic. I hit 1k to 2k at company events, no problem. A sports photographer might be setting their camera into a rapid sequence spray and pray mode to capture as much of the action happening, because why not? That's just the RAW files. I'm not deleting my RAW files, so the processed files are on top, not instead. And the xmp metadata files. Also it makes sense to export the jpegs with several quality presets. High quality for publishing and low quality with watermarks for preview. Oh yeah, and when I was working on the files, I generated preview proxy files so I can work with them faster than editing the RAW directly. Plus I've been taking pictures for a few years now. Didn't start last week.
            • by jabuzz ( 182671 ) on Tuesday April 04, 2023 @03:12PM (#63425762) Homepage

              If you don't understand what the problem with five million files is then you just don't understand storage in the slightest. My day job is looking after storage for an HPC facility so I understand the problem of file numbers acutely.

              The problem is that *EVERY* file you create has an overhead for storing and processing the metadata. To the point where one million 1KB files is way more of a problem than a thousand 100GB files. It means I have to spend large sums of money so that I can get the metadata on SSD so the file system performs reasonably. It means my backups take much longer. It means any disaster recovery takes longer.

              We not only quota our users on the amount of space but the number of files they may have. Out the box you get one million files soft quota and two million files hard quota. If you want more it is going to cost $$$. If you don't do that you end up with lazy users that don't clear out their temp files and you have individual users with over 17 million files, that when they clean up drops to ~150k. Ever since we introduced quotas on file numbers things have been much better. Several users have hit their quotas in the last six years, not one has been willing to pay to store more files...

              If you need to keep large numbers of files for "archive" purposes put them in a zip file or similar so the number of files is reduced.

              • My day job is looking after storage for an HPC facility so I understand the problem of file numbers acutely.

                The problem is that *EVERY* file you create has an overhead for storing and processing the metadata. To the point where one million 1KB files is way more of a problem than a thousand 100GB files.

                Not my problem. The service is marketed and sold as a capacity limit, not a file limit.

              • I am genuinely curious, what metadata and how much metadata per file? Is this your system keeping track of if/when the 'user' files got backed up/accessed and maybe checksums and stuff? As an ordinary random idiot, I would think that the metadata must be much smaller than the file it is referring to.

              • by Anonymous Coward

                I have similar problems on my personal servers. I have tens of millions of source code and other small files. It's painful as hell to manage these.

                One trick I use is to put most of my source code in to a loopback filesystem on top of the actual filesystem. So then at the low level it's just one big file for everything. I also use a lot of iSCSI drives and virtual machine drives which accomplish the same thing.

            • by tlhIngan ( 30335 )

              What is even the problem with 5m files if people fit into the volume envelope? As a photographer though, I don't think the rate you proposed is unrealistic. I hit 1k to 2k at company events, no problem. A sports photographer might be setting their camera into a rapid sequence spray and pray mode to capture as much of the action happening, because why not? That's just the RAW files. I'm not deleting my RAW files, so the processed files are on top, not instead. And the xmp metadata files. Also it makes sense

          • by stooo ( 2202012 )

            >> That's the problem... Google Drive's purpose is end users saving Documents they are working with as part of their on the thing.
            That is a lie.
            The purpose is to gobble up any kind of data.

          • To me its unreasonable because they removed it, if at first they said 5 million files is a limit then that is fine. However if people rely on it for what ever reason especially if you are paying for the service, it is unreasonable to say you know what you go and find some other place for your millions of files. It is a lot of time an effort to readjust.

            That is a problem with cloud services, you are at their mercy, they can change there terms and conditions at any time, increase their prices how they see fit

          • Quick check of a basic single page Angular2 app (well, to be fair some shared code that 4 or 5 single page apps use but no major imports beyond the basics and some primeng stuff) has just over 200 thousand files but is "only" 1.8gb in size. Could probably clean the project etc and reduce it if needed. 25 of such projects would hit the 5 million file count limit, but would take under 50gb of disk space...

            And yes, the "correct" way to do it would be to zip or otherwise package it up into a single, possibly

          • They push Google Drive Backup, which will automatically backup certain folders from your machine to your drive. If you're a Mac user and drop applications in there, you can get a great many files given how app bundles work. If you're a web dev, your node project could have a great number of files as well. Now add in that some businesses will transfer files from an employee to their manager when they lave the company... and you've got VPs who are collecting files like crazy. It would surprise me for 5m fi
          • by dgatwood ( 11270 )

            That's the problem... Google Drive's purpose is end users saving Documents they are working with as part of their on the thing. There is realistically Not any way anybody ends up with 5 million files on Drive, unless they are creating massive numbers of files through an automated process.

            Like.. shooting 1000 photos a day and uploading all the raw image files would get you there after about 13 years - possibly less if you were then processing the files and saving working versions.

            A thousand photos a day is chump change these days. If you shoot continuously on a typical DSLR, you can burn through 1,000 photos in four minutes. :-D

            • by mysidia ( 191772 )

              A thousand photos a day is chump change these days. If you shoot continuously on a typical DSLR, you can burn through 1,000 photos in four minutes.

              Yeah, as I was saying "creating massive numbers of files through an automated process"

              Not that there is anything wrong with that. I'm just saying.. Recording a large number of file objects involves a lot of system directory and/or indexing resources,
              and Google Drive has been marketed as a low cost consumer product, so someone's chosen the wrong product, made

      • You could just read the Hacker News thread?

        IIRC the outage started six weeks ago. Google got rid of the expensive customers by now.

        A real storage business would behave exactly the opposite. Imagine writing software that relies on a Google product!

        • by Dutch Gun ( 899105 ) on Tuesday April 04, 2023 @02:39PM (#63425670)

          No sane business would ever rely on Google's services. There's a reason people prefer to use Amazon or Microsoft cloud services, despite Google being literally a cloud-only company.

          No one who is familiar with Google should be shocked or even all that surprised by something like this. Despite trying to dip a toe in these waters, it feels like Google fundamentally doesn't really understand how to offer paid products or services (see: Stadia). They view everything through the lens of their freeware + advertising empire. Otherwise, something this short-sighted could never have happened in the first place.

      • I mean no, I guarantee you most of the people caught by this were probably people abusing the un-enforced caps on workspaces for 'unlimited' backups.

        Not really. The people caught here were the ones with business accounts. We're not talking about your personal drive here. Looking at just the documents folder on my work laptop I have close to 500k files, thanks to some apps creating a shitton of small files. Any business with 10 of me as an employee will hit the account cap already, and that without backing up anything beyond documents.

        • The people caught here were the ones with business accounts.

          Yes. Workspaces are supposed to be 1TB per user, but Google isn't enforcing this cap, so people pay for the business workspaces and only pay for 1 seat but use more than 1 TB.

          • by Anonymous Coward

            Yes. Workspaces are supposed to be 1TB per user, but Google isn't enforcing this cap, so people pay for the business workspaces and only pay for 1 seat but use more than 1 TB.

            That's incorrect for about the last two years now.

            Business standard adds 2 TB for each licensed seat, and business plus adds 5 TB for each licensed seat.
            By "adds" I mean it raises your domains total storage limit by that amount.

            Five standard licensed users will give your domain 10 TB of storage, shared across all users.
            That can be 2 TB per user, or 10 TB for one user and zero for the rest, or any other combination.

            There is NO per user storage limit in Workspace enforced by Google.
            It is up to the workspace a

      • Re: (Score:2, Interesting)

        I mean no, I guarantee you most of the people caught by this were probably people abusing the un-enforced caps on workspaces for 'unlimited' backups.

        Who cares? If I payed for 1TB of storage, why does it matter if it is 1 file or 10 millions files that make up my 1TB???

        • by flink ( 18449 )

          The first byte of any file is the most expensive to store. No mater the file size you have to allocate one full block of storage, so it will consume 2kB or whatever physical storage. It will also consume a slot in the file allocation table for the media, which is often a finite resource, depending on the file system. For a cloud storage system, there is also metadata associated with the file such as ownership, permissions, modification timestamp, backup policy, etc. 10M 1-byte files cost a lot more to s

          • The first byte of any file is the most expensive to store. No mater the file size you have to allocate one full block of storage, so it will consume 2kB or whatever physical storage. It will also consume a slot in the file allocation table for the media, which is often a finite resource, depending on the file system. For a cloud storage system, there is also metadata associated with the file such as ownership, permissions, modification timestamp, backup policy, etc. 10M 1-byte files cost a lot more to store than 1 10MB file.

            I am well aware how FATs and permissions work. Again, Google advertises a storage limit, not a file limit.

          • by ShanghaiBill ( 739463 ) on Tuesday April 04, 2023 @09:03PM (#63426480)

            No mater the file size you have to allocate one full block of storage

            That is not true for all filesystems.

            For instance, ZFS can store tiny files in the metadata, using no data blocks. For slightly bigger files, multiple files can be in a single block.

            I believe that BTRFS can do the same.

      • And yet you may remember the days of worrying about a lot of small files on the drive and while having free megabytes/gigabytes of space having no more unused clusters (IIRC, whatever FAT16/32 used, larger partition the larger the cluster unless you used special formatting options IIRC) or in the case of a *nix filesystem no more inodes (one inode per 4k of disk space or whatever). Trying to write a file would get "disk full" message, but file managers, disk use utils, etc. would report lots of un-used spa

      • You never had the ability to do unlimited backups. There was always a very explicit limit as to the total storage space.

        This was a hidden limit on the number of small files. Theres all sorts of reasons for why you might have it For instsance one of the jobs I worked on we'd generate millions of images coming out of optical equipment at a rate of arount 30 images a second which where then thrown into a cluster and processed. . You could see how this pretty quickly adds up. Thats just short of a billion imag

    • by gweihir ( 88907 )

      Probably not. Google does not care about its customers. This is just one more piece of evidence that they will screw then over when they think they can get away with it. On the other hand, Google seems to have a pretty high level of corporate stupidity as well these days, because finding whether anybody would have been affected and how badly should have been really easy. Apparently nobody thought to check or the results and their implications got ignored.

      ProTip: When stealthily trying to make your service w

  • ... have you heard?
  • coming soon need to buy top level plan to use that number of files.

  • They say that the Internet never forgets. I suspect at some point the Internet does forget. Storing all of the Earth's information in perpetuity is not sustainable. At some point the minutiae is not important anymore.

    • They just keep copying it to new media over and over. As long as they keep doing the backups it doesn't go away. But eventually, in the fullness of time, there will be a day when that happens to any given piece of data. May not be for a few years yet. People training AI's nowadays have extra incentive to keep training data just in case.
  • Kind of feels like a no-announcement file limit cap was the result of someone's 20% project.

    • by gweihir ( 88907 )

      Yep. And then that person got fired before they got to the "impact on customers" part of the work.

  • "10 years later" [/fake Jacques Cousteau voice]

    "Not like THAT!!!"

  • Sadly the days of don't be evil are far gone.
  • I'd really like to know. I haven't. Pixel phones, various Google offerings that I paid for and got sunsetted without any real option to migrate to something else. As soon as I see Google is the provider of a service, I look for an alternative. As soon as the Nest infrastructure got bought by Google, I was sure i'd never own one. If migrating off Gmail wasn't such a pain in the ass i'd already never be doing business with them ever again.
    • My wife and one of my daughters absolutely swear by their Pixel3XL phones, they have loved them from day 1. Neither of them are interested in upgrading.

  • a UV lamp is the best disinfectant?

Friction is a drag.

Working...