Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Data Storage Operating Systems Windows

Ask Slashdot: How Many Files Are on Your Computer? (digitalcitizen.life) 164

With some time on their hands, long-time Slashdot reader shanen began exploring the question: How many files does my Windows 10 computer have?

But then they realized "It would also be interesting to compare the weirdness on other OSes..." Here are the two data points in front of me:

(1) Using the right click on properties for all of the top-level folders on the drive (including the so-called hidden folders), it quickly determined that there are a few hundred thousand files in those folders (and a few hundred thousand subfolders). That's already ridiculous, but the expected par these days. The largest project I have on the machine only has about 3,000 files, and that one goes back many years... (My largest database only has about 5,000 records, but it's just a few files.)

(2) However, I also decided to take a look with Microsoft's malicious software removal tool and got a completely different answer. For twisted grins, I had invoked the full scan. It's still running a day later and has already passed 10 million files. Really? The progress bar indicates about 80% finished? WTF?

Obviously there is some kind of disagreement about the nature of "file" here. I could only think of one crazy explanation, but my router answered "No, the computer is not checking all of the files on the Internet." So I've already asked the specific question in three-letter form, but the broader question is about the explosive, perhaps even cancerous, "population growth" of files these days.

Maybe we can all solve this mystery together. So use the comments to share your own answers and insights.

How many files are on your computer?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: How Many Files Are on Your Computer?

Comments Filter:
  • by Espectr0 ( 577637 ) on Saturday December 25, 2021 @09:38PM (#62115281) Journal

    a lot of programs use zip files so antivirus decompress them to scan

    • Which raises the question, when you have compressed data of multiple files, should you count them as different files. Many modern games use this technique. If we look at the physical files in the hierarchy we are talking hundreds even if they game has thousands of textures.

      If a compressed heirarchy is "one file" than we can easily talk about operating systems that are only a few hundred files...

      • by tlhIngan ( 30335 )

        What about hard links? Hard links are basically a directory entry that points to the same data block as another directory entry.

        On UNIX systems, you see this as ".." is a hard link to the parent directory. This makes sense logically, but is also what makes preventing parent directory traversal hard since you basically have to scan for it manually.

        But are hard links one file? multiple files? Which brings up the question of what makes up a file - the data block, or the metadata about that data block?

    • by Kisai ( 213879 )

      Only Java does that.

      The problem is what the user is selecting.

      Windows is:
      C:\Windows
      C:\Program Files\Microsoft Windows
      C:\Program Files(x86)\Microsoft Windows
      C:\Program Data
      C:\Users\*\AppData
      etc
      These correspond to Linux/BSD's /root /bin /sbin /usr/bin /usr/sbin etc

      If you install only your OS to C drive, and then tell absolutely everything to only install to another drive, you'll be sadly disappointed. Microsoft Windows "Store", installs everything into c:\Users\ not c:\program files\
      More to the point though,

      • by pjt33 ( 739471 )

        Windows Defender most certainly counts files inside zips, msis, exe resources, etc. It's easy to verify if you have a Windows machine: ask it to scan one of those files.

        • by Briareos ( 21163 )

          Windows Defender most certainly counts files inside zips, msis, exe resources, etc. It's easy to verify if you have a Windows machine: ask it to scan one of those files.

          Well, duh - non-Windows machines won't have Windows Defender... :P

    • by AmiMoJo ( 196126 )

      In this case though it's down to the WinSxS folder.

      It lives inside the Windows directory and is used to store multiple versions of system files, to avoid DLL Hell and allow updates to be rolled back. Most of the files in it are actually symlinks, but Windows Defender checks them like normal files.

  • by magnetar513 ( 1384317 ) on Saturday December 25, 2021 @09:48PM (#62115301)
    one, two, three, oh, wait here's another one...
  • "Programmers" these days don't write software so much as sloppily glue libraries together. This leads to simple projects ballooning up in size. What would have fit neatly on a floppy, now requires multiple gigabytes of storage and uses an absurd amount of memory.

    "Look how easy it is" I watched someone demonstrate the other day "Just add one line here and it will automatically download this package and all of its dependencies!" It made me sick.

    the broader question is about the explosive, perhaps even cancerous, "population growth" of files these days

    ... is easily explained.

    • Re: (Score:2, Offtopic)

      by Aighearach ( 97333 )

      If they'd just choose libraries with few prerequisites they'd still do alright. The biggest part of the problem, in my view, is that they don't care how many libraries their libraries use. If they'd just add one line to the "cons" of a library for each other library that it requires, they'd be able to glue libraries together without causing a lot of bloat.

      It isn't like the good programmers don't use libraries, or software that isn't bloated they had to do everything by hand. Usually they just chose lighter

    • by boxie ( 199960 )

      As a developer, it comes down to $$$

      Storage is cheap. RAM is cheap. Compute is Cheap. Developer time is Expensive.

      For compiled languages, the number of dependencies can be compiled away

      For Javascript/web, webpack et al rewrite things down to a few files

      The cost for devs about deep dependency trees is usually hidden at decision time

      • Re: (Score:2, Troll)

        Storage is cheap. RAM is cheap. Compute is Cheap. Developer time is Expensive.

        Except that you're only paying for one developer, not five billion users' hardware. But it's OK, the company has saved nearly $4.50 in development costs so it's a win for them to socialise the losses as much as possible.

      • Storage is cheap. RAM is cheap. Compute is Cheap. Developer time is Expensive.

        Also: Storage is fast. RAM is fast. Compute is fast. Developer is slow.
        Which means it's usually more efficient (note: I didn't say "better") to throw hardware resources at a piece of software, rather than spend even 20 hours optimizing it.

    • by kmoser ( 1469707 )
      It's bad enough that they pull in dependencies left and right like a kid in a candy shop. But what's worse, these dependencies often contain security vulnerabilities.
    • by noodler ( 724788 )

      [q]"Look how easy it is" I watched someone demonstrate the other day "Just add one line here and it will automatically download this package and all of its dependencies!" It made me sick.[/q]
      Aah, so you're the type of programmer that writes all their drivers for all their devices...

      • by narcc ( 412956 )

        Ah, so you're the type of programmer who builds software by gluing giant libraries together haphazardly and copy/pasting crap from stackoverflow. Simple CRUD apps don't need to be 5+ gigabytes in size and should start instantly, you know.

        I've seen idiots import gigantic do-all libraries to make use of a single function. I've also seen libraries imported to make use of functionality already present in other dependencies. The difference between you and me is that I find that incredibly stupid.

        20 years or

        • by noodler ( 724788 )

          I've seen idiots import gigantic do-all libraries to make use of a single function. I've also seen libraries imported to make use of functionality already present in other dependencies.

          Sure, and that's bad. But that doesn't make libraries evil. As with anything in programming, you have to know what you're doing. Without knowing context i could agree with the person including a library by 'just adding one line'.

          I've seen idiots import gigantic do-all libraries to make use of a single function.

          Idiots are going to idiot. There are a lot of idiot traps in lean code/coding too. In fact, the whole of programming is a mine field of fuck up opportunities. It's nothing new.

          You're the kind of person who would have burned the rest of the day looking, then recommended outsourcing the development, aren't you? Or are you the kind of person who would use the first thing you found, then spend the next week trying to modify it with "plugins" until it did something similar to what you wanted?

          That sort of question is very context sensitive. I wouldn't program my own web server, for instance. It rea

  • by stinkydog ( 191778 ) <sd@stran g e d o g . net> on Saturday December 25, 2021 @09:52PM (#62115309) Homepage

    Including or excluding the "Homework" folder?

  • by jonnythan ( 79727 ) on Saturday December 25, 2021 @10:04PM (#62115319)

    WinDirStat is a handy utility for counting files and looking at disk space allocation. Be sure to run as admin.

    Mine says I have just north of 650,000 and 190,000 directories on my Win 10 PC with 2 users.

    My linux Plex server has 405,000 files and my TrueNAS file server has about 100,000 (if you exclude the files being served; an extra 250,000 if you don't).

    • 82,000 media files on my platter drive and 551,000 on my SSD with Win 10 and various applications.

      How about oldest files? Mine would be an mp3 file of David Bowie dated 6/23/1996.

    • It is blind to a lot of stuff. Last time i used it, it would not detect anything that was WPA.(Microsoft store format)
    • Oh great, now we have software packages for counting files.

    • I've got about 830K files on my C: drive, a 200GB SSD, according to Windows Explorer. Lots of C++ projects with all the build cruft that entails, plus a whole bunch of SDKs and applications, and 12 years worth of miscellaneous crap in my user folder.

      There are 192K files on my D: drive, a large HDD, with all my Steam/GoG games and other stuff that won't fit on my SSD.

      • You obviously never installed Android Studio.

        Android studio will easily double that.

        • If you want to see absolutely breathtaking numbers of files, try ESP-IDF (for the Espressif ESP32, which bundles MinGW) + PlatformIO (which bundles Python) + IDE (which almost inevitably bundles OpenJDK). Or Unity, which installs its own private copy of Android SDK (which isn't necessarily a bad thing, given the amount of ongoing manual labor it used to take to keep Android Studio updates from breaking Unity's build system every few days) and more (at least, if whatever you're doing involves Android). Some

          • Last time I uninstalled Android studio I ended up deleting about a million files manually after the uninstaller refused to touch them.

            (spread out over five or six folders on the disk)

            • Oh, building AOSP is even worse. A full mirror of the AOSP source code is 769 gigabytes. And AOSP isn't even the biggest mirror out there. By Linux-distro standards, it's actually around the middle of the pack size-wise.

              For some truly eye-popping statistics, check out https://mirrors.tuna.tsinghua.... [tsinghua.edu.cn] to see just how big some sourcecode repos for popular projects are. Qt weighs in at 1.64TB. Anaconda and PyPi are 6 and 9 terabytes. By comparison, Ubuntu and Debian are only around 1.6TB. Eclipse is 881GB, A

  • FWIW, be aware that in the default windows install there are some filesystem junctions in the user profile folders that point back at their own parent directory. If a "dumb" program blindly follows the directory structure without taking that into consideration it may end up in an infinite loop following circular references, counting the same files/folders over and over again -- progressively getting slower and slower once you're a few hundred levels deep
    • When I read the original post this is what I thought surely must be occurring. Hilarious if not unexpected that it's the MSRT that's hung in that loop
      • The anti malware scanner is scanning (and counting) files inside archives.
        • by shanen ( 462549 )

          Though the discussion has wandered widely, I'm inclined to believe the archived files (mostly within system libraries) are the most likely explanation. Mea culpa, but I forgot an important detail (again). If MSRT is looping within the directory structures, that's a pretty major bug. Even MS should have caught it by now...

          By the way, it's still running and the progress bar hasn't moved much. That could be evidence for looping?

          As regards the wandering of the discussion, I have two reactions. One is that I pro

    • by splutty ( 43475 )

      This also creates a huge issue with most backup/restore programs when you're trying to 'restore' the home directory of another system.

      Even if you tell it to restore into say L:\OtherSystem, it'll create shit like Appdata/Appdata/Appdata/Appdata/Appdata

  • Hmmm (Score:4, Informative)

    by JustAnotherOldGuy ( 4145623 ) on Saturday December 25, 2021 @10:12PM (#62115329) Journal

    I'm using Linux Mint 20.2 Cinnamon. I ran this from the top of the filesystem (/) as the root user:

    find / -type f | wc -l

    And it returned this:

    find: ‘/run/user/1000/doc’: Permission denied
    find: ‘/run/user/1000/gvfs’: Permission denied
    2078631

    So there were a couple of places it couldn't delve into; I don't know how much difference it would make overall.

    But basically there were 2,078,631 files it could "see".

    • Thanks for the inspiration. Were you using that command as a user or as root. As a user there will be files you are not allowed to read. As root I expect you should catch all of them. I'm off to try.

      • Sorry, I looked back and you were running it as user. I tried and got 2711238 on an Ubuntu Mate 20.04 installed soon after April 2020 and carrying the files forward from previous systems. Some of it porn but not what the other fellow was looking for.

        • crap, you were running it as root. I just need a few more days sleep and then I'll be right as rain.

          • Lol, no worries. Thanks for running it and reporting your results.

            I do have some files carried over from previous installs but not too many. I usually copy my home directory off, wipe the disk, install the newest version, and then copy my home directory back.

            I'd guess that my Mozilla/Firefox directory contain a *bunch* of files that wouldn't be there in a 100% fresh install, and I'm guessing it could be a lot.

            The next time I do a wipe/reinstall I'll run this same command again just out of curiosity but hope

            • True. Firefox builds a huge cache of files so it doesn't have to re-download things if you visit the same site over again.

              On the other front, a freshly installed system, I have a freshly installed Ubuntu Mate 21.10 that is waiting for me to turn it into a DLNA media server and it counts 433617 files. This had a fresh install, software updates and just enough browsing with Firefox to find a wallpaper of Orpheus which will become the machine's name when in service. I haven't installed a DLNA server yet as

    • You should have booted from Liveusb to not count everything is a file in whatnot places, and having a nonfluctuating number of objects on actual disk. What is a file is a question clearly answered on a filesystem level. Not a live system with virtual files.
      • You should have booted from Liveusb

        Ain't nobody got time for that shit.

        Feel free to interrupt all your work, save all your files, close all your tabs and windows, etc etc if you want to.

        Obviously YOU didn't, so don't bother telling me what I should do to satisfy your requirements if you aren't willing to do the same.

        I provided the command that I used so others could use it as a benchmark of sorts for comparison purposes. I'm sure I could whip up 20 different command lines that would return 20 different counts, but for what?

        Run the command, r

    • I ran sudo locate -bc '' and got 2341834 files. This is more than I got running without sudo. I think I may be excluding some directories, but this is close.

      • I tried on macos and it said -b isn't a valid option and complained about the locate database not being generated.
        This is one of those times when macos feels less unixy than it pretends to be.

        • I have a network mirrored drive with all my working files I can't lose (code, circuit designs, book source etc).
          $ find . -type f | wc -l
                40908

          I ran it over the whole disk and it's still running several minutes later. It'll probably land in the 1-3 million range like everyone else. At least of the 40908 files I retain on backups, I know what each file is for.

          • It finished - 8,200,104.

            That's on a macos laptop after three generations of macos laptops transferring the files forward.
            Given that 40,908 of them are my files, 8,159,196 other files seems a lot for OS and application cruft.

    • Ubnuntu - followed the same procedures; 1,217,669

    • by cobbaut ( 232092 )

      Debian 11

      # find / -type f | wc -l
      find: '/run/user/1000/doc': Permission denied
      find: '/run/user/1000/gvfs': Permission denied
      743428

      I've been using this laptop for three years.

    • Indeed - I have been transporting project trees from one computer to another over the years, and a quite similar search yielded 5409258 files for me. It is interesting to prepend a "time" to the find call, and see it takes less than a minute to this result on SSD, compared to the "night long" windows scans.

    • by jvkjvk ( 102057 )

      Manjaro latest installed last week

      669836:

      ~82000 in ~ (personal stuff)
      ~12100 in Music

      ~575000 system files

  • chkdsk says there are 1,340,033 files.

    I can't think of a more reliable way to find out how many actual files there are than to run chkdsk.

    Using find -type f on my dockstar gives a mere 44,642 files.

  • Quite a few (Score:4, Informative)

    by kerashi ( 917149 ) on Saturday December 25, 2021 @10:15PM (#62115337)

    Somewhere between 2 and 5 crap tons.

    I'm a packrat and it's taking too long to get a more accurate measure.

  • by znrt ( 2424692 ) on Saturday December 25, 2021 @10:20PM (#62115341)

    ... just don't deserve an answer. good luck in your future endeavors.

  • QDirStat just finished counting and I have about 5.5 million files. The source code for DD-WRT (1.3M), some GCC cross-toolchains, and backup of the directory they were in (on a different drive) make up the lion's share of files on my computer.

    You asked.

  • It's searching inside zip files. And installers, things like installer exe-s. And it's counting the files within those zip files. So, 1 zip file could = 15,000 "files" (to be scanned), say.

    Most antivirus programs will even say so, that they are searching inside those files.

    Not trying to be mean here, but did you start using computers like yesterday?

  • by 3vi1 ( 544505 ) on Saturday December 25, 2021 @10:43PM (#62115395) Homepage Journal

    I don't think you can get any sense of what is "normal" unless you restrict a survey to the OS files themselves.

    On Linux, I have over 1.2M files in just my source code directories... and another 1M files in just my steamapps subdirectory (haven't run out of space, so why ever uninstall). So, it's extremely easy for systems to deviate by millions of files depending on the users use-case.

  • Using a simple "find . -type f | wc -l" tells you a lot.

    On a Raspberry Pi running some embedded stuff and a lot of python libraries:

    272706

    On a MacBook Pro with a home directory transported from machine to machine since about 2002:

    760360 (in my home directory)
    4611064 (all files)

  • And got a surprisingly low 188,170

    Using"sudo find / -type f | wc -l "

  • >" How many files are on your computer?"

    Linux. Home: 1,969,779
    Linux. Work: 3,651,947

  • There are some directories which can't be traversed, even as root - but my count turned out to be

    1407888

    However that's ignoring the Linux Mint partition.

  • The title says it all. I have 2,995,171 files.
  • My RISC OS machine has 50237 files, 3350 of which are in the Boot (system) directory.

  • Linux Mint 20.2 xfce.
    $ sudo su -
    # cd /
    # ls -laR | wc -l
    12,685,769
    #
    There is overhead in that command; I am guessing about 11.5M files.
    Computer is a
    Mintbox3
    Intel(R) Core(TM) i9-9900K CPU @ 3.60GHz
    Nvidia 1660
    RAM: 32 GB
    All power savings turned OFF. My money! But runs pretty cool anyway.
    This is a Windows-Free site.
    Got three of these running my Trucking Business.
    All business software written "in house" by me.

  • How is it possible that no reply here yet mentions df -i?
    and then use samba to do likewise by mounting the Windows filesystem from Unix?
    • Dam I wish I had mod points right now. The other option is to stat the mount point. I am sure there are similar options on Windows.

  • Ubuntu work/home laptop:

    > sudo find / -type f | wc -l
    find: ‘/run/user/1000/doc’: Permission denied
    find: ‘/run/user/1000/gvfs’: Permission denied
    1279563

    > find ~/.config/vivaldi -type f | wc -l
    8609

    Does it really need that many files to run an online document viewer? Vivaldi is maybe a bit better than most. They are all massive memory hogs.

  • If you have previous version on, you have even more files. Actually, blocks of changes, not really files, but do they count as one?
  • I ran the following on macOS 12:
    sudo find / -type f | wc -l 2> /dev/null
    Aside from it being prevented from looking in a number of directories due to SIP, it told me there are 3,914,709 files on my computer.

    I fondly remember the days of Mac OS 9 (and prior) when the total number of files installed by the OS was countable in double-digits, and I could look in the System folder and tell you at a glance exactly what each and every one of the files in there was for.

    Now, nearly 4M files on my computer (of whic

  • ...how many files my computer has but I think some of them are haunted.
  • sudo find / -type f | wc -l
    reports
    4667775

    on an Arch system with some storage disks

  • In Windows, right click on top-level directories on each drive and select properties. The resulting dialog box will tell you how many files are in those directories (altogether -- not just just directly underneath). You will need to add them up across all top-level direct. But most (sane) installations don't have too many top-level folders on each drive. You will need to do this as an administrator user with elevated privileges or you won't have the rights to view/read some of the files.
  • This reminds me of a lunchtime back-of-the-envelope discussion from about 1994. My co-worker calculated that it was not worth deleting any file less than 10kb, because the time it would take you to inspect the file and decide to delete it cost more than the disk space used to store that file.

    I suspect if you redid that calculation, you would be up to a 1mb file as "not worth the time to delete."

    And that begs a related question: Of those millions of files on your hard drive, how many are actually needed?

  • You probably wont ever see all of the files. I run in to instances where the folder and file permissions are set in a way that even running things as admin it cant see or access them. Most likely set for the system to read it only. The windows directory and programdata folders are areas that i see this a lot when fighting things on users computers. Then like others mentioned there are compressed items like zips and cab.
  • Late 2013 Retina Macbook Pro that's been used mostly, although not exclusively, for web / email / office-y stuff says 2,684,313.

  • by DidgetMaster ( 2739009 ) on Sunday December 26, 2021 @10:40AM (#62116501) Homepage
    Given the cheap nature of mass storage (about $20 per TB for typical HDD), packrats like me store everything. The more relevant question is 'How long does it take to find that file you misplaced?' The more files you have, the longer it takes. Say you looking for a photo you know you took about a year ago and transferred it from your phone to your laptop, but you forgot where you put it. Finding it on a really big drive that is full of other stuff can take a long time. Especially if you can't remember its exact name or what format (JPEG, PNG, GIF, etc.) it used.

    This is one of the biggest problems with antiquated file systems (most of them were first released when a 1GB drive was huge!) They don't help you find anything. Attaching meta-data tags to files is also a pain and searching based on them is slow. I am building a new object store that I hope will replace file systems. I have a container where I created 50M files and I can find all the photos (over a million of them) in under a second.
  • ... archives and RPMs: 12,900,585

    It finished while I was getting a cup 'o' tea.

  • Windows 7 Ultimate SP1 x64

    254,696 files
    38,718 folders

    Much (most?) of that represents artifacts of Windows, not anything I created.

  • C drive:
    Total Files Listed:
    754206 File(s) 312,351,076,201 bytes
    343736 Dir(s) 198,795,538,432 bytes free

    D drive:
    Total Files Listed:
    907764 File(s) 3,413,834,573,332 bytes
    123525 Dir(s) 2,583,813,844,99

  • My very basic Dell laptop has just under 310000 files and 131000 folders in c:\windows, consuming 23.4 GB (about 10%) of my tiny SSD.

  • "There are no stupid questions?"

    "Are you sure? The question was basically answered in the FP."

    Anyway, I feel obliged to say thanks for the thoughtful responses, but I still wish there had been more Funny comments in there.

    Not sure if it counts as a punchline, but MSRT eventually finished--but apparently there's no way to check the results. I wanted to see the final number. And the status report should say it's counting "executables", not "files".

    And still, the total of more than 12 million seems ridiculous.

  • I read it as flies. The answer being RAID?
  • Not as many as there are flies. I suppose that's the price of making a case out of bacon and frosting.
  • 2.08 million files on my work drive in 86 thousand folders, which includes complete copies of my boot and work drives going back to ... Desqview on a 286 if not a little earlier, but the copies of older systems are insignificant in size.

  • On my DOS-only 286, 723 files.
    On my Win95 box, just over 2 million.
    On the current conglomeration of several networked PCs and external drives... somewhere around 20 million, at a guess.

    [looks on linux box]
    Filelight reports about 15 million files on this box alone.

  • Only a few thousand files of my own, but over a million more more that result from backing up about half a dozen Windows, Android, and Linux computers elsewhere on our network.

    Many of them unique to and/or part of various pieces of poorly-written proprietary crapware.

    People still complain about the Windows registry, and not without reason. But it's a true model of sanity compared to how many Windows (and other) programs create tens of thousands of tiny little files, completely oblivious to how long it take

Parts that positively cannot be assembled in improper order will be.

Working...