Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage Games

Warzone Dev Says Game Is Losing Players Over 'Insane' Download Sizes (arstechnica.com) 92

An anonymous reader quotes a report from Ars Technica: For years, players have complained that ballooning game download sizes are clogging up hard drives and Internet bandwidth. In a recent interview with streamer TeeP, Call of Duty: Warzone Live Operations Lead Josh Bridge admitted that the game's massive file size is also impacting the team's ability to release new maps. Asked about the possibility of adding the original Verdansk map to cycle alongside the game's current Caldera map, Bridge said, "We want that. We all want that," before addressing the "technical problem" that makes it difficult: "The install and re-install sizes are fucking insane, right? If we pulled out Caldera and say we're gonna drop in Verdansk, this could be essentially re-downloading, like, the size of Warzone," he said. "And every time we've done that, we lose players," Bridge continued. "Because you're kind of like, 'I don't want to re-download that,' [so you] uninstall. I think you can't fit anything else but Warzone on a base PS4."

Bridge is exaggerating, but only a little. Activision says you need a whopping 175GB of hard drive space on PC for a Warzone install. On Xbox, the base download is listed at nearly 92GB, similar to the size on PlayStation systems. Adding Modern Warfare onto the Warzone package increases the total size to about 250GB on PC and 150GB on consoles. About a year ago, Activision announced that the "larger than usual" Warzone "Season 2 Reloaded" patch would reduce the game's "overall footprint" on hard drives by 10-15GB (and 30-35GB when combined with Modern Warfare, depending on the platform). The "data optimization and streamlining" in that update would also ensure that "future patch sizes for Modern Warfare and Warzone [would] be smaller" than the 57GB update being offered at that point.

The results over the ensuing year have been mixed. A February Season 2 patch required only about 11GB of file downloads, for instance, while a December 7 update that introduced new maps required a 41-45 GB download on consoles. [...] In any case, Bridge was remarkably frank about Warzone's file size issues and said that "looking to the future, we're putting a lot more effort into how we sort that out on a technical level so that we can have that [map] rotation. We've been really looking at it, so we'll have more to talk about that, but that is ultimately a goal to ensure that there's a freshness and a variety of experiences."

This discussion has been archived. No new comments can be posted.

Warzone Dev Says Game Is Losing Players Over 'Insane' Download Sizes

Comments Filter:
  • With the file sizes being large enough to be a full SSD, they should just provide you with an SSD with it preinstalled. It could be wrapped in some sort of 'cartridge' to make it easier to attach/detach from your console.
    Genius!
    • by bjoast ( 1310293 )
      Man, I hate having to blow into the M.2 slot every time a game doesn't start properly.
      • Man, I hate having to blow into the M.2 slot every time a game doesn't start properly.

        OK, that is funny!

    • That would mean shipping a finished product on time.

    • We are intrigued by your idea and wish to subscribe to your newsletter.

      -- the major video game studios

    • I remember when Square did literally that. The PS2 version of Final Fantasy XI came with a PS2 hard drive (not an SSD, but an HDD) with FFXI pre-installed on it. That's why you could only play PS2 FFXI on a "fat" PS2, because it was the only one that had a bay for installing the hard drive. Alas, FFXI on the PS2 was discontinued some years ago.

      • by Megane ( 129182 )

        Before they discontinued it, they had at least one incident where an update filled up the partition, and you had to re-install from the optical drive. That was when I found out that my PS2 optical drive had gone out of alignment again, and I wasn't going to spend two or three hours tweaking that stupid white gear when I already had the PC version working.

        Also, the limitations of the PS2 had effects on the game that are still being felt, not the least of which was that creating new content required a PS2 de

    • by Kisai ( 213879 )

      You jest but this is the same suggestion that was offered back in the days of CD-ROM's and Floppy Disks. When games and programs were coming on a dozen disks.

      What's the best solution though? Well the developers need to do some self-reflection. Clearly how they package and partition their game is a problem. These consoles only have 8GB of memory, so how do you justify a 100GB game unless there's like 12 full reloads of data? These massive patches are obviously because they are making changes to the game tha

      • by laird ( 2705 )

        Even cheap SD cards are spec'd to 3,000 repeated writes to the same sector, and 100,000 isn't unusual. And these days the SD card's controller maps out failed blocks, so the only real impact would be that the SD card's capacity might eventually decrease For wholesale game updates, where huge amounts of data are written linearly, fairly rarely, SD cards wearing out isn't a factor. Where it comes in more commonly is when doing many small incremental writes, such as when apps (e.g. on a Raspberry Pi) write to

      • by raynet ( 51803 )

        They should use compressed assets and data should be optimized by size. Nowadays everything is kept in a format that is easy to stream into the GPU, thus it takes much more space than some years ago. So they should have local optimizer/unpacker, so the patch would be highly compressed and then expanded and patched into the streamable local files that take hundred+ gigs.

        • Or use algorithmically-generated content, then an entire update would just be a single 32-bit seed. For example here's the seed value for World of Spacecraft for an entire new universe that you can generate yourself and play in either single-player or online:

          0x3DAF38FB.

          Enjoy!

      • Personally I liked the physical install disks. For the longest time it was faster to pull a game off of a CDROM than to download it. Now with almost everyone being exclusively download-only and with massively huge sizes even for relatively modest games, it's still hours for downloads. At least it isnt days anymore for me, but there are people where it would be days. The problem is even bigger if you have bandwidth caps, extra fees if you exceed the cap, family members trying to watch TV or get work done

      • by tlhIngan ( 30335 )

        There is an argument for putting games on SD-cards as a distribution mechanism,

        Yeah, if you want $200 games.

        A reasonable size SD card for something like this which consumes a whopping 100GB would cost perhaps $10 using the cheapest of the cheap crap. But writing 100GB of data to such an SD card is easily an operation that will take 10-20 minutes or more. This means mass manufacturing will take time to produce - sure there are robots and such, and writing can take place probably 64 cards at a time, but you'

      • by Reziac ( 43301 ) *

        Well, they wanted realism in their gaming environment... seems like it's done got out of hand.

        I'd be quite "never mind" myself, given my top download speed of 2GB/hour.

    • I like the idea. Something you attach through usb3. Part ROM, part flash. Should be easy to make cheap ROM with similar densities as flash. Also complicates piracy.
      Would make the game expensive though. On an SSD you can still erase and install another game.
  • by reanjr ( 588767 )

    I hope your game crashes and burns and creates great levels of hatred for your company who has spent all their money on hi-res textures to re-skin game engines built in the 90s, instead of hiring a fucking release engineer to tell you your engineering team is filled with fucking morons who don't understand what a patch is or how to design a basic data format.

    I fucking hate modern video game companies. They're complete trash.

    • I would also be willing to bet that a lot of these textures are uninteresting enough that they could be procedurally generated rather than stored as images. All you would need is an algorithm and seed parameters for a huge variety.

      I don't play this sort of game, but I imagine there has to be multiple square miles worth of content to be this huge.

      What I really don't understand are why patches are huge. Are they sending GB of content changes with each patch or is it mostly code? Some sort of terrible de-du

      • Pretty sure it's mostly the latter. The game is a giant binary blob of data vomit which gets regenerated into another - different - giant binary blob of data vomit with no respect to change management.

    • by DarkOx ( 621550 )

      Games have ALWAYS been about trade offs. Its just which trades do you make.

      Storage and network capacities have just EXPLODED over the past decades compared to what they were in the 90s. That went mostly unnoticed as far as the video game world went because at first memory being the bottleneck, then CPU was the bottle neck, and later GPU was the bottle neck (often still is). While that was the case we first got big high rpm hard disks, and high capacity optical media, dvd, bluray. It made sense to unpack ass

      • The pendulum will need to swing back to procedural generation

        It needs to do nothing of the sort. The reality is that most games are simply not optimised for compartmentalised delivery. I can't remember which game it was recently, but I recall one patch was released that effectively was a re-download of the game, but the end result was a 40GB reduction in game size and subsequent content patches went from being in the mid double digit GBs to the low double / single digit GBs.

        The problem is how the game is delivered relies on a "patch" resending shit that the users alr

      • I can't wait until RPGs can actually deliver the amount of content we used to see in the early 90s. Nvidia's face generation tech, coupled with deepfake voice generation from actor seeds could lead to games with depth like we've never seen before.

        Or they can just continue to re-release Call of Duty and contribute nothing to entertainment.

    • by lsllll ( 830002 )
      Bah! These developers don't have anything on the developers of Ark. My "ark" folder on my HD (yeah, I had to move it off my NVME because it takes so much space) is a whopping 231GB. That's without any mods and addons.
  • Finally finding an hour to play and being faced with a mandatory system update and/or a game update is a huge problem, since the PS3 era.

    While I appreciate that modern games ship get content updates, often free, I also dislike the practice of rushing games out unpolished to meet release dates.

    That's why now, I almost exclusively play on the Switch. Updates, if any, are small and usually happen while the system is asleep.

    When that rare hour shows up, I don't want to spent the bulk of it watching a
    • Nintendo is one of the few companies that still cares about user experience.

    • THIS to everything that you don't use every day!

      When I camp, I take a tablet with a couple of games and Netflix for offline viewing. The tablet is basically for this purpose, and I camp about every 2 weeks.

      2 weeks is enough that every freaking game needs large updates, the sort of thing I won't do while tethering. So I have to do this as part of trip planning, I can't just "bring" the tablet (Netflix will auto-download things for offline, it's actually really useful).

      And these are games designed for offli

      • by Rhipf ( 525263 )

        I you are taking your tablet and Netflix with you are you really camping though? 8^)

        • by Rhipf ( 525263 )

          Grrrr.
          Hit submit too soon. Obviously that should have been"
          "If you are ..."

        • Solo camping, and lots of it. Camping is my mental time away from everything.

          When solo, it's my time. Watch a show, shoot the BB gun, hike, stare at fire. Whatever. It's interesting to confront a day with no obligations and only the distractions I bring or create (and possibly not seeing another person the entire time).

          And I live in the city, but grew up on a farm. I enjoy both urban and rural-to-complete solitude.

    • That's why now, I almost exclusively play on the Switch.

      That's also pretty much why I didn't order a Steam Deck. I already own a console that is ostensibly portable (a Switch) and it lives in its dock. Generally, I only have free time to game when I'm at home anyway, so I may as well use a full-fat PC rather than Steam's half-baked portable device.

      The Switch is the only console that still feels like what console gaming used to be, where you can just pop in a game and immediately start playing.

    • by antdude ( 79039 )

      Yeah, also for me just don't have energy to play them especially those grinding types. I just play quick games when I can.

  • Everything nowadays is so over-bloated, it's insane. How can developers generate and rely on excessively bloated code and libraries? Does no one learn how to create efficient and compact code anymore? Apparently decades of Windows has shown that we will take it up the rear for bloat in the code. Super-size it? Sure...

    Not everything requires the kitchen sink.

    • Geez, the code is probably 1% of the size of the whole thing. The rest is artwork. High-resolution models, textures, etc. But if you don't like eye candy feel free to enjoy Nethack in all its ASCII-art glory.
      • by Tom ( 822 )

        Nethack being a better game than most of the current DLC-advertisement systems and microtransaction delivery platforms that have a thin layer of gameplay on top, that's actually a pretty good advise.

      • That still doesn't explain why every single file has to be downloaded when there are changes made to some art assets and or code. Rather than just pushing the new files or files that changed, the most common practice I see is to push the entire thing.

    • Part of the problem is that current developers prefer to attach a 10MB library to do an operation that they could do themselves with a few lines of code (if they knew what they were doing). And the new generation of developers are lazy (and extremely arrogant, which prevents them from learning from the mistakes and successes of their predecessors). Multiply this by every operation the application has to do and you quickly exceed 100MB to do a "Hello World".
      • Hmmm, I may be swearing in the church here, but every time someone in my team decided to write something from scratch when there is a qualified library available available, I had to refrain from doing a Will Smith. "Yeah, but it is just a few lines of code!" "I am a good programmer, I don't make mistakes" I'll test it thoroughly".
        These lines tend to cause problems later on. Not because the programmer is incompetent. Just because there are lines in the available library implementation written in blood.
        I
        • To be more specific, it depends. I have often seen someone attach a complete library that does a lot of things (and proportionally large and with dependencies) just because he needed only one or two functions from that library and neither of them being complex functions, things he could have done himself.

          It is as the parent post had described, they even put the kitchen sink in the project without actually needing it, and they don't care about the cost that this entails in storage/memory usage and additio
          • Had a call over a weekend once from the boss who said the bootloader was broken because it was too big. There was a hard 1MB limit (this was a relatively huge bootloader, it had some graphics to display on screen and such). Turns out he just wanted to compare two strings. These were C strings. What he did was convert both to C++ std::string, then did "==". That brought in over 100K of bloated C++ STL string libraries, character traits, etc. I asked why he didn't just use strcmp(), which is standard in

          • by vux984 ( 928602 )

            Sure, but many things that are 'simple' are NOT simple. People create horrifically bad mathematical functions all the time because the common mathematical definitions aren't numerically stable (are subject to overflow / underflow / loss of precision) when simply programmed directly on a computer using the common form.

            The pythagorean hypotenuse length h = sqrt(a^2 + b^2) is a classic example.

            People also fuck up rounding functions all the time. And I shouldn't even need to mention how hard getting date/time o

        • Hey, we've got four different SSL libraries in our product... Ugh, so many startups I've seen written by people with no formal computer science or software engineering training and it clearly shows.

          In embedded systems where I work, there aren't often libraries. Or you get the source code to the library and now you have be the one to maintain and fix it. And you don't have a BIOS or kernel. And even if there is a good library, you don't have space for it because you've got limited memory, or it doesn't me

      • The 1980s computers basically only knew English and Latin script.
        If you wanted special almost-Latin characters in text (like the German ü), you had to use country-specific characters pages.
        Later, Windows introduced fonts with those symbols.
        Now, we have fonts with all the special letters (all hail Unicode), and we can print them right-to-left (Greek and Latin style), left-to-right (Arabic style I think), top to bottom (Chinese), ...
        And the font sizes and the program sizes grew by some factors of magnitu

    • by slazzy ( 864185 )
      I think developers own shares in hard drive manufactures...
  • The "data optimization and streamlining" in that update would also ensure that "future patch sizes for Modern Warfare and Warzone [would] be smaller" than the 57GB update being offered at that point.

    Streaming updates during gameplay doesn't make patch sizes smaller. They just spread the download out over a longer time, and consume bandwidth and increase latency while playing. "Data optimization" doesn't mean much on its own, but sure, they might be optimizing for space. That game is goddamned huge, what a

    • I don't understand how a "patch" can be 57GB. Are they remaking all the textures and media files for the purpose of the patch or something? If a bug needs to be addressed, one would think only binaries need to be patched...

      • I would assume :) that the bulk of the content is assets that are stored in large files, and that changes are being made to these large files. And rather than deliver tools that can manipulate the files' contents, and delivering only the changes, they're just re-delivering the whole file. Ever since high speed internet access became generally available this trend has been increasing.

        Another plausible explanation might be that they're changing formats for some assets, and for some reason which might or might

        • One of these game studios should just hire an online backup company to help design their patch system. De-duplication is a solved problem. Even with data that is encrypted in transit. Or is it Sony that is forcing these methods and the game code is not involved in the patching process?

      • by Kokuyo ( 549451 )

        I think, and this is merely guesswork, that they sort the files into compressed archives. And for some reason need to replace the whole archive.

      • by EvilSS ( 557649 )
        New map means a bunch of new textures. And with gamers demanding 4K* or even higher, that means big ass textures. Add in new audio (some games ship every language for some reason for dialog instead of letting you pick of using system language to know what to download, which can add up when you are shipping 10 or 20 or more localizations) and it adds up fast.

        *Just because you aren't asking for 4K doesn't mean others aren't as well. Current gen consoles are sold on being able to do 4K.
        • by djinn6 ( 1868030 )

          How about delivering HD textures to people who actually play at 4K? If I'm playing on a 1080p screen, I will not need textures meant for 4K screens and they should not be downloaded by default. And even if I wanted HD textures, I could play with regular textures while those are downloaded in the background. There is no reason to block me from playing the game for hours.

          • by EvilSS ( 557649 )
            Some games used to do this. I recall downloading high res texture packs in the past. I feel like consoles going 4K might be part of the reason it has gone away for the most part.
      • by Megane ( 129182 )
        It might also matter how this "patch" is distributed. I play a game that has 1-2GB of updates every month, but it sometimes gets smaller updates in between. I don't know if it's because of being based on Unity, but most of the data is in dozens of archive files, each with hundreds chunks of data that might not be in the same order after the patch. For their "native" updater, the patch files contain just the assets that need to be changed, but it is also available on Steam. Steam's updater only wants to diff
    • I maintain numerous code bases that are bloated because of things like crappy dependency management, improperly normalized data, copypasta, and just plain sloppiness. I try to fix that when/where I can, but, the moment other developers work on it, they insist on a 30MB library to write 5 lines of JSON, or porting an already-slow data layer to Entity Framework, instead of fixing the actual problem, because clearly it wasn't slow enough already.

      I don't know whether that is the case here, but 30 years in the

      • Any time I've had to output basic, simple XML or JSON I can't think of ever needing or wanting a library to do it. Maybe parsing unpredictable external input, but never output.

        But this is a world where Microsoft Teams is a JavaScript mess.

        • But this is a world where Microsoft Teams is a JavaScript mess.

          Teams in general is a mess. No ability to choose where you want to save a file, not being able to easily save a picture from within a chat, clicking the back button which doesn't take you back one "screen" but instead takes you all the way back to the beginning, the list goes on. Here's a fun thing. If you missed a call you hover over the entry and click Call. You then get a window asking if you want to call the number you just selected.
      • by EvilSS ( 557649 )
        I feel like optimization is a lost art for most developers. No one wants to spend the time (aka money) on optimization when they can just tell the end user to throw more hardware at it. Hardware being cheap and plentiful (well, usually) has spoiled a generation of programmers.
        • Agreed. But perhaps that will gradually change, since for the foreseeable future, hardware is not guaranteed to be cheap, nor even to be available. It will thus become increasingly important to manage resources wisely, not limited to hardware, but certainly including hardware.
    • It's largely an extension of the child casino marketing model. There isn't 10 textures for each in-game weapon, armor, or player character, there's 100. These publishers will harm the experience of all players to sell the complete collection of art assets to a few "gotta have them all" players, while everyone who doesn't care has to install everything regardless of how much storage and/or throughput they'd want to dedicate to a game.

      Unfortunately, the previous excess monetization method didn't work much b

    • by EvilSS ( 557649 )

      The "data optimization and streamlining " in that update would also ensure that "future patch sizes for Modern Warfare and Warzone [would] be smaller" than the 57GB update being offered at that point.

      "Streaming updates during gameplay doesn't make patch sizes smaller. They just spread the download out over a longer time, and consume bandwidth and increase latency while playing.

      Did somebody confuse "streamlining" with "streaming" in that quote?

      • Yes, someone did, because it didn't make any sense for them to have used both of those words, and someone was busy boggling at the very idea that a single game should be this big.

  • We thought Doom on 4 floppyâ(TM)s was outrageous. Not sure what even is in Warzone that makes it that large, a green and a brown texture, some tree/house/weapon models and a level map. If it doesnâ(TM)t fit in the GPU memory, itâ(TM)s probably not optimized well enough, just a ton of lazy programmers.

    • I remember playing "Lemmings" on a Z80 clone. We used to load the main program from tape, and every level after that had to be loaded from tape. It might have been an entire audio cassette for a single game.
      Playing and rewinding the cassette every time when a level didn't load was not fun.

  • Even Elden Ring, which would be undoubtedly be considered a "AAA" title has a total install size of 45GB on PS4 up to 60GB on PC and it's not exactly a slouch in terms of world size or graphic fidelity.

  • At my download rate (about 900 KBps), 250 GB would take about 77 hours. I can easily be patient about such a thing completing, but I would be killed and my name erased from the history books if I significantly degraded the house internet connection for that long. Clearly this is a plot to make people set up QoS on their routers. "Oh, your router doesn't do QoS well? Then buy a new router!"

    • At <1 Mbps, is the modern Internet even usable for you?
      • At <1 Mbps, is the modern Internet even usable for you?

        900KBps is about 9Mbps give or take, 'B' being Bytes and 'b' being bits. I am rolling at 14Mbps, and with patience, ublock and PiHole, the modern net is usable.

        • At <1 Mbps, is the modern Internet even usable for you?

          900KBps is about 9Mbps give or take, 'B' being Bytes and 'b' being bits. I am rolling at 14Mbps, and with patience, ublock and PiHole, the modern net is usable.

          Ah, that silly distinction. It's useful in some contexts, but when talking about network connection data rates, you should measure in bits, not bytes.

    • At my download rate (about 900 KBps), 250 GB would take about 77 hours. I can easily be patient about such a thing completing, but I would be killed and my name erased from the history books if I significantly degraded the house internet connection for that long. Clearly this is a plot to make people set up QoS on their routers. "Oh, your router doesn't do QoS well? Then buy a new router!"

      Most storefronts have download rate limit capabilities built in.

      Also, most bittorrent clients too ;-)

  • The game is actually smaller than it once was when it was more popular. It's losing popularity because it's not that fun any more and is littered with bugs. Just like Peleton it achieved unexpected popularity thanks to the pandemic. It's now a victim of its own success. Cheaters arrived en mass with no solution given. The way 3 studios were forced to maintain it on a silly release cycle made bugs that appeared and were fixed reappear with apparently no real communication between studios to help in fixing th

  • Perhaps their coders should go ask on Stack Overflow on how to properly patch their game, instead of sending out 57GB updates for a 5 line JSON commit.

  • That makes a lot of downloading much quicker. The vendor might help their servers by using NORM or FLUTE to push updates to local distribution sites, which could then handle the traffic.

    With download speeds in the gigabit range for a rapidly increasing number of people, it may be possible to reduce the amount of data locally stored.

    • That makes a lot of downloading much quicker. The vendor might help their servers by using NORM or FLUTE to push updates to local distribution sites, which could then handle the traffic.

      I don't think that's really a solution in this case. The problem is that, more often than not, game patches are this size because patches aren't being optimized. As others in the thread have pointed to, 30MB libraries being sent over to support 5 lines of JSON is not a problem that's solved with faster internet.

      This is, in part, because the existence of GttH isn't everywhere, and it isn't affordable everywhere. I'm sure there are plenty of individuals who, for example, received a hand-me-down PS4 or receive

      • by jd ( 1658 )

        Optimisation is a problem, yes. Large Library Syndrome is, in part, a function of moving away from the UNIX philosophy and it'll be tough to change that in coders (I've a hard time calling some of them programmers).

        The BBC game Elite was as compact as games got, with a multitude of data files that the disk version could switch between. (It might even have had two distinct game engines, I'd have to examine the source.)

        But it is this game, possibly THE masterpiece in the 8 bit era, that made me wonder how muc

  • Activision's developers are bottom tier garbage. Call of Duty games have been terrible for over ten years.

  • I've been having trouble downloading FL Studio due to its size. I tried special download utilities because I suspected default browser tools, but still had issues.

    It does help to do it during low-traffic hours.

  • It' the damned bitmaps of everything.

    Look into procedural generation, and you can shrink that by a factor of literally 1000 to 10,000.

    • by amchugh ( 116330 )

      Alternatively, Sony, Microsoft, Valve, etc... get together and develop a royalty free standardized content library to use across all games. Developers would still have proprietary content, but could at least use enough from a common library to cut down the total file sizes to something reasonable.

  • ... switch to Pong [slashdot.org]

  • The problem I see with this is in data utilization and cap's. The player base for the Free To Play is constrained by the total data cap their tiered at. That means it's free to enjoy but every month they use up their total available bandwidth because they can't afford "unlimited".
  • Are they saying that Call of Duty: Warzone Live is almost 300 gigabytes for a SINGLE MAP GAME?
  • Friend of mine wanted me to buy this game ti play with him. First install was 140GB download, i think it took me almost a week. A fortnight later he calls me to play again and it wants to do a 90GB update i never touched it again it's just ridiculous.
  • Ladies and gentlemen I give you C++! A language that should only be used by experts, and then only rarely. It should never be used for video games.

If all else fails, lower your standards.

Working...