Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Advertising Businesses Graphics Privacy Security Software The Internet Windows Games Hardware Entertainment Technology

Ask Slashdot: Is Computing As Cool and Fun As It Once Was? 449

dryriver writes: I got together with old computer nerd friends the other day. All of us have been at it since the 8-bit/1980s days of Amstrad, Atari, Commodore 64-type home computers. Everybody at the meeting agreed on one thing -- computing is just not as cool and as much fun as it once was. One person lamented that computer games nowadays are tied to internet DRM like Steam, that some crucial DCC software is available to rent only now (e.g. Photoshop) and that many "basic freedoms" of the old-school computer nerd are increasingly disappearing. Another said that Windows 10's spyware aspects made him give up on his beloved PC platform and that he will use Linux and Android devices only from now on, using consoles to game on instead of a PC because of this. A third complained about zero privacy online, internet advertising, viruses, ransomware, hacking, crapware. I lamented that the hardware industry still hasn't given us anything resembling photorealistic realtime 3D graphics, and that the current VR trend arrived a full decade later than it should have. A point of general agreement was that big tech companies in particular don't treat computer users with enough respect anymore. What do Slashdotters think? Is computing still as cool and fun as it once was, or has something "become irreversibly lost" as computing evolved into a multi-billion dollar global business?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Is Computing As Cool and Fun As It Once Was?

Comments Filter:
  • No. (Score:2, Interesting)

    by Anonymous Coward

    There's no variety of systems, operating systems, one API set trying to prove that it is better at one task than another. Now it's basically 2.5 platforms and that's it.

    Sadly, I don't think the best or even more interesting platforms won.

    Definitely no.. much more boring now than 30 years ago.

    • Re:No. (Score:5, Interesting)

      by ShanghaiBill ( 739463 ) on Wednesday December 28, 2016 @06:32PM (#53568201)

      Definitely no.. much more boring now than 30 years ago.

      That is called "growing old". Everything is more fun when you are young.

      Today's younglings likely enjoy using WebGL to make 4K 3D webpages more than I enjoyed writing UIs with curses [wikipedia.org] on a VT100 30 years ago.

      • Re:No. (Score:5, Insightful)

        by 110010001000 ( 697113 ) on Wednesday December 28, 2016 @07:04PM (#53568341) Homepage Journal
        It is doubtful young people are writing that kind of stuff. It is much harder to write software than it ever was. It used to be fun writing "Hello World" but that won't keep the interest of young people today because they are used to much more sophisticated software.
        • Re: No. (Score:5, Funny)

          by Miamicanes ( 730264 ) on Wednesday December 28, 2016 @07:40PM (#53568513)

          ~35 years ago, I got a Vic-20 for Christmas. It took me an hour to write my first program, including the custom character design. Today, if you got a new Dell laptop for Christmas, you'd be lucky if Windows Update finally allowed you to even *do* anything within the first hour or two. Fuck, even a brand new Xbox One or Wii-U (and probably a PS4) will make you wait at least 30-60 minutes for mandatory updates before it'll allow you to play your first game.

          • Re: No. (Score:5, Interesting)

            by ememisya ( 1548255 ) on Wednesday December 28, 2016 @09:05PM (#53569027) Homepage
            Absolutely worse. I refer you to the moment when Blizzard decided to make Internet connectivity mandatory for Diablo 3, it just went downhill from there. We have no privacy with our phones, computers, nor T.V., hell not even cars anymore, sold prebuilt with microphones, and GPS ready to collect data for "security" and partners. The world today is designed for a single purpose, to rule over the minds of the economically less privileged. I kind of liked it more when my cable service thought everyone nationwide loved Push Pop, and my Internet didn't offer me my favorite pizza. Ruling over physically is fine, it's called a governing system, but for fuck's sake did we have to lead people's minds too based on fake relevance further reducing the ability to empathize in the mass media age? I'm not sure, I'll have to ask uncle Google and read about it in my personal news feed from aunt Facebook. A big happy family.
        • Re: No. (Score:4, Insightful)

          by ninthbit ( 623926 ) on Wednesday December 28, 2016 @07:59PM (#53568655)

          What this guy said.... The barrier to entry for nerdy children is much higher now then it was for me. The sole saving grace for them is the open source community and vast availability of examples and information. But even then, you still need multiple skillsets with graphic design, code, data, story/purpose. Microcontrollers and SBC like Arduino and Pi's making IOT devices is the best way to amaze now.

          • Re: No. (Score:5, Informative)

            by ShanghaiBill ( 739463 ) on Wednesday December 28, 2016 @08:11PM (#53568711)

            The barrier to entry for nerdy children is much higher now then it was for me.

            This is what my 8 year old daughter did:
            1. Go to https://scratch.mit.edu/ [mit.edu]
            2. Start coding

            Total time to surmount barriers: 10 seconds.

            • Re: No. (Score:5, Insightful)

              by postbigbang ( 761081 ) on Wednesday December 28, 2016 @09:40PM (#53569227)

              Yeah. This.

              There is so much fun around. Yeah, there's also the mundane and the boring stuff, too. People have bills to pay, and sometimes being a meaningless dweeb is how the lights stay on.

              But there's never been more real fun. If you don't like code projects for Big Corp, you can get into the mad crazy fun of Arduino, Pi, FPGAs, robotics. SoCs, SDRs, and a myriad other interesting projects.

              I've been around since doing 6502s and Z80s in assembler. It's necessary to peel off the layers of cruft and mold that get into one's system when you sit still too long. Coding for secure, optimized code has been increasingly crazy, but if you know your platforms, go for it.

              I watch kids doing fascinating things. Truly sophisticated toys that twenty years ago were impossible at any price. My only fear: letting people get controlled by the advertisers and the government, each of whom are power hungry and relentless. Otherwise, if you're bored, break out of your box.

        • Re:No. (Score:5, Insightful)

          by hey! ( 33014 ) on Thursday December 29, 2016 @09:26AM (#53571215) Homepage Journal

          Back in the day we were writing more challenging programs than "Hello, World!". I personally wrote custom parsers, real-time control software (on a PDP/11 running RSTS/E), a numerous problems requiring serious algorithm design. A lot of what I did would be easy to do today because of a combination of computing power, rich libraries, and scripting languages, but doing it all yourself in C with nothing but the (then much smaller) standard library made it pretty interesting.

          The big difference is how much closer you felt to the bare iron back in the day. Today we work in the context largely of other peoples' frameworks and libraries. If I had to draw an analogy it'd be like voting in a town meeting in a small frontier town, and voting as a citizen in a republic with a hundred million citizens. In which case do you have the most power? It's not a straightforward question. In a small town you can shape policy in a way you can't in large republic, but you're limited by the limitations of that town itself. You can vote to put a man on the Moon, but it's not going to happen.

          The important thing to realize is that as you get older, you just don't have as much fun, pretty much across the board. You have to cultivate playfulness because it doesn't come as naturally as it once did. When I hear people middle aged or older (like myself) pining for a lost past, it's often clear to me that what they're mourning is the loss of their youth.

      • Re:No. (Score:4, Interesting)

        by Anonymous Coward on Wednesday December 28, 2016 @07:16PM (#53568373)

        It's less-fun today, even for youngsters.

        Just look at the technologies you listed.

        WebGL is just a shitty subset of OpenGL, which actually isn't all that different from SGI's old IRIS GL library from the 1980s.

        And they're using WebGL from JavaScript, which was a shitty language back in 1995 when it was first released, and has only very recently seen positive improvements. In many ways JavaScript is still a step back from C89, and even from K&R C!

        A lot of what people are doing with WebGL today was done a couple of decades ago when VRML was the big thing, except VRML offered more practical higher-level abstractions.

        4K is just a bigger version of 640x480.

        They don't even have the benefit of doing this within the context of a real operating system, like IRIX or SunOS. They're constrained to the quite limited and often idiotic web browser ecosystem!

        As odd as it may sound, the youth of today are doing the same stuff the rest of us had done 25 or 30 years ago. But for whatever reason they're using worse libraries/frameworks/APIs (WebGL), from worse programming languages (JavaScript), in a worse environment (web browser)! Just going back to what we were using in 1990 would be a big leap forward.

        • by jbolden ( 176878 )

          I worked on both IRIX and SunOS. Those machines often ran $20k. There pretty much were no kids involved, those were often reserved for graduate students and even college kids used worse systems. They also were used professionally.

          As for Javascript being a step back from C. They aren't remotely related to one another. JavaScript is an interpreted applications language that operates safely cross platform on large computer networks. In the time of IRIX or SunOS interpreted languages were things like BASI

        • Re:No. (Score:4, Funny)

          by Jon Peterson ( 1443 ) <jonNO@SPAMsnowdrift.org> on Thursday December 29, 2016 @05:23AM (#53570517) Homepage

          640x480?! That's just a bigger version of 320x280, and I started out with a lot less than 320x280, I can tell you. Bloody kids, next thing they'll be wanting more than 4 bits of colour information in each pixel.

          As for VRML, I often use it as an example of why 'open standards' are far from a panacea. It's a truly dreadful standard, created in academia before there were either competing implementations of the problem, or even much of a problem, that actively held back VR and web 3d stuff generally for years. Also a useful example of "worse is better".

      • by shanen ( 462549 ) on Wednesday December 28, 2016 @07:29PM (#53568449) Homepage Journal

        Replying here partly in agreement but mostly in wonder about the OP's AC status. If your ideas or opinions are so bad that you don't want to associate your name (or even a handle) with them, then why bother to post at all? I'd make an exception for cases where you are saying something with possible repercussions, but I'm not seeing it in the OP of this thread. (In a sense, it's moot, since my settings render the ACs nearly invisible. It was the quote in the visible reply that exposed this AC.) Incidentally, it doesn't matter in terms of protecting privacy. Slashdot knows who you are, and surely you can't trust the sanctity of your personal information as stored on Slashdot.

        Now what's the agreeing part? In the days of yore computers were within the scope of understanding of a single person. The systems were still small enough that it was at least theoretically possible to understand all of how they worked. I thought that was really fun and cool, even if I never got there I enjoyed the chase. I caught just the tail end of that period.

        Not sure when the transition happened, but at this time there is clearly no hope of understanding everything about any "normal" machine. Both the hardware and software have passed the human scope of understanding or control. No one has time to look at billions and billions of transistors or millions and millions of lines of code. We have to abstract, and picking your level of abstraction is not the same as understanding the entire thing.

        There's also a level of threat and paranoia that cuts into the fun. Maybe part of that is a result of getting old, but I think it is mostly just a matter of experience and understanding my own limitations. I really don't want to be pwned, but all it would take is one juicy vulnerability, and I'm sure the serious black-hat hackers can find one if'n they want to. If a serious hacker is coming for me, I might as well save both of us the trouble and just turn over my passwords now, eh? The best defense is having nothing worth hacking for?

      • There is nothing called "computing" anymore, at least not if a human is involved.

        I'm all for bringing back modems, bbs'ers, terminate and high memory, man I missed the days when Windows shipped without tcp/ip!
      • Today's younglings likely enjoy using WebGL to make 4K 3D webpages

        I don't know. I'd like to hope that this would be the case, but I watch my 13-year-old son so quickly lose interest with complex computing platforms because it just takes so long to get to where you produce anything that looks like anything you're used to. When I was his age, I could realistically put my C64 into graphics mode and code up something that sort of approximated what professional games looked like at the time. Nowadays, the best he can realistically hope for is approximating what games looke

    • FreeBSD, NetBSD. FPGA, Arm, x86. Rasperi Pi, Tablets, Laptops (with USB 3.0 dedicated graphics enclosures). VMs, Cloud, WiFi and Gigabit Ethernet. Want me to start listing programming Languages? How about toolkits for each language?

      There's an F-Bomb of variety out there. You don't even have to look that hard.
  • by Anonymous Coward on Wednesday December 28, 2016 @05:42PM (#53567879)

    Because of Windows spying?

    LMAO.

    • Because of Windows spying?

      LMAO.

      He was just sad that he wasn't bad-ass enough to have the NSA spying on him, just Microsoft. Wait. Now I'm sad too.

  • Betteridge's law (Score:5, Informative)

    by ClickOnThis ( 137803 ) on Wednesday December 28, 2016 @05:46PM (#53567903) Journal

    That is all.

  • by Anonymous Coward on Wednesday December 28, 2016 @05:47PM (#53567911)

    Back in the day, someone dedicated could learn everything he had about a system, from the CPU, registers, RAM, I/O, video, etc. It was relatively simple.

    The only way to get that same "cool and fun" feeling is to dive into the 8-bit microcontrollers such as the ATmega328P. Even the latest Arduinos have become too complex with their ARM SoC.

    Look on hackaday.com, there's often fun projects based on those basic, entry-level, sub-100MHz 8-bit uC.

    • > Back in the day, someone dedicated could learn everything he had about a system, from the CPU, registers, RAM, I/O, video, etc. It was relatively simple.

      And fully documented, in a real manual.

    • by shanen ( 462549 )

      Basically I think you said what I wanted to say better than my earlier comment, but I am going to disagree with you on the grounds that the understandable systems have become toys. I used the qualifier "normal" in my comment to refer to non-toy machines, but I could also say that your approach is to limit the level of abstraction.

      You already got the "insightful" mod you deserve, so in this case my lack of mod points doesn't even bother me... (Now to see if there are any actually funny comments...)

    • by yayoubetcha ( 893774 ) on Wednesday December 28, 2016 @07:56PM (#53568629)

      In 1975 I opened the "memory drawer" on the Wang 3300 and I saw this array of "donuts" with wires running through them.

      I stopped trusting computers when I could no longer see the bits (both as magnetic memory, and paper tape).

  • by iamacat ( 583406 ) on Wednesday December 28, 2016 @05:48PM (#53567917)

    Cool is about pushing the boundary and enjoying experiences which are decades away from mass production. A desktop is not going to be super cool in 2016. Arduino controllers to operate hand wired power windows in your home might be.

    You can get very open and hackable Linux / Chromebook+chrouton desktops and laptops, but you may be hard pressed to get them to do anything which is not already widely available.

  • Hell Yes! (Score:5, Insightful)

    by Prof G ( 578341 ) on Wednesday December 28, 2016 @05:48PM (#53567921)
    Switch to Linux and the cool factor becomes very much alive.
  • by banbeans ( 122547 ) on Wednesday December 28, 2016 @05:49PM (#53567925)

    It never was that great.

    • You must have been one of the Apple II kids.
    • What do you mean, nothing is ever going to be cooler that Fortean on a teletype are having to argue with the mainframe gods
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Wednesday December 28, 2016 @05:51PM (#53567939) Homepage Journal

    If you're a gamer, you are going to be forever at the mercy of the game companies, who are going to exploit their customers to some extent to maximize profit.

    If you are a hacker, you have your own hacker-produced computing platforms and tools and a wide-open vista of hardware and physical objects that can now be designed and manufactured by the individual.

    If you depend on some company to make everything you use, you've set yourself up to be their "client". Don't do that.

    • by AmiMoJo ( 196126 ) <mojo&world3,net> on Wednesday December 28, 2016 @06:09PM (#53568059) Homepage Journal

      While all true, people seem to forget how hard it was to get software before the internet, especially if you were a kid with no money. These days you can download vast amounts of high quality software, and its source code to tinker with. In some respects we are a lot better off now, and when you had to rely on friends, clubs and magazine cover disks/tapes.

      On the other hand, we are definitely a lot further removed form the inner workings of computers now. There is a massive amount of abstraction, which is kind of good for a lot of purposes but also very much encourages people not to look too far beyond really high level library functions. The lack of hacking friendly ports on the hardware side is a big issue too.

      But then again you can get a pretty good oscilloscope for peanuts now, so in some ways hardware hacking is a lot easier than it used to be to get into. We don't have those great kits you could buy from magazines any more though, and while people like Adafruit do offer some interesting stuff it's more Arduino level plugging modules together than figuring out why your transistor biasing isn't working.

      Personally I like the older stuff. Emulators are great for it actually - back in the day I used to reboot my computer about 900 times a day as I was trying to debug assembler (didn't have a single step debugger and of course no memory protection) and figure out what the hardware was doing, and emulators make it much easier.

      • people seem to forget how hard it was to get software before the internet, especially if you were a kid with no money

        I agree, and this goes double for hardware. I grew up with minimal access to computers at home or school. When I stayed with relatives over holidays, I'd spend every possible minute on their computer, but then most of the year I had no access to any computer, let alone any manuals or software. I contented myself with books from the local library, but in 1984 (when I was 13) there wasn't much available. I learned 8080 architecture and machine language, and ANSI C, by reading about them in books, but I didn'

    • If you're a gamer, you are going to be forever at the mercy of the game companies, who are going to exploit their customers to some extent to maximize profit.

      What about all the Open Source and even Free Software video games out there? Sure, they are grossly outnumbered by their commercial counterparts, but some of them are actually very high-quality. There's enough of them to where one could reasonably waste all their time never playing anything else.

    • and I've got a 2TB hard drive and writeable Blue-Ray. So I'm at nobody's mercy. Now, if you're into Multi-Player games exclusively you might have a point. Especially the ones that connect to servers. Then again, you can still play Phantasy Star Universe on private servers if you're so inclined.
  • yes its fun - Arduinos, Rpi, fpga with fricken Arms embedded ..
      thinks we only dreamed of 30 years ago ...
        don't like an 'app' build it ,..

      don't like a platform move

    -- get off my lawn sonny --

  • by raymorris ( 2726007 ) on Wednesday December 28, 2016 @05:54PM (#53567979) Journal

    > many "basic freedoms" of the old-school computer nerd are increasingly disappearing

    There is an organization devoted to computer freedom called the Free Software Foundation, closely allied with GNU. GNU makes most of the operating system we call Linux.

    > Software is available to rent only now (e.g. Photoshop)

    There are several alternatives to Photoshop which use free licenses, meaning licensees that respect freedom. None of them do everything Photoshop does in the exact same way Photoshop does it, but for any *particular* Photoshop user, there's probably a free software package that fits their particular needs well.

    > Windows 10's spyware aspects made him give up on his beloved PC platform and that he will use Linux

    Linux is certainly one way to avoid Windows built-in spyware.

    > viruses, ransomware, hacking, crapware

    That's 99% Windows too, Linux desktop users see viruses and malware very, very rarely - maybe once every 15 years.

    Linux isn't perfect. It does however address most of the concerns mentioned.

    • Yes, Linux distributions suck less than Windows. However, there becomes increasingly less one can do in *nix. Besides I use bash on Windows 10 pretty much daily.

      If I were a kernel dev or even just a website admin I might be able to get by. However, people just like to use commercial software. People laugh at me when they see me use GIMP. Could I use Photoshop? Sure. However, there's zero alternative to Acrobat. Yes, I could cobble together Evince, CUPS, and Inkscape, but they just don't do the trick well. E
      • However, there's zero alternative to Acrobat.

        I take it, then, that you've never looked at Scribus [scribus.net] a cross-platform, FOSS page layout program that's being used by professionals to create newsletters, periodicals and books. And, if you're having trouble with it, there's an active and helpful mailing list full of people ready to advise you. Check it out; you might just be surprised by how good it is.
    • None of those complaints have anything to do with either "cool" or "fun". Most of those complaints are something that the common user doesn't give a shit about and thus has no impact on "cool" or "fun".

      • Slashdot - News for nerds. I take it Mr. Garbz isn't a computer nerd. What type of nerd are you, anyway?

        Also it occurs to me that some of the hacking "cool" flavor that the OP mentions may now be found around the Raspberry Pi, Arduino, and other hobbyist platforms.

  • by turkeydance ( 1266624 ) on Wednesday December 28, 2016 @05:56PM (#53567983)
    computing, music, whatever was better back in the day.
    • by Kjella ( 173770 ) on Wednesday December 28, 2016 @08:14PM (#53568729) Homepage

      computing, music, whatever was better back in the day.

      No, it wasn't better. It was much, much worse. It was so rudimentary you could actually start at:

      10 PRINT "Hello World"
      20 GOTO 10

      There's no doubt that a chain saw is far superior to a hand saw. But if I was interested in saw-making and how saws work it'd be an awfully lot easier to build a hand saw from scratch, all the way down to forging the blade, fitting the handle and giving it teeth. In fact it's often an inverse relationship between how hard it is to make and how hard it is to use, like an automatic gearbox is more complex than a manual gearbox. As progress means that we build more and more advanced and complex solutions, the more it is out of reach for the hobbyist. I could almost make something similar to commercial games on the C64 because many of them were actually written by one man in a garage. Today you look at $100 million dollar titles and realize that even if you did this professionally you'd be one little cog in a very big wheel.

      It's in the nature of advanced civilization, we're all doing a very small part. I depend on other people to produce the food I eat, the clothes I wear, the hot and cold running water, the electricity, the car and the roads etc. and all I do really is program computers and trade for everything else. That means I know a lot about that and very little about the rest. Or I could train for the post-apocalyptic society were I have to survive using whatever crude means I can pull off on my own, but life is short. I think I'll just take my chances and if shit hits the fan contribute to the rapid de-population back to an agrarian society.

  • Back in the day (Score:5, Insightful)

    by rijrunner ( 263757 ) on Wednesday December 28, 2016 @05:56PM (#53567985)

    I have this conversation periodically, except it is usually addressed to music, art, tv, sports, or any of a number of topics. It's like those guys who see a high school girl now and say "Man, they did not look like that back in my day".. yes, they did. It's just that when you saw them then, you didn't see a cute blond, you saw the B***h from social studies.

    There are many exciting things going on now. I am looking at how quickly and massively raspberry pi's have been moving into area where their creators never thought they would be used. I see arduinos and the maker movement and think "Wow". Just a look at adafruit or any of a hundred other sites and the amount of very affordable tech is staggering. We could stop all tech development now and it would be centuries before we explore all the possibilities of what is sitting on the desk in front of us.

    I met someone at a coffee shop awhile back and there was a bunch of teenagers acting like teenagers. My friend is now in their mid-30's. I am in my 50's. I had first met them when they were a teenager at a coffee shop. My friend commented that they were not like that back then and I pointed out that I was their current age when we first met and yes.. my friend was just as dumb and teenagery back then.

    Excitement is never external. You can look at any family pic taken at Disneyland and see the scowling goth kid who is totally not having fun. OK. You have given up windows as the programming platform and gone to Linux and Android.. So? You did not start programming on Windows. You started on other platforms and moved with the times.

    But, that is not what you are complaining about..

    What catches my attention is that *none* of your computing complaints are really computing complaints. They are consumer complaints. You should not be doing this comparison back to their early 80's equivalents.. televisions with 3 channels. Radio. Vinyl records. Newspapers. Magazines. Computing is more than fine right now. It completely rocks. Consumer products are far greater than what they were.

  • Fun (Score:5, Insightful)

    by bjb_admin ( 1204494 ) on Wednesday December 28, 2016 @05:58PM (#53568005)

    I have thought the same thing.

    Of course there are a few fundamental differences between then and now from my point of view:

    1. I was a young teen and had tons of time (and energy) on my hands to play with these things.
    2. Everything you learned you figured out on your own or as a group share with close friends, supplemented with a few manuals and magazines.
    3. The hardware was finite enough you could basically learn everything from the low level access to the hardware to all the software features (basic or machine language). You could literally learn what every location in IO or memory did (53281 anyone??).
    4. With a few days or at most weeks time with even modest skill levels you could put together something that could "wow" your friends and perhaps even non-computer family members.
    5. Atari / TI / Commodore computer overnight parties where a bunch of us get together to compete to show off the best games etc. in an attempt to prove we had the best platform.

    Today we have a lot more learning resources out there, and the hardware is much more powerful but in my mind it just isn't as fun. There is certainly no way to whip up something that would "wow" anyone. It's more a tool now than a fun hobby.

    • I agree. Assembly code, trap vectors, hardware interrupts, blitters, custom chips, etc. Exploring and figuring out how to use computers and their internals were difficult, time consuming, and ultimately very rewarding once you got it to work. Today's easy access to information for anything changed everything and IMHO killed some of the joys of exploration.

      -- This sig reserve the right to refuse service to anyone

    • Exactly that.
      I was going to liken it to doing levels for games like Doom.

      Back in the day, with QERadiant and an illegitimate copy of Photoshop, you could whip up a sweet little PVP quake level even with some custom textures in a half a day. And it looked fine, was fun, everyone was happy.

      Now, with the AAA level standard we're all used to, anything you do in a half day is going to look like complete shit, for textures you practically need an art team, and you're going to spend weeks building the sorts of de

    • Spot on!

  • by phizi0n ( 1237812 )

    "and that the current VR trend arrived a full decade later than it should have"

    No, VR can die in hell with Betamax, MiniDisc, and 3D TV.

    • by guises ( 2423402 )
      Seriously. The current VR fad arrived a decade later than it should have? Bullshit. There was a VR fad twenty years ago and that wasn't even the first. The current fad is the third that I can think of, and no more compelling than the previous two.

      Head-mounted displays just aren't a good idea. They seem like a good idea, they seem like the first step towards a hollodeck, which is the thing that everyone really wants, but they're awkward to use and any immersiveness that they may impart is fairly meaningle
  • by MSG ( 12810 ) on Wednesday December 28, 2016 @05:59PM (#53568013)

    This weekend I spent some time improving my personal installation of SOGo groupware, so that my wife and I can better share email, calendars, and contacts on a system that we personally own.

    Certainly, big companies don't respect users, but it's still possible to provide all of the services that I need using only Free Software, so I do. Pretty much the only exception is navigation, for which I use Google Maps. Everything else we do with Free Software and the more I move my wife to our own services, the happier she is. Personally, I find that immensely gratifying. As long as that continues, I'll find computing as cool and fun as ever.

  • 'Fun'? Not so much. (Score:5, Interesting)

    by Rick Schumann ( 4662797 ) on Wednesday December 28, 2016 @06:00PM (#53568019) Journal
    When I started in computing, you needed a soldering iron, a particular skill-set, and if you were good, some programming skills, to supplement very minimalistic disk operating systems. It was also kind of fun to witness the reactions of people seeing and hearing a 14-inch hard drive powering up, or in some cases the look of recognition on their faces when they realize that the IMSAI 8080 they saw in the movie War Games was a real thing, not just some Hollywood prop. I even built a speech synthesizer, and got to see my friends' eyes go wide when I made it say "Shall we play a game?" I even designed and built some of my own IEEE696 cards to plug into the backplane that did things you couldn't get kits or pre-made boards for. Before the IMSAI, and the Morrow Designs stuff, I had an 8-bit CDP1802-based computer built on perfboard, complete with an integer BASIC interpreter. Fun, fun, fun. Also great experience for later in life; all the skills and experience I gained from all that has kept me employed all this time.

    These days? You might, if you wanted to take the time, effot, and expense to do it, design and build PCIe cards for special functions, but largely there's no point; almost anything you'd want the hardware to do, you can just go out and buy. 'Building a computer' now takes a screwdriver, not a soldering iron, and just about any teenage kid with half a brain can get the parts and cobble a box together. Sure, there's microcontroller stuff of all kinds out there, but there's little to do between those and full-blown desktop systems anymore. Likewise, writing software yourself is almost pointless, you can download just about anything you want, too. Even general electronics as a hobby isn't very accessible or fun anymore, because so much is surface-mount only, not too much is through-hole, so the really interesting devices mean you're more or less required to spin a PCB for whatever it is, which makes it so much more expensive and so much less accessible.

    I guess if you're into computer gaming (I lost interest years ago) or just using a computer as an appliance (which they more or less are anymore) then I guess it's 'fun' for you, but from the background I'm coming from, it really isn't so much anymore.
  • Too much shit. (Score:4, Informative)

    by MouseTheLuckyDog ( 2752443 ) on Wednesday December 28, 2016 @06:04PM (#53568035)

    Dealing with crap like systemd.

    Learning a new language, you don't just learn the language. you learn the build system, sopme complicated IDE plugins, some decent libraries, but most are hack together messes etc.

    One example illustrates it all: Javascript.

  • And bothering to post this question is almost as bad.

    Is computing as cool? No of course not to the self-described nerds who helped build it to where it is now. It's a lot more accessible and exponentially more powerful.

    So instead of lamenting the past, appreciate what's been built and work to making the experience as pleasant as you found it back when you were younger.
  • The only platform that you can still get the hood open now is Linux. I personally prefer Arch Linux or OpenWRT depending on the hardware and expected use for a project.

    But even with Linux you need to choose carefully as vendors work to close even the many products built upon Linux. Just buying hardware with Linux doesn't mean it's open enough to be useful for example: Android as generally sold. AOSP is the exception.

    If you want to intro someone to "old skool" look at the Raspberry Pi platform or OpenWRT. NO

  • by Jjeff1 ( 636051 ) on Wednesday December 28, 2016 @06:09PM (#53568061)
    I was just talking about this the other day with some co-workers. It used to be that you could manage your work network, even a decently large network, and know everything about it in your head. Reading a manual and being a smart guy (or gal) was enough to have a working environment.

    No more. People expect remote access and that everything should be working 24x7, the added complexity of building out those environments, and the merging of multiple technologies means that every change becomes a much more complex endeavor. Encryption requirements makes everything more difficult to implement and troubleshoot. There are caveats with virtually everything, and I just don't have the time to be an expert on everything around me.

    Example from this year - my IP phone system, which integrates with Exchange using custom nonesense for playback in outlook, using the LLDP enabled voice VLAN on my switches, with servers running on my vmware hosts, each of which have multiple redundant connections, with handsets connected to a switch using 802.1x authentication, that's complex enough as it is, but then buried deep in the release notes was a bullet point that exchange 2013 wasn't supported, 18 months after exchange 2013 was released. That's a lot of stuff to be an expert in; a far cry from 10/100 hubs with a single management IP address and a stand alone server that send voicemail over encrypted SMTP.
  • by Sarusa ( 104047 ) on Wednesday December 28, 2016 @06:11PM (#53568079)

    Developing GUIs for databases on Windows 10 is not going to be fun and cool. But that existed back in the 80s, it was COBOL on mainframes.

    If you want it to be fun then you have to pick something fun, which usually involves one of the small boards like Arduino, Raspberry Pi, adding some motor control... This is what I do. These small systems are all quite digestible and have stuff built in we would have killed for, and you can make actual things which do things, be they useful or just playful.

    Or you could develop games for a classic system - there are still people doing homebrew games for all the old systems like Megadrive, Speccy, Apple ][, C/64, Lynx, etc etc. Or there's RPGMaker.

    There is so much awesome stuff going on right now from Arduino-like Maker stuff to drones to GPU power to deep learning to VR - I just got excited about a cheap tiny little camera component (neeeerd).

    So when you say 'computing isn't as fun and cool as it used to be' you mean YOU aren't as fun and cool as you used to be - and who is, besides Betty White? Not me. But that's what really happened, don't blame it on computing. You let your skills decay, didn't keep up to date, don't get excited by new stuff, and are too lazy to even keep up with what you knew how to do. The C64 is still thriving if you thought it was more interesting than watching sports or, oh hey, Westworld is on, I'll start tomorrow.

  • So, I read this on my Windows 10 Surface Pro 3 (running the latest fast ring build) while having my TRS-80 model one, my Apple IIe, and my Mac SE/30 on my side table.

    Yes, the mainstream of computing is different. I no longer have to worry about dip switches or whether I can address memory above 4 MB of RAM.

    However, my teenage son is now designing 3D printed objects on his homebrew PC (dual-boot Win10 and Mint) which he then sends to his MonoPrice 3D printer, which he built from a kit then - not liking the p
  • by FireballX301 ( 766274 ) on Wednesday December 28, 2016 @06:18PM (#53568123) Journal
    Computers have evolved into an indispensable part of day to day life, so it's very obvious that it would stop being 'cool'. The automobile was conceptually a very cool thing in the turn of the 19th century, but they're just cars now. I think the comment here highlights some of the jackassery inherent in the question:

    I lamented that the hardware industry still hasn't given us anything resembling photorealistic realtime 3D graphics, and that the current VR trend arrived a full decade later than it should have.

    This is the sort of complaining that has no place on a 'news for nerds' site - if you want it, build it. If you can't build it, don't bitch that others haven't done it as quickly as you wanted. I don't think OP submitter was the one working on the VR judder problem or the high density screen refresh problem or any of that. This sounds like a bunch of dipshit 'enthusiast' friends from the 80s that only ever dipped a toe in the industry and didn't actually end up building anything they wanted over the thirty years of their careers

  • Remember when space was the coolest? [youtube.com]. For a significant portion of Slashdot's demographic, the answer is "no" because they're not young any more. Younger people are probably dabbling in Maker stuff and might be wondering why this question is being asked.

  • I started coding back on a 386 with Windows 3.1. I miss it, like if I wanted to access a variable I just did it, there wasn't the dozen hoops and 300 line refactoring to navigate the permissions hierarchy to properly modify it. Or there were just pointers, not smart/shared/scoped/unique/weak/etc.. along with very opinionated people that have mutual exclusive ideas of which ones should be used where.

    Building software without the technical bureaucracy was a lot of fun.
  • by RyoShin ( 610051 )

    Another said that Windows 10's spyware aspects made him give up on his beloved PC platform and that he will use Linux and Android devices only from now on

    Sounds like someone is in for a rude awakening about Android [stackexchange.com]. (I think Win10 is worse than stock Android re:data collection, but if your primary concern is privacy...)

    As to the question itself: It absolutely is, for varying definitions of "cool" and "fun". I'm a 90s kid (so many things I have to remember) so I didn't cut my teeth on a C64, but as a youngli

  • by hey! ( 33014 ) on Wednesday December 28, 2016 @06:32PM (#53568199) Homepage Journal

    Declining SAT scores were a big topic of discussion in the 80s and 90s, but what most people never really took into account was that in the 50s most jobs only required a high school diploma; by the 80s more people felt they needed to have a college degree. The decline in scores didn't reflect a decline in ability of graduating high schoolers, it reflected more of the lower-performing graduates taking the test.

    I've been in the computing field for a long time. When I went into it back in the early 80s most people had never seen a computer. There were a very small number of people who worked with computers, and I'd say about half of them were doing at least moderately interesting stuff. Today there are many many more people doing interesting stuff, it's just that the growth in interesting work has been swamped by a rising ocean of mindless, bureaucratic IT drone work.

  • I kinda/sorta know what the poster is getting at. I think the magic of computers back in the 80/90's is the promise of what they will one day deliver. That was the true excitement. In the 2010's now, much of what computers aspired to has now been realized. It's easy to forget where you've been and how far we've come. Fresh young minds today likely feel the same way we did in the 80's about todays computers. I recently built a beautiful 486 computer from "new, old stock" parts. Was a lot of fun, brough
  • by golodh ( 893453 ) on Wednesday December 28, 2016 @06:55PM (#53568293)
    Let's face it: computing has grown up.

    Take application development. Pioneering has been replaced by engineering. Great for making complicated and reliable products, not so great for empowerment of the individual. Software engineering tends to be teamwork. Depending on how "standard" the required end product is you can parcel out the interface design, the overall apllication design, the datastructures, the core algorithms, data management, and housekeeping. Could be 3-50 software engineers in a team. Used to be 1 programmer doing all of that.

    Take high-performance programming. It used to be an art. Found e.g. in DOD stuff, scientific software, and games. Often in assembler, for speed. Nowadays that's mostly out. Certainly for scientific software. You use compilers of even scripting languages that call libraries to do the heavy lifting. You're quite unlikely to do better than the library builders. If you're writing some really new algorithm, you'll code it in C/C++. If absolutely necessary, you can make that code tunable (array stride, blocksize, etc.) and write an algorithm to optimise those parameters for your specific hardware (like e.g. BLAS). If it's too slow, buy better hardware. If it's still too slow, get access to a Hadoop cluster and parallelise your algorithm.

    Take datacommunication. In the early days datacommunication meant controlling some UART and sending squiggles down a wire. Now it's calling a packaged protocol stack and talking to the appropriate protocol layers. More often than not that's the connection or session layer or higher ... unless you are a specialised networking engineer.

    As for computer users as clients: the nerdy types are dying out. What today's consumer wants is things like smartphones and tablets. And what do they want it for? To surf the web (shopping, news, amusement (e.g. video torrents, Youtube)), and to waste time on Facebook, Twitter, Instagram and various chats. If they somehow want a desktop computer, they'll only know it for the OS it runs. That would be "Windows" or "Apple" (meaning macOs, but Apple users typically don't know that). And that's what the industry is giving them. Want "Basic Freddoms" ? Bugger off and run Linux, you freak.

    So, yes. Computing as a product has become commoditised and geared towards the mass market. It's not easy to turn a buck by catering for nerds: the real money is in serving customers. And it shows. Consumer-grade users get a consumer-grade experience plus consumer-grade treatment (read: DRM, spyware, bloatware).

    Those who want to play around with a computer however never had it better. For less than 50$ you can get a complete Raspberry Pi system (or a lookalike) that's more powerful than a clunky old PC. For 500$ you can get performance you used to have only on workstations, and for 1500$ you can get the same power you used to need a supercomputer for.

    The only thing stopping you is know-how, time and interest. But that's not the industry's fault,

  • computer games nowadays are tied to internet DRM like Steam

    So? Steam in particular is incredibly unintrusive unless you're actually trying to pirate the game, in which case it depends entirely on your definition of "fun" -- if you include the challenge of breaking DRM as "fun," then Steam and friends are far more interesting than "draw a black line on your CD."

    some crucial DCC software is available to rent only now (e.g. Photoshop)

    Again, so? Admittedly its annoying having to keep re-paying for something, but that doesn't intrinsically lower the functionality of the software. That's like saying your house is crap because you had to m

    • by Altrag ( 195300 )

      the 360 doesn't already have Win10-like "features,"

      Woops, I its the XBox One these days isn't it? Can you guess which console I prefer? Point still stands though.

  • I think tech and computing have changed for the worse across the board and often focuses on trivial "look at me!* products and services. For a long time now, it has been the case that a computer wasn't much without network capability, but I'll confess that I am so tired of what has come along (think DDoS, breaches, invasion of privacy, tracking, Ransomware) that I am about to just hang up my computer and spend time with paper books from the library.

    20 years ago I would have said it was cool and fun,

  • ... customizing muscle cars in the late 50s.

    The bad news is that cars are boring now.

    The good news is that cars no longer require tinkering to get them to go.

    I've changed out clutches, installed a/c, gapped plugs and points.

    For modern cars, I don't know bullshit from wild honey about fixing them.

    I'm a retired IT guy and cut my teeth on a TRS-80 I bought in Feb, 1978.

    I helped bring in the first network for Mobil Oil.

    I programmed Access, Lotus 123 (and later Excel) macros, and crap like that.

    I do not miss tho

  • by orlanz ( 882574 ) on Wednesday December 28, 2016 @07:01PM (#53568327)

    I am assuming you really meant "computing". Not just desktop programming and gaming like the examples implied.

    When I was just a lad, the adults had programming careers that were very fun. They solved complex puzzles, and problems. It was very frustrating but very rewarding. Even growing up, I enjoyed programming which was very much a "figuring things out" topic minus the grease and back pain of former generations.

    But today, with more than a decade into adulthood, that topic has become mostly a commodity. Windows, Linux, embedded, or otherwise. Lots of people "program" and most problems have already been solved. It's more a test of google-fu than puzzle solving. As a career it is very boring, trivial, and narrow in the results. There are still positions like before but they are outnumbered 1000 to 1.

    So computing in that aspect is no longer fun. Same with hardware, it's all the same. It's all commodity. The gains in the permutations are so minor that cost easily overrides the performance benefits in most cases. This is primarily because hardware has outclassed software. I think software is probably a decade behind hardware now.

    But if we switch to micro computers, sensors, and networks beyond just wifi: The glory days of the past still exist. Smart homes, smart gardens, etc are just a few tinkering days away. The common geek has access to fabricate their own custom hardware solutions. Writing the software is still mostly trivial due to the internet, but the ideas and solutions custom to a geeks unique physical world or situation is well with in reach. In this space we are still only limited by our imaginations in defining the problems to solve.

    It is still very much FUN!

  • Is computing cool and fun as it once was? Hell yes.

    Does "cool and fun" have anything to do with your friend's privacy, social justice, or borderline tin-foil hat related opinions of their OS and their license agreements? No.

    Maybe you need some actual cool and fun friends, or you need to change the question. Computing is more cool and fun than it's ever been. Some mythical issue with your software vendors does not change that. *

    *Posted on a Windows 10 computer using a browser that sends scrapes my personal d

  • by drinkypoo ( 153816 ) <martin.espinoza@gmail.com> on Wednesday December 28, 2016 @07:40PM (#53568511) Homepage Journal

    Computing is a lot more fun now than it was even ten years ago, let alone twenty, let alone longer. You can still do all the same stuff people did back then if you want; people are still wire wrapping their own computers from scratch, for example. And you can still play the games of yesteryear, through emulation. But now there's a whole bunch of things to do which didn't even exist back then, and furthermore, it's vastly easier to get access to a leg up so that you don't have to do a whole job yourself. For example, it's been reasonably possible to build quadcopters since about the 1990s, when cheap MEMS accelerometers began sampling from Analog Devices, and before they appeared in the Wiimote and people began to reappropriate them. But today you can buy a flight controller or build one out of components or you can buy a MCU board and an IMU board. You can write your own flight control software or you can just download code and write the binary or you can download and compile and optionally customize. You can buy the ESCs off the shelf or you can build your own or you can buy cheap ones and reflash them with superior open-source firmware which you can customize. And this is computing, obviously, since each of these things is a little flying computing cluster. And that's before even getting into making them autonomous. We didn't used to have multi-core computers with multiple GB of RAM and an onboard multi-core vector processor which would fit into a ~5W power envelope to do stuff like that with.

    Computing is also a lot more cool than it used to be, which ought to be painfully obvious. It's cool to carry a fancy, needlessly expensive computer around in your pocket! We used to get laughed at just for owning a computer, let alone one you could keep in your pocket. Now you get laughed at if your pocket computer is too old!

  • If you don't need a scientific calculator to help troubleshoot logic and/or circuitry, it ain't "cool".

  • How could you find computing anything but more fun than ever? If there is something you liked a lot more "back in the day" there is nothing stopping you from running things the way they did back in the day.

    Games? Emulation, VPNs, old hardware for dirt cheap, etc. It's all there. How is Steam worse than the stupid copy protection of old? You really miss having to thumb through manuals for access codes? Use GoG. Buy the physical media. There are plenty of options.

    BBSes or the thrill of war dialing? People s
  • by DidgetMaster ( 2739009 ) on Wednesday December 28, 2016 @07:56PM (#53568619) Homepage
    The computing environment certainly has changed. The powers that be are busily taking a lot of the fun out of it in their efforts to create 'walled gardens' where THEY (not YOU) control everything. While I am glad that I don't have to get out a soldering iron just to save a few bytes to permanent storage; I get enraged every time some program or system tries to hide my data from me or make it nearly impossible to do what I want with it. I get ticked off when all your settings change because your system decided it was going to 'upgrade' whether you wanted to or not and the company that wrote the software wants to make you view some new ads. I pull my hair out when I can't even find where my app decided to store that file I just created. I certainly miss the days when the 'install program' was copy *.* and the 'uninstall program' was del *.* and you didn't have to worry about a dozen registry settings or DLLs left behind to play havoc on your system. I am currently working on a new system that will bring a lot of that control back where it belongs...with the user.
  • The older cool point is about simplicity, in my opinion. Computers where more simple (I refer to the device itself, not its usability), and it was much easier to hack cool stuff on it.
  • of the Commodore VIC-20, C-64 and the all awesome Amiga!
    The days of the Amiga were the days of real hardware hacking, building your own SCSI controllers, your own cables. Hacking the Amiga hardware - cutting solder traces and soldering wires to the motherboard to add a toggle switch to toggle between 1MEG chip ram and 1/2 meg chip and 1/2 meg fast ram. Soldering wires to jumpers to switch to NTSC and PAL video mode. Multi-boot rom boards. Burning the whole Amiga 2.1 OS into EPROM and having and Amiga 2000 b

  • ... Yep, not fun and no time and energy like the old days like from the 1980s to early 2000s. :(

  • The Atari, Commodore 64, Apple 2... were hobbyist machines generally designed to interest children in computers. Windows 10 machines are generally work machines generally designed for adults who have some objective not related to entertainment that requires computer assistance to accomplish it. The systems you are describing are the ones that killed the DIY kits from the 1970s, they were part of migration away from hobbyist culture in that they allowed kids (and middle class families) and not adult hobby

  • We have amazing systems and networks today.

    Home 24x7 broadband Internet connections with bandwidth 100x of the LANs I grew up hacking together. Computers thousands of times more capable in every way.

    Even Grandma's now ancient desktop has an operating system with memory protection, preemptive scheduling, multi-processing and a capable IP stack.

    Pocket sized computers now sport capabilities I wouldn't believe myself had someone from the future came back and handed me.

    There have been amazing advances in nifty

  • It's not as roll-your-own as it used to be, but I still enjoy working with computers. The big trend I see causing long term issues is consumerization -- everyone is demanding services that work 100% of the time on their phones, so everything is geared towards that. My big thing is scripting and automation -- making something idiot proof so I can send it out to idiots. ;-) I don't have much time for gaming anymore as I have 2 little kids, but when they get old enough I'm sure I'll get back into it.

    One thing

  • There's a couple of basic problems the submitter and his circle of friends have here that makes it appear that it's not as cool or fun anymore. The first is that they're old enough to start seeing things that are different from how they were in their childhoods as not as good. The second is that they're looking at the designed to be idiot-proof mass market and expecting to see DIY where you can get your hands dirty messing with the inner workings of things.

    Games being tied to DRM is an issue, but it can be

  • No. Mainstream computing is dull and boring and often frustrating.

    Yes. Old time computing still exists, it just isn't mainstream anymore, it is fringe. My first computer required being soldered together from a kit. All personal computers required being soldered together from a kit. I think of the C64 as the third or fourth generation of hobby computers. But guess what: I'm still soldering together my own computers. And it's still fun, and it's still cool. It's just that now, what I do is so far from

  • by LionKimbro ( 200000 ) on Wednesday December 28, 2016 @11:04PM (#53569581) Homepage

    Forget compiled languages. That's not fun.

    We want command based (imperative) languages that can be run in a REPL for fun. BASIC basically fits this.

    Take Python as a contemporary example. Now look at how many basic features of interactivity are NOT enabled in an easy way in Python by default: LOCATE, INKEY, SOUND, PLAY, SCREEN, PSET, LINE, CIRCLE, PGET.

    Just these. You can't do ANY of these things in Python with a basic install. "Yes," if you have tkinter in your install, you kind of can. But it's hairy and complex. It's not anywhere near as simple or accessible as BASIC. Pygmy makes some of these things possible, but those are further steps of installation away, and the interactivity feels further away.

    Line numbers are incredibly simple (read: understandable) as a flow control model. "Why Johnny Can't Code" outlined the problem with mandatory complex abstract control structures.

    I think there are basic fundamental missing pieces in the contemporary programming environment, and that the industry is worse for it.

I haven't lost my mind -- it's backed up on tape somewhere.

Working...