Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware IT

Ask Slashdot: When Should You Call Hardware a 'SoC'? (wikipedia.org) 140

Slashdot reader Prahjister knows what a system on a chip is. But that's part of the problem: I recently started hearing the term SoC at work when referring to digital signage hardware. This has really triggered me.... It is like when I heard people refer to a PC as a CPU.

I tried to speak to my colleagues and dissuade them from using this term in this manner with no luck. Am I wrong trying to dissuade them for this?

Maybe another question would be: Are there technical malapropisms that drive you crazy? Share your own thoughts and experiences in the comments.

And when should you call hardware a 'SoC'?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: When Should You Call Hardware a 'SoC'?

Comments Filter:
  • Pet peeves on /. now?
    • by Potor ( 658520 ) <farker1@gmai l . com> on Sunday March 19, 2023 @01:40PM (#63382965) Journal
      pet peeve: "where" for "we're"
      • Yeah, swipe on a phone when I wasn't micro scrutinizing. It seems like it's getting worse and worse every year probably "auto learning" from people that don't know the difference.
        • Slashdot is an old site, and most of us are getting older.

          And part of getting older is frankly we get a bit dumber. We might *know* more, and we might be a bit more optimized in how we deploy and use that knowledge, but by the mid 40s theres already a few signs of ageing in the old brain socket.

          I recently did an IQ and I've gone from the high 130s to about 130. Did it twice, same score. My spatial reasonings good, my verbal reasoning is good, but my short term RAM is just shot. I was so disturbed by this I

          • I'm turning 48 far too soon. Thankfully, my memory is still measurably improving, but I've intentionally worked on it since my teens. All other areas of learning are also accelerating. I have suffered greatly from presbyopia. It has drastically reduced my rate of ingest.

            I can't say much about IQ, last time I was tested, it was a 4 day proctored exam and I was simply told "It's good enough".

            I don't know how it works for everyone else, but I believe most degradation is related to lack of exercise. When I real
            • Wow. You and I led very different lives. I've never even heard of a 4 day proctored IQ exam, and if I subjected myself to one in no way would I accept, "It's good enough". I know it's good enough. That's why I do what I do.

              Also, if you're memory is measurably improving at 48, either you've lived a commendably(?) pristine life, or you bottomed out earlier from excess living. I know I'm in the latter category. I left a few IQ points in the bin on my way through my early 30s.

          • And part of getting older is frankly we get a bit dumber.

            You misspelled "crankier". And get off my lawn, I have some clouds to shout at.

      • Pet peeve: not using a capital letter at the beginning and not adding a period at the end of a sentence.

  • by Joce640k ( 829181 ) on Sunday March 19, 2023 @01:39PM (#63382961) Homepage

    The Wikipedia page linked in the first line of the summary has a good definition. I'll include it here for the people who didn't read that far:

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • The Wikipedia page linked in the first line of the summary has a good definition.

      I'll agree with their definition.

      I note that it's also somewhat dated and misses a bunch of stuff. Two items struck me:

      - It seems to completely miss the IoT system-on-a-chip families that underly all those bluetooth-low-energy devices. Processor(s), flash, RAM, host of peripherals, including several networking variants and at least one radio, bunch of GPIO, A/D, etc. for controlling and monitoring, under a buck in qua

      • So what's the difference between a microcontroller, like the ESP32, and a SoC used in an IoT device, like an ESP32?

        • by Entrope ( 68843 )

          You can have a system-on-chip microcontroller. These are distinguished from other microcontrollers by having lots of relatively sophisticated peripherals on the same die as the processor core. Think of all the stuff that usually lives on the "south bridge" in a traditional or high-end PC.

        • So what's the difference between a microcontroller, like the ESP32, and a SoC used in an IoT device, like an ESP32?

          I have no clue about Tensilica's stuff, other than that (at least part of) the ESP32 series uses their proprietary LX6 processor cores.

          I was thinking of things like Nordic's nRF series, which use ARM cores and which I'd definitely consider SoCs. But several other families I've worked with also IMHO would qualify.

          Many IoT SoCs work fine as microcontrollers, so I suspect the difference in some c

        • SoC: System on a Chip
          Microcontroller: a processor - no system, no ram, no nothing.

          But I'm sure you knew that already ...

          Or in other words: the SoC is a full fledged computer, for what ever purpose, and except for the need of some software to load from somewhere, it only needs: power, nothing else.

          • Re: Simple (Score:4, Insightful)

            by zmollusc ( 763634 ) on Monday March 20, 2023 @05:12AM (#63384327)

            Surely you meant to say Microprocessor: a processor - no system, no ram, no nothing.

            Put all the parts of a processor on one chip - Microprocessor.

            Put all the parts of a processor along with its supporting elements of ROM, RAM , I/O etc on one chip - Microcontroller.

            Put all that plus some other elements which we now consider intrinsic to a computer system, such as GPU, RF etc on one chip - SoC.

            • by _merlin ( 160982 )

              But doesn't a typical SoC lack RAM/ROM? They often use that "piggyback" arrangement for the RAM chip. Microcontrollers tend to have the RAM and ROM integrated, although there have been variants of the 8048, 8051, H8, etc. without internal ROM.

        • by Gabest ( 852807 )

          Linux. If it cannot run Linux, it's just a microcontroller.

        • So what's the difference between a microcontroller, like the ESP32, and a SoC used in an IoT device, like an ESP32?

          The difference is the ESP32, obviously!

  • I'd even call the newer Intel/AMD CPUs SoCs because they have integrated GPUs inside the processor itself.

    If it's a pure CPU with nothing additional besides a cache, it would not be a SoC.

    • by crow ( 16139 )

      No, because they don't have all the core functionality for a system, such as USB, on the chip. Correct me if I'm wrong, but don't they still require the southbridge chis to make a real system? They have generally integrated the northbridge chip into the CPU, so that's a big step.

      But in general, which I hear SoC, I think Raspberry Pi, cell phone, or equivalent embedded system.

  • by cuda13579 ( 1060440 ) on Sunday March 19, 2023 @01:45PM (#63382987)

    Those have both become such "marketing terms", that they too have lost meaning. The threshold for what is considered "artificial intelligence" has sunk to a point that you could probably sell a four function calculator, as using "AI" to do basic arithmetic.

    • DevOps (Score:4, Informative)

      by sodul ( 833177 ) on Sunday March 19, 2023 @01:55PM (#63383019) Homepage

      DevOps ... DevOps teams and DevOps roles are oxymorons if you go by the original definition. If a company creates a DevOps team as a new silo so the Devs are more isolated from the Operational side of the product, they are 100% missing the point of DevOps.

      In case you are wondering DevOps is a culture or a mindset where the Dev teams collaborates with the Ops team. This lead to software that is more stable and scales better. You can, and should, add a team that helps bring the tooling and processes to make DevOps a reality, but that would be a DevOps Enablement team.

      • I've seen teams being labeled DevOps when they are neither Devs nor Ops. It was baffling when I ran across this, but the director said it was because he wanted the team to feel like they were more in control or such. Buzzwords: gotta use them before they fall out of fashion.

      • where the Dev teams collaborates with the Ops team.

        Tell my company that... They seem to be under the impression that DevOps means the dev team is now responsible for all the ops as well. Not a collaboration, handle it all, so we don't have to pay any ops people.

        • Note: this is at a 60 billion dollar company... Not some little startup.
        • by sodul ( 833177 )

          Unfortunately the bulk of the industry completely misunderstand trends and buzzwords, leading to poor implementation. A former employer (did not stay too long fortunately), decided to stop with QA since Devs should test their own code, and the QA Engineers became 'DevOps' engineers since they know how to automate things ... that was a disaster. At some point an engineer, with a long tenure in the company, asked me to run the production code with the line profiler enabled so they could show high line coverag

          • by narcc ( 412956 )

            asked me to run the production code with the line profiler enabled so they could show high line coverage metrics to the higher ups

            That's both depressing and really funny, all at the same time

    • SoC - System on a Chip. These have traditionally been CPU+ram+flash, but _also_ with modules for I/O handling (dma, uart, usb, etc). One chip, solder it on a board with the appropriate resistors, capacitors, inductions, and some application components like sensors, and you've got a whole system.

      Now CPUs for PCs are odd things. They're no longer just CPUs, they're a sandwich of multiple boards each with many chips, sometimes miniaturized as multiple dies in the same plastic/epoxy package. More than the C

      • by tlhIngan ( 30335 )

        These days, SoCs generally are everything but memory - so they have your processor core, memory interface(s), peripherals like timers, interrupts and others, plus external I/O which generally include I/O devices like GPIO, SD/MMC controllers, PCIe, USB, and other interfaces all in one package.

        Memory, both RAM and flash are typically external to the chip because they are often sized to needs.

        SoCs generally refer to higher performance devices like application processors that run Android and other things, but

        • There's a big range though. The SoCs I've used had RAM and Flash. Thus an entire product was SoC plus RF chip plus discrete components. Others I see have internal RAM and Flash but allow external as well for a larger amount (sometimes 2MB is too small, go figure). Also the size varies hugely, from 16-bit chips up to the equivalent of last millenium's supercomputers.

          I've seem the SoM with a slightly different term, and worked on one, though we didn't have a term other than Module. There are also the sta

    • Ooooh, get mr fancypants with his four arithmetic functions! Back in my day we used to shift the bits to the left by hand.

    • by AmiMoJo ( 196126 )

      I checked TFA and the SoC it mentions is actually a real SoC.

      https://dn.odroid.com/S922X/OD... [odroid.com]

      As well as the ARM CPU, it has a GPU with video decoding, crypto engine for DVB, HDMI interface with CEC and encryption support, camera interface, CVBS encoder, audio codec and stereo DAC, NAND flash controller, DRAM controller, SC card controller, gigabit ethernet MAC and PHY, smart card controller (for satellite TV), DTV demux, USB controller with OTG, PLL clock generators, and some kind of power management (swit

  • A microprocessor which doesn't need a motherboard chipset to fanout connections to memory/peripherals is a SoC, to me. Language is as always fuzzy though, even within technical fields rigor is more exception than rule.

    • Language is as always fuzzy though, even within technical fields rigor is more exception than rule.

      May I take exception to "always"? Language is usually fuzzy, which is usually a useful feature. But in a few - mostly technical - fields, absolute accuracy, rigour, and consistency are vital.

      One of the biggest problems we have is that people are often unsure about the boundary between fuzzy language and rigorous language.

  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Sunday March 19, 2023 @01:48PM (#63382995) Homepage

    Image manipulation that talks about "turning left or right" when they mean "rotate anti-clockwise or clockwise". I see this in real life where people talk about turning door handles left/right. When I complain people either look blankly or explain that "clockwise" is not understood any more, all clocks are digital - in spite of clock emojis [emojipedia.org] all being of round/analogue clocks.

    • by rossdee ( 243626 ) on Sunday March 19, 2023 @01:55PM (#63383017)

      Maybe we should use the terms deosil and widdershins instead

    • Right-hand screw rule vs left-hand screw rule FTW. Damn kids shortcutting things...

    • by 3247 ( 161794 )

      Image manipulation that talks about "turning left or right" when they mean "rotate anti-clockwise or clockwise".

      It should be called widdershins and deosil.

    • Clockwise and Anri-Clockwise is very well understood.
      However, in my family no one ever used such terms: unless for a clock with handles.

      It is more odd - astonishing actually - that an intelligent person like you does not understand, turn the valve left to open the water and right to close it.

      • by jwdb ( 526327 )

        Please define for me, unambiguously, what a "left" rotation is to you, because I agree with GP.

        If you turn something clockwise the upper half moves right and the lower half moves left. Is that therefore a "left" rotation or a "right" rotation? Can't just use the direction my hand moves in either, as that depends entirely on how I'm holding what's rotating.

        I've memorized the open/close rotation as muscle memory, but the words left and right won't stick because my mind rejects them as not meaningful.

  • I've heard people refer to the desktop/tower portion of a PC as a "hard drive" more than a few times. Working in the HVAC industry, laymen tend to refer to every refrigerant as "Freon", even though it is actually a brand name, not a specific refrigerant.

    Lately, with the vinyl record resurgence, there's folks calling records "vinyls", and butchering the names of the various parts of a turntable in all sorts of hideous ways. That really grinds some peoples' gears.

    Pretty much just have to figure that to some

    • Working in the HVAC industry, laymen tend to refer to every refrigerant as "Freon", even though it is actually a brand name, not a specific refrigerant.

      That sort of thing is pretty common. For example, people say "Xerox" instead of "photocopy". Companies usually discourage this to help protect their trademarks from becoming common use terms and un-protectable.

    • I've heard people refer to the desktop/tower portion of a PC as a "hard drive" more than a few times.

      That would be a big F-ing hard drive, apparently.

      Well, as long as they get the "cup holder" disambiguated from the "optical disk drive", they'll probably be okay.

  • by godrik ( 1287354 ) on Sunday March 19, 2023 @02:02PM (#63383041)

    There is a current trend that cyber is used as a noun to refer to cyber security. To the point that anything containing the word cyber is assumed to be about security. Drives me crazy.

    • by tepples ( 727027 )

      Particularly because when I grew up, "wanna cyber?" referred specifically to erotic chat, or "cybersex" as it was once known.

  • I can't believe this "questionc made the frontpage. :/
    • I can't believe this "questionc made the frontpage. :/

      Didn't you know? The whole point of becoming knowledgeable in a field is so that you can perform gatekeeping when other people botch the nomenclature. It's basically the whole reason an English major exists.

    • The announcer butchers it by saying forward slash forward slashTHAT slashdot motherf***r

  • You can call it that, when its a System on a Chip. Come back for more insightful advise. I did not even have to run chatGPT for this one.
  • If I had a coworker who got annoyed that I didn't use perfect terminology around them I'd stop talking to them. If I had to interact with this person all the time I'd find a new job.

    The answer is "don't worry about it". Ketchup, Catsup, who cares.
  • In my lifetime I've gathered a large catalog of misused (IMHO) technical terms that could bother me, but I've let it go. You see, languages mutate over time, going through a never ending evolution that is affected by a myriad of factors, correctness not being high on the list. Pro tip: No one likes a scold.
  • If they're really refering to the whole unit (including the panel) as an "SoC", that does seem weird. But there's a lot we don't know..

    - Are they *really* referring to the whole unit as an "SoC", or some part of it?
    - Are these technical people (I'm guessing not)?
    - Does it actually matter that much to you?

    My first response is "just let it slide - it doesn't matter". But then I remember how much it bothered me how, for several years, our faculty were referring to any new iteration they'd made on some existing

  • And I think one of teaching's big requirements is clarity of thought. Another is clarity when explaining to someone else.

    If the word is used too broadly (like something being 'woke') then lots of people can think they know what it means, but cannot succinctly define it. Non-technical people get annoyed by technical language because it feels obtuse. Confusing. Overly precise. Technical people get annoyed when non-technical people misuse technical language because the details are clear to them, and matte

    • If the word is used too broadly (like something being 'woke') then lots of people can think they know what it means, but cannot succinctly define it.

      It means to be aware of systematic injustices present in society. The people who misuse it to mean "I don't like this TV show because it has too many gay or brown people in it" just don't want to admit to that being their definition.

      It's misused mostly by people who don't realize how good they actually had it. You know, things like not having to stay closeted throughout all of highschool for fear of getting beat to death.

  • I'm generally okay with misappropriating terms from a completely different field, but cooling water systems are a thing which have existed long, and the definition of open loop and closed loop cooling are defined based on the containment of the heat movement fluid, not on whether you get to build it yourself with off the shelf parts. 100% of "open loop" watercooling systems in PC are actually closed loop cooling systems. /petpeeve that hasn't been mentioned here yet.

  • This used to mean a thing.
    And now (for the last 20y actually) it is used for something totally different.

    And whenever you mention the true meaning, you feel pedantic.

  • Why do we need to be so pedantic about squishy terms like 'SoC' and 'microservice'?

    • Why do we need to be so pedantic about squishy terms like 'SoC' and 'microservice'?

      Engineers like to know what they are talking about. Layman does not care.

      • Maybe one should not be Engineering based on Marketing terms alone. Because SoC is about as exact as 'bridge' or 'building' or something else you might Engineer that you describe in a single word.

  • Again, the definition has been bent just to label something as Raytracing for marketing purposes. Back in the early 90's, I saw a PC game that claimed to do Raytracing, but it looks like fake-phong-shading.
  • ... malapropisms that drive you crazy?

    Every news report and Tv. drama using "hacking" to describe cyber-security intrusion (cracking).

  • by dohzer ( 867770 )

    In the FPGA world, the 'SoC' term is generally used when a hard processor (e.g. ARM Cortex-A53) and FPGA fabric are contained in a single device, with communications occurring via a shared on-board bus.
    The alternative is to use a non-SoC FPGA, with either a separate processor chip or a soft-core processor (MicroBlaze, NIOS, etc) implemented in FPGA fabric.

  • by acroyear ( 5882 ) <jws-slashdot@javaclientcookbook.net> on Sunday March 19, 2023 @07:42PM (#63383729) Homepage Journal

    I'm calling my hardware an SOB.

    Some of the software, too.

  • Having read Steven Levy's book, I always admired the original hackers because they were explorers of a sort. They taught themselves how to do things and figured out how systems worked through experimentation rather than being taught by someone else who could and would limit the amount of knowledge they shared aka gatekeepers. Gatekeepers get off on quashing other people's passion for learning and doing.

    Like so many other human endeavors, hacking and the meaning of the word got corrupted by sociopaths.

  • Cheapest and fastest way to get to market. Probably runs Linux and violates GPL too. That's how this industry works. Is some 3 generations out of date ARM920T with built-in LVDS controller and video decoder a system-on-a-chip? Well, it has USB host, built-in ethernet MAC and PHY, and maybe even package-on-package DRAM. More than likely these days signage that support 4K and some reasonably relevant video codecs is also a fairly decent ARM (quad core A53 maybe). Hard to support HDMI 4K@60 or better with HDMI

    • by AmiMoJo ( 196126 )

      I know Panasonic were using RPi compute boards for their digital signage at one point, not sure if they still are.

      • A few SoM vendors picked up on using the same connector as used on the RPI-CM3. So theoretically they can swap in a different board if supply of the Broadcom parts becomes difficult. Getting the equivalent software working on a different ARM SoC is theoretically easy, but in practice there are so many roadblocks that this strategy of alternative vendors usually isn't cheap or worthwhile for low margin stuff where you'd drop in such a module.

  • This applies to SoC and all the terms used in this rapidly evolving technology.

    People form mental images of technical terms, but these images often remain the same and become obsolete over time.

    People of my generation have a mental image of a CPU, that is, a single integrated circuit. People of the previous generation may have mental images of a full cabinet called CPU (Central Processing Unit) with internal circuit boards dedicated to IDU (Instruction Decode Unit), AMU (Address Multiplexer Unit) and ALU (A

  • Sentences that misuse "malapropisms" drive me crazy. For the record, a malapropism refers to using an incorrect wordle in place of one with a similar sound.

  • drive me crazy. While it's not that far off, no Johnny Simpleton, you don't "download" data from your SD-card. It feels nearly as stupid as saying CPU to a computer. Another pet peeve: An image with text below is not a "meme" - it's an image macro as long as it has not aquired meme status and, ototh, there are also other kind of memes.
  • The amount of options for as-a-Service is really getting ridiculous. That's my peeve. But only because it's tangential to my job so I just hate other people's TLAs
  • I think that is less common nowadays along with calling them hard discs.

    I just tell people "call it a computer if you're not sure". Increasingly, this is accurate whether you are talking about printers, mobile phones, cars or other devices you connect TO PCs.

    I am reminded of someone who didn't call his PC anything but used it as a footrest. I tried to dissuade him but his boss did it more effectively!

    Calling inanimate devices the wrong name does not make me cross. I am more upset by people calling other pe

    • There is a distinct difference between "immigrants" and "Illegal immigrants". Pretending like that doesn't exist is...like calling a computer a CPU.
  • I used to be bothered when people called a PC and CPU a long time ago but found it generally just added to user confusion when trying to educate them on all the components. I found it about as useful as a mechanic explaining the inner workings of a car to someone who doesn't care and just wants it to work.

    Now, most people seem to call a PC a "hard drive" more than CPU. 99% of the time I'm worried about the data, so hard drive works fine for me.

  • Employee: That's my CPU ( Points at Computer)
    Me: it's not called a CPU
    Employee: Oh, yes, if you're being technical, it's a Hard Drive.

If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol

Working...