Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Hardware

Why 'Ambient Computing' Is Just A Marketing Buzzword -- For Now (computerworld.com) 52

An anonymous reader quotes Computerworld columnist Mike Elgan: Ambient computing is real. It's the next megatrend in computing.... To interact in an "ambient computing" context means to not care and not even necessarily know where exactly the devices are that you're interacting with. When IoT devices and sensors are all around us, and artificial intelligence can understand human contexts for what's happening and act accordingly and in our interests, then ambient computing will have arrived...

As with many technology revolutions, including augmented reality and AI, the buzzword ambient will precede the actual technology by many years. In fact, the marketing buzzword is suddenly here in full force. The actual technologies? Not so much. Instead, we're on the brink of a revolution in what you might call "semi-ambient computing...."

Rumors are circulating that Google's next smartphones, the Pixel 4 line, may come with Soli built in. I told you in January about Google's Project Soli, which may be called the "Aware" sensor or feature in the Pixel 4 -- again, according to unconfirmed rumors. Soli or Aware capability means the Pixel 4 may accept in-the-air hand gestures, such as "skip" and "silence" during music playback. The new Google "wave" is a hand gesture. The ability to wave away music with a hand gesture brings the smartphone into the semi-ambient computing era. It basically adds natural hand gestures to natural-language processing.... Google also briefly talked last year about a healthcare assistant called Dr. Liz., which was described by former Google CEO Eric Schmidt as an ambient computing virtual assistant for doctors. We'll see if Google ever ships a Dr. Liz product...

Yes, ambient computing is real, and the Next Big Thing, showing up first in business, enterprises and healthcare. But for now, the term ambient computing will be misapplied. It's a buzzword that will be stapled to every semi-ambient computing product and service that comes out over the next few years.

The article predicts we'll eventually see ambient computing arriving in cars, grocery stores, smart glasses -- and notes a Microsoft job listing for its "Ambient Computing & Robotics team" describing "the era where computer vision, AI-based cognition, and autonomous electro-mechanicals pervade the workplace."

Computerworldd adds that Microsoft "was mocked for its 'Clippy' assistant, which the company released in 1996 as a way to provide friendly help for people using Microsoft Office. In the future, Microsoft may release what will essentially be a Clippy that works, because it will understand human context through AI."
This discussion has been archived. No new comments can be posted.

Why 'Ambient Computing' Is Just A Marketing Buzzword -- For Now

Comments Filter:
  • by Anonymous Coward on Sunday June 16, 2019 @12:40PM (#58771814)

    means to not care and not even necessarily know where exactly the devices are that you're interacting with.

    But I want to know where exactly the devices are that I am interacting with.

    I want to control where my data resides and who gets to see it.

    I want control over my own computing environment. I do not want to abrogate all power and control to monolithic meganationals which want to monetize everything about me.

    Can we ever just... have personal computers again? Ones that are not designed from the very beginning to spy on us and "take our privacy seriously" as it were?

    • Yeah that was the 1950s to about the 2000s. I knew what was going on, within reason, with my Commodore 64 when I was in high school.

      Now? I just accept that even the keyboard controller in my laptop is probably something I could spend months trying to understand.

      • by Anonymous Coward

        There's a difference between "don't know what all the transistors in the keyboard controller of my laptop are for", in one hand and "am 100% certain my data is being scraped and collected" in the other.

        Could the keyboard controller send the file I just typed into vim to Google? In theory yes, but (1) I blocked google's IP ranges and (2) I'd notice that network connection in logs. I can be fairly confident the file I created locally has remained local. Not 100% confident, that is true, but close enough fo

    • > Can we ever just... have personal computers again?

      You CAN. But that is NOT where the industry and the world is going.
      I'll suggest Neal Stephenson's new book - Fall, or Dodge in Hell. There is a great exploration of this trend in that book and the world it may lead us to.
  • Humans will just be another computer in an abstract computer environment.
  • Basically they want to build Iron Man's lab. And life. No wonder it's vaporware.
  • Asimov (Score:2, Insightful)

    by AmiMoJo ( 196126 )

    In Asimov's robot books the robots would respond to tiny hand signals and glances. It seemed like a cool idea, compared to the somewhat overly wordy interface for the Star Trek TNG computer. I'd rather wave a hand than say "Computer, tea, Earl Grey, hot".

    Of course the danger is you blink twice to sever the biscuits to the guests, but it triggers the sexbot to emerge from its cupboard instead.

  • by Anonymous Coward

    I know some people like mouse gestures, but it's the first thing I disable on a new system. It's too easy for me to accidentally trigger them. They want hand waves to control music? What happens: This makes me want to get up and dance. Dammit. I just turned the volume up to 11 and then replaced the music with a security scan because it misinterpreted my moves.

    • Call me obsolete but I was never able to interact with a device through voice or mouse gestures. My computer interaction was defined and matured during those times when these methods did not exist. You had a keyboard and a monitor and that was all there was. Mice (and the interface allowing them to be useful) became mainstream after a while and I got used to them very quickly, because they greatly enhanced computer-human interaction. Voice commands always made me feel like talking to an idiot, because norma

  • I was looking at an internet the other day and there was an ad for a citizen developer event. I didn't dare click it.

  • It's a buzzword... (Score:5, Insightful)

    by QuietLagoon ( 813062 ) on Sunday June 16, 2019 @01:29PM (#58771992)
    ... that hides the real intent: 24/7 spying on you so marketing companies can peer into your private life.
    • "24/7 spying on you so the gestapo can peer into your private life."

      FTFY

    • It's a buzzword that hides the real intent: 24/7 spying on you so marketing companies can peer into your private life.

      Worse, they will bill you for the privilege. "Subscribe to our convenient service! Which can auto-subscribe to more services! It's in the Terms of Service!" They expect to suck up all your data and have you pay for it.

  • by Dutch Gun ( 899105 ) on Sunday June 16, 2019 @01:58PM (#58772092)

    In the future, Microsoft may release what will essentially be a Clippy that works, because it will understand human context through AI.

    In the future? What do you think Cortana is? Or rather, what it's *attempting* to be?

    Regarding hand gestures... they're just too ambiguous and actually take more physical and cognitive energy than a simple button press. I predict the smartphone crowd will futz around with them for a year or two, and then quietly abandon them, just like the videogame has (mostly). These days, waving your hands or controller around is largely relegated to the occasional in-game trick or gimmick game. For general computing, ubiquitous voice command is slightly more clunky, but it does have the advantage of being a hands-free interface, and one which just about every person can intuitively understand.

    • For general computing, ubiquitous voice command is slightly more clunky, but it does have the advantage of being a hands-free interface

      You know when you're in an office and it's really annoying and hard to concentrate when people are talking? They're going to be talking all the time.

      • Well, no, not for office work, especially with open office designs. Voice-based automation / dictation has been possible for a very long time, and it's not really taken off in the workplace [dilbert.com]. But I think it makes more sense for home automation, or in blue collar workplaces for occasional computer interaction, where noise might not be such an issue, and where hands are often full or dirty from more physical work.

  • by Gravis Zero ( 934156 ) on Sunday June 16, 2019 @02:07PM (#58772138)

    If you think people are lacking in privacy now, just wait until everything in your house is spying on you. I can tell you already that this is a no-go for a lot of businesses because it's simply too easy to commit industrial espionage. Unlike people, business leaders know you can't trust other businesses because they already know what they would do with that kind of access. Governments are sure to outright ban these super invasive machines from their buildings.

    It's a neat concept but the reality is that it's a fucking security nightmare.

  • by Anonymous Coward

    Back in the early 2000s we used to call that "ubiquitous computing" but maybe that word sounds too spooky now...

    I for one welcome our ambient overlords vs. I for one welcome our ubiquitous overlords. Which one do we prefer?

    It's like videoprotection vs. videosurveillance.

    The question is, does ambient computing threaten our ambient privacy?

  • That there actually is no tiny Siri living inside my iPhone?

  • by mrwireless ( 1056688 ) on Sunday June 16, 2019 @03:03PM (#58772408)

    When Mark Weiser, the "father of the internet of things", laid out his 'Ubiquitous computing' vision in 1991 he said that his vision of embedding cheap small computers in the things around us was a mix of two existing visions:
    - Mobile computing. The computing device moves around but is bulky (laptops)
    - Ambient computing. The computing is integrated in the space around us, but is fixed to one space for which it is highly optimised (like the 'cave' virtual reality rooms of the time).

    So it's a little strange to called "Internet of things 2.0" something that already was the name for Internet of Things 0.5. But I guess we were due a new buzzword to rally around.

    Weiser's Ubicomp vision was a response to the then popular idea of Virtual Reality. Books like Neuromancer and movies like Johnny MneMonic. We shouldn't move into the computer world, Weiser said, the computer should move into ours. It seems we oscillate between these ideas every couple of years.

    As Paul Dourish has argued, I think mobile computing has trumped both ideas. Ambient interfaces are notoriously bad at communicating their 'affordances': what you can do and how you can do it. In the end, the mobile phone as a universal remote with a predictable uniform-ish control scheme solved that issue.

    All in all "the internet of things but even more integrated" is a pretty uncreative vision of what comes after the current crop of 'smart' devices. To me it's more likely that "internet of things 2.0" will be all about moving from cloud-first solutions to edge computing.

    _Sources:_
    Mark Weiser's text. https://duckduckgo.com/?q=mark... [duckduckgo.com]
    Paul Dourish's text: https://www.ics.uci.edu/~jpd/u... [uci.edu]
    Also, check out 'situated actions' by Lucy Suchman is you want to know why already in 1990 people understood that gesture interfaces and predicting situations was always going to run into a wall of affordance miscommunication and edge cases.

    • "Ambient interfaces are notoriously bad at communicating their 'affordances': what you can do and how you can do it. In the end, the mobile phone as a universal remote with a predictable uniform-ish control scheme solved that issue." I was just thinking that mobile phones were well on the way towards eliminating affordances. Windows 8 was blasted for that kind of thing, but if you ask me, Android and iOS are also lacking in the affordances department. (All right, you didn't ask me, but no hard feelings;

  • by Psyko ( 69453 ) on Sunday June 16, 2019 @03:46PM (#58772556)

    Figure 90% of the populace that uses a smartphone or any other kind of device has no idea how it works, what's local vs what's remote, who owns what or where their data is anyway, the other 10% are pretty much like the people here & technology types that understand how these things work.

    This isnt a trend, it doesnt need a fancy new name that someone can say "I COINED THIS!". It happens with any type of technology.

    How many people knew how a phone actually worked 20-30 yeas ago?
    Ask your parents if they knew what actually caused a phone to ring when someone called you. Or what the difference was between SxS, CrossBar or ESS as a function of how a call was connected.

    What about cars, people use some form of mechanized transportation almost every day. Go ask someone to name the 4 cycles of an internal combustion engine and what each of those does and how that causes your wheels to turn. (we'll leave pure electrics out of this).

    As technology advances and subsets of it become commonplace, people don't really care how things work, as long as they do. It's the engineers, developers, admins and support people (like the people reading this) that are the ones in charge of getting it there, for better or worse.

    We spend our time making these things accessible so that the average guy on the street doesn't need to know how something works to be able to use it effectively. Probably so they can focus on coming up with cool sounding names for things like "Ambient Computing".

    • What, four cycles? My lawnmower didn't have that! Two more cycles, and maybe it could have made its wheels turn, instead of making me push it.

  • Do these techbros realize that every plan they state they have for the future sounds like a dystopian nightmare to everyone else?

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...