Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Wireless Networking Hardware

Sentient Data Access 134

CowboyRobot writes "From Queue comes a piece subtitled Why doesn't your data know more about you? From the article: 'It has been more than ten years since such information appliances as ATMs and grocery store UPC checkout counters were introduced. ... A common language for these devices has not been standardized, nor have current database solutions sufficiently captured the complexities involved in correctly expressing multifaceted data. ... As computing devices expand from the status-quo keyboard and desktop to a variety of form factors and scales, we can imagine workplaces configured to have a society of devices, each designed for a very specific task. As a whole, the collection of devices may act much like a workshop in the physical world, where the data moves among the specialized digital stations. For our society of devices to operate seamlessly, a mechanism will be required to (a) transport data between devices and (b) have it appear at each workstation, or tool, in the appropriate representation.'"
This discussion has been archived. No new comments can be posted.

Sentient Data Access

Comments Filter:
  • by migstradamus ( 472166 ) * on Saturday December 20, 2003 @08:27AM (#7772663) Homepage
    I dunno. I'm not sure I want my cell phone to know where my browser has been.
    • It's your imaginary girlfriend you should be worred about... sorry, I meant your 'virtual' girlfriend... ;-s
    • That's exactly what you want...

      Your cellphone knows about what you've been looking at online, then when you're walking by stores it checks their websites (using bluetooth, and their bluetooth-AP) to see if there's anything there that you might be interested in.

      The cellphone doesn't hand out the information to let the server do the thinking, so there are no securityrisks (of that kind) and you can always slap a bayesian filter on the whole thing to make sure that it learns what you're looking at online but
      • That's exactly what you want...

        I want nothing of the sort.

        Your cellphone knows about what you've been looking at online, then when you're walking by stores it checks their websites (using bluetooth, and their bluetooth-AP) to see if there's anything there that you might be interested in.

        Your cellphone maybe, but for sure as hell not mine.

        In fact, I refuse to own one of the infernal things for multiple reasons, amongst which is the very thing you seem to desire.

        I'm quite happy with a world wide web. A wo

        • >> Your cellphone knows about what you've been looking
          >> at online, then when you're walking by stores it checks
          >> their websites (using bluetooth, and their bluetooth-AP)
          >> to see if there's anything there that you might be
          >> interested in.

          > Your cellphone maybe, but for sure as hell not mine.

          > In fact, I refuse to own one of the infernal things for
          > multiple reasons, amongst which is the very thing you
          > seem to desire.

          > I'm quite happy with a world wide web.
          • Sure, badly implemented this thing could give out more information than you want it to, but you can't say that something that doesn't even exist yet has a designflaw.

            Most of us here have the good sense to realize that such a thing will be badly implemented because we live in a free country where commercial interests are the driving force behind most of the products we get on the market. Paid placement, tie-ins, cross-promotions, etc. will enable merchants to gather more and more information about us, whic
            • Most of us here have the good sense to realize that such a thing will be badly implemented because we live in a free country where commercial interests are the driving force behind most of the products we get on the market. Paid placement, tie-ins, cross-promotions, etc. will enable merchants to gather more and more information about us, which a large number of us feel that they are not entitled to, regardless of intent.

              "The good sense"??? Would that be "the good sense" as in the good sense of knowing tha

              • Sorry to burst your little bubble of optimism, but as it stands right now, we've give the henhouse to the fox. Between the credit card companies, the supermarket price-break/loyalty programs, most of our data is out there, with our names on it. We've gotten used to it - so when someone comes along with a great idea, I cynically assume that it will be co-opted by corporate America in an attempt to further collect reams of marketing data.

                You seem to assume that anonymity is a given, I assume things are going
          • what makes you think that a device finding and alerting you about good deals is evil?

            It will also be fully capable of alerting (kindest thanks for the word, it's a good one) people and instrumentalities other than myself. This is a Bad Thing, and no amount of persuasion by the folks who design, build, and sell such devices will ever convince me that there will never ever be any kind of back door (nevermind wide open FRONT doors) that will permit people I don't much like from acquiring and using the informa

      • I was thinking of bluetooth too. If it was incorporated (or RFID tags were used) in a supermarket you could do away with the need for checkout assistants. All you'd need do is put your shopping on a conveyor belt, pack it at the other end and insert your credit/debit card - or there could be a machine you insert cash in that gives you change.
    • Why should one organization bother making their devices talk to others, when a backlash from privacy advocates could make the effort costly, disgraceful, and futile?

      If you're a privacy advocate, the current somewhat insular world of independent devices is best. Our refrigerators do not talk to our phones. We have some control over our lives.

      However, there are some times when I wish some of those devices were more 'open'. My grocery store UPC checkout counter knows a lot about me. Since I don't eat out an
  • XML (Score:4, Funny)

    by skinfitz ( 564041 ) on Saturday December 20, 2003 @08:33AM (#7772672) Journal
    Isn't this what XML is for? Communication of any data types?
    • Re:XML (Score:3, Insightful)

      by AndroidCat ( 229562 )
      That's to start with. Then you have to figure out how to have your data follow you around like Joe Mitzblitzfligx's bad luck cloud. (Bad spelling, 'Lil Abner ref.)

      Let's see.. You have an appointment in the building and get lost. When you walk up to a wall display, it (without asking) shows you a map and path to get to where you want to go.

      Think about what would be required to make that trick work -- then worry about the security problems.

    • by SandHawk ( 15347 ) on Saturday December 20, 2003 @09:15AM (#7772757) Homepage
      XML only solves the first problems of data merging, like making the formats easy to parse correctly and using the right international character set.

      RDF/XML solves a bit more of the problem, making the structure of the information clear, in terms of assertional statements. An RDF/XML file is a knowledge base, full of statements saying this has some particular relationship to that. It lets the machines get at more of the information in a uniforn, universal way.

      But still, the problem of ontology/schema/vocabulary mapping remains: if one system is talking about patients and another is talking about clients, they might or might not really be talking about the same thing. A single person maybe never counts as two patients but sometimes counts as two clients, etc. At least with the data in RDF, most of this mapping can be done in software once a person figures it out and expresses it in a suitable logic language.

      The emerging design of the Semantic Web hopes to make that reasonable, but also to support convergence on common vocabularies by having everything on the web -- if it's trivial to see what vocabularies are already being used, people will mostly only make new ones when the old ones really are different.

      Other hard problems remain, of course, like figuring out which data sources to trust. Fun fun.
  • Duh (Score:5, Funny)

    by arvindn ( 542080 ) on Saturday December 20, 2003 @08:34AM (#7772674) Homepage Journal
    "Why doesn't your data know more about you?"

    That sounds like a bad Soviet Russia joke ;^)

    • Re:Duh (Score:1, Funny)

      by Anonymous Coward
      All your beowulf clusters of Natalie Portman, naked and petrified, with hot grits down her pants, are belong to us !! - Your Data
    • Just put versions of Windows CE in all those devices. See? The problem HAS already been solved.

  • Hmmm (Score:2, Insightful)

    by ziggyboy ( 232080 )
    With so many standards running around and devices intentionally not complying with them, I doubt this would kick off in the near future.
  • Verge of Future? (Score:3, Interesting)

    by landrocker ( 560567 ) on Saturday December 20, 2003 @08:42AM (#7772691) Homepage
    So, today are we getting excited about tech converging (eg. your phone [sonyericsson.com]+camera [nikonusa.com]+pim+kitchen-sink [gnu.org]) or are we getting excited about the tech diverging into hundreds of specialised interconnected devices?

    With all the 'innovation' these days it's getting hard to keep track ;)

    Landrocker
    • I, for one, am excited about both. Computers are becoming more capable all the time at specific tasks that tend to be difficult to implement. At the same time, easier tasks are continually being refactored and converging on more general solutions. I find both useful.
  • Get on the boat (Score:2, Informative)

    by ObviousGuy ( 578567 )
    It's called Universal Plug and Play and despite appearances it is gaining popularity.
  • by AndroidCat ( 229562 ) on Saturday December 20, 2003 @08:46AM (#7772698) Homepage
    Granted that they were writing to a narrow audience, but the style is pretty opaque unless you spend time boiling it down like maple sap to get the meaning.

    I guess starting out a scientific paper with "Hey, wouldn't it be cool if..", but their paper really needs it.

  • by joethebastard ( 262758 ) on Saturday December 20, 2003 @08:53AM (#7772707)
    i'm not an expert on the subject.... but, at least in the case of devices like ATMs, which have fairly simple tasks, how would we benefit from a standardized language? i put my PIN in, money comes out, my bank account balance goes down. the elegance of the code behind it doesn't concern me.

    i know that "security through obscurity" is a cheesy solution, but i can't help thinking that if every ATM in the country had the same architecture, the system as a whole would be more prone to hacks and abuse. what do you think?
  • Middlemen (Score:5, Interesting)

    by foniksonik ( 573572 ) on Saturday December 20, 2003 @08:58AM (#7772716) Homepage Journal
    What you have described is modern day bartering... everyone has their own unit of measurement and everyone is willing to negotiate.

    Until the marketplace demands a standard, businesses will continue this behavior because it is more profitable in the near term... individuals almost always pay more than conglomerates which is the nature of a trading company who can with 'purchasing power' lower the price for goods or services. So as long as the companies are dealing 'direct' with you the consumer, they can ask for whatever service charge you can bare as an individual... compared to credit unions who get much much better deals as an organization.

    So basically it's in all companies best interest to avoid organized clientelle or employees as long as possible in order to maximize profits from low overhead and high margins. Information technology doesn't change this strategy it just adds new levels of complexity.
  • Misconception (Score:5, Insightful)

    by Tom ( 822 ) on Saturday December 20, 2003 @09:00AM (#7772723) Homepage Journal
    From Queue comes a piece ...that shows clearly just how little the author knows about computing.

    The entire point of computers is that they are general purpose devices. The "workshop" idea surely sounds cool to someone who doesn't know about computers, because it resembles the world before "general purpose" was a graspable concept.

    Would I rather want my workplace to be a collection of specialized devices, or a single device that can be configured to be any of the others, plus whatever else or new is necessary? Now that's a difficult question, right?

    • Re:Misconception (Score:1, Interesting)

      by Etiol ( 672326 )
      You mean, do I want Word or texttools?
    • Re:Misconception (Score:3, Interesting)

      by Jeff DeMaagd ( 2015 )
      It takes a lot for general purpose to do better than specialized devices.

      Take TiVo for example. Sure, there are a lot of free and non-free software products to make a general purpose computer behave somewhat similarly. For one, I don't know if any of this software can set the computer to wake up at a particular time slot. There is simply no API to communicate to the BIOS or any other hardware to do this. Being able to sleep and wake up on a timer is important to energy efficiency, I doubt a Tivo has a
      • It takes a lot for general purpose to do better than specialized devices.

        Only in edge areas. In mainstream, there is enough incentive to improve the general purpose device.

        Your virtual VCR is a great example. All the software I've seen, be it MythTV or Freevo or others, far surpass any physical VCR I have ever seen when it comes to functionality.

    • Re:Misconception (Score:2, Insightful)

      by dollar70 ( 598384 )
      The entire point of computers is that they are general purpose devices.

      Yes, but when people go to "Best Buy" or "Dell" and ask for a "computer" these days, what they're really asking for is an interactive television. They don't understand the concept of operating systems or programming, they just identify those things as buzz-words that make you look smart if you use them correctly in a sentence.

      Would I rather want my workplace to be a collection of specialized devices, or a single device that can be c

    • I can assure you that given the choice I'd rather have a workshop of specilized devices, rather than one device that is mediocre at doing everything.
    • The two are not mutually exclusive. Fortunately, nobody is forced to choose. Embedded processors are everywhere. Toasters, cars, cell phones. I sure wouldn't like my ABS to fail because I ran out of disk space while downloading MP3s. It is also nice to have a machine that plays MP3s, lets my edit documents, write programs and play chess with me.

      But even on a general purpose device, the problem of information sharing across applications remains. That's a difficult problem

    • It isn't so much that computers are "general purpose" devices - they are, but they are so, so much more.

      Most people, and that includes a fair number of geeks as well - don't understand what a computer truely is. It is a machine which runs software. But that isn't quite it, either. Computers are software made physical - and this is the cruxt of it all.

      You see, software is nothing more than ideas and algorithms expressed as a special series of ones and zeros, interpreted sequentially. This sequential nature i

  • RISKS Hell? (Score:5, Insightful)

    by localroger ( 258128 ) on Saturday December 20, 2003 @09:06AM (#7772733) Homepage
    This is a nightmare scenario for anyone who is familiar with how data systems fail. I once had a credit agency pick up a very old P.O. box I hadn't used for years and suddenly decide it was my current address, so all my mail from them went into a black hole; this bad address propagated through the credit world for nearly a year, during which I had to call regularly and request copies of bills and get the address changed back.

    In the system described here, once bad data gets into your microwave oven there's virtually no way to chase down all the instances of it that will be floating around the universe. Didn't Sandra Bullock star in a movie about this once?

    • And then there was the sad case of Mr. Buttle in Brazil.
    • Re:RISKS Hell? (Score:4, Insightful)

      by Bronster ( 13157 ) <slashdot@brong.net> on Saturday December 20, 2003 @09:29AM (#7772788) Homepage
      In the system described here, once bad data gets into your microwave oven there's virtually no way to chase down all the instances of it that will be floating around the universe. Didn't Sandra Bullock star in a movie about this once?

      While in a better designed version of the same thing, where everything contained a link to the canonical version of the information, and possibly cached it for a sane length of time, then this wouldn't happen - you would update your current address, and suddenly _everyone_ who had a copy of the canonical location would have the new value.

      Add a little strong crypto - unguessable URIs for data and possibly even encrypt the value of the field to each entity who's supposed to have a copy, in such a way that they can't leak the URI without you knowing who sold your information.
      • Re:RISKS Hell? (Score:1, Interesting)

        by GMontag ( 42283 )
        In this case the data is not the problem it is the humans who mahe the rules for what data "counts" and who to believe.

        I have been having a similar problem with one credit bureau. From numerous conversations I have concluded that the subject of the report is the last person to be believed.

        Similar to the earlier post, the CBs had my address wrong, along with many other things. I called them to correct the obvious typo, they wouldn't unless I sent them a utility bill. So I sent the same utility bill that
    • Re:RISKS Hell? (Score:3, Insightful)

      by svanstrom ( 734343 )
      That's why you need to be more involved in what data they have about you, and currently there are no gadgets good enough to do that, although the cellphone's got potential to become one.

      Direct contact between a gadget of yours and the company that needs the information... a portable database that keeps track of who's got what information about you, and what information they are allowed to get from you.

      All that's basically needed is the cellphone, an open XML-based standard and a way to sign the data; not
      • untrusted companies (like Microsoft).

        Nonsense, Verisign certifies that Microsoft can be trusted. :^P

        • untrusted companies (like Microsoft).


          Nonsense, Verisign certifies that Microsoft can be trusted. :^P


          Not that we can trust Microsoft itself, just their IIS-servers... hmmm... I can't figure out what's worse, trusting M$ or trusting IIS (or trusting Verisign)...
  • by rcastro0 ( 241450 ) on Saturday December 20, 2003 @09:34AM (#7772798) Homepage
    Did anybody try to read the article ? Holy, that is the type of logic that drove me away from social sciences. And the authors seem to be computer science guys !

    Let's see what this is all about:

    1) FIND AN OBVIOUS TREND
    We think microprocessors are spreading everywhere, and see/predict they doing a lot of things, including communication

    2) GIVE IT A SOPHISTICATED SOUNDING NAME
    I think I will call it... UbiComp (ubiquitous computing)!.

    3) ELABORATE ON WHAT NAMED TREND WILL IMPLY
    Computers will be everywhere. People will talk to them. They will talk to people... they will talk with each other ! (claps)

    4) WRITE ABOUT WHY IMPLICATIONS DIDN'T HAPPEN
    "New forms of interaction must be developed for this environment (...)"

    5) PEPPER IT ALL WITH UNBEARABLY OBSCURE PHRASES
    "Thoughts exchanged by one another are not the same in one room as in another. This includes "thoughts" exchanged between people and/or machines, and implies that behavior is sensitive to location, and as a consequence of mobility, must adapt to changes in physical and social location." Make sure you make references to lots of other authors and experts.

    6) RELEASE TEXT TO A "WANT TO LOOK INTELLECTUAL" AUDIENCE
    Which will pretend this is the smartest piece of writing ever, and the uninitiated simply are just not smart enough to understand.

    No thanks, I think I can do without concepts like UbiComp.
    • At first, I thought they were talking about computer chips in our underwear, but I guess that would be SubUbiComp.

      They missed out on using orthagonal paradigms, but at least they didn't call it ClippyWear.

    • Don''t forget...

      7) Insert complicated-looking but essentially meaningless 'diagrams'

      Fig 1 is fucking hilarious.
      • Fig 1 is fucking hilarious.

        Apparently he's upset that so much is being spent onmonitor sized displays? I'd really like to know where he got his "figures" though, and at what point research into display technologies was limited to a specific size and couldn't be applied to any others.

    • Moderators, mod parent UP!

      Oh, already "5 Interesting".

      Thanks for showing that Slashdot can be a force for sanity. This kind of bullshit pseudoscience drives me up the wall.

      But look a little deeper at the article and you will see that the initial superficiality in fact hides a deeper and much more stunning superficiality.

      Some of the background inventions are truly stunning. The "Removable Media Metaphor" lets you carry data from one place to another on, wait for it, physical objects. This is incredibl
    • There are idiots everywhere. Just because the article was a piece of trash, don't rag on Ubiquitous Computing. It's a good idea -- force computers to become smart enough that they (and their interfaces) either disappear from our perception and/or become at least somewhat self-explanatory, then make them portable enough that we can use them wherever we are. Ubiquitous Computing, at its roots, strives to make computers reliable and portable so that getting data into or out of them is no longer a cognitiv
    • I agree wholeheartedly with your observations. Welcome to the world of academic publishing. Since having articles published is seen as being very prestigious in higher learning institutions, I suppose there is great incentive for "talking up" even the most known, mundane, and predictable trends.
    • The late Mark Weiser invented the term "Ubiquitous Computing" to describe what he calls: 'the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing.'

      Ubiquitous Computing argues that technology should not be a juggernaut, the center of attention, the giant blinking home-theatrical shrine set up in the middle of the living room that the whole family sits around and worships while they tan from its glowing radiation. It shoul

  • :: For our society of devices to operate seamlessly, a mechanism will be required to (a) transport data between devices and (b) have it appear at each workstation, or tool, in the appropriate representation.'" :: ... And that's when SkyNET turns itself on and theTerminAshcrofts start tracking you down. Anyone using cash will be labeled suspicious... I don't think I trust anybody with this much information.
  • it's big brother man. Watch out!
  • ...have been around for almost 30 years.
  • by Featureless ( 599963 ) on Saturday December 20, 2003 @09:49AM (#7772822) Journal
    It's simple: There are not many good reasons for it to happen, to any greater extent that it already is.

    Frankly, this just stinks of that old chestnut about interconnected toasters and refrigerators and power drills sharing data seamlessly on a home network. I was never quite able to get thrilled about this kind of thinking when I first heard it, and it rings more hollow every time I've heard it since, which is about a million times, over decades.

    I think the big problem here is that there isn't much of a problem to solve. I'm sure that, when we have even more portable and ubiquitous computers and communication all around us, we'll just be deluged with new applications for it that we just can't quite think of right now, but we don't yet. And you can't chalk it all up to "technology isn't ready yet." No, I think it's related to demand, more than supply.

    As far as I can tell, there aren't many killer apps that fall well under this umbrella, and those few that there are can't begin to justify the expense of the hardware and software involved, now, or probably for another decade or two.

    One thing that always gets me about this line of thinking is that even the examples they lead with tend to be uninspiring and ridiculous: ATMs and grocery store checkouts sharing programming languages and databases? Complicating the "workplace" with converged, general-purpose computing solutions by littering it with specialized information tools? Come on, guys, this is freakin stupid. Does standardizing on a sigle end-all-be-all computer language, OS, or database sound like a good idea to anyone? Or particularly original? What about "un-converging" to any greater extent than we already are? Or is there some new information tool that will change everything?

    I'm sure as soon as someone actually has a real idea that's plausible enough for science fiction, we'll all get excited about being the first to make it happen.

    The article does hint at a few more interesting things; that hierarchical filesystems may be overrated and due for reexamination as the bedrock of computing (although truthfully this is already well in progress - PalmOS? Newton?), that we might see more kiosk or application-specific computers... more "specialized devices" solving problems out in the world... now selling tickets, now portable computerized maps giving directions, perhaps more active displays "everywhere," primarily driven by advertising, but perhaps justified by various underlying civic duties, and location-based computing is undoubtedly going to be more important, as it finally becomes cheap enough to be a factor...

    But these are all just hints. Barely that.

    But overall I find this to be just another valueless futurist rant, devoid of real ideas, coasting on buzzwords and hype, and basically irrelevant to anyone seriously thinking about the future... or at least, nothing you haven't heard before a million times.
    • The idea reminds me a bit of the Java-based software 'agents' we briefly studied in an undergrad distributed computing course I did .. hmm .. probably eight years ago. In fact the whole idea sounds a lot like much of the rationale behind Java in the first place. Each device runs a 'common language' (Java), and the network allows special-purpose software tools (agents) to travel through the network and run on the general-purpose tool/device (or agent clients using some sort of RPC to the server module throug

      • Pretty astute, except for the notion of Java being killed or dead. You might mean Java Applets.

        Java itself is more successful and widely deployed than .Net by orders of magnitude - in addition to the fact that it targets many markets .Net does not, or not meaningfully (embedded, phone, etc).

        Java has in many cases replaced C++ as the language taught in universities around the world. At that milestone, it is unlikely to be "killed" in our lifetimes.
  • More than 10 years? (Score:3, Informative)

    by dcw3 ( 649211 ) on Saturday December 20, 2003 @09:50AM (#7772823) Journal
    "It has been more than ten years since such information appliances as ATMs and grocery store UPC checkout counters were introduced"

    Try 30+ years for UPC. They came about back around 72 at Krogers in Ohio. And ATMs...at least 20+ years, but that Google is left as an exercise to the reader.
  • by SmallFurryCreature ( 593017 ) on Saturday December 20, 2003 @09:51AM (#7772825) Journal
    Let me explain. Since we all seen ATM machines with windows error screens I think it is fair to presume that the computer inside is not exactly small. It is not like say a pocket calculater where every bit counts.

    So why oh why does it remind me to take the receipt when I told it no receipt? It is not printing one out so somewhere in its memory a flag is set. So why can't the last message be adjusted to reflect it? It is a very simple thing to do. I think you learn this kind of thing in the second chapter of any programming book.

    I think until such simple things are realised (I seen people waiting for a receipt that is not going to appear making the throughput of the machine slower wich is not good on a busy shoppping day) we can forget such machines ever becoming even the slightest bit intelligent and say use your name to greet you. Let alone be able to give you say access to you bank statements of that month to see if the rent has already been deducted.

    Oh and this behaviour is spotted on ATM's in holland in several different models belonging to different banks (our cards work with all banks)

    • I think until such simple things are realised we can forget such machines ever becoming even the slightest bit intelligent and say use your name to greet you.

      I've seen several ATMs greet me by name.

      The funniest thing is, I saw this once or twice after I changed my name with the bank, before I got my new card... so I know that they weren't getting my name off of the bank records, but off of the raised plastic letters on the card (or possibly the magnetic stripe, but those usually carry little else than an
  • by mumblestheclown ( 569987 ) on Saturday December 20, 2003 @09:53AM (#7772829)
    Slow day on slashdot? Why else has somebody posted a mediocre business-school masters "IT management" "research" quality humdrum thesis topic as an interesting article?
  • Applications like this exist, but they are not for access to the general public. It's easy to imagine that with the long-time existence now of "plus cards" at grocery stores, their users have been tracked down to the individual M&M bag, in terms of buying habits. The store can tell you what you're most likely to buy next, and print out coupons for those items.
    • The store can tell you what you're most likely to buy next, and print out coupons for those items.

      I dunno about that. The coupons they usually give me are for products that vaguely compete in the same arena as what I buy, but are inferior quality. Like, I'll buy "Healthy Choice" canned soup, and get a coupon for "Chef Boyardee" canned pasta.

      Plenty of times I've handed the coupon back to the checker in disgust.
  • by AndroidCat ( 229562 ) on Saturday December 20, 2003 @10:10AM (#7772859) Homepage
    In spite of lofty ideals, we know what the application would look like: Imagine the worst qualities of Clippy, Talky-Toaster, and Genuine People Personalities, stir in some 1984 and Brazil.
  • by rwa2 ( 4391 ) * on Saturday December 20, 2003 @10:18AM (#7772869) Homepage Journal
    It's under development under a couple of different names.

    Unfortunately, this kind of thing still starts in the military world. The DoD has been developing requirements for Network Centric Warfare [defenselink.mil] (NCW). Basically turning warfare interfaces into a RTS game like StarCraft, C&C, complete with fog-of-war, semi-autonomous units, comm & data sharing, etc. On the technical side, this is manifesting itself as Command, Control, Communications, Computer, Intelligence, Surveillance and Reconnaissance [osd.mil] (C4ISR) architecture. One of the first actual implementations is being worked in in the form of Future Combat Systems (FCS).

    These are complex systems, so the DoD has been maturing development of modeling & simulation interoperability by making contractors adhere to High Level Architecture [dmso.mil] (HLA) so they can properly analyze these systems before deploying them. HLA basically provides a lot of the same data object registration, distribution, and interfaces that older tech like CORBA does, with extra simulation concepts.

    These technologies are being commercialized under the buzzwords "Nework Centric Operations" (NCO) and "Network Enabled Operations" (NEO). Advocates usually point to well networked operations like Wal-mart, UPS, et al. as poster children for what could be done (automatic restocking, package tracking, load balancing & route optimization, etc.) with enough NEO infrastructure. A lot of the interchange standards (including C4ISR) are getting established through bodies like the OMG [omg.org]. Other than the interchange standards, there's not all that much new tech involved... maybe RFIDs and various other networking tech (grid/mesh networks, strong encryption/authentication, mobile IP, etc.). Most if it just involves looking at technology that already exists and figuring out how to piece it together to actually do something worthwhile.

    Disclaimer: I work for one of the gov't contractors throwing all these buzzwords around.

    • Re: Wal-Mart (Score:3, Interesting)

      Advocates usually point to well networked operations like Wal-mart, UPS, et al. as poster children for what could be done

      And that, ladies & gentlemen, is the reason that Wal-Mart will *own* the entire retail market in the US within the decade. They already get $1000/yr from every man, woman, and child in the US.

      I did some consulting for a niche retailer last year. After assessing their current technology, I unilaterally recommended that they copy Wal-Mart in every one of their IT decisions. I even

    • No, you're wrong about HLA. Firstly, HLA provides NO SIMULATION CONCEPTS at all, as it is designed in a generic way that contains no concepts of simulations at all. Secondly, HLA does a helluva lot LESS than CORBA, HLA does not even provide a system for distributed computing or calling remote procedures. All HLA does is provide a standardized way of describing the data framework for networked applications - in other words, a standardized way of describing what the content of network packets will be. There i

      • Well, there's still quite a gap between what HLA was intended to do, and what it actually does. And I suppose I really meant HLA+RTI in order to cover the finer details of simulation time synchronization and such. Here's a good pitch (ppt, unfortunately) comparing HLA vs. CORBA [unm.edu]

        The original HLA spec (up to 1.3) as defined by the DoD was kind of nebulous, so much so that the industry groups had to create their own HLA spec that was actually practical/implementable/useful (IEEE 1516). This page provides

  • Project Oxygen @ MIT (Score:4, Informative)

    by G4from128k ( 686170 ) on Saturday December 20, 2003 @10:26AM (#7772880)
    Project Oxygen [mit.edu] is much closer to achieving the article's goals, at least in terms of cutsy demos(see the video clips at the preceding link). The Oxygen project's goals are a bit different from the article author's goals. Oxygen is more concerned with consumer/business office environments than the article's emphasis on an automotive designer's needs.
  • Middleware. (Score:4, Insightful)

    by Moderation abuser ( 184013 ) on Saturday December 20, 2003 @10:33AM (#7772898)
    Choose your poison:

    Email, nntp,IM, xmlblaster, Jabber, MQSseries, SonicMQ, SwiftMQ, Softwired iBus, Jiiva RapidMQ, ICM etc etc etc etc...

    What we need to do is write *more* message systems. In fact, lets *everyone* do one.

    The real problem is standardisation. The situation is a bit like networking protocols before TCP/IP became all pervasive. Each vendor has their own system and are happy to charge you an arm and a leg to connect it up to everything. You then have the same problem with information definitions and formatting but XML and things schemes rosettanet are gradually solving that one.
  • I have thought about how to implement this and found that Escher's Print Gallery brought me my knees ... why ? here is the story in brief ...

    a workflow that involves a myriad of data types, including:

    • two-dimensional concept sketches;
    • computer-rendered images;
    • animations and movies of cars in various environments;
    • 3-D clay and computer models at various scales;
    • interior textures and fabrics;
    • and engineering data.

    The basic problem is to be able to show the data in 1D, 2D and 3D. Then, there are pseudo dimensions that give rise to 1.5D, 2.5 D, and finally the element of Time T has to be taken into each of these spaces. The crux of the problem is to maintain continuity of "something" that flows between each of these spaces - often in an iterative and recursive fashion. This something can be abstracted as an object (which I call the Bubble, hence my domain name BubbleUI !) and the authors say

    ... environment can be conceptualized as running an object-oriented simulator in which each computational element is abstracted into an object. Objects dynamically enter and leave the environment .... We envision a usage scenario that involves coordinated use of all these terminals. While they are all interconnected at the systems level, from the user's perspective, a seamless mechanism for transporting work from one device to another is highly desirable.

    To be able to visualize this the best I can do is suggest that you look at the Paint Gallery by MC Escher [artchive.com]. and here [uvm.edu] Just Imagine that the paintings in the Gallery are not Static paintings, but are actually windows looking into the Real World. As the Real World is dynamic, when you revist a given window, it is possible that things might have changed. Then, you will have a good idea of what you mind has to get a handle on, before a user can have "sentient data access."

    The concept of visualling the Prints in the Print Gallery as Windows is not too off-base because the Article describes that there is a desire to integrate the physical with the visual ....

    An advantage to using bar codes is that we can also integrate physical assets into our system.

    And the article also says that there are more than just Static Screens that have to be incorporated

    The different tasks in this workflow are typically performed

    • by different people,
    • at different locations,
    • and often using very different and specialized hardware and software.

    So accomodate the above requirement, imagine now that there is not a single Spectator in the Gallery, but there are many people looking at many "Windows" at the same time. And like in real life these Spectators interact with each other inside the Print Gallery (FIGURE), just as the Real World visible from the Windows is interacting in the background (GROUND)..

    After all is said and done, the conclusion that I came up with in the 1st draft of my doctoral thesis (which was rejected and I then approached this subject different which was then accepted) was that the Glue to bind it all is the Cognition of the User - i.e. PortfolioBrowser==User

    The glue that binds our diverse collection of terminals, containers, and identifiers is a software infrastructure we call PortfolioBrowser. ... the design of our PortfolioBrowser embraces our fundamental goal of minimizing transaction costs at all times, throughout the entire system.

    The User, in my conception, is the PortfolioBrowser. And because of this choice at the center o

  • The way they're talking, these databases/whatever are aware of their surroundings.... not self aware.

    People throw around "AI" and "Sentient" too much when describing software when in fact the software is nowhere close to that.
    • People throw around "AI" and "Sentient" too much when describing software when in fact the software is nowhere close to that.

      The best description of "AI" that I've ever heard came from my undergrad AI prof. He said that AI is any system that you don't understand. You can craft the spiffiest neural-net-genetic-alg-self-modifying-rule-based- w hateverthehellyouwant system, and as soon as you explain the algorithm to someone, they'll say "Oh, I get it. So that can't be AI."

      Of course, by that logic, ATM
  • Why doesn't your data know more about you?

    Why else would buying CAT food at PetSmart get me an email to watch the Thanksgiving Purina DOG show on TV?
  • Neat (Score:3, Interesting)

    by Zebra_X ( 13249 ) on Saturday December 20, 2003 @12:34PM (#7773329)
    I think "Sentient Data Access" is a bit of a misleading title. What we are talking about here is real world workflow with a set of devices/platforms that are capable of supporting realworld work flow. XML is not the answer as so many have suggested - but a schema much like the BPML and BPEL that *uses* XML and schemas would be required to support this sort functionality.

    The article speaks specifically to coordinating the transfer of specific data and instructions between devices in a real world environment. Though in most cases, the instruction could be context sensitive. I.E. if you walk into a Vision Dome with a particular bar code scanned, it could surmised that you want to view that bar code/layout/car within the dome.

    Even though the article chastises the world for not having accomplished this yet the reality is that this sort of thing could be implemented today with current technologies. Also the platform could easily use future technologies if designed correctly.

    To build such a thing would require an extensible way of definining a process much like VoiceXML, BPML, BPEL. It would also require the ability to define the exchange of data, more importantantly, the device/location/communication channel that the data will be coming from. And finally, it would require a way of easily defining the execution of a process. The last component is really the challenge. Every software package that would participate in this type of environment would need to "listen" for requests and messages that are coming from devices/other systems. Indeed, this sort of pluming is not hardware, but software. As such it would also need to be supported by every operating system, handheld device, and embedded system to be properly integrated into the world at large.

    So I say build the language then build the engine.
  • My ATM, from LaSalle Bank (owned by a big Netherlands co.), has a lack of 'knowledge' that bugs me every time I use it.

    The first question it asks me is whether I want to work with it in Spanish or English... couldn't it remember that from the card? I'm not likely to suddenly forget English. (I did run through the whole thing once in Spanish, just for kicks).

    It should know which account I tend to take cash out of, and how much, and highlight those options.
  • by HisMother ( 413313 ) on Saturday December 20, 2003 @01:32PM (#7773661)
    Here's something that drives me insane about most ATM machines. When you put your card in, the first thing it does is ask you what language you want it to speak. It's nice, I suppose, that the machine will accomodate speakers of other languages. Why, though, does it ask me this question every single freaking time ? Is a French speaker going to feel like using the ATM in Spanish on some days? Is an English speaker going to suddenly forget English and revert to Vietnamese? Why in tarnation doesn't the machine remember this one little bit of information about me and not bother me with that same stupid question again? I speak English, dammit -- don't ask me about Urdu!

    It's almost an anti-security device, too. If a French-speaker has their card stolen by an English-speaker, it the ATM only prompted in French, that would be at least a little bit of a deterrent for illicit use, wouldn't it be?

    It's crazy to talk about a universally connected web of smart data when the individual machines are, even after years of evolution, so profoundly stupid.

  • As usual... (Score:3, Informative)

    by voodoo1man ( 594237 ) on Saturday December 20, 2003 @01:42PM (#7773698)
    John McCarthy came up with the same idea (and invented a better XML before anyone had even heard of SGML) [stanford.edu] 30 years ago.
  • First of all I am no technophobe. I embrace technology as part of my life. Where I see a problem is giving ourselves over completely to technology. Who is in control? What are the safeguards, checks, and balances to a system that knows everything about you? If I purchase a tube of Preparation H for a family member will I be subjected to a system that insists that I have hemorrhoids? Will there be a donut cushion waiting for me on the next flight I take?
  • Doesn't such a complex arrangement of input devices and this "society of devices" seem just a bit much? It seems to me that it would be much easier to try to streamline our current system. There's already too many kinds of recival and input devices, why should you make more? Such a system, while perhaps being logical, would not be the most efficient. You need single devices that can easily mulitask, and input devices that can be mulitfunctioned, not a messy slew of devices. So where is this multifunctioned
  • This idea keeps coming around. "Smart buildings". "Smart dust".

    A worthwhile project would be a "smart lecture hall". Just provide all the usual gear, but interconnect it so it works reasonably. Sense the approximate number of people in the room and crank airflow up and down accordingly. (That, all by itself, is a viable product concept.) Interconnect the lighting, screen, and projectors so that when the screen is lit, it's not illuminated by room lighting. (Use big, illuminated buttons on the contro

  • My microapps teacher told me a story of his grandfather refusing to use the phone. He said, "Why would I ever talk into a peice of plastic. If I want to talk to somebody, I'll go down the street and talk to them." While some people today may be concerned about big brother, new generations are more open to technology. I think that someday, tomorrow's kids will not have a problem with technology controlling their lives because they will be brought up in a world where technology is already doing so.
  • There's a project which is 'kinda' doing stuff like this. It was started by one of the GNOME/Ximian heavies, Nat Friedman. It's called Dashboard [nat.org] and development is currently going on at a frantic pace.

On the eighth day, God created FORTRAN.

Working...