Sentient Data Access 134
CowboyRobot writes "From Queue comes a piece subtitled Why doesn't your data know more about you? From the article: 'It has been more than ten years since such information appliances as ATMs and grocery store UPC checkout counters were introduced. ... A common language for these devices has not been standardized, nor have current database solutions sufficiently captured the complexities involved in correctly expressing multifaceted data. ... As computing devices expand from the status-quo keyboard and desktop to a variety of form factors and scales, we can imagine workplaces configured to have a society of devices, each designed for a very specific task. As a whole, the collection of devices may act much like a workshop in the physical world, where the data moves among the specialized digital stations. For our society of devices to operate seamlessly, a mechanism will be required to (a) transport data between devices and (b) have it appear at each workstation, or tool, in the appropriate representation.'"
The implications... (Score:5, Funny)
Re:The implications... (Score:2)
Re:The implications... (Score:3, Interesting)
Your cellphone knows about what you've been looking at online, then when you're walking by stores it checks their websites (using bluetooth, and their bluetooth-AP) to see if there's anything there that you might be interested in.
The cellphone doesn't hand out the information to let the server do the thinking, so there are no securityrisks (of that kind) and you can always slap a bayesian filter on the whole thing to make sure that it learns what you're looking at online but
Re:The implications... (Score:2)
I want nothing of the sort.
Your cellphone knows about what you've been looking at online, then when you're walking by stores it checks their websites (using bluetooth, and their bluetooth-AP) to see if there's anything there that you might be interested in.
Your cellphone maybe, but for sure as hell not mine.
In fact, I refuse to own one of the infernal things for multiple reasons, amongst which is the very thing you seem to desire.
I'm quite happy with a world wide web. A wo
Re:The implications... (Score:1)
>> at online, then when you're walking by stores it checks
>> their websites (using bluetooth, and their bluetooth-AP)
>> to see if there's anything there that you might be
>> interested in.
> Your cellphone maybe, but for sure as hell not mine.
> In fact, I refuse to own one of the infernal things for
> multiple reasons, amongst which is the very thing you
> seem to desire.
> I'm quite happy with a world wide web.
Re:The implications... (Score:2)
Most of us here have the good sense to realize that such a thing will be badly implemented because we live in a free country where commercial interests are the driving force behind most of the products we get on the market. Paid placement, tie-ins, cross-promotions, etc. will enable merchants to gather more and more information about us, whic
Re:The implications... (Score:2, Insightful)
"The good sense"??? Would that be "the good sense" as in the good sense of knowing tha
Re:The implications... (Score:2)
You seem to assume that anonymity is a given, I assume things are going
Re:The implications... (Score:2)
It will also be fully capable of alerting (kindest thanks for the word, it's a good one) people and instrumentalities other than myself. This is a Bad Thing, and no amount of persuasion by the folks who design, build, and sell such devices will ever convince me that there will never ever be any kind of back door (nevermind wide open FRONT doors) that will permit people I don't much like from acquiring and using the informa
Re:The implications... (Score:2)
Crackers are the least of my worries, unless they get hold of my identity/credit card number and take a little ride with that info. Of greater concern is the things that corporations and governments (please notice the 's' on the end of government there) might choose to do wit
AFDB (Score:2)
That book is FUNNIER THAN HELL!!!
Among other things, I'm an occasional book reviewer for Paladin Press, the folks who published Lyle's hilarious (and exceptionally well-illustated) book.
A month or two ago I submitted my review of that book to Slashdot, but it was rejected for whatever reason (I average about one in six accepted story submissions to the edito
Re:The implications... (Score:2)
Re:The implications... (Score:1)
Re:The implications... (Score:2)
Re:The implications... (Score:2, Insightful)
If you're a privacy advocate, the current somewhat insular world of independent devices is best. Our refrigerators do not talk to our phones. We have some control over our lives.
However, there are some times when I wish some of those devices were more 'open'. My grocery store UPC checkout counter knows a lot about me. Since I don't eat out an
XML (Score:4, Funny)
Re:XML (Score:3, Insightful)
Let's see.. You have an appointment in the building and get lost. When you walk up to a wall display, it (without asking) shows you a map and path to get to where you want to go.
Think about what would be required to make that trick work -- then worry about the security problems.
Re:XML (Score:2)
And nowadays it seems to be "work that out then worry about privacy."
Re:XML (Score:1)
Re:XML (Score:1)
Semantic Web, Ontology Mapping, Trust (Score:5, Insightful)
RDF/XML solves a bit more of the problem, making the structure of the information clear, in terms of assertional statements. An RDF/XML file is a knowledge base, full of statements saying this has some particular relationship to that. It lets the machines get at more of the information in a uniforn, universal way.
But still, the problem of ontology/schema/vocabulary mapping remains: if one system is talking about patients and another is talking about clients, they might or might not really be talking about the same thing. A single person maybe never counts as two patients but sometimes counts as two clients, etc. At least with the data in RDF, most of this mapping can be done in software once a person figures it out and expresses it in a suitable logic language.
The emerging design of the Semantic Web hopes to make that reasonable, but also to support convergence on common vocabularies by having everything on the web -- if it's trivial to see what vocabularies are already being used, people will mostly only make new ones when the old ones really are different.
Other hard problems remain, of course, like figuring out which data sources to trust. Fun fun.
Duh (Score:5, Funny)
That sounds like a bad Soviet Russia joke ;^)
Re:Duh (Score:1, Funny)
The problem HAS been solved (Score:2)
Hmmm (Score:2, Insightful)
Verge of Future? (Score:3, Interesting)
With all the 'innovation' these days it's getting hard to keep track
Landrocker
Re:Verge of Future? (Score:1)
Get on the boat (Score:2, Informative)
Obsequious computing (Score:5, Insightful)
I guess starting out a scientific paper with "Hey, wouldn't it be cool if..", but their paper really needs it.
perhaps a good thing? (Score:5, Interesting)
i know that "security through obscurity" is a cheesy solution, but i can't help thinking that if every ATM in the country had the same architecture, the system as a whole would be more prone to hacks and abuse. what do you think?
Re:perhaps a good thing? (Score:1, Informative)
A lot of them are standardizing on Windows.
Have A Nice Day!
Re:perhaps a good thing? (Score:1)
Re:perhaps a good thing? (Score:3, Insightful)
the system as a whole would be more prone to hacks and abuse.
I think that's what they're talking about. Just because you don't have a use for all your personal data, doesn't mean nobody does.
Think "cookies for real life".
Middlemen (Score:5, Interesting)
Until the marketplace demands a standard, businesses will continue this behavior because it is more profitable in the near term... individuals almost always pay more than conglomerates which is the nature of a trading company who can with 'purchasing power' lower the price for goods or services. So as long as the companies are dealing 'direct' with you the consumer, they can ask for whatever service charge you can bare as an individual... compared to credit unions who get much much better deals as an organization.
So basically it's in all companies best interest to avoid organized clientelle or employees as long as possible in order to maximize profits from low overhead and high margins. Information technology doesn't change this strategy it just adds new levels of complexity.
Misconception (Score:5, Insightful)
The entire point of computers is that they are general purpose devices. The "workshop" idea surely sounds cool to someone who doesn't know about computers, because it resembles the world before "general purpose" was a graspable concept.
Would I rather want my workplace to be a collection of specialized devices, or a single device that can be configured to be any of the others, plus whatever else or new is necessary? Now that's a difficult question, right?
Re:Misconception (Score:1, Interesting)
Re:Misconception (Score:3, Interesting)
Take TiVo for example. Sure, there are a lot of free and non-free software products to make a general purpose computer behave somewhat similarly. For one, I don't know if any of this software can set the computer to wake up at a particular time slot. There is simply no API to communicate to the BIOS or any other hardware to do this. Being able to sleep and wake up on a timer is important to energy efficiency, I doubt a Tivo has a
Re:Misconception (Score:2)
Only in edge areas. In mainstream, there is enough incentive to improve the general purpose device.
Your virtual VCR is a great example. All the software I've seen, be it MythTV or Freevo or others, far surpass any physical VCR I have ever seen when it comes to functionality.
Re:Misconception (Score:2, Insightful)
Yes, but when people go to "Best Buy" or "Dell" and ask for a "computer" these days, what they're really asking for is an interactive television. They don't understand the concept of operating systems or programming, they just identify those things as buzz-words that make you look smart if you use them correctly in a sentence.
Would I rather want my workplace to be a collection of specialized devices, or a single device that can be c
Not at all difficult - specalized the way to go (Score:2)
I want both (Score:2)
But even on a general purpose device, the problem of information sharing across applications remains. That's a difficult problem
You almost have it... (Score:2)
Most people, and that includes a fair number of geeks as well - don't understand what a computer truely is. It is a machine which runs software. But that isn't quite it, either. Computers are software made physical - and this is the cruxt of it all.
You see, software is nothing more than ideas and algorithms expressed as a special series of ones and zeros, interpreted sequentially. This sequential nature i
RISKS Hell? (Score:5, Insightful)
In the system described here, once bad data gets into your microwave oven there's virtually no way to chase down all the instances of it that will be floating around the universe. Didn't Sandra Bullock star in a movie about this once?
Re:RISKS Hell? (Score:1)
Re:RISKS Hell? (Score:4, Insightful)
While in a better designed version of the same thing, where everything contained a link to the canonical version of the information, and possibly cached it for a sane length of time, then this wouldn't happen - you would update your current address, and suddenly _everyone_ who had a copy of the canonical location would have the new value.
Add a little strong crypto - unguessable URIs for data and possibly even encrypt the value of the field to each entity who's supposed to have a copy, in such a way that they can't leak the URI without you knowing who sold your information.
Re:RISKS Hell? (Score:1, Interesting)
I have been having a similar problem with one credit bureau. From numerous conversations I have concluded that the subject of the report is the last person to be believed.
Similar to the earlier post, the CBs had my address wrong, along with many other things. I called them to correct the obvious typo, they wouldn't unless I sent them a utility bill. So I sent the same utility bill that
Re:RISKS Hell? (Score:3, Insightful)
Direct contact between a gadget of yours and the company that needs the information... a portable database that keeps track of who's got what information about you, and what information they are allowed to get from you.
All that's basically needed is the cellphone, an open XML-based standard and a way to sign the data; not
Re:RISKS Hell? (Score:3, Funny)
Nonsense, Verisign certifies that Microsoft can be trusted. :^P
Re:RISKS Hell? (Score:1)
Not that we can trust Microsoft itself, just their IIS-servers... hmmm... I can't figure out what's worse, trusting M$ or trusting IIS (or trusting Verisign)...
What a poor pretentious article (Score:5, Interesting)
Let's see what this is all about:
1) FIND AN OBVIOUS TREND
We think microprocessors are spreading everywhere, and see/predict they doing a lot of things, including communication
2) GIVE IT A SOPHISTICATED SOUNDING NAME
I think I will call it... UbiComp (ubiquitous computing)!.
3) ELABORATE ON WHAT NAMED TREND WILL IMPLY
Computers will be everywhere. People will talk to them. They will talk to people... they will talk with each other ! (claps)
4) WRITE ABOUT WHY IMPLICATIONS DIDN'T HAPPEN
"New forms of interaction must be developed for this environment (...)"
5) PEPPER IT ALL WITH UNBEARABLY OBSCURE PHRASES
"Thoughts exchanged by one another are not the same in one room as in another. This includes "thoughts" exchanged between people and/or machines, and implies that behavior is sensitive to location, and as a consequence of mobility, must adapt to changes in physical and social location." Make sure you make references to lots of other authors and experts.
6) RELEASE TEXT TO A "WANT TO LOOK INTELLECTUAL" AUDIENCE
Which will pretend this is the smartest piece of writing ever, and the uninitiated simply are just not smart enough to understand.
No thanks, I think I can do without concepts like UbiComp.
Re:What a poor pretentious article (Score:2, Insightful)
They missed out on using orthagonal paradigms, but at least they didn't call it ClippyWear.
Re:What a poor pretentious article (Score:2)
7) Insert complicated-looking but essentially meaningless 'diagrams'
Fig 1 is fucking hilarious.
Re:What a poor pretentious article (Score:2)
Apparently he's upset that so much is being spent onmonitor sized displays? I'd really like to know where he got his "figures" though, and at what point research into display technologies was limited to a specific size and couldn't be applied to any others.
Re:What a poor pretentious article (Score:3, Informative)
Oh, already "5 Interesting".
Thanks for showing that Slashdot can be a force for sanity. This kind of bullshit pseudoscience drives me up the wall.
But look a little deeper at the article and you will see that the initial superficiality in fact hides a deeper and much more stunning superficiality.
Some of the background inventions are truly stunning. The "Removable Media Metaphor" lets you carry data from one place to another on, wait for it, physical objects. This is incredibl
Re:What a poor, pretentious retort! (Score:2)
Re:What a poor pretentious article (Score:1)
Read Mark Weiser's original work on UbiComp (Score:2)
Ubiquitous Computing argues that technology should not be a juggernaut, the center of attention, the giant blinking home-theatrical shrine set up in the middle of the living room that the whole family sits around and worships while they tan from its glowing radiation. It shoul
Re:Read Mark Weiser's original work on UbiComp (Score:2)
You rescued the concept for me.
Re:Read Mark Weiser's original work on UbiComp (Score:2)
The "pervasive computing" people have co-opted Ubiquituous Computing, and perverted and hyped the ideas to sell it to the military (who eagerly respond to the pervasive penis-oriented penetration metaphore).
-Don
And that's when SkyNET turns itself on and... (Score:1)
We do have this... (Score:1)
Grocery store UPC checkout counters... (Score:1)
I have a theory about why this is not happening (Score:5, Insightful)
Frankly, this just stinks of that old chestnut about interconnected toasters and refrigerators and power drills sharing data seamlessly on a home network. I was never quite able to get thrilled about this kind of thinking when I first heard it, and it rings more hollow every time I've heard it since, which is about a million times, over decades.
I think the big problem here is that there isn't much of a problem to solve. I'm sure that, when we have even more portable and ubiquitous computers and communication all around us, we'll just be deluged with new applications for it that we just can't quite think of right now, but we don't yet. And you can't chalk it all up to "technology isn't ready yet." No, I think it's related to demand, more than supply.
As far as I can tell, there aren't many killer apps that fall well under this umbrella, and those few that there are can't begin to justify the expense of the hardware and software involved, now, or probably for another decade or two.
One thing that always gets me about this line of thinking is that even the examples they lead with tend to be uninspiring and ridiculous: ATMs and grocery store checkouts sharing programming languages and databases? Complicating the "workplace" with converged, general-purpose computing solutions by littering it with specialized information tools? Come on, guys, this is freakin stupid. Does standardizing on a sigle end-all-be-all computer language, OS, or database sound like a good idea to anyone? Or particularly original? What about "un-converging" to any greater extent than we already are? Or is there some new information tool that will change everything?
I'm sure as soon as someone actually has a real idea that's plausible enough for science fiction, we'll all get excited about being the first to make it happen.
The article does hint at a few more interesting things; that hierarchical filesystems may be overrated and due for reexamination as the bedrock of computing (although truthfully this is already well in progress - PalmOS? Newton?), that we might see more kiosk or application-specific computers... more "specialized devices" solving problems out in the world... now selling tickets, now portable computerized maps giving directions, perhaps more active displays "everywhere," primarily driven by advertising, but perhaps justified by various underlying civic duties, and location-based computing is undoubtedly going to be more important, as it finally becomes cheap enough to be a factor...
But these are all just hints. Barely that.
But overall I find this to be just another valueless futurist rant, devoid of real ideas, coasting on buzzwords and hype, and basically irrelevant to anyone seriously thinking about the future... or at least, nothing you haven't heard before a million times.
Re:I have a theory about why this is not happening (Score:3, Insightful)
The idea reminds me a bit of the Java-based software 'agents' we briefly studied in an undergrad distributed computing course I did .. hmm .. probably eight years ago. In fact the whole idea sounds a lot like much of the rationale behind Java in the first place. Each device runs a 'common language' (Java), and the network allows special-purpose software tools (agents) to travel through the network and run on the general-purpose tool/device (or agent clients using some sort of RPC to the server module throug
Re:I have a theory about why this is not happening (Score:2)
Java itself is more successful and widely deployed than
Java has in many cases replaced C++ as the language taught in universities around the world. At that milestone, it is unlikely to be "killed" in our lifetimes.
More than 10 years? (Score:3, Informative)
Try 30+ years for UPC. They came about back around 72 at Krogers in Ohio. And ATMs...at least 20+ years, but that Google is left as an exercise to the reader.
Laziness or sloppy coding? (Score:4, Insightful)
So why oh why does it remind me to take the receipt when I told it no receipt? It is not printing one out so somewhere in its memory a flag is set. So why can't the last message be adjusted to reflect it? It is a very simple thing to do. I think you learn this kind of thing in the second chapter of any programming book.
I think until such simple things are realised (I seen people waiting for a receipt that is not going to appear making the throughput of the machine slower wich is not good on a busy shoppping day) we can forget such machines ever becoming even the slightest bit intelligent and say use your name to greet you. Let alone be able to give you say access to you bank statements of that month to see if the rent has already been deducted.
Oh and this behaviour is spotted on ATM's in holland in several different models belonging to different banks (our cards work with all banks)
Re:Laziness or sloppy coding? (Score:2)
I've seen several ATMs greet me by name.
The funniest thing is, I saw this once or twice after I changed my name with the bank, before I got my new card... so I know that they weren't getting my name off of the bank records, but off of the raised plastic letters on the card (or possibly the magnetic stripe, but those usually carry little else than an
Slow day on slashdot? (Score:4, Insightful)
There are applications like this... (Score:3, Interesting)
Re:There are applications like this... (Score:2)
I dunno about that. The coupons they usually give me are for products that vaguely compete in the same arena as what I buy, but are inferior quality. Like, I'll buy "Healthy Choice" canned soup, and get a coupon for "Chef Boyardee" canned pasta.
Plenty of times I've handed the coupon back to the checker in disgust.
It's ClippyWear not UbiWear! (Score:4, Insightful)
Umbrella terms for this type of tech (Score:5, Informative)
Unfortunately, this kind of thing still starts in the military world. The DoD has been developing requirements for Network Centric Warfare [defenselink.mil] (NCW). Basically turning warfare interfaces into a RTS game like StarCraft, C&C, complete with fog-of-war, semi-autonomous units, comm & data sharing, etc. On the technical side, this is manifesting itself as Command, Control, Communications, Computer, Intelligence, Surveillance and Reconnaissance [osd.mil] (C4ISR) architecture. One of the first actual implementations is being worked in in the form of Future Combat Systems (FCS).
These are complex systems, so the DoD has been maturing development of modeling & simulation interoperability by making contractors adhere to High Level Architecture [dmso.mil] (HLA) so they can properly analyze these systems before deploying them. HLA basically provides a lot of the same data object registration, distribution, and interfaces that older tech like CORBA does, with extra simulation concepts.
These technologies are being commercialized under the buzzwords "Nework Centric Operations" (NCO) and "Network Enabled Operations" (NEO). Advocates usually point to well networked operations like Wal-mart, UPS, et al. as poster children for what could be done (automatic restocking, package tracking, load balancing & route optimization, etc.) with enough NEO infrastructure. A lot of the interchange standards (including C4ISR) are getting established through bodies like the OMG [omg.org]. Other than the interchange standards, there's not all that much new tech involved... maybe RFIDs and various other networking tech (grid/mesh networks, strong encryption/authentication, mobile IP, etc.). Most if it just involves looking at technology that already exists and figuring out how to piece it together to actually do something worthwhile.
Disclaimer: I work for one of the gov't contractors throwing all these buzzwords around.
Re: Wal-Mart (Score:3, Interesting)
And that, ladies & gentlemen, is the reason that Wal-Mart will *own* the entire retail market in the US within the decade. They already get $1000/yr from every man, woman, and child in the US.
I did some consulting for a niche retailer last year. After assessing their current technology, I unilaterally recommended that they copy Wal-Mart in every one of their IT decisions. I even
Re:Umbrella terms for this type of tech (Score:2, Informative)
No, you're wrong about HLA. Firstly, HLA provides NO SIMULATION CONCEPTS at all, as it is designed in a generic way that contains no concepts of simulations at all. Secondly, HLA does a helluva lot LESS than CORBA, HLA does not even provide a system for distributed computing or calling remote procedures. All HLA does is provide a standardized way of describing the data framework for networked applications - in other words, a standardized way of describing what the content of network packets will be. There i
HLA vs. CORBA (Score:2)
The original HLA spec (up to 1.3) as defined by the DoD was kind of nebulous, so much so that the industry groups had to create their own HLA spec that was actually practical/implementable/useful (IEEE 1516). This page provides
Project Oxygen @ MIT (Score:4, Informative)
Middleware. (Score:4, Insightful)
Email, nntp,IM, xmlblaster, Jabber, MQSseries, SonicMQ, SwiftMQ, Softwired iBus, Jiiva RapidMQ, ICM etc etc etc etc...
What we need to do is write *more* message systems. In fact, lets *everyone* do one.
The real problem is standardisation. The situation is a bit like networking protocols before TCP/IP became all pervasive. Each vendor has their own system and are happy to charge you an arm and a leg to connect it up to everything. You then have the same problem with information definitions and formatting but XML and things schemes rosettanet are gradually solving that one.
Escher's Print Gallery may provide some insight .. (Score:3, Interesting)
I have thought about how to implement this and found that Escher's Print Gallery brought me my knees ... why ? here is the story in brief ...
The basic problem is to be able to show the data in 1D, 2D and 3D. Then, there are pseudo dimensions that give rise to 1.5D, 2.5 D, and finally the element of Time T has to be taken into each of these spaces. The crux of the problem is to maintain continuity of "something" that flows between each of these spaces - often in an iterative and recursive fashion. This something can be abstracted as an object (which I call the Bubble, hence my domain name BubbleUI !) and the authors say
To be able to visualize this the best I can do is suggest that you look at the Paint Gallery by MC Escher [artchive.com]. and here [uvm.edu] Just Imagine that the paintings in the Gallery are not Static paintings, but are actually windows looking into the Real World. As the Real World is dynamic, when you revist a given window, it is possible that things might have changed. Then, you will have a good idea of what you mind has to get a handle on, before a user can have "sentient data access."
The concept of visualling the Prints in the Print Gallery as Windows is not too off-base because the Article describes that there is a desire to integrate the physical with the visual ....
And the article also says that there are more than just Static Screens that have to be incorporated
So accomodate the above requirement, imagine now that there is not a single Spectator in the Gallery, but there are many people looking at many "Windows" at the same time. And like in real life these Spectators interact with each other inside the Print Gallery (FIGURE), just as the Real World visible from the Windows is interacting in the background (GROUND)..
After all is said and done, the conclusion that I came up with in the 1st draft of my doctoral thesis (which was rejected and I then approached this subject different which was then accepted) was that the Glue to bind it all is the Cognition of the User - i.e. PortfolioBrowser==User
The User, in my conception, is the PortfolioBrowser. And because of this choice at the center o
Re:Escher's Print Gallery may provide some insight (Score:1)
Mm.
Sentient? (Score:2)
People throw around "AI" and "Sentient" too much when describing software when in fact the software is nowhere close to that.
Re:Sentient? (Score:2)
The best description of "AI" that I've ever heard came from my undergrad AI prof. He said that AI is any system that you don't understand. You can craft the spiffiest neural-net-genetic-alg-self-modifying-rule-based- w hateverthehellyouwant system, and as soon as you explain the algorithm to someone, they'll say "Oh, I get it. So that can't be AI."
Of course, by that logic, ATM
Perhaps my data drinks too much? (Score:1)
Why else would buying CAT food at PetSmart get me an email to watch the Thanksgiving Purina DOG show on TV?
Neat (Score:3, Interesting)
The article speaks specifically to coordinating the transfer of specific data and instructions between devices in a real world environment. Though in most cases, the instruction could be context sensitive. I.E. if you walk into a Vision Dome with a particular bar code scanned, it could surmised that you want to view that bar code/layout/car within the dome.
Even though the article chastises the world for not having accomplished this yet the reality is that this sort of thing could be implemented today with current technologies. Also the platform could easily use future technologies if designed correctly.
To build such a thing would require an extensible way of definining a process much like VoiceXML, BPML, BPEL. It would also require the ability to define the exchange of data, more importantantly, the device/location/communication channel that the data will be coming from. And finally, it would require a way of easily defining the execution of a process. The last component is really the challenge. Every software package that would participate in this type of environment would need to "listen" for requests and messages that are coming from devices/other systems. Indeed, this sort of pluming is not hardware, but software. As such it would also need to be supported by every operating system, handheld device, and embedded system to be properly integrated into the world at large.
So I say build the language then build the engine.
Talk to each other? Heck, just remember me! (Score:2, Insightful)
The first question it asks me is whether I want to work with it in Spanish or English... couldn't it remember that from the card? I'm not likely to suddenly forget English. (I did run through the whole thing once in Spanish, just for kicks).
It should know which account I tend to take cash out of, and how much, and highlight those options.
Sub-sentient ATM pet peeve (Score:4, Insightful)
It's almost an anti-security device, too. If a French-speaker has their card stolen by an English-speaker, it the ATM only prompted in French, that would be at least a little bit of a deterrent for illicit use, wouldn't it be?
It's crazy to talk about a universally connected web of smart data when the individual machines are, even after years of evolution, so profoundly stupid.
As usual... (Score:3, Informative)
Do we really want this? (Score:1)
Too much? (Score:1)
Smart everything, again (Score:2)
A worthwhile project would be a "smart lecture hall". Just provide all the usual gear, but interconnect it so it works reasonably. Sense the approximate number of people in the room and crank airflow up and down accordingly. (That, all by itself, is a viable product concept.) Interconnect the lighting, screen, and projectors so that when the screen is lit, it's not illuminated by room lighting. (Use big, illuminated buttons on the contro
Big brother and tomorrows kids (Score:2, Insightful)
A project for Linux doing this (Score:2)
Re:Slashot Personal Ads! (Score:3, Interesting)
How long is it before ATM's / "grocery stores" (supermarkets here) are linked into dating sites and your email?
They know you are looking for a date, and perhaps the ATM gives you messages that the supermarket will give you a good deal on aftershave, and rather than buying beer for consumption at home, you would meet more women if you left the house once in a while.
Re:Slashot Personal Ads! (Score:4, Interesting)
And I hate to think about spam that follows you around. Every damned ATM or wall display just has to publically tell you about those magic bean^w^w blue pills that you opt'ed-in to receive messages about.
Re:Slashot Personal Ads! (Score:4, Funny)
As a privacy advocate, I guess this means I'll be buying hand lotion and "reading material" in separate trips to different supermarkets!
Re:Slashot Personal Ads! (Score:2)
are you referring to bertrand russels story a metaphysicains nightmare [luminary.us]? that'd be funny, viewing a troll as the devil himself
Re:Slashot Personal Ads! (Score:2, Insightful)
Re:Slashot Personal Ads! (Score:2, Insightful)
Dude, if the grocery store tattled on my buying habits and my dating website realized how many twinkies [twinkiesproject.com] and pints of Ben & Jerry's [benjerry.com] I buy (not for myself, I assure you!) the dating site may assume (incorrectly, I assure you) that my picture is 5+ years out of date and not representative of my current date-ability and good-lookie drool factor vis a vis the ladies. Soundly suckily Orwellian to
Re:Slashot Personal Ads! (Score:2)
Furthermore I suspect the health services would be interested in this data so they can prepare themselves for likely trends in poor health for years to come.
Next your (virtual?) doctor will be sending you emails advising buying less junk food and beer, while your bank will be advising saving for future health expenses along with the obvious contact from health insurers
I honestly don't see how this is offtopic... (Score:2, Insightful)