Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses Security Software Hardware

Why Companies Knowingly Ship Insecure Devices 123

wiredmikey writes "A recent survey which included responses from 800 engineers and developers that work on embedded devices revealed that 24% of respondents knew of security problems in their company's products that had not been disclosed to the public before the devices were shipped. But just what that means in terms of attitudes towards security may be more complex than it seems. Additionally, just 41% said their company has 'allocated sufficient time and money to secure' its device products against hacks and attacks. Despite this, 64 percent felt that when engineers call attention to potential security problems, 'those problems are addressed before the device is released.' So, what exactly does this illustrate about the state of security in the development process? The answer, some say, is a jumbled collage of business pressures, bug prioritization and varying attention to security."
This discussion has been archived. No new comments can be posted.

Why Companies Knowingly Ship Insecure Devices

Comments Filter:
  • Not important enough (Score:5, Informative)

    by Anrego ( 830717 ) * on Friday August 12, 2011 @11:38AM (#37068978)

    Security isn’t important enough or visible enough to the end user, and insecurity doesn’t cost companies enough money.

    If company A spends 100,020 extra on securing their product, whereas company B spends $1,020 extra .. and neither product “gets hacked” .. there is no perceived value increase. If company A has to sell their product at a higher cost .. most consumers will go with company B’s product.. _even if_ company A can somehow demonstrate that their product is more secure (and aside from a clean track record, this is hard).

    If Company B’s product gets hacked, 99% of users don’t know or don’t care.. and company A gets exactly 3 new customers (always 3.. regardless of scale) who are concerned with company B’s security track record and assume company A makes a more secure product.

    More importantly, if legislation went through saying that companies were liable for insecurity and the damage that is caused, everything would triple in cost and the masses with piss soup in rage

    • Done in 1. (I don't count the troll above you)

      Start fining the hell out of companies for knowingly exposing their customers to risk (any risk, whether security or e-coli) and companies will clean up their acts.

      Yes, regulating companies makes (sometimes) the end product cost more. That was true when airlines were regulated. We also didn't have incidents like Valujet when airlines were regulated. Safety/security costs more up front, but costs less in the long term.

      • by 0123456 ( 636235 )

        Start fining the hell out of companies for knowingly exposing their customers to risk (any risk, whether security or e-coli) and companies will clean up their acts.

        No, they'll stop making stuff because unlimited liability for 'any risk' is simply insane. If they can't get insurance then there'd be no point being in business if you could be bankrupted at any time (e.g. Joe Loser sues Dell for selling a PC with Windows installed, which clearly exposes them to serious risks).

        • No, they'll stop making stuff because unlimited liability for 'any risk' is simply insane.

          This. I wish I had mod points today.

          Everything anyone does has risk. The only secure computer is one that is turned off. The only secure cell phone is one that has the battery removed. The only secure ... well, you get the idea.

          Absolute security is an impossible dream, an unreachable goal, and a continuous drain on money and time. At some point, we all have to weigh the cost/benfit ratios of what we are doing and get on with our lives. E.g., the value of getting to work greatly outweighs the risks involv

        • by cdrguru ( 88047 )

          How many companies make vaccines today? What companies make the chemicals used for executions?

          The risk became too great and just about everyone got out of the business. The last round of vaccine production for flu required the government to provide immunity to the manufacturer before they would do it.

          • Companies have stopped making and/or selling the chemicals for executions because of investor pressure.

            But these kind of calculations by corporations have been going on for a long time, in many more ways that common people would think of as morally bankrupt. For example, for auto manufacturing design flaws, auto manufacturers regularly price out the cost of fixing the problem versus the cost of settlements to families of people who will be injured or killed. See Ford Pinto gas tank, GM Truck gas tank.

        • by Bengie ( 1121981 )

          Well, the idea is to fine companies who don't try "enough". There will always be security problems, but companies that don't even keep up with the industrial minumum should be heavily fined.

          How we determine this, I don't know.

      • The company that is focuses too much security is going to get fired for being behind schedule and making them lose money.

      • Then expect no new devices to be released. And put the world into a worse recession..

        If you fine them too much then they will calculate that it isn't a profitable sector to be in... Then they won't be in the sector.
        There is only a limit on how prices will rise for a personal device. Airline travel can allow a high price variant as the value of getting there faster is very high. However for your Personal Device getting the newest and greatest if it is too expensive will not add any value to the customer.

        • Open source is distributed for free, as-is, with no warranty, and plenty of disclaimers that the product may not be suitable for your purposes, or any other purposes.

          Unlike the other side of the road, where the code is a closely held secret, you pay for the privilege of using it, and there are generally at least implied warranties that the product is fit for consumer use.

          In short - if the company is willing to rape the consumer for huge profits, while supplying shoddy products, then they DESERVE to be sued

          • by tepples ( 727027 )

            Open source is distributed for free, as-is, with no warranty

            Regulation of the industry would likely make such disclaimers null and void.

            • If the regulation is that wide ranging that such a disclaimer is void it would kill any sort of hobbyist or if you don't care about that, any sort of research in the industry until a full solution (with ALL risks calculated) can be developed, last I checked no industry can claim such mastery far less a particular member of an industry.

              The great news about that world is that we wouldn't need patents because no one would release their product till they knew they could dominate the market.

              • That is why you need to worry about regulations. If you are a small minority group you can get hit by a regulation that see you as a fringe player.

          • by Anonymous Coward

            Open source is distributed for free, as-is, with no warranty, and plenty of disclaimers that the product may not be suitable for your purposes, or any other purposes.

            Unlike the other side of the road, where the code is a closely held secret, you pay for the privilege of using it, and there are generally at least implied warranties that the product is fit for consumer use.

            Umm... You might want to *read* the EULA for one of those 'other side of the road' products. They disclaim all liability that they're legally allowed to, and then additionally limit their liability to the price you paid for the product. You're in, *at best*, exactly the same situation with 'the other side of the road' as you are with Open Source.

            • No, you seem to have missed the point completely. The other side of the road CHARGES you for using their stuff. "I've got some great software, but you have to pay me if you want to use it, and you can never look at how it works!"

              On my side of the road, it's more like, "I've got some stuff that works sometimes, for what I want to do. You can use it, or you can improve on it, or whatever you like. But, be warned, it's just a hack to make things work the way I like."

              Do you still fail to see the difference?

              • by aix tom ( 902140 )

                It *could* be a small little step to just force companies to give a refund when they ship a faulty product.

                Like it is the case with any *REAL* product, which has to have a warranty.
                The manufacturer or seller can try a few times to fix a problem, but when they can't the customer can demand his money back if it doesn't work as advertised. That should just be applied to software sales exactly same way.

      • by dubl-u ( 51156 ) *

        Safety/security costs more up front, but costs less in the long term.

        Not necessarily true. If you blindly make producers liable for all risk, and pile on top of that a substantial regulatory framework, you could create costs well above benefits.

        I have a friend that makes jam. It's good jam. If she were to sell it at the farmer's market, people would happily buy it. And the sorts of people who buy jam at the farmer's market know what they're getting into. If by some fluke one of the jars doesn't seal properly, they'll deal with it. But in your world, she'd be exposing herself

        • There's a big difference between selling jelly at the farmers market and knowingly releasing devices that facilitate identity theft, or knowingly selling meat that was contaminated with feces when the guy working in the factory cut too deep.

          Note the key word "knowingly."

          • by dubl-u ( 51156 ) *

            There's definitely a big difference, which is why I think the "any risk" standard you suggest is too extreme.

            "Knowingly" is a good start. But there's a problem with that, too; it discourages knowing, or activities that lead to knowing, like investigation or research. A lot of the corporate criminals who caused the economic crash we're suffering from got away with it because they had plausible deniability. They just didn't know! And we happily ignored that they could have known, and probably should have know

          • There's a big difference between selling jelly at the farmers market and knowingly releasing devices that facilitate identity theft,

            Xerox, Canon, Ricoh, and several other companies knowingly manufacture devices that facilitate not only identity theft but copyright violation and child pornography. They're called "copy machines". Several companies knowingly manufacture devices that facilitate copyright violations, namely "DVD recorders".

            Many many companies knowingly distribute devices that knowingly allow the violation of many different laws. I can buy radios from Kenwood, Motorola, Tait, EF Johnson, and a host of other companies that a

            • Xerox, Canon, Ricoh, and several other companies knowingly manufacture devices that facilitate not only identity theft but copyright violation and child pornography. They're called "copy machines". Several companies knowingly manufacture devices that facilitate copyright violations, namely "DVD recorders".

              Oh come on. That's bullshit. You know as well as I that the intended use standard applies. Copy machines are not intended to be used for kiddie porn or counterfeiting. Conversely, meat is meant to be eaten

              • Oh come on. That's bullshit. You know as well as I that the intended use standard applies.

                Whatever "standard use" policy you think applies does not change the fact that those companies knowingly make and knowingly distribute devices that facilitate illegal activities. That "standard use" policy may make them eventually not liable for producing those devices, but produce them to do. And you might note that "standard use" did not protect Napster.

                Conversely, meat is meant to be eaten,

                In your rush to call me a liar, did you even bother to note that I clearly differentiated between actions that are themselves a danger to others and act

                • In your rush to call me a liar, did you even bother to note that I clearly differentiated between actions that are themselves a danger to others and actions that are indirectly a danger? Like producing tainted meat is a direct danger, while producing a copy machine, or any other "device" with "security issues", is not.

                  Nope, I didn't notice that. And having re-read your post, I still didn't notice that, because you didn't differentiate anything.

                  But, given that you *meant* to differentiate, if you're acknowle

      • Agreed, except for "any risk." Sooner or later, companies will just stop trying to produce anything. The small private airplane market was a perfect example of this: The government assigned essentially indefinite liability to the manufacturer of an airplane, and after a while Cessna et al just quit making small planes.

      • by cdrguru ( 88047 )

        ValuJet got a bunch of oxygen generators loaded on a plane in spite of a strict regulatory environment. They partly adhered to the regulations and partly did not. There were no inspectors on site to verify compliance, and they took some shortcuts. No amount of regulation would have changed that unless they had on-site inspectors. The cargo handlers had a box to move and they put it on a plane to move it. They were not supposed to, they knew they were not supposed to but did it anyway.

        Alaska Air did sho

    • More importantly, if legislation went through saying that companies were liable for insecurity and the damage that is caused, everything would triple in cost and the masses with piss soup in rage

      No, it would simply force the hand of developers to release all security related code under a GNU license to avoid the liabilities of being the maintainers of the software. That or (very brave) specialsed hardware/software security companies would start providing middleware for that purpose.

      • Brave? You needn't be brave. Just start a subsidiary company that does the security baloney, cash in, transfer the money and when the shit hits the fan, the subsidiary goes bankrupt.

      • Consultants, they will hire consultants to do the work. Then point their finger back at them when there is a problem and go BAD BAD consultants, then hire them for the next job. That is what the government does. If they need to do something that is politically risky they get a consultant to do it, if it succeeds they person get the credit, it it fails they blame the consultant, which privately the consultant happily takes because he knows that he will probably get the next job as well because why would

        • by Dog-Cow ( 21281 )

          skate-goat

          Ruminants on wheels?

        • I was thinking of this just this morning. It seems we hear more and more about damn *stupid* security breaches. SQL Injection, etc... heck, didn't the CitiBank credit card cracker simply modify the URL to scrape thousands of card numbers? Given what we know about outsourcing (not necessarily offshoring, but simply farming out the latest "Web 2.0!!!" design to companies like Accenture) it's hard to believe that a lot of these faulty web sites were designed by one of a few companies.

          It left me wondering, "

          • What you forgot is durring the process the Consulting company may really try to say you should do it this way it is more secure, but normally it will not go threw because consultants are not to be trusted.

            • It's been my experience (working for a subsidiary of an international bank) that the opposite is true. "Oh we should do what the consultants say, they do this all the time."

      • So the solution is to use the GNU license to avoid liability.

        You must work in the industry. Maybe as a CEO?

    • Remember that sales people typically make percentages based on sales. You don't get that percentage until you ship. So you get a lot of pressure to deliver quickly. And you can't do security in a rush. Typically your engineering head will do a security assessment and sales will go over it (usually in a series of small hops and jumps) and then ship anyways, because that's how they get paid. They'll have engineering bang out patches later on. If anyone complains.

      Bottom line is that engineers don't get

    • by mfh ( 56 )

      When Playstation Network was hacked I laughed because I wasn't stupid enough to give them my personal info or a password used in multiple other places. I had a distinct password sent to them and they never saw a dime from me over a credit card.

      When it comes down to it, what other people call paranoia, I call standard practice.

      • by Anrego ( 830717 ) *

        When it comes down to it, what other people call paranoia, I call standard practice.

        In a world where a huge company like Sony can fuck up on such an epic scale and get little more than a wrist slap.. and will probably keep right on doing business they way that've been doing... yup!

        Unfortunately it's hard to "not participate". Everyone wants all your personal info for everything. There are ways around this (temporary credit card numbers) but it's pretty hard to avoid giving someone enough data to do damage while still living a relatively enjoyable life.

        Also.. two digit UID.. jebus!

        • What bothers me most, from a pure security point of view, is that this pretty much turned the PCI-DSS into a weak joke and a laughing stock of the IT-Security community. Sony pretty much had to be compliant, i.e. get the cert. They stored credit card info, they are most likely even a level 1 (highest possible level, more than 6 million transactions annually (or already had a breach, i.e. if they were not, they are now), highest possible security risk) merchant, in other words, they pretty much had to get au

          • by Anrego ( 830717 ) *

            but I guess it starts to become visible outside the business now...

            Problem is it really doesn't.

            Sure, people think about it a bit when it's in the news.. and maybe down the road someone will be looking into something and this incident will be used as a case study... but for the most part... people forget this shit as soon as it's out of the headlines.

          • by maxume ( 22995 )

            If you start from the premise that the credit card companies are the ones that could go ahead and implement secure authentication (with card readers or token generators or whatever), the security of the whole industry is a joke.

            Of course, they are more worried about costs than security so it isn't a big surprise.

            • Forget security tokens or other security features that the customer would have to use. The customer doesn't give half a shit, if you "force" a security token on him, he'll use a different CC provider that doesn't. Especially since, hey, if someone abuses my card, the CC company will cover it, so why bother?

              That the merchant he bought at will most likely discontinue business with him (because he, eventually, gets to foot the bill) is another matter. And I guess a lot of people would be pissed if Amazon, EBay

              • by maxume ( 22995 )

                Chip cards seem to work for the Netherlands (but they are relatively small and the banking industry chose to work together on it).

                If American Express offered a secure payment system that meant I was authorizing single transactions to a single vendor, I'd use it in a heartbeat.

              • by maxume ( 22995 )

                And I guess the more sarcastic response is something like "Yeah, that's the part that is a joke."

                Or whatever. The general point is that the activities they classify as 'security' are largely tilting at windmills, at least when compared to what is technically possible.

      • So clearly you must own a Blackberry if you are concerned about security since all other smartphones can be eavesdropped onto. You must also have timfoiled your house.
      • A service you paid for and had a right to have went down for a month or so and this made you laugh?

    • by danhaas ( 891773 )

      That "full responsibility" approach led the american health system to its present state.

      Sometimes you just have to learn to live with the risk, and try to manage it instead of eliminating it.

    • by Hadlock ( 143607 )

      Yep. Your job as a product manager is to

      1. Ship the product
      2. Ship the product on time and
      3. Do it under budget

      Pick any two. #1 is not optional. As long as conditions 1 and 2 or 3 are met you get to keep your job, and possibly a project completion bonus (if you're lucky). As long as security flaws aren't getting in the way of two of those three objectives, you can ignore them and patch them in a later firmware/software update.

      Complaining to your manager that you need to delay the p

    • by jafac ( 1449 )

      I don't understand this comment. What's wrong with the masses pissing soup? They could sell the soup and make money!

  • Nah, the author and submitter made a valiant attempt but the real reason is that we are "satisfied" to just release stuff and let the general public be un/underpaid debug labor.

    If all that debug was properly full-costed these companies would lose years of profits.

    • by Anrego ( 830717 ) *

      That and customers arn't willing to pay the costs of doing it properly. Especially when your competitor is not doing it properly and as such can offer their product cheaper than yours.

      Consumers are as cheap and greedy as the companies who make the products. Can't sell what people don't care about and arn't willing to pay for..

  • Engineers are saying their products are being rushed to market, and that they're not being given enough time to come out with a perfect product?

    What's the world coming to?

    Next thing you'll be seeing teachers complain about being underpaid and under-appreciated and the president saying that partisan bickering is preventing him from getting anything accomplished.

    Just because it's true doesn't make it news.

  • GM engineers discovered a safety problem in a vehicle they were designing, and designed an extra part to fix it. But management decided to save $5 per vehicle and skip it. GM ended up getting their cabooses sued off for that decision after the legal "discovery" process found out about the intentional shortcut. They Jury handed them their ass.

    Perhaps a similar situation has to happen with software in gizmos before companies "care".

    • by kbonin ( 58917 )

      This is the real reason why most large companies now have email retention policies and auto-delete everything after 30..90 days.

      It is a cheaper "fix".

      • by dubl-u ( 51156 ) *

        This is the real reason why most large companies now have email retention policies and auto-delete everything after 30..90 days.

        It is a cheaper "fix".

        That is an incredibly important point. You could fix the email problem, but you can't fix people refusing to know. Almost everybody responsible for crashing our economy escaped accountability, and many of them claimed that they were blameless because they didn't know what was going on, after setting up companies in such a way that they were guaranteed to not know what was going on.

        It's an endemic problem in corporate America, and we need to find a way to fix it.

    • by 0123456 ( 636235 )

      GM engineers discovered a safety problem in a vehicle they were designing, and designed an extra part to fix it. But management decided to save $5 per vehicle and skip it.

      [citation needed]

      I remember some similar stories (the Pinto gas tank?) of poor engineering design in American cars that management wouldn't change until they had to, but I'm pretty sure the story as you tell it is an urban legend.

      • by Anguirel ( 58085 )

        Yes, the Pinto was the one that would be the origin of that sort of story. The Exploding Gas Tank could have been fixed by a $1 plastic bit, and they knew that before they went to manufacturing.

        http://motherjones.com/politics/1977/09/pinto-madness

  • there is a huge fucking difference inbetween "oops we left the programming interface exposed so some hacker can rewrite the firmware in his xbox controller" and "oops we just gave all your personal data to the Chinese, dont enter any credit cards"

    And please drop this magic cloud of "embedded devices" just for the sake of clarity? Cause for fucks sake that could mean anything from the intellegent disk controllers in a C-64 to a ipad to a army rifle

    • by 0123456 ( 636235 )

      Take my webcam for example. Telnet to port 50000 and you get a root shell with no password required; took two minutes to discover that with nmap after I connected it to my home LAN.

      Or you did, as the first firmware upgrade removed that feature.

    • by Andy Dodd ( 701 )

      Yup. It's interesting, some of the things done in the name of "security" actually piss off a vocal minority of technically-oriented users. This vocal minority is often trusted by less-technical friends to make recommendations on what to buy.

      As a result, a device that's locked-down from tinkerers is going to get less recommendations from "trusted friends". A device that's open to tinkerers might have those tinkerers rave about their device to their less-techie friends.

      The problem is that a lot of routes u

  • Quite frankly, and in a nutshell: Why should a company spend time and money on securing a device if the customer does not honor it?

    Take two companies, A and B. A spends engineering time on working out and ironing out all the security bugs and flaws, ending up with a more expensive product than company B who doesn't. Net result? Customer goes and buys the insecure product from company B.

    Then there's that part where insecurity actually works in the customer's benefit. For reference, see DRM and how it gets ci

    • The insecurity that favors the customer is where companies are more inclined to spend their time and money.

      And "customers don't care" is not the same as "customers don't understand" or "customers don't know about it." Customers, when informed of a security issue, almost always care. I refer you to the classic slashdot car-analogy and ask yourself if you were informed, before purchase, of a serious vulnerability in your car, would you buy it? And if you bought it without knowing and were later informed, w

      • Try telling customers to develop unique passwords with special characters for every website they have an account with. They might care about security, but they care more about remembering their passwords so that they can log in.

        • I'll respond in the form of a comic:

          http://xkcd.com/936/ [xkcd.com]

          To make a password strong, 8 characters, having punctuation, numbers and mixed case is not as great an idea as you might think.

          On the other hand, if you tell people to pick four words of varying length that normally don't have anything to do with one another, and you have a pretty good password. It would invariably be longer than 8 characters and WAY harder for traditional cracking methods.

      • "Caring" is a meaningless word, unless proven with action. The question is how much resources, in both time and money, are the consumers willing to invest in order to be more secure.

        "Informing" the consumer is problematic, because once we get past some rock bottom basics about passwords and credit card numbers and phishing, the average consumer cannot understand the specific issues involved without enormous, tedious research and education which they just are not going to do. Informing sounds nice, but if

        • Actually, informing the consumer is the responsibility of the manufacturer and in many states in the US, failure to disclose such knowledge is a serious violation of law. We are talking about products shipping where the producer is already aware of problems and vulnerabilities aren't we?

          As for "...it is a lot of noise that makes the product look bad..." goes, that argument doesn't stop them from pushing EULAs in peoples faces and then expecting the user to abide by them.

          There are standards... or there were

          • On your first point, it is an interesting question. When it comes to risk of life and limb, the law may be clear cut. Does Microsoft send us notification for every new known theoretical vulnerability? Did the manufacturer of your old wireless sitting in the corner of your home office firewall/router send your notification about every new hack that could compromise security? I think the answer here is no, but maybe someone has useful information on that question.

            On your second point, EULAs protect the ma

      • And "customers don't care" is not the same as "customers don't understand" or "customers don't know about it." Customers, when informed of a security issue, almost always care.

        Oh yes, I can see the PSN being virtually deserted now.

        And oh yes, the people were mighty upset about it. I can still see the laments on many, many message boards what an outrage it is. They shut up quickly as soon as PSN went back online and they could play again.

  • Because its cheaper

  • Comment removed based on user account deletion
    • the common board game 'operation' was unquestionably fed from a 120 volt AC source

      I'm pretty sure that "Operation" has always been a battery-powered game.

      • by cdrguru ( 88047 )

        Operation was introduced in 1965 well after the time when things were "unquestioningly fed from a 120 volt AC source". There is no question it was always battery powered. Heck, I remember wanting one when it first came out when I was like 10 or something.

    • harkening back to the days of manufacturing before the CPSC, Americans basked in the glory of such products as stainless steel lawn darts and carcinogenic drink additives.

      Lawn darts weren't banned until 1988, the CPSC having been founded in 1972. Carcinogenic drink additives were not regulated by the CPSC, but rather the FDA. The two additives you are most likely referring to were actually both eventually ruled non-carcinogenic; one was never banned in the first place.

  • Most people here on Slashdot understand very well the "engineering" perspective of product development. We tend to believe that a better product will sell better and that, conversely, products that sell better are presumed to be better products.

    MBAs know better. What they know is that marketing, public relations and public image/perception is far more critical to "success" than quality.

    So is it any wonder that quality takes a back seat to marketing and releasing a product?

    • by 0123456 ( 636235 )

      MBAs know better. What they know is that marketing, public relations and public image/perception is far more critical to "success" than quality.

      No, that's what they believe... and in the short term they're correct. In the long term, however, it's hard to keep selling people crap when they've had too many bad experiences with your earlier products.

      Look at Sony, for example. My first two Sony camcorders lasted a decade each; in fact, I'm still using the DV camcorder I bought in 1996 because of the design flaw in the HD camera I bought in 2004 where if you remove the battery before the hardware has completely shut down it fries the logic board and cos

      • Good for you, Mr. Engineer. You display logic and wisdom that few people display.

        For example, people continue to vote for Democrats and Republicans and completely exclude alternatives despite the fact that the two leading "brand names" continually fail them. And Sony's continued success despite their quality issues is an important indicator that you are an anomaly and not a mainstream consumer. Mainstream consumers keep buying Sony because they believe Sony is cool technology.

      • I don't buy from Sony either, for the same reasons. However, I seen more than enough people walking home with a sony product under their arm to realize most people really don't care enough to do their research before buying. Heck, look at all the PS3s being sold, and the rabid fanboy community that exists around it.

        So now I just sit back and laugh when someone gets all indignant that their Sony product either failed or somehow abused the purchaser.

    • Security is only one element of a quality product. Adding a new feature or improving ease can increase a products quality at the expense of security.

      • You are presuming they are always mutually exclusive. While it is often the case, it is not ALWAYS the case.

        But you are right in that people tend to favor convenience at the expense of security for consumer products. However, this is best coupled with consumer ignorance because once they discover there is something about their product that makes them or their information vulnerable to attack, they won't care that it was so they could have a more convenient user experience. They will just be pissed off...

  • At my company, we code in Java. Memory leaks never happen either.
  • And doesn't necessarily increase revenue. Besides that, in my history anyway, managers do not want to spend another $5k because a product is "More Secure". They would much rather put the $5k into a product with a dead-simple API than put it into some hypothetical circumstance which they have no direct experience with.

    Security is one of those things you can only truly understand by getting burned by it.

  • Do the devices have a low self-esteem?

    Or do you mean UNSECURED?

  • The more secure you make an embedded device or appliance against information leakage and harvesting-type vulnerabilities, the more likely it is to end up getting returned to stores by frustrated consumers who can't get it to work.

    Just look at WPA-2 -- it's unquestionably more secure than WEP. It's also rarely used in public settings because statistically, it never fucking works. You can take any access point, and any device that supposedly supports WPA-2, and know beyond doubt that there's about a 50-50 cha

  • If you ask me if a product I've worked on is 'secure', my immediate thought is 'what is your criteria?'. There are 'degrees' of secure and the line where someone says 'it's secure' shifts according to whose making the call. Some may say they 'secured' their unattended installer data because they base64 encoded the administrator password (looking at you, microsoft). They would argue they did enough to protect from over the shoulder (visual exposure only, with no opportunity to transcribe it to paper). Th

  • I have worked long and hard in my profession to get devs to fix security bugs. The reaction mostly falls in one of these categories:
    1. I do not understand the issue (read, I am just copying code of the interwebs and have no clue about my job).
    2. I understand the issue but we are under the gun to release the product.
    3. I understand the issue but the vulnerability is theoretical (read, I don't understand anything about large scale production infrastructure)

    Bottom-line: Unless a security big breaks funct
  • So what if my cell phone can access voicemail without a 15 character minimum password.

    So what if my Wii or Xbox can let people chat with me.

    So what if my GPS could theoretically be told to trick me into turning into a lake.

    For 99% of of devices I buy, security "features" are an annoyance end user's don't want.

    When I buy a post card - it's OK someone who theoretically intercepts the mail can read it. I understand that and won't write my credit card number on the back of it. The last thing I want is some

  • They surveyed engineers. Engineers *never* think they have enough time or resources for a project.

BLISS is ignorance.

Working...