Lessons From the Cyberattack On India's Largest Nuclear Power Plant (thebulletin.org) 113
Dan Drollette shares an article by two staffers at the Center for Global Security Research at Lawrence Livermore National Laboratory from The Bulletin of Atomic Scientists.
"Indian officials acknowledged on October 30th that a cyberattack occurred at the country's Kudankulam nuclear power plant," they write, adding that "According to last Monday's Washington Post, Kudankulam is India's biggest nuclear power plant, 'equipped with two Russian-designed and supplied VVER pressurized water reactors with a capacity of 1,000 megawatts each.'"
So what did we learn? While reactor operations at Kudankulam were reportedly unaffected, this incident should serve as yet another wake-up call that the nuclear power industry needs to take cybersecurity more seriously. There are worrying indications that it currently does not: A 2015 report by the British think tank Chatham House found pervasive shortcomings in the nuclear power industry's approach to cybersecurity, from regulation to training to user behavior. In general, nuclear power plant operators have failed to broaden their cultures of safety and security to include an awareness of cyberthreats. (And by cultures of safety and security, those in the field -- such as the Fissile Materials Working Group -- refer to a broad, all-embracing approach towards nuclear security, that takes into account the human factor and encompasses programs on personnel reliability and training, illicit trafficking interception, customs and border security, export control, and IT security, to name just a few items. The Hague Communique of 2014 listed nuclear security culture as the first of its three pillars of nuclear security, the other two being physical protection and materials accounting.)
This laxness might be understandable if last week's incident were the first of its kind. Instead, there have been over 20 known cyber incidents at nuclear facilities since 1990. This number includes relatively minor items such as accidents from software bugs and inadequately tested updates along with deliberate intrusions, but it demonstrates that the nuclear sector is not somehow immune to cyber-related threats. Furthermore, as the digitalization of nuclear reactor instrumentation and control systems increases, so does the potential for malicious and accidental cyber incidents alike to cause harm.
This record should also disprove the old myth, unfortunately repeated in Kudankulam officials' remarks, that so-called air-gapping effectively secures operational networks at plants. Air-gapping refers to separating the plant's internet-connected business networks from the operational networks that control plant processes; doing so is intended to prevent malware from more easily infected business networks from affecting industrial control systems. The intrusion at Kudankulam so far seems limited to the plant's business networks, but air gaps have failed at the Davis-Besse nuclear power plant in Ohio in 2003 and even classified U.S. military systems in 2008. The same report from Chatham House found ample sector-wide evidence of employee behavior that would circumvent air gaps, like charging personal phones via reactor control room USB slots and installing remote access tools for contractors... [R]evealing the culprits and motives associated with the Kudankulam attack matters less for the nuclear power industry than fixing the systemic lapses that enabled it in the first place.
"The good news is that solutions abound..." the article concludes, noting guidance, cybersecurity courses, technical exchanges, and information through various security-minded public-private partnerships. "The challenge now is integrating this knowledge into the workforce and maintaining it over time...
"But last week's example of a well-established nuclear power program responding to a breach with denial, obfuscation, and shopworn talk of so-called 'air-gaps' demonstrates how dangerously little progress the industry has made to date."
"Indian officials acknowledged on October 30th that a cyberattack occurred at the country's Kudankulam nuclear power plant," they write, adding that "According to last Monday's Washington Post, Kudankulam is India's biggest nuclear power plant, 'equipped with two Russian-designed and supplied VVER pressurized water reactors with a capacity of 1,000 megawatts each.'"
So what did we learn? While reactor operations at Kudankulam were reportedly unaffected, this incident should serve as yet another wake-up call that the nuclear power industry needs to take cybersecurity more seriously. There are worrying indications that it currently does not: A 2015 report by the British think tank Chatham House found pervasive shortcomings in the nuclear power industry's approach to cybersecurity, from regulation to training to user behavior. In general, nuclear power plant operators have failed to broaden their cultures of safety and security to include an awareness of cyberthreats. (And by cultures of safety and security, those in the field -- such as the Fissile Materials Working Group -- refer to a broad, all-embracing approach towards nuclear security, that takes into account the human factor and encompasses programs on personnel reliability and training, illicit trafficking interception, customs and border security, export control, and IT security, to name just a few items. The Hague Communique of 2014 listed nuclear security culture as the first of its three pillars of nuclear security, the other two being physical protection and materials accounting.)
This laxness might be understandable if last week's incident were the first of its kind. Instead, there have been over 20 known cyber incidents at nuclear facilities since 1990. This number includes relatively minor items such as accidents from software bugs and inadequately tested updates along with deliberate intrusions, but it demonstrates that the nuclear sector is not somehow immune to cyber-related threats. Furthermore, as the digitalization of nuclear reactor instrumentation and control systems increases, so does the potential for malicious and accidental cyber incidents alike to cause harm.
This record should also disprove the old myth, unfortunately repeated in Kudankulam officials' remarks, that so-called air-gapping effectively secures operational networks at plants. Air-gapping refers to separating the plant's internet-connected business networks from the operational networks that control plant processes; doing so is intended to prevent malware from more easily infected business networks from affecting industrial control systems. The intrusion at Kudankulam so far seems limited to the plant's business networks, but air gaps have failed at the Davis-Besse nuclear power plant in Ohio in 2003 and even classified U.S. military systems in 2008. The same report from Chatham House found ample sector-wide evidence of employee behavior that would circumvent air gaps, like charging personal phones via reactor control room USB slots and installing remote access tools for contractors... [R]evealing the culprits and motives associated with the Kudankulam attack matters less for the nuclear power industry than fixing the systemic lapses that enabled it in the first place.
"The good news is that solutions abound..." the article concludes, noting guidance, cybersecurity courses, technical exchanges, and information through various security-minded public-private partnerships. "The challenge now is integrating this knowledge into the workforce and maintaining it over time...
"But last week's example of a well-established nuclear power program responding to a breach with denial, obfuscation, and shopworn talk of so-called 'air-gaps' demonstrates how dangerously little progress the industry has made to date."
Wait, What? (Score:5, Insightful)
employee behavior that would circumvent air gaps, like charging personal phones via reactor control room USB slots
Reactor control rooms have USB slots? WTF?
Re: (Score:2)
You might argue that the need to install upgrades is a serious problem, and you would be right, but the industry doesn't care.
Re: (Score:2)
They need to export data, and of course, install upgrades.
You might argue that the need to install upgrades is a serious problem, and you would be right, but the industry doesn't care.
A proprietary socket interface would solve this problem.
What else can we solve here today?
Re: (Score:2)
Do both.
Re: (Score:2)
What does any of that have to do with these so-called "USB Slots" whatever they are? And how does "USB Slots" in the control room have anything to do with "export data" or "install upgrades"? Clearly you have never heard of this 1970's technology called a network ...
Re: (Score:2)
The machines in question are air-gapped. Only carefully vetted data is supposed to be carried back and forth, with no permanent connection. But if the way this happens is supposed to be through trusted USB devices plugged into USB ports, it's bad if untrusted devices with network connections are plugged into those USB ports.
Re: (Score:2)
And why would this be occurring in the control room? And who would be doing it do you suppose? The Operators? Hahahehehehhohoho.
Operators operate the plant. They are neither charged with nor responsible for "exporting data" or "installing upgrades. Someone has no concept of what a Process Operator (as in the Console operator) actually does or is paid to do.
Re: (Score:2)
I think you misunderstand the term "air-gapped". The consoles cannot be "air-gapped" -- they need to communicate with the control systems and instrumentation. The entire process control environment may be "air-gapped" from the "entertainment" network (alternatively called the "business" network) where nothing much of consequence occurs, but the console stations cannot be isolated from the control system -- otherwise there would be no point in having them all, would there be?
Re: (Score:2)
Now I know you're just trolling. :P They're not air-gapped from power, either...
Re: (Score:2)
equipped with two Russian-designed and supplied VVER pressurized water reactors
Well at least we know the attackers weren't Russians this time.
Re: (Score:2)
> They need to export data, and of course, install upgrades.
eSATA might be a better option for those who want a smaller attack surface.
Re: Wait, What? (Score:2)
Re: (Score:2)
The cheap computers that run them do.
Re: (Score:2)
Oh, so a "USB Slot" means a "USB Port", and in particular an accessible USB Port on a console computer. This seems rather far-fetched to me. However I suppose it is possible that there are idiots who do that sort of thing.
Re: (Score:2)
Oh, so a "USB Slot" means a "USB Port", and in particular an accessible USB Port on a console computer. This seems rather far-fetched to me. However I suppose it is possible that there are idiots who do that sort of thing.
You mean people who get hung up on obvious things that have nothing to do with the conversation? Or get enrages when they see a thesaurus?
Most of us instantly figure out that USB slots are USB Ports. Or USB connectors, or USB plugs, or USB sockets. I'll use USB* so as not to trigger ya.
USB* has long been known as an attack vector. I've seen it in action. It is pretty easy to gain access to a USB* input. One example is: Go to a trade show and get a freebee USB Flash Drive, AKA Thumb drive, AKA Geek Stick
Re: (Score:2)
The attack vector is the "bus" hanging outside the computer and the Operating System which will execute things willy nilly without being told to do so and without the capability of telling it not to do so. Specifically, it is a Windows problem. And there is no problem with USB "slots". You can go to the corner store and buy power adapters with a standard plug on one end and a USB "slot" on the other. What is the problem?
The problem is exactly what it usually is: Uneducated people with no knowledge of what they speak blarting their ignorance and leaving out facts, thus causing other (perhaps well meaning but also ignorant people) to make assumptions of fact and go off the deep end for no reason, causing lots of panic and noise about nothing much at all.
This is how snake-oil salesmen sell snake-oil.
Actually, it is people that try to take the conversation into ridiculous directions by being so pedantic that they turn others off.
For better or worse, Windows is what people use because they use Windows because they use Windows because they use Windows. So one must make the poor operating system as safe as they can. You aren't going to turn most people into computer experts. You especially aren't when you walk around and ridicule some poor schmedlock because they didn't use the term you demand is corre
Re: (Score:2)
No, I am simply saying that complaining about "USB Slots" is completely ridiculous because lots of stuff has "USB Slots" and without providing useful information simply claiming that "someone plugged a phone into a USB slot in the control room" is like saying "someone plugged a desk lamp into an electrical outlet". It conveys zero useful information and leads to people jumping to asinine conclusions.
Re: (Score:2)
Oh, the irony of you calling anyone an idiot.
Re: (Score:2)
And why would you say that? I guess you believe that the USB "slots" on HMI computers should be available and functional for random passers'by to plug things into? Perhaps you should be joining the league of idiots, as only idiots hold such beliefs.
Re: (Score:2)
There are no quotes around slots you fucking moron. You plug USB devices into the slot of the USB port. Learn to shut the fuck up when you've made yourself look like an idiot if you aren't man enough to own up to your stupidity.
Re: (Score:2)
This is the hill you want to die on? USB Slot vs USB Port?
Re: (Score:2)
There's nothing cheap about the computers provided as operator stations. They are full of completely pointless hardware. Oh and they don't run reactors. They only provide input to the very expensive "not at all anything like a computer" control system that run reactors.
Re: (Score:3)
Reactor control rooms have USB slots? WTF?
The article did talk about this being flaws in the mind-set. But yes, they do. How would you upgrade the firmware on such a thing otherwise, or remove log data? It is air-gaped, remember? However, these USB slots should be behind two-lock covers that can only be opened with special procedures and absolute prohibitions to connect anything but the intended devices.
Side note: I once came to know that some electrician had connected his phone to an USB port on a secure server to charge it in a server room where
Re: (Score:2)
The Console Stations are not air-gapped from the Control Network. The Control Network is air-gapped from the entertainment (business) network. There is no need for USB Access to a console station when it is being used as a Operations station.
Re: (Score:2)
Talk about stating the obvious and missing the point. If there are USB ports, then they are either needed or somebody screwed up massively by not closing them up, nicely illustrating the point of the article.
Re: (Score:3)
Nuclear plant cyber security (I am speaking of the plant systems, NOT the admin business networks) is way ahead of most other industries. They have thought of everything discussed on this thread and ten times more. Unfortunately, we'll always have these Dunning-Kruger level posts based on misinformed articles.
Re: (Score:2)
From the story: " like charging personal phones via reactor control room USB slots"....
Re: (Score:2)
So? Go to the corner store and buy a USB power adapter and have the System/Instrumentation Engineer plug it into an outlet for you. You now have a USB port in the Control Room that can be used for "like charging personal phones via reactor control room USB slots" ...
The article (which was a referent to a study by some bozo's and not related to the incident that is the subject of the article) merely specified that there was a "USB slot" located in the control room and that someone used it to charge up thei
Re: (Score:2)
You have no idea what you are talking about. In an IT security analysis, an USB port of concern is most certainly one connected to a system under evaluation. It is a standard evaluation item treated in a standard way. Obviously you have never seen such an evaluation report.
Re: (Score:2)
Updating the firmware on your nuclear plant should be a major, carefully controlled and monitored operation. Replacing a HDD seems like a better way to do it, or even replacing the whole control PC. After all you want to stage it and test it out first before putting it in a production system.
For logs I'd suggest having data spat out over a unidirectional RS232 link onto a logging machine. Even if it gets p0wned it can't do anything.
Re: (Score:3)
Updating the firmware on your nuclear plant should be a major, carefully controlled and monitored operation.
General statements like this demonstrate ignorance of nuclear control systems. There are many control systems at a nuclear plant. There are highly secured, segmented, and isolated safety control systems (not just one, but multiple ones), there are more elaborate turbine control systems that are integrated with other production systems, and there are various monitoring systems that are much less critical and provide data throughout the plant.
Critical safety systems are actually based on quite simple logic
Re: (Score:2)
I do this kind of thing (not nuclear, but life critical safety systems) for a living. I write the firmware and sometimes manage deployment.
You may think that the system is segmented and and isolated and you can safely update one part from a USB flash drive at little risk to anything, but you are wrong. All those systems interact in ways that can be difficult to predict. Even if something seems trivial it may end up being critical, e.g. a particular display is referenced in the emergency operations and if it
Re: (Score:3)
I do this kind of thing (not nuclear, but life critical safety systems) for a living. I write the firmware and sometimes manage deployment.
You may think that the system is segmented and and isolated and you can safely update one part from a USB flash drive at little risk to anything, but you are wrong. All those systems interact in ways that can be difficult to predict. Even if something seems trivial it may end up being critical, e.g. a particular display is referenced in the emergency operations and if it happens to be not working when the operator gets to that step you have a major problem.
Don't tell me it's all automated. If you are 100% reliant on automated systems with no manual backup you are screwed. That was part of the problem at Fukushima and Chernobyl - loss of instrumentation and automated safety systems.
Even if safety logic shouldn't change that doesn't mean there won't be a need to update the firmware. Over and over again we have seen that systems thought to be safe were not and needed to be altered experience revealed flaws. Simple locks and access controls are also inadequate, you should be using signed binaries at the very least.
I really hope you are not in charge of nuclear safety.
Once again, you demonstrate your ignorance of nuclear systems. Your assumptions they are similar to what you are doing is based on ignorance.
Display errors are not 'major problems' because there are redundant methods to validate operational parameters. Demonstrating your ignorance again.
Fukushima's problem had nothing to do with cybersecurity and everything to do with a tsunami deluge hitting a plant that was not designed to withstand it. More demonstrable ignorance on your part.
Nuclear systems are
Re: (Score:2)
At least read my posts properly before responding.
Fukushima's meltdowns could have been avoided if it had not been for the general confusion in the immediate aftermath brought on the failure of monitoring systems and lack of manual backups.
Re: (Score:2)
Re: (Score:2)
"You may think that the system is segmented and and isolated and you can safely update one part from a USB flash drive at little risk to anything, but you are wrong. All those systems interact in ways that can be difficult to predict. Even if something seems trivial it may end up being critical, e.g. a particular display is referenced in the emergency operations and if it happens to be not working when the operator gets to that step you have a major problem."
If the data is displayed on an operator display (
Re: (Score:2)
An example of a safety system with a display is a dosimeter. Famously one of the issues at both Chernobyl and Fukushima was lack of suitable dosimeters for the staff, making it harder for them to assess the situation and prevent it from getting worse.
Re: (Score:2)
Critical safety systems are actually based on quite simple logic. They should almost never need to be updated as the safety logic doesn't change. There could be system upgrades, they are planned. The systems are in limited access rooms, inside locked cabinets. Work orders are carefully planned. Multiple checks, one person cannot perform any such task independently, every step is double checked. And if any potential cyber asset is involved, cyber security expert reviews and places necessary controls in place.
In a Russian-designed power plant installed in India?
Russian attitudes towards such elaborate procedures are legendary, and Indian attitudes are, if anything, a step down from there.
Tell us again about your extensive experience in the reactor business in the Western world that is laughably divorced from what happens in the Near East. Oh, I see you did. Carry on.
Re: (Score:2)
employee behavior that would circumvent air gaps, like charging personal phones via reactor control room USB slots
Reactor control rooms have USB slots? WTF?
There are business networks in plants, completely separate from the control systems. A work laptop might have a usb port. But it doesn't connect to the plant. You certainly can't plug a usb stick into a safety control system. The article is written by an ignorant author.
Re: (Score:2)
You certainly can't plug a usb stick into a safety control system.
I guess you've never looked at the shape of the port on the bottom of the main processor modules of Schneider's Nuclear 1E certified safety systems. Or looked at the very normal computers that are connected to the TSAA network that controls it. Or for that matter any part of any modern control system.
USB is everywhere. On the operator consoles. On the engineering workstations. On the support systems. And even on the control systems and safety systems directly.
The article is written by an ignorant author.
No where near as bad as your post.
Re: (Score:2)
Reactor control rooms have USB slots? WTF?
You may be surprised that nuclear reactors are controlled by these things called "computers". A "computer" is a device not unlike the thing you are using to read this right now, they have mice (USB), keyboards (USB), and require things like software patches (often delivered via USB).
Re: (Score:2)
No, the control systems are proprietary and designed specifically for purpose. The Supervisory systems often run on commodity computers running commodity Operating Systems, but these devices do not actually "control" anything.
Re: (Score:2)
Stupid pedantry makes every conversation better. /s
Re: (Score:2)
If people cannot be uselessly and mindlessly pedantic, what would they contribute? Many people have no actual clue from where they could derive a meaningful comment.
Also google("USB slot"): About 5’580’000 results
Re: (Score:2)
You're not impressing anyone.
Re: (Score:2)
They just use off-the-shelf PC hardware running Windows and put a note in the employee handbook about not plugging flash drives in.
The is absolutely false. Crtical plant safety systems are in no way run on windows PCs. That is a 100% fabrication. You can't even describe a nuclear critical control system, its elements, or how it is managed.
If you want to make assumptions, fine, just start your post with admission that you have no clue what you are talking about.
Re: (Score:2)
So enlighten us, what do they run?
Because when I've actually been to nuclear power stations and taken a look at their systems I see Windows and Siemens SCADA software, for example.
In fact the fire alarm panel and smoke extract system were both running Windows too. The original system was using DOS, installed back in 1995. Later updated to Windows XP around 2005. Last year upgraded to Windows 10. Industrial PC in a fire alarm panel. I wrote most of the firmware for the nodes.
Re: (Score:2)
So enlighten us, what do they run?
Even a rookie industrial engineer knows these are PLC type systems, usually triple redundant. NOT windows. You probably don't know what you are looking at when at 'power stations', and your assumption that nuclear plant safety systems are the same as any plant you've been in is an uninformed one.
Microsoft Defender updates (Score:1)
Re: (Score:2)
The best practice is to never install updates.
Thus as something works today, so it will work tomorrow. For every instance of today and tomorrow until the heat death of the Multiverse.
The "entertainment" systems (what you would call the Business Network, the one run by the IT 1 D 10 T folks) because it does not really matter whether that stuff works or not since it has zero impact on the real world. They can afford multi-week loss of view and loss of control since they have nothing of import to view and no
Re: (Score:2)
Did they make sure to install the latest updates? This is a critical part of the security posture at all nuclear power plants.
Malware was found on the business network, it has nothing to do with the plant control network. This entire article is based on ignorance of what actually ocurred.
Re: (Score:2)
Indeed! And the Business Network is an "Inherently Malicious External Network" operated by Clowns from the perspective of the Control Network operators.
Re: (Score:2)
Omg both of you stop! I was kidding! It was a joke! A tiny joke only intended to elicit a brief smile in passing but a joke nonetheless. Oh the agony!!!
Unfortunately, there is so much ignorance posted here that jokes are hard to spot.
Air gap works (Score:3)
Secondly, these kind of scare stories are driving some kind of agenda. I don't know what that agenda is, but the nuclear power plant wasn't breached, according to the article.
Re: (Score:2)
The agenda is likely to scare people straight. I was teaching a seminar a couple years ago, and one of the attendees was sharing how proud he was and the commendations he received for his “innovative” use of a raspberry pi to avoid a costly PLC replacement in a critical environment.
Explaining a dozen or so issues with the approach took the next few hours. People don’t inherently “get” security— it really needs to be taught.
Re: (Score:2)
use of a raspberry pi to avoid a costly PLC replacement in a critical environment.
What's wrong with that? Because of the Wifi, bluetooth, and USB?
Re: (Score:2)
All of those can be turned off, just like on any other computer.
Re: (Score:3)
Because the primary goal of the pi was to be built as cheap as possible. It was meant as a tool for students and before the pi you couldn't get a single board computer for under $100. I don't know about you but I wouldn't be replacing mission critical controllers with something that changes parts between runs to keep costs down.
Re: (Score:2)
The primary goal of most equipment (including what is referred to as a PLC) is to build it as cheaply as possible and sell it for the highest price the market is willing to pay. All "Mission Critical Controllers" (however you want to define those) change parts between runs to keep costs down.
Re: (Score:2)
Indeed it is. But cheaply as possible in the control systems / PLC world involves still complying with a world of testing and certification requirements including QA on coding, and design. "Mission Critical Controllers" don't change parts on a whim to keep costs down. They change parts every few years on a cost review after a shitton of testing and design verification. Their parts are high cost to begin with precisely because of the reliability requirements placed on them by customers. The lack of this quic
Re: (Score:2)
Using a Pi or any other "general purpose" solution for something that needs to run for years without intervention is an obvious problem with the risk management and hazop processes in that it permitted such a device to be used in such service. Unless of course that *was* taken into account and deemed irrelevant to the particular use to which the device was being put.
You are equating things which are not equal to be equal, when they are not.
Re: (Score:2)
The Raspberry Pi Compute Module is suitable for that kind of use. They keep the BOM consistent and they are used in various industrial applications. They are nice modules, fairly low cost and much better supported than most SoM offerings. Plus you can use the full size Pi as a development platform which is very handy.
Re: (Score:3)
Re: (Score:3)
We used them to replace one expensive PC with three cheap Pi's at one company I worked, we used a LOT of them, bought them in bulk, at one point our entire countries stock in fact. An SD would fail at least once a week, a Pi once a month across ALL the branches in the company. The software was changed to cater for it, each branch had spares to replace where needed, suspe
Re:Air gap works (Score:5, Insightful)
These things are _not_ reliable. And they have complex software on them that can behave in unexpected ways. PLCs come with extensively tested and assured reliability stats and reliability assurances far beyond "it does not break". A Raspberry Pi hobbyist device comes with "it will work for a few years if you are lucky and it may randomly make errors". A Raspberry Pi does not even have ECC memory or a reliable MCU on it and its function is certainly not fully tested. It is "cheapest possible".
Re: (Score:2)
So? How do you know that these risks were not assessed as part of the commissioning process? Mitigation of the "it will only last a couple of years" is simple. Buy ten of them and pre-load them with the appropriate software and put them on the shelf. It is still cheaper than buying a PLC especially for non-control use. Even I would have objections to using a Pi based device "on-control", but "critical environment" is in the eye of the risk assessor and not an external observer who is likely not complet
Re:Air gap works (Score:4, Insightful)
If you fully test a Raspberry Pi, you end up at a price-tag higher than a PLC. You have to create the whole testing process, the equipment, etc. Basically you need to design a PLC based on it. Sure, if there is no secondary damage when it starts to behave in an arbitrary way (which a PLC will not do), you can do it. It is still probably more expensive overall though. And "buy 10"? Have you overlooked that you also need to archive the whole process, all software and system- images and that there are components on a RPi that do have limited shelf-life?
I do get that PLCs have inflated prices. But replacing it with a hobbyist component is exactly the mind-set that later on causes catastrophes.
Re: (Score:2)
You are wrong. Improper and incomplete risk assessment and hazop procedures "later on causes catastrophes". Deploying something using a Pi where proper risk assessments and hazops have been performed does not "later on causes catastrophes".
Re: (Score:2)
And the other points I have made, you just gloss over? You are a hack.
The point is that if you follow proper procedures, there is no place where an RPi will give you an advantage over a PLC, except in functions that basically do not matter and a PLC should probably not have been used in the first place.
Re: (Score:2)
The fact you use the word HAZOP shows you don't actually know what you're doing. HAZOPs are for process. CHAZOP and FMEA is for control system. The process of conducting a detailed CHAZOP and FMEA is more expensive than a small PLC, not to mention that these two processes will straight away find the Raspberry Pi not suitable for anything mission critical at all.
You're saving pennies in the most dangerous of places.
Re: (Score:2)
Thanks. It seems it becomes pretty clear why ICS security is such a mess: Incompetence of actors in that space.
Re: (Score:2)
Mitigation of the "it will only last a couple of years" is simple.
If you propose going through the process of installing a mission critical system which will only last a few years as part of the design you will be laughed off whatever project you are on. In any case it's clear you don't actually work in this field. If you did you'd realise control systems don't cost much at all. Not compared to the engineering hours put into design and verification by the purchaser.
Re: (Score:2)
Indeed. The main cost in such things is engineering hours. That is if they are done right. Even a simple risk assessment for an RPi will already be more expensive than a PLC where you can just look at the datasheet to find what assurances you actually have. Also, that PLC will have long-term availability and after it goes out of availability, there will usually be a drop-in replacement. That is worth a lot.
Re: (Score:2)
There you go, assuming "mission critical".
Re: (Score:2)
And what pray tell are those issues?
A Raspberry Pi can certainly be on the same reliability scale as a dedicated PLC, and can certainly be packaged to meet whatever environmental requirements are required. It is more versatile and programmable than most PLCs and can be more trivially and completely made safe and secure.
Re: (Score:2)
No. Seriously not. Even at the very low end of reliability, a PLC is in a whole different league.
Re: (Score:2)
gweihir has gone through a number of the concerns I had, but the focus at the time was primarily that they connected it to the secure network without validating or auditing software, without disabling wireless, without a patch management plan, without documentation, with insufficient functional testing, and without disabling included software.
While we have recommended and used low cost single board computers in a pinch, using it as a direct PLC replacement for essentially ladder logic PIDs is a mistake. (Th
Re: (Score:2)
A Raspberry Pi can certainly be on the same reliability scale as a dedicated PLC
You're an idiot who has never looked at a PLC, and I really mean looked. As in simply from the outside, not even having to check part numbers or design of equipment just looking from a distance alone will show you why one will fail in a couple of years, and the other is designed to last 15+.
Hint for the ignorant: Conformal coating.
After you learn what that is, maybe you should start looking at the hardware from closer than 1m away and you'll learn a whole lot of new reasons why your comment is incredibly st
Re: (Score:2)
An utterly unreliable Raspberry Crap as replacement for a critical component? The mind boggles.
Re: (Score:2)
There was no claim that the Pi was replacing a PLC in a "critical-component", merely that it was in a "critical-environment". These are two entirely different things. For example, an ambient temperature sensor on the scaffold around a tower may be a "critical-environment", however it is not a "critical-component" nor a "critical function". Using a Pi to relay leakage information for a leak detection system as an L1 Alarm is a "critical-component" of a "critical-safety-system" in a "critical environment".
Re: (Score:2)
You do get that this is dumbed down for public reporting, right?
Re: (Score:2)
The agenda is to sell snake-oil. Lots of snake-oil.
Re: (Score:2)
Air gap works if you implement it correctly.
And that is the kicker: You need people with a clue. These tend to be more expensive and have less tolerance for abusive working conditions. Hence that industry standard is to use cheapest possible or cheaper. Why do you think there is so incredibly much bad software out there?
Also, there is nothing wrong with using USB for data transfer. Using, say, serial connections, would not make things any better. Even punch cards or punch tape would be subject to the same attacks. The attacks so far have basically al
Re: (Score:2)
Air gap works if you implement it correctly. If you implement it poorly, then it's still better than any other security measure implemented poorly. To begin with, using USB to transfer files is a mistake. There are so many other options that work better, and one of them should be used. In fact, no air-gap exploit would have succeeded to date, if it weren't for USB.
Sealing the USB slots with epoxy and disabling DVD drives helps maintaining an air gap.Of course, nothing is foolproof as fools can be very ingenious.
Re: (Score:2)
Air gap works if you implement it correctly.
I would argue air gaps makes people complacent. In general I see more companies take security seriously when they *don't* have air-gapped networks. Mind you give someone rope and they will use it to hang themselves. I've also seen companies fuck up security completely.
Charging phones, childsplay. I know someone who plugged a 3G modem into their operator station on nightshift back in the day (not a nuclear reactor, but a major hazard facility none the less) and used an engineer's password to fire up a browse
Re: Air gap works (Score:2)
Re: (Score:2)
You're full of shit. People have demonstrated hard drive LED activity and even impossible to hear sounds from internal speakers to leak data.
This is not about leaking data. This is about getting data in there. You do not understand the security model here.
Re: (Score:2)
Almost all continuous control data in almost all control systems is unclassified. Very little of it has any value to anyone at all outside of the Process Operator. In most cases you are entirely welcome to the data -- if you think you could handle it -- there is a LOT of it. I doubt that your Internet connection could handle the volume.
So...what did we learn? (Score:2)
So what did we learn?
While reactor operations at Kudankulam were reportedly unaffected, this incident should serve as yet another wake-up call...
TFS does not tell us what we learned, other than that the security was fine. Air gapping worked, and only the less secure business network was impacted.
The entire rest of the summary is just FUD.
Georgia Tech is producing electrical plant experts (Score:2)
Georgia Tech is one of the top three schools for cybersecurity*. They've recently started a a masters degree program in cybersecurity for power plants and the electric grid. Pretty soon they'll be graduating 100-200 people with masters degrees in plant security every semester. It will be interesting to see what happens when all of those people go out to get jobs in the sector.
* Aka information security. The government calls it cybersecurity, sorry if you don't like the term.
Re: (Score:2)
It will be an unmitigated disaster of checklist idiots playing checklist while having no understanding of the underlying concepts. They will "believe" whatever they are told and thus will do stupid things that will lead to disaster. In all likelihood only 2 of the 100-200 people with Masters Degrees will understand that everything is a lie, especially when it comes from the lips of someone who wants you to buy/use their product.
Re: (Score:2)
Clearly you've never completed graduate level courses at a top university. There's nothing checklist idiot about this work.
The people making checklists will hopefully look at the work we're doing at OWASP, ISC2, and other organizations. Maybe they'll even cite out recommendations. The work at OWASP and ISC cites the research we're doing at Carnegie Mellon and Georgia Tech.
I've been working full time in the infosec field for 20 years, programming infosec systems and teaching security to programmers. Gradua
One example course (Score:2)
Just to give you a feel for it, I just completed a course which has these projects as requirements:
Break Diffie-Hellman (TLS/ssl) in two different ways.
My exploit would allow me to listen to your VPN traffic, for example, on many VPN endpoints.
Bypass the typical protections against cross site scripting, cross site request forgery, and SQL injection in order to exploit a site in three different ways - even though the programmer included protections against these attacks. My exploits would allow me to wire
Re: (Score:2)
And these are all completely useless in the safety and security of a Control System.
Security theater (Score:2)
Here is an example: boarding an international flight - especially Air India from an airport like Bombay...you have to clear security three times by three different agencies *after* you get your boarding pass. What they are trying to do is mysterious.
Indian bureaucracy is Kafka on steroids. At a nuclear plant they will have gun toting soldiers guarding every entrance and mu
Re: (Score:2)
This is a valid 3G defense posture (3G means Guards, Guns, and Gates) to enforce a physical defense perimeter.
That those operating the entertainment network use GMail/Hotmail/Office232/WhatsApp is why those people are not permitted to touch actual Control System networks -- in the grand scheme of things it does not really matter if the entertainment systems go down, it is merely a mild inconvenience that, like a cold or diarrhea, will usually pass in a few days or weeks or months, and nothing of import will
Don't Worry (Score:2)
It will get fixed one way or the other, have faith in human nature.
Re: (Score:2)
Pumped hydro and hydrogen storage in gas fields has risks involved too. A season worth of stored grid energy to weather a winter/dunkelflaute can make a big splash/boom.
Re: (Score:2)
Pumped hydro and hydrogen storage in gas fields has risks involved too. A season worth of stored grid energy to weather a winter/dunkelflaute can make a big splash/boom.
It can. One several orders of magnitude smaller in effect and following cleanup effort. What are they currently thinking how long Chernobyl and Fuckupshima will take to clean up? More than 100 years?
Re: (Score:2)
This has nothing to do with the Nuclear industry. Basically the secretarial network (where they run Word and Powerpoint) was infected by a virus. In other words, it is a wake-up call for the IT folks that run the entertainment systems.