Matthew Rowley Posted October 12, 2016 Report Share Posted October 12, 2016 I am working a project that requires Open Source software to be vetted for malicious code. Has anyone ever dealt with this requirement and have a answer for the OpenG toolkit (which I use extensively) as to whether it has been inspected for malicious code certified "clean"? Thanks in advance. Matt R. Project Lead Quote Link to comment
MarkCG Posted October 12, 2016 Report Share Posted October 12, 2016 Who would certify it in any meaningful legal sense? If instead they mean someone, employee or consultant, looks over it and says on paper "I don't see any obvious malicious code in here" that might be feasible, you could open each of the VIs in the packages you will use and inspect each one. I think the license for LabVIEW itself has some kind of verbiage in it that it's all "at your own risk" anyways. That sound like a pretty onerous requirement. Quote Link to comment
smithd Posted October 13, 2016 Report Share Posted October 13, 2016 (edited) the openg libraries are 'compatible with labview' which meets these standards: https://decibel.ni.com/content/docs/DOC-8981 but otherwise, no clue. Definitely a weird requirement. Depending on the specific requirements, you might be able to focus in the restriction by only using code which uses native labview functions (ie no CLFN/dll calls). Then you can figure out what types of malicious things labview could do -- for example, file IO functions are probably out, as would be any code which uses VI server (could call other code dynamically). These are both pretty easy to verify. From the product page this leaves you: -Array manipulation -String manipulation -Application control -File handling -Zip files -Timing tools -MD5 digest implementation -Error handling -Variant and flattened data manipulation Still a pretty good set of tools and I can't think of any way this could affect other machines in a malicious way. Of course if the definition is 'you must inspect every function'...well, have fun Edited October 13, 2016 by smithd 1 Quote Link to comment
ShaunR Posted October 13, 2016 Report Share Posted October 13, 2016 Don't use open source with this project (and that is not a glib comment.). Their procedures may let you get away with verifying a PGP key against a distribution but the legal ramifications are not worth it unless you have a dedicated legal team. Do you have a dedicated legal team? If not then bite the bullet and refactor the opensource components out. (Then apply to become one of their "preferred suppliers" ) Quote Link to comment
hooovahh Posted October 13, 2016 Report Share Posted October 13, 2016 Oh the compatible with LabVIEW is a good one to go with. These maybe 3rd party tools, but NI is claiming that to be "Compatible with LabVIEW" then that means the "Product does not include any malicious software". And OpenG is listed on NI's site as just that. You can ask for an independent investigation, but the thing is, code review is not a valid form of software verification. You can open the code, and look at it, but you can't say what it will do just by looking at the source, especially in a graphical world, where I could just place a picture of a block diagram on top of my actual code, and other malicious techniques. This is where tools like VI Analyzer, and Unit Test Framework come in to help detect things that the user might miss. If you can't just take NI's word, then I'd say it isn't going to be worth the hassle of independently verifying it. Quote Link to comment
ShaunR Posted October 13, 2016 Report Share Posted October 13, 2016 22 minutes ago, hooovahh said: NI is claiming that to be "Compatible with LabVIEW" then that means the "Product does not include any malicious software". Citation? Quote Link to comment
hooovahh Posted October 13, 2016 Report Share Posted October 13, 2016 It's literally in the only link provided in this thread. https://decibel.ni.com/content/docs/DOC-8981 Although if I'm going to apply a bit more scrutiny, I guess NI is just saying these are the guidelines to be followed, not necessarily the things that NI verifies is true. Quote Link to comment
ShaunR Posted October 13, 2016 Report Share Posted October 13, 2016 11 minutes ago, hooovahh said: It's literally in the only link provided in this thread. https://decibel.ni.com/content/docs/DOC-8981 Although if I'm going to apply a bit more scrutiny, I guess NI is just saying these are the guidelines to be followed, not necessarily the things that NI verifies is true. Indeed. The compatible with LabVIEW program is multi tier and intended to standardise the out-of-the-box experience of addons and herd the cats. The lowest level basically states you have followed the style and submission guides whereas the silver and gold simply attests that you have had positive feedback reviews from real customers as to levels of support (responses within 48 hrs etc). Once a product is on the Tools Network and approved, the same rigor of checks required to gain acceptance is not applied for every release so I would be very wary of a supplier offering this as proof in isolation. Quote Link to comment
austinman Posted October 13, 2016 Report Share Posted October 13, 2016 I have worked with a couple of organizations where developers had similar constraints on using open source libraries/code (in any development language). They had processes in place to verify open source code was 'safe' that were so time consuming it was easier for a developer to create their own. Quote Link to comment
Michael Aivaliotis Posted October 15, 2016 Report Share Posted October 15, 2016 Don't use open-source libraries at all in your project if this is one of the project requirements. Quote Link to comment
drjdpowell Posted October 15, 2016 Report Share Posted October 15, 2016 I wonder how large the risk of malicious code is relative to the risk of serious bugs in code implemented from scratch. 1 Quote Link to comment
ShaunR Posted October 16, 2016 Report Share Posted October 16, 2016 (edited) On 10/15/2016 at 8:45 AM, drjdpowell said: I wonder how large the risk of malicious code is relative to the risk of serious bugs in code implemented from scratch. I've thought long and hard about your question. While it is intuitively obvious, I'm unable to quantify it outside of the formal risk assessment process - which itself can be fairly subjective. (You do risk assessments, right?) When I say "obvious". I mean in the same sense that the prospect of being beaten senseless on a Friday night is intuitively worse and riskier than tripping over and possibly hurting yourself even though I don't know the probabilities involved. I'm aware of formal processes for mitigation of malicious code - which tend to be fluid and dependent on the particular adversary. However I'm not sure of how you would apply those processes to incompetence. Edited October 16, 2016 by ShaunR Quote Link to comment
drjdpowell Posted October 16, 2016 Report Share Posted October 16, 2016 10 hours ago, ShaunR said: When I say "obvious". I mean in the same sense that the prospect of being beaten senseless on a Friday night is intuitively worse and riskier than tripping over and possibly hurting yourself even though I don't know the probabilities involved. I’m not sure intuition is that reliable in such high-importance-but-low-likelyhood things. People die from mundane things like tripping over. Quote Link to comment
ShaunR Posted October 17, 2016 Report Share Posted October 17, 2016 3 hours ago, drjdpowell said: I’m not sure intuition is that reliable in such high-importance-but-low-likelyhood things. People die from mundane things like tripping over. I said it was intuitively obvious that one was a higher risk than the other. Someone trying to kill you poses a higher risk that you will die than the risk from someone that isn't and I don't need to spend 3 weeks investigating to come to that conclusion unless I want a few decimal points on that binary result. If you actually look at the security assessments you sill see things like mitigating controls exist that make exploitation of the security vulnerability unlikely or very difficult. Likelihood of targeting by an adversary using the exploit (my emphasis) So even the formal appraisals require intuitive judgement calls. Quote Link to comment
drjdpowell Posted October 17, 2016 Report Share Posted October 17, 2016 But which is the bigger risk of this example: A) Someone will sell me a car that has been modified to crash and kill me. B) The car that I build from scratch will crash and kill me. Mitigating (A) by accepting (B) is not necessarily reducing your chance of death. Quote Link to comment
hooovahh Posted October 17, 2016 Report Share Posted October 17, 2016 Are we really talking about FMEAs on the risk of open source bugs and/or malicious code, versus source code developed in house? An interesting concept but in my experience most management would rather just have a blanket statement like "No open source code" then to have to go through a process of qualifying it. Michael is probably right, if it is a firm requirement then you're probably hosed. Quote Link to comment
ShaunR Posted October 17, 2016 Report Share Posted October 17, 2016 (edited) 1 hour ago, drjdpowell said: But which is the bigger risk of this example: A) Someone will sell me a car that has been modified to crash and kill me. B) The car that I build from scratch will crash and kill me. Mitigating (A) by accepting (B) is not necessarily reducing your chance of death. A) will definitely kill you wheras B) Probably won't since it is a design requirement not to kill you. If that isn't obvious then I despair Or. Looking at it another way. The goal of writing software is to succeed and realise the requirements. Would you prefer A) or B) to be successful and the contingencies employed to ensure success to drive towards A) or B)? Edited October 17, 2016 by ShaunR Quote Link to comment
ShaunR Posted October 17, 2016 Report Share Posted October 17, 2016 (edited) 1 hour ago, hooovahh said: Are we really talking about FMEAs on the risk of open source bugs and/or malicious code, versus source code developed in house? An interesting concept but in my experience most management would rather just have a blanket statement like "No open source code" then to have to go through a process of qualifying it. Michael is probably right, if it is a firm requirement then you're probably hosed. It's a discipline in its own right and mainly in the defence industries. This quite often happens when offloading responsibilities onto a supplier. The choice is really to employ a Security Engineer or decline to quote. It's similar to declaring your software fit for medical use or for use in explosive environments. People are too used to the "App Crap" of IOS and android where bugs and vulnerabilities are considered an ordinary and expected part of the development cycle rather than a testament to incompetence and/or lack of discipline so they expect every programmer to have in depth knowledge of dangerous domains because........it's only software, right?. Edited October 17, 2016 by ShaunR Quote Link to comment
drjdpowell Posted October 17, 2016 Report Share Posted October 17, 2016 1 hour ago, ShaunR said: A) will definitely kill you wheras B) Probably won't since it is a design requirement not to kill you. If that isn't obvious then I despair Or. Looking at it another way. The goal of writing software is to succeed and realise the requirements. Would you prefer A) or B) to be successful and the contingencies employed to ensure success to drive towards A) or B)? Nobody makes their own car because no-one is trying to sabotage their car, and driving a car built by yourself is extremely dangerous. If you think someone might try to kill you, then you still don’t build your own car. Instead, you verify that no-one has tampered with your car. I was just wondering how that kind of analysis goes with software, and whether management, in it’s insistence on no open source or onerous verification requirements, is actually making the correct choice as far as minimizing risk. Quote Link to comment
ShaunR Posted October 17, 2016 Report Share Posted October 17, 2016 (edited) 22 minutes ago, drjdpowell said: Nobody makes their own car because no-one is trying to sabotage their car, and driving a car built by yourself is extremely dangerous. This is demonstrably incorrect. We used to make petrol go-karts as kids out of lawn mowers and I built my own [hard-tail] motorbike when I was 17. A quick perusal of youtube will show home made drag racers, off-roaders, dune buggies, monster trucks and even tanks, boats and aeroplanes. The only requirement to drive a car of any description on a road in the UK is that they pass an MOT, which is a government test that says "it probably won't kill you or other people [today]". So yes. I would drive my own car and when (not if) it passed the MOT I, along with the police would be pretty happy that it was safe to do so. Edited October 17, 2016 by ShaunR Quote Link to comment
drjdpowell Posted October 17, 2016 Report Share Posted October 17, 2016 You didn't make them from scratch; you used available components (such as lawnmowers, brakes, engines, etc.) all of which you accepted as not being sabotaged by someone who wanted to kill you. An MOT is a non-onerous test, involving a reasonable set of requirements, so that sounds like a good idea. It would be a LOT of work to pass an MOT without using components manufactured by other people. Quote Link to comment
ShaunR Posted October 17, 2016 Report Share Posted October 17, 2016 (edited) 41 minutes ago, drjdpowell said: You didn't make them from scratch; you used available components (such as lawnmowers, brakes, engines, etc.) all of which you accepted as not being sabotaged by someone who wanted to kill you. An MOT is a non-onerous test, involving a reasonable set of requirements, so that sounds like a good idea. It would be a LOT of work to pass an MOT without using components manufactured by other people. I didn't create screws from bars of metal with a lathe but components were chosen, stripped, cleaned, inspected and modified to suit which ensured that tampering would have been spotted and rectified. A car designed to kill me would try to hide or prevent me from taking safety measures or circumventing it's goal of killing me. From a design point of view it is the risk I discover the purpose as opposed to overlooking and not discovering a rare defect. These are intuitive to me. I guess they are not to you. With those levels of paranoia the thing that should be striking fear into you is that LabVIEW is a closed source compiler and IDE so how do you prove to your customer that the language itself that you have chosen to solve their problem is not malicious and the code generated from your diagram is only doing what you have coded. I suppose the point is that you have to convince yourself that something is safe or non-malicious before you try to convince others. Trust isn't transferable so if you trust a toolkit then you have to accept that others may not unless you vouch for it and the implications that arise from that. The process and facts discovered whilst convincing yourself will be the evidence to the customer and that is much easier with your own code because the design goals, implementation specifics and desires/intent are implicitly known. The question is. Does the customer trust you! Edited October 17, 2016 by ShaunR Quote Link to comment
hooovahh Posted October 17, 2016 Report Share Posted October 17, 2016 Better yet did you smelt the metal? If not there could be imperfections in it that could cause a catastrophic accident going down the road. It's hard to have these types of conversations because at some point someone will say "Well that's just ridiculous, I should be able to trust the QA of the screw manufacturers." Just like how I've had some managers not trust the precision of resistors, and wanted each tested before, and after placement in the DUT (Dispite ICT likely finding these issues bringing up another FMEA). But what is ridiculous to some isn't for others. And at the end of the day my job can be made much more difficult than it needs to be depending on how ridiculous important people want to be. Oh and the operating system, and drivers we are using are closed source as well. Is it ridiculous to make a new operating system, kernel, drivers, file system, etc from scratch? Yes. Would some managers think they are doing the company a favor by calling into question the integrity of the operating system? Yes. 1 Quote Link to comment
ShaunR Posted October 17, 2016 Report Share Posted October 17, 2016 (edited) 1 hour ago, hooovahh said: Better yet did you smelt the metal? If not there could be imperfections in it that could cause a catastrophic accident going down the road. It's hard to have these types of conversations because at some point someone will say "Well that's just ridiculous, I should be able to trust the QA of the screw manufacturers." Just like how I've had some managers not trust the precision of resistors, and wanted each tested before, and after placement in the DUT (Dispite ICT likely finding these issues bringing up another FMEA). But what is ridiculous to some isn't for others. And at the end of the day my job can be made much more difficult than it needs to be depending on how ridiculous important people want to be. Oh and the operating system, and drivers we are using are closed source as well. Is it ridiculous to make a new operating system, kernel, drivers, file system, etc from scratch? Yes. Would some managers think they are doing the company a favor by calling into question the integrity of the operating system? Yes. So Boss. You question the suppliers QA? I'll need you to sign this request in order that Goods In switch to 100% inspection sampling and I'll hand it personally to the Goods In manager along with your extension number When will you be at your desk? Edited October 17, 2016 by ShaunR Quote Link to comment
hooovahh Posted October 17, 2016 Report Share Posted October 17, 2016 Just now, ShaunR said: So Boss. You question the suppliers QA? I'll need you to sign this request in order that Goods In switch to 100% inspection sampling and I'll hand it personally to the Goods In manager with your extension number "Do you mean we aren't doing 100% inspection already! I'm going to shake up this place up to make this company better (or so other people will hear me talk and think I'm even more important than I already am)", or "You're the test engineering, isn't this your job!" or "I don't mind making the hard decisions if it means less warranty returns!" or "Our customers demand perfection!" Too many blanket statements without knowing or thinking about how things actually work. Not saying improvements can't be made, but I tend to roll my eyes when people talk a big game but clearly don't know enough to be making decisions like these. Give me a manager with test and engineering experience over one with an MBA any day. (BTW sorry about off topic) Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.