Jump to content

ShaunR

Members
  • Posts

    4,871
  • Joined

  • Days Won

    296

Everything posted by ShaunR

  1. Sweet.I'll test it out on a sbRIO at the weekend once I can remember where I "stored" it
  2. Ah. Here Rolf and I are in total agreement.. CINS are to CLFNs as MP3s are to gramophone records. I know we do And my view that Linux is a community version of Windows 95 (a GUI bolted onto a CMD prompt), probably won't sit well either . I wasn't actually referring to lvZIP. I was thinking more about SQLite - but as you bring it up; it kind of proves my point.Zlib supports 64 bit so the only reason 64 bit isn't supported in the openG tools is the wrapper.In fact. Transport.lvlib supports it. People could have just replaced the DLL if it wasn't for the wrapper From the Zlib website: I think I even gave you a 64 bit version of zlib at one point (with minizip) I'm about you make you eat those words Slightly off topic. But does VxWorks come with libssl?
  3. This is one of the few points where I disagree with Rolf. Maintaining a wrapper seems like a good idea to start with, but you end up replicating all the functions of the original in the wrapper or reducing the function-set available to LabVIEW. LabVIEW is cross platform already and although it takes a bit more up-front, it is much easier to encapsulate the original, extend and maintain. Once you have good LabVIEW coverage, you can just replace the DLLs . This is great if they are 3rd party supplied (and run the LabVIEW test harnesses). It also means that end-users can update them without you having to do anything. If they add a function, you only need to create the LabVIEW part and it's job-done. There are a few projects that suddenly became unusable because they stopped maintaining the wrapper (not Rolf, of course, he maintains them forever ). Have I ever said how much I hate OpenSSL? (just a short gripe because it's doing my head in for similar reasons )
  4. I think you're going to have problems with a camera in this setup. You have a bar for mounting the camera, so I assume the camera will be facing the sheet and drum? You are trying to measure the width of a reflective material (steel) with a reflective (although grubby) background.. What tends to happen here is that increasing light intensity also causes flaring on the background. The background is moving, so the the reflection is not constant and liable to change with position and dirt. So it's not just a case of saturating it with light, you have to be really pedantic with the light levels and the discolouration on the drum itself will have you pulling your hair out (it's a line scanner, so you have no context). Once you do get it set up, you will walk in the next day and it won't work properly anymore because someone cleaned/changed the drum and the lighting require re-adjustment. Your measurement system will also deteriorate over time as the discolouration changes on the drum as you will have to effectively "tune" it for the particular environmental condition on the day.. You are much better off mounting it in the gap between rollers then the light will work well for you (no discoloured, moving or reflective, background) and will not be at all sensitive to variation as the focus of the lens acts as a light filter. Stuff further way will be blurred and darker and require much larger changes in your source illumination to have any effect on your image background at all and you can krank the contrast up to find the edges. You can also choose a lens that has a very short focal depth and you will be basically checking against a pitch-black background. You wouldn't even need to have a homogeneous light source, it would work and be robust with ambient light. That's the "ideal"........ If you cannot do that, the next best thing is to point the camera from the bar to the gap. The issue here, though, is that with your current intentions, you will be a lot further away, will have overhead lights back-lighting your image and perspective distortion. To mitigate the back-lighting you can mount it similarly as you have currently, but near the top roller pointing down. Since you are using a line-scan camera, you can fairly easily compensate for the perspective distortion with very simple (and not CPU intensive) calibration, but depending on what accuracy you want to achieve, you could also get away without it.
  5. Another things which wasn't touched on in the video. Sometimes you have to interact with hardware with a technician/engineer present. Especially when going through wiring, schematics and generally troubleshooting which is invariably in an industrial situation-in situe. Describing what they can see is very difficult especially with a language barrier and for them to carry a laptop around and use it to send images/video of exactly what they are seeing isn't very practical. A head-mounted bluetooth video camera is fantastic for this and very convenient for them. They just need to pair it with their phone and you can see what they can. If you have ever had to remotely troubleshoot wiring it makes it sooooo much easier discussing wire idents and spotting that clue that tips you off to the solution.
  6. People are still using Skype? Obviously industrial espionage not a consideration. I use Jitsi for video conferencing. The UI looks pretty skanky (Java after all) but it works great (screen remote control included). The next release claims it will support up to 100 remote video conferencing streams-we'll see. You can get remote power switches for hard-booting. I've used the telephone operated one (RPS II) with great success and they can be removed after development and reused on other projects. Oh yes. And Bluetooth stereo headsets that can connect to multiple sources (like your customer via PC and phone)
  7. Memory leaks are caused by not closing registrations (event reg refnums) rather than user event refnums. If nothing is registered, you can generate your events 'till your hearts content and you won't leak memory.That's why you can have plugins that come in and out of memory and attach to existing events. Besides. When I exit, I want all memory cleared up including any leaks and LabVIEW takes care of that. When the app is running, I generally want all events running too and let whatever needs them regsiter and unregister..
  8. The only time I ever need to do this kind of thing is when the app is exiting. Then I just don't bother and let LabVIEW clean up my mess.
  9. It depends if they are authenticode-signed (apparently).
  10. The NSA have to use other methods to tell you are connected Good catch and thanks for sharing.
  11. Good luck. Be sure to poke your nose in from time to time. I find that I'm having to resort to DLLs almost all of the time nowadays because the functionality just isn't there and when that happens, you might as well write it in a text language (they usually have working examples and don't always translate well). Apart from getting the UI advantages of modern visual languages; interfacing to DLLs is a works or crashes the IDE kind of deal. in LabVIEW which.makes you just give up and write it in something else. You still cannot beat LabVIEW for DAQ, Control or prototyping, but outside of that it has severe limitations for commercial applications. I understand the move but I am lucky in that it's not an either/or decision for me. I can use whatever I think is appropriate.
  12. A second thought. Why not just do it as a normal contractor rather than as a fixed price project. This is much safer as there is no deliverable as such.
  13. Perhaps Jason can confirm the algorithm. But I don't think the graph controls just decimate in the strictest sense. I think they do something like bi-linear filtering since small artifacts are not lost-as would be with pure decimation. Long term data logging is better served by a DB IMHO. Then it's just a one line query to decimate and you can have a history as big as your disk allows with no memory impact.
  14. The issue is this. The risk is very high that they will try and renege of the agreement-they have all but admitted that. Even if you do put a watertight clause into the contract, it is highly likely they will contest it anyway. So you will need deep pockets to defend it in court to get the money you are owed. If it is a large corporation, they have departments dedicated to finding holes in contracts and arguing the toss of every penny. They will use it to get more concessions out of you by - nit-picking at best, by threatening at worst. The sort of corporation you want to do business with are those that only send contracts to their law dept as a last resort, not first resort. Which do you think they are? Your first defence should you choose to work with them, is of course, the clause in the contract (choose any from the open source contracts, they all disclaim liability). This is really a management bargaining tool, however, so you can point to it and say "that's not what we agreed". If it goes further than that, then you incur huge expense so you really need a company that is prepared to take that risk in the first place and not go further. Your last defence is Limited Liability Insurance or Professional Indemnity Insurance to stop them taking your house, car and dog if they win. That' the risk of all consultancy work. It's just better for your health, wallet and integrity to politely decline any companies that have a history of serial disputes with consultants, Get a good lawyer DISCLAIMER: Not legal advice, not a lawyer, not even particularly good programmer-make of it what you will
  15. I suggest a you avoid them like the plague.
  16. I too have never experienced it. But I remember Daklu writing an example (on lavag) to prove me wrong and that it does really happen.lol Maybe the Lavag historian can find it-I cannot currently I vaguely remember that the reason that I never see it, is because my workflow is to have all the VIs in memory. IIRC, when you change a typedef, the VIs not in memory are not updated,(of course) and so the change is not propagated., When they are next loaded labview isn't able to resolve the discrepancy and resets them to the zero value. LabVIEW is able to resolve the difference if the new value is at the end, but not in the middle. So I think I suggested having the old-style VI Tree just so that you could force LabVIEW to load all the VIs. Since my workflow is to produce examples that act as regression test harnesses, the examples keep all the VI hierarchy in memory so I never see it.
  17. You haven't told the Configure Serial Port which resource to use. Right click on the top left corner terminal and create a constant. Then select your serial port from the list. Put the serial port initialise outside the loop (and don't forget to close it when the loop stops) You should get it working with the Serial port examples first. Then you will see how to use the VIs correctly.
  18. Just to expand on lvbs points. If you build source plugins. i.e. plugins with diagrams so they can be recompiled on the target system when invoked from an executable. Then you must turn off compiled code separation for that VI otherwise you will get the Error 59: “The source file’s compiled code has been separated, and this version of LabVIEW does not have access to separated compiled code. And just to reiterate a point that some people often forget. The global compiled code option only applies for "New Files" so checking it won't change all the existing files to use this method. You have to go and change each VIs setting individually and recompile. DISCLAIMER: The above are not "issues" as such. Just additional things that you have to bear in mind when using the option and may fit into the the grannys' eggs category.
  19. Yes. I should do the rest of my posts (unfortunately I have only started about 3 topics ..... ever ) Just saves people reading the entire thread as it has run its course and the link contains the solution if they want to get the code..
  20. After just over a year, someone has claimed the first prize. The original proposer for the competition is in a bit of a quandary . When the competition was set 5 btc was worth about $75. Now they are worth about $3000 The deadline was set for when 5 working solutions were submitted, however, there has been only one and the OP has now closed the contest. Kudos to the guy who had to write native labview code to handle big numbers and eliptic curve multiplication, which is no mean feat..
  21. Of course - Rolf wrote it. Zlib uses a "running" Adler32. See here for an good explanation of the difference.
  22. Exactly. If we are going to put type information in the JSON stream purely for readability and type checking, shouldn't we also put type information about EXT, DB, SGL, etc.? This just looks to me like a solution looking for a problem.
  23. You know that "refnum" is short for "reference number", right? A file refnum is just a number. In fact. It is the file reference returned by the OS. Same with, for example, TCPIP. It just strikes me that this is a bit like adding the TypeDef information into the JSON stream (e.g Control. TypeDef, StrictTypeDef) which is interesting, but not useful. Let's assume that having specific LabVIEW reference types defined as objects is desirable. What does the LVType:ByteStream give you that the cluster wired as the "template" doesn't? How will you be able to tell that a DAQ Task ID is a UserDefinedRefnumTag rather than a physical channel? What will the cluster template for the variant decoding look like? DAQ Task IDs are not refnums. In fact,they are the same for all IO (like VISA) - a typed string like "path". If you plonk a DAQ Property Node onto a diagram you can quite happily connect a string to the "ref" input (which stands for reference, not refnum). You'll see a conversion dot as it gets converted to the Task ID type-same goes for VISA references. Are these "pre-thoughts" to object serialisation? Here, There Be Monsters.
  24. They don't need to as the equivalent of refnums in other languages are pointers (generally) and therefore integers. It is only because LabVIEW is strictly typed that they get their own type.
  25. Refnums are numerical types in all languages (even labVIEW). However, Whilst it may be "convenient", I have reservations about injecting labview specific formatting into a language agnostic serialisation.What if a string contains "=>" ?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.