Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,786
  • Joined

  • Last visited

  • Days Won

    245

Everything posted by Rolf Kalbermatter

  1. USB really isn't something you should use when you want to rely on it that it runs for weeks uninterrupted. While the idea of USB is nice there are a myriad of things which can and often do go wrong. The first and most obvious problem are the devices itself. There are many USB chip solutions out there that have simply more or less grave bugs in the silicon. They are sometimes getting fixed in later versions of that chip, if the manufacturer actually cares enough to fix its silicon instead of just releasing yet another silicon design with new bugs in there. Then the device drivers of those devices, quite often they are more or less taken over entirely from the development kit of the silicon manufacturer despite of notices all over the place that the software is only provided as an example of how it works and shouldn't be used in production quality designs. Then a quite common problem too are the actual USB chip bridges in the PC themselves and the according drivers. Nowadays those PCs are designed in a matter of months and often the newest chips are used despite that they have sometimes bugs too and their drivers are not yet mature. Maybe it is possible to patch the problems later on with a driver update and maybe the bug is not really fixable in software, which with most PC manufacturers nowadyas means bad luck for the end user. Basically if you want to use USB for uninterrupted long term data acquisition you will have to evaluate both the data acquisition hardware as well as your PC platform very carefully before doing so. USB for life supporting devices is definitely something you should never attempt to do unless you fully control the entire chain from controller software and hardware to the device software and hardware. Rolf Kalbermatter
  2. Maybe some guys came out of a long winter sleep and still think things are done the way as in old DOS days, not wanting to understand that there are nowadays cheap ready to run hardware solutions that cost less than the material you would need to build your own (not to talk about the time you will need to build your own). Rolf Kalbermatter
  3. Not easily. Linux just as Windows does not allow applications to access hardware directly. In Windows you have to write a device driver that does run in the kernel context to do that for you and that is the same for Linux. For Windows there exist several solutions with according device drivers to access the IO adres range from an application level. For Linux there are also several possibilities but an according kernel driver has never made it into the official kernel sources and definitely never will make it into it, since this is a potential security risk. The only way to get that done is by looking for one of the various hacks for Linux port IO access on internet and put that yourself into your kernel. This obviously will require you to at least compile some stuff like a kernel module but more likely to compile your own kernel. Basically while it sometimes seems necessary to do direct port IO it is mostly a bad idea and a potential security risk for sure. Rolf Kalbermatter
  4. It is quite common in WinAPI functions to specify the size of structure parameters passed to a function. Most often this is done in the first member variable of the structure itself but as can be seen there are exceptions such as this were you pass it as a spearate parameter. Why would you do that??? Simple! For version compatibility. Newer implementations of such a function can add extra elements at the end of the structure. If an application that was compiled with an earlier header file calls this function, the function can check the size, possibly recognize the version this application expects but most importantly make sure it only returns the information the structure can hold, avoiding writing past the end of the memory area reserved by the caller for the structure. The string in the structure is NOT fixed size. Only the area reserved for the string in the cluster is fixed size. This is common practice to avoid having to reference any parameters after a string at variable offsets, depending on the size of the embedded strings. In fact it is the only way to declare a structure at compile time. If you use variable sized strings in a structure you can't declare that structure at compile time but instead need to parse it dynamically at runtime. The function will fill in a NULL terminated string in that area up to the reserved length minus the 0 termination character. So the NULL termination character still is needed for the caller to find how long the actual string is. This is the point of my first post. You do not want to return all 32 characters that are reserved in the structure, but instead go through those 32 characters and stop at the first occurrence of a NULL character, since that is the end of the string. It is also a good practice to not assume that the string will be NULL terminated so you will go in a while loop and stop looping when a NULL character is found OR when the length of the array has been consumed. Not strictly necessary but a good idea and this is really called defensive programming. Rolf Kalbermatter
  5. 1MB for a single VI is IMO a little at the large side but it can happen for more complex VIs. It is probably a good candidate to check for common routines that can be delegated into subVIs. The number of cases is not that important but there can be a "to much". I once inherited a program that implemented a robot sequencer. It was written in a single VI with one huge loop containing a sequence structure with one case structure in each sequence, operating all on the same global state variable and one or two case structures in there containing all the possible sequence steps and substeps of the process logic. Of course no shift registers used at all but just globals all over the place. This program was in LabVIEW 6.1, the main VI weighted in at around 8MB, the two huge case structures had more than 200 cases each and every single edit operation on the VI took several seconds or so on a medium speed PC for LabVIEW to verify its internal graphs for syntax checks. Breaking the VI made editing luckily faster. Completely rewriting the application was no real option since the whole sequence logic was nowhere documented other than in the diagram. I finally managed to modify the VI in such a way that the main sequence logic was broken into the UI handler logic in the main VI, three logical processes that the sequencer only could process exclusively anyhow put into their own subVIs, replacing much of the globals with shift registers and intelligent functional globals and some optimizations in the sequence steps itself, resulting in around 4 VIs with 1MB each, some extra helper subVIs, and no cases structure with more than 70 or so cases. The result was an application that could be again edited without any noticable delay, worked noticably smoother with less CPU load, and fixed a few bugs by the way. Besides having to select a specific case during development in the popup list when there are 100 or more cases is a major pain in the a** (and in earlier LabVIEW versions you would not be able to scroll in that list if it did not fit on the screen ;-) Rolf Kalbermatter
  6. Or maybe subpanels? Rolf Kalbermatter
  7. In fact 34 is to long. The MS SDK uses here a fixed length of 32 chars for this. There is therefore no way to return a string longer than 31 characters through this function. Looking at the Vi I do see a few more problems. First the device number is supposed to be a value between "0" and "number of devices - 1". Therefore the -1 node in the wire from the iteration terminal to the CLN seems wrong. Ok, it seems that MIDI_MAPPER (= -1) is also a valid value for this function so the whole idea about increasing the number of devices by one and decreasing the index inside the function by one to get also the MIDI_MAPPER device is a valid idea, although I would have made a comment about this in the diagram just for the future user that sees this code. Second the string embedded in the structure should only be 32 characters long (and not 35 as it is now) which is the value of MAXPNAMELEN. This is no problem here only, because the function does not try to interpret the values after the string. If it would it would read wrong data since everything is shifted by three bytes. Last but not least is the third parameter to the CLN wrong. This is the length of the structure in bytes and is definitely not 38 bytes but more something like 52. 38 is in fact not even long enough for the function to return the entire string, as the structure up to and including the entire string placeholder is already 40 bytes long. Rolf Kalbermatter
  8. Just like the polor plot your best bet would be to use the Picture Control. Alternatively if you have somewhere an ActiveX control that provides this interface you could go with that, but that will be then a Windows only solution and depending on your LabVIEW version may be more or less stable. Rolf Kalbermatter
  9. Traditional objects based on refnums do not have any meaning outside of the process that created them.So queues or notifiers or such won't work as they couldn't connect each to each other. Your options are as follows" 1) Use LabVIEW 8 shared variables 2) Use DataSocket in earlier LabVIEW versions 3) Use VI server 4) Use TCP/IP to write your own client/server communication 5) Use external files I'm not sure about 1), but 2) is something I wouldn't recommend for a number of reasons. It's performance is quite limited, it can be hard to get configured right, and it uses an undocumented protocol. 1) and 3) have the same issue about being undocumented but at least 3) does work very well and all the low level stuff is done for you. 4) is the hardest to implement and requires quite some experience to get a good working system that will also be able to deal with extensions in the future. But it is the most flexible and also most performant solution. 5) is really only a last ressort. I wouldn't recommend it at all as you get all sorts of problems with synchronizing access to the files between the two or more applications. Rolf Kalbermatter
  10. First an int in modern Windows is always 32 bit and not 16 as you have assumed for the Bits parameter of the first function. This probably will neck you as well in the third parameter since you need to provide enough memory for the function to write into and if you assumed 16bit for the values in there your buffer is likely calculated to small. Second while you have this time properly documented the type of AES_KEY you still missed to provide important information. Without knowing the value of AES_MAXNR one can NOT calculate the size of the buffer you need to allocate in LabVIEW for this parameter. Basically the needed size is 4 * (AES_MAXNR + 1) + 4 bytes. A single byte to small can either crash your system immediately or after some time or when you try to close your LabVIEW app but most importantly could also corrupt some data that is vital to LabVIEW or your VI and when saved to disc might corrupt your VI to the point where you have to start over. My advice is still to go to www.visecurity.com and get the CryptoG toolkit that has a ready to use AES encryption and decryption routine out of the box. Rolf Kalbermatter
  11. You do not want to disable unreadable characters but instead go in a while loop and abort at the first occurrence of a NULL character and then resize the byte array to this length - 1 and then put it through a Byte Array To String node. The string in there is a zero terminated string, as all C strings are so and once you encounter \00 it is over and the rest is only garbage that happened to be in memory before the function filled in the information. Rolf Kalbermatter
  12. I'm not 100% sure but I thought the earliest version that had Save with Options->Save for previous version was 5.1. So there would be a problem to go back to 4 with this feature. On the other hand the only reason not to upgrade to a newer version would be that your driver is in compiled form without diagram. In that case and I'm sorry to say it, but you are in deep shit. Rolf Kalbermatter
  13. If the data is in binary form already there is no need for any conversion. Just use a Typecast function to change the incloming string into an array of integers. Hopefully you are using Network byte order because that is what the Typecast function will assume at the incoming stream side. Of course you could write a C code function to receive the data and get it directly into LabVIEW as an integer array and if the data is in Little Endian format this would get rid of the inherent Byte and Word Swap in the Typecast function but if receiving the data already is the bottle neck, that C function most likely won't be much faster than what you have now when receiving that data as a string. After all 50MB is not peanuts for a PC system and typically will require several tens of seconds to be transmitted over a 100 Base TX connection and only slightly less when 1000 Base TX is used, loading the system CPU as well considerably during this time.
  14. In FIrefox you simply click on the picture to toggle between the scaled version and the unscaled one. Rolf Kalbermatter
  15. It depends on the LabVIEW version. Before version 7 you had to close every VI server reference to avoid leaking memory. Since version 7 you only have to close VI server references that you explicitedly open with an Open function. But LabVIEW is forgiving if you try to close VI references you retrieved from property nodes for instance and recognizes that that are owned references and does nothing on them. So as a matter of fact I usually still use the Close Reference function on all VI server references independant if they are explicitedly or implicitedly opened VI references. Rolf Kalbermatter
  16. I haven't tried this but if it does work then of course only with Rosetta. Still there are a myriad of things that might go wrong. Forget probably about any DAQ or other hardware IO other than standard OS channels such as File IO, and hopefully TCP/IP and serial. Maybe I try to find one of those inofficial MacOSX86 development installations and see if it would run on my Sony notebook. Rolf Kalbermatter
  17. The easist possibility would be to get the Crypto-G Toolkit from http://www.jyestudio.com/visecurity/cryptg.shtml. It's pricing is VERY reasonable and definitely cheaper than spending even one or two hours of your time. It contains functions for AES encryption and decryption. Since AES is a standard it should be not a problem that the encryption is done by a different library than the decryption. If you still are here and want to persuade the DLL path, reading the online reference manual about External Code in LabVIEW is an absolute minimum to do to be successful here. And unless you have a sound understanding of C you really will have to go to www.ni.com and do a search on external code to get some good references about what to watch out for. Most of that information is not even LabVIEW specific, but without knowing quite a bit about pointers, datatypes etc. you simply can't hope to get something reliable, using the Call Library Node or Code Interface Node. One obvious indication that you may miss some basic understanding about C is that you leave out some important information here. Without knowing about the type definition for AES_KEY, nobody can possibly give you a ready made example as to how to implement this in LabVIEW. On the other hand if you would know that, the solution most probably would have presented itself to you already without any extra help. You definitely don't want to use a CIN here, believe me, since that will absolutely and without any way to avoid it require you to write C code, so look at the Call Library Node (CLN)! While your information is not really enough to tell you a step by step plan to go with the CLN, you will have to create two VIs with a CLN in there each. Another bit of information that is absolutely not clear is the calling convention that might or might not be defined with the EXPORT keyword. It is probably stdcall but cdecl would be quite likely to if the DLL you try to call is designed for multiplatform use. Trying this one out however is quite easy. If you get it wrong you will immediately crash LabVIEW so restart and try again, switching the calling convention before running the VI. What remains are the "(const) unsigned char *" types and the "AES_KEY *". The first is easy as it is just a C string pointer. So you will have to configure the corresponding CLN parameters to a String type, pass as C string pointer. For the input strings you will pass in the string as required by the function but for the output string you have to allocate the string, since this is a standard convention in C: the caller of a function has to allocate the memory for any parameter that function is supposed to write into. To allocate memory in LabVIEW is easiest by creating an array of unsigned 8 bit integers with the Initialize Array function, and since we need really a string here we convert it then to a string with the Byte Array to String function. How much of memory you have to allocate for the function not to crash is however function dependant and should be documented by the developer of that function (if it is not documented in any way you should kick the developer in his ###### if you get a chance). For the AES_KEY * I can only make wild guesses. It is likely a pointer to a structure. What size this structure has is unknown. As long as that structure does not contain variable sized elements (strings and arrays) you can create an according LabVIEW cluster, and configure the CLN to use Adapt to Type for this parameter. There is a possibility that your cluster uses a certain byte alignment. This is maybe visible in the C header file for your DLL by some #pragma pack() statements but just as likely it might be something only known by the developer. LabVIEW always uses byte packed structures so you might have to experiment here. But considering all this I really very much would recommend you to look at getting Crypto-G. It will give you a ready made solution and definitely cost you less unless you value your time absolutely nothing. Rolf Kalbermatter
  18. The problem with that VI is that it prints the panel as documentation, that is with a header. To create reports for instance it is usually not desirable to have any other information on the printout than what is on the actual panel that is to be printed. This printout of panels without any additional information works very fine if you set the VI properties to "print when completed". But you have to create your own custom print dialog because the one discussed earlier will not indicate a user cancel. Rolf Kalbermatter
  19. The data could be binary as you say and in that case there is really no way to get something intelligent out of it without a good manual describing the actual data format. Another possibility as already mentioned is, that it is actually ASCII but you are not using the right baud rate, data bits, stop bits, and/or parity settings. Rolf Kalbermatter
  20. I was considering this approach as well. But the problem is that it does only allow to select the printer itself and not any of the print properties for the particular printer such as paper selection etc. I'll have to look into the possiblities. Rolf Kalbermatter Well, at least until now, not so here Rolf Kalbermatter
  21. This can happen with ActiveX controls. LabVIEWs control of them is very limited and in combination with Tabs there has been a bug in LabVIEW that made the ActiveX control not be aware that it shouldn't paint at all. On the other hand support for partial cover of ActiveX controls is very limited and that means that an ActiveX control in LabVIEW is either fully visible or not at all. Overlapping will ALWAYS result in the ActiveX control drawing over anything else independant of the Z order. Rolf Kalbermatter
  22. Hmm, I never had my hands on the Lego Robolab development system but I believe that it is a special version of LabVIEW in such a way that the VIs are somehow signed to only work in the Robolab software. On the other hand normal VIs will also not run in the Robolab software, much like the limitations of VIs with earlier evaluation versions of LabVIEW. Probably not something you couldn't overcome with some digging, but well.. Rolf Kalbermatter
  23. And you don't need a heating anymore ;-). But boy will that suck next summer with temperatures above 30 degree Celsius outside. Rolf Kalbermatter
  24. Your comment seems a little harsh, but I have to admit that I'm not really understanding what the OP describes. Must be my poor English I guess. Rolf Kalbermatter
  25. My functional globals almost always turn out to be intelligent globals to some degree ;-). In fact they are not really globals anymore but intelligent data storage containers. But I think Functional Globals pretty much covers all of these aspects. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.