Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,872
  • Joined

  • Last visited

  • Days Won

    262

Everything posted by Rolf Kalbermatter

  1. Your last suggestion was the first thing that came to my mind when reading your post on Info-LabVIEW. It would seem to me the perfect use case for sub panels. Rolf Kalbermatter
  2. If you build your executable and it stops it will quit and then you start it again. The way to do this correctly is rather by writing your configuration into a configuration file (I use INI files for this) and on start up load it. Then inside the application you have somewhere a button or menu to go into a configuration dialog window that allows changing those values and on quiting the dialog you save everything into the configuration file again. To disable front panel elements you use the Property Node with the Disabled property. If the Runtime Engine is already installed you do not need to install it anymore. But creating an installer that does include the runtime engine is not a bad idea. The installer will detect an already installed runtime engine and skip that part. Rolf Kalbermatter
  3. This last suggestion I would take with a grain of salt. Of course it is possible and there were people starting to implement the MySQL TCP protocol in LabVIEW already, but considering the level of expertise from the OP in LabVIEW I think going with one of the existing solutions (NI database Toolkit, idx's ADO Toolkit) is already hard enough to solve. Rolf Kalbermatter
  4. Besides that some of our larger corporate customers only changed to XP in the last year or so I do not see why I should change for myself just yet. My notebook does work for at least another year or two, driver support will be limited for some time, and Windows without service pack hasn't been running on any of my computers since Microsoft introduced that phenomen somewhere around Windows NT 3.5 (when I actually did most of my daily administrative work on a Mac IIci). So I guess I will be using Vista when I will get a new computer but not before that. Rolf Kalbermatter
  5. Well you don't have to worry of course but if you choose a license that is not compatible with most mainstream OS licenses it prevents people wanting to use such a license for their own software to make use of your VIs. Of course that is your decision and your call too, but thinking about these issues is not entirely superfluous AFAICS. Rolf Kalbermatter
  6. I don't see the problem. Make that code in a separate LLB or something without removed diagrams and you are set. The OpenG Builder has even support for this, separating out all the VIs that come from a specific directory hierarchy into a diagram enabled LLB automatically. That he has to use the same LabVIEW version that was used to create the app for editing, and can't change the interface (connector pane, control types) are limitations that you and also nobody else can help with but so be it. The latest is actually a limitation that is valid for every library implementation. And if he breaks the functionality of the library somehow that is also his problem really. Rolf Kalbermatter
  7. Ouch! LabVIEW DSC is a bit more than OPC access, which in the days of BridgeVIEW was quite limited, and alarm management. Things like integrated event and realtime logging, a complete tag based data engine and completely networked and very fast data exchange between engines is quite a bit more than just some simple cosmetics. I do know how hard these things are as I implemented in the past a system similar to what BridgeVIEW had been although not exactly as complete as BridgeVIEW. That said LabVIEW DSC has some issues with things not always working perfectly, partly due to the rather complex system it has grown, partly due to various technology changes in the past from an almost entirely in LabVIEW written SCADA package (BridgeVIEW 1.0) to a system where quite some parts moved out of LabVIEW and parts of the Lookout SCADA engine where integrated instead in LabVIEW DSC 6.0 to a system where virtually everything important is done by various components outside of LabVIEW with quite some of them inheriting from the Lookout technology (LabVIEW DSC 7 and 8). This frequent change of architecture introduced its problems where with each new major version completely new subsystems got replaced and with it new bugs that needed their time to be ironed out, only to introduce other new bugs with the next major architecture change. You can build quite powerful and interesting SCADA applications with LabVIEW DSC as I have used BridgeVIEW 1.0, 2.0, and LabVIEW DSC 5 - 7 for some projects and they always did what had to be done, although there were occasionally some problems to debug and to resolve in more or less close cooperation with some developers in Austin. Lookout in itself has therefore all the features LabVIEW DSC has and a few more except the highly programmable LabVIEW environment. It seems NI is not pushing Lookout anymore but has no intentions of abandoning it yet. The problem with Lookout is not that it is a bad system (it is absolutely not). But when you create a Lookout program you are not exactly programming but rather configuring your application, but this configuration is very powerful the way it is done. The real problem why it never got a big success is twofold. NI had LabVIEW which is sexy and powerful and the NI sales force had little knowledge about Lookout and little interest to learn it since selling LabVIEW was easier and had a lot more sex appeal. Without LabVIEW and the will from NI to go after the SCADA market, Lookout could have easily gotten the killer app in the NI SCADA product line. One of the reasons why I didn't push Lookout myself more for any of our projects was the much more involved licensing for distributed apps in comparison to LabVIEW applications that we could make. But to be honest for classical SCADA applications that certainly wasn't a problem. Lookout was quite cheap in comparison to most other SCADA packages out there back in the late 90ies. Rolf Kalbermatter
  8. I think you misunderstand LGPL a bit. In my opinion for most LabVIEW code it is more apropriate than GPL. GPL requires you to make all your application GPL. LGPL does (at least IMO) allow to link the library to your project in a dynamic way allowing you to choose another license form of your application, as long as you allow your user access to the LGPL part in some way (such as a dynamically called VI in an external LLB to your application). This guarantees that modifications to the LGPL library self are still accessible to the end user (and to a lesser degree to the entire OS community). This was the main reason why LGPL was invented at all as GPL had a to strong viral effect for some developers that created libraries rather than applications. LGPL is in itself fully compatible to any GPL project and as long as you provide some means of access to the source code of the LGPL part even for closed source applications. Rolf Kalbermatter
  9. Hmm I know them as I wrote that library :-) But the status they return is completely independant of TCP/IP or any other network protocol. They return the link status of the network connection (in fact the detection of the carrier on the network cable when it is connected). However be aware that that may not always work, although today that may not be as bad anymore. It can both depend on the network card used and its drivers as well as on the remote side connected. For instance having a hub will always see link status connected eventhough that hub may not be connected to anything else but the power line. In the past some hubs with auto detection/negotiation of the speed and/or crossover or not connection, did have troubles to properly detect certain network cards resulting in a carrier on the network link but no real connection possible. So don't just blindly expect this library to give you everything. This status only tells you that there is a powered network interface attached to your interface and nothing more. If network traffic is possible or not can be and often is a completely different issue. Rolf Kalbermatter
  10. That is not how TCP/IP works. TCP/IP is a state controlled protocol and in order for an error 66 to be reported the TCP/IP stack must go trough the FIN state which is initiated with a FIN, FIN ACK handshaking. Since the connection simply went away there is not something like this. For the local stack the connection is still in an active state although all packet sends and requests timeout, and that is what you get, error 56 timeout. You will have to rethink your aproach but TCP/IP in itself does not guarantee detection of line breaks, only detection and reporting of successful transmission. I think there is some configurable timeout clearing for the TCP/IP stack, where the connection is put into the FIN state automatically after a certain amount of packet requests timeout continously. Rolf Kalbermatter
  11. Sorry Aitor. I remembered from my limited investigations (yes that kind of thing is legal here although using the knowledge to circumvent such protection is not) into the 8.0 license manager that there were two explicit areas that seemed to require a license in order to run. One was the scripting feature that we all know about and the other was something like XNode Development. Knowing scripting I did investigate a bit further into it but not being familiar with XNodes I never went further on that. Maybe they changed the Xnode protection in 8.20 or there are two different aspects about XNodes that are protected differently. I do not know and won't have time to investigate in the near future. Rolf Kalbermatter
  12. I also have to warn that my opinion is not completely unbiased as I have started my LabVIEW carrier as an application engineer at NI and then went to be an alliance member. When I started at NI I was shown this software and some manuals (that were admittingly less complete and much smaller in content than nowadays, but they were printed paper) and then I got a chance to attend a LabVIEW course or two. And those courses really helped a lot. However I have to say that I had previous programming practice in Pascal and a little C so programming in itself wasn't a strange matter to me. My electrical engineering background was very delighted when seeing the LabVIEW diagrams that so much resembled the electrical schemata I had learned to think in earlier so I adopted it quite fast but nevertheless felt that the course really gave me an advantage. It wasn't so much about the programming in itself but about discovering all the little features, editor shortcuts, and tips and tricks that this course gave and also the interaction with the teacher and other students during the course. Later I thought LabVIEW courses myself as an application engineer and also alliance member and I have to say that I still learned a bit during each of those courses. My experience during these courses was that there were two type of people. The ones that knew programming in itself did usually profit a lot more from the course than the ones that had to be thought the basic principles of programming first. Three days is simply not enough to teach someone to understand a whole bunch of programming constructs and something about datatypes and at the same time also have them get familiar with a new software environment such as LabVIEW. But I think that is the same with any software course. I doubt there is a Matlab course that will be useful to anyone that has to be thought the basic principles of mathematics first for instance. The only problem I always felt was that NI likes to market LabVIEW as the tool for non programmers. In my view that is not entirely correct. Without some basic understanding about loops, conditionals, arrays and skalars you simply can't create a good working computer application. The advantage of LabVIEW is that these things are easier to understand and use in LabVIEW for most people since people tend to be more visually oriented than text oriented. Ohh yes, I took the courses in Austin and on a Macintosh since LabVIEW for Windows didn't exist then and there were a few people (not NI people) in the same course that obviously had it even easier than me. They usually had the examples finished before the instructor even asked to start with them. They were attending the class to learn LabVIEW not programming, something which I haven't seen to often over here in Europe later when teaching courses. Rolf Kalbermatter
  13. The idea about FPGA might be interesting here. Earlier versions of LabVIEW did not support (or should I say use) fixed sized arrays although the typedescriptor explicitedly had this feature documented as long as the document about Typedescriptors exists. FPGA was to my knowledge the first environment really needing fixed size arrays so they pushed that. The particular problem you see also seems a bit like pains of the early attempts to get the constant folding optimization into LabVIEW. Not sure if the FPGA Toolkit adds this feature to the LabVIEW environment but I rather think that it is there independant of the existence of the FPGA Toolkit (but can't currently check as I have the FPGA Toolkit also installed). Rolf Kalbermatter
  14. Well I didn't say to turn off all optimizations. Certainly not the ones that are already working fine and in the particular case with 6.0.1 it was not about inplaceness or not. It was about more agressive inplaceness optimization that would completely optimize away bundle/unbundle constructs if combined with certain shift register constructions. The same code had worked fine for several years in previous LabVIEW versions without so much of a hint of performance problems and suddenly blew up in my face. The Queue port was also not such a nice thing but I got easy off there since I didn't use queues much as I had gotten used to create my intelligent USR global buffer VIs for vrtually anything that needed something like a queue functionality too. But I think there is a big difference in bugs introduced through things like constant folding and bugs introduced in new functionality. I can avoid using queues or whatever quite easily but I can hardly avoid using shift registers, loops and basic data structures such as arrays or clusters since they are the fundamental buidling blocks of working in LabVIEW. So if in that basic functionality something suddenly breaks that LabVIEW version is simply not usable for me. The same would be for fundamental editor functionality. Just imagine that dropping any function node on the diagram suddenly crashes on every fourth installed computer somehow. Other bugs can be very annoying but you still can keep working in that LabVIEW version and write impressive applications with this if you need to. While we all would like bug free software I think almost everyone has accepted that that is something that will never really happen before LabVIEW 77 with it's 5th generation AI and environment interfaces with causality influencer. But the basic functionality of LabVIEW 2 should not suddenly break. Rolf Kalbermatter
  15. Well, nothing against German :beer: but the Belgians really have a few of the best ones that I know of. And no I don't say that because he is a collegue Rolf Kalbermatter
  16. Ah I see. Well I for myself still have to do my first project for RT/FPGA in 8.x. 7.1.1 while having some quirks actually still works great for that. Rolf Kalbermatter
  17. Most likely you do something wrong when calling your DLL. There is no way LabVIEW should be able to access memory controlled by your DLL outside of calls to your DLL. The most likely cause of these is actually that you pass a to small buffer to a DLL function that tries to write to that buffer. In C the caller (eg. you as LabVIEW programmer) has to allocate the buffer. LabVIEW can not even guess how big such a pointer should be so you have to tell it. You do that by creating the LabVIEW array or string with functions such as Initialize Array with the necessary size before passing it to the Call Library Node. Most people think that an empty array constant as input is enough since that is how it would work in LabVIEW. But the C function can not dynamically resize the array to the size it would require so it just assumes that the caller has done that already. LabVIEW however can not resize it automatically before passing it to the C function since it has no idea if that array should be 10 bytes or maybe 100 MB. Passing an empty array will basically cause LabVIEW to pass a pointer to a zero length buffer. Now your C function writes into that zero size buffer and overwrites data it should not even look at. If you are lucky you get an Illegal Access Exception when that memory has not yet been allocated to the process at all. More likely however the memory area follwing that pointer is already used by LabVIEW for other purposes including its own diagrams, front panels, management and what else. If you are still a bit lucky you destroy very important information that will soon cause LabVIEW to crash. In the unluckiest case you just overwrite memory that is actually part of your VI definition in memory. Then you do a save et voila you got a corrupted VI on disk that might not be possible to load into memory anymore!!!!! In your case there gets data overwritten that seems not very important but renders some pointers invalid. At the end when LabVIEW tries to properly deallocate all that it has allocated before, it stumbles over these invalid pointers and crashes. Killing your app only avoids the symptome but doesn't cure the cause. So if you get strange crashes check if you use DLLs anywhere. If you do and those DLLs came not with LabVIEW itself you should get very cautious. Stress test them as much as you can with the setup as is used in your application. You may be using a time bomb in your application!! It may seem harmless now but seemingly small changes to the app might cause the corruption to be moved into much more sensitive areas and there your app crashs consistently somewhere seemingly unrelated because you added this single button to that nice user interface and post here that LabVIEW crashed because of adding a simple standard button to your VI. Rolf Kalbermatter
  18. class is a C++ only thing and therefore will never work with the Call Library Node. With wrapper I meant to write a standard C function for each method you want to call in your class. Probably something like following but my C++ is very rusty and not really good. #ifdef __cpluscplus extern "C" { #endif int FirstMethod(int arg1, int arg2); ...... #ifdef __cpluscplus } #endif static My_Class mc; int FirstMethod(int arg1, int arg2) { return mc->FirstMethod(arg1, arg2); } etc...... You can do the same for dynamic classes but then you will have to pass the object pointer as extra parameter in your wrapper function and you also need to create extra functions to create and dispose the object pointer. Rolf Kalbermatter
  19. There is a little problem with this optimization. As long as it works sometimes and NEVER creates wrong results I don't care. But If I create a VI that does something specific and logical and the result that comes out is simply completely off track, I'm getting very pissed. This has been with certain optimizations in shift register handling in the obnoxious 6.0.1 version and other versions before and after and this whole constant folding again has caused quite a bit of throubles. The difficulty simply is: you do not expect LabVIEW to calculate 1 + 1 = 3 and when you get such a result you are searching sometimes hours, questioning your sanity before you throw the towel and decide that it really is a stupid LabVIEW bug. I can live with LabVIEW editor bugs or not always correctly working new features but I certainly don't accept LabVIEW to create completely wrong code that has worked for several versions before. As such I do not want constant folding unless I can rely on it to not cause the compiler to create wrong results. If I need optimization I can think about the algorithme myself and find a variant that is quite likely just as fast or even better than what LabVIEW possibly could come up with from a different suboptimal algorithme. My stance here has been and always will be: I rather have suboptimal and possibly even slow code generated that produces correct calculations than hyper fast code that calculates into the mist. The only exception to this might be if the miscalculation would be to my advantage on my bank account Rolf Kalbermatter But in this case the bad programmer is not the one USING LabVIEW. I know how hard optimization is but still I would rather have a choice in this than having to start to doubt LabVIEW itself every time a result does not match my expectations. And to be honest I have this choice by still using LabVIEW 7.1.1 for basically all of my real work. Rolf Kalbermatter
  20. It is definitely a bug. Since the Flattened data output string is not the same this will cause problems. There are many cases where Flatten/Unflatten are just used to get the data into a stream format and back and the context alone is enough to determine what data has been flattened so that parsing the typedescriptor, which until a few versions of LabVIEW was documented but no official VIs for this parsing were available, was absolutely unnecessary. And someone at NI obviously thought that the typedescriptor was superfluos too (I definitely don't agree but have no influence on that) otherwise they wouldn't have removed it in LabVIEW 8, would they. Rolf Kalbermatter
  21. While I agree that this is not really the way to deal with passwords I wonder if there is a real problem. Are you going to distribute the lvproj file too with your VI's? Not sure why you would want to. Rolf Kalbermatter
  22. My solution was to install them in a specific LabVIEW version and then copy all the files from this version to all other LabVIEW installations manually. Is a bit of work but much less than this stupid install, rename, uninstall carussel. Rolf Kalbermatter
  23. Actually not necessarily. This is a behaviour that also occurres in C, at least the compilers I know of and it has its uses when you read in data from a stream in a certain format but later want to reinterpret some of the data. I know for sure a few cases where I have relied on this fact and changing that now would certainly break lots of peoples VIs. Rolf Kalbermatter
  24. Well it's not the use people here are interested ;-). It's how you made them! As far as I know there are only two possibilies: - You got a license from NI somehow (and have signed an NDA or someone has on your behalf) and then posting this here could get you in trouble. - You hacked LabVIEW to not do a license check anymore or something like that and then you are also in trouble in at least certain countries on this globe who think that even thinking about circumventing anti-copy protection is a major crime. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.