Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,903
  • Joined

  • Last visited

  • Days Won

    269

Everything posted by Rolf Kalbermatter

  1. QUOTE(tharrenos @ Sep 16 2007, 11:18 AM) As far as LabVIEW is concerned there won't be any problem. WiFi cleanly integrates into the OS networking stack and as such just looks like any other Ethernet hardware interface to LabVIEW, meaning LabVIEW has absolutely no idea that the packets would go over a twisted pair, glasfiber, dial-up modem, or WiFi network. The central heating part might be a bit more troublesome. You will need to have some hardware that can control the heating and at the same time has a WiFi interface too. As embedded device which would be interesting only for high production numbers that would require some engineering. If you just put a normal (small size) PC beside and run a LabVIEW executable on that too the necessary engineering would be quite limited. Rolf Kalbermatter
  2. QUOTE(Louis Manfredi @ Sep 13 2007, 01:21 PM) Yes upgrade notes would not have worked. They are both in LabVIEW as long as I can remember, which is about version 2.5 or maybe the parse enum wasn't in there but at least 3.0. Possibly 2.2 for Mac only, had it already too. Rolf Kalbermatter
  3. QUOTE(pjsaczek @ Sep 12 2007, 04:47 AM) As indicated already, if you talk to a serial device always add explicitedly the end of message character for that device to the command or if you want to be more dependant on a specific VISA feature set the "Serial Settings->End Mode for Writes" property for that VISA session to "End Char" and make sure you also set the "Serial Settings->End Mode for Reads" property and the "Message Based Settings->Termination Character" property accordingly as well as the "Message Based Settings->Termination Character Enable" to True. As already said it could have problems with older VISA versions (really old) and I find it better to explicitedly append the correct termination character myself to each message. Also note that since about VISA 3.0 or so the default for serial VISA sessions for "Serial Settings->End Mode for Reads" is already "End Char" and the "Message Based Settings->Termination Character" property is set to the <CR> ASCI character as well as the "Message Based Settings->Termination Character Enable" is set to true. However "Serial Settings->End Mode for Writes" property is set to "None" for quite good reasons as it does modify what one sends out and that can be very bad in certain situations. LabVIEW as general purpose programming environment shouldn't do that for you automatically since there are many somewhat more esoteric devices out there that use another termination character or mode then appending a <CR> character or <LF><CR> character sequence. Rolf Kalbermatter
  4. QUOTE(pjsaczek @ Sep 10 2007, 10:52 AM) No LabVIEW is not adding anything to the strings on its own and I would be really mad if it would. However if you use VISA in a more recent version (let's say less than 5 years old or so) you can configure it to do that for a serial session automatically for you. But personally I find that not such a good idea. I prefer to code each command in such a way to append the correct end of string indication explicitedly. Rolf Kalbermatter
  5. QUOTE(jed @ Sep 11 2007, 04:26 PM) The Windows message queue example is not really meant to hook directly into the queue but only monitor it. In order to hook into that event you would have to modify the C source code of that example to do that specifically. Not to difficult but without some good C knowledge not advisable. Another way might be that newer LabVIEW versions will send a filter event in the event handling structure "Application Instance Close?" which you can use to disallow shutting down the app. Not sure if it will disallow shutting down the session directly though. But it should be enough to detect that there might be a shutdown in progress and allow you to execute the command Adam mentioned to abort that. Rolf Kalbermatter
  6. QUOTE(dsaunders @ Sep 11 2007, 03:09 PM) The private property is somewhat a pain to use as it is really a number of informations you would need to check. First the connecter pane layout itself which is just a magic number for one of the patterns you can choose. Then the array of connections with an arbitrary number to identify the position in the connector pane it is connected too and last but not least the datatype of each connection which is a binary type descriptor and can't just be compared byte for byte but needs to be verfied on a logical level since different binary representation do not necessarily mean different data types. Not sure I would want to spend that much time for this! Another possibility and LabVIEW uses that a lot for its plugins is to use the Set Control Value method and then simply run the plugin VI. Makes the connector pane completely independant of the calling information. You just need the correctly named controls on the front panel. Using an occurrence (or notifier etc.) you can wait in the caller for the VI to signal its termination or simply poll its status to be not running anymore. Rolf Kalbermatter
  7. QUOTE(chrisdavis @ Sep 11 2007, 08:45 PM) My best bet is .Net although I think the WinAPI will also give you this information on a somewhat lower level and probably not exactly trivial to access. Possibly LabVIEW has some private hooks itself. Would make sense to have this info in LabVIEW somehow to make it platform independant. Rolf Kalbermatter
  8. QUOTE(Techie @ Sep 9 2007, 10:04 PM) All comments so far have been valid and right. One other thing, creating controls while in principle possible requires the use of so called scripting, a feature that has not been released by NI and made difficult to access too. Also you can not do it on a VI that is running, so for most user applications it is meaningless as you usually want to do such stuff from the VI itself. LabVIEW can not make any modifications to a VI that is running if these changes would modify the diagram in any way and dropping a control on a front panel does modify the diagram. That is also a limitation of the compiling nature of LabVIEW. Rolf Kalbermatter
  9. QUOTE(brianafischer @ Sep 8 2007, 01:50 PM) What version of LabVIEW are you using? LabVIEW since at least version 7 is fully multi monitor aware and moves windows so that some part of it's window is always visible although it does not center them in the primary screen which I would find to intrusive anyhow. Making an application open its windows always in the secondary screen independant of its location is defintely a very esotoric requirement. Personally I find it rather disturbing to have to look right or left of the main screen depending on the setup. It's already hairy enough to switch between single monitor and multiple monitor use. But I know many applications that will open their window always in the last saved monitor location independant if that monitor is present or not and I really find that an annoying behaviour, especially if they replace the default system menu too, so that you can't even move it from the task bar with the cursor keys. And upgrading to LabVIEW 8.5 could help you a lot. There you can configure per front panel on which screen it should be displayed and LabVIEW will honor that if that monitor is present. But i think this is mostly a runtime feature. Not sure it will have the desired effect on the windows in edit mode and definitely not for the diagram windows. Rolf Kalbermatter
  10. QUOTE(yen @ Sep 8 2007, 02:22 PM) In order to be able to connect to a LabVIEW application over AcitveX it also must have been build with that option enabled. I personally always make sure that option is disabled for various reasons, including safety. Rolf Kalbermatter
  11. QUOTE(yen @ Sep 5 2007, 01:36 PM) Just because there is a Basic development environment for this target does not mean that there couldn't be a (Gnu based) C toolchain for it too. Rolf Kalbermatter
  12. QUOTE(tcplomp @ Sep 7 2007, 02:38 PM) And just to show what the pre LabVIEW 8.5 version of this code would look like: http://forums.lavag.org/index.php?act=attach&type=post&id=6885''>http://forums.lavag.org/index.php?act=attach&type=post&id=6885'>http://forums.lavag.org/index.php?act=attach&type=post&id=6885 Forget about the unwired loop termination! ;-) That is not the point here. And if anyone wonders why one would write a VI in such a way. It is called pipelined execution and has advantages on multi core, or multi CPU machines as LabVIEW will simply distribute the different blocks onto different CPUs/cores if that is possible at all. On single core systems it has no real disadvantage in terms of execution speed but this construct of course takes a memory hit because of the shift registers that store actually double the data between iterations than what a linear execution would need. Rolf Kalbermatter
  13. QUOTE(Jim Kring @ Sep 7 2007, 10:22 AM) Just learned that yesterday at the local LabVIEW day here in the Netherlands, presented by Jeff Washington. His example had a loop but was about pipelined execution and boy I can tell you that although I'm excited about this feature, it does need getting used too. Basically with this node you sort of have to forget a few things about data flow and wire dependancy. And yes Jeff mentioned that the original Feedback node was implemented by an intern and they had thought he had chosen to implement it simply as a folded shift register but that seems to not have been the case and that is why it was much slower than a shift register. In 8.5 however Jeff claimed that the Feedback register should in all aspects we as user could possibly measure, behave exactly as a shift register. Probably there is also already an NI patent pending for it :-) Rolf Kalbermatter
  14. QUOTE(wbiker @ Sep 7 2007, 04:02 AM) Most likely you either call some external code incorrectly when this event occurres or probably also possible you do not close properly some functionality contained in external code which makes it hang. Another possibility is that you have multiple loops in your application and one of them instead of exiting too, starts to run freely without any asynchronous delay function in it anymore. But without seeing some of your code and preferably a seriously scaled down application that exhibits still this behaviour it is not possible to give you more specific advice. Rolf Kalbermatter
  15. QUOTE(TobyD @ Sep 6 2007, 10:32 AM) Or before there was VI server, like this: http://forums.lavag.org/index.php?act=attach&type=post&id=6875 Rolf Kalbermatter
  16. QUOTE(wbiker @ Sep 7 2007, 02:28 AM) By clicking the stop button on the toolbar you are not stopping the VI but really aborting it. If the program execution was at that point in an external code part you can end up with a locking situation since the external code may wait one some event message processing which will not happen anymore since the whol application was basically taken down brutally. Most NI drivers especially in newer LabVIEW versions are writting in such a way that they get informed by LabVIEW about aborts so that they can abort any waiting, but third party drivers usually don't have that, also because the means to be informed by LabVIEW about aborts are not really documented. But even NI drivers can still sometimes get stuck in such a way. The toolbar stop button is really just a last measure to abort a program or for quick and dirty testing but should not be used as normal way of starting and especially stopping a LabVIEW program. The LabVIEW program should have some sort of event processing with an explicit quit button in which case the event processing loop (and any other parallel loop that might be running) gets properly terminated after which you can clean up any DAQ, Instrument, IO-Bus, etc operation properly by closing those resources and once the last item on the diagram has executed the program stops too, but this time cleanly. Rolf Kalbermatter
  17. QUOTE(Karissap @ Aug 31 2007, 01:38 AM) Only if you are quite good at C and integrating that through a DLL into LabVIEW. The System Tray API in Windows uses callbacks and integrating that with LabVIEW is not trivial. ActiveX should be easier but I don't think there is a standard Windows ActiveX control that does System Tray. QUOTE(MikaelH @ Aug 30 2007, 08:17 PM) You can then create the "Icon tray" object set its properties/icons and then when a user interacts with the system tray icon the .net part fires call backs to LabVIEW. The .Net System tray API seems to have a problem somehow. I've seen several applications using that and they all seem to not remove the icon from the system tray when the application closes. Apparently there is no way to properly cause such an icon to disappear, especially when the application closed unexpectedly. Rolf Kalbermatter
  18. QUOTE(LV Punk @ Aug 29 2007, 06:26 AM) Since you plan on throwing away any but the last data anyhow a notifier may be better than a queue in this particular situation. Rolf Kalbermatter
  19. QUOTE(jlokanis @ Aug 29 2007, 12:43 PM) Well if you have a LabVIEW version before 8.0 somewhere you can copy the stuff from there. There was no lvlib before 8.0. Alternatingly I wrote long ago a library that works similar to the NI Eval Formula function but is implemented a bit differently dividing the evaluation cleanly into a parser that creates UPN stack intermediate code and an evaluater that operates on that intermediate code. I needed that for speedy calculations where user entered formules would be needed but they didn't change all the time, so parsing the formula once and then evaluating it over and over again had some serious perfomance advantages. The parsing is not trivial but a lot more streamlined than the NI code and the entire thing is also faster. It hasn't been updated for several years but worked fine for what I needed it. LabVIEW 5.0 ExprEval.zip Rolf Kalbermatter
  20. QUOTE(adriaanrijllart @ Aug 29 2007, 11:41 AM) Hi also. I do remember having met you and yes it is quite some time. In the meantime I'm living for already 11 years in the Netherlands ;-) Greetings Rolf Kalbermatter
  21. QUOTE(adriaanrijllart @ Aug 14 2007, 12:48 PM) As far as I know it doesn't exist and the chance that it will is small. HTTPS requires a serious amount of encryption technology and writing encryption code in LabVIEW is not the most efficient thing to do. But the most critical aspect is the fact that writing encription routines is a tricky business that not many people know about, and unless you are an absolute pro in that area trying to do it is likely to not work or even worse pose serious security risks when using that code. Most people using LabVIEW are simply not professional encryption/security specialists. Using existing C code has the problem that the actual HTTPS encryption actually is put at the lowest level just above the network protocol. So there would be only two possible ways: 1) inject an HTTPS encryption layer into the TCP/IP stack so that LabVIEW does not deal with that encryption at all. This is very difficult to do and would require some state aware encryption layer in a low point of the entire protocol stack that depends on the protocol state of a higher layer. Not really a good idea. Better idea in that context would be to use something like putty and just create a secure tunnel through which the normal HTTP protocol goes. This would require that you can influence the server side too, as you would have to setup an according VPN or similar connection. 2) using the C libraries implement the HTTPS protocol on top of the LabVIEW TCP/IP primitives. Technically the right way but so much work that I would not even consider doing it unless I can find a government sponsored project that would pay for that :-). In short forget it as it would be quite expensive already! Well cheaper and more secure than trying to implement the HTTPS security infrastructure entirely in LabVIEW, but still not practical. Rolf Kalbermatter
  22. QUOTE(paracha3 @ Aug 23 2007, 10:05 PM) Tomi has been right with all his recommendations. No need to use a Visual Source Safe compatible interface, unless you want to invoke the source code control actions directly from within LabVIEW. In my experience using SVN with Tortoise SVN however works actually better. Of course you need some discipline but that is in the nature of source code control anyhow. One extra note, if you really want to access your source code control system directly from within LabVIEW you do need a Visual Source Safe compatible interface plugin. LabVIEW simply interfaces to that API and accepts any compatible source code provider that has been registered in the system. If you can't find a commercial interface for your SCC (but if it is popular at all in any means you probably can) you would have to write it yourself. From what I understand it is not to difficult to do that but the API is considered proprietary information by MS and you only get it by signing an NDA, or at least that was the situation last time I checked into this. Rolf Kalbermatter
  23. QUOTE(RiverdaleVIEW @ Aug 21 2007, 07:58 AM) Well a 2D array in C is VERY ambigious. Without knowing more exactly what the programmer did intend and use you can generally not say for sure how it is implemented just from the prototype. You can implement a 2D array as one single chunk of memory with allo rows (or for the kick colomns) put after each other). This is the way LabVIEW handles 2D arrays. It is as far as the memory layout is concerned just a single 1 dimensional array with all rows streamed (serialized). This is pretty efficient and cool with one single drawback. Since you only know the number of rows and columns for such an array each row will have to have the same length, resulting really into an array of rows * columns elements. Another possibility is to create an array of pointers to 1D arrays. This results in the int **a syntax and is a bigger load for the memory manager. This array can NOT be created nor passed by LabVIEW directly to a DLL nor exported from a LabVIEW DLL. One of the reasons is that there is no way for LabVIEW (and any C compiler actually) to know if this is just a double referenced pointer to a single variable or if it is an array of pointers. int *a[] would be a bit clearer in that sense but traditionally C compilers make no difference between int *a[] and int **a. So if you have an int *a[] or int **a parameter anywhere in your DLL functions, you are going to have to create a C wrapper function in this or a saparate DLL to translate between LabVIEW and your DLL. Rolf Kalbermatter
  24. QUOTE(Sarfaraz @ Aug 10 2007, 05:52 AM) This message is normal. LabVIEW has been compiled without debugging information for several reasons. One of them is the size it would take, another one the fact that debugging information can expose sensitive information in the executable that makes dissasembling it very easy. You should be able to ignore that message and just continue. I have no experience with Visual C 2005 but in Visual C 6 it is just like that. As long as your DLL was compiled with debug information the Visual C debugger should show you source code debugging for your DLL. It could be that you get prompted for the project file at first launch so Visual C can locate the source files. Of course if you start to single step from your code into LabVIEW code you will only see assembly there. Rolf Kalbermatter
  25. QUOTE(Aristos Queue @ Jul 23 2007, 05:29 PM) This sentiment! Is that official NI policy now? Maybe it has come the time to consider for me if LabVIEW is still the tool I love to work with. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.