Jump to content

Donald

Members
  • Posts

    71
  • Joined

  • Last visited

    Never

Everything posted by Donald

  1. Hello, I noticed that the Flatten To String function in the attached LV711 VI does not stay in the "Convert 7.x data" format mode. Other VIs in my project (also with variant data type input) stay in the LV7 format when loaded with labVIEW 8.20. I guess this is a minor bug which is not yet reported. If nobody can explain why this VI behaves different I will send a bug report to NI. Best regards, Donald Download File:post-2015-1168597339.vi
  2. I use TDMS format for a SCADA like LV8.2 application. Each tag is stored at a configurable interval with its status and timestamp. Properties can be added to each group or channel. I understood that we have now 2 file formats: TDM = XML encoded header file holding channel-group config and all their properties + additional binary files holding the channel data (it is also the native DIADEM storage format, so I do not think it will disappear) TDMS = binary header file + additional binary files holding channel data TDMS has a very clean interface and is available on more LabVIEW platforms than TDM. >>> DOES SOMEBODY KNOWS IF IT IS AVAILABLE UNDER LV8.2 for LINUX? Their is a LabVIEW TDMS to TDM file converter available with LV8.2. For TDMS there is an update for the 'viewer' VI on the NI website. Their is also a TDM plugin available for Excel which works very well (at least with my test files that are not that complex of structure), TDMS Excel plugin is still under construction by NI I use Diadem to visualise the TDMS file (import plugin needed for Diadem 10). They work very well together but of course Diadem is not so much used yet and not free. Possible problem: Your runtime (in case you build an executable) will get (much?) bigger when you include USI library in the installer script (Universal Storage Interface I guess). My target is an embedded controller where each MB counts else it doesn't matter so much. Conclusion: I'm very pleased with TDMS as it is very flexible and fast and the number of library VIs very limited. What I would like to test is performance for big chunks of data (+2Gb files), if these tests are OK I'm going to use TDMS for an EEG/EMG recording application that records about 32 x 2 GB files a day (EDF format), TDM was to slow for this application.
  3. Hello LabVIEW developers, LabVIEW remote debugging is working again. Adding port 3580 as 'LabVIEW Debugger' to the firewall's exeception list solved the problem (disabling the firewall was NOT an option as this system is connected on the web). THX for the info
  4. THX for the very detailed info, I'm going to test your recommendations on monday.
  5. I would rather see the option of creating enums where the values of the strings do not have to be sequential. Is this not why ring constants are used instead of enums? I fear for nasty bugs in existing projects when you change the behavior of a ring constant. Also execution speed could be inferior by adding extra features. The string info is not part of the data-type, please leave it like that. Please note that the combo box has a similar behavior.
  6. Since LabVIEW 8 it is possible to remotely debug a labview executable or dll on a client computer over tcp/ip. It seems to be a fantastic feature that is not used by a lot of LV developers (nearly no Google results). Does anyone has an idea if you need to configure DCOM or Firewall ports for this (cfr. OPC server)? I have a headless PC and I lost the remote debugging access (no application enumerated on the host)... I suspect a DCOM problem created by Windows update. Installing LV8.2 is not an option on a flash disk.
  7. THX Stephen your reply does justify our green light for LV8.20 development without OOP. I was getting worried because we have a least 10 Lv8.20 projects going on (one will run in a nuclear power plant!).
  8. THX for the info, it should be the OOP as our company has less problems with 8.20 and we do not use OOP (we hate the pass-by-value implementation). I understand your decission to move away from LV. I also fear that you are not the only one. As a die-hard fan of LabVIEW for more than 10 years I start becomming more on more jealous on the progress of the Visual Studio .NET IDE and tools. To get an idea how hard .NET/Visual Studio is improving please read the very good blog of the last TechEd in Barcelona from my collegue Bruno. http://msmvps.com/blogs/vanDooren The progress of LabVIEW seems to be more in the embedded design than in the modern software patterns and development practices. Where .NET technology tries very hard to avoid hard coded constants and obscured interfaces LabVIEW is moving the other way it seems. I can not sell any LV8 license to a customer who uses LV7.1! Why should he upgrade except if he is forced because of missing toolkits/drivers (e.g. CRio)??? Yes I' m frustrated by this :-) because I'm forced to stick with lv711. >>> Please do not read this as .NET is better than LabVIEW, it depends so much of the project and environment. But for large scale projects at least .NET seems to be a very good choice for the next 5 years. Especially the increased development time and more advanced programming skills are my fear to switch to .NET. For toolkits the development time is less crucial and the .NET market much bigger. To what development environment would you like to switch and do you have an idea what will happen with development time?
  9. Sounds familiar... a reason extra to stay away from x.00 versions but a x.20 version should not crash during developement. I thought 8.20 was OK, and much better than the rush-rush 8.00 version, except for the OOP part, which is most often not needed. My experience with 8.20 is very limited and only with rather small projects (<300 VIs). For mission critical I use 7.1.1. At my company we got a green light for 8.20 development ?! I'm very interested in the scope of your problems? Are you're problems related to OOP or to the project manager? How to debug? You can't .. this is why LabVIEW is NOT used in a lot of mission critical applications. It get even worse when you use external interfaces like ActiveX or .NET. I find it unacceptable that a lot of LV bugs are not even in the knowledge base or in release notes. I always have to call to NI Belgium to check whether the bug is already found and thus making a bug report is waste of my (very expensive) time. Here are some tricks however: 1) Problems with Insane Objects can be tracked down by using the Labview debugger. Searching through LabVIEW info archives and with the help of Brian Renkens website for .ini settings will learn you this profane art (I do not understand why this debugger is hidden anyway). I have not tried this for LV8+. 2) Loading problems can 'sometimes' be solved with a mass compile (a least for data.cpp problems) 3) Cross platform application loading problems could be the use of non standard fonts 4) Check for duplicate files 5) Check for llbs, vis and ctls with abnormal size (very small or very large)
  10. If the controls/indicators follow the typedef, the constants should do too. I was not aware that typedefed control/indicators of a ring follow their typedef now. One of the differences with enums is that the data range of ring is not bound to its array with strings and values... The 'allow undefined values at runtime' option is more to prevent user input. The typedef locks the user interface part of the control but not all of its data behaviour.. See my example (lv711). I guess this is why the constants do not follow typedef changes. It is a pitty because ring values do not have to be sequential and would be nice error number constants holders e.g. I think NI should rather unlock the interface for creating sparse enums because the behavior of enums is much more consistent. I have seen such enums in LabVIEW so it is possible (with CVI??) . Download File:post-2015-1163495288.llb
  11. I can confirm that one of the answers on the NI LabVIEW certification exam is "Functional Global"... Lv Punk thx also for the shortcut to the mails of Stephen Mercer. Very good explanation why they changed the Obtain Queue primitive.
  12. VISA serial read terminates when one of following conditions is true - there is a time-out - the end-of-message character is found (default this is a linefeed) - the requested number of bytes is available I guess you should turn off the end-of-message character detection or redefine the character. You shoud do this when you initialise the serial port (VISA config serial port.vi) or before the read action with a property node of class Serial Comm. Good luck.
  13. Hello, You should disconnect the problem control from its typedef. How ? 1) Open the llb and click "ignore " when LabVIEW loader asks for the missing control. 2) When program is loaded press CTRL+L to get error/warning list window 3) In the Error List window double click on the typedef error message in the error window 4) You should see the missing control grayed out. 5) Right-mouse-click on the missing control and select 'disconnect typedef' The VIs are now in a runable state. Good luck.
  14. Good to hear that 'functonal' doesn't come from Functional programming which was not obvious for me. Crelf I expected your comment about intelligent design and you are right but I also guess that the fact the programmer of the good/bad code used LabVIEW is also a sign of intelligence :-) ? I definitely do NOT like USR since this limits this style of globals to shift register based patterns: 1) For a non-LabVIEW person this is impossible to understand, most of my managers (that pay the projects) do understand intelligent or functional global but have no idea what a shift register is. 2) And what about queue based patterns?? The queue based designs are very hard to beat (by LabVIEW onluyy code) when you need to store/access a large amount (more than 10^6 samples) of data for an unknown number of objects at runtime? Do you want more details on queue based globals please check NI application note about handling large data objects in labview. The biggest problem with the queue based design is that the queues become invalid after the creating VI stops running... so harder to debug programs. (as far as i know there is no labview.ini setting for this behavior). I've never seen any LabVIEW programmer/application using this design so please share your experiences with this design pattern. Maybe we need a new thread for this one? I believe the variant design is not applicable for very large data sets or NI should have done miracles in LV8.20 regarding variant performance sets but could be usefull to have a clean OO approach. I guess it is even harder to debug when things go wrong.
  15. In Belgium and the Netherlands we (CIT Engineering) most often use the acronym GLI - Intelligent (designed) Global Variable. I believe that the acronym originates from Philips Natlab R&D center in Eindhoven (NL) and was introduced by the LabVIEW support group of Albert Geven. I like the acronym because it is short and easy to understand, also for non-hardcore (= occasional) developers and students. Everybody understands that an intelligent global has to offer something extra over a standard global variable. I wonder where 'functional' in Functional Global stands for. Does it refer to functional programming? Which is strange because functional programming tries to avoid mutal objects. Does anybody knows this? Wikipedia learns me: Functional programming is a programming paradigm that conceives computation as the evaluation of mathematical functions and avoids state and mutable data. Functional programming emphasizes the application of functions, in contrast with imperative programming, which emphasizes changes in state and the execution of sequential commands. Sources: http://en.wikipedia.org/wiki/Functional_programming http://en.wikipedia.org/wiki/Immutable_object
  16. Hello LabVIEW developers, I also have this problem, it is not your PC or Windows configuration.
  17. Does anybody knows if these VIs also work for LV7+ linux these days? I tried the above used property nodes under LV7/Suse7.2 and the only thing that was returned as a printer was *???
  18. Status Update: Fixed in 8.0.1 (see also the release notes of 801)
  19. When a Waveform graph is copy/pasted and the waveform has one or more cursors with their names visible, the clone looses the cursor names. This seems to be a bug in the propery list of the waveform graph. Not all properies are visible to us, I guess the label position is not correct. Work around? Yes, empty the cursor list to 'reset' all the properties, also the 'secret' ones. I tested this bug with LabVIEW 7.1 and 7.1.1 for Windows. This bug is solved in LabVIEW 8 and also 6.1 works great. Download File:post-2015-1138650380.vi
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.