Jump to content

wallyabcd

Members
  • Posts

    25
  • Joined

  • Last visited

LabVIEW Information

  • Version
    LabVIEW 2023
  • Since
    2017

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

wallyabcd's Achievements

Newbie

Newbie (1/14)

  • First Post Rare
  • Collaborator Rare
  • Conversation Starter Rare
  • Week One Done
  • One Month Later

Recent Badges

0

Reputation

  1. While Rolf and other responders are entirely correct, I have in the past attempted similar exercises with some measure of success by limiting myself to certain types of interfaces at a time such as RS 232 and RS 422 or Ethernet like interfaces and using this pattern: Msg -> Msg+ Envelope -> Interface”. In fact, I last created such a thing a few months ago for an instrument simulator that supports ASTM LIS2 A2 via RS 232 as well as Ethernet. In General, I structure the program as follows: An initial configuration program to open and configure as many interfaces as needed and wait for communications, and to close them on command. This program was generic enough, taking all the usual parameters of RS 232 if that was the interface or Ethernet if that was the interface. This program can be standalone or part of the State machine described below. The ASTM formatting will be done in the state machine below. The next step is to do a state machine that detects an incoming connection or outgoing connection request, then opens and initialize the appropriate ports or channels. This state machine should have at least 6 states. Incoming connection request, outgoing connection request, Subsequent Incoming messages from a device, Outgoing communications for a device, Communication Termination request etc. Messaging then proceeds as follows: Incoming: Interface->Source->Msg Reception->Envelope Decoding->Msg Decoding->Response/Handling Outgoing: Destination->Message->Envelope Encoding->Interface->Msg Transmission->Status Envelope Decoding is stripping the communicated message from Envelope (Comm protocol + ASTM format) Envelope Encoding means taking the Msg to be transmitted and formatting it for transmission The envelope(s) will contain the Destination+communications configuration + the ASTM LIS A2 overlay formatting etc. The MSG Decoding is simply a Case structure with all the expected commands or type of messages that are expected with one reserved for exceptions, and how to decode and handle them. The Msg Encoding is a case structure for all the different type of commands or messages that can be transmitted and how to format them. From here on, you can see that the case structure names can be loaded dynamically from a file based on configuration or replicated for each type of interface, or even a case where a new case structure corresponding to a new interface or instrument type could be added when needed. In general, this pattern of “Msg -> Msg+ Envelope -> Interface” is very flexible and powerful. I originally used it about 12 years ago to enable real time priority based communications on an implantable surgical instrument with distributed monitoring (15 clients with 1 master controlling the instrument) It can greatly simplify the creation of a simulator as you can switch the envelopes or interfaces or messages even dynamically. Sorry no example as I will have to recreate it, maybe in the future if enough interest and time… The important element is the pattern and understanding it, most Labview programmers can create something adequate. I hope this can get the juices flowing for more productive discussions
  2. Specialist in Software Development& Engineering, Electronics Instrumentation, Medical devices, Dev / Project Management, V&V; working with start-ups and established companies to rapidly develop innovative products Labview, CVI C/C++, Full dev lifecycle and more, available in Switzerland for employment or contract work Wally Geneva / Vaud / Switzerland
  3. Thanks Asbo ! This is great and may make my day tomorrow... I hope it can work on usb mass storage for my labview dongle ! Later I will let your guys know
  4. Hi gentle men ! Who can do me a big favour and convert the vi to labview 8.5 or give me the parameters to call it from labview 8.5 without the wrapper ? This could solve my little pesky problem Thanks Wally ' timestamp='1252322442' post='65482'] Well. Your drive also seems resistant to WMI as well The pervious DLL used 3 methods, the final one being querying the WMI database. Hmmm. Digging deeper........
  5. Hello Bruce; Sorry to hear of your problem... I suspect it may be similar to mine. You didn't mention though, is your target an RT system ? If you're then read on... I installed labview 8.5.1 and open a project that worked fine under labview 8.01. Now after resolving some issues concerning the fpga, I was able to compile my project. After updating my realtime chassis, I copied all the files to the usual places. Immediate crash, with message on a connected screen: MKL FATAL ERROR: Cannot load neither mkl_p3.dll nor mkl_def.dll I decided to replace all my dependencies with the older versions and this worked. I was able at least to boot my application. Then one by one I started exchanging them for the new ones until the RT system failed. At which point I knew it was "lvanlys.dll", yes the advance analysis library. To make a long story short, the App builder for some reason an upgraded version of this file in 8.5.1 which no longer appears able to work under the RT system. Upon searching the NI installation directory on my PC, I found an RT version of this file named lvanlys_RT.dll Replacing the old lvanlys.dll with the above file also worked, but putting both of them at the same time on the Rt fails. It seems NI is looking for the old file first and then if it doesn't find it, the second one. Trying to run my program in the dev environment makes it behave just like yours. So you can try replacing the above file like I did. QUOTE (bmoyer @ Mar 27 2008, 05:44 PM)
  6. Hi Alexandro; True, Lua can be adapted... However I was looking for an off the shelf solution that would allow me to run it on labview RT. Testing the DLL's that come with it with the RT DLL compatibility program fails... http://digital.ni.com/public.nsf/allkb/0BF...6256EDB00015230 This means remaking lua as a DLL and removing all the windows depencies. Just even to see if it will work, I attempted to call some of the DLL included in lua from the development environment targeted to an RT system, and it didn't work. I then tried to deploy and it wouldn't... The main aim of me embeddeding lua was to make it easy to sequence a complex instrument control, and allow the control programs to be modified without touching the base code. I can not justify the effort needed to read the relevant documentation in order to modify it; so I have done a very primitive & limited labview implementation for now. It's not fully satisfactory but it will have to do as I know that long term, we need something more and I am trying to avoid reinventing the wheel here. It even mentions on Lua's website the need to support RT on a future version... My needs are essentiall a DLL or two with exported functions that can be call from labview to either intepret a line of script or a block and return the results to labview. UnderC came the closes, but just won't embedd because of the depenendcies of the windows memory manager. Unfortunately calexpress doesn't quite do it on RT either. Thanks Walters QUOTE(Ale914 @ Jul 31 2007, 03:14 PM)
  7. Hi; Anyone aware of any script intepreter(C/C++...) or processor that can run under labview Rt ? The problem with the current intepreters like Ch, Lua, Python, UnderC are the dependency problems with other libraries like Kernel32.dll and so forth...as such fail to work on RT. I got UnderC to work under labview, but failed under RT because of kernel32.dll... Even adding the file to the project, it still fails; in anycase, I wouldn't want to load another memory manager on the RT I am looking for something that comes as a DLL with header files... I have a control application that runs loads and runs very large text script for control... I also have a very simple language (intepreter) that understand the scripts. It was initialy kept very simple but now, we would like to extend the language, add loops, local variables in the scripts and inline evaluation of operations. for example m1=move_stage(position=100) m2=read_stage_position() m3=m1-m2 if(m3>5) log_msg(position error) blah blah blah... else log_msg(position correct) blah2 blah2 blah2... end Ofcourse what is not obvious from the above example is that only the if statement is processed by the external intepreter while the rest of the lines are processed by our build in intepreter. The idea is to farmed out the loops, branching, and mathematical processing to the intepreter such that I can remake the application by changing the script. My labview functions are already scriptable... Needless to say this a bit more complicated that it appears because I have to cache the results from each call in a stack ... Walters
  8. Hi; Simply setting the TCP transfer to passive mode solves this slowness problem... My installation upload time went from 16 minutes to 1:40 seconds ! Walters QUOTE(wallyabcd @ May 22 2007, 11:59 AM)
  9. Hi Jean; You can install both the labview ETX, RTX modules. The ETX can be any old desktop with some few limitations on the choice of Lan card supported Install it on a spare PC and you're ready to go, and it will behave almost the same as a realtime platform. A few functions are not supported or different on each one... Additional hardware is not so obvious and depends onit's exact nature. See NI website for how to... Walters QUOTE(jlau @ Jul 10 2007, 03:31 PM)
  10. Hi; I presume yo are using the binary mode to log the data; otherwise you would not have this problem ! Logging the data in ascii solves this problem; but then has some overhead. I have been doing this for the last one year and with proper thread management I get good performance on the RT using a highly distributed program with heavy data logging. If you want to go with the binary road, things are more difficult as in an embedded operations, your instrument needs to recover automatically in case of power failure or unprepared power down. If you don't close the file properly by flushing the stream, then you'll get an error opening the file under windows... In this case like one of the users mentioned you have to flush after every write to force the system to commit the data to the drive. This also has some small overhead too. In general I find that unless you are doing data acquisition, using the ascii format is worth the price in most cases. I was very hesitant to do this at the beginning, but I appreciate being able to look at the log files with any simple viewer. I regular produce text files of 30 megabytes without problems either or overly slwing down of the process. Walters Spinx QUOTE(JustinThomas @ May 11 2007, 06:55 AM)
  11. Hi Ben; The code is configured to use the other1 thread in Labview RT and not the UI thread and the function calls are marked as reentrant as we call them through a sequencer and there is no risk of multiple calls... Therefore the icon is colored yellow... QUOTE(Ben @ Jun 28 2007, 03:07 PM)
  12. Hi; I am calling a computation DLL(written in Labwindows) from labview RT. The DLL is marked as reentrant and given it's own thread. When certain fitting functions are called from DLL library, it seems to block the response from the rest of the threads momentarily. I was under the impression that this won't happen as the DLL has it's own seperate thread and that under all circumstances, labview RT will allocate some time to all six base threads. This doesn't appear to be so, in the case of a DLL call. Anyone knows how to force labview RT (PharLap) to do this ? Thanks. Walters Spinx
  13. Hi; I have the samilar problem as you... It's not that slow, but still slower than I want. Using the internet toolkit, http file transfers are blazingly fast. Transfers of hundreds of kilobytes appear instantaneous. Now the bad news is ftp from labview Rt is slow, though in this case, I have decided to use it as it's used only for my application installer, and appears quite slow. Though I have a http facility I can use build in, I want the installer to be very generic, such that other ftp tools could even be used to install the application. Now the good news. One thing that may speed things up a bit is not to use binary mode transfer for every thing. A second method is to transfer many files simultaneously... many commercial tools do this. The third method is to use the dos transfer tool build into windows(dos command line) You can call it from labview RT. I have had some problems with this approach though as sometimes my large transfers(even in binary mode) were truncated good luck Walters Spinx QUOTE(lraynal @ Mar 6 2007, 02:57 PM)
  14. Hi; Without some sort of architectural diagram, it's a bigt hard to follow what exactly you're doing, so I will just give you some advice based on a similar thing that I am doing... I am running an RT based system with fpga, motion and vision... Essentialy, UDP and TCP is used for both communications and logging of status. Every command received or processed by the instrument sequencer or internal command or feedback is logged to a file and also transmitted via UDP. This works so fast transfering small files to the instrument from the desktop appears instanteneous. The only difference here is I don't use shared variables. Be very carefull with shared variables. This system generates quite a bit of data and has no problems with the communications eating lots of memory... Make sure you're not reading or writing to synchronous controls or indicators anywhere in your program unintentionaly. What I would suggest is that you put your communications loop into a seperate thread(real easy in LV) In your communication thread, put your sender and receiver in seperate loops Use a bigger queue. Set the loop rate to about 40 ms Give the thread normal priority. replaced UDP read with 1000 ms timeout Make your communications module almost like an independent state machine, self regulating. In essence, try to have your code multitask. You can make a queick test to see where the problem may be by lowering the priority of your communications loop to see if anything changes. Post the code for more... Goodluck Walters Spinx QUOTE(ruelvt @ Mar 11 2007, 03:32 AM)
  15. Hello; The simple answer to your question is yes, you can return an array from labview. Or you can pass a pointer to an array and labview will modify it. By default, labview passes values using pointers. Indeed the previous poster is correct in that you are limited to certain function definitions; but if you are the one writing the code, you can usually find a way to recast things so they work. The first thing to decide is who do you want to allocate the memory ? Labview or your C++ program ? Let's assume you assume you want your C++ to handle memory. Then you simply configure the call library function to pass the variables from labview by values. You can also do it the other way around and leave all the memory management to labview. Good luck. Walt
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.