-
Posts
3,871 -
Joined
-
Last visited
-
Days Won
262
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Rolf Kalbermatter
-
Why is LV beeping at me when I try to edit a VI icon?
Rolf Kalbermatter replied to Michael Aivaliotis's topic in LabVIEW Bugs
Up to 8.6 I do not have any beep when doing that. I'm using the old style icon editor though. Rolf Kalbermatter -
Somehow I can't follow you. First you say that the property node is terribly slow, although the reasoning for that is only partly right. Yes it operates in synchronous mode so it will force an update of the control on each iteration before going to the next iteration, but there is also additional overhead because the property node (any VI server property node at that not just the value one) has to run in the UI thread so there are also always two context switches involved. And then you say one should always use the property Node for a tab control. I always use a local for that and had not any problems so far. I think we had established that the problem here was the "parallel" execution of the string property node (and only for an element contained on the tab control) when updating the tab control through a local variable. So replacing the string property node with a local variable seems to me the better solution. Rolf Kalbermatter
-
DirectX is a technology from Microsoft. It can do zillion things and one. What is it you want to do? Most likely you are talking about the DirectX interface to access Webcams or other image acquisition devices. In that case there are virtually 100ds of threads in this forum that go about that. Doing a search here and read through them all, while being a very time consuming process, will give you likely more information than you have bargained for, and still be more effective than any crash course we can come up with. Rolf Kalbermatter
-
How to improve Security of Vi code?
Rolf Kalbermatter replied to MViControl's topic in Development Environment (IDE)
Well you can save a VI with removed diagram. But this has the aforementioned drawbacks. It contains only the compiled code for the LabVIEW version that created the VI and also only for that platform. Some of your user is likely to have not the same version of LabVIEW or wanting to use it on the Mac or Linux instead which requires LabVIEW to recompile the diagram. But wait there is no diagram, so the VI is broken and the user has no way to fix it in anyway. So what does that mean? You will have to support whatever LabVIEW version your users have or state this VI will only run in LabVIEW x.y on platform Z. Such a limit is likely to make the acceptance of your library to go so low that you can just as well stop distributing it at all, since nobody is bothering with it. It is already hard to get people to bother about VIs that are not protected and free of charge if they do not come from NI, so any extra hurdle, even the password protection alone, makes that only harder. Also since you have no way to go from the "protected" VI back to the unprotected VI you do need to maintain backup copies of the unprotected ones. An error where you accidentally overwrite your unprotected VI with the "protected" one happens so easily, believe me I can guarantee you that this will happen to you! Rolf Kalbermatter -
Look, a LabVIEW byte array (string) is like this: typedef struct { int32 len; uInt8 elm[]; } **LStrHandle; So you hav a number of problems to pass this as a C String pointer. 1) The actual character array starts with an offset into the real memory area. 2) The LabVIEW string is a handle (pointer to pointer) 3) Last but not least LabVIEW data can be "hyperdynamic". This last one is very important! LabVIEW reserves the right to manage its memory in anyway that makes sense to it. So if you create a string containing a set of characters, then run some magic like memcpy() (I prefer the LabVIEW manger call MoveBlock() for this) through a Call Library Node to get at the pointer that points to the actual string and try to pass that pointer to another Call Library Node to your API, LabVIEW might have reused, resized, or deallocated the original string already at the time your API is called. This in the best case could mean a simple Access Violation or crash or if you are unlucky a much less noticeable effect such as strange interactions with other parts of your code that operate with the reused memory, where your API now tries to write something into (or the other function reusing that memory writes something in and your API reads suddenly gibberish). There are tricks such as making sure the original string wire runs through the diagram without any branching to some place that due to dataflow dependency will execute after the call to your CLN API, to hopefully make LabVIEW retain that string memory until after your API executes. However while this did work in older versions, there is always the chance that by new and improved optimization strategies in newer LabVIEW versions, this suddenly might fail. The reason this works until now is that LabVIEW does not usually do diagram optimizations across structure boundaries, so if you run your string wire to a sequence structure border that depends on your CLN being called and finished too, you are nowadays safe. But if you just wire that string wire to the border without using it, there is a good chance that a newer LabVIEW version with an improved optimization algorithme might realize that this string is never used after your memcpy() hokuspokus call and suddenly the string is gone anyhow, at the time your API call tries to access the pointer that you retrieved with so much virtue from the LabVIEW string handle. Rolf Kalbermatter
-
You can not pass LabVIEW clusters containing variable sized elements such as strings or arrays to a C function. LabVIEW stores its arrays (and strings which are in fact just byte arrays too) very differently than what a C API function expects. So the best apparoach is usually to write a wrapper DLL with a function that takes the elements in the cluster as individual parameters and constructs the C structure to pass to the real API. There is in principle also a way to do it all in LabVIEW, but to be able to do that you need to know a lot more about C programming and its datatypes than you will need to write a wrapper DLL in C. Rolf Kalbermatter
-
What has this to do with VI scripting? This is a very simple starter question and the answer is to look through the zillions of examples that come with LabVIEW and the various drivers/toolkits. The Example finder in the Help menu is a great tool for that! Rolf Kalbermatter
-
How to improve Security of Vi code?
Rolf Kalbermatter replied to MViControl's topic in Development Environment (IDE)
What is wrong about password protection? It's by far the best method, unless you want to hide your code from NI, who theoretically have the ability to look at that code anyhow with ease. Anything else has only a lot of drawbacks. It consists basically of removing the diagram code entirely from the VI that you give to the customer. The drawbacks are: - The VI will not work on any other version or platform of LabVIEW since LabVIEW can't recompile it. - Maintenance of two VI versions, one without diagram and one with is more complicated. Rolf Kalbermatter -
I would actually rather have changed the string value property to a local variable instead. It is a noble attempt to avoid locals as much as possible, but using the value property instead is in fact fighting the devil with the beelzebub. Rolf Kalbermatter
-
Why is LV beeping at me when I try to edit a VI icon?
Rolf Kalbermatter replied to Michael Aivaliotis's topic in LabVIEW Bugs
Rolf Kalbermatter -
I believe IPv6 has some inherent capability for such load balancing (it is somehow done by assigning addresses from a specifically reserved address range to the interfaces, although I'm not sure how that would have to be setup) but with IPv4 this is a lot of hassle. You basically have to program those three links as independent communication channels and do the load balancing yourself somehow. So three listen sockets bound to the correct interface card address on one side acting as individual servers and three client connections. The client will have to load the data into memory and split it up and add some sequencing information into a header and the server will have to accept those data and resequence the packages into a continuous stream. A bit like reimplementing TCP/IP on top of TCP/IP. Rolf Kalbermatter
-
A good historical database can easily top those 900 data samples per second and the querying of data is also quite a bit easier since the indexing over the sensor ID is not necessary. With a historical database where each channel is in fact its own table with a timestamp, value and status column (and no need to have relational humbug and such, so this can be highly optimized too) the organization is much more straightforward and the logging of data is very fast. The only difficulty here is the querying of combined data, as you need to do a similar thing to a join statement when wanting to query multiple channels at the same time, but that is a task that can be overseen fairly well since the structure of the tables is very simple. With your approach querying does only get more complicated as far as the database engine is concerned although for you as user you will not see much of a difference. Adding a new channel in a historical database is simply adding a new table and that has no influence whatsoever on the other already logged data. For the end user it won't make a big difference other than that a historical database can be much more optimized for the task at hand than solving this with a generic super duper relational, object oriented and whatever else database engine. And yes the select statement will look a little different . Rolf Kalbermatter
-
constant value changes on changing representation
Rolf Kalbermatter replied to Dirk J.'s topic in Database and File IO
Why should it do that? The decision what display precision to use is done at the moment the constant changes from non-float to float but is not changed if you change between float representations. Should it do that? It's IMHO highly discutable. I would not like it to change display precision when changing to a different float type after I have changed it to a specific display precision. So this would mean LabVIEW would have to store an additional attribute to the display precision, indicating if that precision has been user changed in some ways or is some automatic default value. Possible to do but most likely not worth the trouble as this would not just be a simple line of code to add but go deep through the entire datatype handling of LabVIEW. Rolf Kalbermatter -
Naming Conventions and Project Organization
Rolf Kalbermatter replied to Daryl's topic in Application Design & Architecture
As others have said you can do that for yourself and your team (if you have any decision power there) but trying to get other people to follow your ideas is very surely doomed. This is a problem in any programming environment and one there is no real solution other than restricting your desire to standardize to the group of people you have power to tell what they have to do. In standard programming languages this starts for instance at indention, bracket styles, UpPeR/LoWeRcAsE, naming conventions, etc and goes further over what types of elements are allowed respectively required. Most Open Source projects try to have a more or less strongly recommended style but seldom enforce it very strictly unless in those projects where committing power is restricted to one or two "benevolent" dictators. So in short, discussion is fine, but accept that everybody who has been working in LabVIEW for a while, has his own style and is very unlikely to be convinced that someone elses style is better. Rolf Kalbermatter -
That is likely to work but has probably not the intended effect since the tab is now changed all the time independent of the event that occurred (or you have to make sure to wire out the correct (current) tab control value in all other event cases, for a real application!). Using a value property for the tab control (or preferably a local for the string) seems a more logical solution here and since that executes in the UI thread it involves a context switch and is sure to avoid the optimization race condition that seems to be in place here otherwise. Rolf Kalbermatter
-
Most likely the patch is really limited to fixing just that single issue and nothing else. So if your computer is not using WMPs you can probably ignore that patch. More fixes would make testing of the fix more involved and I'm sure they are saving their developer and tester time for the LabVIEW 2009SP1 release that is expected somewhere in the begin of next year. Rolf Kalbermatter
-
The exclusive access of serial resources in VISA from different applications is not a VISA "feature" but a Windows OS feature. Once an application has a serial port driver open, Windows will under normal conditions not allow another application to open that port again. As to why there is a Lock and Unlock. You can have applications that communicate from different locations through the same communication port. VISA supports that but obviously it can not know about the communication protocol so if location A writes a command and then location B writes another command (to possibly a different subdevice on that communication link) it is random who will read the response to the A command and who will read the response to the B command. Resource locking solves that problem reliably in that A locks the resource, writes the command and receives the response before unlocking. Rolf Kalbermatter
-
constant value changes on changing representation
Rolf Kalbermatter replied to Dirk J.'s topic in Database and File IO
This is a regular topic in any computer forum and the answer is very simple: Computers are binary and discrete. Floating point numbers are indiscrete, meaning they have an infinitesimal small increment (maybe something Planck related ). To store your number 1.3 exactly in the computer it would need an infinite amount of memory and that is still hard to get at nowadays. So the floating point numbers limit the storage for a number to a limited amount of digits and that is what you see. Single has about 8 digits of accuracy and double about 15 digits. Rolf Kalbermatter -
I don't think a classical database design with one single table for all sensors would be useful. Since you have so much variation in sample speed this would be an immense waste of space and performance. A historical database would be much more suited where every channel (sensor) is its own table with a timestamp, value and possibly status column. That allows to easily add and remove channels to the setup. Adding a new channel is simply creating a new table for it, removing a channel is simply not using it anymore. LabVIEW has the DSC add-on Toolkit which contains its own Citadel database (based on the Logos engine that comes from their Lookout package). This is a highly optimized historical database and there also exists an ODBC driver interface to it so you can query the database also from non LabVIEW applications. You could also devise your own historical database design but I would bet it is hard to beat the Citadel database in such a way. Rolf Kalbermatter
-
import of "waveform"-dll fails
Rolf Kalbermatter replied to mstoeger's topic in Calling External Code
Most likely you have the classical problem of C runtime library support. You are developing your DLL in some Visual C version such as 2007 and then distribute the DLL and your LabVIEW code to another computer than where you created it. Since Visual C defaults to link with the multithreaded DLL runtime library your DLL load will fail on a computer that has not installed the according Visual C runtime library. You can either use an older Visual C compiler that uses runtime libraries that are guaranteed to be available on all modern computers (I use for instance still often VC 6 for that reason, which creates DLLs that would even run on Win95 with any IE version installed. Another solution is to distribute the VC runtime distributables that come with your VC installation on all computers where you want to run this DLL on and the last solution is to change the compile time settings to use the static (non-DLL) version of the runtime libraries. This last one will blow up the size of your DLL a bit but will avoid exactly this problem once and for all. Not sure about your Waveform. You say there is no problem if you do not use waveforms but I would guess this has more to do about using specific runtime functions in the case when you use waveforms. Also the LVCluster datatype you show in your code example is not really aLabVIEW Waveform but simply a data structure. The native LabVIEW Waveform data type is as far as I know not document since it is a special form of LabVIEW variants which are nowhere documented in terms of their C interface. Rolf Kalbermatter -
I think the Full Development system was enough for that. Can't really check myself though since I only have the PDS. Rolf Kalbermatter
-
This bug should only appear in 8.6.1. I didn't notice it in 8.6. But the 8.6.1f1 Fix is the right solution for it. Rolf Kalbermatter
-
How to keep the plot left edge fixed?
Rolf Kalbermatter replied to george seifert's topic in User Interface
That is not what this right click pop-up option is about. Scale Fit if I'm not mistaken is the austoscaling. Auto Adjust Scales is the fact that the scale area (and implied by that the plot area) is resized when the scale needs more space to be displayed. This is by default on, but can be switched of with the right click pop-up menu. There is up to 8.6 apparently no property to switch this on or off programmatically. Rolf Kalbermatter -
A very nice catch! I'll have to look into this more closely. Would seem to me almost a perfect solution, provided the schema is not to complicated. Rolf Kalbermatter