-
Posts
3,872 -
Joined
-
Last visited
-
Days Won
262
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Rolf Kalbermatter
-
Open Source alternative to IMAQ Vision Toolkit
Rolf Kalbermatter replied to Chris Davis's topic in Machine Vision and Imaging
Importing from the DLLs (OpenCV exists of several of them) is really a nitty gritty and time consuming work but still by far not the largest part of such a Toolkit. Many datatypes with OpenCV are specifically C oriented and not suited for direct consumption in LabVIEW so you will have to do some intermediate DLL programming such as what has been done in IVision, or suffer terrible performance. Then you need to make the automatically created VIs use more appropriate controls, add documentation, icons and help information to it. Otherwise nobody is going to use it since it is way to hard to understand. And there is nothing worse than an Open Source library that nobody does use. Rolf Kalbermatter -
That's right and the fact that CRLF is used as message terminator but can also appear in the message itself makes it a bit complicated. However there has to be some specific protocol specification. Either you read in one line and from there can know what will have to follow if any (so your receive loop already needs to know about the specific protocol) or you have an empty line added to the end of every multiline message, which will indicate an end of message. Rolf Kalbermatter
-
Nope, DSHandles are NOT automatically disposed other than at application exit. The same applies to AZ handles. If they would be disposed at the end of your VI execution you would have real troubles to return them to a caller. The difference between DS and AZ really only is that AZ handles are relocatible between calls, which means that you need to lock them explicitedly if you want to dereference them and unlock them afterwards. LabVIEW maintains all its data that get exposed to the diagram excpet path handles in DS handles. However all modern platforms really have no difference between AZ handles and DS handles since locked handles still can be relocated in the lower level memory manager of the system without causing general protection errors, thanks to advanced memeory management support in nowadays CPUs. I believe that the actual AZ and DS distinction was only really necessary to support the old classic Mac OS. According to some postings on Info-LabVIEW from people who should know no flattening is actaully done. Rolf Kalbermatter
-
Lucky you! You seem never have to worked with older Visual Studio software releases :-) Rolf Kalbermatter There are better ways to commit suicide Rolf Kalbermatter
-
"convert EOL" in "read from text file" VI
Rolf Kalbermatter replied to Guenther's topic in Database and File IO
Since it is a text file I think modification of data is not that bad, especially since the LabVIEW internal end of line indicator is LF too. If you want binary consistency then accessing the file as binary file might be more appropriate. Also the file itself won't change before you save it back. As such I do hate the right click popup options of all those new nodes and that definitely has to go away. It hides fundamental functionality from anybody looking at the diagram and "that is a bad thing" TM. Rolf Kalbermatter -
This does work but probably not as you would expect. LabVIEW prepends information to variable sized binary data such as arrays that tell it how many elements follow. So if you read back your array you would have to read back two arrays now. The first read would read the orignal array written to the file and the next read will read the appended array. This is logical and the only way to make this work since binary means just really that, and not some metaformat that would handle such concatenations and other hairy operations transparently. Rolf Kalbermatter
-
Import shared library(dll) strike 2
Rolf Kalbermatter replied to Jacemdom's topic in Calling External Code
#defines are proeprocessor keywords and not part of the actual C compiler. Having a fully compliant proprocessor and a complete C compiler together in one single tool is a major undertaking getting you almost to the point of writing a complete C compiler (well at least the complete parser for it). THis is highly recursive and hard to implement so I guess the developers of the import shared library tool made a few decisions to restrict the parsing complexity. The used OF() macro is from the zlib header file and is used to work around some very old non C99 compatible compiler issues. There are not many compilers that need that anymore but the developers of zlib want to be compatible to as many compilers as possible. Rolf Kalbermatter -
Header file for LabVIEW shared library
Rolf Kalbermatter replied to klessm1's topic in Development Environment (IDE)
No, all the other functions are private and undocumented unless you happen to be a member of the LabVIEW developer team. LabVIEW 3.0 shipped with a much larger extcode.h file where you could see the prototypes of all those functions exported at that time. BUT: still no documentation about the functions and more importantly the parameters and I think LabVIEw exported maybe 500 functions or so back then, some of them are gone completely, others have changed considerably in prototype and/or behaviour. Unless you have a year or so of leisure time to dig into this I would recommend to forget about these functions. Rolf Kalbermatter -
Network bottleneck
Rolf Kalbermatter replied to aart-jan's topic in Remote Control, Monitoring and the Internet
At least if ping hasn't been disabled for that computer. Nowadays with security concerns all over the place this is not something uncommon to happen. Rolf Kalbermatter -
Open Standard for Graphical Programming Language?
Rolf Kalbermatter replied to LAVA 1.0 Content's topic in LAVA Lounge
I don't think compiler building is the way to make money in the future. I'm focusing here on C/C++ since that is what I know most and is also by far the most widly used development system to develop applications: The biggest compiler system (GNU) nowadays is open source and is not something to make money with from itself. MetroWorks didn't fail because it was bought by Motorola, I rather think they might have failed earlier without. They got quite some business out of supporting just about every Motorola CPU, and their main business supporting Macintosh development would have been probably dwarted by now quite a lot as Apple decided to use GNU C instead of anything else for OS X and considering they used an Open Source kernel that would seem like a smart move to me. Borland has a track record of making bad decisions at the wrong moment about great products. Symantec has aside from the starting years never been a company doing products for the user but was only about making money. Watcom was quite successful but were in a specialized business niche, enabling technologies that got easier to do with the year on just plain MS Windows and now most of what Watcom was is Open Source too. Leaves only really MS Visual C and Intel C as currently still active commercial compilers. MS Visual C because it is from the makers of MS Windows and Intel C because they know how to tweak the latest ns out of their CPUs. Basically I think GNU C has over the years made the idea to sell C compilers rather unattractive. I do not see how it could be in the financial interest by NI to promote a GNU Graphics Note: G is already taken as a programming language name and not for LabVIEW so I will refrain from using it here.Dataflow IS LabVIEW. Taking it out of it creates something that is anything but LabVIEW. So I think this settles this. The biggest power of LabVIEW is dataflow but that creates limitations at the same time such as difficulties or almost impossibilities to create a really object oriented system with inheritence and all this, or generic references to speed up large data access. On the other hand dataflow does allow seemless parallelisation of code execution for users that do not understand anything about multi-threading. This same advantage makes implementation of references in a thread safe manner almost impossible without dwarting the perfomance improvements these references are supposed to bring. Exactly and in doing so it would be mostly getting useless for what we are doing with it. I think there is not much of a problem to try to get your own graphical programming environment started then and put it out as Open Source project ;-) if you abtain from using dataflow and a few specific graphical contructs. I know NI has patented many things but some of them are so general that there is certainly proof of prior art or it is so logical to do that alternatives are not even imaginable. A system to configure hardware resources in a certain graphical representation borrowed from MS Explorer is not something I would consider patentable ;-) Others could be interpreted to patent the idea of an API. Rolf Kalbermatter -
Hide Front Panel of a Startup VI
Rolf Kalbermatter replied to Dave Graybeal's topic in User Interface
Because it is asynchronous. The execution of the property Node does only post an event to the UI thread to refresh that VI front panel next time it gets around to be scheduled by the OS to execute. It then continues to execute the diagram and changes the other property nodes too. At some point the OS decides that the current thread has had enough of a time slice, interrupts it and eventually passes control to the UI thread which then does proceed to do whatever it needs to do according to the current properties for that window. This can mean that you still might see flicker on a very slow machine. Nothing to do about this. Rolf Kalbermatter -
Hi Didier Congratulations too, from a fellow-countryman living in the "low lands" of Europe. Will be an interesting addition to your family and make sure you and your wife keep busy Rolf Kalbermatter
-
I haven't checked into this issue specifically but as yen says it's most probably related to how LabVIEW accounts for day ligt saving time correction of timestamps. Traditionally LabVIEW has always used the current times dayligth saving time to convert from its internal UTC timestamp to local time and vice versa. In LabVIEW 7.0 they introduced the ability to account for the DST status of the actual timestamp. For certain reasons that have not been confirmed by an insider to me, they decided however to keep the old behaviour for timestamps before January 1, 1970. There are two possible reasons I can think of (and they may be actually both true): 1) The OS support for determining if a certain timestamp is DST or not is not (on all LabVIEW platforms?) available for timestamps prior to that date. In order to keep LabVIEW consistent across platforms this behaviour has been chosen. 2) For backwards compatibility! Some Toolkits (Internet Toolkit for instance) use the conversion of 24 * 3600 seconds into a date to determine the current timezonde offset including DST status eventhough that timestamp is of course January 2, 1904 and therefore never could have DST. Rolf Kalbermatter
-
I admit it is a sentiment, but it also has practical reasons. Copy protection has come in my way a few times when using legit software. If the copy protection decides to strike its bad day when you are close to a deadline you really are close to destroying company property by seeing how long it takes to fly from the second floor until it hits the pavement Rolf Kalbermatter
-
Share your favorite tips and shortcuts
Rolf Kalbermatter replied to m3nth's topic in Development Environment (IDE)
Damn, I have completely missed this label! Makes my OS style user interfaces finally look perfect. Rolf Kalbermatter -
Also they need to use on both/all platforms the C decl calling convention. This is the only one that is supported on all platforms LabVIEW is available on. Sadly most 3rd party DLLs on Windows use Microsofts stdcall and then you will have to maintain two VI libraries anyhow. If I write DLLs I always make sure they use cdecl. This has worked great for the OpenG LabVIEW ZIP Tools library which has one VI library and three shared libraries for the three OSes, Windows, Linux and Mac OSX. Rolf Kalbermatter
-
You are not very specific. With Intel based do you mean some Pentium CPU board or an Intel ARM CPU? In the first case you would just install normal LabVIEW. I have looked into the Embedded developer module but haven't found time to really do anything with it yet. It definitely is not just doing one day of work to support a new target. Even if you can use an existing target and adapt it to your new one (depends of course both on the CPU as on the actual OS you will use) can you expect to spend at least a week or more before you get your first VIs properly downloaded and running. For an entirely new target I would guess this can take easily months depending on your familiarity with the actual tool chain you need to use for that target. Rolf Kalbermatter
-
Hmm, I haven't seen free software that uses copy protection. :-) Not sure that means anything though! If I have the choice between a software that uses copy protection and one that doesn't I would almost always choose the one without, independant if it is free or not. The only thing I hate more than copy protected software is software that has spyware or other similar things in it. Rolf Kalbermatter
-
So you want to send a keystroke to another application by placing it in the keyboard queue and honestly believe that you have any change to switch to the other application after having started the VI? Because keystrokes are only sent to the active application! I would recommend to try this on a mechanical computer but with any modern computer your VI has already long terminated before you can even think about switching to another application, lets forget about moving your hand to the mouse and activate that other application. Rolf Kalbermatter
-
Apart from some more or less well written VIs to access specific USB and similar keys I'm not aware of much in this direction. The problem as I see it is how much do you want to spend to protect your software? Usually LabVIEW is not used for high volume applications (it's runtime licensing while quite liberal nowadays does not always lend itself well for this) and therefore a license protection that costs days and weeks of development time is not likely to recover its own costs. And license management is a trade of its own with most of what is on the markt nowadays being more of a solution to fend of the casual copier than a way to really dwart the determined hacker. And lets be honest, the only fool proof copy protection is to lock up your software in a safe, destroy any backup copies and throw away the key to the safe So what is it you want to prevent and how much is it worth to you? If it is about not allowing to run your software by people that would anyhow never buy your software, then honestly every single dollar spent into copy protection is simply lost money. If it is about the fun to have copy protection built into your application, it's the same. Only if you can make a valid case that software will be bought thanks to copy protection can you start to look into spending money for this if you want to think commercial. As for me I'm much more likely to buy a LabVIEW add-on toolkit with a honest price that uses no copy protection than using an overpriced toolkit illegally at all. And a copy protected software always gets 10 minus points in my opinion if I have to evaluate such software. I may still buy it but then it needs to be a LOT better than its competition. Rolf Kalbermatter
-
Multicore processor and LV
Rolf Kalbermatter replied to bsvingen's topic in Application Design & Architecture
Data acqusition is not just hardware. The data needs to be transfered by a kernel driver into some intermediate buffer and from there into the application buffer. At least the last operation will be executed in the context of an application thread and can contain scaling (DAQmx channels) and other things too. So there is certainly something a core could be doing eventhough it is for a large part IO/memory related and therefore not the best candidate to be parallelized with multiple cores or CPUs unless you would have separate memory busses too. LabVIEW will not control the cores directly but instead use OS threads. How the OS distributes threads onto multiple cores is almost completely out of control of LabVIEW and can actually be tweaked by a power user. So while multiple independant loops will allow LabVIEW to distribute each loop to a differnt thread it is usually very counterproductive to start distributing related code to multiple threads especially if data flow commands a certain execution order. Instead of just continuing execution of a logical dta flow LabVIEW has to suspend a thread and wait for the correct thread to be activated by the OS to continue the logical data flow. In addition to the costly execution context switch you are forcing onto a logical data flow, you incure additional delays since the OS might decide to activate a different thread first than the one that would suit the dataflow best. However if you have a single loop with subVIs you could assign different execution systems to these subVIs and LabVIEW would be forced in that way to use a different thread. But please note that this will only really have any positive effect if you happen to have two different subVIs in the same loop that do both a computationally expensive operation. Without subVIs parallelisme in LabVIEW is limited. All built in nodes that do not have some kind of wait or timeout function are exectued synchronously and in the same thread as the diagram they are contained in since their typical execution time is quite small in comparison to the context switch overhead to let the OS switch between multiple threads. Most LabVIEW applications I write have anything from 2 to 6 or more parallel loops in the main diagram although sometimes some or all of the loops are located in specific subVIs that are called by the main VI once and terminate automatically all when the application wants to shutdown. This has never given me bad surprises (provided you do a proper design before you start with the actual coding) but results in applications that do DAQ, Analysis, logging, test sequence execution, image acquisition and instrument control all in parallel and still have an utterly responsive user interface. Rolf Kalbermatter -
Insane Errors - how to get rid of
Rolf Kalbermatter replied to Lori's topic in Development Environment (IDE)
But it is already part of the save routine! That is why you get those error dialogs when saving VIs. And sometimes recompiling (shift-run tool) the VI helps but often deleting the offending object is the only course. I haven't seen it often since about LabVIEW 6.1 but can't really comment to much about LabVIEW 8.0.x. It's possible that LabVIEW 8 introduced new ways to create insane objects. Rolf Kalbermatter -
All these things point actually in the same direction: You have somewhere in your COMPLEX UI VI some code that resets these properties. It might be a aproperty node that you added at some point, then resized the Event structure or some case, so that it fell of into invisible area. Or some execution logic in your state machine that causes this at initilisation. I've been there and done that too and almost pulled my hair over it before I realized my own fault. For invisible elements try to open the Error Window under Window->Show Error List and check the show warnings check box. Then go to the VI in question and hit de wanring triangle besides the run button. If it's about the executiona logic of your VI you probably will need to single step or at least work through your intialisation logic with breakpoints until you see where exactly the property gets reset. Rolf Kalbermatter
-
Fract/Exp String to Number precision woos
Rolf Kalbermatter replied to Vyresince's topic in LabVIEW General
As a reference (it's actually mentioned in the online manuals too somewhere) a single precsion number can be only accurate to about 7 significant digits while a double precision number can be accurate to about 15 significant digits. Rolf Kalbermatter -
How to get the mouse's status ?
Rolf Kalbermatter replied to Cool-LV's topic in Calling External Code
Then read the entire MSDN article pointed to in the previous post! It mentions that GetCursor is only for the applciation cursor while there is a GetCursorInfo or similar function that will give you the global cursor (at least if you are not using an archeological Windows version). Rolf Kalbermatter