-
Posts
3,872 -
Joined
-
Last visited
-
Days Won
262
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Rolf Kalbermatter
-
Open Source alternative to IMAQ Vision Toolkit
Rolf Kalbermatter replied to Chris Davis's topic in Machine Vision and Imaging
Robolab is AFAIK a commercial software actually meant to be used with Lego Mindstorms. The fact that you can download it from some servers does not mean that it is free and Open Source. And as far as OpenVisionToolkit is concerned, why not step up yourself and show to everyone that you want to do some real work ;-) Rolf Kalbermatter -
send email and sms...help
Rolf Kalbermatter replied to eerc's topic in Remote Control, Monitoring and the Internet
Does your server need authentification (password) to connect too. That would be the most obvious reasons. The server initially accepts the connection but when seeing the client trying to send mails without first doing authentification it simply drops the connection. The SMTP VIs do not support authentification of any kind so it will not work for email servers requiring that. And since you do seem to get a connection initilially I'm pretty sure you got the server name already right. Rolf Kalbermatter -
Error message: Entry Point Not Found
Rolf Kalbermatter replied to aledain's topic in Development Environment (IDE)
LabVIEW 7.1 should not need a patch to do inport and outport on NT machines of any color (NT4, 2000, XP, 2003). But you need of course the Port IO support installed on that machine. In the Application Builder in the Installer Settings->Advanced dialog make sure you also select the Port I/O Support under Additional components to install and rerun the resulting Installer on your target machine. Rolf Kalbermatter -
multiple CIN call in a single VI
Rolf Kalbermatter replied to AngelHunter's topic in Calling External Code
If you get this kind of error when calling DLLs then you are doing something wrong in the Call Library Node configuration. Show us the prototype(s) of your function(s) and the according VIs then we might be able to help you point out possible problems. Rolf Kalbermatter -
BOOL in Win32 API nomenclature is really a 32 bit integer while a LabVIEW boolean is an 8 bit integer. In this case it might be a non issue but if you pass it as a function parameter this might be rather important. LabVIEW assuming an 8bit integer might decide to only initialize the lowest significant byte of the 32byte value pushed on the stack leaving the other 24 bits at random. If the DLL expects a Windows BOOL instead and does the standard boolean expression evaluation of only comparing against being non-NULL, you likely will never see the DLL receive a FALSE no matter what you do in LabVIEW. As far as using signed or unsigned integer for a Boolean does not matter at all. Since a Boolean by definition is either 0 = FALSE or non-0 = TRUE, the signed evaluation does not matter at all. Rolf Kalbermatter And how does the prototype of the function look, you are trying to call? Rolf Kalbermatter
-
LabVIEW does not support this because there is no possibility to do something like this from Win32 API or any of the other OS user applications APIs LabVIEW runs on AFAIK. You would have to go directly to the filesystem driver API and that is messy and prone to errors not to mention that many filesystems couldn't possibly support such a mode at all. Rolf Kalbermatter
-
Well, by predicting the future you basically changed the timeline that could lead to such an event and prevented it from happening . That is what you get if you try to be smarter than what nowadays physics is willing to allow us mere mortals to be able to do. But in principle you are right. It's not worth doing because LabVIEW is a compiler. So the only reason to do that would be for performance reason and in order to make really a difference that would require a highly optimized C compiler and that is a business only a few crazy people at Intel and some more in the Gnu CC project are willing to tackle Just as trivia here, if you have the Application Builder you already have a C compiler on your system. It is the LabWindows CVI compiler engine packed into an Active X Automation Server. This is used to compile the C stub wrappers and link the final DLL executable image for DLL targets. The C stub wrappers are the actual functions exported from the DLL and take the stack parameters and translate them into LabVIEW datatypes, then invoke the actual VI through the LabVIEW runtime engine and reverse the data translation for any return values before returning to the caller. But LabWindows CVI never has been and most probably never will be a highly optimized C compiler. And it doesn't have to be because it's strength is in the Test & Measurement integration and the shaving off of a few nanoseconds runtime performance is almost never critical to any application that might ever get created in LabWindows CVI. And if you ever happen to get across a function library that needs this high optimation then you will have to get the Intel C compiler and create an object library that you can link into your project, not really that a high investment actually Rolf Kalbermatter Basically almost every LabVIEW application I have written so far spends probably about 99% of its CPU time waiting for user input events and in doing so causes a real CPU load even on lower end machines of a few %. A project that I'm working on and off for the last 7 years is about controlling a 5 axis motion system over a DLL, some 20 or so serial devices, communication with 3 more devices over TCP/IP, doing image acquisition for position recognization and barcode decoding, writing data files to disk and when running full speed controlling all these devices more or less in parallel (each device is controlled by its own deamon consisting of a dynamically launched VI and communicating its device states through dynamically instantiated LV2 style globals to the main system) CPU load never peaks higher than about 30% even on an 7 year old Pentium machine running Windows NT4. And there are similar other applications. I really seldom have seen any of these applications peak even a single time close to 100% CPU time other than when switching from a different application to that application on older low end machines when all the user interfaces need to completely be redrawn. Rolf Kalbermatter
-
jpg or bmp from LabVIEW toExcel
Rolf Kalbermatter replied to LAVA 1.0 Content's topic in Calling External Code
This seems like overkill. What you are trying to do while not impossible is not how the Excel Active X interface was meant to be used. Rather than trying to find out how this would have to be worked out and then maybe the more complicated thing to try to explain it I would rather suggest to send the data to Excel instead and then start a macro in the Excel Spreadsheet that takes those data and creates an Excel Graph from it. While this may sound work intense I'm sure trying to get a LabVIEW graph properly into Excel as bitmap will be at least as hard and that would assume you know how to build a device independant bitmap in memory and transfer its handle across process bounderies. Rolf Kalbermatter -
Getting the name of the active window
Rolf Kalbermatter replied to Mark Balla's topic in Application Design & Architecture
Go to msdn.microsoft.com and search for the function you want to know about. Rolf Kalbermatter -
As Michael said you can't really get a higher resolution than 1ms under Windows or any other Desktop OS. Under Windows and other Desktop OSes even specifying 1ms will seldom get you exactly 1ms delay but can vary with several ms and at least under Windows with a network card installed and one or more other applications running can sometimes exceed 100ms. You might be able to get a 1ms resolution quite reliably with a timed loop but for better you will have to go real time without any doubt. Rolf Kalbermatter
-
Are you sure the coef array never contains more than 21 elements? You index it before you error out if it would contain more! You can't just set the dimSize of an array without making sure it is big enough to hold that many information. Before setting the size you should at least use the LabVIEW memory manager function NumericArrayResize or something like that. Rolf Kalbermatter
-
Maybe if you would (re)install NI-DAQ they might be suddenly there? Rolf Kalbermatter
-
I'm not sure how C#, GDI+ and .Net apply to PDA development. I have a feeling that at least some of these parts are not available at all on a PDA target. Rolf Kalbermatter
-
You forget one thing here! People who answer your question here are NOT paid at all to do so and take the time from something else they may maybe just as much need to pay attention to. By making it unnecessarily work intense for us to give you some good answer you deprive yourself from many possibly good answers. If you think that this is alright then don't scream when your request hasn't been answered at all after 24 hours! By making an urgent signal you most probably annoy most people (it definitely annoyed me) and just get even further from your goal of getting a helpful answer. And that talk about ardour man and such, well I think you have no right to talk in such a way to someone like Chris. He has proven to give very useful answers to many posts here on LAVA and on other LabVIEW related channels and God knows where else, and I feel very much with his statements to your post and your defensive answers. If you don't want to understand this that is your cookie but don't ask me ever to eat it. Rolf Kalbermatter
-
LV8 uses Tag Engine from LV7.1
Rolf Kalbermatter replied to Dietmar's topic in Application Design & Architecture
Unless you happen to have linked explicitedly to a LabVIEW 7.1 VI to load the SCF file or even simply start the tag engine. I could imagine this to happen if your development project is on a different volume than your LabVIEW installation and with some carelessness when opening the project in LabVIEW 8 you are in that situation. Rolf Kalbermatter -
And prompted by http://forums.ni.com/ni/board/message?boar...ssage.id=184494 I digged up one of my VI libraries IPtools.llb that happens to give you also some of this information and improved it a bit. Incidentially it does not use any .Net and therefore will run even under Win98 and NT 4 At least if you use some older LabVIEW version, of course. The lib itself is in LabVIEW 6.1 so there should be almost anybody able to use it. Rolf Kalbermatter Download File:post-349-1147472635.llb
-
There is a good chance that the dual boot would work and some virtualization solutions (but definitely not everyone) might work too. However don't expect anything other than TCP/IP (and maybe serial but that is doubtful) to work for IO. NI-DAQ (and probably any other NI-XXXX hardware drivers) is simply out of question. It plugs deep into the Windows kernel to access the hardware and that part simply can't be virtualized properly enough to allow this to work. Rolf Kalbermatter
-
In that case you should have taken the time to really explain what you are trying to do. If it is so important and your boss is chasing you it is even more important to invest some time on your behalve. Your original request could have meant anything and nothing and while I would probably have known an answer to what you were looking at, I was simply flabbergasted at what it might be that you would want to achieve and having only limited time at my hands I decided to move on rather than try to second guess your intended operation and write a lengthy response with the about 5 different possible things I came up your request could mean while reading your post. Usually it is much more productive to invest some of your own time upfront if you want other people to spend some of their time for you, rather than reposting and needing to explain afterwards in numerous reposts what you really wanted to achieve. And with a good explanation of what you wanted to do you could have gotten a useful response here as well as on NI Developer Exchange probably in a lot less than 24 hours! Rolf Kalbermatter
-
This is something that is not absolutely possible. As long as the device is on the same subnet you could use ARP to actually get at the MAC address but MAC address resolution is not done over subnet borders. Some devices may support a specific TCP/IP protocol to get that information from them but as far as I know, is the TCP/IP network not designed to get at this information outside of the own subnet. Any router in the network able to pass the packet to the destination address or at least another router that should be able to reach the destination directly or indirectly will simply answer the ARP request with its own MAC address, and when the packet arrives route it further according to its own route tables, replacing its own MAC address with the next one in the chain. Rolf Kalbermatter
-
This sort of devices are usually CNC controlled. While they nowadays have and use Ethernet ports to communicate between each other and the master control software, the used protocols are seldom public protocols and most often proprietary ones without any documentation whatsover. I have no experience with your machines but used to commuincate with Heidenhain CNC systems in the past for one project. Their protocol is called LSV2 and on the Heidenhain site I did find a document buried deep in the document hierarchy that usually does not show up in their normal search. I managed to develop a driver to get the file hierarchy and information on a machine over that protocol but even that document itself wasn't sufficient. It was not before I used Etherreal to sniff on the actual messages send between the official Heidenhain tool and those machines that I was able to write VIs to actually do a reliable communication. Without any document explaining at least the basic format of the protocol trying to develop such a VI driver is however almost impossible. So your first action should be to contact the manufacturer or distributor of those devices if there is a protocol documentation and if that fails try to investigate if somewhere on the internet is some protocol documentation available in the form of a document, some example code in C or another programming language or possibly some more or less obscure scripting environment. I've in the past had to learn a bit of Perl to actually get at some protocol documentation of some sort. Rolf Kalbermatter
-
LabVIEW RT calling external code
Rolf Kalbermatter replied to Eric Griwicki's topic in Calling External Code
wsock32.dll is the 32 bit implementation of winsock and that is the major provider for all network related protocols other than netBIOS. As such last time I checked it is just a normal DLL without any ActiveX involvement. In fact Winsock used to be a Microsoft port of the Berkeley network socket library, but according to Microsoft sources they have in the meantime developed their own code since they had an issue or two about not mentioning that they did use the Berkeley library in the first place, which is just about the only requirement you have to follow if you are using that source code in your own product. The Pharlap OS should come with its own implementation of wsock32.dll since this is THE interface to use under the 32 bit Windows environment to do any TCP/IP or related network access and LabVIEW will link to that library too for its TCP/IP and UDP functionality. Rolf Kalbermatter -
Not sure what the problem would be. I can write a small demo VI that does the equivalent of rm -r / (obviously for unix ;-) and post it and anyone not being smart enough to check the VI diagram before executing it would be deleting his entire harddrive. So what? Of course this would be different if I hide the diagram code behind a password, but then it may be a good idea do know from whom you get the code before even attempting to run it! Rolf Kalbermatter
-
Since putty is a command line tool you will want to control it through its standard IO. This can be best done by redirecting its standard IO to pipes and communicate through them. LabVIEW for Unix platforms comes with a pipe library that allows to do this. I have attempted to develop a similar solution for Windows and made it available on OpenG. But it is still in Beta stage and probably has some issues but works for more simple solutions. It is not yet released as OpenG package so you will have to get the DLL and VIs from the sourceforge CVS server or its web interface at http://cvs.sourceforge.net/viewcvs.py/open...it/pipe/source/ Rolf Kalbermatter
-
By declaring "const array3* v;" you only declare a const pointer to the struct but do not allocate any storage for that structure at all. So when you try to dereference that pointer it points into nirwana and crashes. What you want to do is more along the lines of: typedef struct { double x, y, z} array3; const array3 v; double my_value; my_value = v.x; Which of course is still not very useful since v.x will contain some random data (usually 0 but not all compilers will initialize const data to 0). Rolf Kalbermatter
-
The most simple solution is to simply add the R, G, and B parts together and divide it by 3. This has however likely the effect of not exactly giving the result you would expect since the human eye sensitivity is different for the different colors. So you would have to weight the different colors accordingly before adding them. I think you will end up with weighting factors around 0.8 .. 1.3 for the different colors, but I haven't the exact information ready here. Looking on internet about color to grayscale conversion should give you the details easily and you may find that there are actually different weighting factors depending on your intended application. Rolf Kalbermatter