Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,931
  • Joined

  • Last visited

  • Days Won

    272

Everything posted by Rolf Kalbermatter

  1. As a former AE (in a long ago life) I can really relate to his feelings ;-) But I was considering checking out labviewforum.de once, but your comment makes this sound like an utterly painful experience. Rolf Kalbermatter I think the problem with your project duplicator is more one of many advanced LabVIEW users not really having made the jump to 8.0 already. I for one still do most of my work in 7.1 and even sometiems 7.0 as there are projects that need simple modifications that do not justify the hassle of upgrading (and veryfying every aspect of the application to still work perfectly). Just starting to work in 8.0 for the first time for real for a project and in fact only because the client has a .Net library he developed himself which uses .Net events. Otherwise I would most probably have used 7.1 too for this project. The whole project management in 8.0 is a really big change of work procedure and might be fine if you start with LabVIEW but can be a little bit of intimidating if you have managed to work with LabVIEW since version 2.2.1 without this goodie. Rolf Kalbermatter
  2. The 3D control is Active X but the 3D Picture Control is an extension to the well known Picture Control implementing some 3D object drawings, using the OpenGL extension LabVIEW has been linking to for several versions now to implement their new (and IMO ugly) 3D control style. The extension to the Picture control is really available since 7.0 but NI only released a Beta-Test Toolkit to make use of it with LabVIEW 7.1 and being marked as a Toolkit LabVIEW 7.1 refuses to safe it back to 7.0 As to 3D programming in LabVIEW I'm afraid I won't benefit much of such a feature. My left eye is basically lame so all those 3D hypes in the past were leaving me with some curiosity as to how 3D visisbility would really look like. Rolf Kalbermatter
  3. Does your server handle multiple incoming connection requests simultansously? If not it may be not able to get into the listen queue before the timeout occurres. Or one of the sockets either on the client or server sides is not disposing closed IP ports properly so that the socket library runs out of port numbers after a while. 4 second interval would result in about 1000 connections per hour and therefore 1000 used up port numbers so this might be a possible explanation. You could try to log the used port number of your client (and server) and if it keeps increasing instead of reusing the same few port numbers then you might have a problem of not properly closed TCP/IP ports. Could it be that your TCP Close function gets wired with an error cluster indicating an error which might prevent the close from completely closing the port. I know Close functions are supposed to close independant of the error in status, but I know that some functions in the past didn't do that always. Another issue I had sometimes with similar symptoms was when using DHCP. There I have to say that what I did was actually keep connections open for a longer period of time and if the IP address of one of the sides changed during that the connection just went into nirvana. Writing to it did nor produce an error nor get the data arrived at the other side, which also still believed it had a connection open. Closing/reopening connections solved that problem more or less but changing to static IPs was what made it really reliable. Rolf Kalbermatter
  4. LabVIEW (and Windows 95 and higher) being a 32bit application you only can use 32bit compiled DLLs. That means that int is really a 32bit integer. Configuring the Call Library Node to use int16 instead will create some serious problems. For the skalars you won't see crash but just some strange and truncated values. For the array this certainly will cause serious trouble including invalid pointer accesses and crashes since the memory area you are telling LabVIEW to pass in will be only half as large as the DLL thinks it should fill in. Also I hope you make sure that the two arrays are preallocated in LabVIEW to the necessary size. In calling DLLs the caller is ALWAYS supposed to allocate memory for the DLL funcitons to fill in their result. For your case this would mean that you have to allocate two arrays of 10 * 10 elements to pass in. Another issue is the use of two dimensional C arrays. This is an area in C where the specification is not always very clear. A C compiler usually will create an array of n * x elements and internally treat it as one dimensional array which happens to be the same way as LabVIEW looks at multidimensional arrays. But it also can sometimes mean that you have an array of pointers each pointing to a different array of elements (which is not very simple to create in LabVIEW at all). Of course doing Matrix multiplication entirely in LabVIEW is another solution, if done right not really slower than doing it in C too. And LabVIEW 8 has additional support for Matrix operations. Rolf Kalbermatter
  5. The description you are mentionening seems to be for Visual C 6.0. For Visual C 2003 or similar you will have to replace some of the keywords in $(keyword) with something else that has changed between Visual C. Sorry can't help you here as I'm still using Visual C 6 and for that matter when creating a CIN, which by the way is considered legacy technology and therefore not recommended for new designs, I'm using the nmake command on the command line instead. That whole Visual C IDE business for creating CINS seems simply overkill and to much hassle to me. If you are trying to develop something new I would recommend you to abandone the CIN route althogether and go with DLLs/shared libraries instead and link them into LabVIEW through the Call Library Node. CIN support is probably not going away soon in existing LabVIEW platforms but support for new and upcoming LabVIEW platforms (Win 64bit, MacOS Intel, etc) is likely to be non existent). Rolf Kalbermatter
  6. But it is the case! The Windows extended properties for a window determine if the window will have a taskbar button or not. And I do not see why you couldn't make this test yourself. It is really simple and asking for a ready made example sounds a little like a HH (homwork hustler). Rolf Kalbermatter
  7. Just try "LabVIEW" as library name. It might work in the built application but there is no guarantee. The special LabVIEW library name was introduced in probably around 5.0. Not sure if LabVIEW was already using it as a place holder for the current instance or if that was only added later on. There is probably quite a lot more than a single Windows API call behind SetDefaultPrinter. Printing in LabVIEW is somewhat involved and the original idea was to hide as much as possible of the printer setup and configuration from the user. This makes it now a little difficult to provide full print control. And the reason this name was changed in later versions is probably because it had a name clash on one of the platforms LabVIEW is compiled for. Rolf Kalbermatter
  8. As the original author of the OpenG Pipe source code I can only say that the code as present now is still an alpha work in progress. I haven't found much time to go deeper into it and just got it so far to work somehow under Windows although there needs to be done more testing and also the Linux and MacOS parts would be nice to have. What might be the problem with LabVIEW 8 I have really no idea. Pipes under Windows at least are a somewhat fragile technology with lots of possible problems and it is very well possible that LabVIEW 8.0 does something with the Command Line Console for its own internal dirty work that might conflict with trying to redirect standard IO to pipes. I haven't currently any time to dig deeper into this so anyone with some C programming knowledge feel free to investigate what might be the problem. The code comes with the source code for a simple command line executable that can act as loopback device so it is definitely a good idea to try to make the DLL work with that example under LabVIEW 8 first. If this works but not with PErl then it is likely that Perl does some weird stuff too that clashes somehow with LabVIEW 8. Rolf Kalbermatter
  9. Could be. Or if they are VIs from the LabVIEW DSC system or similar toolkits it could be the system flag for those VIs that also excludes them from being visible in the VI hierarchy and in eneral any search of any kind. Rolf Kalbermatter
  10. Controls are entirely different objects than Nodes. I don't think you have any more distinguishable properties for controls than the Class Name itself. Rolf Kalbermatter
  11. But it's truth. The big performance problem is not in thread switching here (thread switching while costly isn't usually where things go completely bad) but it is in copying the data. In order for the data flow paradigma to work without race conditions LabVIEW needs to make a copy of the data whenever a global or local is read. This is no problem for scalars but some people like to store x MB arrays in a global and read it every second to append a new subset to it and write it back. This is about the worst scenario you can think of in using globals. Thread switching however does occur when using the Value property since properties are executed in the UI context. So here in addition to possible performance loss because of data copy the additional two context switches will further increase the performance problem. Rolf Kalbermatter
  12. Unfortunatly Lithium batteries have this characteristic that they will age independant of their useage. While it does probably make a difference if you use your computer often, unlike the well known problem of older battery technologies where not discharging a battery completely would considerably detoriate on the life time of the battery, Lithium batteries suffer from a general aging process that is mostly independant of their use. This means that such a battery stored for 2 years somewhere won't be much better than one that has been used for 2 years on a daily base. This is a general rule which has been described in several places. Of course there are differences. Some batteries will be absolutely useless after two years while others seem to be able to handle aging better. My former Dell had first a battery that was completely gone after less than a year (and I got a replacement under warrenty for it) The second battery which was used just as much did last 2 years and still was fairly well. My current Sony VAIO is about one year old and still going strong with somewhere around 3 hours of power outlet independant operation. But from what I have read so far it seems safe to expect that Lithium batteries usually won't last longer than about 3 years even if not used. I believe there have been recently new technologies introduced that should make the manufactering of Lithium batteries both cheaper and increase their life cycle too. Rolf Kalbermatter
  13. The picture as you have shown it should actually work. Since the wire is going through the loop AND is not modified in anyway on its way LabVIEW will maintain the memory buffer for the time until the while loop is finished and it will 99.99999999% for sure not bother to reallocate or otherwise interfere with the buffer. As to your second question: This would be more proper and not rely on a specific wiring and/or LabVIEW internal mechanismes that "might" change with a future version. If you have a pointer you can actually configure a Call Library Node to call the LabVIEW manager function MoveBlock to copy data from and into this pointer from a LabVIEW array or string. Basically MoveBlock has following syntax: void MoveBlock(void *src, void *dst, int32 numByte); Configure the CLN as follows: Library: LabVIEW Function: MoveBlock Calling Convention: C return value: void 1st param: ** 2nd param: ** 3rd param: int32 by value 1st and second parameter depend if you want to copy to or from the pointer. Lets assume you have a pointer that you want to get data out then you would for instance configure the first parameter as uInt32 by value and wire the pointer you got from the allocation function to it (which of course must be a uInt32 too. The second parameter would be for instance an array of uInt8 passed as C array pointer. Make sure you preallocate that array to the needed size and then pass that same size as third parameter. This of course has a slight drawback of requiring an additional data copy (the MoveBlock operation) but will work irrespective of some not so intuitive wiring or not and possible pitfalls with future LabVIEW versions. Rolf Kalbermatter
  14. The right way to deal with this is as JP Drolet has explained. Use the Format Into String function with the .; prefix anywhere you need to send decimal point strings. This way you do not rely on an ini file settting that can be hard to maintain for distributed applications. Where user interface things are concerened (or file output that is to be interpreted by other applications such as Excel) just use without prefix so that the user gets the format to see that he expects. For entry fields you should try to use the native datatype controls as much as possible. A numeric control has the logic to deal with platform specific user input and so does a data/time control. Trying to deal with this on your own is an exercise that needs LOTS of time. Rolf Kalbermatter
  15. At least for LabVIEW 7.1 and earlier where this happens there is a simple fix. Press the Shift key and click on the Run arrow. This will force a recompile of the VI and cause LabVIEW to reevaluate the status of the VI. This error I have mostly seen with ActiveX, some .Net nodes and with Call Library Nodes. Sometimes the path in the CLN for instance doesn't match so LabVIEW can't load the DLL but other VIs using the same DLL have a good path and load it later on anyhow. LabVIEW then seems to have a problem sometimes to go back and update the status of those VIs that had already been loaded with a missing DLL link. Pressing the broken run arrow alone only will show the error dialog that correctly does not show any errors for the VI, but the broken arrow will only go away by a recompile. Rolf Kalbermatter
  16. I don't think there are any editors for the PICC format outside of NI. And it wouldn't be very useful in most cases. What you can do is copying decorations and subparts from other controls in the customize mode and paste them into parts of your control. This works usually quite good, although there is every now and then a crash. The control editor is a great piece of work introduced way back in the earliest versions of LabVIEW but with certain limitations and crash potential and unfortunately it has never been substantially improved since. Rolf Kalbermatter
  17. PICT is an old vector graphic format used on the Mac. LabVIEW supports PICT directly and can import them and use them and I do believe PICC is a similar but a little bit modified format for LabVIEW's native internal vector graphics. Rolf Kalbermatter
  18. NI has an OCR add-on to IMAQ Vision. That works usually quite well. But: even with the best OCR software the biggest problem often is to get an image that has enough information to allow a software program to analyze it. Things like cameras used, and even much more important the lighting conditions are often the part where you will spend more time with experimentation, than getting the program to analyze the data and do something useful with it. Rolf Kalbermatter
  19. IMAQ Vision comes to my mind here! But, to display the image it has to be decompressed and it for sure will use a lot more than those 1MB it takes on disk. Rolf Kalbermatter
  20. As has been already mentioned in some ways, Toolkits installed (especially those that add a nice icon to the startup splash screen such as DSC, RT/FPGA, or IMAQ will cause LabVIEW to load additional VIs in the background that take up time too. So a clean LabVIEW is quite a bit faster in starting up than one that is loaded with every Toolkit there is. And a dirty Windows system partition certainly makes a huge difference. LabVIEw depends directly or indirectly on a lot of system libraries and they all have to be loaded and mapped into the process space of LabVIEW before LabVIEW can start to even initialize itself. A clean Windows installation will always startup processes significantly faster than one that has been loaded with 100ds of small little shareware programs, games and other little gadgets. Not to mention of all those little programs that show up in the status bar in some form or the other. A new installation of Windows is in general necessary about once a year unless you use your computer exclusively for one specific task and are not installing and uninstalling all kind of software frequently. Rolf Kalbermatter
  21. Why! Just make sure you add those dynamically called Vis as such to your build script or Project. Or as alternative if you do this for plugin VIs that should remain outside of the application make sure you compute the right path when trying to load it. Of course there is the possible problem of plugin VIs using subVIs that were getting out of sync with the same named VIs used in the main application. This can be managed a little easier by using an All Project VIs.vi that contains all the dynamically called VIs as well as the main Top level VI. For editing the project you load this All VIs VI but you do not include that in the application build at all. Rolf Kalbermatter
  22. No, it won't be possible. ActiveX is entirely based on COM/DCOM and that is one huge subsystem that has not yet been implemented by anyone outside of Micrososft. Wine as a project is trying to do that but they are still quite a stretch from a near full implementation. The only implementation of COM/DCOM I know of that has existed at least in the past outside of MS, was a DCOM implementation for Unix systems from Software AG. They had licensed the relevant code from MS and ported it to Unix and sold it for quite big bucks as a solution for big integrators doing banking applications and such. Not sure this is still maintained or sold. .Net is yet another rather huge subsystem that is probably build in parts on technology from ActiveX too. Trying to integrate both COM/DCOM and/or .Net in a RT system is IMO a rather useless exercise. Many things would interfere with the RT nature of the underlying OS and render it more or less to a "normal" OS. Also the foot print of the OS itself would get bloated extremely and would increase the cost for additional system resources. I think that if you do need COM/DCOM and/or .Net for something, an RT system simply is not the right choice for you. Rolf Kalbermatter
  23. And what would be the problem with that? Move works both for files and directories. Move is simply the logical combination of Move and Rename into the same function. If a move happens on the same volume, the file data is not moved at all, but just the directory entry is changed. If it is on different volumes, rename would simply fail and move will correctly move the file or directory. This is how the modern OS'es do it at least and the old DOS rename and move functionality are just remainderes of a time when disk management was a lot more complicated and still had to be done partly by the application using it. The according commands in the Command line shell under Windows are simply built on the OS functions and written to emulate the old behaviour eventhough the OS functions underneath would be much more versatile. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.