Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,871
  • Joined

  • Last visited

  • Days Won

    262

Everything posted by Rolf Kalbermatter

  1. QUOTE (neB @ Jun 6 2008, 07:53 AM) You are of course right. Creation of any refnumed objects in LabVIEW has to be done in the context of a VI hierarchy that does not go idle for the life time of the refnum. In this case the best thing would be to have a deamon running in the background that gets instantiated if necessary by the remote queue VIs and takes care of queue creation on behalf of the those remote queue VIs. Rolf Kalbermatter
  2. QUOTE (crelf @ Jun 6 2008, 07:26 AM) Actually I think the story behind it is a little different. First IMAQ Vision was indeed bought from Graftek and was at that time called Concept V.I. However I'm pretty sure NI acquired all rights on Concept V.I. and consequently does not pay royalities to Graftek. They also bought at a later time ViTA from them and added it to IMAQ Vision and also another software that was the base for IMAQ Vision Builder. NI-IMAQ, the traditional driver for NI image acquisition boards is free to use (and makes only sense with NI boards). IMAQdx the new style API to control NI image acquisition boards seems to require a license activation which may or may not come with the NI image acquisition board. Not sure about this last part. Rolf Kalbermatter
  3. QUOTE (ragglefrock @ Jun 1 2008, 06:18 PM) Actually I think the Call By Reference call will handle arbitration nicely so if you can provide VIs with atomic access to a particular queue you should be fine. Rolf Kalbermatter
  4. QUOTE (Neville D @ Jun 5 2008, 12:32 PM) It's how Vision software generally works. And after all it is some serious engineering to do Vision software so I can understand that the vendors want to get some form of return of investment. You could argument that LabVIEW is probably just as expensive to develop and its licensing is quite a bit simpler since it does not require seperate runtime licenses. But it's not the same industry really, with LabVIEW competing with traditional development environments like Visual Studio, Borland, etc, that do usually not require runtime licenses and we've been through that already in the past when NI tried to get runtime licenses for build LabVIEW executables. Rolf Kalbermatter
  5. QUOTE (jcryan @ Jun 5 2008, 02:08 PM) I don't have such a camera and couldn't do it. Maybe the original poster did something but my experience with this is that most do not bother with the complicacies this causes. Rolf Kalbermatter
  6. QUOTE (jebus @ Jun 5 2008, 03:26 AM) Jim answered about gmail. That about your ISP I'm not sure. For some reasons the server considers your attempt to send data as Junk Mail. Why that would be I have no idea. Technically there seems no reason to believe that the communication did not proceed right. Maybe that you should use a FROM: address in your attempt??? Maybe cableonline requires a specific procedure??? I'm afraid you will have to ask there about this, as painful it might be to get a support person from an ISP that even understands what you are talking about. Rolf Kalbermatter
  7. QUOTE (jdunham @ Jun 4 2008, 07:57 PM) You are so right! Sorry QUOTE It would be great if you could set a string control (especially if input is limited to a single line) to scale with the font the way the numerics od. Thought about that also at some point but was to lazy to make a product suggestion. Rolf Kalbermatter
  8. QUOTE (jebus @ Jun 3 2008, 09:58 PM) The SMTP email VIS do not implement authentification. As far as I know you can't use gmail (and just about any other email provider accessible over public internet) without authentification. You could however use your ISPs SMTP server instead (if you are on a broadband connection) or your companies email server (if you are on a company internal network). Those servers usually allow unauthentificated SMTP access since you access them from the internal network and therefore should be considered trustable. Rolf Kalbermatter
  9. QUOTE (BrokenArrow @ Jun 3 2008, 09:11 AM) Actually it probably depends on the chosen font. Unless you use a TrueType (TT) font Windows will not scale fonts smoothly in one step increments. Those non TT fonts are not defined by glyphs but by bitmaps instead and they do only exist in a discrete amount of sizes. Windows does not attempt to scale bitmap fonts in one step increments because the result would be VERY bad looking. Real TT fonts allow almost seemless scaling to just about any size. LabVIEW does not have anything to say about that. It specifies the Font Name and the size and attributes and Windows does whatever it thinks it can do. LabVIEW has virtually no further control about that other than querying the size of the resulting font to adapt its numeric controls to it. QUOTE (jdunham @ Jun 3 2008, 03:47 PM) I looked at your image. I think the fonts are exactly the same, but they are rendered to a different pixel size in Vista than they were in XP. Remember that integers can only be one line, so labview will always resize the numeric for the specific font size. For strings, the control itself does not resize automatically. As a test, select all of those objects and change the font size to 8 or 9 or something. The numeric array will get a lot smaller, but the string arrays won't change, even though their fonts do. No! the problem is that numeric controls adapt their height to the font applied to the number inside wheras strings do not do that. This is in fact a copying of Windows control behaviour which NI better would have left out IMHO. You can also see that you can not resize numerics in height but only in length whereas strings can be sized to any height independant of the font they display in. Rolf Kalbermatter
  10. QUOTE (crelf @ Jun 2 2008, 03:59 PM) Nope that window does not have any title. Rolf Kalbermatter
  11. QUOTE (normandinf @ Jun 2 2008, 04:46 PM) Works nice! But only as long as there is one single ActiveX/.Net control on the panel. Rolf Kalbermatter
  12. QUOTE (Tomi Maila @ May 29 2008, 07:27 AM) First saying that LabVIEW does not have a memory manager is a bit of a stretch. It's not a garbage collector memory manager like Java has and consequently requires the application to be secure about memory allocation/deallocation to avoid memory leaks during operation but there is nevertheless a layer between LabVIEW and the crt memory allocation routines that I would consider a sort of memory manager. It used to be a lot smarter in old days with help of a memory allocator called Great Circle to compensate for the inadeqacies of what Windows 3.1 could provide for memory management. The behaviour you see is quite likely a feature. I come to this conclusion because of two reasons. First it's behaviour is similar to how LabVIEW uses memory for data buffers when calling subVIs. This memory is also recycled and often not really deallocated. Also the fact that Request Deallocation cleans it up would definitly speak against a leak. Leaks are memory whose reference the application has lost for some reason. This seems not to be the reason here. The Variant most likely keeps the array handle and on negative resizing simply adjusts the dimSize without invoking the memory manager layer to resize that handle. An interesting test would be to see what happens if the small variant does not contain 0 elements but a few instead. Because I could imagine that on an incoming 0 size (or maybe very small size array) the existing internal buffer is reused (with copying of the incoming data to the internal buffer for small sizes) but on larger sized arrays the incoming handle is used instead and the internal handle gets really deallocated. Rolf Kalbermatter
  13. QUOTE (BrokenArrow @ May 27 2008, 11:46 AM) Sorry haven't used it but MAX definitly won't see those boards at all. MAX is an NI application and supports NI hardware only (apart from external devices such as GPIB and PXI boards in an NI PXI rack). Rolf Kalbermatter
  14. QUOTE (Neville D @ May 23 2008, 11:18 AM) That is the VISA passport for HPIB boards if I'm not mistaken. Rolf Kalbermatter
  15. QUOTE (crelf @ May 22 2008, 07:38 AM) Well there might be all kinds of ramifications. For instance NI acquired a source code license to FlexLM for certain OSes (most probably Windows only) and certain uses (most probably allowing to protect their software with it but not allowing them to create a licensing Toolkit for use by people outside of NI). Macrovision (seems they are now Acresso) has been a well known company in copy protection business and I'm sure they employ quite capable lawayers too, so if I was NI I wouldn't try to go beyond what the acquired license allows, which I believe has certainly its price even with a non royality free, non free to use for everything license . Rolf Kalbermatter
  16. QUOTE (Michael_Aivaliotis @ May 21 2008, 10:54 PM) Hmm, why makes clicking on that image give me a download box for a file accessmacro.zip? The link showing in the status bar points definitely to a jpg image. Rolf Kalbermatter
  17. QUOTE (crelf @ May 20 2008, 02:42 PM) Let's put it like this: The FlexLM core in LabVIEW and other NI products is an extensible system. However the way it is build into LabVIEW it assumes a specific secret key to sign licenses. So in order to generate your own licenses you would need to know that key and NI certainly will not publish it since it is secret. And even if you knew it, use of it would likely be against one or more laws to protect copyright such as the DMCA and probably some others. Rolf Kalbermatter
  18. QUOTE (Yen @ May 20 2008, 01:12 PM) Well mine is 1280x800. Put it on my age and the fact that my eyes aren't as sharp anymore . The sreen resolution problem I usually solve by making my panels scalable anyhow. Not the LabVIEW autoscaling, mind you! Just some own scaling so that specific parts on the screen scale while others stay put relative to the rest. Rolf Kalbermatter
  19. QUOTE (eaolson @ May 19 2008, 02:19 PM) I'm not sure where you read about atheism being a religion in the post you replied to. I read about it being a belief and I have to agree with that. Nobody can proof there is a deity nor that there isn't, so even atheists believe in something Rolf Kalbermatter
  20. QUOTE (Yen @ May 19 2008, 01:12 PM) Indeed. LabVIEW here only checks the OS error after doing socket calls and translates that error into it's own error number. And it was decided that a more granular error reporting is the good thing to do and I agree with it. Another angle might be that the OP is not really interested into the actual connection at all but rather into the information if the network node is reachable at all. This is solved easily with a network ping as described in following http://forums.ni.com/ni/board/message?board.id=170&thread.id=70801&view=by_date_ascending&page=1' target="_blank">link. I include a copy of the fixed version of that library. There is however one principal problem with LabVIEW's multithreading and the error reporting as done with sockets: The WSA error is maintained on a thread specific global variable and for the blocking call when waiting on an answer from the remote node and in case of parallel execution of this utility in general this causes a small issue. So for this select call the reported error in case of a failure never will be the real error since the select call happens in one of the threads inside a multi threaded execution systems while the error retrieval will occur in the UI thread. Also if you run multiple ping calls in parallel, the error from one call may get overwritten by the execution of another call before the first VI had a chance to retrieve it's associated error. The only real fix for this would be to write a wrapper DLL that does the actual socket call and error retrieval together in one function per different socket call (since the actual call to an external code routine is guranteed by LabVIEW to be atomic in terms of the actual calling thread inside LabVIEW. I would like to add that the VIs are LabVIEW 7.0 and the original code is from m3nth on the NI forums. Rolf Kalbermatter
  21. I've got a Dell Latitude D830 with 15.4" display and generally like it. In the office I use a docking station and an external 22"wide screen display too. Don't have to do lot of work on the road but I do use the laptop only mode at home and find it quite workable with this 15 inch screen. But I chose on purpose for a low 100 dpi instead of the maximum available 148 dpi because I hate those super tiny fonts and even more the Windows hack of Big fonts to work around that and it also matches the resolution of normal desktop LCD monitors of 90 dpi better. Rolf Kalbermatter
  22. QUOTE (gmart @ May 16 2008, 04:44 PM) Well it's not about for example Visual Studio or not but about if that IDE uses the SCC API. That API was specifically designed around the strict check in/check out philosophy and accordingly uses, enforces and even requires it. And that does not work well with SVN. Visual Studio certainly is not a very good IDE to use with SVN since it does really on the SCC API for its Source Code Control integration. As others have said there are other IDEs that are a lot more flexible in how they interface to SCC systems and that work a lot better with SVN than Visual Studio. Rolf Kalbermatter
  23. QUOTE (mkaravitis @ May 16 2008, 04:56 PM) Callback functionality is not supported byt the Call Library Node. The most simple solution is to write a wrapper DLL in C that provides that callback function translating the callback event into a LabVIEW occurrence or LabVIEw user event and a LabVIEW callable function to install that callback function. All in all not something you are likely to solve without some good C programming knowledge. Rolf Kalbermatter
  24. QUOTE (Michael_Aivaliotis @ May 16 2008, 01:10 PM) My thinking exactly. However quite some of those manufacturers compete with NI mainly on one base and that is being cheaper than NI so I guess that leaves not much space to spend money for real software development especially since NI hardware has been getting more competitive in price too in the last years. Rolf Kalbermatter
  25. QUOTE (Tomi Maila @ May 16 2008, 12:37 PM) I'm still not sure I can see the need for ms and frame accuracy on playback. However what we did so far was synchronization and combined storage of video and data acquisition http://www.citengineering.com/pagesEN/products/sdvd.aspx' rel='nofollow' target="_blank">Synchronized Video & Data Acquisition Device in order to have time accurate life measurements such as blood pressure, ECG and similar together with the actual video recoding of the operation so that these things can later be exactly localized in relation to the actual action taken at that moment. However playback of this video together with the data is usually not really in real time and definitly not in strict realtime as the researcher normally wants to go through the interesting sections in real slow motion. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.