Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,778
  • Joined

  • Last visited

  • Days Won

    243

Everything posted by Rolf Kalbermatter

  1. Very nice post! I feel almost bad not using a Mac currently! To bad Urs isn't much here on Lava! Yes I used Macs for most of the work I did until leaving NI in 1996 and have several times looked at buying one but never fully took the leap. There is also an Emac in the office since a coworker needed a machine to support an app. But that box is so damn slow that switching between applications makes you want to fetch a coffee every time and that is not so good for my blood pressure. But the Core Duo notebooks did make me seriously consider again :thumbup: , eventhough their price is a bit higher than other systems. Rolf Kalbermatter
  2. Yes! The icon resource format used by LabVIEW originates from the original Macintosh OS icons. They have no explicite color table but instead use the application color table initialized by an application at startup and LabVIEW simply adopted that idea and ported it to all platforms. Changing that now would be a bit of a difficulty although I guess they could add a new million colors icon format to the internal resource table. But that would be a lot of work since quite some low level routines would have to be updated and retested (where I think the LabVIEW developers would rather not use that resource format anymore if they could), the icon editor would have to be seriously overhauled, quite a bit of bitmap handling would have to be adapted to support that new icon format and last but not least the color picker tool itself would need to be taken care of too. And I have some feeling that there are quite a lot of other projects on the todo list that have a higher priority and would require less work to do than this and with a lot less risk to break existing code. I do think that the original color picker from older LabVIEW versions which had the actual colors in the 6*6*6 matrix was better for the icon format but they had to update it to support the color box which could take millions of colors. Rolf Kalbermatter
  3. Argh! I forgot about that. They changed that with I think LabVIEW 7. Before only the BW icon was used to compute the transparency mask. Rolf Kalbermatter
  4. But your use of "Or if this is all you need..." implies a certain simplicity to me that I really can't see Rolf Kalbermatter
  5. Well, that or it is indeed a group of hobby freaks wanting to do that in their closed group. Whatever, with such an attitude it is quite likely doomed to fail even if some commercial interests and agenda is involved. I would estimate some 10 man years of development before one can present a system with an architecture that is stable enough for future enhancements and that does some more than what LabVIEW 1.0 could do. If I would setup such a project I would go for a common UI widget Toolkit like QT with an architecture to add special plugin widgets for new types of controls such as waveforms etc. That would require C++ (I don't think any other programming language except C is up to the task) but since my C++ is bad I would not be able to participate to much. One difficult part is the compiler in LabVIEW but it might be interesting to go for a JIT implementation instead, although that technology in itself is not really easier than a normal one pass compiler. All in all LabVIEW itself has probably some 1000 man years of engineering invested into it and doing it all over from scratch you might get away with 100 to get to a level that can remotely do what LabVIEW can do nowadays. But that would not include LabVIEW realtime or FPGA or embedded. Rolf Kalbermatter
  6. No not exactly. The 256 color palette LabVIEW uses actually comes from the Macintosh as a lot of other things that are in LabVIEW since version 3 :-). That this color palette closely correlates witht the web standard does basically just show that Apple as so often had a bit of foresight when doing their user interface guidelines. That or they had a secret time machine and just stole all the things that got later standards:-) Yes icons in itself have no transparency. The tranparency is done by a mask and this mask is in LabVIEW automatically computed from the BW icon. With a resource editor that understands LabVIEW's file format (in older LabVIEW versions you could simply use a Macinbtosh resource editor on the Mac version of LabVIEW) you could manipulate that mask to use something different than the automatically computed mask. Any bitmap in LabVIEW can have 24 bits. And for importet bitmaps that are only imported to be placed on the front panel for instance, LabVIEW even allows transparency (GIF, PNG). Last time I looked (LabVIEW 7) the Picture Control itself did not support an Alpha channel (32bit bitmaps). Rolf Kalbermatter
  7. You all take the front panel to literal. You seem to assume that this is the front panel of your main screen but in LabVIEW the front panel is the second window besides the diagram window that EVERY VI has. So the course information says you need to have this on the (any? I really get a hunch that this quote was taken out of context and was actually in the chapter about creating an application and where the inclusion of an About screen was explained!!) front panel and the license agreement specifies that it has to be in the About screen. Seems very simple to me: What legal verbage do you trust more? Yes the license agreement! Is the course information wrong? No not really just not very clear. However I don't think that an invisible or unreadable copyright notice would stand in court if it ever got there. Rolf Kalbermatter
  8. I don't think LabVIEW in itself supports marshalling of arbitrary COM interfaces through its ActiveX interface. That would be a whole bunch of extra code to make it generic enough to support all possible datatypes. Rolf Kalbermatter
  9. Your last suggestion was the first thing that came to my mind when reading your post on Info-LabVIEW. It would seem to me the perfect use case for sub panels. Rolf Kalbermatter
  10. If you build your executable and it stops it will quit and then you start it again. The way to do this correctly is rather by writing your configuration into a configuration file (I use INI files for this) and on start up load it. Then inside the application you have somewhere a button or menu to go into a configuration dialog window that allows changing those values and on quiting the dialog you save everything into the configuration file again. To disable front panel elements you use the Property Node with the Disabled property. If the Runtime Engine is already installed you do not need to install it anymore. But creating an installer that does include the runtime engine is not a bad idea. The installer will detect an already installed runtime engine and skip that part. Rolf Kalbermatter
  11. This last suggestion I would take with a grain of salt. Of course it is possible and there were people starting to implement the MySQL TCP protocol in LabVIEW already, but considering the level of expertise from the OP in LabVIEW I think going with one of the existing solutions (NI database Toolkit, idx's ADO Toolkit) is already hard enough to solve. Rolf Kalbermatter
  12. Besides that some of our larger corporate customers only changed to XP in the last year or so I do not see why I should change for myself just yet. My notebook does work for at least another year or two, driver support will be limited for some time, and Windows without service pack hasn't been running on any of my computers since Microsoft introduced that phenomen somewhere around Windows NT 3.5 (when I actually did most of my daily administrative work on a Mac IIci). So I guess I will be using Vista when I will get a new computer but not before that. Rolf Kalbermatter
  13. Well you don't have to worry of course but if you choose a license that is not compatible with most mainstream OS licenses it prevents people wanting to use such a license for their own software to make use of your VIs. Of course that is your decision and your call too, but thinking about these issues is not entirely superfluous AFAICS. Rolf Kalbermatter
  14. I don't see the problem. Make that code in a separate LLB or something without removed diagrams and you are set. The OpenG Builder has even support for this, separating out all the VIs that come from a specific directory hierarchy into a diagram enabled LLB automatically. That he has to use the same LabVIEW version that was used to create the app for editing, and can't change the interface (connector pane, control types) are limitations that you and also nobody else can help with but so be it. The latest is actually a limitation that is valid for every library implementation. And if he breaks the functionality of the library somehow that is also his problem really. Rolf Kalbermatter
  15. Ouch! LabVIEW DSC is a bit more than OPC access, which in the days of BridgeVIEW was quite limited, and alarm management. Things like integrated event and realtime logging, a complete tag based data engine and completely networked and very fast data exchange between engines is quite a bit more than just some simple cosmetics. I do know how hard these things are as I implemented in the past a system similar to what BridgeVIEW had been although not exactly as complete as BridgeVIEW. That said LabVIEW DSC has some issues with things not always working perfectly, partly due to the rather complex system it has grown, partly due to various technology changes in the past from an almost entirely in LabVIEW written SCADA package (BridgeVIEW 1.0) to a system where quite some parts moved out of LabVIEW and parts of the Lookout SCADA engine where integrated instead in LabVIEW DSC 6.0 to a system where virtually everything important is done by various components outside of LabVIEW with quite some of them inheriting from the Lookout technology (LabVIEW DSC 7 and 8). This frequent change of architecture introduced its problems where with each new major version completely new subsystems got replaced and with it new bugs that needed their time to be ironed out, only to introduce other new bugs with the next major architecture change. You can build quite powerful and interesting SCADA applications with LabVIEW DSC as I have used BridgeVIEW 1.0, 2.0, and LabVIEW DSC 5 - 7 for some projects and they always did what had to be done, although there were occasionally some problems to debug and to resolve in more or less close cooperation with some developers in Austin. Lookout in itself has therefore all the features LabVIEW DSC has and a few more except the highly programmable LabVIEW environment. It seems NI is not pushing Lookout anymore but has no intentions of abandoning it yet. The problem with Lookout is not that it is a bad system (it is absolutely not). But when you create a Lookout program you are not exactly programming but rather configuring your application, but this configuration is very powerful the way it is done. The real problem why it never got a big success is twofold. NI had LabVIEW which is sexy and powerful and the NI sales force had little knowledge about Lookout and little interest to learn it since selling LabVIEW was easier and had a lot more sex appeal. Without LabVIEW and the will from NI to go after the SCADA market, Lookout could have easily gotten the killer app in the NI SCADA product line. One of the reasons why I didn't push Lookout myself more for any of our projects was the much more involved licensing for distributed apps in comparison to LabVIEW applications that we could make. But to be honest for classical SCADA applications that certainly wasn't a problem. Lookout was quite cheap in comparison to most other SCADA packages out there back in the late 90ies. Rolf Kalbermatter
  16. I think you misunderstand LGPL a bit. In my opinion for most LabVIEW code it is more apropriate than GPL. GPL requires you to make all your application GPL. LGPL does (at least IMO) allow to link the library to your project in a dynamic way allowing you to choose another license form of your application, as long as you allow your user access to the LGPL part in some way (such as a dynamically called VI in an external LLB to your application). This guarantees that modifications to the LGPL library self are still accessible to the end user (and to a lesser degree to the entire OS community). This was the main reason why LGPL was invented at all as GPL had a to strong viral effect for some developers that created libraries rather than applications. LGPL is in itself fully compatible to any GPL project and as long as you provide some means of access to the source code of the LGPL part even for closed source applications. Rolf Kalbermatter
  17. Hmm I know them as I wrote that library :-) But the status they return is completely independant of TCP/IP or any other network protocol. They return the link status of the network connection (in fact the detection of the carrier on the network cable when it is connected). However be aware that that may not always work, although today that may not be as bad anymore. It can both depend on the network card used and its drivers as well as on the remote side connected. For instance having a hub will always see link status connected eventhough that hub may not be connected to anything else but the power line. In the past some hubs with auto detection/negotiation of the speed and/or crossover or not connection, did have troubles to properly detect certain network cards resulting in a carrier on the network link but no real connection possible. So don't just blindly expect this library to give you everything. This status only tells you that there is a powered network interface attached to your interface and nothing more. If network traffic is possible or not can be and often is a completely different issue. Rolf Kalbermatter
  18. That is not how TCP/IP works. TCP/IP is a state controlled protocol and in order for an error 66 to be reported the TCP/IP stack must go trough the FIN state which is initiated with a FIN, FIN ACK handshaking. Since the connection simply went away there is not something like this. For the local stack the connection is still in an active state although all packet sends and requests timeout, and that is what you get, error 56 timeout. You will have to rethink your aproach but TCP/IP in itself does not guarantee detection of line breaks, only detection and reporting of successful transmission. I think there is some configurable timeout clearing for the TCP/IP stack, where the connection is put into the FIN state automatically after a certain amount of packet requests timeout continously. Rolf Kalbermatter
  19. Sorry Aitor. I remembered from my limited investigations (yes that kind of thing is legal here although using the knowledge to circumvent such protection is not) into the 8.0 license manager that there were two explicit areas that seemed to require a license in order to run. One was the scripting feature that we all know about and the other was something like XNode Development. Knowing scripting I did investigate a bit further into it but not being familiar with XNodes I never went further on that. Maybe they changed the Xnode protection in 8.20 or there are two different aspects about XNodes that are protected differently. I do not know and won't have time to investigate in the near future. Rolf Kalbermatter
  20. I also have to warn that my opinion is not completely unbiased as I have started my LabVIEW carrier as an application engineer at NI and then went to be an alliance member. When I started at NI I was shown this software and some manuals (that were admittingly less complete and much smaller in content than nowadays, but they were printed paper) and then I got a chance to attend a LabVIEW course or two. And those courses really helped a lot. However I have to say that I had previous programming practice in Pascal and a little C so programming in itself wasn't a strange matter to me. My electrical engineering background was very delighted when seeing the LabVIEW diagrams that so much resembled the electrical schemata I had learned to think in earlier so I adopted it quite fast but nevertheless felt that the course really gave me an advantage. It wasn't so much about the programming in itself but about discovering all the little features, editor shortcuts, and tips and tricks that this course gave and also the interaction with the teacher and other students during the course. Later I thought LabVIEW courses myself as an application engineer and also alliance member and I have to say that I still learned a bit during each of those courses. My experience during these courses was that there were two type of people. The ones that knew programming in itself did usually profit a lot more from the course than the ones that had to be thought the basic principles of programming first. Three days is simply not enough to teach someone to understand a whole bunch of programming constructs and something about datatypes and at the same time also have them get familiar with a new software environment such as LabVIEW. But I think that is the same with any software course. I doubt there is a Matlab course that will be useful to anyone that has to be thought the basic principles of mathematics first for instance. The only problem I always felt was that NI likes to market LabVIEW as the tool for non programmers. In my view that is not entirely correct. Without some basic understanding about loops, conditionals, arrays and skalars you simply can't create a good working computer application. The advantage of LabVIEW is that these things are easier to understand and use in LabVIEW for most people since people tend to be more visually oriented than text oriented. Ohh yes, I took the courses in Austin and on a Macintosh since LabVIEW for Windows didn't exist then and there were a few people (not NI people) in the same course that obviously had it even easier than me. They usually had the examples finished before the instructor even asked to start with them. They were attending the class to learn LabVIEW not programming, something which I haven't seen to often over here in Europe later when teaching courses. Rolf Kalbermatter
  21. The idea about FPGA might be interesting here. Earlier versions of LabVIEW did not support (or should I say use) fixed sized arrays although the typedescriptor explicitedly had this feature documented as long as the document about Typedescriptors exists. FPGA was to my knowledge the first environment really needing fixed size arrays so they pushed that. The particular problem you see also seems a bit like pains of the early attempts to get the constant folding optimization into LabVIEW. Not sure if the FPGA Toolkit adds this feature to the LabVIEW environment but I rather think that it is there independant of the existence of the FPGA Toolkit (but can't currently check as I have the FPGA Toolkit also installed). Rolf Kalbermatter
  22. Well I didn't say to turn off all optimizations. Certainly not the ones that are already working fine and in the particular case with 6.0.1 it was not about inplaceness or not. It was about more agressive inplaceness optimization that would completely optimize away bundle/unbundle constructs if combined with certain shift register constructions. The same code had worked fine for several years in previous LabVIEW versions without so much of a hint of performance problems and suddenly blew up in my face. The Queue port was also not such a nice thing but I got easy off there since I didn't use queues much as I had gotten used to create my intelligent USR global buffer VIs for vrtually anything that needed something like a queue functionality too. But I think there is a big difference in bugs introduced through things like constant folding and bugs introduced in new functionality. I can avoid using queues or whatever quite easily but I can hardly avoid using shift registers, loops and basic data structures such as arrays or clusters since they are the fundamental buidling blocks of working in LabVIEW. So if in that basic functionality something suddenly breaks that LabVIEW version is simply not usable for me. The same would be for fundamental editor functionality. Just imagine that dropping any function node on the diagram suddenly crashes on every fourth installed computer somehow. Other bugs can be very annoying but you still can keep working in that LabVIEW version and write impressive applications with this if you need to. While we all would like bug free software I think almost everyone has accepted that that is something that will never really happen before LabVIEW 77 with it's 5th generation AI and environment interfaces with causality influencer. But the basic functionality of LabVIEW 2 should not suddenly break. Rolf Kalbermatter
  23. Well, nothing against German :beer: but the Belgians really have a few of the best ones that I know of. And no I don't say that because he is a collegue Rolf Kalbermatter
  24. Ah I see. Well I for myself still have to do my first project for RT/FPGA in 8.x. 7.1.1 while having some quirks actually still works great for that. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.