Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,786
  • Joined

  • Last visited

  • Days Won

    245

Everything posted by Rolf Kalbermatter

  1. Well it is possible. But instead of an array place 8 uint8 elements at the end that will correspond to the 8 bytes of your data array. This array being fixed size is not treated as an array anymore really in C but really corresponds more to a cluster in LabVIEW containing the exact number of elements in it. And byte alignment is not a problem for this structure. See other posts for all about byte alignment but basically LabVIEW always uses 1 byte alignment while most C libraries use some other alignment such as 4, 8 or 16 byte. The alignment only says that the start address of a data element in a cluster is always set to a multiple of the lower of the two values being either the size of the element itself or the byte alignment. So LabVIEW uses really so called packed memory structures while most DLLs would place an int32 on a multiple of 4 for instance adding 0 to 3 filler bytes if the previous element would not end on this memory address. The alignment of data elements in structures can be set the by the DLL developer at compile time but for normal 32 bit DLLs nowadays under Windows you can assume 8 bytes if the library documentation or header file (look for #pragma pack(x) lines in there) doesn't show otherwise. Rolf Kalbermatter
  2. Well Option 3 is not exactly undoable but pretty useless. All office formats with exception of the latest use the so called "structured object storage" format that MS actually has documented partly in the past. It is basically streaming of OLE objects into a binary stream of data. Now the difficulty is about figuring out the actual object elements in this stream for the different Office applications and reverse engineering is definitely possible. Open Office has done that. But it is basically useless because Microsoft has changed the object hierarchy with every new Office version in the past and with the newest Office even moved away more or less entirely by using XML instead. Dealing with these version differences is a nightmare, and reverse engineering such a format without an entire OpenOffice developer community is doomed to fail as you will be starting to figure out the actual workings of a format just about at the point where the next version is due. Rolf Kalbermatter
  3. That would make sense! "Tax the bastards that can not appreciate our fully bundled lock into one provider packages". You also need to consider the extra testing necessary for customized CDs for the European market. Or wait it is a way of driving off European businesses to use Linux more, since they are going to do that anyway, so get from them what you can as long as you can! Rolf Kalbermatter
  4. An image is not a control! An image is an image is an image and simply nothing more. As such it is just a decoration. Using the control editor in LabVIEW you can modify existing LabVIEW controls to use images from the image navigator as sub parts of that control. Each LabVIEW control consists of at least one subpart (simple boolean button) or more (for instance the housing, slider, increment/decrement arrow, filler, etc for a slider). The LabVIEW control editor is accessed by selecting a control with the right mouse button and then selecting Advanced->Customize Control. Beware the control editor is some sort of graphics editor but with limited customizing (you can only change the face and size of parts not more) and it is sometimes a little jirky to use, but if you invest some time you can really get very nice controls done. Rolf Kalbermatter
  5. You could also have asked how to use a computer to create a program. Basically by learning the tools you have available for your use. LabVIEW DSC is specifically an add on to solve most of those problems a bit easier although it is by far not a requirement. But giving you here a run down of what to do would be mostly a copy of the Getting Started with LabVIEW DSC manual and I would assume you have that at your hands. Rolf Kalbermatter
  6. You misunderstand VI server a bit here. VI server is the actual core functionality directly operating on the LabVIEW object hierarchy. LabVIEW objects have absolutely nothing to do with COM at all. LabVIEW to LabVIEW communication on VI server level happens through a LabVIEW private TCP/IP protocol and should support the same features except the ones that are limited on purpose for security reasons. The LabVIEW ActiveX server interface is just a wrapper around the VI server interface and I'm sure some of the COM concepts map better to the VI server object model than others. But considering your NI icon I would say you should definitely contact Lucyangeek aka Brian Tyler who is very proficient with .Net and ActiveX functionality and how it is implemented and used in LabVIEW. Rolf Kalbermatter
  7. This last comment depends a bit I think. Yes I'm a bit a seasoned C programmer (but learned most of what I know about C after I learned LabVIEW). Having learned programming in Pascal might also have helped my tendency to opt for more functions than less. While applications tend to follow your screen hierarchy metrics somehow, that is not always the case with function libraries. I would agree that sorting out every wishy-washy into its own subVI is not very helpful both for maintenance and understanding of the code (and creating the extra icon and connector pane also needs some time) but I would say it is better to have a subVI to much than one to little. I really can't get happy when seeing someones code where: 1) a diagram that covers more than a screen size and is stuffed with all kind of code making it a major task to try to understand what the VI does. 2) or an application that has all over the place similar or the same code constructs implemented over and over again. That said the programs where various VIs are plopped into the diagram where the wiring and connector pane make it obvious that the VI was created with "Create VI from selection" without further edits to that subVI can make me even more sick. Or another bad design topic where many subVIs that all do more or less the same with slight differences are sprinkled through the entire application hierarchy. Rolf Kalbermatter
  8. Well as far as C) is concerned: LabVIEW is smarter than that since I think at least 3.0. Since Array Size is a function that does not (re)use the array ever, LabVIEW always attempts to schedule this call before any other functions that use the branched array, in order to avoid unnecessary buffer copies. And here appearently happens the difference. Because Array Subset usually does modify the buffer passed in it does create a copy in D since the source wire has a branch. Since D is now a new buffer it can be passed directly to the subVI. In the second case Array Subset does reause the buffer but flag it as not clean, since it usually would have resized the array somehow, but in this case without reallocating the actual memory area. Since the buffer is not clean, LabVIEW create now a clean copy to pass it to the subVI. All this shows that analyzing for optimization is a difficult thing to do. In this case the first solution will be more performant since the additional buffer is only allocated once outside of the loop. Array Subset being a function that modifies the buffer is of course a nasty thing to analyze and as you can see optimizing the Array Subset here can actually have negative effects since LabVIEW has to consider many possible corner cases eventhough in this case the extra copy would not strictly be necessary. I'm also sure the entire story would look again different if you would add a shift register to the loop and pass the array into this, feeding it inside the loop from the left to the right shift register terminal. While this code may look useless it almost always helps LabVIEW to optimize the code in the best possible way, as the LabVIEW optimizer has some very smart shift register optimization rules. Rolf Kalbermatter
  9. Very nice post! I feel almost bad not using a Mac currently! To bad Urs isn't much here on Lava! Yes I used Macs for most of the work I did until leaving NI in 1996 and have several times looked at buying one but never fully took the leap. There is also an Emac in the office since a coworker needed a machine to support an app. But that box is so damn slow that switching between applications makes you want to fetch a coffee every time and that is not so good for my blood pressure. But the Core Duo notebooks did make me seriously consider again :thumbup: , eventhough their price is a bit higher than other systems. Rolf Kalbermatter
  10. Yes! The icon resource format used by LabVIEW originates from the original Macintosh OS icons. They have no explicite color table but instead use the application color table initialized by an application at startup and LabVIEW simply adopted that idea and ported it to all platforms. Changing that now would be a bit of a difficulty although I guess they could add a new million colors icon format to the internal resource table. But that would be a lot of work since quite some low level routines would have to be updated and retested (where I think the LabVIEW developers would rather not use that resource format anymore if they could), the icon editor would have to be seriously overhauled, quite a bit of bitmap handling would have to be adapted to support that new icon format and last but not least the color picker tool itself would need to be taken care of too. And I have some feeling that there are quite a lot of other projects on the todo list that have a higher priority and would require less work to do than this and with a lot less risk to break existing code. I do think that the original color picker from older LabVIEW versions which had the actual colors in the 6*6*6 matrix was better for the icon format but they had to update it to support the color box which could take millions of colors. Rolf Kalbermatter
  11. Argh! I forgot about that. They changed that with I think LabVIEW 7. Before only the BW icon was used to compute the transparency mask. Rolf Kalbermatter
  12. But your use of "Or if this is all you need..." implies a certain simplicity to me that I really can't see Rolf Kalbermatter
  13. Well, that or it is indeed a group of hobby freaks wanting to do that in their closed group. Whatever, with such an attitude it is quite likely doomed to fail even if some commercial interests and agenda is involved. I would estimate some 10 man years of development before one can present a system with an architecture that is stable enough for future enhancements and that does some more than what LabVIEW 1.0 could do. If I would setup such a project I would go for a common UI widget Toolkit like QT with an architecture to add special plugin widgets for new types of controls such as waveforms etc. That would require C++ (I don't think any other programming language except C is up to the task) but since my C++ is bad I would not be able to participate to much. One difficult part is the compiler in LabVIEW but it might be interesting to go for a JIT implementation instead, although that technology in itself is not really easier than a normal one pass compiler. All in all LabVIEW itself has probably some 1000 man years of engineering invested into it and doing it all over from scratch you might get away with 100 to get to a level that can remotely do what LabVIEW can do nowadays. But that would not include LabVIEW realtime or FPGA or embedded. Rolf Kalbermatter
  14. No not exactly. The 256 color palette LabVIEW uses actually comes from the Macintosh as a lot of other things that are in LabVIEW since version 3 :-). That this color palette closely correlates witht the web standard does basically just show that Apple as so often had a bit of foresight when doing their user interface guidelines. That or they had a secret time machine and just stole all the things that got later standards:-) Yes icons in itself have no transparency. The tranparency is done by a mask and this mask is in LabVIEW automatically computed from the BW icon. With a resource editor that understands LabVIEW's file format (in older LabVIEW versions you could simply use a Macinbtosh resource editor on the Mac version of LabVIEW) you could manipulate that mask to use something different than the automatically computed mask. Any bitmap in LabVIEW can have 24 bits. And for importet bitmaps that are only imported to be placed on the front panel for instance, LabVIEW even allows transparency (GIF, PNG). Last time I looked (LabVIEW 7) the Picture Control itself did not support an Alpha channel (32bit bitmaps). Rolf Kalbermatter
  15. You all take the front panel to literal. You seem to assume that this is the front panel of your main screen but in LabVIEW the front panel is the second window besides the diagram window that EVERY VI has. So the course information says you need to have this on the (any? I really get a hunch that this quote was taken out of context and was actually in the chapter about creating an application and where the inclusion of an About screen was explained!!) front panel and the license agreement specifies that it has to be in the About screen. Seems very simple to me: What legal verbage do you trust more? Yes the license agreement! Is the course information wrong? No not really just not very clear. However I don't think that an invisible or unreadable copyright notice would stand in court if it ever got there. Rolf Kalbermatter
  16. I don't think LabVIEW in itself supports marshalling of arbitrary COM interfaces through its ActiveX interface. That would be a whole bunch of extra code to make it generic enough to support all possible datatypes. Rolf Kalbermatter
  17. Your last suggestion was the first thing that came to my mind when reading your post on Info-LabVIEW. It would seem to me the perfect use case for sub panels. Rolf Kalbermatter
  18. If you build your executable and it stops it will quit and then you start it again. The way to do this correctly is rather by writing your configuration into a configuration file (I use INI files for this) and on start up load it. Then inside the application you have somewhere a button or menu to go into a configuration dialog window that allows changing those values and on quiting the dialog you save everything into the configuration file again. To disable front panel elements you use the Property Node with the Disabled property. If the Runtime Engine is already installed you do not need to install it anymore. But creating an installer that does include the runtime engine is not a bad idea. The installer will detect an already installed runtime engine and skip that part. Rolf Kalbermatter
  19. This last suggestion I would take with a grain of salt. Of course it is possible and there were people starting to implement the MySQL TCP protocol in LabVIEW already, but considering the level of expertise from the OP in LabVIEW I think going with one of the existing solutions (NI database Toolkit, idx's ADO Toolkit) is already hard enough to solve. Rolf Kalbermatter
  20. Besides that some of our larger corporate customers only changed to XP in the last year or so I do not see why I should change for myself just yet. My notebook does work for at least another year or two, driver support will be limited for some time, and Windows without service pack hasn't been running on any of my computers since Microsoft introduced that phenomen somewhere around Windows NT 3.5 (when I actually did most of my daily administrative work on a Mac IIci). So I guess I will be using Vista when I will get a new computer but not before that. Rolf Kalbermatter
  21. Well you don't have to worry of course but if you choose a license that is not compatible with most mainstream OS licenses it prevents people wanting to use such a license for their own software to make use of your VIs. Of course that is your decision and your call too, but thinking about these issues is not entirely superfluous AFAICS. Rolf Kalbermatter
  22. I don't see the problem. Make that code in a separate LLB or something without removed diagrams and you are set. The OpenG Builder has even support for this, separating out all the VIs that come from a specific directory hierarchy into a diagram enabled LLB automatically. That he has to use the same LabVIEW version that was used to create the app for editing, and can't change the interface (connector pane, control types) are limitations that you and also nobody else can help with but so be it. The latest is actually a limitation that is valid for every library implementation. And if he breaks the functionality of the library somehow that is also his problem really. Rolf Kalbermatter
  23. Ouch! LabVIEW DSC is a bit more than OPC access, which in the days of BridgeVIEW was quite limited, and alarm management. Things like integrated event and realtime logging, a complete tag based data engine and completely networked and very fast data exchange between engines is quite a bit more than just some simple cosmetics. I do know how hard these things are as I implemented in the past a system similar to what BridgeVIEW had been although not exactly as complete as BridgeVIEW. That said LabVIEW DSC has some issues with things not always working perfectly, partly due to the rather complex system it has grown, partly due to various technology changes in the past from an almost entirely in LabVIEW written SCADA package (BridgeVIEW 1.0) to a system where quite some parts moved out of LabVIEW and parts of the Lookout SCADA engine where integrated instead in LabVIEW DSC 6.0 to a system where virtually everything important is done by various components outside of LabVIEW with quite some of them inheriting from the Lookout technology (LabVIEW DSC 7 and 8). This frequent change of architecture introduced its problems where with each new major version completely new subsystems got replaced and with it new bugs that needed their time to be ironed out, only to introduce other new bugs with the next major architecture change. You can build quite powerful and interesting SCADA applications with LabVIEW DSC as I have used BridgeVIEW 1.0, 2.0, and LabVIEW DSC 5 - 7 for some projects and they always did what had to be done, although there were occasionally some problems to debug and to resolve in more or less close cooperation with some developers in Austin. Lookout in itself has therefore all the features LabVIEW DSC has and a few more except the highly programmable LabVIEW environment. It seems NI is not pushing Lookout anymore but has no intentions of abandoning it yet. The problem with Lookout is not that it is a bad system (it is absolutely not). But when you create a Lookout program you are not exactly programming but rather configuring your application, but this configuration is very powerful the way it is done. The real problem why it never got a big success is twofold. NI had LabVIEW which is sexy and powerful and the NI sales force had little knowledge about Lookout and little interest to learn it since selling LabVIEW was easier and had a lot more sex appeal. Without LabVIEW and the will from NI to go after the SCADA market, Lookout could have easily gotten the killer app in the NI SCADA product line. One of the reasons why I didn't push Lookout myself more for any of our projects was the much more involved licensing for distributed apps in comparison to LabVIEW applications that we could make. But to be honest for classical SCADA applications that certainly wasn't a problem. Lookout was quite cheap in comparison to most other SCADA packages out there back in the late 90ies. Rolf Kalbermatter
  24. I think you misunderstand LGPL a bit. In my opinion for most LabVIEW code it is more apropriate than GPL. GPL requires you to make all your application GPL. LGPL does (at least IMO) allow to link the library to your project in a dynamic way allowing you to choose another license form of your application, as long as you allow your user access to the LGPL part in some way (such as a dynamically called VI in an external LLB to your application). This guarantees that modifications to the LGPL library self are still accessible to the end user (and to a lesser degree to the entire OS community). This was the main reason why LGPL was invented at all as GPL had a to strong viral effect for some developers that created libraries rather than applications. LGPL is in itself fully compatible to any GPL project and as long as you provide some means of access to the source code of the LGPL part even for closed source applications. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.