Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,786
  • Joined

  • Last visited

  • Days Won

    245

Everything posted by Rolf Kalbermatter

  1. I don't think compiler building is the way to make money in the future. I'm focusing here on C/C++ since that is what I know most and is also by far the most widly used development system to develop applications: The biggest compiler system (GNU) nowadays is open source and is not something to make money with from itself. MetroWorks didn't fail because it was bought by Motorola, I rather think they might have failed earlier without. They got quite some business out of supporting just about every Motorola CPU, and their main business supporting Macintosh development would have been probably dwarted by now quite a lot as Apple decided to use GNU C instead of anything else for OS X and considering they used an Open Source kernel that would seem like a smart move to me. Borland has a track record of making bad decisions at the wrong moment about great products. Symantec has aside from the starting years never been a company doing products for the user but was only about making money. Watcom was quite successful but were in a specialized business niche, enabling technologies that got easier to do with the year on just plain MS Windows and now most of what Watcom was is Open Source too. Leaves only really MS Visual C and Intel C as currently still active commercial compilers. MS Visual C because it is from the makers of MS Windows and Intel C because they know how to tweak the latest ns out of their CPUs. Basically I think GNU C has over the years made the idea to sell C compilers rather unattractive. I do not see how it could be in the financial interest by NI to promote a GNU Graphics Note: G is already taken as a programming language name and not for LabVIEW so I will refrain from using it here.Dataflow IS LabVIEW. Taking it out of it creates something that is anything but LabVIEW. So I think this settles this. The biggest power of LabVIEW is dataflow but that creates limitations at the same time such as difficulties or almost impossibilities to create a really object oriented system with inheritence and all this, or generic references to speed up large data access. On the other hand dataflow does allow seemless parallelisation of code execution for users that do not understand anything about multi-threading. This same advantage makes implementation of references in a thread safe manner almost impossible without dwarting the perfomance improvements these references are supposed to bring. Exactly and in doing so it would be mostly getting useless for what we are doing with it. I think there is not much of a problem to try to get your own graphical programming environment started then and put it out as Open Source project ;-) if you abtain from using dataflow and a few specific graphical contructs. I know NI has patented many things but some of them are so general that there is certainly proof of prior art or it is so logical to do that alternatives are not even imaginable. A system to configure hardware resources in a certain graphical representation borrowed from MS Explorer is not something I would consider patentable ;-) Others could be interpreted to patent the idea of an API. Rolf Kalbermatter
  2. Because it is asynchronous. The execution of the property Node does only post an event to the UI thread to refresh that VI front panel next time it gets around to be scheduled by the OS to execute. It then continues to execute the diagram and changes the other property nodes too. At some point the OS decides that the current thread has had enough of a time slice, interrupts it and eventually passes control to the UI thread which then does proceed to do whatever it needs to do according to the current properties for that window. This can mean that you still might see flicker on a very slow machine. Nothing to do about this. Rolf Kalbermatter
  3. Hi Didier Congratulations too, from a fellow-countryman living in the "low lands" of Europe. Will be an interesting addition to your family and make sure you and your wife keep busy Rolf Kalbermatter
  4. I haven't checked into this issue specifically but as yen says it's most probably related to how LabVIEW accounts for day ligt saving time correction of timestamps. Traditionally LabVIEW has always used the current times dayligth saving time to convert from its internal UTC timestamp to local time and vice versa. In LabVIEW 7.0 they introduced the ability to account for the DST status of the actual timestamp. For certain reasons that have not been confirmed by an insider to me, they decided however to keep the old behaviour for timestamps before January 1, 1970. There are two possible reasons I can think of (and they may be actually both true): 1) The OS support for determining if a certain timestamp is DST or not is not (on all LabVIEW platforms?) available for timestamps prior to that date. In order to keep LabVIEW consistent across platforms this behaviour has been chosen. 2) For backwards compatibility! Some Toolkits (Internet Toolkit for instance) use the conversion of 24 * 3600 seconds into a date to determine the current timezonde offset including DST status eventhough that timestamp is of course January 2, 1904 and therefore never could have DST. Rolf Kalbermatter
  5. I admit it is a sentiment, but it also has practical reasons. Copy protection has come in my way a few times when using legit software. If the copy protection decides to strike its bad day when you are close to a deadline you really are close to destroying company property by seeing how long it takes to fly from the second floor until it hits the pavement Rolf Kalbermatter
  6. Damn, I have completely missed this label! Makes my OS style user interfaces finally look perfect. Rolf Kalbermatter
  7. Also they need to use on both/all platforms the C decl calling convention. This is the only one that is supported on all platforms LabVIEW is available on. Sadly most 3rd party DLLs on Windows use Microsofts stdcall and then you will have to maintain two VI libraries anyhow. If I write DLLs I always make sure they use cdecl. This has worked great for the OpenG LabVIEW ZIP Tools library which has one VI library and three shared libraries for the three OSes, Windows, Linux and Mac OSX. Rolf Kalbermatter
  8. You are not very specific. With Intel based do you mean some Pentium CPU board or an Intel ARM CPU? In the first case you would just install normal LabVIEW. I have looked into the Embedded developer module but haven't found time to really do anything with it yet. It definitely is not just doing one day of work to support a new target. Even if you can use an existing target and adapt it to your new one (depends of course both on the CPU as on the actual OS you will use) can you expect to spend at least a week or more before you get your first VIs properly downloaded and running. For an entirely new target I would guess this can take easily months depending on your familiarity with the actual tool chain you need to use for that target. Rolf Kalbermatter
  9. Hmm, I haven't seen free software that uses copy protection. :-) Not sure that means anything though! If I have the choice between a software that uses copy protection and one that doesn't I would almost always choose the one without, independant if it is free or not. The only thing I hate more than copy protected software is software that has spyware or other similar things in it. Rolf Kalbermatter
  10. So you want to send a keystroke to another application by placing it in the keyboard queue and honestly believe that you have any change to switch to the other application after having started the VI? Because keystrokes are only sent to the active application! I would recommend to try this on a mechanical computer but with any modern computer your VI has already long terminated before you can even think about switching to another application, lets forget about moving your hand to the mouse and activate that other application. Rolf Kalbermatter
  11. Apart from some more or less well written VIs to access specific USB and similar keys I'm not aware of much in this direction. The problem as I see it is how much do you want to spend to protect your software? Usually LabVIEW is not used for high volume applications (it's runtime licensing while quite liberal nowadays does not always lend itself well for this) and therefore a license protection that costs days and weeks of development time is not likely to recover its own costs. And license management is a trade of its own with most of what is on the markt nowadays being more of a solution to fend of the casual copier than a way to really dwart the determined hacker. And lets be honest, the only fool proof copy protection is to lock up your software in a safe, destroy any backup copies and throw away the key to the safe So what is it you want to prevent and how much is it worth to you? If it is about not allowing to run your software by people that would anyhow never buy your software, then honestly every single dollar spent into copy protection is simply lost money. If it is about the fun to have copy protection built into your application, it's the same. Only if you can make a valid case that software will be bought thanks to copy protection can you start to look into spending money for this if you want to think commercial. As for me I'm much more likely to buy a LabVIEW add-on toolkit with a honest price that uses no copy protection than using an overpriced toolkit illegally at all. And a copy protected software always gets 10 minus points in my opinion if I have to evaluate such software. I may still buy it but then it needs to be a LOT better than its competition. Rolf Kalbermatter
  12. Data acqusition is not just hardware. The data needs to be transfered by a kernel driver into some intermediate buffer and from there into the application buffer. At least the last operation will be executed in the context of an application thread and can contain scaling (DAQmx channels) and other things too. So there is certainly something a core could be doing eventhough it is for a large part IO/memory related and therefore not the best candidate to be parallelized with multiple cores or CPUs unless you would have separate memory busses too. LabVIEW will not control the cores directly but instead use OS threads. How the OS distributes threads onto multiple cores is almost completely out of control of LabVIEW and can actually be tweaked by a power user. So while multiple independant loops will allow LabVIEW to distribute each loop to a differnt thread it is usually very counterproductive to start distributing related code to multiple threads especially if data flow commands a certain execution order. Instead of just continuing execution of a logical dta flow LabVIEW has to suspend a thread and wait for the correct thread to be activated by the OS to continue the logical data flow. In addition to the costly execution context switch you are forcing onto a logical data flow, you incure additional delays since the OS might decide to activate a different thread first than the one that would suit the dataflow best. However if you have a single loop with subVIs you could assign different execution systems to these subVIs and LabVIEW would be forced in that way to use a different thread. But please note that this will only really have any positive effect if you happen to have two different subVIs in the same loop that do both a computationally expensive operation. Without subVIs parallelisme in LabVIEW is limited. All built in nodes that do not have some kind of wait or timeout function are exectued synchronously and in the same thread as the diagram they are contained in since their typical execution time is quite small in comparison to the context switch overhead to let the OS switch between multiple threads. Most LabVIEW applications I write have anything from 2 to 6 or more parallel loops in the main diagram although sometimes some or all of the loops are located in specific subVIs that are called by the main VI once and terminate automatically all when the application wants to shutdown. This has never given me bad surprises (provided you do a proper design before you start with the actual coding) but results in applications that do DAQ, Analysis, logging, test sequence execution, image acquisition and instrument control all in parallel and still have an utterly responsive user interface. Rolf Kalbermatter
  13. But it is already part of the save routine! That is why you get those error dialogs when saving VIs. And sometimes recompiling (shift-run tool) the VI helps but often deleting the offending object is the only course. I haven't seen it often since about LabVIEW 6.1 but can't really comment to much about LabVIEW 8.0.x. It's possible that LabVIEW 8 introduced new ways to create insane objects. Rolf Kalbermatter
  14. All these things point actually in the same direction: You have somewhere in your COMPLEX UI VI some code that resets these properties. It might be a aproperty node that you added at some point, then resized the Event structure or some case, so that it fell of into invisible area. Or some execution logic in your state machine that causes this at initilisation. I've been there and done that too and almost pulled my hair over it before I realized my own fault. For invisible elements try to open the Error Window under Window->Show Error List and check the show warnings check box. Then go to the VI in question and hit de wanring triangle besides the run button. If it's about the executiona logic of your VI you probably will need to single step or at least work through your intialisation logic with breakpoints until you see where exactly the property gets reset. Rolf Kalbermatter
  15. As a reference (it's actually mentioned in the online manuals too somewhere) a single precsion number can be only accurate to about 7 significant digits while a double precision number can be accurate to about 15 significant digits. Rolf Kalbermatter
  16. Then read the entire MSDN article pointed to in the previous post! It mentions that GetCursor is only for the applciation cursor while there is a GetCursorInfo or similar function that will give you the global cursor (at least if you are not using an archeological Windows version). Rolf Kalbermatter
  17. I think there are such VIs on the NI site. Try to locate them there. I haven't used them but a collegue got them in the past. They were limited in the data types supported but adding new datatypes didn't seem to difficult with a little digging. Rolf Kalbermatter
  18. I don't think LabVIEW will add support for modyfying it's toolbar anytime soon. But with the new splitter pane you can create your own very nice toolbars in LabVIEW 8 quite easily. Rolf Kalbermatter
  19. Wow LabVIEW contains actually a reverse compiler . Now everyone that saves his VIs with password or no diagram at all is simply buggered ;-) Hmm, not sure how that differs from the first really. Still needs a reverse compiler and I just so much like this new extraSuperSecretTool. Naaa! This last one is to simple. It can't possibly be this! ;-) Rolf Kalbermatter
  20. The latest news I heard are the same. NI seems to have dropped the idea to think about how to commercialise the Cubix platform. So it's a nice idea but it will probably never be sold in that form. Maybe an opportunity for a third party. With the LabVIEW Embedded Developer platform this would be very feasable, although I wonder about the commercial benefit of such, if even NI doesn't sees a chance to make it a profitable thing. Rolf Kalbermatter
  21. This is only really true for modern hubs. Older dual-mode hubs frequently had trouble to auto negotiate the correct speed and/or handshaking too. Rolf Kalbermatter
  22. Changing the global cursor is not such a simple exercise. Basically Windows maintains a cursor state for each application and changes it accordingly whenever the cursor changes into a screen area owned by a different application. And in the case of LabVIEW it gets rather nasty for the application cursor. LabVIEW has the habit of resetting the cursor frequently to what it thinks it should be. But if you use LabVIEW 8 then you have already functions in the palatte to change the cursor state for at least LabVIEW. To change the global cursor consistently I'm afraid you would end up having to write some hook function again. Rolf Kalbermatter
  23. Maybe that would be a chance to learn C? It's still THE engineering programming language and won't go away anywhere soon. Having some knowledge about C programming will always be an added bonus in almost any engineering profession where you even remotely need to get your hands on a computer. I'm not volunteering to write this thing for you, sorry. It's going to be a time consuming exercise in any case and to get it work exactly as you envision it will probably get a hard exercise for someone else than yourself. Rolf Kalbermatter
  24. Maybe not Most of people at the NI marketing department used to be female back when I was in Austin some 14 years ago, and they were very nice folk to be with . Not sure if that has significantly changed. Rolf Kalbermatter
  25. The file IO APIs the Large File package uses are standard Windows 32bit APIs. The only reason this might not work is that the underlying RT OS and its WinAPI implementation does not support files bigger than 2GB at all. If that's the case there is absolutely nothing you could do in the application itself and maybe that is why HDF5 is not supported. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.