Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,778
  • Joined

  • Last visited

  • Days Won

    243

Everything posted by Rolf Kalbermatter

  1. QUOTE(NateO @ Jan 10 2008, 04:34 PM) While VxWorks is used for the NI-RT targets such as newer Fieldpoint Modules or CRIO controllers I do not think you can just go and install any VxWorks OS and then use that from LabVIEW-RT. The RT OS on those targets definitely has been customized to allow LabVIEW RT to directly work with them. The other option that is more likely to work is the Pharlap OS. NI sells the LabVIEW Real-Time Deployment License for Standard PC's- ETS RTOS which is based on this OS. There you have official support to install a LabVIEW RT supported RTOS on your own computer hardware although there are some specific requirements such as for the network interface. The only other solution I see if you want to use LabVIEW is LabVIEW Embedded. But there you have to customize the environment to interface to the tool chain for your target system, if you dont happen to use a target that is already supported and I'm not aware of an already existing PC104 target. Rolf Kalbermatter
  2. QUOTE(guruthilak@yahoo.com @ Jan 14 2008, 01:29 AM) You can but it is not a magical wand that makes your programs go faster or work better just like that. In fact it is likely it will make them much much slower. Request Deallocation tells LabVIEW to deallocate all memory that is not currently in use. And that will deallocate memory that a VI that is currently idle may just need 1ms later again to do its work so it has to allocate it again instead of being able to reuse it. Request Deallocation only makes sense when you have a VI that uses lots and lots of memory and only runs once or maybe every two weeks in the lifetime of your application. Doing a Request Deallocation right after executaion of that VI may (or may also not) improve the memory consumption of your application. Otherwise this function causes more trouble than it would solve, which is why it isn't very prominently made available. Rolf Kalbermatter
  3. QUOTE(guruthilak@yahoo.com @ Jan 7 2008, 11:16 PM) The only people really knowing about this are in the LabVIEw development team at NI corporate. And no that license is not for sale! You need to have a very strong case and good connections that anyone at NI will even acknowledge that there is something like such a possibility. I would expect maybe three or so Alliance members worldwide being allowed to use that feature. And no we are not one of them, I use the other methods mentioned elsewhere on this forum for the few times I happen to have time to dig deeper into this. Rolf Kalbermatter
  4. QUOTE(BrokenArrow @ Jan 7 2008, 07:49 PM) Vi Analyzer is some LabVIEW versions old now and its analysis might have been correct when it was implemented. LabVIEW does make optimization improvements with every new version. That said Build Array is not a complete no-no. Sometimes there is no way around it or it is in a loop that executes only a few times. But it should always be checked and double checked if you do not want an application that gives you way to much coffee time when it is run. Rolf Kalbermatter
  5. QUOTE(vituning @ Jan 4 2008, 11:45 AM) You can't do that in LabVIEW like that. A Byte array (and a string) in LabVIEW is something very different from a C pointer. When you pass such an element to the Call Library Node and configure the parameter as C pointer, LabVIEW will do the translation for you but inside a cluster is something LabVIEW won't do (and in fact can't do without a configuration dialog that would intimidate anyone but the most diehard C freaks). You would need to treat it as an uInt32 and then do the pointer to array data and vice versa in the LabVIEW diagram with the use of more CallLibraryNodes calling into external functions to copy data in and out. As to this function it is not clear if the caller is supposed to allocated the necessary pointers with the required size or if the function will allocate them. In the first case you would need to call even more external functions to be able to do that, in the second case there would need to be a driver provided function to deallocate those pointers again. All in all a real pain and this particular API is very unfriendly to callers in general! It's even hard to call in C and simply a pain in any other environment. Rolf Kalbermatter
  6. QUOTE(crelf @ Jan 2 2008, 02:35 AM) Very nice Tomi! Congratulations to your insigthful and interesting posts here on LAVA. Looking forward to more of them! Rolf Kalbermatter
  7. QUOTE(Kyuubi™ @ Jan 2 2008, 01:46 AM) How about searching through he examples under Help->Find Examples? Gives you a simple example using the Splitter bar! Rolf Kalbermatter
  8. QUOTE(aart-jan @ Jan 1 2008, 07:57 PM) If you just want to encode the data after it has been acquired entirely the command line tool approach should work. For longer sound acquisition this may not be desirable and in that case I would simply create an interface to one of the existing open source compression libraries such as Lame, through the Call Library Node. But there is one issue with MP3 that OGG probably wouldn't have, and that is the fact that some companies try to squezze money out of every use of MP3 encoding software. So your application while working technically alright may have legal issues. probably not an issue for a home grown hobby application, but certainly a problem for a professional application. I'm not sure about the "run until completion" true problem though. I haven't seen this yet and I do make use of the system exec regularly. Rolf Kalbermatter
  9. QUOTE(tcplomp @ Dec 29 2007, 01:09 PM) It's complete. ASCII doesn't define more than that. The rest are extended characters that depend on the currently set local, the font used etc, etc. Rolf Kalbermatter
  10. QUOTE(psi @ Dec 29 2007, 05:47 PM) Hmm, you will have to buy a runtime license per system, and that is about 500 Euro in single quantities. Interesting yes, not expensive only if you are not planning to make an embedded device, although I'm sure you can negotiate volume license prices. QUOTE LabVIEW Real-Time supports not only NI hardware device. LabVIEW Real-Time can supports Third Party Device. Can I Use a Third-Party Device with LabVIEW Real-Time (RT)? Third-Party Compatibilit Configuring the NI Real-Time Environment and NI-VISA toRecognize a Third Party Device Porting a Windows Device Driver to the NI Real-Time Platform Yes for desktop PCs but not for arbitrary embedded development systems. For some reasons I was assuming that embedded was the route the OP wanted to go. Rolf Kalbermatter
  11. QUOTE(psi @ Dec 28 2007, 05:51 PM) But LabVIEW realtime only supports NI hardware targets (some of them use Pharlap OS and other newer ones with PPC CPU use VxWorks) and somewhat specific x86 hardware with Pharlap (ETS). If the OP wants to use NI hardware targets or can live with a more or less standard PC hardware platform then LabVIEW realtime of course would be best. Rolf Kalbermatter
  12. QUOTE(Dr. Dmitrij Volkov @ Dec 28 2007, 09:07 AM) The equivalent of DLLs in VxWorks are .out files. They are basically shared libraries created with either the VxWorks development system or the free GCC compiler. I've only created one of them so far using the freely downloadable GCC version for VxWorks to create a shared library for NI's newer real time controllers. But I'm not sure what you are trying to do here. LabVIEW for VxWorks is only available for the NI realtime target systems. For your own VxWorks based systems you would have to go the LabVIEW Embedded route. Not very cheap but maybe you are doing that already. Rolf Kalbermatter
  13. QUOTE(Yen @ Dec 21 2007, 07:59 AM) A well Heinlein :thumbup: . Certainly an interesting author. Although I wouldn't share every of his visions and he seemed to have some militant ideas at time, he also had a very impressive way to describe environments and social interaction that were strange and bewildering, while at the same time still somehow believable. My first contact with him was by stumbling more by accident over "Stranger in a Strange Land" and it took me quite some time and writing an essay over this book for school to see the finer points of this book. And it changed the way I looked at many things remarkebly. Rolf Kalbermatter
  14. QUOTE(Yen @ Dec 19 2007, 03:15 PM) Well, yen, I think you might be right that the built in nodes may not really have a string name in the palette menu, but only a resource identifier, that points back into LabVIEW (or one of the .rsc files in the resources directory). It's a long time since I looked into this. I'll see what I can do in terms of getting the resource information in a more user friendly way into LabVIEW. But I need to search that stuff and most probably clean it up a bit. Rolf Kalbermatter
  15. QUOTE(thomas2609 @ Dec 19 2007, 09:10 AM) That should get you the exact same time that is displayed in a LabVIEW timestamp set not to use UTC!! So not really sure what the benefit of that function would be. Rolf Kalbermatter
  16. QUOTE(tcplomp @ Dec 19 2007, 07:44 AM) Indeed I concentrated on the technical ability to do what Tit asked for, but in hindsight I would be interested to know, what he wants to fix. LabVIEW 8 has great control over if you want to show UTC or local time and since LabVIEW 7 the date/time conversion for local time considers the timezone that was active at the time the timestamp represents rather than the current time. There are potential issues with the recent dynamic daylight saving time periods, caused by massive adjustments of that period all over the world. In Windows at least, only Vista allows for such dynamic daylight saving time periods. Older Windows versions will always use the current daylight saving period to calculate the local time for any timestamp, even in times where another period was valid. I would expect that these things can be even a bit less perfect on RT targets, but there you usually shouldn't need to display local time anywhere. Instead just save the numeric timestamp, which was always UTC as long as LabVIEW was multiplatform and then interprete it on the desktop application accordingly. Rolf Kalbermatter
  17. QUOTE(Yen @ Nov 8 2007, 01:06 PM) If I only had more time! I hacked once into these files and once you understand the Macintosh like resource format basics (same as used for VIs, LLBs and most other pre-XML LabVIEW file formats) it's not that difficult to get at the info. Problem is my hacks were never really getting out of proof of concept state with the exception of the VI Library shell extension, but that was entirely written in C. But it reminds me that LabVIEW itself exports a Resource Manager API and it should be not to difficult to target that one with a small VI library. I think it is in fact what the palette API does too. From there figuring out the structure of the palette resources won't be to difficult anymore. Rolf Kalbermatter
  18. QUOTE(Aristos Queue @ Dec 18 2007, 06:52 PM) I do pin them down occasionally and intend to keep them there. But I also have a habit to close as much windows as possible so automatically, that whenever I need that damn pinned palette again, it is already gone Rolf Kalbermatter
  19. QUOTE(TiT @ Dec 19 2007, 05:49 AM) Windows API: Get/SetTimeZoneInformation http://msdn2.microsoft.com/en-us/library/ms724944(VS.85).aspx' target="_blank">(link) But it's not a trivial API and you have to do potentially some extra stuff such as getting a system Time structure correctly filled, etc. The one thing you can't avoid is that you need to adjust the application privilege before you can change the system time information. Because of that it would be best to write an external code DLL that does this all in a neat and proper way and import this as a single function into LabVIEW. In the discription to the TIME_ZONE_INFORMATION structure it states that to disable daylight saving time you need to set the month value in both date structures to 0, among a few other conditions. Rolf Kalbermatter
  20. QUOTE(tcplomp @ Dec 15 2007, 02:30 AM) Well, not entirely true. You could download the LabVIEW evaluation version at that time and install it over the old one, preserving your license file, and everything was fine. The only difference between the upgrade and a full install was in fact preservation of already set things such as the license file. Also installing theat version and activating it with the LabVIEW 8.2 serial number worked too, since they were both in terms of the license manager 8.2. Unfortunately they pulled the 8.2.1 evaluation download and replaced it with the most recent one for 8.5 and here a non active SSP will render your 8.2 serial number invalid for this installation. Also 8.5 is in quite a few things more like a regression to 8.2.1 than anything else. But 8.5.1 is coming out somewhere next year. So if you upgrade now with SSP you get that upgrade automatically. Rolf Kalbermatter
  21. QUOTE(Cool-LV @ Dec 13 2007, 08:29 PM) Don't do that! This source code is protected by copyright laws and NDAs. Either you got it as illegal copy from internet and then are not allowed to use it or look at it or under an NDA which you would just have preached by making it available for download. Rolf Kalbermatter
  22. QUOTE(M.M.K @ Dec 13 2007, 05:49 AM) Simple office network with DHCP. Obviously with a cross cable DHCP won't work but that you must have found out otherwise ping couldn't work either. Rolf Kalbermatter
  23. QUOTE(LV Punk @ Dec 12 2007, 10:37 AM) Yes but probably for analyzing their C source code, not for including that technology in a LabVIEW analyzer tool! Would be nice but I think the complexity of such a task would be very high. Coverity has been getting some publicity this year by offereing their services for free to some Open Source projects. But they did have some problems in getting those code scans be performed regularly. Apparently they had not enough staff to do the regular code base scans for all the projects they had offered that service. Rolf Kalbermatter
  24. QUOTE(jbrohan @ Aug 1 2005, 07:43 PM) Your DST calculation might be flawed a bit. Before LabVIEW 7.0 LabVIEW always used the current time to evaluate if a timestamp had to be adjusted for DST. So if you run a VI using that in the summer it will return a different DST status and timezone offset for a specific timestamp (not the current one) than when you execute it in the winter. In LabVIEW 7.0 I think they remedied that by taking the DST status that was actual at the time the timestamp itself represents except for timestamps that are before 1970 or something like that. For those it still uses the current time independant of the DST status that was actual at the time represented by the timestamp itself. Rolf Kalbermatter
  25. QUOTE(Jim Kring @ Feb 23 2007, 04:49 PM) I think you wanted to say absolute times here. QUOTE(JFM @ Feb 27 2007, 09:18 AM) Thanks for all your comments. I'm not arguing that it should be possible to add two absolute times, what I'm trying to say is that I would like the TimeStamp data type be able to contain relative values as well. Then the addition/subtraction nodes would make sense. Why should the TS be limited to absolute times, when this internally is in fact only a relative time (but displayed as absolute time)? Trying to force your way here? Since you want the addition to work on timestamps they should be made to support relative times? But the timestamp was just invented to be able to properly distinguish between absolute and relative times. Before the timestamp datatype was available it was simply a floating point (well very early in LabVIEW it was in fact an unsigned 32bit integer but that got slowly remedied in later LabVIEW versions) number too. This gave problems as it was not really clear if a value was supposed to be absolute times or not So the timestamp was added just for that reason. To have a unique datatype that could represent absolute time so that LabVIEW could make smart decisions how to operate on that. The disallowence of the addition operator on timestamps is one of those smart decisions. QUOTE Another point is that the TimeStamp (TS) data type is not a floating point value (as far as I understand it), but rather a more accurate representation. It is in fact a fixed point representation. 64 bits for the integer part and 64 bits for the fractional part. As such it is obviously at least as accurate as a double floating point number for integer numbers but in fact can't represent the same range as a flaoting point number. While a flaoting point number can represent numbers from ~ -10^308 to 10^308 seconds the timestamp only can go from ~ -10^19 to 10^19 seconds (+-3*10^11 years relative to 1904). The difference is however that the integer part of the timestamp has still a resolution of 1 seocond when representing such a number while the floating point will have a resolution of ~10^293 seconds when representing its maximum value. In fact a 64 bit floating point number will get a resolution higher than 1 second as soon as it goes over ~ +-4.5*10^15 seconds (only ~1.4*10^8 years). The additional 64 bytes of a timestamp for the fractional part allow for a total resolution of 2^-64 seconds which amounts to ~ 5*10^-20 seconds. Not sure if quantum theory allows for such a small time interval at all but I think LabVIEw 8 only used 32 bits of those 64 fractional bits anyhow. Still all this number theory is really not that interesting and if it would be only because of the range and resolution a double floating point could have worked for a few more thousend years for sure as long as you don't require sub millisecond resolution. But the additional benefit of a distinct datatype really made the timestamp extra interesting. The only complain I have is that they didn't think about extra features such as allowing for timezone information being encoded in the timestamp too. I guess reserving 16 bit of the 64 fractional bits for this wouldn't have made the timestamp less useful. After all a resolution of 3.5*10-15 for a timestamp (around 3 femto seconds) still seems quite above any possible need even for hyper physics. Rolf Kalbermatter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.