Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,903
  • Joined

  • Last visited

  • Days Won

    269

Everything posted by Rolf Kalbermatter

  1. I think the main reason SVN is used so often is because it is easy to setup and to use. And with many LabVIEW programmers doing small teams or even single user scenarios, and having to install the version control too themselves this is simply a huge incentive. It's not because SVN is superior to those other version control systems (it's definitely not) but simply because it is easy to use (and easy to use it wrong too). It's definitely better than no version control at all and for the little cost it has in getting it to work an ideal tool for many. It has limitations such as not really supporting branching and merging and I have run many times into situation when copying a controlled module to a different computer and making some modifications there (you can't usually install arbitrary software on a customer computer, and you also can't debug system hardware which needs some specific interface hardware from your own development computer) and afterwards trying to get this back into the controlled module on my system. Hg seems to support that much better, and while git on a command line is a nightmare to use, it certainly has very powerful features.
  2. Well from the pic there is no obvious reason why it shouldn't work. But there is only so much a pic says. It tells us nothing about the actual Call Library Node configuration and even less about what the function expects to do with the buffer. This is however very crucial information. The problem you pose to us is a bit like presenting us a picture of a car in a seemingly perfect condition and asking us why the engine doesn't work. Is gas in the tank? Is the engine in there? You see?
  3. Well changing only the offsets into 64 bit integers won't work. You also need to change the calls to use the 64 bit versions of the functions. If they simply changed the offsets to just support 64 bits, backwards compatibility would be broken. A 64 bit integer is passed to a function in two 32 Bit addresses on the stack so changing all 32 Bit offsets into 64 Bits only, it would seem strange that you could even run an entire chain of VIs as this mismatch should mess up the stack alignment. Although thinking about it, the functions are all cdecl and that means the caller cleans up the stack afterward so it may work ok anyhow, if there are no other parameters after the offset in the parameter list. Like all OpenG libraries it's all on SourceForge.
  4. Actually 64Bit and support for >2GB ZIP archives are two distinct issue. I have in the current HEAD integrated a new version of zlib which would support 64Bit offsets internally which is what is required to support archives containing more than 2GB components. The same version also SHOULD be compilable as 64Bit code. While these two things happen to be incorporated in a working way in the current ZLIB library, they are not equivalent and even not really related. I have come across some snatch in compiling the LVZIP DLL for 64 Bit and haven't really spent much more time on this until now. Well I can compile the code but it crashes and since I don't have a 64 Bit system at hand with a compatible source level debugger and LabVIEW installation (they both also need to be 64Bits) I can't at the moment debug it. The code as it is in the repository would apart from that crash be fully 64 bit capable and should also be able to handle >2GB components although that is nothing I have ever tested nor have thought to test so far. So the OpenG library as it is in the repository SHOULD be able to handle >2GB files but that will also need re-factoring of some VIs to allow for 64 bit integers wherever offsets are passed to and from the DLL functions (those 64 bit capable functions have a special 64 postfix.
  5. One possible problem might be that the DLL is a 32bit DLL and you were using it in the 32 Bit version of LabVIEW and now have upgraded to LabVIEW 2010 64Bit. LabVIEW only can call DLLs compiled for the same environment as it is itself. If this is the case you have two options: 1) Install LabVIEW 2010 32Bit which will work fine on a 64 Bit Windows version 2) Recompile your DLL for Windows 64Bit (and review it for any bitness issues) This review would encompass things such as that the Call Library Node is configured right: Windows 64 bit uses obviously different pointer sizes which could be a problem if you have configured pointer parameters to be treated as integers, but using the new pointer sized integer type since LabVIEW 8.5 would then be the right thing instead of explicit 32 bit or 64 bit integers Also there is a difference in size between long: Windows treats longs always as 32 bit while everyone else treats a long as 32 bit on 32 bit platforms and 64 bit on 64 bit platforms. Use of long long instead in the C code avoids that problem as it is always 64 Bit but it is a C99 feature only so it may have problems with older compilers and also C++ compilers who don't support the upcoming C++0x standard not being able to recognize it.
  6. I fully agree! And for me VB always has been a nightmare to deal with. Apart of the syntax which I always found fuzzy and unclear (coming from Pascal and Modula which we learned in school) I HATED the separation of the code into separate methods for every event and what else. If I wanted to get an overview of a program I had to either print it out or click through many methods to get even an idea about what the code does and how. I'm pretty sure the more modern versions of VB at least allow a different more full view of the code but this together with the syntax that always seemed rather unstructured and ad hoc from a design view point made me avoid VB like the devil. C# is a bit better in that respect but after having dabbled in Java a bit I prefer it above C#. The only thing I don't like about Java is the fact that it doesn't support unsigned integers and other "low" level stuff. It makes interfacing to binary protocols for instance a bit more of a challenge since you have often to deal with proper sign extension and that sort of thing.
  7. You will need to learn some C AND LabVIEW for sure. The example in the link from vugie provides all that is necessary with only one difference and that is the prototype of your function itself. If your knowledge is so limited that this is to much to do yourself you will run into many more trouble along the way and should consider to either try to find a non callback solution or hire someone who does this part for you.
  8. Sorry, I had been working on another post and for some reason the link to the clipboard library must have stayed in my clipboard (how is that about the clipboard causing a problem about a link to a post about the clipboard ) despite my different intentions. It should be fixed now.
  9. No! Because ActiveX Controls have traditionally not allowed overlapping at all. Maybe it would be theoretically possible nowadays, but most ActiveX controls assume simply that they are the one and only control to draw to the screen area they have been constricted to and won't even bother otherwise, simply because at least in older ActiveX standards there was simply no good way to do overlapping. Maybe later versions added support for that, but since ActiveX is already an obsolete technology again, since MS came out with the next Übercool technology called .Net nobody probably bothered to look into that very much anyhow. The ActiveX container in LabVIEW certainly won't allow support for overlapping controls, so even if ActiveX has provisions for that in its latest released version (which I actually still doubt) and you could find a control that supports that it would make no difference when that control is used in LabVIEW. I don't know your VI sensor mapping example but if overlapping controls are used there then they most likely make use of the Picture Control with the 3D OpenGL based drawing functionality. Look in your LabVIEW palette under Graphics & Sound->3D Picture Control. It's not high performance 3D scene creation and rendering and not at all a 3D wire frame graph but it may be able to do what you want to do.
  10. Well the OpenG pipe library has a System Exec equivalent that also returns besides the pipes for the stdIO and optionally stdErr the resulting task ID. And then another function which can be used to kill that task. Works like a charm for me for starting up command line tools that have no built in remote control to quit them. The OpenG Pipe library has not been released as a package but can be found here as a build OGP package.
  11. I have to agree with Shaun. While the text may seem to indicate that the driver is MOSTLY thread safe, it isn't completely. The three mentioned calls need to be made from the same thread and the only way to easily guarantee that in LabVIEW is by calling them in the UI Thread. And incidentially I would say that are probably also the only functions where you could possibly gain some performance if you could call them multithreaded. So I would not bother to change the VI library at all.
  12. If the VI already crashes when it is opened then something has been corrupted. It could be one of the subVIs it uses or the actual VI itself. Corruptions while not very common do happen and the only way to deal with them is by backing up your work regularly. If the VI crashes as soon as you start it, then it may be calling some invalid method, or into a driver which has gotten corrupted. In the driver case reinstalling the driver software might help.
  13. Mostly C and a little C++. Did some Python and Lua programming in the past. Having gotten an Android mobile device now I'm looking into some Java programming too :-)
  14. LabVIEW uses for many version specific things a 4 byte integer which encodes the version in nibbles. And newer LabVIEW versions compress almost everything in the VI to safe space and load time, but this also makes any internal information very hard to read.
  15. Yep, that are the nodes (note: not VI's). I'm aware that they won't help with UI display but only in reading and writing UTF8 files or any other UTF8 data stream in or out of LabVIEW. Display is quite a different beast and I'm sure there are some people in the LabVIEW development departement, biting nails and ripping out their hair, trying to get that working fine.
  16. Not sure why you say that. Just because it is not all obfuscated or even binarised doesn't mean it is a joke. The OGP format was the first format devised and Jim, me and I think someone else whose name I can't come up right now, came up with it after I had looked at ZLIB and according ZIP library and decided to create a LabVIEW library to deal with ZIP formats. The idea for the spec file format I did in fact derive from the old application builder configuration format and it proved flexible, yet simple enough to deal with the task. Jim came with the idea to put the spec file inside the archive under a specific name, similar to how some Linux package managers did it back then. The rest was mostly plumbing and making it work and it is still in principle the same format as in the beginning. The VIPC is just another similar format to track down packages more easily. JKI goes to great lengths to obfuscate the software in the VIPM but they are to be applauded to not have gone the path of obfuscating the actual file formats.
  17. While the nodes I spoke about probably make calls to the Windows API functions under Windows, they are native nodes (light yellow) and supposedly call on other platforms the according platform API for dealing with Unicode (UTF8 I believe) to ANSI and v.v. The only platforms where I'm pretty sure they either won't even load into or if they do will likely be NOPs are some of the RT and embedded platforms Possible fun can arise out of the situation that the Unicode tables used on Windows are not exactly the same as on other platforms, since Windows has slightly diverged from the current Unicode tables. This is mostly apparent in collation which influences things like sort order of characters etc. but might be not a problem in the pure conversion. This however makes one more difficulty with full LabVIEW support visible. It's not just about displaying and storing Unicode strings, UTF8 or otherwise, but also about many internal functions such as sort, search etc. which will have to have proper Unicode support too, and because of the differences in Unicode tables would either end up to have slightly different behavior on different platforms or they would need to incorporate their own full blown Unicode support into LabVIEW such as the ICU library to make sure all LabVIEW versions behave the same, but that would make them behave differently to the native libraries on some systems.
  18. Yes you can't avoid a buffer copy but I wasn't sure if your buffer copy VIs did just a single buffer copy or if your first VI does some copying too. But the main reason to do it in this way is simply to keep as much of the low level stuff in the DLL, especially since you have that DLL already anyhow. And to be honest, I believe your original solution could have a bug. You pass in a structure that is on the stack. The user event however is processed asynchronous so at the time the event is read the stack location can have long been invalidated and possibly overwritten with other stuff. Bad, Bad! especially since both information relate to a memory buffer. And declaring the variable static is also not a solution since a new event could be generated before the previous is processed. Note: It may work if PostLVUserEvent makes a copy of the data, leaving the ownership of the data object to its caller. It could do that since the event refnum stores internally the typecode of the data object associated with it, but I'm not 100% sure it does so.
  19. If you changed the user Event to be of a string instead of the cluster and changed the code in the Callback like this: DLL void iCubeCallback (char *pBuffer, int IBufferSize, LVUserEventRef *eventRef){ LStrHandle *pBuf = NULL; MgErr err = NumericArrayResize(uB, 1, (UHandle*)pBuf, IBufferSize); if (noErr == err) { MoveBlock(pBuffer, LStrBuf(**pBuf), IBufferSize); LStrLen(**pBuf) = IBufferSize; PostLVUserEvent(*eventRef, (void *)pBuf); } } This way you receive the data string directly in the event structure and don't need to invoke buffer copy functions there. No not really. Supposedly there would be probably some functions in the latest LabVIEW versions, but without at least some form of header file available it's simply to much of trial and crash .
  20. No you need LabVIEW 2010 for this! But I wouldn't expect to much of this. Creating a DLL in LabVIEW and using it from the C environment is certainly going to be an easier solution.
  21. While this would likely work I consider it a rather complicated solution which makes the whole deployment of the library more complicated. Writing a wrapper interface for the DLL (to add the extra error cluster parameter to all functions) is indeed not a trivial exercise (mainly in terms of time, not so much in terms of complication). Especially with open source it also poses the question if you want to modify the original sources or add a separate wrapper. However this is work that has to be done once at creation time and won't bother the end user with complicated setups and non-intuitive usage limitations. If it is only a for a solution for a build once and forget application then it's feasable, otherwise I would definitely try to invest a little more time upfront in order to make it more usable in the long term.
  22. I did test it but didn't make extensive tests. It seemed to work for any combination of OpenG compress -> ALZIP / 7-ZIP uncompress and vice versa as well as compress and uncompress in OpenG. This was enough for me to provide the parameter on the VI interface. I can't exclude the possibility that the generated password might not be compatible with certain other ZIP utilities, and I have also stopped using ALZIP a year or so ago, since they were getting worse and worse in advertisement and eventually tried to push an upgrade to a once free version on the users that was only a trial version and had to be purchased to continue using.
  23. Maybe a restart was necessary. The LabVIEW menu palettes are a bit tricky and while VIPM should attempt to synchronize those palettes after an install, something may have prevented the synchronization to work. A restart of LabVIEW will solve that.
  24. This won't work! While Ledata runs Debian, which is a Linux variant, it also uses an ARM processor (which is the main reason it can be so cheap). LabVIEW for Linux is exclusively x86 (32bit). So the only way to get LabVIEW working on there is to buy the LabVIEW Embedded for ARM Toolkit and create the according interface to integrate the Ledato tool chain. An interesting project but not a trivial one, and certainly not cheap when you look at the price of the LabVIEW Embedded Toolkit.
  25. There used to be a library somewhere on the dark side that contained them. It was very much like my unicode.llb that I posted years ago and which called the Windows WideCharToMultibyte and friends APIs to do the conversion but also had extra VIs that were using those nodes. And for some reasons there was no password, eventhough they usually protect such undocumented functions strictly. I'll try to see if I can find something either on the fora or somewhere on my HD. Otherwise, using Scripting possibly together with one of the secret INI keys allows one to create LabVIEW nodes too, and in the list of nodes these two show up too.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.