Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,837
  • Joined

  • Last visited

  • Days Won

    259

Everything posted by Rolf Kalbermatter

  1. From your post in for the DCG in the code repository I take it that you are talking here about the THThreadCreate() and friends C API. I'm afraid that there is not really a way to make the IDE aware of this thread in any way. These functions are only thin wrappers around the platform thread managment functions (CreateThread on Windows, pthread_create on Linux, etc.). As such they are used by the LabVIEW kernel to manage threads in a way that makes the actual kernel almost completely independent of the underlaying platform API, but are on such a low level that the IDE is no aware about them unless it created them themselves. Basically calling any of these LabVIEW manager functions (memory, file, thread, etc) is more or less equal to directly calling the underlaying system API directly but with the advantage that your C code doesn't have to worry about different APIs when you try to compile it for another LabVIEW target like Linux or Mac OSX. If you only want to work in Windows, calling CreateThread() directly is actually the more direct and simpler way of doing this. What is your actual issue about wanting to have the IDE be aware of your created threads?
  2. Wow that graph certainly looks not very logical. There is absolutely nothing in the LabVIEW world which would explain the huge activity increase in 2011 and 2013. And your suspicion of spam activity probably is founded. Seems like somewhere in 2012 someone started to do some tests to launch a huge spam attack (most likely to many other foras too) in 2013 and then towards the end of the year ramped up in one last huge effort to try to make it a success and killing the "business model" definitely with that. The interesting data is likely more in the baseline where you can see that a somewhat steady number dropped to virtually 0 after the first spam attack and is nowadays just barely above 0 which would be indeed significantly lower than at the start of the graph in 2010.
  3. It's a general tendency I have noticed on LabVIEW forums, although maybe less so on the NI forums. I also signed up at the German LabVIEW forum and while 5 or 6 years ago you had every day several new topics varying from the newbie question of how to do a simple file IO operation to the advanced topic of system architecture and interfacing to external code, nowadays it is maybe a tenth of that with most topics ranging in the more trivial category. The German forum had and has a somewhat broader target range since it was equally meant for beginners and advanced users while LAVA started by the nature of its name as a forum for somewhat more advanced topics, although I would like to think that we never really gave a newbie a bad feeling when posting here, as long as that person didn't violate just about every possible netiquet there is. The German forum may have one additional effect that may contribute to it getting less traffic and that is that English has also in Germany got the defacto standard in at least engineering. But without having any numbers to really compare I would say that the German forum and LAVA have seen similar decay in number of new posts and answers in general. And yes I have been wondering about that. Where did those people go? I feel that some went to the NI forums as they got more accessible over time but I do think that a more important aspect is that LabVIEW has gotten in many places just one of many tools to tackle software problems whereas in the past it was more often THE tool in the toolbox of many developers. That is probably a somewhat jaded view from personal experience but I certainly see it in other places too when I get there during my job. And Shaun definitely addresses another point when he mentions that LabVIEW innovation has slowly been getting to the point of stagnation in the last 5 to 10 years. That would hurt specialized forums like LAVA or a local forum like the German forum most likely a lot more than the NI forum, which catches most of the more trivial user questions of how to get something done or about perceived or real bugs. I'm not sure in how far the NI forum has been seeing a similar slow down. Personally I feel it might have been getting a little less active overal in comparison but what is more apparent is the fact that there too the general level of advanced topics has been slowing down, which would be in accordance about little to no innovations in LabVIEW. The interesting things have been discussed and brainstormed about and very little new things got added that would spur new discussions. What is posted nowadays is mostly about problems when using LabVIEW or some hardware or how to get a simple job down quickly! Is LabVIEW dead? I wouldn't feel so as I still see it used in many places but the excitement about using LabVIEW has been somewhat lost by many. It's a specialized tool to do certain things and in a way the only one in town doing it in this way, but by far not the only one you can use to do these things. In fact there have been many new and not so new possibilities about doing it (I see for instance regularly situations where people have decided to use Python for everything even the UI, which is not something I would easily consider) and the general target has been shifting too. If you want to do internet related stuff then LabVIEW is definitely not the most easy solution and also not the most powerful one. Engineering simply has been getting a lot broader in the last 10 years and while measurement and testing where you directly tinker with hardware and sensors still is quite important, there has been another big market developed that has very little to do with this and where the qualities of LabVIEW don't shine as bright or even show nasty oxidation spots. Maybe the fact that LabVIEW always was designed as a closed environment with very limited possiblities to be extended directly by 3rd parties has hurt it to the point of being forced into the niche market it always tried to protect for itself. It will be interesting to see how NI is going to attempt to fix that. The stagnation in LabVIEW development is something which supposedly happened because they focused the energy on a fundamental redesign of the LabVIEW platform, which has dragged on longer than hoped for and claimed more resources than was good for the existing LabVIEW platform.
  4. I got that too in Chrome. Thought to wait and see again later. Still an issue, So decided to clean my cache and voila!
  5. Even as Microsoft Certified Partner you do (did?) not have unlimited license allowance. After new versions of software are out, you are supposed to upgrade to the newest version within one year. The licenses for older versions get then invalid and with that any VM image using them, even if it is just a backup.
  6. Hmmm, an interesting change. Looks good and pleasant to the eyes. Congrats!
  7. Yes I'm using a somewhat enhanced OpenG Packager version for my own work. It doesn't support many things I would like but I haven't really needed them so far. But it allows simple support for selective installs. It doesn't support relocating, relinking and password protecting Vis, as that is really part of what the OpenG Application Builder was about, only that part is very much out of date as it doesn't support 8.0 and newer file types.
  8. Having been one of the initial codevelopers of the ogp file format and spec file, I wasn't really looking at RPM good enough to say that it was based on it, but we did take inspiration from the general RPM idea and tried to come up with something that worked for LabVIEW. The format itself was in some ways more based on the old application builder configuration file than any specific package manager format. As to creating an alternative to VIPM, I would think this to be almost an effort in vain. The OpenG Commander was a good thing back in the pre LabVIEW 8 days and worked fairly well for that situation but the new project, LabVIEW libraries and classes and various other file formats introduced with LabVIEW 8.0 and later really are a very different breed in many ways. Also VIPM really is a combination of the Open G Commander, Open G Package Builder and the Open G Application Builder, but then severly enhanced to handle the new file types too, which is a very complicated process and requires quite a few undocumented VI server methods, and some of them changed between versions, so it's hard to support more than two or three LabVIEW versions at all. As to saying that NI doesn't provide a good source code control solution is kind of going into the wrong direction. I haven't seen many software developer tools coming with source code control from the same manufacturer that actually work good in any way. Microsoft couldn't pull that trick and there is nothing that makes me believe that NI has even nearly as many resources available than MS. There are ways to use source code control with LabVIEW. Not ideal ones but there aren't any ideal solutions as far as I'm concerned even outside of LabVIEW. LabVIEW has a few extra hurt points as some of its files are big nasty binary blobs, that none of the source code control tools can handle efficiently but all of the modern ones can at least handle them. The more modern XML based files while being text based have a problem in that just about every source code control system will not be able to handle them consistently as simple text based merging is not enough. And context based merging is still in its infancy and doesn't even work well for other XML based files with fully known schema. But to turn around and requiring the LabVIEW files to be in a format that can be easily merged automatically is also not really realistic.
  9. I'm sorry but I'm not likely to work on this anytime soon. We use MS Office in the office, so OpenOffice is not a logical choice for me to work on, and our projects would usually require to work with MS Office too, if they require any office automation integration.
  10. Well, then your NetBEUI to IP address resolution is not working properly. Your previous comment about going directly to the network path, I was actually more refering to entirely alternative protocols such as WebDAV or (S)FTP.
  11. Except going directly the network path? Not really! There might be some obscure registry setting which influences the timeout, but in my experience the default timeouts for accessing remote file paths through Windows file API are rather to long than to small. I don't like to wait for 1 minute to finally get an error that a file path is not currently accessible since the network or remote server is down. Identifying that registry setting is going to be difficult and made extra complicated as it may vary depending on your current setup and logged in credentials. That file API is layered on top of the kernel API which is layered on top of the NetBEUI protocol, which then makes use of Winsock, which then calls into the socket driver and from there in the network card driver. NetBEUI is/was a nice invention back in the days of DOS and got carried over since, but it did not have a rich control interface for things like timeouts, etc. It simply goes and sits for whatever time the developer found reasonable. The Windows file API does only support timeouts on its own level through asynchronous API calls, which is cumbersome and in most cases serves no extra purpose other than freeing up the current thread to do something else while the file API waits for the underlaying resource to become available. LabVIEW does not use the asynchronous file API since it doesn't give LabVIEW anything extra and even if it did, this is entirly seperate from possible timeouts in lower level file system drivers like NetBEUI. Generally I would say your Windows domain setup is somewhat flaky, so that the NetBEUI driver can't always verify the access rights of the current user for the desired network resource in time. This setup used to be fairly straightforward in Win NT 4 times but got tightened up a lot since, so that it is very easy to make errors when configuring domain users and access rights, and this errors often only occur occasionally, depending on network load, actual login mode and various other temporary conditions.
  12. I believe that it should be possible to do, but likely will require to crosscompile the entire NI Linux RT kernel sources in order to configure them for a standard PC rather than the NI specific hardware. Other than that it should be less critical about needing specific hardware (virtualization) than the Pharlap system, since the Linux kernel was and still largely is originally developed on and for the PC. It's been more than 15 years that I compiled my own Linux kernels from scratch and installed them onto a second boot partition so it will be taking some time for sure to figure out all the details. If I ever happen to get some spare time, I'm probably going to try that, but don't hold your breath for it.
  13. What about the LVCompare.exe tool that can be used to automate source code control clients to know how to deal with LabVIEW files more friendly? http://zone.ni.com/reference/en-XX/help/371361H-01/lvhowto/configlvcomp_thirdparty/ That should do most of the things required here and has various options to ignore certain classes of changes in the comparison.
  14. It's a first step towards 3.0 compatibility but by far not enough. 3.0 changed a lot of things. As to running LabPython 4.0 with Python 2.7 I guess that thanks to the runtime dynamic linking, it may work fine as long as you don't hit code paths that depend on the symbols that changed in 2.7. And that is in fact something that makes me wonder if the dynamic linking is a good idea . Different Visual Studio versions might certainly have some influence. I'm not sure anymore if there are any resources that are shared across the Python-LabPython boundary but that would be totally problematic for sure if the Visual Studio and therefore the C runtime library versions differ.
  15. Native Excel files are using a proprietary format, either binary (pre Office 2012 or so) or compressed XML. Both formats are not something that is easily read in LabVIEW although there is a Toolkit which can read and write XLS files directly. Another approach is to use ActiveX to interface to Excel and access the files like that. There are several libraries out there which wrap the ActiveX nodes to make it easier, since the ActiveX object hierarchy of MS Offce software is rather involved and not trivial to understand. But in order for this to work you need to have Excel installed on the computer you use such a library. For the rest it really would help to give more info as was asked multiple times in this thread already. It's rather frustrating to see multiple requests for extra info and then seeing that you keep providing further information in piece meal fashion and if you keep doing that the readyness to help you further will quickly diminish also for future requests you may have. If the files are however in tab or comma separated text format then the Read Spreadsheet file function will work perfectly.
  16. You should probably not attempt to set the absolute path to the Python27.dll when it was installed in System32 (or more likely SysWow64 as in my case when installing the Python 2.7.11 version from Python.org). In that case just specifying Python27.dll alone should be safer. Also note that the LabPython 4.0.something package that gets installed from VIPM is not compatible with Python 2.7. They changed various things in 2.7 in preparation for 3.x including removing some symbols from the DLL and then the LabPython initialization fails because it can't link to the expected symbols. I have prepared some fixes to the current LabPython code that conditionally links at initialization time to the correct symbols but haven't found time to properly build a new VIPM package for this.
  17. The problem is that when Python is started it determines its home path from a number of possible values and then determines the sys.path value from that. This is then used to find modules that you "import" in a script. It does among other things check values like the PYTHONPATH and PYTHONHOME environment variable, registry entries and finally if that all doesn't lead anywhere the current executables directory. From there it tries to find the Lib directory inside the determined home path. However I just installed Python 2.7.11 from the Python site on my machine and I don't see any environment variable nor registry entry for Python. That forces the Python DLL to rely on two remaining parameters, all of which don't help either. First the location of the Python DLL which has been moved from the Python directory to the Windows system directory in the past, so no way for Python to find out where it got installed, and then as last resort the executable directory, which is the LabVIEW installation directory; no joy to find the Python lib directory from there either. For the Python executable this last one is which saves the butt when running from within the Python console, but leaves any application that embeds the Python DLL in the cold. Possible Workarounds: 1) append the relevant Lib path to the sys.path variable before your import statement (something like sys.path.append("C:\Python27\Lib") 2) Define PYTHONPATH=C:\Python27\Lib;<other possible module directories> or PYTHONHOME=C:\Python27 environment variable 3) Create a registry key HKLM\Software\Python\PythonCore\<version>\PythonPath\<name>" with the value of the Lib path 4) NOT RECOMMENDED: the default sys.path Python uses when everything else fails is a list of relative paths which would resolve to the so called Current Working Directory. This can be set with a call to the SetCurrentDirectory() Win32 API. Setting this to the Python Home path would theoretically work, except that this path is also modified by various other Windows functions, most noticably the File Dialog box, so using it as a runtime reference is a very brittle solutions. Seems like there is actually infrastructure in the Python DLL to work with that, but that the standard installer forgets to set any of the possible values to help this infrastructure to do its work. There is no good way to remedy that from within LabPython itself since it shouldn't inherently know about Python installation specific details.
  18. As you can see in the discussion thread for this download, the author didn't get to implement the ooCalc much and seems not to have found time to work on that further.
  19. AFAIK VIPM pulls the actual packages from the sourceforge servers (and the NI Tools Network server). So besides moving all the content to another site you would either have to leave the build packages on sourceforge or find a way to easily let VIPM reference this new server instead. Am not even sure how binary packages on github work and if they are easily referenced by external package managers like VIPM. I think there doesn't exist a authoritative list like that. It would be possible to create a fairly accurate list with some effort albeit there is no warranty that it would be complete, but so far people only have asked for this. Nobody ever went to the effort to actually start to create something like this. Even if you had that list several people on it have moved on and are not actively involved in LabVIEW work anymore and may not even follow this microuniverse anymore.
  20. Actually there is one point which comes to mind. VIPM including the latest version does pull all the OpenG libraries from Sourceforge. Moving that to another server would render all existing VIPM installations non functional for downloading OpenG libraries. I'm sure there is a way to change a configuration file or something in VIPM that could let it use another server and the Pro version supports arbitrary server locations but that may all have its own trouble and complications.
  21. The merging capability isn't exactly working well, definitely not something to use in automated merging during GIT pull requests. As to that the development of LabVIEW seems to have stagnated I can't deny that. There have been two quite large architectural changes in the last 7 to 8 years. One is the support for 64 bit targets which got substantiated in version 2009 and finished with the 64 bit versions for Linux and Mac in version 2014. Especially the Mac version must have been a major work as more or less everything had to be changed from the Carbon framework to the Cocoa framework. That is a major change in the code base, most likely with the addition of Objective C code specifically for this move (many Cocoa APIs are not really accessible from C or C++ code). The other big change was the introduction of the LLVM compiler backend around 2011. As it is something that works in the background, not much should have been felt from it for the end user. Technologically however it is a major change to the code base. The fact that it has been so largely invisible to the users is in fact a big achievement, as it has huge chances to break lots of things in many places. So it's not like there hasn't been much development going on in LabVIEW, but a lot of the last big changes are almost invisible to the end user. There has been for several years a lot of work going on to modernize LabVIEW. Quite some of it was in the underlying infrastructure such as multiplatform support (not just multi-CPU and -OS as originally developed), LLVM compiler backend and many more things. The UI side of LabVIEW is however burdened with lots of legacy liabilities that can't easily be changed without breaking lots of things and user expectations. Keep an eye out for new developments there. It won't be as invisible as the other changes mentioned and not everybody is going to like them . Breaking old habits is one of the most difficult things to do.
  22. I'm not sure that is desirable! Bigger companies have failed very badly to provide a working source code control system (anyone ever tried to use Visual Source Safe?), so I don't have high hopes that an SCC offering from NI would be better than what we can get now with openly availabe Open Source tools. That smoother and more effective merge tools for LabVIEW would be a good thing is unquestionable. But the task of comparing vectors and directed graphs with each other is a magnitude or two more complex than simple text code comparison and to automate that would seem a really complicated task that whole generations of computer science programmers could spend years on. Given the closed source nature of LabVIEW that is however not going to happen.
  23. Shaun wasn't talking about zipping up code for source code control purposes. Just that the additional benefits of GIT compared to SVN, which is currently used for OpenG, basically diminish to nothing, since merging is a highly manual process. Seeing so many GIT clones abandoned quickly on "normal" languages, where automatic merging is at least a possibility albeit even there never without thorough manual review of any merge, the collaborative advantages of GIT simply get lost when it comes to LabVIEW code. SVN works for that just as good if not better thanks to very intuitive and easy to use clients like TortoiseSVN. Yes there is a TortoiseGIT client too, and I tried it but it does show regularly the unix origins even when using it as Windows shell extension. Not the type of seamless integration that makes these things easy to use for people who want to program rather than read manuals. Sourceforge having issues is undeniable and that might be indeed a reason to consider to change the provider for the source code control repository for OpenG, but GIT has definitely not any serious advantage in comparison to using SVN when it comes to LabVIEW code. Also I really doubt there will be suddenly any inrush of new people reviving the OpenG initiative because of such a change. LabVIEW is a nish product, used mostly in industry applications, and most users have paid jobs that makes collaborating on OpenG like initiatives kind of hard. And the others have moved on to Python and the like since. The popularity of LabVIEW won't change because of moving OpenG to things like Github. The popularity of LabVIEW is at the point of what it is, because of decisions made by NI about how to market this language. They are or have been fairly agressive about getting it into educational institutions to get future engineers acquainted with it before they move into the industry, but they were and are very reluctant to loosen the control on LabVIEW in any way. As such it has been and still is a proprietary and a single source application development program, rather than a general purpose programming language, despite the fact that it is technically a fully functional programming language. From a commercial point of view history has shown that their decision was in fact a very successful and profitable one. For the academical purpose of graphical programming languages it was less favorable, but you can't fault a stock market traded company for choosing the profitable approach.
  24. Well I see a GZipStream in there, but not knowing the actual language you use it is hard to judge what specifics that may imply. Are you sure you did proper preparation of the byte stream, namely first removing the "Zipped:" string from the stream and then doing the correct Base64 decoding before trying to pass the resulting byte stream to the GZIP function?
  25. Well I'm sure the GIT system was exactly developed for what you see as original idea. And it works amazingly well for certain projects with a central maintenance like the Linux kernel or the Wine project. However lacking such centralized maintenance it tends to get the cloning mess you allude to. Because most developers are just wanting to get this new awesome feature into the software and not worry about integrating it in the main branch. I still follow the Wine project a bit and it is the single most problem there, some contributor has a great new idea and drops a patch, but then when faced with the trouble of integrating it into the whole and complying with common styles, formatting and following proper error handling and making sure the modified code passes all unit test, the majority just starts to complain about the stringent rules and eventually abandons it. Even in text programming, merging a software branch back into the trunk is often a very tedious, and work intense process, that even advanced code merging algorithms never will be able to fully automate, since it is not always enough to just look at the factual differences in code, but the whole context often has an influence too. And with even basic automatic LabVIEW code merging being still a pipe dream, this makes the distributed development model of GIT more of a liability to LabVIEW source code control rather than an advantage.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.