Jump to content

Barrie

Members
  • Posts

    80
  • Joined

  • Last visited

    Never

Everything posted by Barrie

  1. Barrie

    The book

    I'm trying to connect this to Jim Kring's "An unfiltered stream of data flow consciousness" but I'm not having much success.
  2. Nice Job! I haven't seen this one in many ages. The beauty of this puzzle is that it requires no guessing or permuting, it is 100% logic. Now, can anyone write a program to solve it, (pure G of course) without brute force permutation?
  3. I've used it once to see what it was, and that's it! I have a pretty strict rule regarding diagram size. Going outside the screen size is a pretty solid indication that I'm either taking the wrong approach or I'm being lazy. Thankfully, I have never encountered nightmares like Jack Hamilton has. ( I still think he's making them up ) :laugh: Cheers! Barrie
  4. Generally in terms of optics and such, Laser Focus World is an excellent source for manufacturers and suppliers. www.laserfocusworld.com Their Laser Focus Buyers Guide has just about everthing you can imagine. Good luck Barrie
  5. This feature would promote good wiring practices AND allow a developer with a large screen to be reminded that other developers or the end user may not have the same size screen.
  6. Indeed! Sounds like a good wish list item, if not a hard limit, then at least a warning, perhaps in the VI Analyzer. How about a diagram size limit that is password protected and you can't get the password until you pass your ACLD?
  7. Ok, you can't fool me. You're making these up. Right? You took a photo of a large scale IC and converted it to LabVIEW, right? Please tell me I'm right !! Barrie
  8. This question intrigued me. I use the PCT quite a bit and every function except one just passes data and then executes. The only one that returns data is Get Text Rect and it is password protected. You can flatten the picture datatype to a string and parse it but as far as I can tell, it just mirrors the last command. This was not an exhaustive test. It looks to me like the the PCT is actually a wrapper around a dll. The schema is very similar. If you can schmooze someone at NI to give you a full list of the Opcodes and the parameters passed, there might be a "get pen position" function. If I learn more, I'll reply further. I'd like to hear if you make any progress. Cheers! Barrie
  9. Now that I've learned about <ctrl> when wiring to flip 2-input functions, I use it a lot. :thumbup: Just about everything, including quotient and remainder, recognizes this but the Data Selector has been missed. I think it could benefit from it. Thanks, Barrie
  10. And what about that one too? (NI Week look tame after watching this...) Now really, can anyone imagine Dr. T doing the NI Dance? Barrie
  11. Hear! Hear! I would rather be despised for my opinons, than for not having any. Barrie
  12. Thanks for the clarification. As for reading too much or assuming too much, I must say your original conclusion seemed pretty definitive to me. ....... Potentially, LabVIEW could fit, but that depends as much on the culture of your organization as it does on your skill sets. LabVIEW is an excellent tool for rapid prototyping. That is not to say the final product is all chewing gum and duct tape, as the product has become more sophisticated, it is very worthy of large applications. It also enables you to produce a large amount of eye candy which leads people to assume that the app is 90% completed even though the underlying code may be a long way off. I mention this because some management types see what they want to see, no matter how many times you underscore the total work involved and it can bite you bad. :headbang: As a development environment, the graphical approach works very well for me. I find it very intuitive, but I knew how to read a schematic long before I knew how to code, or even run a computer. Similarly, debugging is also very intuitive, but that may be due to my experience with a 'scope and a multimeter. Curiously, I have also found that if a VI draws well, it will probably run well. If my VI devolves into a rat's nest, that's a huge warning that I haven't fully thought through my approach and need to backtrack before I get myself in too deep. I intend to run a poll here regarding people's backgrounds, i.e. hardware/software, but I am waiting for the new version of Invision. Some people transition well between graphical and text programming, others definitely do not. Some programmers dismiss LabVIEW as "cute". I have seen otherwise intelligent people tell me that "real" programmers only use C, but that's their problem. Besides the obvious interface differences, LabVIEW uses the dataflow paradigm, which is inherently multithreaded. Hyperthreading and multicore/multiprocessors extend this capability naturally. Priorities, scheduling and threads are easily implemented and modified. In other words, it's more than just a pretty face. The sub-vi function encourages modularity, structure and reusability and yet graphically shows that function as in-line code for the big picture. The sub-VI intrinsically provides all the interface for testing the code as stand-alone without the need to add, then delete or comment out test code and printf statements. Compiling and linking is totally transparent, but this may now be the case for text based environments as well. All that said, undisciplined or inexperienced wireworkers can produce spaghetti code in the truest sense of the word, as has been discussed on this forum. Like any other tool, you need to know how to use it to reap the benefits. As you said very well, it all comes down to compromises and fits. I hope this helps, Barrie
  13. While all of the above can be true and does provide some good advice, it is irrelevant at best and misleading at worst. As I understand it, this thread addresses LabVIEW vs. LabWindows, and the above statements apply to every software project ever attempted, regardless of the language or methodology. While I would not be foolish enought to suggest that LabVIEW is self-documenting, there are inherent procedures and structures that are an intrinsic part of the language. This can not be said for "C". With "C" you have more of an opportunity for a "free-for-all" because "C" is not much more than an assembler with delusions of grandeur. The quality of the code is almost entirely dependent on the skill, experience, organization and discipline of the programmer. It also forces the programmer to address issues that, frankly, I don't believe belong in a language in the 21st century. Now, before the flame wars start, assembler and low level languages are fine or even ideal, for many applications. I have probably spent as many hours coding assembler as I have LabVIEW. That said, the difference in productivity is astronomical. Comments in this thread imply that the choice between the two products is whether to "C" or not to "C", but I see it as a decision of How Much "C" you wish to use. There is nothing to stop you from using "C" inside LabVIEW, and you also have access to many high level functions that allow you to put together a rich application in minimal time. I should also mention that the final execution speeds are comparable. The summation: "The only difference is one elegantly works and the other is like a stuffed pig." made me smile because it is a gross oversimplification that shows no real understanding of the differences but it does afford you the opportunity to justify your own preconceptions and biases. As I stated above, to conclude that something that relies on "C" is inherently elegant is nothing short of ludicrous. After all, the quality and productivity of the final code is the goal, isn't it? LabVIEW, LabWindows, "C", Java et al are all just tools that could be in your toolbox. I try to keep a good selection for different jobs, but if all you have is a hammer, everything starts to look like a nail.
  14. Don't apologize for the length, most posts contain too little information. Human readable (ASCII) vs. Machine readable (Binary): I started with microprocessors where an entire program could reside comfortably in 4K. I spent hours squeezing every byte out of the program and packed my data as tight as it could be. For the time, the effort was appropriate. (Yes, I'm an old fart) I now manage an airborne data acquisition system that records a lot of data and I record everything as generic CSV files (tab, space, comma, whatever). I think that the benefit of human readable data far outweighs any space saving acheived by using a binary format. My thinking is as follows: - Most, if not all LV applications are very low quantity installations, therefore the CPU hardware cost is small relative to the dev. cost. - I can buy a 250 GB SATA drive for less than 2 hours of my time and that price will continue to drop. It won't be long before consumers have Terabye+ machines to store bad home videos and make sure they don't miss their favorite irrelevant soap operas. - That recorded data can be read by a human, Exel, LV, and just about every other data related program. I only have to write the code once and I don't have to worry about other apps interpreting my binary format. - If my requirements are "on the edge" of what is available when I start the project, the requirements will be more than manageable before the product is finished. - I'm all for elegance and efficiency, but within what parameters? Even if there was a possibility that an app I was developing would become a high volume item, I would still use LabVIEW first, as it is the best rapid prototyping system that I have found so far. I would establish the proof-of-concept first and then, if and when the numbers become real, I would then look at optimization. Does it bug me that a stand-alone system now has more storage than 500 multi-user systems had 30 years ago? Yes, a little, but I'm getting over it. I hope my rant helps a little. Regards, Barrie
  15. I think I have to side with Louis here. I still use OR and AND where appropriate. As a graphical language, I want everthing to be as obvious as possible, for that time down the road that you have to re-examine and modify code. The compound arithmetic is not as visually intuitive. I do agree, that the system should be smarter and replacing a regular gate should include inheriting the characteristics of the original gate. :2cents: Barrie
  16. Hmmm... Unless I'm missing something, that would imply that the RTE and/or the .exe must need some machine specific information about what sort of machine you want to deploy the app to, during the build. If you wrap up the .exe and RTE together, how does the appbuilder know what the destination machine is? Further, if one makes a single built app for distribution to a number of different machines, that would imply that some of the installs will not work, or not work as well as they could. Can anyone clarify this further? Regards, Barrie
  17. I think the problem is with the resizing. Resizing a digital image is not a trivial problem and some methods are more sophisticated than others. Even equally sophisticated methods produce different results depending on the image content. For example, highly geometric images (such as a VI block diagram) can end up looking terrible, depending on how the image is resampled. The LabVIEW zoom (or resizing) function provides no control over what resampling method is used, so you are stuck with what you have. I noticed in your VI that the zoom factor you calculate is just about guaranteed to be non-integer. For example, even though you may be able to fit an image on the screen that is 100%, the VI will likely resize it to something like 99.2% or 101.5 %. Resizing an image to a "nice" number (such as 1,2,4 or .5 or .25) can improve the situation significantly, again depending on the resampling algorithm. It's a question of trial and error with different values, and is dependent on the image content. Some things you might try: - Make a test VI where you can easily play with the zoom factor, using a slider and observe the results. This will give you feel for how the resampling affects the image at different factors. Then, in your VI: - After you calculate the ideal zoom factor, coerce the factor to something less granular. For example: Coerce 105.2% down to 100% and coerce 49.1% to 50% - If you must be able to see 100% of the image, change the size of the picture control to match the zoom factor, not the other way around. - If picture quality is very important, you may be forced to use a third-party program to resample the image, but that can be messy and time-consuming. I suspect that LabVIEW will be improving the graphics capabilities in the no-so-distant future, but for now you have to work with what you have. Hope this helps, Barrie
  18. I use the picture control quite a bit and I haven't noticed any problems. Could you be more specific when you ay the picture loses quality? Some thoughts are: -Somehow you are changing colour depth or -You have a zoom factor that is not 1 (100%) It might help if you include a screen shot which shows the picture in both Word and LabVIEW a perhaps attach the VI itself. Good luck. Barrie
  19. I can't decide whether to laugh or cry. Thanks for the diversion! Barrie
  20. I'll chip in! In the same category of the icon editor: Double-clicking the "Selection Box" selects the entire icon. (+del key =A fast way to delete the Default Icon) Double-clicking the "Rectangle Box" puts a rectangle around the icon perimeter. Double-clicking the "Filled Rectangle Box" puts a rectangle around the icon perimeter and fills the inside. BTW, it wasn't until some months with 7.0 that I discovered the font trick myself, but it was only after I fought with the itty bitty font default that came with 7.0 Barrie
  21. Just enter it the same way LabVIEW displays it in a numeric. If I understand what you're asking, it's easily demonstrated by wiring a string control to the Fract/Exp String to number function and looking at the output. 1.234e2 ( or 1.234E2) results in a numeric value of 123.4 Barrie
  22. I'm even later to the party but I want to thank Jason for taking the time to not just write about the "bug" but to provide some very good information about the background and philosophy that goes on. :thumbup: I can think of a software company or two that could learn from this kind of responsiveness. Thanks! Barrie
  23. Brent: This little VI may help to further clarify things. http://forums.lavausergroup.org/index.php?...ype=post&id=913 Cheers! Barrie
  24. This is more of a comment than an insightful reply but if my memory serves me, cell based operation can be sped up significantly by the following: Set the visible attribute of the table false Make all your changes Set the visible attribute of the table true This may not apply to later versions, but I do remember that this particular object is very graphically dependent. I don't think it is a bug per se, but the code is probably in need of optimization. Sigh.... Poor NI, so much code, so little time. Cheers to all, Barrie
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.