-
Posts
4,914 -
Joined
-
Days Won
301
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by ShaunR
-
Yes. Absolutely. All the VIs you have downloaded are based around uploading the firmware to the PIC. If you have this already then once you have programmed the PIC with the USB commands that you want it to respond to you can use the LabVIEW serial comms VIS (VISA) to communicate as if it was a normal serial port (see the simple serial example shipped with LabVIEW). See here about using PICS with virtual serial ports.
-
I'll reiterate. DLLS are for the windows platform. So no. It won't work. there are 2 aspects to using pics. 1. Uploading the firmware (bootloader and your program either using a PIC programmer or specific software). 2. Communicating with the firmware once you achieve No1 (usually as a serial port) To achieve No1. You have to use the toolchains supplied from the manufacturer (I use HiTech as it happens) or some open source equivalent. The VIs you are downloading are merely wrappers around the manufacturers API for communicating with their PICs that are "blank" so have no USB support as you would consider it. (LabVIEW is not a PIC programmer). Lots of people have windows and microchip supply the DLL for windows and some people have written the wrappers to use this DLL for uploading the bootloaders and firmware rather than using one of the myriad of other tools people have written.. For the MAC they (Microchip) would need to supply a "framework" or dylib which would enable you to do the same on a MAC with LabVIEW. Maybe this page on PICs with the MAC and Linux will help.
-
Create Different Instances of a Class
ShaunR replied to Suneel's topic in Object-Oriented Programming
Good video. This is basically how all my software works. -
LabVIEW Class Hierarchy window
ShaunR replied to Olivier Jourdan's topic in Object-Oriented Programming
I've never experienced this problem so this is a wild guess. But spurious issues in "ini" files can be a symptom of excessive path lengths. Try deleting the line for "RecentFiles.pathList", "NewDlgRecentMainTemplates.pathList" and the quikdrop ones. -
DLLs are for the windows platform. If you are using the MCHPFSUSB, I've never seen Microchip support for the MAC, but you never know.
-
Ok. Just for you I mean a Multicolumn List Box (which is a table to a simpleton like me ) rather than a Table Control. So yes you are right a "Table Control" does work. Not all table-type controls do though.
-
Application licensing with dongles
ShaunR replied to gyc's topic in Application Builder, Installers and code distribution
Wow. For once LabVIEW didn't crash when saving to a previous version. I flirted with it a while back (its on here somewhere). The issue is Admin rights. I never got around to revisiting it. -
Perhaps we should design a BS Bingo game for LabVIEW
-
I don't know if it is there or not. I don't visit the NI sites much any more since it kept asking me to verify my details-so I don't bother now. But we have over a year to argue about it before it is likely to be considered Well. We are about to argue about semantics. I didn't say what it should do, just that I find it irritating that it doesn't and pointing out that if the OP uses it, it probably won't do what they (and I) expect for those controls.
-
Note that this invoke node will not clear tables or combo-boxes which I've always found infuriating
-
Well said. I personally think people get hung up on the terminology. I view the NI certifications as expanding levels of remit that closely follow a typical career path as you take greater responsibility over more of the project life-cycle. You start off with a general knowledge about LV (CLAD-demonstrate you know what it is) then progress to detailed implementation (CLD-demonstrate you can write software) and for the CLA it expands into requirements decomposition. (that's where I see the grey area between technical and project management begins) If you demonstrate competency in all these areas then you could also call yourself a software systems engineer, senior software engineer, chief developer, software technical authority et al. "Architect" has always seemed a bit, pretentious to me. At this level, titles don't really mean much (I notice you don't call yourself an architect ). I'm pretty sure that next time round, Daklu will probably get one of the highest scores ever in the tests now that it is clear what is expected. For most of us, examinations are rare occurrences and we may perform better under pressure in a management meeting (that we deal with often) rather than in a timed test..
-
Well. I'm coming in a bit late but I'm a bit surprised that there's no discussion of tagged-token/static dataflow or even actors and graphs. Yet this whole thread is about dataflow ? I don't know specifically because I have never given it much thought and only have a cursory understanding since it's not my remit. But if I were to catagorise labvierw I would say it is either a tagged-token dataflow model or a variant thereof (hybrid?). Asking whether it is "pure" or not is a bit like asking if the observer design pattern is a "pure" design pattern.
-
Not a consolation I know, but you're not the first. And definitely won't be the last. Of all the NI exams. The CLA is the one where exam technique weighs heavily in the results. You can nail the requirements and documentation by spending 1/2 an hour writing a well rehearsed scripting VI and cutting and pasting the requirements into a text file As Crelf said. "Give the examiner what they want" (or in this case, the Requirements Gateway).
-
Hmmm. Not sure I necessarily agree with this. M$ has been touting CE as "hard real-time performance" since version 3.0 and the term "real-time" on it's own tends to mean many different things to different people. But here's a report into it's real-time capabilities and a rather excellent web-cast on its real-time features so you can decide by your criteria. When it boils down to it. There really aren't that many "real-time" applications that are required of an OS in Labview development. The trend is to offload real-time onto dedicated hardware and NI have many fantastic PCI or PCIE solutions for this. Thats why we have seen the proliferation of FPGA. This means tat Embedded Windows is an excellent platform for real-time (or near real-time) applications. I've actually been using Windows XP Embedded Standard for a couple of years now (not exactly the same as CE). Far cheaper than NI platform products and you can use NIs cards as well as other manufacturers and is therefore far more versatile. Once you get the right initial "image" - Labview runs on it too. We have gone down the route of having small embedded machines for specific task (e.g one for image acquisition, one for motion control, one for acquisition etc) and as replacements (or alternatives) for cRIO. So far, we have not found one instance where it hasn't been as capable. But many instances here it is far more versatile and at a fraction of the cost. Looking forward to evaluating Embedded 7. But XP has worked so well (with Labview) that it'll probably only happen when we really have to.,
-
http://youtu.be/Vh78T--ZUxY
-
Yup. well worth a rofl point. I have seen quite a few of those VIs.
-
Sweet. I'm pleased it is worthy of consideration.
-
You made that up What you meant was LTWAC (Lazy To Write A Class)
-
Extremely Long Load Time with LVOOP, SuPanels, and VI Templates
ShaunR replied to lvb's topic in Object-Oriented Programming
Red: I think this is where we are at odds. I think it matters very, very much. The loading of the VIs in a class is orders of magnitude slower than instantiating an object. One is a disk operation and the other is just allocating memory. The big caveat with classes is that a class doesn't just load what it needs (as it does in classic LabVIEW). It has to load everything in the class - needed or not. Pre-loading these method VIs is the only practical solution to trade the 1 or 2 seconds (say) of disk activity for a few microseconds/milliseconds of execution and repeatability/consistency of calling those methods. If, however, it is loaded just before it is required. If may be that that 1 or 2 seconds for LabVIEW to load the class from disk is in the middle of a 100 ms timeout, or a timed measurement. Then it is a real problem. Blue:Exactly. But the way the DD table works is that it is a list of VI references in the order of of the hierarchy (parent at the head, child appended to end). If we want to prevent the second scenario as described above from happening. then it will have to load the parent AND the children from disk before the call. If it is an overridden parent. then it will have to load the parent, child and all the siblings. Yes. Absolutely. It returns with a reference. To have that reference it must have loaded that VI. As it cannot load a single VI from a class. It must have loaded the entire class If it didn't know before-hand what class it would need, it must have loaded all of them. Unless it lazy loaded the class when the call was made. This is one way that I think about what the "mechanics" might be (in the design environment in this case). There are a few ways that the blue box area could operate (they could be sequential, depth first et al.), but it's basically how I see the partitioning between loading. and calling in reference to the document. From this I think you can see that I believe (I'm only guessing after all) that all classes are loaded before any calls - loaded, not instantiated. I think instantiation is just semantics to keep POOPers happy since I could say that by placing a common or garden VI on a diagram I am loading and at the same time instantiating that VI. In In this case it's just a parent and child. I think that if you override, then siblings would also be loaded. In the image, your "dynamic dispatcher" is the index array element primitive. -
I've seen it a few times too. Fairly random "feature" Just need to up the PHP memory limit a bit (I bet it's set to 32MB)
-
Did they give you a CAR number for it or was there already one registered.?
-
I do sometimes use them to post to LAVA because creating a snippet is less hassle than uploading a VI. But as I use google chrome, they are just not worth the bother to use generally (open image in a new window, resize the browser, drag to desktop, find a diagram, drag into vi from desktop, restore browser). If a snippet is posted on LAVA it has to be something pretty special for me to try and get it in a vi. I normally just pass them by.
-
Extremely Long Load Time with LVOOP, SuPanels, and VI Templates
ShaunR replied to lvb's topic in Object-Oriented Programming
Indeed. But to call a child it must be laded at some point either prior to the actual call (pre-loaded) or when the call is executed (lazy loaded). Like you said. We don't have visibility of the mechanics. But from the quoted piece of text, it seems that the purpose of the dynamic dispatch table is to maintain a VI reference list. and the object passed determines from which VI reference the method is executed. I am speculating (in response to the OPs observations) that to get those references it must load the VIs. From that, I am suggesting, as it doesn't know which classes "will" be called until it actually happens. The algorithm pre-loads all classes than "can" be called to populate the table. The alternative (I cannot see any others, but there may be one) is you place an object on the wire that indexes past the end of the list triggering some loading voodoo when the call takes place. As you say, you cannot load a part of a class, so the latter could be a long wait as it loads every method for the new class (and dependencies) - right in the middle of your code executing. The former would explain the OPs obserevation. The latter would be a very good reason not to use DD-especially on real-time systems. -
Web UI Builder - feedback wanted
ShaunR replied to Mr Mike's topic in Remote Control, Monitoring and the Internet
Yup. I think also that the explosion of Ajax enabling "desktop style" dynamic web pages that run in the browsers native engines has put the nail in the coffin of external run-times like Silverlight. Things like flash Java etc are really now only used as presentation tools rather than dynamic page content. Try and find a site now that doesn't have jquery for example. Sure it's not graphical. But JavaScript developers are the new C developers;10-a-penny and cheap.There are many data streaming sites now using JS push servers for things like real-time stock quotes. In terms of graphical presentation (like graphs etc). I'm looking at Google maps. Not so much in using it as-is. But the technology. Javascript based with bi-directional feedback allowing zooming, panning, and overlays. All that a web-page requires is the DIV tag and a link to the javascript and the browser does the rest. Now. If there were a tool (like the web-builder) that created javascript (like the embedded toolkits create c). I would be getting very excited. I could then use it to create front ends for my Labview websockets. Have to be free of course. -
I'm pretty much in-line with Cat here. I think "new" business is driven by latest features, gadgets and spin. But thats due to competition from other products. But I would tentatively wager that the vast majority of LabVIEW income is from SSP which, as it's name suggests is for value added Service and Support - a recurring income stream with almost no additional overheads or cost. I think most "real-world" Labview programmers (if they have any sense). Only upgrade to a later version if there is new feature that achieves common requirements that were impossible or at least very difficult in the older versions (very rare-but I would put the events structure in this category). Or a customer requires it. Version changes are a very high risk proposition and I think most opt for "better the devil you know". But that doesn't stop us paying our SSP for the exemplary service NI provides and to be able to pick up the phone and talk to real engineers who have real product knowledge. I, for one, pay my SSP for this. Not to get the new versions. If I were to rate the importance (to me) of new features and reliability on a scale of 1-10. New features would be 2-3 and reliability would be 8-10. New features don't cost me time, money or reputation. Unreliable software does. Nor do I see them as mutually exclusive. Get the software reliable then add features. Adding new features to unreliable software only makes unreliable new features. In times gone by, I used to be more cavalier about upgrades because a new version would come out, with very few known issues of any consequence and if something was found, a patch or new version would be very quick to follow-certainly within a project time-frame. Now, however, the cycle is very slow and I've yet to see a SP2 fixing my 2009 version before 2010 was released (which probably hasn't fixed bugs I'm concerned with but as sure as eggs are eggs will introduce new ones).