Jump to content

ensegre

Members
  • Posts

    550
  • Joined

  • Last visited

  • Days Won

    25

Everything posted by ensegre

  1. Is some restriction of the format specifier syntax acceptable? Namely, do you need to support $ order? Do you need to output some argument multiple times? If the correspondence parameter -> formatted form is 1:1, a loop on an array of variants containing arguments, plus one array of format strings could perhaps do? Or, one array of clusters (index of parameter, format string) to account for reshuffling and repetition, and then concatenate as suggested above?
  2. As per text formatting utilities, gnu has fmt. Additionally, there is par. Perhaps you may consider wrapping them. AFAIR, they simply don't cope with multibyte characters (at best they treat them as independent bytes), but alas, neither LV does really. But right, they count characters, they have no notion of font metrics, which is what could be asked for in an UI.
  3. (3 implies 1b or?) I think the answer is in part subjective (how much extra time would you spend for a diligent implementation?), in part dictated by constraints you don't specify. Like: Setting an unnecessary parameter causes an additional delay? Uses up some communication bandwidth? How are the capabilities of the sensors queried, or otherwise known? Does that take time or bandwidth? Is it safer to query always, because a sensor may have been changed without the caller knowing it? Are there risks connected with trying to set an unsupported parameter? In 1b the error is propagated elsewhere as error, or is just the sensor answering "sorry, I can't"? Squelching the 1b errors incurs in the risk of neglecting other real errors?
  4. Replace Array Subset is your friend, probably. However, note that you cannot format one column of the output as numeric and the other as timestamp. Properties of array elements must be uniform. You may want to consider an 1D array of clusters instead.
  5. I understand even less what you want to do. Maybe you're better to post your code in the thread, someone else may fill in and help reviewing it. Or use a bottom up approach, first submit a minimal example which addresses clearly the behavior you're missing, and only that, for the group to help; add the rest later on in your private application. Also, please don't send VIs with FP or BD diagram saved as maximized, it can be annoying to some.
  6. My snippet above dynamically loads four clones of the same [reentrant] vi (the high resolution timer just as an example), opens them in each of the four subpanels, and closes them when you press the stop button, like I understood you asked originally. Here a slightly more elegant variation of it, btw: What do you want to do instead? A complete event-driven application with buttons for closing or reopening the clones in each of the subpanels? Left as an exercise, but the mechanics would be an event loop with Insert VI and Remove VI invoke nodes for each button pressed, I guess.
  7. ? http://zone.ni.com/reference/en-XX/help/371361M-01/lvprop/subpanel_removevi/
  8. For reference, an example of this kind is in "Grab and Attributes Setup.vi" among the IMAQdx examples of the Example finder. Specifically this VI used to have some glitches in older releases of LV, I don't know whether solved in later versions, but anyway should give you the idea of what Yair means. If you happen to have IMAQ and a webcam you can see it in action.
  9. Like this for example with a reentrant VI?
  10. I have no experience with python classes, but see here: https://sourceforge.net/p/labpython/mailman/message/27990305/ (comment 1). This seems to be a problem others complained too, as you will find if you search in the fora and on the labpython list. My ugly workaround, globalize everything which is lamented as undefined.
  11. Easiest, convert the image to png, Edit/Import Picture to Clipboard, paste to FP, scale to need, send to back (Ctrl J), make path control background transparent, slide image below control. A bit more flexible and reusable, customize the control. PathBackground2.vichemin.ctl
  12. Yes, this way seems to be reasonably simple and have potential. It looks that symbols of an open project can even be changed on the fly, thus the symbol-setting VI can be part of the same project; the only nuisance is that VIs and project will appear as changed and unsaved. Here is my first attempt at it (LV15); I have still to figure out what could be the leanest way for the end user. TestDependencyChecker.zip
  13. So here is my situation: I have this software I'm providing since years, which depends, for a nice side functionality, on a toolbox which the end user might have cared to install or not. (Show of hands, some static copy of LuaVIEW, in a local or in a system directory, or through VIPM). I used to distribute this software as source, just zip of vis and llb, because after all it's for internal use, and may happen to be debugged on target. Not even cared to create once for good a proper project for it, it just grew on like that. Now it occurs to me, it could be sensible to check at runtime for the presence of LuaVIEW, and to disable the additional features if that is not found. Not found, presently, just means that some subVIs on the main BD are missing or broken. I figure that it would be rather easy to check dynamically for a Bad status of some VI wrapping a LuaVIEW subVI, and use that as a flag. The best approximation to what I need, which I could think at, would be to wrap those broken part in a conditional disable structure. But, stock CD structures in the IDE can be conditioned only by bitness/targer/RTE, not by some runtime value (and it makes sense they can't; it's about compilation). Another option coming to mind, would be to create a project for good, with conditional disable symbols, and two different builds depending/independent, but then should I distribute only the builds? Other options coming to mind look to me more cumbersome. e.g., I don't know, calling all relevant LuaVIEW vis dynamically, thus having them not explicitly on main BD. Or, I suppose plugin library may be the keyword, but then do I have to transform LuaVIEW in a plugin, which is above my head and not my call? Any elegant suggestion?
  14. As I find myself maintaining and supporting a legacy software of mine, conceived 12 years ago and still alive and kicking for many users, which at some point along the way picked up a LuaVIEW functionality and dependence, let me see if I got it right: LuaVIEW 1.2.2 supports the three platforms linux, mac, windows, but only at 32 bit LuaVIEW 1.2.2 was (isn't anymore, but I have my saved copy) only available for download at esi-cit. It is not available trough VIPM. LuaVIEW 2.0.x is available through VIPM, but it's only windows at the moment (just checked, VIPM for linux doesn't even list it). LuaVIEW 2.0.x is the only LuaVIEW working on 64bit (windows) I still have to check thoroughly that my software, for the very little lua it uses, is compatible with 2.0. Anyway, what is the minimal version of LV LuaVIEW 2.0 will run on? Note that I'm not asking any special backport, just thinking at how to move on as for requirements of my software.
  15. You might get somewhere by some simpler method, for instance compute the marginal pixel sum along verticals divide the image in horizontal stripes, cutting where 1. is zero -> this isolates pentagrams for each of these stripes, compute the horizontal sum a candidate staff is defined in terms of thresholds: the marginal sum of black pixels must be high enough and wide enough (staffs are thicker than note stems) check in each of these locations that [a slight morphological dilation of] the black pixels is exactly five lines high, and pass/fail which of course assumes that the score has been properly oriented in preprocessing. And still imperfections on the scanned image may fool a simple detection approach; for example the semibischroma D at bar 3 might provide a false staff positive. All together I think complex pattern detection is an art. If there is any attempt of OCM out there, it has really be smarter than any simple scheme like I could think of.
  16. As for a labview wrapper to opencv, I'm only aware of this one (once downloaded but never tried). I'ts all: commercial, closed-source, only windows (and probably only x86), bound to an old opencv version. Given the complexity of opencv though, I think any full scale interface to it would be a major project. I'm not familiar with OMR, but I have done quite some OCR of historical books. For that I actually relied on an existing OS package, tesseract, which is not even the best performing around, but would never have dreamed to implement OCR from scratch using labview [yes there may be some IMAQ "OCR"; but seriously]. Well maybe I could have used labview for just some routinary image rectification and preprocessing task, but it turns there are better ready made tools around, e.g. scantailor. I don't know where you stand in this respect, but if there is some decent OMR around I would stick to it and at best call it from labview if I really had to. Out of curiosity because I don't know OMR workings: does removing the staffs in preprocessing really help the recognition, not complicate it? Well ok, the rhythm of your music is weird, I don't see any two bars with the same duration...
  17. One could argue that if the value comes from a global like in your example, that global could well be buried inside the VI. Anyway, if the workflow is such that the opened sub-UI has to be made aware at later time of a value change, I agree with Tim that a better message-passing channel can be set up. Another UI possibility, don't know if relevant for your case, would be to make your subVI behavior modal (on entry; revert to standard on exit to avoid development trouble). That way your user would be prevented to do anything like choosing a second time from a menu in the main UI.
  18. This is what I use. Would it do? RiseAndRunPanelVI.vi
  19. Wouldn't be the first time I run into a camera/spectrometer/framegrabber/younameit SDK which comes with a more or less maintained driver/dll/sample cpp program set for both bitnesses, whereas the labview layer part of it was less tested , and ended up with some calling convention mistake (here one example). Small companies may not have enough resources to test extensively every possible software and hardware combination they cater for, you can't blame them too much, especially if at least their postsale support is friendly and helpful. The dreadful case is when SDK development is outsourced and arguing is limited to the two-week window the subcontracted programmer has the device on desk. First thing first: do you have a statement from Stellarnet about 64bit LV support? I don't know if that is what you're referring to, but at http://www.stellarnet.us/software/#SPECTRAWIZLABVIEW I read "The software was entirely coded in LabVIEW 8.2 and interacts with the spectrometers via swdll.dll". Besides xp64 being itself quite unstable as 64 bit OS as I recall, LV8.2 was around fall 2006 about 2009. Yes, the same time as xp64.
  20. FWIW, this VI is saved as being part of a library Instruments.lvlib, missing, and hence broken arrow.
  21. CAR 570134. http://forums.ni.com/t5/LabVIEW/bugs-in-Digital-Waveform-Graph/m-p/3245026#M945202
  22. Be that: http://forums.ni.com/t5/LabVIEW/bugs-in-Digital-Waveform-Graph/td-p/3244498
  23. The attached shows what I think is the bug for me. And another one btw related to the always visible plot legend in either classic or tree form. And third, occasionally the Y scale gets wrong but it is not clear to me when. I suspect race conditions while redrawing as I've been able to reproduce the faulty naming only using the event structure; labelling was always correct if I omitted the outer while and event frame, i.e. if I had run the inner code once and again from ready to run VI state. DigitalPlotNames.vi ETA: tried twice to submit it as a service request and got An error ocurred.We are unable to create your Service Request at this time. Please try again later.
  24. go ahead, please, I'm kinda busy these days. I sort of remember I have read on some other thread that the digital graph was a nest of bugs, but if we don't report them one by one there is little chance to get them worked. I also had some occasional mess ups of the plotted part, but not that systematically to pinpoint when, and likely the trick of forcing re-autoscale masks them.
  25. I've recently ran into this bug, and this workaround seems to make the trick, "most" of the times. LV2015.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.