-
Posts
565 -
Joined
-
Last visited
-
Days Won
25
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by ensegre
-
So here is my situation: I have this software I'm providing since years, which depends, for a nice side functionality, on a toolbox which the end user might have cared to install or not. (Show of hands, some static copy of LuaVIEW, in a local or in a system directory, or through VIPM). I used to distribute this software as source, just zip of vis and llb, because after all it's for internal use, and may happen to be debugged on target. Not even cared to create once for good a proper project for it, it just grew on like that. Now it occurs to me, it could be sensible to check at runtime for the presence of LuaVIEW, and to disable the additional features if that is not found. Not found, presently, just means that some subVIs on the main BD are missing or broken. I figure that it would be rather easy to check dynamically for a Bad status of some VI wrapping a LuaVIEW subVI, and use that as a flag. The best approximation to what I need, which I could think at, would be to wrap those broken part in a conditional disable structure. But, stock CD structures in the IDE can be conditioned only by bitness/targer/RTE, not by some runtime value (and it makes sense they can't; it's about compilation). Another option coming to mind, would be to create a project for good, with conditional disable symbols, and two different builds depending/independent, but then should I distribute only the builds? Other options coming to mind look to me more cumbersome. e.g., I don't know, calling all relevant LuaVIEW vis dynamically, thus having them not explicitly on main BD. Or, I suppose plugin library may be the keyword, but then do I have to transform LuaVIEW in a plugin, which is above my head and not my call? Any elegant suggestion?
-
As I find myself maintaining and supporting a legacy software of mine, conceived 12 years ago and still alive and kicking for many users, which at some point along the way picked up a LuaVIEW functionality and dependence, let me see if I got it right: LuaVIEW 1.2.2 supports the three platforms linux, mac, windows, but only at 32 bit LuaVIEW 1.2.2 was (isn't anymore, but I have my saved copy) only available for download at esi-cit. It is not available trough VIPM. LuaVIEW 2.0.x is available through VIPM, but it's only windows at the moment (just checked, VIPM for linux doesn't even list it). LuaVIEW 2.0.x is the only LuaVIEW working on 64bit (windows) I still have to check thoroughly that my software, for the very little lua it uses, is compatible with 2.0. Anyway, what is the minimal version of LV LuaVIEW 2.0 will run on? Note that I'm not asking any special backport, just thinking at how to move on as for requirements of my software.
-
You might get somewhere by some simpler method, for instance compute the marginal pixel sum along verticals divide the image in horizontal stripes, cutting where 1. is zero -> this isolates pentagrams for each of these stripes, compute the horizontal sum a candidate staff is defined in terms of thresholds: the marginal sum of black pixels must be high enough and wide enough (staffs are thicker than note stems) check in each of these locations that [a slight morphological dilation of] the black pixels is exactly five lines high, and pass/fail which of course assumes that the score has been properly oriented in preprocessing. And still imperfections on the scanned image may fool a simple detection approach; for example the semibischroma D at bar 3 might provide a false staff positive. All together I think complex pattern detection is an art. If there is any attempt of OCM out there, it has really be smarter than any simple scheme like I could think of.
-
As for a labview wrapper to opencv, I'm only aware of this one (once downloaded but never tried). I'ts all: commercial, closed-source, only windows (and probably only x86), bound to an old opencv version. Given the complexity of opencv though, I think any full scale interface to it would be a major project. I'm not familiar with OMR, but I have done quite some OCR of historical books. For that I actually relied on an existing OS package, tesseract, which is not even the best performing around, but would never have dreamed to implement OCR from scratch using labview [yes there may be some IMAQ "OCR"; but seriously]. Well maybe I could have used labview for just some routinary image rectification and preprocessing task, but it turns there are better ready made tools around, e.g. scantailor. I don't know where you stand in this respect, but if there is some decent OMR around I would stick to it and at best call it from labview if I really had to. Out of curiosity because I don't know OMR workings: does removing the staffs in preprocessing really help the recognition, not complicate it? Well ok, the rhythm of your music is weird, I don't see any two bars with the same duration...
-
Calling an subVI asynchronously
ensegre replied to Doug Harper's topic in Application Design & Architecture
One could argue that if the value comes from a global like in your example, that global could well be buried inside the VI. Anyway, if the workflow is such that the opened sub-UI has to be made aware at later time of a value change, I agree with Tim that a better message-passing channel can be set up. Another UI possibility, don't know if relevant for your case, would be to make your subVI behavior modal (on entry; revert to standard on exit to avoid development trouble). That way your user would be prevented to do anything like choosing a second time from a menu in the main UI. -
Calling an subVI asynchronously
ensegre replied to Doug Harper's topic in Application Design & Architecture
-
Wouldn't be the first time I run into a camera/spectrometer/framegrabber/younameit SDK which comes with a more or less maintained driver/dll/sample cpp program set for both bitnesses, whereas the labview layer part of it was less tested , and ended up with some calling convention mistake (here one example). Small companies may not have enough resources to test extensively every possible software and hardware combination they cater for, you can't blame them too much, especially if at least their postsale support is friendly and helpful. The dreadful case is when SDK development is outsourced and arguing is limited to the two-week window the subcontracted programmer has the device on desk. First thing first: do you have a statement from Stellarnet about 64bit LV support? I don't know if that is what you're referring to, but at http://www.stellarnet.us/software/#SPECTRAWIZLABVIEW I read "The software was entirely coded in LabVIEW 8.2 and interacts with the spectrometers via swdll.dll". Besides xp64 being itself quite unstable as 64 bit OS as I recall, LV8.2 was around fall 2006 about 2009. Yes, the same time as xp64.
-
[CR] List all VISA session opened
ensegre replied to Benoit's topic in Code Repository (Uncertified)
FWIW, this VI is saved as being part of a library Instruments.lvlib, missing, and hence broken arrow. -
Active plot order does not match actual plot order
ensegre replied to codcoder's topic in LabVIEW General
CAR 570134. http://forums.ni.com/t5/LabVIEW/bugs-in-Digital-Waveform-Graph/m-p/3245026#M945202 -
Active plot order does not match actual plot order
ensegre replied to codcoder's topic in LabVIEW General
Be that: http://forums.ni.com/t5/LabVIEW/bugs-in-Digital-Waveform-Graph/td-p/3244498 -
Active plot order does not match actual plot order
ensegre replied to codcoder's topic in LabVIEW General
The attached shows what I think is the bug for me. And another one btw related to the always visible plot legend in either classic or tree form. And third, occasionally the Y scale gets wrong but it is not clear to me when. I suspect race conditions while redrawing as I've been able to reproduce the faulty naming only using the event structure; labelling was always correct if I omitted the outer while and event frame, i.e. if I had run the inner code once and again from ready to run VI state. DigitalPlotNames.vi ETA: tried twice to submit it as a service request and got An error ocurred.We are unable to create your Service Request at this time. Please try again later. -
Active plot order does not match actual plot order
ensegre replied to codcoder's topic in LabVIEW General
go ahead, please, I'm kinda busy these days. I sort of remember I have read on some other thread that the digital graph was a nest of bugs, but if we don't report them one by one there is little chance to get them worked. I also had some occasional mess ups of the plotted part, but not that systematically to pinpoint when, and likely the trick of forcing re-autoscale masks them. -
Active plot order does not match actual plot order
ensegre replied to codcoder's topic in LabVIEW General
I've recently ran into this bug, and this workaround seems to make the trick, "most" of the times. LV2015. -
Thresholding and flagging blobs larger than a given pixel size should do it, probably without even needing to subtract a background or filter the image in any way. Check the examples, e.g. "Particle Analysis Report" or "Binary Morphological reconstruction" should give you a headstart.
- 6 replies
-
- image
- bubble detection
-
(and 1 more)
Tagged with:
-
Build Number
ensegre replied to Neil Pate's topic in Application Builder, Installers and code distribution
IC ? -
Build Number
ensegre replied to Neil Pate's topic in Application Builder, Installers and code distribution
Not the point (IIUC?). I'm not interested in the local version of git installed, I'm creating a semver of my code (not of the git package) to be perused by my pre- and post-build actions, like creating VIs supplying the version label, albeit fragile, versioning the build, renaming installer packages, etc., from my commit graph. -
Build Number
ensegre replied to Neil Pate's topic in Application Builder, Installers and code distribution
It occurs to me that git describe sort of does that, returning the distance from the first tagged ancestor. So for my work as single developer, and with enough discipline giving tags in semver format, something like that should do for me: The nuisance is perhaps in the per-machine gymnastics I have to do on windows, depending on which flavor of git is installed/bundled, and whether git is made known to the shell or not (I've seen git-bundle installers which offer not to, in order not to create command conflicts). In my snippet I already figured ways for git-gui, not yet for SourceTree. -
Right, now that I turn to it, I see it. Harder, but perhaps not insurmountable: if I get it correctly, the showstopper is that there is no way to recast the saved class data back into classes by merit of some autodynamical mechanism; the hard way should be to include enough class-discriminating information in the JSON, and supplement a lot of ad hoc fragile code to place it back where needed. Anyway, tedious enough to wonder if it is really preferrable to my original xml dump. and me too.
-
This is the way I'm currently exploring for flattening, and while coding it, it seems to work: make my top device class a child of JSON Object (not Value) for each descendant class, create an override Flatten.vi like this: The first subVI in the chain reads the relevant configuration parameters of my Heater class, and bundles them in a cluster. I just leverage on previous VIs I coded for the configuration UI of the application. For a more complex class, the extension of the exercise seems ok as well: (broken wires here in the snippets are only due to the missing class context) Do you see drawbacks?
-
Coming to this thread with my monstruous configuration which is a blurb of about everything including array of classes with children, which so far I dumped as 50KB+ xml, and starting to have cursive looks at the Lineator too, I'd like to reask the same question of https://lavag.org/topic/16217-cr-json-labview/page-6#entry109240 (2nd part, serialize classes): what exactly do I have to do in order to dump my monster as json and viceversa? Create serializable.lvclas parent of all my other classes, perhaps, with two methods - doing exactly what? Is there an example? Also, suppose the monster still mutates, parameters being added or moved: what recommendations would you give in order to maximise the information which can still be retrieved from an out-of-sync configuration dump? TIA, Enrico
-
Subpanels: what are the rules for order of operations?
ensegre replied to Aristos Queue's topic in LabVIEW General
So it is. The lifeline keeps blinking. It stops only when I press the stop button, or if I reopen the FP of the blinker and abort it (maybe obvious but foolprofing). -
Subpanels: what are the rules for order of operations?
ensegre replied to Aristos Queue's topic in LabVIEW General
Like you say, LV2014SP1 linux. The lifeline of Run VI dies either that Auto Dispose is on or off. -
Not the right references IMHO. In the first, light from the bottom reaches the camera at 90° only because the bubbles act as scatterers. Bubbles cause motion and volume uncertainty, though the perpendicular position eases perspective corrections. If the OP cannot even tolerate the density change due to the addition of a small quantity of dye (which, with the image provided, somehow surprises me), I doubt he can put up with bubbles (or possibly other neutrally buoyant seeds). OTOH, I wonder which precision he aims to with his measurements, and whether his aim is achievable with a generic camera even after a careful geometrical image calibration. The second one doesn't even look like a fluid column, but a led matrix display with a "bubbles" animation mode. Not mentioning, shooting in the far IR or UV where water is not transparent... but then he'd need a specialized sensor, and usually they don't came at such high resolutions. I'm taking this homework (?) for the face value stated: (untampered) water -- then density can change with temperature or (extreme) ambient pressure. Or it is not really water but some acqueos solution which has to be left alone. Density - then this water must have been weighted. Must use a camera from an oddball perspective - hence hardly 1% precision in distance measurements. Meniscus curvature effects not mentioned. If volume changes are small, a setup with big reservoir+narrow riser tube may help.
-
As a matter of fact, I have reorganized the sources a bit, packaged them as a project, and put them under RCS here: https://gitlab.com/enricosegre/LVVideo4Linux
-
dark background, reflection on the interface either from above or below (total internal), perspective reconstruction? Maybe you can get some ideas from this: Tracing the boundaries of materials in transparent vessels using computer vision