Jump to content

drjdpowell

Members
  • Posts

    1,981
  • Joined

  • Last visited

  • Days Won

    183

Everything posted by drjdpowell

  1. Ah, non-uniform spacing, I see. Greg, My first thought would have been to truncate the weighting calculation and fitting to only a region around the point where the weights are non negligible. Currently, the algorithm uses the entire data set in the calculation of each point even though most of the data has near zero weighting. For very large datasets this will be very significant. — James BTW> Sometimes it can be worth using interpolation to produce a uniform spacing of non-uniform data in order to be able to use more powerful analysis tools. Savitsky-Golay, for example, can be used to determine the first and higher-order derivatives of the data for use in things like peak identification.
  2. I would not be able to help you for a few weeks as I’m off on vacation. I can see why your smoothing functions so slow and I’m sure someone could easily improve it’s performance by orders of magnitude on large data sets. However, are you sure you would not be better served by using one of the many “Filter” VIs in LabVIEW? I tend to use the Savitsky-Golay filter, but there are many others that can be used for smoothing. They’ll be much, much faster.
  3. I don’t mean utility VIs for the User of the API, rather, I mean “utility” for writing the package internally. Conversion to/from valid JSON string format will be required in multiple places. I tend to call subVIs, needed by the class methods, but not themselves using those classes in any way, “Utility” subVIs. There’s a good 30+ VIs in dependancies. Copying all that to support my variant-to-JSON stuff is excessive. Compare it with just changing the one “remove whitespace” subVI to make the rest of the package independent. But as I said, it should be easy to make the variant stuff an optional add-on, for those who don’t mind adding a couple of OpenG packages. Is it OK to put unfinished stuff in the CR, even uncertified? I’m afraid I’m about to go on two weeks vacation, but I could put what we have to this point in the CR and commit some free time finishing it when I get back. Don’t whip up a thousand and one pretty polymorphic instances until we get the core stuff finished. At some point I switched from not using OpenG if possible, to considering it “standard LabVIEW”. VIPM making it so easy probably contributed to this shift.
  4. The package needs a pair of utility VIs that convert strings to/from the JSON valid form (in quotes, backslash control characters, possible unicode encoding). The variant-to-JSON stuff could be kept separate as an optional feature that requires OpenG (a lot of work to rewrite that without OpenG). Otherwise, I think I just used the faster version of “Trim Whitespace”, easily replaced.
  5. Nicely done! Though I think you didn’t need “To String”, as “Flatten” does the exact same thing. I never thought of using the JSON string form internally to make the outer polymorphic API easier. Great idea. Not sure how many are still reading. Don’t like the OpenG stuff? I love the Variant DataTools. — James
  6. Breaks parser: Backslash quotes \” in strings (eg. "And so I said, \"Hello.\””) Sort of breaks: U64 and Extended precision numbers, since you convert numbers to DBL internally. Note that in both my and Shaun’s prototypes, we keep the numbers in string form until the User specifies the format required. Possible issue?: NaN, Inf and -Inf: valid numeric values that aren’t in the JSON standard. Might be an idea to add them as possible JSON values. Or otherwise decide what to do with them when you write code to turn LabVIEW numerics into JSON (eg. NaN would be “Null”). — James
  7. A little free time this morning: Used arrays, but you could use some parsable string format like “->”. The polymorphic VI currently has only one instance of the many, many it would need. The lower part shows selection of a subset of the JSON that can be passed generically to lower code layers. — James JSON drjdpowell V4.zip
  8. Rephrase as with respect to the programmer, then; the programmer shouldn't have to understand the entire application and data structure from high-level to low-level at the same time. Sorry, I ment JSON “Objects”, not application-specific LVOOP objects. No custom code needed. One could certainly write a multi-level lookup API on top of what I have already. Should be quite easy (though tedious with all the polymorphic instances). Wasted too many hours on this today, though. I don’t have any projects that actually need JSON. — James
  9. What I mean by “abstraction layers” is that no level of code should be handling that many levels of JSON. In your example the same code that knows what a “glossary” is also knows how “GlossSeeAlso” is stored, five levels down deep. For example, imagine an “experiment setup” JSON object that contains a list of “instrument setup” objects corresponding to the different pieces of equipment. The code to setup the experiment could increment over this list and pass the "equipment setup” objects to the corresponding instrument code. The full JSON object could be very complex with many levels, but to the higher-level code it looks simple; just an array of generic things. And each piece of lower-level code is only looking at a subset of the full JSON object. No individual part of the code should be dealing with everything. BTW> I see there is another recent JSON attempt here. They use Variants.
  10. If one does a lot of digging things out multiple object levels deep, then one could build something on top of this base that, say, uses some formatting to specify the levels (e.g. "Nest>>NestArray” as the name). But if one is using abstraction layers in one’s code, one won’t be doing that very often, as to each layer of code the corresponding JSON should appear quite simple. And I think it is more important to build in the inherent recursion of JSON in at the base, rather than a great multi-level lookup ability. Here, for example is another extension: a VI to convert any (OK, many) LabVIEW types into corresponding JSON. It leverages OpenG variant tools. It was very easy to make it work on nested clusters, because it just recursively walks along the cluster hierarchy and builds a corresponding JSON Object hierarchy. —James JSON drjdpowell V3.zip
  11. I must code pretty slow. This took me 2-3 whole hours: JSON drjdpowell.zip Reads in or writes out JSON of any type, with nesting. One would still need to write methods to get/set the values or otherwise do what you want with it. And add code to check for invalid JSON input. — James Added later with methods written to allow an example of getting an array of doubles extracted from a JSON Object: JSON drjdpowell V2.zip Rather verbose. But one can wrap it in a “Get Array of DBL by name” method of JSON Object if you want.
  12. An advantage of Joe’s Variants, or the LVOOP deign, is that the nesting is pretty trivial (just recursion). I think the LVOOP design would be the simplest. Not that I have any time to prove it — James
  13. So’s Joe’s design, now that I look at it. Though your one seems more like his “flattened variant”; how are you going to do the nesting?
  14. Thoughts: If I were approaching this problem, I would create a LabVIEW datatype that matched the recursive structure of JSON. Using LVOOP, I would have the following classes: Parent: "JSON Value”: the parent of three other classes (no data items) Child 1: “JSON Scaler”: holds a “scaler” —> string, number, true, false, null (in string form; no need to convert yet) Child 2: “JSON Array”: array of JSON Values Child 3: “JSON Object”: set name/JSON Value pairs (could be a Variant Attribute lookup table or some such) If I’m not missing something, this structure one-to-one matches the JSON format, and JSON Value could have methods to convert to or from JSON text format. Plus methods to add, set, delete, or query its Values. Like Shaun, I would have the user specify the LabVIEW type they want explicitly and never deal in Variants. — James
  15. Cameras showed up in MAX no problem, but both MAX names would, if selected, lead to images from only one of the cameras. It’s was driver issue, at a lower level than MAX.
  16. I think I’ve seen it with NI-IMAQdx. It’s only with some USB cameras, such as webcams. And it is only when using identical models; one can use multiple cameras of different models, because they go into the Registry under their model names.
  17. It’s adapting to the type of the control it’s connected to, which I did not know it could do.
  18. Had that problem. I believe it is because the Windows software was never made to work with multiple cameras at once. Each identical camera is listed in the Windows Registry under identical names. I believe you can modify the registry, but that is not a satisfactory solution.
  19. Actually, I’ve seen this same effect with an Xcontrol of mine. The same effect occurs if you set the control to “synchronous” and hit the terminal. Note, though that you aren’t directly firing events here; you’re triggering a property, and whatever code behind the scenes is firing “Data Changed” events into the Xcontrol. And the associated data isn’t packaged with the event, it’s provided via the “Data In” input terminal. It isn’t clear that the problem is in the event system itself.
  20. From personal experience, whenever there is a problem somehow involving Xcontrols, my suspicion falls first on the Xcontrol.
  21. You can use User Events "many-to-one” too. Create an Event, register for it yourself, then pass the Event to other processes so they can send you messages.
  22. Can you provide a reference? I’ve only seen mention of that problem when the two events are in different event queues in the same event structure, such as a statically-registered event and a dynamically-registered one.
  23. Hi jg, Really like the tool, except, in addition to renaming, it also moves all the control labels: left centered for controls; right centered for indicators. I know this is a style preferred by many (but not me) and I wondered if this could be turned off? Or better yet, make it a separate tool. — James
  24. Random thoughts on User Events versus Queues: — The event registration refnum is a queue, and it is more instructive to compare this queue to regular Queues rather than compare User Events to Queues. They have different extra features, but in basics they are the same; they serve up the enqueued items to one receiver in order. And one should not have more than one receiver per queue. — The event structure itself can receive from multiple queues. In particular, the statically registered events are in a separate queue from any event reg. refnum attached dynamically. The order that items are served up from different queues is determined by timestamp. This means that near simultaneous events placed in different queues cannot be relied upon to be received in order. However, items from the same queue will always be in order, which means one can use a User Event to reliably carry messages to a process. — A User Event can be thought of as an array of queues; firing the Event is the same as enqueuing to all the queues, and the Event Registration Node serves to add its queue to the array. When created, of course, the User Event is an empty array. Random random thoughts: — a “name” is a reference, the same as a refnum. — “subpaneling” is a silly term
  25. From me. I find them a bit difficult. Just a bit.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.