-
Posts
1,964 -
Joined
-
Last visited
-
Days Won
171
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by drjdpowell
-
-
Nearly. Flatten adds things like quotes and brackets. For conversion, these need to be removed.
The package needs a pair of utility VIs that convert strings to/from the JSON valid form (in quotes, backslash control characters, possible unicode encoding).
It's not a case of liking. There's some great stuff in there. It's a case that not everyone can use OpenG stuff. It's also not really appropriate to expect someone to install a shedload of 3rd party stuff that isn't required just to use a small API (I had to install OpenG especially just to look at your code and uninstall it afterwards)The variant-to-JSON stuff could be kept separate as an optional feature that requires OpenG (a lot of work to rewrite that without OpenG). Otherwise, I think I just used the faster version of “Trim Whitespace”, easily replaced.
-
I made a slight change to your lookup by adding a "To String" in each of the classes to be overridden. This means that the polymorphic VIs become very simple (Not mention that I could just replace my lookup with yours, change terminals and, hey presto, all the polys I've already created, with icons, slot straight in ).
Nicely done!
Though I think you didn’t need “To String”, as “Flatten” does the exact same thing. I never thought of using the JSON string form internally to make the outer polymorphic API easier. Great idea.
I've back-saved it to 2009 so others can playNot sure how many are still reading.
Next on my list is to get rid of the OpenG stuff.Don’t like the OpenG stuff? I love the Variant DataTools.
— James
-
Because JSON is a pretty variable format, I'm looking for JSON (valid or not) that breaks the parser. If anyone wants to give it a try, please feel free to contact me here, or at joe@underflowsoftware.com.
Breaks parser:
Backslash quotes \” in strings (eg. "And so I said, \"Hello.\””)
Sort of breaks:
U64 and Extended precision numbers, since you convert numbers to DBL internally. Note that in both my and Shaun’s prototypes, we keep the numbers in string form until the User specifies the format required.
Possible issue?:
NaN, Inf and -Inf: valid numeric values that aren’t in the JSON standard. Might be an idea to add them as possible JSON values. Or otherwise decide what to do with them when you write code to turn LabVIEW numerics into JSON (eg. NaN would be “Null”).
— James
-
I think you just need a better lookup and you'll be there! (with bells on)
A little free time this morning:
Used arrays, but you could use some parsable string format like “->”. The polymorphic VI currently has only one instance of the many, many it would need. The lower part shows selection of a subset of the JSON that can be passed generically to lower code layers.
— James
-
The code knows nothing. It doesn't know what a glossary IS only that It is a field name it should look up for the programmer- it just gets what the programmer asks for. If the JSON structure changes, no changes to the API are needed. It doesn't care what the structure of the JSON object is, it's just an accessor to the fields within the JSON object - any JSON object.
Rephrase as with respect to the programmer, then; the programmer shouldn't have to understand the entire application and data structure from high-level to low-level at the same time.
There is nothing stopping you doing this, but this isn't the responsibility of a parser. There is nothing to stop you creating an "object" output polymorphic case for your “experiment setup” (or indeed a whole bunch of them), you just need to tell it what fields it consists of and add the terminal. However. That polymorphic case will be fixed and specific to your application, and not reusable on other projects (as it is with direct conversion to variant clusters).Sorry, I ment JSON “Objects”, not application-specific LVOOP objects. No custom code needed.
I think you just need a better lookup and you'll be there! (with bells on)One could certainly write a multi-level lookup API on top of what I have already. Should be quite easy (though tedious with all the polymorphic instances). Wasted too many hours on this today, though. I don’t have any projects that actually need JSON.
— James
-
What I mean by “abstraction layers” is that no level of code should be handling that many levels of JSON. In your example the same code that knows what a “glossary” is also knows how “GlossSeeAlso” is stored, five levels down deep.
For example, imagine an “experiment setup” JSON object that contains a list of “instrument setup” objects corresponding to the different pieces of equipment. The code to setup the experiment could increment over this list and pass the "equipment setup” objects to the corresponding instrument code. The full JSON object could be very complex with many levels, but to the higher-level code it looks simple; just an array of generic things. And each piece of lower-level code is only looking at a subset of the full JSON object. No individual part of the code should be dealing with everything.
BTW> I see there is another recent JSON attempt here. They use Variants.
-
How about a slightly modified JSON of one of your examples? (Get the "NestArray" Values)
{"T1":456.789 , "T2":"test2", "Nest":{"ZZ":123,"NestArray":[1.3,2.4,1.6] }}
I don't think it is sufficient to simply have a look-up as you have here, but it is close.
If one does a lot of digging things out multiple object levels deep, then one could build something on top of this base that, say, uses some formatting to specify the levels (e.g. "Nest>>NestArray” as the name). But if one is using abstraction layers in one’s code, one won’t be doing that very often, as to each layer of code the corresponding JSON should appear quite simple. And I think it is more important to build in the inherent recursion of JSON in at the base, rather than a great multi-level lookup ability.
Here, for example is another extension: a VI to convert any (OK, many) LabVIEW types into corresponding JSON. It leverages OpenG variant tools. It was very easy to make it work on nested clusters, because it just recursively walks along the cluster hierarchy and builds a corresponding JSON Object hierarchy.
—James
-
Yup. Keep drinking the cool-ade It probably took me the same amount of time to write the concept as it did for you to read my posts...lol
I must code pretty slow. This took me 2-3 whole hours:
Reads in or writes out JSON of any type, with nesting. One would still need to write methods to get/set the values or otherwise do what you want with it. And add code to check for invalid JSON input.
— James
Added later with methods written to allow an example of getting an array of doubles extracted from a JSON Object:
Rather verbose. But one can wrap it in a “Get Array of DBL by name” method of JSON Object if you want.
-
An advantage of Joe’s Variants, or the LVOOP deign, is that the nesting is pretty trivial (just recursion). I think the LVOOP design would be the simplest. Not that I have any time to prove it
— James
-
This is exactly what my example is (analogously - Classic LV to LVPOOP).
So’s Joe’s design, now that I look at it. Though your one seems more like his “flattened variant”; how are you going to do the nesting?
-
Thoughts:
If I were approaching this problem, I would create a LabVIEW datatype that matched the recursive structure of JSON. Using LVOOP, I would have the following classes:
Parent: "JSON Value”: the parent of three other classes (no data items)
Child 1: “JSON Scaler”: holds a “scaler” —> string, number, true, false, null (in string form; no need to convert yet)
Child 2: “JSON Array”: array of JSON Values
Child 3: “JSON Object”: set name/JSON Value pairs (could be a Variant Attribute lookup table or some such)
If I’m not missing something, this structure one-to-one matches the JSON format, and JSON Value could have methods to convert to or from JSON text format. Plus methods to add, set, delete, or query its Values. Like Shaun, I would have the user specify the LabVIEW type they want explicitly and never deal in Variants.
— James
-
Cameras showed up in MAX no problem, but both MAX names would, if selected, lead to images from only one of the cameras. It’s was driver issue, at a lower level than MAX.
-
Really? I've got it working just fine on my PC.
I wonder: are you using NI-IMAQdx? Or something different?
I think I’ve seen it with NI-IMAQdx. It’s only with some USB cameras, such as webcams. And it is only when using identical models; one can use multiple cameras of different models, because they go into the Registry under their model names.
-
It’s adapting to the type of the control it’s connected to, which I did not know it could do.
-
Had that problem. I believe it is because the Windows software was never made to work with multiple cameras at once. Each identical camera is listed in the Windows Registry under identical names. I believe you can modify the registry, but that is not a satisfactory solution.
-
I had the problem with an XControl which I was peppering with events to show that XControl value updates often lag behind the calling VI. Doing this I saw that when queuing up many events fast enough, the order became mixed up.
Actually, I’ve seen this same effect with an Xcontrol of mine. The same effect occurs if you set the control to “synchronous” and hit the terminal. Note, though that you aren’t directly firing events here; you’re triggering a property, and whatever code behind the scenes is firing “Data Changed” events into the Xcontrol. And the associated data isn’t packaged with the event, it’s provided via the “Data In” input terminal. It isn’t clear that the problem is in the event system itself.
-
From personal experience, whenever there is a problem somehow involving Xcontrols, my suspicion falls first on the Xcontrol.
-
Whilst the mechanics may be thought of in that way and indeed, both may be coerced to emulate the properties of the other, they are actually different topologies. Queues are "many-to-one" and events are "one-to-many".
You can use User Events "many-to-one” too. Create an Event, register for it yourself, then pass the Event to other processes so they can send you messages.
-
In rare cases the event queue can get muddled if the events are sent more quickly than the timestamp resolution can take care of. This leads to the events possibly being received IN THE WRO_NG ORDER. This is within the same event queue, and can be provoked even with the same event.
Can you provide a reference? I’ve only seen mention of that problem when the two events are in different event queues in the same event structure, such as a statically-registered event and a dynamically-registered one.
-
Hi jg,
Really like the tool, except, in addition to renaming, it also moves all the control labels: left centered for controls; right centered for indicators. I know this is a style preferred by many (but not me) and I wondered if this could be turned off? Or better yet, make it a separate tool.
— James
-
Random thoughts on User Events versus Queues:
— The event registration refnum is a queue, and it is more instructive to compare this queue to regular Queues rather than compare User Events to Queues. They have different extra features, but in basics they are the same; they serve up the enqueued items to one receiver in order. And one should not have more than one receiver per queue.
— The event structure itself can receive from multiple queues. In particular, the statically registered events are in a separate queue from any event reg. refnum attached dynamically. The order that items are served up from different queues is determined by timestamp. This means that near simultaneous events placed in different queues cannot be relied upon to be received in order. However, items from the same queue will always be in order, which means one can use a User Event to reliably carry messages to a process.
— A User Event can be thought of as an array of queues; firing the Event is the same as enqueuing to all the queues, and the Event Registration Node serves to add its queue to the array. When created, of course, the User Event is an empty array.
Random random thoughts:
— a “name” is a reference, the same as a refnum.
— “subpaneling” is a silly term
- 1
-
I'm not sure where the claim comes from, about that understanding unsigned numbers is difficult.
From me. I find them a bit difficult. Just a bit.
- 1
-
The timers output unsigned U32 numbers, and the subtraction will produce the same U32. This cannot be less than zero so that code never executes. Which is not a problem since the subtraction will already handle the rollover satisfactorily; for example: 1 − 4294967295 = 2.
Understanding unsigned numbers and rollover is tricky; I usually have to experiment to remind myself how it works every time I have to do it.
— James
-
Welcome to the weird and wonderful frustrating world of Xcontrols.
Presumably they don’t bother to update the display when the FP is closed, but I would have thought that some event should be triggered when you open the FP so you could set the display state. But I couldn’t get your control to work.
Reading JSON into LabVIEW
in Code In-Development
Posted
I don’t mean utility VIs for the User of the API, rather, I mean “utility” for writing the package internally. Conversion to/from valid JSON string format will be required in multiple places. I tend to call subVIs, needed by the class methods, but not themselves using those classes in any way, “Utility” subVIs.
There’s a good 30+ VIs in dependancies. Copying all that to support my variant-to-JSON stuff is excessive. Compare it with just changing the one “remove whitespace” subVI to make the rest of the package independent. But as I said, it should be easy to make the variant stuff an optional add-on, for those who don’t mind adding a couple of OpenG packages.
Is it OK to put unfinished stuff in the CR, even uncertified? I’m afraid I’m about to go on two weeks vacation, but I could put what we have to this point in the CR and commit some free time finishing it when I get back. Don’t whip up a thousand and one pretty polymorphic instances until we get the core stuff finished.
At some point I switched from not using OpenG if possible, to considering it “standard LabVIEW”. VIPM making it so easy probably contributed to this shift.