Jump to content

LabVIEW 2013 JSON Vis


Recommended Posts

Very pleased to see the New JSON Encode and Decode in the palettes of LabVIEW 2013. I've looked at using them instead of the various libraries out there and I'm in two minds whether I will convert my current apps or use them in the future instead of those 3rd party libraries now I have had a chance to play..

 

Let's start off by saying they work great :worshippy: They are orders of magnitude faster than 3rd party ones and they adhere vehemently to the JSON standard. It's the last bit I'm in two minds about.

 

JSON is subset of Javascript (EMACS). Javascript is dynamically typed, which means that any variable can hold any type and although a string may have quotes around it, it does not preclude inserting it into, or operating on as a numeric type. Whilst the JSON spec does specify that string types be encased in quotes, Javascript (and PHP, for that matter) programmers don't really care and it doesn't break their code if they are present or not. Therefore it is very common to see quotes around numerics and even quotes left off of strings and most parsers will cope with this.

 

LabVIEW is strictly typed and when the JSON Decode encounters quotes, it will error if you have defined the field as, say, a double. and then will not process any further fields. This is a right, royal pain! It also misses a trick that would make our lives so much easier and our code much simpler.

 

Take, for example, the following real JSON stream from MTGox.

 

 

{    "channel":"dbf1dee9-4f2e-4a08-8cb7-748919a71b21",    "channel_name":"trade.BTC",    "op":"private",    "origin":"broadcast",    "private":"trade",    "trade":{                   "type":"trade",                   "date":1376196221,                   "amount":0.3333,                   "price":102.95507,                   "tid":"1376196221462784"                  ,"amount_int":"33330000",                   "price_int":"10295507",                   "item":"BTC",                   "price_currency":"USD",                   "trade_type":"ask",                   "primary":"Y",                   "properties":"limit"                }}

 

The "price_int"," amount_int" are encased in quotes when quite clearly they are integers and, more importantly, we need to manipulate them as integers. This forces the use of cluster elements that are strings and then to convert those fields to the appropriate type. It is compounded further since the structure is nested which means we have to unbundle all of the elements and then re-bundle to our desired types as we cannot use a single cluster definition. Additionally, the "date" is a numeric of the correct type, but that is not very useful in this scenario since it will need to be converted to a string. So defining that field in the decoding cluster as a string would have been a bonus. .

 

 

This is the conversion using the native JSON decode.vi.

 

JSON1.png

 

 

This is using the JSON API available in the CR.

 

JSON2.png

 

The JSON API in the CR is much more forgiving in that the cluster, alone, decides on the type. So type conversion can be done transparently by defining the cluster regardless if a value is quoted or not. This yields a much simple, easier to maintain VI and, should the server generating the JSON decide to strictly adhere to removing quotes from integers; it will not break our code as it would with the native VIs.

 

The native JSON decode has a "strict validation" boolean that states

 

strict validation determines whether LabVIEW returns an error when the JSON object contains items not defined in the input cluster. If strict validation is FALSE, JSON objects may contain items not defined in the cluster.

 

 

It would be useful if this boolean also disabled type checking of quoted strings. It would also be useful if it didn't stop at the first field it couldn't interpret and tried harder to continue. I could live without the latter, but not sure I can without the former - hence my ambivalence.

 

Did I mention how fast the native VIs are?  :worshippy:

Edited by ShaunR
Link to comment

The mutability of types in ECMA/javascript always made me wonder how reliably one can interact with strict-typed languages. I suppose to use the built in version you need to enforce a stricter version of the JSON schema you'll be reading in that includes types. My guess is this limitation is also a good part of why it's so fast.

 

When converting between dynamic and strict typed languages, I suppose it's expected that there needs to be an extra burden somewhere to enforce types. It's unfortunate though that LabVIEW leaves this burden to the writer's side. Of course what happens if you don't have control over the source of the JSON? Do you honestly think MtGox has a specification that prince_int and amount_int are to be serialized as strings? Who's to say, though it seems rather unlikely. What happens if one day price_int suddenly comes in as an integer but amount_int remains a string? Need another work around.

 

I'd argue that unless you have direct control over the source of the JSON, or unless there is a schema or documentation defining types (and obviously structure) within that JSON, then the native API shouldn't be used. Shame, because it is really nice. Really, it's nothing new-- serialized data is only useful if you know how to read it.

Link to comment
I believe this issue was raised in the beta, and the NI response was something along the lines of "Meh..."

 

Pretty much my response when I found out it behaved like this.

 

The mutability of types in ECMA/javascript always made me wonder how reliably one can interact with strict-typed languages. I suppose to use the built in version you need to enforce a stricter version of the JSON schema you'll be reading in that includes types. My guess is this limitation is also a good part of why it's so fast.

 

When converting between dynamic and strict typed languages, I suppose it's expected that there needs to be an extra burden somewhere to enforce types. It's unfortunate though that LabVIEW leaves this burden to the writer's side. Of course what happens if you don't have control over the source of the JSON? Do you honestly think MtGox has a specification that prince_int and amount_int are to be serialized as strings? Who's to say, though it seems rather unlikely. What happens if one day price_int suddenly comes in as an integer but amount_int remains a string? Need another work around.

 

I'd argue that unless you have direct control over the source of the JSON, or unless there is a schema or documentation defining types (and obviously structure) within that JSON, then the native API shouldn't be used. Shame, because it is really nice. Really, it's nothing new-- serialized data is only useful if you know how to read it.

 

Indeed. Although I don't think the decision has anything to do with speed, That comes from the fact it is compiled rather than labview code. It's just an unnecessary  literal interpretation of "string" that hobbles the API. You see this kind of thing quite often when designed purely from spec without use cases. There is no advantages to this behaviour apart from specmanship, but there are serious disadvantages IMHO

 

The frustrating thing for me (as I think I said to AQ when he was talking about his serializer) is that it is probably the one feature in LV2013 that would make me consider switching from LV2009, They've fixed a few of other things I wasn't happy with 2010 onwards that made me completely resistive to upgrading and the JSON Vis were just the sort of feature upgrade I have been waiting for all these years. So as excited as I was to see these in the palettes, I think I will stay with 2009 and use the 3rd party libraries as it is a lot cleaner, more compact and vastly safer.

Link to comment

There are many use cases that the new JSON primitives miss. There are a few that they hit, and those few are home runs. Essentially, if the same code base that does the serialization is the same code base that does the deserialization -- i.e., they are chunks of code that rev in lockstep and where there's an agreed schema between the parties -- the prims are great. Step outside that box and you may find that making multiple "attempt to parse this" calls into the JSON prims is more expensive than getting some VIs somewhere and doing the parsing yourself. That was my experience, anyway.

 

For those who are familiar with my ongoing character lineator project (which released a new version on Monday of NIWeek), the new prims are useless for me. I still have to do my own parsing of the JSON in order to handle all the data mutation cases that serialization requires.

Link to comment
There are many use cases that the new JSON primitives miss. There are a few that they hit, and those few are home runs. Essentially, if the same code base that does the serialization is the same code base that does the deserialization -- i.e., they are chunks of code that rev in lockstep and where there's an agreed schema between the parties -- the prims are great. Step outside that box and you may find that making multiple "attempt to parse this" calls into the JSON prims is more expensive than getting some VIs somewhere and doing the parsing yourself. That was my experience, anyway.

 

For those who are familiar with my ongoing character lineator project (which released a new version on Monday of NIWeek), the new prims are useless for me. I still have to do my own parsing of the JSON in order to handle all the data mutation cases that serialization requires.

 

I wouldn't say "many", probably just two - the "string" anality and you can't wire an object to it. The former is trivial to resolve and based on a whim. If it did not behave in this fashion, it would be great for 99% of cases. The latter? (1%) I expect extremely difficult and was probably flagged as "too much hassle", but it would have had far reaching positive implications for networked transfers and possibly dynamic instantiation of objects.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.