Jump to content
Sign in to follow this  
Manudelavega

Limiting width of numeric string

Recommended Posts

As far as I now there is 2 way to define how to format a numeric into a string using the floating point representation (%f):

 

- Setting the number of significant digits (%_2f)

- Setting the number of digits or precision (%.2f)

 

However I often find myself in the need of a way to define the width of the string. For example I always want 3 digits maximum. Using significant digits won't work if the number is 0.0012 since this only has 2 significant digits, so the string will be 0.0012 even though I would like to see 0.00 (so 0 if we hide trailing zeros).

 

On the other hand, using digits of precision won't work if I have a number like 123.45 since it has 2 digits of precision, so the string will be 123.45 even though I would like to see 123.

 

Now the obvious brutal way would be to use a string subset and only keep the first 3 digits (I guarantee that my number is never bigger than 999). But I really hope there is a more elegant way. I can't imagine that nobody has run into this requirement before. How did you tackle it?

Share this post


Link to post
Share on other sites

Since you're facing a "cosmetic" issue, string manipulation seems OK. If you need rounding, you'll have to get really clever and maybe count how many characters exist to the left of the decimal and use number to fractional string.

Share this post


Link to post
Share on other sites

It's not just a cosmetic issue, the string is sent to a serial device through VISA at quite a high rate (up to 10ms) so I want to find the most optimized solution...

Share this post


Link to post
Share on other sites

Compare the magnitude of the number and then choose between %f and %_f.  At three digits any value between -0.001 and 0.001 would get %f (fixed precision), everything else gets %_f (significant digits).  I may be off a zero, but the premise remains.

Share this post


Link to post
Share on other sites

Even at 2400 baud, you should be able to easily send 10 characters in 10ms.

 

Your requirement indicates 4 characters if we include the conditional decimal point. Add in a terminator and maybe a U16 as a message index and you should still be fine.

 

The question is, what does the other end accept? Is it another LabVIEW app, a SCPI instrument or something like an Arduino with limited resources?

 

LabVIEW to LabVIEW could use SI notation (%_3p) which would always be exactly 4 characters for the data.

 

SCPI devices normally include support for scientific notation (%_3e) which would always be 7 characters in length.

 

If you have a custom system, you will need to understand the parsing routine or capabilities in order to make the best choice.

 

If you are writing the remote device code, simply multiply your value by 100 in LabVIEW and send the value as 00000 to 99999 and divide by 100 on the remote side.

 

Share this post


Link to post
Share on other sites

Thank you all for your answers. The device I'm talking to is an Arduino with strict parsing, and the firmware code takes only floating numbers. I can't always multiply/divide by 100 since we want more precision when the number is small (0.123) and less precision when the number is big (12.3). Like you said Phillip, the decimal point is also a character...

 

I'll probably go with Sparc's method : format differently, based on the amplitude of the value, even though I wanted to avoid extra checks... Too bad there isn't a 3rd choice in the formatting option:

 

1) Number of significant digits (%_3f)

2) Number of digits of precision (%.3f)

3) Maximum number of digits (maybe %*3f or whatever else)

Edited by Manudelavega

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

  • Similar Content

    • By Randy_S
      Good morning,
      I have a DLL I created (in Labview)  that takes in string information, does some magic crunching on it to create a password that is passed out as a string.  This DLL will be called from CVI and probably from a C# application.
      This should be so simple, however I cannot figure out how to pass the string in and get the modified string out of my DLL.
      I've had no luck at all getting this DLL to work by trying many different things.  It crashes LV, returns and error, or simply passes nothing out during my trials.
      I've included the project (its small) in zipped format.  Can one of you kind souls take a look at my code and tell me what I'm doing wrong, or what I need to do correctly to both, configure the DLL and then call it from LabVIEW?
       
      Thank you!
       
      Randy
      Password Generator.7z
    • By piZviZ
      I am able to read HEX file into Labview using read from Text file..I am getting data into labview in terms of string.
       
      Problem =>  Now i want to load this HEX file into microcontroller using VISA-rs232 .Is it require to further conversion or i can load HEX file(in form of string) directly into microcontroller ?
      rs232.vi
    • By Majo
      Hello, 
      I need a little help. 
      I want to send from LabVIEW to Arduino via serial port the string for example "ABC" and so on. 
      The problem is that when I send the string from LabVIEW .......the Arduino do nothing. Sometimes shows some kind of sign.
      When I send the string via Visual Studio it works perfectly.
      Is there someone who can help me.
       
      I attached picture of vi and Arduino code
       
      Thank you very much.   
       
       

      Arduino code.txt
    • By EricLarsen
      I've got a 3rd party DLL I call from within my application.  One of the functions returns a pointer to a variable length null terminated string.  I use the mysterious GetValueByPointer.xnode to dereference the string and return it's value.  This works great from within the Labview development environment.  I'm using 2013Sp1.
       
      But when I compile the application into an executable, the xnode returns an error 7.  It appears the DLL is returning a valid pointer, but the xnode can't dereference it.  My hunch is that the pointer is crossing some kind of protected boundary, but I don't know for sure.
       
      I suppose I could write a wrapper around the DLL call, but I really don't want to have to do that if there is a simple workaround.  Any ideas?
       

    • By twols
      Hello guys!
       
      Is it possible to parse and evaluate a boolean expression stored in a string with standard LV VIs? I can't find anything..
       
      Expression like this one: 1==2 && (2!=5 || 2 > 5)
       
      I'm building a configuration xml-file with if-attributes for conditions:
       
      <item name="xyz" if="{namespace:variable} == 3" /> The first step is replacing the variables within {} with actual values. This is the easy part. The hard part is to evaluate the results...
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.