jdsommer Posted February 18, 2006 Report Share Posted February 18, 2006 I'd like to be able to scan a physical quantity from a string that contains a number and a unit string. Basically, the user should be able to input "2.36 mtorr" or "20 Pa" or whatever and my VI be able to parse that and convert it into the appropriate physical quantity datatype. The base units can be specified at compile time. I thought this was going to be relatively easy, but I'm walking down ever more complex roads. First off, Scan from String does not work as the unit specifier only is valid for Format into string. My other attempt was to print the data into a control, first local, and then one in an external VI using the NumText.Text and UnitLabel.Text properties. The basic idea was to write the data in text form in, and then read the PQ out with the value property. This doesn't work because one is not allowed to change the base units of a control via these properties, apparently even if the VI containing the control is not running. So, at the moment I'm contemplating writing my own unit parser (or finding one already written), but this seems redundant to what is obviously included in LabVIEW, somewhere. Am I missing something really basic somewhere? If not, does anyone have any brilliant ideas? I'm using LabVIEW 7.1, so if there's anything cool in LV8 that would help out, I'm unaware of it. Thanks, Jason Quote Link to comment
Jim Kring Posted February 18, 2006 Report Share Posted February 18, 2006 First write the units string to the Numeric's UnitLabel.Text property. Then write the value (as a string) to the Numeric's NumericText.Text property. Make sure to write the properties it in exactly that order, otherwise you will might write the value in the '"previous" units and then convert to different units. This technique is very nice, because it will generate an "incompatible units" error, if you try to write to units that are not known or of different base units. Download File:post-17-1140297695.vi Quote Link to comment
crelf Posted February 18, 2006 Report Share Posted February 18, 2006 To parse your input string, why not split the string at the "space" and handle the two parts separately? Personally, I'd definately go with Jim's method - the "unit" of data is a very underused feature of LabVIEW, and (as Jim says) as long as you keep your units consistant, it takes care of itself (eg: wiring a control in meters and a control in seconds to a divide primative will yeild a solution in m/s - if you wire that to an indicator with units other than m/s will give you a broken arrow). Quote Link to comment
Jim Kring Posted February 18, 2006 Report Share Posted February 18, 2006 To parse your input string, why not split the string at the "space" and handle the two parts separately?Personally, I'd definately go with Jim's method - the "unit" of data is a very underused feature of LabVIEW, and (as Jim says) as long as you keep your units consistant, it takes care of itself (eg: wiring a control in meters and a control in seconds to a divide primative will yeild a solution in m/s - if you wire that to an indicator with units other than m/s will give you a broken arrow). And, if you want a pure number (without physical units) in your application, just drop a Convert Units node downstream of the conversion VI that I posted. Quote Link to comment
jpdrolet Posted February 18, 2006 Report Share Posted February 18, 2006 Here is a trick that, as far as I know, has not been published yet. In exclusivity to LAVA. You can convert to/from any compatible units. Oddly enough, it generates an error when converting to the same unit... EDIT: OK as usual when converting units I get it the wrong way and inverted the converion coefficient. :headbang: So if you have downloaded the VI try this corrected version. Download File:post-447-1140315289.vi Quote Link to comment
Jim Kring Posted November 4, 2006 Report Share Posted November 4, 2006 Here is a trick that, as far as I know, has not been published yet. In exclusivity to LAVA.You can convert to/from any compatible units. Oddly enough, it generates an error when converting to the same unit... EDIT: OK as usual when converting units I get it the wrong way and inverted the converion coefficient. :headbang: So if you have downloaded the VI try this corrected version. Jean-Pierre, This technique does not seem to handle temperature conversion, due to the fact that there is an offset, in addition to a scaling coefficient (e.g., 0 degF != 0 degC != 0 K). Is there a simple solution that will account for this? I can't see anything obvious that will work around this issue. Also, for some reason this code generates an error for degF and degC, but seems to work OK for Fdeg and Cdeg -- strange. Thanks, Quote Link to comment
David Boyd Posted November 4, 2006 Report Share Posted November 4, 2006 Jean-Pierre,This technique does not seem to handle temperature conversion, due to the fact that there is an offset, in addition to a scaling coefficient (e.g., 0 degF != 0 degC != 0 K). Is there a simple solution that will account for this? I can't see anythin obvious that will work around this issue. Also, for some reason this code generates an error for degF and degC, but seems to work OK for Fdeg and Cdeg -- strange. Thanks, I'm another one of those believers in the units feature of LV, I think it's often overlooked and a really clever concept. My understanding of the way LabVIEW handles units for temperature is that degC and degF imply temperatures on their respective scales (with their respective offsets), while Cdeg and Fdeg represent a difference in degrees on the specified scale. Kelvin, of course, is the same either way since it is an absolute scale. So, for example, a temperature gradient value could be described as a PQ with units of Cdeg/m, or K/m, but you wouldn't want to express it as degC/m. Or consider the case of subtracting two values in degF - the answer could be properly labeled in Fdeg, but not degF. Regrettably, since the value on the wire is really just a value in Kelvins, if you create an indicator or constant from the wire, LV has no way to know whether you want to display a temperature difference or a point on a scale. I find it mildly unnerving that when I create a constant on the BD from a control/indicator with temperature units, the constant always shows up as Cdeg - essentially the same as the base units of K. This caused me no end of confusion when I first started using units for temperatures - I didn't get the difference between Cdeg and degC, and assumed that LabVIEW's behavior was somehow broken. And it looks like even in LV 8.20, this bug hasn't been fixed. Dave Quote Link to comment
jpdrolet Posted November 5, 2006 Report Share Posted November 5, 2006 Dave is correct. the issue is the offset on the scales degF and degC. Replace "any compatible units" with "most compatible units". Also the VI doesn't work "as is" for composed units like converting from km/s to mi/hr. To work the "unitless" string should be set to km s^-1 mi^-1 hr but since this trick is a hack (units are not supposed to be used that way) that doesn't work. Even then LabVIEW (at least 7.1) is still fragile when modifying unit string both manually or programmatically. I just made it crash again while testing some conversion. While I agree with Dave that the units are an overlooked and clever concept in LabVIEW, it is underdevelopped and there are still many bugs because the feature is seldom used other than a simple passive display. After all, It is not that long that units can be changed dynamically. Quote Link to comment
linnx Posted December 15, 2006 Report Share Posted December 15, 2006 Hmm I have a slightly different problem and was wondering if anyone could help. I have a string being read and using a scan from string func on it which should convert a string like: 245m for example to 0.245 or simply keep the milli specifier. The current formating mask I'm using for that function is %#.6p and that simply cuts off the m at the end and sends a 245, and when i'm doing voltage control, that creates quiet a dangerous situation. Could anyone please help! Quote Link to comment
Ton Plomp Posted December 15, 2006 Report Share Posted December 15, 2006 Hmm I have a slightly different problem and was wondering if anyone could help.I have a string being read and using a scan from string func on it which should convert a string like: 245m for example to 0.245 or simply keep the milli specifier. The current formating mask I'm using for that function is %#.6p and that simply cuts off the m at the end and sends a 245, and when i'm doing voltage control, that creates quiet a dangerous situation. Could anyone please help! Crosspost from NI (AKA the dark side) where it is already answered Ton Quote Link to comment
crelf Posted December 15, 2006 Report Share Posted December 15, 2006 (AKA the dark side) I don't see the NI forums as evil: I prefer to call it the low SNR side Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.