Jump to content

TouchScreen Programming for LabVIEW?


Recommended Posts

QUOTE (menghuihantang @ Mar 24 2009, 08:41 AM)

LabVIEW is a graphic programming language which basically avoids text-typing.

I wonder if touchscreen programming for LabVIEW is available. I mean it makes the best use of graphic language and much fun, like in the scientific fictions. That will be so cool.

In theory yes but in practice "it aint ready for prime time".

How to do youdistingisuh between a left and right click?

Dragging on a touch screen is hard if your finger starts buncing as you drag...

Tried and reched for my track-ball after very short order.

Ben

Link to comment

QUOTE (neBulus @ Mar 24 2009, 08:51 AM)

In theory yes but in practice "it aint ready for prime time".

How to do youdistingisuh between a left and right click?

Dragging on a touch screen is hard if your finger starts buncing as you drag...

Tried and reched for my track-ball after very short order.

Ben

Well, I am not saying to eliminate keyboard and mouse, they can still be there if needed. Or even integrate mouse on the touchscreen.

I agree with you that a lot of stuffs are not ready yet, but I think it is possible starting to pursue it now. This will be a good push to spread LabVIEW, don't you think?

  • Like 1
Link to comment

QUOTE (menghuihantang @ Mar 24 2009, 05:41 AM)

LabVIEW is a graphic programming language which basically avoids text-typing.

I wonder if touchscreen programming for LabVIEW is available. I mean it makes the best use of graphic language and much fun, like in the scientific fictions. That will be so cool.

Well typing is still pretty important in LabVIEW. If you code doesn't have labels and comments, then it's not any good. But I guess you could use a speech-to-text driver to fix that.

LabVIEW programming also requires a fair amount of fine motor control (in this case "motor" refers to your hand-eye coordination and ability to make small, controlled movements with fingers and wrist. I don't think a touchscreen really adds anything to the ability to transfer those fine motions into a computer. Maybe a 3D mouse (gyration.com) would be better, but I've never tried it.

Link to comment

QUOTE (jdunham @ Mar 24 2009, 10:31 AM)

Well typing is still pretty important in LabVIEW. If you code doesn't have labels and comments, then it's not any good. But I guess you could use a speech-to-text driver to fix that.

LabVIEW programming also requires a fair amount of fine motor control (in this case "motor" refers to your hand-eye coordination and ability to make small, controlled movements with fingers and wrist. I don't think a touchscreen really adds anything to the ability to transfer those fine motions into a computer. Maybe a 3D mouse (gyration.com) would be better, but I've never tried it.

I don't see why it's impossible to make fine motion on touchscreen. Don't consider touchsreen purely finger-pad. You will have, for instance, a button which functions as a fine motion control. By clicking multiple times, you will have the pixel-by-pixel motion if you want.

You can also have a keyboard on the touchscreen just like a real physical one.

How about that?

You will find out your programming speed is much much faster and more fun.

Link to comment

QUOTE (jdunham @ Mar 24 2009, 11:27 AM)

It might not be faster, depend on individual styles.

But it can probably reduce the wrist pain caused by using mouse,right? I dont know you guys, but my wrist hurt and that's why I brought about this interesting topic.

QUOTE (neBulus @ Mar 24 2009, 09:08 AM)

As
, what will really make a difference is an "eye Tracker". Maybe the next generation after me will go for the brain plug vesion.

Ben

That's great demo. It's better than what I thought. I am only thinking about 2-D. 3-D is definitely better, but probably takes more time for practical usage.

Link to comment

QUOTE (menghuihantang @ Mar 24 2009, 09:13 AM)

But it can probably reduce the wrist pain caused by using mouse,right? I dont know you guys, but my wrist hurt and that's why I brought about this interesting topic.

Well those are certainly worth discussing, and I didn't mean to go on the attack. I am lucky that my wrist have never gotten injured after many years of LabVIEW and computers. It is my opinion that mice are overwhelmingly popular partly because they are the most effective input devices. The mouse seems to work the muscles which have the most outstanding fine motor control capability, side to side with the wrist and forward and backward with the fingers.

I think it's great to discuss alternatives, but the mouse is a hard act to follow. Similarly with the keyboard. Voice activation sounds cool, but if you can type 60 words per minute or more, with good accuracy, then voice is probably not going to be an improvement.

Link to comment

I worked for 5 years in a group designing and building consumer mice and keyboards. Furthermore, I just recently finished a 2.5 year stint for a company where I did extensive work with capacitive touch technology, including touch screens. I don't claim to be an expert in the field, but I've interacted with usability studies enough to learn a few things.

If you want to move Labview programming into a new paradigm, such as touch screen programming, you have to offer the user a tangible benefit. Once the coolness factor wears off, what advantage does touch give the user? In terms of speed and accuracy for most users, nothing beats a mouse. Part of that is simply because that's what users are used to. Part of it is because the mouse works extremely well as part of the complete, closed-loop cursor positioning system. Your brain decides where to move the cursor and feeds inputs to your hand. As your hand moves the mouse, your eye tracks the cursor and your brain sends small error correction commands to your hand, allowing you to get to get the cursor to the target very quickly.

Capacitive touch as a technology cannot compete with the speed and accuracy of a mouse. There is simply too much noise in the system. You typically don't see the noise at the user level because of the extensive filtering taking place under the hood. Of course, that filtering comes at a cost--reduced response time. Everybody's favorite capacitive touch screen, the iPhone, was lagging roughly 100 ms last time I checked. (The effect is most easily observed if you have an application that displays a dot at the location the sensor is reporting and move your finger around the screen quickly.) 100 ms isn't much in absolute terms, but that much of a delay can sure throw off the user's cursor positioning system.

Touch screens suffer from an additional problem that has nothing to do with the technology. Quite simply, users are not accurate with their finger. There are several reasons for this. The main reason is that the finger blocks the target, making it impossible for the user to make those fine tuning adjustments that put the cursor exactly where they want. Another contributing factor is that a user's finger 'footprint' will be different depending on the finger's angle when it makes contact with the touchscreen and depending on how much pressure the user applies. It is very difficult for users to repeatedly hit small targets when using their finger on a capacitive touchscreen. As a general rule, at best you can expect users to be accurate with their touches to ~5mm of their target position. Well designed touch screen interfaces won't have any UI elements less than ~10mm. Imagine trying to hook up a front panel terminal or selecting a wire on a block diagram using a touch screen.

There are ergonomic issues too. Programming Labview via a touchscreen requires large arm motions. It is much more inefficient in terms of both time and energy than using a mouse. "Labview shoulder" would be the new repetitive stress injury. (This is also why I think the user interface from Minority Report is misguided. Too many grand motions are required to get the work done.) IMO, Ben is right; the next big breakthough in cusor positioning is going to come from eye tracking. It's the only thing currently on the horizon that offers advantages over mice. There are huge technical and usability hurdles to overcome, but whoever develops and patents a good consumer-level eye tracking systems is going to make a bundle of money. (I actually tried, unsuccessfully, to generate interest in researching eye-tracking systems during my time developing mice.)

QUOTE

I don't see why it's impossible to make fine motion on touchscreen. Don't consider touchsreen purely finger-pad. You will have, for instance, a button which functions as a fine motion control. By clicking multiple times, you will have the pixel-by-pixel motion if you want.

Essentially you're talking about the arrow keys on the keyboard. Personally, I use them occasionally for positioning selected items on the screen, but if I'm going to have to use them just to get enough accuracy to select an item, I'm going to get real irritated real fast.

QUOTE

You can also have a keyboard on the touchscreen just like a real physical one.

Actually, you can't. You can make the layout the same, but that's about it. Touchscreens, being flat, don't have the same ergonomics or tactile feedback that real keyboards do. The tactile feedback is extremely important for touch typers. The little bumps on the 'F' and 'J' key helps me make sure my hands are positioned correctly without having to look down every time I move them. The curvature of each key provides the subtle hints that keep each finger on the correct key. If I go to hit the 'O' key, and my finger is off a little bit, I subconciously note the different feeling and my brain corrects for it the next time. Touchscreen keyboards don't provide either of these, which is why people can type faster on real keyboards than touchscreen keyboards.

QUOTE

You will find out your programming speed is much much faster and more fun.

Two things in particular make programming fun for me:

  1. Learning how to do new things. (i.e. LVOOP, XControls, etc.)
  2. Developing software that makes mechanical things move. (Because that never loses the coolness factor. :) ) (More generally, this could be considered developing software that makes me or someone else more productive.)

Fighting with the development environment's user interface is NOT something that falls into my "fun" category. (Ask me how much fun I'm having the next time Labview crashes.) A proposal that makes either of those two items easier is good; if it makes them harder it is bad. Programming Labview via a touchscreen, IMO, falls squarely on the "bad" side of the equation.

  • Like 1
Link to comment

QUOTE (Daklu @ Mar 24 2009, 01:38 PM)

I worked for 5 years in a group designing and building consumer mice and keyboards. Furthermore, I just recently finished a 2.5 year stint for a company where I did extensive work with capacitive touch technology, including touch screens. I don't claim to be an expert in the field, but I've interacted with usability studies enough to learn a few things.

If you want to move Labview programming into a new paradigm, such as touch screen programming, you have to offer the user a tangible benefit. Once the coolness factor wears off, what advantage does touch give the user? In terms of speed and accuracy for most users, nothing beats a mouse. Part of that is simply because that's what users are used to. Part of it is because the mouse works extremely well as part of the complete, closed-loop cursor positioning system. Your brain decides where to move the cursor and feeds inputs to your hand. As your hand moves the mouse, your eye tracks the cursor and your brain sends small error correction commands to your hand, allowing you to get to get the cursor to the target very quickly.

Capacitive touch as a technology cannot compete with the speed and accuracy of a mouse. There is simply too much noise in the system. You typically don't see the noise at the user level because of the extensive filtering taking place under the hood. Of course, that filtering comes at a cost--reduced response time. Everybody's favorite capacitive touch screen, the iPhone, was lagging roughly 100 ms last time I checked. (The effect is most easily observed if you have an application that displays a dot at the location the sensor is reporting and move your finger around the screen quickly.) 100 ms isn't much in absolute terms, but that much of a delay can sure throw off the user's cursor positioning system.

Touch screens suffer from an additional problem that has nothing to do with the technology. Quite simply, users are not accurate with their finger. There are several reasons for this. The main reason is that the finger blocks the target, making it impossible for the user to make those fine tuning adjustments that put the cursor exactly where they want. Another contributing factor is that a user's finger 'footprint' will be different depending on the finger's angle when it makes contact with the touchscreen and depending on how much pressure the user applies. It is very difficult for users to repeatedly hit small targets when using their finger on a capacitive touchscreen. As a general rule, at best you can expect users to be accurate with their touches to ~5mm of their target position. Well designed touch screen interfaces won't have any UI elements less than ~10mm. Imagine trying to hook up a front panel terminal or selecting a wire on a block diagram using a touch screen.

There are ergonomic issues too. Programming Labview via a touchscreen requires large arm motions. It is much more inefficient in terms of both time and energy than using a mouse. "Labview shoulder" would be the new repetitive stress injury. (This is also why I think the user interface from Minority Report is misguided. Too many grand motions are required to get the work done.) IMO, Ben is right; the next big breakthough in cusor positioning is going to come from eye tracking. It's the only thing currently on the horizon that offers advantages over mice. There are huge technical and usability hurdles to overcome, but whoever develops and patents a good consumer-level eye tracking systems is going to make a bundle of money. (I actually tried, unsuccessfully, to generate interest in researching eye-tracking systems during my time developing mice.)

Essentially you're talking about the arrow keys on the keyboard. Personally, I use them occasionally for positioning selected items on the screen, but if I'm going to have to use them just to get enough accuracy to select an item, I'm going to get real irritated real fast.

Actually, you can't. You can make the layout the same, but that's about it. Touchscreens, being flat, don't have the same ergonomics or tactile feedback that real keyboards do. The tactile feedback is extremely important for touch typers. The little bumps on the 'F' and 'J' key helps me make sure my hands are positioned correctly without having to look down every time I move them. The curvature of each key provides the subtle hints that keep each finger on the correct key. If I go to hit the 'O' key, and my finger is off a little bit, I subconciously note the different feeling and my brain corrects for it the next time. Touchscreen keyboards don't provide either of these, which is why people can type faster on real keyboards than touchscreen keyboards.

Two things in particular make programming fun for me:

  1. Learning how to do new things. (i.e. LVOOP, XControls, etc.)
  2. Developing software that makes mechanical things move. (Because that never loses the coolness factor. :) ) (More generally, this could be considered developing software that makes me or someone else more productive.)

Fighting with the development environment's user interface is NOT something that falls into my "fun" category. (Ask me how much fun I'm having the next time Labview crashes.) A proposal that makes either of those two items easier is good; if it makes them harder it is bad. Programming Labview via a touchscreen, IMO, falls squarely on the "bad" side of the equation.

Monority Report is just a movie. People like to put their imagination into a movie. So I would say one day from now, some day you can only find a mouse in some country museums. Don't know when though, mouse is a amazing device, like you said.

I don't know much about eye tracking. But are you sure it is less challenging than touchscreen. Yes, fingers are inaccurate than a little mouse. But nobody ever said we can only use fingers. How about laser beams?

I am no expert at all and it's totally stupid imagination. But the coolness factor has always been one of the best motivations. Think about the life around you, all we try to do is to build something cool and make that coolness last for ever. Ok, maybe it's not time yet, but that's not a good reason to stop pursuing.

Link to comment

QUOTE (menghuihantang @ Mar 25 2009, 05:58 AM)

I don't know much about eye tracking. But are you sure it is less challenging than touchscreen. Yes, fingers are inaccurate than a little mouse. But nobody ever said we can only use fingers. How about laser beams?

I am no expert at all and it's totally stupid imagination. But the coolness factor has always been one of the best motivations. Think about the life around you, all we try to do is to build something cool and make that coolness last for ever. Ok, maybe it's not time yet, but that's not a good reason to stop pursuing.

Laser beams are cool, but if you notice, a person with a laser pointer has a hard time keeping it from shaking.

It is definitely worthwhile to think about cool and useful stuff, but sometimes things will add on rather than replace. In fact given that touchscreens have been around for 20 years or so, I suspect they've already been used for all the things they are good at and have been rejected for things like computer programming where they don't add much value.

But as another example, I suspect the keyboard is never going away. It's very fast and flexible. Even if computers can understand speech, it's still easier to type "LVOOP" than try to say it. Maybe if we all have direct neural implants we could bypass keyboards, but then you might have a hard time keeping your more private thoughts private while you are trying to dump out your regular thoughts over the neural interface.

Jason

Link to comment

QUOTE (menghuihantang @ Mar 25 2009, 06:58 AM)

I fully agree. The question we are discussing is 'what is going to replace it?'

QUOTE (menghuihantang @ Mar 25 2009, 06:58 AM)

...mouse is a amazing device, like you said.

Believe it or not, I'm not all that fond of mice. Or more specifically, I'm not all that fond of the way many desktop applications require constant switching between the mouse and keyboard. It wastes time.

QUOTE (menghuihantang @ Mar 25 2009, 06:58 AM)

No, eye tracking is more challenging than touchscreens. The point is that touchscreens have a practical limit on how good they can be. That limit is defined by the way users interact with it. Even if a touchscreen were infinitely accurate and infinitely fast, you still have the fundamental problems that users can't touch accurately and moving your entire arm takes more energy than moving your fingers.

QUOTE (menghuihantang @ Mar 25 2009, 06:58 AM)

Yes, fingers are inaccurate than a little mouse. But nobody ever said we can only use fingers. How about laser beams?

The original question and my response referred to touchscreens. There are countless alternative navigation methods that could be devised. The trick is to find one that offers real advantages over the mouse. How does laser navigation make me more efficient? How does it help me get my job done faster? What are the human limitations? (Try this: Take a laser pointer and hold it about 2 feet away from the computer screen. Now target different UI elements on the screen and see how long it takes before you can consistently hold the beam on the element. Can you hold it on the menus? Can you hold it on toolbar buttons? Can you hold it between the 'll' in the word 'alligator?')

QUOTE (menghuihantang @ Mar 25 2009, 06:58 AM)

Coolness is an important factor for certain types of products. The Apple iTouch is a perfect example. As an MP3 player its overpriced and underfunctional. Yet people buy it because it is cool. Contrast that with a screwdriver. There are all sorts of things that could be added to make a screwdriver cooler (zebra stripes, neon lights, biometric security, ...) yet you don't see these things in screwdrivers. Why? Because the screwdriver is a tool. Nobody uses a screwdriver just for the sake of using a screwdriver; it's used to accomplish something else. Those added coolness features don't help me screw together two pieces of wood any faster or make it easier to pry the lid off of a paint can. They don't offer any benefit to the user.

The mouse is also a tool. It's used to interact with the computer. How often do you sit at your desk with the computer off and move the mouse around, just for the experience? Labview is a tool used to solve other problems. People use it because they can solve those problems easier and faster with Labview than with other programming languages. Computers, for the most part, are used as tools. (Whether or not computers are being used as tools while gaming is debatable.) When people use tools they reach for the one that helps them solve their problem quickly and easily. If you want to replace the mouse as a navigation device, coolness alone isn't going to cut it.

QUOTE (menghuihantang @ Mar 25 2009, 06:58 AM)

Ok, maybe it's not time yet, but that's not a good reason to stop pursuing.

I'm not saying stop trying to make better navigation systems. I'm saying be smart about where you invest your energy. Touchscreens, IMO, are a dead end if you are hoping to replace mice for general computer use.

Link to comment
  • 1 month later...

I am not a professional LabVIEW programmer but I do work on computers quite a bit and when developing I find GUI design to be the most annoying thing to do with a mouse. When I'm writing a LabVIEW program I feel like 90% of the work is GUI design. Between the block diagram, the wiring, keeping the block diagram clean, and then of course, the front panel itself, there is so much GUI work in there that I feel more like I'm doing CAD work than actually programming.

I tried out an HP tx2 12.1" tablet/laptop convertible over the weekend and although I did not do any serious developing on it, I did feel that using the tablet mode for LabVIEW programming was absolutely amazing.

Some of the points in this thread that I'd like to respond to are:

*finger accuracy

I agree using a finger and "touch" mode would not be suitable for programming LabVIEW but using a stylus is completely different. The speed at which I could select items from the toolbars, drag them, precisely position them, select groups, and move them around blew me away. I wired a small VI and then cleaned it up in just seconds where the fine mouse movements would have taken me considerably longer otherwise.

*grand movements & keyboard entry

The screen size is only 12.1 inches. The shoulder movements are hardly more than those required to switch between the keyboard and mouse. I could use the handwritting tool that Vista ships with to complete text entry rather quickly. Granted, if I had to enter in more than 5-10 characters at a time it may be beneficial to switch to keyboard mode momentarily but if I'm just entering in a number or one word label, the handwriting tool is quite effective.

Again, I haven't used this for a serious project yet so I am not saying the person who has years of touch development is wrong, just that when I wrote a small program on this I felt I could do so quite quickly. I will need to compare the speeds at which I can write such a program with a mouse and on the tablet.

I found it difficult to find a comfortable working position. Using this device on my lap in a commuting or recreational atmosphere was not comfortable- the angle was too low. Using a laptop stand with it was too bulky. I felt the best solution would be to use it on a desk as if it was a sheet of paper and I was writing on it. The problem here is that the screen on the hp tx2 does not offer a wide viewing angle.

This was my first AMD, first hp, and first computer I purchased from a big-box store. It is both inexpensive and cheap (from a quality perspective). If the processor frequencies are not cut to 25% it runs over 70C most of the time. One of the keyboard keys was broken out of the box. Due to these reasons (the broken key being the most significant factor) I am planning on returning it.

I haven't tried a Wacom tablet yet but that's my next stop. I would like to see a Wacom tablet which has a small LCD screen on it to help with a general positioning on the larger screen but I'm not sure how practical that would be. Edit: Wacom's 12" Cintiq functions exactly like this. I'd like to try this but I don't want to shell out $1000 for a pointing device/screen only.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.