Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 11/07/2011 in all areas

  1. Regarding this code fragment, wouldn't a high-performance, flexible, and clearly-readable solution be to add the functionality to the case structure itself? By implimenting these Ideas: Wire Class To Case Selector and Allow vi server reference type as case selector. It would look something like this (the "default" case would return the parent "Message" wire type): -- James
    2 points
  2. Before someone times this and realizes it is not true. Although we have some internal support that could someday allow reverse string to be constant time, it is not. Reverse string will actually swap all the characters in the string. Reverse array on the other hand is a constant time operation.
    1 point
  3. According to the post here, TestStand 2010 SP1 (released August 2011) supports dynamic dispatching. Is there a KB entry regarding this? I could not find anything. It would be useful to summarize the LVOOP features and limitations by version of TestStand.
    1 point
  4. I worked for 5 years in a group designing and building consumer mice and keyboards. Furthermore, I just recently finished a 2.5 year stint for a company where I did extensive work with capacitive touch technology, including touch screens. I don't claim to be an expert in the field, but I've interacted with usability studies enough to learn a few things. If you want to move Labview programming into a new paradigm, such as touch screen programming, you have to offer the user a tangible benefit. Once the coolness factor wears off, what advantage does touch give the user? In terms of speed and accuracy for most users, nothing beats a mouse. Part of that is simply because that's what users are used to. Part of it is because the mouse works extremely well as part of the complete, closed-loop cursor positioning system. Your brain decides where to move the cursor and feeds inputs to your hand. As your hand moves the mouse, your eye tracks the cursor and your brain sends small error correction commands to your hand, allowing you to get to get the cursor to the target very quickly. Capacitive touch as a technology cannot compete with the speed and accuracy of a mouse. There is simply too much noise in the system. You typically don't see the noise at the user level because of the extensive filtering taking place under the hood. Of course, that filtering comes at a cost--reduced response time. Everybody's favorite capacitive touch screen, the iPhone, was lagging roughly 100 ms last time I checked. (The effect is most easily observed if you have an application that displays a dot at the location the sensor is reporting and move your finger around the screen quickly.) 100 ms isn't much in absolute terms, but that much of a delay can sure throw off the user's cursor positioning system. Touch screens suffer from an additional problem that has nothing to do with the technology. Quite simply, users are not accurate with their finger. There are several reasons for this. The main reason is that the finger blocks the target, making it impossible for the user to make those fine tuning adjustments that put the cursor exactly where they want. Another contributing factor is that a user's finger 'footprint' will be different depending on the finger's angle when it makes contact with the touchscreen and depending on how much pressure the user applies. It is very difficult for users to repeatedly hit small targets when using their finger on a capacitive touchscreen. As a general rule, at best you can expect users to be accurate with their touches to ~5mm of their target position. Well designed touch screen interfaces won't have any UI elements less than ~10mm. Imagine trying to hook up a front panel terminal or selecting a wire on a block diagram using a touch screen. There are ergonomic issues too. Programming Labview via a touchscreen requires large arm motions. It is much more inefficient in terms of both time and energy than using a mouse. "Labview shoulder" would be the new repetitive stress injury. (This is also why I think the user interface from Minority Report is misguided. Too many grand motions are required to get the work done.) IMO, Ben is right; the next big breakthough in cusor positioning is going to come from eye tracking. It's the only thing currently on the horizon that offers advantages over mice. There are huge technical and usability hurdles to overcome, but whoever develops and patents a good consumer-level eye tracking systems is going to make a bundle of money. (I actually tried, unsuccessfully, to generate interest in researching eye-tracking systems during my time developing mice.) QUOTE Essentially you're talking about the arrow keys on the keyboard. Personally, I use them occasionally for positioning selected items on the screen, but if I'm going to have to use them just to get enough accuracy to select an item, I'm going to get real irritated real fast. QUOTE You can also have a keyboard on the touchscreen just like a real physical one. Actually, you can't. You can make the layout the same, but that's about it. Touchscreens, being flat, don't have the same ergonomics or tactile feedback that real keyboards do. The tactile feedback is extremely important for touch typers. The little bumps on the 'F' and 'J' key helps me make sure my hands are positioned correctly without having to look down every time I move them. The curvature of each key provides the subtle hints that keep each finger on the correct key. If I go to hit the 'O' key, and my finger is off a little bit, I subconciously note the different feeling and my brain corrects for it the next time. Touchscreen keyboards don't provide either of these, which is why people can type faster on real keyboards than touchscreen keyboards. QUOTE You will find out your programming speed is much much faster and more fun. Two things in particular make programming fun for me: Learning how to do new things. (i.e. LVOOP, XControls, etc.) Developing software that makes mechanical things move. (Because that never loses the coolness factor. ) (More generally, this could be considered developing software that makes me or someone else more productive.) Fighting with the development environment's user interface is NOT something that falls into my "fun" category. (Ask me how much fun I'm having the next time Labview crashes.) A proposal that makes either of those two items easier is good; if it makes them harder it is bad. Programming Labview via a touchscreen, IMO, falls squarely on the "bad" side of the equation.
    1 point
  5. QUOTE (neBulus @ Mar 24 2009, 08:51 AM) Well, I am not saying to eliminate keyboard and mouse, they can still be there if needed. Or even integrate mouse on the touchscreen. I agree with you that a lot of stuffs are not ready yet, but I think it is possible starting to pursue it now. This will be a good push to spread LabVIEW, don't you think?
    1 point
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.