Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 09/30/2011 in all areas

  1. I was looking at a function to format a binary string as hex, and tried to get some performance. I was shocked with the following results: What i've concluded is that the Integer to Hex-string function for arrays is not working with U8's but with something else (U32's?). Ton
    2 points
  2. I seem to remember that Floats are coerced to 64 bit integers, so I am guessing that the default input type is U64 for arrays. It seems silly to not have an explicit [u8] input for the primitive instead of blowing up [u8] to [u64]. I am curious if you could check the performance with different integer array types and find one with a smaller time differential between the two methods. A very useful rule-of-thumb for optimization is that the more code you push into a primitive, the faster it will be. This bug(?) breaks that rule so it is very annoying on many levels.
    1 point
  3. If you XOR the old and new values, all the changed booleans will be True. Ton
    1 point
  4. Make Queues use the value of the data type input as default output value in error conditions
    1 point
  5. There's a couple companies that make micro linux boards, where the biggest component is the ethernet port, so it'd definitely meet your size requirement. However, I don't know about the power consumption. Sorry, I don't have any links off-hand.
    1 point
  6. Two suggestions: use the Icon of the polymorphic instance instead of the icon of the polymorphic VI. Add a to-lower-case in the Hexidecimal VI, LabVIEW formats HEX with UPPERcase, while we've seen that the official standard is lowercase Ton
    1 point
  7. The OpenG Board has decided to use on the following icon and connector pane: Additionally (as suggested in this thread) a Hexademical String output has been added to MD5 Message Digest. This could have been implemented in two ways: Depreciated old VI, create new VI and add additional output Convert to Polymorphic API The Polymorphic API was chosen as it allows for future support of additional outputs (if ever needed). Note: It will not break existing code. The Hexadecimal String is just a thin wrapper around the original VI: I will close this review in the next few days. Thanks for everyone's input!
    1 point
  8. Not sure if this fits the bill for you, but maybe a MikroTik RouterBoard might work for you. These things are fully programmable, so you can make it do just about anything you want. Don't know if they fit your size requirement, but the most seem to fit your power requirement. I will give you one warning: learning to configure these things is a steep learning curve.
    1 point
  9. Allow Diagram Disable Structure and Conditional Disable Structure to be replaced with a Case Structure Add the ability to replace a disable structure with a case structure
    1 point
  10. First up, it's not that I don't like Front Panels! Rather, the idea comes from the recognition that, as evidenced by a fair number of ideas in the Idea Exchange, FPs often add unnecessary work to developing VIs. I was trying to pull a number of different ideas together into a solution, sparked by reading this idea which suggests adding yet another layer to a VI - and questioning whether we can reduce complexity instead of increasing it. There's been some discussion on how to limit the FP to show only what's needed for the user interface (e.g. here). Having unneeded (from the interface point-of-view) controls and indicators on the FP means we need to work around issues such as edit-vs-run-time displays, setting up tab orders, rescaling controls (but only some of them) as panels change size, disabled code, and so on. The basis of my suggestion is not that we shouldn't have FPs, but that FPs should only contain what is needed to be seen by the user. Secondly, the data that needs to be seen during development or debugging is often very different again. When we use a Probe, we essentially create a virtual indicator. Take the Probe Window slightly further in its development, and we now have a second FP, which contains another set of controls/indicators for showing, or setting, the data during execution. Also, for many VIs, the variables that are passed through the conpane interface are not required to be shown in the user-interface - Error clusters for example. It makes sense to be able to link the conpane directly to BD controls, and therefore there is no need for the controls/indicators to appear on the FP at all - many subVIs without user interaction wouldn't need an FP defined at all. To me, the conpane has a much stronger link to the BD than the FP, as it is there to provide the programming interface to the code. The downside would be that it is perhaps more difficult to find terminals on a crowded BD, but that could be enhanced (e.g. mouse-over the conpane, and everything fades except for terminals). Perhaps there would even be the opportunity to see the FP as a virtual cluster. The LV paradigm presumably arises from the concept of Virtual Instruments, where the controls/indicators are what you define your instrument by - and much early code was all contained within one VI! This idea suggests that LV has perhaps grown past that stage, with far more emphasis now on subVIs, classes, libraries, etc which are maybe more like patch panels or breadboards (still thinking as an engineer!) - things that are hidden from the end-user. I don't see it as such a radical change - more a response to the way that LabVIEW is being used anyway - to make that process easier and more transparent.
    1 point
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.