Jump to content

Looking for a starting point in VISA driver development


Recommended Posts

Hi everyone,

 

so we have a device for measurement & signal generation and want to make it accessible through VISA say by Ethernet / USB / Serial. Other than the VISA driver template and the SCPI parser that must be implemented on the device, does anyone offer a good starting point on how to make a device talk VISA and maybe even get it certified by NI?

 

Thanks in advance

Link to post
Share on other sites

Hi everyone,

 

so we have a device for measurement & signal generation and want to make it accessible through VISA say by Ethernet / USB / Serial. Other than the VISA driver template and the SCPI parser that must be implemented on the device, does anyone offer a good starting point on how to make a device talk VISA and maybe even get it certified by NI?

 

Thanks in advance

 

Well the SCPI parser is beyond any resources that NI would be able to help with. But if it is about the instrument driver itself you should probably contact the instrument driver developer group at NI here. They can give you more information as to the requirements to get your driver certified and added to the instrument driver library as well as resources about recommended practices in such a driver to ease the process of certification.

  • Like 1
Link to post
Share on other sites
  • 1 month later...

So I heard back from instrument.drivers@ni.com and I think it should be ok to post their reply here in the forum for everyone to read. Some of the suggestions may even be up for discussion, such as the recommendation to go about and use the 2014 Revision (what about backwards compatibility?)

 

So again, thanks for the suggestion to contact them.

 

==============================================================

 

Yes, the best starting point would be the Instrument Driver Guidelines and the Templates, but use the 2014 Revision.  If you do not have 2014, we have saved them templates back to LabVIEW 8.2
You might try using the Instrument Driver Development Studio for your development http://sine.ni.com/nips/cds/view/p/lang/en/nid/211922.  Again, make sure you have the latest templates above.

Make sure review the LabVIEW Settings to use before working on API VIs/Driver
Front Panel: Should all be the same color for consistency
Front Panel: Select Modern 3D style for controls and indictors
Block Diagram: Enable Use transparent name labels.
Block Diagram: Disable automatic error handling in new VIs.
Block Diagram: Disable Place front panel terminals as icons.
Do not Maximize Front Panels or Block Diagrams

The items below usually come up in reviews, when either templates are not followed or Guidelines are not followed:
- Avoid using one command per VI, check template on how to combine commands. Minimize redundant parameters.
- Avoid combining Configure, Action-Status, and Data subVIs in high-level API VIs; these should only be combined in examples
- All VIs, controls, indicators should include documentation and it is recommend to include at least one comment on the block diagram.  
- Use meaningful names (no symbols and Capitalize first letter) and avoid abbreviations unless they are known by users worldwide.
- Document what the VI/control is expected to do and always start the sentence describing what the VI will do with a verb.
- Document any restrictions for using the VI/control, such as whether a certain mode prohibits using the VI/control or if the VI/control cannot be used with a particular instrument model.
- Avoid using Boolean controls if values are not opposite states, the command usually has 0 or 1 or True or False as part of the command (Use the vertical slide switch for Boolean controls).
- Avoid using strings as controls (generally numeric and/or text ring controls are used in API) you can use file path control or timestamp control instead a string control
- Use Text Ring for values that do not have two clear states
- Include "%.;" as part of the format string to convert floating point numbers when using functions like Format Into String or Array to Spreadsheet String.
- Wire a False Constant from the Boolean palette to the Use System Decimal Point terminal of the Number To Exponential String and Number To Fractional String functions.
- Use a Select function, not a Case Structure, to select between two wire options.
- Use Scan From String and Format Into String as often as possible for string manipulation. Other string handling functions, such as Pick Line and Append True/False String, are good string manipulation functions to use.
- Use the Concatenate Strings function sparingly and only if you cannot find a more appropriate string function
- Design the block diagram so it follows the left-to-right, top-to-bottom model of data flow. Use consistent writing style in the VI and throughout the instrument driver.
- Do not use save options that remove or password-protect the block diagram (if you do we cannot certify the driver).
- Use "4-2-2-4 or the 5-3-3-5 connector pane terminal pattern"
- Avoid using all text as icons, but icons should be meaningful and not randomly selected
- Make the VISA resource name a Required input for ALL VIs.

Testing

- Make sure all VIs and Examples (at least two) are tested and documented with a meaningful name.

- Test all combinations of options on front panel

- Test with all supported interfaces implemented in the driver. 

- If there are any issues or other notes helpful to user it should be documented in the Readme.

For certification a programmer or User manual that contains commands should be submitted along with the driver in order to do the certification review.

Hope that helps.

Best Regards,
National Instruments - Instrument Drivers/IVI Group
instrument.drivers@ni.com

Link to post
Share on other sites

I remember using a VI Analyzer plug-in years ago to validate my driver. I'm not sure if that is now included with VI Analyzer or maybe its in the Driver Design Studio.

 

The first thing an NI employee would likely do after receiving a driver for validation would be to test it against the requirements above using VI Analyzer.

 

No sense submitting a driver for certification if it can't pass the automated test...

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Content

    • By javier_r
      VIPM.io now allows you to post LabVIEW Resources, Ideas, and Tools. For example, you could post a link to a video tutorial or blog article about a package. You can also post ideas, like feature requests or new tools. Best of all, package developers are notified when you post your ideas and resources, and you can comment and discuss posts with the community. Take a look at this video to learn more: https://www.vipm.io/posts/664960df-f111-4e13-989a-24be8207182d/

    • By Shuvankar Das
      I want to connect My ccd camera with labview. The details of my system is given bellow. I cannot connect it please help   OS:  WINDOWS 7, 64bit   LabView Run-Time 2013(64-bit) NI-IMAQ 4.8 NI-IMAQdx 4.3   Camera: QICAM Monochrome Cooled (QIC-F-M-12-C) Model QICAM Resolution 1392 x 1040 Sensor 1/2" Sony ICX205 progressive-scan interline CCD Pixel Size 4.65 x 4.65µm Cooling Type Peltier thermoelectric cooling to 25˚C below ambient Digital Output 12 bit Video Output FireWire (IEEE 1394b) Max. Frame Rate 10 fps full resolution @ 12 bits Pixel Scan 20, 10, 5, 2.5MHz Mount Type C-mount optical format  
       
    • By javier_r
      Hello everybody!
      Wondering how many people have tried the new vipm.io site. We have added a ton of features to make it easy to Discover LabVIEW Tools and there are some cool ones coming soon.
      Check it out and let me know what you think 😀
       
      Javier

    • By DigDoug
      CLA_ATM_QMH_PRACTICE.zipHi Folks,
       
      I'm taking the CLA exam in a few weeks and would like some feedback on the solution I put together (attached). A few specific questions:
      Can I dump tags in the VI Documentation of the VI like I did in Error Handler - Console Error.vi and get credit since there are instructions for developers to complete this work? This would be my strategy if I run out of time. 
      Does this seem like a passing solution? Why/why not?
      Where do you think I would lose the most points in this solution?
      Any other feedback on this exam or general strategy tips are greatly appreciated!
       
      Best Regards,
      Aaron
    • By ThomasGutzler
      Hi,
      I'm connecting to a Rigol DZ1000 Oscilloscope via USB and using the :DISP:DATA? ON,0,PNG command to grab a screenshot. Reading out the data in blocks of 65535 bytes until there is no more (see attached vi).
      This normally works fine but yesterday I was getting a timeout error. I fired up IO Trace and got this:
      > 783. viRead (USB0::0x1AB1::0x04CE::DS1ZA201305475::INSTR (0x00000001), "#9000045852‰PNG.......", 65536 (0x10000), 45864 (0xB328)) > Process ID: 0x000039C8 Thread ID: 0x00001760 > Start Time: 13:13:54.1169 Call Duration 00:00:10.4323 > Status: 0xBFFF0015 (VI_ERROR_TMO) You can see that 45864 bytes were received, which is exactly what was specified by the binary data header (45852 data bytes + 11 header bytes + 1 termination char)
      I dumped the reply string into a binary file and set the scope to run so it show something else on the screen. Sure enough the error went away. I also dumped a good result into a file. Then I tried to figure out what the problem may have been but I didn't get anywhere. Any ideas? Sure looks like a bug in VISA read or perhaps an incorrectly escaped reply from the scope?
      It's very easy to "convert" the reply into the screenshot - just remove the leading 15 bytes (4 bytes from WriteBinayFile and 11 bytes from the scope header). And yes, both data files display just fine as PNG. I don't think PNG does internal checksum so byte errors would be hard to spot.
      Any ideas what could have caused that timeout?

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.