Jump to content

Bryan

Members
  • Posts

    366
  • Joined

  • Last visited

  • Days Won

    11

Posts posted by Bryan

  1. Another SVN user here (With TortoiseSVN as standalone and also with an SVN server).  Was introduced to it years ago and have had no reason yet to move to any others.

    I like it for its (relative) simplicity when compared with others.  

    For LabVIEW, you can configure it to use the LVCompare and merge tools, (there are some blurbs online you can search for that tell you how to configure it). 

    Drawbacks that I've found (in my current employment) is lack of organization in the repository (poor pre-planning).  They use one repository for EVERYTHING and it's gotten huge and didn't plan it for best use of the the branching/tagging functionality. 

    Another one is searching this HUGE repository I have to work with.  I've found no easy way to search for files/folders/etc.  My workaround for this is to use the TortoiseSVN command line and dump an SVN list to a file - then search the file to get a relative location for what I'm looking for.

  2. You could also create your own VI that establishes a connection to the database where the credentials are stored as constants and remove the block diagram.  To do this, you have to create a project, then source distribution with that VI (always included).  Then, in the settings for that VI in the distribution, you select the option to remove the block diagram.  This is currently the most secure method of hiding LabVIEW source that I'm aware of.

    The only problem is that if the credentials ever need to be changed, you have to change it in the source VI, re-build the source distribution and then send the new VI to your students.

  3. On 1/6/2021 at 12:47 PM, ShaunR said:

    That's not very American. Where's the guns?

    Sorry, I missed that detail.  The LabVIEW program automates the targeting and actuation of firearms. 

    (To the FBI/NSA agent monitoring this post - this is what is known as a joke).

  4. On 12/27/2020 at 7:58 AM, Mefistotelis said:

    I actually don't care enough for LV to get triggered by the first sentence. But the last one... 😡

    I agree, I have garnered great disdain for Winblows over the years as far as the negative impacts to our testers from updates mandated by IT departments, obsolescence, the pain to install unsigned drivers, just to name a few.  I would hate to see NI stop support for Linux as it has been growing in popularity and getting more user friendly.

    Linux is a great and stable platform, though not for the faint of heart.  It takes more effort and time to build the same thing you could do in shorter time with Windows.  However, If LabVIEW were open source and free, you could theoretically build systems for just the cost of time and hardware.  I've been wishing over the years that they would support LabVIEW on Debian based systems as well.

    I've created two Linux/LabVIEW based setups in the past and never had the issues I've run into with Windows.  Yes, it took more time and effort, but as far as I know - the one(s) I created (circa 2004-5) have been working reliably without issue or have even required any troubleshooting or support since their release.  One is running Mandrake and the other an early version of Fedora.

  5. I just found out this morning that "Runstate.Execution.TSDatabaseLoggingDatalink{GUID}" isn't a valid location for TestStand 2010 (and not sure what other versions).  For 2010 at least, it is "Runstate.Execution.TSDatabaseLoggingDatalink{Database Schema Name}".

    After a lot of futzing around yesterday, I simply added a statement expression step after "Log to Database" in the "Log to Database" callback sequence.  Note: I didn't want to create a new local variable in the sequences of each tester, so I used Step.Result.Status to temporarily hold my value (Probably not a good practice, haha):

    Expression that works for TestStand 2010 (Not sure about other versions) (Using Parameters.DatabaseOptions.DatabaseSchema.Name and replacing all "\s" with "_":

    // Assign the GUID value to our step status for temporary local variable (Evaluate won't work unless this is done for some reason)
    Step.Result.status = Parameters.DatabaseOptions.DatabaseSchema.Name,
    
    // Replace all "-" with "_" in the GUID and build path to datalink object... then set it's value to NOTHING (Destroys DB reference forcing TestStand to re-open it next UUT run.)
    Evaluate("RunState.Execution.TSDatabaseLoggingDatalink" + SearchAndReplace(Step.Result.Status," ","_") + "= Nothing" ),
    
    // Update  step status to what it would normally be if we hadn't had to use it as a temporary local variable.
    Step.Result.Status = "Done"

     

    Expression that works for at LEAST TestStand 2013 and 2016 (Using Parameters.ModelPlugin.Base.GUID and replacing all "-" with "_"):

    // Assign the GUID value to our step status for temporary local variable (Evaluate won't work unless this is done for some reason)
    Step.Result.status = Parameters.ModelPlugin.Base.GUID,
    
    // Replace all "\s" with "_" in the GUID and build path to datalink object... then set it's value to NOTHING (Destroys DB reference forcing TestStand to re-open it next UUT run.)
    Evaluate("RunState.Execution.TSDatabaseLoggingDatalink" + SearchAndReplace(Step.Result.Status,"-","_") + "= Nothing" ),
    
    // Update  step status to what it would normally be if we hadn't had to use it as a temporary local variable.
    Step.Result.Status = "Done"

     

    Edit: I found today that one of our TS2016 installations used the same method as I had posted in the TS2010 verison above.  The DatabaseSchema.Name property had a combination of periods, spaces and dashes.  All of these I had to do a "SearchAndReplace" with underscores.  I'm not currently sure what's causing the inconsistencies between versions (using the GUID vs Schema Name), but so far I've had to tweak each one on a case by case basis. 

  6. I figured I would post a resolution to my own problem for anyone who may run into the same thing. 

    While searching for something unrelated today, I stumbled across my answer.  TestStand creates a connection to the configured database with the first UUT that is tested and maintains that connection while UUTs are continuously tested.  Only when the sequence is stopped is when the database connection is released.  I was able to come up with my solution by essentially compiling what I found in my searches.

    In my situation - "hiccups" in network connectivity or with the SQL server cause that connection to become invalid.  This is why we get our database logging errors and lose logged data.  I had read about the offline processing utility, but I wanted something simpler such as a LabVIEW test step that I could drop into a sequence that would handle things gracefully instead of setting up offline processing configurations on all of the testers. 

    After the first time the "Log to Database" callback is run, the datalink object becomes available at: "Runstate.Execution.TSDatabaseLoggingDatalink{GUID}".  (GUID will vary).  

    My solution (in my testbed so far) is that I created a VI that locates that object and writes a null value to it. (Empty variant constant).  This appears to force TestStand to re-establish the database connection when each subsequent UUT is tested.  This makes sure that when the test sequence sits in limbo for hours or days (plenty of time for DB/Network "hiccups"), that the next time a UUT is run, the data will be logged - assuming the DB and Network are up and running when called.  I have yet to put this method to the test on one of the actual testers to validate it, but it seems promising in my "simulated environment", (we all know how that sometimes goes).  I should also mention that this should go at the end of the "Log To Database" sequence callback.

    I wish I could share the VI that I created, but I'm not sure of my company's policy in doing so, but my description above should be enough to get people on the right path.

    • Like 1
    • Thanks 1
  7. TestStand Version(s) Used: 2010 thru 2016
    Windows (7 & 10)
    Database: MS SQL Server (v?)

    Note: The database connection I'm referring to is what's configured in "Configure > Result Processing", (or equivalent location in older versions).

    Based on some issues we've been having with nearly all of our TestStand-based production testers, I'm assuming that TestStand opens the configured database connection when the sequence is run, and maintains that same connection for all subsequent UUTs tested until the sequence is stopped/terminated/aborted.  However, I'm not sure of this and would like someone to either confirm or correct this assumption. 

    The problem we're having is that: Nearly all of our TestStand-based production testers have intermittently been losing their database connections - returning an error (usually after the PASS/FAIL banner).  I'm not sure if it's a TestStand issue or an issue with the database itself. The operator - Seeing and only caring about whether it passed or failed, often ignores the error message and soldiers on, mostly ignoring every error message that pops up.  Testers at the next higher assembly that look for a passed record of the sub assemblies' serial number in the database will now fail their test because they can't find a passed record of the serial number. 

    We've tried communicating with the operators to either let us know when the error occurs, re-test the UUT, or restart TestStand (usually the option that works), but it's often forgotten or ignored.

    The operators do not stop the test sequence when they go home for the evening/weekend/etc. so, TestStand is normally left waiting for them to enter the next serial number of the device to test.  I'm assuming that their connection to the database is still opened during this time.  If so, it's almost as though MS SQL has been configured to terminate idle connections to it, or if something happens with the SQL Server - the connection hasn't been properly closed or re-established, etc. 

    Our LabVIEW based testers don't appear to have this problem unless there really is an issue with the database server.  The majority of these testers I believe open > write > close their database connections at the completion of a unit test. 

    I'm currently looking into writing my own routine to be called in the "Log to Database" callback which will open > write > close the database connection.  But, I wanted to check if anyone more knowledgeable had any insight before I spend time doing something that may have an easy fix.

    Thanks all!

  8. On 5/6/2013 at 4:59 PM, hooovahh said:

    I've seen people add LabVIEW to their resume with less experience then that.

    Well to be fair, some companies have done the same thing to their employees who have never claimed any experience with a particular item/software/etc..  I've worked a couple of places where it has been: "Oh, you double-clicked the icon for "X" application?  You're now the resident EXPERT!". 

    Not too long ago, we had a "Kaizen" event where we had moved a piece of equipment in a production cell.  I knew nothing about said piece of equipment but it was Windows-Based, so I figured I would go ahead and reconnect the keyboard, mouse, monitor and power it up just to make sure that it still booted after the move.  A couple of people had seen me do this and now every time there's an issue with it, I'm called to look into it. 

    (Sorry to go OT)

  9. @pawhan11 That sounds exactly how SVN was set up where I currently work - Including having LabVIEW ISOs/Installers stored within. (I wonder if we work at the same company.)

    One of the drawbacks you've mentioned and I've found with SVN is the ability to search for directories/files/etc.  There are tools out there I believe to do this, but, since I'm not an administrator, I can't implement anything myself.  Something I've done is use the SVN commands to generate a text file listing of ALL of the SVN contents... then if I need to search for something, I use Notepad or Notepad++ to search for key words in the file.  Once found, I then I have the mapping to find it in the SVN Repository. 

    Something I would want to do if it were me - would be to set up a new repository (or reconfigure the current - if able) to use the proper multi-project structure that SVN recommends. 

  10. Our logging routines are home grown.  Of course, we're not logging EVERYTHING, mostly just overall test results and not each bit of data collected from a UUT.  Our older testers log production pass/fail data to CSV files (older testers) or the company's MS SQL database.  

    I've wanted to use SQLite for local logging, but the push back that I got was that it adds a dependency - having to create or install a viewer to access the logged data.  Most of the other Engineers that would be accessing that data want something that's ASCII based and easy to read and import into other formats that wouldn't require anything other than common applications to access. (E.g. Excel, Notepad, etc.).  They like the KISS method.

  11. I haven't used many VCS systems aside from the very few variants that have existed at my current and previous employers where I had access for LabVIEW development.  Since some of those places didn't view LabVIEW as a programming language, I wasn't often granted permissions to have access to their VCS systems.  So, my experience is limited.

    I spent a lot of time in Visual Source Save, but that's not going to help you here.  However, I was able to grasp SVN + TortoiseSVN pretty quickly.  I've also (for a very short period of time) had a little experience with Trac, though not enough to be a resource for knowledge on it. 

    Additionally, I recall that VisualSVN Server was easy to install and manage an SVN server.

    I've tried to grasp Git - but having been using TortoiseSVN + SVN for so long that I struggled to quickly grasp Git, so I abandoned it.

  12. I was able to find a Declaration of Conformity dated some time in 2011, and that it's a 12-bit digitizer via a  Google search, so it shouldn't be that old.  You may have to get into contact with NI to find out more on this card. 

    Unless the card was such a flop for NI that they chose to erase all evidence of it's existence - never to be spoken of again. 

  13. 3 hours ago, Grv Chy said:

    Thank you Bryan for your suggetion. Do you know any example of sharing data with external PC connected via LAN cable. I am not able to figure out how to share data and save my .csv file directly in a external PC(containing no LabVIEW). I would be very thankful.

    The Internet is loaded with examples of file sharing, so I don't really want to reinvent the wheel by writing up a tutorial.  A Google search will yield tons of tutorials and examples and you can even search based on your computers' operating systems.  

    For W10, Microsoft has the following article: https://support.microsoft.com/en-us/help/4092694/windows-10-file-sharing-over-a-network

  14. Are you meaning a *.txt file generated from a LabVIEW program saved onto a remote PC?  You should be able to set up a shared folder with appropriate permissions on your remote PC and have LabVIEW save the file to the share.  I won't get into the details since there's gobs of info on the internet on how to set up shares. 

    Once it's set up, you can just have your LabVIEW program point to the path of the shared folder.  For example: If your remote PC name is "REMOTEHOST", you name the shared directory as "LabVIEWDATA", and your text file name is going to be "TestData.txt", the file path for your text file would be (on a Windows machine): "\\REMOTEHOST\LabVIEWDATA\TestData.txt".  You can optionally replace the "REMOTEHOST" with the IP address of the remote machine, but if it's a DHCP assigned address, it could change on you in the future. 

  15. I'm not sure that you're going to be able to (easily) do what you want to do without some advanced programming to automate/manipulate Microsoft Excel (for instance, using ActiveX or dotNET), even then - I'm not sure it's possible to do what you're trying to do as it involves Microshaft software, over which LabVIEW will have limited ability to control.

    I agree with 'crossrulz'.  Whomever is pushing this requirement may need to reevaluate the need for this functionality.

  16. There is a "Flush File" function in the "File I/O >> Adv File Funcs" menu that you can have execute after every write.  

    However, If you already have Excel open to view the CSV file, I don't believe that Excel will automatically update your spreadsheet contents every time the file changes if that is what you're trying to do. I haven't tried that myself. 

    If the above doesn't work, it may be that Excel doesn't constantly monitor the file and automatically update.  You could try using the "Refresh All" button in Excel under the "Data" menu, but you'd have to do this manually.  Additionally, you'd have to open your reference to the file with permissions configured in such a way that it doesn't block other applications from using the file if another already has it open.

    Someone more experienced in Excel and actively monitoring file contents may be able to provide more info.

  17. 52 minutes ago, drjdpowell said:

    Interesting quote from the ex-NI person posting on Reddit:

    We all know that they wouldn't be the first company to do that and certainly not the last.  It's happened in places I worked before where we were forced to use some obscure, and non-user friendly software applications that were not the industry norm for reasons unknown.  If I were to speculate, it was because someone in the upper echelon had been dazzled or were/knew someone who could financially gain by way of it.  These applications were, as was said, enterprise level agreements/suites/etc. and likely cost the company a lot of money both in purchasing/agreements as well as the time wasted for learning curves and inefficient usage.

    Before I joined my current company, "Thou shalt use LabVIEW" was the command from above.  Not all were pleased with the mandate at the time and some still aren't, but it's one of the top reasons that helped me to get a job there.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.