Jump to content

dannyt

Members
  • Content Count

    414
  • Joined

  • Last visited

  • Days Won

    12

dannyt last won the day on February 12 2016

dannyt had the most liked content!

Community Reputation

65

About dannyt

  • Rank
    Extremely Active
  • Birthday 08/08/1961

Profile Information

  • Gender
    Male
  • Location
    Devon UK
  • Interests
    Family Time
    Power Kiteing and Buggying (3.5 and 5 meter Beamers)
    Reading (Ian Banks, Grisham, LorT type stuff, The Time Traveller's Wife, David Gemmell ,Robin Hobb)
    Playing down the beach or on Dartmoor
    TV Films (24 & Battlestar Galactica)

LabVIEW Information

  • Version
    LabVIEW 2014

Recent Profile Visitors

3,219 profile views
  1. I place all the VIPM packages I have a dependency upon into a Mercurial repository I have specifically created for that purpose and if I upgrade to a new version ensure both are kept in the repository. In an idea work I would upgrade from the community edition of the VIPM to the Pro version and create VIPC files instead
  2. dannyt

    Data repository

    This might be way more than you are looking for in terms of functionality, but I am using a full cloud based Test Data Management system called http://www.skywats.com/. It is a cloud base Test Results Database with Yield & Trent Analysis, UUT reports, Root Cause analysis and much much more. They provide a number of different ways of getting test results into the system, TestStand integration, LabVIEW Toolkit and also free trail.
  3. Again to stress one thing I mentioned in my post. I have really found it to help to have all NIC's always in an ACTIVE state, the nice green link on light lit and that is why for £10-£50 I stick a cheep powered switch on the NIC, something like a Netgear GS105UK 5 port. It seemed to me that if you just hung something like a UUT straight off the that was not always power on, the powering on and making the NIC go from inactive to active made windows do "things" that sometimes meant thing did not work as expect or just take a damn long time, Windows XP seemed better than the newer version. Install and play with Wireshark it can help tell you what is going where
  4. I know it makes me a slower coder but when coding in LabVIEW I always use autotool and if I want to switch it off I actually use the tools pallete, rather than tabbing. About the only thing I use on the keyboard is Ctrl-E, it is partly also the reason I do not use the quick-drop option. I did have major problems with my mouse hand a few years ago and found that a left handed Evoluent mouse really worked for me. Plus much to my surprise as I am very left handed, I found I could use a Logitech roller mouse with my right hand for things like web browsing or general screen use. So now I have two mice connected to my PC a left handed Evoluent one for LabVIEW programming or anything where real precision is needed and a roller ball on the right hand side for anything else, so I am switching which hand does what throughout the day. As I said it does make my slower and less efficient, but no the overhand it allows my brain to keep up with what my hands do. EDIT. I just noticed that even when doing LabVIEW I used both mice, I am using the roller-ball to moves FB or BD around my screen or from one screen to the other and automatically switch to my left hand when I start putting down wires. wow I had not realised I was doing that.
  5. As has been said you want to play around with the routing table on a windows machine. Read up about the net route command https://technet.microsoft.com/en-gb/library/bb490991.aspx and windows routing in general, you will need http://serverfault.com/questions/510487/windows-get-the-interface-number-of-a-nic to get the if interface for the route add. I have PC's fitted with two NIC I ping out of NIC 1 to an IP wireless device across to air into another IP wireless device and then back into NIC 2 on the same PC. I can also with this setup do Iperf across the air and telnet out the each NIC as desired. Without setting up the routing table correctly, I would in truth just ping and Iperf across the backplane of the PC from one NIC to the other, but the routing ensures I go out one NIC. I do all this with LabVIEW and net route commands issued as system exec calls, it works well. I tend to delete all all routes and then set up my own. If you do this remember to alos add in a route to your network for the main PC LAN or you cannot get to internet or local server :-) Also note if any NICs are disabled or enabled, Windows will try and be clever and sort out the routing as it see's fit and break your rules, so to help overcome this I nearly always have a simple powered dumb Ethernet switch hanging on the NIC between my test device and PC, then if I power off my test device to NIC still sees a good active connection to something.
  6. Been running it for a week now on my development PC and seem OK with LabVIEW. The update from Win 7 to Win 10 was problem free for me, but did take a long time. My only real issues so far is with a Cisco VPN Client, that no longer works at all so I need to use a Windows 7 PC still to get access to my factory located test systems.
  7. Yes a big thanks from me as well, give me something to watch in the evenings
  8. Amazing news has anybody else seen this http://sine.ni.com/nips/cds/view/p/lang/en/nid/213095 LabVIEW Home Bundle for Windows. Great news from NI
  9. You could add the information to the LabVIEW Wiki at http://labviewwiki.org/Home , I am never quite sure if this site is related to LAVA or not.
  10. I never did a full list of the bitset mappings I found what I wanted and left it at that, I did find this on the web http://sthmac.magnet.fsu.edu/LV/ILVDigests/2003/12/06/Info-LabVIEW_Digest_2003-12-06_003.html that might help you, but it may well be different for different LabVIEW versions. I cannot get to grips with your idea of putting the property within the VI's you are interested in, it does not sit well with me the idea of putting in code that does not really have anything to do with the function you want your VI to do, maybe that just me. You are doing something harder than my recompile check, in my case, one or two VI's that actually needed to change would be edited saved and closed. These VIs would then be committed to the source control system with a comment about the change just made. Then by running the tool VI I could get a list of all the VI;s that the original edit has caused to recompile and these could be saved and committed into the source control system with a comment saying they were recompiles. The nice thing in my case was all the VI's were closed when I ran my tool. In your case I think you want to get a list of files in memory and then look at them all one at a time
  11. I am sure there was something for this. At my previous company before the separate source and compile code option existed I used a scripting property to check the Modification bits to look for VI's that only had "recompiled" as their change. Cannot remember what I did thought. EDIT Just found it, I would list all the files in a directory then open one VI at a time in edit mode and then there was a property values of the VI showing "Modifications -> Block Diagram Mods Bitset" , "Modifications -> Front Panel Mods Bitset" and "Modifications -> VI Panel Mods Bitset". I had to play around a little to find out what the values meant but I did get it working
  12. I am very nearly that old certainly used punch cards when I started. My experience covered that of a stand alone Release Manager and as part of a Release Management Team at two major financial institutions. In one we released around every 1-2 weeks between 6pm and 6am while the trading markets were down and in the other we did release approx every six months over a weekend again while trading markets were down. I both case the role covered doing the format builds and release of software to both the test environment and live production systems, this was done mainly by automated scripts write by myself and the release team. We were the first port of call for the test teams when there were problems with the builds or environment as we knew how the system went together and what changes had just been added and which programmers to go to if required. We coordinated and controlled the software changes by the programmers with the required database changes by the Oracle db admin team and or unix system changes required. Finally we had to ensure we were happy we could in the case or a problem with a live production release roll the release back to the previous working system. The key thing thought was to act as the glue between the test teams, the various principle engineers that were responsible for different aspects of the system and project managers. We were a team where everybody had a part to play to make a successful release occur. I did not look at the test managers test results but did check he was happy that the testing cycle was complete and passed, I did not look at the programmers code, the principle engineering made sure coding standards etc were adhered to, I was not a unix admin or an Oracle dbA but I still need to ensure they were in place and ready if needed. At the end of it all I actually did the physical releases I knew what when wrong and needed closer supervision next time , who knew who cut corners (always with the best of intentions) and needed to be encouraged not to, I know who I could be relied on and who would always forget to provide an SQL roll-back script, or forget to tell the Unix admin's of some dependency package needed as part of the deployment. Another un-thankfull task I have done is that of SCC control Manager (ClearCase), I am old enough to have had the arguments with programmer who directly refused to keep their code in the SCC system "why do we need to bother with SSC have just copy and rename my folders" Finally I am sorry that I seem to have gone of on a bit of a rant with this subject, and possibly bored people, this is not something I usually do. I just feel there are many roles involved in getting good quality software released to customers, some are obvious in their impact and glory, other less so and I do not like to see parts of the team that make it happen dismissed.
  13. As I had already tried to point out before your comment, the hot coals was in jest and I did go on to say "justify" the change and in my understanding the word justify implies evidence and rationalisation not emotive argument, but maybe it is not seen to mean that by other. I think there is a very narrow view or experience of a Release Manager's role being used here, of somebody who lives outside of the development team and development process and has no technical ability or validity, but that has not been my experience, I can remember a time when that was the case but that is going back several decades.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.