Jump to content

PA-Paul

Members
  • Content Count

    135
  • Joined

  • Last visited

Everything posted by PA-Paul

  1. As a developer using open source code, I want to be able to inspect and run the unit tests/verification steps etc that were used to prove the code works as intended before release.
  2. So, does this sound like a sensible way to evaluate which option (expressing the X values for my interpolation in Hz or MHz) provides the "best" interpolation: Generate two arrays containing e.g. 3 cycles of a sine wave. One data set is generated with N samples, the other with 200xN samples. Generate two arrays of X data, N samples long, one running from 0 - N and the other running from 0 to N E6 Generate two arrays of Xi data - 200xN samples long, again with one running from 0-N and the other from 0 to N E6 Perform the interplolation once for each data set using the N samp
  3. Thought I posted this the other day, but apparently didnt. I agree it's likely a precision/floating point thing. The issue I have is how it tracks through everything. So the interpolation is used to make the frequency interval in our frequency response data (for our receiver) match the frequency interval in the spectrum (FFT) of the time domain waveform acquired with the receiver. We do that so we can deconvolve the measured signal for the response of the device. We're writing new improved (and tested) version of our original algorithm and wanted to compare the outputs of each. In the new one,
  4. Hi All, This may be a maths question, or a computer science question... or both. We have a device frequency response data set, which is measured at discrete frequency intervals, typically 1 MHz. For one particularly application, we need to interpolate that down to smaller discrete intervals, e.g. 50 kHz. We've found that the cubic Hermite interpolation works pretty well for us in this application. Whilst doing some testing of our application, I came across an issue which is arguably negligible, but I'd like to understand the origins and the ideal solution if possible. So
  5. Unfortunately, even with debug enabled you cant probe the class wire (Well, actually you can but all you get is the class name, not the private data). At least that is the case on VIs within the PPL (so if you try to probe the class wire on the BD of a VI within the packed library all you get is the name of the class. I'm pretty sure the same is true in the calling code once the class wire has "been through" a VI from the PPL. Paul
  6. Not sure where this question best fits, it could have gone in application design and architecture (as I'm using the PPL based plugin architecture), or possibly OOP as I'm using OOP... Anyway, I have made an interface class for my plugin architecture, put it in a library (.lvlib) and then packed that library (to PPL - .lvlibp) for use in my application. My question is relatively simple, why, when I look at the class within the packed library, is there no sign of the "class.ctl" (i.e. the class private data control)? Related - is the lack of class private data control within the PPL
  7. Thanks Tim, What I find strange is that on a fresh windows 7 install, the installer built with LV tells me I need first to install .NET4.6.1 and then does it. Why cant it do something similar in windows 10 and alert me to the fact that .NET installs are needed and do those for me! Ho hum, another reason (asside from DAQmx 16 no being compatible with LV2012) to think about migrating up to a newer version! Thanks again Paul
  8. Hi All, I have an application built in labview 2012 which we have been distributing fine with an installer built in labview 2012 for some time. Recently, a colleague tried to install the application onto a windows 10 machine and although the installation process seemed to go smoothly, for some reason, one of the two exe's distributed by the installer gives me the ever so helpful "This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not e
  9. Background: Precision Acoustics Ltd have been at the forefront of the design, development and manufacture of ultrasonic measurement equipment for over 25 years. The company specialises in the research and development of ultrasonic test equipment used extensively in the QA of medical devices, to provide industry with ultrasonic Non Destructive Examination (NDE) and within academia and national measurement institutes throughout the world. Brief: Together with the Bournemouth University we are looking for a Software Engineer to work on a joint project focused on the developmen
  10. As a company we've been using labview for a while to support various products, and now we're going to be trying to improve our work by implementing more of a collaborative approach to roll out some new versions. But we're not traditionally a "software company" so are starting to look at how we can do this from a "process" point of view, so I was just wondering if anyone here has any particular hints/tips or advice for where team members may be working on the same bits of code, how to avoid conflicts etc etc. IF there's any useful literature or resources that may give some insight into this are
  11. New job opening for a graduate computer scientist for a software architect role on a "Knowledge Transfer Partnership" between Precision Acoustics Ltd (www.acoustics.co.uk) and Bournemouth University in the UK. Looking for someone with LabVIEW experience. Job advert and details here: http://www.bournemouth.ac.uk/jobs/vacancies/technical/advert/fst102.html Salary will be £26,000.00 per annum Precision Acoustics require a dedicated IT Software Architect to join their team on a specialised project. Precision Acoustics manufactures ultrasonic measurement products for medical and NDT indu
  12. Thanks for the answers. Had a suggestion from someone else which I'll try - he has a small dellXPS 13, but sets the resolution down to 1600x900 and then runs at 100% scaling instead of going for full resolution and scaling up... I dont think it sounds the ideal solution, but I'll give it a try. Thanks Paul
  13. Hi All, Wasnt overly sure where to ask this... I'm having issues related to screen resolution, but mostly to do with block diagram behaviour rather than front panel object sizing which comes up regularly! Anyway, I have some code which I wrote on my PC (windows 7.1) at the office - screen resolution is 1680x1050. When I look at the BD on that PC, things line up nicely and look generally ok. However, when I open the same code on my MS Surface Pro 3, (windows 8.1, screen res 2160x1440 - and with scaling set to 125% in the control panel display options) it seems that different things on t
  14. Does anyone know if there's a way to get the value of a position along a slide control based on mouse co-ordinates? For example, the waveform graph has an invoke node which will "Map Coords to XY" I'm after a similar functionality but for a slide control so that I can either prevent unwanted clicks in a certain region of the slider (i.e. I have a 2 slide control and want to either filter out any clicks that fall outside of the region between the sliders, or switch the active slider to be the one closest to the click before the click is processed). I put a slightly more detailed (comp
  15. Thanks for the link, feels a bit kludgy to me, I think I will just aim to not allow the VIs to be stopped from within the panel! Is it considered good practice to remove a VI from a subpanel before putting something else in? It seems you don't need to, but would be good know the best way! Thanks Paul
  16. Thanks for that - I'm not sure at this point if I will need to allow the child VI to stop itself in the end application. In all honesty, I think probably not except that I may want to be able to run the children as standalone VIs as well, in which case I'll need to be able to close them cleanly. Is there a way to find out whether a VI has been called dynamically? e.g. could is there a property I could query inside a VI to see if it was launched dynamically from by another VI? I guess a simple way would be to have a connector pane boolean for static/dynamic and wire the relevant constant in
  17. Hmmm.... I missed that one to be honest - Thanks! That said, I just tried it, and strangely it doesn't seem to work... even when I press the "stop" button to stop the dynamic VIs, the state still returns "running" - what am I missing?! Attached is an example of how I'm trying to do things, including the new idea of checking the execution state. Comments more than welcome! The other strange thing is that I was trying something similar the other day using not a refnum comparator. In that example, I was thinking more about the underlying hardware control "engines" for this applic
  18. Hi All, I'm designing the architecture for a new application. I'm looking at keeping things modular and breaking down the functionality of the system into modules (that can ultimately be re-used). For the UI, I was planning to use a subpanel vi control and load the modules into that when needed. I haven't used subpanels much in the past (we've always ended up going with a tab control, but it makes the interface less-reusable and less modular as all the user events for each "module" are in the same diagram. Anyways, I'm having a little play and running into a small problem - how can I t
  19. Hi All, Posted this yesterday on ni.com (http://forums.ni.com/t5/LabVIEW/Problems-calling-a-dll-with-the-net-constructor-node-from-a-vi/td-p/2282712) as I couldn't get in here for some reason. Not had a response yet so thought I'd throw it here... I also note my formatting got messed up on ni.com... For info, I'm using Labview 2012f3 and windows 7. I have a hardware control application which we wrote to be nominally device independent. We did that by writing our own device drivers (in labview) which are distributed as .llb files, with the top level vi within the llb being the main
  20. Liang - that didn't seem to work. The osk came up fine, but when I ran the osk kill vi, I saw a cmd window popup and then dissapear - but the osk stayed on the screen... I'll try it with a couple of other things to see if it will work. Thanks Paul Just tried takskill on the osk within a cmd window and it doesn't like it for some reason (access is denied!) but it does work on other applications, so it should work for what I want to do. Thanks Paul
  21. Hi All, I have a bit of an annoying problem. I've written an application which acts as a remote server and controls some external kit. A client PC can connect to the server and then remotely operate said external kit. In general, it works fine. But it appears that there's an instability of some kind with the dll supplied by the external kit manufacturer which causes my application to intermittantly fall over and die (with a nice windows: "This program is no longer working" error message). When this happens the client can no longer communicate with the server since its obviously fallen over
  22. I'll have to confess to having not measured it. I was very pushed for time so just made sure it looked quick and smooth! I don't have access to the hardware to actually do any characterisation now either. When I next get it back I'll have a look at it. Sorry! Paul
  23. Hi All, So I got the nagle algorithm switched off using the VIs in asbo's link above. I also got rid of a couple of short command/response items which were being sent to get waveforms, and finally I took out the chopping data up into chunks and everything now seems to work very smoothly. No stalling or juddering at all. So, that looks to be the best way for my application (for now at least) so thanks all for your input! Cheers Paul
  24. Thanks for all of the replies. I'm going to make a couple of tweaks to the code now and see what happens. @ShaunR - you're right I am polling - and I know this will cause a reduction in the possible transfer rate, but what I was seeing (prior to disabling the Nagle algorithm) was a plenty fast enough transfer rate but it would get say 10-20 waveforms fast (>30 waveforms a second) then stall to 2-3 per second (which ties in with the time delay of the ACK etc) sporadically. The situation was manageable when the two PCs were connected only with a crossover ethernet cable, but when I then put
  25. Phillip I just found the "TCP NoDelay" property in the "Instr" property node, but you have to wire a VISA name into that property node. I'm using the TCP vis and only have a TCP refnum. How do I get from one to the other?! Thanks in advance for any info! Paul
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.