Jump to content

PA-Paul

Members
  • Content Count

    135
  • Joined

  • Last visited

Community Reputation

3

About PA-Paul

  • Rank
    Very Active

Profile Information

  • Gender
    Not Telling
  • Location
    Dorchester, Dorset, UK

LabVIEW Information

  • Version
    LabVIEW 2016
  • Since
    2004

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. As a developer using open source code, I want to be able to inspect and run the unit tests/verification steps etc that were used to prove the code works as intended before release.
  2. So, does this sound like a sensible way to evaluate which option (expressing the X values for my interpolation in Hz or MHz) provides the "best" interpolation: Generate two arrays containing e.g. 3 cycles of a sine wave. One data set is generated with N samples, the other with 200xN samples. Generate two arrays of X data, N samples long, one running from 0 - N and the other running from 0 to N E6 Generate two arrays of Xi data - 200xN samples long, again with one running from 0-N and the other from 0 to N E6 Perform the interplolation once for each data set using the N samples sine wave data for the Y data in each case, then using the other X and Xi data sets. Calculate the average difference between the interpolated data sets and the 200xN sample sine wave. If one option is better than the other, it should have a lower average (absolute) difference, right? I'll code it up and post it shortly, but thought I'd see if anyone thinks that's a valid approach to evaluating this! Thanks Paul
  3. Thought I posted this the other day, but apparently didnt. I agree it's likely a precision/floating point thing. The issue I have is how it tracks through everything. So the interpolation is used to make the frequency interval in our frequency response data (for our receiver) match the frequency interval in the spectrum (FFT) of the time domain waveform acquired with the receiver. We do that so we can deconvolve the measured signal for the response of the device. We're writing new improved (and tested) version of our original algorithm and wanted to compare the outputs of each. In the new one, we keep frequency in Hz, in the old in MHz. When you run the deconvolution from each version on the same waveform and frequency response you get the data below: The top graph is the deconvolved frequency response from the new code, the middle from the old code and the bottom is the difference between the two... It's the structure in the difference data that concerns me most - it's not huge, but it's not small and it appears to grow with increasing frequency. Took me a while to track down the source, but it is the interpolation. If we convert our frequency (x data) to MHz in the new version of the code, the structure vanishes and the average difference between the two is orders of magnitude smaller. And its there that I'd like to know which is the more correct spectrum?! The old or the new? Any thoughts? Paul
  4. Hi All, This may be a maths question, or a computer science question... or both. We have a device frequency response data set, which is measured at discrete frequency intervals, typically 1 MHz. For one particularly application, we need to interpolate that down to smaller discrete intervals, e.g. 50 kHz. We've found that the cubic Hermite interpolation works pretty well for us in this application. Whilst doing some testing of our application, I came across an issue which is arguably negligible, but I'd like to understand the origins and the ideal solution if possible. So - If i interpolate my data set with my X values being in MHz and create my "xi" data set to be in MHz and with a spacing of 0.05, I get a different result from the interpolation VI than I do if I scale my X data to Hz and create my xi array with a spacing of 50,000. The difference is small (very small), but why is it there in the first place? I assume it comes from some kind of floating point precision issue in the interpolation algorithm... but is there a way to identify which of the two options is "better" (i.e. should I keep my x data as frequency and just scale to MHz for display purposes when needed, or should I keep it in MHz)? Ideologically there should be no difference - in both cases I'm asking the interpolation algorithm to interpolate "by the same amount" (cant think of the write terminology there to say we're going to 1/20th of the original increment in both cases). Attached is a representative example of the issue (In LV 2016)... Thanks in advance for any thoughts or comments on this! Paul Interpolation with Hz and MHz demo.vi
  5. Unfortunately, even with debug enabled you cant probe the class wire (Well, actually you can but all you get is the class name, not the private data). At least that is the case on VIs within the PPL (so if you try to probe the class wire on the BD of a VI within the packed library all you get is the name of the class. I'm pretty sure the same is true in the calling code once the class wire has "been through" a VI from the PPL. Paul
  6. Not sure where this question best fits, it could have gone in application design and architecture (as I'm using the PPL based plugin architecture), or possibly OOP as I'm using OOP... Anyway, I have made an interface class for my plugin architecture, put it in a library (.lvlib) and then packed that library (to PPL - .lvlibp) for use in my application. My question is relatively simple, why, when I look at the class within the packed library, is there no sign of the "class.ctl" (i.e. the class private data control)? Related - is the lack of class private data control within the PPL the reason I can't probe my class wire in my application built against the PPL? (I can probe it and see the private data values if I code against the original unpacked class). Any thoughts or insights would be gratefully received! Paul
  7. Thanks Tim, What I find strange is that on a fresh windows 7 install, the installer built with LV tells me I need first to install .NET4.6.1 and then does it. Why cant it do something similar in windows 10 and alert me to the fact that .NET installs are needed and do those for me! Ho hum, another reason (asside from DAQmx 16 no being compatible with LV2012) to think about migrating up to a newer version! Thanks again Paul
  8. Hi All, I have an application built in labview 2012 which we have been distributing fine with an installer built in labview 2012 for some time. Recently, a colleague tried to install the application onto a windows 10 machine and although the installation process seemed to go smoothly, for some reason, one of the two exe's distributed by the installer gives me the ever so helpful "This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. ..... (you get the picture!)" error. As an experiment I set up two clean virtual machines, one running windows 7 SP1, one running windows 10. I ran the same installer on both. In windows 7, I got a prompt saying that .NET framework 4.6.1 needed to be installed first, and that happened automatically, and after that the installer continued and everything worked fine (including the exe in question). On windows 10 I got no such warning. The installer ran through and the exe failed with the error message described above. On the Win10 VM, I then went in to "Turn windows features on/off" and although there's a tick in the ".NET framework 4.6 advanced features", there's no specific sign of 4.6.1. for a test, I added a tick to add .NET framework 3.5 (includes .NET 2.0 and 3.0) and installed that. That made my .exe run fine.... So.... is there any way to force the LV installer to prompt for/install the necessary .net "stuff" when running in Windows 10? Anything else I can do to automate things so that our customers dont need to manually add .net support for the program to run if they're using windows 10? Thanks in advance! Paul
  9. Background: Precision Acoustics Ltd have been at the forefront of the design, development and manufacture of ultrasonic measurement equipment for over 25 years. The company specialises in the research and development of ultrasonic test equipment used extensively in the QA of medical devices, to provide industry with ultrasonic Non Destructive Examination (NDE) and within academia and national measurement institutes throughout the world. Brief: Together with the Bournemouth University we are looking for a Software Engineer to work on a joint project focused on the development of a new software platform (and associated output) to control acoustic and ultrasound measurement instrumentation. For the successful candidate, this is an exciting opportunity to develop the skills required to plan, manage and execute a significant development project, working alongside and learning from industry experts in LabVIEW development and Agile software development. There is also a significant personal development budget available along with the opportunity to undertake a project-focused Masters in Research (MRes) at the University. Whilst the majority of KTP candidates remain with the industrial partner after the project, the skills acquired would be widely applicable and highly desirable in future employment opportunities. The project will seek to develop a platform and agile development process that has the hallmarks of a well-engineered software system, including all documentation and verification outputs. The solution will ideally integrate existing software developed by the company from internal development, balancing the need for re-use against the need for a best-practice architecture. This must be undertaken in a way that supports the short, medium and long term business operations of the company. For more information and application process visit the Bournemouth university website: https://www1.bournemouth.ac.uk/software-engineer-ktp-associate-fixed-term
  10. As a company we've been using labview for a while to support various products, and now we're going to be trying to improve our work by implementing more of a collaborative approach to roll out some new versions. But we're not traditionally a "software company" so are starting to look at how we can do this from a "process" point of view, so I was just wondering if anyone here has any particular hints/tips or advice for where team members may be working on the same bits of code, how to avoid conflicts etc etc. IF there's any useful literature or resources that may give some insight into this area would also be gratefully received! For info, we're planning to use mercurial for source code control, but havent previously gone as far as integrating SCC into labview (we've previously used SVN, and managed commits and checkouts through tortoiseSVN, but never used locking or similar as in general people werent working on the same bits of code). Not sure whether we would this time either, but will consider it if it is genuinely useful. Thanks in advance. Paul
  11. New job opening for a graduate computer scientist for a software architect role on a "Knowledge Transfer Partnership" between Precision Acoustics Ltd (www.acoustics.co.uk) and Bournemouth University in the UK. Looking for someone with LabVIEW experience. Job advert and details here: http://www.bournemouth.ac.uk/jobs/vacancies/technical/advert/fst102.html Salary will be £26,000.00 per annum Precision Acoustics require a dedicated IT Software Architect to join their team on a specialised project. Precision Acoustics manufactures ultrasonic measurement products for medical and NDT industries. Based in the south of England in Dorchester, Precision Acoustics is owned by its Managing Director and two of the Research Scientists. The company was established in its present form in 1997 and is well established as a major supplier of equipment for the MHz ultrasound markets on a world-wide basis. Your role will be to drive forward the deployment of the new software architecture, to enable the company to transition to a new and improved way of working. In order to do that, you will need to influence effectively, communicate through a variety of media, and persuade and motivate staff to adopt the new software and associated processes. It is essential that you are experienced in working with LabVIEW. You will bring or develop capability for best-practice requirements engineering, software engineering, software architecture/design (using National Instruments LabVIEW), and software performance. You will also work with the Software Systems Research Centre at Bournemouth University to disseminate research challenges to the academic staff community. This may include authorship of conference and/or journal papers. Your role comes with a large development budget and includes the opportunity to undertake a project-focused fully-funded Masters in Research (MRes) degree at Bournemouth University. This is an 18 month fixed term appointment and could lead to the offer of permanent employment.
  12. Thanks for the answers. Had a suggestion from someone else which I'll try - he has a small dellXPS 13, but sets the resolution down to 1600x900 and then runs at 100% scaling instead of going for full resolution and scaling up... I dont think it sounds the ideal solution, but I'll give it a try. Thanks Paul
  13. Hi All, Wasnt overly sure where to ask this... I'm having issues related to screen resolution, but mostly to do with block diagram behaviour rather than front panel object sizing which comes up regularly! Anyway, I have some code which I wrote on my PC (windows 7.1) at the office - screen resolution is 1680x1050. When I look at the BD on that PC, things line up nicely and look generally ok. However, when I open the same code on my MS Surface Pro 3, (windows 8.1, screen res 2160x1440 - and with scaling set to 125% in the control panel display options) it seems that different things on the block diagram are scaled differently and so things dont line up any more. Particularly painful are things like unbundle by name - where the font and undbundle structure seem to have scaled slightly differently, so things that did have unbundles/bundles that were aligned with other BD objects are no longer aligned and things look really messy. Has anyone else come across these types of issue? I'm not sure if its a windows 7 vs 8 thing, or specifically down to the windows scaling being set to cope with the small high res display on the surface... I'm looking for any suggestions on things I can try/settings hidden away somewhere that might help make moving between the two systems actually possible (At the moment, I'm just not coding on the surface because I cant cope with the unreadability and dont want to waste time aligning stuff all over the place!) I should have prepped an example with pics to better show what I mean - I'll try and do something with that tomorrow when I have access to both machines next to each other! Thanks in advance! Paul
  14. Does anyone know if there's a way to get the value of a position along a slide control based on mouse co-ordinates? For example, the waveform graph has an invoke node which will "Map Coords to XY" I'm after a similar functionality but for a slide control so that I can either prevent unwanted clicks in a certain region of the slider (i.e. I have a 2 slide control and want to either filter out any clicks that fall outside of the region between the sliders, or switch the active slider to be the one closest to the click before the click is processed). I put a slightly more detailed (complicated?) description over or NI.com (http://bit.ly/1l1iJr6) but haven't had a response yet, just wondered if anyone on here might have some ideas? I've have managed a bit of a workaround by cobbling together the co-ordinate of the mouse with some bounding box info on the scale and housing of the control and prior knowledge of the scale range, but I just thought that, at least under the hood labview is doing what I want, is there a way I can access it?! Thanks Paul
  15. Thanks for the link, feels a bit kludgy to me, I think I will just aim to not allow the VIs to be stopped from within the panel! Is it considered good practice to remove a VI from a subpanel before putting something else in? It seems you don't need to, but would be good know the best way! Thanks Paul
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.