Jump to content

PA-Paul

Members
  • Posts

    139
  • Joined

  • Last visited

  • Days Won

    1

PA-Paul last won the day on September 29 2023

PA-Paul had the most liked content!

Profile Information

  • Gender
    Not Telling
  • Location
    Dorchester, Dorset, UK

LabVIEW Information

  • Version
    LabVIEW 2016
  • Since
    2004

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

PA-Paul's Achievements

Newbie

Newbie (1/14)

  • Dedicated Rare
  • First Post Rare
  • Collaborator Rare
  • Conversation Starter Rare
  • Week One Done

Recent Badges

5

Reputation

  1. Thanks Rolf, glad to hear its on your radar already. It's not holding me up in the end as I'm not actively developing anything using the toolkit and fortunately it did work in the place I need it most (the build server!)... but I'll keep my eyes open for the update so I can get it on the dev machines too! Thanks Paul
  2. Hi I'm trying to install the OpenG Zip package via VIPM. But when it gets to the LVZIP readme page - the "Yes, install this library" button remains greyed out no matter how many times I scroll to the top or bottom of the dialog... It worked fine on a different computer, but on mine I cannot get past this (and its a tad frustrating!). I'm on LabVIEW 2021.0.1f2, VIPM 2023.1 I've tried uninstalling the old version, making sure its not installed in any other LabVIEW version, but I get the same results on any version of LabVIEW... Hoping someone can help! Thanks Paul
  3. Thanks for the feedback, that's helpful. Related question (convolution!) - where we have a git repo for each "module", suppose we then need to consider a repo of "data type libraries" (there could be more than one to allow for groupings of different data types...). It all just feels unnecessarily complex somehow, but I guess thats the problem with dependency management, to keep the dependencies between modules "simple" and reduce coupling, it seems you need relative complex strategies for managing the dependencies! Anyone have a feel for whether that's a general thing among other languages or is it particularly challenging in LabVIEW? Thanks again!
  4. Hi All, I'm writing (what I hope is nice) modular code, trying to modularise functions and UIs etc accross the applications we're developing to keep things nice and flexible. But I'm struggling to keep individual modules "uncoupled" - or rather not all interdependent on each other. Example - I have module for communicating with an oscilloscope to capture waveforms. I create a typedef in that module for passing out those waveforms and related information to another module, which sends them on to a storage module and a waveform processing module (where I do various things to the raw waveform) and then finally out to a waveform display module... So my question becomes "who should own the waveform data typedef" - the easiest thing to do is make sure the input side of any module's API accepts the output of the API producing the data... so if it sits in the acquisition module library at the "source" of the data, then assuming I dont keep translating to different data types, all of my other modules end up dependent back to there too - e.g. my waveform processing and display modules now depend on the acquisition module - but I might want to write a post processing and display application that needs to know how to process waveforms, but I dont want to have to include the waveform acquisition module with that as it wont be acquiring anything... I'm sure there's lots of options, and this post is really to try and find some I havent thought of yet! I know I could create the typedef in it's own library or even outside of any library, then the users of that typdef are not dependent on each other (I think that's basically a form of dependency inversion?). But then that gets hard to manage and leads to the idea of a "common data types" library, which can quickly grow to lots of other things and more types of coupling in a way. I could translate the typedef as it passes through a chain of modules so that each module defines the way it expects to receive the data, and then a tree type module hierarchy limits the coupling... but that always feels somewhat inefficient - and I want one waveform data type, not 4 depending on where I am in the application. My situation is marginally complicated as well since we're using PPLs and some of my data types are now classes, so they need to be inside a PPL somewhere and then used from there (and its a pain we cant build a packed class without first wrapping it in a project library...) So yes - how do you all manage your typedefs and classes that get used across module boundaries to minimise those dependency issues!? Thanks in advance, sorry for the slightly rambly post!
  5. As a developer using open source code, I want to be able to inspect and run the unit tests/verification steps etc that were used to prove the code works as intended before release.
  6. So, does this sound like a sensible way to evaluate which option (expressing the X values for my interpolation in Hz or MHz) provides the "best" interpolation: Generate two arrays containing e.g. 3 cycles of a sine wave. One data set is generated with N samples, the other with 200xN samples. Generate two arrays of X data, N samples long, one running from 0 - N and the other running from 0 to N E6 Generate two arrays of Xi data - 200xN samples long, again with one running from 0-N and the other from 0 to N E6 Perform the interplolation once for each data set using the N samples sine wave data for the Y data in each case, then using the other X and Xi data sets. Calculate the average difference between the interpolated data sets and the 200xN sample sine wave. If one option is better than the other, it should have a lower average (absolute) difference, right? I'll code it up and post it shortly, but thought I'd see if anyone thinks that's a valid approach to evaluating this! Thanks Paul
  7. Thought I posted this the other day, but apparently didnt. I agree it's likely a precision/floating point thing. The issue I have is how it tracks through everything. So the interpolation is used to make the frequency interval in our frequency response data (for our receiver) match the frequency interval in the spectrum (FFT) of the time domain waveform acquired with the receiver. We do that so we can deconvolve the measured signal for the response of the device. We're writing new improved (and tested) version of our original algorithm and wanted to compare the outputs of each. In the new one, we keep frequency in Hz, in the old in MHz. When you run the deconvolution from each version on the same waveform and frequency response you get the data below: The top graph is the deconvolved frequency response from the new code, the middle from the old code and the bottom is the difference between the two... It's the structure in the difference data that concerns me most - it's not huge, but it's not small and it appears to grow with increasing frequency. Took me a while to track down the source, but it is the interpolation. If we convert our frequency (x data) to MHz in the new version of the code, the structure vanishes and the average difference between the two is orders of magnitude smaller. And its there that I'd like to know which is the more correct spectrum?! The old or the new? Any thoughts? Paul
  8. Hi All, This may be a maths question, or a computer science question... or both. We have a device frequency response data set, which is measured at discrete frequency intervals, typically 1 MHz. For one particularly application, we need to interpolate that down to smaller discrete intervals, e.g. 50 kHz. We've found that the cubic Hermite interpolation works pretty well for us in this application. Whilst doing some testing of our application, I came across an issue which is arguably negligible, but I'd like to understand the origins and the ideal solution if possible. So - If i interpolate my data set with my X values being in MHz and create my "xi" data set to be in MHz and with a spacing of 0.05, I get a different result from the interpolation VI than I do if I scale my X data to Hz and create my xi array with a spacing of 50,000. The difference is small (very small), but why is it there in the first place? I assume it comes from some kind of floating point precision issue in the interpolation algorithm... but is there a way to identify which of the two options is "better" (i.e. should I keep my x data as frequency and just scale to MHz for display purposes when needed, or should I keep it in MHz)? Ideologically there should be no difference - in both cases I'm asking the interpolation algorithm to interpolate "by the same amount" (cant think of the write terminology there to say we're going to 1/20th of the original increment in both cases). Attached is a representative example of the issue (In LV 2016)... Thanks in advance for any thoughts or comments on this! Paul Interpolation with Hz and MHz demo.vi
  9. Unfortunately, even with debug enabled you cant probe the class wire (Well, actually you can but all you get is the class name, not the private data). At least that is the case on VIs within the PPL (so if you try to probe the class wire on the BD of a VI within the packed library all you get is the name of the class. I'm pretty sure the same is true in the calling code once the class wire has "been through" a VI from the PPL. Paul
  10. Not sure where this question best fits, it could have gone in application design and architecture (as I'm using the PPL based plugin architecture), or possibly OOP as I'm using OOP... Anyway, I have made an interface class for my plugin architecture, put it in a library (.lvlib) and then packed that library (to PPL - .lvlibp) for use in my application. My question is relatively simple, why, when I look at the class within the packed library, is there no sign of the "class.ctl" (i.e. the class private data control)? Related - is the lack of class private data control within the PPL the reason I can't probe my class wire in my application built against the PPL? (I can probe it and see the private data values if I code against the original unpacked class). Any thoughts or insights would be gratefully received! Paul
  11. Thanks Tim, What I find strange is that on a fresh windows 7 install, the installer built with LV tells me I need first to install .NET4.6.1 and then does it. Why cant it do something similar in windows 10 and alert me to the fact that .NET installs are needed and do those for me! Ho hum, another reason (asside from DAQmx 16 no being compatible with LV2012) to think about migrating up to a newer version! Thanks again Paul
  12. Hi All, I have an application built in labview 2012 which we have been distributing fine with an installer built in labview 2012 for some time. Recently, a colleague tried to install the application onto a windows 10 machine and although the installation process seemed to go smoothly, for some reason, one of the two exe's distributed by the installer gives me the ever so helpful "This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. This VI is not executable. Full development version is required. ..... (you get the picture!)" error. As an experiment I set up two clean virtual machines, one running windows 7 SP1, one running windows 10. I ran the same installer on both. In windows 7, I got a prompt saying that .NET framework 4.6.1 needed to be installed first, and that happened automatically, and after that the installer continued and everything worked fine (including the exe in question). On windows 10 I got no such warning. The installer ran through and the exe failed with the error message described above. On the Win10 VM, I then went in to "Turn windows features on/off" and although there's a tick in the ".NET framework 4.6 advanced features", there's no specific sign of 4.6.1. for a test, I added a tick to add .NET framework 3.5 (includes .NET 2.0 and 3.0) and installed that. That made my .exe run fine.... So.... is there any way to force the LV installer to prompt for/install the necessary .net "stuff" when running in Windows 10? Anything else I can do to automate things so that our customers dont need to manually add .net support for the program to run if they're using windows 10? Thanks in advance! Paul
  13. Background: Precision Acoustics Ltd have been at the forefront of the design, development and manufacture of ultrasonic measurement equipment for over 25 years. The company specialises in the research and development of ultrasonic test equipment used extensively in the QA of medical devices, to provide industry with ultrasonic Non Destructive Examination (NDE) and within academia and national measurement institutes throughout the world. Brief: Together with the Bournemouth University we are looking for a Software Engineer to work on a joint project focused on the development of a new software platform (and associated output) to control acoustic and ultrasound measurement instrumentation. For the successful candidate, this is an exciting opportunity to develop the skills required to plan, manage and execute a significant development project, working alongside and learning from industry experts in LabVIEW development and Agile software development. There is also a significant personal development budget available along with the opportunity to undertake a project-focused Masters in Research (MRes) at the University. Whilst the majority of KTP candidates remain with the industrial partner after the project, the skills acquired would be widely applicable and highly desirable in future employment opportunities. The project will seek to develop a platform and agile development process that has the hallmarks of a well-engineered software system, including all documentation and verification outputs. The solution will ideally integrate existing software developed by the company from internal development, balancing the need for re-use against the need for a best-practice architecture. This must be undertaken in a way that supports the short, medium and long term business operations of the company. For more information and application process visit the Bournemouth university website: https://www1.bournemouth.ac.uk/software-engineer-ktp-associate-fixed-term
  14. As a company we've been using labview for a while to support various products, and now we're going to be trying to improve our work by implementing more of a collaborative approach to roll out some new versions. But we're not traditionally a "software company" so are starting to look at how we can do this from a "process" point of view, so I was just wondering if anyone here has any particular hints/tips or advice for where team members may be working on the same bits of code, how to avoid conflicts etc etc. IF there's any useful literature or resources that may give some insight into this area would also be gratefully received! For info, we're planning to use mercurial for source code control, but havent previously gone as far as integrating SCC into labview (we've previously used SVN, and managed commits and checkouts through tortoiseSVN, but never used locking or similar as in general people werent working on the same bits of code). Not sure whether we would this time either, but will consider it if it is genuinely useful. Thanks in advance. Paul
  15. New job opening for a graduate computer scientist for a software architect role on a "Knowledge Transfer Partnership" between Precision Acoustics Ltd (www.acoustics.co.uk) and Bournemouth University in the UK. Looking for someone with LabVIEW experience. Job advert and details here: http://www.bournemouth.ac.uk/jobs/vacancies/technical/advert/fst102.html Salary will be £26,000.00 per annum Precision Acoustics require a dedicated IT Software Architect to join their team on a specialised project. Precision Acoustics manufactures ultrasonic measurement products for medical and NDT industries. Based in the south of England in Dorchester, Precision Acoustics is owned by its Managing Director and two of the Research Scientists. The company was established in its present form in 1997 and is well established as a major supplier of equipment for the MHz ultrasound markets on a world-wide basis. Your role will be to drive forward the deployment of the new software architecture, to enable the company to transition to a new and improved way of working. In order to do that, you will need to influence effectively, communicate through a variety of media, and persuade and motivate staff to adopt the new software and associated processes. It is essential that you are experienced in working with LabVIEW. You will bring or develop capability for best-practice requirements engineering, software engineering, software architecture/design (using National Instruments LabVIEW), and software performance. You will also work with the Software Systems Research Centre at Bournemouth University to disseminate research challenges to the academic staff community. This may include authorship of conference and/or journal papers. Your role comes with a large development budget and includes the opportunity to undertake a project-focused fully-funded Masters in Research (MRes) degree at Bournemouth University. This is an 18 month fixed term appointment and could lead to the offer of permanent employment.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.