Jump to content

Omar Mussa

  • Posts

  • Joined

  • Last visited

  • Days Won


Everything posted by Omar Mussa

  1. The reason why nobody has the ultimate 'SQL' library is that SQL is not completely standardized across different databases. The syntax for SQL for an Oracle database is different than the syntax for a MySQL or SQL Server database. Each database has slightly different versions based on their own optimizations, legacy codes, etc. This makes it really difficult to create more than an interface to the database which is what the Database Connectivity Toolkit is. Once you've connected to the database (which thankfully has been standardized across the vendors through interfaces such as ADO), you are on your own to create SQL commands that get you useful info. The other part that makes it difficult to create a 'Universal' database library is that each database implementation is unique. Different constraints, different table structures, custom stored procedures, etc make it pretty difficult to build a really flexible toolkit. Basically, this is where design patterns come into play. If you can get YOUR organization to adopt different design patterns, you can start building reusable SQL queries that will be flexible for YOUR solutions. Its no easy task and often takes many iterations but it can be useful in the long run. Good luck!
  2. As you are running from the Development Environment, you may want to ask yourself: is the end user also running from the Development Environment or are they going to be running from an EXE? If the answer is EXE, you shouldn't really have to care bc the dialogs will not appear. If you are going to be running from the development environment, an easy way to prevent the dialogs is to make all of your VIs 'Read Only' AND set the LabVIEW Options to "Treat Read Only VIs as Locked" And "Do not save automatic changes" to checked.
  3. I tried to get this to work and I also failed. Luckily for me, my app was able to down-convert to 7.1 where the trick still works.
  4. One of the easiest ways to do this would be to establish a TCP connection between the C# app and the LabVIEW app. Then, the LabVIEW app could listen for requests from the C# app for commands like 'GET: POWER METER VALUE' (typically, you'd wait for requests in an idle frame or in a loop that is reading the TCP buffer). The good thing about this method is its easy to scale up with more commands if the user decides they need other data from the LabVIEW app in the C# app. The downside is that there can be a lag between when the C# app requests the value and the value on the meter so if your power meter value is changing really fast and your system has timing constraints this method may not work out too well.
  5. Basically, from your example code and questions, you need to learn more about data flow programming. I'd recommend reading LabVIEW for Everyone (in the interest of full disclosure, one of the authors is my boss ). It is a good book and will get you beyond your initial LabVIEW hang ups. First, you have to understand one of the major points of data flow programming... data goes in, data goes out. First, let me explain the problems you're having in (sub_save_02.vi)... You arent able to read the data in your subVI because you are writing the value to a control (Initialized Array) but LV does not by default read values from a control. Instead, you should create an indicator and write data to the indicator (call it: Data Out or something so you can differentiate it from the data you write (Data In) and wire it to the connector pane of the subVI. Also, you probably want the File Path to be an Input to your SubVI so that you don't have to overwrite the same file every time... right now its a constant. I would also change the inputs to the SubVIs from booleans to a single Enum value (numeric palette) and have a case for 'Read', 'Write'. The do nothing case should be managed from your main VI (save_02A). Second, your Main VI (save_02A.vi) is a bit of a wreck as well (no offense). In the interest of writing a long spiel about how to clean it up, or fixing it on your behalf, I would refer you to the LabVIEW Examples section - take a look at the Read/Write Binary Data examples. Also, look at state machine examples. If you don't find them in 7.1, you can definitely find them on LAVA and NI Websites. It will help you to understand why, for example, your loop doesn't run more than one iteration. Lastly, as a data flow language, LabVIEW doesn't support 'pointers' directly. If you want to make LabVIEW simulate using pointers, you need to do other things and I wouldn't recommend any of them based on your example code. Instead, just relax and remember that data is passed along the wires. It goes into the subVI, is modified (or not) and comes out of the subVI and back into your 'main' application. LabVIEW uses explicit declaration of inputs and outputs based on the wiring of your front panel controls (inputs) and terminals (outputs) to the connector pane. Unlike other programming languages where a function argument can be both an input and an output, in LabVIEW this is not the case (there are ways to make it behave like this but they are not something you should do normally). Good luck!
  6. I don't know if clearing the error is the use case described. What you really want to do is log errors to file ... so if you clear the error, you've lost the info you need to log. Instead of using the code as it is presented here (an error filter), I would simply add some logic to write the error code, source and time to a log file (ie build an error log). You may or may not want to log all errors --> so you may or may not need to input the error code. If you want to log all errors, you can place your error log function in the Error case of a case structure. After writing the error to file you can clear the error by using the Clear Errors.vi on the Time & Dialog palette and your loop can keep on truckn'.
  7. A 1-bit number simply means its value can be 0 or 1. And luckily enough, a rose is a rose... If you use any of the int type of data and send a 0 or 1 your code should work. Reads and Writes to that value shouldn't require any 'bit banging' because the value is just 0 or 1 (there is nothing to manipulate).
  8. There are off the shelf instruments that do BER testing. This is helpful in that their methodologies are published and spec'd and you can interface with them using TCP, GPIB, etc. Just Google BER Test Instruments and you should get more info that can get you started.
  9. Here's one that I wouldn't even consider a trick until somebody I was debugging some code with was recently wow'd by it... While executing a VI: If you right click on an Array Index Display of a control, indicator or probe, you can select 'Show Last Element' to go to the last element of the array. In development mode: If you want to look at the last element of an array control or indicator, right click on the Index Display and select Advanced-->Show Last Element (there is no 'Advanced' menu for probes so it always works by just right clicking).
  10. Here's a way to get an annoying LV 8.20 bug to show up... Create a chart with multiple plots and turn Autoscaling True, then set Plot Visible to False on the first plot. The autoscaling does not kick in for the remaining plots so you end up with what looks like a blank graph (in my example... behavior could vary if you have a different scenario then mine). I've attached a VI that demonstrates the bug. Run the VI, set Plot0 Visible = Off and watch the scale on the left (if you manually change the scale, you'll see the data is still being plotted for Plot1 for a brief moment (set the Max Value = 15) but the data will not stay visible because the autoscale bug will ruin your day). If you run the VI, in LV 8.0 you'll get the expected behavior. I've attached the VI in 8.0 format so that you don't have to Save For Previous. NI's suggested workaround: "I believe that the easiest work around is to disable autoscaling all together. If this isn't an option, then you'll have to write an autoscaling routine that detects the range of data in the history and then sets the yscale.range property node accordingly." CAR# 42UF33LG Download File:post-5746-1164072769.vi
  11. Hi Everyone, I've been using LabVIEW since 1998 and really am having more fun with it now then at any other time. I consider myself a power LabVIEW programmer, and am working at JKI (www.jameskring.com, www.jkisoft.com) with some of the best LabVIEW programmers you could hope to work with! I've been meaning to get involved with LAVA for several years and ended up just not going it for it. I actually met a lot of LAVA-ers (-ians, -ns, -ites,... members) at NI Week this year, managed to get all pumped up, and then got sidetracked from posting So, as my first New Year's resolution (ok, I'm a bit early) I hope to add to this great community of LAVA-iathans! Cheers!
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.