Jump to content

ShaunR

Members
  • Posts

    4,881
  • Joined

  • Days Won

    296

Everything posted by ShaunR

  1. Definitely the former. If I pre-allocate the array and replace elements rather than auto index, then they perform exactly the same. But what I'm confused by is why there should be a difference between x32 and x64. After all, it should be the same amount of memory being (re)allocated and it is a LV internal implementation.
  2. I'm guessing you were a C++ programmer in an earlier life (I also suspect you haven't been using LV since 1998 as your profile suggests). Labview passes data. Not pointers, objects or lemons (unless you specifically tell it too and even then its just smoke and mirrors). All functions in LV are designed to operate on data. It is a data-centric,data-flow paradigm. Moreover it is a "strictly typed", data-centric, data-flow language. When you connect two VI's together, you are passing a value, not an object. When you use a reference, you are also passing a value, however, the property nodes know (by the type) that they need to "look up" and de-reference in order to obtain the data. They are,if you like, a "special" case of data rather than the "norm" and require special functions (property and method nodes) to operate on. If you were to inspect the value of a reference, you would find a a pointer sized Int. However, if you looked at the memory location you would not find your control or indicator or even its data..
  3. Sure.Here are a couple I have have used.
  4. OK. I'm fairly happy with the performance of the API (there are to be a couple more minor tweaks but nothing drastic). So I started to look at SQLites performance. In particular I was interested in how SQLite copes with various numbers of records and whether there is deterioration in performance with increasing numbers of records. Wish I hadn't Below is a graph of inserts and select queries for 1 to 1,000,000 records. The test machine is a Core 2 duo running Win 7 x64 using Labview 9 SP1 x32. Each data point is an average over 10 bulk inserts using the "Speed Example.vi". The database file was also deleted before each insert to ensure fragmentation and/or tree searching were not affecting the results. I think you can see that both inserts and select times are fairly linear in relation to the number. And (IMHO) 5 seconds to read or write a million records (consisting of 2 columns) is pretty nippy Now the same machine (exactly the same test harness) but using LV2009 SP1 x64 Hmmm. It's interesting to note. that up until about 100,000; x64 it performs similarly to x32. However, memory usage reported by the windows task manager above 200,000+ shows x64 starts to climb further. Typically by the end of the test x32 has consumed about 450MB whilst x64 is about 850MB when viewed in the windows task manager. Checking SQLites internal memory allocation using the "Memory.VI" yields an identical usage between both tests. However,. LV x64 seems to be using 2x windows memory. I'm tempted to hypothesise that it is memory allocation in LV x64 which is the cause. Can anyone else reproduce this result? A single check at (say) 500,000 should be sufficient.
  5. Reproduced also in LV2009 x32 & x64 (PDS) as well as LV2009 x64 & x32 (PDS) I also noticed that if you run it before saving (i.e. have modified VI's in memory because I switched from LV x32 to x64) you get the same results. If you then "Save all" it runs ok. However, after invoking the Icon Editor, "Save All" or a re-compile has no effect.
  6. I on't think you can write service applications in LV. But sc.exe serves the same purpose as srvany.exe. Go to a command prompt and type in sc.exe create myservice binpath=pathtomyservice DisplayName="mydisplayname" (of course change myxxxxx with the appropriate names and paths)
  7. You need to pass the "reference" of the control to the sub VI. By wiring the graph control you are only passing the "data" that the graph contains,
  8. http://www.metacafe.com/watch/1154898/how_to_make_a_tinfoil_hat/ I wonder what colour the sky is on her planet
  9. That's fantastic Just goes to show. "There are no bad programs, only bad programmers". I think I'll set that up as my wallpaper....move over Grace Park
  10. Take a look at the speed example.
  11. You do not need to explicitly open or create a file with the any of the high level API (the exception being "Query by ref") as it will open or create a file if one doesn't exist. Just specify the fie name. You cannot write directly to the file using standard file write functions. A SQLite file has a complex structure.
  12. Be careful whose toes you step on today because they might be connected to the foot that kicks your ass tomorrow!

    1. Davidson D

      Davidson D

      Nice thought to note in subconscious mind..

  13. In the "Dialog & User interface" palette the is a "Colorbox constant". You can wire that to it and choose a colour by clicking on it which will show the colour chooser dialogue.
  14. There's probably a better mathematical solution, but this works as a practical approximation for this sort of thing (certainly for repeatability at least). As Yair said. Take loads of readings. Then calculate the mean and variance of your data. Plug those value into the probably density function for a truncated normal distribution and solve of X (note that b in this case is infinity). It is flawed in that it assumes your data is normally distributed (which it isn't its 1/2 of a normal distribution). But it will give a much better approximation.
  15. You could have just used the "Query.vi" and put in "SELECT DISTINCT Col1,Col2,Col3 FROM TableName;" then you wouldn't need to filter the results.
  16. Yup. I had a similar experience. At my old firm IT insisted that the had to install everything. I gave them a list of Labview and all the packages I needed and went and played with some instruments. After an hour they called me back to say it was complete. Great Where are the toolkits/packages? Umm. how do you install them (they said). So I showed them. After a few hours of sitting watching the install and drinking copious amounts of coffee the IT uy said "OK I'm off home". I said "When you come in tomorrow you need to phone NI and get all the activation codes (license was in ITs name). See you tomorrow so we can activate them all. Oh and by the way, we need to do this all again forl 8.2, 7.1 and 6.0 Can we do that the day after?" Next morning I had local admin rights and they transferred the license into my name
  17. SQLite API Version 1.2 released.
  18. Indeed.You will still have plenty of time to learn Russian so you can read it
  19. I hope your machine is networked to the internet. Mine isn't and after an install I have to hand type in 23 activation codes It's all got a bit ridiculous
  20. SQLite shows a download size of 4.8 MB. But the latest upload is only 1.8 MB. It looks like its the sum of all the versions. Is that right? The download page is geared towards showing information about a particular version (file name, version number, page title etc). Shouldn't it only show the size and download speed of the latest version?
  21. If you are using Version 1.1, then you can use the "Transaction Query.vi" as JCarmody referenced. If you are using version 1.2 (just uploaded so unlikely ) then you can use the "Insert Table.vi" or "Transaction Query.vi". Well. I cannot replicate your test because you haven't released the VIs. But I ran the speed test inserting 1,000,000 rows with version 1.2 and reset the PC. After 12 resets the DB was fine although Labview forgot all its palette settings on the 3rd reset . At that point I got bored since it takes about 1 minute for my PC to boot
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.