-
Posts
3,433 -
Joined
-
Last visited
-
Days Won
289
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by hooovahh
-
-
Man I remember how excited I was when I got a Nintendo GameCube for Christmas. Many hours were spent playing All-Star Baseball 2002, and Star Wars Rogue Squadron II.
If someone around the office said something like that I would be concerned. It was bad enough when I was working with people born after 1990. Something about working with someone from another decade messes with my head. In 5 short years it is possible I could work with someone born after 2000.
-
It's cheap, but a little larger than ideal.
Wow a little large? I was thinking this could be a bench top DMM and the case I had with LCD was even a little large for that at 9.5'' x 8'' x 3''. Your box is 17 x 12 x 10 (I assume inches). Still re purposing hardware in this way to make a dedicated machine is a neat idea. Oh and it might look cool if you add front banana jacks for connecting to the DMM.
-
This will work just fine as a stand alone DMM, if you don't mind the extra cost of a motherboard, Processor, Hard Drive, RAM, Power Supply, case, and software.
Depending on how slick you want it to look I would recommend a PCI riser of some kind. some thing that gives a flexible ribbon cable, or possibly just a 90 degree bend, so the DMM sits parallel to the motherboard to save space. Then you'll probably need a custom case to house it.
I've never used any of these cases but here are a few that might work
http://www.circotech.com/istar-s-21-mini-itx-box-computer-case.html?gclid=CPq7jqur17gCFUXNOgodTDYAPw
http://www.ebay.com/itm/Morex-2766-Mini-ITX-case-w-Dual-PCI-Bays-60W-PS-/130747941410#vi-content
If it were PCI-Express you could fit it in the a very tiny case, and thin motherboard, and external power, using something like this.
http://www.mini-box.com/I-O-shield-and-riser-card-for-DN2800MT
EDIT: I changed my mind this would look awesome with the LCD read out if you can control it.
-
I do see this issue in Windows 7 x64, but this reminded me of LVMark.
https://code.google.com/p/lvmark/
http://lavag.org/topic/14700-lvmark-format-markup-for-string-controls/
And oddly enough I couldn't make its indicators break like yours.
-
Just wanted to post to let you know you are not alone. I have seen this and was confused so I ended up taking a finite amount of data (100 points or so) then averaged them.
-
There is a "Minimum Panel Size" on the VI that is in the sub panel, but there is also "Minimum Pane Size" on the panel that the sub panel is in. This way you can prevent the window from being too small, by keeping the panel from being too small.
You will want to do some testing however. I remember a time where I would set the Minimum Pane Size so my right pane wouldn't get too small, as I made the window smaller I hit the limit of the right pane, but not the left so the left pane stated to get smaller which was the pane I wanted to not change size. Just experiment a while and all these Minimum controls will keep it from getting too small.
-
If you are missing the lvapp.rsc from LabVIEW you likely need to perform a repair on the install of the development environment, or run-time engine (or both).
-
Is there a region issue? Was the EXE build on a OS that was not in English, then ran on one that is? If it is built, and ran in the same environment, is there any odd characters in the menus? By that I mean things you don't find on a standard QWERTY keyboard? Just guessing.
-
Neato. What's a little interesting is the fact that the "Place VI Contents" is not an option on the control palette editor, but it just knows to merge it I guess.
-
Well you can start by not quadruple posting the same question. Seriously this is bad internet etiquette to have on any forum.
That being said I do have a solution for your question which uses the .NET picture box. The reason I used that instead of the native 2D picture control is because it has the Stretch, or Zoom feature that resizes images very nicely. LabVIEW's picture control does have the Zoom Factor but it makes the image look like crap.
I made the picture box control fit to the panel, and then set the panel to resize objects as the window resizes. This gives a more or less fluid resizing of the image.
-
1
-
-
Well that is interesting. I've never had a need to make a decoration to be used in other places and be available on the palette, but I'm surprised there doesn't exist some way of doing this.
I guess the easiest way to do this is to make a control that is transparent, other than the decoration then save it. So make a boolean transparent, then do something like replace the decal image (or false or true) with your decoration. The obvious side effect is the fact that you will have a whole bunch of booleans on your block diagram that don't go any where because they are just for looks. You could do similar stuff with the 2D picture control but I'm guessing it won't scale the way you like when stretched.
-
I think the price of the zxing lib is much better...
You wouldn't say that if you worked at Onbarcode.com
-
1
-
-
I ran across a bug in LabVIEW 2012 the other day. This does not exist in LV 2011 or prior. The CAR is 416470 if you want to track it.
...
The workaround is the either use a constant for the default value, or put a Always Copy primitive between the Get Waveform Attribute and the indicator.
Why does that not surprise me. If I could make a polymorphic VI for all types, I would make a subVI that just calls the Always Copy, but have the icon of the subVI be a band-aid.
-
But it gives an error, how to convert it from cluster to ROI.
A ROI is a cluster. Post your code and the error.
-
Yes it does. All you are looking at is a constant which defines your region. For a test you can right click it and choose Change to Control. Now on your front panel you will get the values that your ROI can be. This can be changed as the program is running.
If you want to base it off of a calculation what you can use is a bundle by name (insert it on the wire in your picture) then you can replace the values with other problematically found values.
-
And you are charging money for your application that cracks LabVIEW passwords? Funny...
/Steen
That's what I always think when I see these services pop up. That's like going to the PirateBay and offering discounted iTunes gift cards.
-
I have no LabVIEW + Linux experience so keep that in mind. But your questions seems to open ended. What if I asked:
What is the minimum system requirements for a program to run in Windows?The problem with this question is it depends on what the program does. If I write a hello world program, that should run (if it has compatibility) on any Windows version. If I wrote some new high end first person shooter then that program may have other system requirements for the program.
Similarly you can ask what are the system requirements to run a hello world program, and I would say any processor Pentium 1 or newer with and 32MB of RAM (maybe less). But if you are writing a program with more processing needs the requirements will be higher. No one can say what the requirements are until the program is complete.
-
(*an Abe is a $5 bill)
"Give me five bees for a quarter." (sorry it just reminded me of this)
-
1
-
-
I'm not familiar with a Hough Transform but NI's forums seem to have some information on it. Looks like it uses IMAQ.
http://forums.ni.com/t5/LabVIEW/Imaq-Hough-trasform/td-p/293094
EDIT: It looks like someone has developed them on this site http://live.ece.utexas.edu/class/siva/siva_dip/siva_dip.htm (number 9) but you need to request the password to the zip.
-
Yeah don't get me wrong, in almost all of my LabVIEW experience, an installer is made up of one EXE and it's dependencies, which is from an application build in the same project as the installer. Such a thing would be nice but like I said I think you will only get this information from a custom post-install VI call.
-
I have never used the 6008, and I can't simulate it for some reason (not in the list in MAX) but the spec says supports 10Ks/s sampling rate on a single channel. I assumed it meant hardware timing, since it would be cruel to say that but require a tight timed loop in Windows. I just went for the cheapest AI USB DAQ not knowing that NI sold them which don't support hardware timing. Please correct me if I'm wrong.
-
Perform a DAQ read using any normal NI hardware, using continuous mode. Then you can have the hardware read a bunch of values (say 10,000 per second) and once a second read all of them, then perform a Maximum function on it in LabVIEW, then read another 10,000 samples and get the maximum of those 10,000 plus the one from previous. The cheapest solution I'd recommend (not knowing the full application) is the USB-6008. Then use any continuous analog input sampling example.
The trick here is that it is a software solution, but using hardware timing so you won't miss any values, it's continuously reading, you just have to ask for a chunk of values at a time.
-
I don't think such a thing will be an easy thing to get. Keep in mind an installer may have multiple EXEs inside it, so there won't be a tag that is the version of software for an EXE. But you can probably perform the steps you want using a Post Install VI call. This will run a VI after the installer is complete. Using this you can get the version of the EXE you want (because you know where it was installed to) then you can write the registry however you like.
-
So generally you can get VIs out of an EXE but usually you don't want to. VIs in an EXE by default have no block diagrams, or front panels unless they are used in the application. So if you were able to get the VIs out you will be able to call them and get outputs from it, but can't see the source. Another big issue is that the VI will be compiled for the only version of LabVIEW that the EXE was built for. So if you made the EXE in 8.2 you can only call it from the 8.2 runtime engine. I have some tools I've found on NI's forums to help get VIs out of EXEs but ever time I try to I realize the outcome isn't very helpful.
LabVIEW 2013 Favorite features and improvements
in LabVIEW General
Posted
This almost worries me. You see for years the Tree control, multicolumn listbox, and table have been dog slow at doing individual cell updates when there are many. So to get around this I have several coding techniques that will improve performance. Things like only updating the cells visible, only showing a subset of the data then load the other data when you scroll, and defer front panel updates.
So my worry is that my performance from older applications will actually be slower then the native implementation, and now I'm going to end up removing all that hard work I did to get around LabVIEW's issues.