Jump to content

COsiecki

Members
  • Posts

    20
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by COsiecki

  1. Going off Hooovahh's code, since you say the inputs are sorted, you could use the found index as the initial search location for the next iteration. I know you lose the parallel-for-loop benefits, but it may make up for that by having a decreasing search pool for each subsequently found element, You'd need to catch when the B element wasn't found so you don't reset your starting point.

  2. First, you have to be running the Pi edition of Minecraft on a Raspberry Pi on the same subnet as your LabVIEW code. This doesn't work with the full version of Minecraft.

    You can download Minecraft for the Raspberry Pi at http://pi.minecraft.net.

    Once its installed, start a game and load a world. This will start a server in the background.

    In the Minecraft-Pi library, you create a Minecraft-Pi object using Init.vi in the Connection Methods folder. You call it with the IP address of the Raspberry Pi running Minecraft.

    Once you have created the object, use Connect.vi to connect to the server. From there, if it didn't error, you are connected until you call Close.vi. You can use any of the other methods to set or read the available values. I included the "mcpi-protocol-spec" text file to explain how the functions work.

    Beyond that, make sure you can ping your Raspberry Pi and that your networking isn't blocking the ports. The Minecraft-Pi server uses port 4711.

  3. I have written a LabVIEW version of the API for Minecraft on the Raspberry Pi. I used the API spec sheet and the sample python code provided by Mojang to make my version. I have uploaded it to the NI example code section. There's enough people here who like to use LabVIEW in unique ways, so I thought someone might like it. Any feedback is also welcome.

     

    Example code link: https://decibel.ni.com/content/docs/DOC-26662

  4. Our method is similar to the ones described here. When we start recording, we pass references to all the channels we want to record to a process that extracts header info from the channels. It then creates a TDMS file and a common receive queue. Each channel then gets its own "data grabber" spawned. The data grabbers will dump data into the common queue when their channel updates (this action is event based). The original recording process will pull the data chunks from the common queue and write it to the appropriate channel in the TDMS file. When the recording stops, an event is fired that kills all the data grabbers. We use a parent data class with type-specific children as our queue elements so we can easily combine different data types in our file. We can also have multiple files grabbing the same data if we need to.

  5. I do it all the time! The only thing that I would warn you about is that front panels and block diagrams open where they were saved. If they were saved on your second monitor and then opened on a system with only one screen, they will open off to the edge of the screen. This can be a little annoying for you, but if that happens to the user, you'll certainly hear about it. Other than that its great! I have dual monitors set up at work and home. I was fortunate enough on one project to get a triple monitor setup. That was live heaven.

  6. Something I have done in the past is to create a set number of subpanels and then have a library of functions to handle sizing, positioning, loading, and unloading of VIs through an interface. You are limited to the number of subpanels you create initially, but as you place them on the front panel, the regular scrollbars will allow you to move around the panel. As a bonus, it becomes very easy to save the template so the user can pull the exact same layout up again. The unused subpanels are shrunken and moved off screen. My system used 30 subpanels and I never had any performance issues, even on some 2005 era single-core computers.

  7. TDMS is a fantastic format. Because I wrote almost all of the DAQ software we have, I have replaced the custom formats we used with TDMS. The ease of use through LabVIEW makes it almost as simple as adding an indicator to your front panel. It's also very powerful for adding metadata. You don't have to figure out where you are going to put some new property in your file that doesn't mess with your offsets, or if you can add another channel without needing to re-write your reader.

    We have had some issues using TDMS in MATLAB. As far as I know there are is no official support for TMDS by Mathworks (they probably see NI as a competitor). One way to do it is to use the TDMS dll and do library function calls, but those can be finicky and can cause MATLAB to crash to desktop. The other problem we had with that method is that, at least when we were using it, the 64-bit version of MATLAB had no built in way to use a dll and you had to link it to an external compiler. It was not an ideal solution. In the end, we wound up using the TDMS format white paper to write a MATLAB-native reader, but there are some data types that aren't supported and we are lacking a number of nice features. It works for the way we tend to use the files we create though.

  8. I like to map buttons 4 and 5 to undo and redo. Its the same behavior as forward and back in a browser. I also sometimes map Align and Distribute to the wheel rocking. Unfortunately, I don't think you can map individual Align or Distribute functions. All you can do is just a repeat of the last operation you selected. Ideally, I'd love to have Align Left and Align Right as my left and right wheel rock respectively. I'm kinda OCD about my code, and I do a lot of alignment. With wire alignment in 2011, its worse... thanks NI :P

  9. The only way I have ever had a background in a 3D scene was to create a large cube or sphere around my scene and then apply the background texture to that. You need to make it quite large, on the order of 100 to 1000 on a side for a cube. I took the 3D demo with the Sun, Earth, and Moon and added a starry background for fun before. I don't know if that is what you meant, but I hope it helps.

    Hello.

    I tried to import a vrml file to LabView and everything is working, except the background.

    I now found that according to

    http://digital.ni.co...62575CA005747E0

    the background feature is unsupported.

    Is there a way to change the background in a 3D scene?

    I found some background color properties, but I found the help a little confusing (I am a novice).

    Any help is much appreciated.

    Kind regards,

    Greg

  10. Some things I noticed:

    1. The only flow control on that loop is waiting for the read. You could use an event structure and a DAQmx event to potentially improve your response. A good one to use is EveryNSamplesAcqIntoBuffer.

    2. You are doing some unknown amount of processing in that state. You might try using a producer/consumer architecture so that the processing and database access don't potentially slow your loop down.

    3. What is that DataQ and what is putting data in there? That state won't run if the queue is empty.

    4. You have the task set up for 100 samples per channel, yet your read is for 1 sample per channel.

    Try checking some of those things. If you narrow down the delay to the actual DAQ read, then I think a call to NI would be a good bet. There are cases where the hardware acts weird, but you have to eliminate the software first.

    I'm running into a synchronization issue with an SCXI-1112 in an SCXI-1000 frame. Irregularly (every 10 to 15 reads), the read takes roughly 10x longer than the majority of the reads. This slows the rest of that case down, which drops data. I've tried various sample sizes, changing my loop delay, and adding "DAQ Control Task" in my initialization to set the task into "Commit". What else might I try to polish the edges so it doesn't stick any more? I've attached pictures of the relevant code.

  11. I don't know how your program is set up, but I would use a DAQ event for timing and then read the mouse position when the event fires. See my attached VI.

    I have an application that requires me to record mouse events (xy location vs time) and synchronize that with DAQ data that is acquired in parallel.

    I get the mouse events with the system tick counter as a time base and the DAQ data with the system timestamp as a time base.

    To synchronize the two I am calling the GetTickCount and the GetDateTime functions in a sequence frame and use those values as the reference points for my relative time base.

    Does that make sense? Is there an easier way?

    We are seeing a 500ms delay between DAQ and mouse data and have no idea where it comes from.

    Read Mouse.vi

  12. You can either create the mesh manually by using Create Mesh.vi and then adding the array of 3D coordinates using a property node. The other option is to use something like this:

    http://decibel.ni.co...t/docs/DOC-5242

    I like the second option for complex things, though it can take a bit of getting used to for scaling and position. All of the tools are free, which is nice.

    I'm just getting started with using the 3D picture control and need some help with drawing objects. There are only a few built in functions for drawing objects such as the cone, cylinder, and box. What do you do when you need to draw a different shape, maybe a pyramide? For example, I would like to draw something similar to the box, but I want the rectangles that define the top and bottom to be different sizes. I'd also like to draw something like the cone but I want to be able to specify the radius of the bottom as well as the top. Anyone know how this can be done? Thanks in advance.

  13. I have a similar project, and I wanted to have it run faster as well. I don't know if it will work for your system, but the system we have lets you create movement programs that you can run without having to send individual commands. To deal with the timing issue, I just run the data acquisition continuously and watch for "flat" spots in my data, since that corresponds to where the motor pauses between moves. I can put very small delays in between my moves and pull that out of the data. It still takes a while to copy the programs to the controller, but once its sent, I can rerun it as many times as I want. This may help speed things up.

    If your particular task doesn't have data that would easily let you determine where the motor stopped, you could use the status bytes from the motor controllers to read when they change state from moving to stopped. If you aren't having to send individual commands, polling status should be fairly quick. Some controllers even have digital outputs that indicate when the motor is moving. You could use this as a trigger for your acquisition.

  14. I am trying to map hardware that is currently in use. Is there a daqMX property that will let you know if the device or physical channel is reserved? I know that if you try to run two tasks on the same hardware you get error 50103 and you could capture that, but I'd like to not need to create and run a task on every channel in my system. Another option would be to assume that nothing else is running (probably a fair assumption on our systems) and then flag them as "in use" in my software.

    Has anyone come up with any other methods for this type of thing if there isn't something in daqMX?

  15. I use DAQmx quite a lot in my day to day work. The only way you can get to the 16bit data from the acquisition device is to use the Analog>Unscaled option on the DAQmx Read vi. When you read the waveform datatype, the unscaled data it converted to DBL values by the DAQmx driver using a polynomial evaluation. You can get the polynomial coefficients for converting to voltage from the driver by using a DAQmx Property Node: DAQmx Channel>>AI.DevScalingCoeff. What you would want to do then, is to set up your read as unscaled, use the unscaled data for your TDMS file, and convert to volts, or whatever scaling at that point. Stay away from the Raw read options. The Unscaled will still return your data in a 2D array of [channels x samples]. Raw returns a 1D array that you have to parse manually. One other note: don't assume the nominal voltage ranges on your device are acceptable for rescaling. Just because you have a 16bit ADC with +-10V range, it doesn't mean you can use 20 / 2^16 as your dV. The actual output from the device probably goes from -10.214 to + 10.173. This will make your data very messy. Always use the device scaling coefficients from the driver.

    I use a polynomial composition to combine my unscaled to voltage conversion and my voltage to engineering units conversion. This reduces my CPU load quite a bit. If you are using a custom scale in MAX, I don't know how that works, so you may not see a difference.

    Another thing I do it build my tasks programatically from config files (you could use xml). I have found that for situations where people are changing acquisition settings frequently, MAX can be a pain. Also, you can't (as far as I know) dynamically retarget a MAX task to new hardware so that you can run multiple instances of it. Another good thing about not using MAX is that you can control what configuration settings your users do and don't have access to. You don't have to give them AC/DC coupling options if they

    are only reading DC levels, for example.

    One nice thing about MAX is, well, um, oh, it already has custom scaling options. Though they can be a little cumbersome. I usually avoid MAX as much as I can since its portability is practically nonexistent and the interface is straight out of the early 90s.

    • Like 1
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.