Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 11/05/2015 in all areas

  1. I had a LabVIEW user group meeting last week talking about embedded platforms and when I got home I saw the Intel Edison collecting dust on my desk: Since it is x86, I thought I would try to deploy a Linux LabVIEW executable to the Edison and run it in the Run Time Environment. Here is what I did: 1) Install Ubilinux, based on Debian Linux, from: https://learn.sparkfun.com/tutorials/loading-debian-ubilinux-on-the-edison 2) Install xvfb, a virtual framebuffer display server (no video on this device), fluxbox display manager and x11vnc: apt-get install xvfb fluxbox x11vnc xterm 3) Install the prereqs for the LabVIEW RTE (this was already installed with the above, but good to check): apt-get install libxinerama1 libgl1-mesa-glx 4) Download the RTE and unzip it, I put it in /home as /root gets pretty full wget http://ftp.ni.com/support/softlib/labview/labview_runtime/2014/Linux/LabVIEW2014RTE_Linux.tgz tar zxvf LabVIEW2014RTE_Linux.tgz 6) the RTE is an rpm and we need deb so get alien: apt-get install alien 7) convert the rpm to deb (this takes a while): alien -k -c labview-2014-rte-14.0.0-1.i386.rpm 8) Install the RTE: dpkg -i labview-2014-rte_14.0.0_1.i386.deb 9) Install some missing fonts: apt-get install xfonts-100dpi xfonts-75dpi xfonts-scalable xfonts-cyrillic ttf-mscorefonts-installer At this point, I compiled the "UDP sender" from the examples using LabVIEW for Linux 2014 with my desktop hardcoded as the destination IP address. I copied the executable over to the Edison, ran it as follows: xvfb-run ./send and a few seconds later on my desktop: I am getting UDP packets from my Edison to my host computer sent from a LabVIEW application! I was pretty excited by this. At this point, I decided that I wanted to see if I could get a GUI running. I started up a virtual display and launched fluxbox using: Xvfb -screen 0 800x600x16 -ac & DISPLAY=:0 fluxbox & And from a linux VM, I created a SSH tunnel and started a VNC connection ssh -l root -L 5900:localhost:5900 192.168.222.90 'x11vnc -localhost -display :0' vncviewer localhost I was greeted with a desktop so I launched a terminal and executed the send program. Here is the result: So there you have it, LabVIEW executables running on a headless Intel Edison with remote desktop support. Oh, and once the VI is running from the remote desktop, you are free to disconnect and reconnect without terminating the running VI. As to resource usage, I show while running everything: root@ubilinux:~# free -h total used free shared buffers cached Mem: 960M 174M 786M 0B 9.0M 100M -/+ buffers/cache: 64M 896M Swap: 0B 0B 0B htop: 1 [##** 9.6%] Tasks: 35, 12 thr; 1 running 2 [##*** 12.7%] Load average: 0.15 0.20 0.11 Mem[|||#**** 65/960MB] Uptime: 00:07:25 Swp[ 0/0MB] PID USER PRI NI VIRT RES SHR S CPU% MEM% TIME+ Command 2018 root 20 0 16132 5428 2772 S 5.0 0.6 0:19.94 x11vnc -localhost 2047 root 20 0 4380 1580 1204 R 5.0 0.2 0:00.83 htop 2033 root 20 0 120M 39620 27756 S 5.0 4.0 0:16.89 ./send 1987 root 20 0 52136 14608 5016 S 3.0 1.5 0:17.00 Xvfb -screen 0 80 2016 root 20 0 12540 6300 2436 S 0.0 0.6 0:02.85 sshd: root@notty 1 root 20 0 2200 716 616 S 0.0 0.1 0:00.81 init [2] 202 root 20 0 2540 1108 716 S 0.0 0.1 0:00.28 udevd --daemon 360 root 20 0 2536 816 428 S 0.0 0.1 0:00.01 udevd --daemon 366 root 20 0 2536 820 428 S 0.0 0.1 0:00.00 udevd --daemon 1552 root 20 0 5760 1424 1040 S 0.0 0.1 0:00.14 /sbin/wpa_supplic 1752 root 20 0 27916 1680 1116 S 0.0 0.2 0:00.17 /usr/sbin/rsyslog 1754 root 20 0 27916 1680 1116 S 0.0 0.2 0:00.01 /usr/sbin/rsyslog 1755 root 20 0 27916 1680 1116 S 0.0 0.2 0:00.01 /usr/sbin/rsyslog 1747 root 20 0 27916 1680 1116 S 0.0 0.2 0:00.23 /usr/sbin/rsyslog 1801 root 20 0 2424 608 520 S 0.0 0.1 0:00.00 /usr/sbin/udhcpd 1837 ntp 20 0 5304 2020 1572 S 0.0 0.2 0:00.11 /usr/sbin/ntpd -p F1Help F2Setup F3SearchF4FilterF5Tree F6SortByF7Nice -F8Nice +F9Kill F10Quit The VI is using about 5% CPU as well as 5% on the the VNC server and 3% on the virtual framebuffer. Here is the disk usage: root@ubilinux:~# df -h Filesystem Size Used Avail Use% Mounted on rootfs 1.4G 1.3G 78M 95% / /dev/root 1.4G 1.3G 78M 95% / devtmpfs 480M 0 480M 0% /dev tmpfs 97M 300K 96M 1% /run tmpfs 5.0M 0 5.0M 0% /run/lock tmpfs 193M 0 193M 0% /run/shm tmpfs 481M 12K 481M 1% /tmp /dev/mmcblk0p7 32M 5.3M 27M 17% /boot /dev/mmcblk0p10 1.3G 125M 1.2G 10% /home It's pretty tight, you will definitely run out if you do not do the RTE installation from /home. As to IO, I have not played much with doing anything with the IO ports, that's next on my list. I wanted to share my progress so far and get people's feedback. Thanks Jon
    1 point
  2. Your code was not attached, but I often use variants, dynamic registration of arrays of control references, and control labels that encode the message that needs to be sent, as illustrated in this old post. I also done similar things connecting multiple controls with Camera Attributes or DAQmx channels.
    1 point
  3. The Transpose can be a free operation but it doesn't have to stay free throughout the diagram. LabVIEW maintains flags for arrays that indicate for instance the order (forward or backward) as well as if it is (transposed or not) The Transpose function then sets that according flag (as does the Revert 1D array does the according flag). Any function consuming the array either has to support that flag and process the array accordingly or first call a function that will normalize the array anyways. So while Transpose may be free in itself it doesn't mean that processing a transposed array is never going to incur the additional processing that goes along with physically transposing the array. I believe it is safe to assume that all native LabVIEW nodes will know how to handle such "subarrays" as will probably autoindexing and similar. However when such an array is passed to a Call Library Node for instance LabVIEW will ALWAYS normalize the array prior to calling the external code function. Similar things account for other array operations such as Array Subset which doesn't always physically create a new array and copies data into it but also can create a subarray that only maintains things like the offset and length into the original array. Of course many of these optimizations will be void and invalidated as soon as your diagram starts to have wire branches that many times require seperate copies of the array data in order to stay consistent.
    1 point
  4. This article should provide a helpful perspective: http://www.ni.com/newsletter/51675/en/ C was my first programming language. Then, I learnt assembly for 2 microprocessors, then C++, then LabVIEW, then TestStand. Learning C helped me to understand memory management, which gave me insight into how LabVIEW does buffer allocations behind-the-scenes. TestStand has a very C-like structure, so knowing C made it easier for me to learn TestStand. Learning assembly (together with digital logic) helped me to understand how software translates into hardware steps, which gave me insight into what the LabVIEW compiler needs to do behind-the-scenes, and it made it easier for me to learn the FPGA module in LabVIEW. Learning C++ helped me to understand OOP concepts, which made it easier for me to learn LVOOP. Note: This is my personal journey to my current state. Learning C (and the other languages I wrote about) definitely gave me a better "intuition" on how to write LabVIEW, but they're not strictly necessary. I don't think it's the most efficient pathway if your main goal is to improve LabVIEW skills. Other people have probably developed the same "intuitions" via other pathways. Note also that I no longer use C or assembly these days. Although they were a good learning experience, I find them too unwieldy for my daily use cases. LabVIEW and C++ are now my preferred tools. I'm guessing that's because the move from C to LabVIEW is a big paradigm shift, so they're simply struggling with a language that's "alien" to their "mother tongue". In fact, I experienced the same when going from C to C++! (They might have similar core syntaxes, but they're very different languages) You might experience a similar struggle when you start learning C, but I highly doubt that it will degrade your LabVIEW skills. After all, LabVIEW is your "mother tongue" ;-)
    1 point
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.