flintstone Posted October 4, 2012 Report Share Posted October 4, 2012 (edited) Hi to everyone, I have a design on a 7851 that is working in the lab and now there is one big issue. The system consists of this FPGA card in a PXIe crate with an 8133 controller which on startup configures the FPGA using the "Open FPGA VI reference" VI, parameterizes the FPGA and receives data from it. The "Open FPGA VI Reference" points to the bitstream and was configured to generate a typedef from the interface. The application on the controller uses LVOOP, the class has in its private data control an instance of this generated typedef (actually it's a bit more complex due to other reasons, the private data only holds a reference to the cluster that holds all the data including the FPGA VI reference, but I don't think this makes a difference). Using this instance and the FPGA Read/Write nodes as well as a DMA FIFO the host application interfaces the FPGA. Within the FP interface to the FPGA there is also a cluster holding some numeric and boolean values, lets call it "param_cluster". The problem is now: I developed it in the lab where it was tested to work. I committed it to SVN (which is used to manage the whole project, essential building block of everything). When I check it out at the teststand to deploy it FPGA Read/Write nodes to "param_cluster" are broken, the error message is "Error: Master Copy for Type Definition could not be found". It's exactly the same error as in this thread http://forums.ni.com...nd/td-p/1496866 . Pitifully there were never clear answers to this. A temporary solution to this problem is to rebuild the bitstream at the teststand and reopen it in the "Open FPGA VI Reference" VI but this cannot be done in the final system. The bitstream used then must be the one tested and verified in the lab. So anyone has in idea what is happening there? Cheers, flintstone Edited October 4, 2012 by flintstone Quote Link to comment
JamesMc86 Posted February 13, 2013 Report Share Posted February 13, 2013 Hi Flinstone, The cause does appear to be a bug, what version of LabVIEW are you on? I found a CAR (128114) which appears to be the problem where the bit file is storing an absolute path to the type def instead of a relative path. Unfortunately this would either mean upgrading to 2012 or keeping the disk hierarchy in the same location on both machines. I guess a third option would be to disconnect the type definitions for the final build. Cheers, James 1 Quote Link to comment
flintstone Posted February 13, 2013 Author Report Share Posted February 13, 2013 Hi James, loads of thanks for giving a convincing explanation for that problem :) . I will investigate my options but I'm pretty sure we will find a way now. Cheers, flintstone Quote Link to comment
flintstone Posted April 4, 2013 Author Report Share Posted April 4, 2013 (edited) I'm sure now that this is really the problem, I wanted to do local tests on a different machine with a pre-synthesized bitstream and I saw the problem again. I reproduced the build environment path structure with a USB key and it worked. The guys that manage the automated build system really don't like this but at least I can point them to this thread so they can be angry at the manufacturer who does not provide a bugfix for this and not at me . Regarding the "Disconnect Type Definitions" option, do you mean to do that for the host application or for the FPGA (where I don't find the option in the build settings)? For now it does not matter anyway, I want to run my Test VI from the LV IDE until I have debugged my FPGA implementation. Cheers, flintstone Edited April 4, 2013 by flintstone Quote Link to comment
flintstone Posted May 13, 2013 Author Report Share Posted May 13, 2013 Final solution will be to only use basic types on the front panel instead of a user defined type. Quote Link to comment
todd Posted November 19, 2013 Report Share Posted November 19, 2013 If I understand what was described above, this may be a useful data point. In 2013: fpga.vi writes data to a typedef cluster indicator rt.vi reads the fpga's typedef cluster indicator and writes a psp variable of the same type pc.vi reads the psp variable If the RT build spec has "Disconnect type definitions" selected, everything works. If typedefs are not disconnected, the RT's psp-write does not put data on the network. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.