Jump to content

JKSH

Members
  • Posts

    436
  • Joined

  • Last visited

  • Days Won

    30

Everything posted by JKSH

  1. The bar edges look a bit blurry. Try sharpening your image and increasing the contrast. Do these help?
  2. Git + SourceTree, for the same reasons as @LogMAN. I'm not getting JIRA for LabVIEW work, but I'm part of an OSS community that does use JIRA.
  3. I'm guessing it has something to do with the pricing of Application Builder -- NI is selling these for $1500. It's not clear to me: Does purchasing the license of Application Builder for one OS entitle you to the other OS'es...? Also, does the Developer Suite entitle you to LabVIEW on all desktop OS'es?
  4. This year, NI Week has a "Breakfast with the Execs" session, which lets attendees sit down with a high-ranking NI official and talk about specific topics. Omid Sojoodi, the Vice President of R&D for Application and Embedded Software, was taking questions about LabVIEW and SystemLink. I asked him about the Windows-centricity of LabVIEW NXG, and whether NI has considered cross-platform GUI toolkits. This is what he said: NI is currently concentrating their resources on maturing NXG on Windows, because it is by far their largest market. As of today, they haven't picked a solution for macOS and Linux, but are considering using web technologies (HTML + CSS + JS). During the early stages of NXG planning, the R&D team evaluated 7-8 cross-platform GUI toolkits. However, none of them were satisfactory. NI does use Qt internally, so they are familiar with it. It is one of the toolkits they evaluated. According to Omid, it didn't look native enough, so they went with a native toolkit (WPF). Personally, I'm surprised by this; I've used Qt lots and it produces very native-looking GUIs to me. In any case, I would've thought the cost of minor deviations from full nativeness (if any) would be dwarfed by the cost of maintaining separate code bases for different platforms. I also think trying to tack on cross-platform support after the maturation of the Windows version will be a very steep uphill battle, compared to doing it from the get-go. However, Omid did also say (in a different discussion) that LabVIEW NXG is very modular, and the front-end is quite separate from the business logic. Perhaps this means NI is confident of their ability to swap front-ends easily?
  5. Every endpoint needs a name. Network Streams are one-to-one: One writer sends data to one reader. Between this pair, only one needs to specify the URL. I presume that your cRIO's IP address is 10.1.17.201? This is what I recommend: You don't need any context names. On your cRIO, set the reader name to "myreader". On your cRIO, do not wire anything into writer url. On your PC, set your writer name to "mywriter" On your PC, set your reader url to "//10.1.17.201/myreader"
  6. This is not the official NI forum. Very few NI staff (who can actually fix your issue) come here. You're trying catch fish by shooting a rifle towards the sky...
  7. Kudoed. But anyway, have you seen VIMs?
  8. According to http://digital.ni.com/public.nsf/allkb/79CB44F7E228AED88625756E00445151?OpenDocument, "Compact RIO systems have a FPGA in the backplane which prevents DAQmx from being used" But just to be sure, I'd ask NI via official channels to get a definitive answer
  9. I'm curious: What's the rationale for supporting NI clients only? I can confirm that I've used the API to host an OPC UA server on a Linux RT CompactRIO, and this server is accessible to 3rd-party clients. My test client was UA Expert (by Unified Automation). I don't know the exact model of my customer's client, but I believe it was by Matrikon OPC.
  10. Yes, apparently. According to https://opcfoundation.org/forum/opc-ua-standard/clarification-for-index-range-handling-of-readwrite/ the ability to read an array subset (including a single element) is required by the standard. The ability to write is optional, however. I believe it's called "Index Ranges". My previous link shows an example. Here's another: http://forum.unified-automation.com/topic1386.html I don't know of one, sorry This sounds like a recipe for hard-to-detect bugs... Is there any chance of getting NI R&D to change this behaviour (or at the very least provide a way to enable stronger checks)?
  11. May I ask why? Off topic: One reason I still use DataSocket in LabVIEW is because that's the only way (that I know of) to programmatically read/write Modbus I/O server data without having to create bound variables: http://forums.ni.com/t5/LabVIEW/Can-I-write-to-Modbus-I-O-Server-addresses-without-creating/td-p/2848048 I'd love to know if there are any alternatives.
  12. Hi, I wanted to clarify: How do you get the string, "x23y45z3"? Does the user type it into a text box? How do you give the values back to the user? Why?
  13. The way I see it, XControls are for cases where you want to create a custom interactive "widget", and you plan to embed multiple copies of this widget in other front panels. (Unfortunately, it doesn't always work well, but that's a different topic.) So, do you want multiple copies of your custom box? If you don't, then you definitely don't need an XControl. If you do, then I'd consider it. Also, does it have to be an overlaid box? Would you consider a separate pop-up dialog altogether?
  14. This is just a guess, but I think the Control and Simulation (C&S) loop treats your code as a continuous-time model, and does some conversion behind-the-scenes to produce a discrete-time model for simulation. A feedback node breaks time continuity, so it's quite possible that the feedback nodes would interfere with the C&S loop's ability to solve your equations. I don't have the toolkit installed so I can't play with it, but I'm not sure if the C&S Loop is capable of solving such complex simultaneous differential equations like yours. I would still do some calculus and algebra to make your equations more simulation-friendly first, before feeding them into LabVIEW. For example, it's pretty easy to get an expression for x. See: Finding dx/dt: http://www.wolframalpha.com/input/?i=Integrate+A-B*(d^2y%2Fdt^2*cos(y)+-+(dy%2Fdt)^2*sin(y))+with+respect+to+t Finding x: http://www.wolframalpha.com/input/?i=Integrate+A*t-B*(dy%2Fdt*cos(y))+%2B+C+with+respect+to+t ...where y = φ, A = F/(M+m), B = m*l/(M+m), C = constant of integration (related to initial conditions). If you still want to try to get the C&S loop to process your equations unchanged, try asking at the official NI forums and see if people there know any tricks: http://forums.ni.com/
  15. That's a simple but valid approach, assuming that both sensors are very close together and face the same direction at all times. The technical term for alignment is "image registration" or "image fusion". The Advanced Signal Processing toolkit contains an example for doing image fusion -- do you have it installed?
  16. The video shows that the nodes that get highlighted are those that take references as inputs. So, I'm guessing that your reference(s) became invalid somehow. I'm not sure why no error messages pop up though -- did you disable those? Wire the error outputs of those highlighted nodes into a Simple Error Handler subVI. What do you see? It's hard to say without seeing your code in detail. Anyway, may I ask why you use the abort button? That can destabilize LabVIEW. It's safer to add a proper "Stop" button to your code
  17. That's quite cool. Note that your system is non-linear (due to the sine and cosine functions). Do you know if those built-in ODE solvers cope well with non-linear systems? It's best to get it working for φ alone first (meaning you need to make sure all your values are non-weird), before you even consider solving for both x and φ at the same time. One quick and dirty technique is to take your "solved" φ variable, pass it through differentiator blocks to calculate (d^2 x / d t^2), and then pass that through integrator blocks to find x. This will avoid the "member of a cycle" problem, but might cause errors to accumulate. See http://zone.ni.com/reference/en-XX/help/371894H-01/lvsim/sim_configparams/
  18. That's fine in mathematics. Writing equations like this provides a concise yet complete way of describing a system. However, this format doesn't lend itself nicely to programmers' code. The easiest answer for you is one that you already understand. So tell us: What techniques have you learnt in your engineering course for solving differential equations in LabVIEW (or in any other software environment, like MATLAB or Mathematica)? I haven't used the Control & Simulation loop before so I don't know what features it has for solving differential equations. However, the first thing I'd try is to substitute the top equation into the bottom equation, and see if that allows you to solve for φ. If not, then my analysis is below. -------- Those are Differential Equations, which describe "instantaneous", continuous-time relationships. You can't wire them up in LabVIEW as-is to solve for x and φ. I would first convert them into Difference Equations, which describe discrete-time relationships. Difference Equations make it very clear where to insert Feedback Nodes or Shift Registers (as mentioned by Tim_S), which are required to resolve your "member of a cycle" problem. You'll also need to choose two things before you can simulate/solve for x and φ: Your discrete time step, Δt. How much time should pass between each iteration/step of your simulation? Your initial conditions. At the start of your simulation, how fast and which directions are your components moving?
  19. This isn't related to EULAs or LabVIEW FPGA, but you might like this site: http://tosdr.org/
  20. I've experienced this a few times in recent weeks. I then become unable to post on the original page, so I have to copy + paste my comments into a new tab. Running Google Chrome 55.0.2883.87 m, and I often take a long time to post too.
  21. PC 3 has 2 cores, right? 50% CPU usage means that one of your cores is running at 100%. Jordan is probably correct: You have at least one while loop running wild. Run Resource Monitor on PC 2 too. I suspect you'll find your application using 25% CPU there (1/4 cores). Look at your code. For each loop, tell us: How frequently does the code inside the loop run? (Is it... once every second? Once every 10 ms? Once every time a queued item is received?) Anyway, here's a small note about loop timing: You haven't answered the questions asked by ShaunR and hooovahh: Are your Ethernet ports 100 Mbps or 1000 Mbps? Since you are using GigE cameras, you need to use Gigabit (1000 Mbps) Ethernet ports. This should be fine, if you don't run anything else on the PC.
  22. Run your application on PC 2 and PC 3. Launch Resource Monitor. Are you hitting the limit of any resource(s)?
  23. One thing's not clear to me: Does that statement mean, "You should not inherit from a concrete class, ever", or does it mean "you should not make a concrete class inherit from a concrete class, but you can let an abstract class inherit from a concrete class"? Also, before we get too deep, could you explain, in your own words, the meaning of "concrete"? Agreed, I'd call it a "rule of thumb" instead of saying "never". An example where multi-layered inheritance make sense, that should be familiar to most LabVIEW programmers, is LabVIEW's GObject inheritance tree. Here's a small subset: GObject Constant Control Boolean Numeric NamedNumeric Enum Ring NumericWithScale Dial Knob String ComboBox Wire In the example above, green represents abstract classes, and orange represents concrete classes. Notice that concrete classes are inherited from by both abstract and concrete classes. Would you ("you" == "anyone reading this") organize this tree differently? What's the rationale behind this restriction?
  24. The reduced contrast is intentional on NI's part. There is an Idea Exchange post calling for its reversal: https://forums.ni.com/t5/LabVIEW-Idea-Exchange/Restore-High-Contrast-Icons/idi-p/3363355
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.