Jump to content

Mark Yedinak

Members
  • Posts

    429
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by Mark Yedinak

  1. It would seem that you have several options. One would be for your applications to poll the status of the configuration file to see if it has been modified and update itself if it has. It could use the modification date of the file for this. This would also mean that your computer's clocks should be synchronized. A second option would be to have your applications update itself at some regular interval. The third, and most complex option is to have a central sever application which your applications would register with. If any instance of your application updated the configuration it would notify the update server which in turn would send a message to all registered applications that the update has occurred. A slight variation of this approach would be for each application have a process that listens for broadcast messages. You could use UDP for this. When an application updated the configuration settings it would broadcast a message indicating that the update has occurred and all of the other applications could update their settings accordingly. This option is probably the most flexible. It would require that your IT staff allow UDP broadcast packets on your network and may require them to open a specific port on the network. You could create a simple protocol for this where your broadcast messages are a message type and a variant. Based on the message type you could send additional information in the variant data and the message type would dictate how to interpret the data itself.
  2. I can't answer your question as to who grades the tests however I can give you my opinion on the difference between the two certifications. The CLD is intended to certify that you have a good grasp of the language itself and use coding style when you code. It let's someone know that you are a capable of creating readable and modular code. Also, I am reasonably sure that a significant portion of the grading for the CLD is the result of the VI Analyzer's report on your exam code. It really focuses more on style than the overall design and architecture of the application. The CLA on the other hand is intended to certify that you are also knowledgeable and capable of creating a solid architecture for a large application in a multi-developer environment. It certifies that you are using good software engineering techniques for designing and implementing your system. The CLA goes deeper into the design of the application rather than the style of the code itself. In addition, it covers software engineering techniques that have nothing to do with your code itself such as source code control, documentation practices and other engineering practices that are completely independent of the programming language being used.
  3. If I were you I would consider using network queues. Here is a thread that discusses them. Attached is the version we have been using in our applications.
  4. QUOTE (ooth @ Mar 5 2009, 07:54 AM) I think the point that folks are missing is that I am not suggesting somethat that can't be done today. There are always ways around this and using the error cluster is one way of doing it. However, if you are working with subVI's that don't have an error cluster you either need to wrap this and add nothing but the error cluster, or some other arbitrary wire, or use a flat sequence frame. I know that there are ways to accomplish this and nothing new is required. I am making this suggestion because I believe it is an improvement over the sequence frame in that it produces cleaner looking and less cluttered code. Also I don't like the idea of wrapping VI's simply to create an arbitrary data dependency. The lack of this new feature would not be the end of the world. However I do think its inclusion would produce cleaner looking code in the long run. And in the end, isn't that what we are trying to accomplish?
  5. As Ton mentioned this is a risk of event driven applications. However I would have to ask why you would need to display the changes that frequently. I would question the design in this case and look at your architecture. There is no reason to update a UI display that frequently. A person will not be able to see the updates at that rate and therefore there is no benefit for updating the display that quickly.
  6. What are you trying to accomplish? Are you asking if you can change the enum at run time or are you asking if it is possible to replace it programmatically during your development process? If you are asking about doing it while your application is running then the answer is no, you can't change it dynamically. If this is what you would like to do then you should use a ring control. If you are asking about doing it during your development then I believe the answer is that it is possible. I haven't done this myself so I can't give you the specifics for doing it. If you are doing this during development I would have to ask if you are using typedef's or not. Typedef's do propigate the control's changes to all code that use it.
  7. You can use this method. This is a direct call method using the winsock library. It works well and you can put it into a loop. However, BE CAREFUL if you do use this in a loop. If you don't introduce any delay you can easily flood the receiving device with ping requests. I do wish that NI would add a native ping to LabVIEW.
  8. The "Mouse up" event is not triggering when the mouse is on the scroll bar. The "Mouse down" event triggers though. It you click the mouse in the string display itself you will see that you get the desired behavior. Have you informed NI about this since this does appear to be a bug. At a minimum it is inconsistent behavior.
  9. What appears to be missing in the above picture is what controls the completion of your read. The code that you have now is only going to read the number of bytes reported by the VISA connection that are available at that one specific point in time. So, you say that your device will be sending lots of data back and what is probably happening is that the VISA connection is reporting some number lower than the total data that should be returned. In essence you haven't read all of the data that you want. As jdunham mentioned your device may not be capable of taking another command until all of the data from the previous one has been read. You need to know when and how to finish reading data before going onto the next step.
  10. You just described it. Your situation is basically a command/response system. However, your code is not written that way. Your code (the original picture you posted) has a loop continually reading at a very high frequency and a parallel loop that is trying to write data. What you need to do is sequence your actions such that you send a command and then read the response. After the response is read you should then send the next command, read its response and so on. There are different ways you can use to terminate your read. It really depends on the data you are getting back. If the data has some unique "end of data" marker you can read until you see this marker. Or your responses may always be a specific length. In which case you need to read the specific number of bytes before continuing. Another way that might work is that you know your device will always send the complete response in a certain amount of time. If this is the case, and the time is short enough you could attempt to read a large number of bytes, a number larger than your biggest response, and set a timeout value slightly longer than the maximum response time. Your worst case is when you can receive a variable amount of data with no termination value in an unspecified or long response time. In this case you need to read a single character with timeout value. Once you read a single character you begin reading as much data as you can and stop reading data once you have some specified period with no data. This timeout is relatively short compared to the overall data timeout. It also assumes that once the device responds it will send all of the response with no significant delays between the characters. Parallel processing is a great thing but just because you can do it doesn't mean that you should. Based on your last post I don't think your architecture is the right one for your task. You can still use the producer/consumer model for handling events but the basic operations of your task could occur in a single state machine. This state machine would correctly sequence your applications writing and reading from the devices. Currently you have no flow control between your writing and reading of the serial port and I believe this is what is causing you your problems. Not to sound nasty or anything but based on many of the questions you have been posted I would highly recommend that you try to take some formal LabVIEW training classes. For example I think you could benefit greatly from taking the Basics I and II classes.
  11. What is controlling the flow or sequencing between your writing to the serial port and reading from it? From the looks of your diagram you are continually reading from the serial port. This will prevent your from writing to it at the same time. You should allow your application a chance to write data. You could put logic in your read loop which will only read data when data is available. Also, do you really need to read your data every 10 ms? I suspect that that loop is starving out the rest of your application and preventing you from writing data when you want.
  12. QUOTE (Aristos Queue @ Feb 26 2009, 09:39 PM) I don't think you understand what I am saying though. The Null wire that I am suggesting replaces the sequence frames and allows you to impart data flow where none exists. Here is a picture of the concept. If you look at the image you can see that the diagram is very readable and the sequence is very clear because it appears as a standard data flow. The Null wire connectors are attached to a node and it is not part of the of the connector pane itself. Personally I find the Null wire suggestion to be cleaner looking and more consistent with the data flow aspect of LabVIEW.
  13. QUOTE (Adam Kemp @ Feb 26 2009, 01:32 PM) Replacing elements, adding a row, searching or splitting them are a few operations that come to mind. I have encountered some performance issues when using tables (2-D string arrays) with a fair amount of data in them. We use tables to update test results or for display test data. For test result we color the cell representing the test result green for a pass, red for a fail or orange for an error. Even restricting the tables to a few hundred lines updating the color of the cells is a time consuming task. The color attribute for the table cells do not actually follow the data. If you delete a row from the table, the corresponding cell color is not deleted. Therefore we end up having to manually track the cell colors ourselves. This results in quite a bit of processing of arrays. This is mainly simple operations but in some tests we end up doing this quite a bit. If I get a chance I will try to put together an example if you think this would be helpful.
  14. You may also want to consider things like processing time on large arrays operations and manipulation.
  15. With all of this talk about error wires and data flow it has made me think about a different type of wire and connector type that could be useful. There are times in code that certain tasks that don't really share data need to be sequenced. One way of doing this is to pass the error cluster through however the subVIs in question don't generate errors and will do no harm (other than wasting some CPU cycles) from running if an error was present. Another way of doing this is by using sequence frames. However in general sequence frames are not very desirable for many reasons. So I was thinking that a new wire and connector type could be useful for these type of situations. The wire type would be the Null wire (or some other name such as sequence wire) that doesn't actually pass data but rather only imparts a sequence flow on a set of VIs. This wire would be a semitransparent gray wire which would clearly indicate that there is no data present. A good example of this would be where you want to time a given task. The "Get Time/Date In Seconds" VI only has a single output, the time. So in order to time a task we generally have to place a flat sequence frame down with the Get Time in the first and third frames and our task in the second one. This generally takes up quite a bit of diagram space. However if we could attach an arbitrary connector to any subVI we choose, we could impart the sequencing required to accomplish our task where no true data flow exists. The digram would be far less cluttered than the one with the bulky sequence frame as well. This would be useful in cases where no data dependencies exist and the value of creating a warpper subVI or enclosing sequence frame (which just looks ugly) or creating a separate state in a state machine just isn't warranted. This sequence wire would not require an actual connector on the connector pane as it could be arbitrayily atteached to a subVI. Effectively this is the same as the flat sequence frame but much less obtrusive on the block diagram. I would be interested in hearing what others think about this concept.
  16. QUOTE (Bjarne Joergensen @ Feb 26 2009, 11:06 AM) I would definitely talk to your R&D about this. The RFC's are there for a reason and it stills baffles me why you find so many home-grown networking implementations that appear like they have never even looked at the RFC. My guess is that your R&D folks simply took the command list and implemented its functionality.
  17. Just for clarity sake 255.255.255.255 is the IP broadcast address. It is not unique or special to UDP only. It is the broadcast address for all IP protocols which UDP is one of many.
  18. QUOTE (Phillip Brooks @ Feb 24 2009, 01:36 PM) I suppose I could but to me this is a problem with NI and LabVIEW. I shouldn't have to jump through hoops to be able to open up the properties dialog box quickly. I am on a corporate network and my default printer is a printer that I use. I shoudln't have to reconfigure my system all the time or create dummy printers as my default and then have to manually select my printer each time I need to print something. Why not set up the properties dialog box to only communicate with the printer when you actually want to print, not every time the properites dialog box is opened. Yes there are some work arounds for this issue but the point is I shouldn't need to do this. If NI has known about this since release 3.0 why hasn't something been done to fix this. This is one of the issues that falls into the "very annoying" category. These are the types of problems I try to resolve for my users. The last thing you want is an annoyed user.
  19. QUOTE (rolfk @ Dec 27 2008, 10:51 AM) OK, this explains quite a bit. I have been running into this issue with several versions of LabVIEW and this is a REALLY annoying behavior. Is there some work around besides deleting the printers? I have multiple printers on my computer and they are not always available however they must be there. I test network connectivity on printers. So I am constantly waiting for the properties dialog box to come up. Can I disable this check? Removing and re-adding printers constantly is not an option that I willing to live with. I also don't like waiting forever for a properties diablog box to come op. It is interesting that I have mentioned this several times to NI support people and not a single one knew about this.
  20. You will want to ask questions related to specific experiences. Stay away from questions with hypothetical or opinion based answers. Here are some examples questions you could ask. Development skills: 1. Describe the architecture of the last/largest/most successful automated system you designed? In hindsight what features were advantageous and what features could have been improved? 2. When designing a system do you have a general architecture you use? Can you describe it? 3. Describe a particularly difficult technical challenge you had to overcome and how did you overcome it? 4. Ask specific questions related to your field if necessary. 5. Have you any experience working with designs relying on data flow for control? (This would help to determine how easily they could transition to using LabVIEW for programming) Management skills: 1. What was the largest number of people you managed? 2. What type of things do you do to encourage team building for the teams you manage? 3. Can you give an example of a management problem you had to solve and how you solved the issue? 4. Can you provide an example of how you mentor team members to develop their skill set? Those are just a few of the types of questions I would ask. You can tailor them more to your specific environment or job as needed.
  21. QUOTE (LVBeginner @ Feb 23 2009, 02:06 PM) In this case you will have to roll your own XML code in LabVIEW for creating/formatting the XML you want to send. It really isn't that difficult since XML is fairly straight forward. With respect to the communication itself the receiver of your data does not need to be written in LabVIEW. You do need to know the port your server will be listening on and any application specific data or messaging that occurs between the server and its clients. If you do goes this route you will be working with the TCP or UDP LabVIEW primatives and not using native LabVIEW data sockets used by shared variables.
  22. QUOTE (Kubo @ Feb 23 2009, 08:57 AM) Again, I will reiterate that you should look at the producer/consumer model for your design. Move all event processing into its own loop and run it parallel to the state machine. The state machine can keep track of what state you are in and what processing should or should not occur. The event handler will simply catches the events and posts a message to the state machine queue. You could use dynamic events which would allow you to specify when you want to listen for specific events. Your state machine can keep track of where it is in its processing and throw away events it no longer cares about. I don't believe there is a way to flush the events from that are queued for the event structure.
  23. We still have a bunch of dqGOOP stuff but we use it mainly because these objects are used so prevalently in our code base. I would prefer to rewrite them in LVGOOP but it seems we never have the time. The old adage of "if it isn't broke, don't mess with it" applies. Anything new that we do is done in LVGOOP.
  24. QUOTE (Kubo @ Feb 20 2009, 03:20 PM) From what you describe your event structures and state machine states are intertwined. Generally this is not a very good way to design your system. You are better off using a producer/consumer model and use a single event structure that is always processing your events in one loop. A parallel loop would contain your state machine. When your events are triggered you would queue actions to your state machine for processing. If you need to you can also disable your front panel controls to avoid multiple events being triggered. This depends on how you want to handle your user interface. Try searching for an example of a producer/consumer design pattern. I believe one ships with LabVIEW. This should help with your application. BTW, the picture you posted is fairly useless since the code is not visible and you are asking for help with your code architecture, not your front panel design.
  25. QUOTE (austin316 @ Feb 20 2009, 10:03 AM) I would look to see if there are some .NET or windsock dll calls that you can make. I am not aware of any way to do it using LabVIEW primitives.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.