Jump to content

Kas

Members
  • Posts

    55
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by Kas

  1. Kas

    yEnc Decoding

    Well, this decoding section will allready be part of a bigger project (producer/consumer style). See, this is part of a program to do with NNTP server from the clients side using TCP\IP protocols. Majority of the incomming data comes in 15 MB chuncks, but this varies, and it can go up to 300 MB per part. The yEnc Decoding needs to happen once the whole part is downloaded, mainly because a yEnc encoded parts will have a header and a footer. The downloaded part is written to disc at certain intervals in order to free up memory (particularly when a single part is large). I'm not sure if its a good idea to read every 128 bytes from the disk untill the whole 300 MB (maybe worst case scenario) is finished. So I thought I read the whole lot at one go, but then have a clever yEnc Decoder that goes through this whole part as fast as possible.
  2. Kas

    yEnc Decoding

    Well, I added the ".." functionality and the decoder itself seems to be working now. As a result, I ended up implementing loops within loops , Dearly costing me speed and probably memory as well. A 250 KB file takes about 1 sec. I dare not think how long would a 200 - 300 MB take. Attached is the latest 2011 version. Any improvements would go along way. I left comments on each section (its pretty simple operation). Thanks Kas. P.S. Now the "NOT WORKING" example actually works as well. This can be confirmed by checking the header of the binary file in the string indicator, the displayed size in the indicator is the same as the decoded size of the array at the end.
  3. Kas

    yEnc Decoding

    Well I've cleaned it up alittle. The incomming stream of data is a string of binary, but I can save the values just as they are without converting it to string again. As for the definition of a "line" in this case, it is based on the principle of how yEnc Encoder works. Basically, it adds a 42 at every byte appart from what they consider to be "critical" characters. These are "=", "CR" and "LF". If during encoding these characters are encountered, they add an escape character "=" in front first then they add 42 AND 64 to the next byte. When finished, the encoded binary will consist of multiple lines of size 128 bytes/characters, appart from the last line ofcourse. If you examine the encoded file attached, you'll see that when you go through each line in the binary string, its size is 128 bytes. As for the ".." feature, its not part of the yEncode/decode mechanism. Its something that only happens when sending an retrieving data from NNTP servers and POP3 procedures only. This seems to be the way the NNTP are set-up. Apparently, when sending a message to the server through TCP\IP, the message has to end with a "." and "CR and LF" in a new line at the end. So, in order for the server to understand that the line beginning with "." does not represent the end of the message, than the ".." comes in. So, to clarify, this ".." rule is only ilegal if it appears at the beginning of a line, and its perfectly ok if it appears anywhere else within the 128 bytes (i.e. line). Sorry to make this long, I just wanted to explane the situation alittle bit. As I mentioned, attached is the cleaned up version but it still doesn't have the ".." function included. Your solution however just looks for ANY ".." and removes one, trouble is, I need to look for ".." ONLY at the beginning of the line. One quick solution would be to cut the whole byte stream in chunks of 128 bytes, then index the first two bytes of each line and search for the ".." characters (where "." is decimal "46"). I guess that I'm just hoping there is a cleaner/better way of doing this and reducing the RAM and CPU and increasing speed at the same time (maybe wishfull thinking here ) when dealing with large files 200-300 MB. Thanks Kas
  4. Kas

    yEnc Decoding

    Sorry asbo. Attached is the 2011 version. (Changed the profile too).
  5. Kas

    yEnc Decoding

    Hah... I think I just figured out the problem. Apparently, If a line starts with a "." (i.e. any of the 128 B lines on the encoded binary) than the NNTP server and apparently POP3, introduces an extra . to it. i.e. if a line starts with ".$£%^&" etc", when downloaded, it becomes "..$£%^&". In this case, than I just delete one of the "." This introduces a new wrinkle on the code. I don't really want to place each line "with pick line function" into an array and examine the first two characters for "..", mainly because I'll be dealing with large files. Is there a way of optimising the above code "this may turn into a trade of between processor speed and memory" with this extra functionality? Thanks Kas
  6. I'm hoping to get a better answer than what NI forum gave at: http://forums.ni.com/t5/LabVIEW/yEnc/td-p/2187886. ---- I'm trying to implement a yEnc Decoder in LabView. The attached solution seem to be working most of the time with small txt files, but when I'm trying to decode larger files, i.e. with .rar extension etc. from usenet groups, the encoder doesn't quite work. It seems to add random bytes extra, and I don't know where this is comming from. I'm not 100% sure if I'm decoding the binary file corectly, but the yEnc Encoding is pretty simple from what I've seen. I've been at this for 3-4 days now, but I still can't solve it. Attached are yEnc Decoder VIs and two files, one where it works and one where it doesn't. Any help appriciated. Thanks Kas ---
  7. Thanks Daklu, In the mean time, I'll continue to update the software in general, and see how I can incorporate it with a friendlier multi-loop interface, namely the one you proposed and there is another one that comes with LabView 2012 version. I'll also try and minimize the cases inside the QSM with SubVI's as you've suggested (as carefully as I can). Regards Kas
  8. Which is also why in situations like these I have to add sequences/macros at the front of the queue rather than at the back. Kas
  9. Apologies for not replying earlier. Was little bit tied up with something else. As for the question, the interrupts are not from UI. They are instrument (Orbit FR) interrupts. i.e. If I set a scan from 30 degrees to 90 degrees, and I also set the interrupt to 1 degrees, then the instrument will send an interrupt through the GPIB every 1 degrees as the motor moves from 30 through to 90 degrees. The format of these interrupts is a digital number. Out of all of the replies I get from the instrument/controller, the interrupt is the only one that I get as digital number only, the others are letters and numbers combined, or just letters, or binary (status response). Which is why on the "Orbit: Send & Recieve", I check if the reply contains only numbers, and if so, I process the interrupt, and I also know that I should read the VNA results as well. This interrupt acts like a control to let me know when I should read from VNA, and it is sent by the controller itself. During motion, I continuously interrogate the controller for position, speed status etc. if however the controllers increment was triggered (i.e. the interrupt) and the reply happened to be in digital format I then read VNA. However, after I process the interrupt, the reply that I "SHOULD HAVE" received instead of the interrupt is still waiting to be read from the controllers buffer, which is why in the "Orbit Param: Proc. incremet", I go back to "Orbit: Send & Recieve" to make sure that I read whatever else is left (i.e. position, velocity etc.). The graph is also amended (attached). Basically, every time I read an interrupt, the sequence takes a "detour" (i.e. reads VNA, updates the graph etc) and then comes back to doing what it was supposed to do before the interrupt was detected (i.e. go to the original sequence). As for the UI, the user no longer needs to do anything during the measurements. The measurements them-self can take hours, maybe even days for accurate readings, and only in emergency situations, the user would have to press the stop button to halt the operations. So, for a normal operation, once the user presses the "start" button and gives a measurement name, the rest is pretty much automatic. As for the template that you've shown, I have a concern because even after the user has pressed the emergency stop, the sequences on the left "QSM functions" will still continue to be executed, since the QSM Functions queue and the emegency stop queue are bot independent of each other. With JKI in the past, this was the way I implemented it. However, I now need to be able to do control, data acquisition and other UI responses at the same time where one does not interfere with the other. Basically, the abort button is not the only thing that I would have to implement, however, in the attached project, this is the only thing I have so far implemented. The analyses section that I have to implement takes bit of time, and I wanted to make sure I have some sort of a template in place before I continue with the rest. Effectively, the user can go and do other things with the software WHILE the measurements are in still going on. Hence, the multi-loop QSMs idea. Flow Chart - Basic Idea.zip
  10. Daklu, if you decide not to go ahead with this, believe me I understand. I'm putting my self in your shoes and I'm in two minds myself if I would've helped or not in this detailed level (specially after I looked at the flow chart). The flow chart is just "Hairy". I've attached the "yEd" flow chart because as an image, its too big. The only thing I haven't implemented is the "STOP" UI button, where it would just flush the queue and send a "Stand By" command to the Orbit instrument, no matter where the execution process is on the third loop. This acts as a "Soft Emergency" procedure if the user sees that the motion is becoming dangerous (i.e. cables stuck on the rotation table etc.). This project could be implemented using JKI Template, but the "STOP" button would have to be monitored regularly, since there's no way of coming out of a macro sequence, even if a User Event is triggered. Furthermore, this software will have other capabilities (not yet implemented) i.e. plot, compare, calculate and manipulate previously measured data from the database (I call it database but its only going to be on a common folder on hard drive) and this is independent of whatever the measurement loop (3rd loop) is doing. Which is why I opted for producer/consumer QSM style Template as the backbone for this project. Seeing how easy this can turn into a disaster (if it already isn't one), this type of template might not really be the best way forward. Attached is also the 2nd flow chart where I tried to simply show the steps that the software should ACTUALLY go through. Since all of the communication protocols/settings and any other processes required are already done, it might be best to simply "abandon ship" and use a different programming style that could be extrapolated from the 2nd flow chart (using it as a guide). Every-time I think of a different programming structure, I come back to QSMs and macros. Since I've used them for nearly everything till now, I just cannot think of any other way. Its as though it has a hold on me and I cannot choose. In terms of knowledge, I pretty much understand what most of the functions in LabView are, i.e. semaphores, calling VIs dynamically, queue system, triggering events dynamically or via signalling (i.e. value change (signal) property), sub panels, etc. effectively the basic day-today operations that LabView has. I believe that so long as the code doesn't end up in OOP style, I should be able to follow. Thanks Kas P.S. Every time I Load the "Main - GUI.vi" it keeps asking for "dotNET ChartXControl.xctl" (used to be a dotNET graph) but I no longer use that control, and I cannot find it on FP to delete. Hence the program will still work if you simply ignore this.
  11. I'm hoping that its ok to bring this discussion back to basics for a bit. Based on the feedback's provided on this thread, I have amended the initial template to use 3 loops instead. Furthermore, I've eliminated one set of enqueue/dequeue SubVIs by making the other set as reentrant execution (JKI I believe does this for theirs). I am however finding it difficult implementing other rules: 1. Don't execute macros within macros i.e. where one state decides what set of macros to execute depending on the instrument reply or other states. As an example, for this project, I send commands and receive replies from the instrument through 1 set of SubVIs located in "Orbit: Send & Recieve". However, upon receive, I continuously check weather the instrument (Orbit FR controller) has triggered an interrupt, and if that's the case, I than go and read the VNA reply, if no, the motion/operation continues without reading anything from VNA. Since there are various motion modes that Orbit can perform, if you set the controller to a certain motion mode i.e. sector scan, than only then would I expect to have an interrupt triggered. If however the motion is just "Move to Position" i.e. send motors to zero position, than I know there is no interrupt from Orbit. In the end, I use this "interrupt" behavior of the Orbit controller to read from VNA. Further more, I check if the scan/motion was finished and if so then send a set of macros where it would save the result or go to idle at the end etc. 2. Don't execute macros from within the same loop, but instead let the main producer loop decide what to do. Again, because of the example stated above, I'm kind of forced to break this rule. There's way too many macros going back and forth where forcing the main producer loop as the message sender becomes impractical. My understanding of the solution to the above: Use SubVIs instead for a set of direct executions i.e. instead of a set of macros where the execution pattern never changes. While this certainly eliminates/reduces a lot of the macros being executes, I cannot see that it would completely eliminate the problem stated on 1, since there are 2 or 3 critical points of the program that are user independent and based on its reply, it decides what the program should do next, i.e. interrupt, scan finished etc. Further more, there a some low level code i.e. "Orbit Mode: Stand By" where I execute through macros from various points in the program i.e. Before I prepare orbit for receiving a set of "Load" commands, before a measurement starts etc. I now understand the reasons and the pros/cons that QSMs have, as well as the solutions and the ways that QSMs should be used in general terms. My problem is implementing these guidelines towards a more specific order on the QSMs. From what I can see, I cannot simply use a QSM in a safe manner without the use of SubVIs (with build-in queues that refer back to main QSM when finished) for tasks such as "Start measurement", which in itself has a set of SubVIs that refers back to the main QSM to check if the measurement is finished. Is there a safe way of implementing problem 1 above in a safe manner using QSMs. If so, would it be possible to see an example on how you would accomplish this. I see the Daklu refers to Private and Public messages. Can you explain a little bit as to what private and public messages are? Thank you Kas
  12. Thanks P.S. Your "Windows API" and "Transport" very useful.
  13. ShaunR:In your latest example, you have a case structure named "Destroy" on your "Queue.vi". In there you flush the queue and then destroy it. Is this normal when working with queues? Its just that majority of the examples I've seen with queues, they just seem to release the queue in the end. Kas
  14. Argghhh... I just confirmed that you can't use the DSC security module like I wanted. Apparently you have to go through "Domain Account Manager" for adding and/or editing user accounts. Hooo.. back through the long road I guess. Thanks Kas
  15. Since this might have a short answer, I thought it might be best if I just continue here rather than open a new thread. Based on one of my previous post about the user login: http://lavag.org/top...dpost__p__98442 Do you think it might be best to use the power of the DCS module for user-login system implementation, or should I start looking into creating my own login system. One thing I can see while reading about the DCS module is that I cannot add/modify user details on the run through user VI without accessing the "Domain Account Manager" through Tools>>Security on the block diagram. I thought that DSC module can help greatly in reducing the workload in creating a new login system from scratch. If however what I've just mentioned is true, has anyone managed to find a workaround?. I haven't personally spent to much time on the DSC module. I'm just trying to determine if this is something that I can start using, and if so then I can concentrate on learning more about the DSC and building the system around it. If not, I'll have to spend my efforts in creating a new login system that would probably end-up doing the same thing as what DSC security module does, but with more leniency. Regards Kas
  16. Hah.. I had no idea. It never even came to mind that such things can cause race conditions. Since this is the case, looks like my whole QSM Template structure is just a ticking time bomb. From this aspect, the amount of macros involved is nothing less than a spaghetti code. Amazing, the more I learn the dumber I feel. Thank you Daklu for taking your time, you've certainly put things into perspective. By the way, what software did you use for that diagram flow. Looks different from general flow diagrams. As per the projects them self, they're not all that important, they've just mildly asked me if I can do something to bring things back to the modern age. The original HP Basic code they have still works for them. This was just me trying to mix things up a little by stepping of my comfort zone a bit (JKI Template ). Now I see what you guys meant. Daklu hit the nail with his last post. I guess before I start labview, I need to learn the lingo first. hehe.... its what you get when dealing with a beginner . It will take me some time to TRULY digest everything you have all mentioned here. But this thread has sent me into an amazing start of. Finally, to Daklu, ShaunR, and everyone participating in this thread: :worshippy: :worshippy: :worshippy: :worshippy: In words, BIG thank you, for having examplory responses and patients in dealing with us (newbies of course). Regards Kas
  17. If the above example is taken strictly as you explained, then yes, you are correct. However, if lets say user changes the target setpoint 2 or 3 times continuously, than "Dwell.vi" would continuously be executed as well. Placing the functions of the three vi's in separate case structures (or the vi's itself), using the "QSM THINGY" I would have thought you have more control over what-is-executed-when. In this case you have control over the "Dwell" case where you wouldn't execute that until the user is finished with the target sp (this may not be the best example to show what I'm thinking). My experience with labview comes from hardware control and operations. I tend to find it easier if I leave the basic operations in the front level and refer to them through macros along the way (as shown on my previous attachment). By comparison here, I'm in no way good at programming, slowly however, I'm trying to get there . Heh... nice..
  18. Hi Daklu. In your latest example, while your message handler implementation makes it easier to understand and cleaner, I somehow feel like its use is limited. If for instance more than one mechanism/object/device needs to read the reactor temperature "ReadReactorTemp", with macros you can just fire the initial sequence of executions (where each device or mechanism would have their own sequences) where one of those sequences would include "ReadReactorTemp" and then go on to do other things. This however cannot be done that easy using your "atomic message handler" since using SubVIs you're mixing more functions into a single case. Whereas if each case structure would have a single function, you can then refer and call them or mix and match them however you want with macros. In my latest attachment you can see that I'm polling the device status quite frequently, If I was to put that in a SubVI, I would've had to put this SubVI all over the place. With macros however, I just call "Orbit Status: Get" whenever needed and the macro goes and takes care of everything else. Sorry, maybe this is not what you guys were discussing but this is what sprang to my mind when I first compared the examples. Regards Kas
  19. Further to my latest post, attached is the ACTUAL project that I have just done using the latest template. In the ZIP file you should find "Track History" and "User ID" folders that I haven't yet implemented on the code. The "Track History" was initially made by a forum member called WMassey from the post below: http://lavag.org/top...__pm341__st__40 (zero-tolerance was me back then) This was quite far back while I was a student just learning LabView, among others, he was the first one who guided me through the initial stages of learning. Now, if I implement the "Track History" (which is design to record anything that is sent or read from VISA), I fear that the application is going to accumulate all the information in memory and slow the whole application down. Is there a way of implementing this without impeding the run speed of the main GUI. I thought of saving the info every lets say 100 lines on a temp.txt file and clearing the memory, but again, I'm not sure if this is the best way. Also, I wanted to see if anyone knows an example code (or VI) that would monitor in a similar way to how "track History" does but this time for the main GUI. Basically, once the software is deployed and the users start using it, if there are any errors generated, then I can easily just see which case structure/function generated this error, what was the Main GUI doing before hand (i.e. a record of previous states that were executed) etc. This is probably where proper error handling and recording comes into play. I know its not very nice to ask for ready made stuff, but I thought this might save me some time in creating one and this way I can concentrate on the other projects that I'm developing. There are quite few machinery in our Antenna department that they want to update the control software's. We're still using basic HP . As for the "User ID", it currently just serves as a filtering mechanism making sure that certain people/groups have access only to certain control software. The admin of course being the main Lab Manager (not me by the way). Since I'm doing all this, I thought its time to updated the "User ID" section as well and make it more advanced that what it already is. Again, its not about absolute secrecy and hiding the password from hackers/crackers etc. For this, I was thinking to start using "Database VI's" for recording and retrieving information rather than a simple "*.txt" file. Currently, the scalability is an issue. For now, there are only two groups, i.e. Admin and Users. I want to make this more dynamic where an admin can add/remove groups on the USER ID, (i.e. the admin can further add Temporary users whose login time runs out in lets say 1 week, add department groups where any user registered to that department would have access, etc. Any way, this is my end game, but any ideas towards this would be great. I am a PhD research assistant in physics and not really a full time LabView user (even though I love it). Which is why even if I'm involved in these freebees projects for Uni, my main time is spent on research. Finally, for me, this thread is "Christmas come early", having the high caliber programmers discussing their issues and voicing their thoughts in something that I'm directly working on is amazing.
  20. I feel bad breaking the momentum here, but I thought I'll use this opportunity to learn as much as I can. Attached is the revised template, but I couldn't use just one set of enqueue and dequeue as daklu suggested. It kept throwing an error, so I went back to using two sets. Further more, from what I can see, if queued producer/consumer structure is used, for every consumer loop a separate queue needs to be implemented. If only one queue is used, how would you control which consumer loop should dequeue the message that the producer sent. This is of course provided that more than 2 consumer/message handler loops are used. Thanks Kas QSM Template.zip
  21. I can attest to that. A 3 machine control project ended up being so big that took me a week to decipher the flow of excecution (i.e. which case is going where). It can look much cleaner however if the embeded "separator no-op" frames are used properly. JKI SM does tend to be a pretty generalised solution for machine controlling and data acuisition aplications (for small to "maybe" medium size projects), and thats as far as it goes I think. On another note, Daklu, I see on your second image you've wires the 1 ms timout to the event structure. wouldn't that mean that the event structure will be executed every 1 ms? From my experience, the whole point on using an event structure in an SM or even multiloop programs is to halt the PC work (so as to not overwhelm it) and act purely as the main controller to the other loops. If there is nothing to be done, the event structure halts and wait for an event (i.e. button press). Kas
  22. Haha.. I had no idea people are taking sides on this. I feel like part of a rebelion. Seriously though, thanks a bunch. My gratitude comes from UK. Regards Kas
  23. Daklu, this is exactly what I was looking for, and then some . As you sugested, I'm going to wait and go through some projects with multi loop programming before I dive into LVOOP (I keep getting scared from it, people in the forum make it sound alittle hard, todd's posts included , but nevertheless a necassary step). One thing I'm not sure of, when you said "Put a timeout terminal on your dequeue vi", would you be able to post an image on how you would do this? Again, to everyone taking their time on this "DAKLU" especially BIG thank you. P.S. "Learning LabFu Daily" nice .
  24. Yayks, this seems to be LVOOP design. I'm not in to that yet. That should probably be my next step to cross.
  25. Thanks to both of you for your replies. 1) For passing data between loops, I found it difficult to implement the normal way (a cluster of string (for case command/messages) and variant (for data)) because I have multiline string that turn into queues i.e. each line would then represent a queue (shown in "Macro:initialise" case), so I thought using functional globasl instead. 2) Both loops have seperate queue names, and I wire both queues to both loops when I want to refer to both of them i.e. from producer to consumer with a button press or from consumer to producer based on hardware replies (depending on the hardware reply, I might have to enable/disable blocks of clusters or even a whole wizard (which I implemented on FP)). Even if both "case structures/message handlers" might have the same name, because I refer to them through seperate queues, I thought it might work. 3) I have seperate sets of enqueue and dequeue because I was worried about race conditions between the two loops. There is a slight difference between the two dequeu's however. In the first loop I have an event structure, and the dequeue for that loop never halts the queue (i.e. waiting for a queue to be present), but as soon as the queue is empty, I feed an empty string, forcing the software to halt on the "Idle" case (i.e. where the event structure is), the second dequeue however doesnt have this, and so the loop is halted on the dequeue. 4) I didn't think it would be a bad idea to have 2 event processors in the first loop. I thought I'll use the first loop to handle the FP buttons, menus etc. and use the second loop as the main workhorse in machine processing in parallel. I thought that if I introduce the third loop for FP activity might be more work for PC. The above answers shows the state of thinking I currently have about various programming structures. I gues my lack of knowledge in "good and bad" programming techniques is starting to show up. So far, I've only used state machines "JKI mainly", but as with everything, the time is come for me to outgrow (mainly because I have no choice ) this and go forth with something better. drjdpowell you mentioned "Daklu’s Lapdog package". do you have a link that I have a look at. Maybe I can incorporate that here. Thanks to both of you, for me, this is the easiest way to learn things. Kas
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.