Jump to content

robijn

Members
  • Posts

    171
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by robijn

  1. QUOTE(jpdrolet @ May 8 2007, 06:45 PM)

    The problem is not with the decimal specifier but with the time format itself. The format string %<%H:%M:%S%3u>T expects the timestring "15:55:29000". The correct format string should be %<%H%M%S.%3u>T with the decimal separator in the time format.

    I don't think so, because the dot is already formatted/scanned by the %u thingy.

    I think the internal time format/scan functions don't understand/use the %.; that was placed in the format string. So it's a bug if you ask me.

    Joris

  2. Hi,

    Very interesting.

    I miss one important goal of object orientation: the match between the real-world and the information.

    OO does not only fascilitate this by being able to make a hierarchy of characteristics ("classes") but only by guaranteeing some things about the life of an object (constructor, destructor, consistency of data). The latter is missing in your article.

    I think it is important that the relation between real-world and its abstraction in the form of objects is correct and protected, to prevent getting "parallel universes" as we have discusses here ealier on Lava. This means an object X needs to have defined actions in a given sequence on it, so that it is always possible to have a consistent state, that also matches the state of the real-world. The prorgammer of the class should be in full control of everything that happens with objects instantiated from the class. The only way to achieve that is by having some kind of referencing system. (I haven't heard of any other solution).

    Seeing your list of references you must have considered this, so my question is why do you not consider this a problem ?

    Joris

  3. QUOTE(Michael_Aivaliotis @ May 4 2007, 12:13 PM)

    That's right, I think I overreacted a bit here.

    QUOTE(Michael_Aivaliotis @ May 4 2007, 12:13 PM)

    What do you consider noise? Questions like: "I'm getting error code 15 when running this DLL"? People come to the forums to ask questions and get problems solved. Sometimes, there are exploratory posts that lead into interesting discussions but they can't all be like that.

    That's a bit the problem with a moderator system. There's no democracy about what's noise or not there, so because of this absoluteness it is only usable useful for very clear abusive cases. I would like some automatic way to sort on the expected interestingness of the post. Some proposals have been made already. I think it's worth another topic.

    QUOTE(Michael_Aivaliotis @ May 4 2007, 12:13 PM)

    . This will definitely clean up the forums.

    Hehe, I've got no CLA (yet). But then I have to raise the question should my boss be willing to pay for LAVA...

    QUOTE(Michael_Aivaliotis @ May 4 2007, 12:13 PM)

    I think it's always good to ask yourself if you are at the right place...

    QUOTE(Michael_Aivaliotis @ May 4 2007, 12:13 PM)

    The only practical solution at the moment is to let the natural order of things dictate as it always has. We all need to roll up our sleeves, come out of the shadows, and put in the effort to get involved in the discussions. The more "advanced" noise we create the better. :thumbup:

    Sounds perfect to me !

    Joris

  4. QUOTE(Michael_Aivaliotis @ May 3 2007, 01:37 AM)

    You can't be serious here. This is what was the reason for their respons.

    QUOTE(Michael_Aivaliotis @ May 3 2007, 01:37 AM)

    Other than that, ALL other questions are welcome whether they be simple or complex. I am the first to post stupid questions here on LAVA. If we don't then NI will never fix the problems that plague new users of a feature.

    We all sometimes ask a stupid question, it will be easily recognized and it's not the point. If you acknowledge that NI people are reading the forum, you should also understand that they will do this less if there is less interesting information for them. It is more interesting than NI's forum because its level and the high information density.

    QUOTE(Michael_Aivaliotis @ May 3 2007, 01:37 AM)

    Then why is the forum called LAVA ? Who is at the wrong location and could have know ? Me as an LabVIEW Advanced (V.A.) user ?

    QUOTE(Michael_Aivaliotis @ May 3 2007, 01:37 AM)

    In order to put a closure to this discussion, I am implementing the following policies/procedures:

    Of course you're right about the options of informing a moderator of an abuse, but I think the tone of your respons is not very correct against the "good" users. The points are all aimed at the "good" users. Please acknowledge that there have not been any excesses yet. These good users appearently were able to contain the problems. Why do we need to use the absolute moderator super power system when the normal user can appearently still handle it ? LAVA is still small enough so that we don't need 3 military ranking layers with their bureaucracy. The setup "For real abuses there's always the moderator" gives me a much safer feeling.

    I hope LAVA will continue to serve the puspose that it was set up for. If too much noise is present it will not do so anymore for me and I will use it less. If more users do this, there will be a gradual process of decay over the coming years.

    Joris

  5. QUOTE(tcplomp @ Apr 30 2007, 08:40 PM)

    I've used this advice (and that is all it is) with the accompanying message that the user base of LAVA is much smaller than on NI forums (look at the number of people online). That NI forums has people paid to answer and most likely someone of their school has asked the Q a year/trimester before them.

    I agree with you Ton. I think Lava is meant to be smaller, given the topics. We should not try to serve every individual with a LV question. Then we could better have joined the forum with NI's forum. It sounds really nice to help everyone but it we'll be biting our own tail.

    I have not seen a single incident where we were "unfairly evil", so I don't think we have to worry about the forum decency yet. I think you will sometimes need to explain again that "things don't work like that", because some people just don't (want to) understand.

    Joris

  6. QUOTE(Guedin @ Apr 26 2007, 04:53 PM)

    The company wants me to create a program in LabView wich will be launch by pressing a pedal . This pedal is connected to a PC gameport . So from the gameport, I only use the button ports . The pedal will be recognize like the button1 of my joystick . I dont care about the analog part

    We are not able to give you a LabVIEW training courses, but we can give you hints.

    In this case the hint should be to go for the LabVIEW joystick functions. On the function palette click Search and type "joystick". You will then find the joystick functions. That's a much better way to access the joystick. It will also work for for example USB joysticks (the port I/O will not).

    Joris

  7. QUOTE(Jim Kring @ Apr 13 2007, 08:37 PM)

    Well, these kinds of statements have been made before about other things. Some of them turned out to be true. For example the mobile phone changed a lot in the way we manage our time. We don't need to find a phone somewhere (and maybe pay actual $) to be able to change an appointment or whatever. With the Web 3.0 thought, in which your apps are located on a server and you only have the user interface locally, a lot of things will become different. Your word processor would not run on your PC anymore. You would not need to install it and you don't need to update it. You could pay for CPU use. These kind of services will be updated without the user noticing.

    But when it comes to data acquisition and specialized software, we will need the software with the hardware. Also we want full control over the software version that is used. There are not (like with the centralized services) many users using the program but only 1 or a few. Apart from that it will take ages before LabVIEW can be run somewhere else on the Net while your data is being acquired locally. And only at that point you could speak of an actual service. I would not worry too much about LV being 'virtualized'.

    I hope NI does not make the error to think that users want automatic updates in LabVIEW. Because as programmers we know how difficult it is to fix a big without creating a new one. LV should be stable. We need reliable scientific data and that can only be generated with a reliable tool, not an unverifyable virtual "something".

    Joris

  8. Hi,

    Looks very promising.

    Remember that some people cannot see colour and cannot differentiate between red and green. It's best not to rely on the colours to be seen.

    Polymorphic idea: A shape a bit like a jigsaw puzzle piece with different contours on each side: flat, circular, sawtooth...

    Type/Class: A circle with a big dot in the center.

    Attribute(s): A label connected to a type symbol.

    Joris

  9. QUOTE(AnalogKid2DigitalMan @ Mar 19 2007, 04:22 PM)

    Hey I missed that one... The daily NASA image is my starting page at work nowadays...

    It's a pity that on the Lost in space movie they had to remove the colour dimension to be able to view the 5th dimension on our 2D (+time) screen. Hmm, the beer-men were also black and white... What's the pattern here ?

    Joris

  10. Lately I was talking with a colleague about placing controls in a table. He showed me an application written in .NET or so which had a very nice table with all kinds of controls in it. The customer had asked him to create a similar application in LabVIEW.

    The usual trick with these kind of things is to take a classic array, "paint" it transparent, add a classic cluster and also "paint" that one transparently. When controls are placed in the invisible cluster, resizing the array will result in a nice list of these controls, without any array or cluster borders visible. In LV8 the scrollbar on the array makes things even nicer !

    Unfortunately between the array elements a gap of some 15 pixels is always present. That is an awful lot of space. The front panel starts to look as if someone did not have the time to complete his job and up to half the screen space can be wasted. So my colleague was not very happy that he could not make it as nice as the .NET app, not even by far.

    Now if only we could resize the borders of the array and cluster to ZERO thickness, then we did not even have to make them transparent...

    Joris

  11. QUOTE(Tomi Maila @ Mar 1 2007, 06:50 PM)

    From Jim's answer I guess that the only possibility is to force the some of the VI front panels to stay open long enough so that carbage collector has done it's job. Keeping a front panel open keeps the VI in memory. But it's not nice to have a front panel open for functionality that should be invisible to the user, especially in a build applicaiton. If anybody comes up with a better idea, I really appreciate it.

    Open a VI front panel from your library and set the state to Hidden. It will stay there although it is invisible and I think you can do with that what you want. It's a very good idea, I have to say. I tink I know what you're aiming at.

    I don't really like garbage collection as in scavanging. I like the "right now" garbage collection better, as it fits real-time programming better. So when you don't use data or an object anymore, it is destroyed RIGHT NOW. Then you know where extra time may be needed (depending on the size of the object/stored objects/entire tree) and you can make arrangements in your prog for that extra time.

    Joris

  12. QUOTE(Eugen Graf @ Feb 22 2007, 05:30 PM)

    I don't understand you. I was not talking about LV8 specific behaviour, the platform independent flattening was there at least from version 5.

    QUOTE(Eugen Graf @ Feb 22 2007, 05:30 PM)

    The problem is, I have a lot of big clusters with different datas, so I don't want to handle every element others, I want only one VI, that converts every data type.

    Well, you can connect the various datatypes to the flatten function, but I don't understand why you want to do flattening AND big/little endian reordering. Just flatten and the resulting string can be unflattened on any platform.

    You talked about a subVI, maybe you can create one that works with any datatype by using a variant as input.

    Joris

  13. QUOTE(Eugen Graf @ Feb 22 2007, 03:28 PM)

    Hello, my question is:

    how can I realize Flatten To String with little endian byte order for 7.x LabVIEW versions? I tryed Swap Bytes and Type Cast VIs, but it fits only interegr data types and not floating point numbers.

    Eugen

    Hi,

    Flatten to string and friends are independent of machine type. So you can use the same data on both Macs and PC's. Flatten/unflatten does the big/little endian rearranging for you.

    Joris

  14. It's a very indicative story, AQ. I am always still very gladd that I can so clearly see (and test) what I am doing with LabVIEW.

    QUOTE(Aristos Queue @ Feb 15 2007, 08:18 AM)

    I have never met the software engineers that you're working with. But my guess is that they're no more or less "practical solution oriented" than hardware engineers. But they may well be judging "practical" on a whole different scale -- is it practical to have an ad hoc system that keeps growing over time and has no master architecture plan?

    If you need to split up a big hardware project, you have to create clear boundaries. You make an "agreement" with the groups on where the boundaries between the parts should be and how they should be connected. That also goes for software. It often happens that these boundaries cost a lot of time, because the agreement was not understood in the same way on both sides. However, these problems are even greater if you need to cross the boundary between disciplines.

    In my experience, when a system is multi-discipline, the time required to bridge these distances can (at least) be trippled. The more boundaries (software and electronics and mechanics and some physical process) the harder it becomes and the longer it takes. I think the reason for this is simply that the engineers in the separate fields have trouble understanding the problems of the other field, and interaction and therefor also progress goes at a very slow rate.

    Joris

  15. One more addition. As the proposed modern error cluster could hold multiple errors simultaneously, the catch nodes need to be able to catch multiple errors if there are more than one errors of the same type. Perhaps the catch function could simply return array of objects of user specified type instead of a simply single object.

    Hmm, I'm not a fan of having yet another structure. That's extending the basics of the language and that should not be done without really deep thought. Before you know it the core of the LV language is obfuscated by all kind of extra's that could have been implemented using already existing mechanisms. An example of this is the waveform type. The waveform has been created as a new type of object, while functionally it could internally have been a cluster. The fancy waveform nodes could have been there, but NI could have used clusters internally. Using existing things (e.g. datatypes) can prevent a lot of trouble that you can get when introducing new things.

    I've been working on collecting multiple errors in an XML way including extracting them again to an array of error clusters. That's no problem with the current system. I would appreciate it if NI would do more of these things for us.

    I've written my own error catcher. It is attached here.

    It perhaps good for analysis to realize that there are two kinds of errors:

    1. errors from an underlaying system, like a library or system call;

    2. errors that you create in the application itself (excluding self-written libraries).

    An error system should be able to facilitate and transport both types of errors.

    For the latter you usually don't need catching errors, functions that are called usually either work or fail, and distinction is usually not important for the program. Textual info is needed for the user. For the first you need some way to distinguise errors in order to be able to do different things when they occor. Right now that can perfectly be done with the error code.

    I don't mean to say the error system should not be extended. But it should stay as transparent as it is now. Maybe the main problem with an error object is right now that LV object are not very transparent (cannot be shown on the FP for example). I think that should be improved first.

    BTW error stuff should not be in the Time and Dialog palette, but in the Application Control pallete ! Because that's what it does, determining program flow.

    Joris

    Download File:post-1555-1171018865.vi

  16. Now that's a coding challenge right there! Create a program that reads a subVI, understands what it does, and then detect other VIs which do similar functions! whew... :blink:

    Sounds good ! We could give a collection of VIs to several people and let them judge the style quality factor. Then we feed the same VIs to te candidate programs and let them determine their style quality factor. The best approximation wins.

    I'm a bit affraid of the centence "understands what it does" though. I don't have good experience with computers saying they understand me.

    Joris

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.