Tomi Maila Posted December 4, 2013 Report Posted December 4, 2013 When I originally selected the name for my blog ExpressionFlow back many years ago, what I had in mind was functional dataflow programming and the ability to pass any subdiagrams (expressions) around as first class citizens of the language. Functional programming languages (haskell, clojure, lisp) have a lot in common with dataflow world. In principle, calling a function in a functional language is identical to calling a node in a visual language. Passing data between functions happens in an immutable way in functional languages, similar to the concept of data flowing between nodes in a flow-based programming languages. Monads determine execution order of functions with side effects, the same way as flow diagrams determine the execution order of nodes in LabVIEW. Functional programming is often seen as an alternative to object oriented programming. One can solve similar problems of dynamic dispatching either by passing functions around or passing objects with methods around. There are of course multi-paradigm languages supporting both objects and functions like Clojure and Scala. So what would LabVIEW look like if it supported concepts of functional programming paradigm? I posted an article on this on ExpressionFlow and I am intending to study the concept on ExpressionFlow in my future posts. Would you use functional LabVIEW if such a language existed? My post on how a functional programming could appear in a visual dataflow language: Synchronous Functional Dataflow Programming & Closures of Named Functions p.s. See Joel's quick intro to functional programming if you are unfamiliar with the concept of functional programming. Quote
viSci Posted December 4, 2013 Report Posted December 4, 2013 Interesting...The closest I have come to functional programming is with Lua. I am not even sure it qualifies but have taken advantage of Lua's ability to assert a 'function string' that can be composed and executed at runtime. BTW, I really enjoyed the Spolsky article and am looking forward to your next post! Quote
Aristos Queue Posted January 3, 2014 Report Posted January 3, 2014 Tomi: The link is returning an error "Error establishing a database connection". Quote
RnDMonkey Posted January 16, 2014 Report Posted January 16, 2014 Website is still down for me, as Stephen indicated. Quote
ned Posted October 13, 2014 Report Posted October 13, 2014 Reviving an old topic here because I've been playing with F# a bit at work, and it's made me wonder - what if LabVIEW had a type-inference system similar to F# (or ML, or some other functional languages)? Meaning that the compiler can deduce data types at edit/compile time, but it deduces that type only to whatever level of specificity is actually required. For example, in a sort function, so long as there is some way to compare two items of the same type, the actual type doesn't matter. This allows you to write generic functions and still get the benefits of strict typing, and type correctness at compile time. It might be an alternate approach to OOP interfaces for LabVIEW. Of course it would change the look of the language - wires could no longer be colored based on type, since many functions would accept multiple types. I realize it's unlikely to happen in LabVIEW any time soon. Anyone else played with this aspect of functional languages? Comments or thoughts as to whether it could be done in LabVIEW? Quote
ShaunR Posted October 14, 2014 Report Posted October 14, 2014 Reviving an old topic here because I've been playing with F# a bit at work, and it's made me wonder - what if LabVIEW had a type-inference system similar to F# (or ML, or some other functional languages)? Meaning that the compiler can deduce data types at edit/compile time, but it deduces that type only to whatever level of specificity is actually required. For example, in a sort function, so long as there is some way to compare two items of the same type, the actual type doesn't matter. This allows you to write generic functions and still get the benefits of strict typing, and type correctness at compile time. It might be an alternate approach to OOP interfaces for LabVIEW. Of course it would change the look of the language - wires could no longer be colored based on type, since many functions would accept multiple types. I realize it's unlikely to happen in LabVIEW any time soon. Anyone else played with this aspect of functional languages? Comments or thoughts as to whether it could be done in LabVIEW? I hate strict typing. It prevents so much generic code. I like the SQLite method of dynamic typing where there is an "affinity" but you can read and write as anything. I also like PHPs dynamic typing, but that is a bit looser than SQLite and therefore a bit more prone to type cast issues, but still few and far between. That is why sometimes you see things like value+0.0 so as to make sure that the type is stored in the variable as a double, say. Generally, though. I have spent a lot of time writing code to get around strict typing. A lot of my code would be a lot cleaner and generic if it didn't have to be riddled with Var to Data with hard-coded clusters and conversion to every datatype under the sun. You wouldn't need a polymorphic VI for every data-type or a humungous case statement for the same. It's why I choose strings which is the next best thing with variants coming in 3rd. <rant> I mean. We have a primitive for string to int, one for string to double another for exponential (and again, all the same in reverse). Really? Why can't we connect a string straight to an integer indicator and override the auto-guess type if we need to. </rant> But yes. I think it can can be done in LabVIEW. They could do it with variants. Not as good as dynamic typing, but it'd be closer. A variant knows what type the data is and you can connect anything to them but they failed to do the other end when you recover the value (I think it was a conscious decision). That is why I call variants "the feature that never was" because they crippled them. I think recovery of data is a bit of a blind spot with NI. Classes suffer the same problem. It's always easy getting stuff into a forma/type/class, but getting it out again is a bugger. Quote
infinitenothing Posted November 12, 2014 Report Posted November 12, 2014 This seems pretty close to the original diagram. Quote
ned Posted November 12, 2014 Report Posted November 12, 2014 I hate strict typing. It prevents so much generic code. I like the SQLite method of dynamic typing where there is an "affinity" but you can read and write as anything. I also like PHPs dynamic typing, but that is a bit looser than SQLite and therefore a bit more prone to type cast issues, but still few and far between. That is why sometimes you see things like value+0.0 so as to make sure that the type is stored in the variable as a double, say. Generally, though. I have spent a lot of time writing code to get around strict typing. A lot of my code would be a lot cleaner and generic if it didn't have to be riddled with Var to Data with hard-coded clusters and conversion to every datatype under the sun. You wouldn't need a polymorphic VI for every data-type or a humungous case statement for the same. It's why I choose strings which is the next best thing with variants coming in 3rd. Forgot about this thread for a month. Shaun, you might have misunderstood how strict typing works in F# (I probably explained it poorly). When you write a function in F#, for the most part you don't specify the actual types of the parameters. Instead, the compiler determines what operations need to be possible on those parameters, and then only allows values to be passed as those parameters when they support the required operations. As a simple example, a sort function might accept an array of an unspecified type, and the only operation that needs to be supported is comparison. The compiler will then let you pass an array of any type that supports comparison to that function. So you get the benefits of both generic functions (because functions don't specify parameter types) and strict type checking (because the compiler can determine at compile time whether the parameters are valid types). That's something I'd love to see in LabVIEW, too. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.