OK, so I've been learning a new computer language recently, and I'm finding it intriguing. Not yet at the point where I can use it "in anger", but I'm close to the point of choosing it for some of my everyday scripting things. It would take longer, but would help me to learn it.

So ...

Would you learn a new language? What would induce you to do so?

@ColinTheMathmo I am thinking a lot recently about how highly parallel paradigms (mainly GPU, but also multicores, etc) might transform the way we (should) approach or teach numerical algorithms.

It would be very interesting to find a programming language that allows to do this in a natural way, preferably with a higher level of abstraction. For example, are there dataflow languages that are suitable for high performance numerical work?

@mm You've made me think ... I'm pretty sure the implementation of fexl isn't deliberately parallel, but I can see that it might be possible to do that.

Hmm.

I've made a note of that. The implementation is open, but I don't understand it. Maybe I should learn enough to do so.

@ColinTheMathmo Regarding fexl: this is a prime example of what I mean! The λ-calculus seems to deliberately avoid multivariable functions in favor of repeated evaluation ("abstraction", or "substitution" in fexl) of functions of one variable. This has the same computational power as λ-expressions with multiple variables. Indeed, "λ x y . z" is equivalent to "λ x . λ y . z". But the latter is essentially sequential, whereas the former is inherently parallel! Is it time to revisit λ-calculus?

@mm Thinking out loud ... these things were worked on at a time when implementing them was not even considered, so it was (I'm guessing) considered (if at all) that these things "happen" instantaneously.

Analogy: Consider proof by induction. People talk of the proposition "becoming true" for larger n, they talk of "dominoes falling", etc. But that's misleading at best, and generally just nonsense.

So with the λ-calculus. We now have to think of things as happening, not simply as being true.

@ColinTheMathmo Exactly! When λ-calculus was invented, efficiency mainly meant shortcutting unnecessary computations, without a particular computational model at hand. Nowadays it makes sense to consider in which order computations should be performed. In fact, being aware of CPU caches, for example, can lead to nontrivial solutions with impressive performance gains (cf. springer.com/gp/book/978354000)

For flexl and the λ-calculus: Maybe something like Regent would work? See: elliottslaughter.com/2015/12/r

Follow

@ColinTheMathmo

Your chart is ready, and can be found here:

solipsys.co.uk/Chartodon/10677

Things may have changed since I started compiling that, and some things may have been inaccessible.

The chart will eventually be deleted, so if you'd like to keep it, make sure you download a copy.

Sign in to participate in the conversation
Mathstodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!