Follow

OK, so I've been learning a new computer language recently, and I'm finding it intriguing. Not yet at the point where I can use it "in anger", but I'm close to the point of choosing it for some of my everyday scripting things. It would take longer, but would help me to learn it.

So ...

Would you learn a new language? What would induce you to do so?

· · Web · 3 · 0 · 3

@ColinTheMathmo I would want to be able to write it about as fast as I can write Python, but have better performance, especially for multithreaded applications

@gappleto97 I'm getting to the level of writing some of my code in both Python and the new language so I can start doing some timing comparisons. Not sure it have multithreading support, I'll have to check.

I'm still just a beginner, though. It's a mind-bending paradigm (for me) so it's not just a "Write Python in X" exercise.

That's why it's (potentially) valuable.

@gappleto97 I'm learning "fexl".

The website doesn't really do it justice, my initial investigations suggest that it's actually quite usable ... but I'm still exploring it.

I'm finding it quite interesting ... I've been surprised.

@ColinTheMathmo I always think of programming languages as media of expression, so one rough criterion I have is that I want a language that lets me say more with less. That is, I'd have to be convinced it's expressive yet concise for a particular application.

@tpfto That's interesting. One of the arguments for Lisp (which I am *not* advocating) is that it helps expand how you think about computation, and that then improves your ability to program in other languages. The claim is that your ability to program in general is improved, and then you are more effective in whatever language you use subsequently.

Not sure I believe it, but it's an interesting argument. Certainly the language I'm learning now might do something similar.

@ColinTheMathmo @tpfto I think this is true for maybe ALL programming paradigms/families - I know for myself that after learning Haskell I had a new perspective on how to deal with some kinds of problems. Same with Bash and C (well C only kinda, in that C was basically the first language in which I became proficient-ish.)

I haven't yet found this to be the case for lisp yet, but I just haven't done enough of it yet probably.

@reed I'm learning fexl, which is a different approach making lambda calculus an effective language for working. I'm wondering if it will enhance my abilities/perceptions across other languages.

CC: @tpfto

@ColinTheMathmo I am thinking a lot recently about how highly parallel paradigms (mainly GPU, but also multicores, etc) might transform the way we (should) approach or teach numerical algorithms.

It would be very interesting to find a programming language that allows to do this in a natural way, preferably with a higher level of abstraction. For example, are there dataflow languages that are suitable for high performance numerical work?

@mm You've made me think ... I'm pretty sure the implementation of fexl isn't deliberately parallel, but I can see that it might be possible to do that.

Hmm.

I've made a note of that. The implementation is open, but I don't understand it. Maybe I should learn enough to do so.

@ColinTheMathmo Regarding fexl: this is a prime example of what I mean! The λ-calculus seems to deliberately avoid multivariable functions in favor of repeated evaluation ("abstraction", or "substitution" in fexl) of functions of one variable. This has the same computational power as λ-expressions with multiple variables. Indeed, "λ x y . z" is equivalent to "λ x . λ y . z". But the latter is essentially sequential, whereas the former is inherently parallel! Is it time to revisit λ-calculus?

@mm Thinking out loud ... these things were worked on at a time when implementing them was not even considered, so it was (I'm guessing) considered (if at all) that these things "happen" instantaneously.

Analogy: Consider proof by induction. People talk of the proposition "becoming true" for larger n, they talk of "dominoes falling", etc. But that's misleading at best, and generally just nonsense.

So with the λ-calculus. We now have to think of things as happening, not simply as being true.

@mm PS: Mostly not available today ... I'll try to think more on this.

@ColinTheMathmo Exactly! When λ-calculus was invented, efficiency mainly meant shortcutting unnecessary computations, without a particular computational model at hand. Nowadays it makes sense to consider in which order computations should be performed. In fact, being aware of CPU caches, for example, can lead to nontrivial solutions with impressive performance gains (cf. springer.com/gp/book/978354000)

For flexl and the λ-calculus: Maybe something like Regent would work? See: elliottslaughter.com/2015/12/r

@ColinTheMathmo

Your chart is ready, and can be found here:

solipsys.co.uk/Chartodon/10677

Things may have changed since I started compiling that, and some things may have been inaccessible.

The chart will eventually be deleted, so if you'd like to keep it, make sure you download a copy.

@mm @ColinTheMathmo according to my buddy, snowbol would be a good choice, because it’s a gpu based language with only about 400 or so symbolic registers, and things move very quickly as thus.

@ringo @ColinTheMathmo Thanks, but not sure I can find it. This is not the same as SNOBOL / SPITBOL, is it? Do you know where to find more info or an implementation?

In the meantime, it seems that futhark and Chapel might be two languages that come close to what I have in mind - but I need to look more into this...

Sign in to participate in the conversation
Mathstodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!