This week I read like five language related papers for machine learning from the 2000s or earlier and it's really shocking how much the field has changed these last 20 years.

Follow

@jt Even rereading "Attention is All You Need" feels like a trip through time now...

@dlzv That one is on my follow up list haha. I was focusing on Bayesian models because they seem easy to understand and implement without industrial amounts of data.

@jt I'm not very familiar with Bayesian models for NLP, do you have some pointers? For what it's worth, there are pretrained models that can be used with very few data now if you're interested. Of course understanding them deeply is another issue...

@dlzv Yeah, I kinda dismissed pretrained models because everything is in English and I work in French... besides our data is kinda specific, but I may try at some point in the future.

I think a nice panorama is given in "Two decades of statistical language modeling: Where do we go from here?" and some of the references there, although my favorite is "Latent Dirichlet Allocation" (which came later). Those are two of the ones I read lol.

@jt There are plenty of pretrained models for French! I think it's probably one of the most well-studied languages: huggingface.co/models?language

Thanks for the refs!

@dlzv Thanks, I'm going to give them a try :)

Sign in to participate in the conversation
Mathstodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!