"Palm oil may be the world's most hated product and it’s in everything. But if economics is about anything, it’s about tradeoffs...
the rapid expansion of palm oil exports since 2000 led to 2.7 percentage point faster poverty reduction and 4% faster consumption growth, at the cost of more rapid forest loss and more fire.
A back of the envelope calculation finds 2.6 of 10 million Indonesians lifted from poverty this century were because of palm oil"
Fiction by writers best known as physicists
(So not Brin or Benford or Snow)
Anyone else noticed the difference between Captcha annoyance on Firefox and Chrome?
I'm using a VPN to be fair, but Chrome waves me through with one round while Firefox takes 10. I smell corporate bullshit.
Why care about this? Besides mere trust in one's hardware, or a mere preference not to be watched, it's to do with the increasing tail risks of being in principle vulnerable to one oddball with a vendetta. These will increase for two reasons: increase in the online population, and in ML fuzzing and intrusion methods.
Only half of humanity are online at the moment; a single script-kiddie troll can do quite a lot; the internet is about to get bigger, louder, and stranger.
Maybe-adequate self-defence on the cheap
I know a decent amount about
Data science careers
Python, SciPy, Spark
Nietzsche, Frege, Wittgenstein
History of computing
History of philosophy
The replication crisis and social science
Biochemistry (esp. veganism & nootropics)
Analytic philosophy (esp. science, ethics)
Punk, jazz, ambient music
and I like answering questions, AMA.
Some Oriental vocabulary for enjoying Ada Palmer's 'Terra Ignota' series even more:
* Xiaoheiwang, 小黑王 (Mandarin): Little Dark Lord.
* Sanling, 三菱 (Mandarin): Three Chestnuts, Three Diamonds. (Transliteration of Mitsubishi)
* Tai-kun, 太君 (Japanese): Supreme Child OR Supreme Leader (Shogun, tycoon). A good rendition for both meanings would be "great upstart".
- Excellent performance in the Notebook, which disappears once I rerun it (with the same seed). This was probably cheeky .ipynb file hacking.
- Most people forget to normalise the data for linear reg, making everything else in the analysis nonsense.
- No code reuse ever.
- Blithe recommendation of multiple comparisons (cheating).
- Editing the response data, instead of binning or rescaling.
Things I have seen marking machine learning take-homes:
- No test split.
- Test split but forgot to use it.
- An exquisite description of the mathematical and practical motivation behind PCA, followed by them taking as many components as there were original columns.
- Tree depths of 50 on data with 1000 rows (this is saturated, with one row per perverse leaf, after 10 splits).
- Absolutely everyone uses classifiers on ordinal/interval responses. I seriously want to know why.
"what just happened presents a serious indictment of academic science. There are dozens of academic groups, with researchers likely numbering in the (low) hundreds, working on protein structure prediction. We have been working on this problem for decades, with vast expertise built up, and not insignificant computational resources...
For DeepMind’s ~10 researchers to so thoroughly rout everyone surely demonstrates the structural inefficiency of academic science..."
A Mastodon instance for maths people. The kind of people who make \(\pi z^2 \times a\) jokes.
\) for inline LaTeX, and
\] for display mode.