• Four Steps to Better Predictions

    Following his work on geopolitical forecasting, Philip Tetlock co-founded The Good Judgement Project, along with decision scientist Barbara Mellers and UPenn colleague Don Moore.

    Their further research identified four key steps to improving forecast accuracy, shared on the (now archived) ‘Science’ section of the project’s site. These steps:

    1. Talent spotting: Identifying the naturally better forecasters. They’re curious, highly analytical and numerically savvy, rational, highly open-minded, and quick to revise their views when presented with new evidence. In thinking, they also structure and disaggregate problems, take an ‘outside view’, and systematically look for base rates.
      Nowadays, the team uses Good Judgement Open to identify some of their new ‘superforecasters’.
    2. Training: Forecasting is a skill that can be learned and improved. Training focuses on techniques to reduce cognitive biases and apply structured thinking. Key strategies include breaking down problems, considering alternative outcomes, and updating predictions as new information emerges.
      The team’s early “cognitive-debiasing” training known as CHAMPS KNOW lasted only an hour, but improved forecasting skills by 11% over an extended period of time.
    3. Teaming: Diverse teams make better predictions than individuals. By grouping forecasters with different perspectives and encouraging collaboration, there’s a “surge of accuracy that goes way beyond what you’d expect”, said the researchers in an interview with Knowledge at Wharton.
    4. Aggregation: Finally, individual predictions are aggregated into a single forecast, but not just through (weighted) averages. A method called log-odds extremising aggregation essentially combines group predictions, adjusting the single consolidated forecast to be more extreme where there is a consensus. So the more people that agree on a factor, the model pushes its probability closer to certainty (so towards 0% or 100%). The idea is that a confident consensus is often more reliable than a simple average would suggest.
  • Hedgehogs, Foxes, and Prediction

    In reviewing Philip Tetlock’s Expert Political Judgment upon its release in 2005, The New Yorker of course discussed the main finding that expert judgements are not much better than those of lay forecasters.

    Another key focus of the review was Isaiah Berlin’s The Hedgehog and the Fox, a metaphor drawn from Archilochus sometime in the 7th century BC. Tetlock uses it to categorise the two approaches people take to forecasting, finding:

    Low scorers look like hedgehogs: thinkers who “know one big thing,” aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who “do not get it,” and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible “ad hocery” that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess.

    An interesting aside is what Tetlock saw when he explored how the hedgehog and fox mindsets correlated with political leanings:

    [There was no] significant correlation between how experts think and what their politics are. His hedgehogs were liberal as well as conservative, and the same with his foxes. (Hedgehogs were, of course, more likely to be extreme politically, whether rightist or leftist.)

    It seems natural, then, that one would want to aim to be a fox in our thinking, right? Well, maybe not if you want to make a name for yourself:

    The upside of being a hedgehog, though, is that when you’re right you can be really and spectacularly right. Great scientists, for example, are often hedgehogs. They value parsimony, the simpler solution over the more complex.

    So, whether it’s the (overly-)confident single-mindedness of the hedgehog or the flexible pragmatism of the fox, being aware of the approaches can help us better navigate the complexities of an uncertain future. Especially combined with Tetlock’s other recommendations, of being comfortable with complexity and uncertainty and keeping our confidence in check.

  • The Two Fundamental Truths of Prediction

    There are two fundamental truths of prediction: (i) there exists a “prediction horizon” where predictions beyond this point are inherently inaccurate; and (ii) experts are generally just bad at predicting.

    On that first point:

    our ability to predict is limited by the nature of complex systems. Weather forecasts, for example, are quite accurate a day or two out. Three or four days out, they are less accurate. Beyond a week, we might as well flip a coin. As scientists learn more about weather, and computing power and sophistication grow, this forecasting horizon may be pushed out somewhat. But there will always be a point beyond which meteorologists cannot see, even in theory.

    Prediction horizons vary, but the general idea is the same whether experts are trying to forecast the weather, economies, elections or social unrest: No matter how brilliant the analysts may be, no matter how abundant the resources at their disposal, their vision can only go so far.

    Yet, despite these inherent limitations, there is hope.

    In Expert Political Judgment, Philip Tetlock’s seminal work on political forecasting, he shows that many experts would do better if they simply guessed randomly. However, his research also offers guidance on how to improve our forecasting accuracy:

    what separated those with modest but significant predictive ability from the utterly hopeless was their style of thinking. Experts [
] were handily beaten by those who used diverse information and analytical models, were comfortable with complexity and uncertainty and kept their confidence in check.

    What this and much other research suggests is that the right training, tools and organization can make people better forecasters.

    Based on these findings, Tetlock received IARPA funding to explore this further (as the ACE Program). His research culminated in The Good Judgement Project (a crowd forecasting organisation) and the excellent Superforecasting. I’ll dive more into this topic in the days to come.

  • Tsundoku and Eco’s Antilibrary

    Today I learnt of tsundoku: 19th-century Japanese slang for “the phenomenon of acquiring reading materials but letting them pile up in one’s home without reading them”.

    Yep, that’s me—with physical books, sure, but especially with ebooks.

    I love books as much as the next person, but I’m no bibliomaniac. Instead, I’m reminded of the more aspirational antilibrary.

    The antilibrary is a personal collection of unread books—a tribute to everything you don’t yet know. The more it grows, the more it reflects your curiosity, not your knowledge. It’s less about ownership than a constant invitation or reminder to learn.

    This is a great way to think about one’s personal “library”—and it handily removes some of the guilt of not having read all my books while acquiring more and more.

    The concept originates with Umberto Eco (coined, I believe, in On Literature), but was popularised by Nassim Nicholas Taleb in The Black Swan, as a great introduction to that eponymous concept:

    People don’t walk around with anti-rĂ©sumĂ©s telling you what they have not studied or experienced, [
] but it would be nice if they did. Just as we need to stand library logic on its head, we will work on standing knowledge itself on its head. Note that the Black Swan comes from our misunderstanding of the likelihood of surprises, those unread books, because we take what we know a little too seriously.

    Let us call an antischolar—someone who focuses on the unread books, and makes an attempt not to treat his knowledge as a treasure, or even a possession, or even a self-esteem enhancement device—a skeptical empiricist.


    As a postscript, while researching for this post, I often saw the below quote attributed to Eco, which neatly encapsulates his antilibrary idea. However, try as I might, I could not verify the source, and so I have my doubts about its authenticity (personally, I also think it doesn’t read like Eco’s writing). No doubt, though, that it evolved from his ideas. Perhaps just not from the man himself.

    It is foolish to think that you have to read all the books you buy, as it is foolish to criticize those who buy more books than they will ever be able to read. It would be like saying that you should use all the cutlery or glasses or screwdrivers or drill bits you bought before buying new ones.

    There are things in life that we need to always have plenty of supplies, even if we will only use a small portion.

    If, for example, we consider books as medicine, we understand that it is good to have many at home rather than a few: when you want to feel better, then you go to the ‘medicine closet’ and choose a book. Not a random one, but the right book for that moment. That’s why you should always have a nutrition choice!

    Those who buy only one book, read only that one and then get rid of it. They simply apply the consumer mentality to books, that is, they consider them a consumer product, a good. Those who love books know that a book is anything but a commodity.

  • The Capital City Effect and Britain’s “Mississippi Question”

    Ranking Britain’s per capita GDP to that of the various US states seems to be of recurring interest to some in the British economic media—likely because the UK ranks surprisingly low. This has given rise to “the Mississippi question”: which is more economically productive per capita, Britain or the most impoverished US state, with the lowest life expectancy?*

    Last year, the Financial Times took the Mississippi question a step further, looking at GDP figures for various countries with and without their capital cities (archived). Without London, the UK now slips just below Mississippi.

    This brings us to the capital city effect, tracked by the German Economic Institute since 2011. It measures the economic contribution of capital cities and the impact of hypothetically removing them (translated).

    Spoiler: the results vary hugely.

    As of 2015, for example, Greece’s per capita GDP without Athens would drop by nearly 20%. Germany, however, was an outlier: removing Berlin had a net positive effect, increasing per capita GDP by 0.2% (though by 2023, it has shifted negative to -0.1%).

    * As of today, Britain seems to be between Alabama and South Carolina (fourth and fifth ranked, respectively).

  • FAST not SMART for Goals

    The conventional SMART approach for setting goals undermines higher-level (team and/or organisational) objectives by promoting an individualistic and isolated approach to work. The best approach for creating effective goals, according to researchers at MIT Sloan, is to go FAST: Frequently discussed; Ambitious; Specific; and Transparent.

    According to their meta-analysis and additional field research across companies such as Google, Intel, Netflix, Burger King and Kraft Heinz, these “four core principles underpin effective goal systems”. However, it was the last principle — transparency — that stood out as the least popular, yet most impactful:

    Making goals public can boost performance by introducing peer pressure, showing employees what level of performance is possible, and helping them locate colleagues in similar situations who can provide advice on how they can do better.

    I’m always curious how it might be possible to apply such findings to personal development. While I use spaced repetition and deliberate practice, I don’t often make goals. Maybe that’ll change.

  • Some Favourite Daily Mini Games

    I feel like the concept of daily mini games really came to prominence in those pandemic-hazed days of 2021, thanks to Wordle.

    Since then, my daily game habits have changed as I hunt for fresh, challenging puzzles. I enjoy finding games that are quick to play but mentally engaging enough to add a little spark and focus to the day.

    Sometimes (basically, if a tab is closed!) I forget to return to a game. So, here’s a list of my current favourites—both to share and as a personal reminder:

  • Prioritising the Search for Good Books

    A favourite hobby of mine is research. Structured or unstructured, informal or scholarly. Deep diving on a topic, old or new, is my jam.

    For that reason, I spend a lot of time reading about the thing, rather than actually doing the thing. The meta-activities.

    There are clear negatives to this approach (limited impulsivity, slower decisions), but also significant upsides (conscious information consumption, higher quality decisions).

    This approach also applies to my reading habits, which is why I tend to only read books that I end up rating highly and why I connected with this comment from Paul Graham about a lesson he taught his twelve year old:

    There’s a second component of reading that many people don’t realize exists: searching for the good books. There are a huge number of books and only a small percentage of them are really good, so reading means searching.

    Someone who tries to read but doesn’t understand about the need to search will end up reading bad books, and will wonder why people who read a lot like to do something so boring.

    You’d think that figuring out which books are the best would be a solved problem by now, but it isn’t. I’m almost 60 and have been reading a lot my whole life, and I’m still constantly searching for the good books.

    Algorithmic recommendations, ‘best book’ awards, and ‘book of the year’ lists abound, but are not a replacement for the hard research. My favourite approach is shortlisting finalists from awards I respect, finding voracious readers who share detailed reviews, and actively talking with friends about books they read, before reading lots of 3- or 4-star reviews to help me make an informed decision.

  • Alt Codes and symbol.wtf

    In my work and in my writing here, I’m constantly searching for “ellipsis”, “euro symbol”, and “section symbol”, among many, many other symbols. For me, trying to remember even more alt codes is a futile endeavour.

    Obviously, when Sam Rose wrote this, I was excited: “Made a dumb website so I wouldn’t ever have to Google “tm symbol” again.

    And so started symbol.wtf, a site I now have bookmarked and visit at least once a week, simply clicking on the easy to find symbols to copy them.

  • The Nerd Urban Dictionary, or: The Overcomplication Compilation

    Seemingly frustrated at how ‘nerds’ throw around technical terms in order to sound smart, Chris Anderson (writer of The Long Tail, etc.) put together The Nerd Urban Dictionary to compile the most common and worst offenders.

    With terms coming from disciplines ranging from statistics to chemistry, finance to the military, here’s a small sample (mostly of things I’ve heard from the mouths of consultants):

    • “Priors” instead of assumptions; (if you want to get really nerdy, you can say “posteriors” instead of conclusions). (“All political discussion on Twitter is just people confirming their priors”)
    • “Causal structure” instead of underlying reason (“There must be some causal structure behind why this happened”)
    • “Non-trivial” instead of hard (“Shipping the code by the end of the day is non-trivial, boss”)
    • “Binary Choice”. A choice with only two options
    • “Tautological” instead of obvious

    Important notes here are that Anderson agrees that these are all uncontroversial when used in the right context, and that his “instead of
” definitions refer to the (often incorrect) common usage, not its precise/correct definition.


    As an aside, I do take exception to the use of ‘nerd’ here. I’ve observed this linguistic signalling more prominently within the technolibertarian ‘tech bro’ and ‘finance bro’ circles, rather than the typically more wholesome ‘pure nerd’ community.

    As such, I have decided to call this The Overcomplication Compilation: a collection of words and phrases used to signal one’s in-group intelligence. Because, you know, regular words are for regular folks!