Posts

Showing posts from October, 2015

Shortest possible ambiguity

I'm procrastinating cleaning up an essay, so instead it seemed worthwhile making a short observatory blog post - one ambiguity in English that has come up a few times for me, is often missed, but also is due to a single letter word: 'a'. An example of the ambiguity is the following question: "When rolling a normal die (faces 1, 2, 3, 4, 5, 6), what is the probability that a number is the result?" . Unless you're trying to find corner cases ("it might get stuck on an edge!" - seriously, ignore these, the point still stands), your answer will either be "1/6" or "100%", and they're both right. The two interpretations can be thought of as: "...what is the probability that a particular single  number (e.g. '5') is the result?", in which case, it's 1/6. The other is: "...what is the probability that the result will be in the set of things we classify as 'numbers'?", in which case 100%.

I predict a riot

Since starting my Cognitive Systems degree, a number of people have asked what exactly that entails. Other than the more standard stuff (psychology & computer science), I figured it worth looking into one of the things I've been contemplating that may help to summarize it: predictive ability. 1. Macro prediction It's no surprise that it's useful for people to predict stuff: daily weather forecasts, and structural integrity of buildings come to mind, but really everything uses it in some way. Science as a whole appears to be built around this idea: your theory's utility is strongly tied to its predictive power (i.e. measurable and replicable). What's more, gaining predictive power is hugely incentivized financially: the stock market is the most obvious example, but also a large amount of machine learning is going into things like predicting behaviour of consumers, or populations, or the environment etc... This leads to some interesting questions: is our ab