Here is a link to a recent paper discussing the negative correlation between African literacy rates in the colonial and post-colonial times and the "slave export intensity" during the pre-colonial era.
Cherokee Gothic rightly points out that this is an example of economic path-dependency, a concept familiar to mathematicians: for certain quantities, it isn't where you end up, but how you got there.
Example: if one pegs their net worth at the value of the stock AAPL, and through some sorcery predicts every upturn and downturn in the stock price, liquidating at peaks and converting all cash to stock at local minima, in a year that person would have a considerable amount more money than the person who held the stock fixed, and much more than the person who made the opposite choices. The path taken to the end is what caused the discrepancy in wealth after a year.
Path dependancy is a familiar trope in political theater and is, depending on the political and philosophical bent of the person, the reason for gender and race gaps in education, poverty, and incarceration. It can be summarized (thanks to Scott E. Page) in the "old Bostonian jump roping rhyme"
I eat my peas with honey. I’ve done it all my life. It makes ’em taste quite funny, but it keeps them on the knife.
And so this recent paper attempts to gauge the level to which slave export has biased literacy rates in African countries over the proceeding centuries. The answer? According to the abstract: "a negative and signicant relationship between slave
export intensity before the colonial era and literacy rates during the colonial era."
Here's the data used to support that claim, buried in a plot in the supplementary material. This image plots literacy rate against some normalized quantity representing pre-colonial slave exports as a percentage of the extant population. Where's the trend?
To me (and this is just me), this appears to be a classic case of oversimplification. If the author wrote this paper with the exact opposite conclusion, I would be equally swayed. What causes the four outlier groups in slave export to be there? Why is there seemingly no trend? Why did the author connect two clusters with a line and call it a trend?
Bad science, even in Path Dependancy
Tyler Cowen has a blog series called "Markets in Everything" where he links to examples of bizarre areas of specialization for which there is a market... for example the market for coats made out of chest hair, topless paintings of Bea Arthur, etc, etc...
I'd like to start my own: Bad Science in Everything. Just like market economies exist for even the strangest of goods, bad scientific research permeates all corners of human existence.
Todays example: a study marked by Outside Online as STUDY: LONGER RUNS ARE EASIER. 25 runners tried to do a 200 miles race. They were tested three times along the races path for their nueromuscular fatigue and other body indicators. These results were compared to runners tested running 100 mile races and less, and lo and behold, the 200 mile runners performed better. The culprit, sleep deprivation, perhaps. This provides for great copy and has done so all over the web, from CBS News to Deadspin.
It is also some bad, bad science.
Of the 25 runners, just 9 completed the race. They were only tested 3 times. Let me repeat. 25 runners. 9 finishers. Over 60 % of the runners who even though they could finish a 200 mile race could not. This subset was then compared against runners running 100 mile races, 50 miles races, and marathons.
So all 9 of these runners' bodies were capable of running 200 miles, which makes them utterly unique, both for human beings and for ultra-marathoners. To compare them to runners running a considerably smaller distance implies that anyone who is capable of running a marathon, or even a 100 mile race is capable of running a 200 mile race, which clearly is not the case, even for those who train for it.
So here we have 9 highly specialized data points which are probably not even useful for comparison to other long-distance runners. Yet they are also being used to make an inference about the other 7 billion human beings.
I think I could probably run a marathon after a few months of training, but I would be utterly wrecked. To say that if (and I couldn't) I ran 200 miles I'd be feeling better? Thats some bad science.
You might argue that this isn't the implication of the study, that the point is that ultra-hyper-marathoners have special fatigue properties of their neuromuscular system. This is probably true. Yet it is certainly what is being inferred in the press.
Bad science in everything.
There are three kinds of lies: lies, damned lies, and statistics. - Mark Twain
People are more likely to answer factual questions incorrectly if the facts do not conform to their political biases, even if presented with news stories disconfirming their preferred belief. With money on the line, however, they will. The consequences for climate policy here are rather broad.
Wonkblog writes up the a reading of a recent paper by Larry Bartels at Princeton, who showed that Democrats were:
Much less likely than Republicans to correctly answer questions about whether inflation went down under President Ronald Reagan (it did) and whether unemployment also fell (it did):
A second group of researchers found:
Republicans presented with news articles pointing out that there were no WMDs in Iraq were more likely to say that such weapons were found than Republicans who didn’t read those articles.
The implication here is that people with strong political beliefs are willing to register their political belief in a survey even at the expense of disconfirming an actual fact, which recalls the beautiful quote by Twain (which he claimed originated in Benjamin Disraeli)
Yet when told that incorrect answers would be penalized by monetary fine, and by including the category "I Don't Know" as a response indicating a sort of "conscientious objection") the partisan gap (remember, this is the spread in answers on a fact based on political affiliation) dropped by 80%.
The two main takeaways here?
This calls into question reports on the acceptance of climate change in the United States: including the results from this Gallup poll.
Gallup shows a major conservative bias towards a belief that climate change is exaggerated, and a liberal bias of similar magnitude (relative to the mean) towards a belief that it is not. Being that the "controversy" over climate change has become such a trenchant partisan issue, and that there is no magic information transmitted to Democrats that Republicans can't access, could it be that people are simply registering their political or religious belief into a survey, rather than their ignorance?
Oceanographer, Mathemagician, and Interested Party