Daniel Kahneman and my unexpected stupid

About half-way through the Masters degree I’ve just finished I began to get interested in how and when I had got so stupid, and not just in a trivial sense that might be explicable by diminished memory function or a fight-or-flight response to the alarmingly patterned trousers that suddenly surrounded me as I went among The Young again. My self-image was that of a reasonably clever person who found studying easy. But writing my dissertation was like repeatedly pressing a switch and not understanding why nothing was happening. Somehow, the appropriate facts were not crashing into each other in the right way, in the way I was pretty sure they had last time I had attempted to do something like this. At one point a tutor told me I was good at spotting the flaws in other people’s arguments. This formed a poignant counterpoint in my violently over-inducting brain to what another tutor said to me about ten years ago, which was that I was good at seeing what really mattered. Interesting category difference there, I think.

So naturally as a fan of bullshit pop psych I first wondered about the whole 10,000 hours thing. Most people in postgrad study are building on the subject they chose when they were 18, which gives them an advantage in both data-set and mindset familiarity. Was it just too much of a stretch to acquire basic mastery of Near Eastern prehistory sufficient to enable me to write meaningfully about it? When I did my first postgraduate work I’d been studying for the previous sixteen years, and the particular subject of my postgrad work for the previous three. That has to make a difference.

And maybe there are other consequences to getting older that are more about changing your mental landscape than depleting it. Maybe I am epistemologically harder on myself these days. I know more in general, I have a higher standard of what it means to have a sound understanding of something than I used to. Probably late teens and early twenties are the optimal time for learning big difficult stuff because you don’t yet comprehend the extent of your own ignorance and would have the crap quite terrified out of you if you did.

But I don’t think any of that fully explains what was going on, and nor did any of the chirpy “Seventy-two reasons why the internet is turning you into a hopeless moron” type posts I turned up on, uh, the internet in search of the answer (although I did come across a link to a finding that men get stupider just by being in a woman’s presence, which has the worrying implication that roughly 50% of the people I am using as a reference point for my own stupidity are actually even smarter than they appear to me.)

But never fear. I still have a bullshit pop psych explanation, just a slightly more complicated and respectable one. One of the things I am finally getting around to reading is Daniel Kahneman’s Thinking, Fast and Slow. Kahneman’s system 1 and system 2 concepts are shorthand for respectively fast, intuitive, impressionistic thinking and slow, effortful, “rational” thinking. “Slow” system 2 thinking is hard and energy-expensive, which is why people have a natural resistance to it and are prone to over-rely on “fast” system 1 thinking (despite believing a lot of the time that they are using system 2 i.e. making rational judgements and decisions).

System 1 serves important purposes – impressionistic judgements enable accurate forecasting in many scenarios – but it is not good at handling certain types of problem, especially those with statistical and logical components. It is subject to various biases which can cause its conclusions and forecasts to be inaccurate, of which I think my favourite is attribute substitution (answering a different question to the one actually posed as if it was an answer to the posed question) because it explains about 80% of political commentary. Attribute substitution is built into the way people construct their political views – onlookers as well as politicians. Political problems are vast and complex, information is hard to come by and analyse, and yet people in public life and public house alike are culturally expected to take views on things they could not possibly carry out full system 2 analysis on. Attribute substitution is probably the single most useful mental tool a person commenting on a political problem has access to, if we define “useful” as “helps me avoid admitting that I do not have a solution to this problem and thereby losing status among my peers.”

However, Kahneman also refers to situations where system 1 thinking does provide reasonably accurate forecasts even where the material is logically or statistically complex, simply because some people in some situations have the system 2 knowledge database to support intuitive leaps. His illustrations are chess masters having an instant grasp of all possible future moves on a board without having to reason it through, and a physician making an instant diagnosis – both specialists recognise familiar cues in the situation and are able to make leaps to judgement which are reasonably accurate. This is where I think the relevance to academia comes in. In these terms, by the time I got to postgrad level in medieval history, I had done all the slow, logical, effortful system 2 thinking required to fix the basic rules in my head. This meant I was able to do a whole lot of informed system 1 thinking – that is, the frequent employment of low-effort intuitive thinking to make leaps and solve problem. This in turn freed up my capacity to do dogged system 2 thinking that was genuinely meaningful.

This is basically like being on drugs – the satisfying animal hit of “stands to reason” system 1 thinking plus the rational knowledge that the  system 2 slogging you’re doing is actually important – which is why I did the Masters in the first place and I guess why anyone sticks with academia at all. Finding thinking uniformly hard was something I had forgotten. I associated academic success with system 1 thinking, because that was the last state in which I had experienced it. I kept waiting for system 1 to kick in, and it didn’t, I didn’t have the database for it, I was grasping for the intuitive before I had done the basic crunchy bit that makes the intuition work. My back then-tutor was identifying my success in system 1 thinking – my now-tutor was describing a process he could observe in my tentative system 2 thinking (picking holes in other people’s argument is a great way of kicking off the system 2 crunchy bit).

For me the only implication here is “If you do a PhD, be crunchier about it”, but I also think it has interesting implications for academic careers in general. If you’ve done all your basic system 2 thinking in the first years of your career, you are able to take effective shortcuts in problem-solving and more of your expensive system 2 capacity is freed up for the boundary-pushing work which will move you forward as a researcher. But it has its downsides too from a point of view of both research and pedagogy. You’re no longer well-equipped to describe to your students – mired as they are in system 2 – what it is you’re really doing. Your shortcuts once laid down are less likely to get truly re-examined, which may mean the perpetuation of specialized versions of heuristic biases in your work.

Perhaps this provides another perspective on why intellectual revolutions are as fraught as they are. It is not mere social and professional defensiveness at work when new paradigms are rejected – we can usually detect these kinds of bias. At a higher level of abstraction, a call to embrace a new paradigm is a call to put down the lovely, easy, satisfying system 1 toys and start again from scratch with the unpromising lego bricks of system 2, which is what I have just had to do.

Uncle Jack’s tattoos

The Guardian is, of course, wrong about everything all the time, and this piece on David Dimbleby’s tattoo is no exception:

In 2013 it’s a look we still associate with youth. It may be ubiquitous but it still carries just the faintest hint of rebellion (though one that will soon, surely, be extinguished: it’s hard to regard as rebellious an act that’s been performed by the man who, after the Queen, is the face and voice of the British establishment).

In the future, the spider on the neck or the angel wings on the back will be associated with grandparents.

That is trivially true as far as it goes, but there is a much wider and more interesting context. It’s the largely uninked middle-class – or aspiring to be so – people born between the 1920s and the 1950s who are the aberration in the British popular cultural record of self-decoration in the modern era. Tattoos have only ever been associated with yoof and rebellion by people with no sense of history (and it’s surprising, isn’t it, how many older small-c conservative people don’t have any sense of history). In fact, the context in which Dimbleby acquired his elegant little scorpion was a program about the history of the navy.

More than anything it made me think of somebody I never met who died long ago in a country far away, known to me as Uncle Jack. He was my great-grandmother’s brother, and was probably remembered by my grandfather as a tall grown-up shape in the same way that my grandfather (d. 1982) was a tall grown-up shape to me. When I finally plucked him out of the records as John Wells, the son of a carpenter, born Dulwich 1884, he proved to have a quite splendid set of tattoos. This is him:

Jack in kilt

This is the early years of the twentieth century. He even looks like a man with ink all over him, doesn’t he? One hundred and thirteen-odd years ago, on 29 November 1900, actually 16 but claiming to be 18, he signed his attestation papers in London for the Fifth Middlesex. He lived at the Wells family home in 20 Marcellus Road, Finsbury Park, and worked in a baker’s shop on the Hornsey Road. At this time his distinguishing marks seem to be limited to the pugnacious:

john wells detail

I think that says “Vax 2 left. Scars forehead; upper lip. D__? forearms.” But if you can make anything better of it, let me know. I’m only inferring the first bit because those papers always note vaccination marks, and his later papers also record two vaccination marks.

He almost certainly has another set of British military papers floating around somewhere because I think the photograph above shows him in a uniform of the Cameronians (Scottish Rifles), which is how he must have met Percival Mortimer who became my great-grandfather; their regimental numbers are a few digits apart. Later records show he served with the Cameronians for nine years, which I imagine includes some years on the reserve.

In the spring of 1911 he and his wife Evie (Alberta Evelyn Mercy Cooper, and I think I would wrestle “Evie” out of that as well) emigrated to Canada, and it was as a member of the Canadian Ordnance Corps that he signed a further set of attestation papers in April 1919. He then claimed to be 33 (actually 35). He’d put on 37lbs-odd since 1900 (well, haven’t we all, darling) and grown two inches, and the complexion previously described as “fair” was now “florid”. But best of all, he now has these (click to embiggen):

tattoo

Lottie. Oops! This makes me think that tattoo predates 1908, which was when he married Evie, in which case maybe he got it when serving in the Scottish Rifles alongside my great-grandfather, as the highlander would also suggest.

The drum major is an interesting one, because that’s the position the photograph above might depict. But I need to investigate this further – the usual uniform of the Scottish Rifles wasn’t the kilt, but some rather fetching trews in the Douglas tartan of the kind now regularly to be seen in Hoxton. So perhaps that tattoo was a celebration of being chosen as a Drum Major?

Britannia between two flags sounds like it was probably standard fare among the tattoo parlours of early twentieth century London (or India or South Africa, which was where various bits of the regiment was posted in the relevant years), and it could belong to any time. The horse and gun carriage I’m not sure about. It’s not from any insignia I’ve been able to find related to the Scottish Rifles. It could be just another standard militaristic trope like Britannia, but given the drum major and the commemoration of “Lottie” I wonder if it also records some kind of historical event, if only a personal one. I’d love to know whether early tattoo parlours had books like they do now, what kind of consultation went on and how accurate they were at producing things like military insignia. And whether any cautions were offered to young men gabbling the name of a girl.

Jack and wife

Jack and Evie. Probably taken around 1918 when they visited family in North London for the last time (my granddad would have been about 8). Note his Canadian uniform. And the chair – there are half a dozen other family photographs featuring this chair, so either they all used the same photographer’s studio or the photographs were all taken in the same house in Finsbury Park or perhaps Crouch End.

I don’t know when Uncle Jack died. We have pictures of him capering around Canada in his 50s and 60s – one of the inky old granddads that the Guardian writer now finds so counter-intuitive.

And then at some point, the postcards and pictures stopped coming, or maybe we stopped writing back.

I always find that crossover in photography very strange – those same Kiplingesque, sepia-washed faces who looked so strained and young in pictures from the 1900s, suddenly sitting in a garden chair grinning in the 1950s like they can’t believe their luck (and maybe they couldn’t). Way too much to hope, of course, that any of the pictures show him with his sleeves rolled up. Being covered in tattoos may have been quite routine for working class boys out of nineteenth-century London, but there were standards.

Jack 1941

Postcard from Uncle Jack in Canada to my great-grandmother in Norf London, 1941. The back reads “Here is Jack come to see you – he forgot to put his medals on.”

Open access – the nuclear option

Universities do, of course, have one option in the open access war that the publishers may not be considering: they can go it alone. As a lapsed sub-editor, a line I bring out at dinner parties only slightly more frequently than “as a former tax assistant”*, I have mixed feelings about this passage:

Yet top universities could organize their own conventional peer review processes economically and effectively, much as they do for PhD examining in the UK, using a system of mutual service and support. All the rest of the piece – getting articles publicized by twitter and blogs, providing a well-edited product, delivering the article to any PC, phablet or colour printer in the world – can be done easily and cheaply by universities themselves. Online communities are already doing the work of developing more and more research, so for universities to directly organize and publish their own peer reviewed journals, monographs and books is a natural next step. In my view only a dramatic fall in journal OA prices can prevent this transition in the next ten years.

As far as peer review goes, this is absolutely true, and indeed why the hell not? But as regards publicising, copy-editing, sub-editing and technical delivery, it depends what you mean by easily and cheaply. Anyone who has ever produced their own e-book, never mind worked in publishing, knows that we are some way off all these processes being entirely human-free, or error-free where they are, and it would be a mistake to reject this consideration wholesale just because it’s one of the arguments employed – with bells on – by the publishers.

I do understand the tendency among those who’ve sweated away at thinking big important thoughts and writing them down to take a somewhat reductionist view of the process that follows. But I’ve made use of self-archived material that was missing (which it claimed not to) some pretty critical graphs, because successful palaeo-archaeological scholarship does not necessarily select for the ability to detect an enormous gap on a page. Slips like that should not happen when a competent sub is on the case. The sub is like the homemaker; you think the whole show is running smoothly by itself until they aren’t there any more.

So “easily and cheaply” at the cost of a couple of non-academic posts, yes. Is that cheap? Maybe, depending on the volume of the material the institution wishes to produce, how much they fork out in subscriptions now, and how many more non-academic posts they are willing to countenance as a matter of principle in lean times.

* I don’t go to many dinner parties. You can probably tell.

CaSE all-party debate at the Royal Society #1

The call of pie prevented me from watching the whole of tonight’s all-party debating panel (David Willetts, Julian Huppert, Liam Byrne) at the Royal Society organized by CaSE, so for the moment these are inadequate and heavily pie-fed gleanings, both political and scientifical, which may be followed by a Proper Post at some point, like with correct grammar and paragraphs and all:

1. The panel like each other, more or less. Huppert thinks Willetts is one of the good guys on, e.g., immigration. Byrne was the most overtly political, at least at first. Natural, I suppose, because the other two have cause to know each other in a working relationship sense. Willetts seemed genuinely quite thrilled that Byrne had actually read DBIS’s Innovation and Research Strategy for Growth Report 2011.

2. The panel generally are strong on their brief, and strong in their advocacy of it. It’s too early to tell how Byrne will really perform in his newly acquired shadow brief and perhaps he had the benefit of being somewhat carried by the other two, but he held his own, and if you are utterly depressed at the state of government and the intellectual paucity of the people in it you could do worse than look to these three.

3. I did a little oven-side dance when Julian Huppert flew the flag for older people retraining in the sciences (the context being the skills shortage generally and a pertinent question from the audience about how to address this problem now, rather than through the medium of improved primary school science teaching which has a 20 year lead time). Not because I have the slightest intention of retraining in the sciences, mind you, but I am broadly retrained, if only from one social science to another, and much as it has benefited me personally I do think the culture is against it. Basically, if I had my way I would carry on learning things forever, and I think it’s a crying shame people are supposed to get more confident and complete as they get older, rather than more wide-eyed. I think a bit more wide-eyed would be good.

4. There was a moment of tremendous excitement, which had the Huppmeister jumping up and down in his chair, when it looked as if we might have cross-party consensus on ring-fenced science funding (in real terms?) 15 years into the future. Certainly Willetts was holding out some sweeteners about funding in the few years following 2015, and Byrne responded warmly. Not sure Willetts was quite as successful in humanising “George” [Osborne] to the audience though.

5. I need to blog about R&D tax credits and possibly change my header. You lucky people. They are, as Julian Huppert said, very generous indeed. I know them of old from working in the tax accountancy sector, and I never did get to do a tax calculation for an R&D tax credit, the number of groundbreaking research companies at large in North London being, at any one time, fairly small.

6. Several further points emerged in the questions which are points of relevance for all of academia, not just the sciences. The first of these was the importance of blue sky thinking. Is the current/emergent funding system conducive to it? I’ve read plenty of sound arguments that it isn’t. The incentives of the REF, certainly as they are known in the humanities, basically favour small annexations of new ground directly adjacent to old ground, and exploratory work that may not result in publication is riskier. The same is true apparently in the sciences. The panel could not offer any firm assurances that I could detect that this would be addressed, although all made the right noises. Huppert did concede that there was some risk inherent in successfully selling the short-term gains of science to the Treasury, in that it downplays the significance of long-term, tentative, exploratory research.

7. There was a brief segue into funding from the student end. Huppert feels the culture changed before his eyes as a student and then a Director of Studies when fees came in. I personally saw nothing of this, having been an undergradate from 1997 (the last year fees were not paid) and a postgrad from 2002, but I’ll take his word for it.

8. Another question with broader relevance to academia as a whole was the diversity problem. Some quite shocking stats came out here, including (if I heard correctly) the fact that 50% of state schools currently don’t have a single girl studying physics. Willetts had some even more telling stats about the proportion of girls getting high marks in science GSCEs who went on to study sciences at A-level, as against the (much higher) proportion of boys, which sadly I didn’t catch in detail and I will update when I find them. You can’t held wondering if Two-Brains won’t eventually be led further down the feminist path than his earlier history would have predicted, the more he considers stats like these and wonders why they are as they are.

And that was when the chicken and leek pie was ready. I will watch the second half of the debate though, because one thing came over to me loud and clear as I was watching and chatting to other watchers: this whole debate is far more advanced (and has, frankly, better people involved) in the sciences than it is in the humanities.

Brain dump #2 – diversity in academia

A little experiment with a call for contributors by the LSE Review of Books that yielded some interesting results. It’s an old post and I can’t remember who put it my way (sorry, thanks) but I thought the point about choice of words was a good one – how to encourage under-represented groups without signalling to over-represented groups that they are not welcome at all (which is only going to perpetuate a sense of division). I tend to agree with the writer’s conclusion that “particularly interested in hearing from…” is probably a phrase you want to avoid, because it can be taken to mean something other than what it is intended to mean.

Words are tricky because they carry so much baggage. Perhaps anyone writing a call for contributions has an impossible balance to achieve because of the associations people make with certain phrases. Action and example, where it’s a possibility, is probably easier. Whenever I’ve been in a position to pick hands out of an audience to ask questions, I’ve tried to go for a woman first. I was reminded of this the weekend before last at the Battle of Ideas, where the session chairs tended to tap a good half-dozen men in the Q&As before any women. It’s quite possible – likely, actually – that this was simply representative of the sea of hands presented to them. But taking the extra half-second to scan the room for a member of a group whose contribution you wish to encourage has the great advantage of not being an explicit discouragement of anyone. Everyone in those situations accepts that somebody has to be picked first, and also that they may not get to ask their question at all.

And it’s always interesting to see just how many more women’s hands go up after you’ve picked a woman once. Presumably the unconscious logic runs, oh right, this is for me. Before anyone has asked a question, the range of possibilities for the room is limitless, but your decision about who to pick first nudges the room along a certain path. Obviously this is not limited to the gender division – people in ethnic minorities, people from different socio-economic groups, people who happen to be sitting at the back under some blown lightbulbs, might all take their cue from the early decisions of the chair.

It’s a long way from being scientific though. I guess a sort of control would be picking a man the first time and seeing if more women’s hands start to go up anyway, in which case a plausible explanation might be that women don’t, as a rule, want to ask the first question. I didn’t put my own hand up in the Battle of Ideas sessions until I was sure I had a question that would contribute (or rather until I’d had the opportunity to assess prevailing levels of Dunning-Kruger bias among the other questioners). It’s even possible that the mere act of updating the call for contributors got the LSE Review of Books a bumper crop of women respondents the second time round, because it signalled that the editors’ needs had not been fulfilled by the initial response, and that the world is not, in fact, full of people obviously cleverer and better qualified than oneself, something certain groups are disproportionately prone to assuming.

The Fingerprints of the Lone Maverick Researcher

I still don’t know whether Graham Robb’s The Ancient Paths is tosh or not. If you dig into the book a very short way, you will see that he happily concedes he isn’t the first person to have come up with this idea. He’s also not setting himself up in defiance of French prehistoric archaeology – indeed, this material is his starting point.

If it isn’t complete tosh (excluding everything about the druids), then its breathless, revelatory tone as reported in the reviews is doing the material a disservice. It’s the tone of a man (usually) who has proven by drawing lines on maps, learning some uncontroversial things about Egyptian astronomy and free-associating with place names a bit that Atlantis is buried under Washington DC and the Hittites invented the internet.

“Popularly dismissed as superstitious, wizarding hermits, Robb demonstrates how the Druids were perhaps the most intellectually advanced thinkers of their age: scientists and mathematicians who, through an intimate knowledge of solstice lines, organized their towns and cities to mirror the paths of their Sun god, in turn creating the earliest accurate map of the world.”

And it was this chorus of quacking, much more than my sketchy knowledge of pre-Roman Gaul, that made me think “duck”.

Yet a lot of this is about marketing, and no author’s fault. This is what publishers’ marketing departments think historical enquiry looks like so this is how they sell it, pretty much regardless of how erudite the text is. If a book doesn’t have Startling New Evidence it must at least Overturn Existing Stereotypes. Often there is a rescue operation involved – restoring something to its rightful place in history, in defiance of some oppressive force. If the thing restored appeals to modern sensibilities, e.g. a conquered people or a wrongly accused individual, all the better – this will engage the reader’s post-Enlightenment indignation as readily as it fired the researcher’s, and they will steam through the book together.

Sometimes the oppressor is orthodox academia; in this case, the way one review tell it, it is also Julius Caesar, and the forehead-slappingly incredible possibility that he might not have represented the Gauls accurately. This is at least an audacious approach to straw man manufacture. If you’re going to tell a lie, tell a big one, as Caesar would no doubt counsel himself.

Physicists apparently suffer from a related problem. The media narrative has it that breakthroughs are made by lone genii, so anyone identifying as such automatically gets media attention. In fact – and just as I can guarantee no scholar of Roman Gaul is currently writing a to-do list that goes (1) believe Caesar (2) get coffee (3) seek to exclude maverick researchers by uniting with colleagues in defence of Teh Orthodoxy – this isn’t how most physicists work at all:

Physics is, these days, an immensely collaborative field. There are a lot of conferences. There are institutes and workshops and collaboration visits and endless seminars and dissections of research papers. Newly built physics institutes tend to have hallways lined with blackboards or dry-erase-glass cubicles to get people out of their offices to collaborate. We talk to each other, not because we are inherently very social (though a lot of us are), but because it’s a really productive way to proceed.

In physics as in archaeology, the motive of the lone maverick is a fine one. The human drive, expressed without cynicism, to discover alternative paradigms is very, very admirable and nobody should be ashamed of owning it. We should want to have our minds blown and our perspective altered. That’s what history and archaeology are about. The problem is that “orthodoxy” here is a straw man. Everybody who researches history or archaeology started doing it because they liked having their minds blown too. Go and read something very orthodox and classic and even textbookish on the Neolithic, like this or this or this. If you’re unfamiliar with the material, your mind will probably be blown. You won’t believe no-one has ever told you this stuff, that it’s just sitting around, and you won’t see yourself, or history, or the species, in quite the same way again.

It’s not that historical and archaeological investigation isn’t revelatory – it is. But these are drugs available on prescription. The idea that they have to be sought out furtively by mavericks making extraordinary bicycle journeys and meditating on hilltops in defiance of orthodoxy is a fantasy nurtured by publishers. It’s a shame that scholarly writing conventions tend to conceal the fact that everyone is really in it for the kicks.

Maybe I have it the wrong way round, and it is scholars who should be learning from publishers. Perhaps scholars who wish to extend their reach should own their inner maverick researcher, talk about their uncertainty and their delight, and do more bicycling.

How academic historians can make money on the side and enhance their impact. (Maybe.)

I am not a techie, but I live with one and seem to know, or know about, a lot of them, and one of the activities they seem to undertake *field notebook out* is selling specialist knowledge to each other. Here’s just one example. This guy wrote two e-books on popular technical subjects, they were successful, so he wrote an e-book about how to write e-books. Actually, you don’t even need to buy a book like that to get a grasp of the topic; the internet is stiff with blogposts telling you how to make money out of self-publishing, how to make money out of online educating, how to sell your skills – and not all of them are selling shovels to prospectors; some of them have done it.

How many of these tech knowledge brokers are making a living out of selling specialist knowledge, as opposed to treating it as supplementary income, or even a loss-making activity designed to give them a leg-up into existing institutional structures such as employment, is hard to quantify. Perhaps it’s something that will shake out over the next ten years or so.

But let’s leave the tedious lucre questions to one side and reaffirm the principle: technical people are for historical reasons accustomed to the idea that there are people out there who are experts in the stuff that they need to know, that they can pay them (in micro amounts, generally) to access that knowledge in non-traditional forms, and that the internet is the appropriate forum for all this. Moreover, the knowledge on offer gets much more specialized than “how to design web apps” (although by extension it may well make less money).

By analogy with previous social developments on the internet, where technical people go, the rest of us follow in descending order of socio-economic grouping and (although this is changing) ascending order of age. This is why sites like Kickstarter and Indiegogo exist. This is why I watch Khan Academy videos and am playing around on Code Academy.

All this will seem a very quick and clumsy summary to people who are closer to these developments than I am. I’m attempting it because a classicist whose blog I enjoy is contemplating a colleague who has quit her post for the bright lights of “entrepreneurial activity in history”, and is wondering – not too seriously, perhaps – about his own Plan B outside academia. You will notice that the post, and his comments, tends towards assuming that traditional publishing and other established media are the obvious choices for an academic historian seeking to make the transition to popular or public historian.

Obviously, this is not necessarily the case. Is there any reason why academic historians shouldn’t self-publish?

Let’s have a look at what Nathan Barry, referenced above, says about the problems and pitfalls of writing e-books from the perspective of most people who might try to do it:

Writing a book is hard. Not for an academic historian. To be sure, there are issues of style to address in the transition to a non-academic audience, but the act of writing a paperback’s worth of words itself is a familiar hurdle. In fact, a paperback’s worth is pushing it. Tyler Cowen’s The Great Stagnation is the length of a long article, which is as long as he needs to make a single acute point with far-reaching implications. Do you have to compromise on the complexity of your content? I’ll come to that in another post, perhaps, but my instinct is, if you’re happy to treat this as supplementary income from a niche audience, not as much you’d think.

I’m not an expert… I don’t have a PhD in marketing or design and I don’t travel the world giving lectures–all things you would typically associate with experts. Clearly not in point here.

Building a following for your blog takes time. Yes, but various academics including Professor Morley don’t seem to have a problem doing that. Anyway, part of the reason it’s hard for ordinary civilians to build a following for a blog which professes to be authoritative on a particular subject is that they don’t have immediate and obvious authority platform to speak from. See “I’m not an expert”.

Let me be clear, I am not recommending this particular “how to sell e-books” e-book. I haven’t bought it. I probably won’t. It just happens to be an example known to me, and I’m using it to point out that the things most people find hard about making money through self-publishing and other knowledge brokering activities are not typically going to be things that academics find hard.

So, conversely, why should academics self-publish? Are there really enough people out there willing to pay for, say, a spiky little pamphlet on Roman economics to make writing it worth an academic’s while, what with all the other calls on their time? What are the benefits of such “entrepreneurial” approaches, relative to the clout of traditional publishing and other mass media?

At the moment, and until I can read more about the audience I have in mind, my answer is It Depends. I doubt I would have gone into a bookstore and bought the kind of book Tyler Cowen might have ended up writing if he had taken The Great Stagnation to a traditional publisher. But mainly I’m just observing that the commodification of specialist knowledge via non-traditional formats works very well indeed in some communities for some people. I can’t see any systemic reason why it shouldn’t work for academic historians, classicists and archaeologists.