Category Archives: sociology

Dear British Sociological Association

Dear British Sociological Association,

It’s your annual conference this week. Hope you’re having fun. I’ve only been once (2007). Honestly, the whole thing frustrated me a bit and, also because it’s so expensive, I’ve not been back. My main memory was a giant, embarrassing “eh” silence after Latour’s keynote, then sitting by the river with some other PhD students wondering why no one was writing a book about “chavs” and new demonisations of the working class. But I probably shouldn’t judge on one event, and I hear you’ve changed loads recently. This year’s theme is “engaging”. I probably should be there. If I had the budget.

Before I go any further, I want to say that I really, really like words. I especially like new words. I think words help us think. I think new words open new ways of seeing the world.

Moving on. Yesterday, I read that climate scientist James Hansen was retiring from NASA, planning to focus on political advocacy in his retirement. I wrote a piece about it for the Guardian which, countering some of the flack Hansen gets for his political activism, suggested the following dyspotian thought experiment:

If I wanted an army of scientific workers to do my bidding I’d train them to think politics smells bad and demonise people like Hansen. I might let them play with some light “philosophy”, certainly no sociology. A bit of the less juicy bits of Popper on a Thursday lunchtime maybe, with some workshops in business, PR and lobbying as a further “enrichment” programme. I’d probably try to enforce really hierarchical and competitive structures within science too, so the workers only talk up and down, rather than organise together. And I’d divide the curriculum up as much as possible, abstracting it all and dismantling anything that invited anyone to consider the work of scientists in any broader context.

Jon Butterworth (physics Prof at UCL and another Guardian writer) very wisely replied that in such a vision he’d also ensure sociologists spoke only to each other, in obscure jargon.

He’s totally right.

Moreover, I think this slightly mischievous vision might be one the sociological profession, of all areas of the academy, would both appreciate and worry about. So think on that when you next wring your hands over “dumbing down”: Who exactly are you serving with your current modes of communication?

None of that means you can’t offer us new words and the new ways of seeing that go with them. I honestly think that’s one of the best things you can give. But do it well. And do it with an eye on talking with (not just at) the public. I also think most of you know this already. But you do need to be a lot better. Because even people who know and love you aren’t getting it. And one of you should have already written that book about chavs back in 2007.

Best wishes,

Maybe even see you next year in person,

Dr Alice R Bell

The Production of Nonknowledge

private public

UCL’s Science and Society reading group discussed an interesting paper on the production of non-knowledge, what science decides not to look at, why and how. It’s interesting because the growing literature on the sociology of ignorance – e.g. agnotology – often sees it as a problem, but as this paper points out, it’s a routine part of science. I thought I’d share my notes. You can download the paper here (pdf).

  • Kempner, J, Merz, JF, Bosk ,CL (2011) “Forbidden Knowledge: Public Controversy and the Production of Nonknowledge”, Sociological Forum, Vol. 26, No. 3: 475-500.

Historians and sociologists of science have long studied they ways in which the social world shapes the production of knowledge. But there’s a limited amount of work on what we don’t know. We might argue it’s hard to get an empirical grip on the absence of something, except that scientists are active producers of non-knowledge; they routinely make decisions about what not to do, and sociologists can track that. In many respects, a process of selection is endemic to scientific work. To quote Robert Merton, “scientific knowledge is specified ignorance” (Kempner et al, 2011: 477).

This paper is particularly focused on scientists’ understandings of what they call ‘‘forbidden knowledge” (too sensitive, dangerous, or taboo in some way) and the ways particular topics might become seen as such. They see such forbidden knowledge as a rather dynamic category, shaped by culture, political climate, and the interests of researchers and therefore decide it’s best to focus on how knowledge comes to be forbidden rather than consider one specific aspect of forbidden knowledge or another (Kempner et al, 2011: 479-80)

What they did: 41 interviews collected 2002-3 drawn from six subject areas picked because such work might involve forms of forbidden knowledge (microbiology, neuroscience, sociology, computer science, industrial⁄organizational psychology and multidisciplinary drug and alcohol studies). Kempner et al identified the 10 top-ranked universities in each discipline and from lists of faculty, randomly chose names and backups in case they did not respond or simply refused. 30-45 mins, mainly by phone but 3 in person. The interview consisted of four sections: they asked researchers to name a prominent example of forbidden knowledge in their field; respondents were then probed on these responses and asked to comment on their own experiences as well as the experiences of their colleagues; thirdly, they asked a series of closed specific questions about practices and experiences; and finally, the interview ended with attitudinal questions about what the respondent felt about scientific freedom and social and professional constraints they worked under.

In terms of the results, Kempner et al found researchers perceived science as responsible and “safe”, with scientists committed to openness and generally moral. The pursuit of knowledge was seen as worth in of itself and perhaps providing the social function of challenging social norms: ‘‘Truth and knowledge is always the most liberating thing, even though it’s often unpleasant and not what people want to hear’’; ‘‘Our job is not to defend the status quo, our job is to explore truth (Kempner et al, 2011: 484). Research subjects disagreed (70%) or disagreed somewhat (20%) that: ‘‘A journal editor should reject a paper if peer review concludes that the results would undermine or clash with societal norms’’ (Kempner et al, 2011: 486).

When pushed, most of the agreed there are legitimate constraints on the production of knowledge; restrictions which provide valuable and necessary protections to society. However, they were largely dismissive of constraints on science which they understood to be motivated by electoral politics, or, as one respondent remarked, are “just typical of American Yahoo politics”. Stem-cell research stood out as an example of overly restrictive limits and the Bush Jnr era America this research was conducted in may well have been a factor. Many respondents expressed a preference that scientists themselves should determine the no-go areas for research, not policymakers or some abstract notion of ‘‘publics’’ (Kempner et al, 2011: 486).

Most of the researchers were also to outline entire areas of research that they felt could/should not be conducted in their fields. Perhaps most tellingly, terms of areas which were already seen as no-go, they tended to do so by reporting stories about people who had broken norms through a series ‘‘cautionary tales’’ which amounted to a sort of gradually constructed collective memory of ‘‘what not to do’’ (Kempner et al, 2011: 486). It seemed that researchers often stumbled into forbidden territory through no intention of their own, only learning they’d hit upon some area of forbidden knowledge when legislators, news agencies, activist groups or institutional review board raised unanticipated objections.

All that said, according to this data, working scientists do not shy away from controversy. If anything, they are driven by it, as long as the controversy is within the community of working scientists, not those pesky policy makers or publics. (Kempner et al, 2001: 487). Drug and alcohol researchers in particular framed the division between these two worlds using an ‘‘us versus them’’ narrative (Kempner et al, 2001: 488). Still, there were internal forces of constraint mentioned too. More than a third of the respondents reported they or one of their colleagues chose not to pursue/ publish research because they knew it contravened accepted dogmas of their discipline in some way.

Overall, they concluded that “forbidden knowledge” was omnipresent in the research process, routine almost as scientific research often “requires that all working scientists learn to accept the bit so that they can properly march their paces” (Kempner et al, 2001: 494). Accepting forms of what might be called censorship is central to the everyday work of making knowledge. Kempner et al go as far to say they found it “puzzling that scientists could maintain an adherence to normative principles of free inquiry while prodigiously avoiding the production of forbidden knowledge” (Kempner et al, 2001: 496) which I found a bit unfair, or at least it’s not “puzzling” even though it’s worth pointing out and discussing. On a more normative level, as non-knowledge is such a routine part of science we should acknowledge this more. We should also perhaps try to heal that “them” vs “us” divide to build productive debate about science chooses what not to research (something I’m sure would actually liberate a lot of scientists, far from the image depicted in this paper’s interviews).

Social construction of science

 “The Knowledge Construction Union”, the IoE take to the streets, en mass.

Saying science is a social construction does not amount to saying science is make believe. It puzzles me that this even needs saying, and yet it does, again and again and again.

Just because something is socially constructed doesn’t mean it isn’t also real.

St Paul’s Cathedral was made by more people than Sir Christopher Wren, he relied upon on a social network. And yet there it still stands, all its socially constructed reality. I saw it from the Southbank when I walked down there last week. I’ve sat on its steps, been inside it, climbed it, taken photos there, got drunk outside, argued about it, been dazzled by it. The thing is real. I do not doubt that. I admit I only perceive it limited by my human capacities. I’m quite short sighted, I get distracted by other things and my view of the place is coloured by what other people have said to me about it. But even in my more annoying “hey, what do we ever really know, really, really” philosophical moments, I’m pretty sure it exists.

Indeed, we could argue St Paul’s is only real – as opposed to a figment of Chris Wren’s imagination – beacause it was socially constructed. In order to get it built, he relied upon the labour, ideas, expertise, money, political will and other resources of whole networks of other people. If hadn’t been for this network, I doubt it would have been constructed at all.

We could say the same for any number of scientific buildings or institutions too. CERN’s a good example. It employs nearly 4,000 staff, hosting a further 10,000 visiting scientists and engineers, representing 113 nationalities drawn from more than 600 universities and research facilities. That’s without getting into the large, long and complex networks of broader financial, physical and intellectual resources they rely up to do their work.  Arguably, it’s because we socially construct science that CERN can exist.

We can also apply this point to scientific ideas, the construction of which is also social, as individuals rely on others to check, adapt, support and inspire them. It’s also worth adding that just because people came up with an idea doesn’t mean it doesn’t match reality, it just means people worked together to find the best idea about the world they can. Science isn’t nature, even if in places it might seem to so have closely described the world that we use it as a shorthand. To say science is made by humans isn’t to say the world around them is. (although there is a “social construction of reality” strand to sociology of science, this is only a strand, and it’s a nuanced philosophical debate which, if you want to engage with, it’s worth taking time over).

None of this is to say individuals don’t play a role, just that they rely on others. The fact that we can, at least on occasion, collect together to make stuff like the discovery of the Higgs boson is one of the things that makes me happy about humanity.

Sociology of science simply wants to take a moment to notice science as something that is made by groups of people. I really don’t get why people find it as somehow desiring of undermining science. You could equally see it as a celebration. If anything, the scientific community should embrace such detailed study of the intricacies of their make up, it helps make cases for more rigorous thinking about funding and immigration policies.

Some of these points are echoed in a short piece I wrote for the Guardian at the weekend. If you want to read more, I suggest you try some of the original Strong Programme, as well as Latour on networks and Merton on communalism. Or recent books by Sergio Sismondo and Massimiano Bucchi offer slightly more digestible introductions. I can also recommend Spencer Weart’s Discovery of Global Warming as a good case study in the social structure of science, it’s a slightly more engrossing read than abstract theory, or there was a nice piece about sociologists at CERN in Nature a while back.

Social scientists and trust in science

I’ll be talking at a Social Science Week event next Monday which asks how social scientists and government might work together to strengthen public trust in scientific evidence?

Times Higher Magazine are partnering, and asked me to write a short piece on the topic for this week’s edition. I briefly ask why social scientists would want to be “used” in such a way, as well as what exactly they might provide:

It is clear that scientists simply saying that they know best is not enough for social or political action – take vaccines, climate change, nutrition, drugs policy (pharmaceutical or otherwise), energy, badger culling, or even – to be retro for a moment – mad cow disease, for example. To have impact, the public must believe the science, not just have it delivered to them. Belief is a social process, and this is where experts on the social can have a powerful role to play.

I wanted to stress that some of us have been at this a while (see this short bibliography, and this overview/ introduction might also be useful). Moreover, the role of social scientists doing such work isn’t just to work out the most efficient method by which science might be passed on to the public. Social researchers working in this areas will take a good long hard look at science as well as this thing we call ‘the public’, and sometimes they will deliver up uncomfortable messages. They are not PR officers, and this is precisely what makes them so powerful (not that there is necessarily anything wrong with science PR…). If science wants public trust, it will have to earn it, and it may have to change itself a bit in the process, or at least be willing to listen to what others have to say (it might also learn and improve from this too…).

Still, a form of this argument could be applied back to social scientists too, who are perhaps not always as trusted as they could be themselves. Perhaps the social researchers should take a leaf out of the natural scientists’ book and try to improve their image slightly. As I conclude:

So my message to social scientists is: ask not just what you can do for science, but also what scientists might do for you. I’d invite any natural scientists listening in to see social critique as a useful part of scientific work too. Everyone in the academy should challenge themselves to consider how the many threads of scholarship can best work together to serve the public good.

I believe the event on Monday is fully booked, but apparently it’ll be recorded in some way (EDIT 10/11: here it is, on YouTube). I’ll probably say something similar to the Times Higher piece, but if you’d like to help me refine my views do please comment here – I’m very open to changing my mind on this.

Book Review: Free Radicals

With his new book, Free RadicalsMichael Brooks has done something which surprised me: he’s produced a popular science version of Against Method.

Against Method, if you don’t know it, is a philosophy of science book by Paul Feyerabend, published in 1975. It argued against the idea that science progressed through the application of a strict universal method, and caused quite the fuss at the time (it continues to, in places). Brooks is keen to distance himself from the more extreme ends of Feyerabend’s version of this view, but agrees with a central sense that, when it comes to doing “good” science, “anything goes”.

Subtitled “the secret anarchy of science” Brooks’ book argues that throughout the 20th century, scientists have colluded in a coverup of their own inherent humanity, building a brand of science as logical, responsible, gentlemanly, objective and rational when in reality it’s a much more disorganised, emotional, creative and radical endeavor. This, Brooks argues, is not only inaccurate but dangerous; education and public policy would be much more successful if science was only more open about its inherent humanity.

This picture of the anarchy of science is done with affection and a clear strength of belief in science. I’m sure some would be tempted to dub it Against Method Lite, but Against Method, With Love might be more accurate. The message seems to be that scientists are people who do amazing things, even though (and sometimes because) they take drugs, lie, cheat, are reckless, work on stuff other than what they’re supposed to, are horrible to their wives, fudge their results, are motivated by money or are simply a bit of a dick. In places, Brooks also emphasises the religious beliefs of many great scientists, and the way in which religion could sit easily alongside, even inspired, their research.

Personally, I’m unconvinced anarchy is the right word here. Messy and human is perhaps better. Or, as a colleague put it at a conference earlier this month: “just people doing people things, in people ways” (I appreciate this doesn’t make for such a sellable book though). Still, the result is a warm, engaging and neatly plotted trundle through aspects of the history of science which the more cheerleading heroic histories tend to avoid. In some respects, the book’s approach of short historical tale after short historical tale is reminiscent of Bill Bryson’s A Short History of Nearly Everything. There is a key difference though. Bryson’s book, when it came out in 2005, bugged me. Bryson is famous for his travel books, in particular a chatty style which talks about the people he encounters with a fair bit of pisstaking (affectionate, and often respectful pisstaking, but pisstaking nonetheless). But, witty as A Short History was, Bryson seemed to have left his ability to take the piss at the door of the Royal Society. It wasn’t a warts and all view of the world-weary observant traveller; more a cleaned up polished pictures you save for the tourist brochure. The scientific community welcome Bryson’s book with open arms. I was left thoroughly bored by its reverence. Brooks on the other hand, perhaps because he has a scientific background himself, doesn’t seem to be nearly so star-struck (and isn’t, I’d say, nearly so boring).

Again, let me stress Brooks’ approach is not “anti-science” in any way. But that’s not to say such an Against Method, With Love approach is without problems. I suspect many of my colleagues in the social studies of science would worry about this somewhat celebratory twist on the idea of anarchic science. They’d want more critique, more probing (because, I should also stress, they see such critique as a way to better science, they generally do this with love too). I also suspect Brooks’ focus on the big names of science – Nobellists and the like – would jar with those who eschew great men stories in favour of uncovering the less obvious, more detailed and often anonymous networked texture of science. Brooks might have produced an anti-hero popular history of science, but it’s still one with a focus on great men. Indeed, there is a way in which these stories of slightly crazy scientists simply constructs a whole new mythical image of the scientist, one that adds new and different forms of barriers between science and society. I’m not convinced science is necessarily a “bad boy” any more than I believe in the mythical branding Brooks aims to puncture.

(An anti-hero history of science isn’t a new one, nor are critiques of it. Rosalind Haynes touches on it in her history of the fictional representation of scientists; work that was neatly reapplied to non-fiction contexts by Elizabeth Leane. There’s a section of my PhD on the rhetoric of an anarchic image of science presented in some kids’ books too)

I’m really not the intended audience for this book though. I’d love to know what a more general reader from outside the scientific community makes of it. I’d also like to know what professional scientists think of the books’ image of their work, and how other scholars in the history, philosophy and sociology of science felt about this refashioning of their ideas. I did enjoy reading it though, I think the concluding points about the political worth of accepting the human side of science are, at the very last, worth more public debate.

David Kirby’s ‘Lab Coats in Hollywood’

dinosaur!Dinosaur model from the 19thC, still on display in a South London park.

Verisimilitude. Good word, isn’t it? It’s one of my favourites.

It means ‘the appearance of being true or real’. It’s not just a term for people who study semiotics: philosophers of science use it too (or at least Popper does), as a way comparing theories’ claims to closeness to truth. It’s more ‘truthlikeness’, than truthiness, but has a range of uses and applications, many of which get somewhat intermingled when it comes to actually putting science to work in society at large.

Top tip: After much swearing at my laptop while writing up my PhD thesis, I discovered typing verysimilartude into Word gets you the correct spelling prompt.

This is a slightly abstract way of introducing a great new book I’ve just finished reading: David Kirby’s Lab Coats in Hollywood. The book is the product of several years of Kirby’s sociological research uncovering the backstage role some scientists play in the film industry, as consultants on the depiction of scientists and scientific ideas on screen. Kirby also seems to love the word verisimilitude, and the occasional messiness of its uses. It’s even on the dust-jacket. But this isn’t an esoteric tome of jargon-filled social science. It’s a neat little book for a generally interested reader; direct, clear, thoughtful and communicated with a genuine interest in the people it studies.

Although the bulk of his examples are films of the last decade or so, in some respects, there is a long history to this sort of work. Kirby refers to my favourite example here: the Crystal Palace dinosaurs (pictured). In particular, the way Richard Owen, back in the 1850s, jumped at the chance to be the scientific advisor, so these models would match his ideas of what they looked like, not those of his rival, Gideon Mandell (Kirby, 2011: 15-16). As Kirby stresses, the construction of a movie is a very complex business, one which involves a huge number of specialists and has some rather unequal power structures. Arguably, Owen had more clout over the Crystal Palace dinosaur models than the scientists involved in the Jurassic Park films did. A scientific consultant may well be listened to at times, and in places within the making of a film, and then later ignored. Indeed, in some respects it’s an odd fluke that any films have scientific consultants at all, and there is no standardised method for integrating them into the film-making process (Kirby, 2011: 42-3).

It’d be wrong to think of film-makers as dismissive of a scientist’s point of view though. They wouldn’t invite them on set in the first place if so. Indeed, one of the key points Kirby makes is how important a scientist’s version of verisimilitude is to the film industry. The book has loads of examples of this (seriously, the number of films that have used advisors might surprise you) but my favorite example is Finding Nemo‘s missing kelp. As Kirby tells it, marine biologist Mike Graham was asked by the animators if there was one thing in the film that might disturb him, what would it be. Graham replied that he’d hate to see kelp in a coral reef (it only grows in cold waters). There was an uncomfortable shuffling in the audience. But go check your DVD: there is no kelp in Finding Nemo. Each frond was carefully removed, at a considerable cost (Kirby, 2011: 102-3). Even films which sell themselves on fantasy (e.g. talking fish) rely on a certain sense of reality too: they need to be credible even in their love of the incredible, and science can help them do this. There’s a lot film-makers can find inspiring in scientific research too; a lot of visual beauty and novel ideas, a lot to make people go ‘wow’. That’s all good material for movie-making. Kirby has a lovely example of a visual used in the 2009 Star Trek movie, inspired by input from astronomer Carolyn Porto (Kirby, 2011: 12).

Kirby also stresses how it important the verisimilitude of films is to scientists, something you can see very well from the fact that remuneration is not simply financial, and often relates to their work. Some do get paid for their work. Some feel this as inappropriate and so take alternative payment like tickets to premiers, some ask for funding for research programmes. Some see it as part of their responsibility to the public understanding of science, some want to promote their ideas, or see them realised with movie-technology, some find it simply fun (Kirby, 2011: 56-63). The National Academy of Sciences has a project to connect scientists and engineers with  professionals in the entertainment industry ‘to create a synergy between accurate science and engaging storylines in both film and TV’. Personally I’m not entirely sure if this is a constructive approach to the perceived ‘problem’ of science in fiction or a giant red herring compared to less showy education and public engagement work (? genuine question mark here, I don’t know. Kirby refers to audience research, but conclusions and comparisons are very hard to draw here), though it may well make professional scientists feel a bit happier; to let off a bit of steam.

Kirby has some constructive advice for anyone who does want to try promoting science through Hollywood: worry less about how you might make the science in entertainment products more accurate, and more about showing filmmakers that accurate science could actually make their film better (Kirby, 2011: 10). Other advice for scientists include get involved early on, and respect the filmmakers’ expertise too. Kirby further invites the reader to think about what scientific accuracy might mean within the necessary shortcuts and sometimes fantastical contexts of the film business. Yeah, there’s Finding Nemo‘s coral, but there’s also Brian Cox’s role in Sunshine, a scientific consultant who was brought in to talk to actors about a scientist’s psychological motivations as much as scientific ideas (Kirby, 2011: 71, 73). Those wanting to have an impact on the public discourse about science through movies would do well to think beyond a narrow sense of  ‘scientific literacy’. As Kirby stresses in his conclusion, based on what we know from the fossil record, the representation of Dilophosaurus in Jurassic Park is completely inaccurate, but the film had much greater public impact (for good or bad) in terms of its depiction of scientists as heroes, as paleontology as exciting, and as genetic engineering as potentially dangerous (Kirby, 2011: 230).

I’ve been recommending Kirby’s research to students for years (links on his site), and I’m glad I can now recommend a whole book to a much wider audience too. If you are interested in the politics of science fiction, some of the oddities of the film industry, scientific accuracy in popular science or simply an interesting mix of cultures, it’s worth a read.

Social scientists and public accountability

This was originally published on social science space. I’m happy to admit it is a piece of rhetoric, designed to make a point. If you are interested in debating this more, I’m on the panel for a Guardian Higher Education livechat on research communication later today (or use the comments below, as ever).

Every now and again I see someone argue that the models for public engagement and impact built for natural sciences are all very well, but can’t possibly apply to us in the social sciences or humanities.

Whilst I have some sympathy, some of this amounts to sticking slightly snobby scholarly fingers in pairs of already too-deaf ears and going ‘yada yada yada, I can’t hear you’ at political realities knocking on the doors of lovingly constructed ivory towers.

Ideas of public engagement and impact are, in themselves, not a bad thing. I’m all for cynicism about particular definitions of these terms sometimes offered to us (cough – Big Society fuss – cough). But we should take these offerings fairly too, and accept that organizations like the National Coordinating Centre for Public Engagement are there to help us be better academics.

The scientific community woke up to demonstrate their worth around the mid 1980s. They then started a long and painful process of realising that they can’t simply shout ‘BUT YOU SHOULD LISTEN TO ME I AM VERY CLEVER’ (and ‘while you’re at it, leave me alone to get on with my work in peace’), gradually realising they needed to get imaginative about explaining not only their work, but themselves. Moreover, they realized that they need to listen to the public too. This doesn’t necessarily amount to being told what to study, it just means listening (this might be useful as background). It may seem like an imposition, but those who bother are reaping the benefits.

Of course, you could study whatever you want to, in exactly the way you choose, and only bother to have the cosiest of chats about it. You can do that in your spare time. Want independence? Go, join the hobbyists. Me, I’m a public sector professional, and as such, I take pride in the ways in which I may cultivate an independent voice, but do so within a network of constraints provided by public service. Listening to outside voices is not a threat to my professionalism; it’s an expression of it.

I don’t want to sound entirely unsympathetic, and I admit I’m being deliberately provocative. I know many people in the social sciences aren’t nearly so blinkered. I also know from personal experience than communicating our scholarship can be bloody difficult. Yeah, everyone loves a nice historical story – a little ‘factette’ about Newton inventing the catflap for example – but what about the more complex offerings from professional historical research, the less convenient ideas, the less appealing detail?

A scientist friend puts a fart joke in his explanation of methane and get congratulated for being so down to earth. I seethe with envy. Part of my research involves unraveling the cultures and politics of fart jokes in popular science. When I try to explain this work, I sound like a spoilsport.

Most people feel uncomfortable talking about the abstract entities of science. Traditionally scientists have seen this as their great challenge, but in some ways they have it easy. Everyone’s got an opinion on the research objects of social sciences and humanities, and this is precisely what makes sharing our expertise so hard. But we shouldn’t loose sense of how it can be an advantage. We should listen to all these opinions, and then work out how to challenge them, how we can offer something more. We have these opinions for a living, we have taken time to have a proper look and good, deep think about it. What new stuff have we dug up? Moreover, as someone who worries about these issues for a living, surely we want to have our ideas and evidence extended, our assumptions poked at, our ideas used?

We are paid to do our research. Teaching a small set of kids privileged enough to go to university, or publishing in esoteric journals only a couple of people will read does not cut it. Moreover, it doesn’t challenge our ideas enough to make the sort of high quality work we should be producing. Earn public trust by showing off your worth. You may well learn something in the process too.

You don’t have to do what you are told, what’d be the point of you if you did? But for goodness sake take those bloody fingers out of your ears. Me, I’m a professional scholar, not a hobbyist. That’s why I try to stretch my work outside of the academy, and why I think you should too.