A CV page of papers’ references with journals’ names blacked through.

Omitting journal names in your CV could help to prioritize the quality of your science over the prestige of the publication.Credit: Nature

Can you name the journal in which microbiologist Alexander Fleming first reported on the antibacterial properties of penicillin? Or where engineer John O’Sullivan and his colleagues presented the image-sharpening techniques that led to Wi-Fi?

Most of you can easily name the benefits of these breakthroughs, but I expect only a few would know where they were published. Unfortunately, in modern scientific culture, there is too much focus on the journal — and not enough on the science itself. Researchers strive to publish in journals with high impact factors, which can lead to personal benefits such as job opportunities and funding.

But the obsession with where to publish is shaping what we publish. For example, ‘negative’ studies might not be written up — or if they are, they’re spun into a positive by highlighting favourable results or leaving out ‘messy’ findings, to ensure publication in a ‘prestigious’ journal.

To shift this focus in my own practice, I have removed all the journal names from my CV. Anyone interested in my track record will now see only my papers’ titles, which better illustrate what I’ve achieved. If they want to read more, they can click on each paper title, which is hyperlinked to the published article.

I’m not alone in thinking of this. The idea for removing journal names was discussed at a June meeting in Canberra on designing an Australian Roadmap for Open Research. A newsletter published by the University of Edinburgh, UK, no longer includes journal titles when sharing researchers’ new publications, to help change the culture around research assessment. Celebrating the ‘what’ rather than the ‘where’ is a great idea. This simple change could be extended to many types of research assessment.

Quality over journal titles

It is disorienting at first to see a reference that does not contain a journal title, because this bucks a deeply ingrained practice. But journal names are too often used as a proxy for research excellence or quality. I want people reading my CV to consider what I wrote, not where it was published, which I know is sometimes attributable to luck as much as substance.

Of course, anyone who really wants to judge me by where I’ve published will simply be able to google my articles: I haven’t anonymized the journals everywhere. But removing the names in my CV discourages simplistic scans, such as counting papers in particular journals. It’s a nudge intervention: a reminder that work should be judged by its content first, journal second.

Because I’m a professor on a permanent contract, it’s easier for me to make this change. Some might think that it would be a huge mistake for an early-career researcher to do the same. But there is no stage in our scientific careers at which decisions about hiring and promotion should be based on the ‘where’ over the ‘what’. It would be easier for early-career scientists to make this change if it became normalized and championed by their senior colleagues.

A potential criticism of removing journal names is that there is nothing to stop unscrupulous academics from publishing shoddy papers in predatory journals to create a competitive-looking CV, which could put candidates with genuine papers at a disadvantage. Promotion and hiring committees need to be made aware of the growing problem of faked and poor-quality research and receive training on how to spot flawed science.

However, when a job gets 30 or more applicants, there can be a need for short-cuts to thin the field. I suggest that reading the titles of each applicant’s ten most recent papers would work better than any heuristic based on paper counts or journal names, for only a slight increase in workload.

Imagine a hiring or fellowship committee that receives plain or preprint versions of the every applicant’s five best papers. Committee members who previously relied on simplistic metrics would have to change their practice. Some might simply revert to Google, but others might welcome the challenge of judging the applicants’ works.

Judging researchers is much more difficult than counting impact factors or citations, because science is rarely simple. Simplistic promotion and hiring criteria ignore this wonderful complexity. Changing typical academic CV formats could bring some of it back.

This is an article from the Nature Careers Community, a place for Nature readers to share their professional experiences and advice. Guest posts are encouraged.

Competing Interests

A.B. is a member of the Australian National Health and Medical Research Council (NHMRC) Research Quality Steering Committee, which provides national guidance on good research practice. A.B. is paid for his time to attend committee meetings. A.B. was on the organizing committee for the Policy Roundtable: An Australian Roadmap for Open Research meeting, which is mentioned in the article, and received paid accommodation to attend the meeting.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *