Fighting the impact factor, one CV at a time


6 min read

The impact factor is probably one of the most loved and hated bibliographic indicators in the scientific community. It is a simple number, really. It reflects the number of citations of recent papers a journal has published in a year, averaged over the last two years. Eugene Garfield created it in 1975 to rank journals and provide a way for librarians to decide which journals to subscribe to. It is an indicator to assess the relative importance and readerships of journals. What it does not do is assess the relative quality of the work of an individual scientist. If anything, the impact factor of the journal in which you publish is an indication of past research published in that journal, not your own.  

The main problem with the impact factor (and there are several, see here, here or here), is that it is often used to compare scientists. That is why a lot of people hate it. When skimming through a list of publications in a CV, we tend to skip over the title of the publication and first look at where the paper was published.  And in our academic brain, journal names are tied to their impact factors. So, when reading a list of publications, what you’re really doing (and most likely everybody does it, consciously or not), is making a list of impact factors. Obviously this is not ideal.

Excellent papers are published in specialist journals that have a modest impact factor only because the topics they publish on are not read by a broad audience.

One of the most recent examples is the 2019 Nobel in Medicine, Peter Ratcliffe, whose Nobel paper got rejected from Nature. The journal’s response was that his research “would be better placed in a more specialized journal”.

The impact factor was never meant to be used like that.  

So what can we do about that? Initiatives are being taken (such as the San Francisco Declaration on Research Assessment) to encourage institutions not to use the impact factor in their evaluation procedures.

At the level of the individual researcher, there is something simple each one of us can do: remove journal names from the list of publications on your CV. No journal names, no (implicit or explicit) link to the corresponding impact factor. People reading your CV will be much more likely to read your paper titles. 

Would removing journal names from our CV really benefit the evaluation of our work when we are applying for jobs and grants? It could for example lead to an increased drive to publish as many papers as possible, even in predatory journals, to bulk up your publication list. So that wouldn’t work either. You want reviewers to judge the quality, not  the quantity of your work. Ideally, reviewers should read your papers to know whether your research is actually good! Unfortunately, nobody has that sort of time when evaluating dozens of grant proposals in a short time span (one might argue, with reason, this is a more fundamental problem with our funding system, but that is a different story). So what can we do to make the life of panel members and reviewers easier? 

The Meaningful CV

The good news is, your publication list can contain a lot more information than just the title of your papers. Many different paper-level metrics exist which can be leveraged to enrich your publication list and make it more meaningful. There are also open web platforms that make it easy to gather such information. Here are some examples:

  • The link to the actual paper, using the DOI
  • The number of citations of a paper can be obtained via CrossRef.org or Dimension.ai
  • The online attention score (or altmetric score) can be obtained via altmetric.com
  • The Field Citation Ratio (FCR) is the relative number of citations of your paper, compared to papers of the same age, in the same field of research. It can be obtained on Dimension.ai.
  • The Relative Citation Ratio (RCR) is the relative number of citation of your paper compared to other papers in the same field of research. It can also be obtained on Dimension.ai.

I particularly like the last two last metrics. They quickly give an idea of how well your papers were received within your field. If your paper has an FCR above 1, for instance, it has been doing better than the average paper in your field. Quick and easy. 

You can even go one step further: you can add quotes from reviewers or from other scientists about your papers. Your research was featured in a news outlet? Add it to the list! Any information regarding the impact of your work should be added to your publication list. Make it as meaningful as possible!

In practice

This all sounds nice, but how do you do this in practice? Here, I would like to share my own experience with the concept. 

I stopped adding journal names to my CV around 2014, when I was a postdoc (I obtained my PhD in 2012). Since then, I have tried to incorporate more and more information in my publication list. I also usually add a brief explanation of the different metrics, for the uninformed reader. 


What about grant proposals and job applications? Well, I try to stick to the same strategy as much as possible. I did not add journal names on my CV when I was applying for my current position, and it did not impact my interview. I do not add them when I am applying for grants as an individual (rather than as a part of  a consortium), and it does not seem to negatively impact my funding rate. I do add them when applying with other colleagues or when it is required. In Belgium, for instance, your list of publications (including journal titles) is automatically generated from the institutional repository when you apply for a national grant. 

Did I ever get any negative feedback regarding the absence of journal names? None that I know of, but I have heard that it surprises people. Is it risky to do so? Maybe. If the person reading your CV is a strong supporter of journal prestige, not having journal names may not be viewed positively. But again, that is why I try adding as much meaningful information as possible. Would I recommend Early Career Researchers to do the same? Yes, but I would first make sure that they know the potential consequences. Would I recommend established researchers with a permanent position to do the same? Most certainly! They have the least to lose and are the most likely to influence others within the community.

One might wonder if not adding journal names means I publish in any journal? The answer is no. I do care about the quality of the journal (based on its overall published research, not a single metric), its audience, whether it is Open Access, or whether it is a Society journal. Removing  journal names from my publication list did not change the way I publish. I still try to find the best, most appropriate venue for my papers. I do value journals (at least some) and the services they provide. I just want to be judged on my own research, not based on the research of other people (which is, ultimately, what the impact factor is). 

How to start? 

Do you want to join the fight against the impact factor? We actually developed a tool to make it easier for you to do so. The complicated aspect of writing your own  “meaningful CV” is that it requires information from different sources, and that this information needs to be updated frequently.

To make this step easier,  we built a small web-app to help researchers gather their paper-related metrics: https://plantmodelling.shinyapps.io/meaningful_cv/. All you have to do is enter the DOI’s of your papers and the app will get the different metrics for each of them. Then, just add them to your CV!

Happy fighting!


About the Author:

Guillaume Lobet is an Assistant Professor at the Forschungszentrum Jülich in Germany and the Université catholique de Louvain in Belgium. Follow him on Twitter @guillaumelobet.


We welcome comments, questions and feedback. Please contact us at ecrlife [dot] editors [at] gmail [dot] com.

Would you like to share your own story, insight or opinion? Pitch us here.

Follow us on Twitter to stay up to speed with our latest blog post releases.


Grants for graduate students
Previous article

Grants for graduate students

Imagine you’re a fifth year graduate student. Your PI had a tragic accident and, unfortunately, is no more. You need a first author publication

How to start an open science initiative
Next article

How to start an open science initiative

It has been 8 years since Simmons and colleagues’ article [https://doi.org/10.1177/0956797611417632] “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis


SCROLL UP

🎉 You've successfully subscribed to ecrLife!
OK