It’s been a while since I did some navel-gazing about who reads this blog and where they come from. This week, quantixed is close to 25K views and there was a burst of people viewing an old post, which made me look again at the visitor statistics.
It’s been a while since I did some navel-gazing about who reads this blog and where they come from. This week, quantixed is close to 25K views and there was a burst of people viewing an old post, which made me look again at the visitor statistics.
{.alignright .size-medium .wp-image-551 loading=“lazy” decoding=“async” attachment-id=“551” permalink=“https://quantixed.org/jcbdiet/” orig-file=“https://i0.wp.com/quantixed.org/wp-content/uploads/2015/07/jcbdiet.jpg?fit=603%2C2586&ssl=1” orig-size=“603,2586” comments-opened=“1”
I was talking to a speaker visiting our department recently. While discussing his postdoc work from years ago, he told me about the identification of the sperm factor that causes calcium oscillations in the egg at fertilisation. It was an interesting tale because the group who eventually identified the factor – now widely accepted as PLCzeta – had earlier misidentified the factor, naming it oscillin.
My post on the strange data underlying the new impact factor for eLife was read by many people. Thanks for the interest and for the comments and discussion that followed. I thought I should follow up on some of the issues raised in the post. To recap: eLife received a 2013 Impact Factor despite only publishing 27 papers in the last three months of the census window. Other journals, such as Biology Open did not.
Note : this is not a serious blog post. Neil Hall’s think piece in Genome Biology on the Kardashian index (K-index) caused an online storm recently, spawning hashtags and outrage in not-so-equal measure. Despite all the vitriol that headed Neil’s way, very little of it concerned his use of Microsoft Excel to make his plot of Twitter followers vs total citations!
When it comes to measuring the impact of our science, citations are pretty much all we have. And not only that but they only say one thing – yeah – with no context. How can we enrich citation data? Much has been written about how and why and whether or not we should use metrics for research assessment.