Being that this is my very first post, it seemed apt to begin with primacy. As most of you probably know, we often find it easier to remember either things we have just learned (recency items) or things that we learned at the very beginning of a session (primacy items). For example, imagine a friend takes you to see a band that you don’t know (you have never listened to any of their music)*; they play a set of twenty songs. Which songs do you think you are more likely to be humming to yourself on the way back home that evening? And which will you be remembering the day after in the shower? A good chance is that you will have the very last song played in your head as soon as you leave the venue – and this is the recency effect – but, after some time, the early songs may start to come back to you due to the primacy effect. Not coincidentally, bands tend to open and close sets with popular favourites so that you remember the show more fondly.
*A caveat here is that if you already know the band’s material fairly well, then you are likely to remember each song very well (as well as already possess multiple memories of each song) and so the serial position effect is not likely to apply.
One reason the primacy effect is quite interesting – among other things! – is that loss of a primacy advantage tends to accompany the emergence of Alzheimer’s disease. To put it simply (and simplistically), the fact that our memory gets a bit poorer as we age is fairly normal: we do lose a bit of this and that as we age (although, we do gain other things – but that is for another post). However, if what we lose is memory for primacy items, then that might be a sign of an emerging problem. And I quote myself: “This specific loss of memory could be one of the first signs that memory is declining in a way that would cause concern.” My quote is taken from this link.
The study I am talking about in the video (what video? you ask; you have to click on the last link!) was published earlier this year in the Archives of Clinical Neuropsychology, and was reported upon here as well. In essence, we followed just over 200 volunteers over a few years to see what aspects of their initial (first visit) memory performance were predictive of cognitive decline. For this purpose, we only recruited subjects who did not have dementia and were cognitively intact at the time of the first test. Our results show that the best predictor of cognitive decline was a poor primacy effect in delayed memory performance. In other words, we found that those participants who did not show a primacy effect for newly-learned words after a time delay (about 20 minutes later) were also more likely to be showing cognitive decline over the following visits.
Going back to our musical example above, we could call this the day-after-shower effect: not humming the first few songs in the shower the day after the show, is not just a disservice to the band, but might also suggest some cognitive decline.
Here is a link to the article page on the Journal.
To conclude this first post, I should like to say, please, do not take all this too literally and freak out if you forget what you had as an appetiser the night before; this is not how this type of testing works. A diagnosis of dementia is done by collecting a lot of different evidence and by conducting several tests, over time. The benefit of studies like the one reported here does not come from findings a test that magically unlocks the secrets of Alzheimer’s disease, but from providing more and new detail. That way, we can fine tune our tests, make them more sensitive, and improve our chances to diagnose the disease early for a prompter intervention.
To quote myself once again: “Anything that provides us with more information and better predictors to identify Dementia is definitely a step forward in the research into the disease as well as the care and treatment of sufferers.”