
Organised Prophets – the Divine and Human Partnership
January 3, 2026Seek simplicity and distrust it
Why I don’t simply accept the smackdown articles on the Quiet Revival
By Simon Cross
The Quiet Revival report, published in 2025, has been the cause of much discussion for nearly a year now.
Whether you find it’s findings hard to believe, encouraging, mystifying, or some combination of the three, there’s no doubt that it has generated debate – and that, surely, has to be a good thing.
My sense is that the authors themselves were somewhat unprepared for the response they got – so far as I can see they had no particular axe to grind, rather they were analysing data which was as surprising to them as it was to the rest of us.
In recent days I’ve read a couple of, to put it crudely, ‘smackdown’ articles, that seek to undermine the credibility of the data that the report is based on.
One is written by Dr Conrad Hackett of Pew Research – and let’s be clear, he knows what he’s talking about when it comes to research. His article, a link to which he posted in a comment on my article last week about this event, argues that the survey results ‘may be misleading’.
In another article, that follows on, somewhat, from Hackett’s piece, Humanists UK make the argument that British Social Attitudes Survey data undermines the YouGov polling (The BBC’s More or Less Programme engaged with this too).
The Humanists UK article states: “churchgoing, including among Gen Z, has continued its long-term decline. The findings are consistent with the Church of England’s and Catholic Church’s own church attendance records.”
I find myself in two minds here – on the one hand, I’m not necessarily a fan of the term ‘revival’ without some very clear definitions. Nor do I welcome the joy with which this data has been seized upon by some on the political and cultural right. But neither am I convinced by these smackdowns.
I am, perhaps, an agnostic when it comes to the Quiet Revival.
One of the things that has concerned me about the smackdown articles, is the seizing upon the methodology of the survey as it’s ‘fatal flaw’. Both Dr Hackett and the Humanists UK writer(s) see the ‘opt-in’ survey data on which is it based as entirely unreliable. This would be compared to the ‘gold standard’ of random sample surveys.
For the uninitiated, the basic difference between these two methodologies is that one relies on people choosing to take part in a survey, and the other relies on researchers finding people to survey.
Critics of the ‘opt-in’ (first) approach say that it is unreliable because it’s vulnerable to various things, such as: fraudulent approaches, use of bots, and recruitment among groups (e.g. youth leaders urging all their young people to take part in the survey, and thereby skewing the result).
There are, of course, legitimate concerns about opt-in surveys, but let’s be clear: not all opt in surveys are equal.
I should clarify that I am not an expert – I’ve both carried out, and taught qualitative and quantitative research, but only to a limited degree. I’m not to be counted among the Conrad Hacketts of this world. However, I’ve had some training and done some teaching, so I know a little.
One thing I feel quite clear about is that binary claims of “random sample = truth” and “opt‑in = unreliable,” are much too simplistic. Opt-in surveys can be reliable, or at least useful, if they are suitably controlled. I’ve not seen any evidence, or indeed claims, that the opt‑in surveys used here failed to use techniques that would safeguard the validity of their data.
On the flip side, there are strengths to opt-in surveys too – particularly when it comes to looking for indicative data. Using opt-in methodology can, for example, help to recruit participants from younger age groups than are otherwise normally captured.
At the same time, random sample surveys have well known weaknesses or limitations, ranging from mode effects (what difference does the way the survey been carried out make?) to social desirability bias (“I’ll say what I think will make me look good to you.”) Random samples do not always manage to properly represent a wide demographic, either.
Basically what I’m saying is that to assert that all opt‑in surveys are void because they are equally vulnerable to bogus respondents, is very difficult to support.
So the ‘smackdown’ approach that says – “this is bogus because it relies on opt-in surveys” – is unconvincing.
There are, too, other problems with these takedowns. Partly these have to do with the way we think about what it means to be ‘religious’ in contemporary society. (I’m conscious we’re not working with a clear definition of ‘religion’ in this article either, but let’s let that slip for now.)
My long held view is that we live in a post secular society, where people can, and do, say, for instance, that they are atheists and that they believe in angels. The old categories are hard to support, now.
Newspaper reports about the Quiet Revival have tended to argue that a “Christian resurgence” leads directly to clear change in religious identity and, ultimately, to the holy grail of regular church attendance.
I think this is a mistake.
A resurgence in Christianity in post-secular society doesn’t necessarily mean a reversal of the church decline trend. Those two things are not the same.
If you find this area interesting, whatever your perspective – and their names are legion for they are many – you should come to the Quiet Revival Symposium.
Online or in person – it’s going to be a good day. And for anyone wondering if the methodology argument will be put forward, then Dr Patricia Tervington, a Pew Research expert will have the opportunity to do just that, if she so chooses… watch this space.
Simon Cross is a recent MC Forum contributor, the chair of the Progressive Christianity Network and publishes regularly on simonjcross.substack.com, where this article was originally published.
Modern Church will be represented at the Quiet Revival Symposium.




