Sunday 18 July 2010

Looking back on 25 years in science... I wasn't actually doing science

I began actually *doing* science in 1984 (a significant date, perhaps) - or, at least, that's what I thought I was doing.

I have worked in and across a variety of fields: neuroendocrinology in relation to psychiatry, the adrenal gland (especially from 1989), epidemiology (from about 1991), evolutionary psychology (from 1994), systems theory (from about 2001)...

***

In all of these areas and others I found serious problems with the existing scientific literature: errors, inconsistencies, wrong framing of problems. And after all, this is what science is supposedly about, most of the time - providing the negative feedback to correct the wrong stuff. (Providing new lines of work is what we would prefer to do, but most people never do or did achieve this.)

My assumption was that - as the years rolled by - I would have the satisfaction of seeing the wrong things tested and discredited, and replaced with more-correct information. So that overall, and in the long term, science would progress. That is what was supposed to happen.

Well, it hasn't happened.

When I look at these areas of science now, and compare them with the area as they were when I entered them, I see no progress whatsoever: the problems, errors, misunderstandings and stupidity which were there years ago are still there - in many cases amplified and elaborated. In many cases things are in a much worse state than they were when I first entered the field: error and falsehood have not been suppressed or corrected, but has instead thriven.

There is no evidence of progress.

***

So, I must conclude that although I believed I was participating in something called science, something that I believed I understood from the writings of Bronowski, Popper, Hull - I wasn't really doing science at all.

I was 'going through the motions' of doing science, but the machinery was broken, and the work I was trying to do, and the work of those whom I respected, was a free-spinning-cog: maybe it was published, maybe it was cited, maybe it was funded, maybe people made careers from doing it - but in the end it was just a private hobby. It did not make any difference. We were deluded.

We could perhaps console ourselves by hoping that we have left a residue of analysis or data in 'the scientific literature'; but with 25000 journals plus, it really is a needle in a haystack. And isolated fragments of analysis and data cannot really be understood if they do happen to be rediscovered; cut off from the discourse which generated them, bits and pieces of science don't make sense to an outsider.

Now it might be argued that although science was not really going-on in the bits I knew, it *was* going on elsewhere; and that is true. To some extent, probably a very small, shockingly small, extent. But it really is scary to contemplate how whole areas of science (and I have known several) can chug-along for decades, with their participants busily doing... *stuff*; all of them apparently thinking, believing that they are doing science, when they are in fact doing no such thing.

***

What *are* they doing? - What were we doing, in those branches of science in which I participated? Glass Bead Games spring to mind (from the novel by Herman Hesse) - in the sense of intellectual exercises wholly detached from reality; but really that is far too elevated and elite a concept for the industrial scale drudgery of Big Science. It is reasonable to consider something like top-level string theory, or analytic philosophy, or even postmodern literary theory as true Glass Bead Games – but not the millions of drones in medical research or physics.

Mainstream Big Science is most reminiscent of a Soviet Union era organizations – such as the grossly unprofitable glass factory I saw on TV being inspected by John Harvey Jones in his TV show Troubleshooter. It was producing vast amounts of defective drinking glasses which nobody wanted to buy or to use – and was simply piling them up in gigantic stacks around the building – wasting everybodys time and taking up all the useful space.

When Harvey Jones was asked what to do, how to make the business profitable, he said the first essential step was to STOP MAKING THE GLASSES. Now: this minute. Switch-off the production line, send everybody home (on continued full pay, if necessary), and *then* begin sorting it out.

But so long as the workers were coming in-and-out and beavering away; the paperwork was being completed (in triplicate); and the masses of defective glasses were being churned-out, piling-up and blocking the aisles and preventing anything useful being done – there was no hope.

***

This is the problem of science today – it has been bloated by decades of exponential growth into a bureaucratically dominated heavy industry soviet factory characterized by vastly inefficient mass production of shoddy goods. And it is trundling along, hour by hour, day by day; masses of people going to work, doing things, saying things, writing things…

Science is hopelessly and utterly un-reformable while it continues to be so big, continues to grow-and-grow, and continues uselessly to churn out ever-more of its sub-standard and unwanted goods.

Switch it off: stop making the defective glasses: now...

Saturday 17 July 2010

Solzhenitsyn on humanism

“As long as we wake up every morning under a peaceful sun, we have to lead an everyday life. There is a disaster, however, which has already been under way for quite some time. I am referring to the calamity of a despiritualized and irreligious humanistic consciousness.

"To such consciousness, man is the touchstone in judging and evaluating everything on earth.

“Imperfect man, who is never free of pride, self-interest, envy, vanity, and dozens of other defects. (…)

“If humanism were right in declaring that man is born to be happy, he would not be born to die.

“Since his body is doomed to die, his task on earth evidently must be of a more spiritual nature. It cannot unrestrained enjoyment of everyday life. It cannot be the search for the best ways to obtain material goods and then cheerfully get the most out of them. It has to be the fulfillment of a permanent, earnest duty so that one's life journey may become an experience of moral growth, so that one may leave life a better human being than one started it. (…)

Only voluntary, inspired self-restraint can raise man above the world stream of materialism. (…)

“If the world has not come to its end, it has approached a major turn in history, equal in importance to the turn from the Middle Ages to the Renaissance. It will exact from us a spiritual upsurge, we shall have to rise to a new height of vision, to a new level of life where our physical nature will not be cursed as in the Middle Ages, but, even more importantly, our spiritual being will not be trampled upon as in the Modern era.

“This ascension will be similar to climbing onto the next anthropologic stage. No one on earth has any other way left but -- upward.


Alexander Solzhenitsyn – Harvard Commencement Address 1978

http://www.columbia.edu/cu/augustine/arch/solzhenitsyn/harvard1978.html

The deep satisfactions in life - Charles Murray, again!

"First, the problem with the European model, namely: It drains too much of the life from life. (...)

"I start from this premise: A human life can have transcendent meaning, with transcendence defined either by one of the world’s great religions or one of the world’s great secular philosophies. If transcendence is too big a word, let me put it another way: I suspect that almost all of you agree that the phrase “a life well-lived” has meaning. That’s the phrase I’ll use from now on.

"And since happiness is a word that gets thrown around too casually, the phrase I’ll use from now on is “deep satisfactions.” I’m talking about the kinds of things that we look back upon when we reach old age and let us decide that we can be proud of who we have been and what we have done. Or not.

"To become a source of deep satisfaction, a human activity has to meet some stringent requirements. It has to have been important (we don’t get deep satisfaction from trivial things). You have to have put a lot of effort into it (hence the cliché “nothing worth having comes easily”). And you have to have been responsible for the consequences.

"There aren’t many activities in life that can satisfy those three requirements. Having been a good parent? That qualifies. A good marriage? That qualifies. Having been a good neighbor and good friend to those whose lives intersected with yours? That qualifies. And having been really good at something—good at something that drew the most from your abilities? That qualifies.

"Let me put it formally: If we ask what are the institutions through which human beings achieve deep satisfactions in life, the answer is that there are just four: family, community, vocation, and faith. Two clarifications: “Community” can embrace people who are scattered geographically. “Vocation” can include avocations or causes.

"It is not necessary for any individual to make use of all four institutions, nor do I array them in a hierarchy. I merely assert that these four are all there are. The stuff of life—the elemental events surrounding birth, death, raising children, fulfilling one’s personal potential, dealing with adversity, intimate relationships—coping with life as it exists around us in all its richness—occurs within those four institutions.

"Seen in this light, the goal of social policy is to ensure that those institutions are robust and vital. And that’s what’s wrong with the European model. It doesn’t do that. It enfeebles every single one of them."


Charles Murray. The Europe Syndrome and the Challenge to American Exceptionalism

http://www.american.com/archive/2009/march-2009/the-europe-syndrome-and-the-challenge-to-american-exceptionalism


***

Comment:

Murray is, as usual, very insightful here.

His basic argument about modernity subverting the deep satisfactions of life is surely correct. People nowadays have difficulty in ‘doing good’ since the state has taken over all good-doing, and made ‘good’ into procedural bureaucracy, empty of meaning.

His argument about the need for transcendence for life to have meaning is also right. As Kurt Vonnegut memorably demonstrated in Breakfast of Champions, if you really try to live by the belief that humans are a collection of chemicals, you will be driven crazy by contradictions to which there is no solution but only distraction.

Yet Murray is trying to make a *secular* argument for the need for transcendental meaning; and the points made, even when true, tend to subvert belief in transcendence by making it expedient rather than true.

I have done the same myself, on numerous occasions. In trying to build bridges between the secular and religious, we (inadvertently) frame belief in transcendence as a means to achieving secular ends.

Perhaps this is an unavoidable hazard – but it is necessary to keep remembering the hazard, and counteracting it.

Chargaff on a scientific career

“How is science done in our days?

“Here I must immediately make a distinction between science as a profession and science as the expression of some of the faculties of the human mind. The two are not necessarily connected.

“When someone tells me 'I am a professional scientist', it does not automatically mean that he is a scientist.

“The distinction I am suggesting here has nothing to do with the question of talent. There have always been more or less gifted scientists, and there were even a few, very few, scientific geniuses. But what I want to bring out is that as a profession science is one of the more recent ones. It barely existed when I began my studies. (...)

“One entered a career in science, just as in history or philosophy, by trying to become a teacher at a college or even a high school. There were very few jobs, and almost none that paid enough to live on, except for the position of the professor himself. And there was usually only one professor for a discipline.

“Hence the old students' saying that there were only two ways to make a university career: per anum or per vaginum. You tried to become the professor's darling or you married his daughter. Obviously, this limited the choice: some professors were very nasty, some daughters were very ugly. Girl students were altogether out of luck, but there were only a few of them.

“You may conclude - and you are right – that this was a most unpleasant system. But it had one advantage: it acted as a sieve, letting through the few who could no do otherwise. By requiring what amounted to a pledge of poverty, it kept out all those who, to use a nasty term, were not ‘highly motivated’. It produced a slightly smaller number, but probably a much higher density, of good scientists than does the present system.

From Heraclitean Fire (1978) by Erwin Chargaff (1905-2002)

***

Comment: Chargaff was the best writer among scientists (of any that I have encountered). He was also exceedingly wise. And he was pessimistic.

Because of this combination, Chargaff's writing must be approached with considerable caution.

Friday 16 July 2010

The submissive flaccidity of modern secular hedonism

I was always puzzled by the submissive flaccidity of modern Western societies: the way that - although they live to maximize gratification and minimize suffering - they will in practice do nothing to protect their future happiness nor to defend against future suffering.

But the reason is encapsulated by "Charlton's Law": Things must always get worse before they can get better; because otherwise they already would be better."

When a beneficial policy is a win-win option, then it gets done automatically, and we don't need to think about it - probably we don't even notice it.

But most beneficial policies have a down-side. Typically, long-term benefit can be attained only at the cost of short-term disadvantage or suffering of some kind, to some people.

So that the hedonic secular goal of making life *overall* as pleasant as possible in the *long-term* is continually being subverted by the *short-term* and *specific* gratification.

The hedonic ideal has reached such an extremity among the ruling elites that they pursue policies which will in the long term lead to lifestyles that they regard as miserable and abhorrent, because effectively to prevent these outcomes makes them feel bad now.

In other words, secular hedonism cannot take tough decisions.

***

A tough decision is precisely a decision in which the correct decision leads to short term harm.

I first recognized this dilemma in medicine, when it is often the case that in order to make a person probably feel better overall in the long term, they must suffer immediate and certain short term misery: for example, surgery. Surgeons live with this on a daily basis, and consequently to be a good surgeon requires a 'tough' attitude.

Of course surgery requires many other things too, and most tough decisions are bad - but the point is that someone who was psychologically unable to make tough decisions, but always sought to maximize the immediate comfort and well-being of patients and to take minimum risk, would be a bad surgeon.

Modern society is *soft* in precisely this fashion - its rulers have lost the ability take tough decisions: to seek long term benefits when these come at the price the cost of short term costs to themselves.

The ultimate reason is, I believe, that humans can only make tough decisions when these are supported by *transcendental aims*, in the sense that humans do not want to forgo short term gratification in this world unless life is about something *more* than gratification – and where non-worldly realities (God, heaven, truth, beauty etc.) are seen as more real and more enduring than immediate gratification - and therefore more important.

***

If human life is (as secular modernity asserts) ultimately about gratification (about maximizing happiness and minimizing suffering) then it will always seem tempting to take the short-term choice leading to immediate and certain happiness and avoid immediate and certain suffering; and to ignore the long-term consequences of these choices on the basis that the future cannot be known with certainty, and we might be dead anyway before the future arrives.

The resulting mentality is characteristic of the modern secular elite, but has spread to encompass much of contemporary life. Charles Murray has encapsulated this modern ‘sophisticated’ attitude very well: “Human beings are a collection of chemicals that activate and, after a period of time, deactivate. The purpose of life is to while away the intervening time as pleasantly as possible.”

My point is that a society which regards the purpose of life as being to while away the time between birth and death as pleasantly as possible is a society which cannot make tough decisions. It is a society which will always take the easy way out, will pursue short-termist and certain benefits, and which will therefore always submit to its enemies - because to resist enemies makes life less pleasant than to appease them.

Even to recognize the reality of threats and enemies is unpleasant, distressing, generative of negative emotions such as fear and anger – better if we can pretend that threats and enemies are harmless or benign, really; and the only truly nasty people are those who make us feel bad about ourselves, here and now…

***

So a society that values nothing higher than a pleasant life and which will seek the pleasant life whereever and whenever possible will be morally flaccid in face of opposition, will appease rather than resist, will submit rather than fight, and will therefore end-up being ruled by its most relentless and long-termist enemies - and by having an extremely un-pleasant life.

This is why secular modernity cannot survive: because it enshrines the worldly enlightened self-interest of submissive flaccidity as its ultimate form of rational, sensitive moral behavior.

Thursday 15 July 2010

Charlton's Law: Things must always get worse before they can get better...

Charlton's Law:

Things must always get worse before they can get better;

Because otherwise they already would be better.

Comments enabled, tentatively

I have tentatively enabled moderated comments, as an experiment.

However, I will only allow comments that I judge suitable for general public consumption.

Otherwise readers with something to say are welcome to e-mail me at the address provided.

Wednesday 14 July 2010

Proportion of private school kids applying to college is about 18%, not 7%.

Dishonest or statistically-incompetent state education propagandists in the UK mass media and 'research' (actually leftist propaganda) outfits such as the egregious Sutton Trust, have relentlessly propagated the factoid that 'only 7 percent' of UK school kids attend private schools.

e.g. http://www.suttontrust.com/news.asp

From this statistic they argue that only 7% of kids at private school implies that only 7% of privately educated kids should be admitted to university, or professions such as medicine or law, or postgraduate degrees - if the admissions system was fair.

And they state that any higher percentages of privately educated people found in a desirable institution implies that the system is unfair and biased in favour of the privately-educated (i.e. this statistic is used as ammunition in typical class warfare rhetoric, designed to create anger, envy and resentment).

However, seven percent is wrong; a deliberately-misleading statistic in relation to university applications.

University applicants come from people aged about 17 and over, so the relevant statistic is the proportion of UK children in the 'sixth form' i.e. those aged approximately 16-19.

The number of little children in private primary schools is irrelevant to university applications.

But the proportion of privately-educated children of university application age is actually 18 percent, not seven percent.

***

From: http://www.isc.co.uk/FactsFigures_PupilNumbers.htm

"Pupil Numbers

"Pupils in ISC schools account for around 80% of the total number of pupils in independent schools in the UK. The UK independent sector as a whole educates around 628,000 children in around 2,600 schools. The independent sector educates around 6.5% of the total number of schoolchildren in the UK (and over 7% of the total number of schoolchildren in England) with the figure rising to more than 18% of pupils over the age of 16.

"There are now 511,886 children in ISC schools in the UK and Republic of Ireland. Of these:

* 67,856 are boarders and 444,030 are day pupils
* 261,051 are boys and 250,835 are girls
* 44,792 are nursery age (0-4)
* 158,631 are primary age (5-10)
* 223,178 are senior age (11-15)
* 85,285 are sixth form age (16-19)

***

So the proportion of privately-educated applicants to universities is about 18 percent, not 7 percent, and this is more than double the media-quoted proportion.

The reason is simple enough - parents are more likely to send their children to private school the older they get. Parents are less concerned with private education at primary level than secondary level - and the more advanced the child the more likely they are to be transferred to a private school - while parents with children at private school will rarely transfer their kids to the state sector except for financial reasons.

Therefore, *if* private school kids were of equal ability to state school kids, then there would be approximately 18 percent of them at top universities if the selection system was unbiased.

But because private schools are selective (to a greater or lesser degree) they have on average better-performing pupils. This means that although there are about 18 percent of university applicants from private schools, a higher proportion than 18 percent would be expected to gain admission to universities; and the proportion would become higher as universities became more selective.

This explains why about one in five private school applicants to the most selective UK universities leads to around one in two or three private school pupils admitted to the most selective universities - on a fair system. The same applies to highly selective professions such as medicine and law.

Of course, the left does *not* want a *fair* university admissions system, but wants one that is government controlled and which is biased in favour of left voters - they want a system of preferences for their supporters.

So they lie about the numbers.

Remember: the proportion of private school educated kids aged 16 plus is about 18%, not 7%.

Byzantine Theocracy - in brief (Steven Runciman)

"[The Byzantine Empire's constitution] was based on a clear religious conviction: that it was the earthly copy of the Kingdom of Heaven. (…)

"It saw itself as the universal Empire. Ideally it should embrace all the peoples of the earth, who, ideally, should all be members of the one true Christian Church, its own Orthodox Church.

"Just as man was made in God’s image, so man’s kingdom on earth was made in the image of the Kingdom of Heaven. Just as God ruled in Heaven, so an Emperor, made in his image, should rule on earth and carry out his commandments.

"Evil had made its way into God’s creation, and man was stained with sin. But if the copy – the Greek word was mimesis, ‘imitation’ – could be achieved, with the Emperor and his ministers and counselors imitating God with His archangels and angels and saints, then life on earth could become a proper preparation for the truer reality of life in Heaven. (…)

"Constantine was lucky in having as his biographer and panegyrist Eusebius of Caeserea (….).

"According to Eusebius the triumph of history had now come, when the Roman Emperor had accepted the Christian message. He was now the wise king who was the imitation of God, ruling a realm which could now become the imitation of Heaven. (…)

"The king is not God among men but the Viceroy of God. He is not the logos incarnate but is in a special relation with the logos. He has been specially appointed and is continually inspired by God, the friend of God, the interpreter of the Word of God. His eyes look upward, to receive the messages of God. He must be surrounded with the reverence and glory that befits God’s earthly copy; and he will ‘frame his earthly government according to the pattern of the divine original, finding strength in its conformity with the monarchy of God’.

"…by and large, the Eusebian constitution survived in Byzantium down the centuries. It was never a legal constitution, so it could be adapted to suit the needs of the time. Roman traditions lasted on to temper it and remind the Emperor that while he represented God before the people, it was his duty also to represent the people before God.

"It never took root in the West, where it faded out when the practical power of the Empire declined. Western thought preferred the rival conception of Saint Augustine’s City of God.

"But to Byzantium it gave a sense of unity, of self-respect and of divine purpose that sustained the Empire to the last. (…)

"No form of government can survive for very long without the general approval of the public. (…) The ordinary man and woman in Byzantium believed their Empire to be God’s holy empire on earth, with the holy Emperor as representative of God before the people and the representative of the people before God.

"For eleven centuries (…) the theocratic constitution of the Christian Roman Empire was essentially unchanged.

"No other constitution in all the history of the Christian era has endured for so long."


From: Steven Runciman. The Byzantine Theocracy. Cambridge University Press, 1977.

***

Comment.

The Byzantine Empire was the most sustainedly-devout Christian society of history so far, and it also had the most enduring political constitution.

This combination gives Byzantium an unique status.

Byzantium is therefore deserving of particular study and reflection; and, for Christians, the essence of Byzantium might legitimately serve as an ideal aim for worldly human society.

Tuesday 13 July 2010

"Why not?"

"Why not?" is a phrase and attitude that is used to justify destruction of existing social institutions.

"Why not?" combines with a short attention span. When a wholly-conclusive answer to "Why not?" cannot instantly be provided in a single self-evidential sentence; then "Why not?" wins and carries the day.

"Why not?" is asked with an implicit assumption that anyone who disagrees thereby approves and supports all that has ever been wrong with whatever is being challenged.

By contrast, the future world opened-up by the application of "Why not?" can be painted wholly positively, with no significant disadvantages.

"Why not?" is an attack on moral comparison and policy realism.

"Why not?" feels itself to embody the free, fun-loving spirit of the bohemian counter culture.

"Why not?" includes everyone in its hopes.

Yet "Why not?" sees itself as a hard-nosed realism - those who try to explain exactly why not, are seen as engaged in fine-spun logic-chopping, or speculative prediction of the future. Anything less that complete and utter instant collapse is seen as inadequate to refute the immediate application of "Why not?".

"Why not?" represents infinite hope versus uncertain prediction.

The power of "Why not?" comes from the widespread social assumption in the mainstream media and among intellectual elites that the onus of proof lies upon those who assert "Not!"

"Why not?" is only applied, however, to selected and left-approved targets. Those who try to apply "Why not?" to leftist principles and practices are instantly identified as moral monsters.

When non-leftists try to use "Why not?" against the left, the most subjective, sensationalist, loose predictions of possible – probable - vast, ramifying harms and humiliations to approved groups are allowed to refute it.

"Why not?" in practice supports left-approved freedom, and crushes all other freedoms.

Monday 12 July 2010

Bronowski's habit of truth (now lost)

Jacob Bronowski (1908-1974) invented the term 'the habit of truth' to describe the fundamental and distinctive ethic of science: the main foundation upon which was built the success of science in providing the means (knowledge) for mankind to shape the natural world.

Bronowski emphasized this, since it was (and is) often imagined that science is a morally neutral activity. But although scientific knowledge is indeed morally neutral (and can be used for good or evil) the practice of science (including being a scientist) is indeed a moral activity - based on the habit of truth.

He argued that for science to be truthful as a whole it is not sufficient to aim at truth as an ultimate outcome, scientists must also be habitually truthful in the ‘minute particulars’ of their scientific lives. The end does not justify the means, instead the means are indivisible from the end: scientific work is ‘of a piece, in the large and in detail; so that if we silence one scruple about our means, we infect ourselves and our ends together’.

***

The idea is that – to be successful in terms of the test of shaping the natural world, scientists – scientific communications - must speak the truth as it is understood. Indeed, I think it likely that the social structure of science is at root a group of people who seek truth and speak truth habitually (and if or when they cannot be truthful, they are silent).

Bronowski perceived that societies which abandoned, indeed persecuted, the habit of truth – such as, in his time, the USSR and Nazi German – paid the price in terms of losing their inability to perceive or generate the underlying knowledge of reality which forms the basis of shaping the natural world. (Note – these were societies which had the habit of truth in science, and then lost it.)

This declining ability to shape the natural world was concealed with propaganda, but such concealment could only be temporary since the cause of the decline was strengthened by every attempt to deny it.

***

Having grown up under the influence of Bronowski (for good and for ill) and also this distinctive morality of science, I have witnessed at first hand the rapid loss of the habit of truth from science: at first an ecapsulated loss whereby scientists continued to be truthful with each other (that is, truthful in the sense of speaking the truth as they see it) while lying to outsiders (especially in order to get grants, promote their research, and to deflect criticism); the situation degenerating swiftly to the final surrender whereby scientists are no longer truthful even with themselves.

At the same time I have seen hype (i.e. propaganda) expand from being merely a superficial sheen added to real science in order to make it more interesting to the general public, to the present situation where hype defines reality for scientists (as well as everyone else) – where propaganda is so pervasive that nobody can know what – if anything – lies beneath it (there is, indeed, no ‘beneath’ since by now hype goes all the way through science from top to bottom).

At the end of his life, Bronowski saw this coming, in its early stages, and wrote an essay entitled The Disestablishment of Science about the need for science to be separated from the state. This was necessary, he argued, because the morality of government and the morality of science were so different.

***

As understand it, Bronowski’s major distinction is between government’s focus on ‘expediency’ or direct short term capability – which is substantially power over human behavior by propaganda and coercion plus already-available and useable technology; and science’s indirect generation of long term capability – which is substantially the result of greater knowledge leading to greater efficiency (more power per person).

“the hidden spring of power is knowledge; and more than this, power over our environment grows from discovery.”

Bronowski assumed that enlightened self-interest (i.e. long-termism) would be a strong force to maintain the independence and honesty of science against its erosion by short-termist government expediency.

This assumption was indeed crucial to Bronowski’s philosophy – which was atheist and utilitarian. He needed to believe that humanity needed to be and *would be* rational, sensible and far-sighted in its self-management; that humanity sought capability as a primary aim (not as an accidental by-product) and he also need to believe in the ‘democracy of intellect’: that humanity was intrinsically unified in terms of their motivation and capability, so that science was basically comprehensible to all (or the mass of) humankind and that the primacy of the habit of truth was also a universal aspiration.

The decades have made convinced me that Bronowski was factually wrong in several of his key assumptions, and this explains why the kind of rational ‘humanism’ Bronowski espoused has proven powerless to arrest the decline in the habit of truth and has indeed been a major collaborator in the erosion (the apparatus of hype and propaganda is staffed mostly by rational humanists, and justifies itself and its activities using rational humanist reasoning).

***

At root, as I understand it, Bronowski’s validation of science was power: the increased power it gave humanity, which was undeniable in terms of the vast and cumulative reshaping of the world which could be seen from the industrial revolution onwards.

Bronowski hoped that this power would be disciplined and moralized by the discipline and morality which itself generated the power: that is, by science. So his vision was of a society based on science becoming organized according to the morality of science, and thereby sustaining that science upon which it depended.

For Bronowski, science was therefore validated by the power it created, and power as an aim was validated by (long term) domination (i.e. in the long term the most scientific society would also be the strongest).

As an auxiliary justification of this seeking after power, Bronowski brought in an ethic that mankind’s deepest desire and ultimate destiny was the perpetual expansion of power, hel claimed to see this in the shape of history (the ‘ascent of man’).

This was indeed a moral principle for Bronowski – but in order to avoid the obvious problems of tyranny and aristocracy, he also needed to believe that the conditions for generating science (and power) were intrinsically ‘democratic’ – that in the long term the diffusion of power, the perpetuation of freedom, were two sides of the same coin of society becoming scientific in its mass.

***

From Science as a Humanistic Discipline:

“…science as a system of knowledge could not have grown up if men did not set a value on truth, on trust, on human equality, and respect, on personal freedom to think and to challenge, [… these are the] prerequisites of to the evolution of knowledge.”

My perspective is that ‘men’ did not value these things, but scientists did – but that they are indeed prerequisites.

Mass scientific competence and the dispersion of political power among citizenry were assumed to be linked phenomena – and mass education in science (including the morality of science) was therefore the basis of both power and freedom.

It now seems to me that Bronowski was wrong about the wellsprings of human motivation, and was engaging in wishful thinking concerning the basis of viable human societies. He grossly underestimated the intrinsically human oriented, short termist, selfish, nepotistic character of human nature; and failed to see the rarity of mathematical and scientific modes of thinking.

Far from being universal, the scientific way of thinking and the habit of truth is so rare in human history and across human societies as to look like a local and perhaps temporary ‘sport’ rather than a fundamental property of mankind.

***

Bronowski was also wrong about the hoped-for tendency for the desire for power intrinsically to regulate itself in a long-termist fashion, and I regard his installation of power seeking as a primary virtue as an instance of Nietzschian moral inversion – rather than an insight.

After all, the secular scientist (or humanist), for all his virtues, is very often a prideful egotist with an insatiable lust for status; and when he subscribes to an ethic of power he will often tend to justify himself as an instrument for the betterment of the human condition.

But the past decades have certainly confirmed that Bronowski was correct about the consequences of abandoning the habit of truth. Bronowski would have been utterly appalled at the pervasive, daily, habitual dishonesty of researchers (especially the leading researchers) in the largest and most dominant world science: medical science.

And as for the Lysenkoism of Climate Science… he might have been darkly amused at the defense of pervasive, deliberate, fundamental collusion and lying on the grounds (perfectly accurate!) that this was statistically *normal* behavior in modern science.

Because the world did not heed Bronowski’s warnings in the Disestablishment of Science, and the outcome of science becoming dependent on government funding has been wholly in line with Bronowski’s direct predictions.

As he wrote in Science as a Humanistic Disicpline: “… science is cut off from its roots, and becomes a bag of tricks for the service of governments.”

“A bag of tricks for the service of governments” – what a perfect description of a major, mainstream modern ‘science’!

Sunday 11 July 2010

Reflections on Charles Murray's Human Accomplishment and genius

Charles Murray's book length quantitative analysis of Human Accomplishment, made a big impact on me. I 'brooded' over it for quite a while, especially the summaries and speculations concerned with the possible cultural causes of 'genius'.

I have always been interested in 'genius', and have read many biographies of geniuses in a range of endeavors.

Ad on the whole, I subscribe to the Great Man theory by which specific individuals shape the course of history - some of these individuals do exceptional damage, others are exceptionally creative, while of course some do both.

And therefore I regard the ability of a society to produce potential Great Men and embody the conditions they need to make a difference, as a major influence on it - and this ability has been very unevenly distributed between societies across space and time.

For example, the modern world - the kind of society characterized by growth in scientific knowledge, technological capability and economic production, which took off in Great Britain and became apparent in the late seventeenth and early eighteenth centuries - and spread from there; this kind of society I believe was probably driven by the work of numerous Great Men (or 'geniuses') who produced qualitative advances (or 'breakthroughs') across a wide range of human activities.

I believe that these numerous breakthroughs required Great Men (i.e an adequate supply of GM were necessary but not sufficient), but once made could be exploited by ordinary men.

Most histories of society took the form of a series of mini-biographies. For example, Jacob Bronowski's Ascent of Man TV series was, for most of its length, focused on a series of specific individuals. And the implication was that this was not just a convenience for the purposes of teaching and entertainment, but an account of how things really, necessarily happened.

Up to the 1950s it was obvious for Britons to focus on Great Men, since they were living among us, and each new generation brought forth a fresh supply - so many, indeed, that only a sample could become household names.

Then, from the mid 1960s into the second half of the century, people began to notice that the supply of GM seemed to be drying up. This went along with various fashions for denying the importance of GM in human history, and attributing change to other forces (such as class). And for human affairs increasingly to be organized bureaucratically, in ways that implicitly denied the need for GM, and indeed sought to replace human creativity and genius with explicit and predictable procedure.

By the mid 1980s I noticed that the last real English poets were dying and that there was nobody to replace them. For the first time in several centuries, there was not one single living poet of real stature.

Looking around, the same situation was looming in science - and by now there are just a few elderly remnants of previous generations who might be regarded as geniuses. Medical breakthroughs also began drying up at about this time (although there have not been many major medical geniuses, according to Murray’s lists).

So apparently the age of genius is over for Britain, which probably means the age of progress via multiple breakthroughs is over; and the same situations seems to prevail everywhere else - so far as I can tell. If genius was the driver of the modern world, this means that the modern world is also over (unless you believe that genius has now effectively been replaced by bureaucracy – Ha!).

Whatever it was that created the supply of geniuses and the conditions in which they could make breakthroughs has changed. I do not know whether there are still *potential* geniuses being born, but the whole motivational structure of society is hostile to genius and it is likely that individuals who would have grown to be potential Great Men in an earlier phase of society, nowadays have their motivation and sense of self poisoned. Instead of trying to achieve great things, such people would now probably pursue a great career, or would simply find themselves fish out of water.

I find myself ambivalent about this. Of course I vastly prefer a society conducive to genius to one being destroyed by bureaucracy. And if human history is conceptualized –like Bronowski does - in terms of a story of progressively increasing power to shape nature (by increasing understanding of its underlying structure) then the prospect of a massive decline in human power is dismaying. It is also dismaying from the perspective of mass human happiness – the prospect of mass violence, displacement, starvation, disease etc.

Yet, realistically, modernity was not planned neither were the modernizing societies (such as late Medieval and early Modern England) in any sense designed to nurture, or provide opportunities for, genius. The whole thing was an unplanned by-product and the age of genius was accidental.

Indeed, it was transitional, never stable, containing the seeds of its own destruction – like so many things. The geniuses were usually transitional figures who – over the course of their own lives – rejected the religious and traditional societies into which they were born. In their own lives they sometimes combined the strengths of the traditional society of the past and the progressive society which was emerging (as a result, partly, of their own work).

Yet of course the transitional phase is necessarily temporary, evanescent, cannot be sustained – and the generations of fully modern people, who are born into the world created by the geniuses – are one-eyed, feeble, and lack the source of strength of traditionalism. They (we) are post modern hedonists, for the most part – consumers, not creators.

I now tend to regard modernity as a temporary aberration from the course of human history. It arose from an accidental conjunction of genetics and society; and the effect of genius was to destroy the genetics and society which had caused itself. Whether it would have been good to sustain modernity (and reform it – because its vices are intolerable and have grown exponentially) I don’t know for sure.

But I do know that we have not even tried to do this, has not even tried to sustain itself, but instead has parasitically exploited the heritage of genius; so the question is now unanswerable empirically.

I now see human choice, or at least our destiny, in terms of lying between traditional societies – a choice between the kinds of human societies which existed roughly between AD 500 and AD 1500.

Saturday 10 July 2010

Moral inversion in secular modern society

The most striking aspect of modern secular society, which would have amazed and horrified our ancestors, is the moral inversion by which have redefined bad as good, sin as virtue.

This has happened as part of the modern rejection of Christianity, and as a solution to the fundamental paradox of the human condition – the conflict between spontaneous human desire and spontaneous human morality.

It is, at root, this moral inversion which is causing secular modern societies to commit suicide by a combination of denial of danger and by deliberate policy.

***

It seems that our literate ancestors (such as the ancient Jews) all spontaneously recognized that for a person to live according to their spontaneous desires - living primarily for seeking gratification and avoiding (or minimizing) suffering - was morally wrong.

This was so obvious that it needed (and indeed needs) no argument - it is the natural moral law for humans that a life aiming at selfish hedonism is intrinsically wicked: that is wicked as a basic stance, not merely in terms of its consequences (which vary according to specific circumstances).

Yet it was also recognized that at some level, for humans as they are now, it is also natural and spontaneous to be selfish and hedonic.

So there was a conflict between the way that humans were 'set-up' to be self-gratifying and the moral sense by which we knew that this way wrong.

***

This was the basic situation, the human condition, as perceived by pretty much all humans throughout history - that of conflict.

And therefore the situation was bleak in the extreme, since there seemed to be no solution.

Of course, all humans also believed (in some sense) in the soul, and its potential persistence after death (in some form, perhaps as a ghost, perhaps in Hades - not the same as hell, perhaps returning to be recycled or reincarnated)

The ancient Jews attempted solution was The Law, which prescribed morality in terms (essentially) of behaviour. If a man could live according to the law, his life (in this world) would be good - although the end was the same for all - good and bad - the ghostly and depersonalized realms of Sheol/ Hades.

However, actually men could *not* live by the law. It was impossible, because of their nature. They could never achieve that to which they aspired. The human condition was tragic.

***

This was the basic human situation, about which humans could do nothing, and from which humanity needed to be rescued.

The Christian solution, the Good News, was that God's Grace had provided a solution, since the incarnation meant that God had taken-up humanity; and if a person proclaimed in their heart Jesus as Lord, and repented of his (inevitable) sinly nature, then there would be forgiveness and the soul would (instead of losing humanity in Hades) be granted eternal life with God.

Man's soul after death would become God-like instead of a gibbering, depersonalized ghostly form of persistence.

So, the Christian message is that belief and repentance in this life can lead to a solution of the fundamental paradox, but only in the next world, after death (however this life is temporary, while life after death is eternal).

***

For whatever reason, the Western elite ruling class became increasingly atheist from the advent of modernity (?c 1700). Since the elite ruling class disbelieved in the soul, they were this-worldly; and since they were this-worldly they wanted as much satisfaction from life as possible (there being nothing else).

But spontaneous natural morality gets in the way of worldly self-gratification – so spontaneous natural morality must (somehow or another) be rejected.

Yet since morality really is spontaneous, it cannot be rejected.

***

So emerged moral inversion: the morality that (contrary to the instinct of spontaneous morality) this-worldly self-gratification is the proper primary aim in life.

From this derives the many specific new ‘Laws’ of modernity; which state that what we used to think was necessary is actually un-necessary, that what we thought was bad was actually good, and what we used to think was good was actually bad.

For secular moderns the only *real* sin is to believe in the reality of sin.

***

This is the current situation, this the secular modern ‘solution’ to the fundamental paradox of the human condition – that life in this-world can be, should be, harmonious - *if only* we recognize that our spontaneous self-gratification is actually morally necessary and should be the primary explicit goal of human endeavor.

And because this solution actually solves nothing, and is merely a statement, a wish-that-this-was-so; it has necessarily been embodied in the *coercive* beliefs, practices and laws of atheist totalitarian states (notably the USSR, National Socialism and Communist China) and this process is now advanced in all Western societies (by enforcement of what is termed ‘Political Correctness’).

So the citizens in modern secular societies are not merely *encouraged* to flout the natural morality which they cannot help but feel, they are increasingly *forced* to flout natural morality. They are compelled to live (and to think and to believe) as if hedonic gratification was the primary value in a life which ends with death and extinction.

And there is no hope of resolution in this world, nor the next world (the existence of which is denied).

***

Since there is no hope of resolution, the only alternative is distraction – to lose oneself in hedonic gratification: such that intense, continuous self-gratification *obliterates* our awareness of the fundamental paradox.

Despair, distraction, denial, self-indulgence… and if these do not work, then some kind of suicide of awareness.

By this analysis, the Decline of the West is a willed societal suicide driven by the mass psychological consequences of top-down, enforced moral inversion.

A common atheist misunderstanding

Atheists commonly misunderstand it when Christians are critical of a primarily this-worldly and hedonic life - i.e. a life dedicated to maximizing gratification and minimizing suffering.

Atheists commonly misunderstand Christians to be arguing that a worldly and hedonic life *predisposes* a person to commit sins (where 'sins' are understood by the atheist as a list of behaviours that transgress the moral law).

But this is a misunderstanding of the Christian belief about the nature of sin.

The Christian perspective is that to live a this-worldly and hedonic life - a life orientated primarily toward personal gratification - precisely *is* the state of sin.

And if the atheist does not understand what the above means, then he does not understand the Christian message, and is arguing against a Straw Man.

Friday 9 July 2010

Is scientific progress a result of genius, elite, or mass effect?

Scientific progress is talked about in three main ways, depending on the numbers/ proportion of the population involved in generating this progress:

1. Genius - 10s to 100s of people per generation – a fraction of 1 percent of the population.

Science is the product of a relatively small number of geniuses - without whom there would be no significant progress.

Therefore an age of scientific progress can be boiled down to the activity of tens or hundreds of geniuses; and the history of science is a list of great men.

2. Elite - 1000s to 10,000s of people per generation – a few percent of the population

Science is the product of an elite of highly educated and trained people, usually found in a relatively small number of elite and research-orientated institutions, linked in an intensely intercommunicating network. Without this elite, and these elite institutions, there would be no significant progress.

The history of science is a history of institutions.

3. Mass - 100,000s to millions of people per generation – a large percent of the population, most ideally.

Science is the product of a 'critical mass' of scientifically orientated and educated people spread across a nation or culture; and whose attitudes and various skills add or synergize to generate scientific progress. If society is not sufficiently 'scientific' in this sense, then there will not be significant progress.

The history of science is a history of gradual transformation of populations - mainly by educational reform.

***

A (common) twist on this is the idea that humans have vast untapped potential - and that this potential might somehow be activated - e.g. by the right kind of education; leading to an elite of geniuses, or a mass-elite, or something...

Perhaps the mainstream idea nowadays is a mushy kind of belief/ aspiration that science is essentially elite but that the elite can be expanded indefinitely by education and increased professionalization.

Another variant is that scientific progress began as based on genius, then became elite-driven, and nowadays is a mass ('democratic') movement: however, this is merely a non-historical description of what has actually happened (more or less) - underpinned by the assumption that scientific progress has indeed been maintained.

But I do not accept that assumption of continued progress (given the vastly increased level and pervasiveness of hype and dishonesty in science).

Certainly there seem to be historical examples of scientific progress without need for a prior scientific mass of the population, or even a pre-existing elite gathered in elite institutions.

***


Of course, nowadays there are no geniuses in science, so admitting that genius is necessary to significant scientific progress entails admitting that we are not making progress.

Nonetheless, my reading of the history of science is that a sufficient supply of genius is necessary to significant scientific progress (although history has not always recorded the identities of the presumed geniuses) – at any rate, science has often made significant progress without elites in the modern sense, and elites often fail to make progress.

Thursday 8 July 2010

The SSRI story - corruption of medical research goes back to the 1960s

The story of the development of the SSRI (selective serotonin-reuptake inhibitor) drugs – the Prozac’ group of ‘antidepressants’ has been investigated thoroughly by David Healy (http://en.wikipedia.org/wiki/David_Healy_%28psychiatrist%29) in several books and papers such as The Antidepressant Era (1998) and Let them eat Prozac (2004) - http://www.healyprozac.com/.

***

In the late 1960s Arvid Carlsson (later a Nobel prizewinner) realized that there was something different about the tricyclic antidepressant Clomipramine – which was used in treating obsessive compulsive disorder and various types of unusual or resistant ‘depression’. He discovered that this was probably because it had a greater effect on blocking the reuptake of serotonin (5-HT) than noradrenaline, and measured this difference in several drugs. Carlsson published a paper in 1969 which identified the antihistamines chlorpheniramine (especially) and diphenhydramine as very likely to be valuable drugs of a type similar to Clomipramine (but with different side effects, less cardio-toxic and safer in overdose) (http://www.medical-hypotheses.com/article/S0306-9877%2805%2900647-X/abstract).

Chlorpheniramine has many properties including the well known formulation Piriton which is used in Hay Fever; while diphenhydramine was often used as a nocturnal cough suppressant (e.g. in one of the Benylin formulations) and as a sleeping medication (e.g. Nytol).

So, here were antihistamine drugs which were already used by millions and considered safe enough to be available without prescription; and with a profile suggesting that they might make a new category of psychotropic drug with similar uses to clomipramine. In effect Carlsson discovered the SSRIs in 1969 or thereabouts.

But the pharmaceutical companies would not do trials on these agents, since their patents had expired, and this knowledge was not disseminated – indeed it is barely known even today. Instead, the pharmaceutical companies ‘concealed’ this knowledge for a decade and a half until they had developed patent-protected compounds – first zimelidine (which was too toxic), then later fluoxetine, (Prozac) and the other drugs later marketed as ‘SSRIs’.

***

Clearly Big Pharma, and the university scientists and academic/ research-oriented psychiatrists, were not even *trying* to discover useful new treatments - if anything they were concealing them. Neither were clinicians sufficiently interested (or knowledgeable) to read and understand the scientific literature which implied that already existing antihistamines would have a valuable role in psychiatry – so psychiatrists were not trying to discover drugs to help their patients – they were only interested when these were ‘new’, glamorous and prescription-only drugs – not old-fashioned drugs available without prescription at the local drugstore or chemist.

Then when SSRIs came along to the point of pre-marketing trials around the early 1980s the pharmaceutical companies were not really interested in trying to find out what these drugs really did, how they might best be used, or their harms and dangers. The obvious use was in the treatment of anxiety – but David Healy (in The Antidepressant Era, 1998) has documented how anti-anxiety drugs were at that point regarded as intrinsically addictive due to emerging concerns about the benzodiazepines (the Valium group of drugs), and there was no interest in trying to launch new anti-anxiety agents into a market where they would be regarded as addictive. So the focus was on developing SSRIs as ‘antidepressants’.

Irving Kristol (in The Emperor’s New Drugs) has documented that by objective and rigorous criteria applied to the randomized trial evidence, the SSRIs are not effective as antidepressants. Yet, by selective and distorted reporting of the trials, the SSRIs were nonetheless licensed and marketed as antidepressants.

So the pharmaceutical corporations were not – as of the late 1970s early 1980s - interesting in telling the truth about what they did know, and were prepared to distort and to conceal even thirty years ago – this kind of behavior is not a recent phenomenon.

Another distortion and concealment related to SSRIs and suicide. Thanks mainly to the work of David Healy, it is now acknowledged ‘officially’ that SSRIs do indeed have a rare side-effect of inducing suicidal behaviour – for this reason they were labelled with a ‘black box’ by the FDA (Federal Drug Administration in the USA). 

Having found a raised rate of suicide and suicide attempts in the early placebo-controlled trials of SSRIs, Healy gave SSRIs to some normal control subjects, and a couple reported unfamiliar violent impulses. Indeed this kind of feeling (akathisia) and behaviour is found with the neuroleptic/ antipsychotic drugs, that are chemically related to the SSRIs (also being chemically modified from antihistamines). 

The behaviour is somewhat paradoxical, given that both SSRIs and antipsychotics usually tend to reduce or flatten emotions in most people – making them unemotional. Nonetheless in some people at some times both classes of drugs seem to produce aggressive impulses.

So, it is clear that, as of the 1980s at least, pharmaceutical companies were actively concealing the harmfulness of harmful drugs.

***

I do not believe that SSRIs are ineffective drugs, but I do agree that they are ineffective ‘antidepressants’ when depression is conceived in the classic way as endogenous depression or melancholia (a state of despairing emotional un-reactivity, reduced thought and movement, reduced food intake etc). SSRIs are sometimes effective in treating people with emotional instability, and in reducing anxiety – and that is where they seem to have found their niche, in the treatment of anxiety, panic, phobias, post-traumatic stress, obsessive compulsive disorder etc.

But getting to this point of understanding the value of SSRIs took a long time, much longer than it should have done – and drug company marketing and the medical research ‘evidence’ hindered rather than helped the process. Presumably many millions of people have been ineffectively or harmfully treated with SSRIs, while others who would perhaps have benefited were not tried on the drugs because they were not ‘depressed’.

Taking the SSRI story in overview, it is deeply worrying in terms of what it tells us about the motivations of pharmaceutical companies, and clinicians, over the past several decades – and its implications for interpreting the medical research literature.


This because science is hard to do: and if you are not even trying to discover useful treatments you certainly will not succeed. And if you are not even trying to be truthful, you will certainly not generate truth by accident.

***

In sum, what this story tells us that:

1. Pharmaceutical companies were concealing information of clinical value as long ago as the late 1960s. This is not a recent development – although it has gotten worse.

2. SSRIs were not a new class of drugs. Pharmaceutical companies were not primarily trying to discover useful new classes of drugs, but chemically to slightly-modify old drugs to produce patentable agents which were then hyped as entirely new classes of wonder drugs. No new class of useful drugs has been discovered in psychiatry since the 1950s. Shocking.

3. There was a delay of about 15 years between discovery of the concept of SSRIs and the marketing of patented SSRIs – and although for those 15 years it was known that SSRI-type drugs were available for use, they were never used. This exhibits complete disregard for the needs of patients.

4. Some old drugs (e.g. chlorpheniramine, diphenhydramine) are cheaply available without prescription and ‘over the counter’ that are of the same class as new and expensive drugs (Prozac, Paxil) available only on prescription. While they are more sedative than modern ‘SSRIs’, these OTC drugs are likely to be similarly effective but safer due to greater experience in their usage.

5. When SSRIs were being investigated the investigation was focused on developing them as antidepressants, because the market for antidepressants was more promising than the market for anti-anxiety drugs – or any other type of drug. So, the SSRIs were never investigated for what they actually did, they were investigated in relation to what the pharmaceutical companies hoped they would do.

6. When the trials were conducted, it was discovered that the SSRIs were all-but-ineffective as antidepressants (i.e. ineffective at treating endogenous depression/ melancholia, ineffective at treating hospital inpatients with depression) – i.e. SSRIs were ineffective at doing what the pharmaceutical companies hoped they would do, but instead of acknowledging this fact, the strategy was to distort and misrepresent the trails and also redefine ‘depression’ and expand its diagnosis - to pretend that the drugs were effective antidepressants.

7. The rare but extremely serious problem of increased rates of suicide attempts and actual accomplished suicides among SSRI-takers compared with controls (an effect which is biologically understandable and plausible given the chemical structure and ancestry of the SSRIs) was concealed and denied – for decades!

8. Overall, the official SSRI research literature is pervasively unsound and untrustworthy. I cannot see any way of correcting for such an extreme and bottom-up degree of selection and bias, and I believe therefore that the official medical research literature on SSRIs should be ignored by serious scientists and physicians of integrity.

9. The second implication is that (unless the case of the SSRIs is unique – which seems highly unlikely, given that they were such a big selling and profitable example of modern drug marketing) - the whole official medical research literature going back at least three decades, is pervasively unsound and untrustworthy, and therefore must be ignored.

10. In such a situation as prevails now, it seems that there is no reliable or discernable relationship between the official medical research literature and the actualities of science; and also no relationship between the clinical literature and the reality of clinical experience. Since there is no effective mechanism to maintain the quality of the medical research literature, and no motivation to do this, and no honesty; it is quite possible that overall medicine is getting worse, rather than better.

11. But – such is the pervasiveness of corruption and dishonesty in relation to medical research and treatment - apparently nobody with any influence is interested by any of this.

12. In other words, nobody with influence is nowadays interested in really discovering, or even significantly improving, medical treatments. Even worse, nobody with influence is even trying to tell the truth about medical treatments.

***

So how are things in the real world of science and medicine, underneath the hype and deception?

The answer is that I do not know, indeed nobody knows. Indeed, you cannot get ‘underneath’ the hype and deception, because the hype and deception goes all-the-way-down.

Wednesday 7 July 2010

Growth and the expectation of growth in scientific knowledge

We have become used to growth in scientific knowledge, and expect growth in scientific knowledge. This expectation at first shaped reality, then became reality, and eventually displaced reality. The link between expectation and actuality was broken and the world of assumptions took over.

***

The expectation that scientific knowledge will grow almost inevitably (given adequate 'inputs' of personnel and funding) is epitomized by the professionalization of scientific research (making scientific research a job) and the expectation of regular and frequent scientific publication – the expectation of regular and frequent publication would only make sense if it was assumed that scientific knowledge was accumulating in a predictable fashion.

We nowadays expect a growth in the number of scientific publications over time, and a growth in the totality of citations – these are fuelled by increases in the numbers of professional scientists and of journals for publishing science. We assume that there is an infinite amount of useful and new science waiting to be discovered, and an infinite pool of people capable of making discoveries.

The economist Paul Romer – and many others – have built this into theories of the modern economy – they argue that continued growth in science and technology fuels continual improvement in productivity (economic output per person) and therefore growth in the economy. And this is kept going by increasing the investment in science and technology. The idea is that we are continually getting better at scientific discovery, investing in scientific discovery, therefore modern society can continue to grow. (Yes, I know it doesn’t make sense, but...)

***

But how would we really know whether science was growing? I mean, who is really in a position to know this?

Who could evaluate whether change is science and increased amounts of self-styled scientific *stuff* actually corresponded to more and better science?

When – as now – scientific growth is expected, and when society acts-upon the expectation, we have an overwhelming *assumption* of growth in science, an assumption that science *is* growing – but that says nothing about whether there really is growth.

Because when people assume science is growing and when they think they perceive that science is growing, this creates vast possibilities for dishonesty, hype and spin. Because people expect science to grow, for there to be regular breakthroughs, they will believe it when regular breakthroughs are claimed (whether or not breakthroughs have actually happened).

***

But how if there is really no growth in scientific knowledge? Or how if the real growth is less than the assumed growth? How if there is actual decline in real scientific knowledge – how would we know?

Science – as a social system –resembles the economy. In the credit crunch of 2008 it was realized that the economy had not really been growing, but what we were seeing was actually some mixture of increasing inflation, increasing borrowing, and rampant dishonesty from many directions. (It is the rampant dishonesty that has prevented this from being understood – and this tactical dishonesty is itself no accident.)

So we discovered that we were not really getting richer, but we were living off ever more credit, and the value of money was less than we thought; and we (or at least I) discovered that we could not trust anybody to tell us anything about what was going on or why it had happened. They were not even trying to discover the truth, they were trying to build their careers (politicians, economists, journalists – all careerists). (To be fair, most of them are explicitly nihilists who do not believe in the truth – so why should we expect them to tell it?)

Truth about the credit crunch was something we amateurs needed to work out for ourselves, as best we could.

***

I believe that science is in the same bad state as the economy, but probably even worse.

In science, what masquerades as growth in knowledge (to an extent which is unknown, and indeed unknowable except in retrospect) is not growth in knowledge but merely an expansion of *stuff*, changes in the methods of counting, and so on.

Almost nobody in science is trying to discover the truth, and is embarrassed even by talking about the subject. Not surprising that they are embarrassed!

For instance, virtually every influential scientific article is now hyped to a variable but vast extent (the honest ones are buried and ignored).

Multiple counting is rife: progress in claimed when a grant is applied for and also when a grant is awarded, and even when the work is still happening – since scientific progress is assumed to be predictable – a mere function of resources, capital and manpower; credit for a scientific publication is counted for all of its (many) authors, for all the many forms in which the same stuff is published and republished, for the department and also for the university where it was done, and also the granting agency which provided the funds and for the journal where it was published – everyone grabs a slice of the ‘glory’.

Credit is given for the mere act of a ‘peer reviewed’ publication regardless of whether the stuff is true and useful – or false and harmful.

Thus the signal of real science is swamped utterly by the noise of hype.

***

Let us suppose that doing science is actually much *harder* than people assume; much harder and much less predictable.

Suppose that most competent and hardworking real scientists actually make no indispensible contribution to science – but merely *incremental* improvements or refinements in methods, the precision of measurements and the expression of theories. And if they personally had not done it, it would have slightly-slowed but would not have prevented progress, or somebody else would have done it.

If science is really *hard*, then this fact is incompatible with the professionalization of science – with the idea of scientific research as a career. Since science is irregular and infrequent, science could only be done in an amateur way; maybe as a sideline from some other profession like teaching, practicing medicine, or being a priest.

Professional science would then be intrinsically phony, and the phoniness would increase as professionalization of science increased and became more precisely measured, and as the profession of science expanded – until it reached a situation where the visible products of science – the *stuff* bore no relationship to the reality of science.

Professional scientists would produce stuff (like scientific publications) regularly and frequently, but this stuff would have nothing to do with real science.

Or, more exactly, the growing amount of stuff produce by the growing numbers of professional science careerists, whose use of hype would also be growing – the amount of this stuff would be so much greater than the amount of real science, that the real science would be obscured utterly.

***

This is precisely what we have.

The observation of growth in scientific knowledge became an expectation of growth in science and finally an assumption of growth in science.

And when it was assumed that science was growing, it did not really need to grow, because the assumption framed the reality.

***

But if science is as hard as I believe it is; then scientific progress cannot be taken for granted, cannot be expected or assumed.

Our society depends on scientific progress – when scientific progress stops, our society will collapse. Yet so great is our societal arrogance that we do not regard science as something real. Instead science is the subject of wishful thinking and propaganda.

Science is a way of getting at certain kinds of truth, but the way that science works is dependent on honesty and integrity. Our societal arrogance is such that we believe that we can have the advantages of real science but at the same time subvert the honesty and integrity of science whenever that happens to be expedient.

Our societal arrogance is that we are in control of this dishonesty – that the amount of hype and spin we apply is under our control and can be reversed at will, or we can separate the signal from the noise, and re-calculate the reality of science. This is equivalent to the Weimar Republic assuming that inflation was under control when prices and wages were rising unpredictably by the hour.

But we cannot do this for the economy and we cannot do it for science. In fact we have no idea of the real situation in either science or the economy, except that in a universe tending towards entropy we must assume that the noise will tend to grow and swamp the signal. The Western economy was apparently growing but in reality it was increased inflation and borrowing and deception; science has appeared to be growing but the reality is increasing hype, spin and dishonesty. The link between stuff and substance has disappeared.

***

When the signals of economics and science (money and ‘publications’ and other communications) lose their meaning, when the meaning is detached from underlying reality, then there is no limit to the mismatch.

The economy was collapsing while the economic indicators improved; and science can be collapsing while professional science is booming.

But if science is very difficult and unpredictable, and if the amount of science cannot be indefinitely expanded by increasing the input of personnel and funding, then perhaps the amount of real science has not increased *at all* and the vast expansion of scientific-stuff is not science.

If so, then the amount of real science (intermittent, infrequent, unpredictable) has surely not stayed constant but will have actually declined due to the hostile environment. At the very least, real science will be de facto unfindable since the signal is drowned by every increasing levels of noise.

So, the economy was a bubble, and science is a bubbles, and bubbles always burst; and the longer delayed the burst, the bigger the bubble will become (the more air, the less substance), and the bigger will be the collapse.

***

When the economic bubble burst, the economy was much smaller than previously realized - but of course the economy was still enormous. In effect, the economy was set back several years.

But when the scientific bubble bursts, what will be left over after the explosion? Maybe only the old science - from an era when most scientists were at least honest and trying to discover the truth about the natural world.

And, in an era of mindless technical specialization, will there be enough scientists even to understand what was left over?

At the very least, science would be set back by several decades and not just by a few years. But it could be even worse than that.

Tuesday 6 July 2010

Fr Seraphim Rose on the spiritual poverty of our times

"I must tell you first of all that, to the best of our knowledge, there are no startsi today—that is, truly God-bearing elders (in the spirit of the Optina elders) who could guide you not by their own wisdom and understanding of the Holy Fathers, but by the enlightenment of the Holy Spirit. This kind of guidance is not given to our times—and frankly, we in our weakness and corruption and sins do not deserve it.

"To our times is given a more humble kind of spiritual life, which Bishop Ignatius Brianchaninov in his excellent book The Arena (do you have it?) calls life by counsel—that is, life according to the commandments of God as learned in the Holy Scriptures and Holy Fathers and helped by those who are elder and more experienced. A starets can give commands; but a counsellor gives advice, which you must test in experience.

"We do not know of anyone in particular who would be especially able to counsel you in the English language. If this is really needful for you, God will send it to you in His time, according to your faith and need, and without your making too deliberate a search for it."

From a letter excerpted at: http://www.orthodoxinfo.com/praxis/frseraphimspeaks.aspx

Comment: The modern lack of guidance from an 'enlightened' elder, in direct lineal discipleship with the greats of the past, is a feature of many aspects of contemporary life including science and medicine; and something which is likely to get worse.

Then the aspirant must remember and accept that "To our times is given a more humble kind of spiritual life"; also a more humble scientific life, or life as a physician or surgeon.

The modern heroism is then to accept humility, and not to strive for an impressive status which can only be fake.

Monday 5 July 2010

My favourite television - Cricket Writers on TV

My favourite television show - from the perspective of purely self-indulgent entertainment - is Cricket Writers on TV.

Indeed CWoTV it is the only thing I regularly watch

This is a discussion that is currently broadcast on a Sunday morning at about 09.00 hours (Sky Sports 1) and which lasts for one and a half hours.

That is to say, I am provided with one and a half hours (minus advertisments) of four cricket journalists sitting around a table (apparently provided with some coffee and croissants) chatting about the week's news in cricket in great detail (c. 70 minutes worth of detail!).

Yesterday's episode was particularly fine since it featured the genial regular host Paul Allott with my favourite modern cricket writer: Scyld Berry who is the Sunday Telegraph Correspondent and the editor of Wisden. In addition there was the personable and opinionated Stephen Brenkley, and also Vic Marks who provided a contrasting view to my own on the perennial issue of four versus five bowlers for test matches.

(In fact my own cricketing views are closely modelled on/ copied from those of Scyld Berry!).

I realize that this programme would be regarded as a form of torture by everbody other than myself - therefore although I provide a link to the audio podcasts below I do not exactly _recommend_ it; because Cricket Writers on TV is not really intended for you, it is for me - http://www.skysports.com/story/0,19528,12172_6174957,00.html.

I find it heartening, fascinating, and almost magical that Sky Sports would go to the trouble and expense of assembling four journos each week (just the right number for conversation) and producing this show uniquely for little old *me* - and, despite the temptations of commercialism, to have continued broadcasting it for at least the past three seasons.

Well, I am very grateful; and will continue to enjoy it while it lasts.

Remember the name: Cricket Writers on TV.

But you need not bother watching it because it is *my* show.

Sunday 4 July 2010

What kind of pathology causes depression?

For several decades, most psychiatrists have propagated the idea that 'depression' (Major Depressive Disorder, Endogenous Depression, Melancholia) is a brain disease.

For example it is often said that depression is caused by some kind of chemical abnormality involving serotonin (5-HT) or its receptors, or involving norepinephrine or dopamine or something.

But if depression really is a brain disease, then it would have to be a disease of the brain substance, and this pathological process would necessarily need to be detectable using normal medical science.

There are only a limited number of pathologies recognized by medical science, and 'neurotransmitter abnormalities' is not one of them. Abnormalities of messenger chemicals are a consequence, not a cause.

When I was a medical student we were taught a diagnostic scheme called the 'surgical sieve' since it was often used in creating a list of possible ('differential') diagnoses for someone with an unknown problem that might be admitted for surgical assessment. Another name is the pathologic(al) classification.

What happens is that each of the potential pathological causes is considered in tune. There are many versions and many use acronyms to make them memorable - for use in actual clinical practice.

Here is the 'surgical sieve' described in Wikipedia

http://en.wikipedia.org/wiki/Surgical_sieve

1. There are two basic categories of disease - congenital (inborn - for example genetic) versus acquired.

Among acquired pathologies there are:

2. Neoplastic (eg cancer)
3. Metabolic (chemical)
4. Infective (germs)
5. Traumatic (accidents)
6. Autoimmune
7. Vascular (caused by blood vessels)
8. Inflammatory
9. Degenerative (due to age)
10.Idiopathic (caused by medical treatment - side effects)


Classic melancholia or endogenous depression is rare - probably less than 1 percent of the population would have an episode during their lifetime and it is devastating; usually it requires hospitalization or equivalent, and lasts for several months at least - but not more than a year or two (unless the person commits suicide, or dies of dehydration or starvation). It is perhaps the most intense suffering which a human can experience, survive and recover-from.

So, in melancholia which of the above might be a causal brain disease?

The point is that the pathological cause of melancholia needs to be relatively long lasting (months, a year) but also fully reversible.

What pathological processes fit that pattern?

The short answer is that none of them have been detected in the brain as a cause, not even metabolism (which is not a primary cause anyway - or when it metabolism is regarded as a primary cause it is usually congenital and usually long lasting and not spontaneously reversible, self-curing).

Whatever is the primary cause must be of a type that leads to a disease which lasts some months but potentially completely goes away - so the cause is *not* going to be neoplasia, for instance.

On this basis the clinical pattern of classic depression best fits with either infective or autoimmune disease.

For example an untreated infection will get worse for a while, then will eventually be defeated by the body - in pre-antibiotic days it might take months to recover from some infections.

Autoimmune disease exacerbations (e.g. rheumatoid arthritis) are not well understood, but the diseases may also wax and wane over a timescale of months.

However, it seems unlikely that there is a primary infective cause in the brain, because this is an encephalitis, and that produces different symptoms from melancholia (especially delirium).

However, depression might well be caused by infective causes elsewhere in the body since immune chemicals travel in the blood; and indeed it is well known that many infections can produce depressive symptoms (as a consequence of immune activation).

Perhaps some people with melancholia also suffer an autoimmune disease - could this primarily be a disease of the brain itself? It seems implausible, since autoimmune changes would produce inflammatory changes that would probably produce encephalitis type symptoms of delirium.

So as with infection it seems likely that melancholia may be a consequence of autoimmune disease elsewhere in the body, but not the brain itself.

In conclusion - it is not plausible that depression (classic melancholia) is a primary brain disease since the pattern of disease does not plausibly fit any of the possible causes; and it is more likely to be a consequence of pathological processes (one or more pathological processes such as infective or autoimmune disease) elsewhere in the body.

On these grounds, I would not be surprised if some episodes of serious endogenous depression/ melancholia turned out often (not always) to have an infective cause - or rather, a variety of infective causes.

This may even apply to autoimmunity, since autoimmunity can be caused by an infection.

So a fruitful investigative strategy might be 'search for infection'; and then treat it, if possible.

Or maybe even try treating presumed infection blind, by trying-out several different antimicrobial agents that cover a range of presumed infective causes?

Humility versus submission

For Christians, repentance is a first step on a short path to forgiveness; but modern secular leftist guilt has no answer.

Repentance and forgiveness versus perpetual guilt - a big difference.

While a Christian aims for humility before God, modern secular leftists can only settle for submission to those people who make them feel guilty.

Humility before God versus submission to Man - a big difference.