Thursday 17 June 2010

Society and sin

One society may be more sinful than another, but the reason is not because it breaks more of the commandments, laws or rules.

That kind of statement would invite quantitative statistical investigation, adding up the number of transgressions, which is surely not the right way of thinking about it.

The focus on laws also stirs-up unresolvable arguments about which specific breaches are most important, and whether - or to what extent - obedience in one domain (e.g. kindness), cancel-out disobedience in another domain (e.g. sexuality, courage).

As I understand it, the core of sin is orientation: an orientation towards this world and a focus on optimizing pleasure and minimizing suffering.

The opposite of sin is an orientation towards the other world and a focus on salvation.

Modern Western Society is therefore sinful not because of the specific things which people do or fail to do, or whether society encourages or enforces these dos and don'ts; but from the underlying cause that modern society (and modern people) cannot even understand the idea of being orientated towards the Kingdom of God and primarily concerned with salvation.

Religiously-motivated activity is therefore always explained-away and instead attributed to economic, political or other causes.

Insofar as a religious perspective is recognized it is regarded either as dumb and despicable or crazy and dangerous.

Sinfulness has therefore gone qualitatively beyond denial and disbelief of the religious perspective; modern societies - modern individuals - now display utter, blank incomprehension.

Wednesday 16 June 2010

The role of asceticism

Perhaps the importance is that a person may learn to stop seeking this-worldly pleasure and stop avoiding this-worldly suffering as the *primary* aim of their life. This enabling the person to seek other-worldly values. It is a disengagement from the relentless focus on this world; necessary, permissive of further steps, but not in itself generative of the Kingdom of God. Asceticism should therefore be voluntary, and any voluntary act of turning from pleasure or acceptance of discomfort in any mode can perhaps be asceticism; or the beginning of asceticism. Imposed hardship, even of the most extreme, is not asceticism at all unless it is voluntarily accepted and consecrated. The point is that withdrawal from the Kingdom of Man is necessary but must be accompanied by a turning toward the Kingdom of God; a turning from the concerns of psychology in this world to a perspective of eternal concerns.

Tuesday 15 June 2010

How the ideal of neutrality/ impartiality actually serves a radical agenda

Neutrality is a lynch pin of elite political thought. Much of modern quasi-scientific social research is dedicated to demonstrating that some modern social system (law, education, the military) is not behaving neutrally. All that is required of such research is to show that people of different sex, ethnicity or whatever are treated differently, and points are scored, the system is discredited and demonstrated as being ripe for radical reform.

(Actually, it is worse than this, because any research which fails to find differences between sexes or whatever is suppressed or ignored - while even clearly erroneous or made-up research showing differences – e.g. radically-motivated research which is actually based on pre-selected anecdotes or fails to control for major confounders like age may be given tremendous publicity.)

However, if it is impossible for an individual, an organization or a culture to be neutral - then this debate takes on a different complexion altogether; because if impartiality is unattainable, then the debate would not *really* be about failure to attain the ideal of neutrality, but *actually* a debate over *who* should be favoured.

The ideal of impartiality in social systems probably derived from the ideal of Roman law, in which (as I understand it) the same system is applied to everyone - everyone goes through the same basic process.

The same idea applies to bureaucracies, as described by Max Weber, in which administrators are required to devise and apply procedures impartially, treating sufficiently-similar cases as operationally-identical.

But in the real world there are major differences in the application of the law and the application of bureaucratic procedures - differences such as: who gets investigated, who gets prosecuted, the type of sentence they receive, who has regulations enforced on them - and so on.

***

One classic political scenario nowadays involves someone (a radical) attacking a procedural system (such as the legal process, employment practice or educational evaluations) as being biased, while another person (a libertarian or conservative) defends the system as being impartial-enough for the purposes.

The radical pretends to argue that impartiality is attainable but requires change, while actually seeking privileges for a particular group. The libertarian/ conservative always gives ground in the direction the radical is pushing, because any actually existing system is indeed partial – if you look hard enough.

Hence the evaluation system is overturned. That group which used to be privileged is now suppressed, and vice versa. This can most clearly be seen in employment policy relating to gender.

A reactionary perspective, by contrast, would accept the radical’s assertion that one group or another must in reality be privileged, and would challenge the radical on the grounds of which group ought to be privileged. The focus of debate changes.

For example, if it is accepted that neutrality is impossible, then employment policy must favour either men or women – the proper question then becomes which is it best for employment policy to favour?

For example, the organization of the military or care for young children will inevitably favour either men or women – the proper question to ask is: which is the most functionally-appropriate favoured group in each specific case? (Clue: the answer is different for each of these two examples…)

***

One big advantage of acknowledging the inevitability of partiality is that this is what most people believe anyway and always have believed – in fact it is only a minority of the intellectual elite (libertarians and conservatives) who really believe in impartiality as a desirable and attainable goal of social systems.

But radicals, socialists, liberals and corrupt politicians are simply exploiting the failure to attain impartiality as a justification for imposing a revolutionary inversion of values.

Hence a belief in the ideal of neutrality unwittingly serves a radical and nihilistically-destructive agenda, since it actually leads to partiality in the opposite direction from that which is socially functional.

Monday 14 June 2010

The impossibility of neutrality

Supposing that it really is impossible that a society can be neutral with respect to anything important - that it must either tend to support or suppress it - then this explains why things can move so swiftly from being forbidden to being compulsory.

If neutrality really is impossible, then to argue that something should not be subject to stigma is - in the long run - precisely equivalent to saying that it is desirable.

If neutrality really is impossible, to argue that 'x' is not evil, is the same as arguing that 'x' is good.

If neutrality really is impossible, then to argue that people should no longer be punished or suffer for doing 'y' is de facto to argue that they should be rewarded and feel good about doing 'y'.

If neutrality really is impossible, then when society ceases to persecute a group, it will always begin to privilege that group.

*

Of course, one might argue that it is not necessarily true that neutrality is impossible; one might argue that theoretically it is possible and desirable that society might maintain an attitude of impartiality with respect to important matters.

But looking back over the past fifty years, what does it look like to you?

To me it seems blazingly obvious that when society ceases to sanction a thing it always, always, always starts to honour that thing.

Liberal democracy intrinsically hostile to Christianity

From Nihilism by Eugene (Fr Seraphim) Rose (from http://www.columbia.edu/cu/augustine/arch/nihilism.html):

"In the Christian order politics too was founded upon absolute truth. We have already seen, in the preceding chapter, that the principal providential form government took in union with Christian Truth was the Orthodox Christian Empire, wherein sovereignty was vested in a Monarch, and authority proceeded from him downwards through a hierarchical social structure.

"We shall see in the next chapter, on the other hand, how a politics that rejects Christian Truth must acknowledge "the people" as sovereign and understand authority as proceeding from below upwards, in a formally "egalitarian" society. It is clear that one is the perfect inversion of the other; for they are opposed in their conceptions both of the source and of the end of government. Orthodox Christian Monarchy is government divinely established, and directed, ultimately, to the other world, government with the teaching of Christian Truth and the salvation of souls as its profoundest purpose; Nihilist rule--whose most fitting name, as we shall see, is Anarchy---is government established by men, and directed solely to this world, government which has no higher aim than earthly happiness.

"The Liberal view of government, as one might suspect, is an attempt at compromise between these two irreconcilable ideas. In the 19th century this compromise took the form of "constitutional monarchies," an attempt--again--to wed an old form to a new content; today the chief representatives of the Liberal idea are the "republics" and "democracies" of Western Europe and America, most of which preserve a rather precarious balance between the forces of authority and Revolution, while professing to believe in both.

"It is of course impossible to believe in both with equal sincerity and fervor, and in fact no one has ever done so. Constitutional monarchs like Louis Philippe thought to do so by professing to rule "by the Grace of God and the will of the people"--a formula whose two terms annul each other, a fact as equally evident to the Anarchist as to the Monarchist.

"Now a government is secure insofar as it has God for its foundation and His Will for its guide; but this, surely, is not a description of Liberal government. It is, in the Liberal view, the people who rule, and not God; God Himself is a "constitutional monarch" Whose authority has been totally delegated to the people, and Whose function is entirely ceremonial. The Liberal believes in God with the same rhetorical fervor with which he believes in Heaven. The government erected upon such a faith is very little different, in principle, from a government erected upon total disbelief, and whatever its present residue of stability, it is clearly pointed in the direction of Anarchy.

"A government must rule by the Grace of God or by the will of the people, it must believe in authority or in the Revolution; on these issues compromise is possible only in semblance, and only for a time. The Revolution, like the disbelief which has always accompanied it, cannot be stopped halfway; it is a force that, once awakened, will not rest until it ends in a totalitarian Kingdom of this world. The history of the last two centuries has proved nothing if not this."

[end of exerpt]

This analysis points to the fundamental weakness of all existing Western Societies.

It seems to imply that over the long term some kind of unified single-hierarchy theocratic monarchy is the only coherent form of a religious society, and will (in the long term) prevail over societies divided between Church and State.

Another point made by Rose elsewhere in this book is that – whether desirable or not - impartiality is impossible. We can only be for or against something (and our actions will tell us which – even if our minds are confused or self-deceptive on the matter).

The impossibility of impartiality entails - inter alia - that a person, a society, a state, will either support or suppress Christianity; and therefore that once a society has ceased explicitly to embody, to support and promote Christainity it will de facto begin suppressing it.

Putting together the first and second points: suppression of Christianity is an inevitable long-term consequence of democracy, an intrinsic property of democracy.

Solitary science?

For anyone wanting to do science when the social structure of science is so corrupt, the obvious question is to ask whether they can 'go it alone'? - whether it makes any kind of sense to do science solo.

At the extreme this would simply mean that a person studied a problem but did not communicate their findings - either study for its own intrinsic value, or perhaps implementing the findings in their own life - for example a doctor doing research and using the findings in their own clinical practice.

Implementing findings in personal practice is, at some level, universal - it is simply termed learning from experience.

But what about doing science for its intrinsic value? This is termed philosophy - or perhaps natural philosophy.

I don't believe there is any line dividing philosophy from real science - although the activities differ considerably at their extremes. Nowadays both philosophy and science are essentially corrupt - or perhaps one could say that the names philosophy and science have been stolen and applied to generic, large scale bureaucratic activities.

However, if philosophy is seen in its essential role - aside from being a career - then that is exactly what a solo scientist would be doing; as indeed was the case for someone like Aristotle who has been rated as both the greatest (i.e. most influential) philosopher and scientist.

But of course Aristotle was a professional, not an amateur, and also he applied the fruits of his scholarship in practice. Indeed, it is hard for humans not to want to communicate their work - not least there is the motivation to get status for one's scholarship.

So, while it is not impossible, I do find it hard to imagine a satisfying life as a solo scientist; and I think that being part of a similarly-motivated group of people is probably a pre-requisite. However, such a group might be relatively small and local - as was the case in the 18th century in England, when science was carried forward by the Lunar Society in Birmingham and similar Literary and Philosophical Societies in other cities.

Sunday 13 June 2010

The malignancy of radical doubt

Like nearly all modern scientists, indeed nearly all of the modern intellectual elite, I find it difficult to believe in the reality of the immortal soul - isn't that strange?

It is natural and spontaneous for humans to believe in a soul which in some way persists after death. And apparently everyone in the world believed this until a few hundred years ago (including, for what it is worth, the greatest intellectuals in the history of humankind - Socrates, Plato and Aristotle). Indeed, on a planetary scale, nearly everyone alive still does believe in the immortal soul - but hardly any of the ruling elite of the Western nations.

Why don't they believe in the soul now?

It was, obviously, not due to any kind of *discovery* of science or logic. It was instead due to a change in metaphysics - a change in assumptions. Specifically the systemic application of 'radical doubt' - or what I think of as the 'subtractive method'.

(Apparently this metaphysical novelty came from Descartes, ultimately - but why it came to dominate the West is a mystery.)

The subtractive method works on the basis that you try denying the reality of something, and see whether this elimination causes instant and complete collapse – if it does not then it is concluded that the subtracted thing was not real but merely a subjective delusion.

So, intellectuals deny the reality of the soul and since this denial does not lead to the immediate and complete destruction of the denying individual or group, so this is taken to mean that the soul does not really exist, that it is subjective, that it had been a delusion that gripped the world for millennia but from which we are now blissfully free.

In practice (which we see around us on a daily, hourly, basis) the subtractive method of radical doubt involves doubting one piece of knowledge (e.g. the reality of the soul, of beauty, of an objective morality, or the factuality of any empirical claim) while *not* doubting other pieces of knowledge – such as the validity of human reason, or the validity of various pieces of science, economics, or whatever.

At another time, however, radical doubt may be turned against the pieces of knowledge which have previously been used to doubt _other_ pieces of knowledge – so that logic might be used to deny the reality of the common sense soul, than later the validity of logic might be doubted using historical, multicultural anthropological ‘evidence’ (e.g. assertions that some cultures or individuals do not use logic, or that the use of logic has changed).

So all of knowledge can be, *is being*, systematically ‘doubted’ piecemeal, a bit at a time, in rotation – as it were.

Yet all specific doubts are relative to other knowledge which – for the time being – is exempted from doubt.

(Total skepticism of all things simultaneously is never seen – presumably because it would be mute, inert and self-extinguishing. If it did exist we would not know about it.)

It is blazingly obvious that radical doubt is irrational – but somehow the irrationality makes no difference, and the process has cumulated over the past few centuries.

I am not trying to caricature here. The subtractive method of radical doubt really is an extremely crude doctrine, utterly irrational, and (nonetheless, or because of this?) totally dominant in Western intellectual circles.

Since the spread of radical doubt from a few individuals to encompass whole classes, whole societies, we can see huge social changes, which show no signs of stopping but rather seem to be accelerating. Yet no matter what happens to individuals or societies that employ radical doubt, it is never taken as evidence that the soul-denying metaphysic is mistaken.

Because it is a metaphysical assumption, the subtractive method is taken for granted, such that whatever problems result from radical doubt will necessarily be attributed to other causes.

Radical doubt is an intellectual malignancy, that is clear; but the puzzle is why Western elites are so vulnerable to its spread.

NB: The proper question about the soul is not whether it is real - *of course* it is real – but what happens to the soul after death, in broad terms. Here there has been uncertainty and disagreement. But evidence comes from common sense (natural law), metaphysical and logical argument, and from revelation.

Saturday 12 June 2010

How to become an amateur scientist - some ideas

The basic and best method is apprenticeship - attach yourself (somehow) to a Master: someone who can do it already. Help them with their work (without pay), and in return they may teach you, advise you, or you may pick up an understanding of how to do what they do.

Read into the subject. Talk or write about what you read and try to get some feedback. Valuable feedback from a competent 'Master' is very, very rare however - it may come seldom and in little scraps, and the apprentice must be alert so as not to miss it.

Don't be too impatient to find a specific problem to work on - allow the problem to find you. Francis Crick proposed the 'gossip test' - that which you gossip about spontaneously, is probably contains a possible problem to work on.

When you are interested in a *problem*, you can usually find some aspect to work-on which you personally can do with your resources of time and effort, and without lavish material resources or manpower.

Publication is a matter of informing people who are genuinely interested in the same problem. This might be done by letter, as in the 17th Century. The internet has solved the problem of making work accessible to those who are interested.

If you are honest/ can earn trust, produce useful work or provide some valuable function, you will be admitted to the 'invisible college' of self-selected people working on a problem.

If you are not trustworthy, lack competence, or are unproductive, then you will not be allowed into the invisible college - because an invisible college is a synergistic group sustained by mutual benefit. If you don't provide benefits to the group, and show no prospect of providing any in the future, then you are merely a parasite and need to be excluded.

The respect of an invisible college is the currency of science - it is the invisible college which evaluates work, and develops and sustains understanding through time.

Friday 11 June 2010

Motivation in science - understanding reality

A scientist needs to want to understand  reality - this entails believing in reality, and that one ought to be truthful about it.

The belief in reality is a necessary metaphysical belief, which cannot be denied without contradiction - nonetheless, in modern elite culture it is frequently denied (this is called nihilism), why is why modern elite culture is irrational, self-contradictory (and self-destroying).

But obviously, a real scientist cannot be a nihilist - whatever cynical or trendy things he might say or do in public, in his heart he must have a transcendental belief in reality.

Science also involves a metaphysical belief (i.e. a necessary assumption, not itself part of science) in the understandability of nature and the human capacity to understand. Without this belief, science becomes an absurd and impossible attempt to find the one truth among an infinite number of erroneous possibilities.

Nonetheless, in modern elite culture, a belief in the understandability of nature and human capacity is routinely denied - another aspect of nihilism. Among many other consequences, this denial destroys the science which makes possible modern elite culture.

Explaining reality is a second step which may follow understanding, but explaining needs to be preceded by the desire to understand - again because there are an infinite number of possible explanations, none of which can be decisively refuted.

Modern science is undercut by many things - one is the difficulty for modern scientists of living by the proper motivations and beliefs of a real scientist. The transcendental beliefs are difficult to hold in isolation; it is difficult to refrain from asking *why* is it that should humans have these beliefs and motivations? Difficult to avoid the idea that they are arbitrary or delusional beliefs.

Committed scientists in recent decades have often justified themselves by emphasizing that science is enormous 'fun' - but this is a foolish and desperate line of defense. Many things are 'fun' for the people who happen to like them, but science was supposed to be about reality.

Hitler and Stalin seemingly enjoyed being dictators, perhaps found it ‘fun’  – but does that justify them?

Of course the ‘science is fun’ line is mostly trying to avoid the ‘science is useful’ trap. Because the usefulness of science is something intrinsically accidental and unpredictable. And of course science might well turn out to be harmful – fatal.; so usefulness cannot be guaranteed . If you try to get usefulness directly, you won’t get science – aims such as usefulness need to be set aside when a scientist is actually trying to understand reality.

Likewise explanations, predictions and so on – these are second order, contingent aspects of scientific discovery. Understanding must come first.

There never will be many people who are genuinely motivated by a desire to understand, and successful science also requires ability and luck.

Not just ability and luck: faith. Doing real science is an act of faith that if the scientist approaches his problem in the proper way and with sufficient effort, he will be rewarded by understanding.

(Rewarded not necessarily with the understanding he expected, but something just as good, or better.)

This is a religious kind of concept; a concept of just reward for proper devotion.

So real science is, at heart, a spiritual vocation – although this may be expressed in a variety of languages, with different levels of insight, and often indirectly.

Obviously it would be best if scientists did *not* talk about their spiritual vocations too much, especially in public. However, if they *are* going to speak honestly about the motivations of real science, then this is the kind of language they would need to use. This is the kind of language they did, in fact, use until about 50 years ago.

But when, as now, the language of spiritual vocation is ruled-out from public discourse (as foolish or fanatical) then scientists will inevitably be dishonest and misleading in public on the subject of science – blathering-on about usefulness when asking politicians and bureaucrats for money, and emphasizing fun when entertaining undergraduates .

In the end, by excluding all mentions of transcendentals or metaphysics, scientists end-up being untruthful with themselves – which is of course fatal to science. Bad motivations will yield bad consequences. The just reward of understanding reality, of understanding the truth, is not given to those whose devotions are dishonest.

Thursday 10 June 2010

More on 'testing'scientific theories

"Unfortunately, we have no way to determine whether a theory survives because it is true or because of our own inability to devise the appropriate tests."

From "Pure" by Mark Anderson

...Or because we can't be bothered to test it, or because it is inexpedient to test;

...or because it *has* been tested and the theory failed to pass the test but we ignore the result, or prefer to pick holes in the test's limitations.

No test of a theory is ever perfect, therefore each test of a favoured theory may be methodologically isolated and demolished on grounds of strictest rigour. This process can be continued without limit.

When a theory is favoured it can never be empirically refuted - neither by experience nor by formal testing.

When a theory is favoured for whatever reason (political, financial, moral) it will survive all assaults.

Testability neither demarcates nor defines science. Indeed nothing defines science, there is no specific methodology - it is (merely) a sub-specialty of philosophy, which is love of wisdom (truth seeking, truth speaking) - and philosophy is itself an emphasis on one specific transcendental 'good'. Push too hard and the whole things crumbles in your hands.

If not methodology, what then accounts for the spectacular success of science (up until the past few decades)?

Perhaps two things: the emergence of groups of honest, motivated and competent people working to solve problems, and the development of a multi-generational tradition so that these groups can hand-on their accumulated experience.

i.e. working together across generations rather than working alone during a single lifespan. That's all. 

 In other words, science was a fortuitous and fragile state of affairs; now long past.

Wednesday 9 June 2010

Careers advice for the real scientist

Supposing you were an honest, highly motivated young person; and you wanted to be a real scientist - what would be the best careers advice, given that a career as a professional scientist is obviously out of the question?

The best approach would be accept that you will be an amateur scientist, and think about how best to fund your work.

In other words, in future real scientists will need to regard their work rather as a serious poet or classical msuic composer does - as a vocation - and to forget about 'making a living' from the vocation.

The traps for a real scientist are nowadays the same as the traps for a poet. There are quite a lot of professional 'poets' who are paid to *be* a poet (writers in residence) - but actually none of them are real poets. Instead, in order to get the jobs, they have had to write what passes for poetry among the people who dish out the writers in residence jobs, which isn't actually poetry.

Sometimes real poets can get jobs pretending to teach poetry to people who want to become writers in residence; but no real poet would want to do these jobs - which are usually poorly paid anyway.

In the fairly recent past, some real poets have been school teachers and librarians - although the nature of these jobs has changed and perhas become more hostile to poetry. I know of dedicated amateur musicians of a high standard who do all kinds of jobs - so long as these jobs leave evenings and weekends free.

So the careers advice would be to use one's talents and choose a job that is paid highly enough per hour that the job can be done part-time - leaving enough time and energy in which to do real science. Such jobs usually require _some_ training, and the training itself costs time, money and motivation - so there will need to be a careful calculation and prediction and avoidance of prolonged and expensive training programs with uncertain job prospects (e.g. a PhD in an arts subject).

There is some shrewd careers advice around: e.g. http://www.martynemko.com/ - there are so many jobs nowadays, that there might well be something suitable that you have never even heard of.

It is also useful to know something about the economics of employment: e.g. "Why men earn more" by Warren Farrell explains that in the private sector you are usually paid more either when you do a job which has to be done but few people can do it (like anything involving numbers or computers); or to do what most people do not want want to do - working in an unpleasant or dangerous environment such as a prison or outdoors in winter.

Or the aspiring scientist could try to find a sinecure i.e. "a position or office that requires little or no work but provides a salary" - in other words, something in the public sector. High status sinecures are hotly competed for (as are all 'cool jobs) and they may be paid little or nothing (because so many people want to do them) - but low status sinecures may be available, doing 'joke jobs', the kind whose title provokes a snigger.

The real scientist will not care too much about the nature of their job or what other people think about it, so long as it provides a reasonably secure income without involving them in activities that interfere with their science, because their vocation is not in the job but in science.

Tuesday 8 June 2010

This is your brain... THIS is your brain on political correctness

*

THIS IS YOUR BRAIN...




This is your brain on political correctness... 
 



Science is about coherence, not testing 'predictions'

Until recently I usually described science as being mostly a matter of devising theories which had implications, and these implications should be tested by observation or experiment.

In other words, science was about making and testing predictions.

Of course there is more which needs to be said: the predictions must derive from theory, and the predicted state should be sufficiently complex, so as to be unlikely to happen by chance.

But it is now clear that this sequence doesn’t happen much nowadays, if it ever did. And that there are weaknesses about the conceptualization of science as mostly a matter of testing predictions.

The main problem is that when science becomes big, as now, the social processes of science (i.e. Peer review) come to control all aspects of science, including defining what counts as a test of a prediction.

This is most obvious in medical research involving drugs. A loosely-defined multi-symptom syndrome is created and a drug or other intervention is tested. The prediction is that the drug/ intervention ‘works’ or works better than another rival, and the test of prediction involves multiple measures of symptoms and signs. Within a couple of years the loosely defined syndrome is being diagnosed everywhere.

Yet the problem is not at the level of testing, since really there is nothing to test – most ‘diagnoses’ are such loose bundles that their definition makes no strong predictions. The problem is a the level of coherence.

Science is a daughter of philosophy, and like philosophy, the basic ‘test’ of science is coherence. Statements in science ought to cohere with other statements in science, and this ought to be checked. Testing ‘predictions’ by observation and experiment is actually merely one type of checking for coherence, since ‘predictions’ are (properly) not to do with time but with logic.

Testing in science ought *not* to focus on predictions such as ‘I predict now that x will happen under y circumstances in the future’ – but instead the focus should be – much more simply – on checking that the statements of science cohere in a logical fashion.

It is an axiom that all true scientific statements are consistent will all other true scientific statements. True statements should not contradict one another, they should cohere.

When there is no coherence between two scientific propositions (theories, 'facts' or whatever), and the reasoning is sound, then one or both propositions are wrong.

Scientific progress is the process of making and learning about propositions, A new proposition that is not coherent with a bunch of existing propositions may be true, and all or some of the existing propositions may be false indeed that is the meaning of a paradign shift or evolutionary science: when new incoherent propositions succeed in overturning a bunch of old propositions, and establishing a new network of coherent propositions.

(This is always a work in progress, and at any moment there is considerable incoherence in science which is being sorted-out. The fatal flaw in modern science is that there is no sorting-out. Incoherence is ignored, propositions are merely piled loosely together; or incoherence is avoided rather than sorted-out, and leads to micro-specialization and the creation of isolated little worlds in which there is no incoherence.)

***

Using this very basic requirement, it is obvious that much of modern science is incoherent, in the sense that there is no coherence between the specialties of science – specialties of science are not checked against each other. Indeed, there is a big literature in the philosophy of science which purports to prove that different types of science are incommensurable, incomparable, and independent.

If this were true, then science as a whole would not add-up – and all the different micro-specialties would not be contributing to anything greater than themselves.

Of course this is correct of modern medical science and biology. For example ‘neuroscience’ does not add up to anything like ‘understanding’ – it is merely a collection of hundreds of autonomous micro-specialties about nervous tissue. This, then that, then something else.

These micro-specialties were not checked for consistency with each other and as a consequence they are not consistent with each other. Neuroscience was not conducted with an aim of creating a coherent body of knowledge, and as a result it is not a coherent body of knowledge.

‘Neuroscience’, as a concept (although it is not even a concept) is merely an excuse for irrelevance.

It is not a matter of whether the micro-specialties in modern science are correct observations (in fact they are nowadays quite likely to be dishonest). But that isolated observations – even if honest - are worthless. Isolated specialties are worthless.

It is only when observations and specialties are linked with others (using theories) that consistency can be checked, that understanding might arise - and then ‘predictions’ can potentially emerge.

Checking science for its coherence includes testing predictions, and maximizes both the usefulness and testability of science; but a science based purely on testing predictions (and ignoring coherence) will become both incoherent and trivial. 

Monday 7 June 2010

The bureaucratization of pain

Analgesia - pain-relief, especially in the broadest sense of relief of suffering - was for most of history the primary interventional benefit of the physician (as contrasted with the surgeon) in medicine.

Among the primary benefits of medicine, perhaps prognosis is the greatest benefit - that is, the ability to predict the future; because prognosis entails diagnosis and an understanding of the natural history (natural progression) of disease.

Without knowledge of the likely natural history of a patient, then the physician would have no idea whether to do anything, and what to do.

However, through most of history, physicians were probably unable to influence the outcome of disease - at least in most instances they would diagnose, make a prognosis then try to keep the patient comfortable as events unfolded.

Keeping the patient comfortable. Relief of suffering. In other words: analgesia.

Much of medicine remains essentially analgesic (in this broad sense), even now.

But relief of actual pain is the most vital analgesic function: because at a certain level of severity and duration, pain trumps everything else.

So, perhaps the most precious of all medical interventions are those which relieve pain - not just the general pain-killers (of which the opiates are the most powerful) but the effective treatments of specific forms of pain - as when radiotherapy treats the pain of cancer, or when GTN treats the pain of angina, or steroids prevent relentless itching from eczema and so on.

The *irony* of modern medicine is that while it has unprecedented knowledge of analgesia, of the relief of pain and suffering - these are (in general) available only via prescription.

So, someone who is suffering pain and seeks relief, and effective analgesia is indeed in principle available, must *first* convince a physician of the necessity to provide them with relief.

If a physician does not believe the pain, or does not care about the pain, or has some other agenda - then the patient must continue to suffer. They do not have direct access to pain relief - only indirect access via the permission of a physician.

Pain and suffering are subjective, and it is much easier to bear another person's pain and suffering than it is actually to bear pain and suffering oneself.

Yet we have in place a system which means that everyone who suffers pain must first convince a professional before they can obtain relief from that pain.

This situation was bearable so long as there was a choice of independent physicians. If one physician denied analgesia for pain, perhaps another would agree?

The inestimable benefits of analgesia have been professionalized, and that means they have nowadays been bureaucratized since professionals now operate within increasingly rigid, pervasive and intrusive bureaucracies.

So the inestimable benefits of analgesia are *now* available to those in pain only if they fulfill whatever bureaucratic requisites happen to be in place.

If the bureaucracy chooses (for whatever reason - saving money, punishing the 'undeserving', whatever) that a person does not fulfill the requirements for receiving analgesia, then they will not get pain relief.

That is the situation, at the present moment.

Why do we tolerate this situation? Why do we not demand direct access to analgesia? Why do we risk being denied analgesia by managerial diktat?

Because, bureaucracy does not even need to acknowledge pain - it can legislate pain and suffering out of existence. It creates guidelines which define what counts as significant pain, what or who gets relief, and what or who gets left to suffer.

It is so easy to deny or to bear *other people's* pain and suffering, to advise patience, to devise long-drawn out consultations, evaluations and procedures.

Bearing pain ourselves is another matter altogether. Pain of one's own is an altogether more *urgent* business. But by the time we find ourselves in that situation, it is too late for wrangling over prescriptions, guidelines, and procedures.

Sunday 6 June 2010

Ketoconazole shampoo - a totally effective anti-dandruff treatment

The one thing that modern medicine hates and suppresses above all else, is a cheap and effective solution to a common problem.

There are scores, indeed hundreds or maybe thousands, of expensive, heavily advertized and *ineffective* 'anti- dandruff' shampoos on sale in supermarkets and pharmacists.

They are expensive non-solutions to the common problem of dandruff - and they are Big Business.

But in my experience ketoconazole shampoo is *totally* effective at stopping dandruff, and an application every week or two will keep it away.

This is because dandruff (and seborrhoeic dermatitis - which is severe dandruff) is caused by a fungus - the Pityrosporum yeast. The ‘cradle cap’ of babies is the same things too, and is also cured by ketoconazole.

The cause and cure were discovered by one of my teachers at medical school - Sam Shuster. (e.g. Shuster S.. The aetiology if dandruff and mode of action of therapeutic agents. Br J Dermatol 1984; 111: 235-242; Ford Gp, Farr Pm, Ive Fa, Shuster S.. The response of seborrhoeic dermatitis to ketoconazole. Br J Dermatol 1984; 111: 603-607.)

In other words, the cause and cure of dandruff has been known for 25 years.

SO - here we have what seems to be a completely effective solution to a problem which affects most adults at some point in their lives - yet the effective treatment is all but secret; presumably because if it were better known then the shelves would be cleared of the scores of ineffective, expensive and heavily advertized rival products.

My point?

In modern medicine, in modern life, it is possible for there to be completely effective and cheap and widely 'available' solutions to common problems, and for these to be virtually unknown.

And it is also notable that discovering the cause and cure of a common disease is not given much credit in medicine nowadays – it made the discoverer neither rich nor famous.

But at the same time there are thousands of rich and famous ‘medical researchers’ who have discovered nothing and cured nothing. Essentially they are rich and famous for ‘doing research’ – especially when that research involves spending large amounts of money.

When ‘medical researchers’ are rewarded for spending large amounts of money, and ignored for discovering the causes and cures of disease, what you end up with is ‘medical research’ that spends large amounts of money but does not discover anything.

And that is precisely what we have nowadays.

Also we end up with ‘medical researchers’ who do not even *try* to discover the causes and cures of disease.

And that is precisely what we have nowadays.

Saturday 5 June 2010

Driclor - a totally effective anti-perspirant/ deodorant

The one thing that modern culture hates and suppresses above all else, is a cheap and effective solution to a common problem.

There are scores, indeed hundreds or maybe thousands, of expensive, heavily advertized and *ineffective* deodorants and antiperspirants on sale in supermarkets and pharmacists. They neither stop odour, nor stop sweat.

They are expensive non-solutions to the common problem of smelly under-arm sweat - and they are Big Business.

But Aluminium chloride solution (which I buy in the brand called Driclor) is *totally* effective at preventing both perspiration and odour, and a single application lasts for three or four days.

The product is very reasonably priced, since a big bottle is about 6-8 US dollars and lasts me for several months.

YET - although it is sold in large pharmacies, it is not usually displayed on the shelves but needs specifically to be asked-for.

SO - here we have what seems to be a completely effective solution to a problem which affects most adults (insofar as most adults use some kind of underarm antiperspirant deodorant) - yet it is not advertized and is all-but hidden.

Presumably because if it were better known then the shelves would be cleared of the scores of ineffective, expensive and heavily advertized rival products. And probably Driclor itself would not survive this process, since the active product (aluminium chloride) is not patent-protected, and the company would no doubt be driven out of business by excess competition.

My point?

In modern medicine, in modern life, it is possible for there to be completely effective and cheap and widely 'available' solutions to common problems, and for these to be virtually unknown.

Friday 4 June 2010

Benzoyl peroxide effective treatment for shaving rash - but it bleaches!

In the spirit of rediscovered self-experimentation trail-blazed by Seth Roberts -

http://www.blog.sethroberts.net/2010/06/04/a-great-change-is-coming-part-1-of-2/

- I thought I would share one of my own discoveries:

i.e. that benzoyl peroxide (BP) cream (which is marketed as a treatment for acne) can treat shaving rash.

http://en.wikipedia.org/wiki/Benzoyl_peroxide

By shaving rash, I mean the unsightly spots which come after shaving, especially on the neck. The spots seem to be due to trauma of the beard hair follicles, and sometimes to in-growing beard hairs.

Anyway, an n=1 on-off crossover self-trial over several weeks established that this kind of rash was treatable, indeed curable, with BP.

Benzoyl peroxide is a peeling agent, so it is not particularly surprising that it works to treat this kind of problem.

HOWEVER, BP is also a bleaching agent (not surprizing for a peroxide!); and after a while I linked its usage to the bleached patches that appeared on towels and shirt collars, ruining them.

So, in the end I could not use it to treat the beard rash.

But it works.

Anti-denialism - The return of Lysenkoism?

Real science was built on a search for truth that was cooperative and competitive at the same time. Popper emphasized the mixture of hypothesis and testing, conjecture and refutation, testing for consistence and predictive ability and discarding of error.

Bronowski emphasized the need for tolerance of honest error, and that contributors to the scientific process should be respected even after their views have been refuted. Otherwise, scientists would not risk being accused/ convicted of being wrong and so would never challenge consensus; and consensus would never yield to refutation. (Which is precisely the situation in mainstream medical research.)

So we still respect Isaac Newton despite him having been superseded by Einstein; and Newton is not usually mocked or derided for having not been correct for all time.

But this balance has been difficult for many scientists, and even more difficult for those outside of science. Lamarck ranks all-time third in importance as a biologist according to Charles Murray's method in Human Accomplishment - behind Darwin and Aristotle - but Lamarck’s views on evolution are often (it seems to me) treated as a joke.

Of course, ignorant disrespect is part of growing-up. But although it has to be tolerated in teenagers, it is not an admirable trait; being a result of pride and aggression fuelled by insecurity.

Adolescents love to hate, and there are an awful lot of adolescents interested in science and working in and around science and is journalism and as pundits - many of them adolescents of advanced middle age.

Adolescents also form gangs, and gangs assert their status by seeking and beating-up victims (the victims of course ‘deserve’ this – for being who they are).

There is an awful lot of ignorant disrespect in science nowadays, and an awful lot of gangsterism. Real science used to be all about individuals – it really did! – but now science is all about gangs.

The reason for so much ignorant disrespect in science is mostly that there is so much ignorance, due to the abundance of low quality people and their micro-specialized perspective. Such have no concept of excellence higher than the standard, prevailing technical practices of their micro-discipline; anyone who does not adhere to these prevailing practices is regarded as either incompetent or wicked - hence despicable hence deserving of punishment. They deserve ‘what is coming to them’ – in gang parlance.

There is always disagreement in science, but the basis of real science was that scientific disagreement was conducted by scientific means. What is *not* acceptable to real science is that scientific disputes should be settled by non scientific methods.

Scientists must be allowed to make mistakes, to be wrong, or science cannot function.

This is necessary because in the first place they may not really have made a mistake and they may be right (or partly right) – but this may not be apparent for a while. Mainstream science may be in error, but this situation may be recoverable if dissent is tolerated.

However, in a system of real science, mistakes are tolerated only when they are *honest* mistakes – lying and deceptions are absolutely forbidden in real science; and will lead to exclusion from the community of real scientists. And incompetent errors are simply a waste of everybody’s time. So dishonesty and incompetence are rightly sanctioned by leading to a scientist’s work being disregarded by others in the field as probably unreliable or unsound.

This is why the dishonest thugs of modern pseudo-science always try to portray dissent and disagreement as always a result of incompetence or dishonesty.

The gangsters of pseudo-science cannot acknowledge even the *possibility* of an honest and competent scientist reaching a different conclusion from the one they themselves support. This is because the gangsters are transparently looking for an excuse to attack and to coerce; after all, gangsters need to make public displays of their power, or else they would soon lose it.

Gang-leaders need to beat-up dissenters, and they need people to know that this is happening, and they need these dissenters to be portrayed as deserving victims of attack.

Consequently the whole concept of honest and competent disagreement has been banished from modern bureaucratic pseudo-science.

In the world of bureaucratic pseudo-science there are only two kinds of view – the correct view which is defined and enforced by the peer review cartel; and wrong views which are held by those either too stupid to understand, or those corrupted by evil.

Lysenko was a scientific gangster in the Soviet Union – Stalin’s henchman - http://en.wikipedia.org/wiki/Trofim_Lysenko. His scientific sin was to suppress scientific opposition using non-scientific means; up to and including jail and death for his opponents. The justification for Lysenko’s use of coercion to suppress and silence dissent was that the opponents’ opposition was harmful to people, caused millions of death etc.

Modern science is just a couple of small steps away from full-blown Lysenkoism. Scientific opposition is suppressed using non-scientific means ranging from defunding, exclusion of publications and other blocks on dissemination, public humiliation, sacking and other legal threats. In many areas of science gangsterism is rife, with intimidation and threats and the administration of media ‘beatings’.

What does it mean? Many would regard the situation as regrettable – but it is much worse than regrettable. It is conclusive evidence of non-science.

A field in which the use of non-scientific modes of argument are rife is simply *not a science*. Not a science at all. It does not work. Gangsterism is incompatible with science.

For example, ‘climate science’ is not a science *at all*; as a field it does not function as a real science, it uses intimidation and coercion as a matter of routine. Therefore nothing in it can be trusted, the good and bad cannot be discriminated.

To clarify - because in general terms climate science does not settle disputes using scientific methods, but by using extra-scientific methods, therefore it is not a real science, but actually is whatever the main influence on its content happens to be: politics, mostly.

The main innovation of ‘climate science’ has been to legitimate the mass use of the hate-term ‘denialism’ to signal who ‘deserves’ a punishment-beating from the gang.

Let us call the phenomenon of labeling and beating up ‘denialists’ by the name of ‘anti-denialism’.

Anti-denialism is no accident, nor is it eradicable without massive reform because anti-denialism is functionally necessary for the cohesion of modern bureaucratic pseudo-science. Without victims to gang-up on, the gangs would fall apart into warring sects. They would fight each other because these gangs are composed of ignorant, egotistical, power-worshipping adolescents. What best holds such people together is pure hatred, and pure hatred needs victims.

With the phenomenon of anti-denialism rife in mainstream discourse, we are just a couple of small steps away from full blown Lysenkoism. We already have systematic intimidation of scientific opposition at every level short of the physical. But I have seen demands from the gangsters of science that the sanctions against denialists be escalated. Destroying livelihoods is no longer enough. Soon, perhaps very soon, unless the tide turns, we may be seeing scientists jailed for their views.

Since honest and competent dissent is not recognized, anyone who disagrees with the peer review cartel is either labeled as too stupid to understand or as competent but wicked. It is the competent dissenters who are most at risk under Lysenkoism, since disagreement with the mainstream coming from a competent scientist marks them out as evil and deserving of punishment.

Anti-denialism needs high profile victims. Lysenkoism needed to punish top scientists like Vavilov, who died following three years in prison http://en.wikipedia.org/wiki/Nikolai_Vavilov.

On present trends we may expect to see prominent denialists and dissenters jailed for being ‘wrong’ (as judged by peer review), jailed for the public good, jailed ‘to save ‘millions of lives’ – but in reality jailed for opposition to the ruling gangsters of bureaucratic pseudo-science, and because anti-denialists absolutely require a continuous supply of victims to beat-up on.

Thursday 3 June 2010

How much formal specialist training does a scientist need?

The answer is - very little.

A highly intelligent and motivated individual can get 'up to speed' in a subject, and begin work in it, in a matter of weeks.

Of course it takes much longer than this to make a significant contribution to a field - often ten years or so of persistence - but this is why it is very important to get started young working on your problem. And starting young means skipping the hyper-extended specialist preliminary 'training' which is the norm nowadays.

This is obvious from the fact that early scientists had never had any formal specialist training because it did not exist.

Further evidence comes from the example of the many physicists and mathematicians who changed field and made major contributions to, for example, biology. They were able to do so because physicists and mathematicians are the most intelligent people (i.e. the group with the highest average general intelligence or IQ) - which means they can learn and remember new material extremely rapidly compared with most of us.

Of course, modern academia insists (usually) on prolonged specialist training. But this is mostly due to careerism and restrictive practices. Major work is continually being done in biology and medicine by people without this training, indeed many of the best ideas come from outside of academia, and often from clever and motivated amateurs (such as investigative journalists). Much of this is published outside the professional literature – in books, not papers.

Intelligence is mostly inborn (i.e. the ability to reason abstractly and systematically cannot be inculcated but is - mostly - either there, or not there); and the extra discipline and baseline knowledge which is provided by education is mostly acquired during development - before college age.

So, whatever formal specialist training a scientist needs before tackling his problem ought to be done at school, in the teenage years - and *not* done at college, in the twenties.

Wednesday 2 June 2010

The culture of analgesia

Analgesia = pain-killing.

This is our culture, now. The culture of analgesia - in which relief of pain and suffering is primary - and not a means to an end.

In medicine, the relief of pain is, or ought to be, of central importance - but on examination it is not primary. The primary goal of medicine is to preserve life and functionality - and this only makes sense where life and functionality themselves have an implicit goal.

(This implicit goal of life is not a part of the concern of medicine as a specialty - but medicine as a human activity only makes sense if it can be assumed that people have something significant to do with their life and functionality. In the past this could be taken for granted - but not any more.)

Medicine is not and never has been a matter of 'first of all, do no harm; because harm is always a risk in trying to preserve life or functionality, or in relieving pain and suffering.

The primary imperative for modern secular democratic liberalism is the avoidance of suffering. Lacking any rationale or context for this, the definition of suffering has expanded without apparent limit.

In particular, hedonism - pleasure-seeking - is now re-labeled as analgesia; since there is an element of suffering involved in *not* being able to indulge one's desires.

The suffering involved in thwarted desire or the necessity for self-restraint is now inflated to a cosmic injustice, so encompassing as to trump almost anything and everything.

Any of the petty humiliations of everyday existence (being forbidden, sneered at, patronized, made to feel inferior, rejected, ignored) are amplified infinitely, and can indeed become the focus of whole life views.

To suffer *offense* is automatically to require, to deserve, analgesia - reparations, compensations, special consideration.

Sensitivity replaces morality.

In the culture of analgesia, life in both its strategy and its moment-by-moment existence, becomes a serial seeking after pain relief. A search for losing oneself, for forgetting, for tranquillization, or at least for distraction.

Ah! - to live a life of serene analgesia varied by serial pleasurable distractions…

(then to die, to sleep and *not* to dream)

- this is, in a sense, the ulimate goal of modernity.

Tuesday 1 June 2010

Worrying thoughts about specialization and growth

Could it be that the differentiation of Church and State, the development of universities (secular in their essence, even when staffed by religious), the process of specialization itself – that all these are actually first steps on an _inevitable_ path to where we are now (i.e. on the verge of a self-inflicted - almost self-willed - collapse of 'the West')?

Universities can be seen as by now vastly inflated institutions, not just parasitic but actively destructive in many ways. On the other hand Universities used to perform some functions which were essential to those aspects of modernity which we most admire: philosophy in the medieval university, classics in the next period, science (Wissenschaft) in the 19th century German universities and so on.

But, in retrospect, all these golden ages of scholarship and research were more like brief transitional periods en route to something much worse.

For instance, the flowering of science (as a specialized, largely autonomous, social system) for the couple of centuries up until the mid twentieth century was a period of constant institutional change until science became - as now - *essentially* a branch of the state bureaucracy.

It seems that useful/ efficient specialization (including separation from State and Religion) leads to over specialization (or micro-specialization) which is increasingly less efficient, then less effective - and all this seems to lead back to re-absorption of science into the State (or into Religion, in principle).

For instance the London Royal Society became more and more autonomous in its conduct until maybe the mid-twentieth century, then became progressively reabsorbed back into the State until now the Royal Society gets about ¾ of its funding directly from the UK parliament, and the organization functions like a department of the UK civil administration.

If we go back and back to find the point at which this *apparently* unstoppable yet self-undermining process began in the West - I think it may lead to the difference between (say) medieval Orthodox Byzantium and Catholic Western Europe.

To scholasticism, perhaps? That was when the divergence became first apparent - when an academic, pre-scientific discipline (i.e. philosophy) became increasing autonomous from Religion (in the West the Religious hierarchy already was separate from the State hierarchy - although sometimes the two cooperated closely. In the East, Church and State formed an intermingled, single hierarchy).

Indeed, my impression is that Thomistic scholasticism may itself be self-undermining – and that this can perhaps be seen in the history of the Roman Catholic Church and even of some specific scholastic scholars – for example Jacques Maritain or Thomas Merton? (They began as traditionalists and ended as modernizers.)

It seems that institutions can grasp the essence of Thomism, and yet the process of understanding does not at all prevent – indeed it perhaps encourages – the continuation of the process until it has destroyed the system itself. As Peter Abelard found, once the process of sceptical analysis has begun, there is not clear point at which it can be seen necessary to stop – and the only point when it is known for sure that things have gone too-far is when the system which supported the process has fallen to pieces and by then it is too late.

Something similar may apply to science. The process of science creates a social system which first really reinvents itself due to real discoveries, then later makes pseudo- discoveries in order really to reinvent itself, then finally makes pseudo-discoveries in order to pseudo-reinvent itself. At which point the full circles has been turned, and all that remains is to drop the pretence.

Of course, differentiation of society led initially to greater strength, based (probably) on frequent breakthroughs in science and technology which drove increased economic productivity and military capability. But as differentiation proceeded to micro- and destructive levels, the real breakthroughs dried up and were replaced with hype and spin, then later pure lies. Real economic growth was replaced with inflation and borrowing. Progress was replaced with propaganda.

We have already observed the whole story in the atheist and supposedly science-worshipping Soviet Union – which in Russia is maybe now returning to the more stable and robust pattern of Eastern Christian (Byzantine) theocracy - and the pattern is merely being repeated in the capitalist and democratic West.

(With the difference that the secular West will probably - in the medium term - return after the collapse of modernity to segmentary, chaotic tribalism, rather than large-scale cohesive theocracy.)

In sum, perhaps the process of social differentiation is unstoppable hence inevitably self-destroying? The increasing rate of science and technological breakthroughs from (say) 1700 to 1900 looked like progress in the conduct of human affairs until it wasn’t. The faster social system growth and differentiation proceeds, the faster it destroys itself.

Rapid growth and differentiation is therefore, in fact, intrinsically parasitic – whether or not we can actually detect the parasitism. At any rate, that's what it looks like to me.