Showing posts with label Science. Show all posts
Showing posts with label Science. Show all posts

Friday, July 5, 2024

So much for globular worming and sea level rise

 

Courtesy of Pascal Fervor, commenting at Liberty's Torch, we find this object lesson in reality.  Click the image for a larger view.



Next time a climate alarmist tries to pull "The seas are rising!" on you, show them that picture, and ask them to explain it.  They won't, of course - because they can't, unless they admit that sea levels on the whole are not rising.

Thank you, ancient Romans!

Peter


Wednesday, July 3, 2024

What if this happened to the Mississippi River?

 

I was interested to read that an ancient course of the Ganges River in India, some 2,500 years old, has been discovered.


Earthquakes, caused by the shifting of Earth’s tectonic plates, have the potential to transform the face of the world. Now, for the first time, scientists have evidence that earthquakes can reroute rivers: It happened to the Ganges River 2,500 years ago.

. . .

In a July 2016 study, Dr. Michael Steckler ... had previously reconstructed the tectonic plate movements — gigantic slowly moving pieces of Earth’s crust and uppermost mantle — that account for earthquakes experienced in the Ganges Delta.

His models showed that the likely source of earthquakes in the region is more than 100 miles (160 kilometers) away from the sand volcanoes that Chamberlain and her colleagues found. Based on the large size of the sand volcanoes, the quake must have been at least a 7 or an 8 magnitude — approaching the size of the Great 1906 San Francisco earthquake.

. . .

About 50 miles (85 kilometers) away from the sand volcanoes, the scientists also found a large river channel that filled with mud at roughly the same time. This finding indicates that 2,500 years ago, the course of the river dramatically changed. The proximity of these events in both time and space suggests that a massive earthquake 2,500 years ago is the cause of this rerouting of the Ganges.


There's more at the link.

The now-demonstrated fact that a major earthquake can change the course of even a huge river like the Ganges, moving it 50 to 100 miles away from its previous course, made me think hard.  I don't know that we've ever seen the like in North America;  most of our rivers have changed course through a combination of erosion and silting (as far as I know, anyway).  However, what might happen if something like the New Madrid Fault let go in a big way?


Earthquakes that occur in the New Madrid Seismic Zone potentially threaten parts of seven American states: Illinois, Missouri, Arkansas, Kentucky, Tennessee, and to a lesser extent Mississippi and Indiana.

The 150-mile (240 km)-long seismic zone, which extends into five states, stretches southward from Cairo, Illinois; through Hayti, Caruthersville, and New Madrid in Missouri; through Blytheville into Marked Tree in Arkansas. It also covers a part of West Tennessee near Reelfoot Lake, extending southeast into Dyersburg. It is southwest of the Wabash Valley Seismic Zone.


Again, more at the link.

What's more, the New Madrid Fault runs slap bang underneath the Mississippi River.  If it really let go, it could easily produce an earthquake with a magnitude of 7 to 8 - it already has in the not too distant past.  If it were big enough, and lasted for long enough, what might that do to the biggest river on our continent?  If a waterway that big were to be displaced by 50 to 100 miles east or west, how much of our economy, our cities and our population would it take with it?  And what would happen to anything in the way?

It's a fascinating subject for speculation.  I wonder if it might make an interesting novel - perhaps set in older times, around the Civil War or Wild West period, as alternate history?  There were powerful earthquakes along the Fault in 1811-12.  What if they were repeated, say, 60 or 70 years later, at even greater intensity?

Hmmm . . .

Peter


Friday, May 31, 2024

Declining intelligence = declining country

 

We've examined the topics of IQ (theoretical and applied), education and ability on several occasions in these pages.  If you'd like to read the earlier articles:


Higher education and IQ

IQ, countries, and coping skills

IQ and potential, both individual and national


Karl Denninger warns that the flood of lower-IQ migrants across our borders threatens to lower our national competence to cope with issues and problems.  (WARNING:  He uses more profanity than usual in his article.)  Here are some excerpts.  Emphasis in original.


You won't like this and I don't care.

You're going to die if you don't take this to heart, or even worse your kids will die.

What am I talking about?

Quite simply its this: You need about a 115 IQ to build and maintain modern civilization.

Examples?  Too many to count.  How about Flint's water system?  It poisoned a bunch of kids, remember?

Why didn't it poison kids for the previous 80 years?  It had been there that long, with the same lead service lines, but didn't poison anyone ... The 115+ IQ people who built and ran the water plant at Flint all those years knew this, and knew how to keep it safe.  They drank out of the same lines and so not only did they know how they had plenty of incentive to not screw up -- and as professionals who were smart, they didn't screw up.

Then Shaqueena, or her analog with a <115 IQ took over.  And changed the water source.  And, at the same time, didn't check and make sure the chemical and pH balance remained correct because the intellectual firepower to do so was simply no longer there.  The result was a bunch of poisoned kids.

. . .

Why do I bring all this up?

Because if we do not stop destroying the incentives for those on the right end of the IQ bell curve to have kids it will not be all that long before you go to flush the toilet and it won't, your stove, heat and A/C won't work either because there's no power or gas and virtually everything we rely on in the modern world will either kill you or simply not function at all.

Its much worse when people who simply don't have the intellectual chops for a given task are passed in school and given credentials they didn't earn.

. . .

Those who built all these things we enjoy today and in fact are the reason we can have several billion people on this planet -- most of them created by white men, and all of them by persons of >115 IQ -- are going to die.  We all die; it is inevitable.  If we do not stop demonstrating to those who are of higher intelligence that the only reason to have kids is their own hedonism, and by doing so for hedonistic purpose they will screw their offspring as those kids will have a s***ty future said people will choose not to breed as they are choosing right now.  The data is clear in this regard -- those who are of higher intelligence are choosing not to have kids and since they are of higher intelligence it is obvious that they are capable of reasoning out the incentives and disincentives, weighing both for themselves and what they perceive as the future for any children they might choose to produce -- and that analysis, once complete, is unfavorable as they see it.

People often claim that as societies advance the people tend to have fewer children.  That's a true statement but did you notice that the "why" is never discussed?  As societies advance inevitably people are led to believe, often by active fraud peddlers, that you can have something for nothing and the more-intelligent discern that it is likely their children will get ****ed by this pattern.  Said persons have no means to stop it peacefully as they're out-represented (by definition > 115 IQ is at least one standard deviation out on the right side and thus they're out-voted roughly 6:1) so they simply choose not to have children at all.  This inevitably results in the average of the curve shifting leftward unless it it is stomped on hard.


There's more at the link.

Like Mr. Denninger, I and many others have warned that "If you import the Third World, you become the Third World".  We're seeing that in action right now.  Despite the progressive left's demonization of IQ as a First World approach that automatically reduces equality and diversity in our workforce, IQ remains the single best indicator of whether or not a nation, or a city, or an organization, can and will prosper.  Higher IQ = better chances of that happening.  Lower IQ = lower chances of that happening.  It's as simple as that.

Peter


Wednesday, May 29, 2024

An effective treatment for bird flu?

 

Bird flu has been in the news a lot lately, with concerns about how America's poultry production (both eggs and meat) may be devastated if it continues to spread.  There's also speculation that if a strain of bird flu, or H5N1 as it's technically known, adapts to humans, the resulting epidemic could kill thousands.

What puzzles me is the insistence by the medical establishment of pushing "cures" such as Tamiflu or Relenza to treat H5N1.  Yes, they offer some hope, and may alleviate minor doses of the flu:  but there's another option that's been widely used in the Third World for years.  That's chloroquine, as well as its derivative hydroxychloroquine.  You'll doubtless remember their being advocated as a treatment for COVID-19 during the pandemic.  Many, including myself, believe it's because of the existing widespread use of hydroxychloroquine (as a prophylactic medication against malaria infection) and ivermectin (as a treatment for river blindness and other illnesses endemic to the continent) that prevented COVID-19 from gaining a foothold in Africa.  So many potential victims were already dosed with an effective treatment that the disease simply couldn't take root.

Unfortunately, the medical establishment is largely ignoring the fact that chloroquine has been claimed by some researchers to be a highly effective treatment against H5N1.


Yan, et al studied H5N1 infection in the laboratory and demonstrated that physiological relevant concentrations of chloroquine inhibited viral entry and damage to human cells. Additionally, when given as treatment and not prophylaxis, chloroquine reduced pulmonary alveolar infiltrates and improved survival in mice after a lethal dose of H5N1 from zero to 70%.


There's more at the link.

Hydroxychloroquine is freely available in the USA, and ivermectin is becoming more so.  (Here's one source of supply;  I think their price is ridiculously high, but there are others if you shop around, often less expensive.)  If you're worried about the possible crossover of the H5N1 influenza virus to the human population, I strongly suggest that you try to obtain some of each, and keep it in your emergency reserve supplies.  (I'm not being compensated in any way for linking to one supplier;  I'm doing so only because I know readers sometimes have difficulty finding a local source of supply.)

I no longer trust the medical profession to speak the truth about epidemics and illnesses - not after they made such a dog's breakfast out of COVID-19.  I'd rather investigate potential threats myself, obtain what information is available, and prepare accordingly.

Peter


Wednesday, May 22, 2024

The dilemma: get more lithium for favored EV's - but at the cost of increased oil fracking

 

I had to laugh at this report.


Almost two centuries after California's gold rush, the United States is on the brink of a lithium rush. As demand for the material skyrockets, government geologists are rushing to figure out where the precious element is hiding.

In September 2023, scientists funded by a mining company reported finding what could be the largest deposit of lithium in an ancient US supervolcano. Now public researchers on the other side of the country have uncovered another untapped reservoir – one that could cover nearly half the nation's lithium demands.

It's hiding in wastewater from Pennsylvania's gas fracking industry.

Lithium is arguably the most important element in the nation's renewable energy transition – the material of choice for electric vehicle batteries. And yet, there is but one large-scale lithium mine in the US, meaning for the moment the country has to import what it needs.

. . .

Expanding America's lithium industry, however, is highly controversial, as mining can destroy natural environments, leach toxic chemicals, and intrude on sacred Indigenous land.

At the same time, however, lithium-ion batteries are considered a crucial technology in the world's transition to renewable energy, storing electricity generated by the wind and the Sun. Finding a source of lithium that doesn't cause more environmental destruction than necessary is key, but a clean solution is complicated.

Pennsylvania sits on a vein of sedimentary rock known as the Marcellus Shale, which is rich in natural gas. The geological foundation was deposited almost 400 million years ago by volcanic activity, and it contains lithium from volcanic ash.

Over vast stretches of time, deep groundwater has dissolved the lithium in these rocks, essentially "mining the subsurface", according to Justin Mackey, a researcher at the National Energy Technology Laboratory in Pennsylvania.

Mackey and his colleagues have now found that when wastewater is dredged up from the deep by fracking activities, it contains an astonishing amount of lithium.


There's more at the link.

Looks like the irresistible force is about to collide headlong with the immovable object, in environmental terms.  The US government and the tree-huggers want to eliminate as much fossil fuel as possible, and are therefore pushing electric vehicles as the solution.  On the other hand, if they want to do that, they have to have lithium for the EV's batteries:  and a major source for lithium now appears to be the fracking technologies they've been trying to ban for years, on the grounds that they're a major source of pollution and other problems.

Which do they want most?  Abundant batteries?  Or abundant gasoline as a derivative of abundant batteries?  Will they do without the latter, even though it means that obtaining the former will be more difficult and much more expensive?  Or will they fuel the vehicles of those of us who reject EV's as being insufficiently developed to be practical, in order to have more EV's to sell to those who want them?

Oh, the irony is delicious . . .



Peter


Tuesday, March 12, 2024

Missile and drone guidance systems: the old is new again?

 

I was interested to read that a new Russian battlefield drone appears to be guided by unspooling several miles of optical fiber cable behind it as it flies.


Russian forces in Ukraine appear to now be using so-called first-person view kamikaze drones controlled via a physical fiber optic line rather than a wireless data link. This configuration offers a control method that is immune to radiofrequency electronic warfare, but that also imposes certain limitations on how the system can be employed.

. . .

This is not a new concept, broadly, either. Wire-guided anti-tank missiles have been in service around the world for decades now and many current-generation designs, such as some of Israel's Spike family, use fiber optic cables. The ubiquitous American-made TOW missile has that feature directly in its name: Tube-launched, Optically-tracked, Wire-guided.

It's also worth noting that many torpedoes also use a similar spooling wire command-link concept.

. . .

Another major advantage to a wired FPV drone is that it would not radiate any energy, nor would the user some distance away, that could be detected. These electronic emissions are key ways drones are detected in the first place and they can also prove deadly for the operator if electronic surveillance systems can triangulate their position. There is no such vulnerability with a wire-guided FPV drone.


There's more at the link.

It's interesting to view this "new" technology through the lens of the history of military technology.  The first anti-tank missiles, such as the French SS.10 (also used by the USA) and ENTAC, the Soviet Snapper, Swatter and Sagger, and some others, were guided by means of instructions sent to the missile in flight over wires it unrolled behind it.  Steering was usually by some kind of joystick, something like those seen on computer game controllers.  On battlefields where many such weapons had been used, such as those of the Yom Kippur War in 1973, observers sometimes found such guidance wires "festooning the landscape", to quote one report.  The system worked, but due to the slow speed of the missiles and their limited range, the operators had to keep their eyes on the weapon to control it.  That meant their position might be identified by the launch plume of the missile, allowing enemy tanks to target it, trying to take out the missile operators before they could steer the missile into the tank.

The US TOW missile of the late 1960's was designed to simplify this system by taking out the need for continuous operator guidance.  Instead of manipulating a joystick, the operator kept his sight focused on the target.  A computer in the launch unit translated the movements of the target into instructions to the missile, which was still guided by wires it unspooled behind it.  Later models allowed the missile tube to be some distance from the control unit, protecting the operator from enemy interference, but still requiring him to keep his sights on the target.

The wire unspooled from the missile was necessarily very light and thin, and could be torn or broken by obstacles on the battlefield.  Accordingly, efforts were made to develop missiles that did not need wires.  For example, Israel developed the MAPATS missile in the 1980's, which was basically a copy of the TOW missile with a laser guidance unit replacing the wire.  The operator simply shone a laser beam onto the target, and the missile slaved itself to the beam and remained aligned with it until it struck.  South Africa copied that concept with its ZT3 version of MAPATS.  Two disadvantages remained:  the operator had to keep the laser beam on the target, meaning he could not duck down behind cover, and the laser beam could be detected by the target, which could then maneuver behind cover to get away from it or fire at the source of the beam.  That's why almost all front-line armored vehicles today have laser sensors, to tell the crew when a laser is being shone at them and from where it's coming.

The latest anti-tank missiles have incorporated the targeting and guidance system into the missile, which is now entirely independent of the operator once it's been launched.  The US Javelin missile is an example of this, as is the Russian Kornet-EM.  The operator merely shows the target to the missile using its built-in sensors, then launches it.  He can take cover or move to a new position while the missile navigates itself to its destination.

Meanwhile, the first battlefield unmanned aerial vehicles (UAV's, or "drones") began in much the same way.  Early models were guided by means of a joystick sending radio commands to the drone.  This system worked, but was prone to interception or jamming.  Many drones today still use it, meaning they can be countered by stronger signals sent by an enemy to "take over" the drone, or jamming instructions from its operator.  Satellite guidance is a lot more difficult to jam, but it's also a lot more expensive to install, so smaller battlefield drones mostly don't use it.  Other models are autonomous, in that they're programmed with a preset course and then sent to fly that circuit, returning to a preprogrammed point with video or other sensor information about their targets.  Since there's no continuous operator guidance, there are no signals to jam, so the drone is much harder to stop.  On the other hand, an autonomous drone can't be directed to new targets as they're detected, or commanded to strike one of them if necessary.

By going back to the early concept of wire guidance (in this case, optical fiber cable rather than plain copper wire), a drone is once again invulnerable to jamming or other electronic countermeasures.  Fiber optic cable is much thinner and lighter than copper wire, meaning more of it can be carried aboard the drone without too much of a weight penalty;  and it can carry a lot more information than wire, meaning the drone can use its own sensors to send data back to the operator, who can then redirect the UAV as required.

If this proves successful on the Ukraine battlefields, I won't be surprised to see anti-tank missiles using the same technology deployed before long.  Since they won't need a laser beam for guidance, laser sensors on tanks will be rendered useless, giving no warning of the missile being fired or of its imminent arrival.  (Of course, there's always the possibility that there'll be a fusion between drones and anti-tank missiles.  Instead of having two separate systems, drones might carry explosive charges with them as a matter of course, turning them from sensor platforms into weapons at a moment's notice and rendering a missile unnecessary.)  Looks like a lot more battlefields might be festooned with wires in the not too distant future.

Given the still relatively slow speeds of drones, and the problem that anything moving too fast won't be able to deploy wires or cables behind it, I wonder if earlier ultra-high-speed missile programs such as the Vought HVM, the MGM-166 LOSAT, and the Compact Kinetic Energy Missile (CKEM), might not be re-evaluated?  Here's a brief overview of those programs.




None of the ultra-high-speed missiles entered production, but a lot of effort went into developing them, and that knowledge base is still out there.  If they could be engineered to carry their own sensors, not needing external guidance, their high speed might make them a real threat to an enemy expecting only much slower optical-fiber-guided drones and missiles.  In fact, what if the latest-generation fiber-equipped drones could act as remote sensors, staying a safe distance from enemy defenses while feeding targeting information to the high-speed missile launch unit using their unjammable fiber optic cable?  Just a thought . . .

Peter


Saturday, March 9, 2024

Saturday Snippet: It all comes down to corn

 

First published in 2006, Michael Pollan's book "The Omnivore's Dilemma: A Natural History of Four Meals" exposed the hugely artificial, compromised nature of our First World food chain.  It's been a source of enlightenment and controversy ever since.



The blurb reads:


What should we have for dinner? Ten years ago, Michael Pollan confronted us with this seemingly simple question and, with The Omnivore’s Dilemma, his brilliant and eye-opening exploration of our food choices, demonstrated that how we answer it today may determine not only our health but our survival as a species. In the years since, Pollan’s revolutionary examination has changed the way Americans think about food. Bringing wide attention to the little-known but vitally important dimensions of food and agriculture in America, Pollan launched a national conversation about what we eat and the profound consequences that even the simplest everyday food choices have on both ourselves and the natural world. Ten years later, The Omnivore’s Dilemma continues to transform the way Americans think about the politics, perils, and pleasures of eating.


I'm still in the process of reading the book.  I'm finding it fascinating, and learning a lot as I proceed.  I thought I'd start you off with the strange tale of how the humble corn plant has come to dominate so much of our food production and consumption.


Air-conditioned, odorless, illuminated by buzzing fluorescent tubes, the American supermarket doesn’t present itself as having very much to do with Nature. And yet what is this place if not a landscape (man-made, it’s true) teeming with plants and animals?

I’m not just talking about the produce section or the meat counter, either—the supermarket’s flora and fauna. Ecologically speaking, these are this landscape’s most legible zones, the places where it doesn’t take a field guide to identify the resident species. Over there’s your eggplant, onion, potato, and leek; here your apple, banana, and orange. Spritzed with morning dew every few minutes, Produce is the only corner of the supermarket where we’re apt to think “Ah, yes, the bounty of Nature!” Which probably explains why such a garden of fruits and vegetables (sometimes flowers, too) is what usually greets the shopper coming through the automatic doors.

Keep rolling, back to the mirrored rear wall behind which the butchers toil, and you encounter a set of species only slightly harder to identify—there’s chicken and turkey, lamb and cow and pig. Though in Meat the creaturely character of the species on display does seem to be fading, as the cows and pigs increasingly come subdivided into boneless and bloodless geometrical cuts. In recent years some of this supermarket euphemism has seeped into Produce, where you’ll now find formerly soil-encrusted potatoes cubed pristine white, and “baby” carrots machine-lathed into neatly tapered torpedoes. But in general here in flora and fauna you don’t need to be a naturalist, much less a food scientist, to know what species you’re tossing into your cart.

Venture farther, though, and you come to regions of the supermarket where the very notion of species seems increasingly obscure: the canyons of breakfast cereals and condiments; the freezer cases stacked with “home meal replacements” and bagged platonic peas; the broad expanses of soft drinks and towering cliffs of snacks; the unclassifiable Pop-Tarts and Lunchables; the frankly synthetic coffee whiteners and the Linnaeus-defying Twinkie. Plants? Animals?! Though it might not always seem that way, even the deathless Twinkie is constructed out of…well, precisely what I don’t know offhand, but ultimately some sort of formerly living creature, i.e., a species. We haven’t yet begun to synthesize our foods from petroleum, at least not directly.

If you do manage to regard the supermarket through the eyes of a naturalist, your first impression is apt to be of its astounding biodiversity. Look how many different plants and animals (and fungi) are represented on this single acre of land! What forest or prairie could hope to match it? There must be a hundred different species in the produce section alone, a handful more in the meat counter. And this diversity appears only to be increasing: When I was a kid, you never saw radicchio in the produce section, or a half dozen different kinds of mushrooms, or kiwis and passion fruit and durians and mangoes. Indeed, in the last few years a whole catalog of exotic species from the tropics has colonized, and considerably enlivened, the produce department. Over in fauna, on a good day you’re apt to find—beyond beef—ostrich and quail and even bison, while in Fish you can catch not just salmon and shrimp but catfish and tilapia, too. Naturalists regard biodiversity as a measure of a landscape’s health, and the modern supermarket’s devotion to variety and choice would seem to reflect, perhaps even promote, precisely that sort of ecological vigor.

Except for the salt and a handful of synthetic food additives, every edible item in the supermarket is a link in a food chain that begins with a particular plant growing in a specific patch of soil (or, more seldom, stretch of sea) somewhere on earth. Sometimes, as in the produce section, that chain is fairly short and easy to follow: As the netted bag says, this potato was grown in Idaho, that onion came from a farm in Texas. Move over to Meat, though, and the chain grows longer and less comprehensible: The label doesn’t mention that that rib-eye steak came from a steer born in South Dakota and fattened in a Kansas feedlot on grain grown in Iowa. Once you get into the processed foods you have to be a fairly determined ecological detective to follow the intricate and increasingly obscure lines of connection linking the Twinkie, or the nondairy creamer, to a plant growing in the earth someplace, but it can be done.

So what exactly would an ecological detective set loose in an American supermarket discover, were he to trace the items in his shopping cart all the way back to the soil? The notion began to occupy me a few years ago, after I realized that the straightforward question “What should I eat?” could no longer be answered without first addressing two other even more straightforward questions: “What am I eating? And where in the world did it come from?” Not very long ago an eater didn’t need a journalist to answer these questions. The fact that today one so often does suggests a pretty good start on a working definition of industrial food: Any food whose provenance is so complex or obscure that it requires expert help to ascertain.

When I started trying to follow the industrial food chain—the one that now feeds most of us most of the time and typically culminates either in a supermarket or fast-food meal—I expected that my investigations would lead me to a wide variety of places. And though my journeys did take me to a great many states, and covered a great many miles, at the very end of these food chains (which is to say, at the very beginning), I invariably found myself in almost exactly the same place: a farm field in the American Corn Belt. The great edifice of variety and choice that is an American supermarket turns out to rest on a remarkably narrow biological foundation comprised of a tiny group of plants that is dominated by a single species: Zea mays, the giant tropical grass most Americans know as corn.

Corn is what feeds the steer that becomes the steak. Corn feeds the chicken and the pig, the turkey and the lamb, the catfish and the tilapia and, increasingly, even the salmon, a carnivore by nature that the fish farmers are reengineering to tolerate corn. The eggs are made of corn. The milk and cheese and yogurt, which once came from dairy cows that grazed on grass, now typically come from Holsteins that spend their working lives indoors tethered to machines, eating corn.

Head over to the processed foods and you find ever more intricate manifestations of corn. A chicken nugget, for example, piles corn upon corn: what chicken it contains consists of corn, of course, but so do most of a nugget’s other constituents, including the modified corn starch that glues the thing together, the corn flour in the batter that coats it, and the corn oil in which it gets fried. Much less obviously, the leavenings and lecithin, the mono-, di-, and triglycerides, the attractive golden coloring, and even the citric acid that keeps the nugget “fresh” can all be derived from corn.

To wash down your chicken nuggets with virtually any soft drink in the supermarket is to have some corn with your corn. Since the 1980s virtually all the sodas and most of the fruit drinks sold in the supermarket have been sweetened with high-fructose corn syrup (HFCS)—after water, corn sweetener is their principal ingredient. Grab a beer for your beverage instead and you’d still be drinking corn, in the form of alcohol fermented from glucose refined from corn. Read the ingredients on the label of any processed food and, provided you know the chemical names it travels under, corn is what you will find. For modified or unmodified starch, for glucose syrup and maltodextrin, for crystalline fructose and ascorbic acid, for lecithin and dextrose, lactic acid and lysine, for maltose and HFCS, for MSG and polyols, for the caramel color and xanthan gum, read: corn. Corn is in the coffee whitener and Cheez Whiz, the frozen yogurt and TV dinner, the canned fruit and ketchup and candies, the soups and snacks and cake mixes, the frosting and gravy and frozen waffles, the syrups and hot sauces, the mayonnaise and mustard, the hot dogs and the bologna, the margarine and shortening, the salad dressings and the relishes and even the vitamins. (Yes, it’s in the Twinkie, too.) There are some forty-five thousand items in the average American supermarket and more than a quarter of them now contain corn. This goes for the nonfood items as well—everything from the toothpaste and cosmetics to the disposable diapers, trash bags, cleansers, charcoal briquettes, matches, and batteries, right down to the shine on the cover of the magazine that catches your eye by the checkout: corn. Even in Produce on a day when there’s ostensibly no corn for sale you’ll nevertheless find plenty of corn: in the vegetable wax that gives the cucumbers their sheen, in the pesticide responsible for the produce’s perfection, even in the coating on the cardboard it was shipped in. Indeed, the supermarket itself—the wallboard and joint compound, the linoleum and fiberglass and adhesives out of which the building itself has been built—is in no small measure a manifestation of corn.

And us?

. . .

Americans eat much more wheat than corn—114 pounds of wheat flour per person per year, compared to 11 pounds of corn flour. The Europeans who colonized America regarded themselves as wheat people, in contrast to the native corn people they encountered; wheat in the West has always been considered the most refined, or civilized, grain. If asked to choose, most of us would probably still consider ourselves wheat people (except perhaps the proud corn-fed Midwesterners, and they don’t know the half of it), though by now the whole idea of identifying with a plant at all strikes us as a little old-fashioned. Beef people sounds more like it, though nowadays chicken people, which sounds not nearly so good, is probably closer to the truth of the matter. But carbon 13 doesn’t lie, and researchers who have compared the isotopes in the flesh or hair of North Americans to those in the same tissues of Mexicans report that it is now we in the North who are the true people of corn. “When you look at the isotope ratios,” Todd Dawson, a Berkeley biologist who’s done this sort of research, told me, “we North Americans look like corn chips with legs.” Compared to us, Mexicans today consume a far more varied carbon diet: the animals they eat still eat grass (until recently, Mexicans regarded feeding corn to livestock as a sacrilege); much of their protein comes from legumes; and they still sweeten their beverages with cane sugar.

So that’s us: processed corn, walking.

. . .

Early in the twentieth century American corn breeders figured out how to bring corn reproduction under firm control and to protect the seed from copiers. The breeders discovered that when they crossed two corn plants that had come from inbred lines—from ancestors that had themselves been exclusively self-pollinated for several generations—the hybrid offspring displayed some highly unusual characteristics. First, all the seeds in that first generation (F-1, in the plant breeder’s vocabulary) produced genetically identical plants—a trait that, among other things, facilitates mechanization. Second, those plants exhibited heterosis, or hybrid vigor—better yields than either of their parents. But most important of all, they found that the seeds produced by these seeds did not “come true”—the plants in the second (F-2) generation bore little resemblance to the plants in the first. Specifically, their yields plummeted by as much as a third, making their seeds virtually worthless.

Hybrid corn now offered its breeders what no other plant at that time could: the biological equivalent of a patent. Farmers now had to buy new seeds every spring; instead of depending upon their plants to reproduce themselves, they now depended on a corporation. The corporation, assured for the first time of a return on its investment in breeding, showered corn with attention—R&D, promotion, advertising—and the plant responded, multiplying its fruitfulness year after year. With the advent of the F-1 hybrid, a technology with the power to remake nature in the image of capitalism, Zea mays entered the industrial age and, in time, it brought the whole American food chain with it.

. . .

Naylor has no idea how many bushels of corn per acre his grandfather could produce, but the average back in 1920 was about twenty bushels per acre—roughly the same yields historically realized by Native Americans. Corn then was planted in widely spaced bunches in a checkerboard pattern so farmers could easily cultivate between the stands in either direction. Hybrid seed came on the market in the late 1930s, when his father was farming. “You heard stories,” George shouted over the din of the tractor. “How they talked him into raising an acre or two of the new hybrid, and by god when the old corn fell over, the hybrid stood straight up. Doubled Dad’s yields, till he was getting seventy to eighty an acre in the fifties.” George has doubled that yet again, some years getting as much as two hundred bushels of corn per acre. The only other domesticated species ever to have multiplied its productivity by such a factor is the Holstein cow.

“High yield” is a fairly abstract concept, and I wondered what it meant at the level of the plant: more cobs per stalk? more kernels per cob? Neither of the above, Naylor explained. The higher yield of modern hybrids stems mainly from the fact that they can be planted so close together, thirty thousand to the acre instead of eight thousand in his father’s day. Planting the old open-pollinated (nonhybrid) varieties so densely would result in stalks grown spindly as they jostled each other for sunlight; eventually the plants would topple in the wind. Hybrids have been bred for thicker stalks and stronger root systems, the better to stand upright in a crowd and withstand mechanical harvesting. Basically, modern hybrids can tolerate the corn equivalent of city life, growing amid the multitudes without succumbing to urban stress.

You would think that competition among individuals would threaten the tranquility of such a crowded metropolis, yet the modern field of corn forms a most orderly mob. This is because every plant in it, being an F-1 hybrid, is genetically identical to every other. Since no individual plant has inherited any competitive edge over any other, precious resources like sunlight, water, and soil nutrients are shared equitably. There are no alpha corn plants to hog the light or fertilizer. The true socialist utopia turns out to be a field of F-1 hybrid plants.

Iowa begins to look a little different when you think of its sprawling fields as cities of corn, the land, in its own way, settled as densely as Manhattan for the very same purpose: to maximize real estate values. There may be little pavement out here, but this is no middle landscape. Though by any reasonable definition Iowa is a rural state, it is more thoroughly developed than many cities: A mere 2 percent of the state’s land remains what it used to be (tall-grass prairie), every square foot of the rest having been completely remade by man. The only thing missing from this man-made landscape is…man.

. . .

There are many reasons for the depopulation of the American Farm Belt, but the triumph of corn deserves a large share of the blame—or the credit, depending on your point of view.

When George Naylor’s grandfather was farming, the typical Iowa farm was home to whole families of different plant and animal species, corn being only the fourth most common. Horses were the first, because every farm needed working animals (there were only 225 tractors in all of America in 1920), followed by cattle, chickens, and then corn. After corn came hogs, apples, hay, oats, potatoes, and cherries; many Iowa farms also grew wheat, plums, grapes, and pears. This diversity allowed the farm not only to substantially feed itself—and by that I don’t mean feed only the farmers, but also the soil and the livestock—but to withstand a collapse in the market for any one of those crops. It also produced a completely different landscape than the Iowa of today.

“You had fences everywhere,” George recalled, “and of course pastures. Everyone had livestock, so large parts of the farm would be green most of the year. The ground never used to be this bare this long.” For much of the year, from the October harvest to the emergence of the corn in mid-May, Greene County is black now, a great tarmac only slightly more hospitable to wildlife than asphalt. Even in May the only green you see are the moats of lawn surrounding the houses, the narrow strips of grass dividing one farm from another, and the roadside ditches. The fences were pulled up when the animals left, beginning in the fifties and sixties, or when they moved indoors, as Iowa’s hogs have more recently done; hogs now spend their lives in aluminum sheds perched atop manure pits. Greene County in the spring has become a monotonous landscape, vast plowed fields relieved only by a dwindling number of farmsteads, increasingly lonesome islands of white wood and green grass marooned in a sea of black. Without the fences and hedgerows to slow it down, Naylor says, the winds blow more fiercely in Iowa today than they once did.

Corn isn’t solely responsible for remaking this landscape: It was the tractor, after all, that put the horses out of work, and with the horses went the fields of oats and some of the pasture. But corn was the crop that put cash in the farmer’s pocket, so as corn yields began to soar at midcentury, the temptation was to give the miracle crop more and more land. Of course, every other farmer in America was thinking the same way (having been encouraged to do so by government policies), with the inevitable result that the price of corn declined. One might think falling corn prices would lead farmers to plant less of it, but the economics and psychology of agriculture are such that exactly the opposite happened.

Beginning in the fifties and sixties, the flood tide of cheap corn made it profitable to fatten cattle on feedlots instead of on grass, and to raise chickens in giant factories rather than in farmyards. Iowa livestock farmers couldn’t compete with the factory-farmed animals their own cheap corn had helped spawn, so the chickens and cattle disappeared from the farm, and with them the pastures and hay fields and fences. In their place the farmers planted more of the one crop they could grow more of than anything else: corn. And whenever the price of corn slipped they planted a little more of it, to cover expenses and stay even. By the 1980s the diversified family farm was history in Iowa, and corn was king.

(Planting corn on the same ground year after year brought down the predictable plagues of insects and disease, so beginning in the 1970s Iowa farmers started alternating corn with soybeans, a legume. Recently, though, bean prices having fallen and bean diseases having risen, some farmers are going back to a risky rotation of “corn on corn.”)

With the help of its human and botanical allies (i.e., farm policy and soybeans), corn had pushed the animals and their feed crops off the land, and steadily expanded into their paddocks and pastures and fields. Now it proceeded to push out the people. For the radically simplified farm of corn and soybeans doesn’t require nearly as much human labor as the old diversified farm, especially when the farmer can call on sixteen-row planters and chemical weed killers. One man can handle a lot more acreage by himself when it’s planted in monoculture, and without animals to care for he can take the weekend off, and even think about spending the winter in Florida.

. . .

The great turning point in the modern history of corn, which in turn marks a key turning point in the industrialization of our food, can be dated with some precision to the day in 1947 when the huge munitions plant at Muscle Shoals, Alabama, switched over to making chemical fertilizer. After the war the government had found itself with a tremendous surplus of ammonium nitrate, the principal ingredient in the making of explosives. Ammonium nitrate also happens to be an excellent source of nitrogen for plants. Serious thought was given to spraying America’s forests with the surplus chemical, to help out the timber industry. But agronomists in the Department of Agriculture had a better idea: Spread the ammonium nitrate on farmland as fertilizer. The chemical fertilizer industry (along with that of pesticides, which are based on poison gases developed for the war) is the product of the government’s effort to convert its war machine to peacetime purposes. As the Indian farmer activist Vandana Shiva says in her speeches, “We’re still eating the leftovers of World War II.”

Hybrid corn turned out to be the greatest beneficiary of this conversion. Hybrid corn is the greediest of plants, consuming more fertilizer than any other crop. For though the new hybrids had the genes to survive in teeming cities of corn, the richest acre of Iowa soil could never have fed thirty thousand hungry corn plants without promptly bankrupting its fertility. To keep their land from getting “corn sick” farmers in Naylor’s father’s day would carefully rotate their crops with legumes (which add nitrogen to the soil), never growing corn more than twice in the same field every five years; they would also recycle nutrients by spreading their cornfields with manure from their livestock. Before synthetic fertilizers the amount of nitrogen in the soil strictly limited the amount of corn an acre of land could support. Though hybrids were introduced in the thirties, it wasn’t until they made the acquaintance of chemical fertilizers in the 1950s that corn yields exploded.

. . .

On the day in the 1950s that George Naylor’s father spread his first load of ammonium nitrate fertilizer, the ecology of his farm underwent a quiet revolution. What had been a local, sun-driven cycle of fertility, in which the legumes fed the corn which fed the livestock which in turn (with their manure) fed the corn, was now broken. Now he could plant corn every year and on as much of his acreage as he chose, since he had no need for the legumes or the animal manure. He could buy fertility in a bag, fertility that had originally been produced a billion years ago halfway around the world.

Liberated from the old biological constraints, the farm could now be managed on industrial principles, as a factory transforming inputs of raw material—chemical fertilizer—into outputs of corn. Since the farm no longer needs to generate and conserve its own fertility by maintaining a diversity of species, synthetic fertilizer opens the way to monoculture, allowing the farmer to bring the factory’s economies of scale and mechanical efficiency to nature. If, as has sometimes been said, the discovery of agriculture represented the first fall of man from the state of nature, then the discovery of synthetic fertility is surely a second precipitous fall. Fixing nitrogen allowed the food chain to turn from the logic of biology and embrace the logic of industry. Instead of eating exclusively from the sun, humanity now began to sip petroleum.

Corn adapted brilliantly to the new industrial regime, consuming prodigious quantities of fossil fuel energy and turning out ever more prodigious quantities of food energy. More than half of all the synthetic nitrogen made today is applied to corn, whose hybrid strains can make better use of it than any other plant. Growing corn, which from a biological perspective had always been a process of capturing sunlight to turn it into food, has in no small measure become a process of converting fossil fuels into food. This shift explains the color of the land: The reason Greene County is no longer green for half the year is because the farmer who can buy synthetic fertility no longer needs cover crops to capture a whole year’s worth of sunlight; he has plugged himself into a new source of energy. When you add together the natural gas in the fertilizer to the fossil fuels it takes to make the pesticides, drive the tractors, and harvest, dry, and transport the corn, you find that every bushel of industrial corn requires the equivalent of between a quarter and a third of a gallon of oil to grow it—or around fifty gallons of oil per acre of corn. (Some estimates are much higher.) Put another way, it takes more than a calorie of fossil fuel energy to produce a calorie of food; before the advent of chemical fertilizer the Naylor farm produced more than two calories of food energy for every calorie of energy invested. From the standpoint of industrial efficiency, it’s too bad we can’t simply drink the petroleum directly.

Ecologically this is a fabulously expensive way to produce food—but “ecologically” is no longer the operative standard. As long as fossil fuel energy is so cheap and available, it makes good economic sense to produce corn this way. The old way of growing corn—using fertility drawn from the sun—may have been the biological equivalent of a free lunch, but the service was much slower and the portions were much skimpier. In the factory time is money, and yield is everything.


There's a whole lot more in the book to explore and think about.  It's certainly opened my eyes to the fundamentally unnatural way in which we feed ourselves today - unnatural in the sense that if our food production were suddenly to be left to nature alone, without scientific and technological assistance, most of us would starve to death within a matter of weeks.

It also exposes the dangerous fallacy that if society goes to hell in a handbasket, preppers and survivalists will be able to grow their own food on secluded farms to keep themselves alive.  That sort of mixed-production family farm no longer exists in most cases, and where it does, it usually has to be subsidized by some sort of outside income.  It's simply no longer economical to grow or raise everything one needs out of one's own resources.  It can be done, but it takes so much effort to do so without the aid of technology (which won't be available in a long-drawn-out crisis or emergency) that it's effectively impossible for all except experienced farmers - of whom we have very few these days.  Those who succeed in doing so will almost certainly not be growing or raising everything they need, anyway, and will have to trade for things they can't produce themselves - but who will be producing those things in such a situation?

Food for thought indeed.

Peter


Tuesday, March 5, 2024

This does not increase my confidence in the health care system...

 

Considering the consequences of hasty deployment of the COVID-19 vaccines, this report doesn't fill me with confidence.


Investigative reporter Jefferey Jaxen brought to light an alarming reality on The Highwire Thursday: the ability to “vaccinate” the entire world without injecting large amounts of the population is upon us.

What we’re referring to is self-spreading vaccines, a technology that was almost ready to be deployed for the coronavirus pandemic — but ultimately passed on for mRNA injections instead.

. . .

The only thing stopping the mass use of this technology is this pesky thing known as informed consent. But the reality is, as evidenced by the COVID era, informed consent is not what it used to be.

The whole idea behind self-spreading vaccines is largely grounded in circumventing informed consent from individuals, or what scientists like to refer to as “behavioral barriers” or “vaccine delay.”

. . .

The U.S. military and DARPA have also been researching self-spreading vaccines, with DARPA exploring antivirals to “evolve” in real-time against new viral strains.

However, if a self-spreading vaccine mutates in an unforeseen way, it could potentially pose grave risks for the entire population.

Attorney Aaron Siri issued a statement on the matter.

“With this product, the whole idea is they release it to basically one person, and it spreads to every single person on the globe. So if they mess it up one time, just once, just once,” he emphasized, “they can mess up the entire world.”

“What might even be the biggest victim ... if they ever release this thing, it’s going to be civil, individual rights... Here, they’re going to release a product where you’re going to have no choice effectively but to take it. That is the ultimate crushing of individual and civil rights.”


There's more at the link.

A quick Internet search for "self spreading vaccines" reveals that there's been a lot of effort put into this since 2022, building on more limited research for a couple of decades prior to that.  Therefore, although the subject is almost ignored by the mainstream media, I'm forced to conclude that the report is probably accurate.

Unfortunately, given the enormous increase in vaccine-related complications, injuries and health issues that have arisen after vaccination for COVID-19, I have no confidence at all that these self spreading vaccines will be any better tested and controlled, and will inflict any fewer negative consequences on most or all of us.  After all, if they don't bother to tell us that the vaccine is spreading itself, how will we know what to look out for?  And how will we know what caused them?  It's like a "get out of jail free" card for vaccine developers and manufacturers - and for the health care system(s) that will take advantage of this technology, whether their customers want it or not.

Welcome to the Brave New World, where your medical masters decide on your behalf what medications, vaccines and other substances you need, and administer them without so much as a "by your leave".  How does it feel to be just another digit in the health care system?




Peter


Thursday, February 22, 2024

A forgery, but... why???

 

I'm puzzled by this report.


A 280 million-year-old fossil thought to be a well-preserved specimen of an ancient reptile is largely a forgery, according to new research.

The fossil, initially discovered in the Italian Alps in 1931, has the scientific name Tridentinosaurus antiquus. Scientists thought the dark, deep outline of the lizardlike body encased in rock was skin and soft tissue, and they considered the fossil to be a puzzle piece for understanding early reptile evolution.

. . .

A new, detailed analysis has revealed that the dark color of the fossil isn’t preserved genetic material ... researchers determined that the body outline was carved in the rock and painted with “animal charcoal,” a commercial pigment used about 100 years ago that was made by burning animal bones. The carving also explained why the specimen appeared to retain such a lifelike shape, rather than appearing flatter like a genuine fossil.

. . .

Intriguingly, there are actual bones within the fossil. The hind limbs, although in poor condition, are real, and there are also traces of osteoderms, or scalelike structures. Now, the researchers are trying to determine the exact age of the bones and what animal they belonged to.

. . .

Rossi and her team can’t be entirely sure that the forgery was done on purpose.

“We believe that, since some of the bones are visible, someone tried to expose more of the skeleton, by excavating more or less where someone would expect to find the rest of the animal,” Rossi said. “The lack of proper tools for preparing the hard rock did not help and the application of the paint in the end was perhaps a way to embellish the final work. Unfortunately, whether all of this was intentional or not, it did mislead many experts in interpreting this fossil as exceptionally preserved.”


There's more at the link.

To my mind, this discovery raises even more questions than the original discovery.

  • Who did it, and why?  It obviously wasn't an attempt to gain publicity for an individual, because there was no fuss at the time of the "discovery" naming any individual as having found it.  If it wasn't for publicity, why did the researcher(s) responsible not simply document what they'd done, or simply discard the sample along with other debris of no scientific value?  I don't think anyone would have complained, given that they didn't destroy anything worthwhile in the process.
  • Why did nobody in the nearly a century since the "discovery" ask more questions about it?  Why was it left until 2021 to begin an investigation?  Clearly, it wasn't considered an important enough issue by previous generations of researchers.  What drew their attention to it so long after the fact?
  • Where did the actual bones discovered during the investigation come from?  Was there an Italian Kentucky Fried Chicken equivalent way back then, and did the originators of the "fossil" simply discard their dinner bones along with the ruined research material?  Did the investigation discover and analyze "eleven herbs and spices" on the fossilized remains?

I doubt we'll ever find answers . . . but it's an intriguing discovery.

Perhaps we should ask the same investigative team to take a long, hard look at the present inhabitants of the White House.  Are there, perhaps, fake fossils to be found there too?



Peter


Thursday, February 1, 2024

What else did they expect?

 

I had to laugh at this report.


Poisoned AI went rogue during training and couldn't be taught to behave again in 'legitimately scary' study

AI researchers found that widely used safety training techniques failed to remove malicious behavior from large language models — and one technique even backfired, teaching the AI to recognize its triggers and better hide its bad behavior from the researchers.


The details are at the link.

Had those researchers never heard the term, "Like father, like son"?  Had they never considered that any artificial intelligence that human intelligence can create or develop is likely to resemble, and emulate, the intelligence that inspired it?

Human beings are flawed creatures - each and every one of us.  We have good points, bad points, indifferent points.  Criminals, psychopaths and their ilk are as human as the rest of us, and in some cases good folks exhibit many of the traits of bad folks.  (Consider leadership.  It's amazing how many leaders, in business, politics or anywhere else, exhibit many of the traits of a psychopath.  It's almost like it goes with the territory.)

Any artificial intelligence designed to work with, and sometimes take the place of, human intelligence is going to have to be the same way.  If it isn't, it won't play well with others, because it won't understand - instinctively, empathically or intellectually - how their minds work, and won't know how to interact with them.  I'm sure the developers of artificial intelligence aren't explicitly trying to make their products psychopathic, or as weird as humans can be;  but those traits are part of human intelligence.  I'll be very surprised if any artificial intelligence designed to mimic and/or duplicate the latter doesn't turn out the same way.  I don't see how it can be otherwise.

Peter


Tuesday, January 23, 2024

The COVID-19 scam is now plain to see - but the authorities worldwide continue to exploit it

 

Two of the leading lights in uncovering the lies about COVID-19 have just laid out the latest evidence that it was a deliberately manufactured plague, and that vaccines against it increase our risk of death.


The Smoking Gun - With a confession note

Emily Kopp from The US Right to Know (USRTK) has obtained additional detail about the DEFUSE proposal that is far more than a smoking gun but, in fact, is more analogous to finding the gun, fingerprint and confession note in one place.

Emily found evidence that this proposal also listed the very restriction enzyme (BsmBI) that Bruttel et al. claimed could build the virus. Not only did Bruttel et al. notice that BsmBI sites were conveniently evenly spaced throughout the viral genome and this spacing was not only NOT observed in other CVs but that this approach made complete sense as a logical path to manually assembling the genome.

Lo and behold the DEFUSE proposal actually contains NEB R0580S part numbers to order these very enzymes to construct the virus in the manner Bruttel et al. predicted.

This is a case closed event!

There is no more debate. C19 was made in a lab. Which lab and when is still a hot topic but it didn’t come from a pangolin courting a bat.


The New Zealand data leaked by whistleblower Barry Young is unassailable proof the COVID vaccines increase your risk of death

None of the leadership team at Health New Zealand is interested in the fact that their database shows they are killing New Zealanders with these vaccines. They will not let me speak to any of their epidemiologists and they won’t show me the time series analysis done by their epidemiologists for some reason. Why not? That’s the best way to silence me: just show me how I got it wrong.

. . .

When Barry has his day in court, he gets to do something none of us have been able to do: force these people to answer the questions we’ve always wanted to know the answer to but they always refused to answer.

These are questions such as:

1.  What reports did the epidemiologists create based on the NZ data showing the vaccine reduces all-cause mortality in New Zealand?

2.  What investigations were made by Health New Zealand after Barry Young informed management there was a safety problem with the COVID vaccine?

3.  How did the New Zealand epidemiologists at Health New Zealand explain the time-series analysis of the leaked data? The time-series analysis shows the vaccines increased the risk of death. If the vaccine didn’t cause this, then why were recently vaccinated people dying at a progressively higher rate than the rest of New Zealand (those of the same age). What did all these people have in common that accelerated their death if it wasn’t the vaccine?

I can’t wait.


There's more at both links.

And yet our government is still urging us to get booster shots, and those who refuse to contaminate themselves with a COVID-19 vaccine are still facing discrimination in many cities and states.

Doubling down on the manipulation of COVID-19 for government purposes, the Director-General of the World Health Organization has just warned of "Disease X", and stressed the need for even more government control of the population to avoid this so-far-non-existent disease.


Ghebreyesus, speaking in front of an audience at the World Economic Forum in Davos on Wednesday, said that he hoped countries would reach a pandemic agreement by May to address this “common enemy.”

Disease X is a hypothetical “placeholder” virus that has not yet been formed, but scientists say it could be 20 times deadlier than COVID-19.

It was added to the WHO’s short list of pathogens for research in 2017 that could cause a “serious international epidemic,” according to a 2022 WHO press release.

Ghebreyesus said that COVID-19 was the first Disease X, but it’s important to prepare for another pandemic.

. . .

“The pandemic agreement can bring all the experience, all the challenges that we have faced and all the solutions into one,” Ghebreyesus said. “That agreement can help us to prepare for the future in a better way.”

“This is a common global interest, and very narrow national interests should not come into the way.”


Again, more at the link.

This is the same man who spouted the "party line" about COVID-19 for years, and exonerated China from any involvement in developing the virus (completely contrary to the evidence that has since emerged), and is pushing for a new global treaty to allow the WHO to mandate - order - national governments to take whatever action it sees fit if a new "Disease X" should emerge.  National sovereignty will be overridden by this treaty.  A global one-size-fits-all solution will be imposed on us - even if it kills us.

Well, after the disastrous display of government and bureaucratic incompetence, bungling and dishonesty over COVID-19, I'm looking forward to someone trying to lecture us about a new disease threat, and telling us what to do and how to respond and how to live, and ordering us to accept a new vaccine - or else!  I suspect many of us will have things to say - not to mention do - to anyone trying to force us to obey orders like good little proles.  That's perhaps the only good thing to come out of COVID-19;  it's taught many of us to think for ourselves, and to distrust Big Brother.

Peter


Thursday, January 18, 2024

And who, precisely, thought this was a good idea?

 

As if we don't have enough to worry about, we learn:


In a Wuhan-esque study, Chinese scientists are experimenting with a mutant COVID-19 strain that is 100% lethal to “humanized” mice.

. . .

The deadly virus is a mutated version of GX/2017, a coronavirus cousin that was reportedly discovered in Malaysian pangolins in 2017 — three years before the pandemic. Pangolins, also called scaly anteaters, are mammals found in warm areas of the planet.

All the mice that were infected with the virus died within just eight days, which researchers noted was a “surprisingly” rapid death rate.

. . .

Although terrifying, the study is the first of its kind to report a 100% mortality rate in mice infected by the COVID-19-related virus — far surpassing previously reported results from another study, the researchers wrote.

. . .

Francois Balloux, an epidemiology expert at University College London’s Genetics Institute, slammed the research as “terrible” and “scientifically totally pointless.”

“I can see nothing of vague interest that could be learned from force-infecting a weird breed of humanized mice with a random virus. Conversely, I could see how much stuff might go wrong,” the professor wrote on X.


There's more at the link.

I can see three possibilities here:

  1. Researchers thought it would be a good idea for whatever reason, and went ahead and did it without even thinking about asking permission, ethical questions, etc.
  2. This research was actually authorized, for some unknown reason, and the news of its "success" was released so as to reflect credit on those who authorized it.
  3. This is intended to scare people, and establish China in the world's eyes as the foremost developer of biological weapons.
If you can see any other reason for doing this, let us know in Comments.

All I can say is, unless there's a solid medical reason for doing such research, it should never be conducted at all.  In this case, I can't see any such reason.  Taking a pangolin virus, that can't affect humans, and adapting it to mice (and, through them, humans), seems utterly illogical on the face of it.  What possible purpose could this serve?  And why try to synthesize what appears to be a more concentrated, more lethal COVID-19 virus when we've already got enough problems with the first one?

Conspiracy theorists have been claiming for years that COVID-19 was actually a plan to reduce the human race's numbers by billions of people for environmental and other reasons.  This sort of "research" can do nothing to comfort them - in fact, it'll make them even more paranoid than they already are.


*Sigh*


Peter


Friday, November 3, 2023

Bigotry? Heck, no - it's common sense!

 

Divemedic shows us this . . . meme?  Poster?  Graphic?  I dunno.



I refuse to be guilt-tripped by science-deniers who ignore medical, scientific and biological factThey're the problem.  And no, the rest of us don't have to change just because they say so.  Those who wish to date trans women are welcome to do so.  Those who don't, are also free to choose.  Relationships can't be compelled, and those who try to do so are just as bigoted and sectarian as anyone else who wants to force their views on those who don't agree with them.

Hatred's got nothing to do with it at all.



Peter


Thursday, October 26, 2023

Exploring an ancient way of life

 

In the midst of all the nastiness, trouble and violence in the world, I was intrigued by a BBC report about two "experimental archaeologists" who paddled down the Thames River in England in leather canoes, replicas of Stone Age watercraft.


Nine days into their quest to paddle the full length of the River Thames, Theresa Emmerich Kamper and Sarah Day watched as a slate grey stormfront swallowed the blue sky. The onrush of rain was moving so fast there was no time to paddle ashore and unload their gear. Their buckskin dresses weren't waterproof. And if their reindeer fur bedding got soaked, it would never dry. So, they draped their leather tent over themselves and huddled inside their cowhide canoe as they were hammered with hail, singing silly songs and bailing water with a wooden cup.

A man moored nearby poked his head out of his houseboat. "If you're gonna do it like the Flintstones," he called out, "you're gonna get wet!"

Over the howling wind, they shouted, "It's for science!"

Facing the elements while surrounding themselves with leather is, in fact, an important part of Day and Emmerich Kamper's work. As experimental archaeologists, they research and recreate ancient technologies to gain insight into how our ancestors lived. They teach ancestral skills, such making clothing, pouches, preserved meat and bone tools from animals.

. . .

If they took animal skin boats on a multi-day journey, they reasoned, they could learn more about how Paleolithic peoples might have traded along rivers and even migrated to islands around Scotland and the Mediterranean.

The idea was born: they would paddle 255km [about 158 miles] of the Thames with handcrafted canoes, equipment and food that mimicked – as closely as they could practically and legally manage it – those made by Stone Age peoples.


There's more at the link, describing their journey and what they learned from their experiences.  I enjoyed it as a total change of pace from the frenetic reality around us.  Recommended reading.

Peter


Friday, August 25, 2023

Our skin may actually cause ageing

 

The BBC reports:


The latest research suggests that our skin is not just a mirror for our lifestyles – reflecting the effects of years of smoking, drinking, sun and stress – and hinting at our inner health. No, in this new upside-down-world, the body's largest organ is an active participant in our physical wellbeing. This is a strange new reality where wrinkles, dry skin and sunspots cause ageing, instead of the other way around.

In 1958 ... [a] major project was quietly conceived. The Baltimore Longitudinal Study was to be a scientific investigation of ageing with a daring and rather unorthodox premise ... The research followed thousands of adult men (and later, women) for decades, to see how their health developed – and how this was affected by their genes and the environment.

Just two decades in, scientists had already made some intriguing breakthroughs, from the discovery that less emotionally stable men were more likely to be diagnosed with heart disease to the revelation that our problem-solving abilities decline only slightly with age.

But one of the most striking findings confirmed what people had long suspected: how youthful you look is an impressively accurate expression of your inner health. By 1982, those men who had been assessed as looking particularly old for their age at the beginning of the study, 20 years earlier, were more likely to be dead.  This is backed up by more recent research, which found that, of patients who were judged to look at least 10 years older than they should, 99% had health problems.

It turns out skin health can be used to predict a number of seemingly unconnected factors, from your bone density to your risk of developing neurodegenerative diseases or dying from cardiovascular disease. However, as the evidence has begun to add up, the story has taken a surprise twist.

. . .

As the largest organ in the body, the skin can have a profound impact. The chemicals released by diseased and dysfunctional skin soon enter the bloodstream, where they wash around, damaging other tissues. Amid the ensuing systemic inflammation, chemicals from the skin can reach and harm organs that seem entirely unrelated, including your heart and brain.

The result is accelerated ageing, and a higher risk of developing the majority of – or possibly even all – related disorders. So far, aged or diseased skin has been linked to the onset of cardiovascular disease, type 2 diabetes and cognitive impairment, as well as Alzheimer's and Parkinson's disease.

. . .

... there is direct evidence that [using sunscreen and moisturizing the skin] does reduce inflammation – and that it may help to prevent dementia ... adding moisture back is not particularly complicated, whatever cosmetics adverts seem to suggest. And in the field of ageing, this simple intervention is showing remarkable results.


There's more at the link.

I'm intrigued by this research because, like many others, I was exposed to particularly harsh conditions for my skin during my military service.  When you're deployed, nobody's going to provide sunscreen or moisturizer for your skin - at least, no military organization of which I've ever heard has done so.  You provide your own, or get sunburned and wizened like a dried-up prune.  (Yes, there are other similes.  No, I'm not going to mention them in a family-friendly blog like this!)

When I look at my catalog of health problems in later life (it's a depressing list), and read this article, I find that many of the illnesses and conditions it identifies are among my issues.  I wonder if there's a correlation between years of one's skin being baked and fried and rained on and frozen in the field, and health in later life?  It sounds as if there may be.  Might veterans be able to use this evidence to get more medical assistance for such issues as they get older?  How would one prove the connection?

Curiouser and curiouser, as Alice would say . . .

Peter


Friday, May 26, 2023

An interesting comparison

 

Red Side has produced a fascinating video comparison of the fastest man-made objects (so far).  It's quite entertaining.




I was amused by its reference to an ultra-high-speed manhole cover, launched during Operation Plumbbob in 1957.  It wasn't actually a manhole cover, but a 2,000 pound steel cover plate over a 500 foot borehole used for a nuclear test explosion.  It's presumed to have been vaporized during its passage through the Earth's atmosphere after the big bang.

Peter


Tuesday, April 25, 2023

Unmasking the transgender movement and those behind it

 

In the aftermath of the Nashville Christian school shooting, a great deal of attention has been paid to the possible motivation of the transgender perpetrator.  The FBI and/or other authorities aren't helping by refusing to publish the shooter's manifesto, which is allegedly "astronomically dangerous" and "a blueprint on total destruction".  In the absence of the facts, we can only speculate - and such speculation is, I suggest, more dangerous than the truth, because there's no way to check or verify it.

However, in the aftermath of the Nashville tragedy, a number of "deep background" articles have sought to analyze the entire transgender movement.  They've produced some very useful information, and I think deserve our attention.  I'm not going to quote at length from any of them, because there's far too much detail to summarize in a brief blog article.  Nevertheless, I highly recommend that you read these articles for yourself.  They're worth your time.


1.  Queer theory indoctrination is directly responsible for Nashville tragedy

"Queer theory — and the transgender ideology that is part of it — is essentially just one segment of critical theory, aimed at deconstructing those truths by assaulting the objective definitions of man, woman, child, marriage, sex, gender, etc. Though it may sound hyperbolic to some, queer theory’s ultimate — albeit camouflaged — purpose is to spark a neo-Marxist revolution by destroying our prevailing notions and institutions; by destroying the American family, and by turning transgender and “nonbinary” youth into violent revolutionaries."


2.  Billionaire perverts behind the trans agenda

"... transgenderism is a top down ‘ideology’ with close links to the transhumanist movement and megamoney-backed initiatives which can be traced back to early 2000s Silicon Valley scientists. Investigative journalist and feminist Jennifer Bilek has has located its genesis in two leading American transhumanists – William Bainbridge and the fabulously wealthy lawyer and bio-tech entrepreneur, Martine Rothblatt, a transwoman who had sex reassignment surgery in 1994."


"The tranny/janny comparison has been made many times before. But aside from the truly excellent wordplay at its heart, most people only compare trannissaries to janissaries because both are “shock troops” for a formidable regime. For instance, here’s the excellent American Greatness:

If it wasn’t already clear, the shooter in Nashville was a Janissary, a demented footsoldier of an evil, totalitarian ideology that wishes to remake the world in its demonic image.

But in fact, this is far more than some simple joke."


4.  Welcome To The Great Spiritual War Of Our Time

"The side of evil should now be obvious to everyone. The perverts, pedophiles, abortion fanatics, and Chinese communist party officials are all on the same grotesque side. It’s the side of the woke and the queer theorists ... The Rainbow Jihad is not an accidental coalition of political groups and cultural forces. There’s a common unity between these people — and the unity is their shared hatred of God. That’s why this evil movement seems to be communist and sexually perverted and Satanic all at the same time."


The fourth article is of particular interest to me, because - as a Christian and a pastor - I firmly believe that the whole transgender debacle is yet another front in the eternal, ongoing spiritual warfare between good and evil.  However, even if you don't believe in God or the spiritual world, the first three articles are enough to make the point.  We're fighting a fundamentally evil philosophy here, one that denies facts and truth and imposes falsehood and lies.  At its heart, that's the core of transgenderism.

I have no problem working with transgender people to help them deal with the psychological and/or psychiatric problems that have led them into this abyss.  I also understand and accept that there are a few people - vanishingly few:  far less than 1% of the population - who are what's described as "intersex".  They can genuinely describe themselves as "neither fish nor fowl", and have immense problems dealing with their gender, sexuality, etc.  For them I have nothing but sympathy.  However, for 99% of those who claim that they're transgendered, I'm afraid that we do them no service by pretending to go along with their delusions.  They need help with their psychological/psychiatric issues, but not help to alter their bodies to conform to their delusions.

As for those who want to divorce "gender" from "sex":  nonsense!  In all of recorded history, we have seen no such dichotomy in medicine, philosophy, theology or any other discipline.  It's a wholly recent phenomenon, driven by political correctness (see the articles linked above for details).  When it comes to one's sex, the chromosomes have it:  XX or XY (with, as noted, a very, very few "intersex" exceptions to that rule).  That reality trumps any desire to be something different.  We are what our chromosomes say we are, and our gender follows our sex.  We may feel differently, but the facts don't correspond to our feelings.  To pretend otherwise is to live a lie;  to accommodate those living a lie is to give countenance to that lie, and abandon the truth.  Unless and until medical science - not medical wishful thinking, not political correctness, but hard, measurable, verifiable scientific fact - can demonstrate otherwise, that's where the matter rests.

Peter