Tuesday, December 30, 2014

In the thrall of a two-faced god

kw: book reviews, nonfiction, science, science and politics, history of science, radiation, short biographies

This post's title is taken from the last chapter of The Age of Radiance: The Epic Rise and Dramatic Fall of the Atomic Era by Craig Nelson. Nelson chronicles the discovery of radiation and radioisotopes, and development of various radioactive products, that began in the late 1800's. He carries through to the present day, in which more than 400 nuclear power stations produce about 1/7th of electricity worldwide, hundreds of radioactive isotopes are known and dozens are used for various medical and industrial purposes, yet several major power plant failures and the problem of accumulating waste from nuclear power plants has led to overweening public fear of anything related to the word "radiation".

Thus, I have observed that the "epic rise" and "dramatic fall" refer to public perception. Prior to 1945, radiation was extremely popular. Lying in a pool of radioactive water was supposed to be therapeutic. Even in 1960, when I was given a half-ounce of "yellowcake" (pure U3O8) powder in a gelatin capsule on a field trip to a Uranium processing plant, it was considered rather benign, and such "pills" were suitable gifts to a troop of Boy Scouts. Public hysteria had yet to set in. Trips to Las Vegas to see A-bomb tests were still popular, and would remain so until 1963. Even then, the end of air-blast testing was a result of a treaty with the USSR, not from public protest in America.

Nuclear waste became an issue primarily in America, because of a set of rather odd laws that prohibited reprocessing spent fuel from nuclear power plants. This is done as a matter of routine in Europe.

A side note for those who need it: Induced fission of Uranium or Plutonium results in "fission products". When a large nucleus is split because it has absorbed a neutron, it leaves behind two fragments (sometimes three) whose mass totals the original mass, minus the mass of two or more neutrons released during the fission event. It is kind of like a drop of water splitting into two smaller drops plus a few tiny droplets. These fission fragments are usually radioactive isotopes, typically with several excess neutrons, so they tend to decay quickly by beta decay, which converts neutrons to protons and balances the nucleus better. Several such decays will result in a stable nucleus. The trouble comes because some of the "quick" decays actually occur over months or years. These longer-lived isotopes accumulate in spent fuel from reactors, and as a result, it stays "hot" for thousands of years. Chemical processing can easily separate out these waste products, leaving purified Uranium or Plutonium, whichever "fuel" was first used. Purified Uranium is called "depleted Uranium" because the power-making isotope has been greatly reduced or eliminated. This stuff makes great bullets for snipers, being almost twice as dense as lead. Reprocessed Plutonium can be returned to the reactor as fresh fuel. Also note that reactors that use enriched Uranium are designed quite differently from those using Plutonium.

In the late 1970's there was a great debate going on about the safety of storing nuclear power plant waste. I was at a public event where a nuclear power industry representative described the materials. He said that the spent fuel from a certain kind of plant would be in a canister that looks a lot like a 50-gallon oil drum, but that it would initially be producing 10,000 watts of heat from the decay of fission fragments. This would decline to 5,000 watts over several hundred years, then be quite steady for thousands of years thereafter. I stood and asked, "May I obtain one or two to heat my crawl space in winter?"

Back to the book. The historical sketches and mini-biographies are invaluable. Dr. Roentgen, the Curies, Fermi, Oppenheimer and so many others are brought to life as rounded personalities in a way I have not read elsewhere. The glacially slow tragedies that prematurely ended the lives of nearly all early students of radioactivity are heartbreaking. It took much, much too long for scientists to realize that the energetic particles released by these isotopes were displacing electrons or atoms from their places throughout any material they passed through, including their own bodies. Such displacement did damage that frequently resulted in cancer or, at higher levels, radiation toxicity and even rather rapid death at the highest levels. A further note on isotopes:

An isotope is a form of an element characterized by a specific number of neutrons in the nucleus. Thus, all atoms of Oxygen have 8 protons in the nucleus, but the number of neutrons ranges from 4 to 18, giving them atomic masses of 12 to 26. The "usual" isotope of Oxygen, O-16, has 8 neutrons. Very small amounts of O-17 and O-18, with 9 and 10 neutrons, exist naturally. All other Oxygen isotopes are short-lived and only exist because of reactions in a nuclear reactor, and usually only when a scientist's purpose is to create them. The most stable of these reactor-created isotopes of Oxygen is O-15, with a half life of about 2 minutes.

The book's 17 chapters are in 4 sections. The first covers the early years up to 1938 or '39, and the second, the development of induced fission and the Manhattan Project that led to both Uranium and Plutonium bombs. One of each was dropped on Japan in 1945, and I can't help wondering if this was as much for experimental reasons as military. The third section covers the cold war, and the fourth, the early spread and more recent fallback of nuclear power generation and the power plant disasters that led to its fall from grace.

I was surprised to find out (I should not have been) that the meltdown at Chernobyl was only one of at least 10 or 12, and became known because it was close enough to international borders that its fallout plume was easily detected in other countries. The others had been successfully kept secret, even though one or two may have exceeded Chernobyl in total radioactive materials released and environmental damage.

The most difficult chapter to read through was the one on Fukushima ("Blessed Island" in Japanese). It is the best documented of the "big three" of the world's imagination, the other two being Chernobyl and Three Mile Island. The narrative exposes the monumental stupidity of the "designers" and "engineers" (they do not truly deserve those titles), who chose a reactor design known to be flawed; who chose to put it on a coastline prone to tsunamis and against clear warnings by geologists, and also in one of the more earthquake-prone parts of Japan—which is more earthquake-prone as a country than almost any other—; who chose to place the backup generators for the cooling system in basements that were at or below sea level; and then the host of errors that were made in operational safety measures during the weeks and months prior to the disaster. Actually, this was a series of linked disasters that played out over half a year's time, and in some measure they are still being played out.

But just as I was sure the author would inveigh against continuing use of nuclear power, he produced a string of facts such as:

  • The total death toll from nuclear power plant meltdowns, so far as is known, is 33,000 or less. This compares with 15,000 deaths over 30 years in the coal mining industry worldwide, and 20,000 in the petroleum industry.
  • The atomic bombs dropped on Japan killed roughly 200,000-250,000 within the first half year (half of those in the first minutes). An equal number were killed by the tsunami of 2004 in the Indian Ocean. Roughly twice this many die yearly in America alone from smoking-related cancer and heart disease.
  • A dam failure in China in 1975 killed 171,000.
  • On a per-megawatt-hour basis, fossil fuels are 18 times as deadly as nuclear fuels.

This is why Nelson calls "Radiance", the totality of industries and products of radioactive elements and isotopes, the two-faced god, like Janus. To moderns, the apt analogy is a two-edged sword. One daren't touch it anywhere but the handle! Yet public opinion is so strong, and the ignorance of scientific principles so profound in both public and political spheres, that atomic energy is effectively dead in America and a number of other "developed" countries, at least for the next generation or two.

Here is some final food for thought I came across as I considered this post:


This illustration went around and around the Web after it was published late in 2011. It shows the excess radiation exposure people are expected to receive by living in the various Japanese prefectures. The red-toned one is Fukushima Prefecture, and the exposure is 0.25-0.50 µSv/h. That unit needs explaining:
Unit: µSv/h - micro-Sieverts/hour. Radiation toxicity begins to make itself evident above a total dose of about 500 mSv (500,000 µSv), or half a Sievert, and more severe affects appear after 1 Sievert. About one person in 18 is expected to develop cancer if exposed to 1 Sv total over an extended period—for example, 115 µSv/h for one year.
It is hardly risky to spend much time even in Fukushima Prefecture. Lifetime exposure under an excess dose of 0.50 µSv/h would be about 2/3 of a Sievert. However, there are a few focal areas near the destroyed power plant in which you'd be very sick after spending no more than a few hours. If you want the thrill of visiting a nuclear wilderness, though, nothing beats taking a "nuclear tourism" tour of the Chernobyl area, where one can look into lands that are said to be too radioactive for people to live, but where wildlife flourishes without human interference, and one can even take a short walk over ground that cannot, by law, be built upon for at least 10,000 years.

Sunday, December 21, 2014

How do we restore appropriate doctoring?

kw: book reviews, nonfiction, medicine, ethics, doctors, memoirs

In all the various stories I have gathered of troubles I have had with or about medical doctors over the years, the problem has always been competence, not ethics. If the experiences Dr. Sandeep Jauhar has described are truly typical, it seems I've been quite lucky. His new book is Doctored: The Disillusionment of an American Physician. The main title is a reflexive jest, because he was the one "doctored", or taught, through his experiences. He got an attitude adjustment, and not one that I would applaud.

General Norman Schwarzkopf said (I paraphrase), "Hardly anyone goes to work daily expecting to do a bad job." In the same way, very few begin a medical career intending to do harm or to get rich off the poor. In his former book Intern Dr.Jauhar described his trials after completing medical school and entering residency. There, he was the abused one, and he harks back to those days a few times in Doctored, when he meets residents and interns who callously take advantage of new ways of doing things, going home at the end of a shift, regardless what is happening, seemingly without caring a whit for the patient being handed off to the next shift's physician. While he never saw one stand up in the midst of CPR because "it was time to go home", but it almost came to that.

To become a family doctor these days is the fastest way to get into practicing on one's own, but it still takes a good while: four years of medical school and at least two years of residency, and perhaps a year or two of fellowship. At the earliest, a newly minted family physician can begin practice, whether privately or with an organization, by age 28 or 29. Dr. Jauhar is a cardiologist. The residency was longer—I am not sure whether he did one or two—and he performed a few years of fellowships before being hired as the attending cardiologist at Long Island Jewish Medical Center at age 36. This accords with the experience of a family friend who also got free of his "education" at age 36 and is now in practice as an orthopedic surgeon.

When you are pushing 40 and have a quarter-million in education loans to pay off, it's hard to make ends meet, even if your pay is well above the national median of $52,000. The book covers a period of about eight years, just the right span of age for most of us to get around to having a midlife crisis. Dr. Jauhar didn't really have time for a midlife crisis. He had a career to jump-start, and soon found the jumper cables were badly frayed. We read a lot about his wife's increasing distress as their savings dwindle after one child is born, and then when another is on the way.

Urged, berated, and nearly bludgeoned by his wife and by circumstance, he began to work part time for another doctor who is in private practice. He soon learned that is it all about business. He didn't have the heart, or the right way of thinking, to do well in business. Ask a doctor why there are so many tests ordered these days, and why nearly everyone gets the same tests regardless how sick they are. The standard answer you'll get is likely to be something about "defensive medicine", the need to "cover all the bases" to avoid litigation. The answer you won't hear is that the insurance companies pay better for some tests than others, and it is the high-dollar ones that are the most overused. Dr. Jauhar found himself "doing scut work for peanuts", to use a phrase he doesn't use, but that I've heard from others. Though he could now make ends meet, he felt he was beginning to lose his soul, helping a doctor game the system and get rich at the expense of the American public.

Make no mistake about it, we all pay for unethical medicine. Most of Medicare is paid for by a payroll tax, and its losses are covered out of the general Federal budget, from taxes we all pay. Insurance companies are not in business to subsidize health care, and must indeed make a profit, so premiums increase and increase to cover the actual costs they incur. Don't pay any mind to a few blind guides who boast that American medicine is the best in the world. Yes, there are a few areas in which treatments in America are the most effective, but in general we pay more than twice as much per capita as in any other developed nation for the thirtieth or fortieth best medical system.

The book is in three parts, titled "Ambition", "Asperity", and "Adjustment". In the end, he adjusted. That, I find rather sad. He got doctored all right. I remember once remarking that a good subtitle to the musical Grease would be "The corrupting of Goody Two-Shoes." Here, I cannot say Dr. Jauhar has been corrupted, not quite, but he has to admit there is a stain on his soul. The incentives built into moderm American medicine, which will be only partly relieved and otherwise exacerbated by the Affordable Care Act (AKA "Obamacare"), practically force a doctor to defraud the system to make a living, and yield incredible riches to those most adept at doing so.

I recall the "traditional" insurance plans called "Major Medical." Patients were expected to pay out of pocket for all the ordinary stuff: doctor visits or office visits (in a day when the doctor visited you a third of the time), and most pills or shots. If you needed something less ordinary such as setting a bone, or sutures, or an operation, the Plan paid 80%, and rates were such that most middle-class Americans could afford their 20%, maybe with a little short-term loan. Now that insurance plans purport to "cover everything" (It's not true, but that's what they advertise), where is the incentive for anyone to economize? When everyone pays thousands and thousands yearly for their medical plan, they feel entitled to go to the doctor for every little thing, and they're OK with the doctor ordering dozens of tests of all sorts, because "the Plan will pay for it". The next year, premiums go up, and the few who are wise realize that once "the Plan" has paid, it has to get the money back, and premiums are its only source of income.

Our system isn't just broken, it is devastated. Dr. Jauhar doesn't have much in the way of solutions to offer. I do. Vote with your feet. Get your "ordinary doctoring" from a local physician. If you need something major, assuming you're capable of travel, go to India or England or somewhere else with one of those 30-40 medical systems that outperforms ours, where you'll pay less at full price, including your travel costs, than you would for the "Co-pay" demanded by the hospital here. Medical tourism is on the increase, for good reason. If enough of us do so, the American medical system will respond to the only force mighty enough to change it: Competition.

Tuesday, December 16, 2014

More on the immense difference between religion and faith

kw: book reviews, nonfiction, religion, christology, catholic theology

There is a lovely video I saw earlier today in a FaceBook post: an elderly Jewish woman named Fell singing hymns ("Jesus Loves Me" for example) to a severely demented woman, and really, really connecting with her. How many Christians know a single Jewish song of the faith, and could connect with someone of a different faith so deeply and genuinely? Mrs. Fell truly embodies something Paul wrote to the Corinthians, "I have become all things to all people."

Many years ago I read The Man Nobody Knows by Bruce Barton, first published in 1924. As a young person searching for an identity, I found some insight in it, about the humanity of Jesus. But I was ultimately unsatisfied, and when I later found faith in Jesus Christ, I realized how shallow the presentation was, leaving the deity of Jesus almost unmentioned. One thing of value stayed with me: the understanding that Jesus was a Jew, and lived and worked almost entirely as a faithful Jew. His message, as he told a Lebanese (Syrophoenician) woman on a rare visit to Lebanon, "I was sent only to the lost sheep of Israel." Yet, when she answered wisely, he granted her request (healing for her daughter), indicating that what those "sheep" discard could be obtained by others.

The passage just mentioned, found in Matthew 15, is one of several showing that Jesus knew His rejection by most Jews would be followed by a more successful spreading among non-Jews. This is largely played out in Acts of the Apostles. Before the time the Jewish-Christian testimony in Jerusalem and Judea was practically exterminated by the Romans after 68 AD, along with hundreds of thousands of Jews, the Jerusalem-centered branch of the church was a distinct minority.

I approached with only moderate expectation Christ Actually: The Son of God for the Secular Age by James Carroll. I did not expect it to be a work of profound faith, and that expectation was confirmed. It didn't take long to recognize that the viewpoint is Roman Catholic. Later it comes out that Mr. Carroll is a former priest, but is now married and a university professor. That is OK, I haven't found works of Catholic theology to be accessible to any but extremely over-educated readers, so I was glad to find a more readable presentation.

I am not sure the author's theological stance is so clearly within that of the Catholic Church. Where he goes to great lengths to explain away the so-called miracles in the New Testament (and he'd probably do so with Old Testament stories as well, were they in the purview of his writing), I am pretty sure the Church's more official position is that Jesus did indeed work miracles. He writes again and again of exploring and examining the divinity of Jesus; in the end, he takes the position, if I understand him aright, that Jesus is counted divine only in retrospect, and neither thought of himself as God nor said so to others. I wonder what he makes of  John 8:24, where Jesus, after being asked where his Father is, finishes a long reply by saying, "I told you that you would die in your sins; if you do not believe that I am he, you will indeed die in your sins." Or that, when he was speaking to the disciples before going to the Garden of Gethsemane (John 14:9), and Philip asked him to show them the Father: "Don’t you know me, Philip, even after I have been among you such a long time? Anyone who has seen me has seen the Father." (All quotes are from NIV)

One mystery of Christology is to understand when Jesus obtained what is called his "pre-Incarnation knowledge". In Mr. Carroll's view, there never was any. Instead, he speculates quite wildly about Jesus as a disaffected and unemployed young man of Galilee chafing under the economic strictures caused by Roman occupation and taxation, becoming a disciple of his cousin John (the baptist) and remaining so for perhaps a decade. Jesus eventually reacts against the asceticism of John and embraces a more public life, preaching to the dispossessed. I find that harder to believe than the goofy story "Bel and the Dragon", found in Catholic Bibles, but not Protestant ones. I guess if the Apocrypha have found their way into your world view, your imagination is rather unfettered. (For the uninitiated, the Dragon story is about Daniel in Babylon, defeating a fire-breathing dragon by throwing a helmet full of water into its mouth and down its throat, causing a steam explosion.)

While reading, I marked a couple dozen places on which I thought I ought to comment. But I have little taste for detailed debate. I will instead take up two important items.

Firstly, Mr. Carroll's fundamental premise is that the four Gospels are "wartime literature", with the three Synoptics (Matthew, Mark and Luke) written in the 70-80 AD time frame, and John written by 90 AD. I believe they were indeed produced during a period of growing warmaking and warmongering, but a decade or so earlier. He writes several times that "all scholars" agree on these late dates, but he overstates his case rather dramatically. Maybe all Catholic scholars late-date the Gospels, but I doubt it. And even some non-Catholic scholars do so, but primarily those who follow the "higher critics" whose fundamental premise is that the Scriptures are purely human products, and treat them as literature and literature only. Such "historical criticism" of Biblical texts was initially condemned by the Vatican (Leo XIII) in 1893, but later somewhat welcomed (Pius XII in 1943).

Those who don't have a hidden agenda to deny and denigrate the Bible's inspiration understand that the Synoptic Gospels were produced between about 55 and 65 AD; perhaps as late as 66 or 67. Certainly within the lifetimes of those we consider their authors. The late-dating historical critics simply cannot believe that Jesus foretold the fall of the Temple 35-40 years before the fact. Instead they posit that all the Gospel authors put these words in Jesus' mouth, even as they were writing about a Jerusalem that they saw being destroyed around them (or "just over there") in 70 AD. Historical critics go to great lengths to deny God's existence, and particularly Jesus's deity. I say "deity" rather than "divinity". Divinity is a quality; deity is the being of the Person.

So the fundamental issue is whether God inspired the writings we call the Bible, and how detailed His inspiration was. Paul wrote to the Corinthians (1 Cor 2:13), "This is what we speak, not in words taught us by human wisdom but in words taught by the Spirit, explaining spiritual realities with Spirit-taught words." He claims verbal inspiration for what he spoke and wrote. Later in the same book, he discusses marriage and differentiates between God's commands and his own opinion, then discusses the choice of lifelong chastity, beginning, "I give a judgment as one who by the Lord's mercy is trustworthy" (7:25), yet ending, "I think that I also have the Spirit of God." (7:40)

We know that there are all sorts of things in the Bible that we can't count as God's words, not directly at any rate. Certain words of Satan are recorded, such as his accusation of Job. One chapter of Daniel was the testimony of Nebuchadnezzar, and the entire book of Ecclesiastes is written from a despairing, depressed, fully human viewpoint. Yet Ecclesiastes is followed by Song of Songs. How amazing, an aging Solomon could write both "vanity of vanities" and "the song of songs". What happened in between? He got back into contact with God! Note: the Shulammite is Solomon writing in a female voice of his ecstatic experience of God as his lover. Guys out there, you think you are male? Maybe a real he-man?!? Wait until you meet the Source of Maleness, dude! ...and God is also the Source of Femaleness, as the title El-Shaddai attests ("shad" is Hebrew for breast). I state my understanding of inspiration thus: The Bible tells us what God wants it to tell us. How He accomplished its production is up to Him.

And, has the text of the Bible been edited? Boy, and how! So what? Cannot God inspire an editor just as effectively as an author? For example, where did the author of Genesis get his material? While I believe it really was Moses, he must have used source writings to compose the Torah. The beasts of burden used by Abraham and Jacob and their servants are called camels in Genesis, yet historically we know that camels were introduced much later, perhaps even after the time of Moses. Abraham and Jacob would have used donkeys. Clearly, either Moses or a later editor updated the text to use a desert beast that had become more familiar. Maybe it was Samuel.

Historical critics late-date the Old Testament books by centuries, not just a decade or two as they do with the Gospels and some Epistles. For a long time it was not known how to counter their arguments. No texts of Old Testament books were known older than Tenth Century AD. After 1947, the Dead Sea Scrolls pushed the dates back to around 160 BC. Then, more recently, fragments dating back to the 700s and 800s BC have been found, mainly in ancient mezuzahs. Naturally, it would be helpful if larger texts were found dating to the times of David and before. If such materials exist, God is still keeping them hidden.

The book most enthusiastically late-dated is Daniel. The young Daniel is supposed to have been taken captive to Babylon in about 586 BC, and lived until the year before Cyrus allowed Jews to return to Jerusalem 70 years later. Many critics pooh-pooh this, and presume it was written during the time of the Maccabees, in about 160 BC. No matter when it was written, it contains a prophecy that is detailed enough to check. This was done by Sir Robert Anderson, who wrote The Coming Prince more than a century ago, in which he demonstrates that the period called "69 weeks" in Daniel began when a certain decree was issued in 445 BC and ended on Palm Sunday, the only day before the crucifixion that Jesus was proclaimed the Messiah. The length of that period comes to 69x7x360 days, or 173,880 days, exactly. If the book of Daniel predicted that period with such exactitude, then it is much more likely that it was written during and near the end of Daniel's life, in Babylon. Those who cannot believe God's word contains genuine predictions would have to late-date Daniel to some time after the year 32 AD to make their case convincing. They'd be laughed out of seminary!

Are the Gospels equally inspired? Faith says Yes. Mr. Carroll states at one point that what is most important is not faith but faithfulness. This is clearly in accord with Catholic teaching going back to the Fifth Century, that we are saved only by our own works. Jesus the Redeemer is never mentioned in Christ Actually. The title comes from something written by Dietrich Bonhoeffer. While Bonhoeffer's beliefs stretch the faith of Jesus a little, those of Carroll stretch it beyond recognition. Aquinas wrote that we must imitate Christ, and this subject informs the last full chapter of the book. This is not Biblical faith. We are told repeatedly by the apostles, particularly Paul, that believers are indwelt by Christ. In his resurrected condition, this is possible. Carroll will admit no bodily resurrection, neither of Jesus nor of anyone else.

In spite of my deep disaffection, I find certain value in the book. We need to be reminded of the Jewishness of Jesus. He did not hate the Jewish leaders, but wept over their intransigence. Yet Mr. Carroll goes too far, proposing that the Gospels are anti-Jewish screeds written later in the Roman-Jewish wars. Anti-Jew they are not, but anti-false-religion they most certainly are, because Jesus was anti-false-religion.

The Gospel of Matthew has been said to have the subject, "Christ versus Religion", and that of John, "Religion versus Christ." Yet, these writers made clear repeatedly that the religion being promulgated by the Pharisees and scribes and other leaders of First-Century Israel was far, far from the religious practice taught by Moses and Samuel and Ezra. And those who are sometimes called "Jews" in Acts, who were following Paul around and trying to undo his work, were a rival faction of what I call "Judaizers" among the Christians, probably based in Jerusalem, where James later told Paul, "You see, brother, how many thousands of Jews have believed, and all of them are zealous for the law." So zealous for the law, they had nearly forgotten the freedom from over-interpretation of the law into which Jesus had called them. They occasioned the downfall of Paul, politically speaking. Let us remember, though, that the crowd that had been whipped up into crying, "Crucify him, crucify him!" was the same crowd that received the first gospel preached by Peter on the day of Pentecost, 7 weeks later, and 3,000 of them, now redeemed and forgiven and baptized, formed the nucleus of the church in Jerusalem. God first reached out to those who'd been duped into calling for His demise.

So here is the list, according to this former Catholic priest:
  • No deity, just a kind of after-the-fact divinity conferred by our adoration.
  • No miracles (a repeated statement: "He could not"!), which John called "signs".
  • No resurrection.
Yet, as the writers of the New Testament make clear, without these there is no salvation. Jesus said to those who didn't believe in him, "You will indeed die in your sins."

The Jesus Christ that Mr. Carroll writes of is not the Jesus Christ in whom I believe. Not even close. If he is a Christian, then I'd be ashamed to call myself a Christian. But if I am a child of God, then unless he repents, Mr. Carroll is destined to perish.

Monday, December 08, 2014

It really, really IS who you know!

kw: book reviews, nonfiction, sociology, sociability, relationships

Do you want to live longer and better, be healthier and smarter? For about 3/4 of us, a truly holistic doctor would prescribe, "Join another group or two; spend more time with people you enjoy and love; get out more." Who would such a prescription not help? Those who already have strong, vibrant social networks. The rest of us would be well advised to develop them. Clue: a vibrant social group is not to be found in virtual space or on your computer or phone. Human faces work better than Skype, much better, infinitely better.

Humans really are social animals, though the extent of our sociability varies. For reasons yet to be ferreted out, all the genes that either strengthen or weaken social tendencies seem to be carried in all of us, but are differently expressed in every individual. How else to explain my family: my wife and I are both very introverted, yet our son is powerfully extroverted (or extraverted, as Carl Jung originally spelled the term); my father is an extrovert, my mother was more reserved, but very sociable, and my siblings and I seem to cover the spectrum (I am the most introverted).

It is becoming better known that married men, in particular, live 10-15 years longer than single or divorced men. The effect is not as strong for women, who tend to have better social lives than men even when they are introverts. Also, having a "partner" is not the same as having a married spouse, and confers no extra longevity benefit. It seems far too many married men have such poor social lives that their wives are their only close confidants (because "men don't talk about those things").

While reading The Village Effect: How Face-to-Face Contact Can Make Us Healthier, Happier, and Smarter" by Susan Pinker, I suddenly remembered the play Our Town. Nearly all I can recall is when the narrator looks out and says, "…one day you look at the gray-haired woman at your side and realize the two of you have shared 50,000 meals…" Of many things Ms Pinker repeats throughout the book, sharing mealtimes, during which you actually converse, rather than grunting over the morning paper or whatever, and particularly starting in infancy, foretells how healthy and successful you are likely to be all your life.

50,000 meals. You know I always have to figure things out. Thornton Wilder must have been thinking mainly of farm families, in which the farmer returns home for mealtimes. Three meals a day works out to nearly 1,100 yearly, so 50,000 meals is a bit over 45 years. Even in 1938 when the play was written, an American couple who'd survived their childhoods, and were starting a life together by age 20 or 22, could expect 45-50 years together.

What of today, when most American couples see each other mainly at dinnertime? There's no way to accumulate 50,000 mealtimes together. For example, my wife and I have been married just 40 years. Nearly all that time, we shared 10-11 meals per week, depending on whether one of us slept through the other eating breakfast on a weekend morning. Throw in a couple weeks of vacation or staycation, with 21 meals together each of those weeks, and it comes to about 565 meals together yearly, or more than 22,600, but way less than Thornton Wilder's calculation. Now in retirement, we average about 18 weekly, and we're happier and more relaxed (not having bosses is also a big help!).

The book begins with stories of a couple of breast cancer survivors, and the social settings both enjoyed, that helped them cope with the sudden and extended disruption of their lives. They are contrasted with people who have little or no social support, and the studies that have shown they are much, much more likely to die shortly after diagnosis, even with aggressive treatment.

The second chapter probes an area of Sicily in which intense social support is the norm, and in which the number of 100-year-olds is 3-4 times what you'd expect. This is a well-attested matter, compared to earlier reports of extreme eldership in parts of Russia, where it was found that young men in Tsarist Russia had taken their dead fathers' identities to avoid military service, or in certain parts of Japan, where the dead had been reported as living for decades so their families could get government support payments (similar to one kind of Social Security fraud). In these villages, everyone truly knows everyone, and they care for one another with rare intensity.

Not everyone can handle the kind of ardent sociality the central Sicilians find normal. I wonder how introverts fare in those villages. But even introverts need at least a few close friends. The quintessential loner of our time, the Unabomber, who took great care to be as unknown as possible, was eventually discovered through his brother. Having "nearly no contact" did not equate to having none at all.

A major theme of the book is that our gadgets are no substitute for friends. Even though we might have tons of online "friends" through FaceBook or something similar, there isn't any health benefit to keeping up with all their Updates or Tweets. Nor are there any intellectual benefits. Rather, quite the opposite. Without going back over all the chapters about the effects of electronic gadgetry on children, I think it is safe to state this conclusion:
Both children and adults learn much, much better from the tutelage of a skilled teacher, than from any combination of laptops, smart phone apps, and other electronic substitutes, including MOOC's. (My conclusion; the author's is lengthier and more specific)
That is why families with the money to do so are putting their children into private schools that have demonstrably great teachers. There is much debate recently about the re-segregation of our schools. In a country where many more blacks are poor in comparison to whites, this is a visible matter. Stand outside a private school and count how many kids of each color exit at the end of the day. I have an idea: for every two children in a private school, offer free tuition to a minority child, and sufficient support by counselors to ensure a realistic chance at success. You'd also need to train the rest of the kids in kindness toward the free kids, or the school will internally segregate.

I suppose it started with the Boob Tube. TV has been around almost exactly as long as I have. Once considered a great "babysitter", the TV set has been exposed for what it is, a kind of "empty calories for the mind" machine. Too many of us are as inactive and "obese" mentally as physically (Think of obesity as a principle: mass without muscle. Apply that to your mind. Not a pretty picture).

It is too bad the word "friend" wasn't trademarked and made unavailable before FaceBook took it over. A "FB Friend" is not usually a friend. The default term ought to be "acquaintance". In the past few years the FB folks have made available some categories, such as "acquaintances", "close friends" (AKA your actual friends you're likely to meet face-to-face), and "relatives". Only an actual, physical friend can take you to the store when your battery's dead—and call AAA for you because so is your cell phone—, or give you a foot massage, meet you for a coffee or soda (I don't drink beer), and care for your cat when you're away for a couple of days. Ms Pinker makes a strong case that those who spend the most time online spend the least time with real people, and are thus the loneliest. And they'll often tell you that.

There is the Dunbar Number, named for Robin Dunbar of Oxford: 150. That is the number of strong relationships humans can effectively manage. Even then, not all will be equally strong. I think of a very social fellow in Bible history, king David. During his vagrant days, on the run from king Saul, he had about 400 men who followed him. Still, there were "the 30" and "the 3", and a second "3 who did not attain to the first 3". "The 3" seem to have each managed around 130 of the men, with the help of about 10 of "the 30", and possibly one each of the other "3" as a lieutenant. Let's compare with typical numbers of "FB Friends".

I have 146 "FB Friends". OF those, 23 are children of church friends (a "church kid" category), 12 are former colleagues from work (with a FB "smart tag" of the company name), 14 are "relatives", and 12 I count as "close friends" in that FB category. My closest friend other than my wife, a man I typically eat with at least weekly, does not use FB, though one of his sons is in the "church kid" group. Of the 146, 133 allow viewing their friends and my "Friends" page lists them, so it was easy to grab the statistics. Here is a bar chart of their "FB Friend" quantities:

By doubling the size of each category to get the next, I made this a Lognormal analysis. The result is skewed to the heavy end. Note that Dunbar's Number would be in the first of the three bars of about 30 members. The Median is 317, and the rather great number of folks with 1,000 or more "FB Friends" is startling. They must spend a good part of their day scanning their News Feed!

It would not be hard for me to double or triple my numbers. But I am selective whom I "friend". My sociable son, not so much. He has over 600.

I also looked at the face tags in Picasa, where I have about 25,000 photos tagged. Of 738 tags, not all are true names. Some are various kinds of "I don't know" designation, such as "unknown female second cousin" or "Bill in the Rock Club"; there are exactly 100 of these at present. I have 14 groups, such as "HS Friend of Son" (maybe 200 or so kids I'll leave him to sort out later if he chooses) or "Relatives of so-and-so" (several faces from old family reunion pix, say, in 1914). There are 51 various kinds of distant acquaintance, such as President Bush, photographed at long distance from a speech venue, or various people whose face and name I know but we've never met and will likely never meet, possibly because they're dead (Antoine Lavoisier is one of these). And there are 7  such as "baby Tom" or "young Tom" for a few children in the family that I wanted Picasa to be able to better recognize at various ages. That leaves 566 distinct persons that I either know well now, or have in the recent past. Not bad for the family introvert!

This is not just an entertaining book to read, it is a scholarly work, and the endnotes constitute an extra chapters' worth of fascinating reading material in addition to the many, many references. An example: From a note on page 314 about breastfeeding, the author points out that the claims for various health benefits of breastfeeding overlap in a significant way the benefits of skin-to-skin contact and face-to-face interaction between mother and baby. (I find it amazing that less than half of American women breastfeed at all, only half of those keep it up for 3 months, and very few last even 6 months. Much of the blame goes to companies with policies that disallow even unpaid leave for child care longer than 9 weeks. I am so glad my wife was able to nurse our son a full year.)

So, feeling a bit lonely? Nobody's going to come to you. Turn off the video game and find a compatible church or hobby club (My atheist brother belongs to a choir, and this season they are of course practicing the Hallelujah Chorus). Then, every chance you get to meet with one of your groups of pals, turn off the cell phone. It won't help you live longer, but they will.

Monday, December 01, 2014

Paradox of the afterlife

kw: book reviews, spiritual speculation, heaven, near-death experiences

A few years ago Eben Alexander, M.D. wrote Proof of Heaven. I haven't read it. Now he follows up with The Map of Heaven: How Science, Religion, and Ordinary People Are Proving the Afterlife. As I was reading, the words of an old slave spiritual song often came to me: "Everybody talkin' 'bout heaven ain't a-goin' there." Opinions on the subject cover a full spectrum. Some disbelieve any kind of afterlife. Some believe everyone "goes to heaven" after they die, and even the range of opinion among religious people starts near the "everyone goes" end and runs along to those who would say, "few, very few" are heaven-bound.

To clear up one matter at the outset: No map is provided in the book. It is about some of those who claim to have tasted heaven, trying to put all their memories into some kind of map. The nearest thing to a map that the author presents is a metaphor from ancient writings of Persia: A conical hat with levels. Sort of like the pointed hat of a witch or wizard … or a dunce. Though each level is smaller than the one below, this small size represents not the size of the realm at that level, but its similarity to the level on which we all find ourselves in our quotidian lives. The true extent of each higher level is supposed to be many times greater than the one below. There's no telling how many levels there are. Familiar sayings about "seventh heaven" have no bearing on it, and none appear in the book.

The book's point of view is entirely what I would call Natural Religion. God is at best vague and formless, more frequently referred to as "the Divine", and in one section near the end, as an all-encompassing entity so that all are eventually subsumed into "God's body". There is no personal God, no Savior and no need of salvation in Dr. Alexander's system.

The seven chapters of the book are very loosely arranged around seven "Gifts": Knowledge, Meaning, Vision, Strength, Belonging, Joy, and Hope. The Appendix describes a kind of meditation using sound as a way to approach the kinds of experiences reported as Near-Death Experiences (NDE's), without being at risk of death. Perhaps he hopes to discover the "music of the spheres".

The three "big questions" that all children eventually get around to asking, and that are never answered, are
Who are we?
Where did we come from?
Where are we going?
Of course, there is a silly comedy routine in which the reply is
Invaders,
from Europe,
to the New World,
(and answering the implied, "Why?") to take over.

Naturally, the three questions are infinitely bigger than that. Are they answered by answering the question, "Are we really heavenly creatures in some kind of temporary non-heavenly realm?"? In one of the later chapters, the author states his acceptance of reincarnation. Doctrines of the cycling birth and rebirth of souls in human form must cope with the vast increase in human population since the time of Gautama Buddha about 2,500 years ago.

Jokes about "My mother-in-law will probably come back as a mosquito" notwithstanding, it seems curious that, should a soul fail the tests of a human lifetime, it returns as some lower creature—perhaps a cow or a rabbit or a crow—in this same realm, not as a conscious entity in some lower level of the multi-level "hat" the Persians imagined. And how is it that, though there were probably no more than 250 million to 300 million persons on Earth in 500 BCE, the number rose to half a billion by 1500 AD, a billion by 1800 AD, and about 7.2 billions today? The numbers of desperately poor on Earth today exceeds the entire human population in the year 1800. So does the number of those who enjoy at least a "middle class" level of prosperity, with riches that Solomon would have envied (what good are 100 tons of gold if you can't buy an iPhone with it?). Where did all these new souls come from? Did a lot of antelopes and orioles get "promoted"? The number of persons who die every year is roughly half the entire world population of the year 500 BCE. That's a lot of recycled souls going around!

It is nice to imagine we are all going to some heavenly realm, or that we are made for that realm and that it is already all around us if we have our eyes opened to see it. There is a certain element of that latter belief in evangelical Christian teaching. But as a Christian, along with all Bible-believers I must question the implied universalism of the book. Let me just ask you this: would you be comfortable sharing Heaven with Hitler, Stalin and Mao, and a constellation of lesser darknesses such as Jack the Ripper, Vlad the Impaler or Pope Boniface, who "crept in like a fox, ruled like lion, and died like a dog." Hey, how 'bout ol' Nero? God has a Hell for a reason!

From cover to cover, both Old and New Testaments hold out a hope, not of "going to heaven", but of resurrection out of death, a permanent leaving-behind of death in all forms. The Old Testament statements in favor of resurrection are comparatively vague, though Daniel was pretty explicit about it. He stated that all the dead would be raised up, some to eternal blessing, some to everlasting contempt.

During the ministry of Jesus he criticized the Sadducees for their disbelief in resurrection, so it is clear it was part of Hebrew theology already. The New Testament culminates with a vision of the "New Jerusalem", a holy city, "coming down out of heaven from God". That is, the perpetual dwelling of the eternal people of God is with God on the Earth in this amazing City. And crucially, the writer states there are "a new heaven and a new earth", onto which this city is lowered. Thus, for a Bible believer, while there is some element of the heavenly surrounding the people of God today, the eternal realm is not here now, but will be brought in to replace this one.

Dr. Alexander tells a pretty story. I am sure it gives comfort to many who otherwise have no hope. But we do not yet know whether NDE's manifest something real, rather than being a kind of standard series of hallucinations conjured up in a dying brain. We are told they all contain the same elements, but a little digging around with Google reveals that NDE's are tuned to cultural expectations. The NDE's of people in some cultures are rather terrifying! It's just that most of those reported in English are the experiences of Westerners with Western (that is, neo-Judeo-Christian) cultural expectations. What we do know is that the fear of death is powerful motivation to grasp at anything that might stave off the darkness. This book taps into that enormous market.

Sunday, November 30, 2014

Faint hope for a better American Constitution

kw: book reviews, nonfiction, constitution, law, amendments

I have just finished reading the Constitution of the United States, and all 27 Amendments. It didn't take long; in the Octavo volume I was reading the main text comprises just over 16 pages and the amendments 13. Less than 30 pages in length, it remains the best Constitution so far devised for any nation. Yet no matter how good it may be, the existence of Amendments shows that as times change, the process of constituting "a more perfect union" is ongoing.

Consider Amendment XII, which provides that Electors shall vote separately for President and Vice-President. Following the original prescription in Article II, when the votes of the electors were counted, the person with the most votes became President, and second place was awarded the Vice-Presidency. This practically ensured that the chief executive and his second-in-command would be bitter political foes. After 1804, the POTUS and the VEEP have at least had some chance of having similar political views. But imagine the outcome of recent elections had the amendment never been proposed or ratified: President Clinton and Vice-President George H.W. Bush, or President George W. Bush and Vice-President Al Gore!

The authors of the original Constitution kicked a few problems into the future, slavery and universal suffrage among them. They were also perhaps a bit idealistic, and didn't foresee how human nature would distort the application of constitutional law. I suspect they never dreamed an "activist court" would arrogate the right of "Judicial Review", to determine what is and what is not "constitutional". One way and another, times continue to change, though people do not, so after the Bill of Rights, a new Amendment has been adopted about every decade or so.

In a new book, Retired Justice John Paul Stevens proposes six. The book is titled Six Amendments: How and Why We Should Change the Constitution. Justice Stevens served just under 25 years, or 11% of the time that the Supreme Court has existed, and as a Circuit Court Justice for some years before that. He believes that time has outpaced a couple of the Amendments, and that distortions in the political process have resulted, necessitating new Amendments beyond the repeal or rewording of those two.

I don't presume to understand everything I have read in the book, so I'll just comment on a few items. Firstly, Gerrymandering. Hardly anyone knows what this is any more except those who practice it, which leads to rather amazing contortions of district maps whenever they are re-drawn, usually following a Census. Take a look at what the Texas legislature wrought following the 1990 Census:

This was Texas District 30 from 1991-96. It gathered a great many Democrats into it, which raised Republicans to a majority in several surrounding Districts. The current district map of several states also show troubling levels of "non-compactness" in districts, which can result in, for example, a state in which 52% of the electorate votes Democratic having Democrats holding 66% of the seats in the state legislatures.

The key words in Justice Stevens's proposed amendment are "compact" and "contiguous". The District shown is probably contiguous, but it certainly isn't compact. However, the word "compact" needs defining. I propose the following: The area comprising a compact district shall comprise 80% or more of the area of the tightest-fitting convex polygon that wholly encompasses it.

Basically, wrap a string around the shape and measure its area, then the area of the proposed district. The district shown would only fill about 30% of an encompassing polygon.

Secondly, he proposes abolishing the Death Penalty via an Amendment that adds five words to Amendment VIII so that it reads
Excessive bail shall not be required, nor excessive fines imposed, nor cruel and unusual punishments such as the death penalty inflicted. (my emphasis)
I have long favored capital punishment as the only certain means of ensuring that certain persons convicted of the most heinous crimes could never repeat their offense, nor any other. The Justice's arguments have convinced me otherwise. Most states now have laws imposing imprisonment without possibility of parole for those crimes. Though a capital offender very rarely escapes, technology is making this less and less likely. Life without parole accomplishes two things:
  1. The incredible cost of the death sentence appeal process would be much reduced (though LWOP appeals might grow to fill the gap), and
  2. As time passes, new evidence or new technology will lead to certain convicts being exonerated and, equally likely, certain others becoming even more clearly guilty. The latter case might also foreclose certain lengthy appeals.
Finally, I am only partly in agreement with Justice Stevens in his proposal to amend Amendment II. His proposal is to add five words, so that it reads:
A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear arms when serving in the Militia shall not be infringed. (his emphasis)
I think he is right that the NRA in particular ignores the first clause of Amendment II. This is why he would add the five words, to tie the two clauses together in the way he believes the Authors understood it. However, I also understand the principle, "If it is a crime to own a gun, only criminals will have them." It's a little hard to get the American firearms toothpaste back in the tube.

It has been said, "The reason for the Second Amendment is in case the government does not keep the First Amendment." Seriously? Tell that to the Branch Davidians, or the folks at Ruby Ridge. No, the "reason" was the expectation of invasion by Britain, which happened in 1812, and the memory of the Revolutionary War which was very, very living memory to those writing the Amendment. There was no standing army, though the Constitution provides for one. There was only the Militia, and all men were expected to be ready to serve at a moment's notice.

From time to time there is a protest by NRA members against proposed legislation or regulation regarding firearms. The ragtag, motley bunch that typically shows up at such events would be laughed out of any real militia. They give a bad name to the NRA. I certainly hope that the majority of loyal, patriotic members of the NRA are as deeply ashamed of those antics as I am. Such dern fool protesters are the kind who say, "You'll have to pry my gun out of my cold, dead fingers." Americans who honor the law silently reply, "That's a challenge we'll accept when needed."

Is there any serious chance for Justice Stevens's six proposals being adopted? I think not. America is no longer the Land of the Free but the Realm of the Rich, and there is too much money to be lost by powerful entities should even one of the six be enacted. Were I king of the country, here are a few Amendments, even less likely to be enacted, that I believe would be equally salutary to the American commonweal and her political health:

  • Congress shall pass no law exempting its Members from liability to obey any statute of the Federal Government or any State.
  • Corporations are not Persons in any political sense. Only persons who can vote have the right to donate to political campaigns either for a candidate or in favor of any ballot issue.
  • No member of the Senate or the House of Representatives shall be entitled to vote upon any measure who has not read the document in its entirety and is able to orally present a summary of its salient arguments upon demand by any constituent.
  • [Line Item Veto] Any measure passed by both the Senate and the House of Representatives, presented to the President, shall be written in the form of clauses not to exceed one page in length each. Each clause is to be signed separately, and any clause not so signed is to be deemed Vetoed.
I think the good Justice might be halfway favorable to at least one or two of these. Anyone else?

Wednesday, November 26, 2014

He's strong and good-looking, and above average

kw: book reviews, nonfiction, stories, memoirs, sketches, humor, humorists

Having listened to A Prairie Home Companion whenever I could between the 1970s and early 2000s, I snapped up Garrison Keillor's new book The Keillor Reader the moment I saw it. I've read only a couple other of his 20+ books, but this promised much, and it does indeed deliver.

The book collects items of Keillor's fiction, semi-fiction and memoir writing over a 45-year career. He thinks of himself primarily as a writer, one who happens to present much of his own writing—or, a stream-of-consciousness version of it—publicly on APHC and at other performance venues.

The arrangement of items is partly by time, within certain topics. He begins with some of his best-loved APHC monologues. I suppose these are from transcripts of his on-air performances, because he writes in one introduction that he pre-writes the monologues, but doesn't read them; he'll use about half the material on average, and chase rabbits (my term for it) as they seem appropriate. His written semi-script is more of a warehouse of ideas he can pick from during the talk. It is a fine way to prepare a talk, and I used a similar strategy in my Toastmasters' International days. But he does supremely better at it!

At least a couple of the monologues must be from public performances, not from on-the-air shows. The material veers into areas my mother used to term "daring". For example, in a hilarious bit triggered by events the day his cousin Kate tried out for a talent show wearing nothing under her sweater, he winds up holding her on his lap as they hide from the school nurse in a stall in the Boys' Room. When he asks, "Are you really not wearing a bra?", she pulls his hands underneath to check for himself. Assuming this is mostly autobiographical, I reckon it was a turning point for a 15-year old boy.

I guess I didn't hear the right monologues to understand the title "Iconic Pajamas" for the second section. The items are short pieces in various genres, published in various venues. He likes to turn well known stories on their head, writing "Little House on the Desert", for example, as a sideways look at Laura Ingalls Wilder: perhaps she "augmented" her stories, for example. Or rewriting "Casey at the Bat" from the perspective of the opposing team. Such bowdlerizations of history continue in the third section, where he lampoons, for example, Earl Grey, Don Giovanni and Zeus.

His humor gets its power from restrained exaggeration. But he can use it quite unrestrainedly when he likes, such as in "My Life in Prison", in which he serves a 512-year sentence for throwing a tomato at his sister. His recounting "My Stroke (I'm Over it)" is straight fact, told in a mildly humorous way, as a fellow, glad to be still alive, might tell his buddies while keeping it light. These are both from the fourth section.

He touches on his faith here and there, with enough emphasis that we realize how profound an affect it has had in his life. Though he was raised in an extremely strict sect of the Brethren, his family was split down the middle between two of its divisions. The various aunts and uncles agreed to be civil anyway, which afforded Keillor and his siblings and cousins more freedom than they'd have had otherwise. He forsook the strict way in his teens, and now takes comfort as an Episcopal. He is warmer than might be suggested by the "God's Frozen Chosen" moniker some use for Anglicans and Episcopals.

His closing essay, "Cheerfulness" is most touching. It begins with a lightning evaluation of synonyms for "happy", noting that being cheerful is a choice, more so than the rest. He claims to be cheerful despite his dour demeanor, and fills the piece with examples. No doubt about it, he spreads good cheer everywhere, so he has a plentiful store of it!

Thursday, November 20, 2014

Decrying a Florida tourists seldom see

kw: book reviews, nonfiction, polemic, newspaper columns, essay collections

Carl Hiaasen is a journalist and columnist for the Miami Herald, and has been for quite some time. A hundred or so of his columns from the past ten years have been gathered by his editor Diane Stevenson into his new book Dance of the Reptiles. You might think the title is about alligators, but a piece of wisdom from my mother came to mind within a few pages of starting the book: Many years ago I was leaving to hike alone up Mt. Lowe, which is accessed through the old Groucho Marx estate in upper Altadena. She expressed worry for my safety, and I said, "I know how to avoid rattlesnakes." She said, "I know, but I'm worried about rattle-people!" The Reptiles of the book are public officials in Florida.

Alice Longworth Roosevelt is said to have carried a cushion embroidered with the words, "If you don't have anything nice to say about anybody, come sit here by me." She'd have been delighted by a visit from Mr. Hiaasen, at least for a while. He is a skilled storyteller, and the writing itself kept me going for quite a while, maybe a third of the book. After that it became a slog. I just don't have an appetite for quite so much mad-dog, polemical journalism.

I understand his frustration. I moved here to the Mid-Atlantic area (I'm kinda south-west of Philadelphia) about 20 years ago. I sure didn't stay here because of the political climate. Within 2 years of our marriage, we moved from California to a Western state, and we've lived in the West or Midwest ever since. A couple of months after our move, our son entered first grade, so I began attending PTA and School Board meetings. What a shock! The PTA was OK, though I was sitting next to a corrupt politician who soon became a senator. My personal take on his voting record is that he has exactly opposite values to mine. I score him a perfect Zero, at least until yesterday, when he actually voted in favor of the XL Pipeline!! (Not that it did any good…)

School Board was another matter. Every member was on the take. The President was big into construction, and it was no coincidence that plans were brought forward time and again, either to demolish building A so a new school could be built somewhere else, or to change the school year in such a way as would necessitate big (and costly) amounts of remodeling of about half the buildings. I got the notion one day that a well-placed bomb at one of their closed door meetings (the usual kind) would do the human race a whole lot of good. Once I realized I had begun thinking that was a really good idea, I quit attending.

I really don't know what to say about the book. It is ancient wisdom that the pen is mightier than the sword, but I think it'd need a dozen more pens of the quality this author shows, to make much of a dent in Florida's public service industry. If you hanker for a really comprehensive catalog of the ways politics go bad, and you used to think New Jersey politics were the worst this country has to offer, read this book, or as much of it as you can without being awakened with the heebie-jeebies!

Friday, November 14, 2014

Biology gives you a brain - Life turns it into a mind

kw: book reviews, nonfiction, science, predictions, brain, mind

The title is a quote from Jeffrey Eugenides, and succinctly expresses my understanding of the mind. A longer exposition on the mind and its possible futures is found in The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind by Michio Kaku. Dr. Kaku, a physicist whose specialty is string theory, is well known to those who watch the Science and Discovery Network cable channels. He is always willing to provide a series of provocative and quotable sound bites on scientific subjects.

In The Future of the Mind he first explores what the mind is, particularly the conscious mind, and defines consciousness in his own unique way. I like his approach:
Human consciousness … creates a model of the world and then simulates it in time, by evaluating the past to simulate the future. This requires mediating and evaluating many feedback loops in order to make a decision to achieve a goal. (p 46)
I would only add: goals can be both innate (hunger or reproduction) and derived (the engineering steps needed to construct a bridge, even though the bridge is part of a larger, innate goal). The reference to feedback loops harks back to an earlier discussion of levels of consciousness.
  • Level 0: Stationary organisms or mechanisms that react to one or a very few feedback loops in a few parameters. The lowest possible consciousness is that of a thermostat, which he defines as Level 0:1 because it reacts to one parameter, Temperature. Plants react to Light, Gravity, Temperature, Moisture and perhaps a few Mineral Concentrations, and could be characterized as Level 0:n where n is about 10.
  • Level 1: Motile creatures (and perhaps some mechanisms) that can thus react to changes in space and location, particularly animals with a central nervous system such as fishes and reptiles.
  • Level 2: Social animals, particularly those that express a theory of mind and are thus reacting to the possible or probable intentions of their fellows and other animals such as predators or their prey. The number of feedback loops that Dr. Kaku might enumerate here grows into the hundreds or thousands.
  • Level 3: Future consciousness, which may or may not be among the capabilities of some nonhuman animals, but is a characteristic of human consciousness. Planning for the future, particularly with multiple contingencies, and not as an instinctual reaction, is the hallmark of this Level.
It occurs to me that Level 3 is an iffy business. Most people plan only when they have no alternative, and often do so badly. I suspect that we are pretty new at this. It may have been achieved less than 100,000 years ago. Dr. K doesn't mention the "when" of Level 3.

At this point I must note a puzzling item, an apparent error. In his student years, the author experimented with Sodium-22 (Na-22), an isotope that emits positrons. He then mentions, in two places (pp 5, 26), that Na-22 is used for taking PET (positron emission tomography) scans of brain activity. Not really. Wafers containing a tiny amount of either Na-22 or Ge-68 are used as "spot markers", stuck on the outside of the body to provide orientation markers, typically for organs other than the brain, which has such a distinctive shape that markers are usually not used. Brain scanning in particular uses Fluorine-18 in a glucose analog (fluorodeoxyglucose or FDG); glucose concentrates in active areas of the brain, and FDG with it. The positrons detected in the scanner "light up" these active areas on the scans.

F-18 has the virtue of a very short half life of 110 minutes and must be generated in a reactor possessed by the imaging facility just before use. Na-22 and Ge-68 have half lives of 2.6 years and 8.9 months, respectively. Also, neither can be used to produce a glucose analog. Even if they could, to achieve a similar level of positron emission, much larger amounts would have to be used, which would continue to emit at that level for many months or years. Thus F-18 is thousands of times safer in the body than the other two.

Onward. Leading up to the multilevel model of consciousness, I find this statement:
Self-awareness is creating a model of the world and simulating the future in which you appear. (p 36)
This leads later to a discussion of whether machine consciousness can become self-aware. A recent article in Wired by Kevin Kelly discusses Artificial Intelligence as an emerging "cloud service", a scalable on-demand service already being used, for example, by face recognition modules in programs such as Picasa and Photo Gallery. Kelly particularly notes that consciousness seems to need an element of chance to make it work. If this is so, conscious intelligence is inherently less than 100% reliable, so that future AI offerings may need to be certified as "Non-Conscious". Thus his view of machine intelligence is as something supplementary to the "natural" consciousness we experience, and is best kept unaware.

Dr. Kaku believes just the opposite, and discusses at length the possibility of machine self-awareness, and the possibility that we will be replaced by machines. The word "robot" is bandied about, with little acknowledgement that the word has two very distinct, very different uses in science fiction versus industry.

Industrial robots are actually better described either as Waldoes—based on "Waldo" by Robert A. Heinlein in 1942—if they are directly human-controlled (this includes drones), or as programmable actuators when they are controlled by a program running in a connected computer. Thus they are a logical extension of NC (numerically controlled) machining.

Autonomous robots as described by Isaac Asimov in I, Robot and all his later "Robot" books and stories, whether subject to his "Three Laws of Robotics" or not, are still decades in the future, if indeed they can be realized as self-contained entities at all. Current state-of-the art autonomous robotic mechanisms, such as the car from Stanford that finally won the DARPA self-driving competition in 2005, are barely at the threshold of Level 1 consciousness. Their "planning" capabilities are pre-programmed, an analog of animal instinct, and limited to finding a way to specific GPS coordinates.

Moore's Law states that the number of devices on a computer chip tend to double about every 18 months. It is a trend Dr. Gordon Moore observed, but has become a self-fulfilling prophecy driven by the profit motive. Several related trends include the power requirements of a certain amount of processing speed: watts per gigaflop (GFLOP, where FLOP means FLoating-point OPerations; per second is implied) seem to fall by about half every two years. This allows us to make a prediction, based on the assumption that Moore's Law will continue to hold for a long enough period. Today's fastest computer system has processing speed and memory capacity very similar to the human brain, but consumes 9,000,000 watts, including air conditioning. The brain maxes out at 20-25 watts. Nine million divided by 25 is 360,000, or 2 to the 18.5 power. That implies at least 37 years before human-level AI can be run with 25 watts.

Moore's Law is already in trouble, however. The fastest computer chips today run at about the same speed as those of about 10 years ago. Greater total power in a "CPU chip" for your PC is achieved by putting multiple processors on the chip. That is why they are now called "multicore" CPU chips. The computer I am using has a 4-core CPU. Commercial chips top out at 16 cores (as of late 2014), and the Watson supercomputer has thousands of these wired together.

I don't hold out much hope for "quantum computers" (qC's). The hype about these devices is beyond incredible. Their operation requires maintaining coherence among some number of quanta, typically electrons or ions held in some kind of magnetic trap, and being able to decohere them in sequence for readout into ordinary, electronic devices. Holding coherence longer than a small fraction of a second is comparable to balancing a pencil on its point. I suspect that the ancillary machinery needed for maintaining coherence and, even worse, manipulating it quantum-by-quantum for readout, will grow exponentially with the length of time coherence is needed, and the number of quanta in use. I don't anticipate a qC to be able to crack AES-256 encryption anytime this century, if ever.

I find the middle of the book most useful. Dr. Kaku discusses mechanically enhancing our smarts. This is actually what we do all the time with the academic technologies, beginning with the emergence of writing a few thousand years ago. While we still ought to teach times tables to our youngsters (gigantic groan from the grandkids), calculators in our phones and watches ensure that we make fewer arithmetical blunders. In 1958 "The Feeling of Power" by Asimov was published, in which mental arithmetic is rediscovered after decades during which all calculation was done using small devices (in 1958 the "desk calculator" was a bit bigger than a portable typewriter). These days we use Google or Bing or DuckDuckGo to find stuff we're not quite sure we remember, or don't know in the first place. Siri and other voice apps on our phones make this process simpler than ever. This enhances our useful smarts.

I am not sure most of us will ever need the invasive devices he describes, such as nanowire hookups to our hippocampus and other areas that mediate memory. The mind is tough to tinker with mechanically. TMS (trans-cranial magnetic stimulation), using a magnetic coil outside the skull, can briefly inhibit certain functions. It has been used to make a person a temporary psychopath, by zapping the brain area where caring resides, and to briefly release savant capabilities, by shutting down an area of the brain that is inactive in autistic savants. But TMS does not add capabilities, it only releases inhibitions placed upon some functions in ordinary brains. Why would you want to be a psychopath, anyway? Ask Neil Armstrong, who needed totally uncaring, steely resolve to land the Lunar Module in 1969. Not all psychopaths are criminals. Maybe future lunar missions (or even commercial airliners) will include a TMS device to shut down distracting anxiety in a pilot during landing.

Supposing we learn to read out and implant memories, even to create or erase them at will. Sometimes this could be a very good thing. I define neuroses as "out of date defense mechanisms". The person or situation that hurt someone is gone forever, but they still react to certain stimuli in embarrassing or disabling ways. When a neurosis is based on a well defined, focal experience, psychologists call it an Engram, and erasing engrams might be a very useful future use of mind technology. Other than that, leave my memories alone!

But memory is slippery, and specific incidents don't just make a kind of diary record in some spot in the brain. Dr. Kaku describes well how shortcut/thumbnail images go one place, emotional memories another, smells elsewhere and so forth. Recalling a memory means gathering all these bits back together for replay through some part of the frontal lobe (and relevant spots throughout the brain) so you can relive the incident. But we edit our memories, emphasizing certain items at the expense of others that we gradually forget entirely. This makes "truth serums" unreliable, as discussed in a mind control chapter.

Dr. Kaku discusses the possibility that we might merge with our electronic offspring, once it is to our benefit to do so. This simply expands the notion of "prosthesis" to the brain. Certain modern "artificial legs" actually perform better than the original for specific tasks. Just ask the "blade runner" (and it is unfortunate that he is now a felon; I don't think it likely he knowingly killed the girl but he couldn't convince a jury of that). He wasn't nearly such a fast runner before he got springy metal feet. But he'd need differently designed prostheses to play football (soccer in America).

As I have mentioned many times in earlier posts, I made a 40-year career out of writing software that worked with people, taking advantage of what people to well and leaving to the machine the tasks that people do poorly. A mechanical brain excels at detecting differences. There are amusing puzzles such as "find 10 things that are different between these two pictures". Sometimes, one of the pictures is a mirror image, which to me actually makes it easier. Something that takes experienced puzzle solvers 5-10 minutes would be solved by a computer with a webcam in a second or less. It might also highlight several hundred or thousand tiny errors that arise from printing ink interacting with the fibers in the paper, something few humans would be able to notice without using a microscope. A "wetware" brain excels at detecting similarities. That is why we can see camels or fish in a cloudy sky, or recognize someone from seeing only the edge of a face turned mostly away.

Only in the past week, I noticed that Picasa is picking out faces that are in profile, something it couldn't do before. But it is still flagging a percent or so of things that are clearly not a human face. However, its ability to find 90+% of the faces in my photos really speeds up face tagging. If I give it time after loading a new batch of pix, it gathers suggestions for many of the faces from my library of identifications of about 700 friends in multiple images. This is an example of useful AI: it isn't as good as I am, and doesn't need to be. It just needs to do most of the work and leave it to me for refinement. But I would not want to leave it to the Picasa face-recognizer to guide a drone on a kill mission. Not when it mistakes so many other Asian women for my wife!

A minor error seen in passing on p 255: The fastest supercomputer at the time of writing could perform about 20 PFLOPs (P = Peta), which is explained as 20 trillion; it is actually 20 quadrillion. A trillion FLOPS is a TFLOP (T=Tera).

And, oh dear, another: comets in the Oort cloud are described on p 289 as lying "motionless in empty space". Even at distances up to a light year, these comets move at speeds in the range of at least 100 m/s in orbit about the Sun. Compared to the Earth zipping along at nearly 30 km/s, or even Pluto, averaging about 5 km/s, that is quite slow, but far from "motionless". Autobahn speeds top out near 90 m/s.

Dr. Kaku excels in speculation, which means he is frequently mistaken as his expectations are overtaken by actual events. However, only those who have the courage to predict have the chance of sometimes being right. While the single-processor version of Moore's law was played out about the year 2000, multicore chips and continuing experiments with vertical-transistor chips continue a somewhat more modest trend. Will we ever achieve the 20 PFLOP-at-20-watt processor needed to equal a brain in both speed and power required, and also in volume (2L or less)? Moore's law might suggest the 37-year timeline I figured above, but we can't really know until we try.

And I don't think duplicating human consciousness is a worthy goal anyway. Much better is producing machinery with sufficient computing power to enrich everyone's lives at affordable cost. This matches the old Japanese supercomputer project which had as one goal, achieving Cray-1 capability (100 MFLOPs) in a $2,000 PC by 1995. This goal was achieved. The computer I am using now, which I built, is 100 times that fast, and the parts cost $800. I want machines to continue what they do best: complement and supplement our abilities. I think Dr. Kaku would agree, in spite of his excited, blue-sky forecasts.

The Appendix is titled "Quantum Consciousness?". It is likely that "free will" and full consciousness require quantum uncertainty. Here I am in full agreement. There is quite a discussion of the "Cat in a box" proposed by Schrödinger. Based on a carefully set-up radioactive detector that has exactly a 50% chance of triggering the release of poison gas to kill the cat in the next hour, we are asked, at precisely the one hour point, "Do you think the cat is dead or alive?". Much is made of the meaning of the Observer, in the Copenhagen Interpretation favored by Niels Bohr and most physicists, and other interpretations. No mention is made of the fact that the cat is also an observer! In fact, the results of many experiments that are intended to "prove" these things show that photographic emulsions, CCD detectors and other devices are also observers! They record the "collapsed wave function" phenomena, whether or not a human is present. I think nobody suggests that the image on a piece of film, developed in automatic machinery, does not truly appear until a human actually turns on a light and looks at it.

I think I am repeating something I wrote elsewhere to say this: a beam of light passing through a vacuum is affected by everything it passes, at any distance whatever. Of course, if you pass it through a small hole you'll get a diffraction pattern. The edge of the hole is the "observer" that leads to the scattering of the photons into a more divergent beam. But even a 1mm diameter laser beam, if it passes through a 1 meter aperture, will make a different pattern on a distant film than it would if the aperture were 2 m across. It will also differ if the aperture is square vs round. The existence of "things" in the universe provides an infinite number of "observers", contributing to the collapse of the wave function—if indeed that is what actually happens—for every quantum event everywhere.

Thus, the author's conclusion is apt. We must know ourselves better, not only to enhance or even duplicate our abilities, but to develop tools that work with us in better and better ways, in more and more useful realms of experience.

Thursday, November 06, 2014

If you think your pet is crazy, you may be right

kw: book reviews, nonfiction, animals, psychology

Our 4-year-old house cat has never been outside on her own. She leaves the house only when we take her to the veterinarian, in her carry-case. Now, you might think a 1600 square foot house with a full basement and a sun porch would be lots of space. After all, she's a lot smaller than we are. But an "outdoor cat" typically roams an area of a few acres, so her world is small. She certainly has more energy than she can expend while kept inside, so she's bored a lot of the time. The condition of our carpet attests to her need to stretch and scratch, a diversionary activity because she can't roam far. And she does something I haven't seen any of our other cats do (I grew up with cats): sometimes she rests with her chin on the floor. She isn't asleep. Her eyes are open but she doesn't move a muscle. It is usually something dogs do when they're bored. She's bored.

After reading Animal Madness: How Anxious Dogs, Compulsive Parrots, and Elephants in Recovery Help Us Understand Ourselves, by Laurel Braitman, I realized we are pretty lucky. At least our poor kitty is not psychotic. She doesn't pull out her hair, nor circle or pace like a caged tiger, nor upchuck her food and eat it again, nor demonically attack us out of nowhere. That pacing tiger in the zoo? It or its ancestors had a natural territory measured in dozens of square miles. It has energy to match. What else can it do?

Human insight: Energetic people of all ages need an outlet. For some, it is extreme sports, long hikes (One of my cousins likes to take a 2- to 4-hour hike in the desert. Daily), jogging or aerobics classes. For others it may be joyriding stolen cars, dealing drugs, doing drugs, or other "antisocial" activities. My outlet during my teens and early 20's was splitting logs with an ax. There's nothing quite like setting up a 14-inch-diameter cut of Lodgepole pine when it is -10°F, and popping it in half with a single whack. Several easy splits later it is in 6-8 pieces, ready to burn. Half an hour, half a cord, and I'd be ready to sit still and do my homework. We burned a lot of wood those years!

In humans or animals, "misbehavior" has a reason. Of course, the roots of behavior are a mix of personality and pathology. Some people just seem born to be criminal, and I've written before of the psychopathic young person I knew from age 7, who seemed unable to think of anything legal to take up his time. I reckon animal personalities are similarly variable. There's a room we never let our cat enter. In this room, and this room only, she will seem peaceable for a while, but then get a wild look in her eyes and climb the drapes. We have drapes in other rooms that she ignores.

Ms Braitman began her journey of discovery because of her suicidal dog (I consider purebred dogs to be maniacs in the making anyway). She had a Bernese Mountain Dog named Oliver who clawed through a window frame, pushed aside an A/C unit and jumped from a fifth-floor window onto concrete. The poor dog was too tough to die just from that. Many vet bills later, he was home, but not for long. While still on the mend, he chewed up another window frame and swallowed enough wood to thoroughly twist up his intestines. He had to be euthanized. The Bernese Mountain Dog is a remarkably stable dog for a purebred. But Oliver had a poor life before the author and her husband agreed to care for him, and came to love him, in spite of his extreme anxiety. Suicide, human or animal, doesn't just come out of nowhere.

In her long quest to discover how nonhuman animals, mainly mammals and birds, suffer mental illness, the author traveled the world and spoke with many experts of many kinds. She is an opponent of the existence of zoos, declaring that once you know what to look for, you cannot see a single animal in a zoo that has normal psychology. She seems to have spent quite a bit of time in Thailand with elephants and their mahouts. The stories are remarkable, both of the normal ones and the abnormal. If a working elephant (few wild ones are left in Thailand) is well matched with a sympathetic mahout, the two become like loving siblings. One kind of trouble comes if an elephant, always a very social animal, is a bit overly anxious, and the mahout is hoping to marry. Jealousy can cause distress, destruction or murder. Another kind is a personality mismatch. Some "trouble elephants" have done much better when paired with a different man (hardly any mahouts are female).

Although an element of the author's purpose has been to illuminate human mental suffering, in reality the book provides a wide-ranging survey of mental illness in animals and the efforts of owners and veterinarians, sometimes helpful and sometimes tragically comical, to alleviate it. Fun fact: the normal dose of Prozac for a 50-pound dog is enough to make you sleep for a week, if you wake at all.

So at a circus, or the zoo, if you see an elephant in a small space, standing still and swaying a little back and forth, in her mind she's striding down a forest trail, enjoying the sights and smells she is denied in her tiny enclosure, and for an elephant an acre it tiny. Most captives endure much less.

I find it remarkable that so many animals, in homes, corrals, zoos, nature parks and so forth, do as well as they do. If you were my pet house cat, would you stay sane?

Friday, October 31, 2014

The Stats are out to get you

kw: book reviews, nonfiction, statistics, logical fallacies

I reckon there are a few hundred books with subjects similar to the classic How to Lie With Statistics by Darrell Huff. They are really self-help aimed at helping us resist arguments made using flawed, or fraudulent, statistics. Now I find a book aimed at those who might use statistics to make an argument, to avoid fooling themselves: Standard Deviations: Flawed Assumptions, Tortured Data, and Other Ways to Lie With Statistics by Gary Smith.

As I began to read, I remember thinking, "He ought to title it Nonstandard Deviations", but I soon realized that proper statistical thinking is so rare, even among scientific writers, that the deviations the book presents are indeed standard practice. It is trouble enough that cynical marketers and politicos are using statistics fraudulently to deceive us; the larger problem is how many different ways proponents can lie to themselves!

The key chapter is #2: "Garbage In, Gospel Out". Although there are 16 more chapters exposing at least as many errors of statistical logic, and a great summary titled "When to Be Persuaded and When to Be Skeptical", those 16 chapters show all the common ways of using numbers to create nonsense. Several are based on faulty assumptions about trends.

We live in a world with two kinds of time. We are embedded in the cycles of the seasons: days, weeks, months, years, decades and centuries. Every day the sun rises, crosses the sky, and sets (unless you live in the high Arctic or on Antarctica). Every year the seasons come and go in sequence. Our most basic, gut-level experience of time is cyclic. But we also have linear time. Plant a tree and it grows taller every year. Some trees keep that up for a thousand years or more. We see continual population growth in most countries and in the whole world (Germany, France and a few other countries have reducing populations, but we don't think about that much). We have ancestors in the past, going all the way back to Noah or Adam or whatever progenitor we believe in; we also expect to have descendants going pretty much forever into the future, or at least "until Kingdom come".

We are less familiar with linear time, though, and tend to think linear trends can continue without limit. The key to unlocking this quandary is to realize that time itself is linear, but things that happen in time have a beginning and an end, and typically rise and fall in between. An evangelical "young-Earth" Christian believes in a strictly limited span of time, beginning about 6,000 years ago, maybe as much as 10,000 years, and ending within the next hundred or so. A purely agnostic scientist who knows cosmology believes time, or at least the current phase of phenomena in time, began 13.8 billion years ago, but there are a few hundred competing theories about when or whether it will end. Nonetheless, the end of life on Earth is pretty well understood to be a billion years from now, because the Sun is slowly heating up, and the end of the Earth itself will follow 3-4 billion years later, when the planet is crisped and perhaps evaporated by the Sun's red giant phase.

A few billion years is plenty of time enough for some trends to go along and go along for a long, long time. The human population of Earth has been steadily increasing for at least the last 50,000-70,000 years. The hope of many "zero population growth" advocates is that human population will stabilize within the coming 50-100 years, and even begin to shrink. However, if you want to start a business that requires population growth to continue, and you're satisfied with a run of 20-40 years, go for it. It'll take at least that long for growth to slow to the point you'd have a hard time keeping the business going. But the usual business cycle is about 6 years. Plan on some kind of downturn in the next few years. If you survive that into the next cycle, you just might keep that business going until your kids are grown.

The author exhorts us, again and again, to think. The motto of IBM used to be "THINK". Statistical reasoning doesn't come naturally, even for statisticians. He uses humorous stories of "experts" who ran afoul of their own wishful thinking. It takes a lot of data to prove a statistical inference. A key concept of statistics is "significance". Scientific journals are filled with articles that employ statistical tests and declare that some finding is "significant to the x% level". That "x%" is typically 95%, which is frequently stated as 0.95. That means that there is at least a 95% chance that the "significant" finding is true. But there's a 5% chance that it is not true.

Let's suppose that every scientific experiment resulted in a publication telling the results. Further, let's suppose that only one in ten reported "significant" results. Think a minute: why do scientists use statistics? It is because they don't get a clear-cut result. If using widget A was always lots better than using widget B, statistics would not be needed. The article could be very short: "In 100 trials, widget A always did a better job than widget B". Then you'd question whether the scientist were sane: after about 10 trials, you can stop already! That depends on just how much A was better.

More typically, there is overlap. Suppose that some scoring method showed that A is better 64% of the time. If that was 64 out of 100, it is probably a significant result, but if it was 16 out of 25, you could be in trouble with the law of small numbers. This is analogous to flipping a coin 25 times to see if it is a fair coin. You get 16 heads. How likely is that? Many people think there ought to be a nearly exact even split, either 12 or 13 heads. Here is how to analyze it:

  • For 25 coin flips, there are 33,554,432 possible outcomes, from all heads to all tails, but in 33,554,430 out of 33,554,432 cases, it'll be some mix. 
  • An outcome of 12 heads occurs 5,200,300 different ways, as does an outcome of 13 heads. Together they total 30.1% of all outcomes. That is, intuition is correct less than 1/3 of the time!
  • An outcome of exactly 16 heads occurs 2,042,875 different ways. Thus, the chance you'll get 16 heads is 6.1%. 
  • There is thus a 6.1% probability that this outcome indicates there is no difference between the two widgets. The result is not sufficiently "significant".

This analysis was done using Pascal's Triangle, and there is plenty of software out there that can do such an analysis. You just have to know enough to set it up. By the way, if this were the result of 50 trials, with 32 heads, you'd have a different conclusion. Firstly, getting exactly 32 heads in 50 throws occurs 1.6% of the time. You could also say that getting at least 64% occurs 3.2% of the time by chance alone. Thus, the "significance level" is 96.8%, which is better than 95%, so there is support to say that widget A is actually better than widget B.

This is not a lock. Remember, I posited a world in which every result is published, whether favorable or unfavorable to the initial conjecture. Do you think negative results are published? Nearly never!! So in a world of "publish everything", if 1/10th report "significant" results, some of those are likely to be due to chance alone. Perhaps one in 20, or 2 of the original 100 articles. But in the real world, the proportion may be quite a bit higher. It is certain to be at least 1 in 20.

OK, that's a long-winded excursion into just one item that struck my fancy. As in most endeavors, there is a very short list of ways to do it right, and a near-infinite number of ways to go wrong. That's why we need to expose our ideas to a great variety of folks with different backgrounds and viewpoints. Many times, though, the proponent(s) of an idea will circulate only among those who think alike.

It is also shown that wanting a certain result is the most powerful enemy of truth. I recall an old story of someone seeking a simple answer, because he didn't know how to figure it for himself. He got a variety of answers from people he knew, until he asked a political lobbyist, who responded, "What do you want it to be?" Well, that joke may be more political than statistical, but it is sobering. No matter how much we may want this or that to be true, the actual case is the actual case, the truth is the truth, and will outlive you and your most heartfelt desire.

Sunday, October 26, 2014

We should ask John Lehr about this

kw: book reviews, nonfiction, dieting, self help, paleo diet theory

I've read and heard snippets about a "Paleo Diet", so I figured it is time to look into it. I read Your Personal Paleo Code: The 3-Step Plan to Lose Weight, Reverse Disease, and Stay Fit and Healthy for Life, by Chris Kresser. The thesis of this diet and self help movement is that we evolved for a million years or so eating a certain way, but in the past 10,000 years or so the agricultural revolution and then the industrial revolution have changed the kinds of foods we eat, and we aren't well fitted to the "modern diet".

Perhaps you've heard of the "no white stuff" diet: no bread or dairy, but eat lots of meats, fowl and fish, and all the fruits and greens you can stand. It seems to be a spinoff of the low/no-carbohydrate Atkins Diet. That is largely where the first section of the book is going.

The author tells us that hunter-gatherer peoples are healthier than we are, and that our ancestors were healthier still. We read that the grain-based diet in all agricultural societies is to blame for chronic illnesses such as heart disease and diabetes. Thus we need to eat more like our pre-agriculture ancestors.

It isn't really that simple, because, he explains, there was no single all-encompassing diet in the paleolithic era, which ended about 12,000 years ago. Those dwelling inland would eat quite different foods than seacoast peoples—who ate much more fish and shellfish—and the Arctic diet was about 90% blubber, as it still is.

He does point out that life expectancy at birth was about 22 years in 10,000 BC, but goes on to say that it fell to about 19 years a few thousand years later, based on archaeological studies primarily in the "Fertile Crescent", or Mesopotamia. I personally attribute that to a great increase in violence as people lived in groups larger than the typical gatherer group of 50-150 souls.

This is a bigger evolutionary adjustment: For millions of years, few members of any species in the genus Homo encountered non-relatives on any frequent basis. If they did, a fight to the death was the ordinary result. This is still true in parts of Papua New Guinea and Amazonia. Once agriculture came along, people began to live in larger and larger groups, and reflexes that were appropriate on the savannas became a problem. We are still learning to get along with strangers, and we're probably evolving more "civil" attributes. In most of the "civilized world", people are able to go about their daily activities without attempting to kill every stranger they see, because that would mean attacking nearly everyone encountered! This is attested by the steadily declining murder rate, documented pretty well for at least 1,000 years. I have written before that in Shakespearean England the murder rate was at least 10 times what it is in modern cities, and 100 times the rate in more rural areas.

Mr. Kresser does write that certain evolutionary changes have occurred as a result of agriculture; things such as tolerance for lactose and gluten. I don't know how many Cro-Magnons would have suffered celiac symptoms from eating wheat (or proto-wheat), but among modern populations of European origin, the rate of gluten intolerance is about 0.75% (1 in 133). Even among Asians, who are famously lactose intolerant, about one in three can drink milk, my wife included.

We really don't know whether any elderly Cro-Magnons had heart attacks, strokes, or cancer. I suspect "old" was closer to 40 than to 70, so they didn't usually live long enough to get "chronic" conditions. As I have also written in earlier posts, human evolution continues at a good clip. Wisdom teeth are on their way out, and another century or two could see a precipitous drop in rates of celiac disease and lactose intolerance, and possibly diabetes as well.

Anyway, for those who'd like to eat Paleo, this book is probably the best resource. The author is quite an enthusiast, but I would not call him a nutcase or fanatic. He is reasonable and persuasive. The second part of the book is advice about learning the kinds of foods you tolerate well, and the third is about building a life around your new/old (very old!) diet. He takes better account of human nature than authors of self-help books typically do, so his advice will be better followed by comparison. He also strongly stresses the need for more motion by all of us who are not professional athletes. I think all that walking has more to do with hunter-gatherer health than more or less meat or starch in their diet.

If I wanted to try the Part 1 diet, I'd find it hard to give up the starches I love: whole wheat bread and pasta, for example (My wife and I can both cook up a mean pot of spaghetti sauce or Stroganoff, though we tend to use ground turkey instead of beef). But I'd probably enjoy adding more steak or roast into my diet, compared to our present diet of chicken and fish, with only occasional beef or pork. Oh, and cheese! A Sunday evening favorite just before, or during, a "couch potato session" beginning with America's Funniest Videos, is a couple slices of bread topped with 6mm of cheese and microwaved; and I put cheese in any meat sandwich. A little tinkering around with his advice about macronutrient balance shows that my best calorie balance is 50% carbohydrate, 20% protein, and 30% fat. That's close to the way I eat now (whew!).

I began reading in a skeptical frame of mind, and came away with quite an appreciation for the author's insights into diet and activity (it's a better-received word than "exercise"). It is particularly appropriate that we learn to eat things that make us feel better hours or a day or two later, in preference to what might taste the best at the moment.

P.S. John Lehr? My favorite among the Geico caveman actors.