Wednesday, January 17, 2018

It grabs you where you live

kw: book reviews, nonfiction, addictions, technology

When I saw the book Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, by Adam Alter, the verse 1 Corinthians 16:15 came to mind. It speaks of a certain family that "have addicted themselves to the ministry of the saints". At least, that is how the King James Version and two others translate the word εταξαν, a form of τασσω, which in the First Century meant "to set, appoint or ordain", but has lost that meaning in the centuries since. Becoming curious about the English usage of the King James era (early 1600's), I found that "addiction" referred mainly to fascination and devotion. Thus many English versions of the verse use either "set" or "devoted". The term was neither positive nor negative prior to the mid-1800's

Addiction has a much stronger and more focused meaning today. To be addicted is to be in the grip of a compulsion or obsession that harms one, or may eventually kill. Since the early 1900's or a little earlier, "addiction" has referred to a compulsion to use substances such as cocaine. As author Adam Alter tells us, there was quite a struggle in the later Twentieth Century among psychiatrists and psychologists about whether to recognize "behavioral addictions". But the modern phenomena—from binge-watching of TV episodes to online game playing, online gambling, twelve- to 24-hour FaceBook sessions and even "checking in" so compulsively that people walk into fountains, manholes and lampposts—have convinced nearly all that behavioral addiction is real and can be really, really bad.

Note the phrase above, "…or may eventually kill." I do not mean just the shortening of life due to bad health from being a "couch potato" or "FB zombie". Suicides have resulted, not just from being trolled online, but from despair over falling behind the social media rat race.

In a fascinating busman's tour through history, we find that addictive tendencies are with us for very good reasons: our distant ancestors did not become ancestors by ignoring the siren call of pleasurable experiences. In pre-agricultural days, over most of the Earth, eating everything that tasted good kept you alive, and getting all the sex you had opportunity to obtain gave you a chance at having descendants. Also, our ancestors traveled, and the dopamine-fueled thrill of seeing what is over the next ridge motivated many of them to seek new pastures and far horizons. Those who traveled the farthest may have been subject to extra risks, but the chance to populate a new and empty landscape was a benefit not to be ignored.

Our tendencies to fall prey to obsessions, compulsions, and addictions are a direct result of the tens of thousands, even millions of years, that humans lived with scarcity. Now about half the human race lives with relative abundance. What happens then? We overdo it; we overdo it big time.

The author describes many behavioral hooks that turn a potentially enjoyable experience into a compelling one. Unsteady rewards are a big, big factor. Even as Pavlov learned, once a dog has learned to associate receiving food with the ringing of a bell, it will salivate when the bell rings, whether food is given or not. But if food is given roughly every third time, the dog will salivate more and more. Rats given the chance to push a bar to get a food pellet will do so, of course. But if pushing the bar doesn't always yield a pellet, they will push the bar again and again, gathering pellets far beyond their need to eat them. Uncertainty is a big hook.

The most addicting games are those that you win about 1/3 of the time. If you win every time, you get bored. If you win less than 1/10 of the time, you look for a "better" game. This is just one example. Apparently, the most addicting computer game to come along, at least up to the time the book was written, is World of Warcraft. The second-most is probably League of Legends, which my son plays more than he should…though so far it hasn't affected his work enough to cut into his income. I hope that day doesn't come, but for many others it has come already (Cue a stereotypical video of a jobless Millennial who lives in the parents' basement and plays games all day).

So, can we do anything about this? Friends of ours despaired of even slowing down their daughter's FaceBook addiction. Her grades suffered badly. She almost dropped out of college. Nobody knows quite what happened, but she somehow developed a backbone, and a level of resistance, so that her grades improved, she graduated, and now has a responsible job. I don't know how much she may still read her News Feed on FB but I don't see a lot of posts from her. There are other folks—well, I just shake my head. I wonder how they have time to put one or two or three dozen posts in their News Feed every single day. Maybe we just have to let people outgrow it. Pity those who never do.

At the end of Irresistible the author discusses one "thing" (I can't think of another word) that seems to make positive use of the hooks that draw us in: Gamification. This is adding an element of fun into otherwise mundane, boring or unpleasant tasks. In the modern era, technological hooks can be used to trigger our compulsions, just enough, but breaks or "units" are inserted so we won't binge out. The FitBit is a potential gamification of exercise, but it doesn't have any checks, so some people damage their health trying to achieve ever-increasing goals. It needs some work.

But even without FitBit and its kin, overdoing it is a risk. I used to exercise a lot, including certain body-mass strengthening routines, and began keeping records. As it happens, that might have been a mistake. Or, at least, I ought to have obtained a buddy or coach to help me keep track and not ramp up my routines too fast. One day I did too many dips and pulled a muscle in my chest. It took five months to heal (I was about 40; were I younger it might have taken only a month or two). By then, the cycle was broken, and since then I primarily walk. There was no FitBit involved, nor have I ever owned one.

I am also reminded of Zooniverse, with more than 70 somewhat gamified "citizen science" projects. There aren't even any bells and whistles, just accumulating numbers of tiny projects completed, but that is enough that millions of people (myself included) enjoy sorting galaxies, counting penguins, or transcribing hand-written museum labels. Without a few little hooks in the projects, it is actually deadly dull work!

I consider the matter unfinished. We don't yet know how to cope with behavioral addictions. As the author writes, we are in the foothills of addictive technology. But not everyone is equally prone to addiction, whether to substances or behaviors. Perhaps Darwinism will run its course, and a future generation will consist mostly of people who are largely immune to the allure of the Like button.

Wednesday, January 10, 2018

Is evidence-based medicine dead?

kw: book reviews, nonfiction, medicine, medical research, critiques

Research incentives are messed up, big time. So much so that Sturgeon's Law of fiction writing applies, doubled: when someone protested to him about the presentations at a science fiction convention, that 90% of it was crud, he replied, "90% of everything is crud!" When people's careers are on the line, when jobs, promotion, tenure and salary all depend on "Publish or Perish", virtue vanishes. Young, idealistic researchers become jaded, cynical cheaters. One medical author has written that as much as 99% of published medical research is valueless or even damaging. Another wrote,
"One must not underestimate the ingenuity of humans to invent new ways to deceive themselves."
This quote is found on page 192 of Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions by Richard Harris. Author Harris admits that his title is a bit tongue-in-cheek, because rigor mortis literally means the stiffness of a corpse, while "rigor" also means strictness in carrying out a procedure. While it might be more accurate to title the book Mortis Rigoris (the death of rigor) or Mortuus est Rigor (rigor has died), it wouldn't resonate with doctors and others of us who know Latin.

More accurately, however, while experimental rigor is neglected more than adhered to, and may be on the ropes, it isn't quite dead yet. The ten chapters in Rigor Mortis illustrate and document every major aspect of medical research, from experimental design (The "gold standard" of the double-blind trial is nearly always compromised to save expenses, and frequently foregone entirely) to animal studies (Suppose you were told that a certain medicine was tested exclusively on women pregnant in their first trimester, of ages between 22 and 25, all from a specific ethnic group in Scandinavia? That's the analogy to a typical mouse study) to statistical analysis (The p-test is dramatically misleading, and we'll get into that one anon).

Have you ever heard of the "desk drawer file"? It is a lot like a Roach Motel; experiments with "negative results" check in, and are never checked out. Some of the few honest researchers left in the field are agitating for a requirement that every study funded with tax dollars be published, no matter what the outcome. The good news: transparency. The bad news: a ten- to 100-fold increase in the number of papers published. There is already an overwhelming deluge of publication! Gack!!

We need look no further than this to validate Sturgeon's Law. Consider the much-overused p-test, or p-value. You take a bunch of numbers, grind the formula (found in every statistical software package out there, including Excel), and out pops a number. Is it smaller than 0.05? Publish! That number gets inverted into "95% statistical probability that the result shown is not due to chance." Hmm. But what is it due to? Sunspots? Batch effects (perform run 1, clean equipment, perform run 2; do they differ because of the cleaning?)? Something you would never think of in your wildest dreams (all too frequently, yes)? But just suppose all those "95% chance it's right, 5% it's wrong" papers actually do have the "correct" cause and effect. How many experiments went to the "desk drawer" since the last time you published? 5, 10, 20, 100? The average is (wait for it) about 20! So, ignoring the desk drawer, five out of 100 publications must be reporting a chance association or correlation. Add the desk drawer factor of 20, and now at least half of them are reporting a correlation due to chance. Just by the way, it is amazing that the vast majority of studies that report a p-value have a number just under 0.05: "Dig around until you get a p-value you like, then stop looking."

Add in other factors, all detailed in Rigor Mortis, and there is little chance that more than a tiny fraction of published research results will stand the test of time. And that is a problem. A little time? That is OK. If a lot of further research and even development and marketing are based on a faulty result, and it takes "medical science" 5, 10, 20 years or more to find and correct the mistake, how many people die or suffer needlessly?

Is there a way out of it? Only partially. Transparency is part of the answer. But bureaucrats are lazy, so even with a law on the books that all studies funded by NIH must publish all results, for example, it is poorly enforced. There are a lot of partial answers out there. Here is my answer: We must live with what we have now, while things are possibly getting better, but today is today. When I must choose a new doctor or specialist, I inspect the waiting room, and later the visitation room. How many drug company trinkets can I find (pens, calendars, note pads, posters, and many more)? The fewer the better. My current doctor's rooms don't have anything with a logo on it. That's a great start; it means the doctor has better-than-usual resistance to high-pressure sales. Such a doctor is more likely to make a decision on medical grounds. Secondly, who do I actually see? Curiously, I prefer to be seen by a PA or NP, rather than a DO or MD. They haven't had all their good sense educated out of them yet. In my experience they are also a lot more willing to answer questions and do so more meaningfully. Also, I do ask a lot of questions, because a brusque doctor is likely to be impatient in the operating room also. In medicine, patience isn't just a virtue, it is a necessity! There is more, but if you aren't doing these things, start there.

What else can you or I do? Educate yourself. Not from medical journals, but from summary materials on things that are known to work. WebMD and Healthline are just the beginning. Don't limit your reading to a single source. When offered a "new" drug, always ask, "Is there an older one that works well enough, perhaps with fewer side effects?" There are always side effects. Some you can live with, some you can't. Do avoid, desperately, a drug that needs another drug to deal with side effects. My wife takes a statin drug for high cholesterol. She was originally prescribed the strongest one, and even taking a tiny dose, had troubling side effects. Her "undrugged" total cholesterol is 240, but that drug is best used for folks in the 400+ range. She demanded a weaker one, and even then, splits the pill in thirds. She has no noticeable side effects, and her "drugged" total cholesterol is about 160. Good enough!

I've learned to tell a doctor, "I am not a patient. I am a customer. You and I will collaborate. I will never cede my right to make decisions, except during anesthesia that we have agreed upon together." Call it an intelligence test. For the doctor. Occasionally a doctor fails it, and then I get another doctor. When needed, I make a doctor aware how skeptical I am of the "evidence" presented in modern journals.

Rigor Mortis is scary. Is it right? Sadly, yes, it is more right than the average published medical study. But don't let that drive you to the amorphous world of "alternative medicine", at least not wholesale. Allopathic medicine has produced amazing health in most Americans and others in the First World. For a generation or so medical research has gone astray. Will it return? Maybe. Until it does, we must be our own best doctors.

Thursday, January 04, 2018

Sleep, beautiful sleep

kw: book reviews, nonfiction, sleep

What a way to start the year! with a book about sleep. Michael McGirr, a former Jesuit priest, and a victim of sleep apnea, writes about sleep and sleeplessness from a few unique perspectives in his book Snooze: The Lost Art of Sleep. I read the book hoping to re-connect with this lost art, but found instead a travelogue, a book of "what" but not "how".

There is no table of contents and I didn't count as I went, but I reckon there are upwards of a dozen chapters. Each is titled by a time and a year (or a few related years), thusly:



in which chapter he writes of his diagnosis of sleep apnea and the invention of the CPAP machine, and about his marriage to Jenny who loved him anyway (this after he left the Jesuit order), or



riffing on Jacob son of Isaac, one of history's celebrated sleepers, he of the dream of angels on a ladder, but one who nonetheless complained to his father-in-law,
… by day the heat consumed me, and the cold by night, and my sleep fled from my eyes.
He writes of Edison, who was too busy inventing to sleep; of Florence Nightingale, who slept little but spent some 3/4 of her life directing matters worldwide from her bed; and of coffee and its use to ward off sleep, so much so that Balzac, who fueled his amazing literary output with sixty cups of coffee daily, died of caffeine poisoning at age 51. Balzac might have lived a lot longer on half the coffee, and while writing less daily, his total production might have been even greater.

I am reminded of Alfréd Rényi, who said, "A mathematician is a machine for turning coffee into theorems," a quote usually attributed to Paul Erdős, and tangentially of Leonardo da Vinci, who is said to have kept to a regimen of three hours and 40 minutes of work followed by a 20-minute nap, day in and day out (that comes to two hours in each 24-hour period). I read about one man who tried working on this schedule and did so for a few years, but then gave it up because he ran out of things to keep him busy. I guess to keep Leonardo's schedule you have to have Leonardo's creativity. I wish McGirr had included these also in his travelogue of sleep and its variations, but he did not.

Regardless, his own studies of sleep, restful or not, led him in many directions, including into those antonyms of caffeine, the various sleep-inducing drugs, from Benadryl® to Ambien® and beyond. During a hospital stay, a nurse gave me two Benadryl®, which worked well. My father used a prescription sleep aid that turned out to be a double dose of diphenhydramine in one pill; the exact equivalent of taking two Benadryl®, but a lot more costly. But the more recent drugs induce sleep by messing with the normal sleep cycle, which can put you into a deep sleep without the total sleep paralysis needed to keep you from acting out your dreams. Lots of sleepwalking (and sleep driving, etc.) incidents are known, some with fatal results.

In the last chapter, he writes that reading in bed can help us drowse, but only if we are reading off of printed paper. The reflected light from a page with dark ink does not inhibit melatonin production. The light from a computer of phone screen has a different quality, and does so interfere.

Overstress is a primary enemy of sleep. We need a certain amount of stress to keep life interesting, but overwhelming, chronic stress just burns us out. Some folks respond with depression and may take to their beds, sleeping much or most of the day. Most of us have trouble getting to sleep, wake too early, and feel tired much of the time. While a few chapters of Snooze address chronic insomnia, a broader affliction is that many of us get some sleep each night, but never seem to get enough. Many, many of us have an experience like mine.

During the last ten or so years of employment at DuPont, I seldom slept more than four hours nightly. For some of that time, I was also on one or another medication to address my bipolarity, but they didn't do much so I learned to cope with it unmedicated. During those medicated periods, I usually napped up to two hours daily, so you could say I had six hours of sleep, but not in one installment. However, without medication with a sleep-promoting side effect, four hours was it. No naps. I had work I enjoyed a lot, a congenial boss (the last 8 of the 10 years), and even told my boss I might work until I was 75. But when the company declared a retirement incentive, I retired at age 66.

After retirement, two important things happened. Within a few weeks, I was sleeping 6-7 hours nightly, and over about half that first year I lost 15 pounds. I remember looking back one day, and saying to myself, "I didn't realize the level of stress I was under!" I had also been using a lot of "cold caffeine" (Pepsi Max), up to a liter daily.

Now that four more years have passed and I am over 70, I get 5-7 hours of sleep, and if I wake early I simply get up, read my Bible a while, have breakfast, then have a morning nap for another hour or two. There aren't a lot of conclusions to draw from that. I am thankful that, though I snore some nights (not all), I don't have apnea; I have part time work that keeps some structure in my life, but is incredibly less stressful than any job I had before; I practically eliminated caffeine, using caffeinated cola only for driving alertness on road trips.

You'll have to look elsewhere for advice and information on how to sleep longer and better. For an enjoyable survey of how humans have been sleeping, or not, Snooze is the book for you.

Tuesday, December 26, 2017

Through space and time with a different mind

kw: book reviews, science fiction, multiple genres

I read through most of this book on an airplane from Phoenix to Philadelphia. Sometimes when I fly I work puzzles the whole time in the air, first whatever is in the airline's magazine, then in a puzzle book. I like the books with a great variety of different kinds of puzzles, not just crosswords or Sudoku. This time I began to read right after push-back, and read pretty steadily through most of the flight.

Do you remember Apple's "Think Different" motto of about 20 years ago? They were criticized for not using "Differently", but the word was not intended as an adverb; it was a noun: "Think [things that are] Different". When I first saw it I recalled the century-old NCR/IBM motto "Think". But that one meant "Think [because nobody else is doing it]".

Well, Hugh Howey thinks Different. Though he has published more than 20 novels and novellas, and a passel of short stories, reading the collection Machine Learning was my first exposure to him. I'll make sure it is not the last.

The volume contains short stories and at least one novella made up of short story-length vignettes, in a few SciFi genres. The author supplied endnotes about the stories, of what he was thinking at the time. He'll think inside a character: What is going through the mind of a truly bug-eyed, tentacled alien in a force bent on attacking Earth? ("Second Suicide"); he takes a riff on his friend Kevin Kelly's statement, that when a machine first becomes self-aware, the first thing it will do is hide ("Glitch"); he considers the consequences of love between human and robot (Algorithms of Love and Hate, a 3-story sequence). This last reminded me a little of The Bicentennial Man by Isaac Asimov, but with a very different take on societal reactions. Finally, "Peace in Amber" is the author's memoir of going through 9/11 in the actual shadow of the twin towers (until they fell), interspersed with a truly weird alien zoo story. Based on his endnotes, I think the zoo story was needed to "spread out" the memoir so he could handle the flood of emotions.

The word "gripping" comes to mind. Read the book and see what word it evokes in you.

Sunday, December 24, 2017

Take Tyson's tour

kw: book reviews, nonfiction, science, astrophysics, popular treatments

What's not to like about Neil deGrasse Tyson? He has become the public face of science today. I love his updated Cosmos series. I have privately studied astrophysics and cosmology enough that perhaps I could have passed by his new book, but I couldn't pass by the enjoyable way he treats his subject. Astrophysics for People in a Hurry is well worth anyone's time, whether you know anything about the subject or not...particularly if not!

This is a rather small book, on purpose. Dr. Tyson knows that today's young adults want everything fast, they want it now, and they want it without fuss. If anyone can deliver up a basic survey of astrophysics and cosmology that meets these requirements, he can. He does so in 12 chapters.

When I think of astrophysics, I think mostly of stellar interiors, but there is much more to it than that. Clearly, from the flow of the book, astrophysics includes cosmology in its purview; probably 2/3 of the books content is cosmological. But he really does cover all the bases, from the reasons for roundness (gravity wins), to the shapes of galaxies (the tug-of-war between gravity and angular momentum), and to the reasons for modern cosmological theory to include both "dark matter" and "dark energy". Chapters 5 and 6 present these mysteries as well as I have ever seen, and explain why they seem to be required for the universe to work the way we observe it working.

I had the great pleasure to encounter a professional cosmologist on an airplane flight four days ago, and we had the chance to talk a little (he wasn't in my row, so our time was limited by physical endurance of turning heads rather sharply). I asked him a question I'd have asked Tyson if I had the chance, "If a unified quantum theory requires a quantum of gravity, how can a graviton get out of a black hole so as to interact with the rest of the universe? What is the emitting surface for a graviton?" He admitted that he hadn't thought of that before. After we talked a while of other things, then broke off for a while, he nudged me, saying, "Consider this. A black hole has three qualities: gravity, angular momentum, and electric charge, right?" I agreed. He continued, "The electric charge is carried by virtual photons, the bosons of electromagnetic force. Real photons cannot escape a black hole; that is why it is black. But the electric charge remains in effect anyway. Thus, the virtual photons do escape—and return to—the black hole to keep the electric charge in place." I thanked him for providing a marvelous "hole" in my considerations of gravitons and black holes. I suspect this is the same answer Tyson would give. Now, upon further thought, I wonder if the electric charge is held within the black hole, or remains attached somehow to the event horizon. From there (or very slightly above it), even real photons could escape if needed. But if virtual photons can indeed escape a black hole, then virtual gravitons could also.

This matter doesn't enter into the book. What does enter in, is how all the pieces fit together. Tyson gives us plenty of food for thought. One of my favorites is playing a numbers game with molecules and time. Here is my version of "Whose air are we breathing?":

Part 1
  • The air above 1 cm² of Earth weighs 1 kg.
  • The average molecular weight of air is about 29.
  • Thus each kg of air contains about 34.5 gm-moles.
  • 1 gm-mole contains 6.02x1023 molecules (or atoms) of any substance.
  • That comes to just over 2x1025 air molecules above each cm².
  • The surface area of Earth is 510 million km² or 5.1x1018 cm².
  • Thus the atmosphere contains a bit more than 1044 molecules.
Part 2
  • Our total lung capacity is around 6 liters (with a rather wide range).
  • Our "tidal" capacity, the amount we usually take in with each breath, is about a half liter.
  • That is about 0.022 gm-moles, or 1.3x1022 molecules.
  • An average person breathes about 23,000 times daily, when not exercising a lot, or about 8.4 million breaths yearly.
  • Napoleon Bonaparte lived 64 years.
  • In a 60-year span, the number of breaths would come to about 500 million.
  • All those breaths add up to 6.6x1030 air molecules.
Part 3
  • All the air that Napoleon breathed amounts to 1/15 trillionth of the atmosphere.
  • 1/15 trillionth of one tidal breath is 880 million air molecules.
Conclusion: Every breath you breathe contains nearly one billion of the air molecules once breathed by Napoleon…or by anyone else who has lived at least 60 years! Tyson didn't go into all this gory detail. He names a couple of the figures in a two-sentence riff on the subject. I just went through the figures to work it out for myself, and to share it here.

A particular aim of Dr. Tyson in everything he writes, and says in his programs, is to impress us with the power of the scientific method. We don't learn "how the world works" by guessing. We observe, make tentative conclusions based on observations, argue with others about it, eventually turn the conclusions into a hypothesis that we can test, and then repeat as needed. Now, in cosmology, a "test" would take billions of years. This isn't chemistry, for which you can mix a few things in a jar and take a measurement in a matter of seconds or minutes. Neither is it biology; we have no cosmological Gregor Mendel, crossbreeding stars as though they were peas. But we can work out the math and see how it squares with the things we see.

In science, more than in any other endeavor, "No man is an island." No woman either. The popular trope of the loner in a stained lab coat making a major discovery is simply unknown to real science. Even a few centuries ago, when chemistry was emerging from alchemy and astronomy was emerging from astrology, a "lonely genius" was really a highly social being, surrounded by helpers, colleagues, opponents, and many others. The quintessential scientific loner, Isaac Newton, spent much more time discussing his findings and theories with members of the Royal Society, including friends, "frienemies", and enemies, than he did carrying out observations or even thinking out his theories. Without a helpful gadfly-friend to prod him, he'd never have finished writing his Principia. So although Newton was famously anti-social, he still had to interact socially for his science to have usefulness and meaning. But that's the beauty of science. It is our great, collaborative enterprise of looking back at the Universe that birthed us, to see how it was done, and a great many more things of interest also.

This isn't a textbook. It provides not an education in the subject but a vision of what astrophysics is. If you treat it sort of like a textbook, and write down ideas that interest you as you go along, you'll gather fodder for any further studies you might wish to carry out. That's the kind of thing I've done all my life.

Monday, December 18, 2017

Growing up unique

kw: book reviews, science fiction, space opera, child prodigies

Fiction authors frequently write to explore. I first recognized this while reading one of Isaac Asimov's Robot stories, in which stories he explored the boundaries of the Three Laws of Robotics. He had hinted at them in Robbie and first stated them clearly in I, Robot. Years later I realized he was also exploring the boundaries of neurosis. As I learned of his life, including what he wrote in several memoirs, I understood that he was profoundly neurotic and he used his characters—the ever-more-perfect and godlike robots in contrast to the all-too-faulty humans—to work through the ramifications of neurosis in himself.

I have read novels by Orson Scott Card for about thirty years, beginning with Ender's Game. I don't know if I have read all the Ender series books. I did read all of the Homecoming books, and it is more than clear that in those Card is exploring the boundaries of morality and altruism. His character Nafai is pathologically altruistic.

When I read Ender's Game I wasn't ready for it. I was a mere 40-year-old. I took it at face value, as a coming-of-age novel in a space opera setting. Speaker for the Dead and other Ender series books also left me bemused. Now, just this year, more than thirty years later, Children of the Fleet adds another layer to the Ender saga, and I think I am beginning to understand.

The children in this novel, including the protagonist, Dabeet Ochoa, resemble those in earlier books in that they think rather consistently at an adult level, and perform certain adult tasks, though with some limitations because they are, after all, mostly pre-teens. None has yet hit the pubertal growth spurt, so they wear child-sized space suits, for example.

I was forcibly struck in this novel (and in retrospect, in Ender's Game) that Ender and Dabeet are victims of profound child abuse. Each is massively distorted from what he might have been in a more usual environment. Ender completed his mission, one supplied by others without his knowledge, by becoming the "Xenocide", the one responsible for annihilating the Formics, an insectile alien species. Dabeet's mission is only partly concealed, and he initially conceals it from others. In carrying it out, he brings life, not death (except indirectly, to a couple of all-too-human evildoers), and he prevents massive death.

Rather than dig further into the novel, I want to riff on the meaning of intelligence. We all think we know what intelligence is, but if asked to describe it, none can do so. For a few generations, tests of IQ (Intelligence Quotient) were thought to measure it, but they really tend to measure a small collection of cognitive and memory feats that are more machinelike than I care for. I wonder how the supercomputer Watson would fare on a Stanford-Binet test.

Further, the meaning of IQ has changed over the years. Originally, an IQ test was used with children ten years old and under, to compare their performance with sixteen-year-olds. I don't know how the test was normed (normalized), but apparently youngsters of ages between six and sixteen were tested to establish the "normal" performance of each year cohort. Then higher or lower performance could be compared with these norms to establish an IQ score: 100 for "normal for one's age". Based on the scatter displayed within each cohort, a Gaussian distribution was fitted and a standard deviation of 16 (later 15) was applied. So, when I was given an IQ test in third grade, at age 7, and my IQ score was 170, that supposedly meant that, in the memory and cognitive skills that were measured, I was performing at the level of a 12-year-old (11.9 to be precise). All I knew at the time was that, having begun to learn to read on my own when I began first grade as a 5-year-old (I turned 6 three months later), as a third grader I was indeed reading books usually seen in the book bags of seventh graders.

But how do you measure the IQ of an adult? When I was 20 did I have the "smarts" of a 34-year-old? Does such a question even have meaning? I think not. Others who considered cognitive psychology their calling thought about this quite deeply, and re-normed the test, making the standard deviation (σ) meaningful as a measure of scarcity. Thus, in any Gaussian distribution, the p statistic for ±2σ is 0.9545, or about 21/22. With σ = 15 and a mean of 100, the range ±2σ is from 70 to 130. So if you have a "normal" group of 44 people, one is likely to have an IQ of 70 or less, and one is likely to have an IQ of 130 or more.

I can tell you from experience, though, that IQ has little relation to street smarts. As an adult, my IQ has settled to 160, or 4σ above "average", a level achieved by one person in about 31,000. As a pre-teen and early teen, I finally realized I was not very likable. I began to work toward fixing that. I felt that if I did not have good social reactions automatically, as my age-mates did, I would have to observe, learn, and calculate those reactions. I did so. I used to look at that 170-to-160 shift as "giving up 10 IQ points for a better SQ" (Sociability Quotient). Thus, this paragraph found near the end of Children of the Fleet hit me with special resonance:
Maybe making and keeping friends will always require me to think through the steps of it … Maybe it will never be natural for me, never reflexive, never easy. So be it. I can't live without it, can't accomplish anything without it, so I will become adequate at forcing myself, against my inclinations, to be a friend to my friends. If I'm good at it, they'll never guess the effort that it requires.
Dabeet's musings match mine at just about the same age. Now I'll tell you what happened after I was 40. No details, just this: I had occasion to learn, through a personality test, that my "calculated person" was pretty good; but also, because a part of the test elicited reactions that had to be too fast for my calculations, I learned that a "natural" personality was truly there, and it was also pretty good! I came away with a proverb, "You cannot build a tree." I had found out, after a few decades of tree construction and maintenance, that a perfectly adequate tree had grown up beneath my notice and could be relied upon to be a "me" that didn't need all the effort. I am happier and calmer as a result.

If Dabeet is a reflection of Card's view of himself, as I suspect, maybe he is in the midst of learning, or will soon learn, the same thing. Let's see where the next of Card's novels takes us, and him.

Thursday, December 14, 2017

Bill Nye the Climate Guy

kw: book reviews, nonfiction, scientific method, climate change, polemics

Bill Nye is one of my all-time favorite people. The fact that I was dismayed by some aspects of his recent book doesn't diminish my admiration for him. He is a top-notch science educator and a writer I enjoy reading.

Bill Nye's new book, Everything All At Once: How to Unleash Your Inner Nerd, Tap into Radical Curiosity, and Solve Any Problem, is ostensibly about that middle phrase: "Release your inner nerd." It is primarily an evangelical work, aimed at anyone on the fence between those who "believe" in climate change and the climate-change "deniers". Along the way, though, he offers great examples and advice for many folks who may be a bit tech-averse, to see how humans are by nature technical beings, and that solving problems is what we do best—or we can, if we go about it right.

I hope a great many people will indeed read this book. It is very well written. The author manages to press his pro-climate change case pretty hard without becoming entirely disagreeable. I will address my concerns in a moment.

Let me first state my background in the matter; it is a subject I have followed for nearly sixty years.

When I was a child I heard about the "Greenhouse Effect". It was already old news, because the term was used by Svante Arrhenius in 1896 to describe his calculations that a doubling of CO2 concentration in the atmosphere would raise average global temperature by about 5°C (that is 9°F to us Americans). At the age of twelve I was able to learn enough math to reproduce Arrhenius's result.

In actuality, "greenhouse effect" is not an entirely accurate metaphor. In a greenhouse, the glass physically traps air warmed by the sun, while also providing spectral emissivity to enhance the effect. A "greenhouse gas" cannot physically trap warm air, but causes extra heating solely via spectral emissivity.

The terms "Global Warming" and "Climate Change" began to be used by some in about 1975, and their use ramped up greatly after 1985. "Greenhouse Effect" also took off about that time, when the atmospheric effects they all refer to became a political football. Then a funny thing happened. Looking at the Google Ngram Viewer, I find that since 1992 "Greenhouse Effect" rapidly fell out of favor, "Climate Change" became the term of choice, with "Global Warming" running a rather distant second.

The problem with all this is that "Greenhouse Effect" denotes a possible cause, while the other two terms refer to effects. So now let us back up and examine the term I threw in earlier, "Spectral Emissivity". For solid materials, this refers to a departure from the spectral behavior of a blackbody or graybody. If we could produce a paint that was perfectly gray—at any level of grayness—throughout the electromagnetic spectrum, we could paint it on a surface and it would cause an amount of heating, when the sun shined upon it, directly correlated to the total emissivity. To be specific, a perfect blackbody surface will heat up to a temperature that depends only on the energy being radiated to it. It has an emissivity of 1. A perfect reflector will not be heated at all. It has an emissivity of 0. A perfect graybody surface with emissivity of 0.5 will heat up to an intermediate temperature according to a proportional constant times the Boltzmann factor t4.

Now, consider a "step-spectral" surface. Suppose it has an emissivity of 1 for visible light, and an emissivity of 0 for infrared light. Let's put the cutoff at 700 nm. A surface with this characteristic, in a vacuum so air will not carry off any heat, and with only visible light shined upon it, would heat up until it was hot enough to radiate away that same amount of radiant energy. In visible light it would appear black. It absorbs light, but if it is cool, emits nearly none. Thus it must heat up. You might know from experience that the heating element in an oven gets to about 600°C before it begins to glow reddish, and at 800°C it is getting orange-red. The great majority of its radiation, however, is at infrared wavelengths longer, much longer, than the 700 nm radiation we call "deep red". If it is prevented by the step-spectral emissivity from radiating at those longer wavelengths, it must, perforce, heat up until it is radiating a lot of visible light, to balance the incoming light. Thus a step-spectral surface tends to get very hot indeed, hotter than an oven element.

Now we can consider gases. Oxygen and nitrogen hardly absorb any light at any wavelength of interest to us as we consider the heat balance of our atmosphere. There is a common gas, however, that does absorb a lot of light, at a range of wavelengths that make it a strong greenhouse gas. That is water vapor. Surprised? We will look at some spectra in a moment. First, qualitatively, we find that water vapor absorbs a lot of ultraviolet light, but absorbs even more strongly in several ranges throughout the infrared, with narrow absorption bands at about 1.2 and 1.9 microns, a wider band from 2.5-3 microns, and a wide, almost total absorption feature from 5 to 7.5 microns. The result of this is that if Earth had no atmosphere it would be 32°C (about 60°F) cooler than it is. A perpetual ice age without the ice. So water vapor is by far the strongest greenhouse gas, and is responsible for life being able to exist on earth.

"Climate Change" is all about carbon dioxide (CO2). What does this gas do? It also has spectral emissivity, with an absorption band at about 2.7 microns, a stronger one near 4.2 microns, and a third between 12-16 microns. This last one is of primary interest. It is perfectly placed to absorb about 10% of the thermal radiation from warm dirt, meaning that the dirt has to get a little warmer to radiate that extra energy at other wavelengths. And that is what is behind Arrhenius's greenhouse effect calculation.

Greenhouse gases operate a little differently from painted surfaces. Dirt and other stuff on Earth's surface has spectral emissivity, of course, but not nearly with the perfection of the step-spectral material discussed earlier. So it reflects a lot of light, absorbs some, and gets warm enough to radiate some infrared. In a vacuum, dirt with sunlight shining on it would have some specific temperature. Now put a layer of greenhouse gas above it, an atmosphere containing water vapor. The incoming sunlight is not affected much. But the outgoing infrared from the warm dirt is partly absorbed by the water vapor, which heats up and radiates also, with half going up and half going down. This causes the dirt to get warmer, until it is able to radiate enough to balance its thermal outflow with the radiative inflow from sunlight and also the re-radiated infrared from the warm air above it. How does CO2 modify this picture? It absorbs a little more infrared radiation, in portions of the spectrum in which water is rather transparent. So CO2 strengthens the greenhouse effect. Now, here are the spectra:

I don't know the original source of this graph. It is found all over the place. It also shows a tiny contribution from oxygen and ozone, but we won't consider those here (in the "ozone layer" the temperature goes up significantly, however).

The blue line is for water vapor. The curve marked 255K shows the thermal radiation from a piece of ice at -18°C or 0°F. "Room temperature" is close to 300K or 27°C (81°F). Its radiation curve would be a little to the left of the one shown.

The point is, water vapor reflects back a lot of the radiation from the earth and even from glaciers. Yes, glaciers radiate infrared also. The blue line is for water vapor with a content near 0.3% of the atmosphere, or near saturation (100% relative humidity) at ice temperature. The CO2 curve is for a few hundred ppm; the sources I read didn't state exactly. The result of increasing the amount of CO2 would be to widen the bands, as their "wings" absorbed more and more. This shows what happens when these two gases lead to greenhouse warming.

Now it is a separate issue, whether this is actually causing climate change. "Deniers" say not so, proponents of the idea that CO2 is a "pollutant" say it is. I won't get into that. We have measured that, from the time I was a little child and there was less than 300 ppm CO2 in the atmosphere, and today, when the amount is 400 ppm, global atmospheric average temperature has risen just under 1°C.

Is that a lot, one degree C? Let's look at one factor. Water expands when heated. Heating water by 1°C yields an expansion of 0.000214, or 0.0214%. The ocean averages four km in depth. If the entire ocean were warmed by 1°C, it would be 0.000214x4,000m = 0.856m deeper (33.7 inches). That is enough to force the evacuation of some low-lying areas and certain island nations such as Tuvalu. "Climate evacuation" has already started. But has the whole ocean heated by that much? Not yet. Give it time. The early evacuations were the result of less than one-third of this figure.

I'll stop there. These are not easy points to make with a public that largely doesn't care. Thus, Bill Nye's passion. He wants to make everyone care. But as I read I took careful note: will he mention water vapor? He does not, except for a throwaway phrase in a late chapter. We can't ignore water, for another reason. Trapping a little more heat means adding energy to the system. That means more water could evaporate. Whether it will or not is a huge area of controversy in the climate modeling arena. Water is complex. It might be the most complex substance there is. It is possible that the added energy will yield a net drying rather than adding more water. We might see more rain, or less rain, overall, and nobody yet has a good handle on which areas might experience greater or reduced rainfall. Oh, I've seen a few predictions, but none is well supported by robust evidence.

I agree with Bill Nye, though, that we need to be reducing our dependence on "convenient" energy from burning stuff (mainly fossil fuels), and toward solar, wind and other "alternatives". A generation ago the oil companies began calling themselves energy companies. But they are really still oil and coal and gas companies, with only tiny amounts being spent on non-carbon energy production. They could become the heroes of the 22nd Century. But I fear they will more likely be the goats. I just don't know who else has money enough to do the research to make solar and wind as ubiquitous as they need to become. And there, I think the Science Guy might agree. Read the book. Agree with Bill Nye or not, you're in for a fun ride.

Friday, December 08, 2017

The most popular snails

kw: species summaries, natural history, natural science, museums, research, photographs

For the current series of projects at the Delaware Museum of Natural History, I have worked through several families of terrestrial gastropods (land snails and tree snails). Many of these are quite inconspicuous, being small and not colorful, though they are in general a little more various than the little brown "mud snails" (freshwater gastropods) I worked with for most of 2016.

You know, in any group of creatures, most are rather inconspicuous and poorly known. The "typical" mammal is a "little brown furry thing" such as a mouse, vole, shrew or lemming. The "typical" bird is a "little brown feathered thing" such as a wren or sparrow. The "typical" insect is a "little dark beetle" about the size of a grain of rice. The world is full of little brown things and we hardly notice them.

But we really like the colorful "charismatic" ones. Among the land snails, that would be the tree snails of Florida and the Caribbean, of the genus Liguus.

This is part of a drawer of "unidentified" lots of Liguus fasciatus, the poster child for pretty tree snails. Though these have been identified as to species, L. fasciatus has many "forms" or "varieties", which we provisionally catalog as subspecies, but they probably aren't really subspecies. We usually call them color forms.They hybridize freely, but a particular color form is usually physically separated from most others, being endemic to a few "hammocks", as small patches of raised and heavily vegetated ground are known in the area.

These are mostly from an area of the Everglades called Pinecrest, named for a ghost town tucked away in the middle of a couple of hundred hammocks. You can clearly see that most of these lots are in need of splitting into their color forms. Any particular hammock may be inhabited by a few color forms. A collector in a hurry will gather a couple of dozen shells, put them in a box or bag with a label (date, provisional ID, and location, at the least, to be a useful specimen lot), and move on to the next hammock a few minutes' walk away in the dry season, or a short airboat ride away the rest of the year.

Here is the prettiest of the color forms, in my opinion:

On your computer screen this may be a bit larger than life size. The paper label is 3 inches long, so these shells are about 2 inches long, a little bigger than the average for the species. Liguus fasciatus splendidus Frampton, 1932, must have been Henry Frampton's favorite also. These are indeed splendid! This lot was collected by Erwin Winte a few years after Frampton described them, and in the 1980's it wound up at DMNH.

These shells are so sought after that, though they are prolific and widespread, many color forms are getting hard to find. In the southeast U.S. and the Caribbean, a whole subset of shell collectors are called "Liguus collectors". We are loving them to death!

This only serves to introduce these lovely shells. I hope soon to gather pictures of several color forms, and also to compare L. fasciatus with its sister species in the genus.

Friday, December 01, 2017

Drones fly - monsters die

kw: book reviews, nonfiction, memoirs, soldiers, drones

Brett Velicovich passed by a protest a few years ago. People were wearing or waving mockups of the Predator drone and chanting, "Drones fly, babies die." The colossal ignorance they displayed got through his post-traumatic apathy like nothing had since he returned from five combat tours over more than a decade, in the elite Delta unit, the one that flies the Predator, Reaper and other military drones in Iraq, Afghanistan and other places where the most vicious terrorists operate. He knew what really happens, who really dies, and more importantly, who really doesn't die (that would be most of us! Babies included). He got help from Christopher S. Stewart to write a book about the reality of drone warfare, Drone Warrior: An Elite Soldier's Inside Account of the Hunt for America's Most Dangerous Enemies.

Brett V. was the intelligence specialist on a drone team. He led the work of gathering information and deciding how and when to arrest, of if needed, kill an enemy. After President Obama was elected, the autonomy of the drone teams was reduced, and the President mandated that he must personally authorize each kill, whether by drone or by a raid. I saw a video from late in the Obama presidency in which he was discussing the more than 3,000 killed at his say-so. He said, "It turns out I am really good at killing. I never thought that would be an item on my résumé."

There were, and are, several drone teams. It doesn't become clear in the book how many kills and arrests occurred under the author's purview. But certain numbers stand out, and this one is primary: for every kill there were twenty arrests, and most of them led to useful intelligence. So the 3,000 terrorist leaders whose death was authorized by President Obama are accompanied by the arrest and interrogation of about 60,000 others. That is the key to a war against ISIS and similar enemy groups.

No matter what you think about the use of military drones, you have to read this book. Furthermore, the author portrays unblinkingly what was happening to him. This kind of work leads to estrangement from everyone, from all of us who can never know what it is really like. Every returned warrior is changed. This is still early days of drone war, which changes someone even more than traditional warfare. I hope that can be improved upon.

Mr. Velicovich nearly lost his way after returning to civilian life. He has found something productive to do with his skills. The book ends at the beginning of this new beginning for him. I wish him success in using drone-intel skills for positive things in the civilian sector.

Saturday, November 25, 2017

Media Schmedia

kw: book reviews, nonfiction, media, social media, news, fake news, sharing

Everything has a life cycle. I can't recall what I expected when I saw All Your Friends Like This: How Social Networks Took Over News. It has three authors, Hal Crawford, Andrew Hunter, and Domagoj Filipovic, who were colleagues at, formerly ninemsn, an Australian online news website that is a lot like Google News might be if it were on its own.

Folks in the vaunted Northern hemisphere pay little attention to what goes on "down under", but these fellows appear to have gotten a finger on the pulse of a generation, learned what it means, and run with it. How do you measure the relative effectiveness of a new style of media? There is the obvious metric: newspapers are going broke, broadcast media are scrambling to keep from dropping off the ratings chart, newsroom staffs are shrinking, and even mediocre podcasts are apparently reaching larger audiences than large TV networks.

These guys wanted something more, and hit upon measuring Shares on FaceBook and related vehicles (I used to think the "News Feed" at FB was a bit of a joke, but I've noticed that its news content is growing). They produced a site (or method?) called Share Wars, and Mr. Filipovic developed a software system, Likeable, that scrapes social media news feeds to gather sharing statistics. It was available for public access until mid-2016, but is now in the background of the trends they report.

The book chronicles these aspects of the replacement of "push" media with "personal push" media, driven by the Share buttons we find on every web site purporting to convey newsworthy items. Publishing is now so easy and pervasive, it has of course greatly increased the production and distribution of lies and scams including "fake news" (which isn't news at all: a lie by any other name is still a lie). When one of the authors spoke of his "War of the Worlds" moment, I realized that "fake news" has been around as long as "real news".

I don't know what else to say. It is a very interesting book, but didn't resonate with me the way I'd hoped. Buggy whips are still being manufactured, but as a specialty item for history buffs and collectors of horse-drawn vehicles. The Times (of wherever) will be with us for a long time, but the introduction of Sharing has changed the landscape of all media, forever, or at least until something even more compelling arrives. Maybe Crawford, Hunter and Filipovic can help us see the next big change coming.

Sunday, November 19, 2017

Lamp spectra - first try

kw: analysis, spectroscopy, lighting

In the past few years we have tried several lower-wattage "bug lights" as an alternative to the yellow 40-watt incandescent bulbs we've used before in our porch light fixture. When the one we had 4 years ago burnt out we got a 13 watt, yellow compact fluorescent spiral lamp by Sylvania. Though it was not marketed as a bug light, it worked pretty well, though some insects came to it. The next year I saw a 6 watt LED bug light, marketed as such by Feit, so we got that. It worked about equally well. Then I went looking for something that might be a little bit better, and got a 3 watt amber bug light, also by Feit. It doesn't draw insects, but it is pretty dim.

I decided to find out whether a little blue light is getting out of these lamps, so I made a crude spectroscope from a piece of diffraction grating and a short length of PVC pipe plus some odds and ends. In this photo it is on a tripod aimed at a test lamp. I aim a camera with a telephoto lens at the black aperture at the left, where the spectrum emerges.

I cut the end of the PVC for the grating at an angle so the spectrum would exit at right angles to the grating. It has the added benefit that, for visual use, looking the "back way" yields a spectrum about twice as wide. But the focal plane is strongly tilted, making it a poor choice for photography (though I tried!). The instrument has a number of shortcomings, but I think I know how to produce a better next version. For one thing, I'll use a different exit angle, so the diffraction grating doesn't reflect the camera and photographer! (see below)

I photographed the spectrum of nine lamps, the three test bug lights and several others either for spectrum reference or to see the spectral coverage of both incandescent and non-incandescent lamps. Eight of the lamps are shown here, and their spectra are tagged in the next image, followed by some explanation.

These are in the order listed in the spectra image.

The first three spectra are for reference. The 4000K (cool white) CFL shows a combination of spectral lines for mercury (Hg) and for the phosphors used to "whiten" the harsh blue-green light of raw Hg lamps. Mercury has a strong green spectral line at 546 nm, as seen in both this lamp and the 13W yellow CFL (A nearby strong green line is from a phosphor) and a strong blue-violet line at 405 nm, which excites some of the fluorescence, but a stronger near-UV line at 365 nm does most of that. The strong red-orange line at or near 615 nm is from a phosphor, as are the yellow-orange-green and green-blue-violet bands. The 40W incandescent lamp shows the smooth spectrum characteristic of a thermal source. The near-lack of yellow in this spectrum is because a camera's sensor sees colors differently from our eyes, but this is only evident when photographing spectra! The 60W "Reveal" lamp has a filter that cuts out most of the yellow and yellow-orange, making the light appear bluer and closer to daylight.

The next three spectra are for the bug lights. The 13W yellow CFL has the same spectrum as the white CFL from green through red, but with extra yellow and orange, and the green-blue-violet phosphor is left out. Also, a filter removes the blue and violet lines of Hg. The two LED's have nearly identical spectra. The blue-violet light from the fluorescence-exciting blue LED is filtered out, leaving only light from the broad band phosphors. The 3W lamp has a little more red-orange than the 6W lamp, and this is visible when they are lit side-by-side; the 3W lamp's color is amber. In the photo of the lamps above, the filter is inside the 3W lamp's envelope, which is white. For these three spectra, the brownish features seen below the green band are reflections of either me or the camera off the diffraction grating film.

The 8.5W LED is the kind of "warm white" bulb we have begun to use around the house. It has a spectrum very similar to incandescent; it just has a dip in the mid-blue range, and a bright band in the blue-violet range, which is from the fluorescence-exciting LED. The UV CFL is a "black light", very similar to old black light fluorescent tubes used at parties, but in spiral form. Most of the visible light is filtered out. The green and violet lines at 546 and 405 nm are a little visible anyway, and the camera is barely able to record the 365 nm line that does all the work of making fluorescent things glow. I am puzzled by the line in between, at about 385 nm. I don't know what it could be from. However, I know that these lamps use a phosphor that responds to a strong Hg line at 254 nm and converts it to longer-wave UV, to get more "black light". Perhaps it is the source of the 385 nm line and other faint features in that space, but I think it mainly adds more 365 nm light.

Finally, the 40W fluorescent tube is of the kind that has been in use for nearly my whole life (7 decades), now mostly supplanted by CFL's and LED's. The two lines of Hg in blue-violet and green come through, but broad-band phosphors fill out the light making these pretty good for most uses. They actually have better color rendering values than CFL's, at the cost of using nearly twice the power: a 40W "tube" and a 23W CFL both emit about 1,600 lumens, but strongly colored items may look a little odd with the CFL.

As crude as it is, this simple spectroscope helped me understand these lamps better. I think the reason that some insects still come to the three non-incandescent bug lights is that they can see the green light. I don't have an incandescent bug light, but I suspect it to have less green light than the CFL or the LED's. This has been an instructive exercise.

Wednesday, November 15, 2017

Libraries - Don't even try to live without 'em!

kw: book reviews, nonfiction, libraries, librarians

In 2014 Kyle Cassidy was invited to a librarians' conference, where he photographed and interviewed a number of the attendees. A project was born. He went to more conferences and eventually obtained portraits and quotes from more than 300 librarians. This is What a Librarian Looks Like: A Celebration of Libraries, Communities, and Access to Information couples the photos and quotes with ten essays about specific libraries by Mr. Cassidy and a baker's dozen remembrances by authors and others who share how their lives were made or molded by libraries.

Spoiler (I suppose): Librarians look like everybody else. It is how they think that makes them different. Random quotes:
"Without librarians and instructors teaching students how to do research, many students never learn that there is a better way to do and learn things." —Lindsay Davis, University of California, Merced
"I want to nurture curiosity, feed knowledge, lay a foundation for information." —Katie Lewis, Drexel University
"Everything comes down to information. Librarians know how to use it, find it, and share it with the world, and they're ready to help everyone else do the same." —Topher Lawton, Old Dominion University.
"In the morning, I'm a rock star to a room full of preschoolers; midday, I'm a social worker assisting a recently unemployed patron in finding resources; in the afternoon, I'm an educator leading kids through an after-school science workshop. Librarians serve so many purposes and wear so many hats, but all of them change lives." —Sara Coney, San Diego County Library
My favorite quote about libraries is by Jorge Luis Borges: "I have always imagined that paradise will be a kind of library." In case, dear reader, you haven't run across an earlier mention of this: the Polymath at Large blog would not exist without libraries. To date I have written 2,075 posts. 55% of them are book reviews. Other than the ongoing series of presenting The Collected Works of Watchman Nee, I own no more than a dozen of the books I have reviewed. The rest were borrowed from one of a handful of local libraries.

When I was 19 years old, having just moved from Ohio to California, I went to the nearest library and began checking out books. At that time of my life I needed escape and I needed it badly. The library had all the science fiction books in one section, a large shelf section seven feet high. I took the first five books at the upper left and checked them out. A few days later I returned them and checked out the next five. For the next year or more I continued this until I had read the entire section of about 500 sci-fi novels and short story collections. Thereafter I slowed down and branched out. When the library began to mix fantasy in with sci-fi, and then horror (Lovecraft was popular at the time), I backed off the fiction and began reading mostly nonfiction, primarily in science (Dewey Decimal numbers 500-599).

These days, even though I am retired, I am sufficiently busy that I seldom finish more than one book weekly. Looking back at recent blog posts (more than 70% are book reviews for the past four years), I find that I average about five books monthly.

I don't use the library only to check out books, though that is behind 90% of my visits. I have attended lectures and programs; I took a guitar to their Poetry Night several years ago and sang one of my songs, which led to a special program featuring my music; the genealogy club meets there and I have attended from time to time.

During the last ten years of my career at Dupont, I was a kind of librarian. I transferred from IT to IS (info science) and I was put in charge of upgrading the software used to index and retrieve technical documents in the Electronic Document Library (EDL). For the final couple of years, we had an upper manager who thought "Google can do anything," and cut way back on the indexing staff. Indexing is the highly specialized craft of determining the major themes of an article or report, and devising an appropriate set of key terms to attach to it in a Metadata portion of its electronic version. Professional indexers (I became one) also determine when a new key term is needed in the controlled vocabulary we were using. Human indexing is still the gold standard, and no "search engine" can yet extract the right set of key terms from any document substantial enough to warrant storing in an electronic library.

When Dupont was "only" a chemical company, the term "rust" was unequivocal. It referred to an oxidation process that corroded metals, particularly iron and some similar metallic elements. But someone who was creating the earliest controlled vocabulary for Dupont was wise enough to realize that "rust" could have wider meaning, and thus an entry in the list is:
rust USE corrosion
Also, two companion entries can be found:
corrosion USE FOR oxidative decay
corrosion USE FOR rust
Sure enough, if you look up "oxidative decay" you will find:
oxidative decay USE corrosion
Wouldn't you know it: Several decades ago Dupont began producing crop protection chemicals, and some of its anti-fungal chemicals were aimed at dealing with various fungi called "rust" such as "wheat rust". Thus, some newer terms referring to fungi were added to the controlled vocabulary.

That is one illustration of a phenomenon that is common in human languages. Words have multiple usages, and their context may be clear to us but not so to software. Even now, no Google search, not even using the Advanced Search page (if you can find it), is able to robustly distinguish articles about rusting of metals from agricultural rusts.

A growing problem today goes by the misleading moniker Fake News (If it is Fake, it isn't News; it's just a Lie). Things on the Internet were bad enough when the main issue with material was ignorance on the part of the writers, the "creators of content". I think nearly any random adult knows that advertising is biased. Gather all the ads you can on toothpaste, for example, and it seems that there are at least five brands that are "recommended" by more than half of all dentists. No toothpaste ad will mention that the surveys used to gather such recommendations consisted of questions of this form:

Which of the following brands of dentifrice would you recommend (Check all that apply)?
 Beaver Brite
There may be 10 or 20 on the list. So, of course, if you're selling DentiGood and 64% of dentists happened to check it, along with five or eight others, you can claim, "2/3 of dentists recommend DentiGood!", thinking that nobody will mind if you round 64% up to 2/3. Of course, you would never, ever mention that 3/4 or more of those same dentists also "recommend" Sani-Kleen!

But what do we do when a larger and larger proportion of the "news" is truly a pack of lies? When I was young it was clear that the news media were biased to the left. Now the majority of them are left-leaning with actual malice. So what can we do? I suggest: Ask a librarian how to do your own research, how to track down the source of a story. That will take more than just looking it up on (staffed by a very busy couple who are really, really good at research).

One of the most helpful humanities courses I ever took, with a title I no longer remember, taught us how to determine the bias in any publication. We read a very wide variety of journals, from Commonweal and The Wall Street Journal to National Review and The New York Times. We were to find diverse articles about the same recent event and compare them. It was the best course in critical thinking I've encountered.

I'll avoid digging further into the fake news conundrum. We need librarians' expertise and tool set to learn how to know what we know and how to know if what we know is worth knowing. 'Nuff said.

Wednesday, November 08, 2017

A dreadfully posh mystery

kw: book reviews, mysteries, aristocrats

Strangely enough, this book was next to a "Sneaky Pie Brown" mystery that I reached for rather absent-mindedly, but I didn't notice I'd mis-aimed until I got to checkout. I decided to consider it an adventure and see what was in store.

On Her Majesty's Frightfully Secret Service by Rhys Bowen shares only its take-off title with any Ian Fleming novel. The mandatory secret agent character is decidedly secondary to Lady Georgina Bannoch, who has all the adventures, solves mysteries, and generally just avoids becoming another victim. Other titles by the author typically take off from sundry titles and tropes (e.g., Her Royal Spyness, The Twelve Clues of Christmas).

Lady Georgie is a poor relation to the royal family, cousin to Queen Mary (mother of Elizabeth). Thus, although she has hardly any family money to go on, she gets pulled into aristocratic intrigues. In this volume, in the spring of 1935 she goes to Italy to care for an ailing friend, but also spends a few days at a house party in a large villa, one attended by the crown prince (her cousin David, who abdicated as Edward VIII) and his intended, Wallis Simpson, and a number of Italian and German grandees. She has a special reason for being at the house party, sent by the Queen to spy on the prince and Mrs. Simpson. Pre-WWII intrigue forms the backdrop.

Initially I found the aristocratic milieu rather tiring, but warmed to it in time. It is a fun sort of lingo to imitate, as fans of Jane Austen well know. I was also a bit taken aback by the characterization of the prince and his intended. If Mrs. Simpson was the spoiled harridan portrayed in the book, one wonders what the prince could possibly have seen in her, though he is portrayed as utterly devoted to her, albeit anxiously. I don't have sufficient knowledge to judge how accurate this may be.

The plot is a classically structured closed-door mystery, solved at the last moment and pretty much by accident by Lady Georgie. A bit of idle fun to read, a break from my usual diet of nonfiction.

"Rhys" is the Welsh version of "Reese" (as in Witherspoon); Ms Bowen is of the British Isles and knowledgeable enough about aristocratic habits of nearly a century ago to pull this off.

Saturday, November 04, 2017

Relating for communicating

kw: book reviews, nonfiction, communication, improvisation

An actor who is any good must become an expert at relating with an audience. This usually means inducing people to care about the character. The best actors may not win all the Oscars, but they are the ones people care about the most. This is distinct from the odd quality of being a "celebrity".

If people watching a play or movie empathize with the character, does that mean that the actor portraying that character also has a lot of empathy? Sometimes, maybe most of the time. Of course, some actors are totally faking empathy, having learned to induce sympathetic feelings in a cynical way, even a psychopathic way (psychopaths are frequently very charming, but it is surface only).

Alan Alda had learned to act what he feels, and became the host of Scientific American Frontiers and several other series because of his unparalleled ability to genuinely relate to the people in the episodes and to the audiences. In his book If I Understood You, Would I Have This Look on My Face?: My Adventures in the Art and Science of Relating and Communicating, Alda relates that it was not always so. Even after a successful career in improv, stage, and screen acting, when he first interviewed a scientist, he made at least five blunders that he never would have made had he made the connection between how an actor projects a character to an audience, and how an interviewer relates to the subject of the interview and to the audience who will watch it. (At 23 words, the book's title is one of the longest on record, and it has my personal "Bravo!" for projecting clarity in a title that long!) The book describes many of the tools, borrowed primarily from improvisational theater, and the "games" used by improv coaches, that Alda and his colleagues at his Center for Communicating Science (now at Stony Brook University) use to improve the communications skills of those least likely to have developed any: working scientists.

I was in drama club in high school, and acted in a repertory company my first two years of college, but I never learned improv. I was strictly a "by the script" actor. But as I read I gradually learned how to relate to the stories Alda tells, and the principles they embody.

For most of us, breakdown of communication has one source: FEAR. I once took a "Business Writing" class my company sponsored, and the pre-assignment was to "improve" a badly-written business letter. I turned in two versions. One was a re-write based on principles of business writing that I knew already. The second was much shorter: brief, to the point, and totally forthright; to it I attached a note, "Here is how we would write if we didn't fear one another."

The games Alda describes and the other methods he uses for breaking down barriers between any two people who want to communicate to one another, all drive out fears in one way or another. For example, one of the first "games", Mirroring, gradually shows the participants that they are not so different. The better the "follower" gets at following the actions of the "leader", even learning to anticipate and thus mirror without delay, the more both learn how similar they are. An advanced version, "leaderless mirroring", drives the point even deeper.

I am such a purist, I had a harder time than most will, to "get" what the author is sharing. Finally, though, the message on one significant point became clear to me: most "lecturing" is answering questions that have not been asked, just as most "help" is presented so as to help the helper (or how the helper imagines needing to be helped); rather, effective communication requires knowing, or learning, enough about the opposite party, so that we elicit the right questions, spoken or not, and then the other is ready to receive the "answers". This solidified a realization I had about the "Golden Rule", which grew into several steps of increasing value:

  • The SILVER rule (attributed to Confucius and others): "Do not do to another anything that you don't want done to you."
  • The GOLDEN rule (from the sayings of Jesus in the Bible): "Whatever you wish that others would do to you, do also to them."
  • The PLATINUM rule: "Do unto others as they wish to have done to them."
  • The DIAMOND rule: "Ask first".

Alda writes much about empathy and Theory of Mind, which allow us to, in part, "read" others' minds. If we know how to listen, though, nothing beats a well-crafted question.

Though I feel quite dull of senses, in an emotional sense at least, I got much from this book, so I think practically anyone can gain much.