Friday, October 27, 2017

One Tree, One Year, One Book

kw: book reviews, nonfiction, natural history, trees, forests, phenology

A woman once wrote of a talk with her daughter, who was being affected by the licentious, free-love, promiscuous atmosphere of the late 1970's. The daughter had asked, "How can you be satisfied to be with just one person for a lifetime?" The woman reminded her daughter of her youthful experience living near the sea shore: "Every day that you could, you went to the little cove. You explored it over and over. You never tired of it. If we still lived there, don't you think you would still enjoy it, just that one little cove, so full of things to see, that changed a bit every day, but was always the same?"

These are wise words. They explain how a couple can, with a bit of imagination, remain fascinated with one another for 40 or 50 years or more (my parents were married 58 years; so far for me, 42 years). I thought of this story when I began to read Witness Tree: Seasons of Change with a Century-Old Oak by Lynda V. Mapes. She spent a year living in visitors' quarters at Harvard Forest near Petersham, Massachusetts, lending much of her attention to a single tree, a red oak about 100 years old and more than 80 feet tall.

The Harvard Forest is home to dozens, perhaps hundreds of experiments in forestry, botany, climatology and a number of other disciplines. Many have been going on for decades, though it is unlikely any have continued since the founding of the Forest in 1907. The map pin in this image is the approximate location of the Witness Tree.

One of the author's mentors has been walking the same route at least weekly, sometimes twice weekly, recording his observations of selected plants—trees and shrubs, mostly—and has compiled a record of more than 25 years of the phenology of those plants and that bit of the forest.

Phenology! I had to look it up (I'd seen the word before and could guess its meaning, but…):
Phenology is the study of periodic plant and animal life cycle events and how these are influenced by seasonal and interannual variations in climate, as well as habitat factors. (from Wikipedia)
A phenological record for a simple annual plant such as a sunflower might include when the seed first sprouted, the first true leaf emerged, the height and spread of the plant on various dates, when each flower appeared, when the seeds ripened, when goldfinches began to eat the seeds and when they were all finished off, the date of the first killing frost, and when the stem fell over.

Such a record for a perennial plant, particularly a shrub or tree, would include numerous events throughout the year, for year after year, not only of what the plant is doing but what significant weather and other environmental events occurred, and the plant's response to them. Disease or locust invasion or hailstorm? It all goes in the record.

The author makes clear throughout the book her interest in climate change and how this tree and those around it are responding, and have responded over the past half century or so. A core sample taken at the beginning of "her" year showed that for the past decade the tree has been doing very well, adding thicker rings than at any similar period in its past. In a sense, the changing climate has been good for this tree. Also, the tree is doing what it can to absorb carbon dioxide and make wood out of it, which mitigates the rapidity of climate change.

I want to deal with a quibble before going on: In Chapter 9 the author does her best to explain the greenhouse effect that carbon dioxide participates in, to warm the earth. A good biologist is not necessarily a good physicist, and the following statement needs amendment:
"As it enters our atmosphere, the radiant shortwave energy of the sun is transformed to long-wave radiation – heat. Molecules of carbon dioxide in the atmosphere absorb this heat and vibrate as they warm, creating even more heat." (emphasis mine)
It takes a moment to understand what she is saying here, because heat-induced warming of any gas does not create more heat. What is happening is that the molecules absorb radiation of medium wavelengths (near infrared), which induces vibration in the molecules so that they re-radiate longer wave energy, a broad spectrum of medium-to-far infrared. No "heat" is created. Infrared radiation is not specifically "heat" radiation, because all radiation heats up anything that absorbs it, in equal measure. A beam consisting of one watt of green, blue, ultraviolet, or whatever radiation, will cause just as much heating as a beam consisting of one watt of infrared radiation. So, more accurately, and specifically related to the greenhouse mechanism:
…the shortwave energy of the sun is absorbed by the earth's surface—dirt, plants, pavement, water—which warms them so that they emit long-wavelength infrared. Some gases in the atmosphere, primarily water vapor, carbon dioxide, and methane, absorb a lot of infrared, which warms them so their molecules vibrate and re-radiate infrared. Half of it is directed generally back down, and half generally outward into space. This redirection is a barrier to the infrared being radiated directly outward, so the earth's surface must get a little warmer and radiate more infrared, bringing about a balance between all the light that was originally absorbed and what is radiated back outward.
Nowhere does she mention that the primary greenhouse gas is water vapor. Though she does say that without the greenhouse effect the earth would be 33 degrees C cooler, and thus mostly frozen, nearly all of that warming to temperatures we consider "comfortable" is because of water vapor. The 280 ppm (0.028%) of carbon dioxide in the pre-industrial atmosphere added about 2°C. Now its level is about 400 ppm, and this has added another degree C.

About a century ago Svante Arrhenius determined, using calculations so simple that I have done them myself, that if atmospheric carbon dioxide were doubled from 300 ppm to 600 ppm, global average temperature would rise by about 4°C (7°F). However, the conclusion that doubling it again to 1,200 ppm (0.12%) would cause an 8°C rise is false. It is not a linear relationship. Better calculations show that carbon dioxide cannot drive warming beyond a level of about 5.5°C (10°F), even with several percent of the atmosphere being composed of carbon dioxide. At that point we would find our breathing affected! Also, The Arrhenius calculations don't take weather into account. When energy is added to the system, some of it goes into stronger winds and more frequent extreme weather events. These can reduce the extra warming by about half. This is a good-news-bad-news situation: global temperature rise is limited to about 3°C, but insurance companies are going to be paying out more claims related to floods, tornadoes and hurricanes.

Now, back to the wonderful tree in Harvard Forest. You can follow it day by day here. It is currently the last in a list of 13 "phenology cameras" (this image was captured an hour or two before I began writing this post).

Ms Mapes, with the help of several colleagues, measured the tree, studied the animals and plants that lived nearby, in, or on it, and climbed it a few times. Such a tree is no easy climb. The lowest branches begin about 40 feet above the ground. Just throwing a bean bag on a string over a branch to start hauling up a climbing rope is no easy feat. Once she learned to do that, her tree-climbing mentor pulled out a large slingshot that can rather accurately place a bean bag over a limb of choice!

Dozens of insects and other small animals depend on forest trees. A "trail camera" also showed her the various animals, from badgers and skunks to deer and coyotes, that passed by the tree, usually without paying attention to it. Trees are remarkable, having coping mechanisms of many kinds because they don't have the option of going indoors when it snows, or of packing up and moving elsewhere when threatened. They must just sit and take it. Two chapters on the way trees "talk", including the way they send signals to one another about new insect depredations, and how trees that receive such signals change their leaf chemistry to discourage the attackers, show that they are far from passive receivers of whatever nature dishes up.

We have, most of us, a certain affinity for trees. We like them in our yards: A house on a quarter-acre lot that hosts 20-30 trees sells for 20% more than one with a tree-free lawn. Eight of the ten most-visited National Parks are forested (see here). While I love the desert, even there I most enjoy areas with "large vegetation", such as at Joshua Tree in the Mojave or the Saguaro-studded areas around Tucson.

In pre-electronic days, a Witness Tree was a landmark used by surveyors, from which the survey of a neighborhood-sized area was conducted. Even in an age of GPS and ubiquitous cell phone towers (which are sometimes camouflaged as rather odd pine trees), it is no waste to give time to observe the changes of a single tree, or a small stand, as they respond to the seasons of the year, and the changes from year to year. When we slow down to not just "smell the roses" but to truly see what is going on, we are all natural-born phenologists.

Sunday, October 22, 2017

Spider, spider on the wwweb

kw: blogging, blogs, spider scanning

I checked the blog stats today. Had I waited a day or two I'd have missed another love note from Russia. All quiet since some time on October 17th, though.


Thursday, October 19, 2017

Seven edges of knowledge

kw: book reviews, nonfiction, science, theories, knowledge

Can we (collectively) know everything, or will some things remain forever beyond our ken? The answer depends on how much there is to know. If knowledge is, quite literally, infinite, then given the universe as we understand it, there is no possibility that everything can be known. But there is another way to look at the question, one taken by Professor Marcus du Sautoy in The Great Unknown: Seven Journeys to the Frontiers of Science: are some things forever unknowable by their very nature? His "Seven Journeys" are studies of the cutting edge of seven scientific and socio-scientific disciplines; they are explorations into what can be known accordingly.

The seven disciplines are simply stated: Chaos (in the mathematical sense), Matter, Quantum Physics, The Universe, Time, Consciousness, and Infinity (again, in the mathematical sense). Six of these are related to the "hard" sciences, while Consciousness is considered a "soft" problem by many, but in reality, it may be the hardest of all!

I knew beforehand of the three great theoretical limits to the hard sciences that were gradually elucidated in the past century or so: Heisenberg Uncertainty, Schrödinger Undecidability, and Gödel Incompleteness. Each can be considered from two angles:

  1. Heisenberg Uncertainty (H.U.) is the principle that the combination of momentum and position can be known to a certain level of precision, but no further. It primarily shows up in the realm of particle physics. Thus, if you know with very great accuracy where a particle is or has been (for example, by letting it pass through a very small hole), you cannot be very certain of its momentum, in a vector sense. In the realm of things on a human scale, diffraction of light expresses this. If you pass a beam of light through a very small hole, it fans out into a beam with a width that is wider the smaller the hole is. This has practical applications for astronomers: the large "hole" represented by the 94-inch (2.4 meter) aperture of the Hubble Space Telescope prevents the "Airy circle" of the image for a distant star from being smaller than about 0.04 arc seconds in visible light, and about 0.1 arc seconds in near-infrared light. The mirror for the James Webb Space Telescope will be 2.7 times larger, and the images will therefore be 2.7 times sharper. But no telescope can be big enough to produce images of "infinite" sharpness, for the aperture would need to be infinite. All that aside, the two interpretations of H.U. are 
    1. The presence of the aperture "disturbs" the path of the particle (in the case of astronomy, each photon), which can somehow "feel" it and thus gets a random sideways "kick".
    2. The Copenhagen Interpretation, that the particle is described by a wave equation devised by Schrödinger that has some value everywhere in space, but the particle's actual location is not determined until it is "observed". The definition of "observer" has never been satisfactorily stated.
  2. Schrödinger Undecidability, proposed originally as a joke about a cat that might be both dead and alive at the same moment, is the principle that the outcome of any single quantum process cannot be known until its effect has been observed. The "cat" story places a cat in a box with some poison gas in a flask which has a 50% chance of being broken open in the next hour according to some quantum event such as the radioactive decay of a radium nucleus. Near the end of the hour, you are asked, "Is the cat dead or alive?" You cannot decide. Again that pesky "observer" shows up. But nowhere have I read that the cat is also an observer! Nonetheless, the principle illustrates that, while we can know with a certain accuracy the average number of quantum events of a certain kind that might occur, we have no way to know if "that nucleus over there" will be the next to go. Two ways of interpreting this situation are given, similar to the above, firstly that the event sort of "decides itself", and the other, also part of the Copenhagen Interpretation, that only when an outcome has been observed can you know anything about the system and what it has done.
  3. Gödel Incompleteness is described in two theorems that together proved mathematically that in any given algorithmic system, questions can be asked, and even their truth can be described, but those questions' veracity cannot be proven within that algorithmic system. Most examples you'll find in the literature are self-referential things such as a card that reads on one side, "The statement on the other side of this card is true" and on the other, "The statement on the other side of this card is false." Such bogeys are models of ways of thinking about the Incompleteness theorems, without really getting to their kernel. A great many of them were discussed in gory detail by Doug Hofstadter in his book Gödel, Escher, Bach: The Eternal Golden Braid, without getting to the crux of the matter: Is our own consciousness an algorithmic system? because it seems we can always (given time) develop a larger system in which previously uncrackable conundrums are solvable. But then of course, we find there are "new and improved" conundrums that the tools of the new system cannot handle. An example given in The Great Unknown is the physics of Newton being superseded and subsumed into the two theories of Relativity developed by Einstein. Again, there are two ways this principle is thought of. Firstly, that given time and ingenuity we will always be able to develop another level of "meta system" and solve the old problems. But secondly, we get into the realm of the "hard-soft" problem of consciousness: Is consciousness algorithmic? for if it is, we will one day run out of meta systems and can go no further.
Thus the two questions that really need answering are, "What is Consciousness?" and, "Is Space Quantized?"

The only way we know to study consciousness is to study our own and that of a small number of animals that seem to be self-aware. Some would posit that we can create conscious artificial intelligence (AI), but this is questionable because all known methods in the sphere of AI studies are algorithmic, even if the algorithm is "hidden" inside a neural network. Since we do not yet know if natural intelligence (NI) is algorithmic, we cannot compare AI to NI in any meaningful sense!

One consequence of a possibly infinite universe is that everything we see around us might be duplicated an endless number of times, right down to the atomic and subatomic level. Thus there could be infinite numbers of the Polymath at Large typing this sentence, right now, in an infinite number of places, though very widely separated, to be sure (say, by a few trillions or quadrillions of light years, or perhaps much, much more). But, if I understand the proposition correctly, that is only possible if space is quantized. Quantization of space is based on the discovery of the Planck length and the Planck time about a century ago. They are the smallest meaningful units of length and time known. The Planck length is about 1.62x10-35 m, or about 10-20 the size of a proton. If space is quantized, it is most likely quantized on this scale. The Planck time is the time it takes a photon to travel a Planck length, or about 5.4x10-44 sec.

If space is quantized with the space quantum being a Planck length, that means that positions can be represented by very large integers, and that those positions will be not just very precise, but exact. How large an integer? If we consider only the visible universe, which has a proper radius of about 75 billion light years, or 7.1x1026 m, you'd need a decimal integer of 44+26+1 = 71 digits, or a binary word (for the computer) containing 236 bits or 29.5 → 30 bytes.

The trouble comes when you want to learn positions to this kind of precision/exactitude. To learn a dimension to an accuracy of one micron you need to use light (or another sort of particle such as an electron) with a wavelength of a micron, or smaller, to see it. To see the position of a silicon atom in a crystal, you need x-ray wavelengths smaller than 0.2nm (or 200 pm), which comes to 6,200 eV per photon. X-rays of that energy are a little on the mild side. But to "see" a proton, you are getting in the sub-femtometer range, which requires gamma ray photons with several million eV each. Twenty orders of magnitude smaller yet, to be able to distinguish a Planck length, would require such energetic gamma rays (about an octillion eV each) that two of them colliding would probably trigger a new Big Bang.

By the way, photon energies of billions to trillions of eV would be needed to pin down the locations of the quarks inside nucleons, which is what would actually be needed to get a "Star Trek Transporter" to work, at both the scanning and receiving end. Each such photon has the energy of a rifle bullet. You would need several per quark of your sample to transport. Maybe that's why the transporter hasn't been invented yet, and probably never could be…even if Dilithium and Rubindium get discovered one day.

Also, just by the bye, in a quantized universe there would be no irrational numbers, not truly. I am not sure how lengths "off axis" could be calculated, but they would somehow have to be jiggered to the next quantum of space. There goes Cantor's Aleph-1 infinity!

OK, I got so wrapped up in all of this that I hardly reviewed the book. It's a great read, so get it and read it and go do your own rant about the limits of knowledge!

Monday, October 09, 2017

The die of a trillion faces

kw: analysis, radioactivity, quantum physics, chaos

I'm halfway through a book about the edges of scientific knowledge, which I'll review anon. In the meantime, two of the chapters got me thinking: one on mathematical chaos and the other on quantum randomness as it relates to radioactivity.

Mathematical chaos does not refer to utter randomness, but to mathematical process that are completely deterministic but "highly sensitive to initial conditions." Such systems are typically studied by running computer simulations, which brings out an amusing feature: many such systems are also overly prone to amplify rounding errors in the calculations. For example, numerically solving a set of stiff differential equations frequently results in the solution "blowing up" after a certain point, because the rounding errors have accumulated and overwhelm the result.

Natural systems, being analog and not digital, can be described by sets of differential equations. Digital simulations of such systems can proceed only so far before descending into nonsense. The most famous of these is forecasting the weather. Many computer scientists and meteorologists have labored for decades to produce weather models that run longer and longer, farther and farther into the future, before "losing it." So now we have modestly reliable seven-day forecasts (and Accuweather.com has the temerity to show 90-day forecasts); a decade ago or so, no forecast beyond three or four days was any good.

Quantum randomness is a beast of another color, indeed, of a different spectrum of colors! These days the classic illustration is the ultra-low-power two-slit interference pattern. You can produce a visible (and thus moderate-power) pattern with a laser pointer, a pinhole or lens, and a little piece of foil with two narrow slits a short distance apart. The pinhole or lens will spread the beam so you can see it hit both slits. On a screen a few inches behind, a pattern of parallel lines will appear, similar to this image.

The ultra-low-power version is to set this up with the lens/pinhole and the slits and the laser held in stands, and the screen replaced by sensitive photographic film. Then a strong filter is put at the laser's output, calculated to make the beam so weak that no more than one photon will be found in the space between the laser and the film at any one time. Such an arrangement requires an exposure of a few hours to get the beginnings of a record, and several days to get an image like the one above. Whereas this experiment with strong light seems to show the wave nature of light, the ultra-low-power version shows that a photon has a wave nature all by its lonely self!

A "short" exposure of an hour or less will show just a few dots where single photons were captured by the emulsion. They appear entirely random. The longer the exposure, the more a pattern seems to emerge, until a very long exposure will produce a clear pattern. The pattern shows that you can predict with great precision what the ensemble of many photons will do, but you cannot predict where the next photon to pass through the apparatus will strike the film.

Radioactivity also obeys certain quantum regularities (I hesitate to write "laws"). Half-life expresses the activity of a radioactive material in reciprocal terms. A long half life indicates low activity. In the book I was reading the author wrote of a little pot of uranium 238 (U-238) he bought, which contains just enough of the element to experience 766 alpha decays per minute. My first thought was to see how much U-238 he had bought. U-238 has a half life of 4.468 billion years. Working out the math, I determined that he had just over one milligram of uranium. The amount was very close, which made me suspicious that there was a typo: If he actually bought exactly one milligram, the activity would be 746 decays per minute…and that might be the true amount.

What is happening inside a uranium nucleus that leads a certain one to emit a helium (He-4) nucleus (and thus turn into thorium 234, Th-234)? Scattering experiments carried out decades ago showed that although the atomic nucleus is incredibly tiny, it is mostly empty space! I learned this as a physics student in the late 1960's. I had found it hard enough to wrap my mind around the view of an atom as a stadium with a few gnats buzzing around the periphery, centered on a heavy BB. So the protons and neutrons, while not being effectively "dimensionless" like electrons, are still much tinier than the space they can "run around" in. The propensity of proton-heavy elements such as U-238 to decay by emitting helium nuclei indicates that the protons and neutrons "run around" in subgroups.

The standard explanation is that at some point one of the He-4 nuclei "tunnels" through the "strong force barrier", finds itself outside the effective range of the force, and thus is accelerated away by electromagnetic repulsion to an energy of 4.267 MeV. What determines when it tunnels through?

Back in the chapter on chaos, the author spoke of dice with various numbers of faces, though he illustrated the randomness of a die's fall using a "normal" 6-sided die he got in Las Vegas. I guess they make them more accurate there, where large stakes are wagered on their "fairness". But dice with various numbers of faces are produced for board-based role playing games. This illustration, from aliexpress.com/, shows one such set of ten different kinds of die, ranging from 4 to 20 faces.

Put two thoughts together, and you can get some interesting products. Can the randomness of alpha decay be related to the randomness of a tumbling die? We can set up a model system with a box of cubical, 6-sided dice, perhaps 100. Here are the steps:
  1. Cast the dice on a table top (with raised sides so none fall off, perhaps).
  2. Remove each die that shows a 6.
  3. Return the rest to the box.
  4. Repeat from step 1.
I did this a few times, stopping each run after 16 trials. Here are two results:

100, 81, 69, 58, 49, 41, 35, 30, 24, 21, 18, 14, 11, 9, 8, 6, 5
100, 90, 78, 64, 53, 46, 37, 31, 26, 22, 18, 15, 12, 10, 9, 8, 7

The calculated half life of these dice, with "activity" of 1/6 per throw, is 4.16 throws. As seen above, small number statistics cause a certain variation, so that after four throws, 49 and 53 are left; after 8 throws, 24 and 26; and so forth. If instead you use 20-sided dice, the half life would be 13.9 throws.

This led me to think of the He-4 (alpha particle) "cores" bouncing around inside the strong-force boundary around a U-238 nucleus as being governed by a die with an immense number of faces, perhaps a trillion. Rather than numbers from one to a trillion on the faces, the only thing that matters is the "get out of here" face, which we might consider to be green (for "go"), the rest being red. On average, once per trillion "bounces" the die momentarily has its green face at the boundary, and the alpha particle flies free. Since the decay constant for U-238 is ln(2)/half life of 4.468 billion years, or one decay yearly per 6.45 billion nuclei, a trillion-sided die would imply a "bounce" time of about two days. The actual transit time for an "orbiting" He-4 is closer to 10-18 sec, which implies a die with a whole lot more than a trillion faces; say, ten trillion trillion faces.

Can it be that quantum randomness and mathematical chaos are related? Could one cause the other … in either direction?!?

That is as far as I have taken these ideas. I don't know (does anyone?) whether the internal, dynamic structure of a large nucleus is dominated by lone nucleons, by clusters such as He-4 and others, or what. The lack of decay products other than alpha particles, except in cases of spontaneous fission, for nuclei that are proton-rich, indicates that any nucleic clusters don't exceed the He-4 nucleus in size (and beta decay is a subject for another time!).

Sunday, October 01, 2017

Learn all about fat. Get depressed.

kw: book reviews, nonfiction, physiology, fat, weight loss

I have known for a long time that for many of us in affluent countries, weight management is a fierce challenge. We can see this from the very existence of Weight Watchers, Nutrisystem, Jenny Craig and literally hundreds of other clinics, systems, and plans, and the $60 to $150 billion that Americans spend on weight loss proves it. If weight management were easy it would be cheap, and we wouldn't need all those clinics and "life coaches" and the rest.

Now we can learn in great detail just what we are up against…if we really want to know. I suspect many folks don't want to! I am not sure if I am happy about knowing, either. Like it or not, I just finished reading The Secret Life of Fat: The Science Behind the Body's Least Understood Organ and What it Means For You, by Sylvia Tara, PhD.

You read right: Dr. Tara calls our fat system an organ. It is the largest and most complex endocrine organ in our body, except perhaps for a very few people who cannot deposit fat and as a consequence must eat tiny meals two to four times hourly to stay alive and comparatively pain-free. Do you think you'd like to be truly fat free? Without a system of depositing fat, which our liver and other organs produce continually, the blood gets milky with circulating lipids that just go 'round and 'round until they are used up by metabolic processes. Heart attack at a young age is the typical fate. People with this affliction who try to eat "more normally" wind up with painful lipid deposits in the skin, rather than normal layers of healthy fat, and in either case they look like walking skeletons, like it or not.

Fat does a lot more than regulate our energy stores. As an endocrine organ, it communicates with the rest of the endocrine system, regulates appetite and metabolism, determines our fertility (or its lack), and stands ready to help us stave off a famine. In babies, the "brown" and "beige" varieties of fat produce extra energy to keep the little body warm. When you have the weight-to-skin area ratio of a house cat, but no fur, you need to produce a lot more energy per pound to keep from freezing to death at "normal" temperatures in the 70's (or the low 20's in Celsius). A strange therapy that turns some "white" fat to "beige" fat is being tested to shift people's energy balance for weight loss. It promises to be even more costly than staying at a Mayo Clinic Weight Loss residence.

You may have heard of ghrelin, leptin and adiponectin. These are just three of the signaling molecules that make us hungry, or not. Leptin turns down our "appestat", the others raise it. Several other signals shift our cravings here and there. Others "tell" fat to deposit itself in our subcutaneous layer ("safe" fat) or viscerally ("dangerous" fat). Guess what can shift all of these in a healthy direction? Exercise. Lots of it. Nothing else is as effective.

Also, as we are only recently learning (partly because of a genuine conspiracy carried out 50 years ago), sugar is much more of a culprit in making us fatter and making that fat less healthy than we used to think, as compared to dietary fat. To be clear, trans fat is truly evil (and all of us who grew up eating Margarine rather than butter must shed a tear here), and also, while saturated fat is a little better and some is actually necessary, saturated fat has to be balanced with the mono- and poly-unsaturated varieties or it does cause problems. But excess sugar is the worst, and sugar substitutes, oddly enough, are almost as bad, because the insulin system kicks in when we taste sweetness, regardless of source. An insulin spike causes fat to be deposited.

During my last ten years working, I got in the habit of drinking about a liter of sugar-free cola daily (Pepsi Max had the best taste). Upon retirement I stopped drinking soda almost entirely, and lost 15 pounds. At first I thought the weight loss was because I was under much less stress; chronic stress also causes weight gain. But now I think it is probably at least half due to stopping my soda pop habit.

After nine chapters of the science of fat—and fascinating science it is—the last four chapters are the "how to" section. The author is a woman, she is descended from an ethnic group in India that endured repeated famines for millennia, and both of these work to make her metabolically fitted to gain weight and hold it, waiting for the next famine. She has also done a certain amount of yo-yo dieting. Guess what? If you have never been overweight, you have a metabolism that matches the calculations at sites such as the Basal Metabolic Rate Calculator. There I find that my basal metabolism is about 1,750 Cal/day, with a dietary intake need of between 2,400 and 3,000, depending on how active I am. Were I female, these numbers would be 1,550, 2,050 and 2,650 (Note: I rounded the numbers from the overly-exact calculations. Also, when I write Calorie, I refer to kilocalories. The calorie of physics is 1/1000 of a Calorie).

What if you have dieted, and regained your weight? Your fat system changes, permanently, so that maintaining the weight now requires fewer Calories, a lot fewer (20-30%). So, you struggled with a 1,200 Calorie per day diet and lost 25 pounds. You used to eat 2,400 Calories daily. If you go back to a 2,400 Cal/day diet, you'll gain it all back, and then some. You'll even gain it back, more slowly, at 2,000 Cal/day! If the BMR Calculator says your dietary need at the weight you want to maintain is 2,250 Calories, you'll actually now barely be able to hold the new weight at 1,700 Cal/day. That is a cruel fact of weight loss-and-regain.

Chapter 12 is titled "Fat Control II: How I Do It". Dr. Tara eats no dinner. Ever (hardly ever!). She chronicled, almost pound-by-pound, how she lost a certain amount of weight over about a year, and how she did it using a "partial fast" of no food intake for 18 of the 24 hours a day, and small high-fiber meals in that 6-hour "eating window". She also boosted her activity level, mightily. She recommends 5 workouts per week of 45 minutes' duration, sufficiently vigorous to make us sweat and have a hard time talking (none of this treadmill-walking while holding a conversation on the phone!).

I decided to check something. I used the short-form Longevity Calculator at Wharton twice, making one change between. The first time, I put "1-2 workouts per week", the second "5+ workouts per week". My life expectancy in the first instance is 91, and in the second 92. In either case, the tool reports that I have a 75% chance to live beyond age 84. Going back and changing activity to "rarely", returned a life expectancy of 90. Hmm. I am 70 now. If I hold up, and am able to do those vigorous workouts 5x/week, I'd spend an extra three hours weekly working out. That is 156 hours yearly or, in 15 years (until age 85, when I'd probably have to slow down!), 2,340 hours. My waking hours in one year are 6,570 (I sleep 6 hours on average, in spite of trying to stay abed longer).

So I can gain another year of life if I spend about a third of it working out. Would I be healthier? Certainly, as long as I don't tear up my body doing all those workouts. I'd have to get into it gradually. So it is likely that those 15 years would be pretty good ones. On the other hand, it would have to go hand-in-hand with less eating, meaning I'd be living with being chronically hungrier. That is not an easy choice, but this is the kind of cost/benefit analysis we need to do. Unless the FDA approves an economical form of Leptin treatment to help us manage appetite, it's the best hope I have of being svelte again. That's mildly depressing.

Today's spiders are Polish

kw: blogs, blogging, spider scanning

When I logged in a few minutes ago I noted a big spike in traffic had occurred about 4:00 AM my time. Focusing on the past 24 hours showed that 123 of the roughly 160 "hits" in that period are from Poland, as seen here:
It appears that the person or entity in Poland also favors the Safari browser on Linux, which leads me to believe it is a server running an automated script, that is, a spider. Hmm. Now, on to what I logged in to do…