Tuesday, April 29, 2014

Where are the limits of human-animal rights?

How far should we extend consideration for animal rights, in particular with regard to chimpanzees or other 'higher' primates?  Views on this vary from people who argue about the necessity of using non-human primates to understand human disease, or behavior, or evolution, to the those who argue that as our nearest relatives, chimpanzees should enjoy the same rights that humans enjoy.  In fact, the New York Times had a story on Sunday about the first chimp suing its owner for abuse.  Federal regulations have changed on this over the years, currently erring on the side of more rights than fewer, but this hasn't always been so.

Now, Svante Paabo, a pioneer of sequencing ancient DNA (from fossils) and the first and one of the major sequencers of Neanderthal fossil DNA, has written a thoughtful NY Times piece about the rather culpably casual rhetoric about 'cloning' a Neanderthal that has followed the sequencing of several Neanderthal genomes.  The  widespread headlines by promoters of this sexy idea usually omit some important technical details that clearly show that the idea is not actually cloning a Neanderthal (Svante briefly itemizes some of this), but that's not our point here.

Neanderthal; BBC website

The Neanderthal is treated in the usual stories, even by prominent investigators who should know better, as some object of study, as an 'it'.  But we know that Neanderthals and other populations that once existed, or at least are identified as populations by some sample fossil finds, could interbreed with the direct ancestors of us modern humans, or is that the way to put it?

If 'they' bred with early 'us', then what makes them 'they' rather than others of 'us'?  The answer is that this is a misrepresentation of the population dynamics.  What happened, based on current knowledge and interpretation, is that two groups of our ancestors met and inter-bred.  The only reason one is called 'them' and the other 'us' is that the vast majority of the DNA you and we carry around seems  to have come from the ancestors being referred to as our own, and only minority from the Neanderthals who are often treated as a truly different kind of life--a different species.  This shows some modern arrogance on our part, as if they were some sort of strange interlopers into our noble lineage.  Instead, they were just part of our lineage.

Is old-fashioned racism lurking beneath the jolly stories here?
Of course, in our current scientific system, including news and other attention-seeking media, we quickly see evaluations of what functionally important genes 'they' gave to 'us'?  (Anything we gave to them must have been rather devastating, if the interpretation that they all died out is correct).  Whether the list is of 'good' or 'bad' genes, does the assignment of such genes as being owned by one or the other ancestral group an unmixed scientific assessment?

Or, by classifying the groups as us and them, and then saying 'they' had this or that genetic variant that they gave to 'us', are we carelessly indulging in just another form of group generalization, another form or value-judgment racism?  Do we blame their inferior nature for the 'bad' genes, for type 2 diabetes, say, that we got by some careless intermixture with 'them'?  If genes they gave us were superior (say, high-IQ genes such as found in certain racial groups today--you can fill in the blanks, we're sure), then why did they die out and we persevere?  If we won but they had the superior genes at the time of encounters, then why doesn't this make people think twice about ritualized Darwinist assertions of how who adapted to what?

If it was a few of their good genes mixing with our overall-better genomes, then why didn't the offspring, who were 50-50 in terms of ancestry, lend superiority to their part of the interacting groups?  Of course, you can make up any story you want, especially if you believe that the 'disease' or 'superior' variants really are so--we readily know how vulnerable GWAS and other genetic value assessments are proving to be, to over-generalization.

So perhaps there's the additional issue not just of animal/human distinctions, but of careless talk reinforcing the racism always latent in the human heart.

But even that is not our point today.

What about the clones?
Let's suppose that the modern Dr Frankensteins can bring the Neanderthal 'It' to life.  How will that be done?  First, with present-day technology, there will have to be a very low-tech participant besides the Neanderthal DNA: some woman to carry the fetus.  It probably can't be a chimp, should one 'volunteer', because Neanderthals were so essentially human that a human uterus,  pelvis, and physiology would be needed to carry the new baby.  Well, there are probably plenty of people who'd volunteer for the job.  So, forgetting that minor obstacle, what will occur once the venerable Dr F has the bundle of joy in his lab?

Will it be 'human'?  Svante likens the cloning to doing that with his deceased grandparent's DNA and points out how rather spooky and wrong that would be.  Here, there would be no personally known relative involved.  But would the cloner, our kindly Dr F, have ownership of the baby?  Would the baby be considered an animal or a person?

So, would the beneficent Dr F have the right to keep little It in a cage?  Or, say, put It on display in his lab or in the (say, naming no names) nearby Boston zoo?  Would It be housed alone, or with some other animals.....and if so, which, and which type of animals as It's roommate(s)?

Would scientists have the right to poke, probe, and experiment with Neando?  To draw blood or even snip out a few tissue biopsies when they wanted to (often enough to generate a stream of Nature papers)?   Subject it to CT or fMRI scans?  Why not?  After all, this is science! 

Or, alternatively, would little Neando automatically have civil rights as a human?  Born in some august university's city here, would it have citizenship?  Would it have the right to a real foster home, with human parents?  To be taught language (not as an experiment by some always-well-meaning NIH-funded psychology professors, but by a real adoptive parent)?  Would it have the expectation, indeed the right, to go to school?

When it reached adulthood, assuming it wasn't so immunologically vulnerable that it would quickly die of some sort of infection, or malnutrition by being fed our typical diabetogenic diet, would it have the right to vote?

The real test of ethics
The real test of ethics, indeed the basic core of ethics, is when it restrains you from doing something you'd really like to do.  Otherwise, it's basically a feel-good game.  So, how seriously, no matter how many ethics seminars we may have been certified as having attended, do we take human rights?  On which side of the sadism line do our views lie?  When is it time to stop superficial speculation or even joking about all of this, even in major news and journal media, instead to take a stand and draw the line, a line that one simply may not cross?

Monday, April 28, 2014

Cover of Science, and a cover-up?

The 18 April issue of Science has a nice artist's rendering of the planet Kepler-186f, about which we recently blogged.  We pointed out how every time a new 'habitable'  planet is found by NASA, its PR machine makes sure the eager news media trumpet its dramatic news.  What could be better than a fake image (that is, an artist's rendering, totally imaginary) on the cover, a realistic not impressionistic painting.  Could one suggest that that is misleading?

April 18 cover of Science

On page 229, the blurb about the Cover, which repeats the painting, says that the discovery of Kepler-186f 'confirms' that Earth-sized planets exist around other stars.  Of course, we recently had some similar claims or 'confirmations' of something like 750 specifically-identified 'habitable' planets.  So what this actually confirms, a skeptic might allege, is that Science needs cover material.

To milk this even further, there is a Commentary about this major discovery.  So what does it say?  In a very nice figure, it lists three basic attributes that would be needed for this planet to be (as their heading says) a "Place Like Home".  Oh wait!  We left out a word!  The heading actually says "No" place like home!  This planet is of the right size, for those of us dreaming of going to a planet on which the various resort locations are within a Boeing 747's range for holiday package trips.  And it is in the temperature-defined 'habitable' zone, which means there could be water and hence beach resorts to go to.  But, sad to say, it is not orbiting a proper sun-like star!  No tans!

And, sad to say, even the authors of the Commentary raise several scenarios by which even this juicy find might be so spookily misbehaved in various ways as not to allow life as we know it (or, perhaps, even as Bella Lugosi knew it).  So, when you get right down to it....forget it!  But these caveats, well, they're rather buried under the covers of the evocative imagery.

Even the Commentary itself also says rather passively "Unfortunately, the planet is too far away from Earth for follow-up studies.  However, researchers hope it heralds many similar worlds soon to come."  Well, if NASA gets continued funds for this Hollywood-like exercise.  And is the very phrasing designed to lure gullible readers of Science (or of NASA releases) to think that next time--soon!, we'll find one we can visit?

In our recent post we guesstimated that if they're as common and easy to find as recent reports out of NASA suggest, then there must be billions of such planets in our galaxy alone, and many trillions of them out there in the universe.  A year or so ago we mused about what travel to r even communication with a closer (22 LtYr) planet would actually imply.

Worthy science doesn't need to be blared to the public in misleading ways. Because then it's not science, it's advertising or entertainment.

Friday, April 25, 2014

Another view of the academic 'problem'

Many of us are harping on the corrosion or corruption (or just change, depending on your point of view) of universities and their missions.  The tuition costs keep rising, many students find themselves deeply in debt upon graduation, and yet the benefits in terms of courses taught by actual tenure track professors, breadth and depth of education, and job opportunities at the end, are shrinking.  In essence, much of what has happened is that universities have grown to become institutions that charge customers (students) but are structured largely to serve their  (the faculty's and administrators') own interests.   At least this is the short-run picture and it has grown palpably at a seemingly accelerated pace in recent decades.

Here is a good statement of the issues written by an agricultural economist, Robert Taylor, at Auburn.  We generally agree with what he says.  But what if we take a broader view?

Is this just the 'good old days' phenomenon?
As we've asked in some recent posts on this timely topic, are these sentiments accurate or is this just the old and disgruntled complaining because they're being left behind?  The people in the linked article and its attached video are all apparently successful (based on their titles, at least).  This is clearly the case for the article's author, Bob Taylor, whom we happen to know.  In a recent interview, Nobel prizewinner Sydney Brenner, not just a soured loser, says much the same.  In our more modest way, we ourselves have enjoyed essentially unbroken funding for about 40 years.  We had ample funding from the major organizations, and our resulting papers have been published.  I personally have held a 'distinguished' and then the major endowed professorship here at  Penn State.  Which is all to say that my feelings about academia today are not at all due bitterness after an unsuccessful career.

Well, you might say, even successful old people tend to gripe about change, and perhaps because they were successful they get the idea that others want to hear what they have to say, that it somehow gives them a disinterested view of things.  But they may just be nostalgic grumps anyway.

Those struggling for employment today, as surveys show to be a higher fraction than in the past, can answer whether the system is doing what it should.  Those younger faculty who can't get a grant might have a view.  Lowly-paid instructors might want to say whether this is just nostalgia or is real.  Is the system broken?  Were they exploited or are they being left behind in a way that their mentors could have foretold?

On the other hand, as we've recently said, one could ask whether maybe our culture is just evolving at present into a more cruel and inequitable form not so different as prevailed for most of human history, in which there was a small elite, large lower class, and less in the middle than we've recently seen.  Maybe the post-WWII era was an historical anomaly and we are just settling into a different way of doing things, feeling surprised or stunned by it, because it doesn't currently jibe with our mythology (that we are all equal, that we live in an effective democracy, that higher education will benefit everyone, that research is all about the public good, etc.).

In a similar vein, perhaps, from the point of view of science itself, we may be seeing a struggle, a building of an elitist research hierarchy with resources coalescing into ever-greater power centers, but one that will eventually generate more ultimate public good than what has led up to it.

Only the future will tell whether we are pouring funds into low-payoff science the way we poured our national treasury onto Viet Nam in the 1960s and '70's (or you can easily identify your favorite government or industrial wasteful aspect of our culture).

But whether or not this might be a good way for long-term societal gains, say in health, it is not going well for many people today.  They are actual flesh and blood, not an abstraction such as our 'culture'. Similar things can be said about the current widespread response (or lack of it) to societal needs related to climate and agricultural sustainability:  like the pyramids long ago, the broader monumental industrial structure, being built by the few on the many backs of the nameless.

Thursday, April 24, 2014

Is it competition vs cooperation; or, cooperation lets competition be?

Since Darwin, emphasis in evolutionary theory has been on competition between individuals and species in the race for optimal fitness -- he or she who passes the most genes to subsequent generations wins.  Darwin saw this through the lens of the rampant cruelty of Nature, and the need for individuals to find food and mates and escape being eaten.

To many, this is a fundamental underpinning of evolution, although in recent years a number of evolutionary biologists have begun to think that perhaps cooperation deserves a larger role.  We have done our part, in our book (titled, in fact, The Mermaid's Tale: Four Billion Years of Cooperation in the Making of Living Things) and here on MT and elsewhere, though, despite our best efforts, competition remains the predominant view of how life works.

In our book we suggest that cooperation is a fundamental principle of life, arguably much more pervasive and important than competition because it happens at all levels all the time, from minuscule intracellular spaces to grander ecosystems, instantaneously as well as over evolutionary time.  It is, we think, hard to argue that competition plays such a central role.

A friend and sometime co-blogger Reed Goodman alerted us to an interesting piece the other day in The Baffler, "What's the Point if You Can't Have Fun?" by David Graeber.  Graeber, currently Professor of Anthropology at the London School of Economics, believes that behavioral scientists have gone over the top in arguing that there must be a rational purpose to every animal behavior.  
I’m simply saying that ethologists have boxed themselves into a world where to be scientific means to offer an explanation of behavior in rational terms—which in turn means describing an animal as if it were a calculating economic actor trying to maximize some sort of self-interest—whatever their theory of animal psychology, or motivation, might be.
Instead, why can't they just be having fun?  

Graeber notes that with the discovery of genes, evolutionary theorists quickly adopted the idea that everything animals did was in the service of passing along their own genes, an idea popularized by Richard Dawkins' in his book The Selfish Gene, but also widely accepted within evolutionary biology as well.  Indeed, evolutionary biologists tend to smell a competitive rat everywhere, nurturing the view that everything animals do must be adaptive, naturally selected, and for the purpose of out-reproducing the competition. 

This of course raises problems like altruism, cooperation among non-kin, animals sacrificing their own life for someone else, and so forth, but these have generally been hand-waved away with we would say rather contorted arguments that reframe kindness, cooperation and self-sacrifice as just competition in disguise.  Of course, that implicitly makes a tautology of every such explanation, based on an axiom -- an assumption -- of pervasive selective determinism.

Graeber isn't at all a fan of this strict view of biology and evolution.  His essay is wide ranging in scope,  from inchworms dangling in air for the sheer fun of it, to the historical context in which the idea that the purpose of life is the propagation of DNA (our genes thus made us invent the PCR machine, for  unlimited propagation of DNA?) could gain purchase, to the discussion of free will and consciousness.  It is provocative and well worth a read.  

But it was Graeber's mention of a 1902 book by Russian naturalist Peter Kropotkin (1842-1921) that most caught my eye.  In Mutual Aid: A Factor of Evolution Kropotkin argues that Darwin was wrong to place so much emphasis on competition, because cooperation -- mutual aid -- is so obviously in evidence all around us.  The idea of the struggle for life as a 'law of nature' was something he just couldn't accept because, as he wrote "...I was persuaded that to admit a pitiless inner war for life within each species, and to see in that war a condition of progress, was to admit something which not only had not yet been proved, but also lacked confirmation from direct observation." 

As a naturalist, Kropotkin spent much time traveling and observing nature. In Mutual Aid he documents  evidence of aid over conflict among animals, in humans and throughout human evolution and history, writing:
As soon as we study animals -- not in laboratories and museums only, but in the forest and prairie, in the steppe and the mountains -- we at once perceive that though there is an immense amount of warfare and extermination going on amidst various species, and especially amidst various classes of animals, there is, at the same time, as much, or perhaps even more, of mutual support, mutual aid, and mutual defense amidst animals belonging to the same species, or at least to the same society.  Sociability is as much a law of nature as mutual struggle.  
But which comes first, evidence or interpretation?
Kropotkin was a prominent figure in 19th century activist politics.  He was, according to the wisdom of the masses, a "geographer, economist, activist, philologist, zoologist, evolutionary theorist, philosopher, writer and prominent anarchist." (Wikipedia.)  He was sympathetic to the plight of the peasant in Russia as a young man, and to socialist ideas, though he eventually settled on anarchism and as a political activist, was imprisoned for subversive activities in 1876.  He escaped from prison before his trial, however, and fled to Europe, only returning to Russia after the revolution in 1917, enthusiastic about the changes he saw happening, though eventually disillusioned by the authoritarian socialism that the revolution became.

Kropotkin disliked capitalism and the idea that life must be a struggle.  As an anarchist, he preferred to believe that humans were capable of mutual aid and cooperation, and that we could effectively run our own societies.  On the other hand, competition was in the cultural air when Darwin was doing his thinking, with the British empire dominating much of the world, the beginning of the industrial age and the rise of capitalism, the economics of Thomas Malthus who was so influential to Darwin's thinking, so it was perhaps natural that Darwin, and Wallace too -- and indeed Richard Dawkins in the 1970's -- framed their theories of evolution in terms of competition.

One can assert that if Kropotkin was driven by his ideology to see in Nature what his filters allowed him to see, then the same certainly applies to the Darwinians and even to the gentle Charles himself.  If Darwin's view prevailed in the west, the cooperation-based views of Lysenko prevailed in the Soviet Union, with disastrous consequences for science.  But viewed in its context, these polarities are understandable. 

What does this say about which view is right?
I don't know.  Ken and I thought we were writing  The Mermaid's Tale about biology.  As we wrote in the book, competition and cooperation are laden words, but we explicitly chose 'cooperation' as an antidote to 'competition' with all its political and cultural meaning.  More neutral and scientifically appropriate terms like 'successful interaction' and 'differential proliferation' would serve science better and be less a matter of defining everything before it's observed.  However, our intention was to describe cooperation not just as kindness, but cooperative interactions among genes, organelles, organs, organisms and species.  In that context, we had little to say about culture, except insofar as we would argue that culture generally and manifestly usually trumps genetically driven behaviors.

So, I was surprised (and of course pleased) to see a recent review of our book on Amazon that says, among other things, "Would more anthropologists and policy makers read this…".  It's a favorable review, so presumably the author sees political and cultural meaning where we were explicitly only intending to describe biology.

But that's okay, and as it should be.  Science is always done in cultural or political context.  To a great extent, we see what we believe.  

Wednesday, April 23, 2014

Genomics in microbiomic clothing: The next 'hula hoop'?

So we've just been through 20 years of genomics, GWAS based on genetic markers tested in samples such as ever-growing huge numbers of cases compared to hordes of controls.  We know that mapping even in simple organisms including flies, yeast, and bacteria, show genomic causal complexity.  We know that whole population genome sequencing is the next way the marketeers will promise immortality.  We know this teaches us lessons for biology that we don't want to listen to.

We know that even without the exotic technology that makes all this possible, there are relatively strong, simple genetic causes of traits--this goes back in a formal sense to Mendel, and even GWAS, though often thought-free, can find these strong effects and will continue to find them here and there.  This, even though after all the effort of 20 years or more we still don't really understand the traits for which the mirage of an oasis of solutions has failed us (obesity, diabetes, schizophrenia, asthma, autism,.....).

So this is not no-yield science even if it is science of diminishing returns.  And it's our current way of doing business (and, perhaps, 'business' is the right word for it?).  It's our 'anthropology', how our culture works.  Big Ag is doing it, the military is doing it, NASA is doing it.   Big Data and other Big Projects are the order of the day and, to be blunt, the way to secure large long-term funds in a fickle market.

But as at least some are tiring of the over-marketing of all of this, the next genomic hula-hoop fad down this particular technology's line looks like it's the microbiome.  DNA sequencer makers can sell machines for this type of study, if the GWAS market shrinks.

But isn't it basically just the same?

We will find that the community of bugs in this or that part of your body will differ over time, will differ among sampling sites (this or that part of the skin, gut, or wherever).  It is even more complex than genomes because not only will we have the DNA sequences of large numbers of recognized microbes, but the microbial species will vary within their population and we'll have their relative frequencies in our samples.  In other words, from each person we get one genotype, alleles present or absent (or present in one or two copies), but for microbes we'll have their relative frequency in the sample (how many times their sequence was identified in the sample).  So there will be another very major variable added to the mix.

Everyone will differ in these regards, within and between populations, with age, sex, all sorts of environmental variables, and over time.  And while these variables may be enumerated one by one, in fact their dynamics involve interactions among them (often, probably, more than just two-species interactions in complex microbiome ecologies).  And there will be interactions with the host's genome as well.   If interaction among genomic sequence elements has proven largely intractable, wait til you see what the Wizards of Microbiomics turn up and start touting to the media.  This if anything bodes to make genomic variation rather simple!

Of course, we will learn a lot in this endeavor.  And, as with genetics, there will be strong and important signals related to human, animal, and plant well-being.  Some of these we know of already (e.g., some nasty strains of infectious bugs, E. coli, and so on).  Many would not require expensive exhaustive enumeration studies to find.  Just as with genetics, there will be successes.  But, we think, just as with genomics, we can already see the over-selling and the faddishness and band-wagoneering of microbiomics (and, similarly, of epigenetics).  Can basically the same knowledge come with fewer, more focused and disciplined microbiomic studies?

Perhaps we are over-reacting here, or perhaps this is just how humans, individually pursuing careers behave.  To get the minority of brilliant discoveries, perhaps the price to pay is of the sea of not-so-brilliant incremental work.  Only history will filter through the din of self-promotion to show what was actually important and in itself actually worth paying for.

If this is just how we are, in science, the arts, business, then progress is expensive, a fact we have to live with.  Of course, if resources were not captured by this academic welfare system, many other people could actually live, literally, or have better health and living standards.  Priorities on what to do with resources are societal, and we acknowledge that the sentiments we express presents a political view that.

But in terms of science itself, one could discuss whether there might be a better, more focused and modest, and less costly way to enhance creativity per dollar invested.  We and others have been writing with occasional suggestions about how funding could be less institutionalized, and we recently asked whether all of those ideas are rather irrelevant to the societal processes that will actually make change, when and where it happens.  Repetition and persuasion are perhaps essential parts of the jockeying 'game' of resource distribution.

Meanwhile, we'll be treated to the predictable years of breathless fantastic unprecedented paradigm-shifting discoveries in microbiomics.  Eventually, its dust will settle.

Tuesday, April 22, 2014

Microbiomes and complexity

Obesity -- the more we know, the less we seem to know -- or, at least, the more complicated it gets.  But, have heart!  Because this is turning out to be true of much of biology.  The more we learn about cellular mechanisms, how genes work, gene networks, the effects of medications, the relationship between diet and disease, the effects of environmental exposures on risk, and so much else, the better we understand that reductionist science is not necessarily the best way to explain cause and effect.  Why are some people obese and some aren't, why can't a single genetic variant often explain much, why do some people benefit from a given medication and some not, can we predict who will get which disease, and so forth?  It's complicated. But absorbing that message can be the first step towards better understanding.

piece in the April 17 Science, "Microbiome: A Complicated Relationship Status" by Sarah Deweerdt. elucidates this well. "Nothing is simple about the links between the bacteria living in our guts and obesity," Deweerdt writes.  Studies comparing the gut microbiome of obese people with that of thin people have shown marked differences between them.  Indeed, researchers have shown that "...microbial genes sort the lean from the obese with 90% accuracy, whereas looking at human genes yields the right answer only 58% of the time."

Of course, this isn't predictive, it's the microbiota of individuals who are already obese.  Whether obesity is caused by obesity-related gut flora or gut flora are a by-product of obesity isn't yet known, though a number of experiments with mice, including this one ("The gut microbiota as an environmental factor that regulates fat storage", Bäckhed et al., PNAS, 2004), suggest that gut flora might in fact be causal.  A 2014 study of the effects of pre- and probiotics on lipid metabolism and weight suggests the same, as do a number of others in the intervening decade. Of course, even if that's the case, genomic and microbial and other environmental factors interact: none is 'the' cause by itself.

To test the causal relationship, Bäckhed et al. transferred the microbiota of obese mice to the guts of germ-free mice (born by Caesarian section into sterile environments).  Despite eating less than before the transfer, and expending more energy than the germ-free controls, the recipient mice showed a 60% weight gain by two weeks after receiving the microbiota from the obese donors.  However, they never actually became obese themselves.  And we wonder if this is specific to the strain of mice they used: how would results compare if tested comparably on many other laboratory strains?

Bäckhed et al. report direct evidence of metabolic responses to the presence of the new gut flora, including increased hepatic production of triglycerides and increased monosaccharide uptake from the gut, and "increased transactivation of lipogenic enzymes... The liver has at least two ways of responding to this augmented delivery of calories: increasing inefficient metabolism (futile cycles) and exporting these calories in the form of fat for deposition in peripheral tissues."

That is, Bäckhed et al. suggest, resident gut microbes help us efficiently store calories, but in the calorie-rich environment that western grocery stores and other food provisioners create, over-efficiency can lead to obesity.  

The "thrifty genotype" becomes the "thrifty microbiome"
It should not be ignored that this is the same argument that Jim Neel used in 1962 to explain the evolution of genetic predisposition to diabetes ("Diabetes Mellitus: A 'Thrifty' Genotype Rendered Detrimental by 'Progress'").  His idea was that genes and pathways for storing energy in pre-modern times to take people through times of famine become disease risks in our time of plenty.  But this paper has been cited, and 'thrifty' rhetoric used without much restraint, even after Neel basically acknowledged that the idea was oversimplified and didn't apply to the major adult-onset diabetes epidemic.  It's highly likely that the thrifty microbiome idea will prove to be overly simplified as well.  

The microbiome is a hot item these days.  Though, unlike 'the' human genome, no one has ever suggested that there is 'a' single microbiome, which means that the recognition of complexity has been there from the start, as it should have been in the genome project.  Nonetheless, we have to be careful not to bestow too much credit for depth of insight on  the microbiome bandwagon:  reductionist explanations for what the microbiome can explain are tempting, perhaps especially by the media.  So, it's nice to see Deweerdt giving attention to its complexity.  

Indeed, Deweerdt cites researchers who believe that microbiota can only be considered to be part of a causal chain with respect to obesity.  What we eat influences the bacteria in our gut, and that in turn may influence our weight.  Germ-free mice, for example, didn't gain weight on a sugar-laden diet, suggesting that if sugar is obeseogenic, it's the bacteria in the gut that make calories available from the carbohydrates that we can't.  And gut bacteria can digest other components of what we eat that we ourselves can't, again increasing the number of calories we metabolize from certain foods.  

As far as we know, no one is claiming that if the thrifty microbiome idea is valid, it will be the whole story behind obesity, even in a single individual.  To date, to be sure, the mouse results aren't being replicated in humans, and fecal transplants aren't causing weight loss.  But even if to some extent gut flora are involved in regulating weight gain or loss, some forms of obesity really will turn out to have a fairly simple genetic explanation, even if that will vary between people, and some really will be due to energy imbalance (more energy consumed than expended).  And, there will be other explanations as well, perhaps even including a role for inflammation which is turning out to be involved in many diseases and disorders, as well as a combination of all of the above, even in single individuals.  

And the possible involvement of microbes only pushes the question back a step.  E.g., where do these obesity microbes come from, and are some people more susceptible than others?   

The more we learn, the more complicated it gets.  

Monday, April 21, 2014

Earths galore: we're getting closer...but to what?

Well, NASA's done it again.  They've found another exciting planet lurking in the depths of near space.  This time, the BBC proclaims, we have Kepler find 186f (illustrated, even!), the best one yet and (maybe) it (could) be watery!  It seems that the news cycle isn't just 24/7, but longer: every time NASA can release the story about some newly found somewhat-earthlike rock, the news outlet pick it up as if it were the first time and nobody can remember that we've seen almost the same many times before. But if they can get their sales with re-runs, we can't be blamed for at least returning to this topic (e.g., we blogged about this when NASA reported the news of an exoplanet circling the star Gliese 581, as well as others), though hopefully with a little bit more that's different compared with NASA's releases!

Just like Earth! [in an artist's ebullient imagination]  Credit:
CreditNASA Ames/SETI Institute/JPL-CalTec
Cred
A planetary plenitude
This discovery is called by the ever-sober news media an 'earth twin' or as the knowledgeable NY Times puts it, 'perhaps a cousin' (whatever that means).  Sssh!  If you keep very quiet, you might be able to hear your Keplerian kin-folk talking!

 Well, we can overlook such verbiage since ours attempts to be a science blog.

Actually, the discovery of a plenitude of possible planets, or 'habitable' ones as they seem often to be referred to, is interesting and continues apace.  They now number in the hundreds and only a trivial fraction of the universe has been scanned, or is even scannable with available technologies.

These truly are interesting findings, though they are, surprisingly, not at all surprising.  After all, space is massively large and filled with a chaos of objects hot and cold, large and small.  If, as seems likely, Newton was right and gravitation is universal, then the small stuff will often be captured by the gravitational attraction of the big stuff. Big hot stuff (stars) can capture smaller wandering rocks and they'll end up in orbit.  Some even smaller rocks are captured by the pull of, and orbit around, bigger rocks (like moons around planets). Lots of other rocks and stars will be in all sorts of relationships as well.  But some of these will be special.

If we care about our sort of life, then we want what is being called a Goldilocks planet: like her porridge, the rock will be not too hot, and not too cold, not too wet and not too dry, but just right!  That is, there will be water and warmth enough to keep it liquid but not turn it all to steam, and other things of that sort.  There is where, we're told, we'll find the ETs.  Some day.

Now this is genuinely thought-provoking, but it needs none of the circus hype of the news media.  That's because it basically tells us what we already knew.  In fact, the actual facts are to us a lot more interesting than the Disneyfication.

We've previously in general terms discussed the idea that if there are an infinity of starts, galaxies, planets or universes, there would just as likely be all sorts of life on them.  Here, we can be a tad more specific than that.  For example, if there are hundreds of planet-like things just here in our own local galaxy (the Milky Way), somewhat like 186f, and we've really just begun looking and technically can only see some of what might be out there, and if what we know is largely within our own galaxy, where there are in the range of 100 billion stars, then thousands and thousands of those stars must have orbiting rocks.  There are around 100 billion other galaxies (give or take a few), and we can assume that there must be thousands upon thousands of the same sorts of rocks orbiting stars and rocks orbiting around those rocks, in each galaxy.

That is, even on the back of the proverbial envelope, one would estimate that there would be at least 100 thousand billion habitable planets.  That is 100 trillion planets (100,000,000,000,000), as a minimal estimate.  Once we knew that there were 'habitable' rocks orbiting stars, such as Earth and perhaps one or two more even just around our own sun, there likely are around 100 or more billion earth-maybes in the Milky Way alone!  Of course, if you hold to Genesis, our Earth could be God's only watering hole, but once we had clear evidence of other possibles, a reasoning person must accept that these larger numbers become plausible.

The point is that even without the Kepler and other telescopes scanning the heavens for these things, the totally convincing plausibility argument would be that the universe is awash in 'habitable' planets.

But, ETs?
Now the fact that there are lots of warm, wet rocks out there is one thing, but it doesn't imply that there is anybody living on them. However, life--even just our sort of life--is clearly possible because we're here living it as proof.  Given that,  even a modest kind of belief in natural science would lead one to believe that if you have 100 trillion tries, there really has to be some sort of life out there, and probably lots of it, even if it's only on a trivially teeny fraction of the habitable planets.

This of course does not address whether it's our sort of life in the 'intelligent' sense.  Or life based on DNA. The fact that we are here is not quite so persuasive about that, because the numbers get astronomical (so to speak, but in the other direction--of smallness).  The number of nucleotides in earth-life's genetic history, from primal RNA to global DNA today, likely dwarfs even 100 trillion.  Each has arisen and/or later been changed by largely independent individual probabilities that are very, very small.  A net result is, in essence, the product of these probabilities (this and this and this...and this--the result--had to happen).  So to go from primal soup to any given form of  complex 'intelligence' over 3.5 billion years, that is, our form of it, would be minuscule relative even to the numbers of potentially habitable planets.  This could mean that intelligent life arising more than once, even with so many trials, would be very unlikely, and thus that we are lonely in our uniqueness.

But if others just like us may not happen more than once, there are also countlessly many such pathways to intelligence: after all, each human has a different genotype and there have been billions upon billions of us.  So it really is impossible to do more than muse about what the net resulting probabilities are.  To a great extent it depends on what we count as intelligent life.  To a greater extent, "Are we alone?"  is hardly even a scientific question to ask.

Worse for NASA (and Disney) is that even here on Earth where we know intelligent life has arisen, we've only been at it for, say 1,000,000 years, being generous and depending on what 'intelligent' means. But if it means having language and communicating by electromagnetic radiation (like radio), so we could communicate with ET's, that's only been about 100 years and probably we won't last much longer, either. So the probability that at any given time smart life is present in us and any other such place is a minuscule fraction of the time that life has been around on any of these lucky 100 trillion planets.

In that sense, large numbers don't nearly guarantee that there are smart anythings anywhere else.  The chance that us-like life is out there now, and that 'now' means we can communicate with it, becomes possibly rather miniscule.

Forget about chatting!
In one of our previous posts about Gliese 667C, we note the problems about thinking that we could communicate with, much less actually go to, such places (assuming we understand physics, like the limiting speed of light, correctly).

Kepler 186f is said to be about 500 light years away.  That means that a signal that we can pick up from there was sent when Da Vinci was painting the Mona Lisa.  If there was intelligent life there, and they're at all like us, they may well have obliterated themselves long ago.  But suppose they're peaceful (having evolved way beyond us), then just to send a friendly radio wave of "Hi!" to them, and get a wave back would take until the year 3014. By then most everything would have changed about human life here, with lots of world wars (though, of course, Republicans would still be trying to keep ordinary people from being able to afford a doctor).  Forget about chatting with the ETs!  Even Google will be out of business by that time.

And as we said about Gliese 667C which is a mere 22 light years away, 20 times closer than 186f, physically getting there would not be half or any of the fun.  It would be impossible in any practical sense, and even if we could actually do it, it would take millennia of space travel to get to 186f, and when we got there there might be nobody still around to drop in on.

So, what is the purpose of the space probe?
Without being too much of a spoil sport, because up to a point this kind of exploration really is interesting and in some ways now and then may tell us important things about our universe (not likely to be comforting to dogmatic religion), we have to ask about the purpose of this kind of probing.  In a sense, for reasons we suggest above, the numbers suggest that it really tells us nothing that we didn't have almost just as strong a reason to know anyway.  It would take something like a Genesis literalist to think that there would be no other planets with life on them, or even that they would be very few.  And of course either we think these findings suggest the plausibility that forms of life must exist out there, or else the burden of proof should be on a denier to show how, in the face of these overwhelming numbers (and not counting theories about multiple independent universes), there could fail to be some such 'life' on lots and lots of planets.

Of course, this is really just science fiction, almost literally.  The vast majority of any such planets are, were, or will be millions or even billions of light-years away. That means what we see today isn't there now, but was there eons ago.  Much of that light has been streaming here since before there was life on Earth--or even before there was an Earth!  Indeed, if a typical star's lifetime is around 10 billion years, much of what we see no loner exists as such and, likewise, much or most of what actually is out there came into existence too recently (even if millions or billions of years ago) for any evidence to have reached us.

So, it takes either a television sci-fi producer, a NASA PR rep, or a real dreamer to think we could ever go there or really communicate with much or even any of what must be out there.  If we really thought anything like that, we should intensely be doing very down-to-earth studies to see if the speed of light and relativity are limiting factors or whether transformative new aspects of space itself remain to be discovered.

At what point is the research cost** not worth the number of people who could be fed by the same funds, and so on?  When does asking such questions make one just a killjoy, and when does it make one concerned for the problems, and the actual unknowns, on the one planet we actually can do something about?



**Or, as we've suggested before, if this really is mainly just entertainment, why not let the video or other entertainment industries pay for it?

Friday, April 18, 2014

Another way to look at the university research budget sand castle problem

Yesterday we noted that a clear awareness of a crunch-time for university based science and graduate training is 'in the air'.  This is the result of an oversupply of doctoral graduates and a shrinking level of research funding.  It's leaving young people and even some older ones high and dry.  It's associated also with the loss of societal support for higher education as a general societal gain--legislators are backing away from providing funds to state universities.  One casualty is the loss of tenure-track jobs, being replaced by instructional serfdom.

These things reflect a more general turn towards individualism in our society--indeed, if you want to go to college, well, pay for it yourself!  But it's also a reflection of the self-serving college and university 'bubble' by which we have advertised 'education' and our research findings so heavily, to create societal demand, but without matching substance beneath it.

So many articles and blog posts and so on are being written to hand-wring about this.  We mentioned the Albert et al. PNAS commentary yesterday, written by experienced, senior people in science, but there have been many others.  We write in sympathy with the views expressed, and have, as have the authors of these and many other commentaries on this crisis, tried to suggest ways to get through trying times.

However, there is another very different way to look at this.

Social change must occur on its own terms
We, and authors of bemoaning commentaries that make recommendations for how to face these problems, are generally senior.  What we naturally tend to think of, and to suggest, amounts to ways to return to how things were done in the past--to how it was when we were young, the system we came up in, liked, got used to and which we would suggest make a come-back.  We did well during our decades and so tend to think we know what's right.  We naturally tend to propose changes meant to maintain the status quo.

But maybe that's wrong. Maybe we should not be listened to.  Maybe it's natural and right that we be put out to pasture.  We had what we view as halcyon days and they do seem definitely to have been gentler and easier than what is faced today.  But perhaps the solutions now will have to be different.

Perhaps lifetime tenure is obsolete.  Perhaps dependence on the grant system can't be reinstated, and major shifts in jobs will have to occur, and shouldn't be mourned.  Perhaps academic life will become less desirable, or will come to be something very different from what we elders knew and liked.  This is already happening, of course, not so much by design but because universities as businesses make decisions based on bottom-line considerations more than they used to, rather than what's best for scholarship or research or educational interests.  At least as we elders see those.

Perhaps, even, intensified competition is just the way it'll have to be.  Perhaps the capitalistic view that this is the hard-knocks way for society to thrive, trimming fat, intensifying effort and so on will become the norm.  Perhaps universities will have to shrink, professors losing jobs that don't really matter in the online world.  Perhaps the existence of excess labor pools--instructors who can't get tenure-track jobs and instead work by the hour when and where they can get jobs--is just going to be the way of the world because it is more economically 'efficient' (for society as a whole).  Perhaps this is a return not to the way it was for current elders, but much farther back, to the itinerant scholar days, ones who sing for their supper as individuals.  That's how it was in much of Classic times and the Middle Ages, after all.

In fact, it will just happen, however it happens.  Powers that be will struggle to keep things as they are and newcomers will struggle to change them, all in ways no one can really predict.  But perhaps in one way or another, we are already seeing a gradual de facto return to some forms of social and intellectual elitism, along with income inequity, is the path of the future even if we elders don't like that.  Perhaps our ideas about 'democracy' are just naive.

Maybe we should just not be the ones invited to write editorials about this 'crisis': maybe it's a crisis only in the mirror on the past.

Perhaps instead, young people will somehow restructure things in a way we elders can't or don't envision, and hence could never recommend.  Maybe they and only they should be writing about this---or, more realistically, maybe this needs to be worked out, by them, through the social media rather than the stodgy outlets we elders tend to use.

Given the number of stressors on the system, however, much of the change and the resolution is likely to be unplanned and will just in some meandering or chaotic way be where universities find themselves when the dust settles.  

Whatever replaces our type of world will become the new status quo, the one the new elders mourn the passing of fifty years from now, as our generation fades into the sunset of the world we have known.

Thursday, April 17, 2014

Playing in sand castles is no game! Funding science

There are rumors that the proposed federal NSF budget will cut some areas (cut not hold steady) by amounts well into double digits (like around 20%).  That's a permanent cut imposed over just one year, I think, on top of the steady-at-best budgets of recent years.  And a new commentary by Bruce Alberts et al. in PNAS bemoans the similarly serious situation in biomedical (NIH-based) research.  These latter authors make many or most of the same points that we have often been making here (and we were not alone by any means): these are not sour grapes rants but are widely perceived as truths about today's circumstances in science.

The points have to do with the poorly justified if not selfish excess production of PhDs, the hypercompetitive funding and publishing environment that eats up too much time while it stifles creativity, the conservative and cumbersome grant system, administrative creep and so on.

How did we get into this situation?

In a way we got into this situation because the idea of an ever-growing economy ran up against the real world (that, ironically, science is supposed to be about understanding).  We could and should have known this, but nonetheless built a sand-castle research/university welfare system too close to the shore, and now the tide of the inevitable is about to wash into or over it.

Sandcastle in Singapore; Wikimedia

We smugly expanded at exponential rates though any idiot (even a scientist!) knows that in the real world, as opposed to Disney perhaps, exponential growth must reach limits.  We behave short-term and totally selfishly, building our programs, training ever more graduate students, asking for ever bigger grants, bloating our administrations, being more hooked on overhead and soft-money salaries than a downtown druggie is addicted to meth.

This was a university 'bubble' that we built, and our society bought into it.  Now we're getting our comeuppance.  It's too bad because the most affected people will be the younger scientists who are innocent of the greedy behavior we elders indulged in during our careers.  It is we who deserve the slap on the backside, but the bruises will fall on our students--is falling on them.  There are not many university jobs and in many fields of scholarship, including hard-core and softer science, as well as the non-STEM subjects, there is a t-choice: a taxi-driving jobs compete with the prospects of a tenure-track job.

Universities are, often cravenly, saving money by denying tenure, hiring nearly unpaid adjunct instructors (but not reducing tuition accordingly, of course), and labs are letting staff off (to go compete for taxi licenses) because even some Dominant Baboon scientists can't get enough grants to feed their mill any more.

Now, we know that nationally, our Wall Street oligarchs treated themselves to a massive recession of which we, not they, were the victims, and they are getting off the hook for their evils.  But even forgetting that, the economy has had its downturn, as economies always do (the cycling tide of exponential growth).  So there is a constriction being laid on top of the overtly exponential-growth behavior of our universities.

In a downturn, there is a legitimate need to sort out priorities, which is less needed when everything is growing like Topsy.  Some areas have to be cut if we are to salvage what's really important. We here have often written critically of the puffed up, incremental rather than creative blowing away of large amounts of funding for various Big Data projects.  We've said that funding cuts might actually be a good thing if they forced people to think about their science rather than just buy more technology.  And both NIH- and NSF-related fields are guilty of devouring logs and spewing out sawdust.

But in a humane society, as ours should be, there should be a phase-out period of areas that are not delivering enough goods.  In our current system, however, there is so much lobbying and jockeying and self-promotion that this is not likely to be a humane process.  This we think is especially so if the cuts are quick, hard, and without much warning.

Either we'll continue with the brutally intense competitive environment, hostile to constructive interaction, in which we are already immersed in many areas of university science, or we'll have to bite some bullets.  We need to train substantially fewer graduate students.  Tenured faculty may need to do more actual teaching ourselves (fewer TAs).  We will have to scale back our labs to have fewer post-docs and technicians, and may need to do more actual science ourselves. We may have to be more selective, and restrictive in what we do or propose to do.  Administrations will have to do with fewer administrators, fewer shiny new buildings or lesser office furniture, and less addiction to overhead. Medical schools may actually have to learn to pay their employees (rather than relying on NIH to do that).

These changes even if they occur won't help those we've already misled into coming into these fields in which it was not hard to see the impending crunch, even years ago: They are the innocent victims.

We think what is needed, if it were possible, is a frank but non-partisan national discussion of what kinds of science and scholarship are most important and to phase in more funds for those and less for areas that, no matter how legitimate, are just less vital these days or less promising of major new discoveries.  We should consider academic employment practices and things like tenure and job security.  If they have to change, it should be in a phased way and not be punitive the way it is becoming now.

Alberts et al. suggest that we train them, our PhDs, for jobs other than in academe.  That's a great point, but if it's just an excuse for us to keep recruiting the same number of graduate students, it's a selfish ruse to preserve our business as usual, because we'd just quickly flood these other job areas if we did that.

The golden days of science (and scholarship--not all the important things in life are STEM things) may not be over, if we can behave properly and leave our six-guns at the coat-check.  But it does not seem likely to be easy or, worse, free of partisan politics unrelated to science itself.

What are you supposed to think, if you're a new graduate student, or a recent PhD?

Tuesday, April 15, 2014

STEMing the tide, part III: A (new) 'modest proposal'

We have been writing about the push in this country to strengthen the STEM subjects in education, science, technology, engineering and math, because of their financial, career, and material role in society. This is being done explicitly because when money is tight, subjects like the arts, humanities, and social sciences don't pay direct benefits.  This can be seen as inexcusably crass, but in a tight job market and culture increasingly embedded in things technological, with weakening public support for education, it is an understandable trend.

We  happen to be Luddites in this regard, perhaps, because we think that our society should not back away from the more literary, esthetic, and contemplative aspects of life.  This is not snobbery on our part, or at least not only that, thinking that everybody ought to love watching opera or reading the Iliad. The point is a societal one.  Much of our culture, such as pop music, sports, video games, chat sites, and the like are called 'popular' in part because everybody likes them (opera once was 'popular'). But the point here is that you don't need formal education to be exposed to them, indulge in them, or appreciate them for their values.

Appreciation of broader aspects of life, such as the 'finer' literature and arts, history, philosophical and anthropological thought, and the like is much more complex and often out of modern vernacular, technical, complex--even boring.  But exposure to them is as greatly enhanced by formal education, just as is the case for STEM subjects.  They have snob value in our social upper crust, but they have their aspects of value and appeal that might benefit and edify the lives of many more.

Here 'education' refers first to K-12. The current way to describe topics is to group the fashionable ones under the rubric STEM and then largely dismiss the others by omission--let them be nameless!  School districts are, we regularly read, shrinking or abandoning their music and arts programs, teaching of classics and the like, because they cost money, while adding pre-college specialty courses such as calculus. In a nutshell, this is based on our cultural obsession with money above all things, because these are the subjects, we are told, that industry wants and that make money for them and thus their employees.

But if being an industrial chemist or mechanical engineer pleases the wallet, we rarely hear that they please the soul.  We have not heard of a single serious-sized school district that has abandoned its sports programs, such as football or basketball, which are quite expensive, to augment the arts.

Universities and perhaps many colleges, are racing onto (or is it 'down' to?) the same money-driven bandwagon. Abandoning part of their mission to 'educate' informed citizens, they are widely shrinking or even sometimes running completely away from the non-STEM areas (but not, of course, football or basketball).

The scientific data on successful, healthy aging
I just returned from a workshop at the National Research Council, underwritten by NIH's National Institute on Aging (NIA), to discuss what we have learned about the basis of longevity and healthy lifespan experiences.  An objective was to provide advice to the NIA on directions of future ways to invest their resources, based on what we have learned from what has been supported heretofore.  The results, in a central area related to the question at hand, were in fact major and clear--and should provide equally clear directions for future NIA investment.

Health is a biological phenomenon (even mental health, of course, since the mind is a biological organ). The approach to human lifespan, longevity, health and life-course experience relates to the causes of negative as well as positive experience.  We should use our research technologies to find and identify the causes of either, so we can intervene with the negative and reinforce the positive.

In this case, the working model, in our scientific age that puts technology first, has been that ill health causes social and psychological decline. If you are sick, a biological and in that sense technical state, you cannot hold a job, may be involved in abusive domestic situations, become depressed, then invest badly in food or other resources and the like.  If you are sick, you may be more likely to be overweight, shorter, more likely to drink too much or to smoke.  So we have a plague of people in whom to search for the misguided cells, so we can alter their behavior.

Surprisingly, however, the reported research has shown, rather clearly and in both humans and other animal models (in particular, findings in other primates in the wild were reported at this meeting), that quite the opposite is true:  Social standing and cultural milieu are major, primary determinates of life-course health and experience. This even moreso than money itself!  Longevity and even height is in a strong sense determined by the degree of satisfaction or control you feel in your life, your social position, and even physical resources (incomes) do not over-ride the social effects.  Excepting of course strong harmful genetic effects in a small fraction of people, disease and lifespan causal are mediated largely by these aspects of social environment which, in turn, affect your health prospects. If you're born on the wrong side of the tracks, you're fate is largely sealed.

Since similar results were reported in several aspects and respects and even other species, one need not worry about the details, which seem to be generally small relative to the main picture.  The details needn't be studied to death.  Instead--we paid for the research, the research was very carefully and well done, and we got a clear result!  The question has largely been answered, and we now know how best to invest future resources most effectively for life-course improvement.

But the answer will surprise you!

Our 'modest proposal'
In 1729, Jonathan Swift saw a problem of the widespread lives of poverty among the downtrodden in Ireland, and suggested a solution:  they should gain income by selling their excess children (of which there were many), to be cooked in various culinary ways to satisfy the rich.  Many savory recipes were provided.

Carve, saute, and don't forget the sauce.  Drawing by Dore

That essay was a vicious satirical critique of societal inequity in Swift's time, and we (living in more civilized times, we generally suppose) would never think to suggest that kind of solution to the offensive, growing inequity in our society today.  But we do have a modest suggestion for today, based on our National Institutes of Health living up to its word, and using the results of research it sponsors to improve our society's lot.

The non-STEM parts of our educational system address quality of life issues that have to do with your assessment of the world, sense of well-being, ability to integrate understanding of civil life and across different realms of human thinking.  People with higher levels of senses of integration and well-being will be better able (as the research shows) to negotiate society and this will lead to better prospects and better health and longer life.

Of course, knowledge of the STEM subjects is important in this.  But we are already pouring resources there, clearly with more to come.  But we are pulling the plug on the non-STEM subjects that are associated with giving you a shot at being on the better side of the tracks--better and more equitable places in society, and which, we now know thanks to NIA research, lead to longer and healthier lives. This quantitatively and qualitatively trumps the relatively smaller, and consequent rather than causal effects of the various high-technology, costly things we spend funds on in relation to the pandemic diseases like heart disease, stroke, obesity-related diseases and so on.

So: what the NIA should do is to redirect its funds from these very sexy technological research approaches to life-course issues (like GWAS and so many other Big Data fashionable fields), and urgently pour these resources instead into intervening in the actual major causes of impaired lives.  NIA should underwrite the improvement of K-12 education nationwide, and should endow non-STEM programs in universities, conditional on those areas being retained as serious-level requirements for graduation.

If we let this recipe cook for a decade or two we'd have a more sophisticated, knowledgable, intellectually resourceful and more savory equitable society with more peace of mind.  And the populus would, as a direct consequence, have more intellectual resources to engage in creative and innovative science and technology, with the economic benefits that go with that. As a result, the rates of our common chronic diseases, including mental deterioration, and their associated misery and costs would be way down.

The diseases that would be left would be the truly biological or genetic or clear-cut environmentally caused instances of these diseases, on which cases focused research (rather than just big-data collection) might have a reasonable shot at devising cures and prevention.

That is our modest proposal for how we should use the results of the research we pay for (but we dare to suggest that it's not how we're using them now).

Monday, April 14, 2014

Is there a right way to raise children?

Three pieces in the Sunday NYTimes about how to bring up children ring a bell.  The first, "Raising a Moral Child", asks "What does it take to be a good parent?"  The answer -- yes, there's an answer -- is to praise your child's character, not her or his deed.  "You are a kind person," not, "Sharing your toys with your friend was very kind," will produce a caring, generous adult.

Children sharing a milkshake; Wikimedia
The second, "Growing Up At Sea", refutes the widespread criticism of a family with two young children that intended to sail from Mexico to New Zealand, but got into trouble and instead had to be rescued by the US Navy and Coast Guard.  The author, Ania Bartkowiak, herself spent most of the first eleven years of her life sailing the world with her parents and older brother, anchoring at far-flung ports and finishing correspondence courses on deserted tropical beaches.  She describes what sounds like an amazing, rare, and cherished childhood.

The third piece, written by Keith Robinson and Angel Harris, asks no questions, but instead asserts that "Parental Involvement Is Overrated." How do the authors know?  Because
...evidence from our research suggests otherwise. In fact, most forms of parental involvement, like observing a child’s class, contacting a school about a child’s behavior, helping to decide a child’s high school courses, or helping a child with homework, do not improve student achievement. In some cases, they actually hinder it.
So, three pieces about the effects of upbringing on the adults children will become, two with answers, one a cautionary tale about how conventional wisdom can be wrong.  Two reductionist approaches promoting what authors hope become conventional wisdom, one quite the opposite, extolling the virtues of unconventional upbringing.

Take a look at the parenting section of any bookstore, though, or go to Amazon.com and search for parenting books.  No, I'll do that for you … I find a grand total of 97,130 books on parenting.  Almost 100,000 authors believe they've got the answer to how to bring up children.  Wow. Here are just the first few titles: "Discipline Without Shouting or Spanking", "Simplicity Parenting: Using the Extraordinary Power of Less to Raise Calmer, Happier, and More Secure Kids," "Parenting With Love and Logic."  Everybody has an answer.

But, here's what really interests me here.  If there were in fact an answer, there would be only one book, not 97,130.  Or no books at all, because we'd all do it the way our parents' did, because they'd have done it right.

And here's why this rings a bell.
Along with social scientists, geneticists and epidemiologists and nutritionists continue to look for single answers to complex questions, and often believe they've found them.  But, there are many ways to bring up kind, caring, successful children -- indeed, many ways to define success -- just as there are many pathways to heart disease, or tallness, or hypertension or doing well in school.  And all of these pathways involve genes and family backgrounds, peers and social pressures, and pretty much none of them can be reduced to a single factor: the right way to praise a child, the proper amount of parental involvement, the single gene or food or vitamin.

Should a parent help with homework?  What should we call 'help'?  Is making a child's favorite meal help?  Or picking her up at school or freeing him from doing the dishes, to allow more time for doing the work?  If you want a caring child, won't simply being caring yourself be a lesson?  Can't a smile, or loving words be praise?

Similarly, we don't only drink red wine or take calcium supplements or consume saturated fat.  These are always consumed in a larger context, including a complex diet, genetic background, childhood exposures, amount of exercise, illnesses and so on, and everyone's unique.

There will always be parenting advice, because there will always be anxious parents.  But the advice will always be embedded in the culture of the moment.  Who reads Dr Spock anymore?  Who even knows who he was?  His parenting advice was followed for perhaps several decades, and it was 'right' because there were far fewer advice books, and it fit the tenor of the times, and people believed it, which by definition made it right.  But times have changed and advice books have moved on.

But, just as with causes of heart disease, where, by the way, there's also an advice market, there is no single answer.  And any actual answers will take context and complexity into account.

Thursday, April 10, 2014

As remarkable as science, and as unremarkable as most science

Last Saturday we went to the opera---well, to the Met's live broadcast to local movie theaters.  It was Puccini's La Boheme, an amazing, remarkable feat, the match of anything in science.  But there was even more.

With something like 4 hours' notice, Kristine Opolais, an up and coming Latvian singer, who had sung the lead in Puccini's Madama Buttefly the night before, and got no sleep, got a message early in the morning asking if she'd sing the lead role, Mimi, in La Boheme at that day's matinee, because the expected lead was ill with the flu.  For some insane reason, she agreed.  In front of a full Met House and estimated 300,000 worldwide viewers, with closeup camera scrutiny, she delivered an essentially flawless, gorgeously moving performance.  It was doubly or trebly moving because of the feat of switching roles, remembering all the musical words and cues and notes, and learning the staging with almost no notice.

Ms. Opolais and Vittorio Grigolo in the Metropolitan Opera's broadcast of Puccini's “La Bohème” on Saturday. CreditMarty Sohl/Metropolitan Opera .

Whatever might be mitigating circumstances (she's sung Mimi in other houses in recent years), this tour de force reinforced our respect for the nonSTEM aspects of human life.  Nothing we have seen in decades of science matched what we saw, for skill, technique, and all the learned aspects of high levels of human achievement.  This reflects the reasons we think universities should stop backing away from training anyone but scientists.

And then….  We are in Washington this week for a science meeting, but had some spare time and went to the National Art Gallery.  A big ad boasted that a new Van Gogh painting had been acquired and was on display.  Naturally, we went to see it, rushing past some other magnificent French impressionist paintings.  And what did we see?  Well, art is subjective, but this was under-whelming.  A blob of typical Van Gogh slap-dash.  We are sure the Gallery paid more for that than most of us earn in a lifetime.

Van Gogh, "Green Wheat Fields," Auvers, 1890

Yes, an investment in 'art', and maybe relevant to understanding a major artist's life.  But to us, as we quipped to each other, like a famous scientist's papers in a grade-B journal.  Not a masterpiece.  Yet, the worship of the Established leads to that purchase, much as too many journals and too many 'science' reporters, tout the every work of someone with a prominent reputation or job in a university near to Big City.

We, personally, have the utmost respect, or even awe, for any great human achievement, and the work and skill that are responsible.  The same is true for an art performance, a novel, a historical analysis, or, yes, even a scientific discovery.  But it is also true that most work in most fields is ordinary, yet we give it bloated treatment if it may show that we hob-nob with the famous.

Inspired works of human endeavor are deeply moving, in any field.  Science is among them, but brilliance is not restricted to science, and the experience of brilliance is something that should be open to everyone; the more who are educated to appreciate it, the more whose lives will be edified by the experience.

Wednesday, April 9, 2014

development through research??

15 years ago, when Chief Khunchai first took a job managing a malaria clinic on a remote stretch of the Thailand-Myanmar (still Burma at that time) border, there weren’t year round roads, there was no electricity, no telephones, and the endemic guerilla warfare between the Karen and the Burmese didn’t pay much attention to the international border.  A Karen military base was just over the mountain on the other side of the river.  Sometimes when fights broke out, mortars would fly across the river and land near the clinic.  It wasn’t personal; such things don’t always stop at international borders.

Moei River - the international border between Thailand and Myanmar


In the hot season temperatures regularly exceed 90° Fahrenheit, at midnight.  Without electricity, every degree above 80 is obvious.  There is a constant trickle of sweat behind your ears and down your lower back, and you eventually stop mistaking this feeling for mosquitoes and other insects that want your blood.

In the wet season, everything is permeated by the omnipresent moisture.  Pencils won’t write on paper, which has been collecting moisture from the rain and from your sweat, and pens make thick smudges on anything they touch.  Records are hard to keep.  The landscape is almost fluorescent green during this season.  The dichotomy between inside and outside is a false one.  Even the walls grow green with algae, plants and vines work their ways into the cracks and struggle for a nook or cranny to fill and exploit.

In this part of the world, the sun goes down at a consistent time pretty much year round – 6:30.  But this malaria clinic is surrounded by sharp mountain peaks and karst rock formations, and these geological entities hide the sun more quickly.  If work was to be done after 5pm or so, it was done by candle.  Malaria diagnoses would have to wait until tomorrow, when the light from the sun could be used in the small circular mirror that illuminates the slides and lenses in the microscope.

microscope for detecting malaria


Contacting the outside world could be done using military styled radios, through a tall antenna that stretched out of the top of the clinic.  This was handy in case people needed to be evacuated because of flooding, fires, or fighting to relay information about the epidemiological situation, or to request shipments of dwindling medical supplies.  A lighting rod was placed at the other end of the building to keep people from getting barbecued during storms.

a storm over the Moei

Over the years, Chief Khunchai has become a much respected member of society.  He knows almost everyone in the district in some way or another, and he holds a lot of political weight.  His office gets nicer over time.  Still the chief of a malaria clinic, he looks back on those days with little nostalgia.  He shudders a little when he tells me about working by candle light and having to worry about the fighting.  Yes, today things are quite different.  

He now manages a new malaria clinic about 35 miles south of the one he started at.  This malaria clinic has electricity.  At first this meant an electric microscope and lights, and that work could be done at night.  It also meant fans, which make work much more bearable during certain times of the year.  Even more recently, it meant that sealed doors could be installed in the main room and office so that wall AC units could also be installed.  AC isn’t frequently used, but it is very nice to turn the units on when there are special guests (usually political superiors) visiting.  

There are at least three different large malaria projects running in the district, and each of these projects has hired staff that are housed at the clinic.  A room was built on to the back so that they would all have desk and computer space.  A little data entry and a lot of facebook and youtube happen in that room.  

In the same period of time, malaria cases appear to have decreased, even while the population of the area has increased.  This is especially the case with regard to cases in Thai nationals.  Most cases here are in Myanmar nationals or Karen people with no nationality.  In fact, it is entirely possible that today there are more malaria-related personnel in Thailand than there are cases of malaria in Thai people each year.  That is, I think, a very strange thing.  



I begin with this story so that I can paint a picture of a kind of situation that I think has occurred in many parts of the world.  Malaria persists in places where “development”, for whatever reason, hasn’t extended.  One (I) could easily argue that such “development” is actually destructive in many ways, but it is hard to argue that life for many people hasn’t become easier.  And malaria cases have gone down at the same time.  

At least some of that development must be a direct result of the research cash that flows in from major malaria research projects and initiatives.  Those data entry people in the back facebooking can now purchase relatively nice motorcycles; some of the managers might even buy cars.  It’s not just the malaria clinic that has changed, there are also new restaurants, roads that are mostly good (or equally bad) year-round rather than only being traversable during the dry season, and more recently, a 7-11.  I joke that next year there may be another 7-11 across from that 7-11, but you may not understand unless you’ve recently visited Bangkok.  All of these things have associated workers who in turn buy stuff from places that also employ people.  In this part of the world, and I think in other parts too, malaria is mostly a “rural” disease.  It exists in places without 7-11s and year-round roads.  As you pave the ground for those roads and build concrete jungles, this particular disease tends to go away.  

And I find in this all a great irony.  

I’ve previously heard jokes that the best way to get rid of a disease is to try to study it.  I think this means I’m not the first to notice what is happening.  

The malaria industry is huge and there is a lot of money in it.  Frustratingly, much of that money winds up getting wasted through corruption and through things that ultimately aren’t necessary for what I think really matters: helping people who are sick with malaria, or even better, getting rid of malaria.  

For that matter, a question I’ve increasingly worried about over the last several years is: Should we really be setting up an industry, a vast network of jobs, that are all geared toward halting a disease?  Will these people really be motivated to stomp out the very thing (in this case, malaria) that keeps their own lives, at least economically, afloat?  Is that why people heatedly argue that we should be trying to control malaria rather than just get rid of it??!  Even more-so, while I can see the value in having electricity at a malaria clinic for diagnosis purposes, is AC, more space, new desks, etc. all relevant for combating the malaria problem?  

But perhaps there is another way to look at this too.  That is, perhaps all those research dollars that get pumped into malaria research do actually work.  I think they really do.  I just don’t think they work in the way that any of us really intend for them to.  They wind up spurring the local economy, they boost peoples’ economic well-being, and then in some cases and for some extremely complex reasons, people who move out of deep poverty are no longer faced with the immediate health consequences of that poverty.  For them, malaria isn’t any longer an immediate danger.  They can sit in a nice office, preferably behind a nice fan and in front of a nice computer screen, and check boxes on the computer that correspond to a malaria patient’s age and sex (or to a “like” button on someone’s post).  After half a life’s worth of work in less-than-ideal conditions, maybe it is more than OK that Chief Khunchai no longer has to dodge mortars or work by candlelight.  Hell, maybe he deserves the occasional AC – I certainly convince myself that I do.  

Sometimes I’ve gotten quite riled-up by the ways I see malaria research dollars getting spent but maybe I’ve completely missed the point.  Maybe all that really matters is that those dollars with the malaria name on them wind up having the effect that (I think) we all ultimately want.  Even if the functional mechanism behind this cause and effect has basically nothing to do with the one(s) that many of us think matters.      



*** I know several "Chief Khunchais" - but this name is of course made up