Translation from English

Sunday, December 4, 2016

Brain Pickings

The dark side of IQ and why we can't measure intelligence, philosopher Jonathan Lear on the paradoxical seedbed of courage and cultural resilience, Sally Mann on the treachery of memory and the elusive locus of the self, and more.Email formatted oddly or truncated?
View it in your browser. No room for me in your life anymore? Unsubscribe. If a friend forwarded this to you and you'd like your very own, subscribe here.
donating = loving

I pour tremendous time, thought, love, and resources into Brain Pickings, which remains free. If you find any joy and stimulation here, please consider supporting my labor of love with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:

Subscribe

You can also become a one-time patron with a single donation in any amount: 

Donate

And if you've already donated, from the bottom of my heart: THANK YOU.

WelcomeHello, Larry! If you missed last week's edition – Maya Angelou on how a library saved her life, Parker Palmer on healing the heart of democracy, Ben Shahn on the power of nonconformists, and more – you can catch up right here. If you missed the special once-a-decade edition -- the 10 most important things I learned in the first 10 years of Brain Pickings -- read it here. If you're enjoying my newsletter, please consider supporting this labor of love with a donation â€“ I spend countless hours and tremendous resources on it, and every little bit of support helps enormously.

Radical Hope: Philosopher Jonathan Lear on the Paradoxical Seedbed of Courage and Cultural Resilience

“Courage is the measure of our heartfelt participation with life, with another, with a community, a work; a future,” wrote the poet and philosopher David Whyte in contemplating crisis as a testing ground for courage. But the future at which courage must aim its gaze is often one obscured by the blinders of our culture’s current scope of possibility.
In January of 2009, Elizabeth Alexander took the podium at the Washington Mall and welcomed Barack Obama to the presidency with her exquisite poem â€œPraise Song for the Day,” which made her only the fourth poet in history to read at an American presidential inauguration. Seven years later, facing a radically different and radically dispiriting landscape of possibility, Alexander took a people’s podium and reminded us that our greatest ground for hope is in the once-unimaginable, which the present that was once the future has proven possible. Looking back on that historic moment in 2009, she reflected:
That was a beautiful moment that so many elders never thought they’d live to see. So there are things that we don’t yet know, that we don’t think we’re going to live to see, that are also going to give us power and beauty if we hold up our own.
What Alexander is articulating — the notion that our cultural conditioning limits what we know to hope for, until the unimaginable proves itself possible — is what philosopher Jonathan Lear examines in Radical Hope: Ethics in the Face of Cultural Devastation (public library). 

Illustration from Alice and Martin Provensen’s vintage adaptation of Homer’s Iliad and Odyssey
Simultaneously echoing and complicating Susan Sontag’s notion that the courage of an example is what mobilizes communities, Lear considers the paradoxical seedbed of the kind of courage that propels culture forward:
We rightly think that the virtue of courage requires a certain psychological flexibility. A courageous person must know how to act well in all sorts of circumstances. We recognize that there can be times in life when the stock images of courage will be inappropriate, and the truly courageous person will recognize this extraordinary situation and act in an unusual yet courageous way.
This ability to be courageous beyond the culturally prescribed forms of courage, Lear points out, is therefore an inherently countercultural ability, which reveals the central paradox of cultural resilience. He writes:
If we think of the virtues, or human excellences, as they are actually taught by cultures across history, it is plausible to expect that the virtuous person will be ready to tackle the wide variety of challenges that life might throw his way. It is unclear that there is anything in such training that will prepare him for the breakdown of the form of life itself. We would like our ethics to be grounded in psychological reality. Thus whatever flexibility is required of a virtuous person, it ought to be something that can be inculcated in the education and training of a culture. But a culture does not tend to train the young to endure its own breakdown — and it is fairly easy to see why. A culture embodies a sense of life’s possibilities, and it tries to instill that sense in the young. An outstanding young member of the culture will learn to face these possibilities well.
But things grow more complicated when the situation at hand is the breakdown of this very sense of possibility within a culture. Lear writes:
The inability to conceive of its own devastation will tend to be the blind spot of any culture… A culture tends to propagate itself, and it will do that by instilling its own sense of possibility in the young.
The ability to envision possibilities beyond those handed down by our existing culture, Lear argues, requires what he calls “radical hope.” He explains:
For what may we hope? Kant put this question in the first-person singular along with two others — What can I know? and What ought I to do? â€” that he thought essentially marked the human condition. With two centuries of philosophical reflection, it seems that these questions are best transposed to the first-person plural. And with that same hindsight: rather than attempt an a priori inquiry, I would like to consider hope as it might arise at one of the limits of human existence… What makes this hope radical is that it is directed toward a future goodness that transcends the current ability to understand what it is. Radical hope anticipates a good for which those who have the hope as yet lack the appropriate concepts with which to understand it. What would it be for such hope to be justified?
The answer to that question is what Lear goes on to explore in the remainder of the elevating and lucidly mobilizing Radical Hope. Complement it with Rebecca Solnit’s indispensable Hope in the Dark â€” a book that preceded Lear’s and in many ways exceeds it in transcending theoretical discourse to offer a muscular, livable-with model for hope in the pulse of life.

Hold Still: Sally Mann on the Treachery of Memory, the Dark Side of Photography, and the Elusive Locus of the Self

“Memory is never a precise duplicate of the original… it is a continuing act of creation,”pioneering researcher Rosalind Cartwright wrote in distilling the science of the unconscious mind.
Although I lack early childhood memories, I do have one rather eidetic recollection: I remember standing before the barren elephant yard at the Sofia Zoo in Bulgaria, at age three or so, clad in a cotton polka-dot jumper. I remember squinting into a scowl as the malnourished elephant behind me swirls dirt into the air in front of her communism-stamped concrete edifice. I don’t remember the temperature, though I deduce from the memory of my outfit that it must have been summer. I don’t remember the smell of the elephant or the touch of the blown dirt on my skin, though I remember my grimace.
For most of my life, I held onto that memory as the sole surviving mnemonic fragment of my early childhood self. And then, one day in my late twenties, I discovered an old photo album tucked into the back of my grandmother’s cabinet in Bulgaria. It contained dozens of photographs of me, from birth until around age four, including one depicting that very vignette — down to the minutest detail of what I believed was my memory of that moment. There I was, scowling in my polka-dot jumper with the elephant and the cloud of dust behind me. In an instant, I realized that I had been holding onto a prosthetic memory — what I remembered was the photograph from that day, which I must have been shown at some point, and not the day itself, of which I have no other recollection. The question — and what a Borgesian question — remains whether one should prefer having such a prosthetic memory, constructed entirely of photographs stitched together into artificial cohesion, to having no memory at all. 
That confounding parallax of personal history is what photographer Sally Mann explores throughout Hold Still: A Memoir with Photographs (public library) — a lyrical yet unsentimental meditation on art, mortality, and the lacuna between memory and myth, undergirded by what Mann calls her “long preoccupation with the treachery of memory” and “memory’s truth, which is to scientific, objective truth as a pearl is to a piece of sand.”

Sally Mann as a child
In a sentiment that calls to mind Oliver Sacks’s exquisite elucidation of how memory works, Mann writes:
Whatever of my memories hadn’t crumbled into dust must surely by now have been altered by the passage of time. I tend to agree with the theory that if you want to keep a memory pristine, you must not call upon it too often, for each time it is revisited, you alter it irrevocably, remembering not the original impression left by experience but the last time you recalled it. With tiny differences creeping in at each cycle, the exercise of our memory does not bring us closer to the past but draws us farther away.
I had learned over time to meekly accept whatever betrayals memory pulled over on me, allowing my mind to polish its own beautiful lie. In distorting the information it’s supposed to be keeping safe, the brain, to its credit, will often bow to some instinctive aesthetic wisdom, imparting to our life’s events a coherence, logic, and symbolic elegance that’s not present or not so obvious in the improbable, disheveled sloppiness of what we’ve actually been through.

Photograph: Sally Mann
Nearly half a century after Italo Calvino observed that â€œthe life that you live in order to photograph it is already, at the outset, a commemoration of itself,” Mann traces this cultural pathology — now a full epidemic with the rise of the photo-driven social web — to the dawn of the medium itself. Reflecting on the discovery of a box of old photographs in her own family’s attic, she echoes Teju Cole’s assertion that â€œphotography is at the nerve center of our paradoxical memorial impulses” and writes:
As far back as 1901 Émile Zola telegraphed the threat of this relatively new medium, remarking that you cannot claim to have really seen something until you have photographed it. What Zola perhaps also knew or intuited was that once photographed, whatever you had “really seen” would never be seen by the eye of memory again. It would forever be cut from the continuum of being, a mere sliver, a slight, translucent paring from the fat life of time; elegiac, one-dimensional, immediately assuming the amber quality of nostalgia: an instantaneous memento mori. Photography would seem to preserve our past and make it invulnerable to the distortions of repeated memorial superimpositions, but I think that is a fallacy: photographs supplant and corrupt the past, all the while creating their own memories. As I held my childhood pictures in my hands, in the tenderness of my “remembering,” I also knew that with each photograph I was forgetting.

Young Sally Mann on her beloved horse
Mann, whose memoir is strewn with an extraordinary sensitivity to language, anchors into the perfect word the perfect analogy for how the living of life impresses itself upon our memory:
When an animal, a rabbit, say, beds down in a protecting fencerow, the weight and warmth of his curled body leaves a mirroring mark upon the ground. The grasses often appear to have been woven into a birdlike nest, and perhaps were indeed caught and pulled around by the delicate claws as he turned in a circle before subsiding into rest. This soft bowl in the grasses, this body-formed evidence of hare, has a name, an obsolete but beautiful word: meuse. (Enticingly close to Muse, daughter of Memory, and source of inspiration.) Each of us leaves evidence on the earth that in various ways bears our form.
Over and over, Mann returns with palpable unease to the parasitic relationship between photography and memory, culminating in this unadorned indictment:
I believe that photographs actually rob all of us of our memory.
More than that, photographs disquiet our already unnerving relationship with time — a relationship which Borges, the poet laureate of memory’s perplexities, captured with memorable brilliance. Mann writes:
Photographs economize the truth; they are always moments more or less illusorily abducted from time’s continuum.

Photograph: Sally Mann
This dislocation of mnemonic truth into photographs is as rooted in time as it is in space. Mann, whose work is heavily animated by the life of landscape, once again draws on language to explore the nuances of the relationship between photography, memory, and place:
In an immigrant society like this one, we are often divided from our forebears less by distance than by language, generations before us having thought, sung, made love, and argued in dialects unknown to us now. In Wales, for example, Welsh is spoken by barely 20 percent of the population, so we can only hope that the evocative Welsh word hiraeth will somehow be preserved. It means “distance pain,” and I know all about it: a yearning for the lost places of our past, accompanied in extreme cases by tuneful lamentation (mine never got quite that bad). But, and this is important, it always refers to a near-umbilical attachment to a place, not just free-floating nostalgia or a droopy houndlike wistfulness or the longing we associate with romantic love. No, this is a word about the pain of loving a place.
[…]
Contemporary Welsh-speakers have continued that expression, linking memory and landscape most vividly in R. W. Parry’s sonnet in which the longed-for landscape communicates to the human heart, “the echo of an echo… the memory of a memory past.”
With an eye to her own heritage as a displaced Southerner, Mann adds:
Looking through my long photographic and literary relationship with my own native soil I can perceive a definite kinship with those fakelorish bards wailing away about their place-pain.

Photograph: Sally Mann
A generation after Susan Sontag admonished against the “aesthetic consumerism” of photography, Mann invites us to confront these commodifications of memory that we have come to take for granted:
Before the invention of photography, significant moments in the flow of our lives would be like rocks placed in a stream: impediments that demonstrated but didn’t diminish the volume of the flow and around which accrued the debris of memory, rich in sight, smell, taste, and sound. No snapshot can do what the attractive mnemonic impediment can: when we outsource that work to the camera, our ability to remember is diminished and what memories we have are impoverished.
Contemplating mortality, that ultimate end-point of memory, Mann writes in a Whitman-like meditation on the ever-elusive locus of the self:
Where does the self actually go? All the accumulation of memory — the mist rising from the river and the birth of children and the flying tails of the Arabians in the field — and all the arcane formulas, the passwords, the poultice recipes, the Latin names of trees, the location of the safe deposit key, the complex skills to repair and build and grow and harvest — when someone dies, where does it all go?
In a sentiment evocative of Meghan O’Rourke’s beautiful assertion that â€œthe people we most love do become a physical part of us, ingrained in our synapses, in the pathways where memories are created,” Mann adds:
Proust has his answer, and it’s the one I take most comfort in — it ultimately resides in the loving and in the making and in the living of every present day.
Hold Still is an intensely beautiful and layered read in its totality. Complement this particular facet with Virginia Woolf on how memory threads our lives together, neuroscientist Suzanne Corkin on how the famous amnesiac H.M. illuminates the paradoxes of memory and the self, and Susan Sontag on how photography mediates our relationship with life and death.

Genes and the Holy G: Siddhartha Mukherjee on the Dark Cultural History of IQ and Why We Can’t Measure Intelligence

Intelligence, Simone de Beauvoir argued, is not a ready-made quality â€œbut a way of casting oneself into the world and of disclosing being.” Like the rest of De Beauvoir’s socially wakeful ideas, this was a courageously countercultural proposition — she lived in the heyday of the IQ craze, which sought to codify into static and measurable components the complex and dynamic mode of being we call “intelligence.” Even today, as we contemplate the nebulous future of artificial intelligence, we find ourselves stymied by the same core problem — how are we to synthesize and engineer intelligence if we are unable to even define it in its full dimension?
How the emergence of IQ tests contracted rather than expanding our understanding of intelligence and what we can do to transcend their perilous cultural legacy is what practicing physician, research scientist, and Pulitzer-winning author Siddhartha Mukherjee explores throughout The Gene: An Intimate History (public library) — a rigorously researched, beautifully written detective story about the genetic components of what we experience as the self, rooted in Mukherjee’s own painful family history of mental illness and radiating a larger inquiry into how genetics illuminates the future of our species. 

Siddhartha Mukherjee (Photograph: Deborah Feingold)
A crucial agent in our limiting definition of intelligence, which has a dark heritage in nineteenth-century biometrics and eugenics, was the British psychologist and statistician Charles Spearman, who became interested in the strong correlation between an individual’s high performance on tests assessing very different mental abilities. He surmised that human intelligence is a function not of specific knowledge but of the individual’s ability to manipulate abstract knowledge across a variety of domains. Spearman called this ability “general intelligence,” shorthanded g. Mukherjee chronicles the monumental and rather grim impact of this theory on modern society:
By the early twentieth century, g had caught the imagination of the public. First, it captivated early eugenicists. In 1916, the Stanford psychologist Lewis Terman, an avid supporter of the American eugenics movement, created a standardized test to rapidly and quantitatively assess general intelligence, hoping to use the test to select more intelligent humans for eugenic breeding. Recognizing that this measurement varied with age during childhood development, Terman advocated a new metric to quantify age-specific intelligence. If a subject’s “mental age” was the same as his or her physical age, their “intelligence quotient,” or IQ, was defined as exactly 100. If a subject lagged in mental age compared to physical age, the IQ was less than a hundred; if she was more mentally advanced, she was assigned an IQ above 100.
A numerical measure of intelligence was also particularly suited to the demands of the First and Second World Wars, during which recruits had to be assigned to wartime tasks requiring diverse skills based on rapid, quantitative assessments. When veterans returned to civilian life after the wars, they found their lives dominated by intelligence testing.

Illustration by Emily Hughes from Wild
Because categories, measurements, and labels help us navigate the world and, in Umberto Eco’s undying words, â€œmake infinity comprehensible,” IQ metrics enchanted the popular imagination with the convenient illusion of neat categorization. Like any fad that offers a shortcut for something difficult to achieve, they spread like wildfire across the societal landscape. Mukherjee writes:
By the early 1940s, such tests had become accepted as an inherent part of American culture. IQ tests were used to rank job applicants, place children in school, and recruit agents for the Secret Service. In the 1950s, Americans commonly listed their IQs on their résumés, submitted the results of a test for a job application, or even chose their spouses based on the test. IQ scores were pinned on the babies who were on display in Better Babies contests (although how IQ was measured in a two-year-old remained mysterious). 
These rhetorical and historical shifts in the concept of intelligence are worth noting, for we will return to them in a few paragraphs. General intelligence (g) originated as a statistical correlation between tests given under particular circumstances to particular individuals. It morphed into the notion of “general intelligence” because of a hypothesis concerning the nature of human knowledge acquisition. And it was codified into “IQ” to serve the particular exigencies of war. In a cultural sense, the definition of g was an exquisitely self-reinforcing phenomenon: those who possessed it, anointed as “intelligent” and given the arbitration of the quality, had every incentive in the world to propagate its definition.
With an eye to evolutionary biologist Richard Dawkins’s culture-shaping coinage of the word “meme” â€” â€œJust as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs,” Dawkins wrote in his 1976 classic The Selfish Gene“so memes propagate themselves in the meme pool by leaping from brain to brain.” â€” Mukherjee argues that g became a self-propagating unit worthy of being thought of as “selfish g.” He writes:
It takes counterculture to counter culture — and it was only inevitable, perhaps, that the sweeping political movements that gripped America in the 1960s and 1970s would shake the notions of general intelligence and IQ by their very roots. As the civil rights movement and feminism highlighted chronic political and social inequalities in America, it became evident that biological and psychological features were not just inborn but likely to be deeply influenced by context and environment. The dogma of a single form of intelligence was also challenged by scientific evidence.

Illustration by Vladimir Radunsky for On a Beam of Light: A Story of Albert Einstein by Jennifer Berne
Along came social scientists like Howard Garner, whose germinal 1983 Theory of Multiple Intelligences set out to upend the tyranny of “selfish g” by demonstrating that human acumen exists along varied dimensions, subtler and more context-specific, not necessarily correlated with one another — those who score high on logical/mathematical intelligence, for instance, may not score high on bodily/kinesthetic intelligence, and vice versa. Mukherjee considers the layered implications for g and its active agents:
Is g heritable? In a certain sense, yes. In the 1950s, a series of reports suggested a strong genetic component. Of these, twin studies were the most definitive. When identical twins who had been reared together — i.e., with shared genes and shared environments — were tested in the early fifties, psychologists had found a striking degree of concordance in their IQs, with a correlation value of 0.86. In the late eighties, when identical twins who were separated at birth and reared separately were tested, the correlation fell to 0.74 — still a striking number. 
But the heritability of a trait, no matter how strong, may be the result of multiple genes, each exerting a relatively minor effect. If that was the case, identical twins would show strong correlations in g, but parents and children would be far less concordant. IQ followed this pattern. The correlation between parents and children living together, for instance, fell to 0.42. With parents and children living apart, the correlation collapsed to 0.22. Whatever the IQ test was measuring, it was a heritable factor, but one also influenced by many genes and possibly strongly modified by environment — part nature and part nurture. 
The most logical conclusion from these facts is that while some combination of genes and environments can strongly influence g, this combination will rarely be passed, intact, from parents to their children. Mendel’s laws virtually guarantee that the particular permutation of genes will scatter apart in every generation. And environmental interactions are so difficult to capture and predict that they cannot be reproduced over time. Intelligence, in short, is heritable (i.e., influenced by genes), but not easily inheritable (i.e., moved down intact from one generation to the next).
And yet the quest for the mythic holy grail of general intelligence persisted and took us down paths not only questionable but morally abhorrent by our present standards. In the 1980s, scientists conducted numerous studies demonstrating a discrepancy in IQ across the races, with white children scoring higher than their black peers. While the controversial results initially provided rampant fodder for racists, they also provided incentive for scientists to do what scientists must — question the validity of their own methods. In a testament to trailblazing philosopher Susanne Langer’s assertion that the way we frame our questions shapes our answers, it soon became clear that these IQ tests weren’t measuring the mythic g but, rather, reflected the effects of contextual circumstances like poverty, illness, hunger, and educational opportunity. Mukherjee explains:
It is easy to demonstrate an analogous effect in a lab: If you raise two plant strains — one tall and one short — in undernourished circumstances, then both plants grow short regardless of intrinsic genetic drive. In contrast, when nutrients are no longer limiting, the tall plant grows to its full height. Whether genes or environment — nature or nurture — dominates in influence depends on context. When environments are constraining, they exert a disproportionate influence. When the constraints are removed, genes become ascendant.
[…]
If the history of medical genetics teaches us one lesson, it is to be wary of precisely such slips between biology and culture. Humans, we now know, are largely similar in genetic terms — but with enough variation within us to represent true diversity. Or, perhaps more accurately, we are culturally or biologically inclined to magnify variations, even if they are minor in the larger scheme of the genome. Tests that are explicitly designed to capture variances in abilities will likely capture variances in abilities — and these variations may well track along racial lines. But to call the score in such a test “intelligence,” especially when the score is uniquely sensitive to the configuration of the test, is to insult the very quality it sets out to measure. 
Genes cannot tell us how to categorize or comprehend human diversity; environments can, cultures can, geographies can, histories can. Our language sputters in its attempt to capture this slip. When a genetic variation is statistically the most common, we call it normal â€” a word that implies not just superior statistical representation but qualitative or even moral superiority… When the variation is rare, it is termed a mutant â€” a word that implies not just statistical uncommonness, but qualitative inferiority, or even moral repugnance. 
And so it goes, interposing linguistic discrimination on genetic variation, mixing biology and desire.

Illustration by Lisbeth Zwerger for a special edition of the fairy tales of the Brothers Grimm
Intelligence, it turns out, is as integrated and indivisible as what we call identity, which the great Lebanese-born French writer Amin Maalouf likened to an intricate pattern drawn on a tightly stretched drumhead. â€œTouch just one part of it, just one allegiance,” he wrote“and the whole person will react, the whole drum will sound.” Indeed, it is to identity that Mukherjee points as an object of inquiry far apter than intelligence in understanding personhood. In a passage emblematic of the elegance with which he fuses science, cultural history, and lyrical prose, Mukherjee writes:
Like the English novel, or the face, say, the human genome can be lumped or split in a million different ways. But whether to split or lump, to categorize or synthesize, is a choice. When a distinct, heritable biological feature, such as a genetic illness (e.g., sickle-cell anemia), is the ascendant concern, then examining the genome to identify the locus of that feature makes absolute sense. The narrower the definition of the heritable feature or the trait, the more likely we will find a genetic locus for that trait, and the more likely that the trait will segregate within some human subpopulation (Ashkenazi Jews in the case of Tay-Sachs disease, or Afro-Caribbeans for sickle-cell anemia). There’s a reason that marathon running, for instance, is becoming a genetic sport: runners from Kenya and Ethiopia, a narrow eastern wedge of one continent, dominate the race not just because of talent and training, but also because the marathon is a narrowly defined test for a certain form of extreme fortitude. Genes that enable this fortitude (e.g., particular combinations of gene variants that produce distinct forms of anatomy, physiology, and metabolism) will be naturally selected. 
Conversely, the more we widen the definition of a feature or trait (say, intelligence, or temperament), the less likely that the trait will correlate with single genes — and, by extension, with races, tribes, or subpopulations. Intelligence and temperament are not marathon races: there are no fixed criteria for success, no start or finish lines — and running sideways or backward, might secure victory. The narrowness, or breadth, of the definition of a feature is, in fact, a question of identity — i.e., how we define, categorize, and understand humans (ourselves) in a cultural, social, and political sense. The crucial missing element in our blurred conversation on the definition of race, then, is a conversation on the definition of identity.
Complement this particular portion of the wholly fascinating The Gene with young Barack Obama on identity and the search for a coherent self and Mark Twain on intelligence vs. morality, then revisit Schopenhauer on what makes a genius.
BP

No comments:

Post a Comment

Please leave a comment-- or suggestions, particularly of topics and places you'd like to see covered