Saturday, February 28, 2009
This vision, the editorial acknowledges, has massive obstacles: the progressive master narrative can be dismissive of alternative perspectives and countervailing arguments; the visions propounded by foundations may be sorely lacking in preparing prospective teachers for the realities of the classroom; accrediting standards shortchange foundational perspectives and theorizing; the self-marginalization of foundations faculty; and the utilitarian approaches of alternative routes that make little, if any, time for deep and careful reflection and critique.
(I of course have quibbles with some of the specifics of the editorial. For me, the social foundations are about much more concrete and varied aspects of the schooling process: the role of schools in a democratic society (philosophical and historical foundations), the relationships between school and social change (multicultural and sociological foundations), and the perspective of school as an organization (anthropological, political, and legal foundations). Moreover, the readings held up as exemplars (Parker Palmer, Sam Intrator, Tom Barone, Mark Edmundson), are, for me, inspirational texts much more so than foundational texts. Give me some good classics anytime: David Tyack and Larry Cuban on the grammar of schooling; Phillip Jackson on the hidden curriculum; John Ogbu on voluntary and involuntary minorities; Audrey Thompson on whiteness. This makes my students not just “reflect”; it helps them to act in new ways by helping to break the cycle of teaching as we were taught and thinking in the default narrative of radical individualism. But these are “insider” debates. Let me stick to the big picture of the editorial.)
In one respect, this editorial is an extremely useful support for and an acknowledgement of foundations faculty and the field as a whole. It suggests to me that the acrimonious and fruitless debates about the relevance of teacher preparation have somewhat cooled off, enough at least to take a step back and see what has occurred in teacher preparation since NCLB. It also indexes current broad debates about the value of a “liberal arts” perspective in all too-instrumental times.
Yet such support is also saddening. It reminds me of when politicians start speaking in superlatives of their opponents; as the political pundits will tell you, such good words are a sure sign that the other candidate is done for and everyone in the room knows it.
The marginalization of the social foundations in teacher preparation has been long in the coming. It has been a combination of self-marginalization and external pushing and prodding. See my past posts on marginalization, CSFE, the role of AESA (here also), and writing (here and here also), as well as those of many others, to see the context for this. Suffice it to say that the social foundations field (as embodied by AESA) has no voice at any major educational policy table that I am aware of. It is no longer a part of NCATE; it is no longer a part of AACTE; the last comprehensive study of the state of the field of social foundations was done in the mid-1980s; CSFE, the policy arm of AESA, has been more or less defunct for half a decade. Though, I know, I know, the history of the field has seen such ups and down from its very beginnings (see, for example, Mary Rose McCarthy’s wonderful history of the original foundations course at Columbia).
This is not about throwing stones or settling grudges. Rather, my post is about wondering whether this editorial is a call to action or a call for an elegy.
I am not naive enough to think that this one editorial is the tipping point, the fulcrum, the moment in history that will decide whether the foundations field succeeds or fails. But if JTE and its editors put forward this call, shouldn’t someone stand up and answer them?
Tuesday, February 24, 2009
Scientists have discovered that childhood trauma can actually alter your DNA and shape the way your genes work. This confirms in humans earlier findings in rats, that maternal care plays a significant role in influencing the genes that control our stress response.
Epigenetics is the study of changes in the function of genes that don’t involve changes in the sequences of DNA. The DNA is inherited from our parents; it remains fixed throughout life and is identical in every part of the body. During gestation and even later in development, however, the genes in our DNA are marked by a chemical coating called DNA methylation. These marks are somewhat sensitive to one’s environment, especially early in life. The epigenetic marks punctuate the DNA and program it to express the right genes at the appropriate time and place.
Monday, February 23, 2009
This is a belated follow-up to Paul Rosenberg and David Sirota’s critiques of Nate Silver’s “Rationalist vs. Radical” progressivism. The dichotomies Silver laid out include:
Rationalist vs. Radical
Empirical vs. Normative
Sees politics as a battle of ideas vs. Sees politics as a battle of wills
Technocratic vs. Populist
Prone to elitism vs. Prone to demagoguery
Prone to co-optation vs. Difficult to organize
Optimistic vs. Pessimistic
Conversational vs. Action Oriented
The ensuing discussion focused mostly on how progressives think, or frame the world. I want to look, instead, at something different and potentially more important: how progressives have historically conceptualized ACTION.In ongoing historical work towards a book I’m calling Social Class, Social Action, and the Failure of Progressive Democracy, I argue that there are actually three distinct forms of progressivism, all drawing from different interrelated aspects of middle-class culture: Administrative, Collaborative, and Personalist progressives. As with any categorizations, these have their own problems, but I think reflect key historical realities.
Not only do Silver’s comparisons miss this three-fold complexity, but he also mixes in working-class models of social action as well.
Below I lay out these three different progressive camps, and then return to Silver’s dichotomy, adding in the working-class influence as well.
This is the model that mostly won the day in the bureaucracies of the world after the turn of the 20th century. This is an expert model—“we know more than you so we should tell you what to do.”At best, the administrative progressives envisioned a paternal process of social change, as those few who know best create a better world for the ignorant masses. At worst they bought into the “scientific management” movement promoted by Taylor, in which workers became “hands” and middle-class managers became the “minds” of industrial work. Even Taylor, however, seemed to believe that this mind/hands model would end up being best for everyone—because it was the most efficient model, everyone would end up getting more for less.
This group drew from the models of progressive classrooms, professional associations, and the less hierarchical relationships between white-collar workers. They envisioned a society designed around the collaborative method, seeking a flat “democratic” society in which everyone could participate equally in the development of a better world.John Dewey, the most sophisticated proponent, acknowledged that he couldn’t figure out how this would work—in fact he showed pretty conclusively in The Public and Its Problems that it couldn’t work. But he and other collaborative progressives were unwilling to give up on their essentially utopian visions. He kept hoping that even though no one had ever been able to solve the problem of how a local model of collaboration would provide a structure to organize an enormous society, someone might solve it in the future.
Why wouldn’t he and other collaborative progressives give up in the face of overwhelming evidence that their vision was unworkable?
The crucial problem was one of social lag. If they gave in to a vision of the world that assumed the existence of unending conflict was an inevitable part of human society at least for the foreseeable future, as unions and other working-class movements did, they would have to teach people social practices that would ill prepare them to achieve the kind of utopia they wanted. Teaching people in society to "fight" would point them away from the kinds of collaborative practices they valued, and actually make it more difficult (perhaps impossible) to ever achieve their utopia.
Thus, in their classrooms and elsewhere, they were willing to take the risk (for the working class, among others) that not teaching them to fight in solidarity as mass collectives would doom them to long term oppression.
The personalists emerged out of the romantic stream of thought in America. Like collaborative progressives, they sought to develop egalitarian communities, but they were less interested in joint work and collective action. Instead, they sought to develop social contexts in which each individual engaged authentically with every other, and educational context that sought to foster individual expression to the fullest extent possible. The personalists also envisioned a society built on this model, but didn’t worry too much about the specifics. They hoped that social change would just “happen” if they created the right kind of persons. But they mostly didn’t sweat the details too much.Where the collaborative progressives focused on the need for people to work together on joint projects, the personalists focused on the importance of allowing people to actualize their individuality within egalitarian communities.
In their education and in their social theory, the personalists focus on a world without charismatic leaders, without leaders at all in the sense that a working-class union or other standard action organization would understand them. Theirs is a view of individual actualization within a "beloved community"--a term used by SNCC in the south, taken up by SDS in the North, and drawn from one of the 1920s personalists, Randolph Bourne. Interestingly, these folks were not professors but independent intellectuals, as were the writers of the 1960s, for whom the key thinker was Paul Goodman. There are some fascinating similarities across these two eras that have not been fully explored.
This romantic vision emerged most powerfully in the 20th century in the 1920s in the work of the "young intellectuals": Randolph Bourne, Van Wyck Brooks, Lewis Mumford (who was active in the 60s) and Waldo Frank. Mostly forgotten. Then it reemerged in the 1960s in the highly intellectual and anti-leader organizing models of the Student Nonviolent Coordinating Committee (SNCC) and Students for a Democratic Society (SDS), for whom the key intellectual influences were Ella Baker and Paul Goodman.
The evidence of the impact of SNCC in the South is that it did have a somewhat transformative impact on what African Americans in some areas saw as "possible" for them, and did create a strong base for future organizing in some areas, but it did not (and was not supposed to) lead to mass action. In Birmingham, actually SNCC was reduced to begging Martin Luther King (who they disdained, and referred to as "the Lawd" in reference to their opposition to charismatic leaders) to "lead" people on marches. They didn't have the capacity to do so themselves. Importantly SNCC's effectiveness in pursuing its "beloved community" model largely resulted from their fairly sophisticated combination of collaborative and personalist visions.
In the North, in working-class white neighborhoods, SDS created the almost completely ineffectual ERAP organizations. These failed in large part because they were much more personalist and less “pragmatic” than SNCC, seeking to impose their leaderless vision on those they worked with. They had less of a focus on the pragmatics of joint action. Some groups could hardly ever get anything done--at one point according to Miller, they spent two days discussing whether they should take a day off and go to the beach. An iconic photograph shows one of their key "leaders" gazing intently into the lens, with everyone else falling asleep around her.
In fact, it is hard to imagine particularly effective, strictly personalist political movements. The communes of the counter-culture were probably the best examples of the social implications of personalism. It's no accident that personalists tend not to talk very concretely about social change. (At best, thinkers like Goodman embraced a kind of privileged anarchism, mostly evacuated of any socialist vision.)
Back to Silver
From the perspective I'm discussing, here, it seems clear that Silver is mixing different kinds of progressivism. For example, it is the personalists and not the collaborative or the administrative progressives that are “difficult to organize.” Other aspects of his dichotomy seem to refer to the administrative progressives. All progressives, for example, tend to be optimistic to a fault, although the administratives, of course, have little faith in “the people.”From the way he frames his dichotomy, it seems like Silver is drawing from a particular interpretation of the experience of the 1960s. And his framing not only misunderstands the complexity of progressivism, it mixes in aspects of working-class culture as well. For example, no progressives ever saw politics as “a battle of wills.” Nor did the progressives ever try to “marshal an army” for social change as he later argues.
To some extent, Silver is mixing up the "personalist" progressives of the SDS and early SNCC era, and the later dogmatic leaders of the Black Power movement and groups like the Weathermen. It is informative to note that the Black Power movement was fundamentally (and explicitly) an urban working-class movement, and that it was the working-class that emerged as increasingly influential in the South (in the form of Deacons for Defense, for example). The early personalists were quite optimistic--they only became cynical later on, and that's when their strategic approach shifted--and many of them simply "dropped out."
And it also raises questions about what exactly Silver means by progressivism. In his discussion of his dichotomy, he equates Marxist perspectives with that of the progressives. But the progressives, as I understand them, have never really been Marxist. As fairly comfortable middle-class professionals, they have never had much interest in attacking capitalism directly. Were the Marxist ideologues who emerged late in the 1960s “progressives?” I don’t know the history of that aspect well enough to say, but I doubt it.
In this later post, he argues that he was actually talking about "populists" as his "radical" progressives, but that doesn't really capture the distinctions he laid out either. See Paul's detailed discussions of populism here and elsewhere.
Silver is mixing so much up in his analysis that I’m not really sure what he’s talking about.
If people are interested in a more nuanced discussion of intersections between different kinds of progressivism and working-class visions of action in the Civil Rights Movement, you can see this draft case study chapter from my book. Part of my goal in the case study is to show how these abstractions break down and become intertwined in unexpected ways as they play out in the real world.
While the general population has been in recession for one year, people of color have been in recession for five years. By definition, a long-term recession is a depression.
. . .
Extreme economic inequality (which the U.S. experienced in the 1920s and is again experiencing now) is often a key indicator of recession and/or depression. The Black depression of today may well foreshadow the depth and length of the recession the whole country entered in December 2007. A deep recession would see median family income decline by 4%. Thirty-three per cent of Blacks and 41% of Latinos would drop out of the middle class. The overall national rate would be 25%.
Thursday, February 19, 2009
A new brain-imaging study is shedding light on what it means to "get lost" in a good book — suggesting that readers create vivid mental simulations of the sounds, sights, tastes and movements described in a textual narrative while simultaneously activating brain regions used to process similar experiences in real life.
"Psychologists and neuroscientists are increasingly coming to the conclusion that when we read a story and really understand it, we create a mental simulation of the events described by the story," said Jeffrey M. Zacks . . . .
The study, forthcoming in the journal Psychological Science, is one of a series in which Zacks and colleagues use functional magnetic resonance imaging (fMRI) to track real-time brain activity as study participants read and process individual words and short stories.
. . .[F]indings demonstrate that reading is by no means a passive exercise. Rather, readers mentally simulate each new situation encountered in a narrative. Details about actions and sensations are captured from the text and integrated with personal knowledge from past experiences. These data are then run through mental simulations using brain regions that closely mirror those involved when people perform, imagine or observe similar real-world activities. . . .
Changes in characters' locations (e.g., "went through the front door into the kitchen") were associated with increases in regions in the temporal lobes that are selectively activated when people view pictures of spatial scenes.
Tuesday, February 17, 2009
As the economy ails, policymakers, program managers, and service providers will be under extraordinary pressure to get the biggest bang for each buck. Be part of the discussion as experts tackle such questions as
- What happens to children and families during recessions?
- What must federal, state, and local officials do to speedily implement the recovery package and coordinate programs effectively?
- Are service providers ready?
- How will budget-strained states handle a funding infusion?
- Can new and expanded activities jump start change in early childhood programs and other children's initiatives?
- Will the recovery plan's short-term boost take the pressure off Congress to make permanent investments and reforms?
- How should legislators and laypeople measure success?
- Derek Douglas, director, New York governor's Washington office
- Olivia Golden (moderator), institute fellow, Urban Institute; former assistant secretary for children and families, U.S. Department of Health and Human Services
- Douglas Holtz-Eakin, president, DHE Consulting LLC; director of domestic and economic policy, 2008 John McCain presidential campaign; former director, Congressional Budget Office
- Joan Lombardi, research professor, Public Policy Institute, Georgetown University; first director, Child Care Bureau, Department of Health and Human Services
- Matthew Stagner, executive director, Chapin Hall at the University of Chicago
This leads to beliefs that the world is constructed "for" agents with minds:
So how does the brain conjure up gods? One of the key factors, says Bloom, is the fact that our brains have separate cognitive systems for dealing with living things - things with minds, or at least volition - and inanimate objects.
This separation happens very early in life. Bloom and colleagues have shown that babies as young as five months make a distinction between inanimate objects and people. Shown a box moving in a stop-start way, babies show surprise. But a person moving in the same way elicits no surprise. To babies, objects ought to obey the laws of physics and move in a predictable way. People, on the other hand, have their own intentions and goals, and move however they choose.
Bloom says the two systems are autonomous, leaving us with two viewpoints on the world: one that deals with minds, and one that handles physical aspects of the world. He calls this innate assumption that mind and matter are distinct "common-sense dualism". The body is for physical processes, like eating and moving, while the mind carries our consciousness in a separate - and separable - package. "We very naturally accept you can leave your body in a dream, or in astral projection or some sort of magic," Bloom says. "These are universal views."
There is plenty of evidence that thinking about disembodied minds comes naturally. People readily form relationships with non-existent others: roughly half of all 4-year-olds have had an imaginary friend, and adults often form and maintain relationships with dead relatives, fictional characters and fantasy partners. As Barrett points out, this is an evolutionarily useful skill. Without it we would be unable to maintain large social hierarchies and alliances or anticipate what an unseen enemy might be planning. "Requiring a body around to think about its mind would be a great liability," he says.
Again, experiments on young children reveal this default state of the mind. Children as young as three readily attribute design and purpose to inanimate objects. When Deborah Kelemen of the University of Arizona in Tucson asked 7 and 8-year-old children questions about inanimate objects and animals, she found that most believed they were created for a specific purpose. Pointy rocks are there for animals to scratch themselves on. Birds exist "to make nice music", while rivers exist so boats have something to float on. "It was extraordinary to hear children saying that things like mountains and clouds were 'for' a purpose and appearing highly resistant to any counter-suggestion," says Kelemen.
Wednesday, February 04, 2009
There is apparently little research on this, but Richard Rothstein et. al. report on a double-blind study showing that providing vitamin supplements to poor children directly resulted in increased test scores.
But, of course, giving poor kids vitamins in the morning (maybe yummy ones) is not only too difficult to do on a regular basis, it's not really important enough to study very carefully.
Pedagogy. Remember. It's all about pedagogy. That's what we do.
We now return to our regular programming.