Saturday, April 26, 2014

Faith, Reason, and Debating the Existential "Big Questions"

I'm past college, and with those years has passed the incidence of earnest debate about things like religion and the meaning of life. That I attended a Catholic university and majored in a "Great Books" meant that I fielded my share of challenges from those who believed something different than I did, and one of the most pressing questions that came up at that time was why.

Why do you believe?

There is something fantastic and mythological, certainly, about the story of a God coming to earth in order to offer Himself up as a perfect, spotless sacrifice in order to atone for every human sin, past and future, and reconcile the human race to Himself as God. The particulars of the story are indeed quaint and uncomfortably sentimental: a sweet young woman chosen to miraculously conceive God's child; archetypal authority figures hatching dastardly plots and darkly scheming to stop this bright young hero; a set of bumbling accomplices; an impossibly evil death; and the most mythical and unbelievable thing of all: that he was killed and then came back to life.

To my friends, well-educated and mostly liberal humanists, the tale of Christ bears too many similarities to the quaint myths of many other cultures, and is only the biggest myth in a child-like narrative of the world with a stylized creation story and a lot of horrible barbarities. Compared to sophisticated promise of modern disciplines like sociology, psychology, and specialized sciences, a primitive culture's myth seems plainly archaic. How could anyone believe this, much less someone college-educated?

The challenge about answering this question is that it is ideological rather than academic. Those who ask it have a certain perspective which I don't understand, but which seems to preclude the idea of a supernatural. Some profess to be humanists, who believe that continued enlightenment in sciences will eventually conquer our social and personal afflictions. Others profess to be rationalists, believing only in those things that science has proved or theorized.

Such alternative belief systems are not, in and of themselves, ideological. They fall more truly into the existential category, defining who we are and why we exist. But they seem to come with a lot of ideological baggage these days. After all, elements of our society today are unabashed and even aggressive apologists for faith (professing the Christian doctrine of sola scriptura) and many of them speak in terms of condemnation, specifically condemnation of those who disagree with them, to hell. They often stand for uncomfortably traditional values as well, like maintaining traditional gender and socio-economic roles. Now all of a sudden we aren't talking about a different moral and existential perspective, we're talking about an ideological opponent. And, to be fair, there are fundamentalist Christians who are offensive and judgmental in proselytizing their beliefs.

But to turn the tables, many so-called rationalists and/or humanists can be just as aggressive, and I am skeptical that their explanations of the world are actually more 'rational' than a faith-based one. It's easy to talk about gravity or astronomical relations and say that we can "prove" real science empirically, but I doubt that many of us have empirically viewed the behavior of a virus, or the release of certain brain hormones causing affection or depression. We accept that viruses and brain hormones work a certain way because we have studied the effect of those things and measured them in actual humans, so we know they exist and they affect, somehow, our health or mental state. We also believe people called "scientists" when those people tell us about viruses and brain hormones (and the behavior of chemical elements, and many other things), because we have faith that their education and certification makes them intrinsically trustworthy on certain issues.

Whether or not you trust a scientist or a theologian (or a priest) is really the question, unless on. An Op-Ed in the Washington Post recently pointed out very thoroughly that the two sides are not mutually exclusive. I have little to add to the writer's argument because I agree with him -- I believe in the story of the Christ and yet also pursue understanding of scientific matters, because I want to know more about us and this world we inhabit. He ends with a marvelous paragraph worth quoting in full:
The problem comes when materialism, claiming the authority of science, denies the possibility of all other types of knowledge — reducing human beings to a bag of chemicals and all their hopes and loves to the firing of neurons. Or when religion exceeds its bounds and declares the Earth to be 6,000 years old. In both cases, the besetting sin is the same: the arrogant exclusive claim to know reality.
The answer to the question of why I believe the entirety of the Christian story, with it's quaint mythological narratives about paradisiacal gardens and apples of knowledge of good and evil and floods and prophets and whales and the Son of God is that I find it more plausible than any of the alternatives. It really makes more sense to me. Not necessarily in they physical particulars ("do you really believe that some prophet actually parted water to create a passage?"), but in the tale it tells of how humanity became prone to doing bad things and how God then came Himself to redeem humanity from its sinful nature.

The Christian tale is plausible to me mostly because of my own experiences in sin and redemption. The vast majority of these experiences are with my own sins and redemptions in my life so far, and a few of them are observations of other peoples' sins and redemptions. On a precious few occasions I recall witnessing a miracle, or experiencing a beatific presence I attribute to the Christian God. These things are open to interpretation in an academic sense, of course. Rationalists might argue that my experiences of good and bad in myself and others are filtered through a strong inculcated Catholic belief system. They might doubt that I, in fact, saw or experienced so-called "supernatural" things, and point to the demonstrated phenomenon of humans to manufacture memories that suit their subconscious perspectives. And as far as that goes, they may be right. I can't transmit my experiences to others, so therefore I can't expect anyone else to believe my conclusions. And yet I can no more forget them than an astronaut could forget his view of a round earth from space, or an astronomer could forget the sightings and calculations that the earth and nearer bodies revolved around the sun in elliptical trajectories.

My point here is not to convince anyone in my beliefs. I don't think that's possible -- neither a rationalist nor a faith-based belief system can be truly transmitted via dialectic. Any belief system has to be experienced to be believed, personally and deeply experienced. And for a human, that means engaging both the intellect and whatever part of the brain controls belief.

Someone who believes that human emotions like love and depression are a combination of neuron activity and chemical activity in the brain has probably actively engaged the subject: he or she likely wondered why people experience love and other emotions, and pursued the answer until they found an explanation. That's the activity of his or her intellect. He or she also had to exclude other explanations for emotions (presuming they found others), such as activity of a metaphysical soul, or instinctual behavior bred in by evolution, which is primarily a decision of faith. Does he or she trust neurologists who measure neuron activity and brain chemicals? Priests, philosophers, and/or wise men and women, who have reached a supernatural explanation due to their long experience in considering and/or observing human behavior? What about sociologists and/or biologists who study behavioral patterns and instinct activity?

Personally, I don't believe that a scientist is intrinsically a better person than a priest or a philosopher. All three are human, which means they are subject to the same ideological myopia and vices, as well as the same inspiration and virtue, as the rest of us. No single person knows everything, and experience teaches that even if a person did, he or she would forget part of it, or hide part of it, or even use it to his/her advantage. Positing that it's possible to know everything, and use that knowledge correctly, is coming dangerously close to positing God. Whether we follow to that conclusion, or stop short -- and who/what we decide to trust and therefore believe -- well, that's just our obligation as rational beings. We each must individually decide what to believe.

It's natural that each of us would seek like-minded friends in the world, and so it's easy to see how we would gravitate towards those who believe the same things. So begins ideology, or the pursuit of actualizing an ideal, which carried to the extreme ends up forgetting that ideas are not more important than people -- or so I argue as a Christian: that individuals have the highest intrinsic value; ideas may be valuable but they're not worth more than life itself.

I plead that we don't let this social instinct push us into prejudice. I and many people I know believe in the teachings of Christianity and yet also follow the progress of scientific knowledge. Many of these people are scientists or doctors themselves. And likewise, I know that people who religious faith (Christian or other) is irrational do not reduce the human experience to the peculiar behavior of a peculiar animal, enslaved to instinct and evolutionary imperative.

So let's not discuss these existential issues of faith, science, reason, and belief with a desire to win, especially to win by painting other belief systems in pejorative colors. Rather let's do it to better understand ourselves and each other.

Monday, April 21, 2014

Restoring the Meritocracy, or addressing concerns about the US Officer Corps

Recently Mr. William Lind published his latest article, and as usual it was provocative. Titled "An Officer Corps that can't score," it argues that the United States military has lost the competitive edge in combat for the following reasons:
  • An ego problem, the apparent perception of US Officers that they oversee the best military that's ever existed;
  • A personnel problem, that officers are punished for creative thinking and innovation (and the mistakes that invariably accompany such a mindset);
  • A staffing problem, which shortens command tours of duty so everybody on the bench gets a chance to play, if only for a short period of time; and worst of all,
  • A moral problem, in which officers support and perpetuate the status quo to protect their careers--notably a problem the US Military did not have after the Vietnam conflict (according to Mr. Lind).
Certainly these are serious accusations. Mr. Lind's article sparked a great deal of response, too. Several active duty officers penned articles which asserted indignantly that there *is* a great deal of debate in the military regarding staffing, weapons acquisition, force structure, and other 'big picture' issues. What is conspicuously absent from the responses, however, is a critique of the personnel situation--which, as the lynchpin of Mr. Lind's argument, probably deserves the most thoughtful consideration.

Mr. Lind's own history plays a big part in his critique as well. I've never met the man, but if you'll indulge in a little amateur psychology, I would say that Mr. Lind very much has a dog in this fight. He was foremost among what he calls the most recent wave of "reformist innovators," and highly praises his contemporaries Col Boyd (USAF) and Col Wyly (USMC), with whom he generated much of the intellectual foundation of so-called Maneuver Warfare. He also helped introduce and develop the theory of Fourth-Generation Warfare, an extension of Col Boyd's definitive and much-lauded omnibus theory of combat "Patterns of Conflict." Anyone who is a bit startled (and/or stung) by the opening line of his article, "The most curious thing about our four defeats in Fourth Generation War—Lebanon, Somalia, Iraq, and Afghanistan—is the utter silence in the American officer corps," ought to at least realize that Mr. Lind is aggressively applying the theories of warfare that he developed and championed to his very broad-brush of a statement about our apparently constant defeats.

The predictable--and justified--knee-jerk reaction by junior officers in the US Military is that Mr. Lind is wrong, and that there is anything BUT silence about the struggles and outcomes of these so-called "Fourth Generation Wars." Indeed, in my own experience there is a lot of debate about technology (drones, bombs, tanks, and their efficacy) and tactics regarding the most recent conflicts in the Middle East. That is all very good. But I think Mr. Lind hits the nail on the head when he criticizes the military--particularly the officer--personnel system. And while there is a lot of debate about that issue as well, it's usually conducted in hushed voices and away from field grade and higher officers.

Complaints about personnel issues usually center around field grade officers focused on achieving the next rank (and running their subordinates into the ground to get it), or general officers trying to maintain their reputation to their civilian masters with an increasing administrative burden of annual training and paperwork accountability. To the uninformed, it just sounds like bitching, but hearing enough of it reveals that both types of anecdotes coalesce around one central issue: today's officer cadre does not have either the time or resources to focus on warfighting.

How has this come to pass? At the danger of theorizing ahead of data, I have some suggestions:
  • First, during the Iraq and Afghanistan conflicts we created a whole sub-combatant-command for each location, complete with Joint Force Commanders, Functional Component Commanders, Service Component Commanders, and associated staffs. This effectively doubled the requirement for staff officers in each of the four major service components. In addition to being top-heavy, it prevented the whole coalition from having any true cohesion as a unit, because new units were revolving in and out under a joint commander who, in addition to directing the whole campaign, also had to administer the vastly increased relief-in-place and transportation requirements of such an ad hoc system. Imagine if Patton had new armored and mechanized units rotating in and out of the 3rd Army throughout 1944 and 1945. Would he have been able to build such a successful and dynamic fighting force?
  • Second, as a corrolary to the first, there are career requirements for officers appointed to joint commands. The demand for those officers has forced the services to cut career billeting corners to get enough qualified officers to meet the demand. That is a recipe for "check-the-box" leadership and careerism from start to finish.
  • Third, most services made a decision to shorten deployment times in order to ease the burden on servicemembers' families. This was a social decision, and it may not have been a bad one. However it did create a 'revolving door' in nearly every unit in the military, as whole combat units turned over from year to year and had to be assigned places in the supporting establishment, which in turn was bloated beyond needs and suffered the same 'revolving door' effect. The Army alone experimented with year-long deployments in the hopes that more time in country would allow greater innovation and success in the counterinsurgency fight; I'd be curious to see if there were any positive results.
  • Finally, Congress has micromanaged the benefits of servicemembers to the point of restricting officers from shaping their force. I doubt anyone in the military, including me, would complain about pay increases, money earmarked for better base gyms and housing (including 'in country'), and a reduction on sexual assault and/or suicide. The problem is the way Congress has enacted these changes. Forcing them down the military's throat creates a culture of 'yes-men' who must "support and defend" the Constitution by bowing to each new decree of a prime Constitutional institution, Congress, no matter what that does to already scarce military resources. Sergeant Major Barrett's comments, while tactless and insensitive, demonstrate the frustration of many military leaders that servicemembers need meaningful combat training, expensive as it is, more than they need administrative sexual assault training and fast-food joints on base.
The prevailing sentiment among junior officers is that the military is directionless, or maybe more specifically suffering the pull of too many 'missions' at once. There's Congress, forcing social changes and shutting down government. There's the so-called "War on Terror," which carries real danger but no real reward--neither Congress nor the Services themselves seem to care much about it anymore. There's the Administration, preaching a "pivot to the Pacific" and a drawdown, which ominously promises more tasks for the military to accomplish with fewer people, and there's the innate sense of honor in the services themselves that expect the officer cadre to keep all these masters happy and still field fighting units.

In this context, I will speak heresy to the die-hards and state that there's small wonder junior officers in particular keep their heads down and try not to screw up (i.e. bring all their servicemembers back alive with comparatively little regard for 'the big picture'). It also explains why so many veterans of the recent conflicts look back nostalgically on the simpler world of their combat tours, when they had a single direct mission and a feeling of accomplishment.

So what sort of reform would make Mr. Lind happy? I'm not sure, as he simply bemoans US Officers' lack of creativity and moral fibre, but I have some suggestions on that score as well. But first, I'll point out that some of the best ideas have come from much more creditable sources than me. Go there, and explore.

My ideas are pretty simple. There is a romantic conception floating around that the military is a meritocracy--in other words, the officers who are best at their jobs should be the ones that get promoted. The shortened command tours, vast administrative requirements, and glut of officers in the services effectively obscure the good officers from the mediocre, lowering moral and motivation. I believe that the best leaders in today's military truly seek a chance to lead and to show their mettle, so I propose the military make a few structural changes to recover a merit-based promotion system.
  • Lengthen command tours, including the tours that are required for command screening, to 3 (or 4) years. This would first of all require existing commanders to put a lot of thought into the junior officers they promote, knowing that the officers they evaluate highly will eventually control a combat unit for three years (instead of 18 months), and would allow existing junior officers a lot more time to develop and lead their troops under the guidance of one Commanding Officer. 
  • Longer tours help mitigate the 'zero-defect mentality,' a colloquialism which refers to the reality that one mistake in an officer's career is enough to prevent him/her from making it to the next step, because he/she will always be compared to other officers with no such mistakes. It's a lazy way to evaluate, because the positive effects of the officer with the mistake may be greater than those of his/her peers, and may indicate greater potential. But at least with a full 3 years of observed time, officers will be able to recover from mistakes--and their seniors will be forced to consider which of their subordinates are best suited for further opportunities, knowing that maybe only one will have the opportunity.
  • Longer command tours also permit greater unit stability, which will increase esprit de corps, has been shown to reduce things like suicide and sexual assault, and will certainly increase combat effectiveness.
  • Increasing tour length will be essentially meaningless if officer staffing remains high, because right now it seems like every officer gets the chance to move on regardless of his/her performance against peers. As part of the draw-down, the military as a whole should reduce officer staffing to the minimum level required for service administration, starting with Generals and working down the rank structure (and this reduction should occur before any enlisted personnel cuts, in accordance with good leadership practices). The military should also eliminate the additional joint force staffs located in Iraq and Afghanistan. This will be an unpopular step, as many generals will be forced into retirement, many more field grade officers will be forced into early retirement, and many junior grade officers will not have the opportunity to continue in the military past their first tour. It would help ensure, however, that only the best officers in each rank will remain--reinforcing the idea of the military as a meritocracy.
Actual, active duty officers have much more specific lists of things which need to change, most of which revolve around their ability to train their servicemembers. And we should listen to them. But we can't force current officers to change their way of thinking--most of them have been shaped by the questionable leadership environment that Mr. Lind notes for the entirety of their career. We can, however, collectively change the game--we can stop playing that 'everybody gets a chance' and start giving our officers the space and responsibility to fully lead their men and women. That's why most of them sought a commission in the first place.

These kinds of changes will force leaders at all level to focus on quality, not qualifications; it will force officers to make tough evaluation decisions after years of watching their subordinates develop. Ultimately, only the top 20-30% will have a career each tour, which will ensure that only the most effective officers run our military.

When our nation's security and American lives are at stake, isn't that what we want?


Saturday, September 21, 2013

On Pope Francis and the teachings of the Catholic Church

Pope Francis has really shaken up the Catholic Church this time. He affirmed that people of good will, even if they are atheists, are close to Christ. He condemned the, er, widespread condemnation of homosexuals. He has called for the Church to more open and welcoming, which is seen by many as a hint he will relax the mandates of the Catholic Magisterium. His "stance," a ridiculously vague term which implies his worldview, agenda, and perspective, has prompted much spilling of ink on these controversial subjects from news media and Catholic commentators. And yet he has affirmed Church teaching as well. What could be going on?

I think the commentators may be missing the point. Pope Francis wouldn't be the first pope to install sweeping changes, but until that happens I'm going to assume that he is doing what his predecessors have done, which is teach the faith. And so far, his comments affirm what the Catholic Church has always taught, though perhaps not with so much emphasis: that humans have free will gifted to them by God, with which others should not interfere; that they are called to follow their conscience to be people of good will; and that they must treat others as they would treat themselves.

These teachings mirror, as far as I can tell from my own scriptural study, what Jesus himself taught. Remember the Good Samaritan, the woman charged with adultery, the tax collector at the temple, the centurion, and the Prodigal Son? In each of stories, Jesus chooses forgiveness over condemnation. He also identifies unlikely protagonists; namely Samaritans, Roman soldiers, tax collectors, and outright, confessed sinners. Most homilies/sermons I've heard on these scriptural passages emphasize that the humble, the lost, those seeking goodness are the ones close to God--and the spiritual authorities (the Pharisees) are outside God's favor.

Notably, the Gospels have little good to say about the Pharisees. They constantly try to trick Jesus and get him to blaspheme against the law, they grumble about him associating with enemies of the Jews, and Jesus himself condemns them pretty stridently, calling them "hypocrites." The reason for this, I think, is because they are overly scrupulous, a word which used to mean "overly concerned with rules." Put simply, to be scrupulous, a person must be dedicated so much to following the letter of laws and dictates that he or she fails to accomplish the good for which those laws and dictates were instituted in the first place.

Pope Francis seems to have evoked this element of Gospel teaching in his recent statements, speeches, and interviews...and it is not surprising that he has caused a furor in doing so. The Gospel's challenge in this regard is a very personal one, and it strikes at the core of each person's unconscious, but deeply held, beliefs and convictions. Left to each of our own devices, I'm sure we would each live a good life according to our own perception and experience. But such is not our world: we are immersed in society, and so we contact nearly infinite other perspectives and experiences. What are we to do when another perception, or someone else's experience, challenges our own?

This question is not a Catholic one; it is a universal one. Who among the people of the earth today has never been shamed out of an opinion by someone else's story, or never had a rival in love, or cause for jealousy, or has violated his or her own values? These essential human conflicts we can resolve in one of three ways: first, we can ignore the conflict by becoming a hermit--either separate from the world, or existing within it yet unwilling to challenge our fellows; second, we can identify one rigid set of values to give our interactions structure, and never change them no matter what additional experience we receive; third, we can interact dynamically with our world, seeking understanding of others, though we are aware that we might hurt them. The third option requires love, humility, compassion, generosity, and forgiveness, and it is the Catholic answer, for the Gospel teaches it.

With much distinguished scholarship on sin, objective evil, and the elements of a good life, the Catholic Church has (probably inadvertently) created the elements of a rigid set of values, for those who choose the ease of such a moral compass. Yet other members of the Church (and our society) seem to withdraw from the difficulty of true compassion and generosity, preferring a simpler course of benevolently accepting all experiences--and, along the way, granting themselves license to ignore the idea of "true values" in favor of no values.

I don't mean to generalize here. I realize that few, if any, people live their lives entirely on one side of the spectrum or another. In truth, I suspect that each of us has beliefs about which we are scrupulous, and others which we choose not to engage. But Pope Francis seems to be guiding Catholics toward the third way, reminding them that they have a responsibility to dynamically engage the world, seeking to love and care for all people--even (or perhaps especially) for those who are most offensive and pharisaical to them. Whatever sins we abhor in our neighbor, remember Jesus calls us foremost to love them anyway.

And should we be tempted to pass judgment when dynamically engaged, I submit that we remember (whether we are progressive or traditional Catholics, for all that we imperfectly know of God's perspective from the Bible and Church teaching, and no matter the reach of our perspective which is necessarily limited by the tiny fraction of humans we happen to know) that in the end we surely must admit that cannot know the heart of God (as the bible reminds us though Isaiah and Christ himself).

So let us not commit the sin of the Pharisees, and use our religion to condemn and degrade others, either by accusing them of a lack of love or by passing judgement on them for failing to sufficiently respect the rules. Let us not pridefully arrogate God's province of salvation and condemnation to our imperfect human understanding. Let us remember Jesus' warning about the child and the millstone, and remember that when we fail to love and be compassionate for someone who appears to us as sinful, and then treat them with disdain and unlove, then we may very well be the agent of their stumbling in their relationship with God.

I believe that is the message of Pope Francis. It can be boiled down to love your neighbor, whether he/she is scrupulous or progressive, tends toward latin mass or the vernacular, is gay, or has had an abortion, or is divorced. Loving your neighbor doesn't mean condoning what they do, but there's a catch here. If one perceives a sin in someone else, couching a correction (or condemnation) in loving words is not the same thing as loving them. In fact, it is usually the opposite. And anyway, we are pretty specifically told to focus on our own sins, which--if we presume to pretend that we're better than others morally--may trend toward the deadly.

As far as how to love our neighbor, I suppose our best teacher is Jesus, who wants us to feed, clothe, care for, and visit in prison our neighbors, in metaphorical ways as well as material. Pope Francis, bless him, provides us an excellent example of this, and the crowing or panicked pundits of his papacy only indicate that we certainly need both his reminder to love, and his leadership.

Friday, September 6, 2013

The Injustice of our Over-Sexualized Society

I've heard/read a lot of discussion about sex recently.  More so than usual, in fact.  I think perhaps Ms. Cyrus' glorification of wanton, objectifying lust at the recent Video Music Awards (VMAs) is chiefly responsible for re-kindling this long-running social debate.
Because much of our interaction occurs on the internet, there has been a glut of responses to the VMA performance, particularly, and been some popular blog posts on the subject of over-sexualized young people, generally.  Social media friends of mine have posted among themselves two articles in particular that caught my attention.  The first is an "FYI" (for your information) by one Mrs. Hall to girls who post sexy pictures of themselves on social media; the second is one of many dismayed responses to that "FYI."

I'm not here to criticize (at least, to criticize any more than stating my opinions and thoughts should be construed as criticism).  My heart goes out to Ms. Cyrus because I worry that sometime in the future she will regret her lascivious performance at the VMAs.  She may regret it because she's tired of hearing about it, or because she worries what her parents, future boyfriends, or future husbands will think, or because she doesn't want her children (if she decides to have any) to see her in that degrading sexual way.
Worst of all, she may even have relationship troubles and/or decide not to have children because of that performance (and it's backlash).
After all, many have pointed out that it was an excellent attention-getter, and in my experience most people who want attention have some emptiness inside they hope the attention will fill.  If that's true with Ms. Cyrus, then it's likely that she will cling at least partially to whatever ideology of hedonism or reckless youth inspired her to create her drug-celebrating, sex-celebrating, promiscuity-celebrating song--or to, er, 'perform' Mr. Robin Thicke's sex-celebrating song--in the first place, because it got her some attention.  It's also something of an ideology today to forego regret and embrace mistakes, because...well, I don't know.  A sign of maturity is recognizing past mistakes and resolving to do better, which is also a form of atoning for those mistakes, and which implies regret (because if you didn't regret it, then why was it a mistake?).  Unfortunately, refusing to admit mistakes and celebrating promiscuity do not conduce to long-term, loving, trusting relationships.  As I am fortunate enough to be in such a long-term, loving, and trusting relationship, and I realize just what I wonderful thing it is, I hope that Ms. Cyrus (along with everyone else) finds it in her life.
So I hope my concern is warrantless.  But the idea of celebrating sex, and the idea of freedom from mistakes and regrets, is not limited to Ms. Cyrus or Mr. Thicke.  We've all had those ideas in our lives, because we've all experienced the excitement, desire, and lust for sex, as well as the unpleasant guilt, anger, and depression that comes from making a mistake (and for the record, I'm talking about all mistakes here, not just the relationship ones.  Botched interviews, incomplete projects, hurtful words, everything).  It's a psychological certainty that as we begin to experience sexual desire, we will model our parents' behavior towards the same.  It's also virtually certain that our reckless teenage brains will revolt against our parents' staid views, and that we will in some way, big or small, experiment with sexuality.  And by the time we are full adults, most of us will have had some actual sexual experiences (good, bad, and ugly) and will have reflected a bit on them: rarely maybe just by ourselves but certainly when confronted with those we hurt along the way, or those who loved us along the way, or our own pain or heartbreak.

All that reflection certainly does not mean the end of sexual desire or the freedom that we felt as teenagers.  In a good relationship, such 'teenage feelings' ought to be part of the emotional connection.  Certainly a multi-thousand-year-old canon of love stories, love songs, love poetry, and well, love bears witness to the fact that at our best, we don't grow out of excitement, desire, and carefree recklessness. We might temper it with some restraining virtue as our prefrontal cortex develops in our mid-20s, but we don't get rid of it.  How would our relationships last otherwise?

But our collective experience of these things should teach us not to condemn it in our youth.  I think it's a pity that today's teenagers got to see the VMAs, it's a pity they have such easy access to sometimes very disturbing pornography, and it's a pity that social media makes people so accessible.  But if a teenager, boy or girl, is interested or desirous of sexuality, that's neither bad nor good--it's totally amoral.  It's a result of their development as a human being.  The only negative about it their exposure to what I would consider unhealthy sexuality (Ms. Cyrus' performance, disturbing pornography), and the fact the entire world can see their development and their mistakes via social media.

The original FYI subtly condemned some young women--friends of the sons of the author--who posted sexually alluring pictures of themselves on social media websites.  Ignoring the fact that it was cruel and demeaning criticism, with detailed and contemptuous noting of a lack of a bra, or an arched back, the article explicitly held the young women responsible for the possibility of unhealthy sexual thoughts in Mrs. Hall's sons.  The responses are consequently fun to read: they point out that young men also post pictures of themselves in sexually alluring (in a masculine way) poses; and more to the point they note that whatever desire a young man might feel, healthy or not, it is his responsibility and his alone to behave well.

I won't rehash their arguments, but I encourage you to check them out.  If you happen to think that the alluring or sexy behavior of a young woman, however self-demeaning it might be, is an good reason for you to treat her demeaningly yourself, then perhaps you in particular need to read the responses.

I wish, however, to add two additional considerations.  The first is a study which showed that a male confronted with a picture of a scantily-clad, sexually alluring female has increased activity in the area of his brain that relates to tools and other items he can manipulate; the second is the fact that nothing we record and/or post on line is ever, ever erased in our frighteningly new technological age.

The first consideration helps explain (but certainly does NOT excuse) why men are disposed to treat women who present themselves sexually worse than they ought.  It also probably explains why Mrs. Hall, who no doubt wishes for her sons to grow up into respectful and gentlemanly men, has some misplaced (and justly condemned) anger against the sexy selfies she saw on social media.

The second consideration is really more of a personal preservation piece of advice. Fortunately most of us, if we ever made an exhibition of ourselves like Ms. Cyrus did, (publicly or to a partner), we can safely hide it away in our past where we won't have to explain ourselves to future partners, spouses, or children.  Social media took off after I had graduated from college, and I shudder to think what my statuses or pictures would have shown had I been able to post them publicly.  Now, sexy selfies, videos of less-than-respectable party behavior (whether silly drunkenness or salacious dares) and other records of typically reckless, experimental teenage behavior are there for the viewing--by everyone.  Literally, everyone in the world.  They've been used to bully and demean, too.  Make no mistake that such evidence which may by right be personal and private, but by reality are accessible to everyone, may determine to a large degree one's reputation.

Furthermore, while I personally hold that promiscuous, reckless, or demeaning behavior is equally bad in girls and boys, I don't think that's a very widely held perspective.  Mrs. Hall, along with (no doubt) many mothers of sons, seems predisposed to blame alluring young women for the vicious thoughts of men; the male frat-boy locker room culture of male college students and young professionals seems to have no problem with the essential hypocrisy of demanding easy sex but condemning 'easy women.'  Which is very unfair.  Food for thought.

So if you do find yourself looking at a sexy selfie of a young girl on some social media site (Mrs. Hall and any other parent or teenager), remember that the young person in question is probably in the process of discovering sexuality for the first time, and is maybe expressing a totally understandable and fairly universal desire to be attractive to the opposite sex, and is ultimately behaving just like the young men who post shirtless pictures of themselves at the beach.  And if you argue that it's 'boys being boys'  for the young men, but find yourself feeling shocked and/or condemnatory about the young women, then perhaps we should re-evaluate the phrase, 'boys will be boys' and advise our children to stop over-sexualizing themselves, regardless of their own sex.  Then perhaps the young girls that Mrs. Hall, along with much of society, picks on might get a break from the unjust blame and contempt and responsibility.

Because we all recognize the importance of raising virtuous children.  Many have noted that Ms. Cyrus and Mr. Thicke are notably devoid of certain virtues like respect, for self or others, as an explanation for their overtly, almost offensively sexual behavior.  (Which is perhaps unfair to Ms. Cyrus because she's only in her early 20s and therefore hasn't finished developing the part of her brain that houses virtue, while Mr. Thicke is in his 30s and has no excuse.)

Such judgment may be true, academically.  I certainly wouldn't wish to act like Ms. Cyrus, because her performance at the VMAs seemed degrading and elicited disgust in me.  I hope I could make that judgment without judging her to be a bad person--after all, most people do things because they think they are good things to do, not because they have malice in their heart.  Yet condemning Ms. Cyrus, or any young women who uses sex as a form of allure in what they say, or how they dress, or even (yes) what they do, is shifting the blame. It binds upon them some of our own individual guilt about sex or sexuality, and it is a mean and bullying pathway to feel good about ourselves.

So let's start by treating everyone, perhaps especially the over-sexed among us, with dignity, and let's teach our children the same.  It's better for us, better for our relationships, better for the people we meet--and it will be better in all those ways for our children, and for those who meet them.

Wednesday, May 22, 2013

On graduation generally, with some commencement advice

There's nothing that makes me feel older than seeing graduation pictures and congratulations post all over social media. Many of my older friends have children graduating from high school, and many of my contemporaries have siblings, nieces, and nephews graduating. Such bright, exciting moments! And with the privilege of age I look back on my own shining moment, and remember that the aftermath was, well, difficult. The years following my degree were filled with growth and development, much of it painful. This article eloquently captures the sense of loss, of frustrated loneliness, the unexpected difficulty of 'real work' in 'the real world.'

I'll bet most of us remember this transition, and we have all sorts of advice for others on how to deal with it. But that's what all graduates hear, from the rumors of their friends (I heard so-and-so is a great company) to the advice of parents or teachers to the commencement address itself.

Obviously, I don't remember any of the advice I was given. I doubt anyone else does, either.

I mean, there's just so much of it! And it's not like any graduate is eager to plan their life 'in the real world,' not with celebrating to do. And most college graduates are complacent to some degree--by their graduation year they have usually  found a comfortable niche in their institution. By the way, the last semester usually doesn't have much impact on class standing, so I suspect that most college seniors spend rather more time socializing before graduation than studying.

So it isn't like anybody paying much attention to those wise words spilled extravagantly over commencement ceremonies. Yet if they did (and I grant some do--not me, I both proud and ashamed to admit, but some do), I doubt they'd get much out of the advice. The sad fact is that college education does not prepare one for 'real world,' and most commencement advice tells of how to succeed in the 'real world.'

By 'college education, I mean the kind of education to be had at institutions found in U.S. News' annual ranking.

I've written before that colleges and universities are supposed to be places that challenge students and train them, through rigorous study and discussion, to solve the problems of the world. There are technical colleges for training people to solve technical problems, there were law schools for training people to solve legal problems, and at the top of the proverbial academic totem pole there were the liberal arts schools, which trained people (theoretically) to solve people problems. Subsets of these liberal arts schools focused on societal problems, or artistic problems (not really great language, the problem that artists solve is that of too little art in our communities), or economic problems. But regardless, the graduates of these 'colleges' were supposed to be molded into the movers and shakers, advancers of society, generally good and reliable and virtuous people ready to make a positive difference in their business and social circles.

This still happens, of course. Excellent niche schools like California Tech or MIT maintain a sterling reputation based on the observable skills and attributes of their graduates. Well-regarded liberal arts (or "general") schools like Harvard, Stanford, and the rest continue to turn out exceedingly well-educated young men and women. But the 'post-college depression' that plagues my generation (and which increases, I bet) argues eloquently that the graduates of 'top-tier' universities are not prepared for the 'real world,' even if they are educated to the max. And I'll go so far as to suggest that today's college graduates may not be all that educated as compared with the graduates of yore.

If colleges and universities are separate institutions dedicated to forming young men and women for success in the real world, then they are also separate from it. That's necessary, but it is also dangerous. Separation breeds unfamiliarity, and unfamiliarity breeds contempt. True to form, a close look at 'college culture' shows that what higher education values in its students, and what it teaches, is a collection of loosely associated ideals that revolve around vague concepts like open-mindedness and experimentation and uniqueness or specialness. In my experience (a decade ago, but more recently via facebook) a great many of today's college students, and therefore graduates, see the point of college as getting an 'experience' (whatever that is--I assume it means mostly social drinking with a little class and sports spectating thrown in). That experience is supposed to be nurturing and consequence-free, so that students can 'find themselves.' It would be easy to launch into a righteous tirade, but those are a dime-a-dozen on the internet. Let's give this idea the benefit of the doubt and say that 'finding oneself' is probably related to being mentally/emotionally healthy, directly impacts a graduates' contributions to society.

Yet research has shown that 'the college experience,' while it may nurture and produce open-minded graduates who have 'found themselves' (actually I think it is probably unhealthy in sum) mirrors the 'real world' less and less. Grade inflation, on the rise since the 1960s, means fewer and fewer college graduates have dealt with failure and had a chance to recover. That's not like the 'real world,' and in fact many commencement addresses I've read (and successful people I've heard) talk about inevitable failures and the importance of learning from failures and doing better next time. Research shows that failure is an essential part of learning, and it's absence can cause excessive fear of risks and (you guessed it) failure. Students at 'top tier universities, crafted for admission by 'magnet' or 'prep' high schools, may not learn about failure at all. It's a particularly hard lesson to learn after college, when the consequences are life-changing.

What would make the university years a good separation is a connection to what comes after. Learning through challenge, failure, and perseverance is direct preparation for the 'real world,' and the separation is insulated from breeding contempt by the learning itself which is external and oriented at the world outside--even if the world available is bounded by university walls. But taking away that external focus and emphasizing reflective learning, or 'finding oneself,' creates a shelter wherein the separation breeds contempt very efficaciously. The contempt is usually expressed in rejecting traditional social metrics of success (a family, real estate, success in one's community), pejoratively labeling them as conformity, settling, or re-imagining them as social slavery, materialism, bourgeois values, or patriarchy (depending upon the ideology at hand). Sadly, such beliefs encourage new graduates to feel that they are better than everybody else because they are somehow independent and free-spirited, non-conformist and appropriately ambitious. And because most people figure out that acting better than others is a great way to make enemies, they simply compromise and disassociate a 'professional life' from what they regard is their 'real life.' I've seen myself and many contemporaries literally wear two faces: a sober, professional face for work and family; and an artificial 'eclectic,' interesting, face in selected social groups.

Certainly the perspective that the value of higher education lies mostly in 'the experience,' which students generally interpret as experiencing independence and fun, subtly relegates studies to an inconvenient but necessary part of 'the experience,' effectively teaching students to develop two faces--one for class and impressing professors, another for parties and Greek life. Essentially, the college years' chief role is providing ample 'find oneself' time to painstakingly craft the right persona for each environment. I know I spent many hours on AOL Instant Messenger, pompously posting impressive quotes and poems in my 'away message' and profile, hoping everyone would see what a sensitive intellectual I was. After college, I continued this personal dimorphism by constantly tweaking my Facebook profile. And I was not alone: I noticed the same behavior in college friends and co-workers. I can't help but think that a lot of the post-college depression had to do with all that focus on self and others, inviting self-doubt and constant comparison.

Worst of all, the values taught by the 'college experience' almost certainly have lasting, perhaps lifelong consequences. Although in humans the temporal lobe of the brain--controlling memory, the understanding of higher meaning, and complex problem-solving--develops by 17 years of age, the frontal lobe--controlling judgment, the understanding of consequences of future actions, and values (determining good and bad)--doesn't finish developing until a person is in their early to mid-twenties. So college or university freshmen are intellectually capable of learning undergraduate academics, but they are still in the process of developing their values. Teaching them anti-social behavior and sheltering them from the need to be valuable during the college is setting them up for failure throughout their lives, because those are the prime years when humans establish values. It becomes much harder to change values, and therefore behavior, after the mid twenties. Believe it or not, there's the science behind the phrase, "people don't change."

The dirty little secret about all of this is that graduates are about to enter society, and none of the things that makes a person fit for society (whether the society is a military branch or a hippy commune) are truly taught in universities--except maybe through having to live with a roommate. People who think they're better than everybody else, or people who are self-focused (what, after all, is 'finding oneself' if not self-focus?) are by definition anti-social. And guess what, post-college depression? Being anti-social leads to loneliness, insecurity, and depression. Reference "The Roseto Study," (thank you, Outliers by Malcom Gladwell), which shows scientifically that human happiness (and health) are tied to functional societies, which may be found in families, neighborhoods, and corporations.

So while the purpose of colleges and universities are to give students value to the societies they'll enter--value in the form of technical knowledge, problem-solving abilities, communication abilities, ethical development--I think that contemporary education falls short. Grade inflation means students don't really have to learn much of anything except perhaps how to charm their teachers into liking them, and their educational inheritance of anti-social qualities hinders their ability to join new companies and social groups after graduation. New graduates may possess a wealth of knowledge, skills, and abilities, but bringing all that goodness to bear in 'the real world' (a job, a relationship, or a new social scene) requires them to be social. In fact, success in the 'real world' usually depends more on one's ability to work well with others--become a part of a team--than on measurable skills.

I suspect that's the origin of the wry and cynical phrase, "[Success is] not about what you know, it's about who you know," which is almost true. Really, success not about what you know, but about your ability to know others and communicate.

That was the thing about Roseto: the town had a strong community. Most of the researchers who studied that place traced it to the fact that it was relatively isolated and it's inhabitants traced a common ancestry to a small town in Italy. But despite being 'as American as apple pie' (in terms of employment, diet, interests, etc.) at the time of all the studies, Rosetans were empirically far happier and healthier than their fellow Americans. They had a functional society, a community where each of them were better known and therefore better appreciated by their peers, allowing individuals both more freedom to be 'who they were' and demanding enough conformity and external focus to keep all members of the community valuable. Whether or not one agrees philosophically with the Rosetan social structure, the facts speak for themselves. Rosetans were happier, more satisfied with their lives, less depressed, and less prone to diseases ranging from heart attacks to cancer. And it's all due to the fact that they were social--something that's hard to accomplish living by the values encouraged in college and university environments.

I wish I'd known back then--really known (because I'm sure many wise people tried to tell me)--that all the hours spent on myself and my various images, all the anxiety I felt about the inevitable post-college setbacks, were wasted. Nobody read my facebook profile, nobody cared how eclectic my 'likes' and hobbies were, and nobody wanted to hear the complicated details of my love life or the reasons why I flubbed a particular task. Everyone was too busy developing their own self-image, simply wanted me to do my job, or cared about me and wanted me to be happy.

That's what I wish I'd know at my commencement, and I'm sure many people tried to tell me these things. Actually, I can remember some advice to the effect of, "be a good person and to work hard." But I didn't listen as well as I should, and maybe that's the story of college students everywhere. Perhaps if 'top-tier' colleges and universities emphasized team-oriented social skills, hard work, and resiliency more than self-focus and nurture, graduates wouldn't need any commencement advice. They'd be formed for success already.

Sunday, May 12, 2013

A Suggested Scientific Definition of the Meaning of Life

It's funny how thoughts and ideas coalesce. As I worked my way through five or six Michael Crichton novels on one of my periodic binges, I noted in The Lost World an astonishing addition to the standard evolution narrative. And then as I trolled my Facebook feed I saw the same idea again in a Psychology Today article, which I read and shared. I didn't quite recognize the relation then. Not until tonight. There is something truly fundamental to humanity about relationships, it seems--and it seems like humanity eschews relationships to a greater degree now than ever.

Many scientists and philosophers have marveled at the evolution of humanity. Since Charles Darwin revolutionized the understanding of biology by showing that some species suffered extinction, and others evolved, a great many ideas have surfaced. For example, rationalists doubting the presence of God and suspicious of religion regarded this new information as another nail in the coffin of superstition, as humanity marched towards a greater and more knowledgeable civilization. This was regarded as an attack against religious institutions, who fought back first by declaring such belief heresy and blasphemy, and second by co-opting evolution as proof that God brought about His likeness on earth through this new biological process.

Interestingly, it seems that when scientists and thinkers began to examine the why of it all, they largely passed over the why as it related to humans. Some claimed that extinction and evolution were a result of "Natural Selection." Species had to adapt (evolve) to cope with changing environments which included new predators and predations, or they suffered extinction. Later, more randomness was inserted with the claim that evolution was a result of gene mutation; namely that species were randomly mutating all the time, and beneficial mutations conduced to survival and became species characteristics, while weakening or harmful mutations resulted in extinction.

These ideas became canon and carried wide influence. It's a neat and tidy explanation, and it has sheltered a comfortable delusion about the supremacy of humans. Anthropologists and sociologists drew a similarly tidy line from our forebearers, who must have evolved oversized brains as a clever response to the dangers of their world. Because we humans alone developed tools, and imposed our will upon certain animals to domesticate them, and upon certain plants to cultivate them, and formed societies for mutual support, we ascended to "the top of the food chain." We evolved to survive, and natural selection has declared us winners.

But there are problems with this canon. One is that our near relatives, chimpanzees and gorillas, survive just fine without walking upright and without our oversized brains. Another is that we are not the only species who use tools and exhibit complex problem-solving and form societies. Most glaring, however, is the problem that has clouded the issue of evolution since it broke upon the world: we assume that we are superior because of our brains, our tools, and our civilizations.

Furthermore, the evolutionary canon usually ignores an essential element in species, which is behavior. It's clear that human behavior--complex problem-solving, for one, and societies, for another, has enabled our race to not only survive but flourish. Interestingly, the behavior of certain animal species (such as gorillas, wolves, and dolphins) has conferred similar advantages. Perhaps my own research has been incomplete, but I have not found much historic literature on evolution which addresses this.

The Psychology Today article noted in passing some recent research that suggests cognitive development happens more efficaciously in social environments--literally, at play. The social environment seemingly must include physical interaction, like eye contact, physical cues, and auditory communication. It may have suggested that the human brain evolved to support more advanced social relationships, or it may have been Mr. Crichton. But that strikes me as a very important piece of information, because it suggests that our large brains are not primarily problem-solving apparatuses that (for example) helped us connive ways to bring down a wooly mammoth. More likely our brains have developed to handle relationships.

If true, this has great implications for our way of life. Whence notions of love, honor, duty, and morality? From an evolutionary standpoint, such notions are of dubious value, because often they result in annihilation (I recognize that that they also result in creation). But in a social context--seeing society as an evolved characteristic which permits adaptation to an environment--such notions actually conduce to the continued presence of the species. Nature is full of examples where an organism sacrifices itself for the good of it's species, and I find it unlikely that humans are that different. Take the common characteristic of mothers to defend their young unto death, if necessary. That is certainly a necessity for the procreation of the species, but it also fits into a description of love and honor and duty, which exist in nearly all human societies and accomplish the same end.

In fact, I've read some evolutionary studies that have remarked upon an accelerated development of species that exhibit complex behavior. Whereas it may have taken a geological epoch for life to move from single-celled organisms to multi-organ mammals, the amount of evolution which has occurred in the last million years has seen the ascendance of incredibly rich biodiversity, many with complex behavioral components (not least humans and their close relatives). Is this a sort of evolutionary arms race? As competitors evolve to survive, they must become more complex and more social in the process. Larger-brained humans could not be born as mature as other animals, some of which can walk and even defend themselves immediately after birth, because the large head containing that brain wouldn't fit through the mother's pelvis. Therefore they were born increasingly helpless, with brains small enough to be housed in a birth-able head--but with increased need of protection as well. Therefore societies had to be formed to protect children until they could survive on their own--misnomer in and of itself, because humans don't generally survive alone.

Which brings up another question: why don't humans fend for themselves? why do we do that in societies? Do we need cooperation to survive, or did we learn that cooperation (and the intangible values and ethics that support it) simply yielded a more successful species?

Taking religion, humanism, and arrogance out of the dialogue, it would seem that the human organism is constructed for societies. Perhaps that's why research shows that cognitive development happens the most in a social environment. Perhaps that's why the majority of our brain runs our 'unconscious,' which provides intuition, directs and interprets body language, and generates moral reasoning, all of which are of great use in helping us get along with our fellow humans, and of less use than our 'conscious' rationality when it comes to solving problems (like bringing down that scary mammoth).

So if science has an answer to the meaning of life, it would appear to be relationships and societies. That's nothing new, of course--the importance of family has been defended by Aristotle, the Bible, and every eastern religion of which I'm familiar. Concepts like freedom, and equality, and virtue, and the rest appear to exist to support our biologic need for society and it's advantages. I'm not diminishing them; such concepts I believe to be essential. Yet they exist to serve our relationships, and not for their own sakes.

I find it a common conception that humanity has marched steadily from primitive beginnings to ever more advanced and meaningful civilization. I disagree with it, however--perhaps our physical quality of life is better now than in the Renaissance, but whether we are happier or better is conjecture, and I believe in doubt. Science tells us that the denizens of hunter-gatherer societies are happier and healthier than we civilized internet-users, by nearly every criteria (objective and subjective). In fact, the current generations are shorter, less healthy, and more prone to mental and behavioral disorders than recent generations, such as those which fought in the Second World War and the baby boomers.

While I don't believe that human societies will ever be perfect--society by it's nature constrains because cooperation requires sacrifice--perhaps the exploding diagnoses of mental illness and disorder in our own can be traced to the breakdown in relationships. I think it no accident that the most destructive humans among us are usually loners, with little in the way of participation in society (there are exceptions to this, of course). It's no secret that our modern world offers a great deal of distraction from the business of relationships, from demanding careers to ubiquitous media to solitary entertainment (like video games). All of which makes it easier to plug away at the office, or watch one's favorite shoe (or sports team) on the television, or develop a virtual life via social media.

It's important to note that relationships and societies are physical things. Studies show that human senses are more stimulated by other humans--our eyes tend to focus on other people, we react more strongly to the smells of other people, we pick out speech more readily than other sounds. In a sense, our oversized brains are a complex communication device meant to connect us with other people. That does not happen effectively via computer (although I admit that deploying away from my wife made me very grateful for the telephone and video chat).

Apparently, the wisdom of religion and of old wives has been corroborated by science: family (and by extension, society) is everything. It is the key to humanity's success, and probably to its happiness as well.

Friday, May 10, 2013

The Leadership Puzzle

Generations of young men and women have entered the military wondering if they will be able to lead. The thought of ordering any number of cynical, experienced servicemembers to do something dangerous, or menial, or just plain unpleasant can be very uncomfortable. Many of my student lieutenants expressed the same fears I had - would they have the respect of their platoon sergeant or non-commissioned officer-in-charge? would they look like a fool? what mistakes were they likely to make? what if their troops didn't follow their orders?

There are many great books on leadership, of course. And there are many more great articles and perspectives (like this one) available online, which provide great practical advice on the subject. But I've found that the reality of leadership is that experience does not transfer, and most books on leadership are mainly a memoir. So though they may be candid and clearly written, they don't necessarily help anyone develop leadership, because their experience is not transferable. I suppose a new leader could try to act like a leader he or she admires, but that is a little false, and it's common knowledge that false leaders are loathed and ineffective.

I was flattered to receive a telephone call from a young Marine second lieutenant earlier this week, asking if I could meet him and discuss The Basic School. That institution is abbreviated "TBS" (the "T" stands for "The," a fact which is amusing to most). He sought me out because I very recently taught at TBS, and so would be able to tell him how to succeed. We met last night at my house with another lieutenant in the area, grilled some bratwurst, and tried to talk about leadership.

But how? If I tell them anecdotes of my own leadership, I will be offering great sea stories, but since they will unlikely face the same situations as I did (and they're even more unlikely to remember my stories if they do), the anecdotes aren't much help. If I tell them common mistakes, they will remember what I say as forbidden practices, without really comprehending the underlying issues or considerations that made each mistake, well, a mistake.

In the end, I mostly talked to them about the perspective and attitude of a leader, and illustrated with examples from my experience. It was the best that I could come up with, because I reasoned that very few people are 'born leaders' in the sense that they showed up to TBS already knowing how to lead effectively, and that the only way to develop leadership is to just do it, to get out there and keep trying until either you are effective, or you realize that your efforts aren't helping your subordinates. In the latter case, if you care about anyone but yourself, it becomes obvious that you should do what's best for them, and get a new leader who actually is effective.

So to start with, I told them that the reason leaders exist is to accomplish tasks, and to do it correctly. In the military, officers carry the authority of the President of the United States as well as his or her special trust and confidence. Their leadership is burdened by the expectation that they will succeed in their orders; that they will do so in a ethical manner; that they will proceed with as much care as possible to the health and welfare of their subordinates; and that their actions will reflect well on their service, their country, and their president. It's kind of a big deal.

But sir, I read in their eyes, but sir, that's all well and good and we get it. But what do we DO? or better yet, what shouldn't we DO? Because while their commission and their oath of office is impressive, it doesn't really provide much direct input into the day-to-day actions and interactions of a leader. And because they were raised in our current educational system (through no fault of their own), they unconsciously expect every problem and every question to have one specific, correct answer. Unless they can write it down in their notebook, memorize it, and then present it in response to a question, they find it confusing. That is how learning is modeled today, and they have to learn a new way to learn...they must learn to come up with answers on their own, guided by vaguely phrased virtues and values. Faced with the chaos and complexity of leading others, especially when they lead others into discomfort and adversity, they must be able to create order.

So I continue that the expectations they carry as officers are held by everyone around them, from the President himself to the lowest enlisted man or woman. That nobody joins the Marine Corps to be mediocre, to have an easy life, to be ordinary. Within every Marine there is a yearning to do great things. And they expect their officers to make that happen.

Here is the reason why there can be no "do this" or "don't do this" list on leadership. Such things are too restrictive. Leaders must provide everything for their charges--they must train them well, give their lives meaning, teach them core values, and when necessary send them into dangerous situations. That means recognizing their value, knowing their personal lives, creating good training, and training yourself by orchestrating tough, realistic situations where you have to make hard decisions. The key to remember is that most people don't want what's good for them, they want what feels good. They don't want to change their practices, or complete another task, or sit in a class. They don't want to be the one selected to sit in a listening post near an enemy location. But a leader must decide what is right to do in a given situation--based on those commissioning expectations, as well as on the orders of his superiors--and then decide how to actually do it. Those are often hard decisions to make because they will involve sacrifice and toil.

But sir, their posture said, how do I know what is right? It's a fair question, because nearly every young leader is painfully aware of their great lack of both knowledge and experience compared to the men and women they lead. But there are a lot of resources available a young officer--the platoon sergeant, the squad leaders, fellow lieutenants, their executive officer and commander, and the written experiences of their professional forebearers. In fact, almost every leadership text notes (and I agree) that young officers should ask for help discerning what is right from those around them. It's courteous and frankly, they need the help. Yet no matter what, the leader must completely own every decision he or she makes, hard or not. Inexperience or naïveté does not exclude knowing right from wrong, and everyone inside the unit and out is counting on the young lieutenant to make the right decision.

So that is my first advice to a young lieutenant, or a new leader: Make decisions. Make them as best as you can.

Because leaders have to provide everything, they inevitably fail and fall short. Every young leader will look like a fool. Every young leader will at times be too hard on their subordinates, and at times too easy. Every young leader will in some way train their charges wrong and have to apologize and re-train. Every young leader will make bad decisions, even though they were trying their best to make good ones. It's inevitable. In fact, it's so inevitable that subordinates will expect those kinds of mistakes. And all of them are forgivable.

Unless it's a mistake because the leader knew the right decision to make and instead made a different one.

That's called a "mistake in character." That's called "violating integrity." That is universally identified as the one thing that is completely unforgivable in leadership. And I agree. That's when a leader shows him or herself unfit for the special trust and confidence, and frankly if the young, inexperienced, unknowing leader can't even do the right thing (even if he or she does it mistakenly), then he or she is useless. Literally useless. Less use to the unit than the newest private.

Now most people will also make some mistakes in character along the way. Little ones. Little lies, or little violations of orders. Here's the thing, however: it's impossible to be avoid the scrutiny of the subordinates. They will know. They will see dishonesty in their leader's eyes and they will magically be there when their leader decides to do something in violation of standards or values.

So the only way to recover from a mistake as a leader is to own it. If the decision was wrong, the leader still has to take responsibility for it (and for it's effects), and the leader has to make better decisions starting that very minute. Self-pity is not permitted. Giving up is not permitted. Never is it more important to continue trying than after one has made a mistake, and looks like a fool. No matter what just happened, no matter how many terrible or shameful instances lead up to a given moment, in that moment the subordinates still need--still deserve--a leader.

They need a leader who does the right thing even when it's hard, and unpopular, and makes others hate him or her. A leader who keeps emotions out of his or her decisions. A leader who models behavior, professionally and personally, for his or her subordinates. A leader who refuses to expect anything less than the absolute best from his or herself, who is dedicated to developing his or her unit into a group of great, extraordinary people who can accomplish great things.

So my second piece of advice: Never give up, and never quit.

The rest is personality. Every leader is different, and to give advice beyond all this is to suggest specific behavior. But the old wisdom of leadership is based in these two truths, and I discussed them with my audience:

- Making right decisions applies to oneself. That is why leaders need to be good and professional. They need to be competent, devoted to their service, and they need to set the example for everyone. Making right decisions for oneself (instead of wrong decisions, easy decisions, selfish decisions) is how one builds character and honor.

- Never giving up means a leader is there for his or her subordinates. There to learn about them, care about them, train them, teach them to do their job well and to take pride in it, and there to give them responsibility. It's more than just telling a subordinate what to do; the reality is that most people don't listen to much of anything spoken. A leader must teach something by demonstrating it, giving others a chance to try, and then correcting them when they do it wrong. A leader must forgive mistakes and reward achievement. A leader must be willing to concern him or herself with a subordinate's personal life. A leader must be totally devoted, totally committed to his or her unit.

- As mentioned earlier, people join the military to do great things. The nation expects the military to do great things. The point of having leaders is that those leaders make sure great things are accomplished. And so leaders must never falter in preparing their subordinates to accomplish great things if called. But they must also never falter in preparing themselves, which means they must practice making decisions. They must put themselves into unexpected situations where they have to make tough decisions. That is unpleasant, especially when it's so much easier to develop tough training for others. While it's great if subordinates are trained excellently, if their leader never developed his or her ability to decide, communicate, and act in difficult situations, then the unit is compromised and the leader is useless.

- The burden of having to make right decisions, and the importance of never giving up the fight to be better, means that leaders become professionally competent. That means memorizing data about equipment. It means reading tactics and techniques for employing their subordinates, or accomplishing their likely tasks. It means seeking out knowledge about upcoming events and challenges to better prepare, seeking out better ways to prepare, and making sure the leader is ready him or herself for anything that might come up, which applies to physicality as well--leaders must be physically capable and healthy. It means embracing a warrior ethos.

- Being a good leader is a discipline. It is a process. Leaders interact with, mold and shape, teach and mentor, discipline and evaluate people. They accomplish their tasks with people. There is no rule for success with people, and every situation will likely be unpredictable and confused. If a leader has made a life and habit of making right decisions and never giving up on his or her development, that leader will be mentally and physically capable of addressing each new situation, whether it's a trivial problem of a subordinate or a mission that threatens annihilation of the unit, without becoming flustered, overwhelmed, or desperate. That is mental strength and physical toughness and when practiced it is courage.

Those are the pieces of the leadership puzzle.

Sunday, May 5, 2013

A critique of modern scientific thought

The theory of Anthropogenic Global Warming (AGW) has caused a great of discussion in the past 30 years. There have been books, and award-winning movies, and Nobel-winning personalities, and most significantly millions (if not billions) of dollars dedicated to educating the public about this theory, and to stopping it. I personally have participated in debates about, mostly with my friends (on Facebook), and I have been surprised at how religiously the belief in the verity of the theory is held. It is one of the defining issues of our epoch, equal to the subject of the Vietnam conflict in the 1970s and perhaps eclipsing our on-going Middle Eastern conflict today.

AGW is depressingly obscure, I've found. There are those who debate the meaning of the word theory, explaining that it means only an explanation of something instead of a law. A theory is not immutably true, such as the laws that govern the conservation of energy in physics. Of course, I believe that gravity is a theory--the best explanation of why objects interact with this large land mass we call earth (and why other masses in space interact the way they do)--which seems pretty immutable itself. Clouding the issue further is the fact that the theory of gravity has laws that apply within the theory, such as the law that an object in space within the gravitational field of the earth will fall towards it.  So how much trust are we to place in this AGW theory? It appears that the answer lies in one's perception.

Then there are the details of AGW. I generally get the impression from the all rhetoric about "global warming" that pollution causes the world to heat up. That will apparently result in sea levels rising, perhaps several hundred (or thousand) feet. Digging a little deeper, it seems that the warming is supposed to occur because of carbon dioxide, a "greenhouse gas" that traps heat. Where it traps heat is confusing as well: is it in the atmosphere, or on the surface? The most scientific explanations focus on atmospheric warming, proposing that a global warming of the upper air will irrevocably trap heat on the surface, with a host of terrible consequences: mass species extinction, including the oxygen-producing algae in the ocean, and perhaps a catastrophic shut-down of the world's biological equilibrium. I've already noted the possibility of sea levels rising, which (given that most of the world's population lives on the coast) would be a very grave threat indeed. Spreading deserts, making agriculture impossible and engendering a massive famine. Terrible stuff indeed.

I hear a lot about climate models, vastly complicated computer programs which seek to extrapolate a set of data into the future. I could misunderstand, of course, but it seems that many of the terrible consequences we can expect to face are themselves extrapolated from a single set of data, the expected temperatures determined by the climate models. And that is what causes me the most concern.

A very passing and abstract understanding of Chaos Theory and a slightly more nuanced understanding of human experience in conflict has made me very suspicious of linear thinking, which is what those climate models appear to be (in essence). Linear thinking establishes direct relationships between things, such as causes and effects. It works very well, too, in what scientists call a "closed system." We use it with great success in everyday life, when we travel places, cook food, conduct our daily work, and the like. After all, most of us know that if we leave our house at for a familiar destination, it takes a well-known amount of time. For me, it takes 30 minutes, give or take, to get to work. If I set the stove dial to "8" instead of "Hi," my bacon cooks quickly without burning. If my daughter does not have much of a nap, she will have a hard time sleeping at night.

In fact, I would posit that linear thinking is essential to our lives. Almost anything we do which is complicated needs a linear explanation--perhaps in a checklist--that helps us achieve the task. Hunters learn and understand complicated details about their spears, bows, or firearms; parents develop complicated sets of procedures for their children, businesses develop strategic plans (not to mention floor procedures, sales protocols, and marketing campaigns), and individuals come up with life plans that may include college, a specific job, a relationship, and so on.

The success of this mindset, and the almost unconscious way which we collectively apply it, tends to obscure the fact that such linear thinking is partially inadequate. But instinctively we know it. We know that an unexpected traffic jam, or a suddenly malfunctioning burner, or a child's unexpected whim, or a new product (or service), or the weather, or any number of other things can disrupt a linear procedure. We recognize it so easily that we have birthed uncounted idioms describing it: "that's life," "expect the unexpected," "murphy's law," and others.

In fact, in my former profession, there was a great deal of debate about whether a battlefield could be treated linearly. That was, of course, the great dream of the American military starting in the 1960s: as weapons became more and more advanced, and more control was possible via computer systems and advanced radios, military thinkers began to wonder if the terrible uncertainty of war could be avoided. They imagined a great army, with all weapon systems and theaters coordinated and controlled from a central location. Armed (literally) with that dream, and with advanced Command and Control (C2) systems developed at ruinous taxpayer expense, all designed by extrapolating past experience into future conflicts, the American military strode confidently into Vietnam, then into the Persian Gulf, then into Afghanistan, and finally again into the Persian Gulf.

Of course, with the possible exception of Desert Storm (1991), history teaches us that our military confidence was misplaced. Vietnam became a bloody, protracted war confused results and our forays into Iraq and Afghanistan look little different. And yet how, with the most advanced weapons and control technology that humanity has ever developed, did we end up with such debatable success?

One proposed answer is in non-linear thinking. Called in different disciplines Chaos Theory, or "Complex System Dynamics," the short story is that our world is inherently unpredictable. It does not behave according to cause and effect, or set rules. It is subject to "emergent factors," which is a verbally precise way to say that new, unexpected things occur. That accident on the way to work, or the new product that destroys a marketing plan, or the new behavior of a child or an entity. Something that is totally unexpected.

Let me take three examples. The first is falling in love. A great many people fall in love with someone unexpected, for an unexpected reason. Perhaps they knew the person before, and weren't romantically or sexually interested, then something occurred that changed their perception. Perhaps they were surprised by a new person they met. Either way, the encounter and the complex emotions that followed--joy, care, desire, excitement, need, contentment--was unexpected. It was emergent. Though we could try to explain it as cause and effect ("I was always attracted to blondes," or "It happened when I stopped looking"), those causes are not, in fact, causes. They are woefully inadequate causes. If it was blonde hair, or the fact that a person has stopped "looking for love," then what about all the other blondes, or all the other people one meets when they stop looking? Even trying to articulate it aloud is beyond the capacity of our language, and most people in love finally resort to phrases like, "it was just different," or "I just knew." They are recognizing that in their love, there was something new. New about them, new about the other person, new about their life, perceptions, and perspective, literally new about the world.

This example also tells us a lot about our relationship to non-linear thinking. We humans seek love inescapably, if we are to believe the evidence of adolescent behavior, the enduring institution of marriage (whatever it's relevance now), the preponderance of our art and media, and the time-honored tradition of matchmaking (now updated to websites like eHarmony and Match.com). In fact, we don't collectively consider love authentic unless it is non-linear. We are contemptuous of arranged marriages, for example. We expect love to be exciting and unscripted. Spontaneous. There is a deep need for and understanding of dynamic, unpredictable relationships that is at the core of who we are and how we relate.

The second example lies in the twentieth-century conflicts already described. Linear thinking, cause-and-effect perspective taught that a disciplined, advanced military such as our own would protect the Republic of Viet Nam (RVN, or South Vietnam). That proved inadequate because the Soviets armed and trained the North Vietnamese Army (NVA) to a much greater extent. That was an emergent event. So we thought up a linear pretext to accomplish our goal of supporting the allied RVN--we sent our own disciplined, advanced military. Unfortunately, the NVA changed tactics. They allied with the Viet Cong guerrillas and began avoiding open conflict. Even so, they were defeated in every major military engagement, but what Americans did not suspect was that such defeats, which crippled their ability to fight, in fact advanced their cause. They were behaving unexpectedly. They didn't attempt to beat the American military on the battlefield, they attempted to make America as a whole tired and ashamed of the conflict. That was an emergent behavior to which the Americans couldn't adapt, and it dynamically interrelated to other emergent qualities such as the "counterculture" social movement occurring in American universities, the increasing prevalence and social acceptance of drugs, and the increased media access to the world which was provided by Americans themselves, through embedded TV reporters. The true relationship and origin of all these events is (I argue) too complex to comprehend, which is why it is non-linear. But their unexpected, frustrating effect is well-documented in history.

But those first two articles deal with human phenomena. What about "natural" phenomena? The third example of dynamism and emergent behavior is evolution. The theory of evolution has long been lauded as a rigorously scientific perspective. Because it stands at odds with the biblical story of the world's beginning, many rationalists have used it to debunk Christianity (and in a broader context, all religion) despite the fact that many scientists who have contributed to the theory were practicing religious men and women. And there is a nice, apparently linear path from single-celled organisms in vast primordial seas to breathtaking biological complexity in the form of mammals and reptiles (including humans and dinosaurs). Charles Darwin, the scientist who first proposed this theory in The Origin of the Species, explained simply that evolution occurred as a result of "natural selection," positing that organisms best suited for their environment survived, while those more poorly suited were eventually killed off through competition (or by the environment itself.

But "natural selection" is an explanation with many facets. It has been reduced to "survival of the fittest," where evolution occurs to cope with changing environments and the species who are less capable of survival and procreation become extinct. Darwin himself, however, became a household name in the Western world due to his idea of "sexual selection," claiming that sexual desirability was responsible for evolution (a titillating idea, especially in Victorian England). In fact, Darwin's work seems to focus on sexual selection, making me wonder as I read it whether or not he departed a bit from the path of rigorous scientific research and began publishing explanations that continued to draw more attention and publicity. Yet no matter how we choose to define "natural selection," the troubling fact remains that we don't really know how it happened, or why it happened. We can explain that this species became extinct, while that species evolved. But excepting a few instances of evolution or extinction we were collectively fortunate enough to observe (such as antibiotic-resistant bacteria), the natural mechanism of evolution is pure speculation. We cannot explain coherently the cause and effect of it all, we can only guess.

For example, if evolution was driven by the need to survive, why then have traits evolved in species that have no apparent effect on survival whatsoever? Evolution certainly caused humans to have different eye colors, but it's unclear as to how that was "naturally selected." And why is the absence of a tail (when the tailbone is present) more efficacious to survival? Why have some species become extinct, while others survived. It is not satisfactory to say that somehow such traits must have aided survival, because if we can't explain something then we have no right to believe it (else we make science the same as religion). Sharks and crocodiles, organisms that have survived the dinosaurs, the ice age, and untold other environmental changes--not to mention the evolution of creatures that share their environments--make a mockery of evolution as a response to "natural selection." And "sexual selection" makes no more sense, because the mating patterns of bygone creatures are forever a mystery, absent time travel. We observe that sexual behavior tends to "breed out" weakness within a species, but it certainly doesn't explain the extinction or development of various species.

Further reinforcing the non-linear characteristics of evolution are the the philosophical implications it has inspired. Evolution is random and follows no single discernible pattern, therefore we humans are an accident (with all our art and science and other achievement as well). And while that is a wonderful overarching expression of the unknowability of this great process of biology, geology, and atmospherics that has been the story of this planet's life, it points inescapably at dynamic, emergent behavior. Literally every evolutionary step has been emergent, something new, whether it was the asteroid that supposedly began the extinction of the dinosaurs or the increasing brain size that characterized the transition from ancient apes to our modern human. Evolution may in fact be the most confidently non-linear perspective in the modern world, and evolutionary biologists have by and large ceased offering conventional cause-and-effect linear explanations for the developments they discover; instead they focus on explaining the apparent facts, which in detail continue to be frustrating obscure. For example, evidence suggests that Neanderthals may have used speech and tools, and probably interbred with both Cro-Magnons and Anatomically Modern Humans (AMHs). Was their extinction then "bred out," or did they become extinct through some other evolutionary mechanism, such as persecution and genocide at the hands of more advanced evolutionary cousins (which would itself be an emergent event)? It's not even clear whether they were more or less intelligent, since they appear to have had more voluminous brains than AMHs, which is a crude indicator of intelligence in organisms.

For some reason, it appears that science has given humans the illusion that there is a finite amount of information in the world, and once all information is known--once science and research has plumbed the depths of all mysteries and revealed all--then there will be no more surprises. That attitude is most concretely seen in the repeated, futile attempts by militaries in the last half-century to bring all aspects of the battlefield under control. But with people and with the world, experience teaches that emergent behaviors occur, without precedent and unpredictable by any cause-and-effect extrapolation. And any attempt to neatly package emergent behavior with a linear explanation is pure speculation. No one will ever know why the North Vietnamese martyred themselves militarily, or how why such martyrdom, if carried on long enough, would result in American war fatigue. Certainly the Americans, who ought to have known best, did not predict it; while it might be fashionable to say that Ho Chi Minh and Giap were smarter than Americans, the fact is that their emergent tactic itself occurred to them through a result of unexpected effects and opportunities. Likewise, no one will ever know what happened to Neanderthals. And a guess, even one made by a scientist, is still a guess.

Eastern thought deals with this reality much better than our contemporary Western thought. Since the renaissance, Westerners have undergone a half millennium of constant progress and living improvement. We have mastered agriculture, distance travel, flight, and medicine. To a certain extent we have even mastered weather--hurricanes no longer slam against ships and shores with 36 hour warnings; our satellites allow us to evacuate days before landfall. But for all this mastery, we can't predict. Eastern disciplines such as Buddhism or the way of Lao Tzu take what is to Western minds a curiously fatalistic approach to life, but I argue that there is wisdom in recognizing one's inability to control one's surroundings. The Marine Corps General James Mattis recognized how little he could control a battlefield, despite commanding whole divisions, because of the violent and highly dynamic environment. He took the radio handle "Chaos" to illustrate that he did not seek to control the battlefield but rather to thrive in the unpredictable environment. That is a military tenet perhaps first articulated by Sun Tzu, an eastern thinker.

And speaking of weather, our weather "predictions" are merely speculations based on observed data. The path of a hurricane is projected, and large swathes of coastline are put on alert. Why? Because we simply don't know where it's going. Half the time a hurricane deviates by hundreds of miles from it's projected path. Other weather developments are guesses at what might happen over, say Chicago when system A intersects system B--never minding that weather systems, like hurricanes, are projected in the future with poor accuracy. And the results of weather systems which intersect are unpredictable, too. These systems are emergent, dynamic, and probably respond to variables that are as of yet uncomprehended. Such as land use, as in cities (which tend to be warmer than surrounding countryside).

All of this calls the predicted outcomes of global warming into serious question. While empirical data over the last 200 years has clearly shown a warming trend, and glaciers melting, and growing holes in the ozone layer, the effect of such facts is unpredictable. It is essentially dynamic and emergent. The "El Nino" phenomenon was hailed as a manifestation of the consequences of global warming, but evidence suggests that it has been occurring at two to seven year intervals for 300 years, and perhaps even further back. So we can't be sure if the extreme weather caused by El Nino is due to AGW or not. In fact, while NOAA has identified that the number of anomalous weather/ocean systems regarding temperature has increased, nobody is sure whether that's a new development or not. And the fact that within the broad warming trend of the last 200 years there existed a 30-year cooling trend from 1940-1970 clouds the issue even further.

Because "the environment" is such a complex system, with emergent, non-linear, dynamic developments, I think that all the trouble and fuss about predicting climate change is a mistake. The simple fact is that we can't predict anything--we can only guess. Perhaps a guess or two will be correct, but that will itself be an emergent effect from the whole. Besides, the use of terrifying predictions to stimulate more attention on the issue of AGW strikes me as manipulative, a way for AGW apologists and researchers to increase their support, especially financially. I certainly have no illusions that scientists, like everyone else, are susceptible to stretching the truth to get their way. After all, bankers, businessmen, and priests have done it for years. Ultimately, I think our resources are better spent learning to thrive in this emergent environment, which starts by understanding it. Computer models apply linear thought to a non-linear system, which makes them nearly useless. Rigorous research aimed at knowing instead of predicting is much more helpful.

It is perhaps tempting to think that if our world is so dynamic and emergent, then what use is there for linear, scientific thought at all? What can we possibly do to make a difference if we have no way of knowing or predicting what the effects of our action will be? The fact is that we live in relation to this world, and we always have. Native Americans burned forests and fields to flush game and make the land into something more congenial to them. We have farmed for thousands of years. We have learned to thrive by taking our environment and adapting to it in a way that is advantageous to us. This doesn't just apply to the natural world, either--businesses do it in the marketplace, governments do it in political spheres, and we individuals have done it with every single aspect of our lives. It is a survival mechanism. And if our environment is changing now, I think it's a good bet that we have something to do with it--but simply reversing the processes is unlikely to reverse the effects we've seen to date. The world will continue to evolve, dynamically. That is why I think it is so foolish to think that we can control the "environment" to such a degree that...what? What is the desired solution to AGW? Make the world as it was in 1930? 1830? Does anybody really know when the world was healthier? What about in the Jurassic period (200-150 million years ago), when there was more oxygen in the air (and more carbon dioxide as well), not to mention warmer temperatures?

We should "pick up" after ourselves, of course. We should not destroy if we can help it. Demanding greater energy efficiency is virtuous, and certainly will mitigate the effects of carbon dioxide in the atmosphere, not to mention the deleterious smog that existed in Los Angeles in the 1970s and 1980s, which we have successfully cleaned up, and which exists today in developing cities like Shanghai. Finding better ways to use land than mass deforestation and urban development might slow the warming effect, since scientists point to land use as a major factor in the present warming trend. Of course, that entails a behavioral change, as by and large the population of the world continues to concentrate in the cities. And contamination of water and land with industrial by-products including hormonal, radioactive, and corrosives is still a major threat, and ought be combatted to the maximum extent possible. But whatever steps we take, we should be mindful that they will birth their own emergent effects, and almost certainly will not have the effects that we expect (or not entirely).

Keeping that in mind, we should be careful not to impose restrictions on developing societies that do not have the luxury of guilt over a theory of projected environmental behavior, and struggle daily with poverty. The science behind AGW does not account for the human cost of change, except where it predicts catastrophic results for humanity. That fact is the most suspicious of all.

To thrive in this world, as we have done so far, we must remember that science does not tell us what to do; rather it tells us what is. And that information may help us discern what to do about things, but there is no blueprint. The climate is certainly changing, and the reasons for that change are probably much more complex than industrialization and land use. After all, the earth has already been through three atmospheres and many geologic periods already, and likely will go through more as the earth's evolution continues. How that evolution will be affected by warming, carbon dioxide, or anything else attributed to us is unknowable.

And our evolutionary business is to remain, as the sharks and the crocodiles have. We must learn to thrive.