Popular Posts

Tuesday, October 25, 2011

EVERYONE A LEADER OR NO ONE IS

EVERYONE A LEADER OR NO ONE IS

James R. Fisher, Jr., Ph.D.
© October 25, 2011

The paradoxical dilemma of our times is that we cannot lead and we do not want to follow.  Why is it most people don’t like to be led?  They want to believe they are in charge.  Then, why don’t most people lead?  Because they simply don’t know how to lead.  And so, most American organizations have a management system and call it leadership.

James R. Fisher, Jr., Work Without Managers: A View from the Trenches, 1990, p. 276.



For the past score of years I have been writing about what I call corpocracy.  I called it the American disease in Work Without Managers. 

We had reached, in my view, corporate excess by 1990 when I retired for the second time from the corporate world, and decided to write about it. 

Work Without Managers became that book.  I had no idea at the time it would prove so prophetic as so many reviewers saw it as an angry book of someone a little half cocked. 

Instead of challenging that mindset, I proceeded to write The Worker Alone, Going Against the Grain (1995), Six Silent Killers: Management’s Greatest Challenge (1997), Corporate Sin: Leaderless Leadership and Dissonant Workers (2000), and A Look Back To See Ahead (2007).

My wife, BB, claims I keep beating an already dead horse to death, and I guess she has a point.  I’m an introvert who prefers to write out my angst rather than proselytize a sleepy constituency that for the past twenty years has allowed corpocracy to feudalize its world without complaint.  That is until the bubble burst in 2008 with the real estate meltdown compounded by the realization that Wall Street was playing monopoly with real money.

CORPOCRACY


Corporate society I determined twenty years ago was diseased.  In Work I described that disease as a management insensitive to its employees, supportive of company politics at the expense of productivity, secretive as a measure of communication, ritualistic as a gag on productivity, continually meeting as and end in itself, unaware of its internal focus until markets disappear, propensity for short term planning, advocating a rhetoric supportive of initiative until it goes in an unexpected direction, increasingly isolated – business as usual – with a covert hostility to anyone who threaten its infallible authority no matter how muddleheaded that authority might be.

Corporate society has gotten away with this for one hundred years, but now people are hurting as they did before the Great Depression in 1929, when the privileged few were living high on the hog while most others were unable to make ends meet from paycheck to paycheck if they in fact had a job. 

Corpocracy, it might be said, has created a modern feudalistic state; unfortunately, the state is now a global entity.

This brings me to what is happening on Wall Street with people camping out there in protest with a smorgasbord of complaints, a situation that has spread throughout the United States and to cities around the globe. 

I don’t know what to make of it, but I hear smatterings of comments that sound consistent with what I have attempted to say over the past several years, about the arrogance and hubris of corpocracy as well as the decline of the middle working class, the idea we all have to be college graduates with a degree often with no value added skills, and therefore making our employability difficult if not impossible. 

I am not opposed to liberal arts, but most of that you can get by simply reading books, going to museums, watching programs on PBS, traveling and having stimulating friends from other cultures. 

The irony and paradox is that the skill base that we think traditionally for workers is not only being outsourced abroad but disappearing in the wake of this new electronic age. 

Leaderless leadership, which I have often written about unsympathetically, might actually be the answer.  Perhaps leaderless leadership is the prescription for the future as we negotiate the difficult terrain going forward.  Perhaps my little phase repeatedly expressed in my missives is finally blooming into something real: Everyone is a leader or no one is.  Perhaps all of those camped out on Wall Street are leaders.

Let us believe that this movement, and it is increasingly looking to be just that, is a peaceful one and not a bloody revolution.  Another irony is that the systemic nature of our problems are so deep and pervasive that even those at the top, the so-called “1 percent,” have no corrective mechanism to abrogate or reverse the situation even if they have a passionate interest in doing so.  I fear revolution is in the air, and I have no idea where it will go.  I do know revolution also hurts those already suffering the most.

*     *     *

Tuesday, October 18, 2011

HOPE IS NOT A METHOD, BELIEF A DESTINATION, BUT LOVE THE PALLIATIVE TO ALL

HOPE IS NOT A METHOD, BELIEF A DESTINATION, BUT LOVE THE PALLIATIVE TO ALL

James R. Fisher, Jr., Ph.D.
© October 18, 2011

Eight months ago my daughter Jeannie was removed from life by a hit and run driver.  Yesterday, we found out my daughter Jennifer was hit by the diagnosis she has grade 3, breast cancer.  Only 36, with her career soaring even in these difficult times, and in a solid supportive relationship, the cold words on the report were as impersonal as a weather report.  Glaring on the page as if written in red were the words “carcinoma” and “grade 3,” the most aggressive form of this cancer, but buried in multi-syllable jargon.

You wonder why the agents of the body that work so hard in our defense quietly doing their thing from moment to moment suddenly become perverse, and work against our nature.  The disease and the impersonal words seem an act of collusion.  Anger mixed with repressed tears tells you this is a moment to be strong no longer a need to hide your weakness.  It is not all about you.  It is about your daughter whom you love more than life itself.

Our Jennifer is stoic, and has faced many challenges in her young life with resolve and will.  It is as if she has been in training to combat one adversary after another, and as Kafka says in The Trial, “she has done nothing wrong.”  But it is not a matter of ethics.

Neither is it a time to get lost in remorse and the palaver “this is not fair,” nor in hope that it may reverse the situation, when hope is not a method.  Jennifer will continue her MRI’s, pet scans, biopsies and doing what a regiment of MD’s in this diagnostic clinic are trained to advise her to do, but alas, belief in them is not a destination.  She is on a journey familiar to a multitude at various stages of the same situation, and she is not alone.

The sentinels in her body that will realign themselves like the soldiers they were meant to be, abandoning their mutiny, will not be limited to the possible necessity of surgery, not the possible regiment of chemotherapy, not the constant visits to the clinic to update the status of her interior lymphatic battlefield, nor even her will and resignation to get better and put all this behind her.  Love is the antidote, ultimate medicine that emanates from all sides bombarding her cardio vascular and autonomic nervous system, her physical world with spiritual components as indefinable as the soul. 

It feels as if God has given us a second chance with this hit and run driver, this perpetrator as much a mystery as in the one that fell Jeannie.  Love is the cure, the medicine that will corral these renegade forces because it is the inexhaustible fuel of the will to live.

*     *     *

Monday, October 17, 2011

IT'S A MATTER OF PRINCIPLE

IT’S A MATTER OF PRINCIPLE

James R. Fisher, Jr., Ph.D.
© October 17, 2011

It will come as no surprise to many of you that when I write one of my long missives I am soon greeted with an avalanche of comments, criticisms and conversations. 

At one point in my life, all of these were answers, but now when over 300 are on my everyday email address book that is not possible.  Then there is the fact that I am getting long in the tooth.  I do respect all the comments and relish reading them. 

It always surprises me when people read these long missives and still have the energy to give feedback.  As I have repeated, I am not looking for converts to my point of view.  My interest is in telling stories that may have some relevance to the times.  Such is the case with "Profiles of the Leader-as-Artist."

Rereading Proving You're Qualified: Strategies for Competent People without College Degrees (1995) and The Rapture of Maturity: A Legacy of Lifelong Learning (2004) by Charles D. Hayes, I thought of these profiles.  Retrieving the missive from my archives, I put it on my website (http://www.theperipateticphilosopher.com/) and blog, then sent it to you.

These books are all about self-university and the didact, or the self-learner.  The men profiled in my missive were all students of this philosophy if some have not been around for five centuries.  Hayes has a remarkable understanding of the temper of our times.  Give his work a look.  You won't regret it.

Another factor that found me going back to my original 2004 missive was the death of Steven Jobs, the quintessential self-learner.  Then I realized all those profiled fit that mode.

Invitations come to me on a regular basis to submit a paper for peer review and possible inclusion in scholarly journals; specialists, I suspect, only read thee instruments.  I don’t consider myself a specialist.  .

In any case, MODERN ECONOMY is one such journal.  It would like to consider my work for the Journal of Service Science and Management or the Journal of Intelligent Learning Systems and Applications.  Perhaps these profiles would fit.  Stay tuned.

After publishing in trade journals and popular periodicals for years, including The Reader's Digest, I decided to limit my missives to emails and my blog.  A journal published a missive of mine some time ago without permission.  It was published with my byline, but skipped the step of asking permission.  This offended me.  It was a matter of principle, which, incidentally, is the reason for this brief missive.

BB and I watched A Man For All Seasons (1966 film) last Saturday, starring Paul Scofield as Sir Thomas More.  I have written about Sir Thomas as well as his best friend, the Rotterdam priest and Catholic scholar, Desiderius Erasmus.  Sir Thomas wrote Utopia (1516), his idea of an ideal society, and Erasmus wrote The Praise of Folly (1511), his take on the vanities of men of his time, and the limitations of convention on human experience, problems we still face today.

The two men illustrate aspects of what I call "a matter of principle." 

Principles with Sir Thomas were palpable, less so with Erasmus.  Erasmus hid his behind clever words, publishing his book surreptitiously and anonymously.  This brings me to playwright Robert Bolt's "A Man For All Seasons."

Sir Thomas More was an English statesman and friend of King Henry VIII, who wanted to divorce his wife and marry Ann Boleyn.  More’s refusal to accept the divorce and the king’s remarriage, or to acknowledge the king as head of the Church of England, cost him his head. 

His allegiance, on principle, was to the Vatican, the Papacy, and the Roman Catholic Church.  He could not forsake his conscience, his God or his church.  Remember this was the time of the Protestant Reformation in which all of Europe was on the brink of a radical paradigm shift, again somewhat like our time.

Once the film was over, I turned to BB and said, "Honey, what is really sad about this film is that Taylor, Ryan and Rachel would have no idea what all the fuss was about."  Ryan, 17, Rachel, 15, and Taylor, 19 are our grandchildren.

She looked at me and nodded.  "No, it wouldn't make any sense at all to them."

Indeed, Sir Thomas More sacrificed life, limb and family for principle, an idea that has lost its energy today.

*     *     *

Sunday, October 16, 2011

PROFILES of the LEADER-as-ARTIST: Three Common Threads to This Leadership


James R. Fisher, Jr., Ph.D.
© October 16, 2011

(This is the complete original manuscript published, and © March 27, 2004)











PROFILES of the LEADER-as-ARTIST:

THREE COMMON THREADS TO THIS LEADERSHIP

ABSTRACT

There are three common threads to leadership, which are particularly in evidence with the Leader-as-Artist: the ability to see the situation clearly (vision) with the courage to glimpse over the horizon; the aptitude to embrace the unknown but knowable (creativity) to guide people to where they need to go, but don’t yet know it; and the capacity to serve the people as they are and understand them better than they do themselves (service).  What follows is the rationale and profiles that illustrate the Leader-as-Artist.

There is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, then to take the lead in the introduction of a new order of things.

Machiavellie


The highest achievement possible to a man is the full consciousness o0f his own feelings and thoughts, for this gives him the means of knowing intimately the hearts of others.

Shakespeare


PREAMBLE


Readers of my works know leadership is a central theme, that I see an absence of leadership, or leaderless leadership with a dearth of vision, creativity, and service.  In the absence of leadership, people seldom end up where they expect to go.

Leadership is not about the leader’s preference, but the people’s.  With people lock stepping to convention, lost in collapsing confusion, it calls for the leader as complete follower, or Leader-as-Artist.  Then he is the people and in touch with their essence.  Like the artist who cannot paint the tree until he becomes the tree, until subject and object become one, the leader to act wisely must first become the people before he can lead them.  This is a function of the artist and Leader-as-Artist becoming one and the same.

It is folly for a leader to believe he can lead unfamiliar with the culture, or to steer clear of the void without a stiff learning curve.  Leadership is not a game of hubris, but of creative engagement, empathetic understanding, and persistent resilience.

Leaders are not characters out of fictitious dreams, not rock stars, but often hard to know as admirers treat them as legends while critics as caricatures.  This sets them apart when they are like us only written large. 

Leaders are architects of paradigm shifts from the ordinary to the astonishing.  They speak to us through the prisms of our ways, which can color our perceptions of their actions.  Often they are self-assertive, detached, and intense with what they are, finding it difficult for them to relate to us.  This is the Leader-as-Artist greatest challenge. 

Leadership is the self-university of the didact, the self-taught listener and learner, as opposed to the knower and teller, the apt student of creative thinking.  Lincoln had little more than a year of formal education, Edison three months, Ford six years, Jobs and Gates dropped out of college, and Ray Kroc didn’t finish high school.

These men could see clearly what was there; and imagine what could be.  Different yet alike, they shared eccentric temperaments, were known to be moody, of shifting focus, plunging into daydreams or melancholy, then coming out with new clarity.  They suffered fools poorly who didn’t get it, giving them a Machiavellian luster. 

Dreamers with long stretches of idle thought, they read, brood, assess and scheme to will their minds to some form of action.  Leader-as-Artist optimizes what is available to change it into what is not, seldom inventing anything new, but harnessing what is there into something that touches the mind of the times. 

Martin Luther changed Christianity from Church and Rome centered to God and man centered.  Peter the Great pitch forked backward Russia into modernity.  Mahatma Gandhi took on the autocracy of empire to establish democratic India.  Thomas Edison turned electricity into new utilitarian products.  Henry Ford stepped over 500 car companies to produce an affordable automobile, and paid workers enough to afford the purchase.  General Douglas MacArthur guided the defeated Japanese into a democratic republic.  Ray Kroc used the common hamburger to create a fast food international empire.  Steven Jobs essentially stole Xerox’s Alto personal computer in broad daylight to seed a computer revolution.  Bill Gates bought Q-DOS (known as the “quick & dirty operating system”) from a struggling disk operating system for $50,000 turning the package into MSDOS, launching Microsoft into a $ multi-billion enterprise.

These Leaders-as-Artists created out of what was already there.

With these leaders, nature is writ simply.  Martin Luther saw salvation through faith, alone; Peter the Great saw a way out of Russian feudalism; Gandhi saw non violent civil disobedience as a way to defeat a military behemoth; Edison saw science as the root to new technology; Ford saw mass production as the key to controlling costs; MacArthur saw culture as the route to change; Kroc saw a simple uniform menu as irresistible to busy people; Jobs saw how toys could become tools; and Gates saw a paradigm shift from hardware to software.

Lincoln epitomized the Leader-as-Artist.  The rail-splitter, the honest broker, the shrewd country lawyer, the cracker-barrel philosopher, the humorist, and the statesman saved the Union and ended slavery because he was the consummate leader.  Yet, it would be wrong to see him as modest.  Great artists and leaders are seldom modest.  Intellectual arrogance and unconscious superiority distinguish them from others who would have them follow a misguided course.  They attack work as if it were war, warriors and risk takers, accepters of collateral damage if necessary in pursuit of a vision.

Praise these Leaders-as-Artists if you must, but if they had not stepped into the void surely others would.  Lincoln confessed that he did not control events, but that events controlled him, as they do all leaders. 

It is when leaders ignore events forcing them to comply with preternatural notions that nature becomes perverted and chaos and calamity ensue.  We see this in leaders-as-celebrities, in intellectual, religious and political leaders who would make their obsessions, ours.  This is our all too human side, but it is not leadership.

Nor does the Leader-as-Artist search for excellence but designs it out of the distillate of his vision, creativity and desire to serve.  Critical thinking, or what is already known, is limiting, whereas the Leader-as-Artist is comfortable in creative thinking, or what is not known but can be found out.

Leadership is not management, which is the control of things.  Leadership enables people to express their will.  People are led; things are managed. 

Management uses critical thinking (e.g., MBA case studies) to focus on what has succeeded, whereas leadership uses creative thinking to focus the achievable.

We know much about management but little about leadership.  So, we perfect management and call it leadership.  Many CEOs in the Fortune 500 have engineering backgrounds with the inclination to process information, not filter it, to value logic over intuition, to settle on cosmetic rather than real change, to trust linear logic over non-linear thinking, to rely on historical precedence at the expense of the new.

Vision is to see beyond the expected; creativity to design a way forward; and service to turn what is available into what is needed. 

In this explanatory age the emphasis is on analysis and results with little attention to chronic disturbances and their causes.  Problems snowball but we can’t seem to break the pattern.  Panic is in the air as there has been more change in the last fifty years than the previous three hundred.

People will follow an idea if led.  A leader as the quintessential follower can calibrate readiness and gauge collective energy to move them forward.  He doesn’t force his idea on them to overcome forward inertia, which is like having the foot on the accelerator and brake at once burning up rubber and going nowhere, but let’s natural interior momentum change the status. 

A leader’s vision is not a fait accompli, but an interactive system within a designed framework, an exploratory journey into the unknown where doubt and failure keep the leader on task.  He is his vision in action. 

A leader’s creativity is a process of conceptual integrity, but not a perfect design as he thinks and feels his way along.  This can be frustrating to followers who crave the concrete.  The Leader-as-Artist is often reluctant to share his speculations as the best-wrought plans can go awry.  Creativity is like a puzzle that cannot be fully understood until all the pieces are in place, and then the final design might come as a surprise. 

A leader’s capacity to serve is based on his ability to deal with changing needs.  Long before people sense change it permeates everything.  We see people caught flatfooted with their livelihood swept away unable to make a living.  This requires the Leader-as-Artist to create a gospel of work to encourage workers to embrace their limitations and work through them by developing new skills, not an easy task.

The Leader-as-Artist is not necessarily more able, better educated, or better connected.  He is the conscience of his time.  Forces collide within that drive him beyond himself, causing him to take note and act, forces that remain dormant in others.  He personifies our history, stepping out of the lethargy into a new landscape with many horizons.  Here is an eclectic representation of some Leaders-as-Artists familiar to you.


Martin Luther (1483 – 1546)

The German religious reformer and founder of the Protestant Reformation was a skeptic who hated skepticism.  He wrestled with the desire for faith, alone, against doubt and fear.  He appeared on the cusp of modernity paying special homage to the Christian struggle between God and man, and thus stumbled into prominence as the consciousness of the West.

The goodness of his efforts seemed overmatched by the calamitous effects his works had on religion and society.  His obsession with scriptural text dissolved something of its sacredness.  He also shut the door on his Catholic experience.  Catholicism had been able to hold the simple and sophisticated, the shallow and the profound, the ignorant and the educated into an enduring communion.  He changed the calculus of the Western world.

An ordinary man from peasant stock his simple desire was for an orderly universe promised by Aristotle.  In the early sixteenth century, amidst the chaos, he sought escape by joining an Augustinian monastery, where he was ordained a priest in 1507.  By nature moody and taciturn, he preferred scholarship to dealing with people.  Easy to anger, he found teaching Aristotle ethics unsettling.  The experience filled him with rage against the philosopher.  He then transferred this contempt for Aristotle to the philosophy of Thomas Aquinas.  His conflict was of reason itself when reason contradicted Christian ethics, especially with Aristotle’s impersonal and abstract God, and the God adopted by Aquinas.  He was in a dilemma, a rebel looking for a cause.

Everything changed for the young priest in 1510 when he was assigned to a mission in Rome.  This introduced him to the corrupt papacy of Leo X, Giovanni de Medici.  Pope Leo was a hedonist who loved banquets, gambling and lovely ladies at court.  He spent lavishly on art transforming the Vatican and the new basilica of St. Peter’s into a work of art.  With the Church appallingly in debt with the need for a new revenue source, Pope Leo X issued a decree authorizing the selling of indulgences.

The granting of indulgences gave the sinner full remission of all sins, opening the gates of heaven to the sinner upon death without first going to Purgatory.

Luther’s indignation at this shameless practice became irrepressible.  On his return to Wittenberg in 1512, he began to preach the doctrine of faith, alone, rather than through good works.  Then on October 31, 1517, he drew up a list of 95 theses on indulgences denying the pope’s infallible right to forgive sin, and nailed them to the church door.

Pope Leo took little notice of this renegade priest until the Vatican coffers diminished sharply in gold receipts.  In 1518, he summoned Luther to Rome.  Luther refused to go.  Meanwhile, word was spreading across the land of this belligerent monk gaining converts to his cause.  Violence followed.  The pope was losing control of his flock.

Emperor Charles V was in agreement with the pope, organizing the Diet of Worms in 1521 to consider Luther’s excommunication from the church.  Luther agreed to attend the conference on the assurance of safe passage.  There he was asked to retract his theological teachings.  After a day of meditation, he returned to the Diet and said, “Here I stand.  I cannot do otherwise.  God help me. Amen.” 

Declared a heretic, Luther was banned from the Austrian-Hungarian Empire.  On his return from the Diet of Worms, for his own protection, supporters persuaded him to lodge at Wartburg rather than Wittenberg.  There he remained in secret for a year translating the bible into German, returning to Wittenberg in 1522.

Luther published damning pamphlets rebuking elements for and against his cause, showing equal contempt for giants of the time such as the Dutch scholar and humanist Desiderius Erasmus of Rotterdam, and the King of England Henry VIII.  His passionate drive to save Christianity from bondage to Rome would consume the rest of his life.

Blind luck interspersed with epiphany seems common to Leaders-as-Artists, as if they are the right person at the right time with the right DNA to grasp the chance of the moment.

Luther opened the way to religious experience with zest, spontaneity and individualism.  It was the new face of society, which was preferred to elaborate rules and regulations, corrupt practices and punishing biases.  People were encouraged to look into their hearts for God’s grace, and not through works with church authorities as intercessors.

The Protestant Reformation transformed Europe with many different interpretations of its message including that of Calvin, Henry VIII, Wesley, Fox, Socinus, Williams, Eddy, Smith, Kierkegaard, Nietzsche, Barth, Bultmann, Tillich, Niebuhr, Marcel, and Buber.  The cultural shift had a momentum of its own.

Luther’s act of denial at the Diet of Worms of the efficacy of the church over the German soul, along with his creation of the German bible, crushed economic feudalism, and set in motion a psychological shift from a God-centered to a man-centered society.  Voltaire declared that Western capitalism was a byproduct of Luther’s Reformation.  This was further enhanced with the theology of Calvin with his doctrine of predestination.  Those to enjoy paradise (“The Elected”) could be identified by worldly success.


Peter the Great (1672 – 1725)

The Russian tsar’s reign was controversial for he had a weakness for drink, inclination to vulgarity and bombast, and contempt for political and religious ceremony.  His approach to say the least was unconventional

Peter trusted only what he could see for himself.  He was given to wandering in disguise among his people to assess their collective mind, manner and morale.  Their passivity saddened him as they appeared shackled to a primitive system.  He resolved to break these shackles and lead his people to greatness.

Military hegemony appeared the route to international recognition.  For six years, he pursued a quest for a mighty army often wandering among the troops as a common soldier.  When the army moved against the Turks in 1695, he disguised himself as a humble bombardier.  Russia lost that war.

Undaunted, Peter embarked on a European tour (1697 – 1698) traveling partly anonymously to form a grand alliance with Europe against Turkey, while acquiring Western technology necessary to modernize Russia’s armed forces.  He spent a total of sixteen months in Germany, Holland, England and Austria, and even worked as a ship’s carpenter in Holland and England.

In the course of this sojourn, he amassed knowledge of Western ways, technology and manners, but was especially enamored of Germany’s expertise.  To drive Russia into the modern world he hired thousands of European craftsmen, artisans and military experts who returned to Russia with him.  He also launched wide social and domestic reforms.

Known for his calloused ways, he characteristically offended the sensibilities of his countrymen by his vigorous espousal of Western superiority, then added insult to injury by insisting that men at court shave off their beards and dress like Germans.  The construction of new homes and public buildings followed Western architectural style. 

In 1703, he set about the construction of his dream port city of St. Petersburg to rival the grandest cities of Europe, and designated it the capital of the Russian Empire.  As remarkable as he was difficult, his reign was a cultural revolution of the first magnitude taking primitive Russia out of isolation and into league with Europe.  His strong focus on the military also established Russia as an international power.  Catherine II “The Great” (1729 – 1796) would build on this foundation to expand Russia’s influence, territory and cultural glory.


Mohandas Gandhi (1869 – 1948)

The Indian political leader studied law in London, England, and was admitted to the bar in 1889.  He gave up a 5,000 pound British salary in Bombay in 1895, abandoned Western dress, and commenced to live as a native Indian on one-pound per week in Durban in the Natal Province of South Africa.  In accordance with Hindu asceticism, he dressed in a loincloth and shawl.  What led to this abrupt change in costume was the dramatic shock he experienced in his first visit to South Africa, where Indian indentured workers were treated less than human.  The slight Indian would start a revolution.

Gandhi was a 24-year-old Indian barrister in 1883 when he made that first visit to South Africa to assist a merchant client, Abdulla Sheth, in a lawsuit.  His experience in the first-class compartment of a train was instrumental in persuading him to remain in South Africa to fight for Indian rights.

Holding a first-class ticket, he was asked to go to the van compartment when the train reached Maritzburg, the capital of Natal.  He refused.  Two officials insisted that “colored” men were not allowed in first-class. Then the constable entered the fray.  He took Gandhi by the hand and pushed him out of the compartment.  Gandhi still refused to go to the other compartment and was forced off the train watching it steam away. 

It was winter.  Maritzburg at high elevation was quite cold, and he was not dressed for the weather, his overcoat and luggage still on the train.  The dilemma: was he to stay and fight for his rights, or return to India?  The temporary hardship, he decided, was only a symptom of a much deeper disease of color and cultural prejudice.  He decided to stay.

Indian workers worked in the Natal sugar cane plantations.  Gandhi deplored the conditions under which his countrymen and women worked and lived, only a step above slavery.  For the next twenty-one years, he would live in South Africa opposing its discriminatory practices and legislation against Indians. 

In 1914, he returned to India taking an interest in increasing Home Rule.  During WWI, he had little success in engaging the British colonial government in dialogue on Indian issues.  Finally, in 1920, he initiated a campaign of civil disobedience.  This led to social and economic disruption, and some violence, resulting in his arrest and jailing on the charge of conspiracy.  He was tried and convicted and imprisoned from 1922 – 1924.  Upon his release, he resumed his legal battle in support of Home Rule, and was blocked at every juncture. 

In 1930, he led a 200-mile march to the sea to collect salt in symbolic defiance of the government’s monopoly on that product.  Arrested again, he was released in 1931 to attend the London Round Table Conference on India as the sole representative of the Indian National Congress.

Gandhi called for a free and united India, the revival of home industries, especially of textile manufacturing, and the abolishment of the caste system.  The conference refused his platform in its entirety, as did the British at large and many of his countrymen.

Instead of making concessions civil nonviolent disobedience practices were renewed along with his “fast to death,” which ultimately led to the constitutional compromise of 1937.  This allowed limited Home Rule as a temporary truce. 

When WWII broke out in 1939, Gandhi attempted to persuade the British that only a free and independent India could effectively support the war effort.  Great Britain countered that independence must be tabled until the war was won.  Gandhi described that promise as “a postdated check on a crashing bank.”

He was arrested in 1942 for concurring civil disobedience deemed obstructive to the war effort, and remained in prison until late 1944.  The war ended in the summer of 1945.

With the war over, the Allies focused on creating the United Nations.  Gandhi took the opportunity to negotiate with the British Cabinet Mission on a new constitution.  This led to India’s independence in May 1947.  Eight months later, a Hindu extremist assassinated him at a prayer and pacification meeting in Delhi on January 30, 1948.

India today is the world’s largest democracy and professional workforce in the world (200 million).  Soon India will be the most populated country replacing China.

The United States Civil Rights Movement adopted Gandhi’s civil nonviolent disobedience strategy in the 1960s.  Dr. Martin Luther King, Jr. and his associates successfully executed this strategy, which led to civil rights legislation in Congress.


Thomas Edison (1847 – 1931)

The American inventor and self-taught physicist was born in Milan, Ohio.  Edison, the most prolific inventor the world has ever known, had three months of formal education.  He was expelled from school being considered retarded when his problem was being nearly totally deaf.

At the age of eight, he became a railroad newsboy on the Grand Trunk Railway.  While still not a teenager, he printed and published his own newspaper and sold it on the train, which he called “The Grand Trunk Herald.”

During the Civil War (1861 – 1865), he worked as a telegraph operator through his teens, and invented an electric vote-recording machine.  In 1871, he invented the paper tickertape automatic repeater machine for the stock exchange on Wall Street.  On the profits, he set up the first industrial research laboratory in the world in Newark, moving the operation to Menlo Park in 1876, and West Orange, New Jersey in 1887. 

His workshops in Menlo Park and West Orange became the prototype for the modern industrial R&D laboratory.  Teams worked on specific projects rather than as lone inventors.  These facilities became known as “The Edison Laboratories.” 

At the height of his creativity, some of the inventions were the transmitter and receiver for the automatic telegraph, the quadruplex system of transmitting four simultaneous messages, an improved stock-tickertape system, the carbon telephone transmitter (microphone) for Western Union Telegraph Company, a phonograph instrument (gramophone), the carbon granule microphone as an improvement for Bell’s telephone, a megaphone, the electric valve, the kinetoscope, a storage battery, light sockets with the “Edison base,” junction boxes, safety fuses, underground conductors, benzol plants, the first talking motion picture, and the first commercial practical incandescent light bulb.  Edison took out more than 1,300 U.S. and foreign patents in his career.  He also discovered thermionic emission, or “the Edison Effect.”

That said his crowning achievement was the creation of the Pearl Street Electric Light Power Plant in New York City in 1881, the first in the world.  It was Edison’s vision to have daylight possible around the clock.  This utility changed city life.  Not surprisingly, it met with widespread social, political and religious opposition.  Only thirty-four, he organized political support for his idea, and then guided city engineers in its realization.

Refusing to rest on his laurels, he became a major contributor to the war effort of WWI, developing the manufacturing processes of chemicals previously imported.  Later, he worked on the production of synthetic rubber.  He served as head of the U.S. Navy Consulting Board concerned with ship defenses against torpedoes and mines. 

He was an entrepreneur before the word was invented.  Before anyone could imagine the possibilities of motion pictures, he built the world’s first film studio in 1893, then developed, patented, and acquired the rights to the film industry’s early technology.  Some considered him a ruthless businessman as well as a coldly objective scientist.

He gained a reputation for suing virtually every new company with a version of his cameras and projection equipment, crushing many upstart companies with monstrous legal fees.  When he couldn’t wipe out his competition, he cannily offered, “Pool your patents with mine into a holding company, or trust, and I’ll drop my suits.”  Many did.

In December 1908, the Motion Picture Patents Company was formed with the majority of the shares held by Edison, and a former rival, the Biograph Company.  Anyone who wanted to produce, distribute, or exhibit a motion picture in the United States had to pay a license fee to this combine.

Edison biographers see him using enforcement tactics worthy of the mob, from hitting rebellious firms with stiff legal sanctions to engaging in physical force.  In 1915, the United States government invoked the Sherman Anti-Trust Act.  This dissolved the trust.  Edison’s interest in the film industry faded as it moved to California, described as “the skunk-infested citrus-rich real estate of the west coast far from his laboratory sanctuary in New Jersey,” 


Henry Ford (1863 – 1947)

Henry Ford was a Michigan farm boy with mechanical aptitude.  He left his family’s farm in 1878 at the age of fifteen with a sixth grade education to work in a machine shop in Detroit.  The work was not satisfying so he returned to the farm to putter on the side on his power-driven vehicles. 

In 1890, he returned to Detroit finding work as a machinist and engineer at the Edison Company.  He continued to work on a petrol-driven motorcar that he produced in 1892.

Convinced he had something, he resigned from the Edison Company forming the Detroit Automobile Company.  Disagreements with his partners led him in 1903 to cash in his interest, and form the Ford Motor Company with the Dodge brothers.  In 1907, he bought them out, and thereafter the Ford family controlled the company.

This self-taught engineer was not above learning from the success of others.  He noted that Ransom Olds was setting world records in 1903 producing 3,000 cars with his small, low-priced Oldsmobile.  Workers assembled parts from nearby machine shops.

With $28,000 backing from a local coal dealer, Ford began an Olds like operation.  In 1908, he unveiled the Model T Ford selling the world on the idea of an automobile being affordable transportation.  Between 1908 and 1916, he reduced prices by 58 percent when demand was so great he could have gotten any price he asked.  He didn’t because it conflicted with his vision of producing motorcars for the multitude.

By the end of the twentieth century, the former farm boy’s early vision of individual car ownership had been realized beyond his wildest dreams.  The automobile changed every aspect of modern life from the way cities look to the role of the oil cartels in international politics to the social-sexual behavior of individuals to the very air we breathe.

The Model T Ford was durable, lightweight, flexible and stood up to rough country roads, creating a market for cars among rural working people, where 90 percent of Americans lived in 1908.  Moreover, at $850, Ford’s car was affordable and no longer a plaything of the rich.  Over the years, the price would fall to $310 with the time to assemble dropping from 12.5 to 1.5 hours.  The Model T Ford was 40 percent cheaper than its nearest competitor. 

Ford declared, “No man making a good salary will be unable to afford my car.”  To ensure this goal he paid his workers $5 per day when the going rate in the industry was $2 for a nine hour workday.  The workday at Ford was reduced from nine to eight hours. 

Not stopping there, Ford instituted a profit-sharing plan to distribute $30 million of company stock among employees.  He knew seasoned workers were in demand with the industry peaking with some 500-car manufacturers in the US, alone.  So he left nothing to chance to jeopardize his stable workforce or market for his cars. 

Aside from the economics of scale of mass production, he was the first to tool everything from axles to gearboxes to precise tolerances.  Every Model T was like every other.  This introduced another advantage of interchangeable parts. 

A watershed moment came in 1913 when Ford introduced the moving assembly line.  The first line did not involve the entire car, only the magneto.  It was one of those rare moments that define an era not unlike Gutenberg’s first printing of a book using movable type in the middle of the fifteenth century.  After 1913, mass production touched virtually every aspect of modern life, and marked the dawn of the Second Industrial Revolution.

Ford’s genius was adapting the discoveries of others to his purposes.  Swiss inventor Johann Georg Bodmer built the first mechanical conveyance system in textile factories in England, Gottlieb Daimler, Rudolf Diesel and Karl Benz contributed crucial technologies to the refinement of the car itself.  No automotive component bears Ford’s name.

A visionary, he seized on the idea of the automobile as a popular form of transportation for common man, whereas his competitors thought of it in terms of the luxury market. He wasn’t selling cheaper cars but reducing the distances between people.  By the mid-1930s, the assembly-line process was implanted in the popular imagination in a revolution in art, music, literature and architecture.  Critics saw the concept of mass production as hard, cold and impersonal.  Sociologists wrote of alienation and self-estrangement, while Ford saw it as bringing the good life to ordinary workers.

Largely because of his vision, automobiles redefined cities and gave birth to suburbs.  They altered shopping habits and introduced regional shopping centers.  The pursuit of leisure was part of Ford’s vision with the happy union of cars to drives to weekend cottages, campgrounds, and amusement parks.  The automobile reconfigured family life taking recreation out of the home and into the countryside with unprecedented mobility especially for teenagers and women.  The legacy of Henry Ford is less about the automobiles and more about the transformation of our lives and minds.


General Douglas MacArthur (1880 – 1964)

Douglas MacArthur was the son of a general, Arthur MacArthur.  He became Director of the Philippines after the brutal three-year war waged by the U.S. Army against independent-minded Filipino revolutionaries.  The country was handed over to the United States as spoils of the Spanish-American War.  The war ended in 1902 with the U.S. in possession of the archipelago of 7,100 islands, and a strategic military base.

Young MacArthur graduated from West Point Academy in 1903, finishing first in his class with the highest grade point average in the military institute’s history.  In 1905, he went to Tokyo as an aide to his father.

He commanded the 42nd (Rainbow) Division in France in 1917 in WWI, and was decorated 13 times and cited seven times for bravery in the field.  Promoted to brigadier general in August 1918, he became the youngest divisional commander in France.  In 1919, he became the youngest superintendent of West Point.  In 1930, he was made general and chief of staff of the U.S. Army.  This was followed in 1935 as head of the U.S. military mission to the Philippines, a position previously held by his father. 

In 1941, with the US on the brink of WWII, he came out of retirement at the age of sixty-one to be commanding general of the U.S. forces in the Far East.  In March 1942, after a skillful but unsuccessful defense of the Bataan peninsula, he ordered an evacuation from the Philippines to Australia, where he set up Headquarters as Supreme Commander of the South West Pacific. 

As WWII developed in the South Pacific, he carried out a brilliant leap-frogging strategy to recapture the Philippine Archipelago from the Japanese.  In 1944, he was appointed General of the Army and completed the liberation of the Philippines in July 1945.  Then in September 1945, as Supreme Commander of the Allied Powers, he graciously accepted the surrender of delegates of the Empire of Japan on the battleship Missouri.

At this point he changed hats from general to diplomat demonstrating his vision and leadership as the American responsible for dismantling the Empire of Japan to a peacetime democracy.   It was the crowning achievement of his professional career. 

General MacArthur, with his unlimited authority, cut the emperor down to size by dismissing 7,500 employees from the Imperial Household.  Then he persuaded Emperor Hirohito to renounce his claim to divinity but remain the titular head of his people.  This paved the way for an American-style constitution with sweeping political reform.  At the same time, he fought vigorously forces within the Truman Administration that would have the Emperor tried as a war criminal. 

In 1946, he orchestrated the establishment of the Bicameral Diet (legislature), a prime minister elected by the Diet from its ranks, an independent judiciary, and the ratification of the constitution. This made the Emperor a figurehead sanctioned by “the will of the people with whom resides sovereign power.”  A 31-article bill of rights was created, a complete demilitarization of Japanese society with no right to wage war, but the Japanese traditional pageantry and ceremony of the Emperor was to continue without change.

MacArthur was equally aggressive on the economic side.  He resisted the demands of Washington to dissolve the massive wealthy monopolies or the family-operated trusts that were the cornerstone of Japan’s war economy.  Of the 325 conglomerate businesses slated for breakup only 18 were dissolved.  The rest became the energetic source of revitalization and foundation of the nation’s economic recovery.  His remarkable insight into the Japanese mind contributed greatly to Japan’s postwar economic resurgence. 

A reflection of his artistic temperament was his inclination to go against the wishes of his superiors, often forcing them to eat their words when he defiantly triumphed.  When war broke out in Korea in 1950, President Truman gave him orders to support the South Koreans in accordance with the Security Council Resolutions of the United Nations.

The previous July he was made Commander-in-Chief of the UN Forces.  After initial setbacks, he pressed the war far into North Korea.  Once the Chinese entered the war in November, he demanded powers to blockade the Chinese coast, bomb Manchurian bases, possibly with atomic bombs, and use Chinese nationalist troops from Formosa against the Chinese communist regime. 

His quest was for total victory, not a stalemate, whereas the Truman Administration was committed to containment fearing WWIII.  In April 1951, President Truman relieved him of his command on Wake Island in the South Pacific.  The following year he campaigned for presidency, but failed to win nomination in 1952.  Invited to address Congress, he said in his melodious voice, “Old soldiers never die, they just fade away.” 

A leader on the order of a Caesar, Japan is a rich mosaic of his crafting composed of a moral sense of the Japanese people made into the image and likeness of the United States.  For this architecture, Japan is today America’s most powerful ally.

A legend in his own time, unforgiving, irascible with an imperious belief in his capacity to lead and the wisdom of his mission, along with a strong sense of theatre, he never doubted his vision, creative powers or ability to serve.  He knew people of the Far East because he spoke their languages, loved their cultures, and respected their traditions.  He has earned a place among the best the American culture has ever produced.


Ray A. Kroc (1902 – 1984)

Ray Kroc opened his first fast food franchise restaurant in April 1955 in Des Plaines, Illinois, a suburb of Chicago, the same year Steven Jobs and Bill Gates were born.

No one could have guessed that this 53-year-old kitchen equipment salesman would become a visionary and set in motion a revolution in American eating habits, or that he would create a worldwide empire on the hamburger.

Kroc, exclusive distributor of the Multimixer, a malt machine that could mix five milkshakes at once, met Richard and Maurice McDonald the previous year at their San Bernardino, California drive-in restaurant.  Dick and Mac (as Maurice was known) opened their first restaurant in 1940. 

It was not unusual, a barbecue and carhop place.  The brothers came to realize customers wanted their food in a hurry, were not interested in being waiting on, just quick service.  So, in December 1948, Dick and Mac moved into the fast food business.  It wasn’t a particularly sophisticated place.  Dick came up with the idea of a couple of arches to represent “M” in McDonald.  Customers could drive in and place their orders at the first window and pick up their food at the second window.  Customers loved it.  A hamburger cost 15 cents, milkshake 20 cents, and a pack of fries 10 cents. 

The brothers used the now famous assembly-line technique to prepare burgers, fries and shakes.  They turned their roadside McDonald stand into a $200,000 business, and began to sell franchises across California.

Kroc saw an opportunity to sell more Multimixers, and was eager to share in the receipts, persuading the brothers to let him license the restaurant nationally.  In 1954, he sold his blender business and bought the American franchise for McDonald’s for $2.7 million.  In 1961, he bought the world rights.  The chain now boasted more than 200 branches.  The McDonald brothers faded into history.  Kroc took over the fast food world.

A super patriot, he ordered franchise operators to fly the American flag while he built his new business with military uniformity and precision in products and presentations.  His philosophy, “Persistence and determination alone are omnipotent.  If a corporation has two executives who think alike, one is unnecessary.”  He was quite Machiavellian.  “If I saw a competitor drowning I’d put a live fire hose in his mouth.”

By the 1980s with 10,000-plus units, the McDonald Corporation was the largest food-service company in the world, the biggest owner of commercial real estate in the United States, and one of the nation’s major employers.  Stamped across the landscape from Paris to Beijing, the restaurant trademark of golden arches became a supreme symbol of American style consumerism.  Today, McDonald’s has 25,000 restaurants in 120 countries around the world. A staggering 40 million people eat at a McDonald’s every single day of the week.

While eminently successful, nothing is particularly original or innovative about the chain.  It does the simple things well with standardized precision.  A McDonald’s in Nairobi, Kenya looks much the same as one in Warsaw, Poland, or Battle Creek, Michigan.

The Leader-as-Artist doesn’t invent, but acts with the existential pragmatism of Horace: Carpe diem, quam minimum credula postero: Take advantage of today and place no trust in tomorrow.”  In modern parlance “carpe diem” has been reduced to “seize the day.”

Kroc invented the fast food service economy with hamburgers, shakes and fries.  Like Ford, who never invented anything, he seized the day and made it his empire.

The Leader-as-Artist creates out of what is already there.  Luther created religious peace out of spiritual turmoil; Peter the Great modern comfort out of economic hardship; Gandhi civil freedom out of cultural bondage; MacArthur rediscovered hope (for the Japanese people) out of crushing defeat; and Kroc eating on the run out of time consuming and costly restaurant dining.

Now, we turn our attention to two Leaders-as-Artists who never had to grow up, who could remain perennially children, approaching the future with the eyes of the child, turning toys into tools, changing the calculus of change from centuries into single decades.  Over the previous 500 years, parents were idealized, and little changed from generation to generation.  Now, the precocious child has taken the leader’s mantle.  Philosopher Jacques Barzun calls this “the dawn of decadence,” when it might more accurately be seen as the end of hypocrisy for the under thirty crowd finds ample time to wonder.  Leadership has come home.  


Steven Jobs (1955 – 2011)

Steven Jobs and Stephen Wozniak were college dropouts in 1976 when they first introduced Apple II, the inexpensive, easy to use home computer, which helped to launch the personal computer revolution.  Fueled by Wozniak’s mainly self-taught engineering genius and Jobs’ entrepreneurial acumen, Apple transcended its garage operation origin to become the fastest growing company in American history.

Jobs, with an uncanny artistic flair, started by designing video games for Atari, while Wozniak worked on pocket calculators for Hewlett-Packard.  Wozniak had designed the rudiments of a home computer but could not prevail on his employer to produce it.  Jobs, ever the inveterate salesman, was able to convince Nolan Bushnell of Atari to commit resources to this project.  So Jobs and Wozniak went off on their own to set up shop in Jobs’ garage. 

They gained experience in electronic circuit design when they were kids building “blue boxes,” which were illegal as they were used to make free long distant telephone calls.  Their original intention was to make circuit boards in the garage and sell them to hackers who wanted to build their own computers.  With capital of $1,300 from the sale of Jobs’ Volkswagen, and Wozniak’s calculator, production commenced.  They hoped to sell their first customer, The Byte Shop, 100 boards for $50 when they cost $25 per board to make.  Paul Terrell of the shop said he was interested only in a fully assembled product made from the circuit boards.  So, in one small step, Jobs and Wozniak graduated from circuit board to computer manufacturers.

Wozniak did the engineering while Jobs sought venture capital, being able to raise $600,000.  When Apple went public in 1980 with its initial public offering (IPO), the company was able to raise $110 million.  They were multimillionaires and only twenty-five.  Four years later, Macintosh was introduced.  Jobs, now chairman of Apple, was holding stock valued at well over $100 million, and one of only three people under thirty on Forbes’s list of the 400 riches Americans, the other three were oil heiresses.  Wozniak had left Apple to go back into teaching.

When Apple was incorporated in 1977, there were only 50,000 personal computers in the world.  By 1987, Apple was building and selling 50,000 personal computers every day.

These two Leaders-as-Artists are credited with creating the first home computer.  True, Apple sold the first home computer, but Xerox had the dubious distinction of building the first personal computer in its laboratories.

In 1974, three years before the birth of Apple II, Xerox had created Alto with its graphic screen, overlapping pop-up windows, icons, fonts, and a mouse.  Jobs saw this design while visiting a friend in the Xerox laboratory.  He couldn’t believe his eyes.  “Why aren’t you doing something with this?” he asked, then added, “If you don’t, we will.”  And of course, they did.  They stole the personal computer in broad daylight.

Engineers at Xerox had pleaded with management to produce the personal computer for the general public, but corporate heads felt no market existed for a machine designed to be used by one person.  The rest is history.

Xerox executives were locked into critical thinking, or what was already known, whereas the Leader-as-Artist is into creative thinking, or what is not known but can be found out by thinking outside the box, the natural ways of the child.


William Henry Gates III (1955 -     )

Bill Gates was less the child and more the parent, less the wonderer and more the Machiavellian, although Jobs had some of the Italian thinker in his DNA as well. 

It was 1981 and the computer business had gotten personal when IBM, previously sleeping at the switch, suddenly tried to play catch up by introducing its PC.  Apple had a four-year lead with its desktop computers flying off the shelf.  IBM, the world giant in the manufacture of data processing machines, moved to assert its competitive advantage.  The company reasoned its size not only allowed it to build a comparable product but to quickly erase its shortsightedness.  In 1981, IBM sold 25,000 units; three years later unit sales soared to 3 million.  IBM gave notice, however, that it was still the hardware king.

At the core of the IBM-PC was Santa Clara-based Intel Corporation’s microprocessor, and Seattle-based Microsoft’s disk operating system (MSDOS), or the program that facilitates the running of other programs known as “software.”  IBM licensed MSDOS.

In a fateful oversight, IBM did not prevent Intel or Microsoft from selling those products to other manufacturers.  Soon, a horde of IBM clones crowded the marketplace, all based on the Intel chip, and MSDOS (Microsoft Disk Operating System).

By the mid-1990s, nearly 90 percent of the world’s personal computers were either IBM’S or clones.  An industry once full of eccentric technologies had been completely standardized.  IBM found itself in deep financial trouble.  It was a hardware company in an increasingly software world.  Intel was the world’s largest chip manufacturer, and Bill Gates was the richest man in the world although not yet forty.

To understand the paradigm shift missed by IBM it should be mentioned at this point that there was a paradigm shift by-the-numbers management to Leaders-as-Artists. 

It was a youth movement of unprecedented dimension, as seniority and pyramid climbing in the corporation was now superseded by imaginative, curious and largely self-educated risk taking entrepreneurs.  They were not locked into a system where the emphasis was on business as usual, the status quo, corporate pecking order, three-piece suits, corner offices in glass towers, infallible authority, or loyalty other than to self-interests.

The computer industry now changes radically from month to month requiring constant and steep learning curves.  Like children, these Leaders-as-Artists treat work as play, falling down, getting up, laughing, taking their lumps in good spirit, judging progress on results, alone, not hours spent doing to the alarm of most other industries, which have room only for parents.

To wit, in panic mode, IBM in 1980 rushed to put a system together that could be based on existing components and technology instead of utilizing its state-of-the-art laboratories as it had in reengineering its mammoth System 360.  This resulted in much more computing power for the price than competing machines of Burroughs and UNIVAC.

IBM’s practice was to force widespread price-cutting, following its initial offering with a steady stream of new technology breakthroughs.  IBM exploited technology better than anyone to gain its competitive advantage, yet it was not as Machiavellian as these Leaders-as-Artists.  Their drive was to control the technology.

IBM missed the PC boom because it failed to appreciate how the world was changing to software.  Instead, its greatest fear was undercutting its hugely profitable hardware large computer business.  So, David downed Goliath with a circuit board.  To be fair, IBM missed the paradigm shift for another reason.  It was war weary from constant anti-trust litigation battles with the federal government.  IBM had no desire to join computer wars.

On the other hand, upstart Microsoft didn’t miss the exploitative ways learned from IBM.  It used them openly against its competition.  It departed, however, from IBM in that it embraced the computer wars, paying little attention to federal threats for its monopolistic practices.  Instead, with youthful exuberance, it replied with gusto, “Bring it on!”

With hindsight, patterns and processes take on a characteristic consistency and logic, which rules out the element of chance, or blind luck.  It would seem that during these stressful moments intuition kicks in for the Leader-as-Artist.  Consider this.

The best candidate for IBM to produce its operating system was a small husband-wife software company called Digital Research.  IBM failed to come to an agreement with Gary Kildall, a computer engineer and the head of Digital.  Kildall missed this golden opportunity choosing that day to go flying in his private plane, leaving his wife, Dorothy McEwen, to negotiate with IBM without his input.  She was in fact the firm’s business manager and chief negotiator. 

After meeting with the IBM representative, who claimed a decision was expedient, she rejected the terms of the contract finding them too one sided in favor of IBM.  Her main objection was the nondisclosure agreement required by IBM.  This would, as she saw it, allow IBM to tract her company’s product development, and then go out and duplicate these products on its own.

IBM next contacted Microsoft.  Bill Gates, and his partners Paul Allen and Steve Ballmer were of another mind.  The story now reads like fiction.  It pitted tiny Microsoft’s swift opportunism against giant IBM’s slow wittiness.  Microsoft played IBM’s swagger with fawning obeisance, knowing IBM was late to the PC business, and off the mark with its own PC.  The Microsoft partners knew IBM was in a catch up mode, but still saw PC’s as ancillary to mainframe computers.  IBM exhibited no discomfort with the soaring growth of Apple, Inc., nor did it feel any kind of threat to its core businesses.  Clearly, IBM thought it was negotiating with a strong hand yet surprised when Microsoft so quickly accepted its terms, given the experience with Digital Research. 

Frank Cary, IBM’s chairman at the time, was the father of two technical errors, mistakes made when the decision was to go outside the company for the microprocessor, the heart of the PC, and for the PC’s operating system when it had the most sophisticated engineering staff in the world in its own facilities.  Intel agreed to supply the chips and Microsoft the software.  IBM has been lagging in the computer race ever since.

Big Blue, as IBM is known, is the quintessential management-by-the-numbers prototype megacorporation.  It thought it was saving time and money by outsourcing a non-core activity to small contractors.  After all, IBM was in the computer hardware business where the real money and power were.  Without an IBM computer box, the reasoning went, the Microsoft brand was valueless.

What IBM couldn’t see, and Bill Gates could is that the computer business was on the brink of a paradigm shift from hardware to software.  Gates knew IBM would be required to establish a common standard, or platform for software applications.  That platform would be Q-DOS that Microsoft had purchased from Seattle Computer for $25,000 for non-exclusive rights.  The value of this operating system soon was apparent.  Microsoft quickly paid Seattle Computer another $50,000 for exclusive rights.  In 1986, Microsoft paid Seattle Computer nearly $1 million to settle a dispute over the rights to DOS, now known as MSDOS.

The irony is that there were other operating systems superior to MSDOS.  Apple was already providing choices to its desktop computers, as the Apple brand was ahead of Microsoft in terms of image and perceived quality. 

Gates, who is an excellent bridge player, understood the deck of cards better than anyone, as he used his bravado, instinct for the jugular, and when to use trump.  Under the initial contract, IBM agreed to fund most of the developmental costs of MSDOS, but only Microsoft was allowed to license the operating system to third parties.  This was a killer clause, and an act of audacity and brilliance. 

It was as if Gates not only read all the cards but the future as well, gambling on instinct that there would be an explosion of new competition in the personal computer business.  When that day arrived, which he believed would be soon, he would see that they all ended up using MSDOS and paying Microsoft for the privilege.

IBM’s mistakes didn’t end with Microsoft.  When its error was recognized, it didn’t renegotiate the licensing agreement, or attempt to break with Microsoft.  More mystifying, senior management killed an internally developed operating system that could have broken Microsoft’s stranglehold on the PC market.  Consequently, Intel’s microchip and Microsoft’s software continues to dominate.

Author Charles D. Hayes has written a series of books on the didact and self-university.  Leaders-as-Artists don’t limit themselves if they lack formal credentials, or specific training, they find what they need through self-initiative and self-discovery.  Jobs, Wozniak and Gates have pursued this curriculum, as have many others.

Leaders-as-Artists don’t follow the company line.  They are the company.  Their muse is their interest.  Lincoln was a self-taught writer of the first rank; Ford a self-taught engineer, Edison a self-taught scientist, Kroc a self-taught entrepreneur, Gandhi a self-taught revolutionary, Luther a self-taught reformer, Peter the Great a self-taught freethinker, and MacArthur a self-taught social engineer.  With self-university, school is never out, and the mind stays forever young.


Greatest Lessons


As the world shifts from the dominance of the few over the many, Leaders-as-Artists have risen but are indistinguishable from followers.  Authoritarianism fades as counterintuitive ideas take hold. The world is changing, accelerating, but leadership is constant. 

An ordinary cleric humbled the most powerful institution on earth, and seeded the Reformation.

A poorly educated scoundrel had the vision and moxie to lift the Russian people out of the darkness of night into the bright sunshine of day.

A saint in an artist’s body, shy to the point of embarrassment, took on the Empire and led his people to independence.

So deaf he was thought retarded, he embraced his handicap to be a scientist and the greatest inventor of all time.

Thought to lack sophistication and an executive temperament, he mastered mechanics, mechanical engineering, assembly line technology, and business to become the premier car manufacturer of the world.

Arrogant to the point of being unbearable, a military genius on the battlefield of the first order, he became the empathic model for nation building turning a defeated nation into a thriving democracy

A kitchen supply salesman, he launched a fast food business that would stretch around the globe, and change the world’s eating habits.

A college dropout, he turned toys into tools and the personal computer into an object of beauty as well as a form and function of precision.

Another college dropout, he could give lessons to the Robber Barons for his ability to read the cards and the faces that hold them, becoming the richest man in the world, then forming a humanistic foundation with a $26 billion endowment for the world’s needy.

*     *     *

Microsoft, now a giant monopoly, was ordered to split into two companies by Judge Thomas Penfield Jackson on June 6, 2000.  Four years later, Microsoft is still intact.  The European Union followed this order, fining Microsoft $615 million for antitrust violations mainly involving bundling.  Microsoft has appealed.  By the time this is resolved, Microsoft will be into new technologies.

Jobs’ fairytale story at Apple has had fewer traumas, but has not been without its setbacks.  In 1985, he hired the Pepsi chairman, John Sculley, to run Apple.  Apple under Jobs was at time contentious, disorganized, casual, spontaneous, eclectic, or a back pocket approach to management.  Sculley was to hone Apple with his executive polish. 

Unfortunately, he didn’t know the industry, understand the people, and found the Apple culture unfathomable.  Apple was a personification of Jobs.  A conflict developed between Sculley and Jobs.  Then Apple lost its copyright battle in a lawsuit with Microsoft.  Sculley took the opportunity to convinced the Apple Board that Jobs was the problem.  Jobs was forced to resign.

Apple’s market share plummeted from 20 percent to just 8 percent.  Michael Spindler replace Sculley.  He lasted until 1996 by which time market share had fallen to 5 percent.  Apple was hiring CEOs that didn’t know the business, and was on the brink of collapse.  Gil Amelio was hired and lasted 500 days with the market share down to 4 percent. 

After a thirteen-year exile, Jobs was back.  The iconoclast with an attitude, and company co-founder, he gave Apple a second chance.  Missed was his knowledge of the industry, his artistic acumen, his arrogance and brashness, his resolute passion to create, his ability to make the marketplace clamor for Apple’s products.

The Leader-as-Artist teaches us to get inside what limits us to see beyond the expected, to abide by Nature’s laws rather than to abuse them, to see ideas not in gender, ethnic or political terms, but as Mother Earth constructs. 

Form does not always follow function in the human sense as it does with things, as the Leader-as-Artist knows. The function of behavior often follows the specific form of the culture.  Peter the Great understood this.  Steven Jobs learned to appreciate this as well when he came back to Apple.  When we function from a problem-orientation point of view, we often place the emphasis on the solution while the problem slips away, as in the case of IBM during the early days of the computer revolution. 

The Leader-as-Artist gets around limitations, around circular logic, around crisis management, and the absolutes of everything by considering the three common threads to leadership discussed here.


*     *     *