One of the most momentous conversations in history you’ve never heard about took place between Socrates and a man named Callicles, and is recorded in Plato’s Gorgias. It is a surprisingly moving portrayal of a man who tries to describe the world as it is but, upon prompting, reveals how much he yearns for the way it ought to be. Although it took place 2,400 years ago, the conversation is timeless and very modern. I think it describes many of us today.
The weak, the sheep, don’t like that, of course, so they get together and call what the strong do unjust. By implication, what they themselves do is just. Collectively as a herd, the sheep want to dominate the lions. So whereas nature is on the side of the strong and the lions, convention is on the side of the weak and the sheep.
Influence on Nietzsche
To many of you, this rings a bell. Yes, this is where Nietzsche got his ideas for his Genealogy of Morals. Nietzsche took his metaphors of lions, sheep, herds, slaves and so forth from Callicles, then spun his theory. It was that the sheep banded together to invert the natural concepts of good and bad, strong and weak, motivated by a festering rage for which Nietzsche used the French word ressentiment.
Relevance to Darwin
Socrates being Socrates, of course, he goes on to needle Callicles about the precise meaning of words in order to poke a hole in his argument. He asks Callicles to clarify the terms “better” and “stronger”. Are they the same?
Callicles has to admit that they are not. And off they go, debating what that means.
Today, of course, we know that Callicles was looking for a better word: not strong or good but fit. Not fit as in ‘toned from the gym’ but as in ‘survival of the fittest’. The fittest, according to Darwin, are not the strongest or the best but the most adapted.
The law of nature that Callicles refers to is therefore evolution. It is the tautological observation that those who are better adapted to the prevailing circumstances will leave more of themselves (ie, their genes) behind than those who are worse adapted.
Gibe at democracy
Callicles and Socrates go on to mock democracy (Athens was an even more direct democracy than America is today). Democracy to them is the inversion of nature, the herd of sheep ruling the lions, the weak dominating the strong, the inferior getting their revenge on the superior.
Yearning for what ought to be
But the dialogue between Callicles and Socrates becomes more moving than anything Nietzsche did with it. That’s because during the conversation it becomes clear that Callicles is a sophisticated and sensitive man who’s trying to describe how the world is while simultaneously being sad about it and yearning for how things ought to be.
He’s confused and bitter, about many things. He’s angry at Socrates for needling him, but also because he already foresees (correctly, of course) that the democratic herd of sheep will condemn the lion Socrates. And he hates himself for having to suck up to the herd, to the Athenians, to make his living.
He also hates seeing the fit succeed whether or not they are also good. In other words, he has the ideal of justice in his head as though it were an archetype. Like most of us, he’s frustrated. That’s all that Plato definitely establishes in this dialogue.
More than three years ago–it seems like three decades–I wrote an eight-chapter Special Report in The Economist in which I tried to envision the future of the media. (It starts here, for those of you with a subscription.)
In it I argued that we (society) were in the midst of a transformation equal in significance to that started by Gutenberg’s printing press during the Renaissance. One media era was ending, another starting:
Old: Media companies produce content & captive, passive audience consumes it.
New: Everybody produces content and shares, consumes, remixes it.
Old: Media companies lecture the audience (one to many).
New: The audience has conversations among itself (many to many).
To show you how long three years can be, consider:
As part of my Special Report, I did our (The Economist‘s) very first podcasts–a word that many of the editors in London had not even heard yet. Today our podcasts are among the most popular on iTunes.
During my research for the Report, I heard the word “YouTube” for the first time (the company had just been founded). When I sent the Report to the editor, it contained one single reference to YouTube. Four (!) weeks later, when the Report was published, YouTube had already become the biggest story of that year (2006).
I had never heard of Facebook (not to mention Twitter). And so on.
How I use the media today
All of this sounds quaint today, so I thought I might share with you how my personal media habits have changed since my Report, and then answer some questions:
Does my 2006 thesis hold up?
Would I refine it today?
Is there a media “crisis”?
1) More efficiency in my work life
Back in 2006, I still subscribed to a lot of paper newspapers and magazines, as all journalists used to do, in order to “keep up” with the competition and to be informed. Those things piled up on my floor and made me feel guilty.
Today I have no paper subscription at all! I have precisely two electronic subscriptions on my Kindle, one newspaper (The New York Times) and one magazine (The Atlantic).
I use my Kindle in the morning over my latte to catch up with the global headlines, the mass market “news”. It is almost relaxing. It takes maybe 15 minutes. Later in the day, if I am driving, I will listen to NPR in the car. That represents my entire consumption of “mainstream” media through their traditional distribution channels. I do not own a TV set.
After I put down my Kindle, my work starts. This means that I open my own, personal “newspaper”, which is my RSS Reader. Here is what it looked like yesterday:
In my RSS reader I mix “feeds” from the “head” and the “long tail”, from the LA Times to small blogs on California politics and obscure research outfits such as the Public Policy Institute of California.
The important thing to note here is that I have
disassembled many disparate publications and information sources, including sources not traditionally considered “news”, and
reassembled them as only I can for my own productivity. I have thus replaced “editors” and will never, ever allow them back into this part of my life.
I probably spend an hour or so reading inside my RSS reader. This is not so relaxing. I consider it work. This is my deep dive into stuff I need to know to cover my beat (ie, the Western states). I don’t worry about printing or filing anything because I tag the items, knowing that I can search for them in future. (And yes, that means that my office is now paperless.) Sometimes I hit “share” and my editor can see what I’m reading.
Then I’m done for the day, and I move on a) to do research for my stories and b) to take occasional study breaks for fun with the other media….
2) My intellectual life: Social curation
In my “private” (ie, non-Economist) existence, I now essentially live the vision that I sketched in my Special Report. Which is to say that I am simultaneously the audience for other “amateur” producers of content and an amateur producer myself. This is simply a highfalutin way of saying:
I blog (right here) for motivations that are not remotely commercial, and
I read other blogs for intellectual stimulation, and
I occasionally post to my Facebook news feed, and
I glance into the Facebook updates of people I know.
Through the blog, Facebook and the old-fashioned medium of email, I now have a spontaneous and unplanned but remarkably efficient and bespoke system of social curation for my media content.
I can easily spend an hour or two a day just following the links that you guys, ie my blog readers, provide. Virtually all of you on this blog have never met me in person but you have a keen sense of my intellectual tastes by now, and you provide links that are, for the most part, stunningly relevant. Sometimes you bring to the surface specific research papers or articles in obscure journals that I would never have discovered in the previous media era.
On Facebook, I find that the connections are of the opposite nature: Most of my “friends” I really do know in offline life, but many understand my intellectual tastes less than my blog readers. But my Facebook friends nonetheless are in my social circle, so their links tend also to be obscure, risqué, ironic, or moving–in short, more interesting and enjoyable than any content the media companies used to dish up for me in the previous era. Ten years ago, for instance, I would probably never have seen this stunning Ukrainian artist perform the Nazi invasion of Ukraine with sand:
The things to note here are:
My social curators also disassemble and reassemble the sources of content. They mix Jon Stewart clips (mainstream media, commercial) with homemade music ensembles (amateur, non-commercial) into one bespoke media flow.
My online and offline friends have thus become what media editors used to be, and they are far better at it than their media-conglomerate predecessors ever were. I will never allow the old editors back into my life.
It goes without saying that I “time-shift” and “place-shift”, which is just a highfalutin way of saying that I “consume” this content wherever and whenever (laptop + iPhone) I happen to be.
3) My intimate media
The final layer is what Paul Saffo in my Special Report called the “personal” media. These are media produced by family members and very intimate friends for defined and tiny audiences.
Example: baby pictures and clips on my private family web site. The site is protected and only grandparents and dear friends have access. The motivation is thus the opposite of the traditional media:
The audience is deliberately kept small (whereas media companies want large audiences)
The intent is to share and preserve personal memories.
Because the capture and sharing of such intimate media is so much easier than it ever was, I spend much, much more of my media time immersed in them. Where do I find this time? Easy. As Clay Shirky has been saying for years: We have a surplus of time, once we get rid of the crap in our lives.
Conclusion
So, to answer my three questions:
Does my 2006 thesis hold up? Yes, I believe it does. We all have the equivalent of many Gutenberg printing presses in our pockets and on our laps, and we use them to tell stories to one another as never before.
Would I refine it today? I would pay more attention to video and audio as opposed to text in the mix.
Is there a media “crisis”? No!
It is that last point that may come as a surprise. I am in an unusual position in that am both a professional and an amateur writer. So I must be aware that the news industry is dying, right?
I am indeed aware that it is shrinking. But is that a problem? There are indeed two crises:
A money and profits crisis for owners of media capital.
An employment crisis for journalists.
But those are two constituencies that the rest of society need not care about. For society as a whole, I believe there is no crisis, once we stop being hysterical and examine our media habits.
What I have discovered in my own personal media behavior is that I am today better informed than I have ever been before. But much of the information I consume no longer comes from journalists.
Instead, much, much more of it now comes from universities and think tanks in my RSS reader and iTunes University, from scientists and thinkers and other experts at conferences such as TED, and from you, who are a self-selected and thus qualified bunch of editors.
Speaking purely as a consumer of the media and a citizen, I believe that there is no media crisis–indeed, that we are entering a second Renaissance.
One thread on The Hannibal Blog, as regular readers know, is strategy. That’s because strategy (as distinct from tactics, which is also important) is so important in achieving success. Genius tactics in service of the wrong strategy leads to disaster, as it did for the main character in my forthcoming book, Hannibal of Carthage.
Mark Hurst over at Good Experience has an amusing and insightful post on strategy as opposed to tactics. (Mark, by the way, also runs Gel, an ideas conference and a mini-TED, as it were.)
On one hand, Mark quotes Calvin Coolidge, our 30th (as well as 30th-most-impressive) president:
Nothing in this world can take the place of persistence. Talent will not; nothing is more common than unsuccessful people with talent. Genius will not; unrewarded genius is almost a proverb. Education will not; the world is full of educated derelicts. Persistence and determination alone are omnipotent.
If I may reflect on my guy, Hannibal: He had remarkable persistence: Leading an army of elephants over the Alps, defeating the Romans, staying undefeated in Italy for 16 years!!
The trouble with the Coolidge take on success is, as Mark points out, that the effectiveness of persistence
depends on having the right direction. Without that one little element, the entire effort is for naught.
So Mark wheels out the Cheshire Cat, a sort of feline Clausewitz. Alice asks which way she should go, and the Cheshire Cat answers:
“That depends a good deal on where you want to get to,” said the Cat.
“I don’t much care where,” said Alice.
“Then it doesn’t matter which way you go,” said the Cat.
To Mark that means that
you have to stop and take time to find the direction. You can’t run while you’re reading the map.
To me it means that Hannibal was a bit like Alice. Yes, he knew that he wanted to defeat Rome (which was like saying “I want to achieve success”–ie, vague). But he did not know where he wanted to go (ie, how to go about defeating Rome).
I admire people like Albert Einstein and Carl Jung (both characters in my book) who were able to feel awe. They retained their ability to be amazed by the world, and derived out of that amazement what Abraham Maslow called “peak experiences.”
I also admire people like Richard Dawkins (and of course Charles Darwin) who are able to use the precision-scalpels of their minds for clear thinking and shocking insight. Eg: Evolution. Eg: No God.
Like Einstein, I don’t really see combat between the one attitude and the other, between the left brain and the right, the yang and the yin. I especially like what happens when the two are well connected.
Dawkins, true to take-no-prisoners form, answered:
The kindest thing to say is that it leaves him with nothing to do, and no achievements that might attract our praise, our worship or our fear. Evolution is God’s redundancy notice, his pink slip.
Armstrong responded brilliantly too, by avoiding the embarrassing efforts of certain people to deny the evidence of evolution and instead going a level deeper, into topics dear to The Hannibal Blog: story telling, mythology, and archetypes:
First Armstrong concedes that
Richard Dawkins has been right all along, of course—at least in one important respect. Evolution has indeed dealt a blow to the idea of a benign creator, literally conceived…. No wonder so many fundamentalist Christians find their faith shaken to the core.
But then she expands the topic:
Most cultures believed that there were two recognized ways of arriving at truth. The Greeks called them mythos and logos. Both were essential and neither was superior to the other; they were not in conflict but complementary, each with its own sphere of competence. Logos (“reason”) was the pragmatic mode of thought that enabled us to function effectively in the world and had, therefore, to correspond accurately to external reality. But it could not assuage human grief or find ultimate meaning in life’s struggle. For that people turned to mythos, stories that made no pretensions to historical accuracy but should rather be seen as an early form of psychology; if translated into ritual or ethical action, a good myth showed you how to cope with mortality, discover an inner source of strength, and endure pain and sorrow with serenity…
(Note 1: Logos is one of those Greek words that can be translated in several different ways. Viktor Frankl, as you recall, translated it as meaning, and named his approach logotherapy after it.)
(Note 2: The complementarity of mythos and logos is the stylistic assumption behind the book I am writing. It is non-fiction (logos) but–or so I hope, and so the editor believes–reads like myth. That’s because I feel that ideas, even logical ones, require stories for their telling.)
I have been researching James Madison for a little project that I am not yet entirely at liberty to disclose. And my research is reminding me to be extremely impressed–so impressed that he may just be my favorite founding father. He certainly belongs into my pantheon of the world’s greatest thinkers.
Madison, of course, was not only the fourth president but also, and more importantly, the “father” of the US Constitution. He was the one who took the official notes in the sweltering summer heat of Philadelphia in 1787, and the one whose “Virginia Plan” (which was delivered by the other Virginian delegate but conceived by Madison) formed the basis of the subsequent compromises that led to our constitution. He was 36 years old at the time, and as physically short as he was intellectually giant. Wouldst that America had a man of his ilk today.
I am about to sketch out his vision of freedom as succinctly as I can, but let me just say that if you have been reading the Hannibal Blog for a while, you won’t be at all surprised that I admire the man. Madison fits perfectly my tastes for:
the republican, as opposed to democratic, vision of liberal government, where both ‘republican’ and’democratic’ are properly defined.
Since it is that last point that is most likely to be misunderstood, let me drill into that part of Madison’s thinking. Here is how I understand his views on the matter:
Madison originally preferred to use the word republic to describe the new America they were building, as opposed to the word democracy.
“Republic”
Republic comes from the Latin res publica, which means ‘public thing’–in other words a country ‘owned’ by its people rather than by a monarch. Deriving from Latin, the word reminded educated men such as Madison of republican Rome (ie, Rome before its civil wars), which was so remarkably stable and moderate, and which so impressed Polybius.
Being a public thing, a republic implicitly contains the element that we would call democracy, but it is understood that this is a representative democracy, in which the people choose representatives who in turn decide the issues of the day in competition with other branches of the government. Governance, in other words, has a basis in the people but is removed from the mob.
Most importantly for Madison, minorities in this republic are protected from majorities. He recognized that the tyranny of majorities is perhaps the greatest threat to freedom (which liberal thinking is all about, after all).
Put differently and in modern lingo, Madison was the opposite of a ‘populist‘. If he were around today, certain ‘real-America’ Alaskans would attack him with demagogic effect for being elitist.
“Democracy”
Democracy, by contrast, comes from the Greek and means ‘rule of the people‘. The connotation to educated men such as Madison was therefore ancient Athens, during the Periclean era of the Peloponnesian War, which had a direct democracy as opposed to the balanced representative one.
As part of another project that I’m not totally at liberty to disclose yet, I am also looking into that Athenian democracy right now. And allow me to state clearly that it ended in chaos and failure, in pre-emptive wars (Sicily) that should never have happened and mob-mad injustices such as the trial of Socrates.
Direct democracy is of course alive and well today in western states including California. In a mindlessly populist culture, it is a popular idea. (Stuck in a debate? Just say “let the people decide!”) What that leads to I have described in The Economist.
I’m finding myself intrigued in the extreme by a figure from antiquity as colorful as Hannibal: Alcibiades. He is such a good character, he might be worth another book.
Why? Mostly because he was a (bad) student of Socrates‘, and indeed the perfect foil for the great old man:
Socrates: interested in justice. Alcibiades: interested in himself.
Socrates: tried to teach Alcibiades inner values. Alcibiades: tried (and failed) to sleep with Socrates
Let me give you an abbreviated and simplified biography of this man. (One reason why many people never learn to appreciate history is that many teachers get bogged down in boring detail. So let’s not make that mistake today.)
Alcibiades, his father having died young, was raised in the home of his uncle, Pericles, the greatest statesman of Athens, which was in turn the greatest power of Greece. Alcibiades was thus a rough equivalent of, say, a Kennedy heir in the 60s and 70s–a party boy in a powerful family.
On the eve of Alcibiades’ own entry into Athenian politics, Socrates took an interest and, using his customary Socratic irony (in which Socrates pretends to be less than he is), got Alcibiades to talk about what he wants Athens to do, in the process exposing him to be the confused young man that he was.
Alcibiades, being good-looking (and very much the ladies’ man, of which more in a minute) and charming, rose politically. He became a general in the Peloponnesian War, one of two to take a huge invasion army to Sicily in what was to be one of the dumbest pre-emptive strikes in history.
Just after they sailed, however, the Athenians discover that somebody had, apparently as a prank, broken off all the erect phalluses on the statues of Hermes, which was sacrilege. This was exactly the sort of thing that Alcibiades got up to when he was drunk, so he was presumed guilty. (Then again, he was such an obvious culprit that he may have been framed.) So the Athenians sent another ship after the invasion fleet to arrest their general and bring him home for trial.
Alcibiades did not like that idea and defected to … Sparta! The enemy. Because he was so charming, the Spartans accepted him, and Alcibiades helped them defeat the Athenians. But then it was found out that Alcibiades was sleeping with the wife of one of the Spartan kings, so he made a hasty exit.
Next he went to Persia, Athens’ other enemy. He charmed them, advised them …. (you get the pattern).
Such was his charm and charisma that, after having been a traitor to his native country so long, he then persuaded the Athenians to take him back! For a while, he became their general again. But then he fell out again and crossed back over the Hellespont to another kingdom.
He was sleeping with a girl there one day when his political enemies (he had amassed a few by then) surrounded the house. Alcibiades grabbed a dagger and, possibly naked, attacked. He died in a hail of arrows.
So Viktor Frankl says in the video above, summarizing his theory of logotherapy, which I’ve read at greater length in his book Man’s Search for Meaning. In other words, if people suffer but see meaning in their life, and even in their suffering, they do not despair, as he himself did not despair when he was in Auschwitz and other concentration camps.
He is therefore, as he also says in this video, the anti-Sartre. Sartre and the other existentialists believed that we have to accept the meaninglessness of our existence. I, in my black-turtleneck and Gauloise phase (everyone has one), used to think that was cool. But Frankl thinks it is nonsense.
Or rather, he thinks that it is unhealthy and unhelpful. Hence logotherapy, which
focuses rather on the future, that is to say, on the meanings to be fulfilled by the patient in his future. (Logotherapy, indeed, is a meaning-centered psychotherapy).
He calls it logotherapy because
Logos is a Greek word which denotes “meaning.” Logotherapy, or, as it has been called by some authors, “The Third Viennese School of Psychotherapy,” focuses on the meaning of human existence as well as on man’s search for such a meaning.
According to logotherapy,
this striving to find a meaning in one’s life is the primary motivational force in man. That is why I speak of a will to meaning in contrast to the pleasure principle (or, as we could also term it, the will to pleasure) on which Freudian psychoanalysis is centered, as well as in contrast to the will to power on which Adlerian psychology, using the term “striving for superiority,” is focused.
Speaking of the will to power, Frankl likes to quote Nietzsche:
“He who has a why to live for can bear almost any how.” I can see in these words a motto which holds true for any psychotherapy. In the Nazi concentration camps, one could have witnessed that those who knew that there was a task waiting for them to fulfill were most apt to survive.
Those people who do see meaning in their lives, says Frankl, are able to
transform a personal tragedy into a triumph, to turn one’s predicament into a human achievement.
I want to agree with Frankl, but the trouble starts when he describes how he applies his approach to actual therapy. To me it sounds like semantic trickery. He meets desperate people and tries to change their attitude, but really he only does some conceptual gymnastics and calls that meaning.
Once, an elderly general practitioner consulted me because of his severe depression. He could not overcome the loss of his wife who had died two years before and whom he had loved above all else. Now, how could I help him? What should I tell him? Well, I refrained from telling him anything but instead confronted him with the question, “What would have happened, Doctor, if you had died first, and your wife would have had to survive you?” “Oh,” he said, “for her this would have been terrible; how she would have suffered!” Whereupon I replied, “You see, Doctor, such a suffering has been spared her, and it was you who have spared her this suffering—to be sure, at the price that now you have to survive and mourn her.” He said no word but shook my hand and calmly left my office. In some way, suffering ceases to be suffering at the moment it finds a meaning, such as the meaning of a sacrifice.
I don’t doubt that the old man, out of love for his wife, preferred to bear the pain of being the survivor so that she did not have to. But his wife was still gone. His job (surviving her) was done. Pointing out that he had saved her pain did not give him meaning for his life from that point forward.
So my critique of logotherapy is really the same as my critique of religion: Sure, it might be helpful to see meaning (= believe in God), but that does not mean that there actually is meaning (=God). Sartre might be right after all.
I’m happy with it. But of course I am again on tenterhooks, and feeling quite vulnerable, as I await the reactions of my editor.
The improvement was dramatic. I am reminded again of what Khaled Hosseini, also a Riverhead author, said about writing and rewriting: the first draft is purely a frame. The book happens in the subsequent drafts.
Still, I can’t help but wonder also about the law of diminishing returns: Would I be able to make such a dramatic improvement again in a third draft? Or is this where the book wants to be? Am I perhaps … done?
For my piece in the current issue of The Economist, I had the pleasure of talking to, among other people, the equivalent of Meryl Streep in the porn industry: Nina Hartley. She is 50 and says she loves sex more than ever, on camera or off. She has been at it (the camera part) since 1984.
A somewhat unfortunate part of my job, as I am increasingly discovering, is that the most interesting parts of my research and my conversations often fall well outside the realm of what can make it into my articles. Yes, of course, readers might care about how the porn industry is doing. But they’re human so they must also be curious about, well, sex. After all, it’s not everyday that you get to talk to somebody who does it for a living.
In my case, I was just a tad shy for the first few moments. It helped that I have never “consumed” Nina’s “content” so I had no visuals to distract me. Still: How would I talk to somebody who views having sex as I view writing?
It turned out that Nina was very easy and very interesting to talk to. The conversation ranged just as it would have ranged with anybody else. Our health care debate drives her “mad.” California can’t govern itself. That sort of thing.
Performing sex on camera, she said, is
a highly paid form of blue-collar work… sort of like farm labor.
Everybody is an independent contractor and there are no benefits. No benefits. It’s important in such a conversation not to reach for the double entendres.
About the porn industry, Nina was somewhat nostalgic and sentimental. In the 80s, when she started, it was apparently a glamorous sort of thing. The product was hard to get and had rarity value, the production took place in a subculture that considered itself revolutionary. There was a frisson, a pioneer spirit, a certain excitement.
Now it’s seedy, cheap, everywhere. She wouldn’t start again today if she were young now.
So why is she still in it?
In large part because she actually likes the sex, she told me. She thinks that women increase their sensitivity in middle age. At least that is happening to her.
My enjoyment of sex has increased, but for most [performers] it goes down, especially the men.
I asked her if she meant that doing it on camera makes people “numb.” Sometimes, she says. Many performers stop having sex in their private lives altogether. The men basically have to, since they couldn’t have private sex, then perform as well the next day on camera, and any hint of “having trouble” might kill their career. But they also genuinely lose interest.
Nina sees her role now as “mentor” as well as actress, so she counsels the young performers not to let that happen.
Frankly, it amazes me that it–burning out–hasn’t happened to her. She must be a modern Aphrodite.
This appears to be a hot topic of research. Jag Bhalla (to whom The Hannibal Blog increasingly outsources the more intensive research into matters linguistic and narrative ;)) now points us to new theories by anthropologists who have apparently constructed the equivalent of genealogical family trees for humanity’s oldest stories.
Jamie Tehrani
Dr Jamie Tehrani, a cultural anthropologist at Durham University, studied 35 versions of Little Red Riding Hood from around the world. Whilst the European version tells the story of a little girl who is tricked by a wolf masquerading as her grandmother, in the Chinese version a tiger replaces the wolf. In Iran, where it would be considered odd for a young girl to roam alone, the story features a little boy. Contrary to the view that the tale originated in France shortly before Charles Perrault produced the first written version in the 17th century, Dr Tehrani found that the varients shared a common ancestor dating back more than 2,600 years.
He said: “Over time these folk tales have been subtly changed and have evolved just like an biological organism.” …
Dr Tehrani … identified 70 variables in plot and characters between different versions of Little Red Riding Hood…. The original ancestor is thought to be similar to another tale, The Wolf and the Kids, in which a wolf pretends to be a nanny goat to gain entry to a house full of young goats. Stories in Africa are closely related to this original tale, whilst stories from Japan, Korea, China and Burma form a sister group. Tales told in Iran and Nigeria were the closest relations of the modern European version.
(And once again, permit me to add paranthetically but immodestly that I am attempting in my forthcoming book to narrate just such an archetypal storyline about success, failure & reversal in life.)