posted 1 September 2003 |
COPY-AND-PASTE CITATION William H. Calvin, A Brief History of the Mind (Oxford University Press 2004), chapter 9. See also http://WilliamCalvin.com/BHM/ch9.htm |
William H. Calvin |
Abduction scene, Kolo Cave, Tanzania (From copy at National Museum of Kenya, Nairobi)
As masters of illusion, specialists in
the evolving art of social control, [shamans] held exalted
positions. It could not be otherwise. As equals, they could never
have done what they had to do, indoctrinate people for survival in
groups, devise and implant the shared memories that would make for
widening allegiances, common causes, communities solid enough to
endure generation after generation.... The ceremonial life promoted,
not inquiry, but unbending belief and obedience.... To obtain
obedience it helps if shamans ... can create a distance between
themselves and the rest of the group .... One must appear and remain
extraordinary (by no means an easy task when one is a long-time
member of a small group), look different with the aid of masks...
and sound different, using antique words and phrases, reminders of
ancestors and a remote past, and special intonations conveying
authority, fervor, inspiration. 9 From Africa to Everywhere Was the still-full-of-bugs prototype spread around the world?
With the transition to Homo sapiens sapiens, we probably saw ourselves in a new light. The premodern mind might well have lacked our narrative tendencies - and, with them, the tendency to see oneself as the narrator of a life story, always situated at a crossroads between the alternative interpretations of the past and the paths projected into various possible futures. Scientists are uncertain storytellers. Though we may stand atop a stable pyramid of certainties, laboriously established by our predecessors, we are always attuned to the uncertainties of the scientific frontier – and so we are professionally uncertain. We try to bring coherence out of chaos and we sometimes get it wrong. We expect to be telling a somewhat different story, ten years from now.
The campfire storytellers who first attempted to reconstruct human origins probably concentrated on the travels of their grandparents. But at some point, armed with the abstract imagination that characterizes modern humans, they created “origin stories” of the Adam-and-Eve variety, embellished down through the generations, which attempt to account for human beginnings. The Cherokee Indians tell of a creator who baked his human prototypes in an oven after molding them from dough. He fired three identical figures simultaneously. He took the first one out of the oven too early: it was sadly underdone, a pasty pale color. Creators may not do things perfectly the first time but in creation myths their actions are always irrevocable – so the pale human was not simply put back in the oven to cook a little longer. It remained half-baked. But the creator’s timing was perfect on taking the second figure out of the oven: richly browned, it pleased him greatly and he decided that it would become the ancestor of the Indians. He was so absorbed in admiring it that he forgot about the third figure in the oven until he smelled it burning. Throwing open the door, he found it was black. Sad, but nothing to be done about it. And such were the origins of the white, brown, and black races of mankind. The Cherokees are not alone in being ethnocentric. There is sometimes an ethical lesson to the origin stories, improved over time, but whatever original facts were part of the origin story tend to become less accurate over time, with enough retellings. As the anthropologist and poet Loren Eiseley said, “Man without writing cannot long retain his history in his head. His intelligence permits him to grasp some kind of succession of generations; but without writing, the tale of the past rapidly degenerates into fumbling myth and fable.”
Some religions have learned not to “bet the farm” on the literal truthfulness of a particular creation myth – not to create a situation where truly valuable teachings become as tainted as the tooth fairy when one of the props turns out to have been oversimplified or imagined. Studies of human evolution, using such tools as changes in DNA, are now rapidly recovering some of the basic facts about who migrated where and when. I won’t attempt to explain the mechanics of the DNA dating method except to note that it is a lot like the linguists’ technique of noting similar word roots in related languages. Whereas the English say “fist,” “finger,” and “five,” the Dutch instead say “vuist,” “vinger,” and “vijf.” The Germans use “faust,” “finger,” and “funf.” That’s because all three languages are descendants of the Germanic language family. Sir William Jones, while serving as a judge in India in 1786, noted that Sanskrit shared a number of features with Greek and Latin, suggesting an ancient language in common among Mediterranean peoples and those of the Indian subcontinent. The family of Indo-European languages, as it is now called, represents the splits and separate evolution of dialects over the centuries. Finnish, Estonian, and Hungarian are not members of that language family, being more related to the Uralic languages in Asia, and testifying to a thousand years of invasions of Europe from the grasslands of Asia. One can play the same game with genes that differ between groups, and even estimate the time at which one ancient tribe split off from another, likely because of some migration that caused them to lose touch with one another. For accuracy, it’s best to focus on genes that are not shuffled with each generation, such as the Y chromosome which is passed only from father to son. The mitochondria inside each cell have their own genome, and it is passed only through your mother, who got it from her mother, and so on back. As such, they are more stable. My mitochondrial genes, for example, are from Sweden as that’s where my maternal great-grandmother came from. Similarly, my Y chromosome is likely from England, though there’s the possibility of one mutation since then (the average rate of mutations is what allows estimates of the time since two groups split). But, thanks to gene shuffling as sperm and eggs are formed, the rest of my genes are far more of an eclectic mix of hundreds of ancestors in just that two-century time frame. The first migration out of Africa was Homo erectus about 1.7 million years ago. Some more likely followed, and the most recent major emigration from Africa is what populated the modern world with our species, Homo sapiens sapiens. The migrations are inferred from both conventional archaeological dating and the molecular clock.
If you haven’t been following the anthropologists’ incessant dating and re-dating of important events, you might still think that the last Out of Africa was at 100,000 years ago, into Israel – and that behaviorally modern was much more recent, say when blade technologies and cave art appeared in Europe 30,000 to 40,000 years ago. This was awkward, as it required the non-European groups to develop simultaneously their own behaviorally modern transitions, e.g., that the Australian aborigines did it on their own, and somehow at the same time as the new Europeans. But at the turn of the twenty-first century, the best estimates for both dates changed – and they converged on about 50,000 years ago. Not only is Israel recognized as not being very exotic (much of the time, it has an African flora and fauna), but the mitochondrial DNA dating for Out of Africa – originally supporting the 100,000-year emergence – was redone using more appropriate portions of the mitochondrial genome. A period between 60,000 and 40,000 years ago became the more probable time for the immigration of Homo sapiens into the more exotic parts of Eurasia. The Y-chromosome gives similar bracketing dates. One estimate is that about 1,000 people –the size of a tribe and perhaps speaking a common language – emigrated into Eurasia and multiplied. Furthermore, various indicators of behavioral modernity have moved back in time (recall that earlier figure). While what we’d like is a marker for syntax or long-range contingent planning, we must often rely on the behavioral B’s: blades, beads, burials, bone toolmaking, and beautiful. Barbs on bone tools for fishing are seen in the Congo about 90,000 years ago. In the same period, on the southernmost coast of Africa, come the earliest signs of something “symbolic” in the form of crosshatchings on red ochre. By 52,000 years ago in east Africa, beads can be seen. Clearly they were beginning to think something like we do. So it now appears that humans were behaviorally modern before the last great Out of Africa. Many paleoanthropologists are now happy with talking about most behaviorally modern abilities emerging in Africa between 90,000 and 50,000 years ago – and that resolves the thorny multiregional problem for behavioral modernity, as one can assume that it developed once in Africa and then spread around the world. Certainly better language and planning would have made it easier to scout out new territories. Almost all anthropologists (including the ones favoring the multiregional hypothesis for occasional interbreeding between Neanderthals and moderns) believe that all modern peoples on all continents are overwhelmingly descended from modern Africans. Between 60,000 and 40,000 years ago, behaviorally modern Homo sapiens sapiens spread into Asia, perhaps on both northern grasslands and southern shoreline routes. I won’t cover the many secondary migrations since then except to note that for the northern route into the steppes of central Asia, we have another important data point from Y chromosome studies: All non-African peoples (well, at least males) share genes that were present in central Asia about 40,000 years ago, suggesting a secondary center of spread from there that displaced the Neanderthals in Europe and the remaining Homo erectus in China. So think Out of Africa, followed by Out of the Steppes of Central Asia. First out of the cradle, then out of the nursery.
This history of the Out of Africa migrations makes the old concept of race look very simplistic. There are at least three things that contribute to our everyday concepts of race: teams, adaptations, and appearances. As I mentioned earlier, we will form up teams, quite spontaneously. It’s an important aspect of what makes us human. Stewart Brand wrote about being part of a spontaneous team of passers-by that formed up in the 1989 San Francisco earthquake to rescue people from collapsed buildings. But we often form opposing teams for trivial purposes, just as easily. “Us against them” is often founded on ethnicity or religion, but fans of one sports team may work cooperatively to harass another such ad hoc fan club. There are not major genetic differences between the people who fight one another in the Balkans or Northern Ireland. Even if they didn’t form up teams around a shared religious background, they’d likely form up teams based on some other shared interest. But something that can be heard (a style of speech) or seen (different styles of dress or facial appearance) can easily bias which team you join. Ethnicity, based on such behavioral choices, is a significant part of the older notions of race. But some differences are indeed biological. Appearances sometimes involve specific physiological adaptations, such as keeping the dust out of the eyes in Asia. In northern Europe, the adaptation involved making the most of scarce sunlight for vitamin D production in the unclothed patches of skin. The equatorial Maasai’s long limbs are appropriate to losing heat. The short limbs and long body of the far-northern Saami and Inuit peoples are best for conserving heat. Some regional genetic differences produce more subtle effects: though there is no difference in the rate of having identical twins around the world, Africans give birth to a lot more fraternal twins than do Asians (Europeans are right in the middle). It will not be surprising if some average aspects of mind – say, styles of problem solving – turn out similarly to differ among regional groups, even if individuals in each group span the whole human range of variability. But like having a “Roman nose,” much of race is just being part of a large extended family and sharing the family resemblance. From appearances, it is easy to create African, Asian, and European clusters but there is nothing sacred about a continent – particularly Eurasia! – and things can obviously be subdivided from regional extended families all the way down to the characteristic appearance sometimes seen in a small village. Experienced travelers claim to recognize hundreds of categories. Some recognizable appearances come from recent mixes, as in the Mexican combination of recent European and ancient Asian genes. European “whites” themselves have had a lot of recent gene contributions from the numerous Out of Asia episodes between when Attila’s Huns crossed the Rhine in 436 to the great siege of Vienna in 1683. Then there was the African invasion of France in 732. African influences usually came north via the gradual spread of Mediterranean genes but also because of mass movements; after being forced out in 1492, many Spanish Jews emigrated to northern and eastern Europe. So Europeans are quite a mixture; the main thing we have in common biologically is a former need to make our skins unusually pale (a condition we often try to disguise via sun tans). Present-day African Americans average about one-quarter European genes and, however useless or pernicious that racial identification has become for most purposes, it occasionally is still an important clue to diagnosing an inherited disease such as sickle cell anemia. That said, let us consider the consequences of spreading humans around the globe at a time not long after we first became behaviorally modern.
While structured thinking may be one of the aspects of human uniqueness (and, when combined with ethical judgment ability, certainly a candidate for our crowning glory), I suspect that evolution hasn’t yet tested it very well – that it is clunky and perhaps dangerous at times. Our emotional value judgments are far older and better tested than our intellects. Emotions are handy when decisions must be made quickly but they are also overly broad, lacking precision and nuance. Many problems arise when snap judgments substitute for deeper consideration. Consider some of the consequences of acquiring higher intellectual functions:
Cowboys have a way of tying up a steer or bronco that fixes the brute so that it can neither move nor think, like the proverbial deer frozen in the headlights. This is the hogtie, and it is what our rationality sometimes does to us, freezing us when we ought to keep searching. There are ways around this, as when we teach about the common logical fallacies. Just as medications can fix some problems, it is hoped that education about pitfalls might alleviate others. Teaching “critical thinking” skills in school is one way to combat the pervasive misleading information and logic that bombards us daily. We can learn to routinely ask: Why is this free? What are they really selling? What did they avoid mentioning? Do those statements really contradict one another? Important relative to what? Does that conclusion necessarily follow? Is the definition adequate? Is there ambiguity here? What’s being assumed? Can you rely on the alleged authority? And when everything seems to hang together nicely, be wary of arguing in a circle – you may just be using different synonyms for the start and finish. There is even an organized practice of trying to find people lacking critical thinking skills. The technical term is “trolling for suckers” – locating the fraction of the population that is truly impaired in their judgment and then attempting to part them from their money. “Sucker lists” collect the addresses of gullible people who have responded to something-for-nothing bait that is so improbable that most people would ignore it – but nibbling at it marks you for further attention. As in the New York joke about selling the Brooklyn Bridge, “If you can believe that, then I’ve got a bridge that I’d like to sell you.” Schemes like this require communicating the bait to a large number of people in order to find the unfortunate few. Nearly free email has now enabled mass mailings on a scale never seen before. The everyday lament, “How could anyone believe such stuff?”, has a sad answer. Such ploys can pay off, sometimes not immediately but later on the followup (the free “bait” and then the “hook”). But we are all gullible on some occasions, often early or late in life, or on some subjects at all times. This organized exploitation of intellectual shortcomings could be controlled by ethics and laws. But there are many everyday defects, shared by everyone to some extent, that make you wonder if Homo sapiens sapiens was really ready for prime time when everything expanded 50,000 years ago.
Complex thought presumably underlies the entire suite of higher intellectual functions. We can operate at levels that are not easily translated into words, as when we mull over a puzzle. I’ll have much more to say on the subject of levels of organization in the next-to-last chapter, but first let me mention some flaws in the more fundamental cognitive processes, where it looks as if we are still plagued by the crudeness that usually characterizes anything that is a “first of its kind”:
Categorical perception can put blinders on us, so that we cannot see the nuances. Japanese reared without hearing the English /L/ and /R/ sounds will lump them together and hear a Japanese phoneme that is in between in sound space; “rice” and “lice” sound the same to them. Newborn infants can hear the difference, but soon a category forms up around the sounds most often heard, and variants are conformed to the new standard. (It’s known as the magnet effect or category capture.) Perception can also fill in missing information erroneously, as when blind spots in our visual world are filled in (look at wallpaper and, instead of seeing a featureless spot where your photoreceptors are missing, you see the area filled in by the surrounding patterns). We also do this fill-in over time, as when a light flash in one location is followed by a second light flash nearby – and we report seeing a smooth movement of the light, filling in all the intermediate points. A striking example occurs in viewing cave art with the aid of a flickering oil lamp. Between flashes, one’s eye position drifts a little in the profound darkness and, when the next flicker again illuminates the scene, it looks to us as if the depicted animal smoothly moved! While we share such perceptual inaccuracies with most primates, we have higher-order cognitive versions of category-capture and fill-in as well.
Our memory mechanisms are not very good at avoiding substitutions or keeping things in order. A child taking part in a collaborative project, when later asked who did what, will often think that she performed an action that the videotape shows was performed by another child. I’m not referring here to what Henny Youngman said, “After you’ve heard two eyewitness accounts of an auto accident, you begin to worry about history.” I’m talking about what happens much later, even if you get it right initially. Even when you initially succeed in recalling a sequence of events, you may make a mistake in recalling the event weeks later. If you scramble things once, it may have consequences a month after that, as if you had overwritten the correct memory sequence with your erroneous recall. The memory expert Elizabeth Loftus likes to say that “Memory, like silly putty, is malleable…. The inaccurate memories can sometimes be as compelling and ‘real’ to the individual as an accurate memory.” Keeping things in the right order is often important for structured thinking, and it looks as if evolution didn’t get around to fixing the flaws in memory mechanisms. Changing the name of something is, of course, a standard attempt to manipulate your memories, perhaps to run away from a problematic reputation. (Cynics would note that both my local telephone company and my bank have changed their names twice in recent memory.)
Our structured judgment may not be up to the task even when we structure our thoughts successfully, as in those fallacies of logic. And as merchants know all too well, our decision-making is easily swayed by the last thing we happen to hear. Psychology texts are full of examples about the unwarranted emphasis that is often given to some minor aspect.
Vivid examples can capture our minds and override other considerations. Although we might spend all day carefully considering the documented facts about frequency-of-repair records when shopping for a new car, our judgment is still notoriously easy to sway with just one nonrepresentative example. Someone at a dinner party complaining about repairs to their top-rated car is often sufficient to override our logical consideration of the average repair experience. We ought to treat the new example as just part of the range of variation that led to the average we researched. Instead, captured by the vivid example, we go out the next day and buy the second-choice car. Any narrative provides an attractive framework, when competing with dry facts detached from stories. Ronald Reagan often took advantage of this when he was president of the United States, telling an easily appreciated story of some one person – and letting this carefully selected example serve as a rationale for a favored government policy. Vivid stories can be used to smother inconvenient facts.
Searching for coherence, we sometimes “find” patterns where none exist – as when imagining voices when it is only the sounds of the wind, or trying to force fit a simple explanation on a complicated set of relationships. “It all hangs together” is what makes for strong belief systems and allows all sorts of actions to be rationalized.
We offer reasons, often several deep, for an action or a belief. Some considerations, perhaps ethical ones, can override others. Rationalizations are untruthful inventions that are more acceptable to one's ego than the truth. We fall prey to logical fallacies; even snails assume “after this, therefore because of this” and you’d think that evolution could have kept us from falling for it so often. Reasoning often involves a chain of reasons, a considerable limitation because the reality is usually a more complex web of interacting causes.
There are disconnects between thought and talk, as in that blues lament of Mose Allison, about when “Your mind is on vacation but your mouth is working overtime.”
Conditionals and pretense work surprisingly well, considering how little evidence there is for such abilities in the great apes. We have an ability to entertain propositions without necessarily believing them, distinguishing “John believes there is a Santa Claus” from “There is a Santa Claus.” But we aren’t born that way. The ability to play a role in “doctor” or “tea party” arise later in the preschool years. Do we later remember what was pretense and what was real? Not always. Source monitoring (tagging facts with where you learned them) often fades with time so that “facts” become detached from their supports. The day afterward, you may know it only happened in a dream – or that you only planned to say it but didn’t actually utter the words – but will you lose that pretense tag in another month?
Concreteness is seen in a few modern people who answer very literally to any example of figurative speech, who are unable to rise beyond the most basic interpretation. But most of us are very good at backing off and treating a question more abstractly, looking for the metaphor. When someone starts to lose this ability, physicians suspect damage to the frontal lobe and go looking via diagnostic brain imaging.
Rational argument alone often cannot overcome those who simply and passionately believe. Yet logic is often bent and distorted in the service of those belief systems; it can even override everyday experience. As Fyodor Dostoevsky noted, “But man has such a predilection for systems and abstract deductions that he is ready to distort the truth intentionally, he is ready to deny the evidence of his senses in order to justify his logic.” I doubt that this was a problem before the behaviorally modern transition. There is nothing like logic in the aid of strong beliefs to provide the motivation to override ethics and find hypocritical excuses for committing acts of violence. It is most familiar from extreme political beliefs but consider two examples of how professedly peaceful religious cults have, once they became wealthy via give-us-everything contributions from their members, turned to using biological and chemical terrorism. In 1984, members of the religious cult of Bhagwan Shree Rajneesh sprayed the salad bars of four restaurants in The Dalles, Oregon, with a solution containing salmonella. The idea was to keep townspeople from voting in a critically contested local election; 751 people became ill. This cult merely obtained mail-order biological salmonella samples and cultured them. (This is low-tech kitchen stuff.) The second cult, in contrast, recruited technically trained people in considerable numbers and engaged in indiscriminate slaughter. Aum Shinrikyo (“Aum” is a sacred syllable that is chanted in Hindu and Buddhist prayers; “Shinrikyo” means supreme truth) is a wealthy religious cult in Japan (recently renamed Aleph), with many members in Russia. Their recruiters aggressively targeted university communities, attracting disaffected students and experts in science and engineering with promises of spiritual enlightenment. Intimidation and murder of political opponents and their families occurred in 1989 by conventional means, but the group’s knowledge and financial base allowed them to subsequently launch substantial coordinated chemical warfare attacks. In 1994, they used sarin nerve gas to attack the judges of a court in central Japan who were about to hand down an unfavorable real-estate ruling concerning sect property; the attack killed seven people in a residential neighborhood. In 1995, packages containing this nerve gas were placed on five different trains in the Tokyo subway system that converged on an area housing many government ministries, killing 12 and injuring over 5,500 people. During the investigations that followed, it turned out that members of Aum Shinrikyo had planned and executed ten attacks using chemical weapons and made seven attempts using such biological weapons as anthrax. They had produced enough sarin to kill an estimated 4.2 million people. Other chemical agents found in their arsenal had been used against both political enemies and dissident members. While they were also virulently anti-Jewish, apocalyptic scenarios dominated the sect’s doctrine and the Tokyo attack was said to be an attempt to hasten the Shiva version of Armageddon. Only cult members would survive it, they claimed, thereby purifying the world by ridding it of nonbelievers. (No mention seems to have been made of the enormous cleanup job the true believers would then face.)
The general problem here is the motivation provided by strong belief of all kinds and its narrow logic. As Dostoevsky observed, belief systems serve to distort new information, making it conform to preconceptions. It provides a narrowed focus, within which everything seems to hang together, and consequences follow from its logic. It reminds me of that aphorism: the person with one watch always knows what time it is. The person with two watches is always uncertain. If you approach a problem from multiple directions at once, you learn to live with such uncertainty. Strong belief systems, however, often try to narrow you down to one source so that you cannot escape its logic. Educated people are not immune to getting trapped in such blinkered logic. Most technically trained people, including many physicians, do not have broad educations and they are often activists, trained to make decisions quickly and move on. The masterminds of modern terrorist movements are frequently personable, technically competent people from privileged or middle-class backgrounds, not at all fitting the usual depiction of their foot soldiers as the ignorant downtrodden. And such leaders, in the manner of modern executives everywhere, usually weed out the truly mentally ill as too unreliable. To get people to follow your orders, it helps if they can follow your logic. Martyrdom is often the result of excessive gullibility, of ensnarement by narrowly focused logic. Some suicide bombers turn out to be educated people, trapped in the logic of some scheme where “everything hangs together.” While the false-coherence problem is old, its consequences have escalated. “For the foreseeable future, smaller and smaller groups of intensely motivated people will have the ability to kill larger and larger numbers of people,” Robert Wright writes. “They'll just have to be reasonably intelligent, modestly well-funded, and really pissed off.” Or really trapped by a compelling logic that reframes their existence. Recall the Fermi paradox about extraterrestrial intelligence: “If intelligence is common in the universe, why haven’t we seen them already?” (since some surely evolved technological civilizations before we did). This suggests that prior technological enlightenments elsewhere might have flickered only briefly before self-destructing. Sometimes the strong belief and its logic is of religious origin (“God gave us this land”), sometimes secular (“All power to the people”). But all this has little to do with mental illness (that I will discuss in the last chapter), however great our reflex tendency to label some of the acts as “crazy.” It has everything to do with our half-baked rationality.
Despite its virtues, anything that is “the first of its kind” tends to be awkward at first, rough around the edges. There is a growing suspicion that maybe modern humans are like that. Our intellects are a big step up but they appeared very recently in the ice ages, long after the human brain stopped enlarging. They’re not well tested yet and are still prone to malfunctions. However impressive our average intellect may be when compared with the other apes, remember that biological evolution often produces overblown features with major drawbacks. Peacocks have tails that hamper escape from predators. Elk grow antlers so wide they can no longer run through forests to escape wolves. Overgrown intellects have similar problems. An intellect with great persuasion and planning abilities sometimes produces dramatic results as the leader of a suicide cult. As Desmond Morris once said, we prefer to think of ourselves as fallen angels, not risen apes. At least, we hope, evolution is still improving us. Alas, biological evolution doesn’t perfect things – it just moves on to new “products” with a different set of bugs. (Sound familiar? And how often does your computer still hang or crash? We might be like that, not ready for prime time.) Even when we avoid hanging up from obsessions or crashing from epileptic seizures, we stumble over numerous cognitive pitfalls (usually without noticing). Once you also recognize that we’re recently risen apes, you realize that there simply hasn’t been much time in which to evolve a less buggy version 2.0. Clearly, human cultural innovation is now in charge of getting the bugs out, not biological evolution. And we haven’t made much progress yet.
We’ve arranged a global civilization in which most crucial elements – transportation, communications, and all other industries; agriculture, medicine, education, entertainment, protecting the environment; and even the key democratic institution of voting – profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces. – Carl Sagan, 1996
|
If you read the book on the web (uncomfortable
but possible), consider buying a book as a gift for a friend.
(We live and learn and pass it on.)
Click on a cover for the link to amazon.com.
2000 The Cerebral Code 1996 How Brains Think 1996 Conversations with Neil's Brain 1994 The River That Flows Uphill 1986 The Throwing Madonna 1983 |
Table
of Contents Notes and
References for this chapter
On to the NEXT CHAPTER
|
copyright ©2003 by William H. Calvin