“The only wisdom we can hope to acquire is the wisdom of humility: humility is endless.” —T. S. Eliot
Whence Such Confidence?
I have friends who see themselves as having intellectual problems with the gospel—with some spiritual matter or other. Interestingly, these friends all share the same twofold characteristic: they are confident they know a lot about spiritual topics, and they are confident they know a lot about various intellectual matters.
This always interests me, because my experience is very different. I am quite struck by how much I don’t know about spiritual things and by how much I don’t know about anything else. The overwhelming feeling I get, both from thoroughly examining a scriptural subject (say, faith) and from carefully studying an academic topic (for example, John Bell’s inequality theorem in physics), is the same—a profound recognition of how little I really know, and how significantly, on many topics, scholars who are more knowledgeable than I am disagree among themselves: in other words, I am surprised by how little they really know, too.
All who have developed expertise in a specific intellectual subject are aware of this second point. Familiar with the cutting-edge literature in their field, they recognize how many disagreements exist on various matters even though those disagreements may, for reasons of space, be either lightly treated or omitted altogether in textbooks and other general introductions. As a result, even trained scholars, if they had no special expertise in a particular area, could easily be mistaken about the subject—in ways both numerous and deep—if they developed opinions based primarily on such secondary sources of information. How could they not overestimate the degree to which matters are settled and certain when all they are reading is a general treatment?
Knowing this about intellectual life, and knowing the equal difficulty of fully grasping a scriptural subject even as fundamental as faith, I am at a loss to explain others’ confidence. I certainly do not share it. I have come to believe, after many a false start, that if I am honest and thorough in my approach to the gospel, and if I am honest and thorough in my approach to intellectual disciplines, there resides in each the imperative for a profound sense of humility. I discover in both of them that what we don’t know far outstrips what we do.
Matters of the Spirit
Think first of the gospel. The scriptures require the most exacting study; they simply do not yield to superficial and occasional glances. For generations, for example, it was common to read “narrow neck of land”—the one geographical marker in the Book of Mormon that stands out even on a thin reading—and to draw the natural inference that the phrase referred to the Isthmus of Panama, and that familiar Book of Mormon events thus spread over two continents. Only recently has it become widely known that the Book of Mormon itself decisively disproves this inference, but it required careful, thorough reading.Similarly, some have seen the Ammonites as pacifists and wonder if the Book of Mormon doesn’t therefore—despite its litany of wars—actually contain a pacifist message. But this too is based on an underreading of the account.
Although careful study is crucial in reaching gospel conclusions, the scriptures rarely if ever give full answers, even when read with care. Questions always linger and even multiply. Consider the topic of agency and accountability. The scriptures teach clearly that agency was given to man and that we are held accountable and judged (see, for example, Moses 7:32; 6:56; D&C 29:35; 1 Ne. 10:20; 15:33; and Alma 5:15), but the topic is more complicated than it appears at first glance. Recall, for example, Lehi’s blessing to the children of Laman that “if ye are cursed, behold, I leave my blessing upon you, that the cursing may be taken from you and be answered upon the heads of your parents” (2 Ne. 4:6)—a blessing he extended to the children of Lemuel as well (2 Ne. 4:9). And note Jacob’s reminder to the Nephites that the Lamanites’ filthiness at that time “came because of their fathers” (Jacob 3:9), and also his warning to the Nephites that “ye may, because of your filthiness, bring your children unto destruction, and their sins be heaped upon your heads at the last day” (Jacob 3:10). Also recall the Lord’s pronouncement that, though the people at the time of the flood were the most wicked of all his creations, “their sins shall be upon the heads of their fathers” (Moses 7:36–37), and his declaration in our day that if parents are not diligent in teaching their children, “the sin be upon the heads of the parents” (D&C 68:25).
These passages raise certain questions. For instance, who, then, is actually accountable? Do those whose sins are answered on their parents’ heads have no accountability? And if so, does that mean they have no agency? Or are they still responsible for part of their sinfulness and therefore have partial agency? If so, is it fifty percent? Seventy percent? Ten percent? It seems there must be degrees of agency and accountability, but how exactly are those degrees apportioned? And how many people fall in the category of having their sins answered on their parents in the first place? Surely there are countless parents who have been worse than Laman and Lemuel. Do the children of those parents all fall in that category too? Am I fully accountable? Are my children? Surely it is relevant, for example, that while scripture clearly declares that there is no forgiveness for murder “in this world nor in the world to come” (D&C 42:18), the Ammonites committed acts of murder and yet obtained forgiveness. Isn’t some notion of an attenuated accountability required to explain this? If not, why not? And if so, how must that work?
These are deep questions, and they deserve careful thought. But notice that however we answer them, our conclusions will be based on our own judgment and on our own weighing of various considerations. This means that even though agency and accountability are fundamental concepts of the gospel, much of what we believe about these concepts will necessarily be uncertain. Whatever we conclude, they are our conclusions, not the declarations of the Lord, and we must hold them tentatively.
Lessons in Humility
This uncertainty should not surprise us. Spiritual history is full of lessons in humility—of occasions in which individuals supposed a conception of reality to be the truth and then were startled by what they subsequently learned. In his transcendent transfiguration experience, for example, Moses saw not only God but also “the world and the ends thereof, and all the children of men which are, and which were created.” The record reports that Moses “greatly marveled and wondered” at these experiences and observed to himself, “Now, for this cause I know that man is nothing, which thing I had never supposed” (Moses 1:8, 10).
The brother of Jared’s experience was similar. The account tells us that “the veil was taken from off the eyes of the brother of Jared, and he saw the finger of the Lord; . . . and the brother of Jared fell down before the Lord, for he was struck with fear.” Asked by the Lord why he had fallen, he answered, “I saw the finger of the Lord, and I feared lest he should smite me; for I knew not that the Lord had flesh and blood” (Ether 3:6–8).
Both Moses and the brother of Jared were preeminent spiritual figures prior to these experiences, and both had highly advanced spiritual capacity.Yet both were amazed at what they discovered once the Lord removed the veil from their understanding.
Doctrine and Covenants 76 and 19. The experiences of Moses and the brother of Jared are far from the only such instances. I think the most dramatic example in this dispensation, at least since the First Vision, is the revelation given in Doctrine and Covenants section 76. Recall that by the time this revelation was given the Prophet had seen the Father and the Son; he had been visited multiple times by the resurrected Moroni; he had seen in vision many of the events that transpired in the Book of Mormon (before translating it, incidentally);he had translated the Book of Mormon by revelation; he had been ordained by resurrected beings; he had received dozens of recorded revelations; and he had experienced visions too numerous to list. And after all that, he received the vision recorded in D&C 76.
Keep in mind that up until this time (1832) the combined record of the New Testament and the Book of Mormon had fashioned an image of an afterlife that was neatly divided into two simple classifications, heaven and hell. But in a single stroke, the vision of the degrees of glory changed all that, and did so radically. Everything that had been supposed on this large and central topic, both by the Saints and by the Prophet himself, had been either wrong or at least incomplete.
The same was true of the idea of “eternal” punishment. Until 1830 there was little reason for anyone to question scriptural statements about the everlasting/eternal/unending nature of the punishment that the wicked would suffer in the next life (see, for example, Matt. 25:41, 46, and Mark 3:29). Such terms were naturally understood to be making pronouncements about time, and they effectively defined hell as something that would never end.
Then, in a remarkable scriptural surprise, the Lord informed the Prophet in Doctrine and Covenants section 19 that he did not use the terms endless and eternal in the way that mortals were accustomed to using them. Rather than designating a length of time, the terms designate instead a certain quality: Endless punishment, we learn, is simply God’s punishment and is not defined by length of time (D&C 19:5–12).So again, in a single stroke, the Saints’ understanding was transformed. Practically everything that had been supposed on this topic up to that time had been mistaken.
Abundant surprises. The history of the gospel on the earth is, in one way, the history of just such surprises. Adam learned something completely unknown to him when taught the purpose of baptism (Moses 6:53–63); Enoch was surprised at the Lord’s tears, and gained fresh—and surprising—understanding when told the reason for them (Moses 7:28–41); Christ’s appearance as the promised Messiah violated the expectations harbored by the ancient Jews about that prophetic figure, and, as a result, those who accepted him embraced not only the actual Messiah, but, simultaneously, a new conception of the Messiah (see Matt. 16:13–16; Luke 24:21, 26–27; John 4:25–26); the Lord’s disciples were subsequently shocked at how he could be taken from them in death—and equally shocked at his resurrection three days later (see Mark 16:10–14; Luke 24:10–11; John 20:24–25). And so on. To write the history of God’s revelations is to write the history of man’s surprises. They are profound lessons in humility.
To appreciate this reality even more fully, consider the series of shocks that investigators and new members of the Church encounter, in one sequence or another, in rapid succession, as they study the gospel.The list is quite overwhelming, but a moment’s reflection will persuade us that this level of investigation is just the beginning. We are all investigators, and most of what we eventually learn will be taught to us after this life is finished. Even with all we know, from an eternal perspective we still know virtually nothing. As much as we have learned about degrees of glory in eternity, for example, it is sobering to contemplate the Prophet’s observation that he revealed only a hundredth of what he himself had learned in the vision.
And ponder again the implications of Doctrine and Covenants 19. If the Lord can announce at any moment that he simply doesn’t use certain scriptural words the way we customarily use them, then we might expect that other conclusions we have reached might also undergo revision as the Lord pours out future revelations of his mind and will.
Of course, none of this changes what we do know with certainty. For example, I know that we do indeed have a Father in Heaven, that we have a Savior who is Jesus Christ, that the fullness of the gospel was restored through the Prophet Joseph Smith, that the Book of Mormon is true, and that the Lord directs his Church today through living prophets. These are certain, and I know them.
All I have suggested is that the scriptures actually touch on a lot more topics than these fundamental certainties. On such matters there is very little that I can pretend to comprehend—certainly not in any degree of fullness. On many topics, devoted people can see issues differently and reach a variety of conclusions. Indeed, it would not be wrong to expect that every spiritual concept I currently hold will be enriched, and in many cases thoroughly transformed, by things I learn in the future—particularly following mortality. Given the incomplete nature of the revealed word, that is to be expected—even embraced. Who am I to insist that my understanding of current revelation is anywhere close to the truth in the depth, detail, and expanse in which God knows it and which he will eventually be able to reveal? What right do I have to think that all surprises are behind me and to be defensive, insistent, or smug in any way? The truth is, I have none.
I believe the same must be true for all of us. Because our knowledge is so fragmentary, we will surely encounter an endless train of spiritual surprises once we pass through the veil. At least by then, if not before, we will appreciate just how little of the Lord and his works we have actually comprehended in this life. I believe we are to honor the Lord in every aspect of our lives, and I think that entails recognizing just how little we know of what he knows, and then living, thinking, and studying accordingly. It means pondering the living word diligently and with a bright sense of tentativeness, humility, and wonder.
Matters of the Intellect
It is not only in gospel study that we should tread humbly; humility pertains equally to matters of the intellect. For example, in a very enlightening and engaging Nobel Conference years ago, MIT professor Victor Weisskopf reported that the scientists of the 1930s, despite their brilliance and dedication, nevertheless lived in “a fool’s paradise.” Physicists at the time, he observed, “thought they had found all the elementary particles. . . . Why was it a fool’s paradise? Well, it was a fool’s paradise because it turned out it was not so.”
Regarding science generally, Stanley Jaki remarked, “What I suggest is that even science, to say nothing of the broader cultural outlook, might benefit by a modest measure of caution about the presumed absolute validity of some propositions particularly dear to the scientific and philosophical spirit of the age.”
Similarly, years ago Robert Nozick characterized, in an unforgettable way, the nature of theoretical explanation generally. He said that such activity
feels like pushing and shoving things to fit into some fixed perimeter of specified shape. All those things are lying out there, and they must be fit in. You push and shove the material into the rigid area getting it into the boundary on one side, and it bulges out on another. You run around and press in the protruding bulge, producing yet another in another place. So you push and shove and clip off corners from the things so they’ll fit and you press in until finally almost everything sits unstably more or less in there; what doesn’t gets heaved far away so that it won’t be noticed. . . . Quickly, you find an angle from which it looks like an exact fit and take a snapshot; at a fast shutter speed before something else bulges out too noticeably. Then, back to the darkroom to touch up the rents, rips, and tears in the fabric of the perimeter. All that remains is to publish the photograph as a representation of exactly how things are, and to note how nothing fits properly into any other shape.
So, even intellectual topics are slippery and resist ultimate explanation. Achieving understanding of the world is hard. To further illustrate, let me draw at least cursory attention to three episodes in recent intellectual history.
One of the towering intellectual figures of the twentieth century was the Austrian philosopher Ludwig Wittgenstein (1889–1951). Wittgenstein began his studies in engineering at the University of Manchester in 1908 but, at Gottlob Frege’s suggestion, went to Cambridge in 1911 to study logic under Bertrand Russell. He there became engulfed in the problems of logic and philosophy, and his reputation for intensity became legendary—many thought him at least somewhat mad. Indeed, certain that he needed a more pristine environment in which to work on the intellectual problems that consumed him, Wittgenstein left Cambridge after two years to live in a remote village in Norway. There he stayed and worked until enlisting in the Austrian army at the outbreak of World War I. He continued his philosophical work during the war and actually wrote much of his immensely famous and influential work Tractatus Logico-Philosophicus while in an Italian prisoner-of-war camp.
The Tractatus first appeared in English in 1922. A treatment of the problems that had so long absorbed him, the slight, tightly written volume made important advances in logic, proposed a striking theory of how language is related to the world, and drew conclusions about various philosophical topics, including “the mystical.” So certain was Wittgenstein of his thinking that he said in the preface, “The truth of the thoughts that are here set forth seems to me unassailable and definitive. I therefore believe myself to have found, on all essential points, the final solution of the problems.”Convinced of this, Wittgenstein determined there was nothing more to do in philosophy; he therefore abandoned the field and the academic life altogether and became an elementary school teacher in rural Austria.
Despite his retirement from the intellectual community, Wittgenstein’s reputation soared. When, in 1929, Wittgenstein finally returned to Cambridge with a reawakened interest in philosophy, John Maynard Keynes wrote in a letter: “Well, God has arrived. I met him on the 5.15 train.”
To achieve a PhD and secure an academic position at Cambridge, Wittgenstein submitted the Tractatus as his PhD thesis. By this time, his reputation was so immense that Russell and G. E. Moore (also a famous philosopher) conducted Wittgenstein’s oral examination with embarrassment; indeed, Russell called the examination the most absurd thing he had known in his life. Nothing better illustrates the irony of the situation than this: During the exam, Russell raised objections to some parts of the Tractatus, whereupon Wittgenstein brought the meeting to a close by slapping his two examiners on the back and saying, “Don’t worry, I know you’ll never understand it.”
From this point on, however, Wittgenstein’s confidence waned. Although the Tractatus decisively influenced a whole generation of philosophers and other scholars, Wittgenstein himself came to recognize what he called “grave mistakes” in the book and dramatically transformed his approach to the issues it addressed. His later work, captured in a number of sources—but perhaps best in his Philosophical Investigations, published posthumously—is also concerned with the nature of language and the problems of philosophy, but it repudiates altogether the theory of language of the Tractatus and its approach to philosophical issues. In a dramatic display of intellectual impact, this work too became widely considered a work of genius and decisively influenced another generation of philosophers.
The widely influential movement known as “logical positivism” followed a similar trajectory. Although it developed distinctive doctrines of its own, logical positivism was inspired in large part by the early Wittgenstein. Centered in a group of scholars in Vienna, beginning in the mid-1920s, the movement came to be known as the work of the “Vienna Circle.” Leading members of the Circle were physicist Moritz Schlick, philosopher Rudolf Carnap, and sociologist Otto Neurath.The young British philosopher, A. J. Ayer, was a member of the Circle for a time. He spoke little German and simply listened as members of the Circle debated each other. But he learned quickly and, at age twenty-five, published a very confident and influential book explicating logical positivism and the answers it provided to various intellectual problems, including those of science, ethics, and religion. First published in 1936, Ayer’s work became a classic expression of the Circle’s point of view and made him justly famous.
Speaking generally,the central feature of logical positivism was the “verification principle”—the view that any meaningful statement about the world must be verifiable through experience. Indeed, the meaning of any statement is its method of verification. According to this principle, the claims of science are meaningful because they are empirical statements that can in principle be verified through observation. Statements about the circumference of the earth, the height of Mount Everest, and the effect of heat on gases are examples. The verification principle also allowed a place for statements that are not about the world but that are necessarily true due to the laws of logic or to the meanings of their terms. Examples would include the statement “All bachelors are unmarried” and the mathematical equation 5+4=9. Statements that do not fall into one of these two categories—empirically verifiable statements about the world or statements that are necessarily true—are simply meaningless. A claim like “the absolute is pure oneness of being” is an example. This statement appears to be about the world, but how would one go about verifying it? Since it eludes verification, it must be understood to make no claim at all; it is neither true nor false, but meaningless. The logical positivists saw the great metaphysical systems of philosophy as riddled with just such nonverifiable and thus meaningless statements, and it was against these intellectual systems that they were largely reacting. All metaphysical claims, including those of religion, were rejected as nonsense.
Despite the early confidence of its many adherents, however, the doctrines of logical positivism ultimately unraveled. Most importantly, the movement could never successfully formulate the verification principle itself: the principle is not verifiable, and yet neither is it a necessary truth. Is it therefore, by its own standards, a meaningless and nonsensical claim? W. V. Quine’s influential assault on the logical positivists’ distinction between analytic and synthetic (or empirical) truths also contributed significantly to the decline of the movement.Indeed, the crumbling of its doctrines was eventually so complete that Ayer himself—when asked years later about the main defects of logical positivism—simply remarked, “Well, I suppose the most important of the defects was that nearly all of it was false.”
This episode is fascinating, not only because logical positivism’s influence was immense, but because that influence continued in other intellectual fields long past the recognition of serious problems by its own central figures. Harvard philosopher of science Hilary Putnam remarks wryly—in reference specifically to the enduring effects of logical positivism—that “scientists tend to know the philosophy of science of fifty years ago” and adds that “it is annoying to a philosopher to encounter a scientist who is sure that he needn’t listen to any philosophy of science and who then produces verbatim [logical positivist] ideas which you can recognize as coming from what was popular in 1928.”
Einstein and Bohr
Another interesting example of ongoing surprise is found in the theoretical debate in physics between Albert Einstein and Niels Bohr. The story begins early in the twentieth century, when developments in physics led to the general field of quantum mechanics—a discipline that developed an impressive but bizarre array of experimental results. Scientists found, for example, that light exhibits both particle-like and wavelike properties, a phenomenon completely unknown in our ordinary world of bowling balls, oceans, and burritos. Some studies are able to display both of these characteristics as they accumulate data over time, but in the early days of quantum mechanics a given experiment would exhibit either one of these properties or the other, but not both. It thus appeared that the type of measurement itself determined what kind of properties would be observed while, at the same time, obscuring observation of its other set of properties.
Related to this was an odd discovery regarding the position and momentum of elementary particles: As experimenters increased the certainty with which they measured one of these properties, they decreased the certainty with which they could measure the other—and this was due not to a limitation in measurement, but to a reality about the particle itself.
Such results led in short order to two disparate ways of looking at the world. One of these concluded that the quantum world is radically different from our ordinary world of familiar objects. According to this view, everything at the quantum level exists in something of a cloudy, indeterminate state, possessing only probabilities of being in one particular condition or another; the act of measurement disturbs this state and then (but only then) the state becomes determinate. Thus, a particle at the quantum level does not actually possess any precise physical position or momentum; instead, a more definite location or momentum is created only as part of the act of observation itself.One implication of this is that reality is not just “out there” for us to discover; instead, our attempts at discovery themselves importantly influence what the world is. Contemplating all this, Niels Bohr once remarked that “if someone says that he can think about quantum physics without becoming dizzy, that shows only that he has not understood anything whatever about it.”
A second view was that, at the deepest level of explanation, the quantum world is actually the same as our ordinary world of experience, but that our current methods of measurement are too coarse to discern this. Small particles do possess actual, physical locations and momenta, for example—whatever our difficulty in discerning them—and this means that there truly is a reality “out there” that exists independently of our observations.
These differences in theory led to a friendly ongoing debate in the late 1920s and 1930s between Bohr, who, along with the large majority of scientists, held the first view, and Einstein and a few others, who held the second.
The most important of Einstein’s challenges to Bohr appeared in a paper published in 1935 with collaborators Podolsky and Rosen (the paper is thus widely referred to as EPR). The paper created an ingenious thought experiment that showed how, using indirect means, to measure both the position and the momentum of a particle so that the particle itself is not disturbed or affected in any way in making the measurement.
This development was theoretically groundbreaking. The thought experiment demonstrated that elementary particles must have precise features such as position and momentum after all: if exact position and momentum can be determined, even indirectly, then—contra Bohr—they must exist, and they must exist whether we happen to measure them or not.
In response to Einstein, Bohr altered to some degree his manner of characterizing quantum mechanics,but it was not possible to say that Einstein had actually “won.” He had created a thought experiment that raised questions about the adequacy of the orthodox view, but his view did not make predictions that were any different from Bohr’s. As a result, no one could see how to conduct an experimental test of the dispute, and the issue was largely set aside by practicing scientists. Einstein and Bohr were locked in a theoretical stalemate.
The landscape shifted dramatically in 1964. In what has been praised as “the most important recent advance in physics,” “the most profound discovery of science,” and “one of the profound scientific discoveries of the [twentieth] century,”the Irish physicist John Bell (1928–1990)—referred to in later years as the Oracle of CERN —developed a mathematical theorem that finally showed the way to an experimental test of the Einstein-Bohr divide by showing that the two views do in fact result in competing predictions.
Eventually, Bell’s theorem was used to perform just such experimental tests, the most famous of which was conducted by Alain Aspect and his colleagues in 1982. Their tests yielded results that differed significantly from those required by Einstein’s view, while, in regard to the orthodox interpretation of quantum phenomena, “the agreement,” they report, “is excellent.”In a sophisticated experimental test, Einstein’s view of the quantum world stood refuted (at least on the face of it). Since then, other important studies (generally referred to as EPR experiments) have been performed, all of them supporting Aspect’s results.
I find this long controversy fascinating in two ways. First, it is instructive that the debate about the foundations of physics occupied two of the best minds of the twentieth century, and that decades—not to mention a third great mind—were required to reach anything close to a resolution. This is worth pondering in itself. But second, and even more significantly, it turns out that the theoretical issues have actually not abated. Despite his apparent support from experiments like Aspect’s, Bohr’s view of the quantum world is far from universally accepted among those who specialize in the theoretical foundations of quantum mechanics. Indeed, John Bell himself held to Einstein’s basic position about quantum particles until his death in 1990, despite the results of experiments based on his own theorem. In doing so, Bell surprisingly reinvoked the idea of an ether—long a theoretical heresy in physics—and even entertained the possibility of travel at faster than the speed of light, despite the explicit repudiation of this in special relativity and by the scientific community generally.So Bell himself considered Bohr far from vindicated; he believed only that the experimental results “require a substantial change in the way we look at things.”
Another physicist also reports that, at least in the philosophical explication of fundamental matters, Bohr’s star has fallen and some have questioned whether his philosophy of physics “could be given a coherent interpretation at all.”Indeed, Murray Gell-Mann, well-known Nobel laureate in physics, also resists the Bohr view, remarking once that “the fact that an adequate philosophical presentation has been so long delayed is no doubt caused by the fact that Niels Bohr brainwashed a whole generation of theorists into thinking that the job was done 50 years ago.” On another occasion (in 1994) he said that “physicists are only now approaching a really satisfactory interpretation” of quantum mechanics.
Others think we are not that close at all. One specialist believes that both Einstein and Bohr were focused on the wrong problem all along,and in a recent book, the authors report that there is no longer an established or dominant interpretation of quantum theory at all—which is why, they say, it is important to keep the interpretation debate open by looking back at the history of quantum theory.
In short, it is an understatement to say that on these and many related matters the debate continues.
Lessons in Humility
These incidents from recent intellectual history suggest that significant intellectual matters are often less settled than the current orthodoxy implies, whatever that orthodoxy happens to be and in whatever field. The best experts can always have penetrating and fundamental questions—including of themselves—even if others do not.
Scholars under the spell of the early Wittgenstein, for example, had to be shocked when the great master abruptly reversed field on their cherished doctrines and left them eating intellectual dust. And what of all the steadfast adherents of logical positivism, who had to face Ayer’s own verdict that “nearly all of it was false”? Consider also the decades-long debate between Einstein and Bohr in which neither could be shown to prevail; the apparent vindication, finally, of Bohr in experimental tests; and the more recent questioning of Bohr’s fundamental views despite these experimental results. What are we to make of all that? Further, what does it imply that all high school juniors “know,” for example, that there is no such thing as an ether, just as they “know” that nothing can travel faster than the speed of light—especially in view of the fact that the great John Bell didn’t “know” either of these things?
Again, just as with scriptural topics, it seems to me that the mandate in intellectual matters is inescapable: it is the mandate of humility.
Some Convictions about Academic and Gospel Study
The lesson for me from all of the surprises in both spiritual and intellectual matters is how little I know. Indeed, I have learned to assume that if I find myself thinking I know a lot about some subject it is only because I am not thinking very deeply about it: in any deep effort to understand either spiritual or intellectual issues, questions outstrip answers quickly, and without end, and at such times nothing is more apparent to me than how little I genuinely know.
Based on these observations, and even though my scholarly attainments are modest, I have developed a number of personal convictions about academic and gospel study. Here are just four of them.
1. Both Are Pursuits of the Truth
At this level of abstraction, gospel and academic study are identical. The first focuses fundamentally on the things of eternity and relies heavily on the role of our spiritual sensibility in responding to the confirmation of the Spirit of God; the second focuses primarily on the things of the world and relies on the physical senses in leading to truth. Despite these differences, in aim they are one: the search for truth.Where these areas of focus overlap they will ultimately yield identical conclusions. If they seem to disagree at present, it is only because we understand too little of one or the other, or, most likely, of both.
2. Pursue Both Secular and Spiritual Learning as
a Way of Honoring God
It is instructive that Bach added the phrase Soli Deo Gloria (“To God alone be glory”) to the end of most of his scores. It is little wonder that a biographer could report of Bach’s art that it was “one great hymn of praise to God.”Shouldn’t my intellectual studies, too, be one great hymn of praise to God? Doesn’t that follow from the command that we are to sanctify ourselves so that our minds “become single to God” (D&C 88:68)? I also can’t help thinking of the title Beethoven gave to the third movement of his late string quartet, Opus 132, “Holy Song of Thanksgiving to the Godhead, from One Recovering,” where the beauty and holiness of the music sublimely express that title. Although Beethoven is nothing like Bach in the union of his art with worship, he is still instructive here: Shouldn’t my intellectual studies, too, be a holy song of thanksgiving to the Godhead?
3. Examine Evidence Carefully and Follow Wherever It Leads,
Rather Than Jumping on Intellectual Bandwagons
Although the discussion below deals exclusively with intellectual matters, with suitable qualifications the same principles would apply to spiritual matters.
Evidence. It might seem like a truism to say that we must be careful in examining evidence, but determining the “evidence,” even in empirical disciplines, is often not as straightforward as we might think. It is natural to suppose, for example, that the world is populated with straightforward facts and that the scientific process consists simply in observing those facts and then developing the best explanation we can of what we all observe. While this seems sensible on the surface (the logical positivists assumed this view, for example), the matter actually turns out to be more complicated. Norwood Russell Hanson, among others, argued long ago that what we see and how we see it are influenced by some body of information we already hold; our observations, so to speak, are to some degree “theory-laden.”Emphasizing the same point, Thomas Kuhn records numerous historical examples of scientific facts that were observed only after one theory replaced another: the theory paved the way for the observations, rather than the observations for the theory.
Determining the evidence, then, is not always a simple matter; indeed, as just mentioned regarding Kuhn, sometimes our theory precludes us from even seeing certain evidence. So I must always wonder: How much of my view of the evidence is actually formed by a theory I already hold? And how much is this true of others as well, including those I admire? And if my view of the evidence is influenced by a theory I already hold, what are the chances that I could ever see evidence that would disconfirm that theory?
“Tribe” mentality. But related to this, I must also be true to the evidence, at least so far as I understand it, and resist jumping on intellectual, or other, bandwagons for the sake of acceptance, popularity, or praise. This includes refusing to acquiesce—in whatever discipline—to the intellectual consensus of the time just because it is the consensus of the time or, even unwittingly, to adopt a point of view just because it is the view of high-profile academic figures, or even of my colleagues. To behave in any of these ways is to adopt a tribe mentality—an exaggerated sense of academic correctness and a resultant defensiveness about my beliefs. My concern in that case is less with pursuing the truth than with maintaining membership in the tribe and preserving its intellectual purity.
To give just one example, surely something like a tribe mentality must have been at work in what John Bell considers the “disgraceful” treatment Louis de Broglie received at the hands of the physics community in advancing an explanation of quantum phenomena in terms of classical physical laws rather than within Bohr’s widely accepted theoretical framework. Instead of considering de Broglie’s work dispassionately and with scientific inquisitiveness, Bell tells us that it “was laughed out of court” by the intellectual coterie of the time and that “his arguments were not refuted, they were simply trampled on.”Bell’s observation suggests that the concern seems to have been less with pursuing the truth than with maintaining group loyalty and defending an already-agreed-upon point of view. After all, it was a point of view shared by virtually all of the luminaries of the time and that therefore defined what was and was not intellectually respectable, and thus what could and could not be taken seriously. This dismissive treatment occurred even though the dissenting voice came from the intellectually distinguished and formidable Louis de Broglie.
Whither loyalty? If I am to pursue the truth aright, it seems that my loyalty cannot be to a particular intellectual system just because people I admire accept it, or because I want to be associated with such figures on the academic landscape. My loyalty must instead be to the most subtle and critical understanding that I can muster of the evidence itself.
Few attitudes are more risky than supposing that the scholarly icons of the day have reached some final, indubitable intellectual peak on fundamental matters, and that we can shallowly follow them, since all that remains is the adumbration of various details here and there. That was not true in the case of Wittgenstein; it was not true in the case of Carnap and Ayer; and it was not true in the case of either Einstein’s or Bohr’s view of the quantum world.According to John Bell, it was not true even in the case of the scientific community’s universal rejection both of the ether and of the possibility of travel at faster than the speed of light.
Thus, before hitching my wagon to some luminary’s star—or even to some intellectual movement’s star—I had better try to make sure that, among other things, my understanding of the evidence is both subtle and comprehensive; I recognize the anomalies of evidence that the theory does not explain well; I comprehend all the conceptual connections between the observations comprising the evidence and the components of theory claiming to explain that evidence; I have carefully considered ways the evidence itself might be seen differently from alternate theoretical points of view—indeed, the myriad ways, unavoidably, that unexamined assumptions, preconceptions, and presuppositions infiltrate and condition the theory; I have thoroughly and subtly examined rival theoretical explanations and have been able to dismiss them on the basis of my own careful examination of the evidence and of the logic of those rival positions; and I recognize that even the most robust and productive theories face huge hurdles in claiming anything like “truth,” even if preferable to current theoretical rivals.
In short, while developing loyalty to one intellectual position or another is possible, it is not a decision to be undertaken lightly.Nor, once I have developed a responsible conviction about a theory, can I have loyalty to that point of view just because I hold it. To the degree that my concern is with the truth, I will continue to be aware of the anomalies and intellectual debilities of my favored position, even as I am aware of its virtues. I will continue to have in mind equally well the pros and cons of rival theoretical views. If I do not, this may signify that, at heart, I am less concerned with the truth than with simply defending what I have already concluded, not to mention defending my hard-won membership in whatever tribe I use to define my intellectual identity and status.
The great American philosopher W. V. Quine had something to say about such matters near the end of his autobiography: “I have had neither the aptitude nor the temperament for debate, public or private, when confronted with motives recognizably other than the pursuit of truth. If in discussing with a student I sensed that he was animated rather by some ideological preconception, or by a wish to have been right for the sake of high marks or self-esteem, I made short work of the dialogue.”A vast gulf, Quine goes on to say, separates those who are thinking primarily of themselves in their scholarship and those who are thinking primarily of the truth. He remarks, “The latter, I like to think, will inherit the earth.”
4. Pursue the Truth with Humility
As must be clear by now, I believe that nothing impedes our understanding of the world, or of the gospel, quite as thoroughly as a dogmatic insistence on whatever understanding we think we possess at the moment. On the contrary, in both scientific and gospel scholarship, there is profound reason for a lingering tentativeness about many of the ideas we hold at any one time. Expressing this very humility, Gell-Mann once spoke of the difficulty of making theoretical headway in physics, remarking that “perhaps some now unknown brilliant young scientist will find a new set of questions to ask, the answers to which will clarify today’s problems and make what I have been saying here obsolete.”
This appears to be a perfect statement of the humility that should characterize my explorations in both gospel and academic topics. To the degree I pursue the truth with such humility, it seems I will exhibit a number of characteristics. Here are a few (although stated specifically in terms of intellectual matters, they should also apply with certain qualifications to a broad range of scriptural issues):
• I will not claim to know more than I do. I will appreciate that my intellectual convictions, whatever they are, are beholden to a complex, intricate, and hidden web of assumptions, preconceptions, and predispositions that I probably do not even recognize, much less comprehend.
• I will be neither defensive nor rigid about the conclusions I reach. I will be able to catalog the evidence that weighs against the positions I favor as easily as I can expound the evidence that weighs for them. And I will neither minimize the former nor exaggerate the latter.
• I will live with the explicit recognition that: (1) knowledge of the truth—in any kind of complete form—is not possible; (2) because of this, there is no possibility that I have achieved it; and (3) in favoring one explanation over another I am simply making judgments about what I find to be the most perspicacious explanation, for the time being, of the facts as I imperfectly understand them.
• I will embrace the expectation that, in the end, I will turn out to be wrong on an endless host of matters. This is inevitable, and it is both futile and unwise to imagine otherwise.
Humility writ large. It seems that if I pursue the truth with humility, I will live in welcome anticipation of surprise. If I am honest and thorough—both in gospel study and in my intellectual discipline—I will discover soon enough that what I know is far outweighed by what I don’t. Since that is the case, I might as well start out humble, since, once I face the Lord and begin to glimpse eternity, that is certainly how I will end up.
This is humility writ large, and I see no way around it. I discern in this stance the teaching of the redoubtable Mormon that “none is acceptable before God, save the meek and lowly in heart” (Moro. 7:44) and of Benjamin that “man doth not comprehend all the things which the Lord can comprehend” (Mosiah 4:9).
I hear echoes of these ancient prophetic sentiments in the twentieth century’s T. S. Eliot, and everything I have learned about both the Spirit and the intellect convinces me he was correct. Beyond the certainties of the gospel, the only wisdom we can hope to acquire is the wisdom of humility. Humility is endless.
John Bell’s Revolutionary Response
As mentioned in the text, John Bell did not accept Bohr’s apparent vindication at the hands of experiments based on his theorem. Instead, he considered highly unorthodox ideas in his approach to the results—invoking both the idea of an ether and of communication at faster than the speed of light. To appreciate just how revolutionary Bell’s thinking was, a little history is in order.
While the ancient Greeks originated the idea of an ether, it was thoroughly at home in the physics of the nineteenth century. Before the end of that century, the great James Clerk Maxwell (1831–1879) had determined that light consists of electromagnetic waves, and for this reason (among others), he and others concluded that there must be some physical medium through which such light waves were propagated.This point of view added support to the centuries-old notion of the ether and contributed to the wide acceptance of it in the scientific circles of the time.
In 1887, Albert Michelson and Edward Morley conducted the first experimental test of the ether. The study was designed to test the motion of the earth through this element (which, since the earth was moving relative to it, constituted a sort of ether “wind”).
Fully expecting to detect the ether and to identify at least some of its effects, Michelson and Morley were surprised when their experiment failed to show this result. There seemed to be two possible explanations: Either there was in fact an ether, and the experiment had simply failed to detect it, or there was no ether after all—which meant that there was no medium through which light waves were propagated. Because this outcome violated the best theoretical understanding of the time, Michelson and Morley were reluctant to accept it and repeated the experiment a number of times to account for one variable or another. But the result was always the same.
Although some were glad to see the ether eradicated from scientific discussion because of the strange properties it would have to possess, other important physicists resisted abandoning the idea. Brilliant thinkers such as Hendrik Lorentz, Henri Poincaré, Joseph Larmor, and George Fitzgerald continued to embrace the ether and published important works motivated by the Michelson-Morley experiment. One argument was that the instruments designed to identify the ether were themselves distorted by motion and thus could not be expected to detect the movement of the earth through this element, even if it did exist. In that case, the failure to detect the ether by Michelson and Morley was not due to its absence, but to the failure of the experimental situation to account for this variable.
The deathblow to the acceptability of the ether, however, was finally dealt by Einstein’s special theory of relativity in 1905. Here is why. Beginning with Galileo, scientists had recognized that the laws of motion in a room traveling at a constant rate of speed (and in a straight line) were the same as those in a room completely at rest. The ways that objects in the moving room would behave were identical to the ways that objects in a stationary room would behave. (Each room constitutes its own frame of reference, so to speak, and since each is in a steady state of motion—the stationary room’s state of motion is simply zero—it is said that they are both “inertial” frames of reference.)
What Einstein did in the special theory of relativity was to postulate that light, too, operates identically in the two rooms. By this time, Maxwell’s famous equations regarding electric and magnetic fields had demonstrated the speed of light in a vacuum to be 300,000,000 meters per second (186,300 miles per second), or c. Einstein postulated that, just as the other laws of physics would be unchanged in the two frames of reference, the speed of light would be unchanged as well. Observers inhabiting a room that was traveling even at great speed would not be able to tell that they were doing so, even by measuring the speed of light. Any observer, in any frame, will measure the speed of light to be c.
An important consequence of this invariance of the speed of light was that the concept of being “at rest,” or stationary, turned out to be meaningless. Even if we inhabited a frame of reference that was at rest, we could not know it, because there is no independent, fixed frame of reference that is at rest and that can therefore tell us whether or not we are. Scientists in the late nineteenth century considered the ether to be stationary in this way, and, if true, the ether’s stationary character would naturally open up the possibility of determining whether or not any other frame was at rest by comparing it to the ether. In this sense the ether constituted a preferred frame of reference. But experiments of the Michelson-Morley type could not detect the presence of an ether, and this obviously precluded any kind of comparison to it. This is where the notion of the ether suffered ultimate rejection: it was precisely this concept—that of a stationary medium that filled space—that had been thought to provide a universal fixed frame relative to which we could define the speed of any object traveling through space. But once the whole notion of an at-rest frame of reference was abandoned in special relativity, the ether—as an expression of such a frame—was rendered superfluous. The idea of an ether subsequently became a theoretical anathema in physics.
Bell’s Departure from Orthodoxy
John Bell’s departure from the scientific consensus on these ideas is demonstrated in the course of an interview conducted in 1986 by fellow physicist Paul Davies. Davies identified the two crucial aspects at issue in the experiments based on Bell’s theorem, both of which were advanced by Einstein: (1) the reality of features of the external quantum world, independent of our observations of them, and (2) the idea that particles at a distance cannot instantly influence each other because there is no such thing as faster-than-light interaction. The interviewer observed that the correlations shown in Aspect’s experiment meant that only one of these ideas could be maintained. If the particles are “real” and possess a fixed state prior to observation, then the experimental correlations must have been achieved by a communication of information between the two particles that traveled faster than the speed of light. According to Einstein’s own special theory of relativity, as we have seen, such communication is not possible. Therefore, if Einstein was right about the “real” or “fixed” nature of quantum particles, then he was wrong about nothing traveling faster than the speed of light. On the other hand, if he was right about nothing traveling faster than the speed of light, then he was wrong about the “real” or “fixed” nature of quantum particles—which meant that such particles had to be more like the orthodox conception (that is, two entities that comprise a single quantum state, so that both are affected by measurement of either one). Davies asked Bell, “Which of the two [the fixed state of particles, or the impossibility of interaction at faster than the speed of light] would you like to hang on to?” Here is Bell’s reply:
For me it’s a dilemma. I think it’s a deep dilemma, and the resolution of it will not be trivial; it will require a substantial change in the way we look at things. But I would say that the cheapest resolution is something like going back to relativity as it was before Einstein, when people like Lorentz and Poincaré thought that there was an ether—a preferred frame of reference—but that our measuring instruments were distorted by motion in such a way that we could not detect motion through the ether.
Bell’s mention of the ether in this context is significant. As we have seen, in 1887, against their own expectations, Michelson and Morley had failed to detect an ether in experimental tests, and Einstein’s special theory of relativity, published in 1905, had on further grounds eliminated any notion of a fixed or preferred frame of reference such as the ether was thought to be. As a result of these developments, the notion of an ether had been practically laughable in physics for the better part of a century. But here Bell reintroduces the idea. He does so, he says, because if there is an ether “you can imagine that there is a preferred frame of reference, and in this preferred frame of reference things do go faster than light.” He adds, “The reason I want to go back to the idea of an ether here is because in these EPR experiments there is the suggestion that behind the scenes something is going faster than light.”In further defense of the ether, Bell added:
What is not sufficiently emphasized in textbooks, in my opinion, is that the pre-Einstein position of Lorentz and Poincaré, Larmor and Fitzgerald was perfectly coherent, and is not inconsistent with relativity theory. The idea that there is an ether, and these Fitzgerald contractions and Larmor dilations occur, and that as a result the instruments do not detect motion through the ether—that is a perfectly coherent point of view.
Davies then remarked, “To sum up then, you would prefer to retain the notion of objective reality and throw away one of the tenets of relativity: that signals cannot travel faster than the speed of light [which an ether would make possible]?” The answer: “Yes. One wants to be able to take a realistic view of the world, to talk about the world as if it is really there, even when it is not being observed.”
This episode demonstrates the inherent interrelatedness of intellectual ideas: Bell was able to hold onto one idea (that quantum particles possess fixed properties prior to measurement) because he was willing to question others (that there is no ether and that signaling cannot occur at faster than the speed of light).And he questioned these ideas despite the century-long contrary consensus of the scientific community.
Whether he turns out in the final analysis to be right or wrong (either about an ether or about travel at faster than the speed of light), at a minimum Bell demonstrates that, as I suggested earlier, the best experts can always have penetrating and fundamental questions, even if others do not.