Hi, I'm David Feddes and the aim of this talk is to help you to think humbler. Now, for those of  you who are grammar police, I know that it should say, think more humbly, but think humbler  is shorter and easier to remember. So the goal by the end of this is to be able to think  humbler. I'm going to be drawing again on two books that are very helpful in this regard a  scholarly book by Daniel Kahneman, a longer and more in depth book. He's a professor at  Princeton and the winner of the Nobel Prize. And then the Art of Thinking Clearly is a book by  Rolf Dobelli. We need to keep in mind that Kahneman talks about two systems, system one  and system two, the quick thinking and then the slow thinking, and are slower and more  careful thinking system two is supposed to be very evaluative and rational. But often it's kind  of a smug slug. It thinks very highly of itself. And it's kind of lazy system two feels in charge.  But it's lazy and it prefers minimum effort. Some thoughts and actions that system two  believes it rationally chose are prompted by system one you think you've really thought  things through, when in fact, all of that thinking through was just processing information that  came from your almost automatic reflexes and your intuitions and your feelings. And your  reason just found excuses to justify what you already receive from your intuition. Quick, easy  answers feel right. And you feel like you're very knowledgeable, you feel like you're very good at things. You feel like most of what you attempt is going to succeed. But is that really so?  84% of French men think they are above average lovers. Now, if it were true to what they  really are only 50% could be above average, but evidently, 84% think they're in that upper  50%. 93% of American students think that they are above average drivers, only 7% are below average, in their own humble opinion. So most people are better than average lovers. Way  more than half of people, almost 100% of people think they're above average drivers. We're  better than average lovers. We're better than average drivers. And we are better than  average thinkers. Most people assume we know more than we really do. We are more sure  than the evidence would want it. The Bible says the way of a fool seems right? In his own  eyes, the simple believes anything. Do you see a man wise in his own eyes, there is more  hope for a fool than for him. Whoever trust in his own mind is a fool. Well, that's not very  complimentary of being overconfident. And thinking that you have a tremendous mind that  always thinks accurately, logically. And truthfully, you're a lot better off saying I'm wrong a lot. And I want to help you by knowing some of the ways that we tend to go wrong in thinking too  highly of our own thinking. One of the most common is confirmation bias. That's the tendency to interpret new evidence to fit our prior beliefs and preferences, and expectations. And  confirmation bias filters out what doesn't fit by ignoring it. Or if we are aware of something  that doesn't fit our conception of things. We call it an exception overall, I'm right. But there  are exceptions here and there confirmation bias figures into our choice of what news channels we're going to pay attention to. We pay attention most of the time to news channels that tend to agree with the perspective we take, whether it's more conservative, whether it's more  liberal, and so on. We like on social media to hang out with friends who think quite a bit the  way we do and we kind of avoid those who are on a different wavelength. We seek them out.  And that in turn feeds into confirmation bias because we're kind of already hearing what we  want to hear. And it reinforces this even more that that's the way right thinking people really  think confirmation bias has an impact in a whole bunch of different areas. Your own self image is reinforced again and again, by various circumstances. If you already feel bad about  yourself, then every bad thing that happens to you just piles on and reinforces that you're a  loser. If you feel really great about yourself, then everything that comes your way that makes  you feel even greater puffs you up. In the world of astrology, you get horoscopes that tell you  things about yourself. You say, you know that that sounds about right. It's amazing what those horoscopes can tell a person. And of course, they're often so general, that what's already in  your own thinking about yourself just latches onto a word here or there in the horoscope and  just confirms what you already believe about yourself. In the world of business, you want to  follow a certain strategy? And if it goes, Well, you say, Boy, I think you're doing well because  of my excellent strategy. If they aren't going well, you say, Well, my excellent strategy hasn't  quite had time to kick in yet. In the world of medicine, doctors are very confident sometimes  of their diagnosis. In one study of people who had died, there was comparison of the  diagnosis by the physician, and then what the autopsy actually showed as the cause of death.

The physicians had a very high degree of certainty that they were correct. And so the study  took the cases of which they had a very high degree of certainty and confidence, and found  that they were wrong 40% of the time, a little scary next time you go to the doctor. Science  has a tendency to confirm its existing theories and to ignore inconvenient data. If things don't 

fit Darwinian evolution, well, we'll believe Darwinian evolution anyway. And it's been that way, in the past with other scientific views when the earth was considered to be the center of the  universe, it was very hard for any inconvenient data that didn't quite fit that and much data  didn't seem to fit if you worked at it hard enough. But there was always a few inconvenient  facts that didn't fit. Science looks for ways to confirm its existing theories. And that's not a  flaw of science. That's just a feature of human nature, that we tend to look for things that  confirm the way we already think about politics. Do I need to say that people who take a  political position, it's almost impossible to persuade them otherwise, because everything that  happens confirms their political conviction and in theology too when you go to the Bible,  everything you look confirms your belief about infant baptism, or confirms your belief that  infant baptism is wrong. It confirms your belief, that freewill decides everything about the  matter of salvation. Or it confirms your belief that God predestines and controls everything  that happens. And you read the Bible with an eye toward what agrees with you. And anything  that wouldn't be quite so convenient for my particular theological system. You just kind of  fudge there. So confirmation bias has influence in a lot of different areas. One scholar says to  the other, did you read my paper on confirmation bias? Yes, but it only proved what I already  knew. What the human being is best at doing is interpreting all new information so that their  prior conclusions remain intact. That's what the famed investor Warren Buffett said. If you  want to battle confirmation bias, at least a little, you'll never escape it completely. But to try  to battle it and counteract it. Seek evidence against your beliefs. Look for things that don't fit, what you think, listen to views, that are different from yours really listen, and try to find out if  there's things somebody else is seeing that you're not. Closely related is self serving bias,  where we are biased in favor of ourselves. My success is from my skill my intelligence, my  hard work, my failure, that's due to other factors. Spouses feel that they've done more than  their share, the husband feels he's doing more than his fair share in the marriage and his wife isn't quite doing as much as she ought to. But the wife knows she's doing more than she  ought to be doing really, and the husband isn't carrying his share of the load. Or if you're in  the business environment, this member of the team is certain that he's working harder and  making a bigger contribution to the project than those other slackers that he has to work with. Rolf Dobelli once asked five roommates in a building where he lived to explain how often they  took out the garbage he asked each one separately, while he was riding down elevator at  different times with them just as kind of a little experiment. He took the answers they gave it  it added up to 320% they were taking out the garbage 320% of the time total. So somebody  was overestimating their contribution. Now part of this is related to the psychologists call  availability bias, what comes to mind first and what's on our radar, and we notice what we do. We don't notice what other people do nearly so much. And so we tend to overestimate our  overall contribution, whether we're rooming with people whether they were part of a  marriage, whether we're working as part of a business team, whether we're working as part of a church team, we notice what we're doing. We're less good at noticing what others do. And  we need to think humbler and realize that the self serving bias is very, very strong. Another  bias is introspection bias, thinking that if only you do more soul searching, more looking  inside, then you'll get reliable knowledge, knowledge of yourself knowledge of what you ought to do. And certainly sometimes it does help to do some soul searching or looking inside. But  the trouble with that is, you're always still just looking inside at you. And using your own mind to think about you. Introspection is biased, in favor of yourself. And it's biased in favor of your  current beliefs and behavior. Introspection ignores input from outside, and it tends to discount critics. And so we need to, as the Bible says, Be open to rebuke, be open to the wisdom of a  multitude of counselors and not just say, Wow, I really need to go deep inside and figure out  what I need to do next and how I can be more sensible introspection bias is best counteracted by seeking input and really listening to the input of others. false consensus bias is assuming  that just about everybody, or at least most people, and certainly most right, thinking people 

would agree with me. A psychologist ran an experiment. And he asked a variety of students, a lot of them, if they would wear a sign for half hour a sandwich board, eat at Joe's just walk  around on the sidewalk and wear that sign. They're asked, Will you wear it? And then they  were also asked, Do you think others would wear it? And 62% of those who said they would  wear it, were then asked, well, would others do it too? And they said, Yeah, they would, you  know, it's, it's just kind of funny to something to do. Other people do that, too. And those who  refused to wear the sign when they were asked, Well, do you think others would also refuse  the course they refuse to do something so dumb? So you see that in both cases, a very strong majority assumed that others who did it, others would want to do what they chose to do?  Surely others agree with you? And if not, there's something a little wrong with them. It's  again, that that kind of self centered thing where you project your own thinking, your own  attitudes onto others, and think, yeah, they're pretty much like me. And if you're aware of this bias of this false consensus, assuming that the rest of the group thinks like you do, it'll help  you preachers. Don't assume that everybody in your group is in agreement with you, just  because they don't raise their hand or yell or say, preacher, I disagree. If you're a pastor, and  you're counseling with somebody, and they just look at you and look kind of polite, and nod  their head a little bit, don't assume that they think, wow, that pastor is giving me brilliant  advice, I am going to go do it right away. You can't assume false consensus, you kind of have  to keep on watching. Keep on listening to see whether people really agree with you and be  open to the possibility that they don't agree with you. And sometimes you can send a signal  that you really don't want to hear from people who would disagree, you want the assumption  that everybody agrees with you. And then sometimes you'll you'll get what you want. Not in  the sense that they agree with you, but in the sense that they will never communicate  disagreement. They'll let your bias stay firmly in place. attention bias is another thing where  we need to learn to think a little more humbly, because sometimes we were paying very close attention to one thing. The thing we think matters most we miss something else entirely.  There's a famous experiment in psychology, where various subjects in the experiment were  asked to watch a video. And in this video, they were asked to watch some people passing a  basketball back and forth and weaving in and out among each other. And there were several  people wearing white and several wearing black and the subjects were asked now count the  number of passes that are made by the team wearing white. And so they watch very carefully and because the various players wearing white and black were weaving in and out from each  other and passing, and sometimes doing tricky passes, it was not easy to follow. So they  would watch. And then at the end of the video, they finally had the count. And they were  asked now how many passes were there? And they wrote down the number of passes. Then  they were asked kind of a strange question. Did you see the gorilla, gorilla, What gorilla in the middle of this video of the team wearing white and the team wearing black, weaving in and  out and passing a basketball, someone wearing a gorilla uniform walks in, stands in the  middle of the whole thing for nine seconds, pounds their chest. And then after nine seconds in the middle of the video leaves. And more than half of the people watching the video and  counting the passes of the team wearing white didn't see the gorilla. And they would not  believe the researcher when he said there had been a gorilla in the middle of that video until  they rewatched it. Because you see, when your attention is very focused, and your mind is  following something that's kind of hard to follow. It can shut out an awful lot of other details,  we see what we focus on. As Daniel Kahneman says, We can be blind to the obvious, and we  can be blind to our blindness. We don't even know what we're blind to. Specialty bias is  assuming that my specialty, the thing that I'm really good at, reveals the only cause of  something and I know the one solution and it happens to lie within my specialty. If your only  tool is a hammer, everything looks like a nail. That's the old saying. So we need to avoid the  single cause fallacy thinking that a set of events has a single easily identified cause and it just happens to be caused by the area that I'm good at figuring out and that it can be solved by  the one thing that I'm good at suggesting, we need to seek multiple causes, and reasons why  something might be happening. And we need to consider whether there might be a variety of  responses that would address the whole situation. Otherwise, we get too focused in on what  we happen to be good at often at times. It takes teamwork. People with a variety of skills and 

specialties. People who are good at a variety of things to achieve the overall result and so  think humbler, realize that the things that you might happen to be good at, are only a small  circle compared to the total truth that's out there, and the total factors in a situation and if  you think humbler, then you're readier to see that other people have good insights. And they  might also have good skills that can do things that you can't. We're too sure of what we think  we know. And we seldom realize the vastness of our ignorance, and the uncertainty of our  world. The vastness of our ignorance and the uncertainty of our world. We'd rather think that  we just know it all. survivor's bias is another difficulty. survivor's bias is caused by the way  people are brought to attention, say in the news or in other kinds of coverage. Lottery winners are on the news, they won the Mega Millions, they got so much money, whoa, they won big,  professional athletes are on television. And in the news all the time. Famous Actors are on the screen of the TV, and the movie theater. And you see these people who've made it big as  actors, and are getting paid millions and millions of dollars for every show. You see books by  famous authors, and you think I could be an author. You hear the famous singers and you  listen to their albums, and you watch the music videos, and you wonder whether you might  be able to be a great singer. You see the tremendous business entrepreneurs, the people who  become billionaires, those who started their own businesses, and you say, I think maybe I  ought to start a business. You see successful churches, they become mega churches that  grow huge. And you say, I bet if I planted a church, it might take off. But you don't think  yourself. Most people who play the lottery lose money. Most people who try to be professional  athletes don't. There's one in a million chances that I could ever be a professional athlete.  Most people who try to be actors may act in their high school play. And then they get really  poor in college while they majored in drama. And then they tried to get an acting job but they  just aren't out there or they don't pay anything. So you get a second and third job. That's  what happens to most people who go into acting for every author who gets published, there  are hundreds who don't. For every famous singer, there are hundreds of wannabes, who  maybe sang for a little group here or there in their own setting or sang a bit of karaoke in a  bar one time, but they sure never made it big. For everybody who launched a company or  tried to two out of three business starts, fail. Much the same is true of church planting. It's not for the faint of heart. Most church plants don't last for very many years, and very few of them  turn into mega churches, a lot a bunch of discouraging downer information. But why don't you hear more about the vastly greater number who fail? Well, because of survivor's bias, the  ones who make it big, are the ones who also get all the publicity. But when you're thinking  rationally, it's wise to pay some attention to those who didn't get all the publicity, because  there are a lot more of them. But if you're blind to the probability of failure, then you  overestimate your chances of success. And you say, I could make it big as a singer, I could be  a great athlete, well, I made 10, free throws in a row one game, and you don't realize the  level of skill that it takes to really be the super elite in any of these areas. That doesn't mean  you should never try anything. We'll get to that a little bit later. But it does mean that if you  try something don't overestimate your chances of success, just because you heard about the  successes, and haven't paid attention to the failures Don't be taken in by survivor's bias.  Outcome bias is evaluating a decision by how it turned out by the results, not by whether it  was based on the best information that was available at the time. And so sometimes you get  reckless leaders who get lucky, or they make a rash decision that was had very little chance  of succeeding, but they're one of the lucky ones. And it turned out and now they're praised for being bold, and for having foresight. And they're put in charge of many greater things. And  many times they end up wrecking that enterprise. Sometimes wise leaders, who end up with  a bad result are blasted as incompetent even though they made a good decision based on the best information and took the best approach at the time. But things just didn't work out.  Because there's a lot that's beyond our control. But outcome bias just makes the decision  about how good or bad it was, based on how it turned out. We credit presidents with saving  the economy, when in fact sometimes it's just great inventions, or other things in the business world that took off and had almost nothing to do with the president at the time. But hey,  outcome bias, he must be good for the economy because the economy went well during the  past four years. The Bible says not everything is under our control. And outcomes don't 

always reveal how fast or strong or smart you are, the race is not to the Swift or the battle to  the strong, nor does food come to the wise or wealth to the brilliant or favorite to the learned. But time and chance happen to them all. In short, stuff happens. And sometimes good stuff  happens. And not always because of your brilliance. Sometimes bad stuff happens, and not  always because of your stupidity. But outcome bias tells us that if it turned out well, I was  brilliant if it turned out poorly. Somebody who was dumb. hindsight bias is the sense that my  mind has a firm grasp on what has happened, and that my memories are accurate. The fact of the matter, as is demonstrated by a lot of research in cognitive psychology is that your  memories change to fit how things turned out, or to fit your current opinions. A major project  surveyed 3000 people's views on a variety of issues and asked them how they thought things  would turn out in the political realm and in some other things. Then they were interviewed  again some years later. And it was found that their memories of what they had said back then matched their current views, not what they had said, years ago, they didn't remember  accurately what they had said. Their memories of what they said, turned out to be exactly  what they were thinking now, which was not the same as they were thinking back then. Even  flashbulb memories are often very mistaken. What's a flashbulb memory? Well, it's one of  those that's just imprinted on your mind and it is so unforgettable and vivid and real. It's like  you're still there. Sometimes it happens in connection with a great tragedy like I knew exactly  where I was when the airplanes hit the towers in 2001, or I knew exactly where I was, when  some other great events happened. And I remember it like it was yesterday. Except you don't, sorry. Most of the time, even your flashbulb vivid memories do not accurately capture how  you really felt at the time. Again, psychologists have done research. For example, after the  explosion of the Challenger Space Shuttle, back in the 80s. A professor immediately got a lot  of different students to write down where they were, how they felt, what they witnessed at  the time. Then he went back to those same students Three years later, and asked them to  record they all knew exactly how they felt back then they all remember those memories were  so vivid, except there wasn't a very close match between what they wrote down three years  later. And what they wrote a day later, one student was shown, he said, I know that's my  handwriting, but I couldn't have written that. That's how even our flashbulb most vivid  memories are. So, again, memory has value. Don't be so sure that your memory is right.  Sometimes you get in an argument with somebody, I remember exactly how it happened.  Okay, two things keep in mind, first thing is, you remember, you're remembering from your  perspective, what happened. And secondly, you know, what, you might not even remember  accurately what happened, because memory is very, very tricky. And so when you're in a  sharp disagreement with somebody else, it's not just pulling the trump card to say, but I  remember that sometimes you don't. Memories change. Our minds have imperfect ability to  reconstruct past states of knowledge or beliefs that have changed. Sometimes you think you  believe something all along, but you actually changed your mind about it. Once you adopt a  new view of the world or a part of it, you lose much of your ability to recall what you used to  believe, before your mind changed. I know, people who, after they grew up, declared  themselves to be transsexual, and they remember their childhood. Or do they? Because I  remember, sometimes people back in their childhood. And I remember some of the things  they said and did and it's quite different than their memories. Now, maybe I'm wrong and  what I remember, but maybe when you change your mind about who you are, and who you  decided to become your whole view of your childhood, and how it developed has changed.  The illusion that one has understood the past feeds the further illusion that one can predict  and control the future is Daniel Kahneman. That's the forecast illusion. Here's an annual  survey and it asked Chief Financial Officers of major corporations to estimate the s&p index  returns for the next year. So the big financial markets, how are the stocks and other financial  markets going to do in the next year this survey was taken year after year after year of the  top chief financial officers of the big corporations. Then the researchers took 11,600 Total  forecasts and match them up with how the s&p actually did a year later, they found that there was zero correlation. In fact, there was slightly less than zero, so even a negative correlation  with the forecast and the actual value of the s&p. In other words, the expert forecasts were  literally worthless. And they still didn't know it. They still thought that they had a pretty good 

idea of how the markets were going to do in the year to come not exactly but a pretty good  idea when in fact, they didn't know any more than anybody else. How the markets were going to do. There are two kinds of forecasters, those who don't know, and those who don't know,  they don't know. That's John Kenneth Galbraith. economists think they can forecast what's  going to happen with the economy. But most of the time, their forecasts are worth almost  nothing. Studies of fund managers who try to manage and beat the returns of the market.  Studies find again and again and again, that just a market index beats 80% of the fund  managers. Climate scientists forecast what's going to happen in 10 years in 40 years in 100  years. A quick question in the past, how well have climate scientists predicted the future?  Second question Is climate so simple? With so few factors, that with enough study, scientists  can figure it out and predicted? Don't think so. When it comes to tech futures, what's going to  be invented 10 years from now, 30 years from now, what's gonna be the next big thing in 50  years. Books are written like that. But let me assure you, if you go back 50 years and read  those books, they will read as a comedy. And then there are those who have forecast illusion  about Jesus return. They've studied the Bible carefully. They've looked at all the signs of the  times, they've computed this, they've calculated the secret meaning of that, and now they  know, they know that Jesus is going to return at such and such a time, there was only one  verse they missed. Jesus statement, no one knows the day or the hour, not even the Son of  Man, when Jesus was on earth, even he didn't know. But somehow the date setters can  forecast that there are two kinds of forecasters, those who don't know. And those who don't  know, they don't know. And they try to forecast. Now listen, you who say today or tomorrow  will go to this to that city, spend a year there carry on business and make money but you do  not even know what will happen tomorrow. What is your life, you are a mist that appears for a little while and then vanishes. Instead, you ought to say if it is the Lord's will, we'll live and do  this or that so you can always forget God's role in it. And the fact is, when you're forecasting,  oftentimes, you forget not only God's appointment and God's direction of events, but you also just don't know how much you don't know. So if we get the forecast illusion, along with the  forecast illusion comes our planning. And our planning is often very different from reality. The  Scottish Parliament decided it needed a new building and it was going to build it. And the  estimated cost of building it was $40 million. The final costs are not dollars 40 million pounds  in the final cost was 431 million pounds 10 times, the original estimate is our that's just one  product check that got way out of hand. Newsflash, most projects get way out of hand,  beyond the original estimate, rail projects worldwide. This isn't just looking at one railroad  here or there. But worldwide, a study of many rail projects found that the project's cost 45,  more 540 5% more on average than was estimated originally. And also estimates of  passenger use were 100% 106% greater than actual use. So for any given railway project, it's  going to cost almost half again as much as you think it's going to. And it's going to serve less  than half as many passengers as you claim it's going to, but the next railway project will come along, and the next state or city or country will hail it as the thing that's going to make a big  difference and serve. So many people only cost this much. Plans have to take into account  what the average is what other similar projects have done before you say and that's the way  it's going to be. In 2002 Americans remodeling their kitchens had an expected cost of  $18,658. And actual cost of 38,769 Probably be higher now. But the point is, it was more than  double what they estimated when they decided first to start remodeling their kitchen. In  planning bias, you expect things to go as planned. And in that expectation, you ignore your  own track record how often in your life, do things always go the way you plan them. And you  also are ignoring that there are just factors completely outside the plan and outside your  track record in your control. And so it's a lot more humble. And it's a lot more realistic to plan  by considering other similar projects. If you are a railway planner, don't just look at your  estimate of things, or what you think will happen and how many people that will serve look at  what's happened with similar rail projects all over the world. And then try to adjust  accordingly and say it's going to cost more than I think And it's not going to serve nearly as  many people as I think now is it still worth doing maybe so but realize that planning has got  consider other similar projects. And another suggestion that has been made by people who  are trying to help overcome planning bias is this when you've got a plan now, take a moment,

take more than a moment. And imagine a year from now imagine that this just turned out to  be a disaster. Please describe how your plan turned out to be a disaster. Now if you do that,  then you start thinking of possibilities that could go wrong. And maybe after you've done that, you will have helped to prevent that. disaster. But at any rate, if you never consider other  projects that are similar, never imagine anything disasters happening in your plan, then  you're guilty of planning bias and you're headed for some trouble. Then another bias that  again is kind of related is control bias. Planning bias comes because we think we can control  things. Rolf Dobelli tells the story of a man who goes out into the town square every day  wearing a red hat. And he takes off that hat. And he waves it wildly for five minutes. And he  waves it and he waves it and he he puts it back on and walks away. He does that day after  day after day, goes into town square waves his hat for five minutes, then leaves. One day  somebody comes up to him and says, Why are you doing that? He says, I'm keeping the  giraffes away. Well, there aren't any giraffes here. I must be doing a good job. That's control  bias thinking that your actions are the decisive factor in the situation thinking we can control  things that we really can't control. You ever heard of placebo buttons? You may not have  heard of them, but you might be using them. When you come to a traffic light and you're  getting a little antsy and you want to walk across the traffic, you want to change that light  and good. There's a button there. And you press that button. And once you press that button  then pretty quick, it'll give a red light to that traffic that's coming. And you can get the walk  signal to walk right across Aren't you glad you have that button. Except in many cases, that's  a placebo button it does exactly nothing, except make you feel like you're in control and not  get quite so antsy and so restless while you wait for the traffic it's deliberately put there for  that purpose, or you're on an elevator and it has door open button and door closed buttons  and you joyously push the door open button. So hurry up and open and you can get off. And  what do you know, it opens or you get into it, and you push the door closed button. Good.  Now it's closing, I can get going in this elevator. Many times, those are placebo buttons, they  do exactly nothing. The elevator was going to close, when it was set to close, no matter what  button you pushed. And it was going to open no matter whether you pushed the open button  or not. It was a placebo button to keep you from getting too restless on the elevator. But if it's there. To reinforce your control bias, you feel like you're in control. So you feel better about  the situation. No, sorry, next time we're gonna be in the elevator at that traffic light, you  might not have quite so much confidence in the handy little button. At any rate control bias is  there in doctors, assuming they can control more than they can. Economists thinking that by  taking this or by raising interest rates by dropping interest rates, we're holding this policy, this is going to control the economy or by politicians who think oh, there's a situation we've got to  do something we've got to do it now. Doctors will say oh, the person sick we've got to  intubate. Now we've got to prescribe this we've got to do that surgery. Well, sometimes that's  needed. Sometimes it would be wiser to just let the situation develop a little further and  maybe resolve itself. Sometimes politicians do more harm than good by doing something  rather than by doing nothing but control troll bias makes us feel like we got to do something  to do it. Now doing nothing is just we can't stand that many an investor would be wise to take  the advice of the famed investor Jack Bogle. He said, don't just do something, stand there.  And the old saying is don't just stand there, do something. But if you're an investor A lot of  the time, don't just do something stand there. Leave it alone, leave your investments alone,  because a lot of the time your attempt to control it just after the markets have gone down  means you're gonna lose a lot of money. So control bias can cost you literally our comfort and  conviction that the world makes sense rests on a secure foundation, our almost unlimited  ability to ignore our ignorance. Say What a rude thing to say. Maybe so but Daniel Kahneman  is one of the foremost cognitive psychologists in the world with vast knowledge of the field  and with vast knowledge of how confident we are in what we think and how ignorant we really are. The Bible says the man who thinks he knows something does not yet know as he ought  to know that the man who loves God is known by God. We we rely so much on what we think  we know but it's Actually God's knowledge of us and his work in our lives that makes a bigger  difference. The apostle Paul says, Now I know in part, and only later will I know fully. And he  says, Now that you know God, or rather are known by God. So sometimes we ought to be very

cautious to our claims about we know and supremely in our claims of how much we know of  God. You claim, you know, God, well, rather, you're known by God, and you rest in that fact,  rather than in the quality of your own knowledge. So st. Come where you have many mental  

blind spots, you're wrong more than you admit, you understand less than you realize, many of your plans and expectations fail much is beyond your control. And yet, does that mean okay?  There's no use thinking anymore, because no matter how hard you think you're wrong, a lot of the time, there is no use trying anything anymore, because the likelihood of failure is greater  

than the likelihood of success. Well, maybe so. But if you don't think at all, and don't learn to  think harder, then you're going to be wrong even more than you otherwise would be. And if  you never take any risks, then you don't risk failure, you guarantee failure. You cannot  understand the work of God, the Maker of all things. Sow your seed in the morning and an  evening, let your hands be idle for you do not know which will succeed, whether this or that,  or whether both will do equally well. So you don't know. So if at first you don't succeed, try  again. You don't understand everything, but try to understand what you can and then humbly  do your thinking, and humbly do your risking. Well, let's review a little of what we've learned  about Thinking Fast and Slow system one, your intuition is fast involuntary. It's automatic, it's  effortless. And system two is slow, deliberate, it's attentive, it's effortful. I've been  highlighting some of the glitches or things that go wrong in our thinking. But we can be very  grateful for the minds that we have been given. And even though they're distorted by our own limits, and sometimes by our own sin, it's great to have both of these intuition and the ability  to think more carefully and deliberately. There's complementary roles system one your  automatic reaction, and system two, your attentive thinking work well together much of the  time, system, one saves you time and energy, in pondering every belief, and every decision  would be too slow and exhausting. So system one takes care of a lot of that and then system  two can think more carefully and rationally and evaluate as needed. But beware of the bias.  system one has built in biases, it tends toward systemic frequently repeated errors in certain  types of situations that we've looked at. System two is sometimes blind to the biases of  system one. And it accepts its errors rather than finding them and fixing them. And be aware  of the tendency of system two to be a smug slug system two feels like it's in charge. But it's  lazy it prefers the least effort possible. Some thoughts and actions that system two believes  that irrationally chose are prompted by system one. You think that you really thought this  through but all the time you were being led by your feelings, by your intuitions, by your likes,  your dislikes by your instantaneous reaction, and what your mind was really doing was just  finding excuses to do what your intuition wanted to do anyway. The quick, easy answers that  feel right aren't always right. So don't be a slug. Think harder. Don't be smug. Think humbler?  If a belief or decision is minor, go with your intuition. Don't think too hard. If a belief or  decision could have important long term impact, then think harder and think humbler and all  along train your thinking skills, so that system two can work well when needed. And that's  what logic and critical thinking are about to help train your thinking skills. To get at truth.  Think harder, think humbler and then strengthen your skills to see the structure of arguments  to see whether the argument is actually sound, whether it's actually valid, whether what's  claimed as the conclusion is actually supported by the premises that are presented. Be better  at detecting cognitive biases, and fallacies, we're going to look at some more fallacies in later  talks. Learn to use inductive logic learn to use some of the tools of probability, not just  whether you feel like something is likely, but to actually look at the data, and then be able to  estimate more accurately how probable it really is. And be able to use deductive logic and  proofs. These are some of the thinking skills that we can learn when we study logic and  critical Thinking I hope that you'll be able to think harder think humbler think more accurately  realizing that we're never going to think perfectly but we can certainly improve our thinking



Last modified: Wednesday, March 2, 2022, 10:44 AM