For years, I’ve been working in a field where emotions are seen as dangerous. If you submit to emotions too easily, then investing in the stock market isn’t what you should do. And I tend to agree with that: in investing, emotions make you do irrational things. There’s plenty of evidence that fear and panic make you sell when prices tumble, and that contentment and exuberance make you buy when prices skyrocket.
These irrational things that emotions make us do are called behavioral biases, and people like Daniel Kahneman and Richard Thaler got Nobel Prizes for discovering, studying, and isolating them. Albeit, thanks to these people (and others), we now know about hundreds of different biases, we continue to fall into their traps as if we didn’t. These biases exist because we’re human, because we have a biological structure. As long as we continue to be made of flesh and blood and nerves and chemical reactions, emotions and biases are here to stay, and we continue to act irrationally. That’s just the way we’re wired.
In the case of investments, we tend to project the current state into the future, thinking that it will keep going in that direction, possibly forever. We do this ignoring that, throughout history, what went down eventually came back up (and beyond the prior high), and vice versa. I’m talking about whole stock markets here, not single stocks, which sometimes go down and keep going down for a good reason. And if you do know that good reason, then maybe you should sell. But most of the time you just decide to not go look for the good reason and imagine something which is the product of your intuition and you’ll never know whether that is the good reason or not and get overwhelmed by emotions that in turn trigger irrational acts. And in this example, the irrational act would be not to sell. In short, it’s complicated (or a mess). The interaction between emotions and investing decisions is well documented and if there’s one key takeaway from all the research and studies out there is that humans should forget about stock market investing and let machines do it.
Machines do not have emotions, it seems. There’s a big debate on whether AI, at some point, will become sentient. Some say that, no matter how advanced, it will never become sentient and develop consciousness, feelings, emotions. They argue that true sentience requires a biological basis, and it looks like there’s nothing biological in AI. Others say that advanced generative AI could potentially reach a level of sentience comparable to human consciousness. As interesting and thought-provoking as this debate is, nobody really knows.
But as of today, 7 June 2023, machines do not have emotions. Despite all the hype around Chat GPT and the other new AI tools, there's no evidence that they have even a shred of consciousness and feelings. So they are perfect for investing in the stock market: we can feed them all the data in the world about companies and trends and prices and volumes and what have you, and not only do they eat and digest them in seconds, but they also learn from them and become better and better at finding patterns and insights, and ultimately at making buy and sell decisions. They’re already doing that: this is no science fiction. In fact, they’ve been doing that for years. Have they made their users shitloads of money? In some cases they have, but we can’t say for sure that “AI makes money on the stock market”.
Why is that? Didn’t I just make the case that emotions are dangerous for stock market investing? This is not a piece on investing, but I’ll just say this: to make money in stocks, you have to buy low and sell high, so you need someone on the other side of the trade who sells you low and buys from you high. In other words, you need to trade with people who give in to emotions and act irrationally. Machines, for all the things I wrote above about emotions and stuff, started out with a high probability of being on the right side of the trade. When there were few of them, and the rest of the market was made of humans, it was easy to make money. But as machines grew in number and increasingly replaced humans at this game, the probability to find another machine on the other side of the trade increased. And when that happens, who’s going to make money? Hard to say. But you get the picture.
When the whole world of investing will be made of 100% machines -- something not even that far-fetched -- I don’t know what’s going to happen. Maybe money will no longer be made. Who knows. So, yes, emotions are dangerous in investing and the solution is to hand the job over to machines. But then maybe we’ll get to a point where we will regret having eliminated emotions and we’ll want them back, or else nobody will be able to make money in the stock market anymore. And then what? Will we destroy the machines and put humans back on the job? This could actually be the subject for a finance-science fiction movie titled “The monsters who made money-making vanish in the fog of AI”. Or something along these lines.
But I digress. The thing is, having emotions and acting irrationally and conducting an imperfect existence amid uncertainty and risk is as suboptimal as it is essential to humanity. And sometimes we don't fully understand how essential something is until it’s no longer there.
Emotions may be dangerous in investing, and maybe in other fields too. They have been taken advantage of by prophets and leaders and politicians throughout history. Without emotions and the tools to manipulate them (language and actions), religions and movements and ideals and causes wouldn’t have been born, let alone thrived. Yet, emotions are necessary.
Irrational behaviors have been demonized as risky and potentially destructive. But too much rationality is deleterious. By making everything turn out exactly as expected, absolute rationality kills surprise. And without surprise, without accelerated heartbeats, without the unexpected, I don’t know that it’s worth living this life. They say uncertainty is bad and unsettling and risky and we should strive to make everything as certain as possible, but can you imagine a world where everything is certain? Can you imagine a world where we all have an expiration date written somewhere on our skin, where on the day we’re born we know with absolute certainty when we’re going to die? The day and time and what the weather will be like on that very day, at that very time? And who will be there witnessing our “expiration”, and what they will wear and what they will look like? Can you imagine knowing with absolute certainty what music will be released next year, or in ten years, or in twenty? And what art will be made? And the final score of every football game?
And so at work I have to control emotions if I want to avoid disasters. I have to disregard them. If they come along, I have to pretend I don’t know them, and turn to the other side. I have to be like a machine.
But then I write and read and take photos and hear and play music, and watch movies and go to museums and have conversations and walk at night. And in all these instances emotions are as vital and necessary as the air I breathe. Sometimes they’re hard to decipher, difficult to channel, and overwhelming. But they’re there, and I want them to be there, and if they’re not there it’s not right and I worry. So I live my life going from a state where I worry if emotions drive my behavior to one where I worry if they don’t.
This creates tension and release, alert and abandonment, vulnerability and virtuosity. It’s the Dr. Jekyll and Mr. Hyde of my existence.
Welcome to all new subscribers! I’m glad you’re here. Please leave a comment, I’d love to hear your thoughts on this piece and this substack and the universe and the future of mankind and what have you.
If you liked what you read, it would mean the world to me if you shared it.
And if you’re not yet a subscriber and just stumbled upon this page because someone shared it or by divine intervention, and you liked it, please do subscribe to receive my writing every Wednesday in your inbox.
Funny that I never knew or wondered about your profession. You write with the freedom of a retired person, not a high-stress trader. James Bailey wrote a similar piece about the value of emotion in relationships vs investing that you might enjoy on the subject. https://onmoneyandmeaning.substack.com/p/the-invisible-authors-the-visible
I've always found the research on behavioral biases somewhat difficult. Not that I don't agree they exist, but knowing about them doesn't really seem to make it any easier to avoid them. As far as I can tell, even in the limited field of investing (curious what kind of investments you make, maybe we should talk about that), making money doesn't seem to be about simply removing emotion from trades. There's still value in "gut instincts" that are largely products of emotion. Then again, I've only worked in Private Equity for a short time, so my own experience is quite limited.
I also have issues with the framing Kahneman and Tversky used for biases as always things to eliminate for optimal performance. Not sure if you've looked at much 4E cognitive science, but I prefer their framing. The biases (and Systems 1 thinking) are bad for rational based endeavors (investing, academic activities, etc.) but are the basis for experiences like wonder, awe, and personal growth. Ironically, these biases seem to be the basis by which we can change our relationship as agents with our arena, and it's this transcended relationship which seems to allow new insights that often lead to better outcomes in rational pursuits. At least that's how I've read it.
Interesting article. Thanks for writing it.