Our Guest Kenneth Cukier Discusses
Go All In on AI: The Economist’s Kenneth Cukier on AI's Experimentation Era
If AI is becoming a playground for experimentation, are today’s organizations bold enough to explore it or are they still too afraid to try?
On this episode of Digital Disruption, we are joined by Kenneth Cukier, Deputy Executive Editor at The Economist. Kenneth is the bestselling author of several books on technology and society, notably including “Framers,” on the power of mental models and the limitations of AI, with Viktor Mayer-Schönberger and Francis de Vericourt, and “Big Data: A Revolution That Transforms How We Live, Work and Think” with Viktor, which was a NYT bestseller translated into over 20 languages and sold over two million copies worldwide. It won the National Library of China’s Wenjin Book Award and was a finalist for the FT Business Book of the Year. Kenneth also coauthored a follow-on book, “Learning with Big Data: The Future of Education.” He has been a frequent commentator on CBS, CNN, NPR, the BBC and was a member of the World Economic Forum’s global council on data-driven development.
Kenneth has spent decades at the intersection of AI, journalism, business strategy, and global policy. In this conversation, he sits down with Geoff to share candid insights on how AI is reshaping organizations, leadership, economics, and the future of work. He breaks down the real state of AI, what’s hype, what’s real, and what it means for workers, leaders, and companies. Kenneth explains how AI is shifting from automating tasks to expanding the frontier of knowledge, why today’s multi-trillion-dollar AI investment wave is both overhyped and underhyped, and how everything from healthcare to management is poised to transform. This episode explores why most companies should treat AI as a playground for experimentation, how The Economist is using generative AI behind the scenes, the human skills needed to stay competitive, and why great leadership now requires enabling curiosity, psychological safety, and responsible innovation. Kenneth also unpacks the growing “AI-lash,” the limits of GDP as a measure of progress, and why the organizations that learn fastest, not the ones that simply know the most, will win the future.
Whether you’re a business leader, technologist, strategist, or simply trying to make sense of the AI revolution, this conversation is packed with clarity, nuance, and practical wisdom.00;00;00;07 - 00;00;27;23
Speaker 1
Hey everyone! I'm super excited to be sitting down with Kenneth Cook. Kenneth is deputy executive editor covering AI for The Economist and a New York Times bestselling author whose book Big Data A Revolution That Transforms How We Live, work and Think, has sold over 2 million copies. What I love about Kenneth is that he sits at the intersection of business research and journalism, working as a former research fellow at Harvard in Oxford, where he led seminars on AI and business.
00;00;27;25 - 00;00;48;08
Speaker 1
He travels the world exploring what's next in AI and is a seasoned speaker for forums like TEDx Davos, NATO, Google, and the US State Department. I want to ask him where he sees hype versus reality in AI. What can this technology really do? Who has a chance to get ahead? And what does it mean for us as workers, leaders and people?
00;00;48;10 - 00;00;52;20
Speaker 1
Let's find out.
00;00;52;23 - 00;00;53;13
Geoff
I'm here with
00;00;53;15 - 00;01;13;19
Geoff
Ken Cook to talk about all things, AI today and the future of AI. So. So, Ken, I mean, this is an area that you've been following, whether it's data, algorithms, AI, AI, as a journalist for a pretty long time. And, you know, we find ourselves in a position now where we've got, you know, a $3 trillion and rising bet on this technology.
00;01;13;27 - 00;01;23;03
Geoff
And so I'm curious from your perspective, what can this technology do? What can it not do, and how does that impact your outlook on how all of this is going to play out?
00;01;23;06 - 00;01;44;29
Kenneth Cukier
you know, some people say, well, is technology hyped? Is I hyped, is I under hyped? The it's a it's a little bit of both. In fact it's overhyped or hyped simply because there's so much investment. As you pointed out, going into it, $3 trillion over the course of basically four years is exceptional in the history of technology and in any in innovation.
00;01;45;06 - 00;01;53;25
Kenneth Cukier
We've never really seen that before in corporate R&D, just to put a point on it. So it's unprecedented that you're getting that much amount of investment
00;01;53;27 - 00;02;07;07
Kenneth Cukier
You could point to the railway mania, but it was wasn't as compressed in such a timeframe, however. And that was the 1800s. Okay. However, the case for it being under hyped is also fairly strong.
00;02;07;13 - 00;02;11;16
Kenneth Cukier
So what is it that AI is and what can it do?
00;02;11;18 - 00;02;34;08
Kenneth Cukier
Let's start with what humans are and and how we make decisions. We're cognitive creatures. We learn and we mature, and then we actually find some economic value in making lots of decisions. And if we can make repeatable decisions, that's great. But often these are going to be one off decisions. But we're we're very, stunted by our temporarily.
00;02;34;08 - 00;02;56;21
Kenneth Cukier
We are going to, you know, degrade over time and then die. And we're also stunted in the sense that we only have five senses and we have one brain, and we need to sleep at different points. Now, I can exceed human capacity, cognitive capacity. It was able to computers were able to do that from the 1940s. So there's nothing really novel about that.
00;02;56;21 - 00;03;12;04
Kenneth Cukier
We're still understanding the immense power of that. But the real gem of AI is that it exceeds our ability not only to understand, but to learn new things, to pierce the frontier of knowledge.
00;03;12;07 - 00;03;22;20
Kenneth Cukier
What I mean by that is the world is very complex. There's a lot of covariance and we tend to dumb it down and simplify it so that we can understand it.
00;03;22;23 - 00;03;48;15
Kenneth Cukier
But if we had a mechanism, a tool whereby we could exceed what is known, sensors that can detect things that the human eye and ear, cannot, but also the intricate and elaborate nature of how reality works that far exceeds our ability to suss out and even to contemplate. That's going to be a beneficial win. So I'll give you an example of what I mean by that.
00;03;48;21 - 00;03;52;01
Kenneth Cukier
By looking at one piece of research from several years ago,
00;03;52;04 - 00;04;18;15
Kenneth Cukier
Google wanted to identify whether you could look at retina scans, the ordinary sort of scan that you'd get at a, at a shop on a, on a main street for eyeglasses to determine whether you could identify cardiac events, whether a heart attack or cardiac arrest. And sure enough, the algorithm that they created by putting in lots of data and then analyzing it was able to do that, and it was able to identify whether the person was a smoker or not.
00;04;18;22 - 00;04;36;27
Kenneth Cukier
At about 85%, well, interesting. Through the retina alone. They were able to identify the age of the person plus or minus five years at about 75%, which is much larger than chance, right? Which is great. And those two features do affect whether someone's going to have a heart attack or not, for fairly obvious reasons age and whether you're a smoker.
00;04;37;00 - 00;05;01;03
Kenneth Cukier
But it was able to identify the sex of the individual, the gender, male or female, to around 97% accuracy. And what was stunning about that is that that was never even known in the field of opthamologist, whether it was possible at all. So I was able to identify something in the structure of the scans that humans didn't even have a theory for.
00;05;01;03 - 00;05;37;13
Kenneth Cukier
We wouldn't have known to look for it, although the researchers did try to identify it, and it turns out to be incredibly accurate. That one finding speaks for the broader way in which AI is going to go out through society first, doing the classic machine learning problem, which is identifying new items of knowledge. That hasn't happened before. But we were then able to suss out why, for example, when girl was able to beat the world's best players, the world's best players were able to look at how it made its decisions and they could reason through and come up with a mental model, a schema of what was going on.
00;05;37;15 - 00;05;58;14
Kenneth Cukier
But pretty soon, just like with the retina scan, which is just the beginning of something bigger, it's going to exceed our ability to understand what is going on. And we're not even going to be able to reason through how it reached its conclusion. But we will know that it's going to be valid. Now, in the case of 97%, remember
00;05;58;16 - 00;06;19;17
Kenneth Cukier
It's not a slam dunk, right? In most instances, it will be able to identify the gender of the person. But in a blurred area of 3% it will misidentify the gender. What will we do in society when that's misidentifying cancer or the rocket explodes on launch? Those are new questions that we'll have to face as we start using AI.
00;06;19;17 - 00;06;21;11
Kenneth Cukier
More and more.
00;06;21;14 - 00;06;28;12
Geoff
So, you know, with all of that in mind and, you know, specifically tying it to the societal piece, it feels like
00;06;28;12 - 00;06;45;12
Geoff
people are struggling to get a singular narrative about how this is going to impact their lives and their livelihoods, and for good reason. Right. There's obviously a ton of uncertainty here. And, you know, based on everything you've heard, can, you know, I'm curious and I know you're a journalist.
00;06;45;12 - 00;07;06;10
Geoff
So you, you know, try to maintain some level of neutrality, but, you know, on a scale from, you know, full excitement to, you know, complete panic, you know, how do you feel looking out over the horizon at some of these new capabilities and, you know, the potential disruptions they can have on our society? And,
00;07;06;13 - 00;07;19;22
Geoff
there anything that you think we as kind of consumers and also the creators of this technology need to do to make sure that you know, with it this ends up being a force for good.
00;07;19;24 - 00;07;34;23
Kenneth Cukier
I guess there's two elements to this. The first one is that nobody really knows what's going to happen. Everyone has a thought and a theory. And often, you know, if someone says jobs are going to be, you know, eliminated and others say that, no, that's not possible, one side is going to be correct and the other side's going to be wrong.
00;07;34;26 - 00;07;55;18
Kenneth Cukier
The second element to that is we don't do very well with change, because we tend to extrapolate from what we have today and think that tomorrow is going to just look like today. But with that one variable that's changed and that never happens, right? You've never isolated down to a single variable. When you have a change, everything, everything else changes as well.
00;07;55;21 - 00;08;26;14
Kenneth Cukier
So let me explain what I mean by that. In terms of health care, imagine that the cost of, diagnosing an element, say, cancer, a biopsy goes from what today might be at cost $1,500 and charged at $7,000 for a simple test, to, one penny. Right? Just like the penny post in England in the 1800s. Sending letters you've got a penny scan for because it's image recognition and the data collection.
00;08;26;14 - 00;08;49;15
Kenneth Cukier
I'll explain how that works in a minute. So suddenly, when that happens, you, the classical economists, would say all these people are going to be out of work, but then they would look at it from a welfare consumer welfare standpoint, which is to say the benefit, the value of it to the individual. And they would say, well, it's going to be great for the individual because they're now going to be paying a penny, right?
00;08;49;15 - 00;09;32;26
Kenneth Cukier
Not $1,000. Right. But it's more subtle still. There's a lot of secondary effects. Imagine that the data now is collected every time we flush the toilet. Right. And it's just the natural cost of doing business. Right. It's which is added to your, your bill. Right. And now you're identifying some people are going to cancer, some people aren't. And you're identifying new trends signaling the progression of disease, being able to spot the incidences of cancer far earlier than before because you're learning new knowledge about how the cellular structure changes and the biomarkers for that identify of cancers are not, suddenly you're able to prevent cancer, rather than having someone knock on the door and feel a
00;09;32;26 - 00;09;59;20
Kenneth Cukier
lump and say, hey, is this a problem? And then it's too late to act so suddenly from a consumer welfare benefit, everyone is better and they're healthier. There may be as many jobs or more jobs in health care, but they're earning money through different ways. Okay, let's say there's not more money in the health care system. Let's say actually, if you go from, you know, $1,000 to a penny, that in fact you're paying more for health care for other things, but not in that one domain.
00;09;59;22 - 00;10;39;14
Kenneth Cukier
Keep in mind that GDP, the way that we measure the strength of the economy, the value of the economy only measures market transactions. So if something is happening that you used to pay for, that you no longer pay for, GDP goes down. That was the great phenomenon between 1995 and 2005, 2015, in the US economy, when they looked at the value of the internet for the economy, you no longer had to erase and print out a PDF of your blueprints before 430 to then race uptown in Manhattan to hand it to Fedex by 5 p.m..
00;10;39;14 - 00;11;00;28
Kenneth Cukier
Pickup to then ship it to California overnight for then to for them to receive it like all of that was gone. So the cost of the printer, right. The cost of the taxi and the cost of Fedex and your time. Right. Gone. What was the value of the of the internet to the US economy? In this instance, GDP went down because it was free.
00;11;01;00 - 00;11;21;12
Kenneth Cukier
Everyone benefited because you could click a button and get it in milliseconds in Santa Barbara. Right. So when we think of AI, things will absolutely change. We don't know exactly how, but we have to remember there's going to be second and third order effects, not simply that first order effect. I would look for these changes to be quite profound.
00;11;21;18 - 00;11;32;01
Kenneth Cukier
I don't think we're going to be out of work, but I think most jobs, or I should say, think most jobs, almost all jobs will definitely change.
00;11;32;04 - 00;11;59;02
Speaker 1
If you work in IT, Infotech research Group is a name you need to know. No matter what your needs are, Infotech has you covered. AI strategy? Covered. Disaster recovery? Covered. Vendor negotiation? Covered. Infotech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe!
00;11;59;05 - 00;11;59;15
Geoff
so
00;11;59;15 - 00;12;21;29
Geoff
if I'm, you know, interpreting that correctly, you see a huge benefit on the consumer side, whether it's in your work, whether it's for, you know, your health or your personal life in terms of that, the cost of what you're getting or the quality of what you're getting. And as an aside, it sounds like you're not a super big fan of GDP being a particularly accurate measure of wealth.
00;12;22;01 - 00;12;23;23
Geoff
Is that. I mean, I just came
00;12;23;25 - 00;12;25;01
Kenneth Cukier
All of that's true.
00;12;25;04 - 00;12;25;08
Geoff
was,
00;12;25;15 - 00;12;26;11
Kenneth Cukier
Yeah,
00;12;26;13 - 00;12;26;21
Geoff
Yeah,
00;12;26;26 - 00;12;28;27
Kenneth Cukier
all of that's true. I mean,
00;12;28;29 - 00;12;32;06
Kenneth Cukier
I'm not, but I'm not I'm not totally anti GDPR.
00;12;32;09 - 00;12;44;19
Kenneth Cukier
it's useful in certain domains but not in others. You have to really underscore you have to recognize its limitations. But I'm definitely pro consumer for how AI works its way through the economy and makes our lives better.
00;12;44;19 - 00;13;12;02
Kenneth Cukier
It already is. Google search is being an example, translation being a second one. But most people don't realize that a typical cell phone is loaded with I. For one thing, it's actually managing the power management of the battery, because you're going to have 80 apps open because you're never closing them, but they power those down that they predict in a model that you're not going to use, but they actually will keep those alive that you think will be.
00;13;12;07 - 00;13;22;08
Kenneth Cukier
So they manage the power consumption and get like 20% better battery life just from an artificial from a from an AI system operating the background, managing the battery.
00;13;22;10 - 00;13;24;26
Geoff
Right. So, so when I think about these use cases and
00;13;25;01 - 00;13;38;04
Geoff
no surprise that the notion of in all of these examples is data has kind of come up and needing to have this information about the consumer or about the system. And this is this is an area that you've been covering for for quite a long time now.
00;13;38;04 - 00;13;46;28
Geoff
And so, you know, from your perspective is this is this all sort of a continuation of, you know, the big data revolution we saw, you know, 10 or 15 years ago?
00;13;47;05 - 00;13;52;09
Geoff
You know, are there discrete chapters or is this like a completely, you know, different track that we're on?
00;13;52;12 - 00;14;13;15
Kenneth Cukier
It is absolutely a continuation. In fact, the term big data was always a shorthand, and it was a shorthand for machine learning and the term machine learning was a shorthand for artificial intelligence. And the reason why they use these other terms is that back in the year 2000 to 2010, you couldn't actually use the term AI because it was it was considered laughable.
00;14;13;19 - 00;14;37;23
Kenneth Cukier
The technique didn't work because it was actually a different approach. So they had to come up with a different term. And so big data was just like, hey, you know, nobody talks about 802.11, but that's Wi-Fi. That's the Wi-Fi standard. So big data became the sort of shorthand for what was going on that the canonical text, which was written by, Russell Stuart and Peter Norvig of Google is says it all.
00;14;37;29 - 00;14;50;04
Kenneth Cukier
It's this is the textbook that all the entry level, you know, the undergraduate students learning. I have to read to to advance and it's called artificial intelligence. Colon a novel approach.
00;14;50;07 - 00;15;06;13
Kenneth Cukier
Why that subtitle? Why a novel approach? Well, the point is that in the past, the field grew up by having these almost hand-coded instructions, the symbolic macking mechanism of basically giving it instructions so that it could actually do things.
00;15;06;15 - 00;15;26;28
Kenneth Cukier
The shift that took place, which was Hinton and and you're still Bengio as those three the considered the godfathers of AI is to invert the process and say, no, no, no, no, no, don't instruct the computer what to do. Give it lots of data, and the data will infer what the right answer is from its training data, from its training set.
00;15;27;00 - 00;16;06;27
Kenneth Cukier
That technique was machine learning. That then became deep learning when you had multiple layers. Of course, Hinton gave us backpropagation, so it learned from what it had actually learned. So it's kind of a recursive learning that was phenomenal. Then it would go off to the races by 2012 with the GPU, that it would go off with the ramp to 2015 with, with generative adversarial networks, where they actually can actually create things using the same model, but to predict and what not to predict, but to generate, of course, the transformer architecture to 1917 and then 2022 with large language models that have captured everyone's attention.
00;16;07;00 - 00;16;20;14
Kenneth Cukier
But it all comes back to the basic fundamental question or fundamental method, which is give it data and the doctrine is more data better answer.
00;16;20;16 - 00;16;30;12
Geoff
Right, right. That's sort of a corollary of the garbage in, garbage out thing of, you know, is that the answer is only going to be as good as the data that it gets. And so, you know, the reason I was curious about that is,
00;16;30;19 - 00;16;33;06
Geoff
you know, big data, people have been talking about it for a long time.
00;16;33;06 - 00;16;52;16
Geoff
People have been, you know, and I'm sure you face this in some capacity. But around that time, there were organizations 15 years ago or more that recognized the power of big data and machine learning, and they were collecting reams of data and analyzing reams of data. You know, not not just the Googles of the world, but, you know, all sorts of, you know, enterprises.
00;16;52;18 - 00;17;15;22
Geoff
And there were organizations that just kind of missed the boat on that one, right? That just, you know, didn't necessarily lean into it. And, and so, you know, it seems like and I'm curious on your perspective, it seems like a lot of the organizations that were well positioned at that time are now like collecting dividends. And they're only becoming, you know, more powerful and getting farther ahead in this era.
00;17;15;25 - 00;17;39;17
Geoff
And so my question is, with some of these new tools, is it do you predict we're going to continue to see the same winners as all along and they'll just get farther ahead. Or for organizations that missed the boat the first time, is there an opportunity to kind of, you know, leapfrog some of the traditional methods here to, you know, start to take advantage of these technologies?
00;17;39;19 - 00;18;02;05
Kenneth Cukier
So it's a big bet and I, I don't think we're we're still in early days of the data revolution. I'm really struck by that. Every now and then I get a knock on the door to give a talk or someone says they they like the book by an organization that's just learning about data. That and I as a an artifact from 15 years ago when I was writing about anything.
00;18;02;08 - 00;18;22;19
Kenneth Cukier
How are these people still in business? What is going on in the economy? But it takes a long time for these things to seep in. We see this in the stock market. Of course, there's the phenomenon of the best versus the rest, where the best companies. This is the Eric Brit wholesome. There's been lots of research, but Eric Sellers of MIT has some of that actually.
00;18;22;19 - 00;18;50;02
Kenneth Cukier
Now Stanford has some of the best, papers on this, showing that the the companies at the top in their field, they're not like 15% better or 25% better. They're like three times or eight times better in terms of capital allocation and and return on investment and stock market performance than the average of their industry, and certainly much more than their laggards.
00;18;50;09 - 00;19;25;01
Kenneth Cukier
And of course, we know with the the stock market in 2024 and 2025 that if you take out the the AI levered AI related shares and, and big tech shares and look at the stock market as a whole, the stock market has basically plateaued. It's not it's really flatlined. And all of the gains from most of the markets have come from the AI companies or the technology companies, including Defense Tech, that really pulled out ahead and left the lift, the lift to the whole market.
00;19;25;03 - 00;19;48;14
Kenneth Cukier
So we're already seeing these incredible gains that we get. However, I think there's a sort of a gravity to a business. This thing about Walmart, you know, it was 20 years of of Amazon when Walmart woke up and it was like, we are we're stuffed. Unless we do something and they're back and they're doing things, they've got a great brand, they've got a very capillary network.
00;19;48;16 - 00;20;08;24
Kenneth Cukier
The, the, the, the clicks and bricks model turned out to be a very good one. Where people can click and collect was very good. People wanted to see it in a showroom to then ordered online and then get it so they could touch it and feel it. Amazon can't do that, but a lot of retailers can. So it turns out that there is a more of a constancy and a staying power for companies.
00;20;08;24 - 00;20;28;13
Kenneth Cukier
The other thing about being a real life company is that you if you make that investment, you can get the data in a way that new companies cannot. And if you if you cared a little bit about data in the past, you can go back to your troves. You have to clean it. And it's very expensive, but you can actually learn from that data.
00;20;28;19 - 00;20;48;16
Kenneth Cukier
So I'm actually mesmerized by the number of companies that have stayed in the game, even though they were so slow out of the gates. But they're now in AI. I'm making some of those bets. And of course, there's a whole ecosystem, that of service providers, that is willing to to work with these companies to get them where they need to go.
00;20;48;16 - 00;21;12;12
Kenneth Cukier
These companies on their own cannot hire AI engineers. But just like the promise of business consulting, where you're not going to get the best strategist because you can't afford it for a one and done project. So you amortize that cost over all of the fortune 1000 by choosing McKinsey and choosing BCG and Bain and and they they host that person who then is allocated to you on a piecemeal basis.
00;21;12;15 - 00;21;30;19
Geoff
So when you, when these people knock on your door and, you know, kind of say help, you know, I should have done this 15 years ago. But I'm ready to start today. You know, what do you tell them. You know, what's your advice for how to get started, how to think about this and how to get back in the race, you know, so to speak?
00;21;30;19 - 00;21;36;24
Geoff
Is it just like, oh, you know, hire, hire a big consultancy and you're on your way or is, you know, what's the guidance to give them?
00;21;36;27 - 00;21;50;11
Kenneth Cukier
Yeah. The guidance is basically so first thing you know, I'm a I am a as you pointed out, I am a journalist. I'm not a consultant. And so this is sort of a big picture take I don't want to go into the weeds of the company, but what I would do, because that's just not where I play.
00;21;50;13 - 00;22;13;14
Kenneth Cukier
But what I would say is, okay, you're behind, you know that. Just be honest with yourself and your team if that's the case. Secondly, this is about a culture. It's about having a data mindset. It's about seeing the world and all things in it through the lens of data and not put it in not what a vendor says.
00;22;13;15 - 00;22;44;26
Kenneth Cukier
This is what we can collect and this is what we can analyze. You bring your humanity to the table and bring your ground truth and your deep understanding of the customer, the customer's pain points and what you do and what, and the values that emanate from you that you want to put into this world. And then you're going to need to collect probably a very different data set, and you're probably going to need a more bespoke system that says what was right for, you know, an e-commerce merchant in this country or at this time or in this sector is not right for you.
00;22;44;29 - 00;23;03;09
Kenneth Cukier
So, I mean, I know of one company that wanted to take a look at its, you know, it was a content company that wanted to look at its, its least performing, articles and just and snip them off. It's there. At least performing articles are also the most expensive to do, as you can imagine, because they're going to be a niche article.
00;23;03;16 - 00;23;03;25
Kenneth Cukier
Right?
00;23;04;03 - 00;23;22;17
Kenneth Cukier
if it's in finance would be, you know, something weird about, you know, bank, you know, sustainable, you know, bank, performance and and living will rates related to stress testing it according to the Basel banking requirements. Right. Things that are just ridiculous. I said, well, wait a minute.
00;23;22;20 - 00;23;47;16
Kenneth Cukier
Why don't you attach a lifetime customer value and LTV, metric to every single article and identify on a per article basis where the value of the article is relative to the subscribers on a lifetime basis. What you might find is this sort of blockbuster effect where everybody likes that the sexy article about Facebook launching a new cryptocurrency.
00;23;47;18 - 00;24;07;23
Kenneth Cukier
Right. Because that's just sort of what everyone is sort of interested in. It's in the air, but those people come and go off your rolls. That's not that important. You might find that those articles that that get very little traffic that, that are very costly to do is where you have your most prestigious and most durable subscribers.
00;24;07;29 - 00;24;17;20
Kenneth Cukier
In fact, you don't want to get rid of them. In fact, you don't want to have more of them. You just want to keep that as it is. The success of a company often
00;24;17;20 - 00;24;25;02
Kenneth Cukier
is as as much by luck as by intention. Most serious business people understand that they are inheritors of something.
00;24;25;06 - 00;24;27;16
Kenneth Cukier
And you don't really know what works and what doesn't.
00;24;27;23 - 00;24;55;17
Kenneth Cukier
It's always, you know, every day is a new product market fit because the world changes and people's tastes change. So I think you want to be this is you want to be very cautious of what you do. I should say there's a political philosophy behind this. And it's called by Edmund Burke from 1800s. It's called conservativism, where it's not just prevent progress, but you, you reform by doing that that the most, you know, careful and circumscribed thing.
00;24;55;20 - 00;25;11;21
Kenneth Cukier
And I think that that's what I would argue that they, that businesses need to do apply data, but be very careful and, and don't just sort of go with abandon thinking that you can do absolutely everything because you want to be very judicious. How you apply your data.
00;25;11;24 - 00;25;19;23
Geoff
Well, and you know, hearing you say that it feels like the same principle probably applies to, you know, any adoption of new AI technologies as well. And
00;25;19;26 - 00;25;31;24
Geoff
you finding that organizations that there's a temptation to just, you know, throw AI at everything and, are people being diligent enough? And that conservativism approach of where are the risks?
00;25;31;24 - 00;25;38;03
Geoff
What are the kind of crown jewels? And how do we make sure where, you know, you're taking the right approach?
00;25;38;05 - 00;25;44;11
Kenneth Cukier
So, Jeff, that's that's actually a slightly different quite interestingly. And it's a great question. It's because it's a different question, a different answer
00;25;44;15 - 00;26;02;29
Kenneth Cukier
the thing about data, the data stick, if you will, that's essentially a century long, endeavor. Right. It was it's basically the foundation of statistics in the sciences, which was dates from the late 1800s, really in a substantial way to today.
00;26;03;00 - 00;26;26;11
Kenneth Cukier
So when we when I'm asking business people to apply data to their business problems, I'm basically saying adopt this technology or this technique that scientists use. But, you know, in at scale between 1930 and 1970 when the, when, when the sciences became very mathematics. And you should just do it as well because now the tools are accessible.
00;26;26;11 - 00;26;29;20
Kenneth Cukier
Excel has all this power. You never use it as just one small example
00;26;29;23 - 00;26;53;26
Kenneth Cukier
for AI for where we are today, particularly if it's a Lem as as the technology, I actually believe go for it. Like literally do everything. If you've got if you've got 15 ideas and five of them are good and five of them are weak and or middling, and five of them are pretty laughable, don't call them do all 15.
00;26;53;26 - 00;27;14;28
Kenneth Cukier
If it cost too much, do it anyway. Find a way to make it happen. And the reason why is this is a period of experimentation. Nobody really knows what's going to work or not. The famous study that's that was from a division related to MIT, but not really at MIT from the fall that, spoke about 95% of corporate pilot projects not working.
00;27;15;01 - 00;27;31;05
Kenneth Cukier
I thought I wasn't surprised by that. I was really surprised that, like, wow, 5% are. That's great. The reason why is by the time it's a corporate project, it's has the buy in from the CEO, it's top down, it has a budget. The General Council was involved too. It's people are involved. That takes a lot of effort. Right.
00;27;31;05 - 00;28;06;08
Kenneth Cukier
That's a real project. That's not where I'd look for the value of I. What I'd look is to ditch it from the corporate hierarchy, top down, try anything. I'm panicking. Approach of management and look at the atomic unit of the individual, not 9 to 5 when they're in the office, but 5 to 9 when they're at home. When an executive has to get something done for the next day and it's just screwed and he or she has to, I say, use AI in a creative way and in the magic way that they think they could do before, and they optimize around their own work needs.
00;28;06;10 - 00;28;23;29
Kenneth Cukier
That is where AI is going to be experimented best. That's where the learnings are then going to scale and go up to the organization. So I'm very bullish on organizations, particularly the atomic unit of the individual, the executive who comes up with an idea from the bottom up as well as the top down and says, let's try it all.
00;28;24;02 - 00;28;46;04
Kenneth Cukier
This is a this is incredible, incredibly febrile moment. And it would be ridiculous for someone to apply the same rules of business that are very probate and sober and patient and careful at a time when there's so much happening and the innovation cycles can be very quick, and the cost of innovating doesn't have to be that expensive.
00;28;46;06 - 00;29;02;25
Geoff
So, so you see it more as you know, you're encouraging if I'm using the analogy. Right. It's almost like a petri dish of just like try everything we, we need to experiment as, as broadly and as grandly as possible. And you know, we'll know the winners when we see them. But let's let's throw it all at the wall for right now.
00;29;03;00 - 00;29;03;17
Geoff
Is that fair?
00;29;03;17 - 00;29;33;03
Kenneth Cukier
Totally. Yeah, totally a playground, right? It might be the example like and just and just play play with it. At The Economist that's what we have done. Right. And our editor was very, you know, I was very probate and responsible and careful and judicious and nervous about all the things that we were doing. And then in one of the meetings, editor basically blurted out, look, I know that most of these things are not going to pan out, but we'll be ready when the when we have something that will pan out.
00;29;33;05 - 00;29;51;06
Kenneth Cukier
And I thought, yep, that's exactly it. That is exactly it. Let's go. Let's play. Let's lose money, let's experiment. But the risks of not being primed as an organization, as a culture, when we when there is, when the moment's right, that's really the danger.
00;29;51;06 - 00;29;58;07
Kenneth Cukier
so we're we're doing lots of things and some will work, some won't, but none of them are going to reshape our business.
00;29;58;10 - 00;30;06;15
Kenneth Cukier
However, I'm sure that we are going to be the sort of organization that if something does across our spheres that could reshape our business, we're ready to go for it.
00;30;06;18 - 00;30;24;28
Geoff
So so you know, can you tell me a little bit more about, you know, the, the approach of The Economist. Because you know on the one hand you know you're to the conservativism argument, you're, you know holding and caring for this, you know, this very high brand value institution that's been around for, you know what, probably around 150 years.
00;30;25;04 - 00;30;44;25
Geoff
And, you know, I'm sure that comes with its share of responsibility. So what's what's been the posture for some of these experiments. And can you give me a sense of any of the, you know, early, exciting, you know, projects that you've said, wow, I didn't know this could do this for us versus the things that are going to be, you know, clear losers here.
00;30;44;28 - 00;30;52;19
Kenneth Cukier
So yeah, so we never knew what the clear losers would be. I mean, I guess the first thing that had to happen is we needed a different mental model.
00;30;52;22 - 00;31;05;26
Kenneth Cukier
LMS, generative AI generates content. Like what, a rocket scientist to know that. But once you realize that, you realize that you can create more, right? So whatever you're doing, you can create more of it.
00;31;05;28 - 00;31;28;09
Kenneth Cukier
And so there was a lot of people throughout the organization that had lots of ideas about creating more, and we had to have a, a very serious and sort of come to Jesus moment as an organization to say, hey, hey, like every other media company on planet Earth is talking about creating more. So there's nothing differentiated about it.
00;31;28;11 - 00;31;52;09
Kenneth Cukier
What makes The Economist unique? What makes it special, distinctive, and the source of our value is not that we have more, so we have less. So how can we create in a world in which everyone's creating more? How can we do something in which we're actually creating less? So suddenly, instead of having a generative AI, a system where people could query our information, get another article.
00;31;52;12 - 00;32;14;15
Kenneth Cukier
We were doing article summaries making our articles tighter and shorter so people can actually see just the gist of the piece without actually, having to read the whole thing. Now, was that going to cannibalize our our articles? Well let's experiment. Let's find out if it if it if it sinks the ship, we've got a problem. But maybe it's going to enrich add more value to the reader.
00;32;14;15 - 00;32;28;18
Kenneth Cukier
And that would be a good thing. And so that's what we're doing. So we started doing that. Now. We made this put a lot of command and control around it. We didn't have the AI just go off on a. So we have human editors curate what it generates, but we're using it in that way. That's a really low stakes.
00;32;28;21 - 00;32;48;11
Kenneth Cukier
It seems like it's very low. I think it is a quite low stakes thing to do. But the important thing was the reframing that took place. The the reversal right, of saying this is what it's used for, but actually we're not going to use it for the that's primary purpose. We're going to use it for a secondary feature of what it can do.
00;32;48;13 - 00;33;08;09
Kenneth Cukier
And that helped us. We have internal tools. We've got this one woman who responds to all of the journalist queries, whether it's, you know, how do I avoid getting kidnaped when I'm going on this particular, assignment versus, you know, you know, how can I, you know, get my housing allowance for this area because it was missed in that payment cycle.
00;33;08;11 - 00;33;26;03
Kenneth Cukier
And her name happens to be named. And, well, lo and behold, we've got a chat bot called Ask Ann, which is all of that. That and data on, what to do in certain circumstances. So we don't actually have to ask the real. And we can actually ask the computerized and the cyborg and, and if that doesn't work, then ask the real lab.
00;33;26;09 - 00;33;28;14
Kenneth Cukier
And so it's, it's a, it's a lifesaver for her.
00;33;28;17 - 00;33;48;01
Kenneth Cukier
AI one story, one element of what we've done, which I love, is we started translating, and did audio translations of some of our smaller little news briefs, and they were good, and we loved them, and the consumers liked it. And we looked at it and we thought, you know what?
00;33;48;03 - 00;34;06;20
Kenneth Cukier
We're not going to continue with it. It was just a little bit too much effort for the payback we were getting. The translations were good, but they weren't good enough. The audio version was good, but but we could have been better. And we just felt, you know what? Like in a world of constrained resources, this was one thing that we should pull the plug on.
00;34;06;22 - 00;34;30;00
Kenneth Cukier
Stop and put our resources into something else. And what I like about that is it's very easy to start initiatives. It's often hard to discontinue initiatives. And this was one example where there was something wrong with it. But we thought, you know what, let's put our effort in somewhere else. And so we stopped doing that. And I think it was great institutionally for us to see that we have the sort of flexibility that we could actually go from one thing to another.
00;34;30;02 - 00;34;49;07
Geoff
No, that's, that's great. That's a, that's a great organizational muscle to have. I completely agree with you. And I've worked with a lot of organizations and seen it a thousand times where things proliferate and, and they never die. And then it just becomes this, this bloat for the organization. But that's the that's really exciting, Ken, about some of the projects going on there.
00;34;49;10 - 00;35;14;21
Geoff
Really, really interesting to hear, especially from, from an economist reader. Coming back to, you know, the ans of the world and like the human impact, one of the things you've been writing about is basically how humans compete in this world that's increasingly algorithmic, where we're talking about how it's more a genetic or augmented by machines,
00;35;14;24 - 00;35;20;20
Geoff
can you share a little bit about, you know, your perspective there and what skills you think are most important to that?
00;35;20;26 - 00;35;25;14
Geoff
The people and the leaders of, you know, the next generation.
00;35;25;16 - 00;35;48;25
Kenneth Cukier
Great question. So the skills for the doers, as opposed to the leaders, for the doers, it's going to be curiosity. You just have to see the world and and seek imaginatively and and just be interested in lots of different things and not for any utilitarian reason, but just for the pure bliss of being curious and learning new things.
00;35;48;25 - 00;36;08;14
Kenneth Cukier
Because where your mind meanders, you'll find gold. What was the famous, phrase, from here with a thousand faces? Joseph Campbell, where are you? Stumble there. Your treasure lies. And I think that the only way to stumble is just to forage in the dark, and then fall over something. And what you fall over might indeed be a small pot of gold.
00;36;08;17 - 00;36;28;25
Kenneth Cukier
Something valuable and new. Particularly if it's something that other people haven't thought before. You get definitely going to need to be a strong willed character, because if you are actually tumbling upon the new, the world will not love you for it. Everybody says they want the new thing. Nobody wants the new thing. They they give Socrates the hemlock to drink to kill him.
00;36;28;27 - 00;36;54;04
Kenneth Cukier
They crucified Jesus. They, you know, they they laugh at, Ignatius, Semmelweis, the the Hungarian doctor of the Vienna General Hospital who have the most preposterous idea of asking all of the doctors who are performing a live childbirth, or I should say, childbirth, to wash their hands prior to delivering the baby. And they thought it was absolutely ridiculous.
00;36;54;04 - 00;37;15;18
Kenneth Cukier
But all the midwives were the, the, the, the children being delivered by midwives were all surviving the children and the mothers being delivered by the doctors were all dying. The doctors were performing autopsies. They were never washing their hands. And they all laughed and, laughed him away. But of course, he was absolutely correct in that. So innovators always look ridiculous.
00;37;15;18 - 00;37;40;23
Kenneth Cukier
So you need to have strong will as, as well. And some self-confidence when you're doing this, you know, this sort of cognitive forging with curiosity to learn new things for leaders. It's a little bit different. Leaders need to enable those people, leaders, maybe the people who have the ideas, but maybe not. And they need to carry that. They need to own that and sort of take it on the chin.
00;37;40;25 - 00;38;18;00
Kenneth Cukier
By the time you've played the politics to get very high, or by the time you've had to sort of be an institutionalist and, and, and a bureaucrat to get to become a manager. You have other skills that are important to the organization, like constancy, temperament, reasonability sobriety. It's probably not zaniness. Taking risks, going out on a limb, making a case, annoying your peers and your colleagues because you're set, you're you're still going off on this one thing that they think is shouldn't be and doing something else.
00;38;18;02 - 00;38;44;29
Kenneth Cukier
The gadfly is the person who changes the world, but they are enabled by wise leaders who let those people pursue their curiosity and pursue their bliss. So the leaders need to set the culture that enables it. If they don't do that, if they think that that they're going to be a risk averse culture, or they think that they should be the ones to to innovate and not the others, that could be a recipe for for problems.
00;38;45;01 - 00;38;57;07
Kenneth Cukier
I don't think it's a recipe when it comes to people like Steve Jobs, but Steve Jobs was one of them. 1 in 100,000,000. Right now, no one's going to be Steve Jobs and the the wise leader probably says, all things considered considered. I'm probably not Steve Jobs.
00;38;57;13 - 00;39;09;00
Kenneth Cukier
However, there will be 1 or 2. However, for for most leaders, the best thing they can do is to build teams that have a culture of curiosity.
00;39;09;02 - 00;39;15;21
Geoff
Well, I love that. And I'm just kind of processing what that means throughout an organization because it's,
00;39;15;24 - 00;39;33;07
Geoff
you know, an organization of any size is not a leader in their teams. It's, you know, a series of leaders all the way up. And there's, you know, it's funny, you know, one of the things I've reflected on from my career is just, you know, people people love to dunk on leaders as, you know, they're useless or, you know, who cares about them.
00;39;33;07 - 00;39;34;09
Geoff
Why do we need them? But
00;39;34;09 - 00;39;50;20
Geoff
the impact that a bad leader can have anywhere in the organization, I've seen really, really tremendous negative impact there. And so, you know, as you think about, you know, whether it's at The Economist or whether it's with any organizations, you've, you know, you've spoken with or worked with,
00;39;50;22 - 00;39;55;08
Geoff
how do you make sure that culture, you know, does it have to come from the top down?
00;39;55;08 - 00;40;01;20
Geoff
How do you make sure it's kind of proliferating throughout the organization and not just in little pockets?
00;40;01;22 - 00;40;10;06
Kenneth Cukier
So ideally it's going to proliferate throughout the organization. But the reality is it might be only in little pockets, and you just have to accept that,
00;40;10;08 - 00;40;26;16
Kenneth Cukier
whitening. It's not always going to strike everywhere in an organization. You might not want, your innovators to be an HR, but you do want them in product, or you might not want them in product, but you do want them in marketing, so that that's the first thing.
00;40;26;16 - 00;40;53;00
Kenneth Cukier
So I wouldn't say that the whole organization has to, has to be that way. How else would you do it? I agree with you that leadership is really hard. I mean, I think the the best thing that you could do is to train your managers to be better at what they're doing, of working with people rather than doing that sort of 20th century model which always failed, which is to take the person who's good at doing this one thing and then think that they can become the manager.
00;40;53;06 - 00;41;16;03
Kenneth Cukier
To have all those other people work as well as they do. And that's usually a recipe for disaster. It's rare to take a journalist who's really good at being a journalist, make them an editor and say, now you have to manage other journalists if they're not a very good manager. In the case of a newsroom, that's particularly problematic because we see it all the time, because the nature of being a very good journalist is to be tenacious, to be opinionated.
00;41;16;10 - 00;41;41;23
Kenneth Cukier
There's it's also most of what you do is actually not social, but very insular, very quiet. Your reading, your writing, those are very much the skills that insularity, which is very awful to be a, to be to run a team where you need to be a bit more extroverted, you need to put on a public face of confidence, even if you're wracked with lots of doubt, which you can be as a journalist.
00;41;41;25 - 00;42;10;29
Kenneth Cukier
Yes, journalist. Sometimes our self self knowledgeable, not only on rare occasions, not often, but it's been known to happen. Whereas the, the manager. So the manager has a particular role to play that's different than the, the doer underneath them. And that that shift doesn't always work very well. So there's two possibilities when you choose better. So the the best person of the at the doer level might not be the person who's not great as a doer.
00;42;10;29 - 00;42;31;15
Kenneth Cukier
It might indeed be a very, very good manager. Or if you want it to take that person who's good at doing and you make it, put them into a managerial role, flood the zone with resources, give them good coaching, good mentorship, probably coaching from the outside as well, like a corporate psychologist to just to talk through what they're doing so they can become a very a better manager here.
00;42;31;15 - 00;42;44;06
Kenneth Cukier
And then as, as a on a supervisory level, you just hear from the underlings that, you know, how is that person doing? Are you happy here or can they keep their staff? What do they like best? What should the person be improving
00;42;44;08 - 00;42;53;29
Kenneth Cukier
that term psychological safety is so important. Great teams. You see it on a playing court in sports where you are, you can actually watch it live.
00;42;53;29 - 00;42;58;25
Kenneth Cukier
And the points tell their own story in terms of as they accumulate and the hoops go in
00;42;58;28 - 00;43;10;02
Kenneth Cukier
when people are working as a team, they can so far exceed the abilities as individuals because there's a real synergy effect to it. And sorry, from a Dilbert like cliches, but it's true. So,
00;43;10;04 - 00;43;17;01
Kenneth Cukier
the message would be to invest in the managers to get them, get the most out of the teams.
00;43;17;03 - 00;43;43;01
Geoff
I love that and I totally agree with you. But there was another point you made there too, which was that the power of the teams over the individual. Right. And, you know, to what degree is it inherent in that can that it's not? Is that a rejection of kind of this cookie cutter approach of, you know, we hire engineers in every you know, we have an engineer scaffolding and every engineer should come in this one mold.
00;43;43;06 - 00;43;55;17
Geoff
how do you build a great team, and how should you be thinking about the individual players in it? And like, does that need to scale or is it just, you know, looking at it people as people.
00;43;55;19 - 00;44;21;27
Kenneth Cukier
So it's strange. I mean, I think it's about values and it's about, I think the first and most important value is being courteous. Sounds crazy. Sounds like I'm an old school kind of person, but but I'm not. There's a term actually in, in cultural evolution, a sort of a a weird form of quantitative social science, in which they look at how human civilizations form and, and develop.
00;44;21;29 - 00;44;54;19
Kenneth Cukier
And the motto in that domain is, it's better to be social than to be smart. And the reason why is they show that if you have to be smart about the world, whether hunter gatherer, you're a Roman centurion, etc., it's all the knowledge that you can possibly accumulate in your own mind and process and then do. But it'd be so much better if you could learn from other people who've already made certain stakes and therefore not make those mistakes before you've made them, and not have to learn it for the first time yourself.
00;44;54;25 - 00;45;08;06
Kenneth Cukier
So much better to, you know, talk to one guy who says, yeah, don't touch the red coals. Another person says, oh yeah, the red coals are really bad, but there's also white ones and blue ones. So anything that if you feel like it's going to be hot, don't touch it rather than to touch it and burn your hand.
00;45;08;12 - 00;45;13;24
Kenneth Cukier
So it's better to be social then to be smart. So how does that play out? As a group,
00;45;13;27 - 00;45;35;29
Kenneth Cukier
what I've seen organizations work really well. It's because people have a profound respect for each other that that. And even if they are rivals or even if they don't have that respect, they're courteous. There's there's been instances where I've had to interact with other entities not within The Economist, but outside of it.
00;45;36;02 - 00;45;41;01
Kenneth Cukier
And I have a sort of no assholes rule as as the term goes, which is
00;45;41;06 - 00;45;57;29
Kenneth Cukier
we can see the world differently. We can interpret a contract differently. We can have different objectives, we can butt heads and we we can find ways to work it out or maybe even leave. But if they are saying something, but now the turtle, if you will, maybe.
00;45;57;29 - 00;46;11;03
Kenneth Cukier
But they say something to my staff and it's inappropriate. They've got a big fucking problem on their hands. Right? They have to, they're going to either have to justify it or make an apology, and it's probably the end of the relationship.
00;46;11;10 - 00;46;22;10
Kenneth Cukier
And I think there's no other way, like, you defend your staff from people who are discourteous and you also respect you expect that from your peers, from your boss, from all the people around you.
00;46;22;10 - 00;46;30;11
Kenneth Cukier
And there should be never an instance where you're not actually acknowledging the dignity of other people.
00;46;30;13 - 00;46;34;29
Geoff
I feel like I'm learning a lot about you as a, as a manager and a leader
00;46;35;01 - 00;46;38;20
Kenneth Cukier
Brass knuckle Tourette syndrome leader. Exactly.
00;46;38;23 - 00;47;01;26
Geoff
no, it's it's great. It's great. It wasn't necessarily where I thought we would go today, but I'm like, I know, I think it's really, really valuable, you know, kind of leadership lessons there, but we're also sort of backing into, you know, a question here about the, the role that I can and will play in organizations and what it can replace and what it can't replace.
00;47;01;26 - 00;47;21;07
Geoff
And you talked about this, you know, this kind of dichotomy or maybe that's too strong a word between, you know, the social and the knowledge itself. Is it as simple as well? It's not going to replace the relationship. So you need to get really good at the social. And if you just know things, you're in trouble or how would you sort of frame that out?
00;47;21;09 - 00;47;41;10
Kenneth Cukier
So I first have to respect the technology and know that it's going to be able to do things that you can't suss out, that you can't do. Secondly, you need to work with the technology with other people in some industries, the technology to replace other people, and you're going to have difficult conversations with them. I think it more, more commonly, you're going to need people to supervise the technology.
00;47;41;13 - 00;48;03;27
Kenneth Cukier
There's a great paper out by, some Princeton computer scientists called I as a normal technology that makes that case. And they do a very good job of showing that just as we have power tools, we didn't sort of give the machinery and robots a chance to do their own thing. We as human beings set the parameters of what it would do, and then they supervised it and they handled the edge cases and they said, why would this not be the case with AI as well?
00;48;04;03 - 00;48;25;17
Kenneth Cukier
And we're already seeing that with AI. Really idiotic companies are sort of handing it off to the machine and letting it run amok. And smart companies are saying actually want to find the right ways in which we apply it, but we want human beings to supervise it. So I think the we're going to need people to be better at being people bringing their humanity to their work.
00;48;25;23 - 00;48;52;28
Kenneth Cukier
And we're going to need managers who are just better at working with people and acting more like coaches. That, in fact, has been the great trend in the last 40 years in American business, in which the manager has gone from being the supervisor to being the coach and the mentor. You're seeing it everywhere, but all you have to do is pick up a book for a management book from 1970s and then a management book today, or talk to leaders today, and you absolutely see that.
00;48;52;28 - 00;49;37;03
Kenneth Cukier
And it's the progress that people have made, has been exceptional, where one of the reasons why the American corporation has been outperforming so many other corporations is because of the cultural aspects of leadership, that we actually flub the zone with people to be people and to not be only transactional, but relational, and to get the most from other people by recognizing that they're that they're bringing them their whole cells to work and that they want to do a good job, and their role as the boss is to be the mentor and the coach and the inspiration, and not just simply the person who, you know, like in Charlie Chaplin's Modern Times, is glaring and
00;49;37;03 - 00;49;41;24
Kenneth Cukier
making sure that that that the guys at the assembly line turning the wrench.
00;49;41;27 - 00;49;48;28
Geoff
Yeah. I think the guy holding the stick, so to speak. Right. Yeah. So so I want to I want to pull on a slightly different thread.
00;49;49;04 - 00;50;05;18
Geoff
mentioned earlier, you know, this notion that you kind of need to have a strong stomach for a lot of the innovation stuff here and for, you know, whether it's AI or any type of change, there's going to be a big, you know, backlash here, whether it's, you know, internal to your organization, whether it's with consumers or society.
00;50;05;18 - 00;50;28;08
Geoff
People are just, you know, have a limited appetite for change. And I'm curious, you've written previously on, you know, what was called at the time, the Tech Lash. And I feel like I haven't heard that term lately, but I've experienced it, in a lot of ways around AI, even if people haven't used that label. And so, you know, I wanted to ask you a couple of things.
00;50;28;08 - 00;50;29;26
Geoff
Can the first one is
00;50;29;28 - 00;50;54;00
Geoff
what does that look like from your perspective now versus how it looked, you know, 7 or 8 years ago, did this like Tech Lash against, you know, big tech and against these technologies that people are worried about. And then do you find that that informs you at all, as an editor around the messages you want to put out into the world around technology?
00;50;54;06 - 00;51;05;15
Geoff
Are there things that we have a responsibility as journalists or as technology advocates to share? You know, in a world where there's so much anxiety about this?
00;51;05;17 - 00;51;25;12
Kenneth Cukier
Jeff, those are great questions. So to the to the first element of the tech lash and where it is today, I think we actually do have an eyelash, right? The tech lash has still been there. It's you're not hearing about it. And I think in part because for people of a certain generation of a certain age, we've just become a nerd to it.
00;51;25;12 - 00;51;29;23
Kenneth Cukier
We just accept it as the thing rather than as something different.
00;51;29;27 - 00;51;47;15
Kenneth Cukier
butt off. So if you look at the usage rates of a lot of social media platforms, it's plummeted. Whether it's Twitter X or Facebook, LinkedIn is still doing well, but B but there's a lot of, a lot of people who are abandoning social media because they are uneasy with it.
00;51;47;17 - 00;52;01;28
Kenneth Cukier
However, it's large, but for people of a certain age, they just take it as the status quo for younger people. And I'm thinking, 30 and under, 25 and under, they are very anti AI. And that's really interesting to me.
00;52;02;01 - 00;52;11;04
Kenneth Cukier
after spending you know, their their four years of their teenage years with their phones, you know, surgically implanted to the palm of their hand or they're, you know, pressed up against their eyes.
00;52;11;11 - 00;52;35;04
Kenneth Cukier
They now are trying to get rid of it and go for walks, and, and do other ways in which they can try to find deeper meaning in the world, rather than just simply be feel like they're victims and hostage to machine it reminds me a little bit like the anti-smoking movement in the 90s and 2000. The difference there was that there was a cohort of people who it wasn't about cancer and it wasn't about the cost.
00;52;35;10 - 00;52;57;07
Kenneth Cukier
They didn't want to enrich big Tobacco. They just they just understood that it was just Madison Avenue. And it was and and there was something rotten and awful about this, about these companies. And they just wanted to avoid it. And so just as we have big oil and pollution, we see that we've got big tech and a different form of cognitive pollution.
00;52;57;12 - 00;53;08;06
Kenneth Cukier
And so people are rejecting it. And AI is absolutely being rejected by younger people. There was a second part to your question. I've totally not forgotten.
00;53;08;08 - 00;53;12;24
Geoff
The second part to the question was, as a journalist,
00;53;13;00 - 00;53;16;17
Geoff
given that there's some of this, did this, you know, AI backlash
00;53;16;22 - 00;53;20;01
Geoff
is there any sort of journalistic responsibility there as a technology advocate?
00;53;20;04 - 00;53;46;14
Kenneth Cukier
we have a responsibility to, to honesty. Right. And to to truth, to to being. You can't be objective, but you can strive to be impartial. And so I think that, journalism should be, should hold a mirror to its to society, but not a mirror that is, naive, but one that's informed. So it decides to actually.
00;53;46;17 - 00;54;05;10
Kenneth Cukier
I mean, there's on one hand, it's a mirror, on the other hand, flashlight. And the flashlight is where do you put the beam? And so you should we should be wise enough as, as careful custodians of the, of how some people interact with the public with a certain portion of the public sphere. Don't want to say that we speak for everyone, all of that.
00;54;05;10 - 00;54;17;25
Kenneth Cukier
But I think we have an audience and they trust us. So we want to live up to that trust that we should be in that light, in the areas that are concerning to us and therefore to them, and vice versa,
00;54;18;03 - 00;54;26;11
Kenneth Cukier
which is areas like how is is what are the pathologies of technology as well as the benefits of technology.
00;54;26;13 - 00;54;48;21
Kenneth Cukier
In the case of the retina scans that we began with, that's low stakes. The data already exists. It's being used for this purpose. Now it's going to be used for this one of them. That's also a beneficial one. But if it's to identify the racial makeup of someone or, or other characteristics that could lead to harm to that individual, then we should say, hey, you shouldn't use this.
00;54;48;21 - 00;55;18;09
Kenneth Cukier
And how do you build the proper safeguards around the misuse of that information for this purpose? And that's what good journalism should always be doing. And it's it's harder to do today because it's it's costly to do. You need good readers who respect and honor it. Yeah. And that's it. We're in a world in which I think the press has been, has not always lived up to the expectations, and that it that it owes to itself.
00;55;18;12 - 00;55;43;25
Geoff
Right. So let me ask you, maybe, you know, a more specific question about that. So, you know, I, I absolutely agree with you. And I love the, the use of the word impartial there. And I have to imagine in some cases in, you know, with this beat around AI that there's so much money at play, there's so many influential voices trying to sell you their version of the future.
00;55;44;02 - 00;56;12;19
Geoff
And either so there's there's so much marketing at play, there's so many different voices telling you different things. Is there anything, you know, that, that you're consuming now or that that you're hearing that you're most skeptical about or that you think is bias, that that you try to make sure you put you make clear in, you know, you and your team's journalism that this is, you know, boosterism or, you know, readers should be skeptical of these these messages.
00;56;12;21 - 00;56;24;15
Kenneth Cukier
So we've always been saying that these things are boosterism and readers should be skeptical. In fact, our whole nature has been to look at to be, I think, very balanced in our coverage,
00;56;24;22 - 00;56;33;22
Kenneth Cukier
going back years. So we've met so almost as a sort of a muscle memory as sort of our, as a belt, a challenge, sort of as a worldview.
00;56;33;22 - 00;56;55;02
Kenneth Cukier
The way that we see the world is not to buy into the hype. If anything, we try to disentangle what is actually happening and what is legitimate from what people say. I'll give you a strange example about related to Covid, which I think would be useful because that's almost like the most glaring way in which many people in the media fell down.
00;56;55;05 - 00;57;15;07
Kenneth Cukier
about six months, nine months into the first lockdown. So this 2020, a group of scientists got together and they went to a place called Great Barrington, I think, in Massachusetts or New Hampshire. And they created something called the Great Barrington Declaration. And their idea was the idea of a lockdown all together wouldn't be is ridiculous.
00;57;15;07 - 00;57;40;05
Kenneth Cukier
You don't want a lockdown of society. Just just protect the vulnerable people. Keep them separate from the people who in the rest of society. And if you do that, you'll protect the vulnerable and you'll let society to function. And if not, the drawbacks of of locking down everyone else is going to be manifest in terms of lower income and marital disputes and etc., etc. and kids not learning.
00;57;40;07 - 00;58;04;28
Kenneth Cukier
And there was a whole dimension in media. It was interesting to see that had just a knee jerk reaction against it, because you just it didn't fit the narrative that a lot of people had. There was other people who looked at these academics and they said, oh yes, well, he's a professor of neurology at this at Stanford, but he's not an epidemiologist, so he's not a vet, the rights specialist.
00;58;04;28 - 00;58;26;25
Kenneth Cukier
And they were sort of cutting it down. So the economist, interestingly, our science team, we've got some remarkable people here. They did a three page analysis of what they were saying. And they came to the conclusion that it wouldn't work. And it wasn't the right idea. And I, I at the time, I had a podcast on science for The Economist, and I remember interviewing one of the people involved in it, actually, people on all sides of the spectrum of it.
00;58;26;27 - 00;58;52;28
Kenneth Cukier
And I was struck by the people involved being so grateful and thankful for our coverage. And I said, yeah, but we said we we said it was bonkers. And they said, yeah, but you took it seriously. You took the time to think it through and analyze it point by point. And come to your conclusion, we're so grateful that you have the integrity and the honesty and the the goodwill to actually treat it substantially.
00;58;52;28 - 00;59;13;20
Kenneth Cukier
And rather than just with a knee jerk preconceived notion about it, just as we did when it came to that instance and Covid. So two, with artificial intelligence, I hope with all things that we do, we come to it with an unassuming form of intelligence that says we have our values. We we think certain things, but we should look at it and examine it on its merits.
00;59;13;20 - 00;59;29;07
Kenneth Cukier
And we're not going to be swayed by what some marketing department says or what someone else in some of the news organization says. We're going to think for ourselves because our readers expect that. And they want to. They want to think for themselves as well. So we want to give them both sides of the argument.
00;59;29;09 - 01;00;07;26
Geoff
Well, first of all, thank you for that. Thank you for doing that. As a reader and as a think for yourself advocate, because it I mean, I'm sure you face it in some, in some ways and, and and you've talked about it already in terms of, you know, what the economist will and won't do and the, you know, turning down a more is more approach because it feels like, you know, as a consumer or just a person, you know, living at this moment in history where just being inundated with more lower value crap that it's like, don't think about this too hard or don't think about this for too long.
01;00;07;26 - 01;00;25;08
Geoff
Because if it's not designed for that and I don't, I don't know, it's hard to believe that's good for us as a society and, you know, for our ability to, you know, think critically and make the right decisions. And, I don't know, be good people. Ultimately.
01;00;25;11 - 01;00;47;23
Kenneth Cukier
I think it's a I think it's a serious problem. I mean, I the question that we struggle with is, you know, we're a subscription based product. And so although there is a small tribe of people that are our audience of 1.5 million people who pay for us, thereabouts, but we but we it would be interesting to think, well, how can we have even a bigger impact?
01;00;47;25 - 01;01;04;16
Kenneth Cukier
Granted, a lot of those people themselves are journalists or opinion formers in other ways. And so there's a trickle down effect of, of of the integrity that I think we bring to understanding the world that then gets disseminated more broadly. But it's but we wish that we had an even bigger impact. Still.
01;01;04;19 - 01;01;22;04
Geoff
Yeah. No, it's it's it's so important. And I'm curious and I don't know the answer this question, but do you, do you have, do you have kids, Ken. And if so, you know what? What's your posture toward their, I guess, media consumption and social media consumption.
01;01;22;06 - 01;01;49;14
Kenneth Cukier
Yeah, I have had children. One has no interest in social media and and smidgen of media, but that's his thing. For the, for the elderly. Who's is a teenager? They're, you know, it was tough. I mean, social media became, it was very sort of pulled them away from, planet Earth for a while, and now they're they're rebelling against it, which is really interesting to see.
01;01;49;16 - 01;02;06;29
Kenneth Cukier
They're going for long walks and they don't want to be sort of victims of the machine. And that to me, and seeing that was one data point. But then I started seeing and hearing more about it. Now I'm ready to go out and say, hey, it's a thing. The AI backlash has happened or is in the process of happening.
01;02;07;02 - 01;02;25;19
Geoff
Yeah. Which is, you know, and maybe this is a weird thing for, like, the host of a largely AI driven podcast to say, but like, I, I love that. Like, that's very exciting to me that people are willing to like kind of, you know, detach from that and, you know, look for it for meaning and question those things.
01;02;25;22 - 01;02;30;24
Geoff
We've talked about, you know, a lot of advice and a lot of different forms here.
01;02;30;24 - 01;02;39;28
Geoff
what's your best advice these days for business leaders? Just trying to navigate this entire AI and technology landscape.
01;02;40;01 - 01;03;09;14
Kenneth Cukier
Learn, learn, read read about the technology, how it work, just the basic stuff, generally speaking, at a high level, how it works, what the trends are. Have people that you talk to on a regular basis that you trust in your organization. So call it like, lunch and learn conversations. You can imagine an hour and a half theorem, you know, if you just get, you know, high end, you know, takeaway food into the office.
01;03;09;17 - 01;03;24;00
Kenneth Cukier
So there's a little bit of a specialness to a brown bag sort of lunch. But what's a, what's a, what's a what's a little bit more of a polish to it as well, I would say because it advanced them. It involves you as well. People will like it and then have conversations where you're actually learning and discussing with your team.
01;03;24;00 - 01;03;41;29
Kenneth Cukier
And when they see that you're doing these things, reading a book, like some of the new books that have come out on I reading The Economist, talking about articles, sharing, an article with the team that, and then using that as a basis of a conversation, they will start doing that as well. I'm learning. This is a moment.
01;03;41;29 - 01;04;01;13
Kenneth Cukier
It's such a febrile moment in the world, because the technology is so new and the risks of not doing the right thing with it and being lost is high. Even though we said that there is time to catch up, there's not unlimited time to catch up, right? The internet did destroy, Blockbuster and Tower Records, right? Those are real examples.
01;04;01;20 - 01;04;18;27
Kenneth Cukier
Sears, Roebuck, Roebuck, which was the great mail order catalog company and one of the strongest companies in corporate America between basically probably 1910 to 1950, 1960 did go bankrupt as the rise of Walmart and Amazon took it up, you know, just destroyed the mail order catalog business.
01;04;19;00 - 01;04;24;21
Kenneth Cukier
so bringing people together to have those conversations I think is really, really valuable.
01;04;24;23 - 01;04;47;29
Kenneth Cukier
The going back to the, the the the boss as coach and mentor. The coach is great because they're choosing great players. And so I think that's a useful metaphor to say your bring bring together teams and encourage them to be the best they can be. That sounds like claptrap. It I get it, but it actually is true.
01;04;47;29 - 01;04;55;11
Kenneth Cukier
And great organizations are doing that.
01;04;55;14 - 01;05;01;01
Kenneth Cukier
This was brilliant Geoff. Fabulous questions. Great audience. Thank you.
01;05;01;04 - 01;05;26;17
Speaker 1
If you work in IT, Info-Tech Research Group is a name you need to know. No matter what your needs are, Info-Tech has you covered. AI strategy? Covered. Disaster recovery? Covered. Vendor negotiation? Covered. Infotech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe!
The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Kenneth Cukier Discusses
Go All In on AI: The Economist’s Kenneth Cukier on AI's Experimentation Era
On this episode, we are joined by Kenneth Cukier, Deputy Executive Editor at The Economist and bestselling author, to explore why most companies should treat AI as a playground for experimentation, how The Economist is using generative AI behind the scenes, the human skills needed to stay competitive, and why great leadership now requires enabling curiosity, psychological safety, and responsible innovation.
Our Guest Dr. Anne-Marie Imafidon Discusses
Is AI Eroding Identity? Future of Work Expert on How AI Is Taking More Than Jobs
From redefining long-held beliefs about “jobs for life,” to the cultural fractures emerging between companies, workers, and society, Dr. Anne-Marie goes deep on what’s changing, what still isn’t understood, and what leaders must do right now to avoid being left behind.
Our Guest Andy Mills Discusses
How AI Will Save Humanity: Creator of The Last Invention Explains
If you want clarity on AGI, existential risk, the future of work, and what it all means for humanity, this is an episode you won’t want to miss.
Our Guest Peter Norvig Discusses
AGI Is Here: AI Legend Peter Norvig on Why It Doesn't Matter Anymore
Are we chasing the wrong goal with artificial general intelligence and missing the breakthroughs that matter now?