On IdeaCast, we bring you insights from leading thinkers in business. For more advice on growing your business, you should check out the podcast Masters of Scale. LinkedIn co-founder Reed Hoffman joins top minds in business to explore their unconventional paths to scale. Check out Masters of Scale, wherever you get your podcasts.
在IdeaCast节目中,我们为您带来商界领先思想家的见解。为了获取发展您的企业的更多建议,您可以听听Masters of Scale这个播客。LinkedIn的联合创始人Reed Hoffman与商业顶尖人才一起探索他们的非传统扩张路径。请在您获得播客的地方查看Masters of Scale。
Welcome to the HBAR IdeaCast from Harvard Business Review. I'm Allison Beard. When any new technology comes along, people, especially those in the tech industry itself, tend to get really excited about all the good it's going to bring. Social media connects people around the world, crypto democratizes finance, generative AI supercharges productivity, and so on. The evangelist crowd is loud and proud. But as we've seen over the past decade, the potential downsides of the latest tech innovations don't always get as much attention. Yes, you'll see some skeptics warning about unintended consequences and negative externalities. But it doesn't seem like industry insiders, the people building and deploying these new tools and the leaders overseeing that work, are thinking all that hard about what challenges they might inadvertently create.
欢迎来到《哈佛商业评论》(Harvard Business Review)的HBAR IdeaCast。我是艾莉森·比尔德。当任何新技术出现时,人们,尤其是科技行业内的人,往往对它所带来的好处感到非常兴奋。社交媒体连接了全世界的人,加密货币民主化了金融,生成式人工智能增强了生产力等等。热心支持的人群非常响亮自豪。但是,就像我们在过去的十年中所看到的一样,最新技术创新的潜在负面影响并不总是得到足够的关注。是的,你会看到一些怀疑者警告意外后果和负面外部性。但是,似乎行业内部人士,那些正在构建和部署这些新工具以及监督这项工作的领导者,并没有认真思考他们可能无意中产生的挑战。
Our guest today is an unabashed techno optimist. He really does believe in the power of technology to improve our lives. But he also knows how important it is for tech companies to think more carefully and responsibly about the problems they're trying to solve and the products and services they're putting out into the world.
Reed Hoffman is a founding board member of PayPal, a founder of LinkedIn, a partner of the Venture Capital from Greylock, and a director of several companies including Microsoft, although he recently stepped down as a board member of OpenAI. He's also a podcaster hosting Masters of Scale and the new show possible. Reed, welcome. Great to be here.
里德·霍夫曼是PayPal的创始董事会成员、领英的创始人、Greylock风险投资公司的合伙人,以及包括微软在内的多家公司的董事。尽管他最近辞去了OpenAI的董事会成员一职,但他仍然是一名播客主持人,主持Masters of Scale和新节目Possible。欢迎里德光临。非常高兴在这里见到你。
Okay, first off, how do you define responsible or ethical technology?
So one of the illusions that is sometimes promulgated is that technology is essentially value neutral. And that doesn't mean that it embodies values in kind of a simple way that like, you know, I believe in democracy or I believe in some other form of human organization, you know, or kind of the various values debates we're having within the US and other countries.
I think that the question is you say, well, how does this affect the human condition? What does it mean for different individuals like are there bias issues or are there things where it creates some kind of bad social impact?
And you have to ask these questions. Obviously, one of the challenges when you're dealing with things of scale is it's never all good. Like 100% everything on the thing. What you have to do is you have to make it on broad, really good and then try to make sure that you're not disadvantaging groups that don't have power or a voice. So for example, you say, well, cars, well, cars are generally speaking very good at enables transportation, enables mobility, enables people to live in different areas on the other hand, of course, in the US, we have 40,000 deaths per year in driving, and then of course, climate and all the rest. So you have some kind of challenges and you try to shape it so that on balance is very good and you're dynamically improving as you learn and refine.
As someone who has been a leader in the tech industry for a really long time, what is your honest assessment of the job that you all are doing in considering not just the upsides but also the downsides and then trying to mitigate those risks, whether that social media a decade ago or a generative AI today?
Well, it's a little bit hard to talk about the entire tech industry because there's some people I think who are doing pretty good jobs and I think there's some people who are doing pretty awful jobs. You know, the story of social media, as you said in the intro, is when it opens with blogs and social networks and all the rest. It's like, oh, we're giving voice to the people who didn't otherwise have voices and people who might be a minority of some sort somewhere in the world, whether it might be, you know, kind of sexual orientation or might be religious or might be a racial minority, they can discover their voice and they can connect with other people. It isn't that awesome. And of course, it is and it continues to be.
But you say, well, now it becomes where everyone's there and then all of the issues that become part of why we have government, why we have regulation and how we make society work together, those then come in place in full.
And like, for example, one of the classic things that I've been debating since as long as I've been on basically television, I think I did a 1996 firing line on this on freedom of speech. Is this like, oh, we don't regulate freedom of speech.
And it's like, well, of course, we do. We have truth in advertising. We have issues around hate speech or violence or there's all kinds of ways we regulate speech.
Many of you say, well, my freedom of speech allows me to say false advertising and to sell drugs that are harmful for lots of money. You're like, well, that we don't allow a society.
And I think that's what the tech industry is still coming up to speed on in terms of how do we navigate what is our definition of truth in collective discussion and how do we navigate that?
我认为科技行业在如何导航我们集体讨论中对真相的定义上仍然需要逐步提高认识,我们该如何应对?
Now when it gets to AI, which is obviously the thing I've been spending a ton of time on the last number of years, I think the tech industry has learned from the social media side to pay more attention here.
So the question's around, well, is it biased or might there be unintended consequences in jobs or misinformation?
问题是,它是否有偏见,或者会在就业或误导方面产生意想不到的后果?
The way that ethics starts is by asking the questions and checking as you're building. You're not going to get it perfect. You're not going to launch something scale and get it perfect. But if you're asking the questions and you're measuring and you're improving, then you'll eventually get to a very good place.
So it seems like you're saying that technologists today and leaders of tech companies have maybe learned from the era of social media and that the famous move fast and break thing era is over.
What I'd say is, okay, you know, I'm the author of Blitzkailing. I'm a definite move fast person. The question is, what things do you break? You break your servers? Fine, no problem. You break society. No, that's a problem.
And what I'd say is some tech leaders, you know, Sachin Adela, Sam Altman are well on the learning curve. I think it'd be a fool's statement to say, oh, we've learned we're good. It's like, no, no.
Part of what you're doing is we're exploring this new stuff and we're building these new things and you can't predict all of it. You're learning as you go and you're fixing.
What factors are they considering when making business decisions?
在做出商业决策时,他们考虑了哪些因素?
You know, what is the general public maybe not see or hear about what's going on behind the scenes, both in the VC community and within the companies themselves?
您知道,在风险投资社区和企业内部,一般公众可能无法看到或听到幕后发生的事情吗?
So I'd say that every technology company that I'm a part of, that my partners at Greylock are part of are all at least asking the questions and doing it as part of how they develop the technology and the questions can range from are we being responsible stewards of data and people's trust?
Might there be groups that are being disadvantaged by this technology that is a structural bad thing like for example, racial disadvantage?
这项科技可能会让某些团体受到不利影响,例如种族歧视等结构性问题,是否存在这样的情况?
Whereas you say, well, we're just advancing criminals. And that's okay. That's fine. You know, fraudsters. And the best we can are we red teaming and thinking about blind spots or things that could go wrong?
Are we thinking about what happens when this gets the scale? Do we have a good theory about why this will be net really positive and how we can remediate or diminish harms?
I think all of those questions in every tech company that I'm part of are a center and we go out and learn. We hire people and ask for, you know, what are the other things we should be thinking I'm doing here?
Yeah. And bigger picture is the industry and I'm sorry for keeps saying the industry as a whole.
是的,更大的图片是这个行业,我很抱歉一直说整个行业。
But most of the people you know, you know, are you now focusing on more important problems than perhaps you once did?
你所认识的大多数人,是不是现在把注意力放在比之前更重要的问题上了?
You know, there's the famous Peter Till quote, we wanted flying cars. They gave us 140 characters. And now it's more like we want climate change solutions, but we're getting a chatbot that can write stanzas like Shakespeare.
So for example, flying cars, I'm on the board of jovy. We are working on flying cars. And that's to redefine space in a climate change way that will help with gridlock and pollution and a bunch of other things and be accessible.
On the other hand, the natural pattern of these things is to try to figure out what's the easiest work to do that's most valuable. And so that's why people tend to do a lot of software.
另一方面,这些事情的自然模式是尝试找出最有价值的最容易完成的工作。因此,人们往往会做很多软件。
And you know, I tend to think that actually chatbots can be really valuable. They can be valuable for, you know, anything that ranges from, give me some good information to help me solve this problem to any number of things that could play into human life.
But on the other hand, of course, solving hard problems like climate change, ocean, deacidification, you know, other kind of things are super important. And people are working on those. They're just harder because it's a lot more expensive with the economic rewards being much more challenging.
One of the things that, you know, I try to give thinking to and advice to is how do we create an incentive system that also goes after the hard problems more? Yeah, absolutely. And you use the word valuable. So let me press on that a little bit.
By valuable, do you mean valuable to society, valuable to investors? You know, where is that purpose, profits, trade off, or balance falling for your community now?
Well, in an ideal system, you align them so that the high functioning of business, where the product that's offering their customers, it's really good for the well-being of the customers in society and the stakeholders that are in, there are of course places where that gets misaligned. And it's not only within the tech industry, it's, you know, this is one of the challenges we can have with making industries work.
And look, there is a, in all of society, there's a whole bunch of people who are doing things only for money or only for profits. That's part of our, how we design the alignment of society that goes all the way back to Adam Smith. But the question is also that people will say, I want to hold my head up with my friends and my community in SAP doing a really good thing.
All the people I hang out with are focused on, how is it that we're also making the world and society better with what we're doing? And so for example, that's one of the questions we ask at Greylock, I'm going to invest in, is to make sure that we are positive on those vectors that we have to do so within the context of a strong business.
But you know, if you're asking a question and intentionally trying to do that, then that's at least half the game. And so as of you see at Greylock, you know, what are you looking for, seeking out both in business ideas, business models, and founders right now?
Part of the thing that's a delight about venture investing is while you may have a very active theory of the game. So I've been doing, you know, a generative AI for the last few years. I co-founded a company called Inflection with Mustafa Soliman. We have adept in Cresta and Storkel and all these other companies at Greylock. And so we have a very active thesis on artificial intelligence and have had for, you know, five plus years.
We're also being surprised by the amazing things brought to us. So you know, just to kind of illustrate what I think the quality of being surprised is, when Brian Chesky and Nate and Joe brought Airbnb to me, I hadn't really been thinking about, you know, a marketplace for space, a question about how you can not just travel to a place to see a monument, but to experience local culture, to enable people to transform their own economic outcomes of being able to afford their house or their space. But yet that's just software and that brings all that together.
So you know, for me, in addition to AI, I also tend to look at networks that redefine our social society space. That's part of the reason I created, you know, LinkedIn with my co-founders, things that we've done in various other investments at Greylock, including like, for example, you know, like take Roblox, which is, you know, okay, you've got developers building entertainment and educational things that generally speaking, mostly appeal to kids, but a whole range of experiences, we're looking to be surprised.
And the question we ask is, are the customers really benefited from this and are the, is the community and society that they're in broadly also benefited? And does it have a very strong business that will transform industries? And you know, if we see all that and we see an entrepreneur that we think is high integrity and that we would be delighted to be in business with our entire lives, then we get really excited and join forces.
Yeah, that high integrity piece, you know, finding people, founding teams who are absolutely trying to scale and run with their ideas and make a change, but then also will take that moment to step back and ask the questions about, you know, ethical construction, deployment, etc. How do you evaluate for that?
Well, it's not a simple formula, but one of the things we do pretty rigorously is reference checking. You haven't completed your reference checks until you found a negative reference check on everybody in the world.
So, for example, if someone was reference checking me in depth, what they would find is, oh my gosh, he's a really great creative problem solver, but he's not particularly good at making the trains run on time, right?
And obviously, you know, when you're asking the integrity question, you're asking a question of, how much do you actually, in fact, walk the walk, not just talk the talk, how much do you, when you're getting positions of stress, do you make decisions, like, for example, that say, no, no, yeah, that would be the easy decision, but that takes risks and other people's well-being.
Let's take the hard decision. Do you honor your commitments? And therefore, you know, when you're saying, hey, we're going to have a commitment to make sure that we are tracking how we impact society. We're going to have dashboards on it, and we're going to be improving them year by year. Will you be doing that?
Talk a little bit about the role that the tech world, the BC world, and industry that is still very much dominated by rich white men has to play in increasing inclusivity and also decreasing socioeconomic equality.
谈一下科技界、区块链界以及富有白人男性主导的行业在增加包容性和减少社会经济不平等方面所扮演的角色。
One of the things that I've been saying for maybe a decade plus now, anytime you look at a problem, you go, that's important to solve, you go, if you're not part of the solution that you're part of the problem, so you need to be saying, how am I as an individual, and also, of course, as a firm in every house, investing and trying to solve this problem?
How am I putting in sweat and blood into trying to make this happen? So relative to diversity and inclusion in making sure that you have a regular workflow and process by which you're trying to recruit, you're trying to meet entrepreneurs.
I mean, we do things at Greylock like have a set of office hours that's only for underrepresented minority entrepreneurs. We do any recruiting thing, we make sure that we are interviewing disproportionately large numbers of underrepresented minorities, including unfortunately, venture women, which is like, well, aren't they half the population? Yes. Yes.
And doing everything you can. And so, for example, we've helped stand up kind of new venture firms, because when they come to us and say, hey, we're going to, we think that one of the things may just be having a venture firm that's entirely focused on funding women entrepreneurs might be a good way of doing it. Great. We'll help you. And so you have to do all that kind of stuff. And what I want the progress to be 10x faster than it is going absolutely.
And if someone figures out a way to make that happen, we'll help. We'll support. On the economic gaps, it's always a little tricky because it's dynamic over time.
For example, one of the things that I do, the same thing in my philanthropy as I do in my investing, which you find an amazing entrepreneur. In this case, it's Byron Aguiste, who says, look, there's all of this massive growth in the tech jobs and tech industry.
I'm like, sure, that works for the community's color, works for women, works for other minority groups. Let's go make sure that a whole bunch of these people have pathways in the tech jobs and make that happen, and so they at least can begin to bring their families and understand what the tech opportunities are, have their communities begin to be able to benefit from participating in these industries.
But by the way, when you're growing a new company, the new company makes the executives and the founders the most money and the next group of people, the next most money, etc. As ways work.
So it doesn't necessarily immediately cause distribution economics, but you're trying to get everyone participating and then you're trying to make sure that the next generation of founders has the diversity that we have in society.
If you're leading a business today, one thing certain, tech is fast moving and sometimes confusing. TechTrends is a new series from Kaspersky that unpacks tech trends to give you insight to make the right decisions for your business's tech and cybersecurity investments.
An expert will tackle a current tech trend during each episode, like Smart Energy, AI and Blockchain, and then you'll hear from someone in business who's using the tech and the decisions they've made and challenges they've faced. That's insight story. TechTrends unpacked for business leaders.
You mentioned other industry leaders that you respect and admire who you think are modeling, good leadership, not only on the, I'm running a great business, but also on the, I'm working to improve society front.
But the poster boys for the tech industry, Elon Musk, Jeff Bezos, Mark Zuckerberg, they definitely aren't perceived that way. No matter how much money they might give to charity or how many rockets they might launch into space.
Do you get the sense that the good guys, as Kara Swisher might call you, are developing as many accolades as the people who do still cling to that move fast and break things, ethos?
Well, when I myself argued with Mark Zuckerberg about freedom of speech issues and all of things, but for example, one of the things I do with him is the CZI biohub where he's trying to cure infectious disease for people all around the world and putting a lot of money to that. And that's because he's such the poster boy for other intelligent criticisms, he doesn't get as much credit near the here Priscilla for all this other amazing stuff he does. So I just kind of feel it's important to make that gesture. Yeah. And there's no question that a lot of the people who make a lot of money then do a lot of good. I guess it's just trying to marry the two is what we're talking about.
Yeah. Well, that's important to do too. But like, for example, there is a differentiation between people who go all of my economics is for my own self-glorification and people go, look, I'm making a bunch of economics and I'm also doing a bunch of things that I'm caring for a bunch of communities that has nothing to do with my self-glorification. And I just, I say that in part because it's too easy to get on the criticism bandwagon and I just, you know, I think it's important to note.
Now I'd say that the folks who are perhaps not beating these drums is extremely tend to have less, I think the word you used was accolades. You know, I think it's because it's the principal way that you get accolades is by defining something pretty extreme and beating that drum and then people who, you know, think that you're the Messiah for beating the drum in that direction, then come follow you.
If you're kind of like measured and saying, you know, like things I've been saying here, which is, look, it's a net benefit is the goal. I think you do have to move fast. I think you have to build things quickly. I think you will break things, including things that you don't want to break in doing it. I think you have to do it with care and attention, but I think if you don't do it with speed, then the people who do it with speed who don't care about what the impact is, you know, set the rules.
And so I tend to think that, you know, it's less good, call it media coverage to talk about the people who are trying to be thoughtful than the people who are being extreme.
我倾向于认为,媒体报道的时候谈论那些试图思考的人,比那些极端行为的人要少,这是不够好的。
Do you think that Silicon Valley still sort of leads the world in terms of what the tech industry is thinking about or do you see sort of different ecosystems developing their own ethos around purpose and profits?
Well, I'd say the two areas in the world that are the most tech leading are both Silicon Valley and a set of cities in China, mostly along the coast.
我认为全球两个科技领先的地区分别是硅谷和中国一些城市,主要分布在沿海地区。
I try as much as I possibly can to help create other tech innovation centers in other areas of the world. I was just in Italy, France, and the UK, you know, high-principled democracies that have a really good concept of kind of like what the human rights should be in so forth. I try to help as much as possible in facilitating the creation of entrepreneurial bases and tech industries.
I think Silicon Valley continues to be along with, like we learn a whole bunch of stuff from China, the kind of driving drumbeat.
我认为硅谷一直与中国紧密联系在一起,就像我们从中国学到很多东西一样,这是一种推动力量。
And it's one of the reasons why I think it's a very good thing that the discourse is, like I'm in dinner parties in Silicon Valley where, you know, part of the discussion is to say, well, how do we, now that tech is continuing to have larger and larger impact, what is the way that we make sure that we're doing the right thing?
Let's talk about China, are those questions being asked over there also? Well, not being a native Chinese speaker and not having been there for a few years. You know, I would say, I think any group of people, you've got a million people, you've got a distributed smart people, you've got a distribution of ethical people, you've got a whole bunch of different things.
I would say that their environment is more tuned at the moment to the, has it were the rise of China and the success of the business and somewhat less to, you know, like, for example, you know, what does this mean for disadvantaged minorities within society? You know, in China, I don't think you have any discussion in the tech and companies of what it means for the weakers, what it means for, you know, other kinds of things. I think people are people. I'm not saying anything about the quality of the people in doing that. I just think it's the environment that they're operating in.
Are there opportunities for more collaboration, interaction, knowledge sharing?
是否有更多的合作机会、互动交流、知识共享的可能性?
So for example, one of the things I've been highly focused on along with the OpenAI and Microsoft folks, which is like AI safety and making sure that when you build these new, very large, very capable systems that the net impact is very good, that there are no really bad impacts. And you say, okay, well, how do we make sure that the work that we're doing, even though we've put in a whole bunch of work and energy and cost and hiring, I think there's hundreds of people at Microsoft who work on AI safety, how do we essentially just distribute it for free? I think we've offered everybody, including our competitors and so forth in China in order to try to get to good places because that's part of being intentional and good people.
So I do want to turn to your new show. It's a very interesting, you know, your sort of addition of the show possible to your masters of scale franchise, because one is sort of the founders, the entrepreneurs, the leaders of companies who made it big, basically, and then possible seems to feature people behind the scenes working on these really difficult problems you mentioned earlier, you know, so on democratizing higher education through technology, nuclear fusion, you know, to help solve some of our climate issues. Talk about why you wanted to launch the show and focus on those people as opposed to the famous corporate leaders.
So you know, one of the things that I see a lot in the US and seen in some places or the rest of the world is what is referred to as tech lash, which is more negativity and uncertainty about what technology is bringing versus the positive sides. And I believe as a hypothesis, but very strongly and argue for the whatever scale problem you're trying to solve, whether it's climate change, whether it's economic justice, whether it's criminal justice, other things, 30 to 80% of the solution is technology. What I mean by that is technology changes the scope of what's possible. It changes cost curves, it changes what you might be able to pull off with the resources that we have. We can help solve these really fundamental problems with technology. And it isn't technology is the only solution, part of the solution.
It's also how we organize ourselves as a society, what we value, what we invest in versus other things, other kinds of things. But technology is an essential part of making that scale solution work. And so we want to go to essentially the leaders, the innovators, the imaginers of what the world could be in this really good new way and to talk to them and to share that sense of here is where we should row towards. And I think we can, for example, solve these really big problems, climate change. And oh my gosh, we could build a world that's so much better than we are today. Let's get to it.
Does the new generation of founders seem excited about that even if it means their big paydays might be two decades in the future as opposed to becoming unicorn within five years? Well, again, I think some are and more are. It won't be all are. It won't be, some people will still be creating. I mean, I'm not going to, I try not to throw entrepreneurs under the bus, but various things that I go, well, that's not a pretty great thing to create. Delivering liquor to your front door or something along. Whatever, whatever the thing might be, I guess the one I most often pick on is jewel. But like creating electric cigarettes or vape things, I think are net not positive.
But go and have the imagination that through entrepreneurship, through technology, through invention, you could solve these things. And as we get, there's a super amount of very talented people in the world and we just want more of them working on these problems. And thinking about the fact that they could make a difference by creating a technology, a business, a project that could focus on this and make it work. And that's the dialogue we're hoping to increase and they kind of applying our imagination to how we create the future.
Yeah. We haven't yet talked about the role of government in innovation and in regulation. So some of the greatest technologies, GPS for one, stemmed from government investment initially. So do we need more of that as part A of this question and part B is, where do you stand on regulation for emerging technologies like generative AI? Should there have been more regulation on social media, etc?
One of things where people say, for example, what do I believe that most Silicon Valley, or a lot of Silicon Valley people don't believe is actually in fact government's absolutely essential. It helps create a lot of things, help create not just rule of law and society and healthy functioning economy, but also baseline investment in universities and technologies. And so I'm a big believer in those.
I also think that regulation can be an important part of that. One of the challenges of regulation is that the baseline conception of how most people tend to think of regulation is to ask for permission, not forgiveness to tell you that you continue to do things the way you've done them in the past and you're kind of locked in with very slow change from that and tend to be done by people who don't necessarily understand what the innovation clock looks like.
And so the principle that I usually articulate here is, when is bad regulation better than no regulation? And by the way, the answer is not that's not a rhetorical question or never, because like for example, when you get to the financial system, the absolute necessity of the financial system continuing to run, you say, well, actually, in fact, bad regulation is better than no regulation to make sure that the banking system doesn't break. And other kind of things, because it's just too critical, otherwise.
Now when you get to a lot of technology, you say, well, you're inshining the past. The problem is, is well, if the actual solution is technology in the future, then a regulation that particularly slows you down or anchors you to the past will be potentially more damaging to humanity than not doing it in various ways.
And so it's a world you say, no regulation is a course thought. So what you do is start by defining like what are the outcomes that you're looking for? And can you set those outcomes to essentially the innovators, the companies, the other things to say more of these outcomes and less of these outcomes. And we'd like to see a dashboard. We'd like to see it tracked by your auditors.
And so for example, when we'd like to see less violence on video. So do you say, well, I'm going to have a regulation to say you must have a five minute delay between uploading the video and the broadcast of it. And you say, okay, well, that may not actually solve your violence in video problem because you know, terrorists or whatever else might trick or hack the system for it in your regulation, really just created a whole bunch of processes that didn't do anything.
Whereas what you said to companies is said, well, okay, I recognize you can't get to zero because again, large scale systems. So let's say for the first 100 views, it's a thousand dollar fine, for the next thousand views, it's a 10,000 dollar fine. And for every view after that, it's a hundred thousand dollar fine. You figure out how not to show, you know, murders on video, right? And that's what I mean by defining outcomes in ways and then having the innovation do that. And that's the kind of thing that I think is the pattern that we need to apply when it gets to technology.
So what advice do you give to people who are early in their tech careers right now? What are some of the pitfalls to watch out for? And how can they become great, more responsible builders of technology? I think everybody needs to think about their own life path with a tool set of an entrepreneur doesn't mean they have to be an entrepreneur.
I think another thing is to realize that the, you know, all the way back to the beginning of our discussion that the creation of technology can be itself a great good if you're asking the right questions. I think that even, you know, questions when you say, well, you know, obviously like take an area that's fraught with a whole bunch of things, genetic modification, genetic and engineering. So that could be really bad.
Obviously, but of course it could be really good getting rid of genetic diseases in ways that just cause suffering. So if you're asking the right questions and you're doing it the right way and you're thinking about how do you shape it the right way, you can have a scale impact in the world that leaves humanity much better because of your effort. And I think that's the, you know, ask the right questions and help create the future.
Reid, thank you so much for being on the show. My pleasure. Thank you. That's Reid Hoffman, entrepreneur, investor and podcaster with the new show possible. And we have more episodes and more podcasts to help you manage your team, your organization and your career, including an upcoming IdeaCast bonus series about how artificial intelligence will change work. Find them at hbr.org slash podcasts or search hbr an Apple podcast Spotify or wherever you listen.
This episode was produced by Mary Doe. We get technical help from Rob Eckhart, our audio product manager is Ian Fox and Hannah Bates is our audio production assistant.
Thanks for listening to the HBO IdeaCast. We'll be back with a new episode on Tuesday.
感谢您收听HBO IdeaCast。下一期新的节目将于周二更新。
I'm Allison Beer.
我是艾莉森·比尔。
Hi, it's Allison. Before you go, I have a question. What do you love about hbr?
嗨,我是艾莉森。在你离开之前,我有一个问题。你喜欢哈佛商业评论的哪些方面?
I worked at newspapers before I came to hbr and the thing that has impressed me most is the amount of attention and care that goes into each and every article. We have multiple editors working on each piece. They put their all into translating these ideas typically from academia or from companies in practice into advice that will really change people's lives in the workplace.
If you love hbr's work, the best thing you can do to support us is to become a subscriber. You can do that at hbr.org slash subscribe IdeaCast, all one word, no spaces. That's hbr.org slash subscribe IdeaCast.