🔑 English/NPR

The messy human drama behind OpenAI

한국의 메타몽 2023. 12. 20. 01:32

Link (click) : The messy human drama behind OpenAI

 

The messy human drama behind OpenAI : The Indicator from Planet Money

The company behind ChatGPT pushed out its CEO Sam Altman on Friday. OpenAI's board gave the public little insight into its controversial decision. On Sunday, Microsoft announced it was hiring Sam Altman. By Monday morning, hundreds of OpenAI employees are

www.npr.org


SYLVIE DOUGLIS, BYLINE: NPR.

(SOUNDBITE OF DROP ELECTRIC SONG, "WAKING UP TO THE FIRE")

 

DARIAN WOODS, HOST: 

The company behind ChatGPT is a complete mess right now. On Friday, OpenAI's board made a shock announcement that it was ousting its CEO and co-founder, Sam Altman, and the details were scarce. The board released a note saying that Sam Altman hadn't been transparent with the board.

 

+ oust : 몰아내다(쫓아내다/축출하다)

 

WAILIN WONG, HOST: 

And what happened next was a fiasco. Senior researchers at OpenAI resigned that day. The company went through two changes of interim CEOs. And last night, Microsoft announced that it was hiring Sam Altman to lead AI efforts. And now hundreds of OpenAI staff have threatened to quit in solidarity with Sam unless the OpenAI board resigns.

 

+ fiasco : 낭패

+ interim : 중간의, 임시의, 잠정적인

+ solidarity : 연대, 결속

+ threaten : 협박(위협)하다, (나쁜 일이 있을) 조짐이 보이다

 

WOODS: It's a lot. And working all weekend talking to sources close to the action was Kate Clark. Kate is a senior reporter for tech publication The Information.

 

KATE CLARK: I have never seen a weekend this drama-filled in technology since I've been writing about startups - ever.

 

WOODS: This is THE INDICATOR FROM PLANET MONEY. I'm Darian Woods.

 

WONG: And I'm Wailin Wong. Today on the show, the human drama behind artificial intelligence. Darian will take it from here, explaining the tensions between AI safety and development that's at the heart of this conflict and how an unusual corporate structure has boiled over into farce. That's all coming up after the break.

(SOUNDBITE OF MUSIC)

 

+ at the heart of : ~의 핵심/본질/중심에

+ farce : 소극, 익살극, 웃음거리

+ boil over into : ~의 사태가 폭발하여 ~로 치닫다

 

WOODS: Kate Clark, deputy bureau chief for the tech publication The Information, thank you so much for joining THE INDICATOR.

 

+ deputy : (한 조직의 장 바로 다음가는 직급인) 부, 의원

+ bureau [ ˈbyu̇r-(ˌ)ō ] : 책상, (특정 주제에 대한 정보를 제공하는) 사무실(단체) 

 

CLARK: Thanks for having me.

 

WOODS: Well, let's get into it. So we can talk about the characters. We can talk about the big, world-changing philosophies. We can talk all about the money involved. I mean, it's really got all the elements for a kind of modern Shakespearean drama.

 

CLARK: It has been an incredibly memorable 48 hours. I mean, this is the company responsible for ChatGPT, which has been a viral sensation over the past year and has really been the leading breakthrough of AI that has gotten so many people excited and nervous about the future of artificial intelligence. But like you said, you also have this cast of characters, namely Sam Altman. He's a longtime entrepreneur with incredible connections and a network unlike anyone else. So the fact that this board fired him expecting things to kind of go on as normal is really one of the most shocking pieces of this whole thing.

 

+ viral : quickly and widely spread or popularized especially by means of social media

+ namely : 즉, 다시 말해

 

WOODS: I'd love to just talk about the background to OpenAI, the company that runs ChatGPT. It's not quite a company. It's a nonprofit. So what is it? Is it trying to make a buck or is it trying to research artificial intelligence?

 

+ nonprofit : 비영리적인, 비영리단체

+ make a buck : 돈을 벌다

 

CLARK: The short answer is both. And it's not something that is normal in Silicon Valley or in tech. Basically, OpenAI is run by a nonprofit. That nonprofit has a for-profit subsidiary. That subsidiary has raised more than $10 billion from investors, mostly from Microsoft, but as well as venture capital firms like Sequoia Capital. They actually cap the profits that investors can get from the company. This is all part of ensuring that they're not prioritizing profits and that they're prioritizing safety. That is why you have a board that does not have equity shareholders like you would typically see.

 

+ equity [ ˈe-kwə-tē ] : (한 회사의) 자기 자본, 순수 가치, 공평, 공정

+ equity shareholder : 주주

+ cap : 덮다, 한도를 정하다

 

WOODS: The goal of OpenAI is to develop AI responsibly. And so what does responsibly mean in this context?

 

CLARK: It really depends on who you ask. But there are, of course, people who are very afraid of a doomsday future in which, you know, AI is capable of doing terrifying things. And OpenAI, when it started in 2015 as a nonprofit, had a goal of developing artificial intelligence as safely as possible. It seems that the board thought that Sam Altman's decisions were focused too much on moving fast and making money. Long before this happened, there were disagreements within the company about whether they were developing that AI safely enough.

 

WOODS: Now, Sam Altman has been really public about the need for AI safety, but there is a camp in the board that pushed for Sam to leave who are even more hardline on AI safety. And there's one board member in particular, Ilya Sutskever.

 

+ camp : 측(진영)

+ hardline : 강경한 태도, 강경책

 

CLARK: Yes. So there is a camp within OpenAI that felt the company was developing the technology too quickly and not safely enough. That camp was led by Ilya Sutskever, who is a co-founder of OpenAI. He was a core part of the fight toward AI safety and was really concerned. This gets a little more complicated because today Ilya said that he no longer supported the board's decision, and he now wanted to reunite the company, of course, because he wants the company not to fall apart. So this morning, I believe 400 OpenAI employees signed a letter saying that they would quit.

 

+ fall apart : 다 허물어질(부서질) 정도이다. 결딴나다.

 

WOODS: Wow - 400 out of?

 

CLARK: Seven hundred.

 

WOODS: Four hundred out of 700, OK.

 

CLARK: Exactly. The number could be even bigger.

 

WOODS: And in fact, as we're going to air, we're seeing reports that hundreds more have signed the letter.

 

+ air : 환기시키다, (의견을) 발표하다, 방송하다, 방송되다

 

CLARK: I mean, that would leave OpenAI with nothing. It's already been a massive destruction of value over the weekend, and it could get worse by the minute. I think by Monday morning, it was extremely clear that there was no going back if they'd lost Sam. If they lost Sam, hundreds of those employees would have the opportunity to go join Microsoft or - who knows? But they're not going to work at OpenAI anymore, and that's a big problem for Ilya.

 

+ by the minute : 시시각각으로

 

WOODS: And what have you heard from people close to the action that you've been talking to? Like, how are they

responding to Microsoft hiring Sam?

 

+ action : (이야기 속의) 사건

 

CLARK: I have heard that they are very optimistic about a potential that the board resigns and Sam Altman return to OpenAI. Me waking up this morning to some of these developments, I'm thinking, OK, Sam Altman has joined Microsoft, and that's the end of that chapter. Of course, you know, moments later, people are telling me, no, we're still fighting to get this back, to reunite this team and get everyone back together. Who knows? Maybe we'll have more news by the end of the day. Maybe this goes on for weeks. But it's definitely not over yet.

 

WOODS: I mean, I can imagine from the Microsoft perspective, this looks like a complete mess when you've got an investment of billions of dollars. And it seems to be either the world's highest stakes existential drama - one view of it is that - but another view is just this kind of squabbling between an unprofessional board.

 

+ existential : 존재에 관한, 실존주의적인

+ squabble : 옥신각신하다. 티객태격 싸우다.

 

CLARK: From what I understand, if Sam Altman does end up returning, there'll be a new board. Microsoft would like to have a board seat. They may have what's called a board observer seat, which means they can go to the meetings, but they can't actually vote on big decisions. But certainly I think moving forward, if he returns, you're going to see a venture capitalist. They are going to want to be in that room. Many would argue, and many have, that they should have already been in that room. But because of OpenAI's unusual structure, they really didn't have that opportunity.

All of this ties into a bigger conversation about governance on these really powerful startups. It's just so complicated because I understand why they didn't want a bunch of equity shareholders like venture capitalists on their board - because, of course, you might imagine those venture capitalists wouldn't be as concerned with safety. It really does make sense, the approach they'd taken. But then you have something like this where they fire the CEO for what appears to be very little cause. So had they had those investors on the board, that wouldn't have happened. But then you have a company that's prioritizing profits.

 

+ for what appears to be very little cause : (직역) 원인이 거의 없는 것으로 보이는 것 때문에 -> (의역) 원인이 되지도 못하는 이유로

 

WOODS: I mean, this really shows the tension between wanting to be the exemplar in AI safety and also be able to pay for all those servers and graphics processing units and just distribution of these tools and development. You need a lot of money for that. And so is it even going to be possible to both be the AI safety people and the frontier-of-AI people at once?

 

+ exemplar : 모법, 전형

 

CLARK: I think this episode shows that that isn't possible. So, who knows? I mean, perhaps we'll see somewhat of a divorce between that nonprofit and the for-profit subsidiary, and maybe that's the best path forward. Or maybe you will see Sam Altman actually stay at Microsoft, and he'll build an AI lab under Microsoft's umbrella. And they'll have access to that technology. You then have a whole nother set of issues, which is that Microsoft has then power over both of these entities which may fight against one another. And that gets complicated. So there's really no path out of this that's straightforward. I mean, there's a lot of jokes on Twitter this morning about, you know, this board was supposed to determine when OpenAI's technology reached artificial - reached AGI, meaning that their technology was superior to humans.

 

+ nother = another

+ straightforward : 간단한, 솔직한, 복잡하지 않은

+ AGI : Artificial General Intelligence 

 

WOODS: Right - artificial general intelligence.

 

CLARK: Yes, exactly. And that board couldn't even think one step ahead to clearly outline why they made their decision. So it is kind of insane, and I think it will make a great limited series at some point or a movie.

 

WOODS: Kate Clark, senior reporter for The Information, thank you so much for joining THE INDICATOR.

 

CLARK: Thanks for having me.

(SOUNDBITE OF MUSIC)