TOP
紅利積點抵現金,消費購書更貼心
If Anyone Builds It, Everyone Dies:The Threat to Humanity of Superintelligent AI (The Guardian Best Books of 2025)
滿額折

If Anyone Builds It, Everyone Dies:The Threat to Humanity of Superintelligent AI (The Guardian Best Books of 2025)

商品資訊

定價
:NT$ 934 元
優惠價
79738
無庫存,下單後進貨(到貨天數約45-60天)
下單可得紅利積點:22 點
商品簡介
作者簡介

商品簡介

本書由 AI 研究者 Eliezer Yudkowsky 與 Nate Soares 合著,聚焦於一個正在快速成形、卻尚未被充分理解的議題:當人工智慧的能力超越人類,我們是否已準備好與之共存。作者指出,目前全球對 AI 的投入速度,已遠遠超過我們建立安全與治理共識的能力。

本書以清楚、有條理的方式說明超智慧 AI 可能帶來的系統性風險,並透過歷史案例與推論,提醒讀者科技發展並非只關乎創新,也涉及選擇、節制與集體責任。書中不以恐懼作為訴求,而是邀請讀者理解問題的結構,思考是否需要在關鍵時刻放慢腳步。

本書為 New York Times 暢銷書,並獲《The Guardian》選為年度重點書目,是一部促進公共對話、而非製造恐慌的思考之作。

・聚焦 AI 的長期風險與治理
・論述清楚,適合非技術讀者
・引導理性而必要的討論

The founder of the field of AI risk explains why superintelligent AI is a global suicide bomb and we must halt development immediately
AN INSTANT NEW YORK TIMES BESTSELLER

'The most important book of the decade' MAX TEGMARK, author of Life 3.0

'A loud trumpet call to humanity to awaken us as we sleepwalk into disaster - we must wake up' STEPHEN FRY

‘The best no-nonsense, simple explanation of the AI risk problem I've ever read’ YISHAN WONG, former Reddit CEO

AI is the greatest threat to our existence that we have ever faced.

The scramble to create superhuman AI has put us on the path to extinction – but it’s not too late to change course. Two pioneering researchers in the field, Eliezer Yudkowsky and Nate Soares, explain why artificial superintelligence would be a global suicide bomb and call for an immediate halt to its development.

The technology may be complex, but the facts are simple: companies and countries are in a race to build machines that will be smarter than any person, and the world is devastatingly unprepared for what will come next.

Could a machine superintelligence wipe out our entire species? Would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares explore the theory and the evidence, present one possible extinction scenario and explain what it would take for humanity to survive.

The world is racing to build something truly new – and if anyone builds it, everyone dies.
** A Guardian Biggest Book of the Autumn **

作者簡介

Eliezer Yudkowsky is a founding researcher of the field of AI alignment, with influential work spanning more than twenty years. As co-founder of the non-profit Machine Intelligence Research Institute (MIRI), Yudkowsky sparked early scientific research on the problem and has played a major role in shaping the public conversation about smarter-than-human AI. He appeared on Time magazine’s 2023 list of the 100 Most Influential People In AI, and has been discussed or interviewed in the New York Times, New Yorker, Newsweek, Forbes, Wired, Bloomberg, The Atlantic, The Economist, Washington Post, and elsewhere.
Nate Soares is the president of the non-profit Machine Intelligence Research Institute (MIRI). He has been working in the field for over a decade, after previous experience at Microsoft and Google. Soares is the author of a large body of technical and semi-technical writing on AI alignment, including foundational work on value learning, decision theory, and power-seeking incentives in smarter-than-human AIs.

購物須知

外文書商品之書封,為出版社提供之樣本。實際出貨商品,以出版社所提供之現有版本為主。部份書籍,因出版社供應狀況特殊,匯率將依實際狀況做調整。

無庫存之商品,在您完成訂單程序之後,將以空運的方式為你下單調貨。為了縮短等待的時間,建議您將外文書與其他商品分開下單,以獲得最快的取貨速度,平均調貨時間為1~2個月。

為了保護您的權益,「三民網路書店」提供會員七日商品鑑賞期(收到商品為起始日)。

若要辦理退貨,請在商品鑑賞期內寄回,且商品必須是全新狀態與完整包裝(商品、附件、發票、隨貨贈品等)否則恕不接受退貨。

優惠價:79 738
無庫存,下單後進貨
(到貨天數約45-60天)

暢銷榜

客服中心

收藏

會員專區