If anyone builds it, everyone dies : why superhuman AI would kill us all
(Book)

Book Cover
Average Rating
Contributors
Soares, Nate, author.
Published
New York : Little, Brown & Company, 2025.
ISBN
9780316595643 hardcover, 0316595640 hardcover
Status

Description

Loading Description...

Copies

LocationCall NumberNoteStatus
Oak Lawn Public Library - Stacks006.3 YUDKOWSKChecked out
LocationCall NumberNoteStatus
Addison Public Library - 2nd Floor - Adult Books006.3 YUDOn Shelf
Batavia Public Library District - Adult Nonfiction006.3 YUDOn Shelf
Berwyn Public Library - Adult New006.3 YUDOn Shelf
Bloomingdale Public Library - New Books006.3 YUDBeing transferred between libraries
Bridgeview Public Library - Stacks006 YUDOn Shelf
Show All Copies

More Details

Format
Book
Edition
First edition.
Physical Desc
xii, 259 pages ; 24 cm
Language
English
Notes
Other Title
Why superhuman artificial intelligence would kill us all
Bibliography
Includes bibliographical references (pages 236-252) and index.
Description
"In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next. For decades, two signatories of that letter -- Eliezer Yudkowsky and Nate Soares -- have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us -- and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn't even be close. How could a machine superintelligence wipe out our entire species? Why would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive. The world is racing to build something truly new under the sun. And if anyone builds it, everyone dies." -- Provided by publisher.

Citations

APA Citation, 7th Edition (Style Guide)

Yudkowsky, E., & Soares, N. (2025). If anyone builds it, everyone dies: why superhuman AI would kill us all. (First edition). Little, Brown & Company.

Chicago / Turabian - Author Date Citation, 18th Edition (Style Guide)

Yudkowsky, Eliezer, 1979- and Nate, Soares. 2025. If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. Little, Brown & Company.

Chicago / Turabian - Humanities (Notes and Bibliography) Citation, 18th Edition (Style Guide)

Yudkowsky, Eliezer, 1979- and Nate, Soares. If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. Little, Brown & Company, 2025.

UCL Harvard Citation (Style Guide)

Yudkowsky, E. and Soares, N. (2025). If anyone builds it, everyone dies: why superhuman AI would kill us all. First edn New York: Little, Brown & Company.

MLA Citation, 9th Edition (Style Guide)

Yudkowsky, Eliezer, and Nate Soares. If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. First edition, Little, Brown & Company, 2025.

Note: Citations contain only title, author, edition, and publisher. Only UCL Harvard citations contain the year published. Citations should be used as a guideline and should be double checked for accuracy. Citation formats are based on standards as of May 2025.

More Suggestions

Staff View

Loading Staff View.