{"id":105691,"date":"2024-11-21T01:16:59","date_gmt":"2024-11-21T07:16:59","guid":{"rendered":"https:\/\/milesfortis.com\/?p=105691"},"modified":"2024-11-21T01:31:10","modified_gmt":"2024-11-21T07:31:10","slug":"105691","status":"publish","type":"post","link":"https:\/\/milesfortis.com\/?p=105691","title":{"rendered":""},"content":{"rendered":"<p><a href=\"https:\/\/www.washingtontimes.com\/news\/2024\/nov\/19\/henry-kissinger-final-warning-prepare-superhuman-p\/\" target=\"_blank\" rel=\"noopener\">Kissinger\u2019s final warning: Prepare now for \u2018superhuman\u2019 people to control Earth<\/a><\/p>\n<p>Humanity must begin preparations to no longer be in charge of Earth because of artificial intelligence, according to a new book from the late statesman Henry Kissinger and a pair of the country\u2019s leading technologists.<\/p>\n<p>The rise of AI creating \u201csuperhuman\u201d people is a major topic of concern in \u201cGenesis,\u201d published Tuesday by Little, Brown and Company. It\u2019s the \u201clast book\u201d from Kissinger, according to the publisher\u2019s parent company Hachette. Kissinger was a longtime U.S. diplomat and strategist who died last year at age 100.<\/p>\n<p>Kissinger\u2019s co-authors, former Google CEO Eric Schmidt and longtime Microsoft senior executive Craig Mundie, finished the combined work after Kissinger\u2019s death, and The Washington Times has obtained an advance copy. Mr. Schmidt and Mr. Mundie wrote they were among the last people to speak with Kissinger and sought to honor his dying request to finish the manuscript.<\/p>\n<p>The authors offer a bracing message, warning that AI tools have already started outpacing human capabilities so people might need to consider biologically engineering themselves to ensure they are not rendered inferior or wiped out by advanced machines.<\/p>\n<p>In a section titled \u201cCoevolution: Artificial Humans,\u201d the three authors encourage people to think now about \u201ctrying to navigate our role when we will no longer be the only or even the principal actors on our planet.\u201d<\/p>\n<p>\u201cBiological engineering efforts designed for tighter human fusion with machines are already underway,\u201d they add.<\/p>\n<p>Current efforts to integrate humans with machine include brain-computer interfaces, a technology that the U.S. military identified last year as of the utmost importance. Such interfaces allow for a direct link between the brain\u2019s electrical signals and a device that processes them to accomplish a given task, such as controlling a battleship.<\/p>\n<p>The authors also raise the prospect of a society that chooses to create a hereditary genetic line of people specifically designed to work better with forthcoming AI tools. The authors describe such redesigning as undesirable, with the potential to cause \u201cthe human race to split into multiple lines, some infinitely more powerful than others.\u201d<\/p>\n<p>\u201cAltering the genetic code of some humans to become superhuman carries with it other moral and evolutionary risks,\u201d the authors write. \u201cIf AI is responsible for the augmentation of human mental capacity, it could create in humanity a simultaneous biological and psychological reliance on \u2018foreign\u2019 intelligence.\u201d<\/p>\n<p>Such a physical and intellectual dependence may create new challenges to separate man from the machines, the authors warn. As a result, designers and engineers should try to make the machines more human, rather than make humans more like machines.<\/p>\n<p>But that raises a new problem: choosing which humans to make the machines follow in a diverse and divided world.<\/p>\n<p>\u201cNo single culture should expect to dictate to another the morality of the intellects on which it would be relying,\u201d the authors wrote. \u201cSo, for each country, machines would have to learn different rules, formal and informal, moral, legal, and religious, as well as, ideally, different rules for each user and, within baseline constraints, for every conceivable inquiry, task, situation, and context.\u201d<\/p>\n<p>The authors say society can expect technical difficulties, but those difficulties will pale in comparison with designing machines to follow a moral code, as the authors said they do not believe good and evil are self-evident concepts.<\/p>\n<p>Kissinger, Mr. Schmidt and Mr. Mundie urged greater attention to aligning machines with human values. The trio said they would prefer that no artificial general intelligence surpassing humanity\u2019s intellect is allowed to emerge unless it is properly aligned with the human species.<\/p>\n<p>The authors said they are rooting for humanity\u2019s survival and hope people will figure it out, but that the task will not be easy.<\/p>\n<p>We wish success to our species\u2019 gigantic project, but just as we cannot count on tactical human control in the longer-term project of coevolution, we also cannot rely solely on the supposition that machines will tame themselves,\u201d the authors wrote. \u201cTraining an AI to understand us and then sitting back and hoping that it respects us is not a strategy that seems either safe or likely to succeed.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Kissinger\u2019s final warning: Prepare now for \u2018superhuman\u2019 people to control Earth Humanity must begin preparations to no longer be in charge of Earth because of artificial intelligence, according to a new book from the late statesman Henry Kissinger and a pair of the country\u2019s leading technologists. The rise of AI creating \u201csuperhuman\u201d people is a &hellip; <a href=\"https:\/\/milesfortis.com\/?p=105691\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[87,55],"tags":[],"class_list":["post-105691","post","type-post","status-publish","format-standard","hentry","category-technology","category-they-made-a-movie-about-this"],"_links":{"self":[{"href":"https:\/\/milesfortis.com\/index.php?rest_route=\/wp\/v2\/posts\/105691","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/milesfortis.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/milesfortis.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/milesfortis.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/milesfortis.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=105691"}],"version-history":[{"count":2,"href":"https:\/\/milesfortis.com\/index.php?rest_route=\/wp\/v2\/posts\/105691\/revisions"}],"predecessor-version":[{"id":105693,"href":"https:\/\/milesfortis.com\/index.php?rest_route=\/wp\/v2\/posts\/105691\/revisions\/105693"}],"wp:attachment":[{"href":"https:\/\/milesfortis.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=105691"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/milesfortis.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=105691"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/milesfortis.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=105691"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}