检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张武举[1] 胡泽恩 ZHANG Wuju;HU Zeen(Southwest University of Political Science&Law,Chongqing 401120,China)
出 处:《河南警察学院学报》2020年第6期75-86,共12页Journal of Henan Police College
基 金:国家哲学社会科学基金项目“伦理的刑事司法运用研究”(15XZX016)的阶段性研究成果。
摘 要:自动驾驶汽车作为当今人工智能技术的热门应用,面临两难选择的伦理困境。以人类为中心和个人化的伦理设定的矛盾冲突,在这一领域有着更直观、更真实的表现。伦理层面上,自动驾驶汽车在技术设计中应当避免作出可能伤害人类的决策。在两难伦理困境中,伤害人类的结果已不可避免。自动驾驶汽车须接受人类的引导,以防止人工智能自主决策并伤害人类,确保人类在涉及人之生命的重大决策中享有充分有效的自我决定权。在刑法学领域,须明确刑事责任主体及具体情境下不同的伦理设定可能导致的刑事责任。在排除刑事风险的前提下,可以建立一个以用户为主导的自动驾驶汽车两难伦理困境决策方案。As a popular application of artificial intelligence today,autonomous vehicles face an ethical dilemma.The conflicts between human-centered and personalized ethical settings are more intuitive and real in this field.On the ethical level,autonomous vehicles should avoid making decisions that may harm human beings in their technical design,while in the dilemma of ethics,the result of harming human beings are inevitable.Therefore,autonomous vehicles must be guided by human beings to prevent artificial intelligence from making decisions and harming human beings,and to ensure that human beings enjoy full and effective self-determination in major decisions involving human life.Besides,in the field of criminal law,it is necessary to clarify the subject of criminal responsibility and the criminal responsibility that may be caused by different ethical settings in specific situations.And on the premise of excluding criminal risks,a user-led ethical dilemma decision-making scheme for autonomous vehicles can be established.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.224.33.235