检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王淑庆[1] 袁晓军 WANG Shuqing;YUAN Xiaojun(College of Public Administration,Hunan Normal University,Changsha Hunan 410081,China)
机构地区:[1]湖南师范大学公共管理学院,湖南长沙410081
出 处:《长沙大学学报》2023年第4期7-12,共6页Journal of Changsha University
基 金:国家社会科学基金一般项目“人工智能社会实验的伦理问题及对策研究”,编号:22BZX039。
摘 要:伴随机器人自主性的不断提高及其与人类社会的深入交互,它们是否拥有权利是智能时代不可避免的问题,学者们对此众说纷纭。利益论和意志论是权利概念中两种具有代表性且相互竞争的理论,巴斯尔和鲍恩通过对这两种理论的分析,认为机器人难以拥有自我意识,无法成为主体利益的承担者,从而得出机器人无法拥有权利的结论。但机器人意识的可能性在智能技术所具有的巨大潜力背景之下,并未被完全排除;此外,人类也并非全然不能从对机器人进行道德关怀的过程中受益。因此,机器人拥有获得道德权利的可能性,而康德的间接义务思想,能够在一定程度上为机器人的道德权利作辩护。With the increasing autonomy of robots and their deep interaction with human society,whether they have rights or not seems to be an inevitable question in the age of intelligence.Basl and Bowen’s analysis of the competing theories of interest and will—the concepts of Rights—suggests that robots have a hard time becoming self-aware,cannot become the subject of the interests of the bearer,so that they draw a conclusion that robots cannot have their rights.But the possibility of robot consciousness has not been entirely ruled out in the context of the enormous potential of intelligent technology;nor,moreover,have humans been entirely unable to benefit from the process of ethical concern for robots.So robots have the potential to acquire moral rights,and Immanuel Kant’s idea of indirect duties is precisely the sort of defense for the moral rights of robots.
分 类 号:TP242[自动化与计算机技术—检测技术与自动化装置] B82-057[自动化与计算机技术—控制科学与工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7