检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:商建刚 Shang Jiangang
机构地区:[1]上海政法学院经济法学院 [2]华东政法大学
出 处:《东方法学》2025年第1期104-117,共14页Oriental Law
基 金:研究阐释党的二十大精神国家社会科学基金重大项目“加强重点领域、新兴领域、涉外领域立法研究”(项目批准号:23ZDA075)的阶段性研究成果。
摘 要:人形机器人进入日常生活与非专业性用户互动产生了“自主安全悖论”与“拟人化陷阱”,但我国缺失人形机器人的侵权责任制度。从隐喻的视角分析欧盟、美国、日本等域外的人工智能产品治理范式发现,将人形机器人视同“动产”“儿童”“宠物”“电子人”“法人”等“一刀切”的模式均不符合其技术特征。人形机器人的自主性纯粹是技术性的,属于具有自己特征和含义的新“智能”物。产品缺陷责任、替代责任、保险责任等任何单一范式均无法规制人形机器人的侵权责任。可以基于汉德责任公式和合理的替代测试标准,构建适用于人形机器人制造商、系统程序开发商、算法设计者、运营商和操作用户的链式责任治理模型。The entry of humanoid robots into daily life and their interaction with non-professional users have created an "autonomy safety paradox" and an "anthropomorphism trap".However, there is a lack of an established liability system for humanoid robots in China. Analyzing the governance paradigms for AI products in the European Union, the United States, and Japan from a metaphorical perspective reveals that treating humanoid robots as equivalent to "movable property", "children", "pets", "electronic humans" or "legal persons" is inappropriate due to the distinct technical characteristics of humanoid robots. The autonomy of humanoid robots is purely technological, representing a new type of "intelligent" entity with its own features and significance. Traditional models such as product defect liability, vicarious liability, and insurance liability cannot adequately regulate the tort liability of humanoid robots. A chain liability governance model, based on the Hand formula and reasonable replacement testing standards, can be constructed to regulate the responsibilities of humanoid robot manufacturers, system developers,algorithm designers, operators, and users.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249