机构地区:[1]School of Information Science and Technology Micro Nano System Center,Fudan University,Shanghai 200433,China [2]Department of Anesthesiology,Huashan Hospital,Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence,Ministry of Education,Behavioral and Cognitive Neuroscience Center,Institute of Science and Technology for Brain-Inspired Intelligence,MOE Frontiers Center for Brain Science,Fudan University,Shanghai 200433,China [3]Kuang Yaming Honors School,Nanjing University,Nanjing,Jiangsu 210023,China [4]Shanghai Key Laboratory of Intelligent Information Processing,School of Computer Science,Fudan University,Shanghai 200433,China [5]State Key Laboratory of Primate Biomedical Research,Institute of Primate Translational Medicine,Kunming University of Science and Technology,Kunming,Yunnan 650500,China [6]New Vision World LLC.,Aliso Viejo,California 92656,USA [7]Behavioural and Clinical Neuroscience Institute,University of Cambridge,Cambridge,CB21TN,UK
出 处:《Zoological Research》2023年第5期967-980,共14页动物学研究(英文)
基 金:supported by the National Key R&D Program of China (2021ZD0202805,2019YFA0709504,2021ZD0200900);National Defense Science and Technology Innovation Special Zone Spark Project (20-163-00-TS-009-152-01);National Natural Science Foundation of China (31900719,U20A20227,82125008);Innovative Research Team of High-level Local Universities in Shanghai,Science and Technology Committee Rising-Star Program (19QA1401400);111 Project (B18015);Shanghai Municipal Science and Technology Major Project (2018SHZDZX01);Shanghai Center for Brain Science and Brain-Inspired Technology。
摘 要:Video-based action recognition is becoming a vital tool in clinical research and neuroscientific study for disorder detection and prediction.However,action recognition currently used in non-human primate(NHP)research relies heavily on intense manual labor and lacks standardized assessment.In this work,we established two standard benchmark datasets of NHPs in the laboratory:Monkeyin Lab(Mi L),which includes 13 categories of actions and postures,and MiL2D,which includes sequences of two-dimensional(2D)skeleton features.Furthermore,based on recent methodological advances in deep learning and skeleton visualization,we introduced the Monkey Monitor Kit(Mon Kit)toolbox for automatic action recognition,posture estimation,and identification of fine motor activity in monkeys.Using the datasets and Mon Kit,we evaluated the daily behaviors of wild-type cynomolgus monkeys within their home cages and experimental environments and compared these observations with the behaviors exhibited by cynomolgus monkeys possessing mutations in the MECP2 gene as a disease model of Rett syndrome(RTT).Mon Kit was used to assess motor function,stereotyped behaviors,and depressive phenotypes,with the outcomes compared with human manual detection.Mon Kit established consistent criteria for identifying behavior in NHPs with high accuracy and efficiency,thus providing a novel and comprehensive tool for assessing phenotypic behavior in monkeys.
关 键 词:Action recognition Fine motor identification Two-stream deep model 2D skeleton Non-human primates Rett syndrome
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...