检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王东凯 苗永康[1] 金昌昆 周海廷 WANG Dong-Kai;MIAO Yong-Kang;JIN Chang-Kun;ZHOU Hai-Ting(Geophysical Research Institute of SINOPEC Shengli Oilfield,Dongying 257022,China;Shengli Oilfield Exploration and Development Research Institute,Dongying 257022,China)
机构地区:[1]中国石化胜利油田分公司物探研究院,山东东营257022 [2]中国石化胜利油田分公司勘探开发研究院,山东东营257022
出 处:《物探与化探》2021年第1期127-132,共6页Geophysical and Geochemical Exploration
基 金:国家科技重大专项(2017ZX05072);山东省非教育系统公派留学项目——高级科研人才访学计划(201802001)。
摘 要:针对地震资料处理中存在的时差、相位差问题,提出了一种时差、相位差自动识别校正方法。该方法以希尔伯特变换和相对熵算法为理论基础,以KL散度为判别准则,全过程数据驱动,自动化实现时差、相位差的识别与校正,有效降低人工识别成本,避免人为因素带来的误差。文中详细阐述了相关原理及实施过程,并通过数值模拟结果验证了该方法的正确性和有效性。连片、多分量实际资料的应用分析表明,相较于人工识别及理论值校正,该方法可以有效提高识别及校正精度,增强处理对象的一致性,改善剖面质量,为后续处理及解释工作提供技术保障。Aiming at the problem of timelag and phaselag in the processing of seismic data,an automatic recognition and correction(ARC)method is proposed.The method is based on Hilbert transform and relative entropy algorithm.Kullback-Leibler divergence is used as the criterion.The whole process is data driven,and the recognition and correction of timelag and phaselag are realized automatically,which effectively reduces the cost of artificial identification and avoids errors caused by human factors.In this paper,the related principles and implementation process are expounded in detail,and the correctness and effectiveness of the method are verified by numerical simulation results.The application analysis of merged and multi-component actual seismic data shows that compared with manual identification and theoretical value correction,this method can effectively improve the accuracy of recognition and correction,enhance the consistency of events,improve the quality of the sections,and provide technical support for subsequent processing and interpretation.
分 类 号:P631.4[天文地球—地质矿产勘探]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.137.210.169