Explainable artificial intelligence and interpretable machine learning for agricultural data analysis  被引量:1

在线阅读下载全文

作  者:Masahiro Ryo 

机构地区:[1]Leibniz Centre for Agricultural Landscape Research(ZALF),Eberswalder Str.84,15374 Müncheberg,Germany [2]Brandenburg University of Technology Cottbus–Senftenberg,Platz der Deutschen Einheit 1,03046 Cottbus,Germany

出  处:《Artificial Intelligence in Agriculture》2022年第1期257-265,共9页农业人工智能(英文)

基  金:supported by ZALF Integrated Priority Project(IPP2022)“Co-designing smart,resilient,sustainable agricultural landscapes with cross-scale diversification”,Bundesministerium für Bildung und Forschung(BMBF)Land-Innovation-Lausitz project“Landschaftsinnovationen in der Lausitz für eine klimaangepasste Bioökonomie und naturnahen Bioökonomie-Tourismus”(03WIR3017A);BMBF project“Multi-modale Datenintegration,domänenspezifische Methoden und KI zur Stärkung der Datenkompetenz in der Agrarforschung”(16DKWN089);Brandenburgische Technische Universität Cottbus-Senftenberg GRS cluster project“Integrated analysis of Multifunctional Fruit production landscapes to promote ecosystem services and sustainable land-use under climate change”(GRS2018/19).

摘  要:Artificial intelligence and machine learning have been increasingly applied for prediction in agricultural science.However,many models are typically black boxes,meaning we cannot explain what the models learned from the data and the reasons behind predictions.To address this issue,I introduce an emerging subdomain of artificial intelligence,explainable artificial intelligence(XAI),and associated toolkits,interpretable machine learning.This study demonstrates the usefulness of several methods by applying them to an openly available dataset.The dataset includes the no-tillage effect on crop yield relative to conventional tillage and soil,climate,and management variables.Data analysis discovered that no-tillage management can increase maize crop yield where yield in conventional tillage is<5000 kg/ha and the maximum temperature is higher than 32°.These methods are useful to answer(i)which variables are important for prediction in regression/classification,(ii)which variable interactions are important for prediction,(iii)how important variables and their interactions are associated with the response variable,(iv)what are the reasons underlying a predicted value for a certain instance,and(v)whether different machine learning algorithms offer the same answer to these questions.I argue that the goodness of model fit is overly evaluated with model performance measures in the current practice,while these questions are unanswered.XAI and interpretable machine learning can enhance trust and explainability in AI.

关 键 词:Interpretable machine learning Explainable artificial intelligence AGRICULTURE Crop yield NO-TILLAGE XAI 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象