检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《模式识别与人工智能》2001年第3期327-331,共5页Pattern Recognition and Artificial Intelligence
基 金:国家自然科学基金
摘 要:本文基于粗集理论中模糊类对给定范畴的隶属度,给出了一种利用决策表进行规则提取的新方法LBR(Learning By Rough sets),并在此基础上提出了一种新的粗-模糊神经网络(RFNN)模型,以降水量预测为例,得到了很好的拟合效果,从而具有广泛的应用前景。Basing on the membership function of a given class of fuzzy partition in rough set theory, in this paper we put forward a new method of excavating knowledge from decision table, which is named LBR (Learning By Rough sets). Through this way, we can get possible rules and necessary rules and their Belief-Degree. And combined with rough set theory, it has the talent in dealing with incomplete and vague information. As we all know, neural network has the ability to imitate the mind of human, and the mind of human is just a system that excellently work with incomplete information. So, if we combine LBR with neural network, we can get a strong and effective approach to manage the incomplete and fuzzy data with intellect. That is the idea from which we get the model of rough-fuzzy neural network (R-FNN), which can automate the process of learning rules by the use of LBR, and omit the task of gain rules before constructing the model of network. This method can lead to a intelligent network. The by-product of the network is the membership function of the fuzzy classes. Usually, the fuzzy neural network needs to gain the partition of universe and the corresponding membership function of the classes by using experiments of experts. But the R-FNN put forward in this paper can give the membership function arbitrarily at the time of construction, and then adjust the parameter of membership function by training. Another advantage of the network is that it can exclude the singleton automatically. Since we haven't dealt with the data before construction, there is not reduction of the rules. So, if the classes induced by the partition of the universe are too much, the system will run into saturation because of too many number of nodes in hidden layer. To avoid this situation, we should choose a suitable threshold X for the real case, and then omit the nodes that represent the rules whose Belief-Degree are smaller than \. If we choose the suitable X by the training of the network, we can reduce the nodes automatically. It will i
关 键 词:模糊集 规则提取 粗-模糊神经网络 粗集理论 学习算法
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15