Implementation of Rapid Code Transformation Process Using Deep Learning Approaches  

在线阅读下载全文

作  者:Bao Rong Chang Hsiu-Fen Tsai Han-Lin Chou 

机构地区:[1]Department of Computer Science and Information Engineering,National University of Kaohsiung,Kaohsiung,811,Taiwan [2]Department of Fragrance and Cosmetic Science,Kaohsiung Medical University,Kaohsiung,811,Taiwan

出  处:《Computer Modeling in Engineering & Sciences》2023年第7期107-134,共28页工程与科学中的计算机建模(英文)

基  金:supported by the Ministry of Science and Technology,Taiwan,under Grant Nos.MOST 111-2221-E-390-012 and MOST 111-2622-E-390-001.

摘  要:Our previous work has introduced the newly generated program using the code transformation model GPT-2,verifying the generated programming codes through simhash(SH)and longest common subsequence(LCS)algo-rithms.However,the entire code transformation process has encountered a time-consuming problem.Therefore,the objective of this study is to speed up the code transformation process signi􀀀cantly.This paper has proposed deep learning approaches for modifying SH using a variational simhash(VSH)algorithm and replacing LCS with a piecewise longest common subsequence(PLCS)algorithm to faster the veri􀀀cation process in the test phase.Besides the code transformation model GPT-2,this study has also introduced MicrosoMASS and Facebook BART for a comparative analysis of their performance.Meanwhile,the explainable AI technique using local interpretable model-agnostic explanations(LIME)can also interpret the decision-making ofAImodels.The experimental results show that VSH can reduce the number of quali􀀀ed programs by 22.11%,and PLCS can reduce the execution time of selected pocket programs by 32.39%.As a result,the proposed approaches can signi􀀀cantly speed up the entire code transformation process by 1.38 times on average compared with our previous work.

关 键 词:Code transformation model variational simhash piecewise longest common subsequence explainable AI LIME 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象