期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Implementation of Rapid Code Transformation Process Using Deep Learning Approaches
1
作者 Bao Rong Chang Hsiu-Fen Tsai han-lin chou 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第7期107-134,共28页
Our previous work has introduced the newly generated program using the code transformation model GPT-2,verifying the generated programming codes through simhash(SH)and longest common subsequence(LCS)algo-rithms.Howeve... Our previous work has introduced the newly generated program using the code transformation model GPT-2,verifying the generated programming codes through simhash(SH)and longest common subsequence(LCS)algo-rithms.However,the entire code transformation process has encountered a time-consuming problem.Therefore,the objective of this study is to speed up the code transformation process signi􀀀cantly.This paper has proposed deep learning approaches for modifying SH using a variational simhash(VSH)algorithm and replacing LCS with a piecewise longest common subsequence(PLCS)algorithm to faster the veri􀀀cation process in the test phase.Besides the code transformation model GPT-2,this study has also introduced MicrosoMASS and Facebook BART for a comparative analysis of their performance.Meanwhile,the explainable AI technique using local interpretable model-agnostic explanations(LIME)can also interpret the decision-making ofAImodels.The experimental results show that VSH can reduce the number of quali􀀀ed programs by 22.11%,and PLCS can reduce the execution time of selected pocket programs by 32.39%.As a result,the proposed approaches can signi􀀀cantly speed up the entire code transformation process by 1.38 times on average compared with our previous work. 展开更多
关键词 Code transformation model variational simhash piecewise longest common subsequence explainable AI LIME
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部