期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Code Transform Model Producing High-Performance Program 被引量:1
1
作者 Bao Rong Chang Hsiu-Fen Tsai po-wen su 《Computer Modeling in Engineering & Sciences》 SCIE EI 2021年第10期253-277,共25页
This paper introduces a novel transform method to produce the newly generated programs through code transform model called the second generation of Generative Pre-trained Transformer(GPT-2)reasonably,improving the pro... This paper introduces a novel transform method to produce the newly generated programs through code transform model called the second generation of Generative Pre-trained Transformer(GPT-2)reasonably,improving the program execution performance significantly.Besides,a theoretical estimation in statistics has given the minimum number of generated programs as required,which guarantees to find the best one within them.The proposed approach can help the voice assistant machine resolve the problem of inefficient execution of application code.In addition to GPT-2,this study develops the variational Simhash algorithm to check the code similarity between sample program and newly generated program,and conceives the piecewise longest common subsequence algorithm to examine the execution’s conformity from the two programs mentioned above.The code similarity check deducts the redundant generated programs,and the output conformity check finds the best-performing generative program.In addition to texts,the proposed approach can also prove the other media,including images,sounds,and movies.As a result,the newly generated program outperforms the sample program significantly because the number of code lines reduces 27.21%,and the program execution time shortens 24.62%. 展开更多
关键词 Newly generated programs GPT-2 predetermined generative programs variational Simhash algorithm piecewise longest common subsequence
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部