期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
A Hybrid Deep Learning Approach for PM2.5 Concentration Prediction in Smart Environmental Monitoring
1
作者 Minh Thanh Vo Anh HVo +1 位作者 Huong Bui Tuong Le 《Intelligent Automation & Soft Computing》 SCIE 2023年第6期3029-3041,共13页
Nowadays,air pollution is a big environmental problem in develop-ing countries.In this problem,particulate matter 2.5(PM2.5)in the air is an air pollutant.When its concentration in the air is high in developing countr... Nowadays,air pollution is a big environmental problem in develop-ing countries.In this problem,particulate matter 2.5(PM2.5)in the air is an air pollutant.When its concentration in the air is high in developing countries like Vietnam,it will harm everyone’s health.Accurate prediction of PM2.5 concentrations can help to make the correct decision in protecting the health of the citizen.This study develops a hybrid deep learning approach named PM25-CBL model for PM2.5 concentration prediction in Ho Chi Minh City,Vietnam.Firstly,this study analyzes the effects of variables on PM2.5 concentrations in Air Quality HCMC dataset.Only variables that affect the results will be selected for PM2.5 concentration prediction.Secondly,an efficient PM25-CBL model that integrates a convolutional neural network(CNN)andBidirectionalLongShort-TermMemory(Bi-LSTM)isdeveloped.This model consists of three following modules:CNN,Bi-LSTM,and Fully connected modules.Finally,this study conducts the experiment to compare the performance of our approach and several state-of-the-art deep learning models for time series prediction such as LSTM,Bi-LSTM,the combination of CNN and LSTM(CNN-LSTM),and ARIMA.The empirical results confirm that PM25-CBL model outperforms other methods for Air Quality HCMC dataset in terms of several metrics including Mean Squared Error(MSE),Root Mean Squared Error(RMSE),Mean Absolute Error(MAE),and Mean Absolute Percentage Error(MAPE). 展开更多
关键词 Time series prediction PM2.5 concentration prediction CNN Bi-LSTM network
下载PDF
A Method of Integrating Length Constraints into Encoder-Decoder Transformer for Abstractive Text Summarization
2
作者 Ngoc-Khuong Nguyen Dac-Nhuong Le +1 位作者 Viet-Ha Nguyen Anh-Cuong Le 《Intelligent Automation & Soft Computing》 2023年第10期1-18,共18页
Text summarization aims to generate a concise version of the original text.The longer the summary text is,themore detailed it will be fromthe original text,and this depends on the intended use.Therefore,the problem of... Text summarization aims to generate a concise version of the original text.The longer the summary text is,themore detailed it will be fromthe original text,and this depends on the intended use.Therefore,the problem of generating summary texts with desired lengths is a vital task to put the research into practice.To solve this problem,in this paper,we propose a new method to integrate the desired length of the summarized text into the encoder-decoder model for the abstractive text summarization problem.This length parameter is integrated into the encoding phase at each self-attention step and the decoding process by preserving the remaining length for calculating headattention in the generation process and using it as length embeddings added to theword embeddings.We conducted experiments for the proposed model on the two data sets,Cable News Network(CNN)Daily and NEWSROOM,with different desired output lengths.The obtained results show the proposed model’s effectiveness compared with related studies. 展开更多
关键词 Length controllable abstractive text summarization length embedding
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部