Static Poisson’s ratio(vs)is crucial for determining geomechanical properties in petroleum applications,namely sand production.Some models have been used to predict vs;however,the published models were limited to spe...Static Poisson’s ratio(vs)is crucial for determining geomechanical properties in petroleum applications,namely sand production.Some models have been used to predict vs;however,the published models were limited to specific data ranges with an average absolute percentage relative error(AAPRE)of more than 10%.The published gated recurrent unit(GRU)models do not consider trend analysis to show physical behaviors.In this study,we aim to develop a GRU model using trend analysis and three inputs for predicting n s based on a broad range of data,n s(value of 0.1627-0.4492),bulk formation density(RHOB)(0.315-2.994 g/mL),compressional time(DTc)(44.43-186.9 μs/ft),and shear time(DTs)(72.9-341.2μ s/ft).The GRU model was evaluated using different approaches,including statistical error an-alyses.The GRU model showed the proper trends,and the model data ranges were wider than previous ones.The GRU model has the largest correlation coefficient(R)of 0.967 and the lowest AAPRE,average percent relative error(APRE),root mean square error(RMSE),and standard deviation(SD)of 3.228%,1.054%,4.389,and 0.013,respectively,compared to other models.The GRU model has a high accuracy for the different datasets:training,validation,testing,and the whole datasets with R and AAPRE values were 0.981 and 2.601%,0.966 and 3.274%,0.967 and 3.228%,and 0.977 and 2.861%,respectively.The group error analyses of all inputs show that the GRU model has less than 5% AAPRE for all input ranges,which is superior to other models that have different AAPRE values of more than 10% at various ranges of inputs.展开更多
Determining the adsorption of shale gas on complex surfaces remains a challenge in molecular simulation studies.Difficulties essentially stem from the need to create a realistic shale structure model in terms of miner...Determining the adsorption of shale gas on complex surfaces remains a challenge in molecular simulation studies.Difficulties essentially stem from the need to create a realistic shale structure model in terms of mineral heterogeneityand multiplicity.Moreover,precise characterization of the competitive adsorption of hydrogen andmethane in shale generally requires the experimental determination of the related adsorptive capacity.In thisstudy,the adsorption of adsorbates,methane(CH_(4)),and hydrogen(H_(2))on heterogeneous shale surface modelsof Kaolinite,Orthoclase,Muscovite,Mica,C_(60),and Butane has been simulated in the frame of a moleculardynamic’s numerical technique.The results show that these behaviors are influenced by pressure and potentialenergy.On increasing the pressure from 500 to 2000 psi,the sorption effect for CH_(4)significantly increasesbut shows a decline at a certain stage(if compared to H_(2)).The research findings also indicate that raw shalehas a higher capacity to adsorb CH_(4)compared to hydrogen.However,in shale,this difference is negligible.展开更多
Leakages from subsea oil and gas equipment cause substantial economic losses and damage to marine ecosystem,so it is essential to locate the source of the leak.However,due to the complexity and variability of the mari...Leakages from subsea oil and gas equipment cause substantial economic losses and damage to marine ecosystem,so it is essential to locate the source of the leak.However,due to the complexity and variability of the marine environment,the signals collected by hydrophone contain a variety of noises,which makes it challenging to extract useful signals for localization.To solve this problem,a hydrophone denoising algorithm is proposed based on variational modal decomposition(VMD)with grey wolf optimization.First,the average envelope entropy is used as the fitness function of the grey wolf optimizer to find the optimal solution for the parameters K andα.Afterward,the VMD algorithm decomposes the original signal parameters to obtain the intrinsic mode functions(IMFs).Subsequently,the number of interrelationships between each IMF and the original signal was calculated,the threshold value was set,and the noise signal was removed to calculate the time difference using the valid signal obtained by reconstruction.Finally,the arrival time difference is used to locate the origin of the leak.The localization accuracy of the method in finding leaks is investigated experimentally by constructing a simulated leak test rig,and the effectiveness and feasibility of the method are verified.展开更多
Diagnosing various diseases such as glaucoma,age-related macular degeneration,cardiovascular conditions,and diabetic retinopathy involves segmenting retinal blood vessels.The task is particularly challenging when deal...Diagnosing various diseases such as glaucoma,age-related macular degeneration,cardiovascular conditions,and diabetic retinopathy involves segmenting retinal blood vessels.The task is particularly challenging when dealing with color fundus images due to issues like non-uniformillumination,low contrast,and variations in vessel appearance,especially in the presence of different pathologies.Furthermore,the speed of the retinal vessel segmentation system is of utmost importance.With the surge of now available big data,the speed of the algorithm becomes increasingly important,carrying almost equivalent weightage to the accuracy of the algorithm.To address these challenges,we present a novel approach for retinal vessel segmentation,leveraging efficient and robust techniques based on multiscale line detection and mathematical morphology.Our algorithm’s performance is evaluated on two publicly available datasets,namely the Digital Retinal Images for Vessel Extraction dataset(DRIVE)and the Structure Analysis of Retina(STARE)dataset.The experimental results demonstrate the effectiveness of our method,withmean accuracy values of 0.9467 forDRIVE and 0.9535 for STARE datasets,aswell as sensitivity values of 0.6952 forDRIVE and 0.6809 for STARE datasets.Notably,our algorithmexhibits competitive performance with state-of-the-art methods.Importantly,it operates at an average speed of 3.73 s per image for DRIVE and 3.75 s for STARE datasets.It is worth noting that these results were achieved using Matlab scripts containing multiple loops.This suggests that the processing time can be further reduced by replacing loops with vectorization.Thus the proposed algorithm can be deployed in real time applications.In summary,our proposed system strikes a fine balance between swift computation and accuracy that is on par with the best available methods in the field.展开更多
Purpose:The purpose of this study is to serve as a comprehensive review of the existing annotated corpora.This review study aims to provide information on the existing annotated corpora for event extraction,which are ...Purpose:The purpose of this study is to serve as a comprehensive review of the existing annotated corpora.This review study aims to provide information on the existing annotated corpora for event extraction,which are limited but essential for training and improving the existing event extraction algorithms.In addition to the primary goal of this study,it provides guidelines for preparing an annotated corpus and suggests suitable tools for the annotation task.Design/methodology/approach:This study employs an analytical approach to examine available corpus that is suitable for event extraction tasks.It offers an in-depth analysis of existing event extraction corpora and provides systematic guidelines for researchers to develop accurate,high-quality corpora.This ensures the reliability of the created corpus and its suitability for training machine learning algorithms.Findings:Our exploration reveals a scarcity of annotated corpora for event extraction tasks.In particular,the English corpora are mainly focused on the biomedical and general domains.Despite the issue of annotated corpora scarcity,there are several high-quality corpora available and widely used as benchmark datasets.However,access to some of these corpora might be limited owing to closed-access policies or discontinued maintenance after being initially released,rendering them inaccessible owing to broken links.Therefore,this study documents the available corpora for event extraction tasks.Research limitations:Our study focuses only on well-known corpora available in English and Chinese.Nevertheless,this study places a strong emphasis on the English corpora due to its status as a global lingua franca,making it widely understood compared to other languages.Practical implications:We genuinely believe that this study provides valuable knowledge that can serve as a guiding framework for preparing and accurately annotating events from text corpora.It provides comprehensive guidelines for researchers to improve the quality of corpus annotations,especially for event extraction tasks across various domains.Originality/value:This study comprehensively compiled information on the existing annotated corpora for event extraction tasks and provided preparation guidelines.展开更多
Rising concerns about climate change drive the demand for lightweight components.Magnesium(Mg)alloys are highly valued for their low weight,making them increasingly important in various industries.Researchers focusing...Rising concerns about climate change drive the demand for lightweight components.Magnesium(Mg)alloys are highly valued for their low weight,making them increasingly important in various industries.Researchers focusing on enhancing the characteristics of Mg alloys and developing their Metal Matrix Composites(MMCs)have gained significant attention worldwide over the past decade,driven by the global shift towards lightweight materials.Friction Stir Processing(FSP)has emerged as a promising technique to enhance the properties of Mg alloys and produce Mg-MMCs.Initially,FSP adapted to refine grain size from the micro to the nano level and accelerated the development of MMCs due to its solid-state nature and the synergistic effects of microstructure refinement and reinforcement,improving strength,hardness,ductility,wear resistance,corrosion resistance,and fatigue strength.However,producing defect-free and sound FSPed Mg and Mg-MMCs requires addressing several variables and their interdependencies,which opens up a broad range of practical applications.Despite existing reviews on individual FSP of Mg,its alloys,and MMCs,an attempt has been made to analyze the latest research on these three aspects collectively to enhance the understanding,application,and effectiveness of FSP for Mg and its derivatives.This review article discusses the literature,classifies the importance of Mg alloys,provides a historical background,and explores developments and potential applications of FSPed Mg alloys.It focuses on novel fabrication methods,reinforcement strategies,machine and tool design parameters,material characterization,and integration with other methods for enhanced properties.The influence of process parameters and the emergence of defects are examined,along with specific applications in mono and hybrid composites and their microstructure evolution.The study identifies promising reinforcement materials and highlights research gaps in FSP for Mg alloys and MMCs production.It concludes with significant recommendations for further exploration,reflecting ongoing advancements in this field.展开更多
Object detection has made a significant leap forward in recent years.However,the detection of small objects continues to be a great difficulty for various reasons,such as they have a very small size and they are susce...Object detection has made a significant leap forward in recent years.However,the detection of small objects continues to be a great difficulty for various reasons,such as they have a very small size and they are susceptible to missed detection due to background noise.Additionally,small object information is affected due to the downsampling operations.Deep learning-based detection methods have been utilized to address the challenge posed by small objects.In this work,we propose a novel method,the Multi-Convolutional Block Attention Network(MCBAN),to increase the detection accuracy of minute objects aiming to overcome the challenge of information loss during the downsampling process.The multi-convolutional attention block(MCAB);channel attention and spatial attention module(SAM)that make up MCAB,have been crafted to accomplish small object detection with higher precision.We have carried out the experiments on the Karlsruhe Institute of Technology and Toyota Technological Institute(KITTI)and Pattern Analysis,Statical Modeling and Computational Learning(PASCAL)Visual Object Classes(VOC)datasets and have followed a step-wise process to analyze the results.These experiment results demonstrate that significant gains in performance are achieved,such as 97.75%for KITTI and 88.97%for PASCAL VOC.The findings of this study assert quite unequivocally the fact that MCBAN is much more efficient in the small object detection domain as compared to other existing approaches.展开更多
Over the past two decades,machine learning techniques have been extensively used in predicting reservoir properties.While this approach has significantly contributed to the industry,selecting an appropriate model is s...Over the past two decades,machine learning techniques have been extensively used in predicting reservoir properties.While this approach has significantly contributed to the industry,selecting an appropriate model is still challenging for most researchers.Relying solely on statistical metrics to select the best model for a particular problem may not always be the most effective approach.This study encourages researchers to incorporate data visualization in their analysis and model selection process.To evaluate the suitability of different models in predicting horizontal permeability in the Volve field,wireline logs were used to train Extra-Trees,Ridge,Bagging,and XGBoost models.The Random Forest feature selection technique was applied to select the relevant logs as inputs for the models.Based on statistical metrics,the Extra-Trees model achieved the highest test accuracy of 0.996,RMSE of 19.54 mD,and MAE of 3.18 mD,with XGBoost coming in second.However,when the results were visualised,it was discovered that the XGBoost model was more suitable for the problem being tackled.The XGBoost model was a better predictor within the sandstone interval,while the Extra-Trees model was more appropriate in non-sandstone intervals.Since this study aims to predict permeability in the reservoir interval,the XGBoost model is the most suitable.These contrasting results demonstrate the importance of incorporating data visualisation techniques as an evaluation metric.Given the heterogeneity of the subsurface,relying solely on statistical metrics may not be sufficient to determine which model is best suited for a particular problem.展开更多
Medical imaging plays a key role within modern hospital management systems for diagnostic purposes.Compression methodologies are extensively employed to mitigate storage demands and enhance transmission speed,all whil...Medical imaging plays a key role within modern hospital management systems for diagnostic purposes.Compression methodologies are extensively employed to mitigate storage demands and enhance transmission speed,all while upholding image quality.Moreover,an increasing number of hospitals are embracing cloud computing for patient data storage,necessitating meticulous scrutiny of server security and privacy protocols.Nevertheless,considering the widespread availability of multimedia tools,the preservation of digital data integrity surpasses the significance of compression alone.In response to this concern,we propose a secure storage and transmission solution for compressed medical image sequences,such as ultrasound images,utilizing a motion vector watermarking scheme.The watermark is generated employing an error-correcting code known as Bose-Chaudhuri-Hocquenghem(BCH)and is subsequently embedded into the compressed sequence via block-based motion vectors.In the process of watermark embedding,motion vectors are selected based on their magnitude and phase angle.When embedding watermarks,no specific spatial area,such as a region of interest(ROI),is used in the images.The embedding of watermark bits is dependent on motion vectors.Although reversible watermarking allows the restoration of the original image sequences,we use the irreversible watermarking method.The reason for this is that the use of reversible watermarks may impede the claims of ownership and legal rights.The restoration of original data or images may call into question ownership or other legal claims.The peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)serve as metrics for evaluating the watermarked image quality.Across all images,the PSNR value exceeds 46 dB,and the SSIM value exceeds 0.92.Experimental results substantiate the efficacy of the proposed technique in preserving data integrity.展开更多
Foam is utilized in enhanced oil recovery and CO_(2) sequestration.Surfactant-alternating-gas(SAG)is a preferred approach for placing foam into reservoirs,due to it enhances gas injection and minimizes corrosion in fa...Foam is utilized in enhanced oil recovery and CO_(2) sequestration.Surfactant-alternating-gas(SAG)is a preferred approach for placing foam into reservoirs,due to it enhances gas injection and minimizes corrosion in facilities.Our previous studies with similar permeability cores show that during SAG injection,several banks occupy the area near the well where fluid exhibits distinct behaviour.However,underground reservoirs are heterogeneous,often layered.It is crucial to understand the effect of permeability on fluid behaviour and injectivity in a SAG process.In this work,coreflood experiments are conducted in cores with permeabilities ranging from 16 to 2300 mD.We observe the same sequence of banks in cores with different permeabilities.However,the speed at which banks propagate and their overall mobility can vary depending on permeability.At higher permeabilities,the gas-dissolution bank and the forced-imbibition bank progress more rapidly during liquid injection.The total mobilities of both banks decrease with permeability.By utilizing a bank-propagation model,we scale up our experimental findings and compare them to results obtained using the Peaceman equation.Our findings reveal that the liquid injectivity in a SAG foam process is misestimated by conventional simulators based on the Peaceman equation.The lower the formation permeability,the greater the error.展开更多
This study conducts a systematic literature review(SLR)of blockchain consensus mechanisms,an essential protocols that maintain the integrity,reliability,and decentralization of distributed ledger networks.The aim is t...This study conducts a systematic literature review(SLR)of blockchain consensus mechanisms,an essential protocols that maintain the integrity,reliability,and decentralization of distributed ledger networks.The aim is to comprehensively investigate prominent mechanisms’security features and vulnerabilities,emphasizing their security considerations,applications,challenges,and future directions.The existing literature offers valuable insights into various consensus mechanisms’strengths,limitations,and security vulnerabilities and their real-world applications.However,there remains a gap in synthesizing and analyzing this knowledge systematically.Addressing this gap would facilitate a structured approach to understanding consensus mechanisms’security and vulnerabilities comprehensively.The study adheres to Preferred Reporting Items for Systematic Reviews and Meta-Analyses(PRISMA)guidelines and computer science standards and reviewed 3749 research papers from 2016 to 2024,excluding grey literature,resulting in 290 articles for descriptive analysis.The research highlights an increased focus on blockchain consensus security,energy efficiency,and hybrid mechanisms within 60%of research papers post-2019,identifying gaps in scalability,privacy,and interoperability for future exploration.By synthesizing the existing research and identifying the key trends,this SLR contributes to advancing the understanding of blockchain consensus mechanisms’security and guiding future research and structured innovation in blockchain systems and applications.展开更多
The rapid adoption of Internet of Things(IoT)technologies has introduced significant security challenges across the physical,network,and application layers,particularly with the widespread use of the Message Queue Tel...The rapid adoption of Internet of Things(IoT)technologies has introduced significant security challenges across the physical,network,and application layers,particularly with the widespread use of the Message Queue Telemetry Transport(MQTT)protocol,which,while efficient in bandwidth consumption,lacks inherent security features,making it vulnerable to various cyber threats.This research addresses these challenges by presenting a secure,lightweight communication proxy that enhances the scalability and security of MQTT-based Internet of Things(IoT)networks.The proposed solution builds upon the Dang-Scheme,a mutual authentication protocol designed explicitly for resource-constrained environments and enhances it using Elliptic Curve Cryptography(ECC).This integration significantly improves device authentication,data confidentiality,and energy efficiency,achieving an 87.68%increase in data confidentiality and up to 77.04%energy savings during publish/subscribe communications in smart homes.The Middleware Broker System dynamically manages transaction keys and session IDs,offering robust defences against common cyber threats like impersonation and brute-force attacks.Penetration testing with tools such as Hydra and Nmap further validated the system’s security,demonstrating its potential to significantly improve the security and efficiency of IoT networks while underscoring the need for ongoing research to combat emerging threats.展开更多
Shear logs,also known as shear velocity logs,are used for various types of seismic analysis,such as determining the relationship between amplitude variation with offset(AVO)and interpreting multiple types of seismic d...Shear logs,also known as shear velocity logs,are used for various types of seismic analysis,such as determining the relationship between amplitude variation with offset(AVO)and interpreting multiple types of seismic data.This log is an important tool for analyzing the properties of rocks and interpreting seismic data to identify potential areas of oil and gas reserves.However,these logs are often not collected due to cost constraints or poor borehole conditions possibly leading to poor data quality,though there are various approaches in practice for estimating shear wave velocity.In this study,a detailed review of the recent advances in the various techniques used to measure shear wave(S-wave)velocity is carried out.These techniques include direct and indirect measurement,determination of empirical relationships between S-wave velocity and other parameters,machine learning,and rock physics models.Therefore,this study creates a collection of employed techniques,enhancing the existing knowledge of this significant topic and offering a progressive approach for practical implementation in the field.展开更多
This study aims to formulate a steady-state mathematical model for a three-dimensional permeable enclosure(cavity)to determine the oil extraction rate using three distinct nanoparticles,SiO_(2),Al_(2)O_(3),and Fe_(2)O...This study aims to formulate a steady-state mathematical model for a three-dimensional permeable enclosure(cavity)to determine the oil extraction rate using three distinct nanoparticles,SiO_(2),Al_(2)O_(3),and Fe_(2)O_(3),in unconventional oil reservoirs.The simulation is conducted for different parameters of volume fractions,porosities,and mass flow rates to determine the optimal oil recovery.The impact of nanoparticles on relative permeability(kr)and water is also investigated.The simulation process utilizes the finite volume ANSYS Fluent.The study results showed that when the mass flow rate at the inlet is low,oil recovery goes up.In addition,they indicated that silicon nanoparticles are better at getting oil out of the ground(i.e.,oil reservoir)than Al_(2)O_(3)and Fe_(2)O_(3).Most oil can be extracted from SiO_(2),Al_(2)O_(3),and Fe_(2)O_(3)at a rate of 97.8%,96.5%,and 88%,respectively.展开更多
The segmentation of head and neck(H&N)tumors in dual Positron Emission Tomography/Computed Tomogra-phy(PET/CT)imaging is a critical task in medical imaging,providing essential information for diagnosis,treatment p...The segmentation of head and neck(H&N)tumors in dual Positron Emission Tomography/Computed Tomogra-phy(PET/CT)imaging is a critical task in medical imaging,providing essential information for diagnosis,treatment planning,and outcome prediction.Motivated by the need for more accurate and robust segmentation methods,this study addresses key research gaps in the application of deep learning techniques to multimodal medical images.Specifically,it investigates the limitations of existing 2D and 3D models in capturing complex tumor structures and proposes an innovative 2.5D UNet Transformer model as a solution.The primary research questions guiding this study are:(1)How can the integration of convolutional neural networks(CNNs)and transformer networks enhance segmentation accuracy in dual PET/CT imaging?(2)What are the comparative advantages of 2D,2.5D,and 3D model configurations in this context?To answer these questions,we aimed to develop and evaluate advanced deep-learning models that leverage the strengths of both CNNs and transformers.Our proposed methodology involved a comprehensive preprocessing pipeline,including normalization,contrast enhancement,and resampling,followed by segmentation using 2D,2.5D,and 3D UNet Transformer models.The models were trained and tested on three diverse datasets:HeckTor2022,AutoPET2023,and SegRap2023.Performance was assessed using metrics such as Dice Similarity Coefficient,Jaccard Index,Average Surface Distance(ASD),and Relative Absolute Volume Difference(RAVD).The findings demonstrate that the 2.5D UNet Transformer model consistently outperformed the 2D and 3D models across most metrics,achieving the highest Dice and Jaccard values,indicating superior segmentation accuracy.For instance,on the HeckTor2022 dataset,the 2.5D model achieved a Dice score of 81.777 and a Jaccard index of 0.705,surpassing other model configurations.The 3D model showed strong boundary delineation performance but exhibited variability across datasets,while the 2D model,although effective,generally underperformed compared to its 2.5D and 3D counterparts.Compared to related literature,our study confirms the advantages of incorporating additional spatial context,as seen in the improved performance of the 2.5D model.This research fills a significant gap by providing a detailed comparative analysis of different model dimensions and their impact on H&N segmentation accuracy in dual PET/CT imaging.展开更多
The functionality of buildings depends on the climate they are subjected to,this means buildings constructed these days need to be built to perform efficiently within the current and future climate and with the aim of...The functionality of buildings depends on the climate they are subjected to,this means buildings constructed these days need to be built to perform efficiently within the current and future climate and with the aim of reducing the greenhouse emission induced on the environment.In order to lower CO_(2) emissions and assist in reinforcing the capability of cities to adapt to climate change whilst enhancing the quality of the built environment,it is vital to improve the environmental performance and energy efficiency in buildings.This study aims to identify and compare existing GBRS(green building rating system)with Nigerian climatic condition.Proposal criteria weighting was developed from the questionnaire supplying expert opinions.It began with a comparative analysis of GBRS.The analysis was conducted to find the similarities and differences between existing rating systems and come up with an appropriate rating system for Nigeria.These approaches included a fieldwork approach(pilot study,survey);questionnaire;and interviews,both structured and unstructured.Then,the information generated in this pre-test investigation formed the conversational guide and described the decision to determine the main factors that should be involved in the new assessment system for Nigeria.展开更多
The goals of this study are to assess the viability of waste tire-derived char(WTDC)as a sustainable,low-cost fine aggregate surrogate material for asphalt mixtures and to develop the statistically coupled neural netw...The goals of this study are to assess the viability of waste tire-derived char(WTDC)as a sustainable,low-cost fine aggregate surrogate material for asphalt mixtures and to develop the statistically coupled neural network(SCNN)model for predicting volumetric and Marshall properties of asphalt mixtures modified with WTDC.The study is based on experimental data acquired from laboratory volumetric and Marshall properties testing on WTDCmodified asphalt mixtures(WTDC-MAM).The input variables comprised waste tire char content and asphalt binder content.The output variables comprised mixture unit weight,total voids,voids filled with asphalt,Marshall stability,and flow.Statistical coupled neural networks were utilized to predict the volumetric and Marshall properties of asphalt mixtures.For predictive modeling,the SCNN model is employed,incorporating a three-layer neural network and preprocessing techniques to enhance accuracy and reliability.The optimal network architecture,using the collected dataset,was a 2:6:5 structure,and the neural network was trained with 60%of the data,whereas the other 20%was used for cross-validation and testing respectively.The network employed a hyperbolic tangent(tanh)activation function and a feed-forward backpropagation.According to the results,the network model could accurately predict the volumetric and Marshall properties.The predicted accuracy of SCNN was found to be as high value>98%and low prediction errors for both volumetric and Marshall properties.This study demonstrates WTDC's potential as a low-cost,sustainable aggregate replacement.The SCNN-based predictive model proves its efficiency and versatility and promotes sustainable practices.展开更多
In the era of 5G,seamless mobility handovers are critical in densely populated regions like Malaysia to mitigate disruptions and inefficiencies.5G networks offer unprecedented data speeds and reliability,essential for...In the era of 5G,seamless mobility handovers are critical in densely populated regions like Malaysia to mitigate disruptions and inefficiencies.5G networks offer unprecedented data speeds and reliability,essential for advancing mobile communication and Internet of Things applications.However,ensuring continuous connectivity and service during mobility remains challenging,especially in urban settings.Digital twin technology pre-sents a promising solution to enhance 5G handover mechanisms.A digital twin network(DTN)mirroring Malaysia's 5G infrastructure is proposed,utilising real-time data and user behaviour insights to optimise energy consumption during handovers.The focus is on energy-efficient protocols and algorithms,reviewed through a systematic literature review.The DTN aims to enhance mobility handover efficiency through predictive handovers and adaptive resource allocation,bolstered by sustainable practices such as edge computing.The potential of DTNs to optimise 5G handover processes is explored,starting with the foundational concepts of 5G mobility and digital twins,highlighting the need for improved strategies in high-mobility scenarios.Methodologies leveraging digital twins to predict network conditions,simulate handover scenarios,and proactively manage decisions are examined,reducing latency and packet loss.Case studies demonstrate how digital twins adapt dynamically to network changes and user mobility,thereby improving quality of service and user experience.Malaysia's specific 5G mobility challenges are addressed with a tailored DTN emphasising energy efficiency,evaluated through practical applications.Evaluation criteria assess effectiveness with in-depth analysis of methods,performance metrics,limitations,and recommendations for future research.Challenges and future directions including scalability,security,and real-time data processing,are discussed,aiming to integrate digital twin technology with 5G networks for enhanced connectivity.This abstract provides a roadmap for leveraging digital twins to optimise 5G network performance sustainably,guiding future research and implementation strategies.展开更多
In underground coal mines, uncontrolled accumulation of methane and fine coal dust often leads to serious incidents such as explosion. Therefore, methane and dust dispersion in underground mines is closely monitored a...In underground coal mines, uncontrolled accumulation of methane and fine coal dust often leads to serious incidents such as explosion. Therefore, methane and dust dispersion in underground mines is closely monitored and strictly regulated. Accordingly, significant efforts have been devoted to study methane and dust dispersion in underground mines. In this study, methane emission and dust concentration are numerically investigated using a computational fluid dynamics(CFD) approach. Various possible scenarios of underground mine configurations are evaluated. The results indicate that the presence of continuous miner adversely affects the air flow and leads to increased methane and dust concentrations.Nevertheless, it is found that such negative effect can be minimized or even neutralized by operating the scrubber fan in suction mode. In addition, it was found that the combination of scrubber fan in suction mode and brattice results in the best performance in terms of methane and dust removal from the mining face.展开更多
基金The authors thank the Yayasan Universiti Teknologi PETRONAS(YUTP FRG Grant No.015LC0-428)at Universiti Teknologi PETRO-NAS for supporting this study.
文摘Static Poisson’s ratio(vs)is crucial for determining geomechanical properties in petroleum applications,namely sand production.Some models have been used to predict vs;however,the published models were limited to specific data ranges with an average absolute percentage relative error(AAPRE)of more than 10%.The published gated recurrent unit(GRU)models do not consider trend analysis to show physical behaviors.In this study,we aim to develop a GRU model using trend analysis and three inputs for predicting n s based on a broad range of data,n s(value of 0.1627-0.4492),bulk formation density(RHOB)(0.315-2.994 g/mL),compressional time(DTc)(44.43-186.9 μs/ft),and shear time(DTs)(72.9-341.2μ s/ft).The GRU model was evaluated using different approaches,including statistical error an-alyses.The GRU model showed the proper trends,and the model data ranges were wider than previous ones.The GRU model has the largest correlation coefficient(R)of 0.967 and the lowest AAPRE,average percent relative error(APRE),root mean square error(RMSE),and standard deviation(SD)of 3.228%,1.054%,4.389,and 0.013,respectively,compared to other models.The GRU model has a high accuracy for the different datasets:training,validation,testing,and the whole datasets with R and AAPRE values were 0.981 and 2.601%,0.966 and 3.274%,0.967 and 3.228%,and 0.977 and 2.861%,respectively.The group error analyses of all inputs show that the GRU model has less than 5% AAPRE for all input ranges,which is superior to other models that have different AAPRE values of more than 10% at various ranges of inputs.
基金PETRONAS Research fund(PRF)under PETRONAS Teknologi Transfer(PTT)Pre-Commercialization—External:YUTP-PRG Cycle 2022(015PBC-020).
文摘Determining the adsorption of shale gas on complex surfaces remains a challenge in molecular simulation studies.Difficulties essentially stem from the need to create a realistic shale structure model in terms of mineral heterogeneityand multiplicity.Moreover,precise characterization of the competitive adsorption of hydrogen andmethane in shale generally requires the experimental determination of the related adsorptive capacity.In thisstudy,the adsorption of adsorbates,methane(CH_(4)),and hydrogen(H_(2))on heterogeneous shale surface modelsof Kaolinite,Orthoclase,Muscovite,Mica,C_(60),and Butane has been simulated in the frame of a moleculardynamic’s numerical technique.The results show that these behaviors are influenced by pressure and potentialenergy.On increasing the pressure from 500 to 2000 psi,the sorption effect for CH_(4)significantly increasesbut shows a decline at a certain stage(if compared to H_(2)).The research findings also indicate that raw shalehas a higher capacity to adsorb CH_(4)compared to hydrogen.However,in shale,this difference is negligible.
基金financially supported by the National Key Research and Development Program of China(Grant No.2022YFC2806102)the National Natural Science Foundation of China(Grant Nos.52171287,52325107)+2 种基金High Tech Ship Research Project of Ministry of Industry and Information Technology(Grant Nos.2023GXB01-05-004-03,GXBZH2022-293)the Science Foundation for Distinguished Young Scholars of Shandong Province(Grant No.ZR2022JQ25)the Taishan Scholars Project(Grant No.tsqn201909063)。
文摘Leakages from subsea oil and gas equipment cause substantial economic losses and damage to marine ecosystem,so it is essential to locate the source of the leak.However,due to the complexity and variability of the marine environment,the signals collected by hydrophone contain a variety of noises,which makes it challenging to extract useful signals for localization.To solve this problem,a hydrophone denoising algorithm is proposed based on variational modal decomposition(VMD)with grey wolf optimization.First,the average envelope entropy is used as the fitness function of the grey wolf optimizer to find the optimal solution for the parameters K andα.Afterward,the VMD algorithm decomposes the original signal parameters to obtain the intrinsic mode functions(IMFs).Subsequently,the number of interrelationships between each IMF and the original signal was calculated,the threshold value was set,and the noise signal was removed to calculate the time difference using the valid signal obtained by reconstruction.Finally,the arrival time difference is used to locate the origin of the leak.The localization accuracy of the method in finding leaks is investigated experimentally by constructing a simulated leak test rig,and the effectiveness and feasibility of the method are verified.
文摘Diagnosing various diseases such as glaucoma,age-related macular degeneration,cardiovascular conditions,and diabetic retinopathy involves segmenting retinal blood vessels.The task is particularly challenging when dealing with color fundus images due to issues like non-uniformillumination,low contrast,and variations in vessel appearance,especially in the presence of different pathologies.Furthermore,the speed of the retinal vessel segmentation system is of utmost importance.With the surge of now available big data,the speed of the algorithm becomes increasingly important,carrying almost equivalent weightage to the accuracy of the algorithm.To address these challenges,we present a novel approach for retinal vessel segmentation,leveraging efficient and robust techniques based on multiscale line detection and mathematical morphology.Our algorithm’s performance is evaluated on two publicly available datasets,namely the Digital Retinal Images for Vessel Extraction dataset(DRIVE)and the Structure Analysis of Retina(STARE)dataset.The experimental results demonstrate the effectiveness of our method,withmean accuracy values of 0.9467 forDRIVE and 0.9535 for STARE datasets,aswell as sensitivity values of 0.6952 forDRIVE and 0.6809 for STARE datasets.Notably,our algorithmexhibits competitive performance with state-of-the-art methods.Importantly,it operates at an average speed of 3.73 s per image for DRIVE and 3.75 s for STARE datasets.It is worth noting that these results were achieved using Matlab scripts containing multiple loops.This suggests that the processing time can be further reduced by replacing loops with vectorization.Thus the proposed algorithm can be deployed in real time applications.In summary,our proposed system strikes a fine balance between swift computation and accuracy that is on par with the best available methods in the field.
文摘Purpose:The purpose of this study is to serve as a comprehensive review of the existing annotated corpora.This review study aims to provide information on the existing annotated corpora for event extraction,which are limited but essential for training and improving the existing event extraction algorithms.In addition to the primary goal of this study,it provides guidelines for preparing an annotated corpus and suggests suitable tools for the annotation task.Design/methodology/approach:This study employs an analytical approach to examine available corpus that is suitable for event extraction tasks.It offers an in-depth analysis of existing event extraction corpora and provides systematic guidelines for researchers to develop accurate,high-quality corpora.This ensures the reliability of the created corpus and its suitability for training machine learning algorithms.Findings:Our exploration reveals a scarcity of annotated corpora for event extraction tasks.In particular,the English corpora are mainly focused on the biomedical and general domains.Despite the issue of annotated corpora scarcity,there are several high-quality corpora available and widely used as benchmark datasets.However,access to some of these corpora might be limited owing to closed-access policies or discontinued maintenance after being initially released,rendering them inaccessible owing to broken links.Therefore,this study documents the available corpora for event extraction tasks.Research limitations:Our study focuses only on well-known corpora available in English and Chinese.Nevertheless,this study places a strong emphasis on the English corpora due to its status as a global lingua franca,making it widely understood compared to other languages.Practical implications:We genuinely believe that this study provides valuable knowledge that can serve as a guiding framework for preparing and accurately annotating events from text corpora.It provides comprehensive guidelines for researchers to improve the quality of corpus annotations,especially for event extraction tasks across various domains.Originality/value:This study comprehensively compiled information on the existing annotated corpora for event extraction tasks and provided preparation guidelines.
文摘Rising concerns about climate change drive the demand for lightweight components.Magnesium(Mg)alloys are highly valued for their low weight,making them increasingly important in various industries.Researchers focusing on enhancing the characteristics of Mg alloys and developing their Metal Matrix Composites(MMCs)have gained significant attention worldwide over the past decade,driven by the global shift towards lightweight materials.Friction Stir Processing(FSP)has emerged as a promising technique to enhance the properties of Mg alloys and produce Mg-MMCs.Initially,FSP adapted to refine grain size from the micro to the nano level and accelerated the development of MMCs due to its solid-state nature and the synergistic effects of microstructure refinement and reinforcement,improving strength,hardness,ductility,wear resistance,corrosion resistance,and fatigue strength.However,producing defect-free and sound FSPed Mg and Mg-MMCs requires addressing several variables and their interdependencies,which opens up a broad range of practical applications.Despite existing reviews on individual FSP of Mg,its alloys,and MMCs,an attempt has been made to analyze the latest research on these three aspects collectively to enhance the understanding,application,and effectiveness of FSP for Mg and its derivatives.This review article discusses the literature,classifies the importance of Mg alloys,provides a historical background,and explores developments and potential applications of FSPed Mg alloys.It focuses on novel fabrication methods,reinforcement strategies,machine and tool design parameters,material characterization,and integration with other methods for enhanced properties.The influence of process parameters and the emergence of defects are examined,along with specific applications in mono and hybrid composites and their microstructure evolution.The study identifies promising reinforcement materials and highlights research gaps in FSP for Mg alloys and MMCs production.It concludes with significant recommendations for further exploration,reflecting ongoing advancements in this field.
基金funded by Yayasan UTP FRG(YUTP-FRG),grant number 015LC0-280 and Computer and Information Science Department of Universiti Teknologi PETRONAS.
文摘Object detection has made a significant leap forward in recent years.However,the detection of small objects continues to be a great difficulty for various reasons,such as they have a very small size and they are susceptible to missed detection due to background noise.Additionally,small object information is affected due to the downsampling operations.Deep learning-based detection methods have been utilized to address the challenge posed by small objects.In this work,we propose a novel method,the Multi-Convolutional Block Attention Network(MCBAN),to increase the detection accuracy of minute objects aiming to overcome the challenge of information loss during the downsampling process.The multi-convolutional attention block(MCAB);channel attention and spatial attention module(SAM)that make up MCAB,have been crafted to accomplish small object detection with higher precision.We have carried out the experiments on the Karlsruhe Institute of Technology and Toyota Technological Institute(KITTI)and Pattern Analysis,Statical Modeling and Computational Learning(PASCAL)Visual Object Classes(VOC)datasets and have followed a step-wise process to analyze the results.These experiment results demonstrate that significant gains in performance are achieved,such as 97.75%for KITTI and 88.97%for PASCAL VOC.The findings of this study assert quite unequivocally the fact that MCBAN is much more efficient in the small object detection domain as compared to other existing approaches.
文摘Over the past two decades,machine learning techniques have been extensively used in predicting reservoir properties.While this approach has significantly contributed to the industry,selecting an appropriate model is still challenging for most researchers.Relying solely on statistical metrics to select the best model for a particular problem may not always be the most effective approach.This study encourages researchers to incorporate data visualization in their analysis and model selection process.To evaluate the suitability of different models in predicting horizontal permeability in the Volve field,wireline logs were used to train Extra-Trees,Ridge,Bagging,and XGBoost models.The Random Forest feature selection technique was applied to select the relevant logs as inputs for the models.Based on statistical metrics,the Extra-Trees model achieved the highest test accuracy of 0.996,RMSE of 19.54 mD,and MAE of 3.18 mD,with XGBoost coming in second.However,when the results were visualised,it was discovered that the XGBoost model was more suitable for the problem being tackled.The XGBoost model was a better predictor within the sandstone interval,while the Extra-Trees model was more appropriate in non-sandstone intervals.Since this study aims to predict permeability in the reservoir interval,the XGBoost model is the most suitable.These contrasting results demonstrate the importance of incorporating data visualisation techniques as an evaluation metric.Given the heterogeneity of the subsurface,relying solely on statistical metrics may not be sufficient to determine which model is best suited for a particular problem.
基金supported by the Yayasan Universiti Teknologi PETRONAS Grants,YUTP-PRG(015PBC-027)YUTP-FRG(015LC0-311),Hilmi Hasan,www.utp.edu.my.
文摘Medical imaging plays a key role within modern hospital management systems for diagnostic purposes.Compression methodologies are extensively employed to mitigate storage demands and enhance transmission speed,all while upholding image quality.Moreover,an increasing number of hospitals are embracing cloud computing for patient data storage,necessitating meticulous scrutiny of server security and privacy protocols.Nevertheless,considering the widespread availability of multimedia tools,the preservation of digital data integrity surpasses the significance of compression alone.In response to this concern,we propose a secure storage and transmission solution for compressed medical image sequences,such as ultrasound images,utilizing a motion vector watermarking scheme.The watermark is generated employing an error-correcting code known as Bose-Chaudhuri-Hocquenghem(BCH)and is subsequently embedded into the compressed sequence via block-based motion vectors.In the process of watermark embedding,motion vectors are selected based on their magnitude and phase angle.When embedding watermarks,no specific spatial area,such as a region of interest(ROI),is used in the images.The embedding of watermark bits is dependent on motion vectors.Although reversible watermarking allows the restoration of the original image sequences,we use the irreversible watermarking method.The reason for this is that the use of reversible watermarks may impede the claims of ownership and legal rights.The restoration of original data or images may call into question ownership or other legal claims.The peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)serve as metrics for evaluating the watermarked image quality.Across all images,the PSNR value exceeds 46 dB,and the SSIM value exceeds 0.92.Experimental results substantiate the efficacy of the proposed technique in preserving data integrity.
基金This work was supported by the National Natural Science Foundation of China(Grant Nos.U2240210,52279098)the Natural Science Foundation of Jiangsu Province(Grant No.BK20200525)the Fundamental Research Funds for the Central Universities(Grant No.B230201021).We express our gratitude to PETRONAS and Shell Global Solution International B.V.for their support of this work.
文摘Foam is utilized in enhanced oil recovery and CO_(2) sequestration.Surfactant-alternating-gas(SAG)is a preferred approach for placing foam into reservoirs,due to it enhances gas injection and minimizes corrosion in facilities.Our previous studies with similar permeability cores show that during SAG injection,several banks occupy the area near the well where fluid exhibits distinct behaviour.However,underground reservoirs are heterogeneous,often layered.It is crucial to understand the effect of permeability on fluid behaviour and injectivity in a SAG process.In this work,coreflood experiments are conducted in cores with permeabilities ranging from 16 to 2300 mD.We observe the same sequence of banks in cores with different permeabilities.However,the speed at which banks propagate and their overall mobility can vary depending on permeability.At higher permeabilities,the gas-dissolution bank and the forced-imbibition bank progress more rapidly during liquid injection.The total mobilities of both banks decrease with permeability.By utilizing a bank-propagation model,we scale up our experimental findings and compare them to results obtained using the Peaceman equation.Our findings reveal that the liquid injectivity in a SAG foam process is misestimated by conventional simulators based on the Peaceman equation.The lower the formation permeability,the greater the error.
基金funded by Universiti Teknologi PETRONAS and grants(YUTP-PRG:015PBC-011).
文摘This study conducts a systematic literature review(SLR)of blockchain consensus mechanisms,an essential protocols that maintain the integrity,reliability,and decentralization of distributed ledger networks.The aim is to comprehensively investigate prominent mechanisms’security features and vulnerabilities,emphasizing their security considerations,applications,challenges,and future directions.The existing literature offers valuable insights into various consensus mechanisms’strengths,limitations,and security vulnerabilities and their real-world applications.However,there remains a gap in synthesizing and analyzing this knowledge systematically.Addressing this gap would facilitate a structured approach to understanding consensus mechanisms’security and vulnerabilities comprehensively.The study adheres to Preferred Reporting Items for Systematic Reviews and Meta-Analyses(PRISMA)guidelines and computer science standards and reviewed 3749 research papers from 2016 to 2024,excluding grey literature,resulting in 290 articles for descriptive analysis.The research highlights an increased focus on blockchain consensus security,energy efficiency,and hybrid mechanisms within 60%of research papers post-2019,identifying gaps in scalability,privacy,and interoperability for future exploration.By synthesizing the existing research and identifying the key trends,this SLR contributes to advancing the understanding of blockchain consensus mechanisms’security and guiding future research and structured innovation in blockchain systems and applications.
基金supported through Universiti Sains Malaysia(USM)and the Ministry of Higher Education Malaysia providing the research grant,Fundamental Research Grant Scheme(FRGS-Grant No.FRGS/1/2020/TK0/USM/02/1).
文摘The rapid adoption of Internet of Things(IoT)technologies has introduced significant security challenges across the physical,network,and application layers,particularly with the widespread use of the Message Queue Telemetry Transport(MQTT)protocol,which,while efficient in bandwidth consumption,lacks inherent security features,making it vulnerable to various cyber threats.This research addresses these challenges by presenting a secure,lightweight communication proxy that enhances the scalability and security of MQTT-based Internet of Things(IoT)networks.The proposed solution builds upon the Dang-Scheme,a mutual authentication protocol designed explicitly for resource-constrained environments and enhances it using Elliptic Curve Cryptography(ECC).This integration significantly improves device authentication,data confidentiality,and energy efficiency,achieving an 87.68%increase in data confidentiality and up to 77.04%energy savings during publish/subscribe communications in smart homes.The Middleware Broker System dynamically manages transaction keys and session IDs,offering robust defences against common cyber threats like impersonation and brute-force attacks.Penetration testing with tools such as Hydra and Nmap further validated the system’s security,demonstrating its potential to significantly improve the security and efficiency of IoT networks while underscoring the need for ongoing research to combat emerging threats.
文摘Shear logs,also known as shear velocity logs,are used for various types of seismic analysis,such as determining the relationship between amplitude variation with offset(AVO)and interpreting multiple types of seismic data.This log is an important tool for analyzing the properties of rocks and interpreting seismic data to identify potential areas of oil and gas reserves.However,these logs are often not collected due to cost constraints or poor borehole conditions possibly leading to poor data quality,though there are various approaches in practice for estimating shear wave velocity.In this study,a detailed review of the recent advances in the various techniques used to measure shear wave(S-wave)velocity is carried out.These techniques include direct and indirect measurement,determination of empirical relationships between S-wave velocity and other parameters,machine learning,and rock physics models.Therefore,this study creates a collection of employed techniques,enhancing the existing knowledge of this significant topic and offering a progressive approach for practical implementation in the field.
基金The APC of this article is covered by Research Grant YUTP 015LCO-526。
文摘This study aims to formulate a steady-state mathematical model for a three-dimensional permeable enclosure(cavity)to determine the oil extraction rate using three distinct nanoparticles,SiO_(2),Al_(2)O_(3),and Fe_(2)O_(3),in unconventional oil reservoirs.The simulation is conducted for different parameters of volume fractions,porosities,and mass flow rates to determine the optimal oil recovery.The impact of nanoparticles on relative permeability(kr)and water is also investigated.The simulation process utilizes the finite volume ANSYS Fluent.The study results showed that when the mass flow rate at the inlet is low,oil recovery goes up.In addition,they indicated that silicon nanoparticles are better at getting oil out of the ground(i.e.,oil reservoir)than Al_(2)O_(3)and Fe_(2)O_(3).Most oil can be extracted from SiO_(2),Al_(2)O_(3),and Fe_(2)O_(3)at a rate of 97.8%,96.5%,and 88%,respectively.
基金supported by Scientific Research Deanship at University of Ha’il,Saudi Arabia through project number RG-23137.
文摘The segmentation of head and neck(H&N)tumors in dual Positron Emission Tomography/Computed Tomogra-phy(PET/CT)imaging is a critical task in medical imaging,providing essential information for diagnosis,treatment planning,and outcome prediction.Motivated by the need for more accurate and robust segmentation methods,this study addresses key research gaps in the application of deep learning techniques to multimodal medical images.Specifically,it investigates the limitations of existing 2D and 3D models in capturing complex tumor structures and proposes an innovative 2.5D UNet Transformer model as a solution.The primary research questions guiding this study are:(1)How can the integration of convolutional neural networks(CNNs)and transformer networks enhance segmentation accuracy in dual PET/CT imaging?(2)What are the comparative advantages of 2D,2.5D,and 3D model configurations in this context?To answer these questions,we aimed to develop and evaluate advanced deep-learning models that leverage the strengths of both CNNs and transformers.Our proposed methodology involved a comprehensive preprocessing pipeline,including normalization,contrast enhancement,and resampling,followed by segmentation using 2D,2.5D,and 3D UNet Transformer models.The models were trained and tested on three diverse datasets:HeckTor2022,AutoPET2023,and SegRap2023.Performance was assessed using metrics such as Dice Similarity Coefficient,Jaccard Index,Average Surface Distance(ASD),and Relative Absolute Volume Difference(RAVD).The findings demonstrate that the 2.5D UNet Transformer model consistently outperformed the 2D and 3D models across most metrics,achieving the highest Dice and Jaccard values,indicating superior segmentation accuracy.For instance,on the HeckTor2022 dataset,the 2.5D model achieved a Dice score of 81.777 and a Jaccard index of 0.705,surpassing other model configurations.The 3D model showed strong boundary delineation performance but exhibited variability across datasets,while the 2D model,although effective,generally underperformed compared to its 2.5D and 3D counterparts.Compared to related literature,our study confirms the advantages of incorporating additional spatial context,as seen in the improved performance of the 2.5D model.This research fills a significant gap by providing a detailed comparative analysis of different model dimensions and their impact on H&N segmentation accuracy in dual PET/CT imaging.
文摘The functionality of buildings depends on the climate they are subjected to,this means buildings constructed these days need to be built to perform efficiently within the current and future climate and with the aim of reducing the greenhouse emission induced on the environment.In order to lower CO_(2) emissions and assist in reinforcing the capability of cities to adapt to climate change whilst enhancing the quality of the built environment,it is vital to improve the environmental performance and energy efficiency in buildings.This study aims to identify and compare existing GBRS(green building rating system)with Nigerian climatic condition.Proposal criteria weighting was developed from the questionnaire supplying expert opinions.It began with a comparative analysis of GBRS.The analysis was conducted to find the similarities and differences between existing rating systems and come up with an appropriate rating system for Nigeria.These approaches included a fieldwork approach(pilot study,survey);questionnaire;and interviews,both structured and unstructured.Then,the information generated in this pre-test investigation formed the conversational guide and described the decision to determine the main factors that should be involved in the new assessment system for Nigeria.
基金the University of Teknologi PETRONAS(UTP),Malaysia,and Ahmadu Bello University,Nigeria,for their vital help and availability of laboratory facilities that allowed this work to be conducted successfully.
文摘The goals of this study are to assess the viability of waste tire-derived char(WTDC)as a sustainable,low-cost fine aggregate surrogate material for asphalt mixtures and to develop the statistically coupled neural network(SCNN)model for predicting volumetric and Marshall properties of asphalt mixtures modified with WTDC.The study is based on experimental data acquired from laboratory volumetric and Marshall properties testing on WTDCmodified asphalt mixtures(WTDC-MAM).The input variables comprised waste tire char content and asphalt binder content.The output variables comprised mixture unit weight,total voids,voids filled with asphalt,Marshall stability,and flow.Statistical coupled neural networks were utilized to predict the volumetric and Marshall properties of asphalt mixtures.For predictive modeling,the SCNN model is employed,incorporating a three-layer neural network and preprocessing techniques to enhance accuracy and reliability.The optimal network architecture,using the collected dataset,was a 2:6:5 structure,and the neural network was trained with 60%of the data,whereas the other 20%was used for cross-validation and testing respectively.The network employed a hyperbolic tangent(tanh)activation function and a feed-forward backpropagation.According to the results,the network model could accurately predict the volumetric and Marshall properties.The predicted accuracy of SCNN was found to be as high value>98%and low prediction errors for both volumetric and Marshall properties.This study demonstrates WTDC's potential as a low-cost,sustainable aggregate replacement.The SCNN-based predictive model proves its efficiency and versatility and promotes sustainable practices.
基金Yayasan UTP,Grant/Award Number:015LC0‐312the Yayasan Universiti Teknologi PETRO-NAS‐Fundamental Research Grant(YUTP‐FRG)-015PBC‐011 for their generous funding of this publication.
文摘In the era of 5G,seamless mobility handovers are critical in densely populated regions like Malaysia to mitigate disruptions and inefficiencies.5G networks offer unprecedented data speeds and reliability,essential for advancing mobile communication and Internet of Things applications.However,ensuring continuous connectivity and service during mobility remains challenging,especially in urban settings.Digital twin technology pre-sents a promising solution to enhance 5G handover mechanisms.A digital twin network(DTN)mirroring Malaysia's 5G infrastructure is proposed,utilising real-time data and user behaviour insights to optimise energy consumption during handovers.The focus is on energy-efficient protocols and algorithms,reviewed through a systematic literature review.The DTN aims to enhance mobility handover efficiency through predictive handovers and adaptive resource allocation,bolstered by sustainable practices such as edge computing.The potential of DTNs to optimise 5G handover processes is explored,starting with the foundational concepts of 5G mobility and digital twins,highlighting the need for improved strategies in high-mobility scenarios.Methodologies leveraging digital twins to predict network conditions,simulate handover scenarios,and proactively manage decisions are examined,reducing latency and packet loss.Case studies demonstrate how digital twins adapt dynamically to network changes and user mobility,thereby improving quality of service and user experience.Malaysia's specific 5G mobility challenges are addressed with a tailored DTN emphasising energy efficiency,evaluated through practical applications.Evaluation criteria assess effectiveness with in-depth analysis of methods,performance metrics,limitations,and recommendations for future research.Challenges and future directions including scalability,security,and real-time data processing,are discussed,aiming to integrate digital twin technology with 5G networks for enhanced connectivity.This abstract provides a roadmap for leveraging digital twins to optimise 5G network performance sustainably,guiding future research and implementation strategies.
基金financial support from McGill University-Canada and NSERC-Discovery Grant RGPIN-2015-03945
文摘In underground coal mines, uncontrolled accumulation of methane and fine coal dust often leads to serious incidents such as explosion. Therefore, methane and dust dispersion in underground mines is closely monitored and strictly regulated. Accordingly, significant efforts have been devoted to study methane and dust dispersion in underground mines. In this study, methane emission and dust concentration are numerically investigated using a computational fluid dynamics(CFD) approach. Various possible scenarios of underground mine configurations are evaluated. The results indicate that the presence of continuous miner adversely affects the air flow and leads to increased methane and dust concentrations.Nevertheless, it is found that such negative effect can be minimized or even neutralized by operating the scrubber fan in suction mode. In addition, it was found that the combination of scrubber fan in suction mode and brattice results in the best performance in terms of methane and dust removal from the mining face.