Multilateral wells promise cost savings to oil and fields as they have the potential to reduce overall drilling distances and minimize the number of slots required for the surface facility managing the well.However,dr...Multilateral wells promise cost savings to oil and fields as they have the potential to reduce overall drilling distances and minimize the number of slots required for the surface facility managing the well.However,drilling a multilateral well does not always increase the flow rate when compared to two single-horizontal wells due to competition in production inside the mother-bore.Here,a holistic approach is proposed to find the optimum balance between single and multilateral wells in an offshore oil development.In so doing,the integrated approach finds the highest Net Present Value(NPV)configuration of the field considering drilling,subsurface,production and financial analysis.The model employs stochastic perturbation and Markov Chain Monte-Carlo methods to solve the global maximising-NPV problem.In addition,a combination of Mixed-Integer Linear Programming(MILP),an improved Dijkstra algorithm and a Levenberg-Marquardt optimiser is proposed to solve the rate allocation problem.With the outcome from this analysis,the model suggests the optimum development including number of multilateral and single horizontal wells that would result in the highest NPV.The results demonstrate the potential for modelling to find the optimal use of petroleum facilities and to assist with planning and decision making.展开更多
This paper aims to frame a new rice disease prediction model that included three major phases.Initially,median filtering(MF)is deployed during pre-processing and then‘proposed Fuzzy Means Clustering(FCM)based segment...This paper aims to frame a new rice disease prediction model that included three major phases.Initially,median filtering(MF)is deployed during pre-processing and then‘proposed Fuzzy Means Clustering(FCM)based segmentation’is done.Following that,‘Discrete Wavelet Transform(DWT),Scale-Invariant Feature Transform(SIFT)and low-level features(colour and shape),Proposed local Binary Pattern(LBP)based features’are extracted that are classified via‘MultiLayer Perceptron(MLP)and Long Short Term Memory(LSTM)’and predicted outcomes are obtained.For exact prediction,this work intends to optimise the weights of LSTM using Inertia Weighted Salp Swarm Optimisation(IW-SSO)model.Eventually,the development of IW-SSO method is established on varied metrics.展开更多
Due to recent improvements in forensic DNA testing kit sensitivity,there has been an increased demand in the criminal justice community to revisit past convictions or cold cases.Some of these cases have little biologi...Due to recent improvements in forensic DNA testing kit sensitivity,there has been an increased demand in the criminal justice community to revisit past convictions or cold cases.Some of these cases have little biological evidence other than touch DNA in the form of archived latent fingerprint lift cards.In this study,a previously developed optimised workflow for this sample type was tested on aged fingerprints to determine if improved short tandem repeat(STR)profiles could be obtained.Two-year-old samples processed with the optimised workflow produced an average of approximately five more STR alleles per profile over the traditional method.The optimised workflow also produced detectable alleles in samples aged out to 28 years.Of the methods tested,the optimised workflow resulted in the most informative profiles from evidence samples more representative of the forensic need.This workflow is recommended for use with archived latent fingerprint samples,regardless of the archival time.展开更多
The development of photocatalytic technology has grown significantly since its initial report and as such,a number of screening methods have been developed to assess activity. In the field of environmental remediation...The development of photocatalytic technology has grown significantly since its initial report and as such,a number of screening methods have been developed to assess activity. In the field of environmental remediation, a crucial factor is the formation of highly oxidising species such as OH radicals. These radicals are often the primary driving force for the removal and breakdown of organic and inorganic contaminants. The quantification of such compounds is challenging due to the nature of the radical,however indirect methods which deploy a chemical probe to essentially capture the radical have been shown to be effective. As discussed in the work presented here, optimisation of such a method is fundamental to the efficiency of the method. A starting concentration range of coumarin from 50 mmol/L to 1000 mmol/L was used along with a catalyst loading of 0.01 g/L to 1 g/L Ti TiO2 to identify that 250 mmol/L and 0.5 g/L Ti TiO2 were the optimum conditions for production. Under these parameters a maximum production rate of 35.91 mmol/L(Rmax= 0.4 mmol/L OH* min-1) was achieved which yielded at photonic efficiency of 4.88 OH*moles photon-1 under UV irradiation. The data set presented also highlighted the limitations which are associated with the method which included; rapid exhaustion of the probe molecule and process inhibition through UV light saturation. Identifying both the optimum conditions and the potential limitations of the process were concluded to be key for the efficient deployment of the photocatalytic screening method.展开更多
Nowadays it is known that the thermomechanical schedules applied during hot rolling of flat products provide the steel with improved mechanical properties.In this work an optimisation tool,OptiLam (OptiLam v.1),based ...Nowadays it is known that the thermomechanical schedules applied during hot rolling of flat products provide the steel with improved mechanical properties.In this work an optimisation tool,OptiLam (OptiLam v.1),based on a predictive software and capable of generating optimised rolling schedules to obtain the desired mechanical properties in the final product is described.OptiLam includes some well-known metallurgical models which predict microstructural evolution during hot rolling and the transformation austenite/ferrite during the cooling.Furthermore,an optimisation algorithm,which is based on the gradient method,has been added,in order to design thermomechanical sequences when a specific final grain size is desired.OptiLam has been used to optimise rolling parameters,such as strain and temperature.Here,some of the results of the software validation performed by means of hot torsion tests are presented,showing also the functionality of the tool.Finally,the application of classical optimisation models,based on the gradient method,to hot rolling operations,is also discussed.展开更多
Global meteorology data are now widely used in various areas, but one of its applications, weather analogues, still require exhaustive searches on the whole historical data. We present two optimisations for the state-...Global meteorology data are now widely used in various areas, but one of its applications, weather analogues, still require exhaustive searches on the whole historical data. We present two optimisations for the state-of-the-art weather analogue search algorithms: a parallelization and a heuristic search. The heuristic search (NDRank) limits of the final number of results and does initial searches on a lower resolution dataset to find candidates that, in the second phase, are locally validated. These optimisations were deployed in the Cloud and evaluated with ERA5 data from ECMWF. The proposed parallelization attained speedups close to optimal, and NDRank attains speedups higher than 4. NDRank can be applied to any parallel search, adding similar speedups. A substantial number of executions returned a set of analogues similar to the existing exhaustive search and most of the remaining results presented a numerical value difference lower than 0.1%. The results demonstrate that it is now possible to search for weather analogues in a faster way (even compared with parallel searches) with results with little to no error. Furthermore, NDRank can be applied to existing exhaustive searches, providing faster results with small reduction of the precision of the results.展开更多
This paper proposes a modified grey wolf optimiser-based adaptive super-twisting sliding mode control algorithm for the trajectory tracking and balancing of the rotary inverted pendulum system.The super-twisting slidi...This paper proposes a modified grey wolf optimiser-based adaptive super-twisting sliding mode control algorithm for the trajectory tracking and balancing of the rotary inverted pendulum system.The super-twisting sliding mode algorithm severely alleviates the chattering present in the classical sliding mode control.It provides robustness against model uncertainties and external disturbances with the knowledge of the upper bounds of the uncertainties and disturbances.The gains of the super-twisting sliding mode algorithm are selected through adaptive law.Parameters of the adaption law are tuned using a modified grey wolf optimisation algorithm,a meta-heuristic optimisation technique.Lyapunov stability analysis is carried out to analyse the overall control system stability.The performance of the proposed control algorithm is compared with two other sliding mode control strategies present in the literature,therein showing better performance of the proposed control scheme.展开更多
This paper focuses on the trajectory tracking of quadrotors under bounded external disturbances.An optimised robust controller is proposed to drive the position and attitude ofa quadrotor converge to their references ...This paper focuses on the trajectory tracking of quadrotors under bounded external disturbances.An optimised robust controller is proposed to drive the position and attitude ofa quadrotor converge to their references quickly. At first, nonsingular fast terminal slidingmode control is developed, which can guarantee not only the stability but also finite-timeconvergence of the closed-loop system. As the parameters of the designed controllers playa vital role for control performance, an improved beetle antennae search algorithm is proposedto optimise them. By employing the historical information of the beetle’s antennaeand dynamically updating the step size as well as the range of its searching, the optimisingis accelerated considerably to ensure the efficiency of the quadrotor control. The superiorityof the proposed control scheme is demonstrated by simulation experiments, from whichone can see that both the error and the overshooting of the trajectory tracking are reducedeffectively.展开更多
We evaluate an adaptive optimisation methodology,Bayesian optimisation(BO),for designing a minimum weight explosive reactive armour(ERA)for protection against a surrogate medium calibre kinetic energy(KE)long rod proj...We evaluate an adaptive optimisation methodology,Bayesian optimisation(BO),for designing a minimum weight explosive reactive armour(ERA)for protection against a surrogate medium calibre kinetic energy(KE)long rod projectile and surrogate shaped charge(SC)warhead.We perform the optimisation using a conventional BO methodology and compare it with a conventional trial-and-error approach from a human expert.A third approach,utilising a novel human-machine teaming framework for BO is also evaluated.Data for the optimisation is generated using numerical simulations that are demonstrated to provide reasonable qualitative agreement with reference experiments.The human-machine teaming methodology is shown to identify the optimum ERA design in the fewest number of evaluations,outperforming both the stand-alone human and stand-alone BO methodologies.From a design space of almost 1800 configurations the human-machine teaming approach identifies the minimum weight ERA design in 10 samples.展开更多
Decomposition of a complex multi-objective optimisation problem(MOP)to multiple simple subMOPs,known as M2M for short,is an effective approach to multi-objective optimisation.However,M2M facilitates little communicati...Decomposition of a complex multi-objective optimisation problem(MOP)to multiple simple subMOPs,known as M2M for short,is an effective approach to multi-objective optimisation.However,M2M facilitates little communication/collaboration between subMOPs,which limits its use in complex optimisation scenarios.This paper extends the M2M framework to develop a unified algorithm for both multi-objective and manyobjective optimisation.Through bilevel decomposition,an MOP is divided into multiple subMOPs at upper level,each of which is further divided into a number of single-objective subproblems at lower level.Neighbouring subMOPs are allowed to share some subproblems so that the knowledge gained from solving one subMOP can be transferred to another,and eventually to all the subMOPs.The bilevel decomposition is readily combined with some new mating selection and population update strategies,leading to a high-performance algorithm that competes effectively against a number of state-of-the-arts studied in this paper for both multiand many-objective optimisation.Parameter analysis and component analysis have been also carried out to further justify the proposed algorithm.展开更多
In order to play a positive role of decentralised wind power on-grid for voltage stability improvement and loss reduction of distribution network,a multi-objective two-stage decentralised wind power planning method is...In order to play a positive role of decentralised wind power on-grid for voltage stability improvement and loss reduction of distribution network,a multi-objective two-stage decentralised wind power planning method is proposed in the paper,which takes into account the network loss correction for the extreme cold region.Firstly,an electro-thermal model is introduced to reflect the effect of temperature on conductor resistance and to correct the results of active network loss calculation;secondly,a two-stage multi-objective two-stage decentralised wind power siting and capacity allocation and reactive voltage optimisation control model is constructed to take account of the network loss correction,and the multi-objective multi-planning model is established in the first stage to consider the whole-life cycle investment cost of WTGs,the system operating cost and the voltage quality of power supply,and the multi-objective planning model is established in the second stage.planning model,and the second stage further develops the reactive voltage control strategy of WTGs on this basis,and obtains the distribution network loss reduction method based on WTG siting and capacity allocation and reactive power control strategy.Finally,the optimal configuration scheme is solved by the manta ray foraging optimisation(MRFO)algorithm,and the loss of each branch line and bus loss of the distribution network before and after the adoption of this loss reduction method is calculated by taking the IEEE33 distribution system as an example,which verifies the practicability and validity of the proposed method,and provides a reference introduction for decision-making for the distributed energy planning of the distribution network.展开更多
In recent years, there has been remarkable progress in the performance of metal halide perovskite solar cells. Studies have shown significant interest in lead-free perovskite solar cells (PSCs) due to concerns about t...In recent years, there has been remarkable progress in the performance of metal halide perovskite solar cells. Studies have shown significant interest in lead-free perovskite solar cells (PSCs) due to concerns about the toxicity of lead in lead halide perovskites. CH3NH3SnI3 emerges as a viable alternative to CH3NH3PbX3. In this work, we studied the effect of various parameters on the performance of lead-free perovskite solar cells using simulation with the SCAPS 1D software. The cell structure consists of α-Fe2O3/CH3NH3SnI3/PEDOT: PSS. We analyzed parameters such as thickness, doping, and layer concentration. The study revealed that, without considering other optimized parameters, the efficiency of the cell increased from 22% to 35% when the perovskite thickness varied from 100 to 1000 nm. After optimization, solar cell efficiency reaches up to 42%. The optimization parameters are such that, for example, for perovskite: the layer thickness is 700 nm, the doping concentration is 1020 and the defect density is 1013 cm−3, and for hematite: the thickness is 5 nm, the doping concentration is 1022 and the defect concentration is 1011 cm−3. These results are encouraging because they highlight the good agreement between perovskite and hematite when used as the active and electron transport layers, respectively. Now, it is still necessary to produce real, viable photovoltaic solar cells with the proposed material layer parameters.展开更多
Over the last decade, the rapid growth in traffic and the number of network devices has implicitly led to an increase in network energy consumption. In this context, a new paradigm has emerged, Software-Defined Networ...Over the last decade, the rapid growth in traffic and the number of network devices has implicitly led to an increase in network energy consumption. In this context, a new paradigm has emerged, Software-Defined Networking (SDN), which is an emerging technique that separates the control plane and the data plane of the deployed network, enabling centralized control of the network, while offering flexibility in data center network management. Some research work is moving in the direction of optimizing the energy consumption of SD-DCN, but still does not guarantee good performance and quality of service for SDN networks. To solve this problem, we propose a new mathematical model based on the principle of combinatorial optimization to dynamically solve the problem of activating and deactivating switches and unused links that consume energy in SDN networks while guaranteeing quality of service (QoS) and ensuring load balancing in the network.展开更多
In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate pr...In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.展开更多
Academician of the CAE member Youxian Sun from Zhejiang University initiated Digital Twins and Applications(ISSN 2995-2182).It is published by Zhejiang University Press and the Institution of Engineering and Technolog...Academician of the CAE member Youxian Sun from Zhejiang University initiated Digital Twins and Applications(ISSN 2995-2182).It is published by Zhejiang University Press and the Institution of Engineering and Technology and sponsored by Zhejiang Univer-sity.Digital Twins and Applications aim to provide a specialised platform for researchers,practitioners,and industry experts to publish high-quality,state-of-the-art research on digital twin technologies and their applications.展开更多
An excellent cardinality estimation can make the query optimiser produce a good execution plan.Although there are some studies on cardinality estimation,the prediction results of existing cardinality estimators are in...An excellent cardinality estimation can make the query optimiser produce a good execution plan.Although there are some studies on cardinality estimation,the prediction results of existing cardinality estimators are inaccurate and the query efficiency cannot be guaranteed as well.In particular,they are difficult to accurately obtain the complex relationships between multiple tables in complex database systems.When dealing with complex queries,the existing cardinality estimators cannot achieve good results.In this study,a novel cardinality estimator is proposed.It uses the core techniques with the BiLSTM network structure and adds the attention mechanism.First,the columns involved in the query statements in the training set are sampled and compressed into bitmaps.Then,the Word2vec model is used to embed the word vectors about the query statements.Finally,the BiLSTM network and attention mechanism are employed to deal with word vectors.The proposed model takes into consideration not only the correlation between tables but also the processing of complex predicates.Extensive experiments and the evaluation of BiLSTM-Attention Cardinality Estimator(BACE)on the IMDB datasets are conducted.The results show that the deep learning model can significantly improve the quality of cardinality estimation,which is a vital role in query optimisation for complex databases.展开更多
Capsule endoscopy(CE) has proved to be an important non-invasive tool for diagnosis and monitoring Crohn's disease patients.It has the advantage of excellent visualization of digestive tract mucosa,a good tolerabi...Capsule endoscopy(CE) has proved to be an important non-invasive tool for diagnosis and monitoring Crohn's disease patients.It has the advantage of excellent visualization of digestive tract mucosa,a good tolerability and safety in well-selected patients.The risk of retention can be diminished by good selection of patients using imaging techniques and by the use of patency capsule.The aim of a capsule examination is not only an early diagnosis but also a very good stratification of prognosis,thus directing the treatment strategy for either a step up or top-down approach and also permitting the optimization of the treatment depending on the findings.When symptoms and biomarkers point to a change in the disease's activity we can either adjust the treatment directly as recommended in CALM study or choose in selected patients to visualize the digestive mucosa through a CE and take a decision afterwards.The appearance of the new capsule from Medtronicthe Pillcam Crohn's might be an important step forward in diagnosis,evaluating disease extent,the severity of the disease,prognosis,management in a treat to target approach,with treatment modifications according to the data from CE examination.Serial examinations in the same patient can be compared and a more objective evaluation of the lesions modification from one exam to another can be performed.We present the latest developments and current status and evidence that in selected patients capsule can be a tool in a treat to target approach.展开更多
A method for packing irregular particles with a prescribed volume fraction is proposed.Furthermore,the generated granular material adheres to the prescribed statistical distribution and satisfies the desired complex s...A method for packing irregular particles with a prescribed volume fraction is proposed.Furthermore,the generated granular material adheres to the prescribed statistical distribution and satisfies the desired complex spatial arrangement.First,the irregular geometries of the realistic particles were obtained from the original particle images.Second,the Minkowski sum was used to check the overlap between irregular particles and place an irregular particle in contact with other particles.Third,the optimised advance front method(OAFM)generated irregular particle packing with the prescribed statistical dis-tribution and volume fraction based on the Minkowski sum.Moreover,the signed distance function was introduced to pack the particles in accordance with the desired spatial arrangement.Finally,seven biaxial tests were performed using the UDEC software,which demonstrated the accuracy and potential usefulness of the proposed method.It can model granular material efficiently and reflect the meso-structural characteristics of complex granular materials.This method has a wide range of applications where discrete modelling of granular media is necessary.展开更多
Expansive soils are problematic due to the performances of their clay mineral constituent, which makes them exhibit the shrink-swell characteristics. The shrink-swell behaviours make expansive soils inappropriate for ...Expansive soils are problematic due to the performances of their clay mineral constituent, which makes them exhibit the shrink-swell characteristics. The shrink-swell behaviours make expansive soils inappropriate for direct engineering application in their natural form. In an attempt to make them more feasible for construction purposes, numerous materials and techniques have been used to stabilise the soil. In this study, the additives and techniques applied for stabilising expansive soils will be focused on,with respect to their efficiency in improving the engineering properties of the soils. Then we discussed the microstructural interaction, chemical process, economic implication, nanotechnology application, as well as waste reuse and sustainability. Some issues regarding the effective application of the emerging trends in expansive soil stabilisation were presented with three categories, namely geoenvironmental,standardisation and optimisation issues. Techniques like predictive modelling and exploring methods such as reliability-based design optimisation, response surface methodology, dimensional analysis, and artificial intelligence technology were also proposed in order to ensure that expansive soil stabilisation is efficient.展开更多
Self-piercing riveting(SPR)is a cold forming technique used to fasten together two or more sheets of materials with a rivet without the need to predrill a hole.The application of SPR in the automotive sector has becom...Self-piercing riveting(SPR)is a cold forming technique used to fasten together two or more sheets of materials with a rivet without the need to predrill a hole.The application of SPR in the automotive sector has become increasingly popular mainly due to the growing use of lightweight materials in transportation applications.However,SPR joining of these advanced light materials remains a challenge as these materials often lack a good combination of high strength and ductility to resist the large plastic deformation induced by the SPR process.In this paper,SPR joints of advanced materials and their corresponding failure mechanisms are discussed,aiming to provide the foundation for future improvement of SPR joint quality.This paper is divided into three major sections:1)joint failures focusing on joint defects originated from the SPR process and joint failure modes under different mechanical loading conditions,2)joint corrosion issues,and 3)joint optimisation via process parameters and advanced techniques.展开更多
文摘Multilateral wells promise cost savings to oil and fields as they have the potential to reduce overall drilling distances and minimize the number of slots required for the surface facility managing the well.However,drilling a multilateral well does not always increase the flow rate when compared to two single-horizontal wells due to competition in production inside the mother-bore.Here,a holistic approach is proposed to find the optimum balance between single and multilateral wells in an offshore oil development.In so doing,the integrated approach finds the highest Net Present Value(NPV)configuration of the field considering drilling,subsurface,production and financial analysis.The model employs stochastic perturbation and Markov Chain Monte-Carlo methods to solve the global maximising-NPV problem.In addition,a combination of Mixed-Integer Linear Programming(MILP),an improved Dijkstra algorithm and a Levenberg-Marquardt optimiser is proposed to solve the rate allocation problem.With the outcome from this analysis,the model suggests the optimum development including number of multilateral and single horizontal wells that would result in the highest NPV.The results demonstrate the potential for modelling to find the optimal use of petroleum facilities and to assist with planning and decision making.
文摘This paper aims to frame a new rice disease prediction model that included three major phases.Initially,median filtering(MF)is deployed during pre-processing and then‘proposed Fuzzy Means Clustering(FCM)based segmentation’is done.Following that,‘Discrete Wavelet Transform(DWT),Scale-Invariant Feature Transform(SIFT)and low-level features(colour and shape),Proposed local Binary Pattern(LBP)based features’are extracted that are classified via‘MultiLayer Perceptron(MLP)and Long Short Term Memory(LSTM)’and predicted outcomes are obtained.For exact prediction,this work intends to optimise the weights of LSTM using Inertia Weighted Salp Swarm Optimisation(IW-SSO)model.Eventually,the development of IW-SSO method is established on varied metrics.
基金This work was supported by the Department of Forensic Science of Virgina Commonwealth University and National Institute of Justice(NIJ)Award 2014-DNBX-K013.
文摘Due to recent improvements in forensic DNA testing kit sensitivity,there has been an increased demand in the criminal justice community to revisit past convictions or cold cases.Some of these cases have little biological evidence other than touch DNA in the form of archived latent fingerprint lift cards.In this study,a previously developed optimised workflow for this sample type was tested on aged fingerprints to determine if improved short tandem repeat(STR)profiles could be obtained.Two-year-old samples processed with the optimised workflow produced an average of approximately five more STR alleles per profile over the traditional method.The optimised workflow also produced detectable alleles in samples aged out to 28 years.Of the methods tested,the optimised workflow resulted in the most informative profiles from evidence samples more representative of the forensic need.This workflow is recommended for use with archived latent fingerprint samples,regardless of the archival time.
基金the financial support of Northern Irelands Department of Education and Learning for funding Caitlin Buck’s Ph DQueen’s University Belfast Pioneering Research Programme (PRP) for funding the research of Dr Nathan Skillen
文摘The development of photocatalytic technology has grown significantly since its initial report and as such,a number of screening methods have been developed to assess activity. In the field of environmental remediation, a crucial factor is the formation of highly oxidising species such as OH radicals. These radicals are often the primary driving force for the removal and breakdown of organic and inorganic contaminants. The quantification of such compounds is challenging due to the nature of the radical,however indirect methods which deploy a chemical probe to essentially capture the radical have been shown to be effective. As discussed in the work presented here, optimisation of such a method is fundamental to the efficiency of the method. A starting concentration range of coumarin from 50 mmol/L to 1000 mmol/L was used along with a catalyst loading of 0.01 g/L to 1 g/L Ti TiO2 to identify that 250 mmol/L and 0.5 g/L Ti TiO2 were the optimum conditions for production. Under these parameters a maximum production rate of 35.91 mmol/L(Rmax= 0.4 mmol/L OH* min-1) was achieved which yielded at photonic efficiency of 4.88 OH*moles photon-1 under UV irradiation. The data set presented also highlighted the limitations which are associated with the method which included; rapid exhaustion of the probe molecule and process inhibition through UV light saturation. Identifying both the optimum conditions and the potential limitations of the process were concluded to be key for the efficient deployment of the photocatalytic screening method.
基金supported by the project "Quality improvement by metallurgical optimised stock temperature evolution in the reheating furnace including microstructure feedback from the rolling mill" (OPTHEAT RFSR-CT-2006-00007) of the Research Fund for Coal and Steel (RFCS) from the European Union
文摘Nowadays it is known that the thermomechanical schedules applied during hot rolling of flat products provide the steel with improved mechanical properties.In this work an optimisation tool,OptiLam (OptiLam v.1),based on a predictive software and capable of generating optimised rolling schedules to obtain the desired mechanical properties in the final product is described.OptiLam includes some well-known metallurgical models which predict microstructural evolution during hot rolling and the transformation austenite/ferrite during the cooling.Furthermore,an optimisation algorithm,which is based on the gradient method,has been added,in order to design thermomechanical sequences when a specific final grain size is desired.OptiLam has been used to optimise rolling parameters,such as strain and temperature.Here,some of the results of the software validation performed by means of hot torsion tests are presented,showing also the functionality of the tool.Finally,the application of classical optimisation models,based on the gradient method,to hot rolling operations,is also discussed.
基金the Fundação para a Ciência e a Tecnologia[UIDB/50021/2020].
文摘Global meteorology data are now widely used in various areas, but one of its applications, weather analogues, still require exhaustive searches on the whole historical data. We present two optimisations for the state-of-the-art weather analogue search algorithms: a parallelization and a heuristic search. The heuristic search (NDRank) limits of the final number of results and does initial searches on a lower resolution dataset to find candidates that, in the second phase, are locally validated. These optimisations were deployed in the Cloud and evaluated with ERA5 data from ECMWF. The proposed parallelization attained speedups close to optimal, and NDRank attains speedups higher than 4. NDRank can be applied to any parallel search, adding similar speedups. A substantial number of executions returned a set of analogues similar to the existing exhaustive search and most of the remaining results presented a numerical value difference lower than 0.1%. The results demonstrate that it is now possible to search for weather analogues in a faster way (even compared with parallel searches) with results with little to no error. Furthermore, NDRank can be applied to existing exhaustive searches, providing faster results with small reduction of the precision of the results.
文摘This paper proposes a modified grey wolf optimiser-based adaptive super-twisting sliding mode control algorithm for the trajectory tracking and balancing of the rotary inverted pendulum system.The super-twisting sliding mode algorithm severely alleviates the chattering present in the classical sliding mode control.It provides robustness against model uncertainties and external disturbances with the knowledge of the upper bounds of the uncertainties and disturbances.The gains of the super-twisting sliding mode algorithm are selected through adaptive law.Parameters of the adaption law are tuned using a modified grey wolf optimisation algorithm,a meta-heuristic optimisation technique.Lyapunov stability analysis is carried out to analyse the overall control system stability.The performance of the proposed control algorithm is compared with two other sliding mode control strategies present in the literature,therein showing better performance of the proposed control scheme.
基金Fujian Provincial Science and Technology Major Project(No.2020HZ02014)Education and Teaching Reform Research Project for Colleges and Universities in Fujian Province(No.FBJG20210239)Huaqiao University Graduate Education Teaching Reform Research Funding Project(No.20YJG009).
文摘This paper focuses on the trajectory tracking of quadrotors under bounded external disturbances.An optimised robust controller is proposed to drive the position and attitude ofa quadrotor converge to their references quickly. At first, nonsingular fast terminal slidingmode control is developed, which can guarantee not only the stability but also finite-timeconvergence of the closed-loop system. As the parameters of the designed controllers playa vital role for control performance, an improved beetle antennae search algorithm is proposedto optimise them. By employing the historical information of the beetle’s antennaeand dynamically updating the step size as well as the range of its searching, the optimisingis accelerated considerably to ensure the efficiency of the quadrotor control. The superiorityof the proposed control scheme is demonstrated by simulation experiments, from whichone can see that both the error and the overshooting of the trajectory tracking are reducedeffectively.
文摘We evaluate an adaptive optimisation methodology,Bayesian optimisation(BO),for designing a minimum weight explosive reactive armour(ERA)for protection against a surrogate medium calibre kinetic energy(KE)long rod projectile and surrogate shaped charge(SC)warhead.We perform the optimisation using a conventional BO methodology and compare it with a conventional trial-and-error approach from a human expert.A third approach,utilising a novel human-machine teaming framework for BO is also evaluated.Data for the optimisation is generated using numerical simulations that are demonstrated to provide reasonable qualitative agreement with reference experiments.The human-machine teaming methodology is shown to identify the optimum ERA design in the fewest number of evaluations,outperforming both the stand-alone human and stand-alone BO methodologies.From a design space of almost 1800 configurations the human-machine teaming approach identifies the minimum weight ERA design in 10 samples.
基金supported in part by the National Natural Science Foundation of China (62376288,U23A20347)the Engineering and Physical Sciences Research Council of UK (EP/X041239/1)the Royal Society International Exchanges Scheme of UK (IEC/NSFC/211404)。
文摘Decomposition of a complex multi-objective optimisation problem(MOP)to multiple simple subMOPs,known as M2M for short,is an effective approach to multi-objective optimisation.However,M2M facilitates little communication/collaboration between subMOPs,which limits its use in complex optimisation scenarios.This paper extends the M2M framework to develop a unified algorithm for both multi-objective and manyobjective optimisation.Through bilevel decomposition,an MOP is divided into multiple subMOPs at upper level,each of which is further divided into a number of single-objective subproblems at lower level.Neighbouring subMOPs are allowed to share some subproblems so that the knowledge gained from solving one subMOP can be transferred to another,and eventually to all the subMOPs.The bilevel decomposition is readily combined with some new mating selection and population update strategies,leading to a high-performance algorithm that competes effectively against a number of state-of-the-arts studied in this paper for both multiand many-objective optimisation.Parameter analysis and component analysis have been also carried out to further justify the proposed algorithm.
基金supported by the National Natural Science Foundation of China(52177081).
文摘In order to play a positive role of decentralised wind power on-grid for voltage stability improvement and loss reduction of distribution network,a multi-objective two-stage decentralised wind power planning method is proposed in the paper,which takes into account the network loss correction for the extreme cold region.Firstly,an electro-thermal model is introduced to reflect the effect of temperature on conductor resistance and to correct the results of active network loss calculation;secondly,a two-stage multi-objective two-stage decentralised wind power siting and capacity allocation and reactive voltage optimisation control model is constructed to take account of the network loss correction,and the multi-objective multi-planning model is established in the first stage to consider the whole-life cycle investment cost of WTGs,the system operating cost and the voltage quality of power supply,and the multi-objective planning model is established in the second stage.planning model,and the second stage further develops the reactive voltage control strategy of WTGs on this basis,and obtains the distribution network loss reduction method based on WTG siting and capacity allocation and reactive power control strategy.Finally,the optimal configuration scheme is solved by the manta ray foraging optimisation(MRFO)algorithm,and the loss of each branch line and bus loss of the distribution network before and after the adoption of this loss reduction method is calculated by taking the IEEE33 distribution system as an example,which verifies the practicability and validity of the proposed method,and provides a reference introduction for decision-making for the distributed energy planning of the distribution network.
文摘In recent years, there has been remarkable progress in the performance of metal halide perovskite solar cells. Studies have shown significant interest in lead-free perovskite solar cells (PSCs) due to concerns about the toxicity of lead in lead halide perovskites. CH3NH3SnI3 emerges as a viable alternative to CH3NH3PbX3. In this work, we studied the effect of various parameters on the performance of lead-free perovskite solar cells using simulation with the SCAPS 1D software. The cell structure consists of α-Fe2O3/CH3NH3SnI3/PEDOT: PSS. We analyzed parameters such as thickness, doping, and layer concentration. The study revealed that, without considering other optimized parameters, the efficiency of the cell increased from 22% to 35% when the perovskite thickness varied from 100 to 1000 nm. After optimization, solar cell efficiency reaches up to 42%. The optimization parameters are such that, for example, for perovskite: the layer thickness is 700 nm, the doping concentration is 1020 and the defect density is 1013 cm−3, and for hematite: the thickness is 5 nm, the doping concentration is 1022 and the defect concentration is 1011 cm−3. These results are encouraging because they highlight the good agreement between perovskite and hematite when used as the active and electron transport layers, respectively. Now, it is still necessary to produce real, viable photovoltaic solar cells with the proposed material layer parameters.
文摘Over the last decade, the rapid growth in traffic and the number of network devices has implicitly led to an increase in network energy consumption. In this context, a new paradigm has emerged, Software-Defined Networking (SDN), which is an emerging technique that separates the control plane and the data plane of the deployed network, enabling centralized control of the network, while offering flexibility in data center network management. Some research work is moving in the direction of optimizing the energy consumption of SD-DCN, but still does not guarantee good performance and quality of service for SDN networks. To solve this problem, we propose a new mathematical model based on the principle of combinatorial optimization to dynamically solve the problem of activating and deactivating switches and unused links that consume energy in SDN networks while guaranteeing quality of service (QoS) and ensuring load balancing in the network.
文摘In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.
文摘Academician of the CAE member Youxian Sun from Zhejiang University initiated Digital Twins and Applications(ISSN 2995-2182).It is published by Zhejiang University Press and the Institution of Engineering and Technology and sponsored by Zhejiang Univer-sity.Digital Twins and Applications aim to provide a specialised platform for researchers,practitioners,and industry experts to publish high-quality,state-of-the-art research on digital twin technologies and their applications.
基金supported by the National Natural Science Foundation of China under grant nos.61772091,61802035,61962006,61962038,U1802271,U2001212,and 62072311the Sichuan Science and Technology Program under grant nos.2021JDJQ0021 and 22ZDYF2680+7 种基金the CCF‐Huawei Database System Innovation Research Plan under grant no.CCF‐HuaweiDBIR2020004ADigital Media Art,Key Laboratory of Sichuan Province,Sichuan Conservatory of Music,Chengdu,China under grant no.21DMAKL02the Chengdu Major Science and Technology Innovation Project under grant no.2021‐YF08‐00156‐GXthe Chengdu Technology Innovation and Research and Development Project under grant no.2021‐YF05‐00491‐SNthe Natural Science Foundation of Guangxi under grant no.2018GXNSFDA138005the Guangdong Basic and Applied Basic Research Foundation under grant no.2020B1515120028the Science and Technology Innovation Seedling Project of Sichuan Province under grant no 2021006the College Student Innovation and Entrepreneurship Training Program of Chengdu University of Information Technology under grant nos.202110621179 and 202110621186.
文摘An excellent cardinality estimation can make the query optimiser produce a good execution plan.Although there are some studies on cardinality estimation,the prediction results of existing cardinality estimators are inaccurate and the query efficiency cannot be guaranteed as well.In particular,they are difficult to accurately obtain the complex relationships between multiple tables in complex database systems.When dealing with complex queries,the existing cardinality estimators cannot achieve good results.In this study,a novel cardinality estimator is proposed.It uses the core techniques with the BiLSTM network structure and adds the attention mechanism.First,the columns involved in the query statements in the training set are sampled and compressed into bitmaps.Then,the Word2vec model is used to embed the word vectors about the query statements.Finally,the BiLSTM network and attention mechanism are employed to deal with word vectors.The proposed model takes into consideration not only the correlation between tables but also the processing of complex predicates.Extensive experiments and the evaluation of BiLSTM-Attention Cardinality Estimator(BACE)on the IMDB datasets are conducted.The results show that the deep learning model can significantly improve the quality of cardinality estimation,which is a vital role in query optimisation for complex databases.
文摘Capsule endoscopy(CE) has proved to be an important non-invasive tool for diagnosis and monitoring Crohn's disease patients.It has the advantage of excellent visualization of digestive tract mucosa,a good tolerability and safety in well-selected patients.The risk of retention can be diminished by good selection of patients using imaging techniques and by the use of patency capsule.The aim of a capsule examination is not only an early diagnosis but also a very good stratification of prognosis,thus directing the treatment strategy for either a step up or top-down approach and also permitting the optimization of the treatment depending on the findings.When symptoms and biomarkers point to a change in the disease's activity we can either adjust the treatment directly as recommended in CALM study or choose in selected patients to visualize the digestive mucosa through a CE and take a decision afterwards.The appearance of the new capsule from Medtronicthe Pillcam Crohn's might be an important step forward in diagnosis,evaluating disease extent,the severity of the disease,prognosis,management in a treat to target approach,with treatment modifications according to the data from CE examination.Serial examinations in the same patient can be compared and a more objective evaluation of the lesions modification from one exam to another can be performed.We present the latest developments and current status and evidence that in selected patients capsule can be a tool in a treat to target approach.
基金The authors would like to acknowledge the financial support provided by the National Key R&D Program of China(Grant No.2018YFC1504802)the National Natural Science Foundation of China(Grant Nos.41972266,12102230).
文摘A method for packing irregular particles with a prescribed volume fraction is proposed.Furthermore,the generated granular material adheres to the prescribed statistical distribution and satisfies the desired complex spatial arrangement.First,the irregular geometries of the realistic particles were obtained from the original particle images.Second,the Minkowski sum was used to check the overlap between irregular particles and place an irregular particle in contact with other particles.Third,the optimised advance front method(OAFM)generated irregular particle packing with the prescribed statistical dis-tribution and volume fraction based on the Minkowski sum.Moreover,the signed distance function was introduced to pack the particles in accordance with the desired spatial arrangement.Finally,seven biaxial tests were performed using the UDEC software,which demonstrated the accuracy and potential usefulness of the proposed method.It can model granular material efficiently and reflect the meso-structural characteristics of complex granular materials.This method has a wide range of applications where discrete modelling of granular media is necessary.
文摘Expansive soils are problematic due to the performances of their clay mineral constituent, which makes them exhibit the shrink-swell characteristics. The shrink-swell behaviours make expansive soils inappropriate for direct engineering application in their natural form. In an attempt to make them more feasible for construction purposes, numerous materials and techniques have been used to stabilise the soil. In this study, the additives and techniques applied for stabilising expansive soils will be focused on,with respect to their efficiency in improving the engineering properties of the soils. Then we discussed the microstructural interaction, chemical process, economic implication, nanotechnology application, as well as waste reuse and sustainability. Some issues regarding the effective application of the emerging trends in expansive soil stabilisation were presented with three categories, namely geoenvironmental,standardisation and optimisation issues. Techniques like predictive modelling and exploring methods such as reliability-based design optimisation, response surface methodology, dimensional analysis, and artificial intelligence technology were also proposed in order to ensure that expansive soil stabilisation is efficient.
文摘Self-piercing riveting(SPR)is a cold forming technique used to fasten together two or more sheets of materials with a rivet without the need to predrill a hole.The application of SPR in the automotive sector has become increasingly popular mainly due to the growing use of lightweight materials in transportation applications.However,SPR joining of these advanced light materials remains a challenge as these materials often lack a good combination of high strength and ductility to resist the large plastic deformation induced by the SPR process.In this paper,SPR joints of advanced materials and their corresponding failure mechanisms are discussed,aiming to provide the foundation for future improvement of SPR joint quality.This paper is divided into three major sections:1)joint failures focusing on joint defects originated from the SPR process and joint failure modes under different mechanical loading conditions,2)joint corrosion issues,and 3)joint optimisation via process parameters and advanced techniques.