With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)...With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)applications are proposed for the dispersed computing network composed of heterogeneous task vehicles and Network Computing Points(NCPs).Considering the amount of task data and the idle resources of NCPs,a computing resource scheduling model for NCPs is established.Taking the heterogeneous task execution delay threshold as a constraint,the optimization problem is described as the problem of maximizing the utilization of computing resources by NCPs.The proposed problem is proven to be NP-hard by using the method of reduction to a 0-1 knapsack problem.A many-to-many matching algorithm based on resource preferences is proposed.The algorithm first establishes the mutual preference lists based on the adaptability of the task requirements and the resources provided by NCPs.This enables the filtering out of un-schedulable NCPs in the initial stage of matching,reducing the solution space dimension.To solve the matching problem between ICVs and NCPs,a new manyto-many matching algorithm is proposed to obtain a unique and stable optimal matching result.The simulation results demonstrate that the proposed scheme can improve the resource utilization of NCPs by an average of 9.6%compared to the reference scheme,and the total performance can be improved by up to 15.9%.展开更多
To date,there is no research that deals with biological waste as fillers in polyphenylene sulfide(PPS).In this study,oyster shells were recycled and treated to prepare thermally-treated oyster shells(TOS),which were u...To date,there is no research that deals with biological waste as fillers in polyphenylene sulfide(PPS).In this study,oyster shells were recycled and treated to prepare thermally-treated oyster shells(TOS),which were used as PPS fillers to make new bio-based antibacterial composite materials.The effect of varying the content of TOS was studied by means of structure and performance characterization.PPS/TOS composites were demonstrated to have an antibacterial effect on the growth of E coli and S.aureus.Qualitative analysis showed that when the TOS content was≥30%and 40%,the composite materials had an apparent inhibition zone.Quantitative analysis showed that the antibacterial activity increased with the TOS content.Fourier transform infrared spectroscopy indicated the formation of hydrogen bonds between the molecular chains of TOS and PPS and the occurrence of a coordination reaction.At 10%TOS,the composite tensile strength reached a maximum value of 72.5 MPa,which is 9.65%higher than that of pure PPS.The trend of bending properties is the same as that of tensile properties,showing that the maximum property was reached for the composite with 10%TOS.At the same time,the crystallinity and contact angle were the highest,and the permeability coefficient was the lowest.The fatigue test results indicated that for the composite with 10%TOS,the tensile strength was 23%lower than static tensile strength,and the yield strength was 10%lower than the static yield strength.The results of the study showed that TOS not only could reduce the cost of PPS,but also could impart antibacterial properties and enhance the mechanical and,barrier properties,the thermostability,as well as the crystallinity.展开更多
In the field of computer research,the increase of data in result of societal progress has been remarkable,and the management of this data and the analysis of linked businesses have grown in popularity.There are numero...In the field of computer research,the increase of data in result of societal progress has been remarkable,and the management of this data and the analysis of linked businesses have grown in popularity.There are numerous practical uses for the capability to extract key characteristics from secondary property data and utilize these characteristics to forecast home prices.Using regression methods in machine learning to segment the data set,examine the major factors affecting it,and forecast home prices is the most popular method for examining pricing information.It is challenging to generate precise forecasts since many of the regression models currently being utilized in research are unable to efficiently collect data on the distinctive elements that correlate y with a high degree of house price movement.In today’s forecasting studies,ensemble learning is a very prevalent and well-liked study methodology.The regression integration computation of large housing datasets can use a lot of computer resources as well as computation time,and ensemble learning uses more resources and calls for more machine support in integrating diverse models.The Average Model suggested in this paper uses the concept of fusion to produce integrated analysis findings from several models,combining the best benefits of separate models.The Average Model has a strong applicability in the field of regression prediction and significantly increases computational efficiency.The technique is also easier to replicate and very effective in regression investigations.Before using regression processing techniques,this work creates an average of different regression models using the AM(Average Model)algorithm in a novel way.By evaluating essential models with 90%accuracy,this technique significantly increases the accuracy of house price predictions.The experimental results show that the AM algorithm proposed in this paper has lower prediction error than other comparison algorithms,and the prediction accuracy is greatly improved compared with other algorithms,and has a good experimental effect in house price prediction.展开更多
This paper presents a general framework for addressing sparse portfolio optimization problems using the mean-CVaR(Conditional Value-at-Risk)model and regularization techniques.The framework incorporates a non-negative...This paper presents a general framework for addressing sparse portfolio optimization problems using the mean-CVaR(Conditional Value-at-Risk)model and regularization techniques.The framework incorporates a non-negative constraint to prevent the portfolio from being too heavily weighted in certain assets.We propose a specific ADMM(alternating directional multiplier method)for solving the model and provide a subsequential convergence analysis for theoretical integrity.To demonstrate the effectiveness of our framework,we consider the l_(1)and SCAD(smoothly clipped absolute deviation)penalties as notable instances within our unified framework.Additionally,we introduce a novel synthesis of the CVaR-based model with l_(1)/l_(2)regularization.We explore the subproblems of ADMM associated with CVaR and the presented regularization functions,employing the gradient descent method to solve the subproblem related to CVaR and the proximal operator to evaluate the subproblems with respect to penalty functions.Finally,we evaluate the proposed framework through a series of parametric and out-of-sample experiments,which shows that the proposed framework can achieve favorable out-of-sample performance.We also compare the performance of the proposed nonconvex penalties with that of convex ones,highlighting the advantages of nonconvex penalties such as improved sparsity and better risk control.展开更多
A quantitative protocol for the rapid analysis of Microcystis cells and colonies in lake sediment was developed using a modified flow cytometer, the CytoSense. For cell enumeration, diluted sediment samples containing...A quantitative protocol for the rapid analysis of Microcystis cells and colonies in lake sediment was developed using a modified flow cytometer, the CytoSense. For cell enumeration, diluted sediment samples containing Microcystis were processed with sonication to disintegrate colonies into single cells. An optimized procedure suggested that 5 mg dw (dry weight)/mL dilution combined with 200 W x 2 min sonication yielded the highest counting efficiency. Under the optimized determination conditions, the quantification limit of this protocol was 3.3x104 cells/g dw. For colony analysis, Microcystis were isolated from the sediment by filtration. Colony lengths measured by flow cytometry were similar to those measured by microscopy for the size range of one single cell to almost 400 ~tm in length. Moreover, the relationship between colony size and cell number was determined for three Microcystis species, including Microcystisflos-aquae, M. aeruginosa and M. wessenbergii. Regression formulas were used to calculate the cell numbers in different- sized colonies. The developed protocol was applied to field sediment samples from Lake Taihu. The results indicated the potential and applicability of flow cytometry as a tool for the rapid analysis of benthic Microcystis. This study provided a new capability for the high frequency monitoring of benthic overwintering and population dynamics of this bloom-forming cyanobacterium.展开更多
Converting water into hydrogen fuel and oxidizing benzyl alcohol to benzaldehyde simultaneously under visible light illumination is of great significance,but the fast recombination of photogenerated carriers in photoc...Converting water into hydrogen fuel and oxidizing benzyl alcohol to benzaldehyde simultaneously under visible light illumination is of great significance,but the fast recombination of photogenerated carriers in photocatalysts seriously decreases the conversion efficiency.Herein,a novel dual-functional 0D Cd_(0.5)Zn_(0.5)S/2D Ti_(3)C2 hybrid was fabricated by a solvothermally in-situ generated assembling method.The Cd_(0.5)Zn_(0.5)S nano-spheres with a fluffy surface completely and uniformly covered the ultrathin Ti_(3)C2 nanosheets,leading to the increased Schottky barrier(SB)sites due to a large contact area,which could accelerate the electron–hole separation and improve the light utilization.The optimized Cd_(0.5)Zn_(0.5)S/Ti_(3)C2 hybrid simultaneously presents a hydrogen evolution rate of 5.3 mmol/(g·h)and a benzaldehyde production rate of 29.3 mmol/(g·h),which are~3.2 and 2 times higher than those of pristine Cd_(0.5)Zn_(0.5)S,respectively.Both the multiple experimental measurements and the density functional theory(DFT)calculations further demonstrate the tight connection between Cd_(0.5)Zn_(0.5)S and Ti_(3)C2,formation of Schottky junction,and efficient photogenerated electron–hole separation.This paper suggests a dual-functional composite catalyst for photocatalytic hydrogen evolution and benzaldehyde production,and provides a new strategy for preventing the photogenerated electrons and holes from recombining by constructing a 0D/2D heterojunction with increased SB sites.展开更多
In the fabrication of aero-engine blades,a great deal is gained when massive material removal is avoided at the end of the process,and as little as possible material is left on the blade billet.Due to the uncertainty ...In the fabrication of aero-engine blades,a great deal is gained when massive material removal is avoided at the end of the process,and as little as possible material is left on the blade billet.Due to the uncertainty of pre-process,the billet shapes are inconsistent.Sometimes,the near-net-shape billet doesn’t cover the blade design surface to be cut.Therefore,blade localization is necessary for these billets before the machining.In conventional localization methods,the design surface’s location focused on guaranteeing enough material to be cut.However,because the to-becut surface is in near-net and free-form shape,it is difficult to find a valid localized surface model to generate the tool path.Different from the localized surface is taken as rigid in previous investigation,it is allowed to deviate from the design surface no more than the tolerance band.In term of this principle,the tolerance band is utilized to promote localization ability.A series of optimization models with different priorities is established to avoid the abandonment expensive blade billet.Finally,with the experiments performed on the near-net-shape blades,the blade localization theory and the promotion of localization ability are verified.展开更多
The Inert Doublet Model(IDM) is one of the many beyond Standard Model scenarios with an extended scalar sector, which provide a suitable dark matter particle candidate. Dark matter associated visible particle producti...The Inert Doublet Model(IDM) is one of the many beyond Standard Model scenarios with an extended scalar sector, which provide a suitable dark matter particle candidate. Dark matter associated visible particle production at high energy colliders provides a unique way to determine the microscopic properties of the dark matter particle. In this paper, we investigate that the mono-W + missing transverse energy production at the Large Hadron Collider(LHC),where W boson decay to a lepton and a neutrino. We perform the analysis for the signal of mono-W production in the IDM and the Standard Model(SM) backgrounds, and the optimized criteria employing suitable cuts are chosen in kinematic variables to maximize signal significance. We also investigate the discovery potential in several benchmark scenarios at the 14 TeV LHC. When the light Z_2 odd scalar higgs of mass is about 65 GeV, charged Higgs is in the mass range from 120 GeV to 250 GeV, it provides the best possibility with a signal significance of about 3σ at an integrated luminosity of about 3000 fb^(-1).展开更多
基金supported by the National Natural Science Foundation of China(Grant No.62072031)the Applied Basic Research Foundation of Yunnan Province(Grant No.2019FD071)the Yunnan Scientific Research Foundation Project(Grant 2019J0187).
文摘With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)applications are proposed for the dispersed computing network composed of heterogeneous task vehicles and Network Computing Points(NCPs).Considering the amount of task data and the idle resources of NCPs,a computing resource scheduling model for NCPs is established.Taking the heterogeneous task execution delay threshold as a constraint,the optimization problem is described as the problem of maximizing the utilization of computing resources by NCPs.The proposed problem is proven to be NP-hard by using the method of reduction to a 0-1 knapsack problem.A many-to-many matching algorithm based on resource preferences is proposed.The algorithm first establishes the mutual preference lists based on the adaptability of the task requirements and the resources provided by NCPs.This enables the filtering out of un-schedulable NCPs in the initial stage of matching,reducing the solution space dimension.To solve the matching problem between ICVs and NCPs,a new manyto-many matching algorithm is proposed to obtain a unique and stable optimal matching result.The simulation results demonstrate that the proposed scheme can improve the resource utilization of NCPs by an average of 9.6%compared to the reference scheme,and the total performance can be improved by up to 15.9%.
基金Sichuan Province Science and Technology Support Program (2022JDTD0016,2020YFG0176)Chengdu Science and Technology (2021-RC02-00005-CG)+5 种基金Sichuan Golden-Elephant Sincerity Chemical Co.,Ltd (HX2020019)Zigong City Science and Technology (2019CXRC01,2020YGJC13)Opening Project of Material Corrosion and Protection Key Laboratory of Sichuan Province (2019CL05,2020CL19,2018CL07)Opening Project of Sichuan Province,the Foundation of Introduced Talent of Sichuan University of Science and Engineering (2017RCL16,2019RC05,2019RC07,2020RC16)the Opening Project of Key Laboratories of Fine Chemicals and Surfactants in Sichuan Provincial Universities (2020JXY04)Xi’an Weijingyi Art and Culture Communication Co.,Ltd (HX2021385)。
文摘To date,there is no research that deals with biological waste as fillers in polyphenylene sulfide(PPS).In this study,oyster shells were recycled and treated to prepare thermally-treated oyster shells(TOS),which were used as PPS fillers to make new bio-based antibacterial composite materials.The effect of varying the content of TOS was studied by means of structure and performance characterization.PPS/TOS composites were demonstrated to have an antibacterial effect on the growth of E coli and S.aureus.Qualitative analysis showed that when the TOS content was≥30%and 40%,the composite materials had an apparent inhibition zone.Quantitative analysis showed that the antibacterial activity increased with the TOS content.Fourier transform infrared spectroscopy indicated the formation of hydrogen bonds between the molecular chains of TOS and PPS and the occurrence of a coordination reaction.At 10%TOS,the composite tensile strength reached a maximum value of 72.5 MPa,which is 9.65%higher than that of pure PPS.The trend of bending properties is the same as that of tensile properties,showing that the maximum property was reached for the composite with 10%TOS.At the same time,the crystallinity and contact angle were the highest,and the permeability coefficient was the lowest.The fatigue test results indicated that for the composite with 10%TOS,the tensile strength was 23%lower than static tensile strength,and the yield strength was 10%lower than the static yield strength.The results of the study showed that TOS not only could reduce the cost of PPS,but also could impart antibacterial properties and enhance the mechanical and,barrier properties,the thermostability,as well as the crystallinity.
基金This work was supported in part by Sichuan Science and Technology Program(Grant No.2022YFG0174)in part by the Sichuan Gas Turbine Research Institute stability support project of China Aero Engine Group Co.,Ltd(Grant No.GJCZ-0034-19)。
文摘In the field of computer research,the increase of data in result of societal progress has been remarkable,and the management of this data and the analysis of linked businesses have grown in popularity.There are numerous practical uses for the capability to extract key characteristics from secondary property data and utilize these characteristics to forecast home prices.Using regression methods in machine learning to segment the data set,examine the major factors affecting it,and forecast home prices is the most popular method for examining pricing information.It is challenging to generate precise forecasts since many of the regression models currently being utilized in research are unable to efficiently collect data on the distinctive elements that correlate y with a high degree of house price movement.In today’s forecasting studies,ensemble learning is a very prevalent and well-liked study methodology.The regression integration computation of large housing datasets can use a lot of computer resources as well as computation time,and ensemble learning uses more resources and calls for more machine support in integrating diverse models.The Average Model suggested in this paper uses the concept of fusion to produce integrated analysis findings from several models,combining the best benefits of separate models.The Average Model has a strong applicability in the field of regression prediction and significantly increases computational efficiency.The technique is also easier to replicate and very effective in regression investigations.Before using regression processing techniques,this work creates an average of different regression models using the AM(Average Model)algorithm in a novel way.By evaluating essential models with 90%accuracy,this technique significantly increases the accuracy of house price predictions.The experimental results show that the AM algorithm proposed in this paper has lower prediction error than other comparison algorithms,and the prediction accuracy is greatly improved compared with other algorithms,and has a good experimental effect in house price prediction.
基金supported by the National Natural Science Foundation of China(No.12001286)the Project funded by China Postdoctoral Science Foundation(No.2022M711672).
文摘This paper presents a general framework for addressing sparse portfolio optimization problems using the mean-CVaR(Conditional Value-at-Risk)model and regularization techniques.The framework incorporates a non-negative constraint to prevent the portfolio from being too heavily weighted in certain assets.We propose a specific ADMM(alternating directional multiplier method)for solving the model and provide a subsequential convergence analysis for theoretical integrity.To demonstrate the effectiveness of our framework,we consider the l_(1)and SCAD(smoothly clipped absolute deviation)penalties as notable instances within our unified framework.Additionally,we introduce a novel synthesis of the CVaR-based model with l_(1)/l_(2)regularization.We explore the subproblems of ADMM associated with CVaR and the presented regularization functions,employing the gradient descent method to solve the subproblem related to CVaR and the proximal operator to evaluate the subproblems with respect to penalty functions.Finally,we evaluate the proposed framework through a series of parametric and out-of-sample experiments,which shows that the proposed framework can achieve favorable out-of-sample performance.We also compare the performance of the proposed nonconvex penalties with that of convex ones,highlighting the advantages of nonconvex penalties such as improved sparsity and better risk control.
基金supported by the National Basic Research Program (973) of China (No. 2008CB418006)the National Special Program of Water Environment (No.2009ZX07106-001-002)+1 种基金the National Natural Science Foundation of China (No. 31070355)the National Major Science and Technology Program for Water Pollution Control and Treatment (No. 2009ZX07101-013)
文摘A quantitative protocol for the rapid analysis of Microcystis cells and colonies in lake sediment was developed using a modified flow cytometer, the CytoSense. For cell enumeration, diluted sediment samples containing Microcystis were processed with sonication to disintegrate colonies into single cells. An optimized procedure suggested that 5 mg dw (dry weight)/mL dilution combined with 200 W x 2 min sonication yielded the highest counting efficiency. Under the optimized determination conditions, the quantification limit of this protocol was 3.3x104 cells/g dw. For colony analysis, Microcystis were isolated from the sediment by filtration. Colony lengths measured by flow cytometry were similar to those measured by microscopy for the size range of one single cell to almost 400 ~tm in length. Moreover, the relationship between colony size and cell number was determined for three Microcystis species, including Microcystisflos-aquae, M. aeruginosa and M. wessenbergii. Regression formulas were used to calculate the cell numbers in different- sized colonies. The developed protocol was applied to field sediment samples from Lake Taihu. The results indicated the potential and applicability of flow cytometry as a tool for the rapid analysis of benthic Microcystis. This study provided a new capability for the high frequency monitoring of benthic overwintering and population dynamics of this bloom-forming cyanobacterium.
基金This work was supported by the National Natural Science Foundation of China(Grant Nos.51902137 and 51672113)the Key Research and Development Plan(Grant No.BE2019094)+1 种基金the Qing Lan Project([2016]15)of Jiangsu ProvinceThe calculations were carried out by the Advanced Computing East China Sub-center and Big Data Center of Southeast University。
文摘Converting water into hydrogen fuel and oxidizing benzyl alcohol to benzaldehyde simultaneously under visible light illumination is of great significance,but the fast recombination of photogenerated carriers in photocatalysts seriously decreases the conversion efficiency.Herein,a novel dual-functional 0D Cd_(0.5)Zn_(0.5)S/2D Ti_(3)C2 hybrid was fabricated by a solvothermally in-situ generated assembling method.The Cd_(0.5)Zn_(0.5)S nano-spheres with a fluffy surface completely and uniformly covered the ultrathin Ti_(3)C2 nanosheets,leading to the increased Schottky barrier(SB)sites due to a large contact area,which could accelerate the electron–hole separation and improve the light utilization.The optimized Cd_(0.5)Zn_(0.5)S/Ti_(3)C2 hybrid simultaneously presents a hydrogen evolution rate of 5.3 mmol/(g·h)and a benzaldehyde production rate of 29.3 mmol/(g·h),which are~3.2 and 2 times higher than those of pristine Cd_(0.5)Zn_(0.5)S,respectively.Both the multiple experimental measurements and the density functional theory(DFT)calculations further demonstrate the tight connection between Cd_(0.5)Zn_(0.5)S and Ti_(3)C2,formation of Schottky junction,and efficient photogenerated electron–hole separation.This paper suggests a dual-functional composite catalyst for photocatalytic hydrogen evolution and benzaldehyde production,and provides a new strategy for preventing the photogenerated electrons and holes from recombining by constructing a 0D/2D heterojunction with increased SB sites.
基金this work from the National Natural Science Foundations of China(No.51775445)the Fundamental Research Funds for the Central Universities of China(No.31020190503008)+1 种基金the Xi’an Science and Technology Project(No.201805042YD20CG26(9))The Project Supported by Natural Science Basic Research Plan in Shaanxi Province of China(No.2019JM-349)are thankfully acknowledged。
文摘In the fabrication of aero-engine blades,a great deal is gained when massive material removal is avoided at the end of the process,and as little as possible material is left on the blade billet.Due to the uncertainty of pre-process,the billet shapes are inconsistent.Sometimes,the near-net-shape billet doesn’t cover the blade design surface to be cut.Therefore,blade localization is necessary for these billets before the machining.In conventional localization methods,the design surface’s location focused on guaranteeing enough material to be cut.However,because the to-becut surface is in near-net and free-form shape,it is difficult to find a valid localized surface model to generate the tool path.Different from the localized surface is taken as rigid in previous investigation,it is allowed to deviate from the design surface no more than the tolerance band.In term of this principle,the tolerance band is utilized to promote localization ability.A series of optimization models with different priorities is established to avoid the abandonment expensive blade billet.Finally,with the experiments performed on the near-net-shape blades,the blade localization theory and the promotion of localization ability are verified.
基金Supported by the National Natural Science Foundation of China under Grant Nos.11205003,11305001,11575002the Key Research Foundation of Education Ministry of Anhui Province of China under Grant Nos.KJ2017A032,KJ2016A749,KJ2013A260Natural Science Foundation of West Anhui University under Grant No.WXZR201614
文摘The Inert Doublet Model(IDM) is one of the many beyond Standard Model scenarios with an extended scalar sector, which provide a suitable dark matter particle candidate. Dark matter associated visible particle production at high energy colliders provides a unique way to determine the microscopic properties of the dark matter particle. In this paper, we investigate that the mono-W + missing transverse energy production at the Large Hadron Collider(LHC),where W boson decay to a lepton and a neutrino. We perform the analysis for the signal of mono-W production in the IDM and the Standard Model(SM) backgrounds, and the optimized criteria employing suitable cuts are chosen in kinematic variables to maximize signal significance. We also investigate the discovery potential in several benchmark scenarios at the 14 TeV LHC. When the light Z_2 odd scalar higgs of mass is about 65 GeV, charged Higgs is in the mass range from 120 GeV to 250 GeV, it provides the best possibility with a signal significance of about 3σ at an integrated luminosity of about 3000 fb^(-1).