This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online ide...This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online identification method is a computer-involved approach for data collection,processing,and system identification,commonly used for adaptive control and prediction.This paper proposes a method for dynamically aggregating large-scale adjustable loads to support high proportions of new energy integration,aiming to study the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction methods.The experiment selected 300 central air conditioners as the research subject and analyzed their regulation characteristics,economic efficiency,and comfort.The experimental results show that as the adjustment time of the air conditioner increases from 5 minutes to 35 minutes,the stable adjustment quantity during the adjustment period decreases from 28.46 to 3.57,indicating that air conditioning loads can be controlled over a long period and have better adjustment effects in the short term.Overall,the experimental results of this paper demonstrate that analyzing the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction algorithms is effective.展开更多
Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss pos...Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss possible causes of biases in a WRF-based RCM with a grid spacing of 50 km,named WRFG,from the North American Regional Climate Change Assessment Program(NARCCAP)in simulating wet season precipitation over the Central United States for a period when observational data are available.The RCM reproduces key features of the precipitation distribution characteristics during late spring to early summer,although it tends to underestimate the magnitude of precipitation.This dry bias is partially due to the model’s lack of skill in simulating nocturnal precipitation related to the lack of eastward propagating convective systems in the simulation.Inaccuracy in reproducing large-scale circulation and environmental conditions is another contributing factor.The too weak simulated pressure gradient between the Rocky Mountains and the Gulf of Mexico results in weaker southerly winds in between,leading to a reduction of warm moist air transport from the Gulf to the Central Great Plains.The simulated low-level horizontal convergence fields are less favorable for upward motion than in the NARR and hence,for the development of moist convection as well.Therefore,a careful examination of an RCM’s deficiencies and the identification of the source of errors are important when using the RCM to project precipitation changes in future climate scenarios.展开更多
Condensed and hydrolysable tannins are non-toxic natural polyphenols that are a commercial commodity industrialized for tanning hides to obtain leather and for a growing number of other industrial applications mainly ...Condensed and hydrolysable tannins are non-toxic natural polyphenols that are a commercial commodity industrialized for tanning hides to obtain leather and for a growing number of other industrial applications mainly to substitute petroleum-based products.They are a definite class of sustainable materials of the forestry industry.They have been in operation for hundreds of years to manufacture leather and now for a growing number of applications in a variety of other industries,such as wood adhesives,metal coating,pharmaceutical/medical applications and several others.This review presents the main sources,either already or potentially commercial of this forestry by-materials,their industrial and laboratory extraction systems,their systems of analysis with their advantages and drawbacks,be these methods so simple to even appear primitive but nonetheless of proven effectiveness,or very modern and instrumental.It constitutes a basic but essential summary of what is necessary to know of these sustainable materials.In doing so,the review highlights some of the main challenges that remain to be addressed to deliver the quality and economics of tannin supply necessary to fulfill the industrial production requirements for some materials-based uses.展开更多
Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
Lithium recovery from spent lithium-ion batteries(LIBs)have attracted extensive attention due to the skyrocketing price of lithium.The medium-temperature carbon reduction roasting was proposed to preferential selectiv...Lithium recovery from spent lithium-ion batteries(LIBs)have attracted extensive attention due to the skyrocketing price of lithium.The medium-temperature carbon reduction roasting was proposed to preferential selective extraction of lithium from spent Li-CoO_(2)(LCO)cathodes to overcome the incomplete recovery and loss of lithium during the recycling process.The LCO layered structure was destroyed and lithium was completely converted into water-soluble Li2CO_(3)under a suitable temperature to control the reduced state of the cobalt oxide.The Co metal agglomerates generated during medium-temperature carbon reduction roasting were broken by wet grinding and ultrasonic crushing to release the entrained lithium.The results showed that 99.10%of the whole lithium could be recovered as Li2CO_(3)with a purity of 99.55%.This work provided a new perspective on the preferentially selective extraction of lithium from spent lithium batteries.展开更多
Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero....Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.As a result,many algorithms use a two-layer encoding approach to optimize binary variable Mask and real variable Dec separately.Nevertheless,existing optimizers often focus on locating non-zero variable posi-tions to optimize the binary variables Mask.However,approxi-mating the sparse distribution of real Pareto optimal solutions does not necessarily mean that the objective function is optimized.In data mining,it is common to mine frequent itemsets appear-ing together in a dataset to reveal the correlation between data.Inspired by this,we propose a novel two-layer encoding learning swarm optimizer based on frequent itemsets(TELSO)to address these SLMOPs.TELSO mined the frequent terms of multiple particles with better target values to find mask combinations that can obtain better objective values for fast convergence.Experi-mental results on five real-world problems and eight benchmark sets demonstrate that TELSO outperforms existing state-of-the-art sparse large-scale multi-objective evolutionary algorithms(SLMOEAs)in terms of performance and convergence speed.展开更多
Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to tr...Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to traverse vast expanse with limited computational resources.Furthermore,in the context of sparse,most variables in Pareto optimal solutions are zero,making it difficult for algorithms to identify non-zero variables efficiently.This paper is dedicated to addressing the challenges posed by SLMOPs.To start,we introduce innovative objective functions customized to mine maximum and minimum candidate sets.This substantial enhancement dramatically improves the efficacy of frequent pattern mining.In this way,selecting candidate sets is no longer based on the quantity of nonzero variables they contain but on a higher proportion of nonzero variables within specific dimensions.Additionally,we unveil a novel approach to association rule mining,which delves into the intricate relationships between non-zero variables.This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value.We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs.The results demonstrate that our approach achieves competitive solutions across various challenges.展开更多
Electrochemical lithium extraction from salt lakes is an effective strategy for obtaining lithium at a low cost.Nevertheless,the elevated Mg:Li ratio and the presence of numerous coexisting ions in salt lake brines gi...Electrochemical lithium extraction from salt lakes is an effective strategy for obtaining lithium at a low cost.Nevertheless,the elevated Mg:Li ratio and the presence of numerous coexisting ions in salt lake brines give rise to challenges,such as prolonged lithium extraction periods,diminished lithium extraction efficiency,and considerable environmental pollution.In this work,Li FePO4(LFP)served as the electrode material for electrochemical lithium extraction.The conductive network in the LFP electrode was optimized by adjusting the type of conductive agent.This approach resulted in high lithium extraction efficiency and extended cycle life.When the single conductive agent of acetylene black(AB)or multiwalled carbon nanotubes(MWCNTs)was replaced with the mixed conductive agent of AB/MWCNTs,the average diffusion coefficient of Li+in the electrode increased from 2.35×10^(-9)or 1.77×10^(-9)to 4.21×10^(-9)cm^(2)·s^(-1).At the current density of 20 mA·g^(-1),the average lithium extraction capacity per gram of LFP electrode increased from 30.36 mg with the single conductive agent(AB)to 35.62 mg with the mixed conductive agent(AB/MWCNTs).When the mixed conductive agent was used,the capacity retention of the electrode after 30 cycles reached 82.9%,which was considerably higher than the capacity retention of 65.8%obtained when the single AB was utilized.Meanwhile,the electrode with mixed conductive agent of AB/MWCNTs provided good cycling performance.When the conductive agent content decreased or the loading capacity increased,the electrode containing the mixed conductive agent continued to show excellent electrochemical performance.Furthermore,a self-designed,highly efficient,continuous lithium extraction device was constructed.The electrode utilizing the AB/MWCNT mixed conductive agent maintained excellent adsorption capacity and cycling performance in this device.This work provides a new perspective for the electrochemical extraction of lithium using LFP electrodes.展开更多
Tetracycline and analogues are among the most used antibiotics in the dairy industry. Besides the therapeutic uses, tetracyclines are often incorporated into livestock feed as growth promoters. A considerable amount o...Tetracycline and analogues are among the most used antibiotics in the dairy industry. Besides the therapeutic uses, tetracyclines are often incorporated into livestock feed as growth promoters. A considerable amount of antibiotics is released unaltered through milk from dairy animals. The presence of antibiotic residues in milk and their subsequent consumption can lead to potential health impacts, including cancer, hypersensitivity reactions, and the development of antibiotic resistance. Thus, it is important to monitor residual levels of tetracyclines in milk. The purpose of this study is to develop a quick and simple method for simultaneously extracting five tetracycline analogues from bovine milk. Specifically, five tetracycline analogues: Chlortetracycline (CTC), demeclocycline (DEM), doxycycline (DC), minocycline (MC), and tetracycline (TC) were simultaneously extracted from milk using trifluoroacetic acid. Subsequently, the extracted analogues were separated by reverse-phase high-performance liquid chromatography (RP-HPLC) and detected at 355 nm using UV/Vis. Calibration curves for all five tetracycline analogues show excellent linearity (r2 value > 0.99). Percent recovery for MC, TC, DEM, CTC, and DC were: 31.88%, 96.91%, 151.29, 99.20%, and 85.58% respectively. The developed extraction method has good precision (RSD < 9.9% for 4 of the 5 analogues). The developed method with minimal sample preparation and pretreatment has the potential to serve as an initial screening test.展开更多
Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining wal...Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining walls,stabilizing piles,and anchors,are time-consuming and labor-and energy-intensive.This study proposes an innovative polymer grout method to improve the bearing capacity and reduce the displacement of bedding slopes.A series of large-scale model tests were carried out to verify the effectiveness of polymer grout in protecting bedding slopes.Specifically,load-displacement relationships and failure patterns were analyzed for different testing slopes with various dosages of polymer.Results show the great potential of polymer grout in improving bearing capacity,reducing settlement,and protecting slopes from being crushed under shearing.The polymer-treated slopes remained structurally intact,while the untreated slope exhibited considerable damage when subjected to loads surpassing the bearing capacity.It is also found that polymer-cemented soils concentrate around the injection pipe,forming a fan-shaped sheet-like structure.This study proves the improvement of polymer grouting for bedding slope treatment and will contribute to the development of a fast method to protect bedding slopes from landslides.展开更多
The joint entity relation extraction model which integrates the semantic information of relation is favored by relevant researchers because of its effectiveness in solving the overlapping of entities,and the method of...The joint entity relation extraction model which integrates the semantic information of relation is favored by relevant researchers because of its effectiveness in solving the overlapping of entities,and the method of defining the semantic template of relation manually is particularly prominent in the extraction effect because it can obtain the deep semantic information of relation.However,this method has some problems,such as relying on expert experience and poor portability.Inspired by the rule-based entity relation extraction method,this paper proposes a joint entity relation extraction model based on a relation semantic template automatically constructed,which is abbreviated as RSTAC.This model refines the extraction rules of relation semantic templates from relation corpus through dependency parsing and realizes the automatic construction of relation semantic templates.Based on the relation semantic template,the process of relation classification and triplet extraction is constrained,and finally,the entity relation triplet is obtained.The experimental results on the three major Chinese datasets of DuIE,SanWen,and FinRE showthat the RSTAC model successfully obtains rich deep semantics of relation,improves the extraction effect of entity relation triples,and the F1 scores are increased by an average of 0.96% compared with classical joint extraction models such as CasRel,TPLinker,and RFBFN.展开更多
The spatial distributions of different kinds of ions are usually not completely the same in the process of extracting.In order to study the reason for the different characteristics of ion extraction, a simplified simu...The spatial distributions of different kinds of ions are usually not completely the same in the process of extracting.In order to study the reason for the different characteristics of ion extraction, a simplified simulation model of Cu+ andCr+ ions extraction process was established by 2D3V (two-dimensional in space and three- dimensional in velocity space)particle-in-cell (PIC) method. The effects of different extraction voltages from 0 V to 500 V on the density distribution ofCu+ and Cr+ ions and the change of plasma emission surface were analyzed. On the basis of this model, the ion densitydistribution characteristics of Cu+ ions mixed with Li+, Mg+, K+, Fe+, Y+, Ag+, Xe+, Au+, and Pb+ ions respectivelyunder 200-V extraction voltage are further simulated, and it is revealed that the atomic mass of the ions is the key reason fordifferent ion density distributions when different kinds of ions are mixed and extracted, which provides support for furtherunderstanding of ion extraction characteristics.展开更多
Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework...Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework for aircraft geo-localization in a large range that only requires a downward-facing monocular camera,an altimeter,a compass,and an open-source Vector Map(VMAP).The algorithm combines the matching and particle filter methods.Shape vector and correlation between two building contour vectors are defined,and a coarse-to-fine building vector matching(CFBVM)method is proposed in the matching stage,for which the original matching results are described by the Gaussian mixture model(GMM).Subsequently,an improved resampling strategy is designed to reduce computing expenses with a huge number of initial particles,and a credibility indicator is designed to avoid location mistakes in the particle filter stage.An experimental evaluation of the approach based on flight data is provided.On a flight at a height of 0.2 km over a flight distance of 2 km,the aircraft is geo-localized in a reference map of 11,025 km~2using 0.09 km~2aerial images without any prior information.The absolute localization error is less than 10 m.展开更多
Dexamethasone is classified as a corticosteroid and is commonly used among cancer patients to decrease the amount of swelling around the tumor. Among patients with cancer, in particular brain tumors, seizures can beco...Dexamethasone is classified as a corticosteroid and is commonly used among cancer patients to decrease the amount of swelling around the tumor. Among patients with cancer, in particular brain tumors, seizures can become a daily routine in their everyday lives. To counteract the seizures, an antiepileptic drug such as phenytoin is administered to act as an anticonvulsant. Phenytoin and dexamethasone are frequently administrated concurrently to brain cancer patients. A previous study has shown that phenytoin serum concentration decreases when administrated concurrently with dexamethasone. Thus, it is important to monitor the concentration of these two drugs in biological samples to ensure that the proper dosages are administrated to the patients. This study aims to develop an effective extraction and detection method for dexamethasone and phenytoin. A reverse-phase high-performance liquid chromatography (HPLC) method with UV/Vis detection has been developed to separate phenytoin and dexamethasone at 219 nm and 241 nm respectively from urine samples. The mobile phase consists of a mixture of 0.01 M KH2PO4, acetonitrile, and methanol adjusted to pH 5.6 (48:32:20) and is pumped at a flow rate of 1.0 mL/min. Calibration curves were prepared for phenytoin and dexamethasone (r2 > 0.99). An efficient solid-phase extraction (SPE) method for the extraction of dexamethasone and phenytoin from urine samples was developed with the use of C-18 cartridges. The percent recovery for phenytoin and dexamethasone is 95.4% (RSD = 1.15%) and 81.1% (RSD = 3.56%) respectively.展开更多
The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the intera...The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the interaction among decision variables is intricate,leading to large group sizes and suboptimal optimization effects;hence a large-scale multi-objective optimization algorithm based on weighted overlapping grouping of decision variables(MOEAWOD)is proposed in this paper.Initially,the decision variables are perturbed and categorized into convergence and diversity variables;subsequently,the convergence variables are subdivided into groups based on the interactions among different decision variables.If the size of a group surpasses the set threshold,that group undergoes a process of weighting and overlapping grouping.Specifically,the interaction strength is evaluated based on the interaction frequency and number of objectives among various decision variables.The decision variable with the highest interaction in the group is identified and disregarded,and the remaining variables are then reclassified into subgroups.Finally,the decision variable with the strongest interaction is added to each subgroup.MOEAWOD minimizes the interactivity between different groups and maximizes the interactivity of decision variables within groups,which contributed to the optimized direction of convergence and diversity exploration with different groups.MOEAWOD was subjected to testing on 18 benchmark large-scale optimization problems,and the experimental results demonstrate the effectiveness of our methods.Compared with the other algorithms,our method is still at an advantage.展开更多
With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that consid...With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that considers the trust relationship among decisionmakers(DMs).In the process of consensusmeasurement:the social network is constructed according to the social relationship among DMs,and the Louvain method is introduced to classify social networks to form subgroups.In this study,the weights of each decision maker and each subgroup are computed by comprehensive network weights and trust weights.In the process of consensus improvement:A feedback mechanism with four identification and two direction rules is designed to guide the consensus of the improvement process.Based on the trust relationship among DMs,the preferences are modified,and the corresponding social network is updated to accelerate the consensus.Compared with the previous research,the proposedmodel not only allows the subgroups to be reconstructed and updated during the adjustment process,but also improves the accuracy of the adjustment by the feedbackmechanism.Finally,an example analysis is conducted to verify the effectiveness and flexibility of the proposed method.Moreover,compared with previous studies,the superiority of the proposed method in solving the LGDM problem is highlighted.展开更多
The separation of aromatics from aliphatics is essential for achieving maximum exploitation of oil resources in the petrochemical industry.In this study,a series of metal chloride-based ionic liquids were prepared and...The separation of aromatics from aliphatics is essential for achieving maximum exploitation of oil resources in the petrochemical industry.In this study,a series of metal chloride-based ionic liquids were prepared and their performances in the separation of 1,2,3,4-tetrahydronaphthalene(tetralin)/dodecane and tetralin/decalin systems were studied.Among these ionic liquids,1-ethyl-3-methylimidazolium tetrachloroferrate([EMIM][FeCl_(4)])with the highest selectivity was used as the extractant.Density functional theory calculations showed that[EMIM][FeCl_(4)]interacted more strongly with tetralin than with dodecane and decalin.Energy decomposition analysis of[EMIM][FeCl_(4)]-tetralin indicated that electrostatics and dispersion played essential roles,and induction cannot be neglected.The van der Waals forces was a main effect in[EMIM][FeCl_(4)]-tetralin by independent gradient model analysis.The tetralin distribution coefficient and selectivity were 0.8 and 110,respectively,with 10%(mol)tetralin in the initial tetralin/dodecane system,and 0.67 and 19.5,respectively,with 10%(mol)tetralin in the initial tetralin/decalin system.The selectivity increased with decreasing alkyl chain length of the extractant.The influence of the extraction temperature,extractant dosage,and initial concentrations of the system components on the separation performance were studied.Recycling experiments showed that the regenerated[EMIM][FeCl_(4)]could be used repeatedly.展开更多
The characteristics of the extracted ion current have a significant impact on the design and testing of ion source performance.In this paper,a 2D in space and 3D in velocity space particle in cell(2D3V PIC)method is u...The characteristics of the extracted ion current have a significant impact on the design and testing of ion source performance.In this paper,a 2D in space and 3D in velocity space particle in cell(2D3V PIC)method is utilized to simulate plasma motion and ion extraction characteristics under various initial plasma velocity distributions and extraction voltages in a Cartesian coordinate system.The plasma density is of the order of 10^(15)m^(-3)-10^(16)m^(-3)and the extraction voltage is of the order of 100 V-1000 V.The study investigates the impact of various extraction voltages on the velocity and density distributions of electrons and positive ions,and analyzes the influence of different initial plasma velocity distributions on the extraction current.The simulation results reveal that the main reason for the variation of extraction current is the spacecharge force formed by the relative aggregation of positive and negative net charges.This lays the foundation for a deeper understanding of extraction beam characteristics.展开更多
Biometric recognition is a widely used technology for user authentication.In the application of this technology,biometric security and recognition accuracy are two important issues that should be considered.In terms o...Biometric recognition is a widely used technology for user authentication.In the application of this technology,biometric security and recognition accuracy are two important issues that should be considered.In terms of biometric security,cancellable biometrics is an effective technique for protecting biometric data.Regarding recognition accuracy,feature representation plays a significant role in the performance and reliability of cancellable biometric systems.How to design good feature representations for cancellable biometrics is a challenging topic that has attracted a great deal of attention from the computer vision community,especially from researchers of cancellable biometrics.Feature extraction and learning in cancellable biometrics is to find suitable feature representations with a view to achieving satisfactory recognition performance,while the privacy of biometric data is protected.This survey informs the progress,trend and challenges of feature extraction and learning for cancellable biometrics,thus shedding light on the latest developments and future research of this area.展开更多
The oceanic trace metals iron(Fe),nickel(Ni),copper(Cu),zinc(Zn),and cadmium(Cd)are crucial to marine phytoplankton growth and global carbon cycle,and the analysis of their stable isotopes can provide valuable insight...The oceanic trace metals iron(Fe),nickel(Ni),copper(Cu),zinc(Zn),and cadmium(Cd)are crucial to marine phytoplankton growth and global carbon cycle,and the analysis of their stable isotopes can provide valuable insights into their biogeochemical cycles within the ocean.However,the simultaneous isotopic analysis of multiple elements present in seawater is challenging because of their low concentrations,limited volumes of the test samples,and high salt matrix.In this study,we present the novel method developed for the simultaneous analysis of five isotope systems by 1 L seawater sample.In the developed method,the NOBIAS Chelate-PA1 resin was used to extract metals from seawater,the AG MP-1M anion-exchange resin to purify Cu,Fe,Zn,Cd,and the NOBIAS Chelate-PA1 resin to further extract Ni from the matrix elements.Finally,a multi-collector inductively coupled plasma mass spectroscope(MC-ICPMS)was employed for the isotopic measurements using a doublespike technique or sample-standard bracketing combined with internal normalization.This method exhibited low total procedural blanks(0.04 pg,0.04 pg,0.21 pg,0.15 pg,and 3 pg for Ni,Cu,Fe,Zn,and Cd,respectively)and high extraction efficiencies(100.5%±0.3%,100.2%±0.5%,97.8%±1.4%,99.9%±0.8%,and 100.1%±0.2%for Ni,Cu,Fe,Zn,and Cd,respectively).The external errors and external precisions of this method could be considered negligible.The proposed method was further tested on the seawater samples obtained from the whole vertical profile of a water column during the Chinese GEOTRACES GP09 cruise in the Northwest Pacific,and the results showed good agreement with previous related data.This innovative method will contribute to the advancement of isotope research and enhance our understanding of the marine biogeochemical cycling of Fe,Ni,Cu,Zn,and Cd.展开更多
基金supported by the State Grid Science&Technology Project(5100-202114296A-0-0-00).
文摘This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online identification method is a computer-involved approach for data collection,processing,and system identification,commonly used for adaptive control and prediction.This paper proposes a method for dynamically aggregating large-scale adjustable loads to support high proportions of new energy integration,aiming to study the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction methods.The experiment selected 300 central air conditioners as the research subject and analyzed their regulation characteristics,economic efficiency,and comfort.The experimental results show that as the adjustment time of the air conditioner increases from 5 minutes to 35 minutes,the stable adjustment quantity during the adjustment period decreases from 28.46 to 3.57,indicating that air conditioning loads can be controlled over a long period and have better adjustment effects in the short term.Overall,the experimental results of this paper demonstrate that analyzing the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction algorithms is effective.
文摘Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss possible causes of biases in a WRF-based RCM with a grid spacing of 50 km,named WRFG,from the North American Regional Climate Change Assessment Program(NARCCAP)in simulating wet season precipitation over the Central United States for a period when observational data are available.The RCM reproduces key features of the precipitation distribution characteristics during late spring to early summer,although it tends to underestimate the magnitude of precipitation.This dry bias is partially due to the model’s lack of skill in simulating nocturnal precipitation related to the lack of eastward propagating convective systems in the simulation.Inaccuracy in reproducing large-scale circulation and environmental conditions is another contributing factor.The too weak simulated pressure gradient between the Rocky Mountains and the Gulf of Mexico results in weaker southerly winds in between,leading to a reduction of warm moist air transport from the Gulf to the Central Great Plains.The simulated low-level horizontal convergence fields are less favorable for upward motion than in the NARR and hence,for the development of moist convection as well.Therefore,a careful examination of an RCM’s deficiencies and the identification of the source of errors are important when using the RCM to project precipitation changes in future climate scenarios.
文摘Condensed and hydrolysable tannins are non-toxic natural polyphenols that are a commercial commodity industrialized for tanning hides to obtain leather and for a growing number of other industrial applications mainly to substitute petroleum-based products.They are a definite class of sustainable materials of the forestry industry.They have been in operation for hundreds of years to manufacture leather and now for a growing number of applications in a variety of other industries,such as wood adhesives,metal coating,pharmaceutical/medical applications and several others.This review presents the main sources,either already or potentially commercial of this forestry by-materials,their industrial and laboratory extraction systems,their systems of analysis with their advantages and drawbacks,be these methods so simple to even appear primitive but nonetheless of proven effectiveness,or very modern and instrumental.It constitutes a basic but essential summary of what is necessary to know of these sustainable materials.In doing so,the review highlights some of the main challenges that remain to be addressed to deliver the quality and economics of tannin supply necessary to fulfill the industrial production requirements for some materials-based uses.
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
基金the Science and Technology Key Project of Anhui Province,China(No.2022e03020004).
文摘Lithium recovery from spent lithium-ion batteries(LIBs)have attracted extensive attention due to the skyrocketing price of lithium.The medium-temperature carbon reduction roasting was proposed to preferential selective extraction of lithium from spent Li-CoO_(2)(LCO)cathodes to overcome the incomplete recovery and loss of lithium during the recycling process.The LCO layered structure was destroyed and lithium was completely converted into water-soluble Li2CO_(3)under a suitable temperature to control the reduced state of the cobalt oxide.The Co metal agglomerates generated during medium-temperature carbon reduction roasting were broken by wet grinding and ultrasonic crushing to release the entrained lithium.The results showed that 99.10%of the whole lithium could be recovered as Li2CO_(3)with a purity of 99.55%.This work provided a new perspective on the preferentially selective extraction of lithium from spent lithium batteries.
基金supported by the Scientific Research Project of Xiang Jiang Lab(22XJ02003)the University Fundamental Research Fund(23-ZZCX-JDZ-28)+5 种基金the National Science Fund for Outstanding Young Scholars(62122093)the National Natural Science Foundation of China(72071205)the Hunan Graduate Research Innovation Project(ZC23112101-10)the Hunan Natural Science Foundation Regional Joint Project(2023JJ50490)the Science and Technology Project for Young and Middle-aged Talents of Hunan(2023TJ-Z03)the Science and Technology Innovation Program of Humnan Province(2023RC1002)。
文摘Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.As a result,many algorithms use a two-layer encoding approach to optimize binary variable Mask and real variable Dec separately.Nevertheless,existing optimizers often focus on locating non-zero variable posi-tions to optimize the binary variables Mask.However,approxi-mating the sparse distribution of real Pareto optimal solutions does not necessarily mean that the objective function is optimized.In data mining,it is common to mine frequent itemsets appear-ing together in a dataset to reveal the correlation between data.Inspired by this,we propose a novel two-layer encoding learning swarm optimizer based on frequent itemsets(TELSO)to address these SLMOPs.TELSO mined the frequent terms of multiple particles with better target values to find mask combinations that can obtain better objective values for fast convergence.Experi-mental results on five real-world problems and eight benchmark sets demonstrate that TELSO outperforms existing state-of-the-art sparse large-scale multi-objective evolutionary algorithms(SLMOEAs)in terms of performance and convergence speed.
基金support by the Open Project of Xiangjiang Laboratory(22XJ02003)the University Fundamental Research Fund(23-ZZCX-JDZ-28,ZK21-07)+5 种基金the National Science Fund for Outstanding Young Scholars(62122093)the National Natural Science Foundation of China(72071205)the Hunan Graduate Research Innovation Project(CX20230074)the Hunan Natural Science Foundation Regional Joint Project(2023JJ50490)the Science and Technology Project for Young and Middle-aged Talents of Hunan(2023TJZ03)the Science and Technology Innovation Program of Humnan Province(2023RC1002).
文摘Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to traverse vast expanse with limited computational resources.Furthermore,in the context of sparse,most variables in Pareto optimal solutions are zero,making it difficult for algorithms to identify non-zero variables efficiently.This paper is dedicated to addressing the challenges posed by SLMOPs.To start,we introduce innovative objective functions customized to mine maximum and minimum candidate sets.This substantial enhancement dramatically improves the efficacy of frequent pattern mining.In this way,selecting candidate sets is no longer based on the quantity of nonzero variables they contain but on a higher proportion of nonzero variables within specific dimensions.Additionally,we unveil a novel approach to association rule mining,which delves into the intricate relationships between non-zero variables.This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value.We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs.The results demonstrate that our approach achieves competitive solutions across various challenges.
基金financially supported by the National Natural Science Foundation of China(No.52072322)the Department of Science and Technology of Sichuan Province,China(Nos.23GJHZ0147,23ZDYF0262,2022YFG0294,and 2019-GH02-00052-HZ)。
文摘Electrochemical lithium extraction from salt lakes is an effective strategy for obtaining lithium at a low cost.Nevertheless,the elevated Mg:Li ratio and the presence of numerous coexisting ions in salt lake brines give rise to challenges,such as prolonged lithium extraction periods,diminished lithium extraction efficiency,and considerable environmental pollution.In this work,Li FePO4(LFP)served as the electrode material for electrochemical lithium extraction.The conductive network in the LFP electrode was optimized by adjusting the type of conductive agent.This approach resulted in high lithium extraction efficiency and extended cycle life.When the single conductive agent of acetylene black(AB)or multiwalled carbon nanotubes(MWCNTs)was replaced with the mixed conductive agent of AB/MWCNTs,the average diffusion coefficient of Li+in the electrode increased from 2.35×10^(-9)or 1.77×10^(-9)to 4.21×10^(-9)cm^(2)·s^(-1).At the current density of 20 mA·g^(-1),the average lithium extraction capacity per gram of LFP electrode increased from 30.36 mg with the single conductive agent(AB)to 35.62 mg with the mixed conductive agent(AB/MWCNTs).When the mixed conductive agent was used,the capacity retention of the electrode after 30 cycles reached 82.9%,which was considerably higher than the capacity retention of 65.8%obtained when the single AB was utilized.Meanwhile,the electrode with mixed conductive agent of AB/MWCNTs provided good cycling performance.When the conductive agent content decreased or the loading capacity increased,the electrode containing the mixed conductive agent continued to show excellent electrochemical performance.Furthermore,a self-designed,highly efficient,continuous lithium extraction device was constructed.The electrode utilizing the AB/MWCNT mixed conductive agent maintained excellent adsorption capacity and cycling performance in this device.This work provides a new perspective for the electrochemical extraction of lithium using LFP electrodes.
文摘Tetracycline and analogues are among the most used antibiotics in the dairy industry. Besides the therapeutic uses, tetracyclines are often incorporated into livestock feed as growth promoters. A considerable amount of antibiotics is released unaltered through milk from dairy animals. The presence of antibiotic residues in milk and their subsequent consumption can lead to potential health impacts, including cancer, hypersensitivity reactions, and the development of antibiotic resistance. Thus, it is important to monitor residual levels of tetracyclines in milk. The purpose of this study is to develop a quick and simple method for simultaneously extracting five tetracycline analogues from bovine milk. Specifically, five tetracycline analogues: Chlortetracycline (CTC), demeclocycline (DEM), doxycycline (DC), minocycline (MC), and tetracycline (TC) were simultaneously extracted from milk using trifluoroacetic acid. Subsequently, the extracted analogues were separated by reverse-phase high-performance liquid chromatography (RP-HPLC) and detected at 355 nm using UV/Vis. Calibration curves for all five tetracycline analogues show excellent linearity (r2 value > 0.99). Percent recovery for MC, TC, DEM, CTC, and DC were: 31.88%, 96.91%, 151.29, 99.20%, and 85.58% respectively. The developed extraction method has good precision (RSD < 9.9% for 4 of the 5 analogues). The developed method with minimal sample preparation and pretreatment has the potential to serve as an initial screening test.
基金supported by the Fujian Science Foundation for Outstanding Youth(Grant No.2023J06039)the National Natural Science Foundation of China(Grant No.41977259 and No.U2005205)Fujian Province natural resources science and technology innovation project(Grant No.KY-090000-04-2022-019)。
文摘Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining walls,stabilizing piles,and anchors,are time-consuming and labor-and energy-intensive.This study proposes an innovative polymer grout method to improve the bearing capacity and reduce the displacement of bedding slopes.A series of large-scale model tests were carried out to verify the effectiveness of polymer grout in protecting bedding slopes.Specifically,load-displacement relationships and failure patterns were analyzed for different testing slopes with various dosages of polymer.Results show the great potential of polymer grout in improving bearing capacity,reducing settlement,and protecting slopes from being crushed under shearing.The polymer-treated slopes remained structurally intact,while the untreated slope exhibited considerable damage when subjected to loads surpassing the bearing capacity.It is also found that polymer-cemented soils concentrate around the injection pipe,forming a fan-shaped sheet-like structure.This study proves the improvement of polymer grouting for bedding slope treatment and will contribute to the development of a fast method to protect bedding slopes from landslides.
基金supported by the National Natural Science Foundation of China(Nos.U1804263,U1736214,62172435)the Zhongyuan Science and Technology Innovation Leading Talent Project(No.214200510019).
文摘The joint entity relation extraction model which integrates the semantic information of relation is favored by relevant researchers because of its effectiveness in solving the overlapping of entities,and the method of defining the semantic template of relation manually is particularly prominent in the extraction effect because it can obtain the deep semantic information of relation.However,this method has some problems,such as relying on expert experience and poor portability.Inspired by the rule-based entity relation extraction method,this paper proposes a joint entity relation extraction model based on a relation semantic template automatically constructed,which is abbreviated as RSTAC.This model refines the extraction rules of relation semantic templates from relation corpus through dependency parsing and realizes the automatic construction of relation semantic templates.Based on the relation semantic template,the process of relation classification and triplet extraction is constrained,and finally,the entity relation triplet is obtained.The experimental results on the three major Chinese datasets of DuIE,SanWen,and FinRE showthat the RSTAC model successfully obtains rich deep semantics of relation,improves the extraction effect of entity relation triples,and the F1 scores are increased by an average of 0.96% compared with classical joint extraction models such as CasRel,TPLinker,and RFBFN.
基金the Presidential Foundation of China Academy of Engineering Physics(Grant No.YZJJZQ2022016)the National Natural Science Foun-dation of China(Grant No.52207177).
文摘The spatial distributions of different kinds of ions are usually not completely the same in the process of extracting.In order to study the reason for the different characteristics of ion extraction, a simplified simulation model of Cu+ andCr+ ions extraction process was established by 2D3V (two-dimensional in space and three- dimensional in velocity space)particle-in-cell (PIC) method. The effects of different extraction voltages from 0 V to 500 V on the density distribution ofCu+ and Cr+ ions and the change of plasma emission surface were analyzed. On the basis of this model, the ion densitydistribution characteristics of Cu+ ions mixed with Li+, Mg+, K+, Fe+, Y+, Ag+, Xe+, Au+, and Pb+ ions respectivelyunder 200-V extraction voltage are further simulated, and it is revealed that the atomic mass of the ions is the key reason fordifferent ion density distributions when different kinds of ions are mixed and extracted, which provides support for furtherunderstanding of ion extraction characteristics.
文摘Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework for aircraft geo-localization in a large range that only requires a downward-facing monocular camera,an altimeter,a compass,and an open-source Vector Map(VMAP).The algorithm combines the matching and particle filter methods.Shape vector and correlation between two building contour vectors are defined,and a coarse-to-fine building vector matching(CFBVM)method is proposed in the matching stage,for which the original matching results are described by the Gaussian mixture model(GMM).Subsequently,an improved resampling strategy is designed to reduce computing expenses with a huge number of initial particles,and a credibility indicator is designed to avoid location mistakes in the particle filter stage.An experimental evaluation of the approach based on flight data is provided.On a flight at a height of 0.2 km over a flight distance of 2 km,the aircraft is geo-localized in a reference map of 11,025 km~2using 0.09 km~2aerial images without any prior information.The absolute localization error is less than 10 m.
文摘Dexamethasone is classified as a corticosteroid and is commonly used among cancer patients to decrease the amount of swelling around the tumor. Among patients with cancer, in particular brain tumors, seizures can become a daily routine in their everyday lives. To counteract the seizures, an antiepileptic drug such as phenytoin is administered to act as an anticonvulsant. Phenytoin and dexamethasone are frequently administrated concurrently to brain cancer patients. A previous study has shown that phenytoin serum concentration decreases when administrated concurrently with dexamethasone. Thus, it is important to monitor the concentration of these two drugs in biological samples to ensure that the proper dosages are administrated to the patients. This study aims to develop an effective extraction and detection method for dexamethasone and phenytoin. A reverse-phase high-performance liquid chromatography (HPLC) method with UV/Vis detection has been developed to separate phenytoin and dexamethasone at 219 nm and 241 nm respectively from urine samples. The mobile phase consists of a mixture of 0.01 M KH2PO4, acetonitrile, and methanol adjusted to pH 5.6 (48:32:20) and is pumped at a flow rate of 1.0 mL/min. Calibration curves were prepared for phenytoin and dexamethasone (r2 > 0.99). An efficient solid-phase extraction (SPE) method for the extraction of dexamethasone and phenytoin from urine samples was developed with the use of C-18 cartridges. The percent recovery for phenytoin and dexamethasone is 95.4% (RSD = 1.15%) and 81.1% (RSD = 3.56%) respectively.
基金supported in part by the Central Government Guides Local Science and TechnologyDevelopment Funds(Grant No.YDZJSX2021A038)in part by theNational Natural Science Foundation of China under(Grant No.61806138)in part by the China University Industry-University-Research Collaborative Innovation Fund(Future Network Innovation Research and Application Project)(Grant 2021FNA04014).
文摘The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the interaction among decision variables is intricate,leading to large group sizes and suboptimal optimization effects;hence a large-scale multi-objective optimization algorithm based on weighted overlapping grouping of decision variables(MOEAWOD)is proposed in this paper.Initially,the decision variables are perturbed and categorized into convergence and diversity variables;subsequently,the convergence variables are subdivided into groups based on the interactions among different decision variables.If the size of a group surpasses the set threshold,that group undergoes a process of weighting and overlapping grouping.Specifically,the interaction strength is evaluated based on the interaction frequency and number of objectives among various decision variables.The decision variable with the highest interaction in the group is identified and disregarded,and the remaining variables are then reclassified into subgroups.Finally,the decision variable with the strongest interaction is added to each subgroup.MOEAWOD minimizes the interactivity between different groups and maximizes the interactivity of decision variables within groups,which contributed to the optimized direction of convergence and diversity exploration with different groups.MOEAWOD was subjected to testing on 18 benchmark large-scale optimization problems,and the experimental results demonstrate the effectiveness of our methods.Compared with the other algorithms,our method is still at an advantage.
基金The work was supported by Humanities and Social Sciences Fund of the Ministry of Education(No.22YJA630119)the National Natural Science Foundation of China(No.71971051)Natural Science Foundation of Hebei Province(No.G2021501004).
文摘With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that considers the trust relationship among decisionmakers(DMs).In the process of consensusmeasurement:the social network is constructed according to the social relationship among DMs,and the Louvain method is introduced to classify social networks to form subgroups.In this study,the weights of each decision maker and each subgroup are computed by comprehensive network weights and trust weights.In the process of consensus improvement:A feedback mechanism with four identification and two direction rules is designed to guide the consensus of the improvement process.Based on the trust relationship among DMs,the preferences are modified,and the corresponding social network is updated to accelerate the consensus.Compared with the previous research,the proposedmodel not only allows the subgroups to be reconstructed and updated during the adjustment process,but also improves the accuracy of the adjustment by the feedbackmechanism.Finally,an example analysis is conducted to verify the effectiveness and flexibility of the proposed method.Moreover,compared with previous studies,the superiority of the proposed method in solving the LGDM problem is highlighted.
基金supported by the National Natural Science Foundation of China(22125802,22078010).
文摘The separation of aromatics from aliphatics is essential for achieving maximum exploitation of oil resources in the petrochemical industry.In this study,a series of metal chloride-based ionic liquids were prepared and their performances in the separation of 1,2,3,4-tetrahydronaphthalene(tetralin)/dodecane and tetralin/decalin systems were studied.Among these ionic liquids,1-ethyl-3-methylimidazolium tetrachloroferrate([EMIM][FeCl_(4)])with the highest selectivity was used as the extractant.Density functional theory calculations showed that[EMIM][FeCl_(4)]interacted more strongly with tetralin than with dodecane and decalin.Energy decomposition analysis of[EMIM][FeCl_(4)]-tetralin indicated that electrostatics and dispersion played essential roles,and induction cannot be neglected.The van der Waals forces was a main effect in[EMIM][FeCl_(4)]-tetralin by independent gradient model analysis.The tetralin distribution coefficient and selectivity were 0.8 and 110,respectively,with 10%(mol)tetralin in the initial tetralin/dodecane system,and 0.67 and 19.5,respectively,with 10%(mol)tetralin in the initial tetralin/decalin system.The selectivity increased with decreasing alkyl chain length of the extractant.The influence of the extraction temperature,extractant dosage,and initial concentrations of the system components on the separation performance were studied.Recycling experiments showed that the regenerated[EMIM][FeCl_(4)]could be used repeatedly.
基金Project supported by Presidential Foundation of CAEP (Grant No.YZJJZQ2022016)the National Natural Science Foundation of China (Grant No.52207177)。
文摘The characteristics of the extracted ion current have a significant impact on the design and testing of ion source performance.In this paper,a 2D in space and 3D in velocity space particle in cell(2D3V PIC)method is utilized to simulate plasma motion and ion extraction characteristics under various initial plasma velocity distributions and extraction voltages in a Cartesian coordinate system.The plasma density is of the order of 10^(15)m^(-3)-10^(16)m^(-3)and the extraction voltage is of the order of 100 V-1000 V.The study investigates the impact of various extraction voltages on the velocity and density distributions of electrons and positive ions,and analyzes the influence of different initial plasma velocity distributions on the extraction current.The simulation results reveal that the main reason for the variation of extraction current is the spacecharge force formed by the relative aggregation of positive and negative net charges.This lays the foundation for a deeper understanding of extraction beam characteristics.
基金Australian Research Council,Grant/Award Numbers:DP190103660,DP200103207,LP180100663UniSQ Capacity Building Grants,Grant/Award Number:1008313。
文摘Biometric recognition is a widely used technology for user authentication.In the application of this technology,biometric security and recognition accuracy are two important issues that should be considered.In terms of biometric security,cancellable biometrics is an effective technique for protecting biometric data.Regarding recognition accuracy,feature representation plays a significant role in the performance and reliability of cancellable biometric systems.How to design good feature representations for cancellable biometrics is a challenging topic that has attracted a great deal of attention from the computer vision community,especially from researchers of cancellable biometrics.Feature extraction and learning in cancellable biometrics is to find suitable feature representations with a view to achieving satisfactory recognition performance,while the privacy of biometric data is protected.This survey informs the progress,trend and challenges of feature extraction and learning for cancellable biometrics,thus shedding light on the latest developments and future research of this area.
基金The National Key Research and Development Program of China under contract No.2022YFE0136500the National Nature Science Foundation of China under contract Nos 41890801 and 42076227the Shanghai Pilot Program for Basic Research-Shanghai Jiao Tong University under contract No.21TQ1400201.
文摘The oceanic trace metals iron(Fe),nickel(Ni),copper(Cu),zinc(Zn),and cadmium(Cd)are crucial to marine phytoplankton growth and global carbon cycle,and the analysis of their stable isotopes can provide valuable insights into their biogeochemical cycles within the ocean.However,the simultaneous isotopic analysis of multiple elements present in seawater is challenging because of their low concentrations,limited volumes of the test samples,and high salt matrix.In this study,we present the novel method developed for the simultaneous analysis of five isotope systems by 1 L seawater sample.In the developed method,the NOBIAS Chelate-PA1 resin was used to extract metals from seawater,the AG MP-1M anion-exchange resin to purify Cu,Fe,Zn,Cd,and the NOBIAS Chelate-PA1 resin to further extract Ni from the matrix elements.Finally,a multi-collector inductively coupled plasma mass spectroscope(MC-ICPMS)was employed for the isotopic measurements using a doublespike technique or sample-standard bracketing combined with internal normalization.This method exhibited low total procedural blanks(0.04 pg,0.04 pg,0.21 pg,0.15 pg,and 3 pg for Ni,Cu,Fe,Zn,and Cd,respectively)and high extraction efficiencies(100.5%±0.3%,100.2%±0.5%,97.8%±1.4%,99.9%±0.8%,and 100.1%±0.2%for Ni,Cu,Fe,Zn,and Cd,respectively).The external errors and external precisions of this method could be considered negligible.The proposed method was further tested on the seawater samples obtained from the whole vertical profile of a water column during the Chinese GEOTRACES GP09 cruise in the Northwest Pacific,and the results showed good agreement with previous related data.This innovative method will contribute to the advancement of isotope research and enhance our understanding of the marine biogeochemical cycling of Fe,Ni,Cu,Zn,and Cd.