Traditional transgenic detection methods require high test conditions and struggle to be both sensitive and efficient.In this study,a one-tube dual recombinase polymerase amplification(RPA)reaction system for CP4-EPSP...Traditional transgenic detection methods require high test conditions and struggle to be both sensitive and efficient.In this study,a one-tube dual recombinase polymerase amplification(RPA)reaction system for CP4-EPSPS and Cry1Ab/Ac was proposed and combined with a lateral flow immunochromatographic assay,named“Dual-RPA-LFD”,to visualize the dual detection of genetically modified(GM)crops.In which,the herbicide tolerance gene CP4-EPSPS and the insect resistance gene Cry1Ab/Ac were selected as targets taking into account the current status of the most widespread application of insect resistance and herbicide tolerance traits and their stacked traits.Gradient diluted plasmids,transgenic standards,and actual samples were used as templates to conduct sensitivity,specificity,and practicality assays,respectively.The constructed method achieved the visual detection of plasmid at levels as low as 100 copies,demonstrating its high sensitivity.In addition,good applicability to transgenic samples was observed,with no cross-interference between two test lines and no influence from other genes.In conclusion,this strategy achieved the expected purpose of simultaneous detection of the two popular targets in GM crops within 20 min at 37°C in a rapid,equipmentfree field manner,providing a new alternative for rapid screening for transgenic assays in the field.展开更多
BACKGROUND The concept of macroscopic on-site evaluation(MOSE)was introduced in 2015 when the endoscopist observed better diagnostic yield when the macroscopically visible core on MOSE was superior to 4 mm.Recent stud...BACKGROUND The concept of macroscopic on-site evaluation(MOSE)was introduced in 2015 when the endoscopist observed better diagnostic yield when the macroscopically visible core on MOSE was superior to 4 mm.Recent studies suggest that MOSE by the endoscopist may be an excellent alternative to rapid on-site evaluation,and some classi-fications have been published.Few studies have assessed the adequacy of histologic cores in MOSE during endoscopic ultrasound-guided fine-needle aspiration/biopsy(EUS-FNA/FNB).AIM To evaluate the performance of MOSE during EUS-FNA/FNB.METHODS This multicentric prospective study was conducted in 16 centers in 3 countries(Egypt,Iraq,and Morocco)and included 1108 patients with pancreatic,biliary,or gastrointestinal pathology who were referred for EUS examination.We prospectively analyzed the MOSE in 1008 patients with available histopathological reports according to 2 classifications to determine the adequacy of the histological core samples.Data management and analysis were performed using a Statistical Package for Social Sciences(SPSS)version 27.RESULTS A total of 1074 solid lesions were biopsied in 1008 patients with available cytopathological reports.Mean age was 59 years,and 509 patients(50.5%)were male.The mean lesion size was 38 mm.The most frequently utilized needles were FNB-Franseen(74.5%)and 22 G(93.4%),with a median of 2 passes.According to 2 classifications,618 non-bloody cores(61.3%)and 964 good samples(95.6%)were adequate for histological evaluation.The overall diagnostic yield of cytopathology was 95.5%.The cytological examination confirmed the diagnosis of malignancy in 861 patients(85.4%),while 45 samples(4.5%)were inconclusive.Post-procedural adverse events occurred in 33 patients(3.3%).Statistical analysis showed a difference between needle types(P=0.035)with a high sensitivity of FNB(97%).The analysis of the relationship between the MOSE-score and the final diagnosis showed a significant difference between the different scores of the MOSE(P<0.001).CONCLUSION MOSE is a simple method that allows endoscopists to increase needle passes to improve sample quality.There is significantly higher FNB sensitivity and cytopathology diagnostic yield with good MOSE cores.展开更多
Endoscopic ultrasound (EUS) has become an essential tool for the study of pancreatic diseases. Specifically, EUS plays a pivotal role evaluating patients with a known or suspected pancreatic mass. In this setting, dif...Endoscopic ultrasound (EUS) has become an essential tool for the study of pancreatic diseases. Specifically, EUS plays a pivotal role evaluating patients with a known or suspected pancreatic mass. In this setting, differential diagnosis remains a clinical challenge. EUS-guided fine-needle aspiration (FNA) and fine-needle biopsy (FNB) have been proven to be safe and useful tools in this setting. EUS-guided FNA and FNB, by obtaining cytological and/or histological samples, are able to diagnose pancreatic lesions with high sensitivity and specificity. In this context, several methodological features, trying to increase the diagnostic yield of EUS-guided FNA and FNB, have been evaluated. In this review, we focus on the role of rapid on-site evaluation (ROSE). From data reported in the literature, ROSE may increase diagnostic yield of EUS-FNA specimens by 10%-30%, and thus, diagnostic accuracy. However, we should point out that many recent studies have reported adequacy rates of > 90% without ROSE, indicating that, perhaps, at high-volume centers, ROSE may not be indispensable to achieve excellent results. The use of ROSE can be considered important during the learning curve of EUS-FNA, and also in hospital with diagnostic accuracy rates < 90%.展开更多
A“cloud-edge-end”collaborative system architecture is adopted for real-time security management of power system on-site work,and mobile edge computing equipment utilizes lightweight intelligent recognition algorithm...A“cloud-edge-end”collaborative system architecture is adopted for real-time security management of power system on-site work,and mobile edge computing equipment utilizes lightweight intelligent recognition algorithms for on-site risk assessment and alert.Owing to its lightweight and fast speed,YOLOv4-Tiny is often deployed on edge computing equipment for real-time video stream detection;however,its accuracy is relatively low.This study proposes an improved YOLOv4-Tiny algorithm based on attention mechanism and optimized training methods,achieving higher accuracy without compromising the speed.Specifically,a convolution block attention module branch is added to the backbone network to enhance the feature extraction capability and an efficient channel attention mechanism is added in the neck network to improve feature utilization.Moreover,three optimized training methods:transfer learning,mosaic data augmentation,and label smoothing are used to improve the training effect of this improved algorithm.Finally,an edge computing equipment experimental platform equipped with an NVIDIA Jetson Xavier NX chip is established and the newly developed algorithm is tested on it.According to the results,the speed of the improved YOLOv4-Tiny algorithm in detecting on-site dress code compliance datasets is 17.25 FPS,and the mean average precision(mAP)is increased from 70.89%to 85.03%.展开更多
Endoscopic ultrasound-guided fine-needle biopsy(EUS-FNB)is an excellent investigation to diagnose pancreatic lesions and has shown high accuracy for its use in pathologic diagnosis.Recently,macroscopic on-site evaluat...Endoscopic ultrasound-guided fine-needle biopsy(EUS-FNB)is an excellent investigation to diagnose pancreatic lesions and has shown high accuracy for its use in pathologic diagnosis.Recently,macroscopic on-site evaluation(MOSE)performed by an endoscopist was introduced as an alternative to rapid on-site cytologic evaluation to increase the diagnostic yield of EUS-FNB.The MOSE of the biopsy can estimate the adequacy of the sample directly by the macroscopic evaluation of the core tissue obtained from EUS-FNB.Isolated pancreatic tuberculosis is extremely rare and difficult to diagnose because of its non-specific signs and symptoms.Therefore,this challenging diagnosis is based on endoscopy,imaging,and the bacteriological and histological examination of tissue biopsies.This uncommon presentation of tuberculosis can be revealed as pancreatic mass mimicking cancer.EUS-FNB can be very useful in providing a valuable histopathological diagnosis.A calcified lesion with a cheesy core in MOSE must be suggestive of tuberculosis,leading to the request of the GeneXpert,which can detect Mycobacterium tuberculosis deoxyribonucleic acid and resistance to rifampicin.A decent diagnostic strategy is crucial to prevent unnecessary surgical resection and to supply conservative management with antitubercular therapy.展开更多
On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is e...On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is essential for on-site programming big data.Duplicate data detection is an important step in data cleaning,which can save storage resources and enhance data consistency.Due to the insufficiency in traditional Sorted Neighborhood Method(SNM)and the difficulty of high-dimensional data detection,an optimized algorithm based on random forests with the dynamic and adaptive window size is proposed.The efficiency of the algorithm can be elevated by improving the method of the key-selection,reducing dimension of data set and using an adaptive variable size sliding window.Experimental results show that the improved SNM algorithm exhibits better performance and achieve higher accuracy.展开更多
According to different testing purposes, methods and available environmental conditions, the seismograph testing can be divided into laboratory and on-site testing, respectively. The testing of the seismograph's k...According to different testing purposes, methods and available environmental conditions, the seismograph testing can be divided into laboratory and on-site testing, respectively. The testing of the seismograph's key parameters and other concerning technical specifications are well described in guide documents(China Earthquake Administration, 2017). This includes seismometer sensitivity, linearity and clip levels based on the shake table test, as well as the seismometer natural period, damping constant based on electrical calibration(Wang Guangfu,1986; Ple?inger A.,1993) and instrumental self-noise collocation estimation(Holcomb L.G., 1989; Sleeman R. et al., 2006). However, with the development of seismic observation technology, many new requirements for the performance evaluation of seismographs have been put forward, and new testing items and methods have emerged.展开更多
We study the dynamical energy equipartition properties in the integrable Toda model with additional uniform or disordered on-site energies by extensive numerical simulations. The total energy is initially equidistribu...We study the dynamical energy equipartition properties in the integrable Toda model with additional uniform or disordered on-site energies by extensive numerical simulations. The total energy is initially equidistributed among some of the lowest frequency linear modes. For the Toda model with uniform on-site potentials, the energy spectrum keeps its profile nearly unchanged in a relatively short time scale. On a much longer time scale, the energies of tail modes increase slowly with time. Energy equipartition is far away from being attached in our studied time scale. For the Toda model with disordered on-site potentials, the energy transfers continuously to the high frequency modes and eventually towards energy equipartition. We further perform a systematic study of the equipartition time teq depending on the energy density ε and the nonlinear parameter α in the thermodynamic limit for the Toda model with disordered on-site potentials. We find teq∝ (1/ε)^a(1/α)^b, where b≈ 2a. The values of a and b are increased when increasing the strengths of disordered on-site potentials or decreasing the number of initially excited modes.展开更多
Approximately 20% of homes nationwide use an on-site treatment system as a form of household wastewater management. However, approximately 10% to 20% of on-site treatment systems malfunction each year, many of which h...Approximately 20% of homes nationwide use an on-site treatment system as a form of household wastewater management. However, approximately 10% to 20% of on-site treatment systems malfunction each year, many of which have either failed or exceeded the soil’s long-term acceptance rate (LTAR), causing environmental and human health risks. The objective of this field study was to evaluate the effects of soil condition (e.g., wet and dry) and product architecture type [i.e., chamber, gravel-less-pipe (GLP), polystyrene-aggregate, and pipe-and-aggregate] on in-product solution storage and biomat thickness in a profile-limited soil in northwest Arkansas under increased loading rates and to estimate the LTAR for each product. During Phase I of this study (March 13 to October 4, 2013), effluent loading rates were approximately doubled, while rates were approximately quadrupled during Phase II (October 8, 2013 to May 29, 2014), from the maximum allowable loading rate for each product. The pipe-and-tire-chip, 46-cm-wide trench pipe-and-gravel, and the 25-cm diameter GLP products had the greatest (p < 0.001), while the 31-cm-width and the 5.4-m-long chambers had the lowest (p < 0.001) in-product solution storage during wet-soil conditions of Phase I monitoring. The 25-cm diameter GLP product had the greatest (p < 0.001), while the 61-cm-width, 5.4-m-long chamber had the lowest (p < 0.001) in-product solution storage during Phase II. Results of this study indicate that some alternative products may be able to effectively handle effluent loading rates in excess of those currently allowed by the State of Arkansas. Further research will be required to confirm these interpretations.展开更多
Dynamic envelope curve is a significant parameter to assess the running safety of high-speed trains.Up to now the method based on binocular stereo vision is the only way available to measure the dynamic envelope curve...Dynamic envelope curve is a significant parameter to assess the running safety of high-speed trains.Up to now the method based on binocular stereo vision is the only way available to measure the dynamic envelope curve of a train,the speed of which is over200km/h.Nevertheless the method has two limitations,one is large field-〇f-view(FO V),the other is calibration time.Hence portable calibration equipment,easy-t〇-build target and rapid calibration algorithm are required to complete the calibration.In this paper,a new rapid on-site calibration method with large FOV based on binocular stereo vision is proposed.To address these issues,a light target has been designed,the rail coordinate system(RCS)is represented by40fixed retroreflective points on the target,and they are utilized to calibrate the parameters of two cameras.In addition,two cameras merely capture a single image of the target simultaneously,and the intrinsic and extrinsic parameters of the cameras can be calculated rapidly.To testify the proposed method,the experiments have been conducted and the results reveal that the accuracy can reach+1mm,which can meet the measurement requirement.展开更多
The real-time monitoring of environmental radiation dose for nuclear fa-cilities is an important part of safety, in order to guarantee the accuracy of the monitoring results regular calibration is necessary. Around nu...The real-time monitoring of environmental radiation dose for nuclear fa-cilities is an important part of safety, in order to guarantee the accuracy of the monitoring results regular calibration is necessary. Around nuclear facilities there are so many environmental dosimeters installed dispers-edly, because of its huge quantity, widely distributed, and in real-time monitoring state;it will cost lots of manpower and finance if it were tak-en to calibrate on standard laboratory;what’s more it will make the en-vironment out of control. To solve the problem of the measurement ac-curacy of the stationary gamma radiation dosimeter, an on-site calibra-tion method is proposed. The radioactive source is X-ray spectrum, and the dose reference instrument which has been calibrated by the national standard laboratory is a high pressure ionization. On-site calibration is divided into two parts;firstly the energy response experiment of dosim-eter for high and low energy is done in the laboratory, and the energy response curve is obtained combining with Monte Carlo simulation;sec-ondly experiment is carried out in the field of the measuring dosimeter, and the substitution method to calibrate the dosimeter is used;finally the calibration coefficient is gotten through energy curve correction. In order to verify the accuracy of on-site calibration method, the calibrated dosimeter is test in the standard laboratory and the error is 3.4%. The re-sult shows that the on-site calibration method using X-ray is feasible, and it can improves the accuracy of the measurement results of the stationary γ-ray instrument;what’s more important is that it has great reference value for the radiation safety management and radiation environment evaluation.展开更多
In current research on task offloading and resource scheduling in vehicular networks,vehicles are commonly assumed to maintain constant speed or relatively stationary states,and the impact of speed variations on task ...In current research on task offloading and resource scheduling in vehicular networks,vehicles are commonly assumed to maintain constant speed or relatively stationary states,and the impact of speed variations on task offloading is often overlooked.It is frequently assumed that vehicles can be accurately modeled during actual motion processes.However,in vehicular dynamic environments,both the tasks generated by the vehicles and the vehicles’surroundings are constantly changing,making it difficult to achieve real-time modeling for actual dynamic vehicular network scenarios.Taking into account the actual dynamic vehicular scenarios,this paper considers the real-time non-uniform movement of vehicles and proposes a vehicular task dynamic offloading and scheduling algorithm for single-task multi-vehicle vehicular network scenarios,attempting to solve the dynamic decision-making problem in task offloading process.The optimization objective is to minimize the average task completion time,which is formulated as a multi-constrained non-linear programming problem.Due to the mobility of vehicles,a constraint model is applied in the decision-making process to dynamically determine whether the communication range is sufficient for task offloading and transmission.Finally,the proposed vehicular task dynamic offloading and scheduling algorithm based on muti-agent deep deterministic policy gradient(MADDPG)is applied to solve the optimal solution of the optimization problem.Simulation results show that the algorithm proposed in this paper is able to achieve lower latency task computation offloading.Meanwhile,the average task completion time of the proposed algorithm in this paper can be improved by 7.6%compared to the performance of the MADDPG scheme and 51.1%compared to the performance of deep deterministic policy gradient(DDPG).展开更多
Collaborative edge computing is a promising direction to handle the computation intensive tasks in B5G wireless networks.However,edge computing servers(ECSs)from different operators may not trust each other,and thus t...Collaborative edge computing is a promising direction to handle the computation intensive tasks in B5G wireless networks.However,edge computing servers(ECSs)from different operators may not trust each other,and thus the incentives for collaboration cannot be guaranteed.In this paper,we propose a consortium blockchain enabled collaborative edge computing framework,where users can offload computing tasks to ECSs from different operators.To minimize the total delay of users,we formulate a joint task offloading and resource optimization problem,under the constraint of the computing capability of each ECS.We apply the Tammer decomposition method and heuristic optimization algorithms to obtain the optimal solution.Finally,we propose a reputation based node selection approach to facilitate the consensus process,and also consider a completion time based primary node selection to avoid monopolization of certain edge node and enhance the security of the blockchain.Simulation results validate the effectiveness of the proposed algorithm,and the total delay can be reduced by up to 40%compared with the non-cooperative case.展开更多
Vehicular edge computing(VEC)is emerging as a promising solution paradigm to meet the requirements of compute-intensive applications in internet of vehicle(IoV).Non-orthogonal multiple access(NOMA)has advantages in im...Vehicular edge computing(VEC)is emerging as a promising solution paradigm to meet the requirements of compute-intensive applications in internet of vehicle(IoV).Non-orthogonal multiple access(NOMA)has advantages in improving spectrum efficiency and dealing with bandwidth scarcity and cost.It is an encouraging progress combining VEC and NOMA.In this paper,we jointly optimize task offloading decision and resource allocation to maximize the service utility of the NOMA-VEC system.To solve the optimization problem,we propose a multiagent deep graph reinforcement learning algorithm.The algorithm extracts the topological features and relationship information between agents from the system state as observations,outputs task offloading decision and resource allocation simultaneously with local policy network,which is updated by a local learner.Simulation results demonstrate that the proposed method achieves a 1.52%∼5.80%improvement compared with the benchmark algorithms in system service utility.展开更多
In a network environment composed of different types of computing centers that can be divided into different layers(clod,edge layer,and others),the interconnection between them offers the possibility of peer-to-peer t...In a network environment composed of different types of computing centers that can be divided into different layers(clod,edge layer,and others),the interconnection between them offers the possibility of peer-to-peer task offloading.For many resource-constrained devices,the computation of many types of tasks is not feasible because they cannot support such computations as they do not have enough available memory and processing capacity.In this scenario,it is worth considering transferring these tasks to resource-rich platforms,such as Edge Data Centers or remote cloud servers.For different reasons,it is more exciting and appropriate to download various tasks to specific download destinations depending on the properties and state of the environment and the nature of the functions.At the same time,establishing an optimal offloading policy,which ensures that all tasks are executed within the required latency and avoids excessive workload on specific computing centers is not easy.This study presents two alternatives to solve the offloading decision paradigm by introducing two well-known algorithms,Graph Neural Networks(GNN)and Deep Q-Network(DQN).It applies the alternatives on a well-known Edge Computing simulator called PureEdgeSimand compares them with the two defaultmethods,Trade-Off and Round Robin.Experiments showed that variants offer a slight improvement in task success rate and workload distribution.In terms of energy efficiency,they provided similar results.Finally,the success rates of different computing centers are tested,and the lack of capacity of remote cloud servers to respond to applications in real-time is demonstrated.These novel ways of finding a download strategy in a local networking environment are unique as they emulate the state and structure of the environment innovatively,considering the quality of its connections and constant updates.The download score defined in this research is a crucial feature for determining the quality of a download path in the GNN training process and has not previously been proposed.Simultaneously,the suitability of Reinforcement Learning(RL)techniques is demonstrated due to the dynamism of the network environment,considering all the key factors that affect the decision to offload a given task,including the actual state of all devices.展开更多
Crowdsourcing technology is widely recognized for its effectiveness in task scheduling and resource allocation.While traditional methods for task allocation can help reduce costs and improve efficiency,they may encoun...Crowdsourcing technology is widely recognized for its effectiveness in task scheduling and resource allocation.While traditional methods for task allocation can help reduce costs and improve efficiency,they may encounter challenges when dealing with abnormal data flow nodes,leading to decreased allocation accuracy and efficiency.To address these issues,this study proposes a novel two-part invalid detection task allocation framework.In the first step,an anomaly detection model is developed using a dynamic self-attentive GAN to identify anomalous data.Compared to the baseline method,the model achieves an approximately 4%increase in the F1 value on the public dataset.In the second step of the framework,task allocation modeling is performed using a twopart graph matching method.This phase introduces a P-queue KM algorithm that implements a more efficient optimization strategy.The allocation efficiency is improved by approximately 23.83%compared to the baseline method.Empirical results confirm the effectiveness of the proposed framework in detecting abnormal data nodes,enhancing allocation precision,and achieving efficient allocation.展开更多
Aiming at the problems of low solution accuracy and high decision pressure when facing large-scale dynamic task allocation(DTA)and high-dimensional decision space with single agent,this paper combines the deep reinfor...Aiming at the problems of low solution accuracy and high decision pressure when facing large-scale dynamic task allocation(DTA)and high-dimensional decision space with single agent,this paper combines the deep reinforce-ment learning(DRL)theory and an improved Multi-Agent Deep Deterministic Policy Gradient(MADDPG-D2)algorithm with a dual experience replay pool and a dual noise based on multi-agent architecture is proposed to improve the efficiency of DTA.The algorithm is based on the traditional Multi-Agent Deep Deterministic Policy Gradient(MADDPG)algorithm,and considers the introduction of a double noise mechanism to increase the action exploration space in the early stage of the algorithm,and the introduction of a double experience pool to improve the data utilization rate;at the same time,in order to accelerate the training speed and efficiency of the agents,and to solve the cold-start problem of the training,the a priori knowledge technology is applied to the training of the algorithm.Finally,the MADDPG-D2 algorithm is compared and analyzed based on the digital battlefield of ground and air confrontation.The experimental results show that the agents trained by the MADDPG-D2 algorithm have higher win rates and average rewards,can utilize the resources more reasonably,and better solve the problem of the traditional single agent algorithms facing the difficulty of solving the problem in the high-dimensional decision space.The MADDPG-D2 algorithm based on multi-agent architecture proposed in this paper has certain superiority and rationality in DTA.展开更多
Introduction: The uncontrolled management of waste electrical and electronic equipment (W3E) causes respiratory problems in the handlers of this waste. The objective was to study the stains associated with respiratory...Introduction: The uncontrolled management of waste electrical and electronic equipment (W3E) causes respiratory problems in the handlers of this waste. The objective was to study the stains associated with respiratory symptoms in W3E handlers. Methods: The study was cross-sectional with an analytical focus on W3E handlers in the informal sector in Ouagadougou. A peer-validated questionnaire collected data on a sample of 161 manipulators. Results: the most common W3E processing tasks were the purchase or sale of W3E (67.70%), its repair (39.75%) and its collection (31.06%). The prevalence of cough was 21.74%, that of wheezing 14.91%, phlegm 12.50% and dyspnea at rest 10.56%. In bivariate analysis, there were significant associations at the 5% level between W3E repair and phlegm (p-value = 0.044), between W3E burning and wheezing (p-value = 0.011) and between W3E and cough (p-value = 0.01). The final logistic regression models suggested that the burning of W3E and the melting of lead batteries represented risk factors for the occurrence of cough with respective prevalence ratios of 4.57 and 4.63. Conclusion: raising awareness on the wearing of personal protective equipment, in particular masks adapted by W3E handlers, favoring those who are dedicated to the burning of electronic waste and the melting of lead could make it possible to reduce the risk of occurrence of respiratory symptoms.展开更多
A cable circuit of a substation in the United Kingdom showed high level of PD activities during a survey using hand hold PD testing equipment. The authors were invited to carry out on-site PD testing experiment to fur...A cable circuit of a substation in the United Kingdom showed high level of PD activities during a survey using hand hold PD testing equipment. The authors were invited to carry out on-site PD testing experiment to further diagnose and locate the potential problem of the cable system. This paper presents the experience of the present authors carrying out the cable test. Following a brief introduction to the experiment equipments and physical connections, the paper analyses the data collected from the testing, including PD pulse shape analysis, frequency spectrum analysis and phase resolved PD pattern analysis. Associated with PD propagation direction identification, PD source diagnosis and localisation was made. Four different types of sensors, which were adapted during the testing, are shown to have different frequency bandwidths and performed differently. Aider comparing the parameters of the sensor and the PD signals detected by individual sensor, optimal PD monitoring bandwidth for cable system is suggested.展开更多
With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)...With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)applications are proposed for the dispersed computing network composed of heterogeneous task vehicles and Network Computing Points(NCPs).Considering the amount of task data and the idle resources of NCPs,a computing resource scheduling model for NCPs is established.Taking the heterogeneous task execution delay threshold as a constraint,the optimization problem is described as the problem of maximizing the utilization of computing resources by NCPs.The proposed problem is proven to be NP-hard by using the method of reduction to a 0-1 knapsack problem.A many-to-many matching algorithm based on resource preferences is proposed.The algorithm first establishes the mutual preference lists based on the adaptability of the task requirements and the resources provided by NCPs.This enables the filtering out of un-schedulable NCPs in the initial stage of matching,reducing the solution space dimension.To solve the matching problem between ICVs and NCPs,a new manyto-many matching algorithm is proposed to obtain a unique and stable optimal matching result.The simulation results demonstrate that the proposed scheme can improve the resource utilization of NCPs by an average of 9.6%compared to the reference scheme,and the total performance can be improved by up to 15.9%.展开更多
基金supported by the Scientific and Innovative Action Plan of Shanghai(21N31900800)Shanghai Rising-Star Program(23QB1403500)+4 种基金the Shanghai Sailing Program(20YF1443000)Shanghai Science and Technology Commission,the Belt and Road Project(20310750500)Talent Project of SAAS(2023-2025)Runup Plan of SAAS(ZP22211)the SAAS Program for Excellent Research Team(2022(B-16))。
文摘Traditional transgenic detection methods require high test conditions and struggle to be both sensitive and efficient.In this study,a one-tube dual recombinase polymerase amplification(RPA)reaction system for CP4-EPSPS and Cry1Ab/Ac was proposed and combined with a lateral flow immunochromatographic assay,named“Dual-RPA-LFD”,to visualize the dual detection of genetically modified(GM)crops.In which,the herbicide tolerance gene CP4-EPSPS and the insect resistance gene Cry1Ab/Ac were selected as targets taking into account the current status of the most widespread application of insect resistance and herbicide tolerance traits and their stacked traits.Gradient diluted plasmids,transgenic standards,and actual samples were used as templates to conduct sensitivity,specificity,and practicality assays,respectively.The constructed method achieved the visual detection of plasmid at levels as low as 100 copies,demonstrating its high sensitivity.In addition,good applicability to transgenic samples was observed,with no cross-interference between two test lines and no influence from other genes.In conclusion,this strategy achieved the expected purpose of simultaneous detection of the two popular targets in GM crops within 20 min at 37°C in a rapid,equipmentfree field manner,providing a new alternative for rapid screening for transgenic assays in the field.
文摘BACKGROUND The concept of macroscopic on-site evaluation(MOSE)was introduced in 2015 when the endoscopist observed better diagnostic yield when the macroscopically visible core on MOSE was superior to 4 mm.Recent studies suggest that MOSE by the endoscopist may be an excellent alternative to rapid on-site evaluation,and some classi-fications have been published.Few studies have assessed the adequacy of histologic cores in MOSE during endoscopic ultrasound-guided fine-needle aspiration/biopsy(EUS-FNA/FNB).AIM To evaluate the performance of MOSE during EUS-FNA/FNB.METHODS This multicentric prospective study was conducted in 16 centers in 3 countries(Egypt,Iraq,and Morocco)and included 1108 patients with pancreatic,biliary,or gastrointestinal pathology who were referred for EUS examination.We prospectively analyzed the MOSE in 1008 patients with available histopathological reports according to 2 classifications to determine the adequacy of the histological core samples.Data management and analysis were performed using a Statistical Package for Social Sciences(SPSS)version 27.RESULTS A total of 1074 solid lesions were biopsied in 1008 patients with available cytopathological reports.Mean age was 59 years,and 509 patients(50.5%)were male.The mean lesion size was 38 mm.The most frequently utilized needles were FNB-Franseen(74.5%)and 22 G(93.4%),with a median of 2 passes.According to 2 classifications,618 non-bloody cores(61.3%)and 964 good samples(95.6%)were adequate for histological evaluation.The overall diagnostic yield of cytopathology was 95.5%.The cytological examination confirmed the diagnosis of malignancy in 861 patients(85.4%),while 45 samples(4.5%)were inconclusive.Post-procedural adverse events occurred in 33 patients(3.3%).Statistical analysis showed a difference between needle types(P=0.035)with a high sensitivity of FNB(97%).The analysis of the relationship between the MOSE-score and the final diagnosis showed a significant difference between the different scores of the MOSE(P<0.001).CONCLUSION MOSE is a simple method that allows endoscopists to increase needle passes to improve sample quality.There is significantly higher FNB sensitivity and cytopathology diagnostic yield with good MOSE cores.
文摘Endoscopic ultrasound (EUS) has become an essential tool for the study of pancreatic diseases. Specifically, EUS plays a pivotal role evaluating patients with a known or suspected pancreatic mass. In this setting, differential diagnosis remains a clinical challenge. EUS-guided fine-needle aspiration (FNA) and fine-needle biopsy (FNB) have been proven to be safe and useful tools in this setting. EUS-guided FNA and FNB, by obtaining cytological and/or histological samples, are able to diagnose pancreatic lesions with high sensitivity and specificity. In this context, several methodological features, trying to increase the diagnostic yield of EUS-guided FNA and FNB, have been evaluated. In this review, we focus on the role of rapid on-site evaluation (ROSE). From data reported in the literature, ROSE may increase diagnostic yield of EUS-FNA specimens by 10%-30%, and thus, diagnostic accuracy. However, we should point out that many recent studies have reported adequacy rates of > 90% without ROSE, indicating that, perhaps, at high-volume centers, ROSE may not be indispensable to achieve excellent results. The use of ROSE can be considered important during the learning curve of EUS-FNA, and also in hospital with diagnostic accuracy rates < 90%.
基金supported by the Science and technology project of State Grid Information&Telecommunication Group Co.,Ltd (SGTYHT/19-JS-218)
文摘A“cloud-edge-end”collaborative system architecture is adopted for real-time security management of power system on-site work,and mobile edge computing equipment utilizes lightweight intelligent recognition algorithms for on-site risk assessment and alert.Owing to its lightweight and fast speed,YOLOv4-Tiny is often deployed on edge computing equipment for real-time video stream detection;however,its accuracy is relatively low.This study proposes an improved YOLOv4-Tiny algorithm based on attention mechanism and optimized training methods,achieving higher accuracy without compromising the speed.Specifically,a convolution block attention module branch is added to the backbone network to enhance the feature extraction capability and an efficient channel attention mechanism is added in the neck network to improve feature utilization.Moreover,three optimized training methods:transfer learning,mosaic data augmentation,and label smoothing are used to improve the training effect of this improved algorithm.Finally,an edge computing equipment experimental platform equipped with an NVIDIA Jetson Xavier NX chip is established and the newly developed algorithm is tested on it.According to the results,the speed of the improved YOLOv4-Tiny algorithm in detecting on-site dress code compliance datasets is 17.25 FPS,and the mean average precision(mAP)is increased from 70.89%to 85.03%.
文摘Endoscopic ultrasound-guided fine-needle biopsy(EUS-FNB)is an excellent investigation to diagnose pancreatic lesions and has shown high accuracy for its use in pathologic diagnosis.Recently,macroscopic on-site evaluation(MOSE)performed by an endoscopist was introduced as an alternative to rapid on-site cytologic evaluation to increase the diagnostic yield of EUS-FNB.The MOSE of the biopsy can estimate the adequacy of the sample directly by the macroscopic evaluation of the core tissue obtained from EUS-FNB.Isolated pancreatic tuberculosis is extremely rare and difficult to diagnose because of its non-specific signs and symptoms.Therefore,this challenging diagnosis is based on endoscopy,imaging,and the bacteriological and histological examination of tissue biopsies.This uncommon presentation of tuberculosis can be revealed as pancreatic mass mimicking cancer.EUS-FNB can be very useful in providing a valuable histopathological diagnosis.A calcified lesion with a cheesy core in MOSE must be suggestive of tuberculosis,leading to the request of the GeneXpert,which can detect Mycobacterium tuberculosis deoxyribonucleic acid and resistance to rifampicin.A decent diagnostic strategy is crucial to prevent unnecessary surgical resection and to supply conservative management with antitubercular therapy.
基金supported by the National Key R&D Program of China(Nos.2018YFB1003905)the National Natural Science Foundation of China under Grant No.61971032,Fundamental Research Funds for the Central Universities(No.FRF-TP-18-008A3).
文摘On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is essential for on-site programming big data.Duplicate data detection is an important step in data cleaning,which can save storage resources and enhance data consistency.Due to the insufficiency in traditional Sorted Neighborhood Method(SNM)and the difficulty of high-dimensional data detection,an optimized algorithm based on random forests with the dynamic and adaptive window size is proposed.The efficiency of the algorithm can be elevated by improving the method of the key-selection,reducing dimension of data set and using an adaptive variable size sliding window.Experimental results show that the improved SNM algorithm exhibits better performance and achieve higher accuracy.
基金sponsored by the Department of Earthquake Monitoring and Prediction,China Earthquake Administration
文摘According to different testing purposes, methods and available environmental conditions, the seismograph testing can be divided into laboratory and on-site testing, respectively. The testing of the seismograph's key parameters and other concerning technical specifications are well described in guide documents(China Earthquake Administration, 2017). This includes seismometer sensitivity, linearity and clip levels based on the shake table test, as well as the seismometer natural period, damping constant based on electrical calibration(Wang Guangfu,1986; Ple?inger A.,1993) and instrumental self-noise collocation estimation(Holcomb L.G., 1989; Sleeman R. et al., 2006). However, with the development of seismic observation technology, many new requirements for the performance evaluation of seismographs have been put forward, and new testing items and methods have emerged.
基金supported by the National Natural Science Foundation of China(Grant Nos.11575087 and 11305045)the Fundamental Research Funds for the Central Universities,China(Grant No.2017B17114)
文摘We study the dynamical energy equipartition properties in the integrable Toda model with additional uniform or disordered on-site energies by extensive numerical simulations. The total energy is initially equidistributed among some of the lowest frequency linear modes. For the Toda model with uniform on-site potentials, the energy spectrum keeps its profile nearly unchanged in a relatively short time scale. On a much longer time scale, the energies of tail modes increase slowly with time. Energy equipartition is far away from being attached in our studied time scale. For the Toda model with disordered on-site potentials, the energy transfers continuously to the high frequency modes and eventually towards energy equipartition. We further perform a systematic study of the equipartition time teq depending on the energy density ε and the nonlinear parameter α in the thermodynamic limit for the Toda model with disordered on-site potentials. We find teq∝ (1/ε)^a(1/α)^b, where b≈ 2a. The values of a and b are increased when increasing the strengths of disordered on-site potentials or decreasing the number of initially excited modes.
文摘Approximately 20% of homes nationwide use an on-site treatment system as a form of household wastewater management. However, approximately 10% to 20% of on-site treatment systems malfunction each year, many of which have either failed or exceeded the soil’s long-term acceptance rate (LTAR), causing environmental and human health risks. The objective of this field study was to evaluate the effects of soil condition (e.g., wet and dry) and product architecture type [i.e., chamber, gravel-less-pipe (GLP), polystyrene-aggregate, and pipe-and-aggregate] on in-product solution storage and biomat thickness in a profile-limited soil in northwest Arkansas under increased loading rates and to estimate the LTAR for each product. During Phase I of this study (March 13 to October 4, 2013), effluent loading rates were approximately doubled, while rates were approximately quadrupled during Phase II (October 8, 2013 to May 29, 2014), from the maximum allowable loading rate for each product. The pipe-and-tire-chip, 46-cm-wide trench pipe-and-gravel, and the 25-cm diameter GLP products had the greatest (p < 0.001), while the 31-cm-width and the 5.4-m-long chambers had the lowest (p < 0.001) in-product solution storage during wet-soil conditions of Phase I monitoring. The 25-cm diameter GLP product had the greatest (p < 0.001), while the 61-cm-width, 5.4-m-long chamber had the lowest (p < 0.001) in-product solution storage during Phase II. Results of this study indicate that some alternative products may be able to effectively handle effluent loading rates in excess of those currently allowed by the State of Arkansas. Further research will be required to confirm these interpretations.
基金National Science and Technology Major Project of China(No.2016ZX04003001)
文摘Dynamic envelope curve is a significant parameter to assess the running safety of high-speed trains.Up to now the method based on binocular stereo vision is the only way available to measure the dynamic envelope curve of a train,the speed of which is over200km/h.Nevertheless the method has two limitations,one is large field-〇f-view(FO V),the other is calibration time.Hence portable calibration equipment,easy-t〇-build target and rapid calibration algorithm are required to complete the calibration.In this paper,a new rapid on-site calibration method with large FOV based on binocular stereo vision is proposed.To address these issues,a light target has been designed,the rail coordinate system(RCS)is represented by40fixed retroreflective points on the target,and they are utilized to calibrate the parameters of two cameras.In addition,two cameras merely capture a single image of the target simultaneously,and the intrinsic and extrinsic parameters of the cameras can be calculated rapidly.To testify the proposed method,the experiments have been conducted and the results reveal that the accuracy can reach+1mm,which can meet the measurement requirement.
文摘The real-time monitoring of environmental radiation dose for nuclear fa-cilities is an important part of safety, in order to guarantee the accuracy of the monitoring results regular calibration is necessary. Around nuclear facilities there are so many environmental dosimeters installed dispers-edly, because of its huge quantity, widely distributed, and in real-time monitoring state;it will cost lots of manpower and finance if it were tak-en to calibrate on standard laboratory;what’s more it will make the en-vironment out of control. To solve the problem of the measurement ac-curacy of the stationary gamma radiation dosimeter, an on-site calibra-tion method is proposed. The radioactive source is X-ray spectrum, and the dose reference instrument which has been calibrated by the national standard laboratory is a high pressure ionization. On-site calibration is divided into two parts;firstly the energy response experiment of dosim-eter for high and low energy is done in the laboratory, and the energy response curve is obtained combining with Monte Carlo simulation;sec-ondly experiment is carried out in the field of the measuring dosimeter, and the substitution method to calibrate the dosimeter is used;finally the calibration coefficient is gotten through energy curve correction. In order to verify the accuracy of on-site calibration method, the calibrated dosimeter is test in the standard laboratory and the error is 3.4%. The re-sult shows that the on-site calibration method using X-ray is feasible, and it can improves the accuracy of the measurement results of the stationary γ-ray instrument;what’s more important is that it has great reference value for the radiation safety management and radiation environment evaluation.
文摘In current research on task offloading and resource scheduling in vehicular networks,vehicles are commonly assumed to maintain constant speed or relatively stationary states,and the impact of speed variations on task offloading is often overlooked.It is frequently assumed that vehicles can be accurately modeled during actual motion processes.However,in vehicular dynamic environments,both the tasks generated by the vehicles and the vehicles’surroundings are constantly changing,making it difficult to achieve real-time modeling for actual dynamic vehicular network scenarios.Taking into account the actual dynamic vehicular scenarios,this paper considers the real-time non-uniform movement of vehicles and proposes a vehicular task dynamic offloading and scheduling algorithm for single-task multi-vehicle vehicular network scenarios,attempting to solve the dynamic decision-making problem in task offloading process.The optimization objective is to minimize the average task completion time,which is formulated as a multi-constrained non-linear programming problem.Due to the mobility of vehicles,a constraint model is applied in the decision-making process to dynamically determine whether the communication range is sufficient for task offloading and transmission.Finally,the proposed vehicular task dynamic offloading and scheduling algorithm based on muti-agent deep deterministic policy gradient(MADDPG)is applied to solve the optimal solution of the optimization problem.Simulation results show that the algorithm proposed in this paper is able to achieve lower latency task computation offloading.Meanwhile,the average task completion time of the proposed algorithm in this paper can be improved by 7.6%compared to the performance of the MADDPG scheme and 51.1%compared to the performance of deep deterministic policy gradient(DDPG).
基金supported in part by the National Key R&D Program of China under Grant 2020YFB1005900the National Natural Science Foundation of China under Grant 62001220+3 种基金the Jiangsu Provincial Key Research and Development Program under Grants BE2022068the Natural Science Foundation of Jiangsu Province under Grants BK20200440the Future Network Scientific Research Fund Project FNSRFP-2021-YB-03the Young Elite Scientist Sponsorship Program,China Association for Science and Technology.
文摘Collaborative edge computing is a promising direction to handle the computation intensive tasks in B5G wireless networks.However,edge computing servers(ECSs)from different operators may not trust each other,and thus the incentives for collaboration cannot be guaranteed.In this paper,we propose a consortium blockchain enabled collaborative edge computing framework,where users can offload computing tasks to ECSs from different operators.To minimize the total delay of users,we formulate a joint task offloading and resource optimization problem,under the constraint of the computing capability of each ECS.We apply the Tammer decomposition method and heuristic optimization algorithms to obtain the optimal solution.Finally,we propose a reputation based node selection approach to facilitate the consensus process,and also consider a completion time based primary node selection to avoid monopolization of certain edge node and enhance the security of the blockchain.Simulation results validate the effectiveness of the proposed algorithm,and the total delay can be reduced by up to 40%compared with the non-cooperative case.
基金supported by the Talent Fund of Beijing Jiaotong University(No.2023XKRC028)CCFLenovo Blue Ocean Research Fund and Beijing Natural Science Foundation under Grant(No.L221003).
文摘Vehicular edge computing(VEC)is emerging as a promising solution paradigm to meet the requirements of compute-intensive applications in internet of vehicle(IoV).Non-orthogonal multiple access(NOMA)has advantages in improving spectrum efficiency and dealing with bandwidth scarcity and cost.It is an encouraging progress combining VEC and NOMA.In this paper,we jointly optimize task offloading decision and resource allocation to maximize the service utility of the NOMA-VEC system.To solve the optimization problem,we propose a multiagent deep graph reinforcement learning algorithm.The algorithm extracts the topological features and relationship information between agents from the system state as observations,outputs task offloading decision and resource allocation simultaneously with local policy network,which is updated by a local learner.Simulation results demonstrate that the proposed method achieves a 1.52%∼5.80%improvement compared with the benchmark algorithms in system service utility.
基金funding from TECNALIA,Basque Research and Technology Alliance(BRTA)supported by the project aOptimization of Deep Learning algorithms for Edge IoT devices for sensorization and control in Buildings and Infrastructures(EMBED)funded by the Gipuzkoa Provincial Council and approved under the 2023 call of the Guipuzcoan Network of Science,Technology and Innovation Program with File Number 2023-CIEN-000051-01.
文摘In a network environment composed of different types of computing centers that can be divided into different layers(clod,edge layer,and others),the interconnection between them offers the possibility of peer-to-peer task offloading.For many resource-constrained devices,the computation of many types of tasks is not feasible because they cannot support such computations as they do not have enough available memory and processing capacity.In this scenario,it is worth considering transferring these tasks to resource-rich platforms,such as Edge Data Centers or remote cloud servers.For different reasons,it is more exciting and appropriate to download various tasks to specific download destinations depending on the properties and state of the environment and the nature of the functions.At the same time,establishing an optimal offloading policy,which ensures that all tasks are executed within the required latency and avoids excessive workload on specific computing centers is not easy.This study presents two alternatives to solve the offloading decision paradigm by introducing two well-known algorithms,Graph Neural Networks(GNN)and Deep Q-Network(DQN).It applies the alternatives on a well-known Edge Computing simulator called PureEdgeSimand compares them with the two defaultmethods,Trade-Off and Round Robin.Experiments showed that variants offer a slight improvement in task success rate and workload distribution.In terms of energy efficiency,they provided similar results.Finally,the success rates of different computing centers are tested,and the lack of capacity of remote cloud servers to respond to applications in real-time is demonstrated.These novel ways of finding a download strategy in a local networking environment are unique as they emulate the state and structure of the environment innovatively,considering the quality of its connections and constant updates.The download score defined in this research is a crucial feature for determining the quality of a download path in the GNN training process and has not previously been proposed.Simultaneously,the suitability of Reinforcement Learning(RL)techniques is demonstrated due to the dynamism of the network environment,considering all the key factors that affect the decision to offload a given task,including the actual state of all devices.
基金National Natural Science Foundation of China(62072392).
文摘Crowdsourcing technology is widely recognized for its effectiveness in task scheduling and resource allocation.While traditional methods for task allocation can help reduce costs and improve efficiency,they may encounter challenges when dealing with abnormal data flow nodes,leading to decreased allocation accuracy and efficiency.To address these issues,this study proposes a novel two-part invalid detection task allocation framework.In the first step,an anomaly detection model is developed using a dynamic self-attentive GAN to identify anomalous data.Compared to the baseline method,the model achieves an approximately 4%increase in the F1 value on the public dataset.In the second step of the framework,task allocation modeling is performed using a twopart graph matching method.This phase introduces a P-queue KM algorithm that implements a more efficient optimization strategy.The allocation efficiency is improved by approximately 23.83%compared to the baseline method.Empirical results confirm the effectiveness of the proposed framework in detecting abnormal data nodes,enhancing allocation precision,and achieving efficient allocation.
基金This research was funded by the Project of the National Natural Science Foundation of China,Grant Number 62106283.
文摘Aiming at the problems of low solution accuracy and high decision pressure when facing large-scale dynamic task allocation(DTA)and high-dimensional decision space with single agent,this paper combines the deep reinforce-ment learning(DRL)theory and an improved Multi-Agent Deep Deterministic Policy Gradient(MADDPG-D2)algorithm with a dual experience replay pool and a dual noise based on multi-agent architecture is proposed to improve the efficiency of DTA.The algorithm is based on the traditional Multi-Agent Deep Deterministic Policy Gradient(MADDPG)algorithm,and considers the introduction of a double noise mechanism to increase the action exploration space in the early stage of the algorithm,and the introduction of a double experience pool to improve the data utilization rate;at the same time,in order to accelerate the training speed and efficiency of the agents,and to solve the cold-start problem of the training,the a priori knowledge technology is applied to the training of the algorithm.Finally,the MADDPG-D2 algorithm is compared and analyzed based on the digital battlefield of ground and air confrontation.The experimental results show that the agents trained by the MADDPG-D2 algorithm have higher win rates and average rewards,can utilize the resources more reasonably,and better solve the problem of the traditional single agent algorithms facing the difficulty of solving the problem in the high-dimensional decision space.The MADDPG-D2 algorithm based on multi-agent architecture proposed in this paper has certain superiority and rationality in DTA.
文摘Introduction: The uncontrolled management of waste electrical and electronic equipment (W3E) causes respiratory problems in the handlers of this waste. The objective was to study the stains associated with respiratory symptoms in W3E handlers. Methods: The study was cross-sectional with an analytical focus on W3E handlers in the informal sector in Ouagadougou. A peer-validated questionnaire collected data on a sample of 161 manipulators. Results: the most common W3E processing tasks were the purchase or sale of W3E (67.70%), its repair (39.75%) and its collection (31.06%). The prevalence of cough was 21.74%, that of wheezing 14.91%, phlegm 12.50% and dyspnea at rest 10.56%. In bivariate analysis, there were significant associations at the 5% level between W3E repair and phlegm (p-value = 0.044), between W3E burning and wheezing (p-value = 0.011) and between W3E and cough (p-value = 0.01). The final logistic regression models suggested that the burning of W3E and the melting of lead batteries represented risk factors for the occurrence of cough with respective prevalence ratios of 4.57 and 4.63. Conclusion: raising awareness on the wearing of personal protective equipment, in particular masks adapted by W3E handlers, favoring those who are dedicated to the burning of electronic waste and the melting of lead could make it possible to reduce the risk of occurrence of respiratory symptoms.
文摘A cable circuit of a substation in the United Kingdom showed high level of PD activities during a survey using hand hold PD testing equipment. The authors were invited to carry out on-site PD testing experiment to further diagnose and locate the potential problem of the cable system. This paper presents the experience of the present authors carrying out the cable test. Following a brief introduction to the experiment equipments and physical connections, the paper analyses the data collected from the testing, including PD pulse shape analysis, frequency spectrum analysis and phase resolved PD pattern analysis. Associated with PD propagation direction identification, PD source diagnosis and localisation was made. Four different types of sensors, which were adapted during the testing, are shown to have different frequency bandwidths and performed differently. Aider comparing the parameters of the sensor and the PD signals detected by individual sensor, optimal PD monitoring bandwidth for cable system is suggested.
基金supported by the National Natural Science Foundation of China(Grant No.62072031)the Applied Basic Research Foundation of Yunnan Province(Grant No.2019FD071)the Yunnan Scientific Research Foundation Project(Grant 2019J0187).
文摘With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)applications are proposed for the dispersed computing network composed of heterogeneous task vehicles and Network Computing Points(NCPs).Considering the amount of task data and the idle resources of NCPs,a computing resource scheduling model for NCPs is established.Taking the heterogeneous task execution delay threshold as a constraint,the optimization problem is described as the problem of maximizing the utilization of computing resources by NCPs.The proposed problem is proven to be NP-hard by using the method of reduction to a 0-1 knapsack problem.A many-to-many matching algorithm based on resource preferences is proposed.The algorithm first establishes the mutual preference lists based on the adaptability of the task requirements and the resources provided by NCPs.This enables the filtering out of un-schedulable NCPs in the initial stage of matching,reducing the solution space dimension.To solve the matching problem between ICVs and NCPs,a new manyto-many matching algorithm is proposed to obtain a unique and stable optimal matching result.The simulation results demonstrate that the proposed scheme can improve the resource utilization of NCPs by an average of 9.6%compared to the reference scheme,and the total performance can be improved by up to 15.9%.