This research assessed the environmental impact of cement silos emission on the existing concrete batching facilities in M35-Mussafah, Abu Dhabi, United Arab Emirates. These assessments were conducted using an air qua...This research assessed the environmental impact of cement silos emission on the existing concrete batching facilities in M35-Mussafah, Abu Dhabi, United Arab Emirates. These assessments were conducted using an air quality dispersion model (AERMOD) to predict the ambient concentration of Portland Cement particulate matter less than 10 microns (PM<sub>10</sub>) emitted to the atmosphere during loading and unloading activities from 176 silos located in 25 concrete batching facilities. AERMOD was applied to simulate and describe the dispersion of PM<sub>10</sub> released from the cement silos into the air. Simulations were carried out for PM<sub>10</sub> emissions on controlled and uncontrolled cement silos scenarios. Results showed an incremental negative impact on air quality and public health from uncontrolled silos emissions and estimated that the uncontrolled PM<sub>10</sub> emission sources contribute to air pollution by 528958.32 kg/Year. The modeling comparison between the controlled and uncontrolled silos shows that the highest annual average concentration from controlled cement silos is 0.065 μg/m<sup>3</sup>, and the highest daily emission value is 0.6 μg/m<sup>3</sup>;both values are negligible and will not lead to significant air quality impact in the entire study domain. However, the uncontrolled cement silos’ highest annual average concentration value is 328.08 μg/m<sup>3</sup>. The highest daily emission average value was 1250.09 μg/m<sup>3</sup>;this might cause a significant air pollution quality impact and health effects on the public and workers. The short-term and long-term average PM<sub>10</sub> pollutant concentrations at these receptors predicted by the air dispersion model are discussed for both scenarios and compared with local and international air quality standards and guidelines.展开更多
This research study quantifies the PM<sub>10</sub> emission rates (g/s) from cement silos in 25 concrete batching facilities for both controlled and uncontrolled scenarios by applying the USEPA AP-42 guide...This research study quantifies the PM<sub>10</sub> emission rates (g/s) from cement silos in 25 concrete batching facilities for both controlled and uncontrolled scenarios by applying the USEPA AP-42 guidelines step-by-step approach. The study focuses on evaluating the potential environmental impact of cement dust fugitive emissions from 176 cement silos located in 25 concrete batching facilities in the M35 Mussafah industrial area of Abu Dhabi, UAE. Emission factors are crucial for quantifying the PM<sub>10</sub> emission rates (g/s) that support developing source-specific emission estimates for areawide inventories to identify major sources of pollution that provide screening sources for compliance monitoring and air dispersion modeling. This requires data to be collected involves information on production, raw material usage, energy consumption, and process-related details, this was obtained using various methods, including field visits, surveys, and interviews with facility representatives to calculate emission rates accurately. Statistical analysis was conducted on cement consumption and emission rates for controlled and uncontrolled sources of the targeted facilities. The data shows that the average cement consumption among the facilities is approximately 88,160 (MT/yr), with a wide range of variation depending on the facility size and production rate. The emission rates from controlled sources have an average of 4.752E<sup>-04</sup> (g/s), while the rates from uncontrolled sources average 0.6716 (g/s). The analysis shows a significant statistical relationship (p < 0.05) and perfect positive correlation (r = 1) between cement consumption and emission rates, indicating that as cement consumption increases, emission rates tend to increase as well. Furthermore, comparing the emission rates from controlled and uncontrolled scenarios. The data showed a significant difference between the two scenarios, highlighting the effectiveness of control measures in reducing PM<sub>10</sub> emissions. The study’s findings provide insights into the impact of cement silo emissions on air quality and the importance of implementing control measures in concrete batching facilities. The comparative analysis contributes to understanding emission sources and supports the development of pollution control strategies in the Ready-Mix industry.展开更多
In parallel-batching machine scheduling, all jobs in a batch start and complete at the same time, and the processing time of the batch is the maximum processing time of any job in it. For the unbounded parallel-batchi...In parallel-batching machine scheduling, all jobs in a batch start and complete at the same time, and the processing time of the batch is the maximum processing time of any job in it. For the unbounded parallel-batching machine scheduling problem of minimizing the maximum lateness, denoted 1|p-batch|L_(max), a dynamic programming algorithm with time complexity O(n^2) is well known in the literature.Later, this algorithm is improved to be an O(n log n) algorithm. In this note, we present another O(n log n) algorithm with simplifications on data structure and implementation details.展开更多
A batch is a subset of jobs which must be processed jointly in either serial or parallel form. For the single machine, batching, total completion time scheduling problems, the algorithmic aspects have been extensively...A batch is a subset of jobs which must be processed jointly in either serial or parallel form. For the single machine, batching, total completion time scheduling problems, the algorithmic aspects have been extensively studied in the literature. This paper presents the optimal hatching structures of the problems on the batching ways: all jobs in exactly N(arbitrary fix batch number and 1 〈 N 〈 n) batches.展开更多
The scheduling problem on a single batching machine with family jobs was proposed.The single batching machine can process a group of jobs simultaneously as a batch.Jobs in the same batch complete at the same time.The ...The scheduling problem on a single batching machine with family jobs was proposed.The single batching machine can process a group of jobs simultaneously as a batch.Jobs in the same batch complete at the same time.The batch size is assumed to be unbounded.Jobs that belong to different families can not be processed in the same batch.The objective function is minimizing maximum lateness.For the problem with fixed number of m families and n jobs,a polynomial time algorithm based on dynamic programming with time complexity of O(n(n/m+1)m)was presented.展开更多
To improve the productivity,the resource utilization and reduce the production cost of flexible job shops,this paper designs an improved two-layer optimization algorithm for the dual-resource scheduling optimization p...To improve the productivity,the resource utilization and reduce the production cost of flexible job shops,this paper designs an improved two-layer optimization algorithm for the dual-resource scheduling optimization problem of flexible job shop considering workpiece batching.Firstly,a mathematical model is established to minimize the maximum completion time.Secondly,an improved two-layer optimization algorithm is designed:the outer layer algorithm uses an improved PSO(Particle Swarm Optimization)to solve the workpiece batching problem,and the inner layer algorithm uses an improved GA(Genetic Algorithm)to solve the dual-resource scheduling problem.Then,a rescheduling method is designed to solve the task disturbance problem,represented by machine failures,occurring in the workshop production process.Finally,the superiority and effectiveness of the improved two-layer optimization algorithm are verified by two typical cases.The case results show that the improved two-layer optimization algorithm increases the average productivity by 7.44% compared to the ordinary two-layer optimization algorithm.By setting the different numbers of AGVs(Automated Guided Vehicles)and analyzing the impact on the production cycle of the whole order,this paper uses two indicators,the maximum completion time decreasing rate and the average AGV load time,to obtain the optimal number of AGVs,which saves the cost of production while ensuring the production efficiency.This research combines the solved problem with the real production process,which improves the productivity and reduces the production cost of the flexible job shop,and provides new ideas for the subsequent research.展开更多
For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and all...For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and allows privacy information to be preserved.Data owners can tightly manage their data with efficient revocation and only grant one-time adaptive access for the fulfillment of the requester.We prove that our protocol is semanticallly secure,blind,and secure against oblivious requesters and malicious file keepers.We also provide security analysis in the context of four typical attacks.展开更多
In this paper we study the problem of scheduling a batching machine with nonidentical job sizes. The jobs arrive simultaneously and have unit processing time. The goal is to minimize the total completion times. Having...In this paper we study the problem of scheduling a batching machine with nonidentical job sizes. The jobs arrive simultaneously and have unit processing time. The goal is to minimize the total completion times. Having shown that the problem is NP-hard, we put forward three approximation schemes with worst case ratio 4, 2, and 3/2, respectively.展开更多
Cloud service providers generally co-locate online services and batch jobs onto the same computer cluster,where the resources can be pooled in order to maximize data center resource utilization.Due to resource competi...Cloud service providers generally co-locate online services and batch jobs onto the same computer cluster,where the resources can be pooled in order to maximize data center resource utilization.Due to resource competition between batch jobs and online services,co-location frequently impairs the performance of online services.This study presents a quality of service(QoS)prediction-based schedulingmodel(QPSM)for co-locatedworkloads.The performance prediction of QPSM consists of two parts:the prediction of an online service’s QoS anomaly based on XGBoost and the prediction of the completion time of an offline batch job based on randomforest.On-line service QoS anomaly prediction is used to evaluate the influence of batch jobmix on on-line service performance,and batch job completion time prediction is utilized to reduce the total waiting time of batch jobs.When the same number of batch jobs are scheduled in experiments using typical test sets such as CloudSuite,the scheduling time required by QPSM is reduced by about 6 h on average compared with the first-come,first-served strategy and by about 11 h compared with the random scheduling strategy.Compared with the non-co-located situation,QPSM can improve CPU resource utilization by 12.15% and memory resource utilization by 5.7% on average.Experiments show that the QPSM scheduling strategy proposed in this study can effectively guarantee the quality of online services and further improve cluster resource utilization.展开更多
The maturity of 5G technology has enabled crowd-sensing services to collect multimedia data over wireless network,so it has promoted the applications of crowd-sensing services in different fields,but also brings more ...The maturity of 5G technology has enabled crowd-sensing services to collect multimedia data over wireless network,so it has promoted the applications of crowd-sensing services in different fields,but also brings more privacy security challenges,the most commom which is privacy leakage.As a privacy protection technology combining data integrity check and identity anonymity,ring signature is widely used in the field of privacy protection.However,introducing signature technology leads to additional signature verification overhead.In the scenario of crowd-sensing,the existing signature schemes have low efficiency in multi-signature verification.Therefore,it is necessary to design an efficient multi-signature verification scheme while ensuring security.In this paper,a batch-verifiable signature scheme is proposed based on the crowd-sensing background,which supports the sensing platform to verify the uploaded multiple signature data efficiently,so as to overcoming the defects of the traditional signature scheme in multi-signature verification.In our proposal,a method for linking homologous data was presented,which was valuable for incentive mechanism and data analysis.Simulation results showed that the proposed scheme has good performance in terms of security and efficiency in crowd-sensing applications with a large number of users and data.展开更多
The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are ca...The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.展开更多
This study focuses on the scheduling problem of unrelated parallel batch processing machines(BPM)with release times,a scenario derived from the moulding process in a foundry.In this process,a batch is initially formed...This study focuses on the scheduling problem of unrelated parallel batch processing machines(BPM)with release times,a scenario derived from the moulding process in a foundry.In this process,a batch is initially formed,placed in a sandbox,and then the sandbox is positioned on a BPM formoulding.The complexity of the scheduling problem increases due to the consideration of BPM capacity and sandbox volume.To minimize the makespan,a new cooperated imperialist competitive algorithm(CICA)is introduced.In CICA,the number of empires is not a parameter,and four empires aremaintained throughout the search process.Two types of assimilations are achieved:The strongest and weakest empires cooperate in their assimilation,while the remaining two empires,having a close normalization total cost,combine in their assimilation.A new form of imperialist competition is proposed to prevent insufficient competition,and the unique features of the problem are effectively utilized.Computational experiments are conducted across several instances,and a significant amount of experimental results show that the newstrategies of CICAare effective,indicating promising advantages for the considered BPMscheduling problems.展开更多
Graph learning,when used as a semi-supervised learning(SSL)method,performs well for classification tasks with a low label rate.We provide a graph-based batch active learning pipeline for pixel/patch neighborhood multi...Graph learning,when used as a semi-supervised learning(SSL)method,performs well for classification tasks with a low label rate.We provide a graph-based batch active learning pipeline for pixel/patch neighborhood multi-or hyperspectral image segmentation.Our batch active learning approach selects a collection of unlabeled pixels that satisfy a graph local maximum constraint for the active learning acquisition function that determines the relative importance of each pixel to the classification.This work builds on recent advances in the design of novel active learning acquisition functions(e.g.,the Model Change approach in arXiv:2110.07739)while adding important further developments including patch-neighborhood image analysis and batch active learning methods to further increase the accuracy and greatly increase the computational efficiency of these methods.In addition to improvements in the accuracy,our approach can greatly reduce the number of labeled pixels needed to achieve the same level of the accuracy based on randomly selected labeled pixels.展开更多
Refined 3D modeling of mine slopes is pivotal for precise prediction of geological hazards.Aiming at the inadequacy of existing single modeling methods in comprehensively representing the overall and localized charact...Refined 3D modeling of mine slopes is pivotal for precise prediction of geological hazards.Aiming at the inadequacy of existing single modeling methods in comprehensively representing the overall and localized characteristics of mining slopes,this study introduces a new method that fuses model data from Unmanned aerial vehicles(UAV)tilt photogrammetry and 3D laser scanning through a data alignment algorithm based on control points.First,the mini batch K-Medoids algorithm is utilized to cluster the point cloud data from ground 3D laser scanning.Then,the elbow rule is applied to determine the optimal cluster number(K0),and the feature points are extracted.Next,the nearest neighbor point algorithm is employed to match the feature points obtained from UAV tilt photogrammetry,and the internal point coordinates are adjusted through the distanceweighted average to construct a 3D model.Finally,by integrating an engineering case study,the K0 value is determined to be 8,with a matching accuracy between the two model datasets ranging from 0.0669 to 1.0373 mm.Therefore,compared with the modeling method utilizing K-medoids clustering algorithm,the new modeling method significantly enhances the computational efficiency,the accuracy of selecting the optimal number of feature points in 3D laser scanning,and the precision of the 3D model derived from UAV tilt photogrammetry.This method provides a research foundation for constructing mine slope model.展开更多
This paper discusses a queueing system with a retrial orbit and batch service, in which the quantity of customers’ rooms in the queue is finite and the space of retrial orbit is infinite. When the server starts servi...This paper discusses a queueing system with a retrial orbit and batch service, in which the quantity of customers’ rooms in the queue is finite and the space of retrial orbit is infinite. When the server starts serving, it serves all customers in the queue in a single batch, which is the so-called batch service. If a new customer or a retrial customer finds all the customers’ rooms are occupied, he will decide whether or not to join the retrial orbit. By using the censoring technique and the matrix analysis method, we first obtain the decay function of the stationary distribution for the quantity of customers in the retrial orbit and the quantity of customers in the queue. Then based on the form of decay rate function and the Karamata Tauberian theorem, we finally get the exact tail asymptotics of the stationary distribution.展开更多
The lethal brain tumor “Glioblastoma” has the propensity to grow over time. To improve patient outcomes, it is essential to classify GBM accurately and promptly in order to provide a focused and individualized treat...The lethal brain tumor “Glioblastoma” has the propensity to grow over time. To improve patient outcomes, it is essential to classify GBM accurately and promptly in order to provide a focused and individualized treatment plan. Despite this, deep learning methods, particularly Convolutional Neural Networks (CNNs), have demonstrated a high level of accuracy in a myriad of medical image analysis applications as a result of recent technical breakthroughs. The overall aim of the research is to investigate how CNNs can be used to classify GBMs using data from medical imaging, to improve prognosis precision and effectiveness. This research study will demonstrate a suggested methodology that makes use of the CNN architecture and is trained using a database of MRI pictures with this tumor. The constructed model will be assessed based on its overall performance. Extensive experiments and comparisons with conventional machine learning techniques and existing classification methods will also be made. It will be crucial to emphasize the possibility of early and accurate prediction in a clinical workflow because it can have a big impact on treatment planning and patient outcomes. The paramount objective is to not only address the classification challenge but also to outline a clear pathway towards enhancing prognosis precision and treatment effectiveness.展开更多
In March 2024,a large batch of people along the surfactant industrial chain attended the“2024 Chinese Surfactant Industrial Meeting”,also the 2024(The 2nd)Chinese International Surfactant Industrial Expo held betwee...In March 2024,a large batch of people along the surfactant industrial chain attended the“2024 Chinese Surfactant Industrial Meeting”,also the 2024(The 2nd)Chinese International Surfactant Industrial Expo held between March 25 and March 28,so as to explore new possibilities in the industry.This event was hosted by China Research Institute of Daily Chemical Industry and National Engineering Research Center for Surfactant(NERCS)and organized by Productivity Promotion Centre for the Surfactant and Detergent Industry and China Daily chemical Industry Information Center,with the special support by China Quality Mark Certification Group.展开更多
In order to implement the spirits of the Central Economic Work Conference and systematically promote the standardization work of commercial aerospace,SAC/TC 425,Space technology and operation,recently established thre...In order to implement the spirits of the Central Economic Work Conference and systematically promote the standardization work of commercial aerospace,SAC/TC 425,Space technology and operation,recently established three working groups for emerging fields and strategic emerging industries related to commercial aerospace.The work scope of WG 1 on commercial aircraft launching covers the standards research,development and revision for operation support,process,industrial chains,and other fields of launching.It is also responsible for the updating of standards documents published by correspondent international standardization organizations for further analysis and adoption.WG 2 on the application of satellite internet is responsible for the research,development and revision of standards in areas such as the application scenarios and demands,functions and process,interfaces and data forms of satellite internet.Its first batch of members consists of 29 service providers,operators,equipment manufacturers,and parties representing public interests in this field.展开更多
This paper addresses the scheduling problem involving batch processing machines, which is Mso known as parallel batching in the literature. The presented mixed integer programming formulation first provides an elegant...This paper addresses the scheduling problem involving batch processing machines, which is Mso known as parallel batching in the literature. The presented mixed integer programming formulation first provides an elegant model for the problem under study. Fhrthermore, it enables solutions to the problem instances beyond the capability of exact methods developed so far. In order to alleviate computational burden, the authors propose MIP-based heuristic approaches which balance solution quality and computing time.展开更多
文摘This research assessed the environmental impact of cement silos emission on the existing concrete batching facilities in M35-Mussafah, Abu Dhabi, United Arab Emirates. These assessments were conducted using an air quality dispersion model (AERMOD) to predict the ambient concentration of Portland Cement particulate matter less than 10 microns (PM<sub>10</sub>) emitted to the atmosphere during loading and unloading activities from 176 silos located in 25 concrete batching facilities. AERMOD was applied to simulate and describe the dispersion of PM<sub>10</sub> released from the cement silos into the air. Simulations were carried out for PM<sub>10</sub> emissions on controlled and uncontrolled cement silos scenarios. Results showed an incremental negative impact on air quality and public health from uncontrolled silos emissions and estimated that the uncontrolled PM<sub>10</sub> emission sources contribute to air pollution by 528958.32 kg/Year. The modeling comparison between the controlled and uncontrolled silos shows that the highest annual average concentration from controlled cement silos is 0.065 μg/m<sup>3</sup>, and the highest daily emission value is 0.6 μg/m<sup>3</sup>;both values are negligible and will not lead to significant air quality impact in the entire study domain. However, the uncontrolled cement silos’ highest annual average concentration value is 328.08 μg/m<sup>3</sup>. The highest daily emission average value was 1250.09 μg/m<sup>3</sup>;this might cause a significant air pollution quality impact and health effects on the public and workers. The short-term and long-term average PM<sub>10</sub> pollutant concentrations at these receptors predicted by the air dispersion model are discussed for both scenarios and compared with local and international air quality standards and guidelines.
文摘This research study quantifies the PM<sub>10</sub> emission rates (g/s) from cement silos in 25 concrete batching facilities for both controlled and uncontrolled scenarios by applying the USEPA AP-42 guidelines step-by-step approach. The study focuses on evaluating the potential environmental impact of cement dust fugitive emissions from 176 cement silos located in 25 concrete batching facilities in the M35 Mussafah industrial area of Abu Dhabi, UAE. Emission factors are crucial for quantifying the PM<sub>10</sub> emission rates (g/s) that support developing source-specific emission estimates for areawide inventories to identify major sources of pollution that provide screening sources for compliance monitoring and air dispersion modeling. This requires data to be collected involves information on production, raw material usage, energy consumption, and process-related details, this was obtained using various methods, including field visits, surveys, and interviews with facility representatives to calculate emission rates accurately. Statistical analysis was conducted on cement consumption and emission rates for controlled and uncontrolled sources of the targeted facilities. The data shows that the average cement consumption among the facilities is approximately 88,160 (MT/yr), with a wide range of variation depending on the facility size and production rate. The emission rates from controlled sources have an average of 4.752E<sup>-04</sup> (g/s), while the rates from uncontrolled sources average 0.6716 (g/s). The analysis shows a significant statistical relationship (p < 0.05) and perfect positive correlation (r = 1) between cement consumption and emission rates, indicating that as cement consumption increases, emission rates tend to increase as well. Furthermore, comparing the emission rates from controlled and uncontrolled scenarios. The data showed a significant difference between the two scenarios, highlighting the effectiveness of control measures in reducing PM<sub>10</sub> emissions. The study’s findings provide insights into the impact of cement silo emissions on air quality and the importance of implementing control measures in concrete batching facilities. The comparative analysis contributes to understanding emission sources and supports the development of pollution control strategies in the Ready-Mix industry.
基金Supported by NSFC(11571323 11201121)+1 种基金NSFSTDOHN(162300410221)NSFEDOHN(2013GGJS-079)
文摘In parallel-batching machine scheduling, all jobs in a batch start and complete at the same time, and the processing time of the batch is the maximum processing time of any job in it. For the unbounded parallel-batching machine scheduling problem of minimizing the maximum lateness, denoted 1|p-batch|L_(max), a dynamic programming algorithm with time complexity O(n^2) is well known in the literature.Later, this algorithm is improved to be an O(n log n) algorithm. In this note, we present another O(n log n) algorithm with simplifications on data structure and implementation details.
基金Supported by the NSF of Henan Province(082300410070)
文摘A batch is a subset of jobs which must be processed jointly in either serial or parallel form. For the single machine, batching, total completion time scheduling problems, the algorithmic aspects have been extensively studied in the literature. This paper presents the optimal hatching structures of the problems on the batching ways: all jobs in exactly N(arbitrary fix batch number and 1 〈 N 〈 n) batches.
基金National Natural Science Foundation of China(No.70832002)Graduate Student Innovation Fund of Fudan University,China
文摘The scheduling problem on a single batching machine with family jobs was proposed.The single batching machine can process a group of jobs simultaneously as a batch.Jobs in the same batch complete at the same time.The batch size is assumed to be unbounded.Jobs that belong to different families can not be processed in the same batch.The objective function is minimizing maximum lateness.For the problem with fixed number of m families and n jobs,a polynomial time algorithm based on dynamic programming with time complexity of O(n(n/m+1)m)was presented.
文摘To improve the productivity,the resource utilization and reduce the production cost of flexible job shops,this paper designs an improved two-layer optimization algorithm for the dual-resource scheduling optimization problem of flexible job shop considering workpiece batching.Firstly,a mathematical model is established to minimize the maximum completion time.Secondly,an improved two-layer optimization algorithm is designed:the outer layer algorithm uses an improved PSO(Particle Swarm Optimization)to solve the workpiece batching problem,and the inner layer algorithm uses an improved GA(Genetic Algorithm)to solve the dual-resource scheduling problem.Then,a rescheduling method is designed to solve the task disturbance problem,represented by machine failures,occurring in the workshop production process.Finally,the superiority and effectiveness of the improved two-layer optimization algorithm are verified by two typical cases.The case results show that the improved two-layer optimization algorithm increases the average productivity by 7.44% compared to the ordinary two-layer optimization algorithm.By setting the different numbers of AGVs(Automated Guided Vehicles)and analyzing the impact on the production cycle of the whole order,this paper uses two indicators,the maximum completion time decreasing rate and the average AGV load time,to obtain the optimal number of AGVs,which saves the cost of production while ensuring the production efficiency.This research combines the solved problem with the real production process,which improves the productivity and reduces the production cost of the flexible job shop,and provides new ideas for the subsequent research.
基金partially supported by the National Natural Science Foundation of China under grant no.62372245the Foundation of Yunnan Key Laboratory of Blockchain Application Technology under Grant 202105AG070005+1 种基金in part by the Foundation of State Key Laboratory of Public Big Datain part by the Foundation of Key Laboratory of Computational Science and Application of Hainan Province under Grant JSKX202202。
文摘For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and allows privacy information to be preserved.Data owners can tightly manage their data with efficient revocation and only grant one-time adaptive access for the fulfillment of the requester.We prove that our protocol is semanticallly secure,blind,and secure against oblivious requesters and malicious file keepers.We also provide security analysis in the context of four typical attacks.
文摘In this paper we study the problem of scheduling a batching machine with nonidentical job sizes. The jobs arrive simultaneously and have unit processing time. The goal is to minimize the total completion times. Having shown that the problem is NP-hard, we put forward three approximation schemes with worst case ratio 4, 2, and 3/2, respectively.
基金supported by the NationalNatural Science Foundation of China(No.61972118)the Key R&D Program of Zhejiang Province(No.2023C01028).
文摘Cloud service providers generally co-locate online services and batch jobs onto the same computer cluster,where the resources can be pooled in order to maximize data center resource utilization.Due to resource competition between batch jobs and online services,co-location frequently impairs the performance of online services.This study presents a quality of service(QoS)prediction-based schedulingmodel(QPSM)for co-locatedworkloads.The performance prediction of QPSM consists of two parts:the prediction of an online service’s QoS anomaly based on XGBoost and the prediction of the completion time of an offline batch job based on randomforest.On-line service QoS anomaly prediction is used to evaluate the influence of batch jobmix on on-line service performance,and batch job completion time prediction is utilized to reduce the total waiting time of batch jobs.When the same number of batch jobs are scheduled in experiments using typical test sets such as CloudSuite,the scheduling time required by QPSM is reduced by about 6 h on average compared with the first-come,first-served strategy and by about 11 h compared with the random scheduling strategy.Compared with the non-co-located situation,QPSM can improve CPU resource utilization by 12.15% and memory resource utilization by 5.7% on average.Experiments show that the QPSM scheduling strategy proposed in this study can effectively guarantee the quality of online services and further improve cluster resource utilization.
基金supported by National Natural Science Foundation of China under Grant No.61972360Shandong Provincial Natural Science Foundation of China under Grant Nos.ZR2020MF148,ZR2020QF108.
文摘The maturity of 5G technology has enabled crowd-sensing services to collect multimedia data over wireless network,so it has promoted the applications of crowd-sensing services in different fields,but also brings more privacy security challenges,the most commom which is privacy leakage.As a privacy protection technology combining data integrity check and identity anonymity,ring signature is widely used in the field of privacy protection.However,introducing signature technology leads to additional signature verification overhead.In the scenario of crowd-sensing,the existing signature schemes have low efficiency in multi-signature verification.Therefore,it is necessary to design an efficient multi-signature verification scheme while ensuring security.In this paper,a batch-verifiable signature scheme is proposed based on the crowd-sensing background,which supports the sensing platform to verify the uploaded multiple signature data efficiently,so as to overcoming the defects of the traditional signature scheme in multi-signature verification.In our proposal,a method for linking homologous data was presented,which was valuable for incentive mechanism and data analysis.Simulation results showed that the proposed scheme has good performance in terms of security and efficiency in crowd-sensing applications with a large number of users and data.
基金supported in part by the“Pioneer”and“Leading Goose”R&D Program of Zhejiang(Grant No.2022C03174)the National Natural Science Foundation of China(No.92067103)+4 种基金the Key Research and Development Program of Shaanxi,China(No.2021ZDLGY06-02)the Natural Science Foundation of Shaanxi Province(No.2019ZDLGY12-02)the Shaanxi Innovation Team Project(No.2018TD-007)the Xi'an Science and technology Innovation Plan(No.201809168CX9JC10)the Fundamental Research Funds for the Central Universities(No.YJS2212)and National 111 Program of China B16037.
文摘The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.
基金the National Natural Science Foundation of China(Grant Number 61573264).
文摘This study focuses on the scheduling problem of unrelated parallel batch processing machines(BPM)with release times,a scenario derived from the moulding process in a foundry.In this process,a batch is initially formed,placed in a sandbox,and then the sandbox is positioned on a BPM formoulding.The complexity of the scheduling problem increases due to the consideration of BPM capacity and sandbox volume.To minimize the makespan,a new cooperated imperialist competitive algorithm(CICA)is introduced.In CICA,the number of empires is not a parameter,and four empires aremaintained throughout the search process.Two types of assimilations are achieved:The strongest and weakest empires cooperate in their assimilation,while the remaining two empires,having a close normalization total cost,combine in their assimilation.A new form of imperialist competition is proposed to prevent insufficient competition,and the unique features of the problem are effectively utilized.Computational experiments are conducted across several instances,and a significant amount of experimental results show that the newstrategies of CICAare effective,indicating promising advantages for the considered BPMscheduling problems.
基金supported by the UC-National Lab In-Residence Graduate Fellowship Grant L21GF3606supported by a DOD National Defense Science and Engineering Graduate(NDSEG)Research Fellowship+1 种基金supported by the Laboratory Directed Research and Development program of Los Alamos National Laboratory under project numbers 20170668PRD1 and 20210213ERsupported by the NGA under Contract No.HM04762110003.
文摘Graph learning,when used as a semi-supervised learning(SSL)method,performs well for classification tasks with a low label rate.We provide a graph-based batch active learning pipeline for pixel/patch neighborhood multi-or hyperspectral image segmentation.Our batch active learning approach selects a collection of unlabeled pixels that satisfy a graph local maximum constraint for the active learning acquisition function that determines the relative importance of each pixel to the classification.This work builds on recent advances in the design of novel active learning acquisition functions(e.g.,the Model Change approach in arXiv:2110.07739)while adding important further developments including patch-neighborhood image analysis and batch active learning methods to further increase the accuracy and greatly increase the computational efficiency of these methods.In addition to improvements in the accuracy,our approach can greatly reduce the number of labeled pixels needed to achieve the same level of the accuracy based on randomly selected labeled pixels.
基金funded by National Natural Science Foundation of China(Grant Nos.42272333,42277147).
文摘Refined 3D modeling of mine slopes is pivotal for precise prediction of geological hazards.Aiming at the inadequacy of existing single modeling methods in comprehensively representing the overall and localized characteristics of mining slopes,this study introduces a new method that fuses model data from Unmanned aerial vehicles(UAV)tilt photogrammetry and 3D laser scanning through a data alignment algorithm based on control points.First,the mini batch K-Medoids algorithm is utilized to cluster the point cloud data from ground 3D laser scanning.Then,the elbow rule is applied to determine the optimal cluster number(K0),and the feature points are extracted.Next,the nearest neighbor point algorithm is employed to match the feature points obtained from UAV tilt photogrammetry,and the internal point coordinates are adjusted through the distanceweighted average to construct a 3D model.Finally,by integrating an engineering case study,the K0 value is determined to be 8,with a matching accuracy between the two model datasets ranging from 0.0669 to 1.0373 mm.Therefore,compared with the modeling method utilizing K-medoids clustering algorithm,the new modeling method significantly enhances the computational efficiency,the accuracy of selecting the optimal number of feature points in 3D laser scanning,and the precision of the 3D model derived from UAV tilt photogrammetry.This method provides a research foundation for constructing mine slope model.
文摘This paper discusses a queueing system with a retrial orbit and batch service, in which the quantity of customers’ rooms in the queue is finite and the space of retrial orbit is infinite. When the server starts serving, it serves all customers in the queue in a single batch, which is the so-called batch service. If a new customer or a retrial customer finds all the customers’ rooms are occupied, he will decide whether or not to join the retrial orbit. By using the censoring technique and the matrix analysis method, we first obtain the decay function of the stationary distribution for the quantity of customers in the retrial orbit and the quantity of customers in the queue. Then based on the form of decay rate function and the Karamata Tauberian theorem, we finally get the exact tail asymptotics of the stationary distribution.
文摘The lethal brain tumor “Glioblastoma” has the propensity to grow over time. To improve patient outcomes, it is essential to classify GBM accurately and promptly in order to provide a focused and individualized treatment plan. Despite this, deep learning methods, particularly Convolutional Neural Networks (CNNs), have demonstrated a high level of accuracy in a myriad of medical image analysis applications as a result of recent technical breakthroughs. The overall aim of the research is to investigate how CNNs can be used to classify GBMs using data from medical imaging, to improve prognosis precision and effectiveness. This research study will demonstrate a suggested methodology that makes use of the CNN architecture and is trained using a database of MRI pictures with this tumor. The constructed model will be assessed based on its overall performance. Extensive experiments and comparisons with conventional machine learning techniques and existing classification methods will also be made. It will be crucial to emphasize the possibility of early and accurate prediction in a clinical workflow because it can have a big impact on treatment planning and patient outcomes. The paramount objective is to not only address the classification challenge but also to outline a clear pathway towards enhancing prognosis precision and treatment effectiveness.
文摘In March 2024,a large batch of people along the surfactant industrial chain attended the“2024 Chinese Surfactant Industrial Meeting”,also the 2024(The 2nd)Chinese International Surfactant Industrial Expo held between March 25 and March 28,so as to explore new possibilities in the industry.This event was hosted by China Research Institute of Daily Chemical Industry and National Engineering Research Center for Surfactant(NERCS)and organized by Productivity Promotion Centre for the Surfactant and Detergent Industry and China Daily chemical Industry Information Center,with the special support by China Quality Mark Certification Group.
文摘In order to implement the spirits of the Central Economic Work Conference and systematically promote the standardization work of commercial aerospace,SAC/TC 425,Space technology and operation,recently established three working groups for emerging fields and strategic emerging industries related to commercial aerospace.The work scope of WG 1 on commercial aircraft launching covers the standards research,development and revision for operation support,process,industrial chains,and other fields of launching.It is also responsible for the updating of standards documents published by correspondent international standardization organizations for further analysis and adoption.WG 2 on the application of satellite internet is responsible for the research,development and revision of standards in areas such as the application scenarios and demands,functions and process,interfaces and data forms of satellite internet.Its first batch of members consists of 29 service providers,operators,equipment manufacturers,and parties representing public interests in this field.
文摘This paper addresses the scheduling problem involving batch processing machines, which is Mso known as parallel batching in the literature. The presented mixed integer programming formulation first provides an elegant model for the problem under study. Fhrthermore, it enables solutions to the problem instances beyond the capability of exact methods developed so far. In order to alleviate computational burden, the authors propose MIP-based heuristic approaches which balance solution quality and computing time.