A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environment...A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environments. At the mobile hosts, all transactions perform local pre-validation. The local pre-validation process is carried out against the committed transactions at the server in the last broadcast cycle. Transactions that survive in local pre-validation must be submitted to the server for local final validation. The new protocol eliminates conflicts between mobile read-only and mobile update transactions, and resolves data conflicts flexibly by using multiversion dynamic adjustment of serialization order to avoid unnecessary restarts of transactions. Mobile read-only transactions can be committed with no-blocking, and respond time of mobile read-only transactions is greatly shortened. The tolerance of mobile transactions of disconnections from the broadcast channel is increased. In global validation mobile distributed transactions have to do check to ensure distributed serializability in all participants. The simulation results show that the new concurrency control protocol proposed offers better performance than other protocols in terms of miss rate, restart rate, commit rate. Under high work load (think time is ls) the miss rate of DMVOCC-MVDA is only 14.6%, is significantly lower than that of other protocols. The restart rate of DMVOCC-MVDA is only 32.3%, showing that DMVOCC-MVDA can effectively reduce the restart rate of mobile transactions. And the commit rate of DMVOCC-MVDA is up to 61.2%, which is obviously higher than that of other protocols.展开更多
Recovery performance in the event of failures is very important for distributed real-time database systems. This paper presents a time-cognizant logging-based crash recovery scheme (TCLCRS) that aims at distributed ...Recovery performance in the event of failures is very important for distributed real-time database systems. This paper presents a time-cognizant logging-based crash recovery scheme (TCLCRS) that aims at distributed real-time databases, which adopts a main memory database as its ground support. In our scheme, each site maintains a real-time log for local transactions and the subtransactions, which execute at the site, and execte local checkpointing independently. Log records are stored in non-volatile high- speed store, which is divided into four different partitions based on transaction classes. During restart recovery after a site crash, partitioned crash recovery strategy is adopted to ensure that the site can be brought up before the entire local secondary database is reloaded in main memory. The partitioned crash recovery strategy not only guarantees the internal consistency to be recovered, but also guarantee the temporal consistency and recovery of the sates of physical world influenced by uncommitted transactions. Combined with two- phase commit protocol, TCLCRS can guarantee failure atomicity of distributed real-time transactions.展开更多
This paper formally defines and analyses the new notion of correctness called quasi serializability, and then outlines corresponding concurrency control protocol QDHP for distributed real-time databases. Finally, thro...This paper formally defines and analyses the new notion of correctness called quasi serializability, and then outlines corresponding concurrency control protocol QDHP for distributed real-time databases. Finally, through a series of simulation studies, it shows that using the new concurrency control protocol the performance of distributed real-time databases can be much improved.展开更多
In parallel real-time database systems, concurrency control protocols must satisfy time constraints as well as the integrity constraints. The authors present a validation concurrency control(VCC) protocol, which can e...In parallel real-time database systems, concurrency control protocols must satisfy time constraints as well as the integrity constraints. The authors present a validation concurrency control(VCC) protocol, which can enhance the performance of real-time concurrency control mechanism by reducing the number of transactions that might miss their deadlines, and compare the performance of validation concurrency control protocol with that of HP2PL(High priority two phase locking) protocol and OCC-TI-WAIT-50(Optimistic concurrency control-time interval-wait-50) protocol under shared-disk architecture by simulation. The simulation results reveal that the protocol the author presented can effectively reduce the number of transactions restarting which might miss their deadlines and performs better than HP2PL and OCC-TI-WAIT-50. It works well when arrival rate of transaction is lesser than threshold. However, due to resource contention the percentage of missing deadline increases sharply when arrival rate is greater than the threshold.展开更多
Most of the proposed concurrency control protocols for real time database systems are based on serializability theorem. Owing to the unique characteristics of real time database applications and the importance of sa...Most of the proposed concurrency control protocols for real time database systems are based on serializability theorem. Owing to the unique characteristics of real time database applications and the importance of satisfying the timing constraints of transactions, serializability is too strong as a correctness criterion and not suitable for real time databases in most cases. On the other hand, relaxed serializability including epsilon serializability and similarity serializability can allow more real time transactions to satisfy their timing constraints, but database consistency may be sacrificed to some extent. We thus propose the use of weak serializability(WSR) that is more relaxed than conflicting serializability while database consistency is maintained. In this paper, we first formally define the new notion of correctness called weak serializability. After the necessary and sufficient conditions for weak serializability are shown, corresponding concurrency control protocol WDHP(weak serializable distributed high priority protocol) is outlined for distributed real time databases, where a new lock mode called mask lock mode is proposed for simplifying the condition of global consistency. Finally, through a series of simulation studies, it is shown that using the new concurrency control protocol the performance of distributed real time databases can be greatly improved.展开更多
Deep engineering disasters,such as rockbursts and collapses,are more related to the shear slip of rock joints.A novel multifunctional device was developed to study the shear failure mechanism in rocks.Using this devic...Deep engineering disasters,such as rockbursts and collapses,are more related to the shear slip of rock joints.A novel multifunctional device was developed to study the shear failure mechanism in rocks.Using this device,the complete shearedeformation process and long-term shear creep tests could be performed on rocks under constant normal stiffness(CNS)or constant normal loading(CNL)conditions in real-time at high temperature and true-triaxial stress.During the research and development process,five key technologies were successfully broken through:(1)the ability to perform true-triaxial compressioneshear loading tests on rock samples with high stiffness;(2)a shear box with ultra-low friction throughout the entire stress space of the rock sample during loading;(3)a control system capable of maintaining high stress for a long time and responding rapidly to the brittle fracture of a rock sample as well;(4)a refined ability to measure the volumetric deformation of rock samples subjected to true triaxial shearing;and(5)a heating system capable of maintaining uniform heating of the rock sample over a long time.By developing these technologies,loading under high true triaxial stress conditions was realized.The apparatus has a maximum normal stiffness of 1000 GPa/m and a maximum operating temperature of 300C.The differences in the surface temperature of the sample are constant to within5C.Five types of true triaxial shear tests were conducted on homogeneous sandstone to verify that the apparatus has good performance and reliability.The results show that temperature,lateral stress,normal stress and time influence the shear deformation,failure mode and strength of the sandstone.The novel apparatus can be reliably used to conduct true-triaxial shear tests on rocks subjected to high temperatures and stress.展开更多
All-solid-state batteries(ASSBs)are a class of safer and higher-energy-density materials compared to conventional devices,from which solid-state electrolytes(SSEs)are their essential components.To date,investigations ...All-solid-state batteries(ASSBs)are a class of safer and higher-energy-density materials compared to conventional devices,from which solid-state electrolytes(SSEs)are their essential components.To date,investigations to search for high ion-conducting solid-state electrolytes have attracted broad concern.However,obtaining SSEs with high ionic conductivity is challenging due to the complex structural information and the less-explored structure-performance relationship.To provide a solution to these challenges,developing a database containing typical SSEs from available experimental reports would be a new avenue to understand the structureperformance relationships and find out new design guidelines for reasonable SSEs.Herein,a dynamic experimental database containing>600 materials was developed in a wide range of temperatures(132.40–1261.60 K),including mono-and divalent cations(e.g.,Li^(+),Na^(+),K^(+),Ag^(+),Ca^(2+),Mg^(2+),and Zn^(2+))and various types of anions(e.g.,halide,hydride,sulfide,and oxide).Data-mining was conducted to explore the relationships among different variates(e.g.,transport ion,composition,activation energy,and conductivity).Overall,we expect that this database can provide essential guidelines for the design and development of high-performance SSEs in ASSB applications.This database is dynamically updated,which can be accessed via our open-source online system.展开更多
Analyzing polysorbate 20(PS20)composition and the impact of each component on stability and safety is crucial due to formulation variations and individual tolerance.The similar structures and polarities of PS20 compon...Analyzing polysorbate 20(PS20)composition and the impact of each component on stability and safety is crucial due to formulation variations and individual tolerance.The similar structures and polarities of PS20 components make accurate separation,identification,and quantification challenging.In this work,a high-resolution quantitative method was developed using single-dimensional high-performance liquid chromatography(HPLC)with charged aerosol detection(CAD)to separate 18 key components with multiple esters.The separated components were characterized by ultra-high-performance liquid chromatography-quadrupole time-of-flight mass spectrometry(UHPLC-Q-TOF-MS)with an identical gradient as the HPLC-CAD analysis.The polysorbate compound database and library were expanded over 7-time compared to the commercial database.The method investigated differences in PS20 samples from various origins and grades for different dosage forms to evaluate the composition-process relationship.UHPLC-Q-TOF-MS identified 1329 to 1511 compounds in 4 batches of PS20 from different sources.The method observed the impact of 4 degradation conditions on peak components,identifying stable components and their tendencies to change.HPLC-CAD and UHPLC-Q-TOF-MS results provided insights into fingerprint differences,distinguishing quasi products.展开更多
The EU’s Artificial Intelligence Act(AI Act)imposes requirements for the privacy compliance of AI systems.AI systems must comply with privacy laws such as the GDPR when providing services.These laws provide users wit...The EU’s Artificial Intelligence Act(AI Act)imposes requirements for the privacy compliance of AI systems.AI systems must comply with privacy laws such as the GDPR when providing services.These laws provide users with the right to issue a Data Subject Access Request(DSAR).Responding to such requests requires database administrators to identify information related to an individual accurately.However,manual compliance poses significant challenges and is error-prone.Database administrators need to write queries through time-consuming labor.The demand for large amounts of data by AI systems has driven the development of NoSQL databases.Due to the flexible schema of NoSQL databases,identifying personal information becomes even more challenging.This paper develops an automated tool to identify personal information that can help organizations respond to DSAR.Our tool employs a combination of various technologies,including schema extraction of NoSQL databases and relationship identification from query logs.We describe the algorithm used by our tool,detailing how it discovers and extracts implicit relationships from NoSQL databases and generates relationship graphs to help developers accurately identify personal data.We evaluate our tool on three datasets,covering different database designs,achieving an F1 score of 0.77 to 1.Experimental results demonstrate that our tool successfully identifies information relevant to the data subject.Our tool reduces manual effort and simplifies GDPR compliance,showing practical application value in enhancing the privacy performance of NOSQL databases and AI systems.展开更多
A real-time data processing system is designed for the carbon dioxide dispersion interferometer(CO_(2)-DI)on EAST.The system utilizes the parallel and pipelining capabilities of an fieldprogrammable gate array(FPGA)to...A real-time data processing system is designed for the carbon dioxide dispersion interferometer(CO_(2)-DI)on EAST.The system utilizes the parallel and pipelining capabilities of an fieldprogrammable gate array(FPGA)to digitize and process the intensity of signals from the detector.Finally,the real-time electron density signals are exported through a digital-to-analog converter(DAC)module in the form of analog signals.The system has been successfully applied in the CO_(2)-DI system to provide low-latency electron density input to the plasma control system on EAST.Experimental results of the latest campaign with long-pulse discharges on EAST(2022–2023)demonstrate that the system can respond effectively in the case of rapid density changes,proving its reliability and accuracy for future electron density calculation.展开更多
The co-frequency vibration fault is one of the common faults in the operation of rotating equipment,and realizing the real-time diagnosis of the co-frequency vibration fault is of great significance for monitoring the...The co-frequency vibration fault is one of the common faults in the operation of rotating equipment,and realizing the real-time diagnosis of the co-frequency vibration fault is of great significance for monitoring the health state and carrying out vibration suppression of the equipment.In engineering scenarios,co-frequency vibration faults are highlighted by rotational frequency and are difficult to identify,and existing intelligent methods require more hardware conditions and are exclusively time-consuming.Therefore,Lightweight-convolutional neural networks(LW-CNN)algorithm is proposed in this paper to achieve real-time fault diagnosis.The critical parameters are discussed and verified by simulated and experimental signals for the sliding window data augmentation method.Based on LW-CNN and data augmentation,the real-time intelligent diagnosis of co-frequency is realized.Moreover,a real-time detection method of fault diagnosis algorithm is proposed for data acquisition to fault diagnosis.It is verified by experiments that the LW-CNN and sliding window methods are used with high accuracy and real-time performance.展开更多
To address the impact of wind-power fluctuations on the stability of power systems,we propose a comprehensive approach that integrates multiple strategies and methods to enhance the efficiency and reliability of a sys...To address the impact of wind-power fluctuations on the stability of power systems,we propose a comprehensive approach that integrates multiple strategies and methods to enhance the efficiency and reliability of a system.First,we employ a strategy that restricts long-and short-term power output deviations to smoothen wind power fluctuations in real time.Second,we adopt the sliding window instantaneous complete ensemble empirical mode decomposition with adaptive noise(SW-ICEEMDAN)strategy to achieve real-time decomposition of the energy storage power,facilitating internal power distribution within the hybrid energy storage system.Finally,we introduce a rule-based multi-fuzzy control strategy for the secondary adjustment of the initial power allocation commands for different energy storage components.Through simulation validation,we demonstrate that the proposed comprehensive control strategy can smoothen wind power fluctuations in real time and decompose energy storage power.Compared with traditional empirical mode decomposition(EMD),ensemble empirical mode decomposition(EEMD),and complete ensemble empirical mode decomposition with adaptive noise(CEEMDAN)decomposition strategies,the configuration of the energy storage system under the SW-ICEEMDAN control strategy is more optimal.Additionally,the state-of-charge of energy storage components fluctuates within a reasonable range,enhancing the stability of the power system and ensuring the secure operation of the energy storage system.展开更多
Database systems have consistently been prime targets for cyber-attacks and threats due to the critical nature of the data they store.Despite the increasing reliance on database management systems,this field continues...Database systems have consistently been prime targets for cyber-attacks and threats due to the critical nature of the data they store.Despite the increasing reliance on database management systems,this field continues to face numerous cyber-attacks.Database management systems serve as the foundation of any information system or application.Any cyber-attack can result in significant damage to the database system and loss of sensitive data.Consequently,cyber risk classifications and assessments play a crucial role in risk management and establish an essential framework for identifying and responding to cyber threats.Risk assessment aids in understanding the impact of cyber threats and developing appropriate security controls to mitigate risks.The primary objective of this study is to conduct a comprehensive analysis of cyber risks in database management systems,including classifying threats,vulnerabilities,impacts,and countermeasures.This classification helps to identify suitable security controls to mitigate cyber risks for each type of threat.Additionally,this research aims to explore technical countermeasures to protect database systems from cyber threats.This study employs the content analysis method to collect,analyze,and classify data in terms of types of threats,vulnerabilities,and countermeasures.The results indicate that SQL injection attacks and Denial of Service(DoS)attacks were the most prevalent technical threats in database systems,each accounting for 9%of incidents.Vulnerable audit trails,intrusion attempts,and ransomware attacks were classified as the second level of technical threats in database systems,comprising 7%and 5%of incidents,respectively.Furthermore,the findings reveal that insider threats were the most common non-technical threats in database systems,accounting for 5%of incidents.Moreover,the results indicate that weak authentication,unpatched databases,weak audit trails,and multiple usage of an account were the most common technical vulnerabilities in database systems,each accounting for 9%of vulnerabilities.Additionally,software bugs,insecure coding practices,weak security controls,insecure networks,password misuse,weak encryption practices,and weak data masking were classified as the second level of security vulnerabilities in database systems,each accounting for 4%of vulnerabilities.The findings from this work can assist organizations in understanding the types of cyber threats and developing robust strategies against cyber-attacks.展开更多
This review explores glucose monitoring and management strategies,emphasizing the need for reliable and userfriendly wearable sensors that are the next generation of sensors for continuous glucose detection.In additio...This review explores glucose monitoring and management strategies,emphasizing the need for reliable and userfriendly wearable sensors that are the next generation of sensors for continuous glucose detection.In addition,examines key strategies for designing glucose sensors that are multi-functional,reliable,and cost-effective in a variety of contexts.The unique features of effective diabetes management technology are highlighted,with a focus on using nano/biosensor devices that can quickly and accurately detect glucose levels in the blood,improving patient treatment and control of potential diabetes-related infections.The potential of next-generation wearable and touch-sensitive nano biomedical sensor engineering designs for providing full control in assessing implantable,continuous glucose monitoring is also explored.The challenges of standardizing drug or insulin delivery doses,low-cost,real-time detection of increased blood sugar levels in diabetics,and early digital health awareness controls for the adverse effects of injectable medication are identified as unmet needs.Also,the market for biosensors is expected to expand significantly due to the rising need for portable diagnostic equipment and an ever-increasing diabetic population.The paper concludes by emphasizing the need for further research and development of glucose biosensors to meet the stringent requirements for sensitivity and specificity imposed by clinical diagnostics while being cost-effective,stable,and durable.展开更多
The real-time detection and instance segmentation of strawberries constitute fundamental components in the development of strawberry harvesting robots.Real-time identification of strawberries in an unstructured envi-r...The real-time detection and instance segmentation of strawberries constitute fundamental components in the development of strawberry harvesting robots.Real-time identification of strawberries in an unstructured envi-ronment is a challenging task.Current instance segmentation algorithms for strawberries suffer from issues such as poor real-time performance and low accuracy.To this end,the present study proposes an Efficient YOLACT(E-YOLACT)algorithm for strawberry detection and segmentation based on the YOLACT framework.The key enhancements of the E-YOLACT encompass the development of a lightweight attention mechanism,pyramid squeeze shuffle attention(PSSA),for efficient feature extraction.Additionally,an attention-guided context-feature pyramid network(AC-FPN)is employed instead of FPN to optimize the architecture’s performance.Furthermore,a feature-enhanced model(FEM)is introduced to enhance the prediction head’s capabilities,while efficient fast non-maximum suppression(EF-NMS)is devised to improve non-maximum suppression.The experimental results demonstrate that the E-YOLACT achieves a Box-mAP and Mask-mAP of 77.9 and 76.6,respectively,on the custom dataset.Moreover,it exhibits an impressive category accuracy of 93.5%.Notably,the E-YOLACT also demonstrates a remarkable real-time detection capability with a speed of 34.8 FPS.The method proposed in this article presents an efficient approach for the vision system of a strawberry-picking robot.展开更多
The composite time scale(CTS)provides a stable,accurate,and reliable time scale for modern society.The improvement of CTS’s real-time performance will improve its stability,which strengths related applications’perfo...The composite time scale(CTS)provides a stable,accurate,and reliable time scale for modern society.The improvement of CTS’s real-time performance will improve its stability,which strengths related applications’performance.Aiming at this goal,a method achieved by determining the optimal calculation interval and accelerating adjustment stage is proposed in this paper.The determinants of the CTS’s calculation interval(characteristics of the clock ensemble,the measurement noise,the time and frequency synchronization system’s noise and the auxiliary output generator noise floor)are studied and the optimal calculation interval is obtained.We also investigate the effect of ensemble algorithm’s initial parameters on the CTS’s adjustment stage.A strategy to get the reasonable initial parameters of ensemble algorithm is designed.The results show that the adjustment stage can be finished rapidly or even can be shorten to zero with reasonable initial parameters.On this basis,we experimentally generate a distributed CTS with a calculation interval of 500 s and its stability outperforms those of the member clocks when the averaging time is longer than1700 s.The experimental result proves that the CTS’s real-time performance is significantly improved.展开更多
In recent years,frequent fire disasters have led to enormous damage in China.Effective firefighting rescues can minimize the losses caused by fires.During the rescue processes,the travel time of fire trucks can be sev...In recent years,frequent fire disasters have led to enormous damage in China.Effective firefighting rescues can minimize the losses caused by fires.During the rescue processes,the travel time of fire trucks can be severely affected by traffic conditions,changing the effective coverage of fire stations.However,it is still challenging to determine the effective coverage of fire stations considering dynamic traffic conditions.This paper addresses this issue by combining the traveling time calculationmodelwith the effective coverage simulationmodel.In addition,it proposes a new index of total effective coverage area(TECA)based on the time-weighted average of the effective coverage area(ECA)to evaluate the urban fire services.It also selects China as the case study to validate the feasibility of the models,a fire station(FS-JX)in Changsha.FS-JX station and its surrounding 9,117 fire risk points are selected as the fire service supply and demand points,respectively.A total of 196 simulation scenarios throughout a consecutiveweek are analyzed.Eventually,1,933,815 sets of valid sample data are obtained.The results showed that the TECA of FS-JX is 3.27 km^(2),which is far below the standard requirement of 7.00 km^(2) due to the traffic conditions.The visualization results showed that three rivers around FS-JX interrupt the continuity of its effective coverage.The proposed method can provide data support to optimize the locations of fire stations by accurately and dynamically determining the effective coverage of fire stations.展开更多
Fast neutron flux measurements with high count rates and high time resolution have important applications in equipment such as tokamaks.In this study,real-time neutron and gamma discrimination was implemented on a sel...Fast neutron flux measurements with high count rates and high time resolution have important applications in equipment such as tokamaks.In this study,real-time neutron and gamma discrimination was implemented on a self-developed 500-Msps,12-bit digitizer,and the neutron and gamma spectra were calculated directly on an FPGA.A fast neutron flux measurement system with BC-501A and EJ-309 liquid scintillator detectors was developed and a fast neutron measurement experiment was successfully performed on the HL-2 M tokamak at the Southwestern Institute of Physics,China.The experimental results demonstrated that the system obtained the neutron and gamma spectra with a time accuracy of 1 ms.At count rates of up to 1 Mcps,the figure of merit was greater than 1.05 for energies between 50 keV and 2.8 MeV.展开更多
基金Project(20030533011)supported by the National Research Foundation for the Doctoral Program of Higher Education of China
文摘A DMVOCC-MVDA (distributed multiversion optimistic concurrency control with multiversion dynamic adjustment) protocol was presented to process mobile distributed real-time transaction in mobile broadcast environments. At the mobile hosts, all transactions perform local pre-validation. The local pre-validation process is carried out against the committed transactions at the server in the last broadcast cycle. Transactions that survive in local pre-validation must be submitted to the server for local final validation. The new protocol eliminates conflicts between mobile read-only and mobile update transactions, and resolves data conflicts flexibly by using multiversion dynamic adjustment of serialization order to avoid unnecessary restarts of transactions. Mobile read-only transactions can be committed with no-blocking, and respond time of mobile read-only transactions is greatly shortened. The tolerance of mobile transactions of disconnections from the broadcast channel is increased. In global validation mobile distributed transactions have to do check to ensure distributed serializability in all participants. The simulation results show that the new concurrency control protocol proposed offers better performance than other protocols in terms of miss rate, restart rate, commit rate. Under high work load (think time is ls) the miss rate of DMVOCC-MVDA is only 14.6%, is significantly lower than that of other protocols. The restart rate of DMVOCC-MVDA is only 32.3%, showing that DMVOCC-MVDA can effectively reduce the restart rate of mobile transactions. And the commit rate of DMVOCC-MVDA is up to 61.2%, which is obviously higher than that of other protocols.
基金Project supported by National Natural Science Foundation ofChina (Grant No .60203017) Defense Pre-research Projectof the"Tenth Five-Year-Plan"of China (Grant No .413150403)
文摘Recovery performance in the event of failures is very important for distributed real-time database systems. This paper presents a time-cognizant logging-based crash recovery scheme (TCLCRS) that aims at distributed real-time databases, which adopts a main memory database as its ground support. In our scheme, each site maintains a real-time log for local transactions and the subtransactions, which execute at the site, and execte local checkpointing independently. Log records are stored in non-volatile high- speed store, which is divided into four different partitions based on transaction classes. During restart recovery after a site crash, partitioned crash recovery strategy is adopted to ensure that the site can be brought up before the entire local secondary database is reloaded in main memory. The partitioned crash recovery strategy not only guarantees the internal consistency to be recovered, but also guarantee the temporal consistency and recovery of the sates of physical world influenced by uncommitted transactions. Combined with two- phase commit protocol, TCLCRS can guarantee failure atomicity of distributed real-time transactions.
基金the National Natural Science Foundation of China and the Commission of Science,Technokgy and Industry for National Defense
文摘This paper formally defines and analyses the new notion of correctness called quasi serializability, and then outlines corresponding concurrency control protocol QDHP for distributed real-time databases. Finally, through a series of simulation studies, it shows that using the new concurrency control protocol the performance of distributed real-time databases can be much improved.
文摘In parallel real-time database systems, concurrency control protocols must satisfy time constraints as well as the integrity constraints. The authors present a validation concurrency control(VCC) protocol, which can enhance the performance of real-time concurrency control mechanism by reducing the number of transactions that might miss their deadlines, and compare the performance of validation concurrency control protocol with that of HP2PL(High priority two phase locking) protocol and OCC-TI-WAIT-50(Optimistic concurrency control-time interval-wait-50) protocol under shared-disk architecture by simulation. The simulation results reveal that the protocol the author presented can effectively reduce the number of transactions restarting which might miss their deadlines and performs better than HP2PL and OCC-TI-WAIT-50. It works well when arrival rate of transaction is lesser than threshold. However, due to resource contention the percentage of missing deadline increases sharply when arrival rate is greater than the threshold.
文摘Most of the proposed concurrency control protocols for real time database systems are based on serializability theorem. Owing to the unique characteristics of real time database applications and the importance of satisfying the timing constraints of transactions, serializability is too strong as a correctness criterion and not suitable for real time databases in most cases. On the other hand, relaxed serializability including epsilon serializability and similarity serializability can allow more real time transactions to satisfy their timing constraints, but database consistency may be sacrificed to some extent. We thus propose the use of weak serializability(WSR) that is more relaxed than conflicting serializability while database consistency is maintained. In this paper, we first formally define the new notion of correctness called weak serializability. After the necessary and sufficient conditions for weak serializability are shown, corresponding concurrency control protocol WDHP(weak serializable distributed high priority protocol) is outlined for distributed real time databases, where a new lock mode called mask lock mode is proposed for simplifying the condition of global consistency. Finally, through a series of simulation studies, it is shown that using the new concurrency control protocol the performance of distributed real time databases can be greatly improved.
基金financial support from the National Natural Science Foundation of China(Grant Nos.52209125 and 51839003).
文摘Deep engineering disasters,such as rockbursts and collapses,are more related to the shear slip of rock joints.A novel multifunctional device was developed to study the shear failure mechanism in rocks.Using this device,the complete shearedeformation process and long-term shear creep tests could be performed on rocks under constant normal stiffness(CNS)or constant normal loading(CNL)conditions in real-time at high temperature and true-triaxial stress.During the research and development process,five key technologies were successfully broken through:(1)the ability to perform true-triaxial compressioneshear loading tests on rock samples with high stiffness;(2)a shear box with ultra-low friction throughout the entire stress space of the rock sample during loading;(3)a control system capable of maintaining high stress for a long time and responding rapidly to the brittle fracture of a rock sample as well;(4)a refined ability to measure the volumetric deformation of rock samples subjected to true triaxial shearing;and(5)a heating system capable of maintaining uniform heating of the rock sample over a long time.By developing these technologies,loading under high true triaxial stress conditions was realized.The apparatus has a maximum normal stiffness of 1000 GPa/m and a maximum operating temperature of 300C.The differences in the surface temperature of the sample are constant to within5C.Five types of true triaxial shear tests were conducted on homogeneous sandstone to verify that the apparatus has good performance and reliability.The results show that temperature,lateral stress,normal stress and time influence the shear deformation,failure mode and strength of the sandstone.The novel apparatus can be reliably used to conduct true-triaxial shear tests on rocks subjected to high temperatures and stress.
基金supported by the Ensemble Grant for Early Career Researchers 2022 and the 2023 Ensemble Continuation Grant of Tohoku University,the Hirose Foundation,the Iwatani Naoji Foundation,and the AIMR Fusion Research Grantsupported by JSPS KAKENHI Nos.JP23K13599,JP23K13703,JP22H01803,and JP18H05513+2 种基金the Center for Computational Materials Science,Institute for Materials Research,Tohoku University for the use of MASAMUNEIMR(Nos.202212-SCKXX0204 and 202208-SCKXX-0212)the Institute for Solid State Physics(ISSP)at the University of Tokyo for the use of their supercomputersthe China Scholarship Council(CSC)fund to pursue studies in Japan.
文摘All-solid-state batteries(ASSBs)are a class of safer and higher-energy-density materials compared to conventional devices,from which solid-state electrolytes(SSEs)are their essential components.To date,investigations to search for high ion-conducting solid-state electrolytes have attracted broad concern.However,obtaining SSEs with high ionic conductivity is challenging due to the complex structural information and the less-explored structure-performance relationship.To provide a solution to these challenges,developing a database containing typical SSEs from available experimental reports would be a new avenue to understand the structureperformance relationships and find out new design guidelines for reasonable SSEs.Herein,a dynamic experimental database containing>600 materials was developed in a wide range of temperatures(132.40–1261.60 K),including mono-and divalent cations(e.g.,Li^(+),Na^(+),K^(+),Ag^(+),Ca^(2+),Mg^(2+),and Zn^(2+))and various types of anions(e.g.,halide,hydride,sulfide,and oxide).Data-mining was conducted to explore the relationships among different variates(e.g.,transport ion,composition,activation energy,and conductivity).Overall,we expect that this database can provide essential guidelines for the design and development of high-performance SSEs in ASSB applications.This database is dynamically updated,which can be accessed via our open-source online system.
基金financial support from the Science Research Program Project for Drug Regulation,Jiangsu Drug Administration,China(Grant No.:202207)the National Drug Standards Revision Project,China(Grant No.:2023Y41)+1 种基金the National Natural Science Foundation of China(Grant No.:22276080)the Foreign Expert Project,China(Grant No.:G2022014096L).
文摘Analyzing polysorbate 20(PS20)composition and the impact of each component on stability and safety is crucial due to formulation variations and individual tolerance.The similar structures and polarities of PS20 components make accurate separation,identification,and quantification challenging.In this work,a high-resolution quantitative method was developed using single-dimensional high-performance liquid chromatography(HPLC)with charged aerosol detection(CAD)to separate 18 key components with multiple esters.The separated components were characterized by ultra-high-performance liquid chromatography-quadrupole time-of-flight mass spectrometry(UHPLC-Q-TOF-MS)with an identical gradient as the HPLC-CAD analysis.The polysorbate compound database and library were expanded over 7-time compared to the commercial database.The method investigated differences in PS20 samples from various origins and grades for different dosage forms to evaluate the composition-process relationship.UHPLC-Q-TOF-MS identified 1329 to 1511 compounds in 4 batches of PS20 from different sources.The method observed the impact of 4 degradation conditions on peak components,identifying stable components and their tendencies to change.HPLC-CAD and UHPLC-Q-TOF-MS results provided insights into fingerprint differences,distinguishing quasi products.
基金supported by the National Natural Science Foundation of China(No.62302242)the China Postdoctoral Science Foundation(No.2023M731802).
文摘The EU’s Artificial Intelligence Act(AI Act)imposes requirements for the privacy compliance of AI systems.AI systems must comply with privacy laws such as the GDPR when providing services.These laws provide users with the right to issue a Data Subject Access Request(DSAR).Responding to such requests requires database administrators to identify information related to an individual accurately.However,manual compliance poses significant challenges and is error-prone.Database administrators need to write queries through time-consuming labor.The demand for large amounts of data by AI systems has driven the development of NoSQL databases.Due to the flexible schema of NoSQL databases,identifying personal information becomes even more challenging.This paper develops an automated tool to identify personal information that can help organizations respond to DSAR.Our tool employs a combination of various technologies,including schema extraction of NoSQL databases and relationship identification from query logs.We describe the algorithm used by our tool,detailing how it discovers and extracts implicit relationships from NoSQL databases and generates relationship graphs to help developers accurately identify personal data.We evaluate our tool on three datasets,covering different database designs,achieving an F1 score of 0.77 to 1.Experimental results demonstrate that our tool successfully identifies information relevant to the data subject.Our tool reduces manual effort and simplifies GDPR compliance,showing practical application value in enhancing the privacy performance of NOSQL databases and AI systems.
基金funded and supported by the Comprehensive Research Facility for Fusion Technology Program of China(No.2018-000052-73-01-001228)the HFIPS Director’s Fund(No.YZJJKX202301)+1 种基金the Anhui Provincial Major Science and Technology Project(No.2023z020004)Task JB22001 from the Anhui Provincial Department of Economic and Information Technology。
文摘A real-time data processing system is designed for the carbon dioxide dispersion interferometer(CO_(2)-DI)on EAST.The system utilizes the parallel and pipelining capabilities of an fieldprogrammable gate array(FPGA)to digitize and process the intensity of signals from the detector.Finally,the real-time electron density signals are exported through a digital-to-analog converter(DAC)module in the form of analog signals.The system has been successfully applied in the CO_(2)-DI system to provide low-latency electron density input to the plasma control system on EAST.Experimental results of the latest campaign with long-pulse discharges on EAST(2022–2023)demonstrate that the system can respond effectively in the case of rapid density changes,proving its reliability and accuracy for future electron density calculation.
基金Supported by National Natural Science Foundation of China(Grant Nos.51875031,52242507)Beijing Municipal Natural Science Foundation of China(Grant No.3212010)Beijing Municipal Youth Backbone Personal Project of China(Grant No.2017000020124 G018).
文摘The co-frequency vibration fault is one of the common faults in the operation of rotating equipment,and realizing the real-time diagnosis of the co-frequency vibration fault is of great significance for monitoring the health state and carrying out vibration suppression of the equipment.In engineering scenarios,co-frequency vibration faults are highlighted by rotational frequency and are difficult to identify,and existing intelligent methods require more hardware conditions and are exclusively time-consuming.Therefore,Lightweight-convolutional neural networks(LW-CNN)algorithm is proposed in this paper to achieve real-time fault diagnosis.The critical parameters are discussed and verified by simulated and experimental signals for the sliding window data augmentation method.Based on LW-CNN and data augmentation,the real-time intelligent diagnosis of co-frequency is realized.Moreover,a real-time detection method of fault diagnosis algorithm is proposed for data acquisition to fault diagnosis.It is verified by experiments that the LW-CNN and sliding window methods are used with high accuracy and real-time performance.
基金supported by the National Natural Science Foundation of China(Grant No.51677058)。
文摘To address the impact of wind-power fluctuations on the stability of power systems,we propose a comprehensive approach that integrates multiple strategies and methods to enhance the efficiency and reliability of a system.First,we employ a strategy that restricts long-and short-term power output deviations to smoothen wind power fluctuations in real time.Second,we adopt the sliding window instantaneous complete ensemble empirical mode decomposition with adaptive noise(SW-ICEEMDAN)strategy to achieve real-time decomposition of the energy storage power,facilitating internal power distribution within the hybrid energy storage system.Finally,we introduce a rule-based multi-fuzzy control strategy for the secondary adjustment of the initial power allocation commands for different energy storage components.Through simulation validation,we demonstrate that the proposed comprehensive control strategy can smoothen wind power fluctuations in real time and decompose energy storage power.Compared with traditional empirical mode decomposition(EMD),ensemble empirical mode decomposition(EEMD),and complete ensemble empirical mode decomposition with adaptive noise(CEEMDAN)decomposition strategies,the configuration of the energy storage system under the SW-ICEEMDAN control strategy is more optimal.Additionally,the state-of-charge of energy storage components fluctuates within a reasonable range,enhancing the stability of the power system and ensuring the secure operation of the energy storage system.
基金supported by the Deanship of Scientific Research,Vice Presidency for Graduate Studies and Scientific Research,King Faisal University,Saudi Arabia(Grant No.KFU242068).
文摘Database systems have consistently been prime targets for cyber-attacks and threats due to the critical nature of the data they store.Despite the increasing reliance on database management systems,this field continues to face numerous cyber-attacks.Database management systems serve as the foundation of any information system or application.Any cyber-attack can result in significant damage to the database system and loss of sensitive data.Consequently,cyber risk classifications and assessments play a crucial role in risk management and establish an essential framework for identifying and responding to cyber threats.Risk assessment aids in understanding the impact of cyber threats and developing appropriate security controls to mitigate risks.The primary objective of this study is to conduct a comprehensive analysis of cyber risks in database management systems,including classifying threats,vulnerabilities,impacts,and countermeasures.This classification helps to identify suitable security controls to mitigate cyber risks for each type of threat.Additionally,this research aims to explore technical countermeasures to protect database systems from cyber threats.This study employs the content analysis method to collect,analyze,and classify data in terms of types of threats,vulnerabilities,and countermeasures.The results indicate that SQL injection attacks and Denial of Service(DoS)attacks were the most prevalent technical threats in database systems,each accounting for 9%of incidents.Vulnerable audit trails,intrusion attempts,and ransomware attacks were classified as the second level of technical threats in database systems,comprising 7%and 5%of incidents,respectively.Furthermore,the findings reveal that insider threats were the most common non-technical threats in database systems,accounting for 5%of incidents.Moreover,the results indicate that weak authentication,unpatched databases,weak audit trails,and multiple usage of an account were the most common technical vulnerabilities in database systems,each accounting for 9%of vulnerabilities.Additionally,software bugs,insecure coding practices,weak security controls,insecure networks,password misuse,weak encryption practices,and weak data masking were classified as the second level of security vulnerabilities in database systems,each accounting for 4%of vulnerabilities.The findings from this work can assist organizations in understanding the types of cyber threats and developing robust strategies against cyber-attacks.
基金supported by the National Research Foundation of Korea (NRF) grant funded by the Korean Government (MSIT) (No.2022M3J7A1062940,2021R1A5A6002853,and 2021R1A2C3011585)supported by the Technology Innovation Program (20015577)funded by the Ministry of Trade,Industry&Energy (MOTIE,Korea)。
文摘This review explores glucose monitoring and management strategies,emphasizing the need for reliable and userfriendly wearable sensors that are the next generation of sensors for continuous glucose detection.In addition,examines key strategies for designing glucose sensors that are multi-functional,reliable,and cost-effective in a variety of contexts.The unique features of effective diabetes management technology are highlighted,with a focus on using nano/biosensor devices that can quickly and accurately detect glucose levels in the blood,improving patient treatment and control of potential diabetes-related infections.The potential of next-generation wearable and touch-sensitive nano biomedical sensor engineering designs for providing full control in assessing implantable,continuous glucose monitoring is also explored.The challenges of standardizing drug or insulin delivery doses,low-cost,real-time detection of increased blood sugar levels in diabetics,and early digital health awareness controls for the adverse effects of injectable medication are identified as unmet needs.Also,the market for biosensors is expected to expand significantly due to the rising need for portable diagnostic equipment and an ever-increasing diabetic population.The paper concludes by emphasizing the need for further research and development of glucose biosensors to meet the stringent requirements for sensitivity and specificity imposed by clinical diagnostics while being cost-effective,stable,and durable.
基金funded by Anhui Provincial Natural Science Foundation(No.2208085ME128)the Anhui University-Level Special Project of Anhui University of Science and Technology(No.XCZX2021-01)+1 种基金the Research and the Development Fund of the Institute of Environmental Friendly Materials and Occupational Health,Anhui University of Science and Technology(No.ALW2022YF06)Anhui Province New Era Education Quality Project(Graduate Education)(No.2022xscx073).
文摘The real-time detection and instance segmentation of strawberries constitute fundamental components in the development of strawberry harvesting robots.Real-time identification of strawberries in an unstructured envi-ronment is a challenging task.Current instance segmentation algorithms for strawberries suffer from issues such as poor real-time performance and low accuracy.To this end,the present study proposes an Efficient YOLACT(E-YOLACT)algorithm for strawberry detection and segmentation based on the YOLACT framework.The key enhancements of the E-YOLACT encompass the development of a lightweight attention mechanism,pyramid squeeze shuffle attention(PSSA),for efficient feature extraction.Additionally,an attention-guided context-feature pyramid network(AC-FPN)is employed instead of FPN to optimize the architecture’s performance.Furthermore,a feature-enhanced model(FEM)is introduced to enhance the prediction head’s capabilities,while efficient fast non-maximum suppression(EF-NMS)is devised to improve non-maximum suppression.The experimental results demonstrate that the E-YOLACT achieves a Box-mAP and Mask-mAP of 77.9 and 76.6,respectively,on the custom dataset.Moreover,it exhibits an impressive category accuracy of 93.5%.Notably,the E-YOLACT also demonstrates a remarkable real-time detection capability with a speed of 34.8 FPS.The method proposed in this article presents an efficient approach for the vision system of a strawberry-picking robot.
基金the National Key Research and Development Program of China(Grant No.2021YFA1402102)the National Natural Science Foundation of China(Grant No.62171249)the Fund by Tsinghua University Initiative Scientific Research Program.
文摘The composite time scale(CTS)provides a stable,accurate,and reliable time scale for modern society.The improvement of CTS’s real-time performance will improve its stability,which strengths related applications’performance.Aiming at this goal,a method achieved by determining the optimal calculation interval and accelerating adjustment stage is proposed in this paper.The determinants of the CTS’s calculation interval(characteristics of the clock ensemble,the measurement noise,the time and frequency synchronization system’s noise and the auxiliary output generator noise floor)are studied and the optimal calculation interval is obtained.We also investigate the effect of ensemble algorithm’s initial parameters on the CTS’s adjustment stage.A strategy to get the reasonable initial parameters of ensemble algorithm is designed.The results show that the adjustment stage can be finished rapidly or even can be shorten to zero with reasonable initial parameters.On this basis,we experimentally generate a distributed CTS with a calculation interval of 500 s and its stability outperforms those of the member clocks when the averaging time is longer than1700 s.The experimental result proves that the CTS’s real-time performance is significantly improved.
基金support from the National Natural Science Foundation of China (No.52204202)the Hunan Provincial Natural Science Foundation of China (No.2023JJ40058)the Science and Technology Program of Hunan Provincial Departent of Transportation (No.202122).
文摘In recent years,frequent fire disasters have led to enormous damage in China.Effective firefighting rescues can minimize the losses caused by fires.During the rescue processes,the travel time of fire trucks can be severely affected by traffic conditions,changing the effective coverage of fire stations.However,it is still challenging to determine the effective coverage of fire stations considering dynamic traffic conditions.This paper addresses this issue by combining the traveling time calculationmodelwith the effective coverage simulationmodel.In addition,it proposes a new index of total effective coverage area(TECA)based on the time-weighted average of the effective coverage area(ECA)to evaluate the urban fire services.It also selects China as the case study to validate the feasibility of the models,a fire station(FS-JX)in Changsha.FS-JX station and its surrounding 9,117 fire risk points are selected as the fire service supply and demand points,respectively.A total of 196 simulation scenarios throughout a consecutiveweek are analyzed.Eventually,1,933,815 sets of valid sample data are obtained.The results showed that the TECA of FS-JX is 3.27 km^(2),which is far below the standard requirement of 7.00 km^(2) due to the traffic conditions.The visualization results showed that three rivers around FS-JX interrupt the continuity of its effective coverage.The proposed method can provide data support to optimize the locations of fire stations by accurately and dynamically determining the effective coverage of fire stations.
基金supported by the National Magnetic Confinement Fusion Program of China(No.2019YFE03020002)the National Natural Science Foundation of China(Nos.12205085 and12125502)。
文摘Fast neutron flux measurements with high count rates and high time resolution have important applications in equipment such as tokamaks.In this study,real-time neutron and gamma discrimination was implemented on a self-developed 500-Msps,12-bit digitizer,and the neutron and gamma spectra were calculated directly on an FPGA.A fast neutron flux measurement system with BC-501A and EJ-309 liquid scintillator detectors was developed and a fast neutron measurement experiment was successfully performed on the HL-2 M tokamak at the Southwestern Institute of Physics,China.The experimental results demonstrated that the system obtained the neutron and gamma spectra with a time accuracy of 1 ms.At count rates of up to 1 Mcps,the figure of merit was greater than 1.05 for energies between 50 keV and 2.8 MeV.