Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and mai...Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and maintenance of cable-stayed bridges.However,the representative temperatures of stayed cables are not specified in the existing design codes.To address this issue,this study investigates the distribution of the cable temperature and determinates its representative temperature.First,an experimental investigation,spanning over a period of one year,was carried out near the bridge site to obtain the temperature data.According to the statistical analysis of the measured data,it reveals that the temperature distribution is generally uniform along the cable cross-section without significant temperature gradient.Then,based on the limited data,the Monte Carlo,the gradient boosted regression trees(GBRT),and univariate linear regression(ULR)methods are employed to predict the cable’s representative temperature throughout the service life.These methods effectively overcome the limitations of insufficient monitoring data and accurately predict the representative temperature of the cables.However,each method has its own advantages and limitations in terms of applicability and accuracy.A comprehensive evaluation of the performance of these methods is conducted,and practical recommendations are provided for their application.The proposed methods and representative temperatures provide a good basis for the operation and maintenance of in-service long-span cable-stayed bridges.展开更多
Long-term navigation ability based on consumer-level wearable inertial sensors plays an essential role towards various emerging fields, for instance, smart healthcare, emergency rescue, soldier positioning et al. The ...Long-term navigation ability based on consumer-level wearable inertial sensors plays an essential role towards various emerging fields, for instance, smart healthcare, emergency rescue, soldier positioning et al. The performance of existing long-term navigation algorithm is limited by the cumulative error of inertial sensors, disturbed local magnetic field, and complex motion modes of the pedestrian. This paper develops a robust data and physical model dual-driven based trajectory estimation(DPDD-TE) framework, which can be applied for long-term navigation tasks. A Bi-directional Long Short-Term Memory(Bi-LSTM) based quasi-static magnetic field(QSMF) detection algorithm is developed for extracting useful magnetic observation for heading calibration, and another Bi-LSTM is adopted for walking speed estimation by considering hybrid human motion information under a specific time period. In addition, a data and physical model dual-driven based multi-source fusion model is proposed to integrate basic INS mechanization and multi-level constraint and observations for maintaining accuracy under long-term navigation tasks, and enhanced by the magnetic and trajectory features assisted loop detection algorithm. Real-world experiments indicate that the proposed DPDD-TE outperforms than existing algorithms, and final estimated heading and positioning accuracy indexes reaches 5° and less than 2 m under the time period of 30 min, respectively.展开更多
Dead fine fuel moisture content(DFFMC)is a key factor affecting the spread of forest fires,which plays an important role in evaluation of forest fire risk.In order to achieve high-precision real-time measurement of DF...Dead fine fuel moisture content(DFFMC)is a key factor affecting the spread of forest fires,which plays an important role in evaluation of forest fire risk.In order to achieve high-precision real-time measurement of DFFMC,this study established a long short-term memory(LSTM)network based on particle swarm optimization(PSO)algorithm as a measurement model.A multi-point surface monitoring scheme combining near-infrared measurement method and meteorological measurement method is proposed.The near-infrared spectral information of dead fine fuels and the meteorological factors in the region are processed by data fusion technology to construct a spectral-meteorological data set.The surface fine dead fuel of Mongolian oak(Quercus mongolica Fisch.ex Ledeb.),white birch(Betula platyphylla Suk.),larch(Larix gmelinii(Rupr.)Kuzen.),and Manchurian walnut(Juglans mandshurica Maxim.)in the maoershan experimental forest farm of the Northeast Forestry University were investigated.We used the PSO-LSTM model for moisture content to compare the near-infrared spectroscopy,meteorological,and spectral meteorological fusion methods.The results show that the mean absolute error of the DFFMC of the four stands by spectral meteorological fusion method were 1.1%for Mongolian oak,1.3%for white birch,1.4%for larch,and 1.8%for Manchurian walnut,and these values were lower than those of the near-infrared method and the meteorological method.The spectral meteorological fusion method provides a new way for high-precision measurement of moisture content of fine dead fuel.展开更多
Emerging connected vehicle (CV) data sets have recently become commercially available, enabling analysts to develop a variety of powerful performance measures without deploying any field infrastructure. This paper pre...Emerging connected vehicle (CV) data sets have recently become commercially available, enabling analysts to develop a variety of powerful performance measures without deploying any field infrastructure. This paper presents several tools using CV data to evaluate traffic progression quality along a signalized corridor. These include both performance measures for high-level analysis as well as visualizations to examine details of the coordinated operation. With the use of CV data, it is possible to assess not only the movement of traffic on the corridor but also to consider its origin-destination (O-D) path through the corridor. Results for the real-world operation of an eight-intersection signalized arterial are presented. A series of high-level performance measures are used to evaluate overall performance by time of day, with differing results by metric. Next, the details of the operation are examined with the use of two visualization tools: a cyclic time-space diagram (TSD) and an empirical platoon progression diagram (PPD). Comparing flow visualizations developed with different included O-D paths reveals several features, such as the presence of secondary and tertiary platoons on certain sections that cannot be seen when only end-to-end journeys are included. In addition, speed heat maps are generated, providing both speed performance along the corridor and locations and the extent of the queue. The proposed visualization tools portray the corridor’s performance holistically instead of combining individual signal performance metrics. The techniques exhibited in this study are compelling for identifying locations where engineering solutions such as access management or timing plan change are required. The recent progress in infrastructure-free sensing technology has significantly increased the scope of CV data-based traffic management systems, enhancing the significance of this study. The study demonstrates the utility of CV trajectory data for obtaining high-level details of the corridor performance as well as drilling down into the minute specifics.展开更多
Wheel polygonal wear can immensely worsen wheel/rail interactions and vibration performances of the train and track,and ultimately,lead to the shortening of service life of railway components.At present,wheel/rail med...Wheel polygonal wear can immensely worsen wheel/rail interactions and vibration performances of the train and track,and ultimately,lead to the shortening of service life of railway components.At present,wheel/rail medium-or high-frequency frictional interactions are perceived as an essential reason of the high-order polygonal wear of railway wheels,which are potentially resulted by the flexible deformations of the train/track system or other external excitations.In this work,the effect of wheel/rail flexibility on polygonal wear evolution of heavy-haul locomotive wheels is explored with aid of the long-term wheel polygonal wear evolution simulations,in which different flexible modeling of the heavy-haul wheel/rail coupled system is implemented.Further,the mitigation measures for the polygonal wear of heavy-haul locomotive wheels are discussed.The results point out that the evolution of polygonal wear of heavy-haul locomotive wheels can be veritably simulated with consideration of the flexible effect of both wheelset and rails.Execution of mixed-line operation of heavy-haul trains and application of multicut wheel re-profiling can effectively reduce the development of wheel polygonal wear.This research can provide a deep-going understanding of polygonal wear evolution mechanism of heavy-haul locomotive wheels and its mitigation measures.展开更多
To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is...To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is developed.The procedure includes(a)conversion of the finite element data into a triangular mesh,(b)selection of a common coordinate system,(c)determination of the rigid body transformation to place both measurements and FEA data in the same system and(d)interpolation of the FEA nodal information to the same spatial locations as the StereoDIC measurements using barycentric coordinates.For an aluminum Al-6061 double edge notched tensile specimen,FEA results are obtained using both the von Mises isotropic yield criterion and Hill’s quadratic anisotropic yield criterion,with the unknown Hill model parameters determined using full-field specimen strain measurements for the nominally plane stress specimen.Using Hill’s quadratic anisotropic yield criterion,the point-by-point comparison of experimentally based full-field strains and stresses to finite element predictions are shown to be in excellent agreement,confirming the effectiveness of the field comparison process.展开更多
Because of its large capacity,high efficiency and energy savings,the subway has gradually become the primary mode of transportation for citizens.A high density of passengers exists within a large-passenger-flow subway...Because of its large capacity,high efficiency and energy savings,the subway has gradually become the primary mode of transportation for citizens.A high density of passengers exists within a large-passenger-flow subway station,and the number of casualties and injuries during a fire emergency is substantial.In this paper,Pathfinder software and on-site measured data of Pingzhou station in Shenzhen(China)were utilized to simulate a fire emergency evacuation in a large-passenger-flow subway station.The Required Safe Egress Time(RSET),number of passengers and flow rates of stairs and escalators were analysed for three fire evacuation scenarios:train fire,platform fire and hall fire.The evacuation time of the train fire,which was 1173 s,was the longest,and 3621 occupants needed to evacuate when the train was fully loaded.Occupants could not complete the evacuation within 6 mins in all three fire evacuation scenarios,which does not meet the current standard requirements and codes.By changing the number of passengers and the number of stairs for evacuation,the flow rate capacity and evacuation time were explored,which have reference values for safety management and emergency evacuation plan optimization during peak hours of subway operation.展开更多
Recent progress in nuclear data measurement for ADS at Institute of Modern Physics is reviewed briefly.Based on the cooler storage ring of the Heavy Ion Research Facility in Lanzhou, nuclear data terminal was establis...Recent progress in nuclear data measurement for ADS at Institute of Modern Physics is reviewed briefly.Based on the cooler storage ring of the Heavy Ion Research Facility in Lanzhou, nuclear data terminal was established.The nuclear data measurement facility for the ADS spallation target has been constructed, which provides a very important platform for the experimental measurements of spallation reactions. A number of experiments have been conducted in the nuclear data terminal. A Neutron Time-of-Flight(NTOF)spectrometer was developed for the study of neutron production from spallation reactions related to the ADS project.The experiments of 400 MeV/u ^(16)O bombarded on a tungsten target were presented using a NTOF spectrometer.Neutron yields for 250 MeV protons incident on a thick grain-made tungsten target and a thick solid lead target have been measured using the water-bath neutron activation method. Spallation residual productions were studied by bombarding W and Pb targets with a 250 MeV proton beam using the neutron activation method. Benchmarking of evaluated nuclear data libraries was performed for D-T neutrons on ADS relevant materials by using the benchmark experimental facility at the China Institute of Atomic Energy.展开更多
Information analysis of high dimensional data was carried out through similarity measure application. High dimensional data were considered as the a typical structure. Additionally, overlapped and non-overlapped data ...Information analysis of high dimensional data was carried out through similarity measure application. High dimensional data were considered as the a typical structure. Additionally, overlapped and non-overlapped data were introduced, and similarity measure analysis was also illustrated and compared with conventional similarity measure. As a result, overlapped data comparison was possible to present similarity with conventional similarity measure. Non-overlapped data similarity analysis provided the clue to solve the similarity of high dimensional data. Considering high dimensional data analysis was designed with consideration of neighborhoods information. Conservative and strict solutions were proposed. Proposed similarity measure was applied to express financial fraud among multi dimensional datasets. In illustrative example, financial fraud similarity with respect to age, gender, qualification and job was presented. And with the proposed similarity measure, high dimensional personal data were calculated to evaluate how similar to the financial fraud. Calculation results show that the actual fraud has rather high similarity measure compared to the average, from minimal 0.0609 to maximal 0.1667.展开更多
The multi-point simultaneous long-term measurement of CO_(2) concentration in seawater can provide more-valuable data for further understanding of the spatial and temporal distribution of CO_(2).Thus,the requirement f...The multi-point simultaneous long-term measurement of CO_(2) concentration in seawater can provide more-valuable data for further understanding of the spatial and temporal distribution of CO_(2).Thus,the requirement for a low-cost sensor with high precision,low power consumption,and a small size is becoming urgent.In this work,an in-situ sensor for CO_(2) detection in seawater,based on a permeable membrane and non-dispersive infrared(NDIR)technology,is developed.The sensor has a small size(Ф66 mm×124 mm),light weight(0.7 kg in air),low power consumption(<0.9 W),low cost(<US$1000),and high-pressure tolerance(<200 m).After laboratory performance tests,the sensor was found to have a measurement range of(0–2000)×10^(-6),and the gas linear correlation R^(2) is 0.99,with a precision of about 0.98%at a sampling rate of 1 s.A comparison measurement was carried out with a commercial sensor in a pool for 7 days,and the results showed a consistent trend.Further,the newly developed sensor was deployed in Qingdao nearshore water for 35 days.The results proved that the sensor could measure the dynamic changes of CO_(2) concentration in seawater continuously,and had the potential to carry out long-term observations on an oceanic platform.It is hoped that the sensor could be applied to field ocean observations in near future.展开更多
Local arterials can be significantly impacted by diversions from adjacent work zones. These diversions often occur on unofficial detour routes due to guidance received on personal navigation devices. Often, these rout...Local arterials can be significantly impacted by diversions from adjacent work zones. These diversions often occur on unofficial detour routes due to guidance received on personal navigation devices. Often, these routes do not have sufficien<span style="font-family:Verdana;">t sensing or communication equipment to obtain infrastructure-based tra</span><span style="font-family:Verdana;">ffic signal performance measures, so other data sources are required to identify locations being significantly affected by diversions. This paper examines the network impact caused by the start of an 18-month closure of the I-65/70 interchange (North Split), which usually serves approximately 214,000 vehicles per day in Indianapolis, IN. In anticipation of some proportion of the public diverting from official detour routes to local streets, a connected vehicle monitoring program was established to provide daily performances measures for over 100 intersections in the area without the need for vehicle sensing equipment. This study reports on 13 of the most impacted signals on an alternative arterial to identify locations and time of day where operations are most degraded, so that decision makers have quantitative information to make informed adjustments to the system. Individual vehicle movements at the studied locations are analyzed to estimate changes in volume, split failures, downstream blockage, arrivals on green, and travel times. Over 130,000 trajectories were analyzed in an 11-week period. Weekly afternoon peak period volumes increased by approximately 455%, split failures increased 3%, downstream blockage increased 10%, arrivals on green decreased 16%, and travel time increase 74%. The analysis performed in this paper will serve as a framework for any agency that wants to assess traffic signal performance at hundreds of locations with little or no existing sensing or communication infrastructure to prioritize tactical retiming and/or longer-term infrastructure investments.</span>展开更多
Since the first Diverging Diamond Interchange (DDI) implementation in 2009, most of the performance studies developed for this type of interchange have been based on simulations and historical crash data, with a small...Since the first Diverging Diamond Interchange (DDI) implementation in 2009, most of the performance studies developed for this type of interchange have been based on simulations and historical crash data, with a small numbe<span style="font-family:Verdana;">r of studies using Automated Traffic Signal Performance Measures (ATS</span><span style="font-family:Verdana;">PM). Simulation models require considerable effort to collect volumes and to model actual controller operations. Safety studies based on historical crashes usually require from 3 to 5 years of data collection. ATSPMs rely on sensing equipment. This study describes the use of connected vehicle trajectory data to analyze the performance of a DDI located in the metropolitan area of Fort Wayne, IN. An extension of the Purdue Probe Diagram (PPD) is proposed to assess the levels of delay, progression, and saturation. Further, an additional PPD variation is presented that provides a convenient visualization to qualitatively understand progression patterns and to evaluate queue length for spillback in the critical interior crossover. Over 7000 trajectories and 130,000 GPS points were analyzed between the 7</span><sup><span style="font-family:Verdana;">th</span></sup><span style="font-family:Verdana;"> and the 11</span><sup><span style="font-family:Verdana;">th</span></sup><span style="font-family:Verdana;"> of June 2021 from 5:00 AM to 10:00 PM to estimate the DDI’s arrivals on green, level of service, split failures, and downstream blockage. Although this technique was demonstrated for weekdays, the ubiquity of connected vehicle data makes it very ea</span><span style="font-family:Verdana;">sy to adapt these techniques to analysis during special events, winter sto</span><span style="font-family:Verdana;">rms, and weekends. Furthermore, the methodologies presented in this paper can be applied by any agency wanting to assess the performance of any DDI in their jurisdiction.</span>展开更多
This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under...This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under deep sea. The data logger is a battery powered instrument, which can connect chemical sensors (pH electrode, H2S electrode, H2 electrode) and temperature sensors. In order to achieve major energy savings, dynamic power management is implemented in hardware design and software design. The working current of the data logger in idle mode and active mode is 15 μA and 1.44 mA respectively, which greatly extends the working time of battery. The data logger has been successftdly tested in the first Sino-American Cooperative Deep Submergence Project from August 13 to September 3, 2005.展开更多
Similarity measure design for discrete data group was proposed. Similarity measure design for continuous membership function was also carried out. Proposed similarity measures were designed based on fuzzy number and d...Similarity measure design for discrete data group was proposed. Similarity measure design for continuous membership function was also carried out. Proposed similarity measures were designed based on fuzzy number and distance measure, and were proved. To calculate the degree of similarity of discrete data, relative degree between data and total distribution was obtained. Discrete data similarity measure was completed with combination of mentioned relative degrees. Power interconnected system with multi characteristics was considered to apply discrete similarity measure. Naturally, similarity measure was extended to multi-dimensional similarity measure case, and applied to bus clustering problem.展开更多
Introduction: The extended administration of corticoids by oral way for a length of more than 3 months defines the long-term-corticosteroid-therapy. This one, used in numerous indications, displays most often at the r...Introduction: The extended administration of corticoids by oral way for a length of more than 3 months defines the long-term-corticosteroid-therapy. This one, used in numerous indications, displays most often at the risk of undesirable effects linked sometimes to the habits of prescription of the doctors. Patients and Methods: In order to study the prescription modalities of this treatment, we conducted a cross-sectional, multicentric and descriptive study from June 1st, 2017 to August 1st, 2017, over a period of 2 months. It involved a questionnaire given to medical specialists of all medical specialties and practicing in the University Hospital of Dakar. Results: 170 doctors were interviewed. Dermatologists and internists were mostly found 19.4% and 18.8% or 33% and 34% doctors. Systemic autoimmune diseases alone accounted for 48% of prescription reasons. Prednisone was prescribed in 88% of cases. The immunosuppressive dose of 1 mg/kg was most often prescribed. Practitioners very heterogeneously prescribe most adjuvant measures to prolonged systemic corticosteroid therapy. Thus, the recommendation of a low-sodium diet (38% of physicians), and the systematic prescriptions of proton pump inhibitors (44.7% of physicians) and vitamin-calcium supplementation were frequently performed by physicians (34% of physicians). While the low carbohydrate diet was advocated by less than a quarter of doctors and the prevention of pneumocystosis and osteoporosis were rare (respectively 61% and 52% of prescribers did not). Conclusion: The global analysis of the habits of our medical specialists concerning the use of long-term glucocorticoids reflected a diversity of indications, heterogeneity of practices, with certain habits not in accordance with the usual recommendations.展开更多
In this work we discuss SDSPbMM, an integrated Strategy for Data Stream Processing based on Measurement Metadata, applied to an outpatient monitoring scenario. The measures associated to the attributes of the patient ...In this work we discuss SDSPbMM, an integrated Strategy for Data Stream Processing based on Measurement Metadata, applied to an outpatient monitoring scenario. The measures associated to the attributes of the patient (entity) under monitoring, come from heterogeneous data sources as data streams, together with metadata associated with the formal definition of a measurement and evaluation project. Such metadata supports the patient analysis and monitoring in a more consistent way, facilitating for instance: i) The early detection of problems typical of data such as missing values, outliers, among others;and ii) The risk anticipation by means of on-line classification models adapted to the patient. We also performed a simulation using a prototype developed for outpatient monitoring, in order to analyze empirically processing times and variable scalability, which shed light on the feasibility of applying the prototype to real situations. In addition, we analyze statistically the results of the simulation, in order to detect the components which incorporate more variability to the system.展开更多
A data fusion method of online multisensors is prop os ed in this paper based on artificial neuron. First, the dynamic data fusion mode l on artificial neuron is built. Then the calibration of data fusion is discusse ...A data fusion method of online multisensors is prop os ed in this paper based on artificial neuron. First, the dynamic data fusion mode l on artificial neuron is built. Then the calibration of data fusion is discusse d with self-adaptive weighing technique. Finally performance of the method is d emonstrated by an online vibration measurement case. The results show that the f used data are more stable, sensitive, accurate, reliable than that of single sen sor data.展开更多
Similarity measure design on non-overlapped data was carried out and compared with the case of overlapped data.Unconsistant feature of similarity on overlapped data to non-overlapped data was provided by example.By th...Similarity measure design on non-overlapped data was carried out and compared with the case of overlapped data.Unconsistant feature of similarity on overlapped data to non-overlapped data was provided by example.By the artificial data illustration,it was proved that the conventional similarity measure was not proper to calculate the similarity measure of the non-overlapped case.To overcome the unbalance problem,similarity measure on non-overlapped data was obtained by considering neighbor information.Hence,different approaches to design similarity measure were proposed and proved by consideration of neighbor information.With the example of artificial data,similarity measure calculation was carried out.Similarity measure extension to intuitionistic fuzzy sets(IFSs)containing uncertainty named hesitance was also followed.展开更多
In this article we study the estimation method of nonparametric regression measurement error model based on a validation data. The estimation procedures are based on orthogonal series estimation and truncated series a...In this article we study the estimation method of nonparametric regression measurement error model based on a validation data. The estimation procedures are based on orthogonal series estimation and truncated series approximation methods without specifying any structure equation and the distribution assumption. The convergence rates of the proposed estimator are derived. By example and through simulation, the method is robust against the misspecification of a measurement error model.展开更多
Spatial linear features are often represented as a series of line segments joined by measured endpoints in surveying and geographic information science.There are not only the measuring errors of the endpoints but also...Spatial linear features are often represented as a series of line segments joined by measured endpoints in surveying and geographic information science.There are not only the measuring errors of the endpoints but also the modeling errors between the line segments and the actual geographical features.This paper presents a Brownian bridge error model for line segments combining both the modeling and measuring errors.First,the Brownian bridge is used to establish the position distribution of the actual geographic feature represented by the line segment.Second,an error propagation model with the constraints of the measuring error distribution of the endpoints is proposed.Third,a comprehensive error band of the line segment is constructed,wherein both the modeling and measuring errors are contained.The proposed error model can be used to evaluate line segments’overall accuracy and trustability influenced by modeling and measuring errors,and provides a comprehensive quality indicator for the geospatial data.展开更多
基金Project(2017G006-N)supported by the Project of Science and Technology Research and Development Program of China Railway Corporation。
文摘Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and maintenance of cable-stayed bridges.However,the representative temperatures of stayed cables are not specified in the existing design codes.To address this issue,this study investigates the distribution of the cable temperature and determinates its representative temperature.First,an experimental investigation,spanning over a period of one year,was carried out near the bridge site to obtain the temperature data.According to the statistical analysis of the measured data,it reveals that the temperature distribution is generally uniform along the cable cross-section without significant temperature gradient.Then,based on the limited data,the Monte Carlo,the gradient boosted regression trees(GBRT),and univariate linear regression(ULR)methods are employed to predict the cable’s representative temperature throughout the service life.These methods effectively overcome the limitations of insufficient monitoring data and accurately predict the representative temperature of the cables.However,each method has its own advantages and limitations in terms of applicability and accuracy.A comprehensive evaluation of the performance of these methods is conducted,and practical recommendations are provided for their application.The proposed methods and representative temperatures provide a good basis for the operation and maintenance of in-service long-span cable-stayed bridges.
文摘Long-term navigation ability based on consumer-level wearable inertial sensors plays an essential role towards various emerging fields, for instance, smart healthcare, emergency rescue, soldier positioning et al. The performance of existing long-term navigation algorithm is limited by the cumulative error of inertial sensors, disturbed local magnetic field, and complex motion modes of the pedestrian. This paper develops a robust data and physical model dual-driven based trajectory estimation(DPDD-TE) framework, which can be applied for long-term navigation tasks. A Bi-directional Long Short-Term Memory(Bi-LSTM) based quasi-static magnetic field(QSMF) detection algorithm is developed for extracting useful magnetic observation for heading calibration, and another Bi-LSTM is adopted for walking speed estimation by considering hybrid human motion information under a specific time period. In addition, a data and physical model dual-driven based multi-source fusion model is proposed to integrate basic INS mechanization and multi-level constraint and observations for maintaining accuracy under long-term navigation tasks, and enhanced by the magnetic and trajectory features assisted loop detection algorithm. Real-world experiments indicate that the proposed DPDD-TE outperforms than existing algorithms, and final estimated heading and positioning accuracy indexes reaches 5° and less than 2 m under the time period of 30 min, respectively.
基金supported by the National Key R&D Program of China (Project No.2020YFC2200800,Task No.2020YFC2200803)the Key Projects of the Natural Science Foundation of Heilongjiang Province (Grant No.ZD2021E001)。
文摘Dead fine fuel moisture content(DFFMC)is a key factor affecting the spread of forest fires,which plays an important role in evaluation of forest fire risk.In order to achieve high-precision real-time measurement of DFFMC,this study established a long short-term memory(LSTM)network based on particle swarm optimization(PSO)algorithm as a measurement model.A multi-point surface monitoring scheme combining near-infrared measurement method and meteorological measurement method is proposed.The near-infrared spectral information of dead fine fuels and the meteorological factors in the region are processed by data fusion technology to construct a spectral-meteorological data set.The surface fine dead fuel of Mongolian oak(Quercus mongolica Fisch.ex Ledeb.),white birch(Betula platyphylla Suk.),larch(Larix gmelinii(Rupr.)Kuzen.),and Manchurian walnut(Juglans mandshurica Maxim.)in the maoershan experimental forest farm of the Northeast Forestry University were investigated.We used the PSO-LSTM model for moisture content to compare the near-infrared spectroscopy,meteorological,and spectral meteorological fusion methods.The results show that the mean absolute error of the DFFMC of the four stands by spectral meteorological fusion method were 1.1%for Mongolian oak,1.3%for white birch,1.4%for larch,and 1.8%for Manchurian walnut,and these values were lower than those of the near-infrared method and the meteorological method.The spectral meteorological fusion method provides a new way for high-precision measurement of moisture content of fine dead fuel.
文摘Emerging connected vehicle (CV) data sets have recently become commercially available, enabling analysts to develop a variety of powerful performance measures without deploying any field infrastructure. This paper presents several tools using CV data to evaluate traffic progression quality along a signalized corridor. These include both performance measures for high-level analysis as well as visualizations to examine details of the coordinated operation. With the use of CV data, it is possible to assess not only the movement of traffic on the corridor but also to consider its origin-destination (O-D) path through the corridor. Results for the real-world operation of an eight-intersection signalized arterial are presented. A series of high-level performance measures are used to evaluate overall performance by time of day, with differing results by metric. Next, the details of the operation are examined with the use of two visualization tools: a cyclic time-space diagram (TSD) and an empirical platoon progression diagram (PPD). Comparing flow visualizations developed with different included O-D paths reveals several features, such as the presence of secondary and tertiary platoons on certain sections that cannot be seen when only end-to-end journeys are included. In addition, speed heat maps are generated, providing both speed performance along the corridor and locations and the extent of the queue. The proposed visualization tools portray the corridor’s performance holistically instead of combining individual signal performance metrics. The techniques exhibited in this study are compelling for identifying locations where engineering solutions such as access management or timing plan change are required. The recent progress in infrastructure-free sensing technology has significantly increased the scope of CV data-based traffic management systems, enhancing the significance of this study. The study demonstrates the utility of CV trajectory data for obtaining high-level details of the corridor performance as well as drilling down into the minute specifics.
基金Supported by National Natural Science Foundation of China(Grant Nos.U2268210,52302474,52072249).
文摘Wheel polygonal wear can immensely worsen wheel/rail interactions and vibration performances of the train and track,and ultimately,lead to the shortening of service life of railway components.At present,wheel/rail medium-or high-frequency frictional interactions are perceived as an essential reason of the high-order polygonal wear of railway wheels,which are potentially resulted by the flexible deformations of the train/track system or other external excitations.In this work,the effect of wheel/rail flexibility on polygonal wear evolution of heavy-haul locomotive wheels is explored with aid of the long-term wheel polygonal wear evolution simulations,in which different flexible modeling of the heavy-haul wheel/rail coupled system is implemented.Further,the mitigation measures for the polygonal wear of heavy-haul locomotive wheels are discussed.The results point out that the evolution of polygonal wear of heavy-haul locomotive wheels can be veritably simulated with consideration of the flexible effect of both wheelset and rails.Execution of mixed-line operation of heavy-haul trains and application of multicut wheel re-profiling can effectively reduce the development of wheel polygonal wear.This research can provide a deep-going understanding of polygonal wear evolution mechanism of heavy-haul locomotive wheels and its mitigation measures.
基金Financial support provided by Correlated Solutions Incorporated to perform StereoDIC experimentsthe Department of Mechanical Engineering at the University of South Carolina for simulation studies is deeply appreciated.
文摘To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is developed.The procedure includes(a)conversion of the finite element data into a triangular mesh,(b)selection of a common coordinate system,(c)determination of the rigid body transformation to place both measurements and FEA data in the same system and(d)interpolation of the FEA nodal information to the same spatial locations as the StereoDIC measurements using barycentric coordinates.For an aluminum Al-6061 double edge notched tensile specimen,FEA results are obtained using both the von Mises isotropic yield criterion and Hill’s quadratic anisotropic yield criterion,with the unknown Hill model parameters determined using full-field specimen strain measurements for the nominally plane stress specimen.Using Hill’s quadratic anisotropic yield criterion,the point-by-point comparison of experimentally based full-field strains and stresses to finite element predictions are shown to be in excellent agreement,confirming the effectiveness of the field comparison process.
基金This study has been sponsored by the Fire Bureau of the Ministry of Public Security(Grant No.2016XFGG05)the Sichuan Mineral Resources Research Center(Grant No.SCKCZY2022-YB010)the Key Laboratory of Flight Techniques and Flight Safety,CAAC(Grant No.FZ2021KF05).
文摘Because of its large capacity,high efficiency and energy savings,the subway has gradually become the primary mode of transportation for citizens.A high density of passengers exists within a large-passenger-flow subway station,and the number of casualties and injuries during a fire emergency is substantial.In this paper,Pathfinder software and on-site measured data of Pingzhou station in Shenzhen(China)were utilized to simulate a fire emergency evacuation in a large-passenger-flow subway station.The Required Safe Egress Time(RSET),number of passengers and flow rates of stairs and escalators were analysed for three fire evacuation scenarios:train fire,platform fire and hall fire.The evacuation time of the train fire,which was 1173 s,was the longest,and 3621 occupants needed to evacuate when the train was fully loaded.Occupants could not complete the evacuation within 6 mins in all three fire evacuation scenarios,which does not meet the current standard requirements and codes.By changing the number of passengers and the number of stairs for evacuation,the flow rate capacity and evacuation time were explored,which have reference values for safety management and emergency evacuation plan optimization during peak hours of subway operation.
基金supported by the Strategic Priority Research Program of the Chinese Academy of Sciences ADS Project(No.XDA03030200)the National Natural Science Foundation of China(No.91426301)
文摘Recent progress in nuclear data measurement for ADS at Institute of Modern Physics is reviewed briefly.Based on the cooler storage ring of the Heavy Ion Research Facility in Lanzhou, nuclear data terminal was established.The nuclear data measurement facility for the ADS spallation target has been constructed, which provides a very important platform for the experimental measurements of spallation reactions. A number of experiments have been conducted in the nuclear data terminal. A Neutron Time-of-Flight(NTOF)spectrometer was developed for the study of neutron production from spallation reactions related to the ADS project.The experiments of 400 MeV/u ^(16)O bombarded on a tungsten target were presented using a NTOF spectrometer.Neutron yields for 250 MeV protons incident on a thick grain-made tungsten target and a thick solid lead target have been measured using the water-bath neutron activation method. Spallation residual productions were studied by bombarding W and Pb targets with a 250 MeV proton beam using the neutron activation method. Benchmarking of evaluated nuclear data libraries was performed for D-T neutrons on ADS relevant materials by using the benchmark experimental facility at the China Institute of Atomic Energy.
基金Project(RDF 11-02-03)supported by the Research Development Fund of XJTLU,China
文摘Information analysis of high dimensional data was carried out through similarity measure application. High dimensional data were considered as the a typical structure. Additionally, overlapped and non-overlapped data were introduced, and similarity measure analysis was also illustrated and compared with conventional similarity measure. As a result, overlapped data comparison was possible to present similarity with conventional similarity measure. Non-overlapped data similarity analysis provided the clue to solve the similarity of high dimensional data. Considering high dimensional data analysis was designed with consideration of neighborhoods information. Conservative and strict solutions were proposed. Proposed similarity measure was applied to express financial fraud among multi dimensional datasets. In illustrative example, financial fraud similarity with respect to age, gender, qualification and job was presented. And with the proposed similarity measure, high dimensional personal data were calculated to evaluate how similar to the financial fraud. Calculation results show that the actual fraud has rather high similarity measure compared to the average, from minimal 0.0609 to maximal 0.1667.
基金Supported by the National Nature Science Foundation of China(No.41527901)the Provincial Key Research and Development Program of Shandong,China(No.2019JZZY010417)the Special Program of Shandong Province for Qingdao Pilot National Laboratory of Marine Science and Technology(No.2021QNLM020002).
文摘The multi-point simultaneous long-term measurement of CO_(2) concentration in seawater can provide more-valuable data for further understanding of the spatial and temporal distribution of CO_(2).Thus,the requirement for a low-cost sensor with high precision,low power consumption,and a small size is becoming urgent.In this work,an in-situ sensor for CO_(2) detection in seawater,based on a permeable membrane and non-dispersive infrared(NDIR)technology,is developed.The sensor has a small size(Ф66 mm×124 mm),light weight(0.7 kg in air),low power consumption(<0.9 W),low cost(<US$1000),and high-pressure tolerance(<200 m).After laboratory performance tests,the sensor was found to have a measurement range of(0–2000)×10^(-6),and the gas linear correlation R^(2) is 0.99,with a precision of about 0.98%at a sampling rate of 1 s.A comparison measurement was carried out with a commercial sensor in a pool for 7 days,and the results showed a consistent trend.Further,the newly developed sensor was deployed in Qingdao nearshore water for 35 days.The results proved that the sensor could measure the dynamic changes of CO_(2) concentration in seawater continuously,and had the potential to carry out long-term observations on an oceanic platform.It is hoped that the sensor could be applied to field ocean observations in near future.
文摘Local arterials can be significantly impacted by diversions from adjacent work zones. These diversions often occur on unofficial detour routes due to guidance received on personal navigation devices. Often, these routes do not have sufficien<span style="font-family:Verdana;">t sensing or communication equipment to obtain infrastructure-based tra</span><span style="font-family:Verdana;">ffic signal performance measures, so other data sources are required to identify locations being significantly affected by diversions. This paper examines the network impact caused by the start of an 18-month closure of the I-65/70 interchange (North Split), which usually serves approximately 214,000 vehicles per day in Indianapolis, IN. In anticipation of some proportion of the public diverting from official detour routes to local streets, a connected vehicle monitoring program was established to provide daily performances measures for over 100 intersections in the area without the need for vehicle sensing equipment. This study reports on 13 of the most impacted signals on an alternative arterial to identify locations and time of day where operations are most degraded, so that decision makers have quantitative information to make informed adjustments to the system. Individual vehicle movements at the studied locations are analyzed to estimate changes in volume, split failures, downstream blockage, arrivals on green, and travel times. Over 130,000 trajectories were analyzed in an 11-week period. Weekly afternoon peak period volumes increased by approximately 455%, split failures increased 3%, downstream blockage increased 10%, arrivals on green decreased 16%, and travel time increase 74%. The analysis performed in this paper will serve as a framework for any agency that wants to assess traffic signal performance at hundreds of locations with little or no existing sensing or communication infrastructure to prioritize tactical retiming and/or longer-term infrastructure investments.</span>
文摘Since the first Diverging Diamond Interchange (DDI) implementation in 2009, most of the performance studies developed for this type of interchange have been based on simulations and historical crash data, with a small numbe<span style="font-family:Verdana;">r of studies using Automated Traffic Signal Performance Measures (ATS</span><span style="font-family:Verdana;">PM). Simulation models require considerable effort to collect volumes and to model actual controller operations. Safety studies based on historical crashes usually require from 3 to 5 years of data collection. ATSPMs rely on sensing equipment. This study describes the use of connected vehicle trajectory data to analyze the performance of a DDI located in the metropolitan area of Fort Wayne, IN. An extension of the Purdue Probe Diagram (PPD) is proposed to assess the levels of delay, progression, and saturation. Further, an additional PPD variation is presented that provides a convenient visualization to qualitatively understand progression patterns and to evaluate queue length for spillback in the critical interior crossover. Over 7000 trajectories and 130,000 GPS points were analyzed between the 7</span><sup><span style="font-family:Verdana;">th</span></sup><span style="font-family:Verdana;"> and the 11</span><sup><span style="font-family:Verdana;">th</span></sup><span style="font-family:Verdana;"> of June 2021 from 5:00 AM to 10:00 PM to estimate the DDI’s arrivals on green, level of service, split failures, and downstream blockage. Although this technique was demonstrated for weekdays, the ubiquity of connected vehicle data makes it very ea</span><span style="font-family:Verdana;">sy to adapt these techniques to analysis during special events, winter sto</span><span style="font-family:Verdana;">rms, and weekends. Furthermore, the methodologies presented in this paper can be applied by any agency wanting to assess the performance of any DDI in their jurisdiction.</span>
基金supported by the International Cooperative Key Project(Grant No.2004DFA04900)Ministry of Sciences and Technology of PRC,and the National Natural Science Foundation of China (Grant Nos.40637037 and 50675198)
文摘This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under deep sea. The data logger is a battery powered instrument, which can connect chemical sensors (pH electrode, H2S electrode, H2 electrode) and temperature sensors. In order to achieve major energy savings, dynamic power management is implemented in hardware design and software design. The working current of the data logger in idle mode and active mode is 15 μA and 1.44 mA respectively, which greatly extends the working time of battery. The data logger has been successftdly tested in the first Sino-American Cooperative Deep Submergence Project from August 13 to September 3, 2005.
基金Project(2010-0020163) supported by Key Research Institute Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology, Korea
文摘Similarity measure design for discrete data group was proposed. Similarity measure design for continuous membership function was also carried out. Proposed similarity measures were designed based on fuzzy number and distance measure, and were proved. To calculate the degree of similarity of discrete data, relative degree between data and total distribution was obtained. Discrete data similarity measure was completed with combination of mentioned relative degrees. Power interconnected system with multi characteristics was considered to apply discrete similarity measure. Naturally, similarity measure was extended to multi-dimensional similarity measure case, and applied to bus clustering problem.
文摘Introduction: The extended administration of corticoids by oral way for a length of more than 3 months defines the long-term-corticosteroid-therapy. This one, used in numerous indications, displays most often at the risk of undesirable effects linked sometimes to the habits of prescription of the doctors. Patients and Methods: In order to study the prescription modalities of this treatment, we conducted a cross-sectional, multicentric and descriptive study from June 1st, 2017 to August 1st, 2017, over a period of 2 months. It involved a questionnaire given to medical specialists of all medical specialties and practicing in the University Hospital of Dakar. Results: 170 doctors were interviewed. Dermatologists and internists were mostly found 19.4% and 18.8% or 33% and 34% doctors. Systemic autoimmune diseases alone accounted for 48% of prescription reasons. Prednisone was prescribed in 88% of cases. The immunosuppressive dose of 1 mg/kg was most often prescribed. Practitioners very heterogeneously prescribe most adjuvant measures to prolonged systemic corticosteroid therapy. Thus, the recommendation of a low-sodium diet (38% of physicians), and the systematic prescriptions of proton pump inhibitors (44.7% of physicians) and vitamin-calcium supplementation were frequently performed by physicians (34% of physicians). While the low carbohydrate diet was advocated by less than a quarter of doctors and the prevention of pneumocystosis and osteoporosis were rare (respectively 61% and 52% of prescribers did not). Conclusion: The global analysis of the habits of our medical specialists concerning the use of long-term glucocorticoids reflected a diversity of indications, heterogeneity of practices, with certain habits not in accordance with the usual recommendations.
文摘In this work we discuss SDSPbMM, an integrated Strategy for Data Stream Processing based on Measurement Metadata, applied to an outpatient monitoring scenario. The measures associated to the attributes of the patient (entity) under monitoring, come from heterogeneous data sources as data streams, together with metadata associated with the formal definition of a measurement and evaluation project. Such metadata supports the patient analysis and monitoring in a more consistent way, facilitating for instance: i) The early detection of problems typical of data such as missing values, outliers, among others;and ii) The risk anticipation by means of on-line classification models adapted to the patient. We also performed a simulation using a prototype developed for outpatient monitoring, in order to analyze empirically processing times and variable scalability, which shed light on the feasibility of applying the prototype to real situations. In addition, we analyze statistically the results of the simulation, in order to detect the components which incorporate more variability to the system.
文摘A data fusion method of online multisensors is prop os ed in this paper based on artificial neuron. First, the dynamic data fusion mode l on artificial neuron is built. Then the calibration of data fusion is discusse d with self-adaptive weighing technique. Finally performance of the method is d emonstrated by an online vibration measurement case. The results show that the f used data are more stable, sensitive, accurate, reliable than that of single sen sor data.
文摘Similarity measure design on non-overlapped data was carried out and compared with the case of overlapped data.Unconsistant feature of similarity on overlapped data to non-overlapped data was provided by example.By the artificial data illustration,it was proved that the conventional similarity measure was not proper to calculate the similarity measure of the non-overlapped case.To overcome the unbalance problem,similarity measure on non-overlapped data was obtained by considering neighbor information.Hence,different approaches to design similarity measure were proposed and proved by consideration of neighbor information.With the example of artificial data,similarity measure calculation was carried out.Similarity measure extension to intuitionistic fuzzy sets(IFSs)containing uncertainty named hesitance was also followed.
文摘In this article we study the estimation method of nonparametric regression measurement error model based on a validation data. The estimation procedures are based on orthogonal series estimation and truncated series approximation methods without specifying any structure equation and the distribution assumption. The convergence rates of the proposed estimator are derived. By example and through simulation, the method is robust against the misspecification of a measurement error model.
基金National Natural Science Foundation of China(Nos.42071372,42221002)。
文摘Spatial linear features are often represented as a series of line segments joined by measured endpoints in surveying and geographic information science.There are not only the measuring errors of the endpoints but also the modeling errors between the line segments and the actual geographical features.This paper presents a Brownian bridge error model for line segments combining both the modeling and measuring errors.First,the Brownian bridge is used to establish the position distribution of the actual geographic feature represented by the line segment.Second,an error propagation model with the constraints of the measuring error distribution of the endpoints is proposed.Third,a comprehensive error band of the line segment is constructed,wherein both the modeling and measuring errors are contained.The proposed error model can be used to evaluate line segments’overall accuracy and trustability influenced by modeling and measuring errors,and provides a comprehensive quality indicator for the geospatial data.