Visible light communication(VLC)has attracted much attention in the research of sixthgeneration(6G)systems.Furthermore,channel modeling is the foundation for designing efficient and robust VLC systems.In this paper,we...Visible light communication(VLC)has attracted much attention in the research of sixthgeneration(6G)systems.Furthermore,channel modeling is the foundation for designing efficient and robust VLC systems.In this paper,we present extensive VLC channel measurement campaigns in indoor environments,i.e.,an office and a corridor.Based on the measured data,the large-scale fading characteristics and multipath-related characteristics,including omnidirectional optical path loss(OPL),K-factor,power angular spectrum(PAS),angle spread(AS),and clustering characteristics,are analyzed and modeled through a statistical method.Based on the extracted statistics of the above-mentioned channel characteristics,we propose a statistical spatial channel model(SSCM)capable of modeling multipath in the spatial domain.Furthermore,the simulated statistics of the proposed model are compared with the measured statistics.For instance,in the office,the simulated path loss exponent(PLE)and the measured PLE are 1.96and 1.97,respectively.And,the simulated medians of AS and measured medians of AS are 25.94°and 24.84°,respectively.Generally,the fact that the simulated results fit well with measured results has demonstrated the accuracy of our SSCM.展开更多
The Bayesian structural equation model integrates the principles of Bayesian statistics, providing a more flexible and comprehensive modeling framework. In exploring complex relationships between variables, handling u...The Bayesian structural equation model integrates the principles of Bayesian statistics, providing a more flexible and comprehensive modeling framework. In exploring complex relationships between variables, handling uncertainty, and dealing with missing data, the Bayesian structural equation model demonstrates unique advantages. Therefore, Bayesian methods are used in this paper to establish a structural equation model of innovative talent cognition, with the measurement of college students’ cognition of innovative talent being studied. An in-depth analysis is conducted on the effects of innovative self-efficacy, social resources, innovative personality traits, and school education, aiming to explore the factors influencing college students’ innovative talent. The results indicate that innovative self-efficacy plays a key role in perception, social resources are significantly positively correlated with the perception of innovative talents, innovative personality tendencies and school education are positively correlated with the perception of innovative talents, but the impact is not significant.展开更多
A new measurement device,consisting of swirling blades and capsule-shaped throttling elements,is proposed in this study to eliminate typical measurement errors caused by complex flow patterns in gas-liquid flow.The sw...A new measurement device,consisting of swirling blades and capsule-shaped throttling elements,is proposed in this study to eliminate typical measurement errors caused by complex flow patterns in gas-liquid flow.The swirling blades are used to transform the complex flow pattern into a forced annular flow.Drawing on the research of existing blockage flow meters and also exploiting the single-phase flow measurement theory,a formula is introduced to measure the phase-separated flow of gas and liquid.The formula requires the pressure ratio,Lockhart-Martinelli number(L-M number),and the gas phase Froude number.The unknown parameters appearing in the formula are fitted through numerical simulation using computational fluid dynamics(CFD),which involves a comprehensive analysis of the flow field inside the device from multiple perspectives,and takes into account the influence of pressure fluctuations.Finally,the measurement model is validated through an experimental error analysis.The results demonstrate that the measurement error can be maintained within±8%for various flow patterns,including stratified flow,bubble flow,and wave flow.展开更多
As a novel paradigm,semantic communication provides an effective solution for breaking through the future development dilemma of classical communication systems.However,it remains an unsolved problem of how to measure...As a novel paradigm,semantic communication provides an effective solution for breaking through the future development dilemma of classical communication systems.However,it remains an unsolved problem of how to measure the information transmission capability for a given semantic communication method and subsequently compare it with the classical communication method.In this paper,we first present a review of the semantic communication system,including its system model and the two typical coding and transmission methods for its implementations.To address the unsolved issue of the information transmission capability measure for semantic communication methods,we propose a new universal performance measure called Information Conductivity.We provide the definition and the physical significance to state its effectiveness in representing the information transmission capabilities of the semantic communication systems and present elaborations including its measure methods,degrees of freedom,and progressive analysis.Experimental results in image transmission scenarios validate its practical applicability.展开更多
A dedicated weak current measurement system was designed to measure the weak currents generated by the neutron ionization chamber.This system incorporates a second-order low-pass filter circuit and the Kalman filterin...A dedicated weak current measurement system was designed to measure the weak currents generated by the neutron ionization chamber.This system incorporates a second-order low-pass filter circuit and the Kalman filtering algorithm to effectively filter out noise and minimize interference in the measurement results.Testing conducted under normal temperature conditions has demonstrated the system's high precision performance.However,it was observed that temperature variations can affect the measurement performance.Data were collected across temperatures ranging from -20 to 70℃,and a temperature correction model was established through linear regression fitting to address this issue.The feasibility of the temperature correction model was confirmed at temperatures of -5 and 40℃,where relative errors remained below 0.1% after applying the temperature correction.The research indicates that the designed measurement system exhibits excellent temperature adaptability and high precision,making it particularly suitable for measuring weak currents.展开更多
To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is...To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is developed.The procedure includes(a)conversion of the finite element data into a triangular mesh,(b)selection of a common coordinate system,(c)determination of the rigid body transformation to place both measurements and FEA data in the same system and(d)interpolation of the FEA nodal information to the same spatial locations as the StereoDIC measurements using barycentric coordinates.For an aluminum Al-6061 double edge notched tensile specimen,FEA results are obtained using both the von Mises isotropic yield criterion and Hill’s quadratic anisotropic yield criterion,with the unknown Hill model parameters determined using full-field specimen strain measurements for the nominally plane stress specimen.Using Hill’s quadratic anisotropic yield criterion,the point-by-point comparison of experimentally based full-field strains and stresses to finite element predictions are shown to be in excellent agreement,confirming the effectiveness of the field comparison process.展开更多
The heating,ventilating,and air conditioning(HVAC)system consumes nearly 50%of the building’s energy,especially in Taiwan with a hot and humid climate.Due to the challenges in obtaining energy sources and the negativ...The heating,ventilating,and air conditioning(HVAC)system consumes nearly 50%of the building’s energy,especially in Taiwan with a hot and humid climate.Due to the challenges in obtaining energy sources and the negative impacts of excessive energy use on the environment,it is essential to employ an energy-efficient HVAC system.This study conducted the machine tools building in a university.The field measurement was carried out,and the data were used to conduct energymodelling with EnergyPlus(EP)in order to discover some improvements in energy-efficient design.The validation between fieldmeasurement and energymodelling was performed,and the error rate was less than 10%.The following strategies were proposed in this study based on several energy-efficient approaches,including room temperature settings,chilled water supply temperature settings,chiller coefficient of performance(COP),shading,and building location.Energy-efficient approaches have been evaluated and could reduce energy consumption annually.The results reveal that the proposed energy-efficient approaches of room temperature settings(3.8%),chilled water supply temperature settings(2.1%),chiller COP(5.9%),using shading(9.1%),and building location(3.0%),respectively,could reduce energy consumption.The analysis discovered that using a well-performing HVAC system and building shading were effective in lowering the amount of energy used,and the energy modelling method could be an effective and satisfactory tool in determining potential energy savings.展开更多
This paper presents a fuzzy logic approach to efficiently perform unsupervised character classification for improvement in robustness, correctness and speed of a character recognition system. The characters are first ...This paper presents a fuzzy logic approach to efficiently perform unsupervised character classification for improvement in robustness, correctness and speed of a character recognition system. The characters are first split into eight typographical categories. The classification scheme uses pattern matching to classify the characters in each category into a set of fuzzy prototypes based on a nonlinear weighted similarity function. The fuzzy unsupervised character classification, which is natural in the repre...展开更多
In the process of green and smart mine construction under the context of carbon neutrality,China's coal safety situation has been continuously improved in recent years.In order to recognize the development of coal...In the process of green and smart mine construction under the context of carbon neutrality,China's coal safety situation has been continuously improved in recent years.In order to recognize the development of coal production in China and prepare for future monitoring and prevention of safety incidents,this study mainly elaborated on the basic situation of coal resources and national mining accidents over the past five years(2017-2021),from four dimensions(accident level,type,region,and time),and then proposed the preventive measures based on accident statistical laws.The results show that the storage of coal resources has obvious geographic characteristics,mainly concentrated in the Midwest,with coal resources in Shanxi and Shaanxi accounting for about 49.4%.The proportion of coal consumption has dropped from 70.2%to 56%between 2011 and 2021,but still accounts for more than half of the all.Meanwhile,the accident-prone areas are positively correlated with the amount of coal production.Among different levels of coal mine accidents,general accidents had the highest number of accidents and deaths,with 692 accidents and 783 deaths,accounting for 87.6%and 54.64%respectively.The frequency of roof,gas,and transportation accidents is relatively high,and the number of single fatalities caused by gas accidents is the largest,about 4.18.In terms of geographical distribution of accidents,the safety situation in Shanxi Province is the most severe.From the time distribution of coal mine accidents,the accidents mainly occurred in July and August,and rarely occurred in February and December.Finally,the"4+4"safety management model is proposed,combining the statistical results with coal production in China.Based on the existing health and safety management systems,the manage-ments are divided into four sub-categories,and more specific measures are suggested.展开更多
Spatial linear features are often represented as a series of line segments joined by measured endpoints in surveying and geographic information science.There are not only the measuring errors of the endpoints but also...Spatial linear features are often represented as a series of line segments joined by measured endpoints in surveying and geographic information science.There are not only the measuring errors of the endpoints but also the modeling errors between the line segments and the actual geographical features.This paper presents a Brownian bridge error model for line segments combining both the modeling and measuring errors.First,the Brownian bridge is used to establish the position distribution of the actual geographic feature represented by the line segment.Second,an error propagation model with the constraints of the measuring error distribution of the endpoints is proposed.Third,a comprehensive error band of the line segment is constructed,wherein both the modeling and measuring errors are contained.The proposed error model can be used to evaluate line segments’overall accuracy and trustability influenced by modeling and measuring errors,and provides a comprehensive quality indicator for the geospatial data.展开更多
Light absorbing particles(LAP, e.g., black carbon, brown carbon, and dust) influence water and energy budgets of the atmosphere and snowpack in multiple ways. In addition to their effects associated with atmospheric...Light absorbing particles(LAP, e.g., black carbon, brown carbon, and dust) influence water and energy budgets of the atmosphere and snowpack in multiple ways. In addition to their effects associated with atmospheric heating by absorption of solar radiation and interactions with clouds, LAP in snow on land and ice can reduce the surface reflectance(a.k.a., surface darkening), which is likely to accelerate the snow aging process and further reduces snow albedo and increases the speed of snowpack melt. LAP in snow and ice(LAPSI) has been identified as one of major forcings affecting climate change, e.g.in the fourth and fifth assessment reports of IPCC. However, the uncertainty level in quantifying this effect remains very high. In this review paper, we document various technical methods of measuring LAPSI and review the progress made in measuring the LAPSI in Arctic, Tibetan Plateau and other mid-latitude regions. We also report the progress in modeling the mass concentrations, albedo reduction, radiative forcing, and climatic and hydrological impact of LAPSI at global and regional scales. Finally we identify some research needs for reducing the uncertainties in the impact of LAPSI on global and regional climate and the hydrological cycle.展开更多
This study develops a procedure to rank agencies based on their incident responses using roadway clearance times for crashes. This analysis is not intended to grade agencies but to assist in identifying agencies requi...This study develops a procedure to rank agencies based on their incident responses using roadway clearance times for crashes. This analysis is not intended to grade agencies but to assist in identifying agencies requiring more training or resources for incident management. Previous NCHRP reports discussed usage of different factors including incident severity, roadway characteristics, number of lanes involved and time of incident separately for estimating the performance. However, it does not tell us how to incorporate all the factors at the same time. Thus, this study aims to account for multiple factors to ensure fair comparisons. This study used 149,174 crashes from Iowa that occurred from 2018 to 2021. A Tobit regression model was used to find the effect of different variables on roadway clearance time. Variables that cannot be controlled directly by agencies such as crash severity, roadway type, weather conditions, lighting conditions, etc., were included in the analysis as it helps to reduce bias in the ranking procedure. Then clearance time of each crash is normalized into a base condition using the regression coefficients. The normalization makes the process more efficient as the effect of uncontrollable factors has already been mitigated. Finally, the agencies were ranked by their average normalized roadway clearance time. This ranking process allows agencies to track their performance of previous crashes, can be used in identifying low performing agencies that could use additional resources and training, and can be used to identify high performing agencies to recognize for their efforts and performance.展开更多
Traumatic brain injury(TBI) is a major contributor of long-term disability and a leading cause of death worldwide. A series of secondary injury cascades can contribute to cell death, tissue loss, and ultimately to the...Traumatic brain injury(TBI) is a major contributor of long-term disability and a leading cause of death worldwide. A series of secondary injury cascades can contribute to cell death, tissue loss, and ultimately to the development of functional impairments. However, there are currently no effective therapeutic interventions that improve brain outcomes following TBI. As a result, a number of experimental TBI models have been developed to recapitulate TBI injury mechanisms and to test the efficacy of potential therapeutics. The pig model has recently come to the forefront as the pig brain is closer in size, structure, and composition to the human brain compared to traditional rodent models, making it an ideal large animal model to study TBI pathophysiology and functional outcomes. This review will focus on the shared characteristics between humans and pigs that make them ideal for modeling TBI and will review the three most common pig TBI models–the diffuse axonal injury, the controlled cortical impact, and the fluid percussion models. It will also review current advances in functional outcome assessment measures and other non-invasive, translational TBI detection and measurement tools like biomarker analysis and magnetic resonance imaging. The use of pigs as TBI models and the continued development and improvement of translational assessment modalities have made significant contributions to unraveling the complex cascade of TBI sequela and provide an important means to study potential clinically relevant therapeutic interventions.展开更多
The existing articulated arm coordinate measuring machines(AACMM) with one measurement model are easy to cause low measurement accuracy because the whole sampling space is much bigger than the result in the unstable...The existing articulated arm coordinate measuring machines(AACMM) with one measurement model are easy to cause low measurement accuracy because the whole sampling space is much bigger than the result in the unstable calibration parameters. To compensate for the deficiency of one measurement model, the multiple measurement models are built by the Denavit-Hartenberg's notation, the homemade standard rod components are used as a calibration tool and the Levenberg-Marquardt calibration algorithm is applied to solve the structural parameters in the measurement models. During the tests of multiple measurement models, the sample areas are selected in two situations. It is found that the measurement errors' sigma value(0.083 4 ram) dealt with one measurement model is nearly two times larger than that of the multiple measurement models(0.043 1 ram) in the same sample area. While in the different sample area, the measurement errors' sigma value(0.054 0 ram) dealt with the multiple measurement models is about 40% of one measurement model(0.137 3 mm). The preliminary results suggest that the measurement accuracy of AACMM dealt with multiple measurement models is superior to the accuracy of the existing machine with one measurement model. This paper proposes the multiple measurement models to improve the measurement accuracy of AACMM without increasing any hardware cost.展开更多
NOAA AVHRR data from the Bay of Biscay between 1988 and 1990 have been examined in order to extract information on the fluctuations of sea surface temperature (SST) at the diurnal time scale. The temporal and spatia...NOAA AVHRR data from the Bay of Biscay between 1988 and 1990 have been examined in order to extract information on the fluctuations of sea surface temperature (SST) at the diurnal time scale. The temporal and spatial distributions of diurnal warming in the area are obtained. The diurnal warming occurs during the summer months. Large diurnal warming in excess of 1℃ is found within 100 km along the west coast of France and within 30 km along the north coast of Spain. In the central Bay of Biscay the diurnal warming is typically about 0.5℃. The diurnal warming up to 6℃ is observed occasionally in the coastal areas where the wind speed is very low. A one-dimensional oceanic mixed-layer model has been used to simulate the diurnal warming. The results demonstrate that the diurnal warming increases with the decrease of the wind speed and the increase of the net heat flux. The comparison shows that the model results are in good agreement with the satellite measurements.展开更多
Terahertz(THz)communication has been envisioned as a key enabling technology for sixthgeneration(6G).In this paper,we present an extensive THz channel measurement campaign for 6G wireless communications from 220 GHz t...Terahertz(THz)communication has been envisioned as a key enabling technology for sixthgeneration(6G).In this paper,we present an extensive THz channel measurement campaign for 6G wireless communications from 220 GHz to 330 GHz.Furthermore,the path loss is analyzed and modeled by using two single-frequency path loss models and a multiplefrequencies path loss model.It is found that at most frequency points,the measured path loss is larger than that in the free space.But at around 310 GHz,the propagation attenuation is relatively weaker compared to that in the free space.Also,the frequency dependence of path loss is observed and the frequency exponent of the multiple-frequencies path loss model is 2.1.Moreover,the cellular performance of THz communication systems is investigated by using the obtained path loss model.Simulation results indicate that the current inter-site distance(ISD)for the indoor scenario is too small for THz communications.Furthermore,the tremendous capacity gain can be obtained by using THz bands compared to using microwave bands and millimeter wave bands.Generally,this work can give an insight into the design and optimization of THz communication systems for 6G.展开更多
The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LN...The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LNSM), as a more general signal propagation model, can better describe the relationship between the RSSI value and distance, but the parameter of variance in LNSM is depended on experiences without self-adaptability. In this paper, it is found that the variance of RSSI value changes along with distance regu- larly by analyzing a large number of experimental data. Based on the result of analysis, we proposed the relationship function of the variance of RSSI and distance, and established the log-normal shadowing model with dynamic variance (LNSM-DV). At the same time, the method of least squares(LS) was selected to es- timate the coefficients in that model, thus LNSM-DV might be adjusted dynamically according to the change of environment and be self-adaptable. The experimental results show that LNSM-DV can further reduce er- ror, and have strong self-adaptability to various environments compared with the LNSM.展开更多
Coal-formed gas generated from the Permo-Carboniferous coal measures has become one of the most important targets for deep hydrocarbon exploration in the Bohai Bay Basin,offshore eastern China.However,the proven gas r...Coal-formed gas generated from the Permo-Carboniferous coal measures has become one of the most important targets for deep hydrocarbon exploration in the Bohai Bay Basin,offshore eastern China.However,the proven gas reserves from this source rock remain low to date,and the distribution characteristics and accumulation model for the coal-formed gas are not clear.Here we review the coal-formed gas deposits formed from the Permo-Carboniferous coal measures in the Bohai Bay Basin.The accumulations are scattered,and dominated by middle-small sized gas fields,of which the proven reserves ranging from 0.002 to 149.4×108 m3 with an average of 44.30×108 m3 and a mid-point of 8.16×108 m3.The commercially valuable gas fields are mainly found in the central and southern parts of the basin.Vertically,the coal-formed gas is accumulated at multiple stratigraphic levels from Paleogene to Archaeozoic,among which the Paleogene and PermoCarboniferous are the main reservoir strata.According to the transporting pathway,filling mechanism and the relationship between source rocks and reservoir,the coal-formed gas accumulation model can be defined into three types:"Upward migrated,fault transported gas"accumulation model,"Laterally migrated,sandbody transported gas"accumulation model,and"Downward migrated,sub-source,fracture transported gas"accumulation model.Source rock distribution,thermal evolution and hydrocarbon generation capacity are the fundamental controlling factors for the macro distribution and enrichment of the coal-formed gas.The fault activity and the configuration of fault and caprock control the vertical enrichment pattern.展开更多
An optimization model of underground mining method selection was established on the basis of the unascertained measurement theory.Considering the geologic conditions,technology,economy and safety production,ten main f...An optimization model of underground mining method selection was established on the basis of the unascertained measurement theory.Considering the geologic conditions,technology,economy and safety production,ten main factors influencing the selection of mining method were taken into account,and the comprehensive evaluation index system of mining method selection was constructed.The unascertained evaluation indices corresponding to the selected factors for the actual situation were solved both qualitatively and quantitatively.New measurement standards were constructed.Then,the unascertained measurement function of each evaluation index was established.The index weights of the factors were calculated by entropy theory,and credible degree recognition criteria were established according to the unascertained measurement theory.The results of mining method evaluation were obtained using the credible degree criteria,thus the best underground mining method was determined.Furthermore,this model was employed for the comprehensive evaluation and selection of the chosen standard mining methods in Xinli Gold Mine in Sanshandao of China.The results show that the relative superiority degrees of mining methods can be calculated using the unascertained measurement optimization model,so the optimal method can be easily determined.Meanwhile,the proposed method can take into account large amount of uncertain information in mining method selection,which can provide an effective way for selecting the optimal underground mining method.展开更多
Measuring software quality requires software engineers to understand the system’s quality attributes and their measurements.The quality attribute is a qualitative property;however,the quantitative feature is needed f...Measuring software quality requires software engineers to understand the system’s quality attributes and their measurements.The quality attribute is a qualitative property;however,the quantitative feature is needed for software measurement,which is not considered during the development of most software systems.Many research studies have investigated different approaches for measuring software quality,but with no practical approaches to quantify and measure quality attributes.This paper proposes a software quality measurement model,based on a software interconnection model,to measure the quality of software components and the overall quality of the software system.Unlike most of the existing approaches,the proposed approach can be applied at the early stages of software development,to different architectural design models,and at different levels of system decomposition.This article introduces a software measurement model that uses a heuristic normalization of the software’s internal quality attributes,i.e.,coupling and cohesion,for software quality measurement.In this model,the quality of a software component is measured based on its internal strength and the coupling it exhibits with other component(s).The proposed model has been experimented with nine software engineering teams that have agreed to participate in the experiment during the development of their different software systems.The experiments have shown that coupling reduces the internal strength of the coupled components by the amount of coupling they exhibit,which degrades their quality and the overall quality of the software system.The introduced model can help in understanding the quality of software design.In addition,it identifies the locations in software design that exhibit unnecessary couplings that degrade the quality of the software systems,which can be eliminated.展开更多
基金supported by the National Science Fund for Distinguished Young Scholars(No.61925102)the National Natural Science Foundation of China(No.62201086,92167202,62201087,62101069)BUPT-CMCC Joint Innovation Center,and State Key Laboratory of IPOC(BUPT)(No.IPOC2023ZT02),China。
文摘Visible light communication(VLC)has attracted much attention in the research of sixthgeneration(6G)systems.Furthermore,channel modeling is the foundation for designing efficient and robust VLC systems.In this paper,we present extensive VLC channel measurement campaigns in indoor environments,i.e.,an office and a corridor.Based on the measured data,the large-scale fading characteristics and multipath-related characteristics,including omnidirectional optical path loss(OPL),K-factor,power angular spectrum(PAS),angle spread(AS),and clustering characteristics,are analyzed and modeled through a statistical method.Based on the extracted statistics of the above-mentioned channel characteristics,we propose a statistical spatial channel model(SSCM)capable of modeling multipath in the spatial domain.Furthermore,the simulated statistics of the proposed model are compared with the measured statistics.For instance,in the office,the simulated path loss exponent(PLE)and the measured PLE are 1.96and 1.97,respectively.And,the simulated medians of AS and measured medians of AS are 25.94°and 24.84°,respectively.Generally,the fact that the simulated results fit well with measured results has demonstrated the accuracy of our SSCM.
文摘The Bayesian structural equation model integrates the principles of Bayesian statistics, providing a more flexible and comprehensive modeling framework. In exploring complex relationships between variables, handling uncertainty, and dealing with missing data, the Bayesian structural equation model demonstrates unique advantages. Therefore, Bayesian methods are used in this paper to establish a structural equation model of innovative talent cognition, with the measurement of college students’ cognition of innovative talent being studied. An in-depth analysis is conducted on the effects of innovative self-efficacy, social resources, innovative personality traits, and school education, aiming to explore the factors influencing college students’ innovative talent. The results indicate that innovative self-efficacy plays a key role in perception, social resources are significantly positively correlated with the perception of innovative talents, innovative personality tendencies and school education are positively correlated with the perception of innovative talents, but the impact is not significant.
基金Supported By Open Fund of Hubei Key Laboratory of Oil and Gas Drilling and Production Engineering(Yangtze University),YQZC202309.
文摘A new measurement device,consisting of swirling blades and capsule-shaped throttling elements,is proposed in this study to eliminate typical measurement errors caused by complex flow patterns in gas-liquid flow.The swirling blades are used to transform the complex flow pattern into a forced annular flow.Drawing on the research of existing blockage flow meters and also exploiting the single-phase flow measurement theory,a formula is introduced to measure the phase-separated flow of gas and liquid.The formula requires the pressure ratio,Lockhart-Martinelli number(L-M number),and the gas phase Froude number.The unknown parameters appearing in the formula are fitted through numerical simulation using computational fluid dynamics(CFD),which involves a comprehensive analysis of the flow field inside the device from multiple perspectives,and takes into account the influence of pressure fluctuations.Finally,the measurement model is validated through an experimental error analysis.The results demonstrate that the measurement error can be maintained within±8%for various flow patterns,including stratified flow,bubble flow,and wave flow.
基金supported by the National Natural Science Foundation of China(No.62293481,No.62071058)。
文摘As a novel paradigm,semantic communication provides an effective solution for breaking through the future development dilemma of classical communication systems.However,it remains an unsolved problem of how to measure the information transmission capability for a given semantic communication method and subsequently compare it with the classical communication method.In this paper,we first present a review of the semantic communication system,including its system model and the two typical coding and transmission methods for its implementations.To address the unsolved issue of the information transmission capability measure for semantic communication methods,we propose a new universal performance measure called Information Conductivity.We provide the definition and the physical significance to state its effectiveness in representing the information transmission capabilities of the semantic communication systems and present elaborations including its measure methods,degrees of freedom,and progressive analysis.Experimental results in image transmission scenarios validate its practical applicability.
基金supported by the Youth Science Foundation of Sichuan Province(Nos.2022NSFSC1230 and 2022NSFSC1231)the Science and Technology Innovation Seedling Project of Sichuan Province(No.MZGC20230080)+1 种基金the General project of the National Natural Science Foundation of China(No.12075039)the Key project of the National Natural Science Foundation of China(No.U19A2086)。
文摘A dedicated weak current measurement system was designed to measure the weak currents generated by the neutron ionization chamber.This system incorporates a second-order low-pass filter circuit and the Kalman filtering algorithm to effectively filter out noise and minimize interference in the measurement results.Testing conducted under normal temperature conditions has demonstrated the system's high precision performance.However,it was observed that temperature variations can affect the measurement performance.Data were collected across temperatures ranging from -20 to 70℃,and a temperature correction model was established through linear regression fitting to address this issue.The feasibility of the temperature correction model was confirmed at temperatures of -5 and 40℃,where relative errors remained below 0.1% after applying the temperature correction.The research indicates that the designed measurement system exhibits excellent temperature adaptability and high precision,making it particularly suitable for measuring weak currents.
基金Financial support provided by Correlated Solutions Incorporated to perform StereoDIC experimentsthe Department of Mechanical Engineering at the University of South Carolina for simulation studies is deeply appreciated.
文摘To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is developed.The procedure includes(a)conversion of the finite element data into a triangular mesh,(b)selection of a common coordinate system,(c)determination of the rigid body transformation to place both measurements and FEA data in the same system and(d)interpolation of the FEA nodal information to the same spatial locations as the StereoDIC measurements using barycentric coordinates.For an aluminum Al-6061 double edge notched tensile specimen,FEA results are obtained using both the von Mises isotropic yield criterion and Hill’s quadratic anisotropic yield criterion,with the unknown Hill model parameters determined using full-field specimen strain measurements for the nominally plane stress specimen.Using Hill’s quadratic anisotropic yield criterion,the point-by-point comparison of experimentally based full-field strains and stresses to finite element predictions are shown to be in excellent agreement,confirming the effectiveness of the field comparison process.
基金support by the Ministry of Science and Technology under Grant No.MOST 108-2622-E-169-006-CC3.
文摘The heating,ventilating,and air conditioning(HVAC)system consumes nearly 50%of the building’s energy,especially in Taiwan with a hot and humid climate.Due to the challenges in obtaining energy sources and the negative impacts of excessive energy use on the environment,it is essential to employ an energy-efficient HVAC system.This study conducted the machine tools building in a university.The field measurement was carried out,and the data were used to conduct energymodelling with EnergyPlus(EP)in order to discover some improvements in energy-efficient design.The validation between fieldmeasurement and energymodelling was performed,and the error rate was less than 10%.The following strategies were proposed in this study based on several energy-efficient approaches,including room temperature settings,chilled water supply temperature settings,chiller coefficient of performance(COP),shading,and building location.Energy-efficient approaches have been evaluated and could reduce energy consumption annually.The results reveal that the proposed energy-efficient approaches of room temperature settings(3.8%),chilled water supply temperature settings(2.1%),chiller COP(5.9%),using shading(9.1%),and building location(3.0%),respectively,could reduce energy consumption.The analysis discovered that using a well-performing HVAC system and building shading were effective in lowering the amount of energy used,and the energy modelling method could be an effective and satisfactory tool in determining potential energy savings.
文摘This paper presents a fuzzy logic approach to efficiently perform unsupervised character classification for improvement in robustness, correctness and speed of a character recognition system. The characters are first split into eight typographical categories. The classification scheme uses pattern matching to classify the characters in each category into a set of fuzzy prototypes based on a nonlinear weighted similarity function. The fuzzy unsupervised character classification, which is natural in the repre...
基金supported by the National Key R&D Program of China (2022YFC3004701)the National Natural Science Foundation of China (52274242,51904293)+1 种基金the Natural Science Foundation of Jiangsu Province (BK20190627)the China Postdoctoral Science Foundation (2019M661998).
文摘In the process of green and smart mine construction under the context of carbon neutrality,China's coal safety situation has been continuously improved in recent years.In order to recognize the development of coal production in China and prepare for future monitoring and prevention of safety incidents,this study mainly elaborated on the basic situation of coal resources and national mining accidents over the past five years(2017-2021),from four dimensions(accident level,type,region,and time),and then proposed the preventive measures based on accident statistical laws.The results show that the storage of coal resources has obvious geographic characteristics,mainly concentrated in the Midwest,with coal resources in Shanxi and Shaanxi accounting for about 49.4%.The proportion of coal consumption has dropped from 70.2%to 56%between 2011 and 2021,but still accounts for more than half of the all.Meanwhile,the accident-prone areas are positively correlated with the amount of coal production.Among different levels of coal mine accidents,general accidents had the highest number of accidents and deaths,with 692 accidents and 783 deaths,accounting for 87.6%and 54.64%respectively.The frequency of roof,gas,and transportation accidents is relatively high,and the number of single fatalities caused by gas accidents is the largest,about 4.18.In terms of geographical distribution of accidents,the safety situation in Shanxi Province is the most severe.From the time distribution of coal mine accidents,the accidents mainly occurred in July and August,and rarely occurred in February and December.Finally,the"4+4"safety management model is proposed,combining the statistical results with coal production in China.Based on the existing health and safety management systems,the manage-ments are divided into four sub-categories,and more specific measures are suggested.
基金National Natural Science Foundation of China(Nos.42071372,42221002)。
文摘Spatial linear features are often represented as a series of line segments joined by measured endpoints in surveying and geographic information science.There are not only the measuring errors of the endpoints but also the modeling errors between the line segments and the actual geographical features.This paper presents a Brownian bridge error model for line segments combining both the modeling and measuring errors.First,the Brownian bridge is used to establish the position distribution of the actual geographic feature represented by the line segment.Second,an error propagation model with the constraints of the measuring error distribution of the endpoints is proposed.Third,a comprehensive error band of the line segment is constructed,wherein both the modeling and measuring errors are contained.The proposed error model can be used to evaluate line segments’overall accuracy and trustability influenced by modeling and measuring errors,and provides a comprehensive quality indicator for the geospatial data.
基金supported by the U.S.Department of Energy, Office of Science, Biological and Environmental Research, as part of the Earth System Modeling ProgramThe NASA Modeling, Analysis, and Prediction (MAP) Program by the Science Mission Directorate at NASA Headquarters supported the work contributed by Teppei J.YASUNARI and William K.M.LAU+2 种基金The NASA GEOS-5 simulation was implemented in the system for NASA Center for Climate Simulation (NCCS).M.G.Flanner was partially supported by NSF 1253154support from the China Scholarship FundThe Pacific Northwest National Laboratory is operated for DOE by Battelle Memorial Institute under contract DE-AC06-76RLO1830
文摘Light absorbing particles(LAP, e.g., black carbon, brown carbon, and dust) influence water and energy budgets of the atmosphere and snowpack in multiple ways. In addition to their effects associated with atmospheric heating by absorption of solar radiation and interactions with clouds, LAP in snow on land and ice can reduce the surface reflectance(a.k.a., surface darkening), which is likely to accelerate the snow aging process and further reduces snow albedo and increases the speed of snowpack melt. LAP in snow and ice(LAPSI) has been identified as one of major forcings affecting climate change, e.g.in the fourth and fifth assessment reports of IPCC. However, the uncertainty level in quantifying this effect remains very high. In this review paper, we document various technical methods of measuring LAPSI and review the progress made in measuring the LAPSI in Arctic, Tibetan Plateau and other mid-latitude regions. We also report the progress in modeling the mass concentrations, albedo reduction, radiative forcing, and climatic and hydrological impact of LAPSI at global and regional scales. Finally we identify some research needs for reducing the uncertainties in the impact of LAPSI on global and regional climate and the hydrological cycle.
文摘This study develops a procedure to rank agencies based on their incident responses using roadway clearance times for crashes. This analysis is not intended to grade agencies but to assist in identifying agencies requiring more training or resources for incident management. Previous NCHRP reports discussed usage of different factors including incident severity, roadway characteristics, number of lanes involved and time of incident separately for estimating the performance. However, it does not tell us how to incorporate all the factors at the same time. Thus, this study aims to account for multiple factors to ensure fair comparisons. This study used 149,174 crashes from Iowa that occurred from 2018 to 2021. A Tobit regression model was used to find the effect of different variables on roadway clearance time. Variables that cannot be controlled directly by agencies such as crash severity, roadway type, weather conditions, lighting conditions, etc., were included in the analysis as it helps to reduce bias in the ranking procedure. Then clearance time of each crash is normalized into a base condition using the regression coefficients. The normalization makes the process more efficient as the effect of uncontrollable factors has already been mitigated. Finally, the agencies were ranked by their average normalized roadway clearance time. This ranking process allows agencies to track their performance of previous crashes, can be used in identifying low performing agencies that could use additional resources and training, and can be used to identify high performing agencies to recognize for their efforts and performance.
文摘Traumatic brain injury(TBI) is a major contributor of long-term disability and a leading cause of death worldwide. A series of secondary injury cascades can contribute to cell death, tissue loss, and ultimately to the development of functional impairments. However, there are currently no effective therapeutic interventions that improve brain outcomes following TBI. As a result, a number of experimental TBI models have been developed to recapitulate TBI injury mechanisms and to test the efficacy of potential therapeutics. The pig model has recently come to the forefront as the pig brain is closer in size, structure, and composition to the human brain compared to traditional rodent models, making it an ideal large animal model to study TBI pathophysiology and functional outcomes. This review will focus on the shared characteristics between humans and pigs that make them ideal for modeling TBI and will review the three most common pig TBI models–the diffuse axonal injury, the controlled cortical impact, and the fluid percussion models. It will also review current advances in functional outcome assessment measures and other non-invasive, translational TBI detection and measurement tools like biomarker analysis and magnetic resonance imaging. The use of pigs as TBI models and the continued development and improvement of translational assessment modalities have made significant contributions to unraveling the complex cascade of TBI sequela and provide an important means to study potential clinically relevant therapeutic interventions.
基金Supported by National Natural Science Foundation of China(Grant No.51265017)Jiangxi Provincial Science and Technology Planning Project,China(Grant No.GJJ12468)Science and Technology Planning Project of Ji’an City,China(Grant No.20131828)
文摘The existing articulated arm coordinate measuring machines(AACMM) with one measurement model are easy to cause low measurement accuracy because the whole sampling space is much bigger than the result in the unstable calibration parameters. To compensate for the deficiency of one measurement model, the multiple measurement models are built by the Denavit-Hartenberg's notation, the homemade standard rod components are used as a calibration tool and the Levenberg-Marquardt calibration algorithm is applied to solve the structural parameters in the measurement models. During the tests of multiple measurement models, the sample areas are selected in two situations. It is found that the measurement errors' sigma value(0.083 4 ram) dealt with one measurement model is nearly two times larger than that of the multiple measurement models(0.043 1 ram) in the same sample area. While in the different sample area, the measurement errors' sigma value(0.054 0 ram) dealt with the multiple measurement models is about 40% of one measurement model(0.137 3 mm). The preliminary results suggest that the measurement accuracy of AACMM dealt with multiple measurement models is superior to the accuracy of the existing machine with one measurement model. This paper proposes the multiple measurement models to improve the measurement accuracy of AACMM without increasing any hardware cost.
基金supported by the UK Science and Engineering Research Council under contract! GR3/7612.
文摘NOAA AVHRR data from the Bay of Biscay between 1988 and 1990 have been examined in order to extract information on the fluctuations of sea surface temperature (SST) at the diurnal time scale. The temporal and spatial distributions of diurnal warming in the area are obtained. The diurnal warming occurs during the summer months. Large diurnal warming in excess of 1℃ is found within 100 km along the west coast of France and within 30 km along the north coast of Spain. In the central Bay of Biscay the diurnal warming is typically about 0.5℃. The diurnal warming up to 6℃ is observed occasionally in the coastal areas where the wind speed is very low. A one-dimensional oceanic mixed-layer model has been used to simulate the diurnal warming. The results demonstrate that the diurnal warming increases with the decrease of the wind speed and the increase of the net heat flux. The comparison shows that the model results are in good agreement with the satellite measurements.
基金supported by the National Science Fund for Distinguished Young Scholars(No.61925102)the National Key R&D Program of China(No.2020YFB1805002)the Key Project of State Key Lab of Networking and Switching Technology(No.NST20180105).
文摘Terahertz(THz)communication has been envisioned as a key enabling technology for sixthgeneration(6G).In this paper,we present an extensive THz channel measurement campaign for 6G wireless communications from 220 GHz to 330 GHz.Furthermore,the path loss is analyzed and modeled by using two single-frequency path loss models and a multiplefrequencies path loss model.It is found that at most frequency points,the measured path loss is larger than that in the free space.But at around 310 GHz,the propagation attenuation is relatively weaker compared to that in the free space.Also,the frequency dependence of path loss is observed and the frequency exponent of the multiple-frequencies path loss model is 2.1.Moreover,the cellular performance of THz communication systems is investigated by using the obtained path loss model.Simulation results indicate that the current inter-site distance(ISD)for the indoor scenario is too small for THz communications.Furthermore,the tremendous capacity gain can be obtained by using THz bands compared to using microwave bands and millimeter wave bands.Generally,this work can give an insight into the design and optimization of THz communication systems for 6G.
文摘The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LNSM), as a more general signal propagation model, can better describe the relationship between the RSSI value and distance, but the parameter of variance in LNSM is depended on experiences without self-adaptability. In this paper, it is found that the variance of RSSI value changes along with distance regu- larly by analyzing a large number of experimental data. Based on the result of analysis, we proposed the relationship function of the variance of RSSI and distance, and established the log-normal shadowing model with dynamic variance (LNSM-DV). At the same time, the method of least squares(LS) was selected to es- timate the coefficients in that model, thus LNSM-DV might be adjusted dynamically according to the change of environment and be self-adaptable. The experimental results show that LNSM-DV can further reduce er- ror, and have strong self-adaptability to various environments compared with the LNSM.
基金financial support from the National major projects (Item No.2016ZX05006-003)
文摘Coal-formed gas generated from the Permo-Carboniferous coal measures has become one of the most important targets for deep hydrocarbon exploration in the Bohai Bay Basin,offshore eastern China.However,the proven gas reserves from this source rock remain low to date,and the distribution characteristics and accumulation model for the coal-formed gas are not clear.Here we review the coal-formed gas deposits formed from the Permo-Carboniferous coal measures in the Bohai Bay Basin.The accumulations are scattered,and dominated by middle-small sized gas fields,of which the proven reserves ranging from 0.002 to 149.4×108 m3 with an average of 44.30×108 m3 and a mid-point of 8.16×108 m3.The commercially valuable gas fields are mainly found in the central and southern parts of the basin.Vertically,the coal-formed gas is accumulated at multiple stratigraphic levels from Paleogene to Archaeozoic,among which the Paleogene and PermoCarboniferous are the main reservoir strata.According to the transporting pathway,filling mechanism and the relationship between source rocks and reservoir,the coal-formed gas accumulation model can be defined into three types:"Upward migrated,fault transported gas"accumulation model,"Laterally migrated,sandbody transported gas"accumulation model,and"Downward migrated,sub-source,fracture transported gas"accumulation model.Source rock distribution,thermal evolution and hydrocarbon generation capacity are the fundamental controlling factors for the macro distribution and enrichment of the coal-formed gas.The fault activity and the configuration of fault and caprock control the vertical enrichment pattern.
基金Project(2007CB209402) supported by the National Basic Research Program of China Project(SKLGDUEK0906) supported by the Research Fund of State Key Laboratory for Geomechanics and Deep Underground Engineering of China
文摘An optimization model of underground mining method selection was established on the basis of the unascertained measurement theory.Considering the geologic conditions,technology,economy and safety production,ten main factors influencing the selection of mining method were taken into account,and the comprehensive evaluation index system of mining method selection was constructed.The unascertained evaluation indices corresponding to the selected factors for the actual situation were solved both qualitatively and quantitatively.New measurement standards were constructed.Then,the unascertained measurement function of each evaluation index was established.The index weights of the factors were calculated by entropy theory,and credible degree recognition criteria were established according to the unascertained measurement theory.The results of mining method evaluation were obtained using the credible degree criteria,thus the best underground mining method was determined.Furthermore,this model was employed for the comprehensive evaluation and selection of the chosen standard mining methods in Xinli Gold Mine in Sanshandao of China.The results show that the relative superiority degrees of mining methods can be calculated using the unascertained measurement optimization model,so the optimal method can be easily determined.Meanwhile,the proposed method can take into account large amount of uncertain information in mining method selection,which can provide an effective way for selecting the optimal underground mining method.
文摘Measuring software quality requires software engineers to understand the system’s quality attributes and their measurements.The quality attribute is a qualitative property;however,the quantitative feature is needed for software measurement,which is not considered during the development of most software systems.Many research studies have investigated different approaches for measuring software quality,but with no practical approaches to quantify and measure quality attributes.This paper proposes a software quality measurement model,based on a software interconnection model,to measure the quality of software components and the overall quality of the software system.Unlike most of the existing approaches,the proposed approach can be applied at the early stages of software development,to different architectural design models,and at different levels of system decomposition.This article introduces a software measurement model that uses a heuristic normalization of the software’s internal quality attributes,i.e.,coupling and cohesion,for software quality measurement.In this model,the quality of a software component is measured based on its internal strength and the coupling it exhibits with other component(s).The proposed model has been experimented with nine software engineering teams that have agreed to participate in the experiment during the development of their different software systems.The experiments have shown that coupling reduces the internal strength of the coupled components by the amount of coupling they exhibit,which degrades their quality and the overall quality of the software system.The introduced model can help in understanding the quality of software design.In addition,it identifies the locations in software design that exhibit unnecessary couplings that degrade the quality of the software systems,which can be eliminated.