As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery...As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery fault diagnosis.The traditional signal processing methods,such as classical inference and weighted averaging algorithm usually lack dynamic adaptability that is easy for trends to cause the faults to be misjudged or left out.To enhance the measuring veracity and precision of vibration signal in rotary machine multi-sensor vibration signal fault diagnosis,a novel data level fusion approach is presented on the basis of correlation function analysis to fast determine the weighted value of multi-sensor vibration signals.The approach doesn't require knowing the prior information about sensors,and the weighted value of sensors can be confirmed depending on the correlation measure of real-time data tested in the data level fusion process.It gives greater weighted value to the greater correlation measure of sensor signals,and vice versa.The approach can effectively suppress large errors and even can still fuse data in the case of sensor failures because it takes full advantage of sensor's own-information to determine the weighted value.Moreover,it has good performance of anti-jamming due to the correlation measures between noise and effective signals are usually small.Through the simulation of typical signal collected from multi-sensors,the comparative analysis of dynamic adaptability and fault tolerance between the proposed approach and traditional weighted averaging approach is taken.Finally,the rotor dynamics and integrated fault simulator is taken as an example to verify the feasibility and advantages of the proposed approach,it is shown that the multi-sensor data level fusion based on correlation function weighted approach is better than the traditional weighted average approach with respect to fusion precision and dynamic adaptability.Meantime,the approach is adaptable and easy to use,can be applied to other areas of vibration measurement.展开更多
The Global Geopotential Models (GGMs) of GOCE (Gravity Recovery and steady- state Ocean Circulation Explorer) differ globally as well as regionally in their accuracy and resolution based on the maximum degree and orde...The Global Geopotential Models (GGMs) of GOCE (Gravity Recovery and steady- state Ocean Circulation Explorer) differ globally as well as regionally in their accuracy and resolution based on the maximum degree and order (d/o) of the fully normalized spherical harmonic (SH) coefficients, which express each GGM. The main idea of this study is to compare the free-air gravity anomalies and quasi geoid heights determined from several recent GOCE-based GGMs with the corresponding ones from the Earth Gravitational Model 2008 (EGM2008) over Egypt on the one hand and with ground-based measurements on the other hand. The results regarding to the comparison of GOCE-based GGMs with terrestrial gravity and GPS/levelling data provide better improvement with respect to EGM2008. The 4th release GOCE-based GGM developed with the use of space-wise solution strategy (SPW_R4) approximates the gravity field well over the Egyptian region. The SPW_R4 model is accordingly suggested as a reference model for recovering the long wavelength (up to SH d/o 200) components of quasi geoid heights when modelling the gravimetric quasi-geoid over the Egypt. Finally, three types of transformation models: Four-, Five- and Seven-parameter transformations have been applied to reduce the data biases and to provide a better fitting of quasi geoid heights obtained from the studied GOCE-based GGMs to those from GPS/levelling data. These models reveal that the standard deviation of vertical datum over Egypt is at the level of about 32 cm.展开更多
According to the space-geodetic data recorded at globally distributed stations over solid land spanning a period of more than 20-years under the International Terrestrial Reference Frame 2008,our previous estimate of ...According to the space-geodetic data recorded at globally distributed stations over solid land spanning a period of more than 20-years under the International Terrestrial Reference Frame 2008,our previous estimate of the average-weighted vertical variation of the Earth's solid surface suggests that the Earth's solid part is expanding at a rate of 0.24 ± 0.05 mm/a in recent two decades.In another aspect,the satellite altimetry observations spanning recent two decades demonstrate the sea level rise(SLR) rate 3.2 ± 0.4 mm/a,of which1.8 ± 0.5 mm/a is contributed by the ice melting over land.This study shows that the oceanic thermal expansion is 1.0 ± 0.1 mm/a due to the temperature increase in recent half century,which coincides with the estimate provided by previous authors.The SLR observation by altimetry is not balanced by the ice melting and thermal expansion,which is an open problem before this study.However,in this study we infer that the oceanic part of the Earth is expanding at a rate about 0.4 mm/a.Combining the expansion rates of land part and oceanic part,we conclude that the Earth is expanding at a rate of 0.35 ± 0.47 mm/a in recent two decades.If the Earth expands at this rate,then the altimetry-observed SLR can be well explained.展开更多
Based on GPS velocity during 1999-2007,GPS baseline time series on large scale during1999-2008 and cross-fault leveling data during 1985-2008,the paper makes some analysis and discussion to study and summarize the mov...Based on GPS velocity during 1999-2007,GPS baseline time series on large scale during1999-2008 and cross-fault leveling data during 1985-2008,the paper makes some analysis and discussion to study and summarize the movement,tectonic deformation and strain accumulation evolution characteristics of the Longmenshan fault and the surrounding area before the MS8. 0 Wenchuan earthquake,as well as the possible physical mechanism late in the seismic cycle of the Wenchuan earthquake. Multiple results indicate that:GPS velocity profiles show that obvious continuous deformation across the eastern Qinghai-Tibetan Plateau before the earthquake was distributed across a zone at least 500 km wide,while there was little deformation in Sichuan Basin and Longmenshan fault zone,which means that the eastern Qinghai-Tibetan Plateau provides energy accumulation for locked Longmenshan fault zone continuously. GPS strain rates show that the east-west compression deformation was larger in the northwest of the mid-northern segment of the Longmenshan fault zone,and deformation amplitude decreased gradually from far field to near fault zone,and there was little deformation in fault zone. The east-west compression deformation was significant surrounding the southwestern segment of the Longmenshan fault zone,and strain accumulation rate was larger than that of mid-northern segment.Fault locking indicates nearly whole Longmenshan fault was locked before the earthquake except the source of the earthquake which was weakly locked,and a 20 km width patch in southwestern segment between 12 km to 22. 5 km depth was in creeping state. GPS baseline time series in northeast direction on large scale became compressive generally from 2005 in the North-South Seismic Belt,which reflects that relative compression deformation enhances. The cross-fault leveling data show that annual vertical change rate and deformation trend accumulation rate in the Longmenshan fault zone were little,which indicates that vertical activity near the fault was very weak and the fault was tightly locked. According to analyses of GPS and cross-fault leveling data before the Wenchuan earthquake,we consider that the Longmenshan fault is tightly locked from the surface to the deep,and the horizontal and vertical deformation are weak surrounding the fault in relatively small-scale crustal deformation. The process of weak deformation may be slow,and weak deformation area may be larger when large earthquake is coming. Continuous and slow compression deformation across eastern Qinghai-Tibetan Plateau before the earthquake provides dynamic support for strain accumulation in the Longmenshan fault zone in relative large-scale crustal deformation.展开更多
Complex survey designs often involve unequal selection probabilities of clus-ters or units within clusters. When estimating models for complex survey data, scaled weights are incorporated into the likelihood, producin...Complex survey designs often involve unequal selection probabilities of clus-ters or units within clusters. When estimating models for complex survey data, scaled weights are incorporated into the likelihood, producing a pseudo likeli-hood. In a 3-level weighted analysis for a binary outcome, we implemented two methods for scaling the sampling weights in the National Health Survey of Pa-kistan (NHSP). For NHSP with health care utilization as a binary outcome we found age, gender, household (HH) goods, urban/rural status, community de-velopment index, province and marital status as significant predictors of health care utilization (p-value < 0.05). The variance of the random intercepts using scaling method 1 is estimated as 0.0961 (standard error 0.0339) for PSU level, and 0.2726 (standard error 0.0995) for household level respectively. Both esti-mates are significantly different from zero (p-value < 0.05) and indicate consid-erable heterogeneity in health care utilization with respect to households and PSUs. The results of the NHSP data analysis showed that all three analyses, weighted (two scaling methods) and un-weighted, converged to almost identical results with few exceptions. This may have occurred because of the large num-ber of 3rd and 2nd level clusters and relatively small ICC. We performed a sim-ulation study to assess the effect of varying prevalence and intra-class correla-tion coefficients (ICCs) on bias of fixed effect parameters and variance components of a multilevel pseudo maximum likelihood (weighted) analysis. The simulation results showed that the performance of the scaled weighted estimators is satisfactory for both scaling methods. Incorporating simulation into the analysis of complex multilevel surveys allows the integrity of the results to be tested and is recommended as good practice.展开更多
This research was an effort to select best imputation method for missing upper air temperature data over 24 standard pressure levels. We have implemented four imputation techniques like inverse distance weighting, Bil...This research was an effort to select best imputation method for missing upper air temperature data over 24 standard pressure levels. We have implemented four imputation techniques like inverse distance weighting, Bilinear, Natural and Nearest interpolation for missing data imputations. Performance indicators for these techniques were the root mean square error (RMSE), absolute mean error (AME), correlation coefficient and coefficient of determination ( R<sup>2</sup> ) adopted in this research. We randomly make 30% of total samples (total samples was 324) predictable from 70% remaining data. Although four interpolation methods seem good (producing <1 RMSE, AME) for imputations of air temperature data, but bilinear method was the most accurate with least errors for missing data imputations. RMSE for bilinear method remains <0.01 on all pressure levels except 1000 hPa where this value was 0.6. The low value of AME (<0.1) came at all pressure levels through bilinear imputations. Very strong correlation (>0.99) found between actual and predicted air temperature data through this method. The high value of the coefficient of determination (0.99) through bilinear interpolation method, tells us best fit to the surface. We have also found similar results for imputation with natural interpolation method in this research, but after investigating scatter plots over each month, imputations with this method seem to little obtuse in certain months than bilinear method.展开更多
The in-orbit commissioning of ZY-1 02C satellite is proceeding smoothly. According to the relevant experts in this field, the imagery quality of the satellite has reached or nearly reached the level of international s...The in-orbit commissioning of ZY-1 02C satellite is proceeding smoothly. According to the relevant experts in this field, the imagery quality of the satellite has reached or nearly reached the level of international satellites of the same kind. ZY-1 02C satellite and ZY-3 satellite were successfully launched on December 22, 2011 and January 9, 2012 respectively. China Centre for Resources Satellite Data andApplication (CRSDA) was responsible for the building of a ground展开更多
A research study collected intensive longitudinal data from cancer patients on a daily basis as well as non-intensive longitudinal survey data on a monthly basis. Although the daily data need separate analysis, those ...A research study collected intensive longitudinal data from cancer patients on a daily basis as well as non-intensive longitudinal survey data on a monthly basis. Although the daily data need separate analysis, those data can also be utilized to generate predictors of monthly outcomes. Alternatives for generating daily data predictors of monthly outcomes are addressed in this work. Analyses are reported of depression measured by the Patient Health Questionnaire 8 as the monthly survey outcome. Daily measures include numbers of opioid medications taken, numbers of pain flares, least pain levels, and worst pain levels. Predictors are averages of recent non-missing values for each daily measure recorded on or prior to survey dates for depression values. Weights for recent non-missing values are based on days between measurement of a recent value and a survey date. Five alternative averages are considered: averages with unit weights, averages with reciprocal weights, weighted averages with reciprocal weights, averages with exponential weights, and weighted averages with exponential weights. Adaptive regression methods based on likelihood cross-validation (LCV) scores are used to generate fractional polynomial models for possible nonlinear dependence of depression on each average. For all four daily measures, the best LCV score over averages of all types is generated using the average of recent non-missing values with reciprocal weights. Generated models are nonlinear and monotonic. Results indicate that an appropriate choice would be to assume three recent non-missing values and use the average with reciprocal weights of the first three recent non-missing values.展开更多
Floods are the most widespread climate-related hazards in the world, and they impact more people globally than any other type of natural disasters. It causes over one third of the total economic loss from natural cata...Floods are the most widespread climate-related hazards in the world, and they impact more people globally than any other type of natural disasters. It causes over one third of the total economic loss from natural catastrophes and is responsible for two thirds of people affected by natural disasters. On the other hand, studies and analysis have shown that damage reductions due to forecasts improvements can range from a few percentage points to as much as 35% of annual flood damages. About 300 people lose their lives each year due to floods and landslides in Nepal with property damage exceeding 626 million NPR on average. The West Rapti River basin is one of the most flood prone river basins in Nepal. The real-time flood early warning system together with the development of water management and flood protection schemes plays a crucial role in reducing the loss of lives and properties and in overall development of the basin. The non-structural mitigating measure places people away from flood. This method is designed to reduce the impact of flooding to society and economy. This paper presents an overview of flood problems in the West Rapti River basin, causes and consequences of recent floods and the applicability and effectiveness of the real time data to flood early warning in Nepal.展开更多
Due to the fact that consumers'privacy data sharing has multifaceted and complex effects on the e-commerce platform and its two sided agents,consumers and sellers,a game-theoretic model in a monopoly e-market is s...Due to the fact that consumers'privacy data sharing has multifaceted and complex effects on the e-commerce platform and its two sided agents,consumers and sellers,a game-theoretic model in a monopoly e-market is set up to study the equilibrium strategies of the three agents(the platform,the seller on it and consumers)under privacy data sharing.Equilibrium decisions show that after sharing consumers'privacy data once,the platform can collect more privacy data from consumers.Meanwhile,privacy data sharing pushes the seller to reduce the product price.Moreover,the platform will increase the transaction fee if the privacy data sharing value is high.It is also indicated that privacy data sharing always benefits consumers and the seller.However,the platform's profit decreases if the privacy data sharing value is low and the privacy data sharing level is high.Finally,an extended model considering an incomplete information game among the agents is discussed.The results show that both the platform and the seller cannot obtain a high profit from privacy data sharing.Factors including the seller's possibility to buy privacy data,the privacy data sharing value and privacy data sharing level affect the two agents'payoffs.If the platform wishes to benefit from privacy data sharing,it should increase the possibility of the seller to buy privacy data or increase the privacy data sharing value.展开更多
The sea-level anomaly (SLA) from a satellite altimeter has a high accuracy and can be used to improve ocean state estimation by assimilation techniques. However, the lack of an accurate mean dynamic topography (MDT...The sea-level anomaly (SLA) from a satellite altimeter has a high accuracy and can be used to improve ocean state estimation by assimilation techniques. However, the lack of an accurate mean dynamic topography (MDT) is still a bothersome issue in an ocean data assimilation. The previous studies showed that the errors in MDT have significant impacts on assimilation results, especially on the time-mean components of ocean states and on the time variant parts of states via nonlinear ocean dynamics. The temporal-spatial differences of three MDTs and their impacts on the SLA analysis are focused on in the South China Sea (SCS). The theoretical analysis shows that even for linear models, the errors in MDT have impacts on the SLA analysis using a sequential data assimilation scheme. Assimilation experiments, based on EnOI scheme and HYCOM, with three MDTs from July 2003 to June 2004 also show that the SLA assimilation is very sensitive to the choice of different MDTs in the SCS with obvious differences between the experimental results and observations in the centre of the SCS and in the vicinity of the Philippine Islands. A new MDT for assimilation of SLA data in the SCS was proposed. The results from the assimilation experiment with this new MDT show a marked reduction (increase) in the RMSEs (correlation coefficient) between the experimental and observed SLA. Furthermore, the subsurface temperature field is also improved with this new MDT in the SCS.展开更多
The need for travel demand models is growing worldwide. Obtaining reasonably accurate level of service (LOS) attributes of different travel modes such as travel time and cost representing the performance of transporta...The need for travel demand models is growing worldwide. Obtaining reasonably accurate level of service (LOS) attributes of different travel modes such as travel time and cost representing the performance of transportation system is not a trivial task, especially in growing cities of developing countries. This study investigates the sensitivity of results of a travel mode choice model to different specifications of network-based LOS attributes using a mixed logit model. The study also looks at the possibilities of correcting some of the inaccuracies in network-based LOS attributes. Further, the study also explores the effects of different specifications of LOS data on implied values of time and aggregation forecasting. The findings indicate that the implied values of time are very sensitive to specification of data and model implying that utmost care must be taken if the purpose of the model is to estimate values of time. Models estimated on all specifications of LOS-data perform well in prediction, likely suggesting that the extra expense on developing a more detailed and accurate network models so as to derive more precise LOS attributes is unnecessary for impact analyses of some policies.展开更多
This paper presents a description and performance evaluation of a new bit-level, lossless, adaptive, and asymmetric data compression scheme that is based on the adaptive character wordlength (ACW(n)) algorithm. Th...This paper presents a description and performance evaluation of a new bit-level, lossless, adaptive, and asymmetric data compression scheme that is based on the adaptive character wordlength (ACW(n)) algorithm. The proposed scheme enhances the compression ratio of the ACW(n) algorithm by dividing the binary sequence into a number of subsequences (s), each of them satisfying the condition that the number of decimal values (d) of the n-bit length characters is equal to or less than 256. Therefore, the new scheme is referred to as ACW(n, s), where n is the adaptive character wordlength and s is the number of subsequences. The new scheme was used to compress a number of text files from standard corpora. The obtained results demonstrate that the ACW(n, s) scheme achieves higher compression ratio than many widely used compression algorithms and it achieves a competitive performance compared to state-of-the-art compression tools.展开更多
基金supported by National Hi-tech Research and Development Program of China (863 Program, Grant No. 2007AA04Z433)Hunan Provincial Natural Science Foundation of China (Grant No. 09JJ8005)Scientific Research Foundation of Graduate School of Beijing University of Chemical and Technology,China (Grant No. 10Me002)
文摘As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery fault diagnosis.The traditional signal processing methods,such as classical inference and weighted averaging algorithm usually lack dynamic adaptability that is easy for trends to cause the faults to be misjudged or left out.To enhance the measuring veracity and precision of vibration signal in rotary machine multi-sensor vibration signal fault diagnosis,a novel data level fusion approach is presented on the basis of correlation function analysis to fast determine the weighted value of multi-sensor vibration signals.The approach doesn't require knowing the prior information about sensors,and the weighted value of sensors can be confirmed depending on the correlation measure of real-time data tested in the data level fusion process.It gives greater weighted value to the greater correlation measure of sensor signals,and vice versa.The approach can effectively suppress large errors and even can still fuse data in the case of sensor failures because it takes full advantage of sensor's own-information to determine the weighted value.Moreover,it has good performance of anti-jamming due to the correlation measures between noise and effective signals are usually small.Through the simulation of typical signal collected from multi-sensors,the comparative analysis of dynamic adaptability and fault tolerance between the proposed approach and traditional weighted averaging approach is taken.Finally,the rotor dynamics and integrated fault simulator is taken as an example to verify the feasibility and advantages of the proposed approach,it is shown that the multi-sensor data level fusion based on correlation function weighted approach is better than the traditional weighted average approach with respect to fusion precision and dynamic adaptability.Meantime,the approach is adaptable and easy to use,can be applied to other areas of vibration measurement.
文摘The Global Geopotential Models (GGMs) of GOCE (Gravity Recovery and steady- state Ocean Circulation Explorer) differ globally as well as regionally in their accuracy and resolution based on the maximum degree and order (d/o) of the fully normalized spherical harmonic (SH) coefficients, which express each GGM. The main idea of this study is to compare the free-air gravity anomalies and quasi geoid heights determined from several recent GOCE-based GGMs with the corresponding ones from the Earth Gravitational Model 2008 (EGM2008) over Egypt on the one hand and with ground-based measurements on the other hand. The results regarding to the comparison of GOCE-based GGMs with terrestrial gravity and GPS/levelling data provide better improvement with respect to EGM2008. The 4th release GOCE-based GGM developed with the use of space-wise solution strategy (SPW_R4) approximates the gravity field well over the Egyptian region. The SPW_R4 model is accordingly suggested as a reference model for recovering the long wavelength (up to SH d/o 200) components of quasi geoid heights when modelling the gravimetric quasi-geoid over the Egypt. Finally, three types of transformation models: Four-, Five- and Seven-parameter transformations have been applied to reduce the data biases and to provide a better fitting of quasi geoid heights obtained from the studied GOCE-based GGMs to those from GPS/levelling data. These models reveal that the standard deviation of vertical datum over Egypt is at the level of about 32 cm.
基金supported by National 973 Project China(2013CB733305,2013CB733301)National Natural Science Foundation of China(41174011,41429401,41210006,41128003,41021061)
文摘According to the space-geodetic data recorded at globally distributed stations over solid land spanning a period of more than 20-years under the International Terrestrial Reference Frame 2008,our previous estimate of the average-weighted vertical variation of the Earth's solid surface suggests that the Earth's solid part is expanding at a rate of 0.24 ± 0.05 mm/a in recent two decades.In another aspect,the satellite altimetry observations spanning recent two decades demonstrate the sea level rise(SLR) rate 3.2 ± 0.4 mm/a,of which1.8 ± 0.5 mm/a is contributed by the ice melting over land.This study shows that the oceanic thermal expansion is 1.0 ± 0.1 mm/a due to the temperature increase in recent half century,which coincides with the estimate provided by previous authors.The SLR observation by altimetry is not balanced by the ice melting and thermal expansion,which is an open problem before this study.However,in this study we infer that the oceanic part of the Earth is expanding at a rate about 0.4 mm/a.Combining the expansion rates of land part and oceanic part,we conclude that the Earth is expanding at a rate of 0.35 ± 0.47 mm/a in recent two decades.If the Earth expands at this rate,then the altimetry-observed SLR can be well explained.
基金supported by the National Key R&D Program of China(2018YFC1503606 2017YFC1500502)Earthquake Tracking Task(2019010215)
文摘Based on GPS velocity during 1999-2007,GPS baseline time series on large scale during1999-2008 and cross-fault leveling data during 1985-2008,the paper makes some analysis and discussion to study and summarize the movement,tectonic deformation and strain accumulation evolution characteristics of the Longmenshan fault and the surrounding area before the MS8. 0 Wenchuan earthquake,as well as the possible physical mechanism late in the seismic cycle of the Wenchuan earthquake. Multiple results indicate that:GPS velocity profiles show that obvious continuous deformation across the eastern Qinghai-Tibetan Plateau before the earthquake was distributed across a zone at least 500 km wide,while there was little deformation in Sichuan Basin and Longmenshan fault zone,which means that the eastern Qinghai-Tibetan Plateau provides energy accumulation for locked Longmenshan fault zone continuously. GPS strain rates show that the east-west compression deformation was larger in the northwest of the mid-northern segment of the Longmenshan fault zone,and deformation amplitude decreased gradually from far field to near fault zone,and there was little deformation in fault zone. The east-west compression deformation was significant surrounding the southwestern segment of the Longmenshan fault zone,and strain accumulation rate was larger than that of mid-northern segment.Fault locking indicates nearly whole Longmenshan fault was locked before the earthquake except the source of the earthquake which was weakly locked,and a 20 km width patch in southwestern segment between 12 km to 22. 5 km depth was in creeping state. GPS baseline time series in northeast direction on large scale became compressive generally from 2005 in the North-South Seismic Belt,which reflects that relative compression deformation enhances. The cross-fault leveling data show that annual vertical change rate and deformation trend accumulation rate in the Longmenshan fault zone were little,which indicates that vertical activity near the fault was very weak and the fault was tightly locked. According to analyses of GPS and cross-fault leveling data before the Wenchuan earthquake,we consider that the Longmenshan fault is tightly locked from the surface to the deep,and the horizontal and vertical deformation are weak surrounding the fault in relatively small-scale crustal deformation. The process of weak deformation may be slow,and weak deformation area may be larger when large earthquake is coming. Continuous and slow compression deformation across eastern Qinghai-Tibetan Plateau before the earthquake provides dynamic support for strain accumulation in the Longmenshan fault zone in relative large-scale crustal deformation.
文摘Complex survey designs often involve unequal selection probabilities of clus-ters or units within clusters. When estimating models for complex survey data, scaled weights are incorporated into the likelihood, producing a pseudo likeli-hood. In a 3-level weighted analysis for a binary outcome, we implemented two methods for scaling the sampling weights in the National Health Survey of Pa-kistan (NHSP). For NHSP with health care utilization as a binary outcome we found age, gender, household (HH) goods, urban/rural status, community de-velopment index, province and marital status as significant predictors of health care utilization (p-value < 0.05). The variance of the random intercepts using scaling method 1 is estimated as 0.0961 (standard error 0.0339) for PSU level, and 0.2726 (standard error 0.0995) for household level respectively. Both esti-mates are significantly different from zero (p-value < 0.05) and indicate consid-erable heterogeneity in health care utilization with respect to households and PSUs. The results of the NHSP data analysis showed that all three analyses, weighted (two scaling methods) and un-weighted, converged to almost identical results with few exceptions. This may have occurred because of the large num-ber of 3rd and 2nd level clusters and relatively small ICC. We performed a sim-ulation study to assess the effect of varying prevalence and intra-class correla-tion coefficients (ICCs) on bias of fixed effect parameters and variance components of a multilevel pseudo maximum likelihood (weighted) analysis. The simulation results showed that the performance of the scaled weighted estimators is satisfactory for both scaling methods. Incorporating simulation into the analysis of complex multilevel surveys allows the integrity of the results to be tested and is recommended as good practice.
文摘This research was an effort to select best imputation method for missing upper air temperature data over 24 standard pressure levels. We have implemented four imputation techniques like inverse distance weighting, Bilinear, Natural and Nearest interpolation for missing data imputations. Performance indicators for these techniques were the root mean square error (RMSE), absolute mean error (AME), correlation coefficient and coefficient of determination ( R<sup>2</sup> ) adopted in this research. We randomly make 30% of total samples (total samples was 324) predictable from 70% remaining data. Although four interpolation methods seem good (producing <1 RMSE, AME) for imputations of air temperature data, but bilinear method was the most accurate with least errors for missing data imputations. RMSE for bilinear method remains <0.01 on all pressure levels except 1000 hPa where this value was 0.6. The low value of AME (<0.1) came at all pressure levels through bilinear imputations. Very strong correlation (>0.99) found between actual and predicted air temperature data through this method. The high value of the coefficient of determination (0.99) through bilinear interpolation method, tells us best fit to the surface. We have also found similar results for imputation with natural interpolation method in this research, but after investigating scatter plots over each month, imputations with this method seem to little obtuse in certain months than bilinear method.
文摘The in-orbit commissioning of ZY-1 02C satellite is proceeding smoothly. According to the relevant experts in this field, the imagery quality of the satellite has reached or nearly reached the level of international satellites of the same kind. ZY-1 02C satellite and ZY-3 satellite were successfully launched on December 22, 2011 and January 9, 2012 respectively. China Centre for Resources Satellite Data andApplication (CRSDA) was responsible for the building of a ground
文摘A research study collected intensive longitudinal data from cancer patients on a daily basis as well as non-intensive longitudinal survey data on a monthly basis. Although the daily data need separate analysis, those data can also be utilized to generate predictors of monthly outcomes. Alternatives for generating daily data predictors of monthly outcomes are addressed in this work. Analyses are reported of depression measured by the Patient Health Questionnaire 8 as the monthly survey outcome. Daily measures include numbers of opioid medications taken, numbers of pain flares, least pain levels, and worst pain levels. Predictors are averages of recent non-missing values for each daily measure recorded on or prior to survey dates for depression values. Weights for recent non-missing values are based on days between measurement of a recent value and a survey date. Five alternative averages are considered: averages with unit weights, averages with reciprocal weights, weighted averages with reciprocal weights, averages with exponential weights, and weighted averages with exponential weights. Adaptive regression methods based on likelihood cross-validation (LCV) scores are used to generate fractional polynomial models for possible nonlinear dependence of depression on each average. For all four daily measures, the best LCV score over averages of all types is generated using the average of recent non-missing values with reciprocal weights. Generated models are nonlinear and monotonic. Results indicate that an appropriate choice would be to assume three recent non-missing values and use the average with reciprocal weights of the first three recent non-missing values.
文摘Floods are the most widespread climate-related hazards in the world, and they impact more people globally than any other type of natural disasters. It causes over one third of the total economic loss from natural catastrophes and is responsible for two thirds of people affected by natural disasters. On the other hand, studies and analysis have shown that damage reductions due to forecasts improvements can range from a few percentage points to as much as 35% of annual flood damages. About 300 people lose their lives each year due to floods and landslides in Nepal with property damage exceeding 626 million NPR on average. The West Rapti River basin is one of the most flood prone river basins in Nepal. The real-time flood early warning system together with the development of water management and flood protection schemes plays a crucial role in reducing the loss of lives and properties and in overall development of the basin. The non-structural mitigating measure places people away from flood. This method is designed to reduce the impact of flooding to society and economy. This paper presents an overview of flood problems in the West Rapti River basin, causes and consequences of recent floods and the applicability and effectiveness of the real time data to flood early warning in Nepal.
基金The National Social Science Foundation of China(No.17BGL196)。
文摘Due to the fact that consumers'privacy data sharing has multifaceted and complex effects on the e-commerce platform and its two sided agents,consumers and sellers,a game-theoretic model in a monopoly e-market is set up to study the equilibrium strategies of the three agents(the platform,the seller on it and consumers)under privacy data sharing.Equilibrium decisions show that after sharing consumers'privacy data once,the platform can collect more privacy data from consumers.Meanwhile,privacy data sharing pushes the seller to reduce the product price.Moreover,the platform will increase the transaction fee if the privacy data sharing value is high.It is also indicated that privacy data sharing always benefits consumers and the seller.However,the platform's profit decreases if the privacy data sharing value is low and the privacy data sharing level is high.Finally,an extended model considering an incomplete information game among the agents is discussed.The results show that both the platform and the seller cannot obtain a high profit from privacy data sharing.Factors including the seller's possibility to buy privacy data,the privacy data sharing value and privacy data sharing level affect the two agents'payoffs.If the platform wishes to benefit from privacy data sharing,it should increase the possibility of the seller to buy privacy data or increase the privacy data sharing value.
基金The National Basic Research Program of China under contract Nos 2012CB417404 and 2011CB403504the National Natural Science Foundation of China under contract No. 41075064the National High Technology Research and Development Program of China under contract No. 2008AA09A404-3
文摘The sea-level anomaly (SLA) from a satellite altimeter has a high accuracy and can be used to improve ocean state estimation by assimilation techniques. However, the lack of an accurate mean dynamic topography (MDT) is still a bothersome issue in an ocean data assimilation. The previous studies showed that the errors in MDT have significant impacts on assimilation results, especially on the time-mean components of ocean states and on the time variant parts of states via nonlinear ocean dynamics. The temporal-spatial differences of three MDTs and their impacts on the SLA analysis are focused on in the South China Sea (SCS). The theoretical analysis shows that even for linear models, the errors in MDT have impacts on the SLA analysis using a sequential data assimilation scheme. Assimilation experiments, based on EnOI scheme and HYCOM, with three MDTs from July 2003 to June 2004 also show that the SLA assimilation is very sensitive to the choice of different MDTs in the SCS with obvious differences between the experimental results and observations in the centre of the SCS and in the vicinity of the Philippine Islands. A new MDT for assimilation of SLA data in the SCS was proposed. The results from the assimilation experiment with this new MDT show a marked reduction (increase) in the RMSEs (correlation coefficient) between the experimental and observed SLA. Furthermore, the subsurface temperature field is also improved with this new MDT in the SCS.
文摘The need for travel demand models is growing worldwide. Obtaining reasonably accurate level of service (LOS) attributes of different travel modes such as travel time and cost representing the performance of transportation system is not a trivial task, especially in growing cities of developing countries. This study investigates the sensitivity of results of a travel mode choice model to different specifications of network-based LOS attributes using a mixed logit model. The study also looks at the possibilities of correcting some of the inaccuracies in network-based LOS attributes. Further, the study also explores the effects of different specifications of LOS data on implied values of time and aggregation forecasting. The findings indicate that the implied values of time are very sensitive to specification of data and model implying that utmost care must be taken if the purpose of the model is to estimate values of time. Models estimated on all specifications of LOS-data perform well in prediction, likely suggesting that the extra expense on developing a more detailed and accurate network models so as to derive more precise LOS attributes is unnecessary for impact analyses of some policies.
文摘This paper presents a description and performance evaluation of a new bit-level, lossless, adaptive, and asymmetric data compression scheme that is based on the adaptive character wordlength (ACW(n)) algorithm. The proposed scheme enhances the compression ratio of the ACW(n) algorithm by dividing the binary sequence into a number of subsequences (s), each of them satisfying the condition that the number of decimal values (d) of the n-bit length characters is equal to or less than 256. Therefore, the new scheme is referred to as ACW(n, s), where n is the adaptive character wordlength and s is the number of subsequences. The new scheme was used to compress a number of text files from standard corpora. The obtained results demonstrate that the ACW(n, s) scheme achieves higher compression ratio than many widely used compression algorithms and it achieves a competitive performance compared to state-of-the-art compression tools.