The empirical literature on China's outward foreign direct investment mainly relies on aggregate data from official statistics, but the reliability of such data is currently a matter of concern because it does not ta...The empirical literature on China's outward foreign direct investment mainly relies on aggregate data from official statistics, but the reliability of such data is currently a matter of concern because it does not take account of relevant features such as industry breakdown, ownership structure and entry mode. A novel firm-level database, EMENDATA, compiled by matching data from several available sources on various types of cross-border deals and including information on group structure, provides a more accurate picture and enables new empirical analyses of the rapidly increasing presence of Chinese companies abroad. Based on this database, this paper offers a more precise assessment of the geographical and sector specialization patterns of Chinese outward foreign direct investment into Europe and suggests new avenues for future research.展开更多
As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery...As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery fault diagnosis.The traditional signal processing methods,such as classical inference and weighted averaging algorithm usually lack dynamic adaptability that is easy for trends to cause the faults to be misjudged or left out.To enhance the measuring veracity and precision of vibration signal in rotary machine multi-sensor vibration signal fault diagnosis,a novel data level fusion approach is presented on the basis of correlation function analysis to fast determine the weighted value of multi-sensor vibration signals.The approach doesn't require knowing the prior information about sensors,and the weighted value of sensors can be confirmed depending on the correlation measure of real-time data tested in the data level fusion process.It gives greater weighted value to the greater correlation measure of sensor signals,and vice versa.The approach can effectively suppress large errors and even can still fuse data in the case of sensor failures because it takes full advantage of sensor's own-information to determine the weighted value.Moreover,it has good performance of anti-jamming due to the correlation measures between noise and effective signals are usually small.Through the simulation of typical signal collected from multi-sensors,the comparative analysis of dynamic adaptability and fault tolerance between the proposed approach and traditional weighted averaging approach is taken.Finally,the rotor dynamics and integrated fault simulator is taken as an example to verify the feasibility and advantages of the proposed approach,it is shown that the multi-sensor data level fusion based on correlation function weighted approach is better than the traditional weighted average approach with respect to fusion precision and dynamic adaptability.Meantime,the approach is adaptable and easy to use,can be applied to other areas of vibration measurement.展开更多
The Global Geopotential Models (GGMs) of GOCE (Gravity Recovery and steady- state Ocean Circulation Explorer) differ globally as well as regionally in their accuracy and resolution based on the maximum degree and orde...The Global Geopotential Models (GGMs) of GOCE (Gravity Recovery and steady- state Ocean Circulation Explorer) differ globally as well as regionally in their accuracy and resolution based on the maximum degree and order (d/o) of the fully normalized spherical harmonic (SH) coefficients, which express each GGM. The main idea of this study is to compare the free-air gravity anomalies and quasi geoid heights determined from several recent GOCE-based GGMs with the corresponding ones from the Earth Gravitational Model 2008 (EGM2008) over Egypt on the one hand and with ground-based measurements on the other hand. The results regarding to the comparison of GOCE-based GGMs with terrestrial gravity and GPS/levelling data provide better improvement with respect to EGM2008. The 4th release GOCE-based GGM developed with the use of space-wise solution strategy (SPW_R4) approximates the gravity field well over the Egyptian region. The SPW_R4 model is accordingly suggested as a reference model for recovering the long wavelength (up to SH d/o 200) components of quasi geoid heights when modelling the gravimetric quasi-geoid over the Egypt. Finally, three types of transformation models: Four-, Five- and Seven-parameter transformations have been applied to reduce the data biases and to provide a better fitting of quasi geoid heights obtained from the studied GOCE-based GGMs to those from GPS/levelling data. These models reveal that the standard deviation of vertical datum over Egypt is at the level of about 32 cm.展开更多
According to the space-geodetic data recorded at globally distributed stations over solid land spanning a period of more than 20-years under the International Terrestrial Reference Frame 2008,our previous estimate of ...According to the space-geodetic data recorded at globally distributed stations over solid land spanning a period of more than 20-years under the International Terrestrial Reference Frame 2008,our previous estimate of the average-weighted vertical variation of the Earth's solid surface suggests that the Earth's solid part is expanding at a rate of 0.24 ± 0.05 mm/a in recent two decades.In another aspect,the satellite altimetry observations spanning recent two decades demonstrate the sea level rise(SLR) rate 3.2 ± 0.4 mm/a,of which1.8 ± 0.5 mm/a is contributed by the ice melting over land.This study shows that the oceanic thermal expansion is 1.0 ± 0.1 mm/a due to the temperature increase in recent half century,which coincides with the estimate provided by previous authors.The SLR observation by altimetry is not balanced by the ice melting and thermal expansion,which is an open problem before this study.However,in this study we infer that the oceanic part of the Earth is expanding at a rate about 0.4 mm/a.Combining the expansion rates of land part and oceanic part,we conclude that the Earth is expanding at a rate of 0.35 ± 0.47 mm/a in recent two decades.If the Earth expands at this rate,then the altimetry-observed SLR can be well explained.展开更多
Based on GPS velocity during 1999-2007,GPS baseline time series on large scale during1999-2008 and cross-fault leveling data during 1985-2008,the paper makes some analysis and discussion to study and summarize the mov...Based on GPS velocity during 1999-2007,GPS baseline time series on large scale during1999-2008 and cross-fault leveling data during 1985-2008,the paper makes some analysis and discussion to study and summarize the movement,tectonic deformation and strain accumulation evolution characteristics of the Longmenshan fault and the surrounding area before the MS8. 0 Wenchuan earthquake,as well as the possible physical mechanism late in the seismic cycle of the Wenchuan earthquake. Multiple results indicate that:GPS velocity profiles show that obvious continuous deformation across the eastern Qinghai-Tibetan Plateau before the earthquake was distributed across a zone at least 500 km wide,while there was little deformation in Sichuan Basin and Longmenshan fault zone,which means that the eastern Qinghai-Tibetan Plateau provides energy accumulation for locked Longmenshan fault zone continuously. GPS strain rates show that the east-west compression deformation was larger in the northwest of the mid-northern segment of the Longmenshan fault zone,and deformation amplitude decreased gradually from far field to near fault zone,and there was little deformation in fault zone. The east-west compression deformation was significant surrounding the southwestern segment of the Longmenshan fault zone,and strain accumulation rate was larger than that of mid-northern segment.Fault locking indicates nearly whole Longmenshan fault was locked before the earthquake except the source of the earthquake which was weakly locked,and a 20 km width patch in southwestern segment between 12 km to 22. 5 km depth was in creeping state. GPS baseline time series in northeast direction on large scale became compressive generally from 2005 in the North-South Seismic Belt,which reflects that relative compression deformation enhances. The cross-fault leveling data show that annual vertical change rate and deformation trend accumulation rate in the Longmenshan fault zone were little,which indicates that vertical activity near the fault was very weak and the fault was tightly locked. According to analyses of GPS and cross-fault leveling data before the Wenchuan earthquake,we consider that the Longmenshan fault is tightly locked from the surface to the deep,and the horizontal and vertical deformation are weak surrounding the fault in relatively small-scale crustal deformation. The process of weak deformation may be slow,and weak deformation area may be larger when large earthquake is coming. Continuous and slow compression deformation across eastern Qinghai-Tibetan Plateau before the earthquake provides dynamic support for strain accumulation in the Longmenshan fault zone in relative large-scale crustal deformation.展开更多
Complex survey designs often involve unequal selection probabilities of clus-ters or units within clusters. When estimating models for complex survey data, scaled weights are incorporated into the likelihood, producin...Complex survey designs often involve unequal selection probabilities of clus-ters or units within clusters. When estimating models for complex survey data, scaled weights are incorporated into the likelihood, producing a pseudo likeli-hood. In a 3-level weighted analysis for a binary outcome, we implemented two methods for scaling the sampling weights in the National Health Survey of Pa-kistan (NHSP). For NHSP with health care utilization as a binary outcome we found age, gender, household (HH) goods, urban/rural status, community de-velopment index, province and marital status as significant predictors of health care utilization (p-value < 0.05). The variance of the random intercepts using scaling method 1 is estimated as 0.0961 (standard error 0.0339) for PSU level, and 0.2726 (standard error 0.0995) for household level respectively. Both esti-mates are significantly different from zero (p-value < 0.05) and indicate consid-erable heterogeneity in health care utilization with respect to households and PSUs. The results of the NHSP data analysis showed that all three analyses, weighted (two scaling methods) and un-weighted, converged to almost identical results with few exceptions. This may have occurred because of the large num-ber of 3rd and 2nd level clusters and relatively small ICC. We performed a sim-ulation study to assess the effect of varying prevalence and intra-class correla-tion coefficients (ICCs) on bias of fixed effect parameters and variance components of a multilevel pseudo maximum likelihood (weighted) analysis. The simulation results showed that the performance of the scaled weighted estimators is satisfactory for both scaling methods. Incorporating simulation into the analysis of complex multilevel surveys allows the integrity of the results to be tested and is recommended as good practice.展开更多
The in-orbit commissioning of ZY-1 02C satellite is proceeding smoothly. According to the relevant experts in this field, the imagery quality of the satellite has reached or nearly reached the level of international s...The in-orbit commissioning of ZY-1 02C satellite is proceeding smoothly. According to the relevant experts in this field, the imagery quality of the satellite has reached or nearly reached the level of international satellites of the same kind. ZY-1 02C satellite and ZY-3 satellite were successfully launched on December 22, 2011 and January 9, 2012 respectively. China Centre for Resources Satellite Data andApplication (CRSDA) was responsible for the building of a ground展开更多
This research was an effort to select best imputation method for missing upper air temperature data over 24 standard pressure levels. We have implemented four imputation techniques like inverse distance weighting, Bil...This research was an effort to select best imputation method for missing upper air temperature data over 24 standard pressure levels. We have implemented four imputation techniques like inverse distance weighting, Bilinear, Natural and Nearest interpolation for missing data imputations. Performance indicators for these techniques were the root mean square error (RMSE), absolute mean error (AME), correlation coefficient and coefficient of determination ( R<sup>2</sup> ) adopted in this research. We randomly make 30% of total samples (total samples was 324) predictable from 70% remaining data. Although four interpolation methods seem good (producing <1 RMSE, AME) for imputations of air temperature data, but bilinear method was the most accurate with least errors for missing data imputations. RMSE for bilinear method remains <0.01 on all pressure levels except 1000 hPa where this value was 0.6. The low value of AME (<0.1) came at all pressure levels through bilinear imputations. Very strong correlation (>0.99) found between actual and predicted air temperature data through this method. The high value of the coefficient of determination (0.99) through bilinear interpolation method, tells us best fit to the surface. We have also found similar results for imputation with natural interpolation method in this research, but after investigating scatter plots over each month, imputations with this method seem to little obtuse in certain months than bilinear method.展开更多
A research study collected intensive longitudinal data from cancer patients on a daily basis as well as non-intensive longitudinal survey data on a monthly basis. Although the daily data need separate analysis, those ...A research study collected intensive longitudinal data from cancer patients on a daily basis as well as non-intensive longitudinal survey data on a monthly basis. Although the daily data need separate analysis, those data can also be utilized to generate predictors of monthly outcomes. Alternatives for generating daily data predictors of monthly outcomes are addressed in this work. Analyses are reported of depression measured by the Patient Health Questionnaire 8 as the monthly survey outcome. Daily measures include numbers of opioid medications taken, numbers of pain flares, least pain levels, and worst pain levels. Predictors are averages of recent non-missing values for each daily measure recorded on or prior to survey dates for depression values. Weights for recent non-missing values are based on days between measurement of a recent value and a survey date. Five alternative averages are considered: averages with unit weights, averages with reciprocal weights, weighted averages with reciprocal weights, averages with exponential weights, and weighted averages with exponential weights. Adaptive regression methods based on likelihood cross-validation (LCV) scores are used to generate fractional polynomial models for possible nonlinear dependence of depression on each average. For all four daily measures, the best LCV score over averages of all types is generated using the average of recent non-missing values with reciprocal weights. Generated models are nonlinear and monotonic. Results indicate that an appropriate choice would be to assume three recent non-missing values and use the average with reciprocal weights of the first three recent non-missing values.展开更多
Due to the fact that consumers'privacy data sharing has multifaceted and complex effects on the e-commerce platform and its two sided agents,consumers and sellers,a game-theoretic model in a monopoly e-market is s...Due to the fact that consumers'privacy data sharing has multifaceted and complex effects on the e-commerce platform and its two sided agents,consumers and sellers,a game-theoretic model in a monopoly e-market is set up to study the equilibrium strategies of the three agents(the platform,the seller on it and consumers)under privacy data sharing.Equilibrium decisions show that after sharing consumers'privacy data once,the platform can collect more privacy data from consumers.Meanwhile,privacy data sharing pushes the seller to reduce the product price.Moreover,the platform will increase the transaction fee if the privacy data sharing value is high.It is also indicated that privacy data sharing always benefits consumers and the seller.However,the platform's profit decreases if the privacy data sharing value is low and the privacy data sharing level is high.Finally,an extended model considering an incomplete information game among the agents is discussed.The results show that both the platform and the seller cannot obtain a high profit from privacy data sharing.Factors including the seller's possibility to buy privacy data,the privacy data sharing value and privacy data sharing level affect the two agents'payoffs.If the platform wishes to benefit from privacy data sharing,it should increase the possibility of the seller to buy privacy data or increase the privacy data sharing value.展开更多
Floods are the most widespread climate-related hazards in the world, and they impact more people globally than any other type of natural disasters. It causes over one third of the total economic loss from natural cata...Floods are the most widespread climate-related hazards in the world, and they impact more people globally than any other type of natural disasters. It causes over one third of the total economic loss from natural catastrophes and is responsible for two thirds of people affected by natural disasters. On the other hand, studies and analysis have shown that damage reductions due to forecasts improvements can range from a few percentage points to as much as 35% of annual flood damages. About 300 people lose their lives each year due to floods and landslides in Nepal with property damage exceeding 626 million NPR on average. The West Rapti River basin is one of the most flood prone river basins in Nepal. The real-time flood early warning system together with the development of water management and flood protection schemes plays a crucial role in reducing the loss of lives and properties and in overall development of the basin. The non-structural mitigating measure places people away from flood. This method is designed to reduce the impact of flooding to society and economy. This paper presents an overview of flood problems in the West Rapti River basin, causes and consequences of recent floods and the applicability and effectiveness of the real time data to flood early warning in Nepal.展开更多
This paper introduced the theory and approaches of building driving forcemodels revealing the changes in land utilization level by integrating RS, GPS, and GIS technologiesbased on the example of Yuanmou County of Yun...This paper introduced the theory and approaches of building driving forcemodels revealing the changes in land utilization level by integrating RS, GPS, and GIS technologiesbased on the example of Yuanmou County of Yunnan Province. We first created the land utilizationtype database, natural driving forces for land utilization database, and human driving forces forland utilization database. Then we obtained the dependent and the independent variables of changesin land utilization level by exploring various data. Lastly we screened major factors affectingchanges in land utilization level by using the powerful spatial correlation analysis and maincomponent analysis module of GIS and obtained a multivariable linear regression model of thechangesin land utilization level by using GIS spatial regression analysis module.展开更多
In this study, superficial marine sediments collected from 96 sampling sites were analyzed for 53 inorganic elements. Each sample was digested in aqua regia and analyzed by ICP-MS. A developed multifractal inverse dis...In this study, superficial marine sediments collected from 96 sampling sites were analyzed for 53 inorganic elements. Each sample was digested in aqua regia and analyzed by ICP-MS. A developed multifractal inverse distance weighted (IDW) interpolation method was applied for the compilation of interpolated maps for both single element and factor scores distributions. R-mode factor analysis have been performed on 23 of 53 analyzed elements. The 3 factor model, accounting 84.9% of data variability, were chosen, The three elemental associations obtained have been very helpful to distinguish anthropogenic from geogenic contribution. The aim of this study is to distinguish distribution patterns of pollutants on the sea floor of NaplesandSalernobays. In general, local lithologies, water dynamic and anthropogenic activities determine the distribution of the analyzed elements. To estimate pollution level in the area, Italian guidance, Canadian sediment quality guidance and Long’s criteria are chosen to set the comparability. As the results shows, arsenic and lead may present highly adverse effect to living creatures.展开更多
Nowadays, the deep learning methods are widely applied to analyze and predict the trend of various disaster events and offer the alternatives to make the appropriate decisions. These support the water resource managem...Nowadays, the deep learning methods are widely applied to analyze and predict the trend of various disaster events and offer the alternatives to make the appropriate decisions. These support the water resource management and the short-term planning. In this paper, the water levels of the Pattani River in the Southern of Thailand have been predicted every hour of 7 days forecast. Time Series Transformer and Linear Regression were applied in this work. The results of both were the water levels forecast that had the high accuracy. Moreover, the water levels forecasting dashboard was developed for using to monitor the water levels at the Pattani River as well.展开更多
基金the project "The challenge of globalization: Technology driven foreign direct investment(TFDI) and its implications for the negotiation of International(bilateral and multilateral) Investment Agreements" funded by the Riksbank Foundation
文摘The empirical literature on China's outward foreign direct investment mainly relies on aggregate data from official statistics, but the reliability of such data is currently a matter of concern because it does not take account of relevant features such as industry breakdown, ownership structure and entry mode. A novel firm-level database, EMENDATA, compiled by matching data from several available sources on various types of cross-border deals and including information on group structure, provides a more accurate picture and enables new empirical analyses of the rapidly increasing presence of Chinese companies abroad. Based on this database, this paper offers a more precise assessment of the geographical and sector specialization patterns of Chinese outward foreign direct investment into Europe and suggests new avenues for future research.
基金supported by National Hi-tech Research and Development Program of China (863 Program, Grant No. 2007AA04Z433)Hunan Provincial Natural Science Foundation of China (Grant No. 09JJ8005)Scientific Research Foundation of Graduate School of Beijing University of Chemical and Technology,China (Grant No. 10Me002)
文摘As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery fault diagnosis.The traditional signal processing methods,such as classical inference and weighted averaging algorithm usually lack dynamic adaptability that is easy for trends to cause the faults to be misjudged or left out.To enhance the measuring veracity and precision of vibration signal in rotary machine multi-sensor vibration signal fault diagnosis,a novel data level fusion approach is presented on the basis of correlation function analysis to fast determine the weighted value of multi-sensor vibration signals.The approach doesn't require knowing the prior information about sensors,and the weighted value of sensors can be confirmed depending on the correlation measure of real-time data tested in the data level fusion process.It gives greater weighted value to the greater correlation measure of sensor signals,and vice versa.The approach can effectively suppress large errors and even can still fuse data in the case of sensor failures because it takes full advantage of sensor's own-information to determine the weighted value.Moreover,it has good performance of anti-jamming due to the correlation measures between noise and effective signals are usually small.Through the simulation of typical signal collected from multi-sensors,the comparative analysis of dynamic adaptability and fault tolerance between the proposed approach and traditional weighted averaging approach is taken.Finally,the rotor dynamics and integrated fault simulator is taken as an example to verify the feasibility and advantages of the proposed approach,it is shown that the multi-sensor data level fusion based on correlation function weighted approach is better than the traditional weighted average approach with respect to fusion precision and dynamic adaptability.Meantime,the approach is adaptable and easy to use,can be applied to other areas of vibration measurement.
文摘The Global Geopotential Models (GGMs) of GOCE (Gravity Recovery and steady- state Ocean Circulation Explorer) differ globally as well as regionally in their accuracy and resolution based on the maximum degree and order (d/o) of the fully normalized spherical harmonic (SH) coefficients, which express each GGM. The main idea of this study is to compare the free-air gravity anomalies and quasi geoid heights determined from several recent GOCE-based GGMs with the corresponding ones from the Earth Gravitational Model 2008 (EGM2008) over Egypt on the one hand and with ground-based measurements on the other hand. The results regarding to the comparison of GOCE-based GGMs with terrestrial gravity and GPS/levelling data provide better improvement with respect to EGM2008. The 4th release GOCE-based GGM developed with the use of space-wise solution strategy (SPW_R4) approximates the gravity field well over the Egyptian region. The SPW_R4 model is accordingly suggested as a reference model for recovering the long wavelength (up to SH d/o 200) components of quasi geoid heights when modelling the gravimetric quasi-geoid over the Egypt. Finally, three types of transformation models: Four-, Five- and Seven-parameter transformations have been applied to reduce the data biases and to provide a better fitting of quasi geoid heights obtained from the studied GOCE-based GGMs to those from GPS/levelling data. These models reveal that the standard deviation of vertical datum over Egypt is at the level of about 32 cm.
基金supported by National 973 Project China(2013CB733305,2013CB733301)National Natural Science Foundation of China(41174011,41429401,41210006,41128003,41021061)
文摘According to the space-geodetic data recorded at globally distributed stations over solid land spanning a period of more than 20-years under the International Terrestrial Reference Frame 2008,our previous estimate of the average-weighted vertical variation of the Earth's solid surface suggests that the Earth's solid part is expanding at a rate of 0.24 ± 0.05 mm/a in recent two decades.In another aspect,the satellite altimetry observations spanning recent two decades demonstrate the sea level rise(SLR) rate 3.2 ± 0.4 mm/a,of which1.8 ± 0.5 mm/a is contributed by the ice melting over land.This study shows that the oceanic thermal expansion is 1.0 ± 0.1 mm/a due to the temperature increase in recent half century,which coincides with the estimate provided by previous authors.The SLR observation by altimetry is not balanced by the ice melting and thermal expansion,which is an open problem before this study.However,in this study we infer that the oceanic part of the Earth is expanding at a rate about 0.4 mm/a.Combining the expansion rates of land part and oceanic part,we conclude that the Earth is expanding at a rate of 0.35 ± 0.47 mm/a in recent two decades.If the Earth expands at this rate,then the altimetry-observed SLR can be well explained.
基金supported by the National Key R&D Program of China(2018YFC1503606 2017YFC1500502)Earthquake Tracking Task(2019010215)
文摘Based on GPS velocity during 1999-2007,GPS baseline time series on large scale during1999-2008 and cross-fault leveling data during 1985-2008,the paper makes some analysis and discussion to study and summarize the movement,tectonic deformation and strain accumulation evolution characteristics of the Longmenshan fault and the surrounding area before the MS8. 0 Wenchuan earthquake,as well as the possible physical mechanism late in the seismic cycle of the Wenchuan earthquake. Multiple results indicate that:GPS velocity profiles show that obvious continuous deformation across the eastern Qinghai-Tibetan Plateau before the earthquake was distributed across a zone at least 500 km wide,while there was little deformation in Sichuan Basin and Longmenshan fault zone,which means that the eastern Qinghai-Tibetan Plateau provides energy accumulation for locked Longmenshan fault zone continuously. GPS strain rates show that the east-west compression deformation was larger in the northwest of the mid-northern segment of the Longmenshan fault zone,and deformation amplitude decreased gradually from far field to near fault zone,and there was little deformation in fault zone. The east-west compression deformation was significant surrounding the southwestern segment of the Longmenshan fault zone,and strain accumulation rate was larger than that of mid-northern segment.Fault locking indicates nearly whole Longmenshan fault was locked before the earthquake except the source of the earthquake which was weakly locked,and a 20 km width patch in southwestern segment between 12 km to 22. 5 km depth was in creeping state. GPS baseline time series in northeast direction on large scale became compressive generally from 2005 in the North-South Seismic Belt,which reflects that relative compression deformation enhances. The cross-fault leveling data show that annual vertical change rate and deformation trend accumulation rate in the Longmenshan fault zone were little,which indicates that vertical activity near the fault was very weak and the fault was tightly locked. According to analyses of GPS and cross-fault leveling data before the Wenchuan earthquake,we consider that the Longmenshan fault is tightly locked from the surface to the deep,and the horizontal and vertical deformation are weak surrounding the fault in relatively small-scale crustal deformation. The process of weak deformation may be slow,and weak deformation area may be larger when large earthquake is coming. Continuous and slow compression deformation across eastern Qinghai-Tibetan Plateau before the earthquake provides dynamic support for strain accumulation in the Longmenshan fault zone in relative large-scale crustal deformation.
文摘Complex survey designs often involve unequal selection probabilities of clus-ters or units within clusters. When estimating models for complex survey data, scaled weights are incorporated into the likelihood, producing a pseudo likeli-hood. In a 3-level weighted analysis for a binary outcome, we implemented two methods for scaling the sampling weights in the National Health Survey of Pa-kistan (NHSP). For NHSP with health care utilization as a binary outcome we found age, gender, household (HH) goods, urban/rural status, community de-velopment index, province and marital status as significant predictors of health care utilization (p-value < 0.05). The variance of the random intercepts using scaling method 1 is estimated as 0.0961 (standard error 0.0339) for PSU level, and 0.2726 (standard error 0.0995) for household level respectively. Both esti-mates are significantly different from zero (p-value < 0.05) and indicate consid-erable heterogeneity in health care utilization with respect to households and PSUs. The results of the NHSP data analysis showed that all three analyses, weighted (two scaling methods) and un-weighted, converged to almost identical results with few exceptions. This may have occurred because of the large num-ber of 3rd and 2nd level clusters and relatively small ICC. We performed a sim-ulation study to assess the effect of varying prevalence and intra-class correla-tion coefficients (ICCs) on bias of fixed effect parameters and variance components of a multilevel pseudo maximum likelihood (weighted) analysis. The simulation results showed that the performance of the scaled weighted estimators is satisfactory for both scaling methods. Incorporating simulation into the analysis of complex multilevel surveys allows the integrity of the results to be tested and is recommended as good practice.
文摘The in-orbit commissioning of ZY-1 02C satellite is proceeding smoothly. According to the relevant experts in this field, the imagery quality of the satellite has reached or nearly reached the level of international satellites of the same kind. ZY-1 02C satellite and ZY-3 satellite were successfully launched on December 22, 2011 and January 9, 2012 respectively. China Centre for Resources Satellite Data andApplication (CRSDA) was responsible for the building of a ground
文摘This research was an effort to select best imputation method for missing upper air temperature data over 24 standard pressure levels. We have implemented four imputation techniques like inverse distance weighting, Bilinear, Natural and Nearest interpolation for missing data imputations. Performance indicators for these techniques were the root mean square error (RMSE), absolute mean error (AME), correlation coefficient and coefficient of determination ( R<sup>2</sup> ) adopted in this research. We randomly make 30% of total samples (total samples was 324) predictable from 70% remaining data. Although four interpolation methods seem good (producing <1 RMSE, AME) for imputations of air temperature data, but bilinear method was the most accurate with least errors for missing data imputations. RMSE for bilinear method remains <0.01 on all pressure levels except 1000 hPa where this value was 0.6. The low value of AME (<0.1) came at all pressure levels through bilinear imputations. Very strong correlation (>0.99) found between actual and predicted air temperature data through this method. The high value of the coefficient of determination (0.99) through bilinear interpolation method, tells us best fit to the surface. We have also found similar results for imputation with natural interpolation method in this research, but after investigating scatter plots over each month, imputations with this method seem to little obtuse in certain months than bilinear method.
文摘A research study collected intensive longitudinal data from cancer patients on a daily basis as well as non-intensive longitudinal survey data on a monthly basis. Although the daily data need separate analysis, those data can also be utilized to generate predictors of monthly outcomes. Alternatives for generating daily data predictors of monthly outcomes are addressed in this work. Analyses are reported of depression measured by the Patient Health Questionnaire 8 as the monthly survey outcome. Daily measures include numbers of opioid medications taken, numbers of pain flares, least pain levels, and worst pain levels. Predictors are averages of recent non-missing values for each daily measure recorded on or prior to survey dates for depression values. Weights for recent non-missing values are based on days between measurement of a recent value and a survey date. Five alternative averages are considered: averages with unit weights, averages with reciprocal weights, weighted averages with reciprocal weights, averages with exponential weights, and weighted averages with exponential weights. Adaptive regression methods based on likelihood cross-validation (LCV) scores are used to generate fractional polynomial models for possible nonlinear dependence of depression on each average. For all four daily measures, the best LCV score over averages of all types is generated using the average of recent non-missing values with reciprocal weights. Generated models are nonlinear and monotonic. Results indicate that an appropriate choice would be to assume three recent non-missing values and use the average with reciprocal weights of the first three recent non-missing values.
基金The National Social Science Foundation of China(No.17BGL196)。
文摘Due to the fact that consumers'privacy data sharing has multifaceted and complex effects on the e-commerce platform and its two sided agents,consumers and sellers,a game-theoretic model in a monopoly e-market is set up to study the equilibrium strategies of the three agents(the platform,the seller on it and consumers)under privacy data sharing.Equilibrium decisions show that after sharing consumers'privacy data once,the platform can collect more privacy data from consumers.Meanwhile,privacy data sharing pushes the seller to reduce the product price.Moreover,the platform will increase the transaction fee if the privacy data sharing value is high.It is also indicated that privacy data sharing always benefits consumers and the seller.However,the platform's profit decreases if the privacy data sharing value is low and the privacy data sharing level is high.Finally,an extended model considering an incomplete information game among the agents is discussed.The results show that both the platform and the seller cannot obtain a high profit from privacy data sharing.Factors including the seller's possibility to buy privacy data,the privacy data sharing value and privacy data sharing level affect the two agents'payoffs.If the platform wishes to benefit from privacy data sharing,it should increase the possibility of the seller to buy privacy data or increase the privacy data sharing value.
文摘Floods are the most widespread climate-related hazards in the world, and they impact more people globally than any other type of natural disasters. It causes over one third of the total economic loss from natural catastrophes and is responsible for two thirds of people affected by natural disasters. On the other hand, studies and analysis have shown that damage reductions due to forecasts improvements can range from a few percentage points to as much as 35% of annual flood damages. About 300 people lose their lives each year due to floods and landslides in Nepal with property damage exceeding 626 million NPR on average. The West Rapti River basin is one of the most flood prone river basins in Nepal. The real-time flood early warning system together with the development of water management and flood protection schemes plays a crucial role in reducing the loss of lives and properties and in overall development of the basin. The non-structural mitigating measure places people away from flood. This method is designed to reduce the impact of flooding to society and economy. This paper presents an overview of flood problems in the West Rapti River basin, causes and consequences of recent floods and the applicability and effectiveness of the real time data to flood early warning in Nepal.
文摘This paper introduced the theory and approaches of building driving forcemodels revealing the changes in land utilization level by integrating RS, GPS, and GIS technologiesbased on the example of Yuanmou County of Yunnan Province. We first created the land utilizationtype database, natural driving forces for land utilization database, and human driving forces forland utilization database. Then we obtained the dependent and the independent variables of changesin land utilization level by exploring various data. Lastly we screened major factors affectingchanges in land utilization level by using the powerful spatial correlation analysis and maincomponent analysis module of GIS and obtained a multivariable linear regression model of thechangesin land utilization level by using GIS spatial regression analysis module.
文摘In this study, superficial marine sediments collected from 96 sampling sites were analyzed for 53 inorganic elements. Each sample was digested in aqua regia and analyzed by ICP-MS. A developed multifractal inverse distance weighted (IDW) interpolation method was applied for the compilation of interpolated maps for both single element and factor scores distributions. R-mode factor analysis have been performed on 23 of 53 analyzed elements. The 3 factor model, accounting 84.9% of data variability, were chosen, The three elemental associations obtained have been very helpful to distinguish anthropogenic from geogenic contribution. The aim of this study is to distinguish distribution patterns of pollutants on the sea floor of NaplesandSalernobays. In general, local lithologies, water dynamic and anthropogenic activities determine the distribution of the analyzed elements. To estimate pollution level in the area, Italian guidance, Canadian sediment quality guidance and Long’s criteria are chosen to set the comparability. As the results shows, arsenic and lead may present highly adverse effect to living creatures.
文摘Nowadays, the deep learning methods are widely applied to analyze and predict the trend of various disaster events and offer the alternatives to make the appropriate decisions. These support the water resource management and the short-term planning. In this paper, the water levels of the Pattani River in the Southern of Thailand have been predicted every hour of 7 days forecast. Time Series Transformer and Linear Regression were applied in this work. The results of both were the water levels forecast that had the high accuracy. Moreover, the water levels forecasting dashboard was developed for using to monitor the water levels at the Pattani River as well.