Decision-making of investors at the stock exchange can be based on the fundamental indicators of stocks, on the technical indicators, or can exist as a combination of these two methods. The paper gives emphasis to the...Decision-making of investors at the stock exchange can be based on the fundamental indicators of stocks, on the technical indicators, or can exist as a combination of these two methods. The paper gives emphasis to the domain of technical analysis. In the broader sense the technical analysis enables the dynamics of the expected future values of the shares estimation. This can be performed on the basis of the data on historical trends of the revenues, profits and other indicators from the balance sheet, but also on the basis of historical data on changes in the values of the shares. Companies generally belong to the different sectors that have different presumptions of development resulting from the global market trends, technology and other characteristic. Processing of historical data values of the outstanding shares of the Zagreb Stock Exchange (ZSE) is origination of this research. Investors are interested to know the estimation of future returns for the stocks as well as the size of the risk associated with the expected returns. Research task in this paper is finding the optimal portfolio at the ZSE based on the concept of dominant portfolio by Markowitz approach. The portfolio is created by solving non-linear programming problem using the common software tools. The results of obtained optimal portfolios contain relevant conclusions about the specifics of the shares as well as the characteristics of the industrial sectors but also provide a further knowledge about diverse sectors treatment at the stock exchange in a multi-year period.展开更多
In the era of big data,the general public is more likely to access big data,but they wouldn’t like to analyze the data.Therefore,the traditional data visualization with certain professionalism is not easy to be accep...In the era of big data,the general public is more likely to access big data,but they wouldn’t like to analyze the data.Therefore,the traditional data visualization with certain professionalism is not easy to be accepted by the general public living in the fast pace.Under this background,a new general visualization method for dynamic time series data emerges as the times require.Time series data visualization organizes abstract and hard-to-understand data into a form that is easily understood by the public.This method integrates data visualization into short videos,which is more in line with the way people get information in modern fast-paced lifestyles.The modular approach also facilitates public participation in production.This paper summarizes the dynamic visualization methods of time series data ranking,studies the relevant literature,shows its value and existing problems,and gives corresponding suggestions and future research prospects.展开更多
The development of vehicle-to-everything and cloud computing has brought new opportunities and challenges to the automobile industry.In this paper,a commuter vehicle demand torque prediction method based on historical...The development of vehicle-to-everything and cloud computing has brought new opportunities and challenges to the automobile industry.In this paper,a commuter vehicle demand torque prediction method based on historical vehicle speed information is proposed,which uses machine learning to predict and analyze vehicle demand torque.Firstly,the big data of vehicle driving is collected,and the driving data is cleaned and features extracted based on road information.Then,the vehicle longitudinal driving dynamics model is established.Next,the vehicle simulation simulator is established based on the longitudinal driving dynamics model of the vehicle,and the driving torque of the vehicle is obtained.Finally,the travel is divided into several accelerationcruise-deceleration road pairs for analysis,and the vehicle demand torque is predicted by BP neural network and Gaussian process regression.展开更多
In 2020,the COVID-19 pandemic spreads rapidly around the world.To accurately predict the number of daily new cases in each country,Lanzhou University has established the Global Prediction System of the COVID-19 Pandem...In 2020,the COVID-19 pandemic spreads rapidly around the world.To accurately predict the number of daily new cases in each country,Lanzhou University has established the Global Prediction System of the COVID-19 Pandemic(GPCP).In this article,the authors use the ensemble empirical mode decomposition(EEMD)model and autoregressive moving average(ARMA)model to improve the prediction results of GPCP.In addition,the authors also conduct direct predictions for those countries with a small number of confirmed cases or are in the early stage of the disease,whose development trends of the pandemic do not fully comply with the law of infectious diseases and cannot be predicted by the GPCP model.Judging from the results,the absolute values of the relative errors of predictions in countries such as Cuba have been reduced significantly and their prediction trends are closer to the real situations through the method mentioned above to revise the prediction results out of GPCP.For countries such as El Salvador with a small number of cases,the absolute values of the relative errors of prediction become smaller.Therefore,this article concludes that this method is more effective for improving prediction results and direct prediction.展开更多
Driven by market requirements, software services organizations have adopted various software engineering process models (such as capability maturity model (CMM), capability maturity model integration (CMMI), ISO ...Driven by market requirements, software services organizations have adopted various software engineering process models (such as capability maturity model (CMM), capability maturity model integration (CMMI), ISO 9001:2000, etc.) and practice of the project management concepts defined in the project management body of knowledge. While this has definitely helped organizations to bring some methods into the software development madness, there always exists a demand for comparing various groups within the organization in terms of the practice of these defined process models. Even though there exist many metrics for comparison, considering the variety of projects in terms of technology, life cycle, etc., finding a single metric that caters to this is a difficult task. This paper proposes a model for arriving at a rating on group maturity within the organization. Considering the linguistic or imprecise and uncertain nature of software measurements, fuzzy logic approach is used for the proposed model. Without the barriers like technology or life cycle difference, the proposed model helps the organization to compare different groups within it with reasonable precision.展开更多
OpenStreetMap has a large number of volunteers.There is a hypothesis that volunteers with different cultural backgrounds may have different editing behaviors when contributing to OSM.It may be strongly related to data...OpenStreetMap has a large number of volunteers.There is a hypothesis that volunteers with different cultural backgrounds may have different editing behaviors when contributing to OSM.It may be strongly related to data quality and data reliability on OSM.As for the heterogeneity and the reliability of OSM data,previous research usually focuses on the geometric accuracy,spatial location accuracy and semantic integrity of OSM data,while few researchers have analyzed these problems from the perspective of editing behavior.On the grounds of relationship between mapping motivation and editing behavior,the dispersion of editing trajectory and clockwise direction index are proposed in the paper to explore whether the volunteers are sufficiently motivated and knowledgeable.In the experiments,the historical OSM data of four countries suggested that developed countries have lower trajectory dispersion.The lower degree of trajectory dispersion reflects the higher concentration and professionalism of volunteers.A high degree of drawing direction consistency shows volunteers who mapped French data were natives with local knowledge.From this point of view,this paper verifies that volunteer editing behavior is an effective method to analyze data quality heterogeneity and data reliability.展开更多
A long-term dataset of photosynthetically active radiation (Qp) is reconstructed from a broadband global solar radiation (Rs) dataset through an all-weather reconstruction model. This method is based on four years...A long-term dataset of photosynthetically active radiation (Qp) is reconstructed from a broadband global solar radiation (Rs) dataset through an all-weather reconstruction model. This method is based on four years' worth of data collected in Beijing. Observation data of Rs and Qp from 2005-2008 are used to investigate the temporal variability of Qp and its dependence on the clearness index and solar zenith angle. A simple and effcient all-weather empirically derived reconstruction model is proposed to reconstruct Qp from Rs. This reconstruction method is found to estimate instantaneous Qp with high accuracy. The annual mean of the daily values of Qp during the period 1958-2005 period is 25.06 mol m-2 d-1. The magnitude of the long-term trend for the annual averaged Qp is presented (-0.19 mol m-2 yr-1 from 1958-1997 and -0.12 mol m-2 yr-1 from 1958-2005). The trend in Qp exhibits sharp decreases in the spring and summer and more gentle decreases in the autumn and winter.展开更多
Screening similar historical fault-free candidate data would greatly affect the effectiveness of fault detection results based on principal component analysis(PCA).In order to find out the candidate data,this study co...Screening similar historical fault-free candidate data would greatly affect the effectiveness of fault detection results based on principal component analysis(PCA).In order to find out the candidate data,this study compares unweighted and weighted similarity factors(SFs),which measure the similarity of the principal component subspace corresponding to the first k main components of two datasets.The fault detection employs the principal component subspace corresponding to the current measured data and the historical fault-free data.From the historical fault-free database,the load parameters are employed to locate the candidate data similar to the current operating data.Fault detection method for air conditioning systems is based on principal component.The results show that the weighted principal component SF can improve the effects of the fault-free detection and the fault detection.Compared with the unweighted SF,the average fault-free detection rate of the weighted SF is 17.33%higher than that of the unweighted,and the average fault detection rate is 7.51%higher than unweighted.展开更多
China began to develop its meteorological satellite program since 1969.With 50-years’growing,there are 17 Fengyun(FY)meteorological satellites launched successfully.At present,seven of them are in orbit to provide th...China began to develop its meteorological satellite program since 1969.With 50-years’growing,there are 17 Fengyun(FY)meteorological satellites launched successfully.At present,seven of them are in orbit to provide the operational service,including three polar orbiting meteorological satellites and four geostationary meteorological satellites.Since last COSPAR report,no new Fengyun satellite has been launched.The information of the on-orbit FY-2 series,FY-3 series,and FY-4 series has been updated.FY-3D and FY-2H satellites accomplished the commission test and transitioned into operation in 2018.FY-2E satellite completed its service to decommission in 2019.The web-based users and Direct Broadcasting(DB)users keep growing worldwide to require the Fengyun satellite data and products.A new Mobile Application Service has been launched to Fengyun users based on the cloud technology in 2018.In this report,the international and regional co-operations to facilitate the Fengyun user community have been addressed especially.To strengthen the data service in the Belt and Road countries,the Emergency Support Mechanism of Fengyun satellite(FY_ESM)has been established since 2018.Meanwhile,a Recalibrating 30-years’archived Fengyun satellite data project has been founded since 2018.This project targets to generate the Fundamental Climate Data Record(FCDR)as a space agency response to the Global Climate Observation System(GCOS).At last,the future Fengyun program up to 2025 has been introduced as well.展开更多
oupled stress release model is proposed in the paper considering the interaction between different parts based on stress release model by VereJones, and is used to historical earthquake data from North China. The resu...oupled stress release model is proposed in the paper considering the interaction between different parts based on stress release model by VereJones, and is used to historical earthquake data from North China. The results by this model are compared with the results by original stress release model using AIC criterion. The results show that coupled stress release model is better than original model.展开更多
Having as target the semi-enclosed basin of the Black Sea,the primary purpose of the existing paper is to present an overview of its extensive physical features and circulation patterns.To achieve this goal,more than ...Having as target the semi-enclosed basin of the Black Sea,the primary purpose of the existing paper is to present an overview of its extensive physical features and circulation patterns.To achieve this goal,more than five decades of data analysis-from 1960 to 2015-were taken into consideration and the results were validated against acknowledged data,both from satellite data over the last two decades and in-situ measurements from first decades.The circulation of the Black Sea basin has been studied for almost 400 years since the Italian Count Luigi Marsigli first described the“two-layer”circulation through the Bosphorus Strait in the year 1681.Since climate change projections for the Black Sea region foresee a significant impact on the environment in the coming decades,a set of adaptation and mitigation measures is required.Therefore more research is needed.Nowadays,the warming trend adds a sense of immediate urgency because according to the National Oceanic and Atmospheric Administration’s National Centre for Environmental Information,July 2020 was the second-hottest month ever recorded for the planet.Its averaged land and ocean surface temperature tied with July 2016 as the secondhighest for the month in the 141-year NOAA’s global temperature dataset history,which dates back to 1880.It was 0.92°C above the 20th-century average of 15.8°C,with only 0.01°C less than the record extreme value measured in July of 2019.展开更多
When the historical data of the early phase trial and the interim data of the Phase II trial are avail-able,we should use them to give a more accurate prediction in both futility and efficacy analysis.The predictive p...When the historical data of the early phase trial and the interim data of the Phase II trial are avail-able,we should use them to give a more accurate prediction in both futility and efficacy analysis.The predictive power is an important measure of the practical utility of a proposed trial,and it is better than the classical statistical power in giving a good indication of the probability that the trial will demonstrate a positive or statistically significant outcome.In addition to the four predic-tive powers with historical and interim data available in the literature and summarized in Table 1,we discover and calculate another four predictive powers also summarized in Table 1,for one-sided hypotheses.Moreover,we calculate eight predictive powers summarized in Table 2,for the reversed hypotheses.The combination of the two tables gives us a complete picture of the pre-dictive powers with historical and interim data for futility and efficacy analysis.Furthermore,the eight predictive powers with historical and interim data are utilized to guide the futility analysis in the tamoxifen example.Finally,extensive simulations have been conducted to investigate the sensitivity analysis of priors,sample sizes,interim result and interim time on different predictive powers.展开更多
Atmospheric water vapor is an essential climate variable(ECV)with extensive spatial and temporal variations.Microwave humidity observations from meteorological satellites provide important information for climate syst...Atmospheric water vapor is an essential climate variable(ECV)with extensive spatial and temporal variations.Microwave humidity observations from meteorological satellites provide important information for climate system variables,including atmospheric water vapor and precipitable water,and assimilation in numerical weather prediction(NWP)and reanalysis.As one of the payloads onboard China’s second-generation polar-orbiting operational meteorological Fengyun-3(FY-3)satellites,the Microwave Humidity Sounder(MWHS)has been continuously observing the global humidity since 2008.The reprocessing of historical FY-3 MWHS data is documented in detail in this study.After calibrating and correcting the data,the quality of the reprocessed dataset is evaluated and the improvement is shown in this study.The results suggest that MWHS observations bias is reduced to approximately 0.8 K,compared with METOP-A Microwave Humidity Sounder(MHS).The temporal variability of MWHS is highly correlated with the instrument temperature.After reprocessing,the scene temperature dependency is mitigated for all 183 GHz channels,and the consistency and stability between FY-3A/B/C are also improved.展开更多
A system reliability model based on Bayesian network(BN)is built via an evolutionary strategy called dual genetic algorithm(DGA).BN is a probabilistic approach to analyze relationships between stochastic events.In con...A system reliability model based on Bayesian network(BN)is built via an evolutionary strategy called dual genetic algorithm(DGA).BN is a probabilistic approach to analyze relationships between stochastic events.In contrast with traditional methods where BN model is built by professionals,DGA is proposed for the automatic analysis of historical data and construction of BN for the estimation of system reliability.The whole solution space of BN structures is searched by DGA and a more accurate BN model is obtained.Efficacy of the proposed method is shown by some literature examples.展开更多
The subject of the legal history belongs to the interdisciplinary of the law and the history. Because of the attribute of the interdisciplinary of the subject of the legal history, the reference of the experience in t...The subject of the legal history belongs to the interdisciplinary of the law and the history. Because of the attribute of the interdisciplinary of the subject of the legal history, the reference of the experience in the development of the h/story is very important. The subject of the legal history must pay attention to the problems encountered in the process of the development of the history and the accumulated experience. The historical development since the last century displays the importance and necessity of the theory of the social science in the study of the history. The researcher of the legal history should focus on the comprehensive use of the theories and methods of the subject of the social sciences, and through the innovative use of the historical data and the use of the multidisciplinary interpretation methods, re-interpret the historical events and figures. With the aid of the comprehensive use of the methods of the multiple disciplines, further open the new situation of the study of the legal history.展开更多
Observational facts from the Maldives, Goa and Bangladesh in the Indian Ocean and from Fiji and New Caledonia in the Pacific record a high sea level in the 17th century, a low sea level in the 18th century, a high sea...Observational facts from the Maldives, Goa and Bangladesh in the Indian Ocean and from Fiji and New Caledonia in the Pacific record a high sea level in the 17th century, a low sea level in the 18th century, a high sea level in the early 19th century and a stable sea level in the last 50 - 70 years. This cannot be understood in terms of glacial eustasy (or in terms of steric effects or tectonics), only in terms of rotational eustasy. The present paper gives a summary of the observational facts behind the formulation of the novel concept of rotational eustasy. It reveals a common trend of sea level changes, which is opposed to the sea level changes in the northern hemisphere, and the global climatic changes in general. Rotational eustasy offers a logical explanation.展开更多
Risk management is relevant for every project that which seeks to avoid and suppress unanticipated costs, basically calling for pre-emptive action. The current work proposes a new approach for handling risks based on ...Risk management is relevant for every project that which seeks to avoid and suppress unanticipated costs, basically calling for pre-emptive action. The current work proposes a new approach for handling risks based on predictive analytics and machine learning (ML) that can work in real-time to help avoid risks and increase project adaptability. The main research aim of the study is to ascertain risk presence in projects by using historical data from previous projects, focusing on important aspects such as time, task time, resources and project results. t-SNE technique applies feature engineering in the reduction of the dimensionality while preserving important structural properties. This process is analysed using measures including recall, F1-score, accuracy and precision measurements. The results demonstrate that the Gradient Boosting Machine (GBM) achieves an impressive 85% accuracy, 82% precision, 85% recall, and 80% F1-score, surpassing previous models. Additionally, predictive analytics achieves a resource utilisation efficiency of 85%, compared to 70% for traditional allocation methods, and a project cost reduction of 10%, double the 5% achieved by traditional approaches. Furthermore, the study indicates that while GBM excels in overall accuracy, Logistic Regression (LR) offers more favourable precision-recall trade-offs, highlighting the importance of model selection in project risk management.展开更多
Due to the increasing requirement for high-level weather and climate forecasting accuracy, it is necessary to exploit a strategy for model error correction while developing numerical modeling and data assimilation tec...Due to the increasing requirement for high-level weather and climate forecasting accuracy, it is necessary to exploit a strategy for model error correction while developing numerical modeling and data assimilation techniques. This study classifies the correction strategies according to the types of forecast errors, and reviews recent studies on these correction strategies. Among others, the analogue-dynamical method has been developed in China, which combines statistical methods with the dynamical model, corrects model errors based on analogue information, and effectively utilizes historical data in dynamical forecasts. In this study, the fundamental principles and technical solutions of the analogue-dynamical method and associated development history for forecasts on different timescales are introduced. It is shown that this method can effectively improve medium- and extended-range forecasts, monthly-average circulation forecast, and short-term climate prediction. As an innovative technique independently developed in China, the analogue- dynamical method plays an important role in both weather forecast and climate prediction, and has potential applications in wider fields.展开更多
文摘Decision-making of investors at the stock exchange can be based on the fundamental indicators of stocks, on the technical indicators, or can exist as a combination of these two methods. The paper gives emphasis to the domain of technical analysis. In the broader sense the technical analysis enables the dynamics of the expected future values of the shares estimation. This can be performed on the basis of the data on historical trends of the revenues, profits and other indicators from the balance sheet, but also on the basis of historical data on changes in the values of the shares. Companies generally belong to the different sectors that have different presumptions of development resulting from the global market trends, technology and other characteristic. Processing of historical data values of the outstanding shares of the Zagreb Stock Exchange (ZSE) is origination of this research. Investors are interested to know the estimation of future returns for the stocks as well as the size of the risk associated with the expected returns. Research task in this paper is finding the optimal portfolio at the ZSE based on the concept of dominant portfolio by Markowitz approach. The portfolio is created by solving non-linear programming problem using the common software tools. The results of obtained optimal portfolios contain relevant conclusions about the specifics of the shares as well as the characteristics of the industrial sectors but also provide a further knowledge about diverse sectors treatment at the stock exchange in a multi-year period.
基金This research is funded by the Open Foundation for the University Innovation Platform in the Hunan Province,Grant No.18K103Hunan Provincial Natural Science Foundation of China,Grant No.2017JJ20162016 Science Research Project of Hunan Provincial Department of Education,Grant No.16C0269.This research work is implemented at the 2011 Collaborative Innovation Center for Development and Utilization of Finance and Economics Big Data Property,Universities of Hunan Province.Open project,Grant Nos.20181901CRP03,20181901CRP04,20181901CRP05 National Social Science Fund Project:Research on the Impact Mechanism of China’s Capital Space Flow on Regional Economic Development(Project No.14BJL086).
文摘In the era of big data,the general public is more likely to access big data,but they wouldn’t like to analyze the data.Therefore,the traditional data visualization with certain professionalism is not easy to be accepted by the general public living in the fast pace.Under this background,a new general visualization method for dynamic time series data emerges as the times require.Time series data visualization organizes abstract and hard-to-understand data into a form that is easily understood by the public.This method integrates data visualization into short videos,which is more in line with the way people get information in modern fast-paced lifestyles.The modular approach also facilitates public participation in production.This paper summarizes the dynamic visualization methods of time series data ranking,studies the relevant literature,shows its value and existing problems,and gives corresponding suggestions and future research prospects.
基金supported in part by National Natural Science Foundation(NNSF)of China(Nos.61803079,61890924,61991404)in part by Fundamental Research Funds for the Central Universities(No.N2108006)in part by Liaoning Revitalization Talents Program(No.XLYC1907087)。
文摘The development of vehicle-to-everything and cloud computing has brought new opportunities and challenges to the automobile industry.In this paper,a commuter vehicle demand torque prediction method based on historical vehicle speed information is proposed,which uses machine learning to predict and analyze vehicle demand torque.Firstly,the big data of vehicle driving is collected,and the driving data is cleaned and features extracted based on road information.Then,the vehicle longitudinal driving dynamics model is established.Next,the vehicle simulation simulator is established based on the longitudinal driving dynamics model of the vehicle,and the driving torque of the vehicle is obtained.Finally,the travel is divided into several accelerationcruise-deceleration road pairs for analysis,and the vehicle demand torque is predicted by BP neural network and Gaussian process regression.
基金This work was jointly supported by the National Natural Science Foundation of China[grant numbers 41521004 and 41875083]the Gansu Provincial Special Fund Project for Guiding Scientific and Technological Innovation and Development[grant number 2019ZX-06].
文摘In 2020,the COVID-19 pandemic spreads rapidly around the world.To accurately predict the number of daily new cases in each country,Lanzhou University has established the Global Prediction System of the COVID-19 Pandemic(GPCP).In this article,the authors use the ensemble empirical mode decomposition(EEMD)model and autoregressive moving average(ARMA)model to improve the prediction results of GPCP.In addition,the authors also conduct direct predictions for those countries with a small number of confirmed cases or are in the early stage of the disease,whose development trends of the pandemic do not fully comply with the law of infectious diseases and cannot be predicted by the GPCP model.Judging from the results,the absolute values of the relative errors of predictions in countries such as Cuba have been reduced significantly and their prediction trends are closer to the real situations through the method mentioned above to revise the prediction results out of GPCP.For countries such as El Salvador with a small number of cases,the absolute values of the relative errors of prediction become smaller.Therefore,this article concludes that this method is more effective for improving prediction results and direct prediction.
文摘Driven by market requirements, software services organizations have adopted various software engineering process models (such as capability maturity model (CMM), capability maturity model integration (CMMI), ISO 9001:2000, etc.) and practice of the project management concepts defined in the project management body of knowledge. While this has definitely helped organizations to bring some methods into the software development madness, there always exists a demand for comparing various groups within the organization in terms of the practice of these defined process models. Even though there exist many metrics for comparison, considering the variety of projects in terms of technology, life cycle, etc., finding a single metric that caters to this is a difficult task. This paper proposes a model for arriving at a rating on group maturity within the organization. Considering the linguistic or imprecise and uncertain nature of software measurements, fuzzy logic approach is used for the proposed model. Without the barriers like technology or life cycle difference, the proposed model helps the organization to compare different groups within it with reasonable precision.
基金National Natural Science Foundation of China(No.41771484)。
文摘OpenStreetMap has a large number of volunteers.There is a hypothesis that volunteers with different cultural backgrounds may have different editing behaviors when contributing to OSM.It may be strongly related to data quality and data reliability on OSM.As for the heterogeneity and the reliability of OSM data,previous research usually focuses on the geometric accuracy,spatial location accuracy and semantic integrity of OSM data,while few researchers have analyzed these problems from the perspective of editing behavior.On the grounds of relationship between mapping motivation and editing behavior,the dispersion of editing trajectory and clockwise direction index are proposed in the paper to explore whether the volunteers are sufficiently motivated and knowledgeable.In the experiments,the historical OSM data of four countries suggested that developed countries have lower trajectory dispersion.The lower degree of trajectory dispersion reflects the higher concentration and professionalism of volunteers.A high degree of drawing direction consistency shows volunteers who mapped French data were natives with local knowledge.From this point of view,this paper verifies that volunteer editing behavior is an effective method to analyze data quality heterogeneity and data reliability.
基金supported by the National Basic Research Program of China(No.2007CB407303)
文摘A long-term dataset of photosynthetically active radiation (Qp) is reconstructed from a broadband global solar radiation (Rs) dataset through an all-weather reconstruction model. This method is based on four years' worth of data collected in Beijing. Observation data of Rs and Qp from 2005-2008 are used to investigate the temporal variability of Qp and its dependence on the clearness index and solar zenith angle. A simple and effcient all-weather empirically derived reconstruction model is proposed to reconstruct Qp from Rs. This reconstruction method is found to estimate instantaneous Qp with high accuracy. The annual mean of the daily values of Qp during the period 1958-2005 period is 25.06 mol m-2 d-1. The magnitude of the long-term trend for the annual averaged Qp is presented (-0.19 mol m-2 yr-1 from 1958-1997 and -0.12 mol m-2 yr-1 from 1958-2005). The trend in Qp exhibits sharp decreases in the spring and summer and more gentle decreases in the autumn and winter.
基金Research Project of China Ship Development and Design Center。
文摘Screening similar historical fault-free candidate data would greatly affect the effectiveness of fault detection results based on principal component analysis(PCA).In order to find out the candidate data,this study compares unweighted and weighted similarity factors(SFs),which measure the similarity of the principal component subspace corresponding to the first k main components of two datasets.The fault detection employs the principal component subspace corresponding to the current measured data and the historical fault-free data.From the historical fault-free database,the load parameters are employed to locate the candidate data similar to the current operating data.Fault detection method for air conditioning systems is based on principal component.The results show that the weighted principal component SF can improve the effects of the fault-free detection and the fault detection.Compared with the unweighted SF,the average fault-free detection rate of the weighted SF is 17.33%higher than that of the unweighted,and the average fault detection rate is 7.51%higher than unweighted.
基金Supported by the National Key Research and Development Program of China(2018YFB0504900,2018YFB0504905)。
文摘China began to develop its meteorological satellite program since 1969.With 50-years’growing,there are 17 Fengyun(FY)meteorological satellites launched successfully.At present,seven of them are in orbit to provide the operational service,including three polar orbiting meteorological satellites and four geostationary meteorological satellites.Since last COSPAR report,no new Fengyun satellite has been launched.The information of the on-orbit FY-2 series,FY-3 series,and FY-4 series has been updated.FY-3D and FY-2H satellites accomplished the commission test and transitioned into operation in 2018.FY-2E satellite completed its service to decommission in 2019.The web-based users and Direct Broadcasting(DB)users keep growing worldwide to require the Fengyun satellite data and products.A new Mobile Application Service has been launched to Fengyun users based on the cloud technology in 2018.In this report,the international and regional co-operations to facilitate the Fengyun user community have been addressed especially.To strengthen the data service in the Belt and Road countries,the Emergency Support Mechanism of Fengyun satellite(FY_ESM)has been established since 2018.Meanwhile,a Recalibrating 30-years’archived Fengyun satellite data project has been founded since 2018.This project targets to generate the Fundamental Climate Data Record(FCDR)as a space agency response to the Global Climate Observation System(GCOS).At last,the future Fengyun program up to 2025 has been introduced as well.
文摘oupled stress release model is proposed in the paper considering the interaction between different parts based on stress release model by VereJones, and is used to historical earthquake data from North China. The results by this model are compared with the results by original stress release model using AIC criterion. The results show that coupled stress release model is better than original model.
基金This work was carried out in the framework of the research project DREAM(Dynamics of the REsources and technological Advance in harvesting Marine renewable energy),supported by the Romanian Executive Agency for Higher Education,Research,Development and Innovation Funding-UEFISCDI,grant number PN-III-P4-IDPCE-2020-0008.
文摘Having as target the semi-enclosed basin of the Black Sea,the primary purpose of the existing paper is to present an overview of its extensive physical features and circulation patterns.To achieve this goal,more than five decades of data analysis-from 1960 to 2015-were taken into consideration and the results were validated against acknowledged data,both from satellite data over the last two decades and in-situ measurements from first decades.The circulation of the Black Sea basin has been studied for almost 400 years since the Italian Count Luigi Marsigli first described the“two-layer”circulation through the Bosphorus Strait in the year 1681.Since climate change projections for the Black Sea region foresee a significant impact on the environment in the coming decades,a set of adaptation and mitigation measures is required.Therefore more research is needed.Nowadays,the warming trend adds a sense of immediate urgency because according to the National Oceanic and Atmospheric Administration’s National Centre for Environmental Information,July 2020 was the second-hottest month ever recorded for the planet.Its averaged land and ocean surface temperature tied with July 2016 as the secondhighest for the month in the 141-year NOAA’s global temperature dataset history,which dates back to 1880.It was 0.92°C above the 20th-century average of 15.8°C,with only 0.01°C less than the record extreme value measured in July of 2019.
基金The research was supported by National Social Science Fund of China[grant number 21XTJ001].
文摘When the historical data of the early phase trial and the interim data of the Phase II trial are avail-able,we should use them to give a more accurate prediction in both futility and efficacy analysis.The predictive power is an important measure of the practical utility of a proposed trial,and it is better than the classical statistical power in giving a good indication of the probability that the trial will demonstrate a positive or statistically significant outcome.In addition to the four predic-tive powers with historical and interim data available in the literature and summarized in Table 1,we discover and calculate another four predictive powers also summarized in Table 1,for one-sided hypotheses.Moreover,we calculate eight predictive powers summarized in Table 2,for the reversed hypotheses.The combination of the two tables gives us a complete picture of the pre-dictive powers with historical and interim data for futility and efficacy analysis.Furthermore,the eight predictive powers with historical and interim data are utilized to guide the futility analysis in the tamoxifen example.Finally,extensive simulations have been conducted to investigate the sensitivity analysis of priors,sample sizes,interim result and interim time on different predictive powers.
基金Supported by the National Key Research and Development Program of China(2018YFB0504900 and 2018YFB0504902)National Natural Science Foundation of China(41775020,42005105,and 41905034)。
文摘Atmospheric water vapor is an essential climate variable(ECV)with extensive spatial and temporal variations.Microwave humidity observations from meteorological satellites provide important information for climate system variables,including atmospheric water vapor and precipitable water,and assimilation in numerical weather prediction(NWP)and reanalysis.As one of the payloads onboard China’s second-generation polar-orbiting operational meteorological Fengyun-3(FY-3)satellites,the Microwave Humidity Sounder(MWHS)has been continuously observing the global humidity since 2008.The reprocessing of historical FY-3 MWHS data is documented in detail in this study.After calibrating and correcting the data,the quality of the reprocessed dataset is evaluated and the improvement is shown in this study.The results suggest that MWHS observations bias is reduced to approximately 0.8 K,compared with METOP-A Microwave Humidity Sounder(MHS).The temporal variability of MWHS is highly correlated with the instrument temperature.After reprocessing,the scene temperature dependency is mitigated for all 183 GHz channels,and the consistency and stability between FY-3A/B/C are also improved.
基金National Natural Science Foundation of China(No.61203184)
文摘A system reliability model based on Bayesian network(BN)is built via an evolutionary strategy called dual genetic algorithm(DGA).BN is a probabilistic approach to analyze relationships between stochastic events.In contrast with traditional methods where BN model is built by professionals,DGA is proposed for the automatic analysis of historical data and construction of BN for the estimation of system reliability.The whole solution space of BN structures is searched by DGA and a more accurate BN model is obtained.Efficacy of the proposed method is shown by some literature examples.
文摘The subject of the legal history belongs to the interdisciplinary of the law and the history. Because of the attribute of the interdisciplinary of the subject of the legal history, the reference of the experience in the development of the h/story is very important. The subject of the legal history must pay attention to the problems encountered in the process of the development of the history and the accumulated experience. The historical development since the last century displays the importance and necessity of the theory of the social science in the study of the history. The researcher of the legal history should focus on the comprehensive use of the theories and methods of the subject of the social sciences, and through the innovative use of the historical data and the use of the multidisciplinary interpretation methods, re-interpret the historical events and figures. With the aid of the comprehensive use of the methods of the multiple disciplines, further open the new situation of the study of the legal history.
文摘Observational facts from the Maldives, Goa and Bangladesh in the Indian Ocean and from Fiji and New Caledonia in the Pacific record a high sea level in the 17th century, a low sea level in the 18th century, a high sea level in the early 19th century and a stable sea level in the last 50 - 70 years. This cannot be understood in terms of glacial eustasy (or in terms of steric effects or tectonics), only in terms of rotational eustasy. The present paper gives a summary of the observational facts behind the formulation of the novel concept of rotational eustasy. It reveals a common trend of sea level changes, which is opposed to the sea level changes in the northern hemisphere, and the global climatic changes in general. Rotational eustasy offers a logical explanation.
文摘Risk management is relevant for every project that which seeks to avoid and suppress unanticipated costs, basically calling for pre-emptive action. The current work proposes a new approach for handling risks based on predictive analytics and machine learning (ML) that can work in real-time to help avoid risks and increase project adaptability. The main research aim of the study is to ascertain risk presence in projects by using historical data from previous projects, focusing on important aspects such as time, task time, resources and project results. t-SNE technique applies feature engineering in the reduction of the dimensionality while preserving important structural properties. This process is analysed using measures including recall, F1-score, accuracy and precision measurements. The results demonstrate that the Gradient Boosting Machine (GBM) achieves an impressive 85% accuracy, 82% precision, 85% recall, and 80% F1-score, surpassing previous models. Additionally, predictive analytics achieves a resource utilisation efficiency of 85%, compared to 70% for traditional allocation methods, and a project cost reduction of 10%, double the 5% achieved by traditional approaches. Furthermore, the study indicates that while GBM excels in overall accuracy, Logistic Regression (LR) offers more favourable precision-recall trade-offs, highlighting the importance of model selection in project risk management.
基金Supported by the China Meteorological Administration Special Public Welfare Research Fund(GYHY201206009)National Basic Research Program of China(2012CB955301)China University Research Talents Recruitment Program(B13045)
文摘Due to the increasing requirement for high-level weather and climate forecasting accuracy, it is necessary to exploit a strategy for model error correction while developing numerical modeling and data assimilation techniques. This study classifies the correction strategies according to the types of forecast errors, and reviews recent studies on these correction strategies. Among others, the analogue-dynamical method has been developed in China, which combines statistical methods with the dynamical model, corrects model errors based on analogue information, and effectively utilizes historical data in dynamical forecasts. In this study, the fundamental principles and technical solutions of the analogue-dynamical method and associated development history for forecasts on different timescales are introduced. It is shown that this method can effectively improve medium- and extended-range forecasts, monthly-average circulation forecast, and short-term climate prediction. As an innovative technique independently developed in China, the analogue- dynamical method plays an important role in both weather forecast and climate prediction, and has potential applications in wider fields.