Bailongjiang watershed in southern Gansu province, China, is one of the most landslide-prone regions in China, characterized by very high frequency of landslide occurrence. In order to predict the landslide occurrence...Bailongjiang watershed in southern Gansu province, China, is one of the most landslide-prone regions in China, characterized by very high frequency of landslide occurrence. In order to predict the landslide occurrence, a comprehensive map of landslide susceptibility is required which may be significantly helpful in reducing loss of property and human life. In this study, an integrated model of information value method and logistic regression is proposed by using their merits at maximum and overcoming their weaknesses, which may enhance precision and accuracy of landslide susceptibility assessment. A detailed and reliable landslide inventory with 1587 landslides was prepared and randomly divided into two groups,(i) training dataset and(ii) testing dataset. Eight distinct landslide conditioning factors including lithology, slope gradient, aspect, elevation, distance to drainages,distance to faults, distance to roads and vegetation coverage were selected for landslide susceptibility mapping. The produced landslide susceptibility maps were validated by the success rate and prediction rate curves. The validation results show that the success rate and the prediction rate of the integrated model are 81.7 % and 84.6 %, respectively, which indicate that the proposed integrated method is reliable to produce an accurate landslide susceptibility map and the results may be used for landslides management and mitigation.展开更多
The present study is focused on a comparative evaluation of landslide disaster using analytical hierarchy process and information value method for hazard assessment in highly tectonic Chamba region in bosom of Himalay...The present study is focused on a comparative evaluation of landslide disaster using analytical hierarchy process and information value method for hazard assessment in highly tectonic Chamba region in bosom of Himalaya. During study, the information about the causative factors was generated and the landslide hazard zonation maps were delineated using Information Value Method(IV) and Analytical Hierarchy Process(AHP) using Arc GIS(ESRI). For this purpose, the study area was selected in a part of Ravi river catchment along one of the landslide prone Chamba to Bharmour road corridor of National Highway(NH^(-1)54 A) in Himachal Pradesh, India. A numeral landslide triggering geoenvironmental factors i.e. slope, aspect, relative relief, soil, curvature, land use and land cover(LULC), lithology, drainage density, and lineament density were selected for landslide hazard mapping based on landslide inventory. Landslide hazard zonation map was categorized namely "very high hazard, high hazard, medium hazard, low hazard, and very low hazard". The results from these two methods were validated using Area Under Curve(AUC) plots. It is found that hazard zonation map prepared using information value method and analytical hierarchy process methods possess the prediction rate of 78.87% and 75.42%, respectively. Hence, landslide hazardzonation map obtained using information value method is proposed to be more useful for the study area. These final hazard zonation maps can be used by various stakeholders like engineers and administrators for proper maintenance and smooth traffic flow between Chamba and Bharmour cities, which is the only route connecting these tourist places.展开更多
Slope aspect is one of the indispensable internal factors besides lithology, relative elevation and slope degree. In this paper authors use information value model with Geo graphical Information System (GIS) technol...Slope aspect is one of the indispensable internal factors besides lithology, relative elevation and slope degree. In this paper authors use information value model with Geo graphical Information System (GIS) technology to study how slope aspect contributes to landslide growth from Yunyang to Wushan segment in the Three Gorges Reservoir area, and the relationship between aspect and landslide growth is quantified. Through the research on 205 landslides examples, it is found that the slope contributes most whose aspect is towards south,southeast and southwest aspect contribute moderately, and other five aspects contribute little. The research result inosculates preferably with the fact. The result of this paper can provide potent gist to the construction of Three Gorges Reservoir area in future.展开更多
Background:Schistosomiasis control is striving forward to transmission interruption and even elimination,evidence-lead control is of vital importance to eliminate the hidden dan gers of schistosomiasis.This study atte...Background:Schistosomiasis control is striving forward to transmission interruption and even elimination,evidence-lead control is of vital importance to eliminate the hidden dan gers of schistosomiasis.This study attempts to ide ntify high risk areas of schistosomiasis in China by using in formation value and machine learni ng.展开更多
The concept of value of information(VOI)has been widely used in the oil industry when making decisions on the acquisition of new data sets for the development and operation of oil fields.The classical approach to VOI ...The concept of value of information(VOI)has been widely used in the oil industry when making decisions on the acquisition of new data sets for the development and operation of oil fields.The classical approach to VOI assumes that the outcome of the data acquisition process produces crisp values,which are uniquely mapped onto one of the deterministic reservoir models representing the subsurface variability.However,subsurface reservoir data are not always crisp;it can also be fuzzy and may correspond to various reservoir models to different degrees.The classical approach to VOI may not,therefore,lead to the best decision with regard to the need to acquire new data.Fuzzy logic,introduced in the 1960 s as an alternative to the classical logic,is able to manage the uncertainty associated with the fuzziness of the data.In this paper,both classical and fuzzy theoretical formulations for VOI are developed and contrasted using inherently vague data.A case study,which is consistent with the future development of an oil reservoir,is used to compare the application of both approaches to the estimation of VOI.The results of the VOI process show that when the fuzzy nature of the data is included in the assessment,the value of the data decreases.In this case study,the results of the assessment using crisp data and fuzzy data change the decision from"acquire"the additional data(in the former)to"do not acquire"the additional data(in the latter).In general,different decisions are reached,depending on whether the fuzzy nature of the data is considered during the evaluation.The implications of these results are significant in a domain such as the oil and gas industry(where investments are huge).This work strongly suggests the need to define the data as crisp or fuzzy for use in VOI,prior to implementing the assessment to select and define the right approach.展开更多
The investment strategy choice of state-owned commercial bank is related to its franchise value change information. This paper analyzes the franchise value change information of state-owned commercial bank. The franch...The investment strategy choice of state-owned commercial bank is related to its franchise value change information. This paper analyzes the franchise value change information of state-owned commercial bank. The franchise value change information shows that the franchise value of state-owned Commercial Bank is descending. Along with the descending of the franchise value, state-owned commercial bank strengthens its high risk investment motive when it chooses its investment strategy. State-owned commercial bank tends to run the high risk of investing securities because its investment variety is very sparse. Based on the theoretical principle of how to control securities investment risk, this paper proposes some countermeasures and suggestions that state-owned commercial bank strengthen the control of its securities investment risk in order to perfect its investment strategy.展开更多
European Community policy concerning water is placing increasing demands on the acquisition of information about the quality of aquatic environments. The cost of this information has led to a reflection on the rationa...European Community policy concerning water is placing increasing demands on the acquisition of information about the quality of aquatic environments. The cost of this information has led to a reflection on the rationalization of monitoring networks and, therefore, on the economic value of information produced by these networks. The aim of this article is to contribute to this reflection. To do so, we used the Bayesian framework to define the value of additional information in relation to the following three parameters: initial assumptions (prior probabilities) on the states of nature, costs linked to a poor decision (error costs) and accuracy of additional information. We then analyzed the impact of these parameters on this value, particularly the combined role of prior probabilities and error costs that increased or decreased the value of information depending on the initial uncertainty level. We then illustrated the results using a case study of a stream in the Bas-Rhin department in France.展开更多
Background:his paper presents a case study on 100Credit,an Internet credit service provider in China.100Credit began as an IT company specializing in e-commerce recommendation before getting into the credit rating bus...Background:his paper presents a case study on 100Credit,an Internet credit service provider in China.100Credit began as an IT company specializing in e-commerce recommendation before getting into the credit rating business.The company makes use of Big Data on multiple aspects of individuals’online activities to infer their potential credit risk.Methods:Based on 100Credit’s business practices,this paper summarizes four aspects related to the value of Big Data in Internet credit services.Results:1)value from large data volume that provides access to more borrowers;2)value from prediction correctness in reducing lenders’operational cost;3)value from the variety of services catering to different needs of lenders;and 4)value from information protection to sustain credit service businesses.Conclusion:The paper also discusses the opportunities and challenges of Big Databased credit risk analysis,which needs to be improved in future research and practice.展开更多
The management of information flow for production improvement has always been a target in the research. In this paper, the focus is on the analysis model of the characteristics of information flow in shop floor operat...The management of information flow for production improvement has always been a target in the research. In this paper, the focus is on the analysis model of the characteristics of information flow in shop floor operations based on the influence that dimension (support or medium), direction and the quality information flow have on the value of information flow using machine learning classification algorithms. The obtained results of classification algorithms used to analyze the value of information flow are Decision Trees (DT) and Random Forest (RF) with a score of 0.99% and the mean absolute error of 0.005. The results also show that the management of information flow using DT or RF shows that, the dimension of information such as digital information has the greatest value of information flow in shop floor operations when the shop floor is totally digitalized. Direction of information flow does not have any great influence on shop floor operations processes when the operations processes are digitalized or done by operators as machines.展开更多
In a large ancient landslide,approximately 240,000 m3 of sediments were reactivated,posing a grave threat to the safety of iron ore stopes.To trace the deformation and evolution history of reactivated Landslide,we con...In a large ancient landslide,approximately 240,000 m3 of sediments were reactivated,posing a grave threat to the safety of iron ore stopes.To trace the deformation and evolution history of reactivated Landslide,we conducted geological surveys and combined real-time monitoring equipment to analyze the landslide data since 1986 and the deformation status of the reactivated Landslide.A multi-factor comprehensive landslide monitoring method and an Newton force early warning system(NFEWS)were established,focusing on underground stress,surface deformation information and landslide stability.Furthermore,we developed a four-level early warning grading standard,employing surface cracks and changes in underground stress thresholds as early warning indicators.This standard adds expert assessment to avoid false alarms and realize real-time dynamics of mining landslides during excavation and transportation.Through the case study and analysis of Nanfen open-pit mine,the NFEWS system offers valuable insights and solution for early warning of landslides in analogous open-pit mines.Finally,the evaluation index system of landslide hazard susceptibility was established by selecting the Newton force influence factor.A landslide susceptibility zoning map is constructed using the information value model.The rationality and accuracy are assessed from three perspectives:frequency ratio,landslide hazard point density,and receiver operating characteristic(ROC)curve.The improved Newton force landslide early warning system provides a good reference for the analysis and monitoring of the creep landslide evolution process.展开更多
Intertextile Beijing Apparel Fabrics,will be held from 29-31 March 2009 at the China International Exhibition Centre,will showcase the latest textiles from around the world on 48,000 sqm of exhibition space.The event ...Intertextile Beijing Apparel Fabrics,will be held from 29-31 March 2009 at the China International Exhibition Centre,will showcase the latest textiles from around the world on 48,000 sqm of exhibition space.The event has confirmed 1100 exhibitors from 14 countries and regions including展开更多
Autonomous underwater vehicle(AUV)-assisted data collection is an efficient approach to implementing smart ocean.However,the data collection in time-varying ocean currents is plagued by two critical issues:AUV yaw and...Autonomous underwater vehicle(AUV)-assisted data collection is an efficient approach to implementing smart ocean.However,the data collection in time-varying ocean currents is plagued by two critical issues:AUV yaw and sensor node movement.We propose an adaptive AUV-assisted data collection strategy for ocean currents to address these issues.First,we consider the energy consumption of an AUV in conjunction with the value of information(VoI)over the sensor nodes and formulate an optimization problem to maximize the VoI-energy ratio.The AUV yaw problem is then solved by deriving the AUV's reachable region in different ocean current environments and the optimal cruising direction to the target nodes.Finally,using the predicted VoI-energy ratio,we sequentially design a distributed path planning algorithm to select the next target node for AUV.The simulation results indicate that the proposed strategy can utilize ocean currents to aid AUV navigation,thereby reducing the AUV's energy consumption and ensuring timely data collection.展开更多
Big data technologies have seen tremendous growth in recent years. They are widely used in both industry and academia. In spite of such exponential growth, these technologies lack adequate measures to protect data fro...Big data technologies have seen tremendous growth in recent years. They are widely used in both industry and academia. In spite of such exponential growth, these technologies lack adequate measures to protect data from misnse/abuse. Corporations that collect data from multiple sources are at risk of liabilities due to the exposure of sensitive information. In the current implementation of Hadoop, only file-level access control is feasible. Providing users with the ability to access data based on the attlibutes in a dataset or the user's role is complicated because of the sheer volume and multiple formats (structured, unstructured and semi-structured) of data. In this paper, we propose an access control framework, which enforces access control policies dynamically based on the sensitivity of the data. This framework enforces access control policies by harnessing the data context, usage patterns and informat/on sensitivity. Information sensitivity changes over time with the addition and removal of datasets, which can lead to modifications in access control decisions. The proposed framework accommodates these changes. The proposed framework is automated to a large extent as the data itself determines the sensitivity with minimal user intervention. Our experimental results show that the proposed framework is capable of enforcing access control policies on non-multimedia datasets with minimal overhead.展开更多
With the rapid development of big data technology, the personal credit evaluation industry has entered a new stage. Among them, the evaluation of personal credit based on mobile telecommunications data is one of the h...With the rapid development of big data technology, the personal credit evaluation industry has entered a new stage. Among them, the evaluation of personal credit based on mobile telecommunications data is one of the hotspots of current research. However, due to the complexity and diversity of personal credit evaluation variables, in order to reduce the complexity of the model and improve the prediction accuracy of the model, we need to reduce the dimension of the input variables. According to the data provided by a mobile telecommunications operator, this paper divides the data into a training sets and verification sets. We perform correlation analysis on each indicator of the data in the training set, and calculate the corresponding IV value based on the WOE value of the selected index, then binning data with SPSS Modeler. The selected variables were modeled using a logistic regression algorithm. In order to make the regression results more practical, we extract the scoring rules according to the results of logistic regression, convert them into the form of score cards, and finally verify the validity of the model.展开更多
In order to maximize the value of information(VoI)of collected data in unmanned aerial vehicle(UAV)-aided wireless sensor networks(WSNs),a UAV trajectory planning algorithm named maximum VoI first and successive conve...In order to maximize the value of information(VoI)of collected data in unmanned aerial vehicle(UAV)-aided wireless sensor networks(WSNs),a UAV trajectory planning algorithm named maximum VoI first and successive convex approximation(MVF-SCA)is proposed.First,the Rician channel model is adopted in the system and sensor nodes(SNs)are divided into key nodes and common nodes.Secondly,the data collection problem is formulated as a mixed integer non-linear program(MINLP)problem.The problem is divided into two sub-problems according to the different types of SNs to seek a sub-optimal solution with a low complexity.Finally,the MVF-SCA algorithm for UAV trajectory planning is proposed,which can not only be used for daily data collection in the target area,but also collect time-sensitive abnormal data in time when the exception occurs.Simulation results show that,compared with the existing classic traveling salesman problem(TSP)algorithm and greedy path planning algorithm,the VoI collected by the proposed algorithm can be improved by about 15%to 30%.展开更多
The Maximum Likelihood method estimates the parameter values of a statistical model that maximizes the corresponding likelihood function, given the sample information. This is the primal approach that, in this paper, ...The Maximum Likelihood method estimates the parameter values of a statistical model that maximizes the corresponding likelihood function, given the sample information. This is the primal approach that, in this paper, is presented as a mathematical programming specification whose solution requires the formulation of a Lagrange problem. A result of this setup is that the Lagrange multipliers associated with the linear statistical model (where sample observations are regarded as a set of constraints) are equal to the vector of residuals scaled by the variance of those residuals. The novel contribution of this paper consists in deriving the dual model of the Maximum Likelihood method under normality assumptions. This model minimizes a function of the variance of the error terms subject to orthogonality conditions between the model residuals and the space of explanatory variables. An intuitive interpretation of the dual problem appeals to basic elements of information theory and an economic interpretation of Lagrange multipliers to establish that the dual maximizes the net value of the sample information. This paper presents the dual ML model for a single regression and provides a numerical example of how to obtain maximum likelihood estimates of the parameters of a linear statistical model using the dual specification.展开更多
In this work,we provide a more consistent alternative for performing value of information(VOI)analyses to address sequential decision problems in reservoir management and generate insights on the process of reservoir ...In this work,we provide a more consistent alternative for performing value of information(VOI)analyses to address sequential decision problems in reservoir management and generate insights on the process of reservoir decision-making.These sequential decision problems are often solved and modeled as stochastic dynamic programs,but once the state space becomes large and complex,traditional techniques,such as policy iteration and backward induction,quickly become computationally demanding and intractable.To resolve these issues and utilize fewer computational resources,we instead make use of a viable alternative called approximate dynamic programming(ADP),which is a powerful solution technique that can handle complex,large-scale problems and discover a near-optimal solution for intractable sequential decision making.We compare and test the performance of several machine learning techniques that lie within the domain of ADP to determine the optimal time for beginning a polymer flooding process within a reservoir development plan.The approximate dynamic approach utilized here takes into account both the effect of the information obtained before a decision is made and the effect of the information that might be obtained to support future decisions while significantly improving both the timing and the value of the decision,thereby leading to a significant increase in economic performance.展开更多
This paper discusses an organizational model to be used for both conventional and virtual organizations. The model deals with variable relationships within an organization and provides a framework for overall organiza...This paper discusses an organizational model to be used for both conventional and virtual organizations. The model deals with variable relationships within an organization and provides a framework for overall organizational design that may include relationship among different design variables and external relationship with environment. Based on the researches of virtual organization, this paper also illustrates the new model of organization in the real world such as Beijing 2008 Olympic games and Dongfeng Automobile group.展开更多
To avoid the negative effects of disturbances on satellites,the characteristics of micro-vibration on flywheels are studied.Considering rotor imbalance,bearing imperfections and structural elasticity,the extended mode...To avoid the negative effects of disturbances on satellites,the characteristics of micro-vibration on flywheels are studied.Considering rotor imbalance,bearing imperfections and structural elasticity,the extended model of micro-vibration is established.In the feature extraction of micro-vibration,singular value decomposition combined with the improved Akaike Information Criterion(AIC-SVD)is applied to denoise.More robust and self-adaptable than the peak threshold denoising,AIC-SVD can effectively remove the noise components.Subsequently,the effective harmonic coefficients are extracted by the binning algorithm.The results show that the harmonic coefficients have great identification in frequency domain.Except for the fundamental frequency caused by rotor imbalance,the harmonics are also caused by the coupling of imperfections on bearing components.展开更多
All kinds of sensing organs in humans are able to reflect only the formal factors of objects,named formal information.It is believed,however,that not only the formal information but also the content information and va...All kinds of sensing organs in humans are able to reflect only the formal factors of objects,named formal information.It is believed,however,that not only the formal information but also the content information and value information of objects could play fundamental roles in the process of information understanding and decisionmaking in human thinking.Therefore,the questions of where and how the content information and the value information be produced from the formal information become critical in the theory of information understanding and decision-making.A conjectural theory that may reasonably answer the question is presented here in the paper.展开更多
基金supported by the Project of the 12th Five-year National Sci-Tech Support Plan of China(2011BAK12B09)China Special Project of Basic Work of Science and Technology(2011FY110100-2)
文摘Bailongjiang watershed in southern Gansu province, China, is one of the most landslide-prone regions in China, characterized by very high frequency of landslide occurrence. In order to predict the landslide occurrence, a comprehensive map of landslide susceptibility is required which may be significantly helpful in reducing loss of property and human life. In this study, an integrated model of information value method and logistic regression is proposed by using their merits at maximum and overcoming their weaknesses, which may enhance precision and accuracy of landslide susceptibility assessment. A detailed and reliable landslide inventory with 1587 landslides was prepared and randomly divided into two groups,(i) training dataset and(ii) testing dataset. Eight distinct landslide conditioning factors including lithology, slope gradient, aspect, elevation, distance to drainages,distance to faults, distance to roads and vegetation coverage were selected for landslide susceptibility mapping. The produced landslide susceptibility maps were validated by the success rate and prediction rate curves. The validation results show that the success rate and the prediction rate of the integrated model are 81.7 % and 84.6 %, respectively, which indicate that the proposed integrated method is reliable to produce an accurate landslide susceptibility map and the results may be used for landslides management and mitigation.
文摘The present study is focused on a comparative evaluation of landslide disaster using analytical hierarchy process and information value method for hazard assessment in highly tectonic Chamba region in bosom of Himalaya. During study, the information about the causative factors was generated and the landslide hazard zonation maps were delineated using Information Value Method(IV) and Analytical Hierarchy Process(AHP) using Arc GIS(ESRI). For this purpose, the study area was selected in a part of Ravi river catchment along one of the landslide prone Chamba to Bharmour road corridor of National Highway(NH^(-1)54 A) in Himachal Pradesh, India. A numeral landslide triggering geoenvironmental factors i.e. slope, aspect, relative relief, soil, curvature, land use and land cover(LULC), lithology, drainage density, and lineament density were selected for landslide hazard mapping based on landslide inventory. Landslide hazard zonation map was categorized namely "very high hazard, high hazard, medium hazard, low hazard, and very low hazard". The results from these two methods were validated using Area Under Curve(AUC) plots. It is found that hazard zonation map prepared using information value method and analytical hierarchy process methods possess the prediction rate of 78.87% and 75.42%, respectively. Hence, landslide hazardzonation map obtained using information value method is proposed to be more useful for the study area. These final hazard zonation maps can be used by various stakeholders like engineers and administrators for proper maintenance and smooth traffic flow between Chamba and Bharmour cities, which is the only route connecting these tourist places.
文摘Slope aspect is one of the indispensable internal factors besides lithology, relative elevation and slope degree. In this paper authors use information value model with Geo graphical Information System (GIS) technology to study how slope aspect contributes to landslide growth from Yunyang to Wushan segment in the Three Gorges Reservoir area, and the relationship between aspect and landslide growth is quantified. Through the research on 205 landslides examples, it is found that the slope contributes most whose aspect is towards south,southeast and southwest aspect contribute moderately, and other five aspects contribute little. The research result inosculates preferably with the fact. The result of this paper can provide potent gist to the construction of Three Gorges Reservoir area in future.
文摘Background:Schistosomiasis control is striving forward to transmission interruption and even elimination,evidence-lead control is of vital importance to eliminate the hidden dan gers of schistosomiasis.This study attempts to ide ntify high risk areas of schistosomiasis in China by using in formation value and machine learni ng.
文摘The concept of value of information(VOI)has been widely used in the oil industry when making decisions on the acquisition of new data sets for the development and operation of oil fields.The classical approach to VOI assumes that the outcome of the data acquisition process produces crisp values,which are uniquely mapped onto one of the deterministic reservoir models representing the subsurface variability.However,subsurface reservoir data are not always crisp;it can also be fuzzy and may correspond to various reservoir models to different degrees.The classical approach to VOI may not,therefore,lead to the best decision with regard to the need to acquire new data.Fuzzy logic,introduced in the 1960 s as an alternative to the classical logic,is able to manage the uncertainty associated with the fuzziness of the data.In this paper,both classical and fuzzy theoretical formulations for VOI are developed and contrasted using inherently vague data.A case study,which is consistent with the future development of an oil reservoir,is used to compare the application of both approaches to the estimation of VOI.The results of the VOI process show that when the fuzzy nature of the data is included in the assessment,the value of the data decreases.In this case study,the results of the assessment using crisp data and fuzzy data change the decision from"acquire"the additional data(in the former)to"do not acquire"the additional data(in the latter).In general,different decisions are reached,depending on whether the fuzzy nature of the data is considered during the evaluation.The implications of these results are significant in a domain such as the oil and gas industry(where investments are huge).This work strongly suggests the need to define the data as crisp or fuzzy for use in VOI,prior to implementing the assessment to select and define the right approach.
文摘The investment strategy choice of state-owned commercial bank is related to its franchise value change information. This paper analyzes the franchise value change information of state-owned commercial bank. The franchise value change information shows that the franchise value of state-owned Commercial Bank is descending. Along with the descending of the franchise value, state-owned commercial bank strengthens its high risk investment motive when it chooses its investment strategy. State-owned commercial bank tends to run the high risk of investing securities because its investment variety is very sparse. Based on the theoretical principle of how to control securities investment risk, this paper proposes some countermeasures and suggestions that state-owned commercial bank strengthen the control of its securities investment risk in order to perfect its investment strategy.
文摘European Community policy concerning water is placing increasing demands on the acquisition of information about the quality of aquatic environments. The cost of this information has led to a reflection on the rationalization of monitoring networks and, therefore, on the economic value of information produced by these networks. The aim of this article is to contribute to this reflection. To do so, we used the Bayesian framework to define the value of additional information in relation to the following three parameters: initial assumptions (prior probabilities) on the states of nature, costs linked to a poor decision (error costs) and accuracy of additional information. We then analyzed the impact of these parameters on this value, particularly the combined role of prior probabilities and error costs that increased or decreased the value of information depending on the initial uncertainty level. We then illustrated the results using a case study of a stream in the Bas-Rhin department in France.
文摘Background:his paper presents a case study on 100Credit,an Internet credit service provider in China.100Credit began as an IT company specializing in e-commerce recommendation before getting into the credit rating business.The company makes use of Big Data on multiple aspects of individuals’online activities to infer their potential credit risk.Methods:Based on 100Credit’s business practices,this paper summarizes four aspects related to the value of Big Data in Internet credit services.Results:1)value from large data volume that provides access to more borrowers;2)value from prediction correctness in reducing lenders’operational cost;3)value from the variety of services catering to different needs of lenders;and 4)value from information protection to sustain credit service businesses.Conclusion:The paper also discusses the opportunities and challenges of Big Databased credit risk analysis,which needs to be improved in future research and practice.
文摘The management of information flow for production improvement has always been a target in the research. In this paper, the focus is on the analysis model of the characteristics of information flow in shop floor operations based on the influence that dimension (support or medium), direction and the quality information flow have on the value of information flow using machine learning classification algorithms. The obtained results of classification algorithms used to analyze the value of information flow are Decision Trees (DT) and Random Forest (RF) with a score of 0.99% and the mean absolute error of 0.005. The results also show that the management of information flow using DT or RF shows that, the dimension of information such as digital information has the greatest value of information flow in shop floor operations when the shop floor is totally digitalized. Direction of information flow does not have any great influence on shop floor operations processes when the operations processes are digitalized or done by operators as machines.
基金the National Natural Science Foundation of China(NSFC)(41941018 and 52304111)the Program of China Scholarship Council(202206430007)。
文摘In a large ancient landslide,approximately 240,000 m3 of sediments were reactivated,posing a grave threat to the safety of iron ore stopes.To trace the deformation and evolution history of reactivated Landslide,we conducted geological surveys and combined real-time monitoring equipment to analyze the landslide data since 1986 and the deformation status of the reactivated Landslide.A multi-factor comprehensive landslide monitoring method and an Newton force early warning system(NFEWS)were established,focusing on underground stress,surface deformation information and landslide stability.Furthermore,we developed a four-level early warning grading standard,employing surface cracks and changes in underground stress thresholds as early warning indicators.This standard adds expert assessment to avoid false alarms and realize real-time dynamics of mining landslides during excavation and transportation.Through the case study and analysis of Nanfen open-pit mine,the NFEWS system offers valuable insights and solution for early warning of landslides in analogous open-pit mines.Finally,the evaluation index system of landslide hazard susceptibility was established by selecting the Newton force influence factor.A landslide susceptibility zoning map is constructed using the information value model.The rationality and accuracy are assessed from three perspectives:frequency ratio,landslide hazard point density,and receiver operating characteristic(ROC)curve.The improved Newton force landslide early warning system provides a good reference for the analysis and monitoring of the creep landslide evolution process.
文摘Intertextile Beijing Apparel Fabrics,will be held from 29-31 March 2009 at the China International Exhibition Centre,will showcase the latest textiles from around the world on 48,000 sqm of exhibition space.The event has confirmed 1100 exhibitors from 14 countries and regions including
基金supported by the National Natural Science Foundation of China(62071472,62101556)the Natural Science Foundation of Jiangsu province(BK20200650,BK20210489)the Future Network Scientific Research Fund Project(FNSRFP2021-YB-12)。
文摘Autonomous underwater vehicle(AUV)-assisted data collection is an efficient approach to implementing smart ocean.However,the data collection in time-varying ocean currents is plagued by two critical issues:AUV yaw and sensor node movement.We propose an adaptive AUV-assisted data collection strategy for ocean currents to address these issues.First,we consider the energy consumption of an AUV in conjunction with the value of information(VoI)over the sensor nodes and formulate an optimization problem to maximize the VoI-energy ratio.The AUV yaw problem is then solved by deriving the AUV's reachable region in different ocean current environments and the optimal cruising direction to the target nodes.Finally,using the predicted VoI-energy ratio,we sequentially design a distributed path planning algorithm to select the next target node for AUV.The simulation results indicate that the proposed strategy can utilize ocean currents to aid AUV navigation,thereby reducing the AUV's energy consumption and ensuring timely data collection.
文摘Big data technologies have seen tremendous growth in recent years. They are widely used in both industry and academia. In spite of such exponential growth, these technologies lack adequate measures to protect data from misnse/abuse. Corporations that collect data from multiple sources are at risk of liabilities due to the exposure of sensitive information. In the current implementation of Hadoop, only file-level access control is feasible. Providing users with the ability to access data based on the attlibutes in a dataset or the user's role is complicated because of the sheer volume and multiple formats (structured, unstructured and semi-structured) of data. In this paper, we propose an access control framework, which enforces access control policies dynamically based on the sensitivity of the data. This framework enforces access control policies by harnessing the data context, usage patterns and informat/on sensitivity. Information sensitivity changes over time with the addition and removal of datasets, which can lead to modifications in access control decisions. The proposed framework accommodates these changes. The proposed framework is automated to a large extent as the data itself determines the sensitivity with minimal user intervention. Our experimental results show that the proposed framework is capable of enforcing access control policies on non-multimedia datasets with minimal overhead.
文摘With the rapid development of big data technology, the personal credit evaluation industry has entered a new stage. Among them, the evaluation of personal credit based on mobile telecommunications data is one of the hotspots of current research. However, due to the complexity and diversity of personal credit evaluation variables, in order to reduce the complexity of the model and improve the prediction accuracy of the model, we need to reduce the dimension of the input variables. According to the data provided by a mobile telecommunications operator, this paper divides the data into a training sets and verification sets. We perform correlation analysis on each indicator of the data in the training set, and calculate the corresponding IV value based on the WOE value of the selected index, then binning data with SPSS Modeler. The selected variables were modeled using a logistic regression algorithm. In order to make the regression results more practical, we extract the scoring rules according to the results of logistic regression, convert them into the form of score cards, and finally verify the validity of the model.
基金The National Key R&D Program of China(No.2018YFB1500800)the Specialized Development Foundation for the Achievement Transformation of Jiangsu Province(No.BA2019025)+1 种基金Pre-Research Fund of Science and Technology on Near-Surface Detection Laboratory(No.6142414190405)the Open Project of the Key Laboratory of Wireless Sensor Network&Communication of Shanghai Institute of Microsystem and Information Technology,Chinese Academy of Sciences(No.20190907).
文摘In order to maximize the value of information(VoI)of collected data in unmanned aerial vehicle(UAV)-aided wireless sensor networks(WSNs),a UAV trajectory planning algorithm named maximum VoI first and successive convex approximation(MVF-SCA)is proposed.First,the Rician channel model is adopted in the system and sensor nodes(SNs)are divided into key nodes and common nodes.Secondly,the data collection problem is formulated as a mixed integer non-linear program(MINLP)problem.The problem is divided into two sub-problems according to the different types of SNs to seek a sub-optimal solution with a low complexity.Finally,the MVF-SCA algorithm for UAV trajectory planning is proposed,which can not only be used for daily data collection in the target area,but also collect time-sensitive abnormal data in time when the exception occurs.Simulation results show that,compared with the existing classic traveling salesman problem(TSP)algorithm and greedy path planning algorithm,the VoI collected by the proposed algorithm can be improved by about 15%to 30%.
文摘The Maximum Likelihood method estimates the parameter values of a statistical model that maximizes the corresponding likelihood function, given the sample information. This is the primal approach that, in this paper, is presented as a mathematical programming specification whose solution requires the formulation of a Lagrange problem. A result of this setup is that the Lagrange multipliers associated with the linear statistical model (where sample observations are regarded as a set of constraints) are equal to the vector of residuals scaled by the variance of those residuals. The novel contribution of this paper consists in deriving the dual model of the Maximum Likelihood method under normality assumptions. This model minimizes a function of the variance of the error terms subject to orthogonality conditions between the model residuals and the space of explanatory variables. An intuitive interpretation of the dual problem appeals to basic elements of information theory and an economic interpretation of Lagrange multipliers to establish that the dual maximizes the net value of the sample information. This paper presents the dual ML model for a single regression and provides a numerical example of how to obtain maximum likelihood estimates of the parameters of a linear statistical model using the dual specification.
基金The authors acknowledge financial support from the Research Council of Norway through the Petromaks-2 project DIGIRES(RCN no.280473)the industrial partners AkerBP,Wintershall DEA,ENI,Petrobras,Equinor,Lundin,and Neptune Energy.
文摘In this work,we provide a more consistent alternative for performing value of information(VOI)analyses to address sequential decision problems in reservoir management and generate insights on the process of reservoir decision-making.These sequential decision problems are often solved and modeled as stochastic dynamic programs,but once the state space becomes large and complex,traditional techniques,such as policy iteration and backward induction,quickly become computationally demanding and intractable.To resolve these issues and utilize fewer computational resources,we instead make use of a viable alternative called approximate dynamic programming(ADP),which is a powerful solution technique that can handle complex,large-scale problems and discover a near-optimal solution for intractable sequential decision making.We compare and test the performance of several machine learning techniques that lie within the domain of ADP to determine the optimal time for beginning a polymer flooding process within a reservoir development plan.The approximate dynamic approach utilized here takes into account both the effect of the information obtained before a decision is made and the effect of the information that might be obtained to support future decisions while significantly improving both the timing and the value of the decision,thereby leading to a significant increase in economic performance.
文摘This paper discusses an organizational model to be used for both conventional and virtual organizations. The model deals with variable relationships within an organization and provides a framework for overall organizational design that may include relationship among different design variables and external relationship with environment. Based on the researches of virtual organization, this paper also illustrates the new model of organization in the real world such as Beijing 2008 Olympic games and Dongfeng Automobile group.
基金National Natural Science Foundation of China(No.U1831123)Fundamental Research Funds for the Central Universities,China(No.2232017A3-04)。
文摘To avoid the negative effects of disturbances on satellites,the characteristics of micro-vibration on flywheels are studied.Considering rotor imbalance,bearing imperfections and structural elasticity,the extended model of micro-vibration is established.In the feature extraction of micro-vibration,singular value decomposition combined with the improved Akaike Information Criterion(AIC-SVD)is applied to denoise.More robust and self-adaptable than the peak threshold denoising,AIC-SVD can effectively remove the noise components.Subsequently,the effective harmonic coefficients are extracted by the binning algorithm.The results show that the harmonic coefficients have great identification in frequency domain.Except for the fundamental frequency caused by rotor imbalance,the harmonics are also caused by the coupling of imperfections on bearing components.
基金The work was supported in part by the National Natural Science Foundation of China(Grant Nos.60575034 and 60873001)。
文摘All kinds of sensing organs in humans are able to reflect only the formal factors of objects,named formal information.It is believed,however,that not only the formal information but also the content information and value information of objects could play fundamental roles in the process of information understanding and decisionmaking in human thinking.Therefore,the questions of where and how the content information and the value information be produced from the formal information become critical in the theory of information understanding and decision-making.A conjectural theory that may reasonably answer the question is presented here in the paper.