Nowadays cloud architecture is widely applied on the internet.New malware aiming at the privacy data stealing or crypto currency mining is threatening the security of cloud platforms.In view of the problems with exist...Nowadays cloud architecture is widely applied on the internet.New malware aiming at the privacy data stealing or crypto currency mining is threatening the security of cloud platforms.In view of the problems with existing application behavior monitoring methods such as coarse-grained analysis,high performance overhead and lack of applicability,this paper proposes a new fine-grained binary program monitoring and analysis method based on multiple system level components,which is used to detect the possible privacy leakage of applications installed on cloud platforms.It can be used online in cloud platform environments for fine-grained automated analysis of target programs,ensuring the stability and continuity of program execution.We combine the external interception and internal instrumentation and design a variety of optimization schemes to further reduce the impact of fine-grained analysis on the performance of target programs,enabling it to be employed in actual environments.The experimental results show that the proposed method is feasible and can achieve the acceptable analysis performance while consuming a small amount of system resources.The optimization schemes can go beyond traditional dynamic instrumentation methods with better analytical performance and can be more applicable to online analysis on cloud platforms.展开更多
Photometric observations of AH Cnc, a W UMa-type system in the open cluster M67, were car- fled out by using the 50BIN telescope. About 100h of time-series/3- and V-band data were taken, based on which eight new times...Photometric observations of AH Cnc, a W UMa-type system in the open cluster M67, were car- fled out by using the 50BIN telescope. About 100h of time-series/3- and V-band data were taken, based on which eight new times of light minima were determined. By applying the Wilson-Devinney method, the light curves were modeled and a revised photometric solution of the binary system was derived. We con- firmed that AH Cnc is a deep contact (f = 51%), low mass-ratio (q - 0.156) system. Adopting the distance modulus derived from study of the host cluster, we have re-calculated the physical parameters of the binary system, namely the masses and radii. The masses and radii of the two components were estimated to be respectively 1.188(4-0.061) Me, 1.332(4-0.063) RQ for the primary component and 0.185(4-0.032) Me, 0.592(4-0.051) Re for the secondary. By adding the newly derived minimum timings to all the available data, the period variations of AH Cnc were studied. This shows that the orbital period of the binary is con- tinuously increasing at a rate of dp/dt = 4.29 x 10-10 d yr-1. In addition to the long-term period increase, a cyclic variation with a period of 35.26 yr was determined, which could be attributed to an unresolved tertiary component of the system.展开更多
The radial contraction-expansion motion paradigm is a novel steady-state visual evoked experimental paradigm,and the electroencephalography(EEG)evoked potential is different from the traditional luminance modulation p...The radial contraction-expansion motion paradigm is a novel steady-state visual evoked experimental paradigm,and the electroencephalography(EEG)evoked potential is different from the traditional luminance modulation paradigm.The signal energy is concentrated chiefly in the fundamental frequency,while the higher harmonic power is lower.Therefore,the conventional steady-state visual evoked potential recognition algorithms optimizing multiple harmonic response components,such as the extended canonical correlation analysis(eCCA)and task-related component analysis(TRCA)algorithm,have poor recognition performance under the radial contraction-expansion motion paradigm.This paper proposes an extended binary subband canonical correlation analysis(eBSCCA)algorithm for the radial contraction-expansion motion paradigm.For the radial contraction-expansion motion paradigm,binary subband filtering was used to optimize the weighting coefficients of different frequency response signals,thereby improving the recognition performance of EEG signals.The results of offline experiments involving 13 subjects showed that the eBSCCA algorithm exhibits a better performance than the eCCA and TRCA algorithms under the stimulation of the radial contraction-expansion motion paradigm.In the online experiment,the average recognition accuracy of 13 subjects was 88.68%±6.33%,and the average information transmission rate(ITR)was 158.77±43.67 bits/min,which proved that the algorithm had good recognition effect signals evoked by the radial contraction-expansion motion paradigm.展开更多
Several pupil filtering techniques have been developed in the last few years to obtain transverse superresolution (a narrower point spread function core). Such a core decrease entails two relevant limitations: a de...Several pupil filtering techniques have been developed in the last few years to obtain transverse superresolution (a narrower point spread function core). Such a core decrease entails two relevant limitations: a decrease of the peak intensity and an increase of the sidelobe intensity. Here, we calculate the Strehl ratio as a function of the core size for the most used binary phase filters. Furthermore, we show that this relation approaches the fundamental limit of the attainable Strehl ratio at the focal plane for any filter. Finally, we show the calculation of the peak-to-sidelobe ratio in order to check the system viability in every application.展开更多
Inward foreign direct investment (FDI) is expected to grow further by virtue of economic globalization. A thorough understanding of the locational determinants of inward FDI will be conducive to enhanced efficiency ...Inward foreign direct investment (FDI) is expected to grow further by virtue of economic globalization. A thorough understanding of the locational determinants of inward FDI will be conducive to enhanced efficiency in attracting direct and SOC-related investments from foreign entities. This study analyzes 51 cases of inward direct foreign investment made in the Incheon free economic zone (IFEZ) from 2002 to 2009 to determine the factors influencing FDI volume, the relevance of locations and the correlation between investment size and location. First, the relationship between the loeational determinants of FDI and the total investment size (total expected project cost) is analyzed. Second, the relationship between the locational determinants of FDI and the FDI is analyzed. Third, the relationship between the locational determinants of FDI and the location choice is analyzed. The results indicate the determinants that influence locations and investment size of FDI entities; whether these factors exercise influence in the zone; and the factors that have relatively significant effects. Ultimately, based on the analytical findings, a few implications for policy and practice are derived.展开更多
A low mass X-ray binary (LMXB) contains either a neutron star or a black hole accreting materials from its low mass companion star. It is one of the primary astrophysical sources for studying stellar-mass compact ob...A low mass X-ray binary (LMXB) contains either a neutron star or a black hole accreting materials from its low mass companion star. It is one of the primary astrophysical sources for studying stellar-mass compact objects and accreting phe- nomena. As with other binary systems, the most important parameter of an LMXB is the orbital period, which allows us to learn about the nature of the binary system and constrain the properties of the system's components, including the compact ob- ject. As a result, measuring the orbital periods of LMXBs is essential for investigating these systems even though fewer than half of them have known orbital periods. This article introduces the different methods for measuring the orbital periods in the X-ray band and reviews their application to various types of LMXBs, such as eclipsing and dipping sources, as well as pulsar LMXBs.展开更多
As one of the most notorious programming errors,memory access errors still hurt modern software security.Particularly,they are hidden deeply in important software systems written in memory unsafe languages like C/C++....As one of the most notorious programming errors,memory access errors still hurt modern software security.Particularly,they are hidden deeply in important software systems written in memory unsafe languages like C/C++.Plenty of work have been proposed to detect bugs leading to memory access errors.However,all existing works lack the ability to handle two challenges.First,they are not able to tackle fine-grained memory access errors,e.g.,data overflow inside one data structure.These errors are usually overlooked for a long time since they happen inside one memory block and do not lead to program crash.Second,most existing works rely on source code or debugging information to recover memory boundary information,so they cannot be directly applied to detection of memory access errors in binary code.However,searching memory access errors in binary code is a very common scenario in software vulnerability detection and exploitation.In order to overcome these challenges,we propose Memory Access Integrity(MAI),a dynamic method to detect finegrained memory access errors in off-the-shelf binary executables.The core idea is to recover fine-grained accessing policy between memory access behaviors and memory ranges,and then detect memory access errors based on the policy.The key insight in our work is that memory accessing patterns reveal information for recovering the boundary of memory objects and the accessing policy.Based on these recovered information,our method maintains a new memory model to simulate the life cycle of memory objects and report errors when any accessing policy is violated.We evaluate our tool on popular CTF datasets and real world softwares.Compared with the state of the art detection tool,the evaluation result demonstrates that our tool can detect fine-grained memory access errors effectively and efficiently.As the practical impact,our tool has detected three 0-day memory access errors in an audio decoder.展开更多
As one of the most notorious programming errors,memory access errors still hurt modern software security.Particularly,they are hidden deeply in important software systems written in memory unsafe languages like C/C++....As one of the most notorious programming errors,memory access errors still hurt modern software security.Particularly,they are hidden deeply in important software systems written in memory unsafe languages like C/C++.Plenty of work have been proposed to detect bugs leading to memory access errors.However,all existing works lack the ability to handle two challenges.First,they are not able to tackle fine-grained memory access errors,e.g.,data overflow inside one data structure.These errors are usually overlooked for a long time since they happen inside one memory block and do not lead to program crash.Second,most existing works rely on source code or debugging information to recover memory boundary information,so they cannot be directly applied to detection of memory access errors in binary code.However,searching memory access errors in binary code is a very common scenario in software vulnerability detection and exploitation.In order to overcome these challenges,we propose Memory Access Integrity(MAI),a dynamic method to detect finegrained memory access errors in off-the-shelf binary executables.The core idea is to recover fine-grained accessing policy between memory access behaviors and memory ranges,and then detect memory access errors based on the policy.The key insight in our work is that memory accessing patterns reveal information for recovering the boundary of memory objects and the accessing policy.Based on these recovered information,our method maintains a new memory model to simulate the life cycle of memory objects and report errors when any accessing policy is violated.We evaluate our tool on popular CTF datasets and real world softwares.Compared with the state of the art detection tool,the evaluation result demonstrates that our tool can detect fine-grained memory access errors effectively and efficiently.As the practical impact,our tool has detected three 0-day memory access errors in an audio decoder.展开更多
Tackling binary program analysis problems has traditionally implied manually defining rules and heuristics,a tedious and time consuming task for human analysts.In order to improve automation and scalability,we propose...Tackling binary program analysis problems has traditionally implied manually defining rules and heuristics,a tedious and time consuming task for human analysts.In order to improve automation and scalability,we propose an alternative direction based on distributed representations of binary programs with applicability to a number of downstream tasks.We introduce Bin2vec,a new approach leveraging Graph Convolutional Networks(GCN)along with computational program graphs in order to learn a high dimensional representation of binary executable programs.We demonstrate the versatility of this approach by using our representations to solve two semantically different binary analysis tasks–functional algorithm classification and vulnerability discovery.We compare the proposed approach to our own strong baseline as well as published results,and demonstrate improvement over state-of-the-art methods for both tasks.We evaluated Bin2vec on 49191 binaries for the functional algorithm classification task,and on 30 different CWE-IDs including at least 100 CVE entries each for the vulnerability discovery task.We set a new state-of-the-art result by reducing the classification error by 40%compared to the source-code based inst2vec approach,while working on binary code.For almost every vulnerability class in our dataset,our prediction accuracy is over 80%(and over 90%in multiple classes).展开更多
Background:Radiological imaging plays a pivotal role in forensic anthropology.As have the imaging techniques advances,so have the digital skeletal measurements inched towards precision.Secular trends of the population...Background:Radiological imaging plays a pivotal role in forensic anthropology.As have the imaging techniques advances,so have the digital skeletal measurements inched towards precision.Secular trends of the population keep on changing in modem times.Hence,finding the precise technique of bone measurement,with greater reproducibility,in modem population is always needed in making population specific biological profile.Aim and Objective:The aim of this study was to estimate the accuracy of the foramen magnum measurement,obtained by three dimensional multi-detector computed tomography using volume rendering technique with the cut off value of each variable,in sex determination of an individual.Materials and Methods:Two metric traits,an antero-posterior diameter(APD)and transverse diameter(TD),were measured digitally in an analysis of 130 radiological images having equal proportion of male and female samples.Foramen magnum index and area of foramen magnum(Area by Radinsky's[AR],Area by Teixeira5s[AT])were derived from APD and TD.Results:Descriptive statistical analysis,using unpaired t-test,showed significant higher value in males in all the variables.Using Pearson correlation analysis,maximum correlation was observed between area(AT and AR r=0.999)and between area and TD(AR r=0.955 and AT r=0.945 respectively).When used individually,TD had the highest predictive value(67.7%)for sex detennination among all the parameters followed by AT(65.4%)and AR(64.6%).Cutoff value of variables TD,AR and AT were 29.9 mm,841.80 mm2 and 849.70 mm2 respectively.Receiver operating characteristic curve predicted male and female sex with 96.2%and 89.2%accuracy respectively.The overall accuracy of the model was 92.7%.Conclusion:Measurements from 3D CT using volume rendering technique were precise,and the application of logistic regression analysis predicted the sex with more accuracy.展开更多
The probability of default(PD) is the key element in the New Basel Capital Accord and the most essential factor to financial institutions' risk management.To obtain good PD estimation,practitioners and academics h...The probability of default(PD) is the key element in the New Basel Capital Accord and the most essential factor to financial institutions' risk management.To obtain good PD estimation,practitioners and academics have put forward numerous default prediction models.However,how to use multiple models to enhance overall performance on default prediction remains untouched.In this paper,a parametric and non-parametric combination model is proposed.Firstly,binary logistic regression model(BLRM),support vector machine(SVM),and decision tree(DT) are used respectively to establish models with relatively stable and high performance.Secondly,in order to make further improvement to the overall performance,a combination model using the method of multiple discriminant analysis(MDA) is constructed.In this way,the coverage rate of the combination model is greatly improved,and the risk of miscarriage is effectively reduced.Lastly,the results of the combination model are analyzed by using the K-means clustering,and the clustering distribution is consistent with a normal distribution.The results show that the combination model based on parametric and non-parametric can effectively enhance the overall performance on default prediction.展开更多
基金This work is supported by the National Natural Science Foundation of China(General Program,Grant No.61572253,YZ,http://www.nsfc.gov.cn)the Innovation Program for Graduate Students of Jiangsu Province,China(Grant No.KYLX16_0384,JP,http://jyt.jiangsu.gov.cn).
文摘Nowadays cloud architecture is widely applied on the internet.New malware aiming at the privacy data stealing or crypto currency mining is threatening the security of cloud platforms.In view of the problems with existing application behavior monitoring methods such as coarse-grained analysis,high performance overhead and lack of applicability,this paper proposes a new fine-grained binary program monitoring and analysis method based on multiple system level components,which is used to detect the possible privacy leakage of applications installed on cloud platforms.It can be used online in cloud platform environments for fine-grained automated analysis of target programs,ensuring the stability and continuity of program execution.We combine the external interception and internal instrumentation and design a variety of optimization schemes to further reduce the impact of fine-grained analysis on the performance of target programs,enabling it to be employed in actual environments.The experimental results show that the proposed method is feasible and can achieve the acceptable analysis performance while consuming a small amount of system resources.The optimization schemes can go beyond traditional dynamic instrumentation methods with better analytical performance and can be more applicable to online analysis on cloud platforms.
基金supported by the National Natural Science Foundation of China(Nos. U1131121,11303021,U1231202,11473037 and 11373073)
文摘Photometric observations of AH Cnc, a W UMa-type system in the open cluster M67, were car- fled out by using the 50BIN telescope. About 100h of time-series/3- and V-band data were taken, based on which eight new times of light minima were determined. By applying the Wilson-Devinney method, the light curves were modeled and a revised photometric solution of the binary system was derived. We con- firmed that AH Cnc is a deep contact (f = 51%), low mass-ratio (q - 0.156) system. Adopting the distance modulus derived from study of the host cluster, we have re-calculated the physical parameters of the binary system, namely the masses and radii. The masses and radii of the two components were estimated to be respectively 1.188(4-0.061) Me, 1.332(4-0.063) RQ for the primary component and 0.185(4-0.032) Me, 0.592(4-0.051) Re for the secondary. By adding the newly derived minimum timings to all the available data, the period variations of AH Cnc were studied. This shows that the orbital period of the binary is con- tinuously increasing at a rate of dp/dt = 4.29 x 10-10 d yr-1. In addition to the long-term period increase, a cyclic variation with a period of 35.26 yr was determined, which could be attributed to an unresolved tertiary component of the system.
基金This work is granted by National Natural Science Foundation of China(Grant Nos.62006024,62071057)the Fundamental Research Funds for the Central Universities(BUPT Project No.2019XD17)Aeronautical Science Foundation of China(NO.2019ZG073001).
文摘The radial contraction-expansion motion paradigm is a novel steady-state visual evoked experimental paradigm,and the electroencephalography(EEG)evoked potential is different from the traditional luminance modulation paradigm.The signal energy is concentrated chiefly in the fundamental frequency,while the higher harmonic power is lower.Therefore,the conventional steady-state visual evoked potential recognition algorithms optimizing multiple harmonic response components,such as the extended canonical correlation analysis(eCCA)and task-related component analysis(TRCA)algorithm,have poor recognition performance under the radial contraction-expansion motion paradigm.This paper proposes an extended binary subband canonical correlation analysis(eBSCCA)algorithm for the radial contraction-expansion motion paradigm.For the radial contraction-expansion motion paradigm,binary subband filtering was used to optimize the weighting coefficients of different frequency response signals,thereby improving the recognition performance of EEG signals.The results of offline experiments involving 13 subjects showed that the eBSCCA algorithm exhibits a better performance than the eCCA and TRCA algorithms under the stimulation of the radial contraction-expansion motion paradigm.In the online experiment,the average recognition accuracy of 13 subjects was 88.68%±6.33%,and the average information transmission rate(ITR)was 158.77±43.67 bits/min,which proved that the algorithm had good recognition effect signals evoked by the radial contraction-expansion motion paradigm.
基金supported by the by the Ministerio de Economía y Competitividad under project FIS2012-31079
文摘Several pupil filtering techniques have been developed in the last few years to obtain transverse superresolution (a narrower point spread function core). Such a core decrease entails two relevant limitations: a decrease of the peak intensity and an increase of the sidelobe intensity. Here, we calculate the Strehl ratio as a function of the core size for the most used binary phase filters. Furthermore, we show that this relation approaches the fundamental limit of the attainable Strehl ratio at the focal plane for any filter. Finally, we show the calculation of the peak-to-sidelobe ratio in order to check the system viability in every application.
文摘Inward foreign direct investment (FDI) is expected to grow further by virtue of economic globalization. A thorough understanding of the locational determinants of inward FDI will be conducive to enhanced efficiency in attracting direct and SOC-related investments from foreign entities. This study analyzes 51 cases of inward direct foreign investment made in the Incheon free economic zone (IFEZ) from 2002 to 2009 to determine the factors influencing FDI volume, the relevance of locations and the correlation between investment size and location. First, the relationship between the loeational determinants of FDI and the total investment size (total expected project cost) is analyzed. Second, the relationship between the locational determinants of FDI and the FDI is analyzed. Third, the relationship between the locational determinants of FDI and the location choice is analyzed. The results indicate the determinants that influence locations and investment size of FDI entities; whether these factors exercise influence in the zone; and the factors that have relatively significant effects. Ultimately, based on the analytical findings, a few implications for policy and practice are derived.
基金partially supported by the Taiwan Ministry of Science and Technology grant NSC 102-2112-M-008-020-MY3
文摘A low mass X-ray binary (LMXB) contains either a neutron star or a black hole accreting materials from its low mass companion star. It is one of the primary astrophysical sources for studying stellar-mass compact objects and accreting phe- nomena. As with other binary systems, the most important parameter of an LMXB is the orbital period, which allows us to learn about the nature of the binary system and constrain the properties of the system's components, including the compact ob- ject. As a result, measuring the orbital periods of LMXBs is essential for investigating these systems even though fewer than half of them have known orbital periods. This article introduces the different methods for measuring the orbital periods in the X-ray band and reviews their application to various types of LMXBs, such as eclipsing and dipping sources, as well as pulsar LMXBs.
文摘As one of the most notorious programming errors,memory access errors still hurt modern software security.Particularly,they are hidden deeply in important software systems written in memory unsafe languages like C/C++.Plenty of work have been proposed to detect bugs leading to memory access errors.However,all existing works lack the ability to handle two challenges.First,they are not able to tackle fine-grained memory access errors,e.g.,data overflow inside one data structure.These errors are usually overlooked for a long time since they happen inside one memory block and do not lead to program crash.Second,most existing works rely on source code or debugging information to recover memory boundary information,so they cannot be directly applied to detection of memory access errors in binary code.However,searching memory access errors in binary code is a very common scenario in software vulnerability detection and exploitation.In order to overcome these challenges,we propose Memory Access Integrity(MAI),a dynamic method to detect finegrained memory access errors in off-the-shelf binary executables.The core idea is to recover fine-grained accessing policy between memory access behaviors and memory ranges,and then detect memory access errors based on the policy.The key insight in our work is that memory accessing patterns reveal information for recovering the boundary of memory objects and the accessing policy.Based on these recovered information,our method maintains a new memory model to simulate the life cycle of memory objects and report errors when any accessing policy is violated.We evaluate our tool on popular CTF datasets and real world softwares.Compared with the state of the art detection tool,the evaluation result demonstrates that our tool can detect fine-grained memory access errors effectively and efficiently.As the practical impact,our tool has detected three 0-day memory access errors in an audio decoder.
文摘As one of the most notorious programming errors,memory access errors still hurt modern software security.Particularly,they are hidden deeply in important software systems written in memory unsafe languages like C/C++.Plenty of work have been proposed to detect bugs leading to memory access errors.However,all existing works lack the ability to handle two challenges.First,they are not able to tackle fine-grained memory access errors,e.g.,data overflow inside one data structure.These errors are usually overlooked for a long time since they happen inside one memory block and do not lead to program crash.Second,most existing works rely on source code or debugging information to recover memory boundary information,so they cannot be directly applied to detection of memory access errors in binary code.However,searching memory access errors in binary code is a very common scenario in software vulnerability detection and exploitation.In order to overcome these challenges,we propose Memory Access Integrity(MAI),a dynamic method to detect finegrained memory access errors in off-the-shelf binary executables.The core idea is to recover fine-grained accessing policy between memory access behaviors and memory ranges,and then detect memory access errors based on the policy.The key insight in our work is that memory accessing patterns reveal information for recovering the boundary of memory objects and the accessing policy.Based on these recovered information,our method maintains a new memory model to simulate the life cycle of memory objects and report errors when any accessing policy is violated.We evaluate our tool on popular CTF datasets and real world softwares.Compared with the state of the art detection tool,the evaluation result demonstrates that our tool can detect fine-grained memory access errors effectively and efficiently.As the practical impact,our tool has detected three 0-day memory access errors in an audio decoder.
文摘Tackling binary program analysis problems has traditionally implied manually defining rules and heuristics,a tedious and time consuming task for human analysts.In order to improve automation and scalability,we propose an alternative direction based on distributed representations of binary programs with applicability to a number of downstream tasks.We introduce Bin2vec,a new approach leveraging Graph Convolutional Networks(GCN)along with computational program graphs in order to learn a high dimensional representation of binary executable programs.We demonstrate the versatility of this approach by using our representations to solve two semantically different binary analysis tasks–functional algorithm classification and vulnerability discovery.We compare the proposed approach to our own strong baseline as well as published results,and demonstrate improvement over state-of-the-art methods for both tasks.We evaluated Bin2vec on 49191 binaries for the functional algorithm classification task,and on 30 different CWE-IDs including at least 100 CVE entries each for the vulnerability discovery task.We set a new state-of-the-art result by reducing the classification error by 40%compared to the source-code based inst2vec approach,while working on binary code.For almost every vulnerability class in our dataset,our prediction accuracy is over 80%(and over 90%in multiple classes).
文摘Background:Radiological imaging plays a pivotal role in forensic anthropology.As have the imaging techniques advances,so have the digital skeletal measurements inched towards precision.Secular trends of the population keep on changing in modem times.Hence,finding the precise technique of bone measurement,with greater reproducibility,in modem population is always needed in making population specific biological profile.Aim and Objective:The aim of this study was to estimate the accuracy of the foramen magnum measurement,obtained by three dimensional multi-detector computed tomography using volume rendering technique with the cut off value of each variable,in sex determination of an individual.Materials and Methods:Two metric traits,an antero-posterior diameter(APD)and transverse diameter(TD),were measured digitally in an analysis of 130 radiological images having equal proportion of male and female samples.Foramen magnum index and area of foramen magnum(Area by Radinsky's[AR],Area by Teixeira5s[AT])were derived from APD and TD.Results:Descriptive statistical analysis,using unpaired t-test,showed significant higher value in males in all the variables.Using Pearson correlation analysis,maximum correlation was observed between area(AT and AR r=0.999)and between area and TD(AR r=0.955 and AT r=0.945 respectively).When used individually,TD had the highest predictive value(67.7%)for sex detennination among all the parameters followed by AT(65.4%)and AR(64.6%).Cutoff value of variables TD,AR and AT were 29.9 mm,841.80 mm2 and 849.70 mm2 respectively.Receiver operating characteristic curve predicted male and female sex with 96.2%and 89.2%accuracy respectively.The overall accuracy of the model was 92.7%.Conclusion:Measurements from 3D CT using volume rendering technique were precise,and the application of logistic regression analysis predicted the sex with more accuracy.
基金supported by the National Natural Science Foundation of China Key Project under Grant No.70933003the National Natural Science Foundation of China under Grant Nos.70871109 and 71203247
文摘The probability of default(PD) is the key element in the New Basel Capital Accord and the most essential factor to financial institutions' risk management.To obtain good PD estimation,practitioners and academics have put forward numerous default prediction models.However,how to use multiple models to enhance overall performance on default prediction remains untouched.In this paper,a parametric and non-parametric combination model is proposed.Firstly,binary logistic regression model(BLRM),support vector machine(SVM),and decision tree(DT) are used respectively to establish models with relatively stable and high performance.Secondly,in order to make further improvement to the overall performance,a combination model using the method of multiple discriminant analysis(MDA) is constructed.In this way,the coverage rate of the combination model is greatly improved,and the risk of miscarriage is effectively reduced.Lastly,the results of the combination model are analyzed by using the K-means clustering,and the clustering distribution is consistent with a normal distribution.The results show that the combination model based on parametric and non-parametric can effectively enhance the overall performance on default prediction.