In this paper,we investigate the global complexity bound for the inexact Levenberg–Marquardt method,where the Jacobian may be perturbed and the solution is possibly not exact.Under reasonable assumptions,we show that...In this paper,we investigate the global complexity bound for the inexact Levenberg–Marquardt method,where the Jacobian may be perturbed and the solution is possibly not exact.Under reasonable assumptions,we show that the global complexity bound is O(ε^(−2)),which is the same as the exact case.We also show that it can be reduced to O(lgε^(−1))under some regularity assumption.展开更多
China's rapid urbanization in the past 30 years can be abstracted as a spatial-temporal product of the socio-economic mechanism relying on productive factors, which represents the sustained dynamics from labor dem...China's rapid urbanization in the past 30 years can be abstracted as a spatial-temporal product of the socio-economic mechanism relying on productive factors, which represents the sustained dynamics from labor demographic bonus to land capital discount. In the context of the current increasing global complexity, this paper points out that this kind of productive factor-based spatial logic not only has structural deficiencies, but also potentially brings about the environmental, economic, social, and political risks in multiple dimensions as follows: first, the crises in regional resources and environment due to the market distortion of productive factors; second, the disappearance of industrial growth impetus due to the investment preference to land urbanization; third, the social polarization caused by the incomplete urbanization of immigrates; fourth, the overdraft of government credit as a result of the unbalanced urbanization of population, industry, and land. Under such a circumstance, the paper proposes a transition of China's existing urbanization policies, so as to achieve the objective of sustainable urbanization in the future.展开更多
Starting from the supposition of time-space substitution, the Langbein-Schumm's Law was applied to deal with response of fluvial erosion System to the changes in mean annual Precipitation induced by global green-h...Starting from the supposition of time-space substitution, the Langbein-Schumm's Law was applied to deal with response of fluvial erosion System to the changes in mean annual Precipitation induced by global green-house warming. As a result, a simple method was put forward to predict change in sediment yield, with Ningxia Hui Autonomous Region in the northern fringe of the Loess Plateau of China as an example. Results show that, even the change in mean annual precipitation is the same, the direction and magnitude of the resultant chang in sediment yteld would be quite different in fferent physico-geographical zones. When mean annual precipitation is increased, sediment yield in arid or semi-arid areas with a mean anntal Peripitation of less than 400 mm will be increased, while sediment yield in sub-humid or humid areas with a mean annual precipitation of more than 400 mm will be decreased.Additionally, the complex response of fluvial erosion system in time series due to the lag of change in vegetation behind the changn in precipitation has also been qualitatively discussed in this paper.展开更多
Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased us...Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased use of open-source software or integration of open-source software into custom-developed software, the quality of this software component increases in importance. This study examined a sample of open-source applications from GitHub. Static software analytics were conducted, and each application was classified for its risk level. In the analyzed applications, it was found that 90% of the applications were classified as low risk or moderate low risk indicating a high level of quality for open-source applications.展开更多
This paper applies software analytics to open source code. Open-source software gives both individuals and businesses the flexibility to work with different parts of available code to modify it or incorporate it into ...This paper applies software analytics to open source code. Open-source software gives both individuals and businesses the flexibility to work with different parts of available code to modify it or incorporate it into their own project. The open source software market is growing. Major companies such as AWS, Facebook, Google, IBM, Microsoft, Netflix, SAP, Cisco, Intel, and Tesla have joined the open source software community. In this study, a sample of 40 open source applications was selected. Traditional McCabe software metrics including cyclomatic and essential complexities were examined. An analytical comparison of this set of metrics and derived metrics for high risk software was utilized as a basis for addressing risk management in the adoption and integration decisions of open source software. From this comparison, refinements were added, and contemporary concepts of design and data metrics derived from cyclomatic complexity were integrated into a classification scheme for software quality. It was found that 84% of the sample open source applications were classified as moderate low risk or low risk indicating that open source software exhibits low risk characteristics. The 40 open source applications were the base data for the model resulting in a technique which is applicable to any open source code regardless of functionality, language, or size.展开更多
Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity a...Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger components at a low cost only with local information.In this paper,we discuss the effectiveness and efficiency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.展开更多
基金This work was partially supported by the National Natural Science Foundation of China(No.11571234).
文摘In this paper,we investigate the global complexity bound for the inexact Levenberg–Marquardt method,where the Jacobian may be perturbed and the solution is possibly not exact.Under reasonable assumptions,we show that the global complexity bound is O(ε^(−2)),which is the same as the exact case.We also show that it can be reduced to O(lgε^(−1))under some regularity assumption.
基金Supported by National Natural Sciences Fund(No.50808112)2012 Elementary Science Research Fund for Central Universities to Dalian University of Technology(No.DUT12RW430.)
文摘China's rapid urbanization in the past 30 years can be abstracted as a spatial-temporal product of the socio-economic mechanism relying on productive factors, which represents the sustained dynamics from labor demographic bonus to land capital discount. In the context of the current increasing global complexity, this paper points out that this kind of productive factor-based spatial logic not only has structural deficiencies, but also potentially brings about the environmental, economic, social, and political risks in multiple dimensions as follows: first, the crises in regional resources and environment due to the market distortion of productive factors; second, the disappearance of industrial growth impetus due to the investment preference to land urbanization; third, the social polarization caused by the incomplete urbanization of immigrates; fourth, the overdraft of government credit as a result of the unbalanced urbanization of population, industry, and land. Under such a circumstance, the paper proposes a transition of China's existing urbanization policies, so as to achieve the objective of sustainable urbanization in the future.
文摘Starting from the supposition of time-space substitution, the Langbein-Schumm's Law was applied to deal with response of fluvial erosion System to the changes in mean annual Precipitation induced by global green-house warming. As a result, a simple method was put forward to predict change in sediment yield, with Ningxia Hui Autonomous Region in the northern fringe of the Loess Plateau of China as an example. Results show that, even the change in mean annual precipitation is the same, the direction and magnitude of the resultant chang in sediment yteld would be quite different in fferent physico-geographical zones. When mean annual precipitation is increased, sediment yield in arid or semi-arid areas with a mean anntal Peripitation of less than 400 mm will be increased, while sediment yield in sub-humid or humid areas with a mean annual precipitation of more than 400 mm will be decreased.Additionally, the complex response of fluvial erosion system in time series due to the lag of change in vegetation behind the changn in precipitation has also been qualitatively discussed in this paper.
文摘Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased use of open-source software or integration of open-source software into custom-developed software, the quality of this software component increases in importance. This study examined a sample of open-source applications from GitHub. Static software analytics were conducted, and each application was classified for its risk level. In the analyzed applications, it was found that 90% of the applications were classified as low risk or moderate low risk indicating a high level of quality for open-source applications.
文摘This paper applies software analytics to open source code. Open-source software gives both individuals and businesses the flexibility to work with different parts of available code to modify it or incorporate it into their own project. The open source software market is growing. Major companies such as AWS, Facebook, Google, IBM, Microsoft, Netflix, SAP, Cisco, Intel, and Tesla have joined the open source software community. In this study, a sample of 40 open source applications was selected. Traditional McCabe software metrics including cyclomatic and essential complexities were examined. An analytical comparison of this set of metrics and derived metrics for high risk software was utilized as a basis for addressing risk management in the adoption and integration decisions of open source software. From this comparison, refinements were added, and contemporary concepts of design and data metrics derived from cyclomatic complexity were integrated into a classification scheme for software quality. It was found that 84% of the sample open source applications were classified as moderate low risk or low risk indicating that open source software exhibits low risk characteristics. The 40 open source applications were the base data for the model resulting in a technique which is applicable to any open source code regardless of functionality, language, or size.
基金supported by the Research Fund from the National Natural Science Foundation of China(Nos.61521091,61650110516,and 61601013)
文摘Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger components at a low cost only with local information.In this paper,we discuss the effectiveness and efficiency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.