This work aims to analyse the actions that companies working in large-scale distribution carry along their value chains to minimise impacts on climate change.Companies operating in this field are aware that it is less...This work aims to analyse the actions that companies working in large-scale distribution carry along their value chains to minimise impacts on climate change.Companies operating in this field are aware that it is less effective to act directly on the core processes and need to involve the upstream value chain in their carbon reduction strategy.These businesses,in fact,need to focus on the indirect GHG(Greenhouse Gases)emissions and depend on how suppliers manage their impacts.In this sector,virtuous companies collaborate with their suppliers to get involved in a common path of quantifying and cutting said impacts together.This aspect is particularly relevant in the case of large-scale retailers.However,the process is not immediate since the supply chain is usually very dense and diverse,for instance,adopting various approaches that do not always coincide.In any case,the key aspect is mapping these suppliers:one of the tools mostly used for this purpose is the survey,as a quick instrument able to reach hundreds of suppliers at the same time,receiving a fast and standardized response,which can easily be processed to form a comprehensive and harmonized mapping of the results as the first step for the subsequent implementation of mitigation strategies.展开更多
Protein-protein interactions are of great significance for human to understand the functional mechanisms of proteins.With the rapid development of high-throughput genomic technologies,massive protein-protein interacti...Protein-protein interactions are of great significance for human to understand the functional mechanisms of proteins.With the rapid development of high-throughput genomic technologies,massive protein-protein interaction(PPI)data have been generated,making it very difficult to analyze them efficiently.To address this problem,this paper presents a distributed framework by reimplementing one of state-of-the-art algorithms,i.e.,CoFex,using MapReduce.To do so,an in-depth analysis of its limitations is conducted from the perspectives of efficiency and memory consumption when applying it for large-scale PPI data analysis and prediction.Respective solutions are then devised to overcome these limitations.In particular,we adopt a novel tree-based data structure to reduce the heavy memory consumption caused by the huge sequence information of proteins.After that,its procedure is modified by following the MapReduce framework to take the prediction task distributively.A series of extensive experiments have been conducted to evaluate the performance of our framework in terms of both efficiency and accuracy.Experimental results well demonstrate that the proposed framework can considerably improve its computational efficiency by more than two orders of magnitude while retaining the same high accuracy.展开更多
An antenna selection algorithm based on large-scale fading between the transmitter and receiver is proposed for the uplink receive antenna selection in distributed multiple-input multiple-output(D-MIMO) systems. By ut...An antenna selection algorithm based on large-scale fading between the transmitter and receiver is proposed for the uplink receive antenna selection in distributed multiple-input multiple-output(D-MIMO) systems. By utilizing the radio access units(RAU) selection based on large-scale fading,the proposed algorithm decreases enormously the computational complexity. Based on the characteristics of distributed systems,an improved particle swarm optimization(PSO) has been proposed for the antenna selection after the RAU selection. In order to apply the improved PSO algorithm better in antenna selection,a general form of channel capacity was transformed into a binary expression by analyzing the formula of channel capacity. The proposed algorithm can make full use of the advantages of D-MIMO systems,and achieve near-optimal performance in terms of channel capacity with low computational complexity.展开更多
This paper investigates large-scale distributed system design. It looks at features, main design considerations and provides the Netflix API, Cassandra and Oracle as examples of such systems. Moreover, the paper inves...This paper investigates large-scale distributed system design. It looks at features, main design considerations and provides the Netflix API, Cassandra and Oracle as examples of such systems. Moreover, the paper investigates the challenges of designing, developing, deploying, and maintaining such systems, in regard to the features presented. Finally, the paper discusses aspects of available solutions and current practices to challenges that large-scale distributed systems face.展开更多
In the large-scale logistics distribution of single logistic center,the method based on traditional genetic algorithm is slow in evolution and easy to fall into the local optimal solution.Addressing at this issue,we p...In the large-scale logistics distribution of single logistic center,the method based on traditional genetic algorithm is slow in evolution and easy to fall into the local optimal solution.Addressing at this issue,we propose a novel approach of exploring hybrid genetic algorithm based large-scale logistic distribution for BBG supermarket.We integrate greedy algorithm and hillclimbing algorithm into genetic algorithm.Greedy algorithm is applied to initialize the population,and then hill-climbing algorithm is used to optimize individuals in each generation after selection,crossover and mutation.Our approach is evaluated on the dataset of BBG Supermarket which is one of the top 10 supermarkets in China.Experimental results show that our method outperforms some other methods in the field.展开更多
Quantile regression(QR) is proposed to examine the relationships between large-scale atmospheric variables and all parts of the distribution of daily precipitation amount at Beijing Station from 1960 to 2008. QR is ...Quantile regression(QR) is proposed to examine the relationships between large-scale atmospheric variables and all parts of the distribution of daily precipitation amount at Beijing Station from 1960 to 2008. QR is also applied to evaluate the relationship between large-scale predictors and extreme precipitation(90th quantile) at 238 stations in northern China.Finally, QR is used to fit observed daily precipitation amounts for wet days at four sample stations. Results show that meridional wind and specific humidity at both 850 h Pa and 500 h Pa(V850, SH850, V500, and SH500) strongly affect all parts of the Beijing precipitation distribution during the wet season(April–September). Meridional wind, zonal wind, and specific humidity at only 850 h Pa(V850, U850, SH850) are significantly related to the precipitation distribution in the dry season(October–March). Impacts of these large-scale predictors on the daily precipitation amount with higher quantile become stronger, whereas their impact on light precipitation is negligible. In addition, SH850 has a strong relationship with wet-season extreme precipitation across the entire region, whereas the impacts of V850, V500, and SH500 are mainly in semi-arid and semi-humid areas. For the dry season, both SH850 and V850 are the major predictors of extreme precipitation in the entire region. Moreover, QR can satisfactorily simulate the daily precipitation amount at each station and for each season, if an optimum distribution family is selected. Therefore, QR is valuable for detecting the relationship between the large-scale predictors and the daily precipitation amount.展开更多
Decentralized robust stabilization problem of discrete-time fuzzy large-scale systems with parametric uncertainties is considered. This uncertain fuzzy large-scale system consists of N interconnected T-S fuzzy subsyst...Decentralized robust stabilization problem of discrete-time fuzzy large-scale systems with parametric uncertainties is considered. This uncertain fuzzy large-scale system consists of N interconnected T-S fuzzy subsystems, and the parametric uncertainties are unknown but norm-bounded. Based on Lyapunov stability theory and decentralized control theory of large-scale system, the design schema of decentralized parallel distributed compensation (DPDC) fuzzy controllers to ensure the asymptotic stability of the whole fuzzy large-scale system is proposed. The existence conditions for these controllers take the forms of LMIs. Finally a numerical simulation example is given to show the utility of the method proposed.展开更多
The large-scale vortical structures produced by an impinging density jet in shallow crossflow were numerically investigated in detail using RNG turbulence model. The scales, formation mechanism and evolution feature o...The large-scale vortical structures produced by an impinging density jet in shallow crossflow were numerically investigated in detail using RNG turbulence model. The scales, formation mechanism and evolution feature of the upstream wall vortex in relation to stagnation point and the Scarf vortex in near field were analyzed. The computed characteristic scales of the upstream vortex show distinguished three-dimensionality and vary with the velocity ratio and the water depth. The Scarf vortex in the near field plays an important role in the lateral concentration distributions of the impinging jet in crossflow. When the velocity ratio is relatively small, there exists a distinct lateral high concentration aggregation zone at the lateral edge between the bottom layer wall jet and the ambient crossflow, which is dominated by the Scarf vortex in the near field.展开更多
Most of the neural network architectures are based on human experience,which requires a long and tedious trial-and-error process.Neural architecture search(NAS)attempts to detect effective architectures without human ...Most of the neural network architectures are based on human experience,which requires a long and tedious trial-and-error process.Neural architecture search(NAS)attempts to detect effective architectures without human intervention.Evolutionary algorithms(EAs)for NAS can find better solutions than human-designed architectures by exploring a large search space for possible architectures.Using multiobjective EAs for NAS,optimal neural architectures that meet various performance criteria can be explored and discovered efficiently.Furthermore,hardware-accelerated NAS methods can improve the efficiency of the NAS.While existing reviews have mainly focused on different strategies to complete NAS,a few studies have explored the use of EAs for NAS.In this paper,we summarize and explore the use of EAs for NAS,as well as large-scale multiobjective optimization strategies and hardware-accelerated NAS methods.NAS performs well in healthcare applications,such as medical image analysis,classification of disease diagnosis,and health monitoring.EAs for NAS can automate the search process and optimize multiple objectives simultaneously in a given healthcare task.Deep neural network has been successfully used in healthcare,but it lacks interpretability.Medical data is highly sensitive,and privacy leaks are frequently reported in the healthcare industry.To solve these problems,in healthcare,we propose an interpretable neuroevolution framework based on federated learning to address search efficiency and privacy protection.Moreover,we also point out future research directions for evolutionary NAS.Overall,for researchers who want to use EAs to optimize NNs in healthcare,we analyze the advantages and disadvantages of doing so to provide detailed guidance,and propose an interpretable privacy-preserving framework for healthcare applications.展开更多
With the advancement of clean heating projects and the integration of large-scale distributed heat pumps into rural distribution networks in northern China,power grid companies face tremendous pressure to invest in po...With the advancement of clean heating projects and the integration of large-scale distributed heat pumps into rural distribution networks in northern China,power grid companies face tremendous pressure to invest in power grid upgrades,which bring opportunities for renewable power generation integration.The combination of heating by distributed renewable energy with the flexible operation of heat pumps is a feasible alternative for dealing with grid reinforcement challenges resulting from heating electrification.In this paper,a mathematical model of the collaborative planning of distributed wind power generation(DWPG)and distribution network with large-scale heat pumps is developed.In this model,the operational flexibility of the heat pump load is fully considered and the requirements of a comfortable indoor temperature are met.By applying this model,the goals of not only increasing the profit of DWPG but also reducing the cost of the power grid upgrade can be achieved.展开更多
Landslides in Tianshui Basin, Gansu Province, Northwest China, severely affect the local population and the economy;therefore,understanding their evolution and kinematics is of great interest for landslide risk assess...Landslides in Tianshui Basin, Gansu Province, Northwest China, severely affect the local population and the economy;therefore,understanding their evolution and kinematics is of great interest for landslide risk assessment and prevention. However, there is no unified classification standard for the types of loess landslides in Tianshui.In this study, we explored the landslide distribution and failure characteristics by means of field investigation,remotesensinginterpretation,geological mapping, drilling exploration and shearwave velocity tests, and established a database of Tianshui landslides. Our analysis shows that shear zones in mudstone usually develop in weak intercalated layers. Landslides occur mainly along the West Qinling faults on slopes with gradients of 10° to 25° and on southeast-and southwest-facing slopes.These landslides were classified into five types: loess landslides, loess–mudstone interface landslides, loess flow-slides, loess–mudstone plane landslides and loess–mudstone cutting landslides. We discussed the evolution and failure process of each landslide type and analyzed the formation mechanism and motion characteristics of large-scale landslides. The analysis results show that the landslides in the study area are characterized by a gentle slope, long runout and high risk. The relationship between the runout L and the vertical drop H of the large-scale landslides in the study area is L > 4 H. There are good correlations between the equivalent friction coefficient of largescale landslides and their maximum height, runout,area and volume. The sliding zone of large-scale landslides often develops in the bedrock contact zone or in a weak interlayer within mudstone. From microstructure analysis, undisturbed mudstone consists mainly of small aggregates with dispersed inter-aggregate pores, whereas sheared clay has a more homogeneous structure. Linear striations are well developed on shear surfaces, and the clay pores in those surfaces have a more uniform distribution than those in undisturbed clay.展开更多
Space swarms,enabled by the miniaturization of spacecraft,have the potential capability to lower costs,increase efficiencies,and broaden the horizons of space missions.The formation control problem of large-scale spac...Space swarms,enabled by the miniaturization of spacecraft,have the potential capability to lower costs,increase efficiencies,and broaden the horizons of space missions.The formation control problem of large-scale spacecraft swarms flying around an elliptic orbit is considered.The objective is to drive the entire formation to produce a specified spatial pattern.The relative motion between agents becomes complicated as the number of agents increases.Hence,a density-based method is adopted,which concerns the density evolution of the entire swarm instead of the trajectories of individuals.The density-based method manipulates the density evolution with Partial Differential Equations(PDEs).This density-based control in this work has two aspects,global pattern control of the whole swarm and local collision-avoidance between nearby agents.The global behavior of the swarm is driven via designing velocity fields.For each spacecraft,the Q-guidance steering law is adopted to track the desired velocity with accelerations in a distributed manner.However,the final stable velocity field is required to be zero in the classical density-based approach,which appears as an obstacle from the viewpoint of astrodynamics since the periodic relative motion is always time-varying.To solve this issue,a novel transformation is constructed based on the periodic solutions of Tschauner-Hempel(TH)equations.The relative motion in Cartesian coordinates is then transformed into a new coordinate system,which permits zero-velocity in a stable configuration.The local behavior of the swarm,such as achieving collision avoidance,is achieved via a carefully-designed local density estimation algorithm.Numerical simulations are provided to demonstrate the performance of this approach.展开更多
文摘This work aims to analyse the actions that companies working in large-scale distribution carry along their value chains to minimise impacts on climate change.Companies operating in this field are aware that it is less effective to act directly on the core processes and need to involve the upstream value chain in their carbon reduction strategy.These businesses,in fact,need to focus on the indirect GHG(Greenhouse Gases)emissions and depend on how suppliers manage their impacts.In this sector,virtuous companies collaborate with their suppliers to get involved in a common path of quantifying and cutting said impacts together.This aspect is particularly relevant in the case of large-scale retailers.However,the process is not immediate since the supply chain is usually very dense and diverse,for instance,adopting various approaches that do not always coincide.In any case,the key aspect is mapping these suppliers:one of the tools mostly used for this purpose is the survey,as a quick instrument able to reach hundreds of suppliers at the same time,receiving a fast and standardized response,which can easily be processed to form a comprehensive and harmonized mapping of the results as the first step for the subsequent implementation of mitigation strategies.
基金This work was supported in part by the National Natural Science Foundation of China(61772493)the CAAI-Huawei MindSpore Open Fund(CAAIXSJLJJ-2020-004B)+4 种基金the Natural Science Foundation of Chongqing(China)(cstc2019jcyjjqX0013)Chongqing Research Program of Technology Innovation and Application(cstc2019jscx-fxydX0024,cstc2019jscx-fxydX0027,cstc2018jszx-cyzdX0041)Guangdong Province Universities and College Pearl River Scholar Funded Scheme(2019)the Pioneer Hundred Talents Program of Chinese Academy of Sciencesthe Deanship of Scientific Research(DSR)at King Abdulaziz University(G-21-135-38).
文摘Protein-protein interactions are of great significance for human to understand the functional mechanisms of proteins.With the rapid development of high-throughput genomic technologies,massive protein-protein interaction(PPI)data have been generated,making it very difficult to analyze them efficiently.To address this problem,this paper presents a distributed framework by reimplementing one of state-of-the-art algorithms,i.e.,CoFex,using MapReduce.To do so,an in-depth analysis of its limitations is conducted from the perspectives of efficiency and memory consumption when applying it for large-scale PPI data analysis and prediction.Respective solutions are then devised to overcome these limitations.In particular,we adopt a novel tree-based data structure to reduce the heavy memory consumption caused by the huge sequence information of proteins.After that,its procedure is modified by following the MapReduce framework to take the prediction task distributively.A series of extensive experiments have been conducted to evaluate the performance of our framework in terms of both efficiency and accuracy.Experimental results well demonstrate that the proposed framework can considerably improve its computational efficiency by more than two orders of magnitude while retaining the same high accuracy.
基金Supported by the National Natural Science Foundation of China(No.61201086,61272495)the China Scholarship Council(No.201506375060)+1 种基金the Planned Science and Technology Project of Guangdong Province(No.2013B090500007) the Dongguan Project on the Integration of Industry,Education and Research(No.2014509102205)
文摘An antenna selection algorithm based on large-scale fading between the transmitter and receiver is proposed for the uplink receive antenna selection in distributed multiple-input multiple-output(D-MIMO) systems. By utilizing the radio access units(RAU) selection based on large-scale fading,the proposed algorithm decreases enormously the computational complexity. Based on the characteristics of distributed systems,an improved particle swarm optimization(PSO) has been proposed for the antenna selection after the RAU selection. In order to apply the improved PSO algorithm better in antenna selection,a general form of channel capacity was transformed into a binary expression by analyzing the formula of channel capacity. The proposed algorithm can make full use of the advantages of D-MIMO systems,and achieve near-optimal performance in terms of channel capacity with low computational complexity.
文摘This paper investigates large-scale distributed system design. It looks at features, main design considerations and provides the Netflix API, Cassandra and Oracle as examples of such systems. Moreover, the paper investigates the challenges of designing, developing, deploying, and maintaining such systems, in regard to the features presented. Finally, the paper discusses aspects of available solutions and current practices to challenges that large-scale distributed systems face.
基金This project was funded by the National Natural Science Foundation of China(41871320,61872139)the Provincial and Municipal Joint Fund of Hunan Provincial Natural Science Foundation of China(2018JJ4052)+2 种基金Hunan Provincial Natural Science Foundation of China(2017JJ2081)the Key Project of Hunan Provincial Education Department(19A172)the Scientific Research Fund of Hunan Provincial Education Department(18K060).
文摘In the large-scale logistics distribution of single logistic center,the method based on traditional genetic algorithm is slow in evolution and easy to fall into the local optimal solution.Addressing at this issue,we propose a novel approach of exploring hybrid genetic algorithm based large-scale logistic distribution for BBG supermarket.We integrate greedy algorithm and hillclimbing algorithm into genetic algorithm.Greedy algorithm is applied to initialize the population,and then hill-climbing algorithm is used to optimize individuals in each generation after selection,crossover and mutation.Our approach is evaluated on the dataset of BBG Supermarket which is one of the top 10 supermarkets in China.Experimental results show that our method outperforms some other methods in the field.
基金jointly sponsored by the National Basic Research Program of China "973" Program (Grant No. 2012CB956203)the Knowledge Innovation Project (Grant No. KZCX2-EW-202)the National Natural Science Foundation of China (Grant Nos. 91325108 and 51339004)
文摘Quantile regression(QR) is proposed to examine the relationships between large-scale atmospheric variables and all parts of the distribution of daily precipitation amount at Beijing Station from 1960 to 2008. QR is also applied to evaluate the relationship between large-scale predictors and extreme precipitation(90th quantile) at 238 stations in northern China.Finally, QR is used to fit observed daily precipitation amounts for wet days at four sample stations. Results show that meridional wind and specific humidity at both 850 h Pa and 500 h Pa(V850, SH850, V500, and SH500) strongly affect all parts of the Beijing precipitation distribution during the wet season(April–September). Meridional wind, zonal wind, and specific humidity at only 850 h Pa(V850, U850, SH850) are significantly related to the precipitation distribution in the dry season(October–March). Impacts of these large-scale predictors on the daily precipitation amount with higher quantile become stronger, whereas their impact on light precipitation is negligible. In addition, SH850 has a strong relationship with wet-season extreme precipitation across the entire region, whereas the impacts of V850, V500, and SH500 are mainly in semi-arid and semi-humid areas. For the dry season, both SH850 and V850 are the major predictors of extreme precipitation in the entire region. Moreover, QR can satisfactorily simulate the daily precipitation amount at each station and for each season, if an optimum distribution family is selected. Therefore, QR is valuable for detecting the relationship between the large-scale predictors and the daily precipitation amount.
基金This project was supported by NSFC Project (60474047), (60334010) and GuangDong Province Natural Science Foundationof China(31406)and China Postdoctoral Science Foundation (20060390725).
文摘Decentralized robust stabilization problem of discrete-time fuzzy large-scale systems with parametric uncertainties is considered. This uncertain fuzzy large-scale system consists of N interconnected T-S fuzzy subsystems, and the parametric uncertainties are unknown but norm-bounded. Based on Lyapunov stability theory and decentralized control theory of large-scale system, the design schema of decentralized parallel distributed compensation (DPDC) fuzzy controllers to ensure the asymptotic stability of the whole fuzzy large-scale system is proposed. The existence conditions for these controllers take the forms of LMIs. Finally a numerical simulation example is given to show the utility of the method proposed.
基金Project supported by the National Natural Science Foundation of China(No.10572084)Shanghai Leading Academic Discipline Project(No.Y0103)
文摘The large-scale vortical structures produced by an impinging density jet in shallow crossflow were numerically investigated in detail using RNG turbulence model. The scales, formation mechanism and evolution feature of the upstream wall vortex in relation to stagnation point and the Scarf vortex in near field were analyzed. The computed characteristic scales of the upstream vortex show distinguished three-dimensionality and vary with the velocity ratio and the water depth. The Scarf vortex in the near field plays an important role in the lateral concentration distributions of the impinging jet in crossflow. When the velocity ratio is relatively small, there exists a distinct lateral high concentration aggregation zone at the lateral edge between the bottom layer wall jet and the ambient crossflow, which is dominated by the Scarf vortex in the near field.
基金supported in part by the National Natural Science Foundation of China (NSFC) under Grant No.61976242in part by the Natural Science Fund of Hebei Province for Distinguished Young Scholars under Grant No.F2021202010+2 种基金in part by the Fundamental Scientific Research Funds for Interdisciplinary Team of Hebei University of Technology under Grant No.JBKYTD2002funded by Science and Technology Project of Hebei Education Department under Grant No.JZX2023007supported by 2022 Interdisciplinary Postgraduate Training Program of Hebei University of Technology under Grant No.HEBUT-YXKJC-2022122.
文摘Most of the neural network architectures are based on human experience,which requires a long and tedious trial-and-error process.Neural architecture search(NAS)attempts to detect effective architectures without human intervention.Evolutionary algorithms(EAs)for NAS can find better solutions than human-designed architectures by exploring a large search space for possible architectures.Using multiobjective EAs for NAS,optimal neural architectures that meet various performance criteria can be explored and discovered efficiently.Furthermore,hardware-accelerated NAS methods can improve the efficiency of the NAS.While existing reviews have mainly focused on different strategies to complete NAS,a few studies have explored the use of EAs for NAS.In this paper,we summarize and explore the use of EAs for NAS,as well as large-scale multiobjective optimization strategies and hardware-accelerated NAS methods.NAS performs well in healthcare applications,such as medical image analysis,classification of disease diagnosis,and health monitoring.EAs for NAS can automate the search process and optimize multiple objectives simultaneously in a given healthcare task.Deep neural network has been successfully used in healthcare,but it lacks interpretability.Medical data is highly sensitive,and privacy leaks are frequently reported in the healthcare industry.To solve these problems,in healthcare,we propose an interpretable neuroevolution framework based on federated learning to address search efficiency and privacy protection.Moreover,we also point out future research directions for evolutionary NAS.Overall,for researchers who want to use EAs to optimize NNs in healthcare,we analyze the advantages and disadvantages of doing so to provide detailed guidance,and propose an interpretable privacy-preserving framework for healthcare applications.
文摘With the advancement of clean heating projects and the integration of large-scale distributed heat pumps into rural distribution networks in northern China,power grid companies face tremendous pressure to invest in power grid upgrades,which bring opportunities for renewable power generation integration.The combination of heating by distributed renewable energy with the flexible operation of heat pumps is a feasible alternative for dealing with grid reinforcement challenges resulting from heating electrification.In this paper,a mathematical model of the collaborative planning of distributed wind power generation(DWPG)and distribution network with large-scale heat pumps is developed.In this model,the operational flexibility of the heat pump load is fully considered and the requirements of a comfortable indoor temperature are met.By applying this model,the goals of not only increasing the profit of DWPG but also reducing the cost of the power grid upgrade can be achieved.
基金This study was sponsored by National Natural Science Foundation of China(Grant No.41902269 and No.41702343)Project of China geological survey(Grant No.DD20190717)The authors express their sincere thanks to the reviewers and editor for their help.
文摘Landslides in Tianshui Basin, Gansu Province, Northwest China, severely affect the local population and the economy;therefore,understanding their evolution and kinematics is of great interest for landslide risk assessment and prevention. However, there is no unified classification standard for the types of loess landslides in Tianshui.In this study, we explored the landslide distribution and failure characteristics by means of field investigation,remotesensinginterpretation,geological mapping, drilling exploration and shearwave velocity tests, and established a database of Tianshui landslides. Our analysis shows that shear zones in mudstone usually develop in weak intercalated layers. Landslides occur mainly along the West Qinling faults on slopes with gradients of 10° to 25° and on southeast-and southwest-facing slopes.These landslides were classified into five types: loess landslides, loess–mudstone interface landslides, loess flow-slides, loess–mudstone plane landslides and loess–mudstone cutting landslides. We discussed the evolution and failure process of each landslide type and analyzed the formation mechanism and motion characteristics of large-scale landslides. The analysis results show that the landslides in the study area are characterized by a gentle slope, long runout and high risk. The relationship between the runout L and the vertical drop H of the large-scale landslides in the study area is L > 4 H. There are good correlations between the equivalent friction coefficient of largescale landslides and their maximum height, runout,area and volume. The sliding zone of large-scale landslides often develops in the bedrock contact zone or in a weak interlayer within mudstone. From microstructure analysis, undisturbed mudstone consists mainly of small aggregates with dispersed inter-aggregate pores, whereas sheared clay has a more homogeneous structure. Linear striations are well developed on shear surfaces, and the clay pores in those surfaces have a more uniform distribution than those in undisturbed clay.
基金co-supported by the Strategic Priority Program on Space Science of the Chinese Academy of Sciences (No.XDA15014902)the Key Research Program of the Chinese Academy of Sciences (No. ZDRW-KT-2019-1-0102)
文摘Space swarms,enabled by the miniaturization of spacecraft,have the potential capability to lower costs,increase efficiencies,and broaden the horizons of space missions.The formation control problem of large-scale spacecraft swarms flying around an elliptic orbit is considered.The objective is to drive the entire formation to produce a specified spatial pattern.The relative motion between agents becomes complicated as the number of agents increases.Hence,a density-based method is adopted,which concerns the density evolution of the entire swarm instead of the trajectories of individuals.The density-based method manipulates the density evolution with Partial Differential Equations(PDEs).This density-based control in this work has two aspects,global pattern control of the whole swarm and local collision-avoidance between nearby agents.The global behavior of the swarm is driven via designing velocity fields.For each spacecraft,the Q-guidance steering law is adopted to track the desired velocity with accelerations in a distributed manner.However,the final stable velocity field is required to be zero in the classical density-based approach,which appears as an obstacle from the viewpoint of astrodynamics since the periodic relative motion is always time-varying.To solve this issue,a novel transformation is constructed based on the periodic solutions of Tschauner-Hempel(TH)equations.The relative motion in Cartesian coordinates is then transformed into a new coordinate system,which permits zero-velocity in a stable configuration.The local behavior of the swarm,such as achieving collision avoidance,is achieved via a carefully-designed local density estimation algorithm.Numerical simulations are provided to demonstrate the performance of this approach.