Eye diagnosis is a method for inspecting systemic diseases and syndromes by observing the eyes.With the development of intelligent diagnosis in traditional Chinese medicine(TCM);artificial intelligence(AI)can improve ...Eye diagnosis is a method for inspecting systemic diseases and syndromes by observing the eyes.With the development of intelligent diagnosis in traditional Chinese medicine(TCM);artificial intelligence(AI)can improve the accuracy and efficiency of eye diagnosis.However;the research on intelligent eye diagnosis still faces many challenges;including the lack of standardized and precisely labeled data;multi-modal information analysis;and artificial in-telligence models for syndrome differentiation.The widespread application of AI models in medicine provides new insights and opportunities for the research of eye diagnosis intelli-gence.This study elaborates on the three key technologies of AI models in the intelligent ap-plication of TCM eye diagnosis;and explores the implications for the research of eye diagno-sis intelligence.First;a database concerning eye diagnosis was established based on self-su-pervised learning so as to solve the issues related to the lack of standardized and precisely la-beled data.Next;the cross-modal understanding and generation of deep neural network models to address the problem of lacking multi-modal information analysis.Last;the build-ing of data-driven models for eye diagnosis to tackle the issue of the absence of syndrome dif-ferentiation models.In summary;research on intelligent eye diagnosis has great potential to be applied the surge of AI model applications.展开更多
This paper investigates the wireless communication with a novel architecture of antenna arrays,termed modular extremely large-scale array(XLarray),where array elements of an extremely large number/size are regularly m...This paper investigates the wireless communication with a novel architecture of antenna arrays,termed modular extremely large-scale array(XLarray),where array elements of an extremely large number/size are regularly mounted on a shared platform with both horizontally and vertically interlaced modules.Each module consists of a moderate/flexible number of array elements with the inter-element distance typically in the order of the signal wavelength,while different modules are separated by the relatively large inter-module distance for convenience of practical deployment.By accurately modelling the signal amplitudes and phases,as well as projected apertures across all modular elements,we analyse the near-field signal-to-noise ratio(SNR)performance for modular XL-array communications.Based on the non-uniform spherical wave(NUSW)modelling,the closed-form SNR expression is derived in terms of key system parameters,such as the overall modular array size,distances of adjacent modules along all dimensions,and the user's three-dimensional(3D)location.In addition,with the number of modules in different dimensions increasing infinitely,the asymptotic SNR scaling laws are revealed.Furthermore,we show that our proposed near-field modelling and performance analysis include the results for existing array architectures/modelling as special cases,e.g.,the collocated XL-array architecture,the uniform plane wave(UPW)based far-field modelling,and the modular extremely large-scale uniform linear array(XL-ULA)of onedimension.Extensive simulation results are presented to validate our findings.展开更多
Considering the large diameter effect of piles,the influence of different pile-soil analysis methods on the design of monopile foundations for offshore wind turbines has become an urgent problem to be solved.Three dif...Considering the large diameter effect of piles,the influence of different pile-soil analysis methods on the design of monopile foundations for offshore wind turbines has become an urgent problem to be solved.Three different pile-soil models were used to study a large 10 MW monopile wind turbine.By modeling the three models in the SACS software,this paper analyzed the motion response of the overall structure under the conditions of wind and waves.According to the given working conditions,this paper concludes that under the condition of independent wind,the average value of the tower top x-displacement of the rigid connection method is the smalle st,and the standard deviation is the smallest under the condition of independent wave.The results obtained by the p-y curve method are the most conservative.展开更多
The application model of epidemic disease assessment technology for Web-based large-scale pig farm was expounded from the identification of epidemic disease risk factors, construction of risk assessment model and deve...The application model of epidemic disease assessment technology for Web-based large-scale pig farm was expounded from the identification of epidemic disease risk factors, construction of risk assessment model and development of risk assessment system. The assessed pig farm uploaded the epidemic disease risk data information through on-line answering evaluating questionnaire to get the immediate evaluation report. The model could enhance the risk communication between pig farm veterinarian, manager and veterinary experts to help farm system understand and find disease risk factors, assess and report the potential high risk items of the pig farm in the three systems of engineering epidemic disease prevention technology, biological safety and immune monitoring, and promote the improvement and perfection of epidemic disease prevention and control measures.展开更多
A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a force...A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a forced aligned Hidden Markov Model(HMM) state sequence obtained from the GMM-based acoustic model. Therefore, it requires a long computation time for training both the GMM-based acoustic model and a deep learning-based acoustic model. In order to solve this problem, an acoustic model using CTC algorithm is proposed. CTC algorithm does not require the GMM-based acoustic model because it does not use the forced aligned HMM state sequence. However, previous works on a LSTM RNN-based acoustic model using CTC used a small-scale training corpus. In this paper, the LSTM RNN-based acoustic model using CTC is trained on a large-scale training corpus and its performance is evaluated. The implemented acoustic model has a performance of 6.18% and 15.01% in terms of Word Error Rate(WER) for clean speech and noisy speech, respectively. This is similar to a performance of the acoustic model based on the hybrid method.展开更多
The streamflow over the Yellow River basin is simulated using the PRECIS (Providing REgional Climates for Impacts Studies) regional climate model driven by 15-year (1979-1993) ECMWF reanalysis data as the initial ...The streamflow over the Yellow River basin is simulated using the PRECIS (Providing REgional Climates for Impacts Studies) regional climate model driven by 15-year (1979-1993) ECMWF reanalysis data as the initial and lateral boundary conditions and an off-line large-scale routing model (LRM). The LRM uses physical catchment and river channel information and allows streamflow to be predicted for large continental rivers with a 1°×1° spatial resolution. The results show that the PRECIS model can reproduce the general southeast to northwest gradient distribution of the precipitation over the Yellow River basin, The PRECIS- LRM model combination has the capability to simulate the seasonal and annual streamflow over the Yellow River basin. The simulated streamflow is generally coincident with the naturalized streamflow both in timing and in magnitude.展开更多
In relatively coarse-resolution atmospheric models,cumulus parameterization helps account for the effect of subgridscale convection,which produces supplemental rainfall to the grid-scale precipitation and impacts the ...In relatively coarse-resolution atmospheric models,cumulus parameterization helps account for the effect of subgridscale convection,which produces supplemental rainfall to the grid-scale precipitation and impacts the diurnal cycle of precipitation.In this study,the diurnal cycle of precipitation was studied using the new simplified Arakawa-Schubert scheme in a global non-hydrostatic atmospheric model,i.e.,the Yin-Yang-grid Unified Model for the Atmosphere.Two new diagnostic closures and a convective trigger function were suggested to emphasize the job of the cloud work function corresponding to the free tropospheric large-scale forcing.Numerical results of the 0.25-degree model in 3-month batched real-case simulations revealed an improvement in the diurnal precipitation variation by using a revised trigger function with an enhanced dynamical constraint on the convective initiation and a suitable threshold of the trigger.By reducing the occurrence of convection during peak solar radiation hours,the revised scheme was shown to be effective in delaying the appearance of early-afternoon rainfall peaks over most land areas and accentuating the nocturnal peaks that were wrongly concealed by the more substantial afternoon peak.In addition,the revised scheme enhanced the simulation capability of the precipitation probability density function,such as increasing the extremely low-and high-intensity precipitation events and decreasing small and moderate rainfall events,which contributed to the reduction of precipitation bias over mid-latitude and tropical land areas.展开更多
Model Order Reduction (MOR) plays more and more imp or tant role in complex system simulation, design and control recently. For example , for the large-size space structures, VLSI and MEMS (Micro-ElectroMechanical Sys...Model Order Reduction (MOR) plays more and more imp or tant role in complex system simulation, design and control recently. For example , for the large-size space structures, VLSI and MEMS (Micro-ElectroMechanical Systems) etc., in order to shorten the development cost, increase the system co ntrolling accuracy and reduce the complexity of controllers, the reduced order model must be constructed. Even in Virtual Reality (VR), the simulation and d isplay must be in real-time, the model order must be reduced too. The recent advances of MOR research are overviewed in the article. The MOR theor y and methods may be classified as Singular Value decomposition (SVD) based, the Krylov subspace based and others. The merits and demerits of the different meth ods are analyzed, and the existed problems are pointed out. Moreover, the applic ation’s fields are overviewed, and the potential applications are forecaste d. After the existed problems analyzed, the future work is described. There are som e problems in the traditional methods such as SVD and Krylov subspace, they are that it’s difficult to (1)guarantee the stability of the original system, (2) b e adaptive to nonlinear system, and (3) control the modeling accuracy. The f uture works may be solving the above problems on the foundation of the tradition al methods, and applying other methods such as wavelet or signal compression.展开更多
[Objective] The behavior of eating, drinking, defecating and peeing of 1 500 pigs in a large-scale microbial fermentation bed-equipped piggery was observed. We hoped to find some simple indicators that could reflect t...[Objective] The behavior of eating, drinking, defecating and peeing of 1 500 pigs in a large-scale microbial fermentation bed-equipped piggery was observed. We hoped to find some simple indicators that could reflect the health status of swinery and to provide experience for the swinery performance management in large-scale microbial fermentation bed-equipped piggery. [Method] The body weight (BW), daily BW gain, feed intake and other indicators of different-day-old pigs were recorded in details. Based on the recorded data, the models between BW, BW gain, average daily feed intake and feed/gain ratio and growth days (d) were established. In addition, the incidences of pox-like macula (dermatitis), diarrhea (gastrointestinal disease), cough (respiratory disease), stiff pig (malnutrition), conjunctivitis (eye disease) and foot inflection (trauma) among fattening pigs were also investigated. [Result] The BW range, average BW, daily BW gain, breeding days, daily feed intake range, average daily feed intake, staged feed intake, accumulated feed intake, feed/gain ratio and accumulated feed/gain ratio of different-day-old pigs were studied, respectively. Four dynamic models were established for the growth of pigs: (1) the BW (y)-age (x) mod- el: y=0.758 9x-19.883 (3=0.993 7); (2) the BW gain (y)-age (x) model: y=1.039 5x05051 (F=0.885 4); (3) the average daily feed intake (y)-age (x) model: y=0.023 5x-0.334 3 (F=0.991 7); (4) the feed/gain ratio (y)-age (x) model: y=0.022x+0.427 8 (P=0.988 5). Based on these models, the corresponding theoretical growth value of pigs at different growth stage could be predicted. The main diseases occurred among the swinery in the large-scale microbial fermentation bed piggery included pox-like macula (dermatitis), diarrhea (gastrointestinal disease), cough (respiratory disease), stiff pig (mal- nutrition), conjunctivitis (eye disease) and foot inflection (trauma). The deadly infec- tious diseases had been not found among the pigs. [Conclusion] When the actual BW, BW gain, average daily feed intake and feed/gain ratio were all lower than the theoretical values predicted by the models, the management should be enhanced. The average daily feed intake of 60 to 65-day-old pigs was lower than the theoretic value, indicating that the pigs could not adapt nicely to the fermentation bed at the very early stage. When the pigs grew up to 70 to 75 d old, the average daily feed intake was higher than the theoretical value, indicating that the pigs had adapted to the fermentation bed. In particularly, average daily feed intake of 75-day-old pigs was higher than the theoretical value by 21%. It was suggested the fermentation bed was conducive to the growth of pigs. Considering the occurrence of diseases among pigs, the overall incidence was relatively low. The incidence of each disease was all lower than 10% with little difficulty in treating. If the management of mattress was strength- ened, such as paying attention to feeding and keeping water clean, many diseases could heal by themselves.展开更多
As a result of rapid development in electronics and communication technology,large-scale unmanned aerial vehicles(UAVs)are harnessed for various promising applications in a coordinated manner.Although it poses numerou...As a result of rapid development in electronics and communication technology,large-scale unmanned aerial vehicles(UAVs)are harnessed for various promising applications in a coordinated manner.Although it poses numerous advantages,resource management among various domains in large-scale UAV communication networks is the key challenge to be solved urgently.Specifically,due to the inherent requirements and future development trend,distributed resource management is suitable.In this article,we investigate the resource management problem for large-scale UAV communication networks from game-theoretic perspective which are exactly coincident with the distributed and autonomous manner.By exploring the inherent features,the distinctive challenges are discussed.Then,we explore several gametheoretic models that not only combat the challenges but also have broad application prospects.We provide the basics of each game-theoretic model and discuss the potential applications for resource management in large-scale UAV communication networks.Specifically,mean-field game,graphical game,Stackelberg game,coalition game and potential game are included.After that,we propose two innovative case studies to highlight the feasibility of such novel game-theoretic models.Finally,we give some future research directions to shed light on future opportunities and applications.展开更多
The temperature control of the large-scale vertical quench furnace is very difficult due to its huge volume and complex thermal exchanges. To meet the technical requirement of the quenching process, a temperature cont...The temperature control of the large-scale vertical quench furnace is very difficult due to its huge volume and complex thermal exchanges. To meet the technical requirement of the quenching process, a temperature control system which integrates temperature calibration and temperature uniformity control is developed for the thermal treatment of aluminum alloy workpieces in the large-scale vertical quench furnace. To obtain the aluminum alloy workpiece temperature, an air heat transfer model is newly established to describe the temperature gradient distribution so that the immeasurable workpiece temperature can be calibrated from the available thermocouple temperature. To satisfy the uniformity control of the furnace temperature, a second order partial differential equation(PDE) is derived to describe the thermal dynamics inside the vertical quench furnace. Based on the PDE, a decoupling matrix is constructed to solve the coupling issue and decouple the heating process into multiple independent heating subsystems. Then, using the expert control rule to find a compromise of temperature rising time and overshoot during the quenching process. The developed temperature control system has been successfully applied to a 31 m large-scale vertical quench furnace, and the industrial running results show the significant improvement of the temperature uniformity, lower overshoot and shortened processing time.展开更多
Based on the explicit finite element(FE) method and platform of ABAQUS,considering both the inhomogeneity of soils and concave-convex fluctuation of topography,a large-scale refined two-dimensional(2D) FE nonlinear an...Based on the explicit finite element(FE) method and platform of ABAQUS,considering both the inhomogeneity of soils and concave-convex fluctuation of topography,a large-scale refined two-dimensional(2D) FE nonlinear analytical model for Fuzhou Basin was established.The peak ground motion acceleration(PGA) and focusing effect with depth were analyzed.Meanwhile,the results by wave propagation of one-dimensional(1D) layered medium equivalent linearization method were added for contrast.The results show that:1) PGA at different depths are obviously amplified compared to the input ground motion,amplification effect of both funnel-shaped depression and upheaval areas(based on the shape of bedrock surface) present especially remarkable.The 2D results indicate that the PGA displays a non-monotonic decreasing with depth and a greater focusing effect of some particular layers,while the 1D results turn out that the PGA decreases with depth,except that PGA at few particular depth increases abruptly; 2) To the funnel-shaped depression areas,PGA amplification effect above 8 m depth shows relatively larger,to the upheaval areas,PGA amplification effect from 15 m to 25 m depth seems more significant.However,the regularities of the PGA amplification effect could hardly be found in the rest areas; 3) It appears a higher regression rate of PGA amplification coefficient with depth when under a smaller input motion; 4) The frequency spectral characteristic of input motion has noticeable effects on PGA amplification tendency.展开更多
Wind energy has been rapidly developed in China during the past decades and the installed capacity has been the largest in the world. In the future, utilization of wind power is still expected to carry out in China ma...Wind energy has been rapidly developed in China during the past decades and the installed capacity has been the largest in the world. In the future, utilization of wind power is still expected to carry out in China mainly with a large-scale centralized layout. Here, we examine the potential climatic impacts of large-scale windfarms associated with deployment scale in China using numerical experiments, in which four deployment scenarios were designed. These four scenarios represented relatively small- (484 GW), medium- (2165 GW) and large-scale (3490 GW and 5412 GW) installed wind power capacities, respectively. Results showed that turbulent kinetic energy, wind velocity, and air temperature varied consistently within those windfarms with the largest changes in turbine hub heights. Moreover, the above relatively large- scale windfarms could induce regional wanning with a maximum of above 0.8 °C in North China. This regional warming may be linked to an anomalous circulation pattern with a negative pressure anomaly center in Northeast China and a positive pressure anomaly center in the middle and lower reaches of the Yangtze-Huaihe River Basin.展开更多
BACKGROUND Large-scale functional connectivity(LSFC)patterns in the brain have unique intrinsic characteristics.Abnormal LSFC patterns have been found in patients with dementia,as well as in those with mild cognitive ...BACKGROUND Large-scale functional connectivity(LSFC)patterns in the brain have unique intrinsic characteristics.Abnormal LSFC patterns have been found in patients with dementia,as well as in those with mild cognitive impairment(MCI),and these patterns predicted their cognitive performance.It has been reported that patients with type 2 diabetes mellitus(T2DM)may develop MCI that could progress to dementia.We investigated whether we could adopt LSFC patterns as discriminative features to predict the cognitive function of patients with T2DM,using connectome-based predictive modeling(CPM)and a support vector machine.AIM To investigate the utility of LSFC for predicting cognitive impairment related to T2DM more accurately and reliably.METHODS Resting-state functional magnetic resonance images were derived from 42 patients with T2DM and 24 healthy controls.Cognitive function was assessed using the Montreal Cognitive Assessment(MoCA).Patients with T2DM were divided into two groups,according to the presence(T2DM-C;n=16)or absence(T2DM-NC;n=26)of MCI.Brain regions were marked using Harvard Oxford(HOA-112),automated anatomical labeling(AAL-116),and 264-region functional(Power-264)atlases.LSFC biomarkers for predicting MoCA scores were identified using a new CPM technique.Subsequently,we used a support vector machine based on LSFC patterns for among-group differentiation.The area under the receiver operating characteristic curve determined the appearance of the classification.RESULTS CPM could predict the MoCA scores in patients with T2DM(Pearson’s correlation coefficient between predicted and actual MoCA scores,r=0.32,P=0.0066[HOA-112 atlas];r=0.32,P=0.0078[AAL-116 atlas];r=0.42,P=0.0038[Power-264 atlas]),indicating that LSFC patterns represent cognition-level measures in these patients.Positive(anti-correlated)LSFC networks based on the Power-264 atlas showed the best predictive performance;moreover,we observed new brain regions of interest associated with T2DM-related cognition.The area under the receiver operating characteristic curve values(T2DM-NC group vs.T2DM-C group)were 0.65-0.70,with LSFC matrices based on HOA-112 and Power-264 atlases having the highest value(0.70).Most discriminative and attractive LSFCs were related to the default mode network,limbic system,and basal ganglia.CONCLUSION LSFC provides neuroimaging-based information that may be useful in detecting MCI early and accurately in patients with T2DM.展开更多
Large-scale atmospheric information plays an important role in the regional model for the forecasts of weather such as tropical cyclone(TC).However,it is difficult to be fully represented in regional models due to dom...Large-scale atmospheric information plays an important role in the regional model for the forecasts of weather such as tropical cyclone(TC).However,it is difficult to be fully represented in regional models due to domain size and a lack of observation data,particularly at sea used in regional data assimilation.Blending analysis has been developed and implemented in regional models to reintroduce large-scale information from global model to regional analysis.Research of the impact of this large-scale blending scheme for the Global/Regional Assimilation and PrEdiction System(CMA-MESO)regional model on TC forecasting is limited and this study attempts to further progress by examining the adaptivity of the blending scheme using the two-dimensional Discrete Cosine Transform(2D-DCT)filter on the model forecast of Typhoon Haima over Shenzhen,China in 2016 and considering various cut-off wavelengths.Results showed that the error of the 24-hour typhoon track forecast can be reduced to less than 25 km by applying the scale-dependent blending scheme,indicating that the blending analysis is effectively able to minimise the large-scale bias for the initial fields.The improvement of the wind forecast is more evident for u-wind component according to the reduced root mean square errors(RMSEs)by comparing the experiments with and without blending analysis.Furthermore,the higher equitable threat score(ETS)provided implications that the precipitation prediction skills were increased in the 24h forecast by improving the representation of the large-scale feature in the CMA-MESO analysis.Furthermore,significant differences of the track error forecast were found by applying the blending analysis with different cut-off wavelengths from 400 km to 1200 km and the track error can be reduced less than by 10 km with 400 km cut-off wavelength in the first 6h forecast.It highlighted that the blending scheme with dynamic cut-off wavelengths adapted to the development of different TC systems is necessary in order to optimally introduce and ingest the large-scale information from global model to the regional model for improving the TC forecast.In this paper,the methods and data applied in this study will be firstly introduced,before discussion of the results regarding the performance of the blending analysis and its impacts on the wind and precipitation forecast correspondingly,followed by the discussion of the effects of different blending scheme on TC forecasts and the conclusion section.展开更多
The society in the digital transformation era demands new decision schemes such as e-democracy or based on social media.Such novel decision schemes require the participation of many experts/decision makers/stakeholder...The society in the digital transformation era demands new decision schemes such as e-democracy or based on social media.Such novel decision schemes require the participation of many experts/decision makers/stakeholders in the decision processes.As a result,large-scale group decision making(LSGDM)has attracted the attention of many researchers in the last decade and many studies have been conducted in order to face the challenges associated with the topic.Therefore,this paper aims at reviewing the most relevant studies about LSGDM,identifying the most profitable research trends and analyzing them from a critical point of view.To do so,the Web of Science database has been consulted by using different searches.From these results a total of 241 contributions were found and a selection process regarding language,type of contribution and actual relation with the studied topic was then carried out.The 87 contributions finally selected for this review have been analyzed from four points of view that have been highly remarked in the topic,such as the preference structure in which decision-makers’opinions are modeled,the group decision rules used to define the decision making process,the techniques applied to verify the quality of these models and their applications to real world problems solving.Afterwards,a critical analysis of the main limitations of the existing proposals is developed.Finally,taking into account these limitations,new research lines for LSGDM are proposed and the main challenges are stressed out.展开更多
A reduction in network energy consumption and the establishment of green networks have become key scientific problems in academic and industrial research.Existing energy efficiency schemes are based on a known traffic...A reduction in network energy consumption and the establishment of green networks have become key scientific problems in academic and industrial research.Existing energy efficiency schemes are based on a known traffic matrix,and acquiring a real-time traffic matrix in current complex networks is difficult.Therefore,this research investigates how to reduce network energy consumption without a real-time traffic matrix.In particular,this paper proposes an intra-domain energy-efficient routing scheme based on multipath routing.It analyzes the relationship between routing availability and energy-efficient routing and integrates the two mechanisms to satisfy the requirements of availability and energy efficiency.The main research focus is as follows:(1)A link criticality model is evaluated to quantitatively measure the importance of links in a network.(2)On the basis of the link criticality model,this paper analyzes an energy-efficient routing technology based on multipath routing to achieve the goals of availability and energy efficiency simultaneously.(3)An energy-efficient routing algorithm based on multipath routing in large-scale networks is proposed.(4)The proposed method does not require a real-time traffic matrix in the network and is thus easy to apply in practice.(5)The proposed algorithm is verified in several network topologies.Experimental results show that the algorithm can not only reduce network energy consumption but can also ensure routing availability.展开更多
Time-delays,due to the information transmission between subsystems,naturally exist in large-scale systems and the existence of the delay is frequently a source of instability. This paper considers the problems of robu...Time-delays,due to the information transmission between subsystems,naturally exist in large-scale systems and the existence of the delay is frequently a source of instability. This paper considers the problems of robust non-fragile fuzzy control for a class of uncertain discrete nonlinear large-scale systems with time-delay and controller gain perturbations described by T-S fuzzy model. An equivalent T-S fuzzy model is represented for discrete-delay nonlinear large-scale systems. A sufficient condition for the existence of such non-fragile controllers is further derived via the Lyapunov function and the linear matrix inequality( LMI) approach. Simulation results demonstrate the feasibility and the effectiveness of the proposed design and the proper stabilization of the system in spite of controller gain variations and uncertainties.展开更多
A continuous-time fuzzy large-scale system F consists of some interconnected Takagi-Sugeno fuzzy subsystems. Two sufficient conditions for the asymptotic stability of this system (namely, theorem 1 and theorem 2) are...A continuous-time fuzzy large-scale system F consists of some interconnected Takagi-Sugeno fuzzy subsystems. Two sufficient conditions for the asymptotic stability of this system (namely, theorem 1 and theorem 2) are derived via a multiple Lyapunov function approach. In theorem 1, the information of membership functions of fuzzy rules should be known in order to analyze the stability of F. But in general this information is not easy to be acquired for their time-varying property. So theorem 2 is provided to judge the asymptotic stability of F, based on which there is no need to know the information of membership functions in stability analysis. Finally, a numerical example is given to show the utility of the method proposed in this paper.展开更多
Social media data created a paradigm shift in assessing situational awareness during a natural disaster or emergencies such as wildfire, hurricane, tropical storm etc. Twitter as an emerging data source is an effectiv...Social media data created a paradigm shift in assessing situational awareness during a natural disaster or emergencies such as wildfire, hurricane, tropical storm etc. Twitter as an emerging data source is an effective and innovative digital platform to observe trend from social media users’ perspective who are direct or indirect witnesses of the calamitous event. This paper aims to collect and analyze twitter data related to the recent wildfire in California to perform a trend analysis by classifying firsthand and credible information from Twitter users. This work investigates tweets on the recent wildfire in California and classifies them based on witnesses into two types: 1) direct witnesses and 2) indirect witnesses. The collected and analyzed information can be useful for law enforcement agencies and humanitarian organizations for communication and verification of the situational awareness during wildfire hazards. Trend analysis is an aggregated approach that includes sentimental analysis and topic modeling performed through domain-expert manual annotation and machine learning. Trend analysis ultimately builds a fine-grained analysis to assess evacuation routes and provide valuable information to the firsthand emergency responders<span style="font-family:Verdana;">.</span>展开更多
基金National Natural Science Foundation of China(82274265 and 82274588)Hunan University of Traditional Chinese Medicine Research Unveiled Marshal Programs(2022XJJB003).
文摘Eye diagnosis is a method for inspecting systemic diseases and syndromes by observing the eyes.With the development of intelligent diagnosis in traditional Chinese medicine(TCM);artificial intelligence(AI)can improve the accuracy and efficiency of eye diagnosis.However;the research on intelligent eye diagnosis still faces many challenges;including the lack of standardized and precisely labeled data;multi-modal information analysis;and artificial in-telligence models for syndrome differentiation.The widespread application of AI models in medicine provides new insights and opportunities for the research of eye diagnosis intelli-gence.This study elaborates on the three key technologies of AI models in the intelligent ap-plication of TCM eye diagnosis;and explores the implications for the research of eye diagno-sis intelligence.First;a database concerning eye diagnosis was established based on self-su-pervised learning so as to solve the issues related to the lack of standardized and precisely la-beled data.Next;the cross-modal understanding and generation of deep neural network models to address the problem of lacking multi-modal information analysis.Last;the build-ing of data-driven models for eye diagnosis to tackle the issue of the absence of syndrome dif-ferentiation models.In summary;research on intelligent eye diagnosis has great potential to be applied the surge of AI model applications.
基金supported by the National Key R&D Program of China with Grant number 2019YFB1803400the National Natural Science Foundation of China under Grant number 62071114the Fundamental Research Funds for the Central Universities of China under grant numbers 3204002004A2 and 2242022k30005。
文摘This paper investigates the wireless communication with a novel architecture of antenna arrays,termed modular extremely large-scale array(XLarray),where array elements of an extremely large number/size are regularly mounted on a shared platform with both horizontally and vertically interlaced modules.Each module consists of a moderate/flexible number of array elements with the inter-element distance typically in the order of the signal wavelength,while different modules are separated by the relatively large inter-module distance for convenience of practical deployment.By accurately modelling the signal amplitudes and phases,as well as projected apertures across all modular elements,we analyse the near-field signal-to-noise ratio(SNR)performance for modular XL-array communications.Based on the non-uniform spherical wave(NUSW)modelling,the closed-form SNR expression is derived in terms of key system parameters,such as the overall modular array size,distances of adjacent modules along all dimensions,and the user's three-dimensional(3D)location.In addition,with the number of modules in different dimensions increasing infinitely,the asymptotic SNR scaling laws are revealed.Furthermore,we show that our proposed near-field modelling and performance analysis include the results for existing array architectures/modelling as special cases,e.g.,the collocated XL-array architecture,the uniform plane wave(UPW)based far-field modelling,and the modular extremely large-scale uniform linear array(XL-ULA)of onedimension.Extensive simulation results are presented to validate our findings.
基金financially supported by the Open Research Fund of Hunan Provincial Key Laboratory of Key Technology on Hydropower Development (Grant No.PKLHD202003)the National Natural Science Foundation of China (Grant Nos.52071058 and 51939002)+1 种基金the National Natural Science Foundation of Liaoning Province (Grant No.2022-KF-18-01)Fundamental Research Funds for the Central University (Grant No.DUT20ZD219)。
文摘Considering the large diameter effect of piles,the influence of different pile-soil analysis methods on the design of monopile foundations for offshore wind turbines has become an urgent problem to be solved.Three different pile-soil models were used to study a large 10 MW monopile wind turbine.By modeling the three models in the SACS software,this paper analyzed the motion response of the overall structure under the conditions of wind and waves.According to the given working conditions,this paper concludes that under the condition of independent wind,the average value of the tower top x-displacement of the rigid connection method is the smalle st,and the standard deviation is the smallest under the condition of independent wave.The results obtained by the p-y curve method are the most conservative.
基金Supported by the Fund Program of Jiangsu Academy of Agricultural Sciences(6111689)the Planning Program of"the Twelfth Five-year-plan"in National Science and Technology for the Rural Developme+nt in China(2015BAD12B04-1.2)the Fund for Independent Innovation of Agricultural Science and Technology of Jiangsu Province[CX(16)1006]~~
文摘The application model of epidemic disease assessment technology for Web-based large-scale pig farm was expounded from the identification of epidemic disease risk factors, construction of risk assessment model and development of risk assessment system. The assessed pig farm uploaded the epidemic disease risk data information through on-line answering evaluating questionnaire to get the immediate evaluation report. The model could enhance the risk communication between pig farm veterinarian, manager and veterinary experts to help farm system understand and find disease risk factors, assess and report the potential high risk items of the pig farm in the three systems of engineering epidemic disease prevention technology, biological safety and immune monitoring, and promote the improvement and perfection of epidemic disease prevention and control measures.
基金supported by the Ministry of Trade,Industry & Energy(MOTIE,Korea) under Industrial Technology Innovation Program (No.10063424,'development of distant speech recognition and multi-task dialog processing technologies for in-door conversational robots')
文摘A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a forced aligned Hidden Markov Model(HMM) state sequence obtained from the GMM-based acoustic model. Therefore, it requires a long computation time for training both the GMM-based acoustic model and a deep learning-based acoustic model. In order to solve this problem, an acoustic model using CTC algorithm is proposed. CTC algorithm does not require the GMM-based acoustic model because it does not use the forced aligned HMM state sequence. However, previous works on a LSTM RNN-based acoustic model using CTC used a small-scale training corpus. In this paper, the LSTM RNN-based acoustic model using CTC is trained on a large-scale training corpus and its performance is evaluated. The implemented acoustic model has a performance of 6.18% and 15.01% in terms of Word Error Rate(WER) for clean speech and noisy speech, respectively. This is similar to a performance of the acoustic model based on the hybrid method.
文摘The streamflow over the Yellow River basin is simulated using the PRECIS (Providing REgional Climates for Impacts Studies) regional climate model driven by 15-year (1979-1993) ECMWF reanalysis data as the initial and lateral boundary conditions and an off-line large-scale routing model (LRM). The LRM uses physical catchment and river channel information and allows streamflow to be predicted for large continental rivers with a 1°×1° spatial resolution. The results show that the PRECIS model can reproduce the general southeast to northwest gradient distribution of the precipitation over the Yellow River basin, The PRECIS- LRM model combination has the capability to simulate the seasonal and annual streamflow over the Yellow River basin. The simulated streamflow is generally coincident with the naturalized streamflow both in timing and in magnitude.
基金supported by the National Natural Science Foundation of China(Grant Nos.42375153,42075151).
文摘In relatively coarse-resolution atmospheric models,cumulus parameterization helps account for the effect of subgridscale convection,which produces supplemental rainfall to the grid-scale precipitation and impacts the diurnal cycle of precipitation.In this study,the diurnal cycle of precipitation was studied using the new simplified Arakawa-Schubert scheme in a global non-hydrostatic atmospheric model,i.e.,the Yin-Yang-grid Unified Model for the Atmosphere.Two new diagnostic closures and a convective trigger function were suggested to emphasize the job of the cloud work function corresponding to the free tropospheric large-scale forcing.Numerical results of the 0.25-degree model in 3-month batched real-case simulations revealed an improvement in the diurnal precipitation variation by using a revised trigger function with an enhanced dynamical constraint on the convective initiation and a suitable threshold of the trigger.By reducing the occurrence of convection during peak solar radiation hours,the revised scheme was shown to be effective in delaying the appearance of early-afternoon rainfall peaks over most land areas and accentuating the nocturnal peaks that were wrongly concealed by the more substantial afternoon peak.In addition,the revised scheme enhanced the simulation capability of the precipitation probability density function,such as increasing the extremely low-and high-intensity precipitation events and decreasing small and moderate rainfall events,which contributed to the reduction of precipitation bias over mid-latitude and tropical land areas.
文摘Model Order Reduction (MOR) plays more and more imp or tant role in complex system simulation, design and control recently. For example , for the large-size space structures, VLSI and MEMS (Micro-ElectroMechanical Systems) etc., in order to shorten the development cost, increase the system co ntrolling accuracy and reduce the complexity of controllers, the reduced order model must be constructed. Even in Virtual Reality (VR), the simulation and d isplay must be in real-time, the model order must be reduced too. The recent advances of MOR research are overviewed in the article. The MOR theor y and methods may be classified as Singular Value decomposition (SVD) based, the Krylov subspace based and others. The merits and demerits of the different meth ods are analyzed, and the existed problems are pointed out. Moreover, the applic ation’s fields are overviewed, and the potential applications are forecaste d. After the existed problems analyzed, the future work is described. There are som e problems in the traditional methods such as SVD and Krylov subspace, they are that it’s difficult to (1)guarantee the stability of the original system, (2) b e adaptive to nonlinear system, and (3) control the modeling accuracy. The f uture works may be solving the above problems on the foundation of the tradition al methods, and applying other methods such as wavelet or signal compression.
基金Supported by International Science and Technology Cooperation Project of China(2012DFA31120)Special Fund for Agro-scientific Research in the Public Interest(201303094)National Key Technology Research and Development Program(2012BAD14B15)~~
文摘[Objective] The behavior of eating, drinking, defecating and peeing of 1 500 pigs in a large-scale microbial fermentation bed-equipped piggery was observed. We hoped to find some simple indicators that could reflect the health status of swinery and to provide experience for the swinery performance management in large-scale microbial fermentation bed-equipped piggery. [Method] The body weight (BW), daily BW gain, feed intake and other indicators of different-day-old pigs were recorded in details. Based on the recorded data, the models between BW, BW gain, average daily feed intake and feed/gain ratio and growth days (d) were established. In addition, the incidences of pox-like macula (dermatitis), diarrhea (gastrointestinal disease), cough (respiratory disease), stiff pig (malnutrition), conjunctivitis (eye disease) and foot inflection (trauma) among fattening pigs were also investigated. [Result] The BW range, average BW, daily BW gain, breeding days, daily feed intake range, average daily feed intake, staged feed intake, accumulated feed intake, feed/gain ratio and accumulated feed/gain ratio of different-day-old pigs were studied, respectively. Four dynamic models were established for the growth of pigs: (1) the BW (y)-age (x) mod- el: y=0.758 9x-19.883 (3=0.993 7); (2) the BW gain (y)-age (x) model: y=1.039 5x05051 (F=0.885 4); (3) the average daily feed intake (y)-age (x) model: y=0.023 5x-0.334 3 (F=0.991 7); (4) the feed/gain ratio (y)-age (x) model: y=0.022x+0.427 8 (P=0.988 5). Based on these models, the corresponding theoretical growth value of pigs at different growth stage could be predicted. The main diseases occurred among the swinery in the large-scale microbial fermentation bed piggery included pox-like macula (dermatitis), diarrhea (gastrointestinal disease), cough (respiratory disease), stiff pig (mal- nutrition), conjunctivitis (eye disease) and foot inflection (trauma). The deadly infec- tious diseases had been not found among the pigs. [Conclusion] When the actual BW, BW gain, average daily feed intake and feed/gain ratio were all lower than the theoretical values predicted by the models, the management should be enhanced. The average daily feed intake of 60 to 65-day-old pigs was lower than the theoretic value, indicating that the pigs could not adapt nicely to the fermentation bed at the very early stage. When the pigs grew up to 70 to 75 d old, the average daily feed intake was higher than the theoretical value, indicating that the pigs had adapted to the fermentation bed. In particularly, average daily feed intake of 75-day-old pigs was higher than the theoretical value by 21%. It was suggested the fermentation bed was conducive to the growth of pigs. Considering the occurrence of diseases among pigs, the overall incidence was relatively low. The incidence of each disease was all lower than 10% with little difficulty in treating. If the management of mattress was strength- ened, such as paying attention to feeding and keeping water clean, many diseases could heal by themselves.
基金This work was supported by National Key R&D Program of China under Grant 2018YFB1800802in part by the National Natural Science Foundation of China under Grant No.61771488,No.61631020 and No.61827801+1 种基金in part by State Key Laboratory of Air Traffic Management System and Technology under Grant No.SKLATM201808in part by Postgraduate Research and Practice Innovation Program of Jiangsu Province under No.KYCX190188.
文摘As a result of rapid development in electronics and communication technology,large-scale unmanned aerial vehicles(UAVs)are harnessed for various promising applications in a coordinated manner.Although it poses numerous advantages,resource management among various domains in large-scale UAV communication networks is the key challenge to be solved urgently.Specifically,due to the inherent requirements and future development trend,distributed resource management is suitable.In this article,we investigate the resource management problem for large-scale UAV communication networks from game-theoretic perspective which are exactly coincident with the distributed and autonomous manner.By exploring the inherent features,the distinctive challenges are discussed.Then,we explore several gametheoretic models that not only combat the challenges but also have broad application prospects.We provide the basics of each game-theoretic model and discuss the potential applications for resource management in large-scale UAV communication networks.Specifically,mean-field game,graphical game,Stackelberg game,coalition game and potential game are included.After that,we propose two innovative case studies to highlight the feasibility of such novel game-theoretic models.Finally,we give some future research directions to shed light on future opportunities and applications.
基金Project(61174132)supported by the National Natural Science Foundation of ChinaProject(2015zzts047)supported by the Fundamental Research Funds for the Central Universities,ChinaProject(20130162110067)supported by the Research Fund for the Doctoral Program of Higher Education of China
文摘The temperature control of the large-scale vertical quench furnace is very difficult due to its huge volume and complex thermal exchanges. To meet the technical requirement of the quenching process, a temperature control system which integrates temperature calibration and temperature uniformity control is developed for the thermal treatment of aluminum alloy workpieces in the large-scale vertical quench furnace. To obtain the aluminum alloy workpiece temperature, an air heat transfer model is newly established to describe the temperature gradient distribution so that the immeasurable workpiece temperature can be calibrated from the available thermocouple temperature. To satisfy the uniformity control of the furnace temperature, a second order partial differential equation(PDE) is derived to describe the thermal dynamics inside the vertical quench furnace. Based on the PDE, a decoupling matrix is constructed to solve the coupling issue and decouple the heating process into multiple independent heating subsystems. Then, using the expert control rule to find a compromise of temperature rising time and overshoot during the quenching process. The developed temperature control system has been successfully applied to a 31 m large-scale vertical quench furnace, and the industrial running results show the significant improvement of the temperature uniformity, lower overshoot and shortened processing time.
基金Project(2011CB013601) supported by the National Basic Research Program of ChinaProject(51378258) supported by the National Natural Science Foundation of China
文摘Based on the explicit finite element(FE) method and platform of ABAQUS,considering both the inhomogeneity of soils and concave-convex fluctuation of topography,a large-scale refined two-dimensional(2D) FE nonlinear analytical model for Fuzhou Basin was established.The peak ground motion acceleration(PGA) and focusing effect with depth were analyzed.Meanwhile,the results by wave propagation of one-dimensional(1D) layered medium equivalent linearization method were added for contrast.The results show that:1) PGA at different depths are obviously amplified compared to the input ground motion,amplification effect of both funnel-shaped depression and upheaval areas(based on the shape of bedrock surface) present especially remarkable.The 2D results indicate that the PGA displays a non-monotonic decreasing with depth and a greater focusing effect of some particular layers,while the 1D results turn out that the PGA decreases with depth,except that PGA at few particular depth increases abruptly; 2) To the funnel-shaped depression areas,PGA amplification effect above 8 m depth shows relatively larger,to the upheaval areas,PGA amplification effect from 15 m to 25 m depth seems more significant.However,the regularities of the PGA amplification effect could hardly be found in the rest areas; 3) It appears a higher regression rate of PGA amplification coefficient with depth when under a smaller input motion; 4) The frequency spectral characteristic of input motion has noticeable effects on PGA amplification tendency.
基金s We acknowledged the financial support of the National Key Research and Development Program of China (2018YFB1502803), the National Natural Science Foundation of China (41475066), and Tsinghua University Initiative Sci entific Research Program (20131089357, 20131089356).
文摘Wind energy has been rapidly developed in China during the past decades and the installed capacity has been the largest in the world. In the future, utilization of wind power is still expected to carry out in China mainly with a large-scale centralized layout. Here, we examine the potential climatic impacts of large-scale windfarms associated with deployment scale in China using numerical experiments, in which four deployment scenarios were designed. These four scenarios represented relatively small- (484 GW), medium- (2165 GW) and large-scale (3490 GW and 5412 GW) installed wind power capacities, respectively. Results showed that turbulent kinetic energy, wind velocity, and air temperature varied consistently within those windfarms with the largest changes in turbine hub heights. Moreover, the above relatively large- scale windfarms could induce regional wanning with a maximum of above 0.8 °C in North China. This regional warming may be linked to an anomalous circulation pattern with a negative pressure anomaly center in Northeast China and a positive pressure anomaly center in the middle and lower reaches of the Yangtze-Huaihe River Basin.
基金Supported by the National Natural Science Foundation of China,No.81771815.
文摘BACKGROUND Large-scale functional connectivity(LSFC)patterns in the brain have unique intrinsic characteristics.Abnormal LSFC patterns have been found in patients with dementia,as well as in those with mild cognitive impairment(MCI),and these patterns predicted their cognitive performance.It has been reported that patients with type 2 diabetes mellitus(T2DM)may develop MCI that could progress to dementia.We investigated whether we could adopt LSFC patterns as discriminative features to predict the cognitive function of patients with T2DM,using connectome-based predictive modeling(CPM)and a support vector machine.AIM To investigate the utility of LSFC for predicting cognitive impairment related to T2DM more accurately and reliably.METHODS Resting-state functional magnetic resonance images were derived from 42 patients with T2DM and 24 healthy controls.Cognitive function was assessed using the Montreal Cognitive Assessment(MoCA).Patients with T2DM were divided into two groups,according to the presence(T2DM-C;n=16)or absence(T2DM-NC;n=26)of MCI.Brain regions were marked using Harvard Oxford(HOA-112),automated anatomical labeling(AAL-116),and 264-region functional(Power-264)atlases.LSFC biomarkers for predicting MoCA scores were identified using a new CPM technique.Subsequently,we used a support vector machine based on LSFC patterns for among-group differentiation.The area under the receiver operating characteristic curve determined the appearance of the classification.RESULTS CPM could predict the MoCA scores in patients with T2DM(Pearson’s correlation coefficient between predicted and actual MoCA scores,r=0.32,P=0.0066[HOA-112 atlas];r=0.32,P=0.0078[AAL-116 atlas];r=0.42,P=0.0038[Power-264 atlas]),indicating that LSFC patterns represent cognition-level measures in these patients.Positive(anti-correlated)LSFC networks based on the Power-264 atlas showed the best predictive performance;moreover,we observed new brain regions of interest associated with T2DM-related cognition.The area under the receiver operating characteristic curve values(T2DM-NC group vs.T2DM-C group)were 0.65-0.70,with LSFC matrices based on HOA-112 and Power-264 atlases having the highest value(0.70).Most discriminative and attractive LSFCs were related to the default mode network,limbic system,and basal ganglia.CONCLUSION LSFC provides neuroimaging-based information that may be useful in detecting MCI early and accurately in patients with T2DM.
基金Project of Shenzhen Science and Technology Innovation Commission(KCXFZ20201221173610028)。
文摘Large-scale atmospheric information plays an important role in the regional model for the forecasts of weather such as tropical cyclone(TC).However,it is difficult to be fully represented in regional models due to domain size and a lack of observation data,particularly at sea used in regional data assimilation.Blending analysis has been developed and implemented in regional models to reintroduce large-scale information from global model to regional analysis.Research of the impact of this large-scale blending scheme for the Global/Regional Assimilation and PrEdiction System(CMA-MESO)regional model on TC forecasting is limited and this study attempts to further progress by examining the adaptivity of the blending scheme using the two-dimensional Discrete Cosine Transform(2D-DCT)filter on the model forecast of Typhoon Haima over Shenzhen,China in 2016 and considering various cut-off wavelengths.Results showed that the error of the 24-hour typhoon track forecast can be reduced to less than 25 km by applying the scale-dependent blending scheme,indicating that the blending analysis is effectively able to minimise the large-scale bias for the initial fields.The improvement of the wind forecast is more evident for u-wind component according to the reduced root mean square errors(RMSEs)by comparing the experiments with and without blending analysis.Furthermore,the higher equitable threat score(ETS)provided implications that the precipitation prediction skills were increased in the 24h forecast by improving the representation of the large-scale feature in the CMA-MESO analysis.Furthermore,significant differences of the track error forecast were found by applying the blending analysis with different cut-off wavelengths from 400 km to 1200 km and the track error can be reduced less than by 10 km with 400 km cut-off wavelength in the first 6h forecast.It highlighted that the blending scheme with dynamic cut-off wavelengths adapted to the development of different TC systems is necessary in order to optimally introduce and ingest the large-scale information from global model to the regional model for improving the TC forecast.In this paper,the methods and data applied in this study will be firstly introduced,before discussion of the results regarding the performance of the blending analysis and its impacts on the wind and precipitation forecast correspondingly,followed by the discussion of the effects of different blending scheme on TC forecasts and the conclusion section.
基金supported by the Spanish Ministry of Economy and Competitiveness through the Spanish National Project PGC2018-099402-B-I00the Postdoctoral fellow Ramón y Cajal(RYC-2017-21978)+6 种基金the FEDER-UJA project 1380637ERDF,the Spanish Ministry of Science,Innovation and Universities through a Formación de Profesorado Universitario(FPU2019/01203)grantthe Junta de Andalucía,Andalusian Plan for Research,Development,and Innovation(POSTDOC 21-00461)the National Natural Science Foundation of China(61300167,61976120)the Natural Science Foundation of Jiangsu Province(BK20191445)the Natural Science Key Foundation of Jiangsu Education Department(21KJA510004)Qing Lan Project of Jiangsu Province。
文摘The society in the digital transformation era demands new decision schemes such as e-democracy or based on social media.Such novel decision schemes require the participation of many experts/decision makers/stakeholders in the decision processes.As a result,large-scale group decision making(LSGDM)has attracted the attention of many researchers in the last decade and many studies have been conducted in order to face the challenges associated with the topic.Therefore,this paper aims at reviewing the most relevant studies about LSGDM,identifying the most profitable research trends and analyzing them from a critical point of view.To do so,the Web of Science database has been consulted by using different searches.From these results a total of 241 contributions were found and a selection process regarding language,type of contribution and actual relation with the studied topic was then carried out.The 87 contributions finally selected for this review have been analyzed from four points of view that have been highly remarked in the topic,such as the preference structure in which decision-makers’opinions are modeled,the group decision rules used to define the decision making process,the techniques applied to verify the quality of these models and their applications to real world problems solving.Afterwards,a critical analysis of the main limitations of the existing proposals is developed.Finally,taking into account these limitations,new research lines for LSGDM are proposed and the main challenges are stressed out.
基金supported by the Program of Hainan Association for Science and Technology Plans to Youth R&D Innovation(QCXM201910)the National Natural Science Foundation of China(Nos.61702315,61802092)+1 种基金the Applied Basic Research Plan of Shanxi Province(No.2201901D211168)the Key R&D Program(International Science and Technology Cooperation Project)of Shanxi Province China(No.201903D421003).
文摘A reduction in network energy consumption and the establishment of green networks have become key scientific problems in academic and industrial research.Existing energy efficiency schemes are based on a known traffic matrix,and acquiring a real-time traffic matrix in current complex networks is difficult.Therefore,this research investigates how to reduce network energy consumption without a real-time traffic matrix.In particular,this paper proposes an intra-domain energy-efficient routing scheme based on multipath routing.It analyzes the relationship between routing availability and energy-efficient routing and integrates the two mechanisms to satisfy the requirements of availability and energy efficiency.The main research focus is as follows:(1)A link criticality model is evaluated to quantitatively measure the importance of links in a network.(2)On the basis of the link criticality model,this paper analyzes an energy-efficient routing technology based on multipath routing to achieve the goals of availability and energy efficiency simultaneously.(3)An energy-efficient routing algorithm based on multipath routing in large-scale networks is proposed.(4)The proposed method does not require a real-time traffic matrix in the network and is thus easy to apply in practice.(5)The proposed algorithm is verified in several network topologies.Experimental results show that the algorithm can not only reduce network energy consumption but can also ensure routing availability.
文摘Time-delays,due to the information transmission between subsystems,naturally exist in large-scale systems and the existence of the delay is frequently a source of instability. This paper considers the problems of robust non-fragile fuzzy control for a class of uncertain discrete nonlinear large-scale systems with time-delay and controller gain perturbations described by T-S fuzzy model. An equivalent T-S fuzzy model is represented for discrete-delay nonlinear large-scale systems. A sufficient condition for the existence of such non-fragile controllers is further derived via the Lyapunov function and the linear matrix inequality( LMI) approach. Simulation results demonstrate the feasibility and the effectiveness of the proposed design and the proper stabilization of the system in spite of controller gain variations and uncertainties.
文摘A continuous-time fuzzy large-scale system F consists of some interconnected Takagi-Sugeno fuzzy subsystems. Two sufficient conditions for the asymptotic stability of this system (namely, theorem 1 and theorem 2) are derived via a multiple Lyapunov function approach. In theorem 1, the information of membership functions of fuzzy rules should be known in order to analyze the stability of F. But in general this information is not easy to be acquired for their time-varying property. So theorem 2 is provided to judge the asymptotic stability of F, based on which there is no need to know the information of membership functions in stability analysis. Finally, a numerical example is given to show the utility of the method proposed in this paper.
文摘Social media data created a paradigm shift in assessing situational awareness during a natural disaster or emergencies such as wildfire, hurricane, tropical storm etc. Twitter as an emerging data source is an effective and innovative digital platform to observe trend from social media users’ perspective who are direct or indirect witnesses of the calamitous event. This paper aims to collect and analyze twitter data related to the recent wildfire in California to perform a trend analysis by classifying firsthand and credible information from Twitter users. This work investigates tweets on the recent wildfire in California and classifies them based on witnesses into two types: 1) direct witnesses and 2) indirect witnesses. The collected and analyzed information can be useful for law enforcement agencies and humanitarian organizations for communication and verification of the situational awareness during wildfire hazards. Trend analysis is an aggregated approach that includes sentimental analysis and topic modeling performed through domain-expert manual annotation and machine learning. Trend analysis ultimately builds a fine-grained analysis to assess evacuation routes and provide valuable information to the firsthand emergency responders<span style="font-family:Verdana;">.</span>