Prunus serotina and Robinia pseudoacacia are the most widespread invasive trees in Central Europe.In addition,according to climate models,decreased growth of many economically and ecologically important native trees w...Prunus serotina and Robinia pseudoacacia are the most widespread invasive trees in Central Europe.In addition,according to climate models,decreased growth of many economically and ecologically important native trees will likely be observed in the future.We aimed to assess the impact of these two neophytes,which differ in the biomass range and nitrogen-fixing abilities observed in Central European conditions,on the relative aboveground biomass increments of native oaks Qucrcus robur and Q.petraea and Scots pine Pinus sylvestris.We aimed to increase our understanding of the relationship between facilitation and competition between woody alien species and overstory native trees.We established 72 circular plots(0.05 ha)in two different forest habitat types and stands varying in age in western Poland.We chose plots with different abundances of the studied neophytes to determine how effects scaled along the quantitative invasion gradient.Furthermore,we collected growth cores of the studied native species,and we calculated aboveground biomass increments at the tree and stand levels.Then,we used generalized linear mixed-effects models to assess the impact of invasive species abundances on relative aboveground biomass increments of native tree species.We did not find a biologically or statistically significant impact of invasive R.pseudoacacia or P.serotina on the relative aboveground,biomass increments of native oaks and pines along the quantitative gradient of invader biomass or on the proportion of total stand biomass accounted for by invaders.The neophytes did not act as native tree growth stimulators but also did not compete with them for resources,which would escalate the negative impact of climate change on pines and oaks.The neophytes should not significantly modify the carbon sequestration capacity of the native species.Our work combines elements of the per capita effect of invasion with research on mixed forest management.展开更多
The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces ...The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces a new method named Big Data Tensor Multi-Cluster Distributed Incremental Update(BDTMCDIncreUpdate),which combines distributed computing,storage technology,and incremental update techniques to provide an efficient and effective means for clustering analysis.Firstly,the original dataset is divided into multiple subblocks,and distributed computing resources are utilized to process the sub-blocks in parallel,enhancing efficiency.Then,initial clustering is performed on each sub-block using tensor-based multi-clustering techniques to obtain preliminary results.When new data arrives,incremental update technology is employed to update the core tensor and factor matrix,ensuring that the clustering model can adapt to changes in data.Finally,by combining the updated core tensor and factor matrix with historical computational results,refined clustering results are obtained,achieving real-time adaptation to dynamic data.Through experimental simulation on the Aminer dataset,the BDTMCDIncreUpdate method has demonstrated outstanding performance in terms of accuracy(ACC)and normalized mutual information(NMI)metrics,achieving an accuracy rate of 90%and an NMI score of 0.85,which outperforms existing methods such as TClusInitUpdate and TKLClusUpdate in most scenarios.Therefore,the BDTMCDIncreUpdate method offers an innovative solution to the field of big data analysis,integrating distributed computing,incremental updates,and tensor-based multi-clustering techniques.It not only improves the efficiency and scalability in processing large-scale high-dimensional datasets but also has been validated for its effectiveness and accuracy through experiments.This method shows great potential in real-world applications where dynamic data growth is common,and it is of significant importance for advancing the development of data analysis technology.展开更多
Time-Sensitive Network(TSN)with deterministic transmission capability is increasingly used in many emerging fields.It mainly guarantees the Quality of Service(QoS)of applications with strict requirements on time and s...Time-Sensitive Network(TSN)with deterministic transmission capability is increasingly used in many emerging fields.It mainly guarantees the Quality of Service(QoS)of applications with strict requirements on time and security.One of the core features of TSN is traffic scheduling with bounded low delay in the network.However,traffic scheduling schemes in TSN are usually synthesized offline and lack dynamism.To implement incremental scheduling of newly arrived traffic in TSN,we propose a Dynamic Response Incremental Scheduling(DR-IS)method for time-sensitive traffic and deploy it on a software-defined time-sensitive network architecture.Under the premise of meeting the traffic scheduling requirements,we adopt two modes,traffic shift and traffic exchange,to dynamically adjust the time slot injection position of the traffic in the original scheme,and determine the sending offset time of the new timesensitive traffic to minimize the global traffic transmission jitter.The evaluation results show that DRIS method can effectively control the large increase of traffic transmission jitter in incremental scheduling without affecting the transmission delay,thus realizing the dynamic incremental scheduling of time-sensitive traffic in TSN.展开更多
The instability of plasma waves in the channel of field-effect transistors will cause the electromagnetic waves with THz frequency.Based on a self-consistent quantum hydrodynamic model,the instability of THz plasmas w...The instability of plasma waves in the channel of field-effect transistors will cause the electromagnetic waves with THz frequency.Based on a self-consistent quantum hydrodynamic model,the instability of THz plasmas waves in the channel of graphene field-effect transistors has been investigated with external magnetic field and quantum effects.We analyzed the influence of weak magnetic fields,quantum effects,device size,and temperature on the instability of plasma waves under asymmetric boundary conditions numerically.The results show that the magnetic fields,quantum effects,and the thickness of the dielectric layer between the gate and the channel can increase the radiation frequency.Additionally,we observed that increase in temperature leads to a decrease in both oscillation frequency and instability increment.The numerical results and accompanying images obtained from our simulations provide support for the above conclusions.展开更多
This study used a three-dimensional numerical model of a proton exchange membrane fuel cell with five types of channels:a smooth channel(Case 1);eight rectangular baffles were arranged in the upstream(Case 2),midstrea...This study used a three-dimensional numerical model of a proton exchange membrane fuel cell with five types of channels:a smooth channel(Case 1);eight rectangular baffles were arranged in the upstream(Case 2),midstream(Case 3),downstream(Case 4),and the entire cathode flow channel(Case 5)to study the effects of baffle position on mass transport,power density,net power,etc.Moreover,the effects of back pressure and humidity on the voltage were investigated.Results showed that compared to smooth channels,the oxygen and water transport facilitation at the diffusion layer-channel interface were added 11.53%-20.60%and 7.81%-9.80%at 1.68 A·cm^(-2)by adding baffles.The closer the baffles were to upstream,the higher the total oxygen flux,but the lower the flux uniformity the worse the water removal.The oxygen flux of upstream baffles was 8.14%higher than that of downstream baffles,but oxygen flux uniformity decreased by 18.96%at 1.68 A·cm^(-2).The order of water removal and voltage improvement was Case 4>Case 5>Case 3>Case 2>Case 1.Net power of Case 4 was 9.87%higher than that of the smooth channel.To the Case 4,when the cell worked under low back pressure or high humidity,the voltage increments were higher.The potential increment for the back pressure of 0 atm was 0.9%higher than that of 2 atm(1 atm=101.325 kPa).The potential increment for the humidity of 100%was 7.89%higher than that of 50%.展开更多
Currently,distributed routing protocols are constrained by offering a single path between any pair of nodes,thereby limiting the potential throughput and overall network performance.This approach not only restricts th...Currently,distributed routing protocols are constrained by offering a single path between any pair of nodes,thereby limiting the potential throughput and overall network performance.This approach not only restricts the flow of data but also makes the network susceptible to failures in case the primary path is disrupted.In contrast,routing protocols that leverage multiple paths within the network offer a more resilient and efficient solution.Multipath routing,as a fundamental concept,surpasses the limitations of traditional shortest path first protocols.It not only redirects traffic to unused resources,effectively mitigating network congestion,but also ensures load balancing across the network.This optimization significantly improves network utilization and boosts the overall performance,making it a widely recognized efficient method for enhancing network reliability.To further strengthen network resilience against failures,we introduce a routing scheme known as Multiple Nodes with at least Two Choices(MNTC).This innovative approach aims to significantly enhance network availability by providing each node with at least two routing choices.By doing so,it not only reduces the dependency on a single path but also creates redundant paths that can be utilized in case of failures,thereby enhancing the overall resilience of the network.To ensure the optimal placement of nodes,we propose three incremental deployment algorithms.These algorithms carefully select the most suitable set of nodes for deployment,taking into account various factors such as node connectivity,traffic patterns,and network topology.By deployingMNTCon a carefully chosen set of nodes,we can significantly enhance network reliability without the need for a complete overhaul of the existing infrastructure.We have conducted extensive evaluations of MNTC in diverse topological spaces,demonstrating its effectiveness in maintaining high network availability with minimal path stretch.The results are impressive,showing that even when implemented on just 60%of nodes,our incremental deployment method significantly boosts network availability.This underscores the potential of MNTC in enhancing network resilience and performance,making it a viable solution for modern networks facing increasing demands and complexities.The algorithms OSPF,TBFH,DC and LFC perform fast rerouting based on strict conditions,while MNTC is not restricted by these conditions.In five real network topologies,the average network availability ofMNTCis improved by 14.68%,6.28%,4.76%and 2.84%,respectively,compared with OSPF,TBFH,DC and LFC.展开更多
Multispecies forests have received increased scientific attention,driven by the hypothesis that biodiversity improves ecological resilience.However,a greater species diversity presents challenges for forest management...Multispecies forests have received increased scientific attention,driven by the hypothesis that biodiversity improves ecological resilience.However,a greater species diversity presents challenges for forest management and research.Our study aims to develop basal area growth models for tree species cohorts.The analysis is based on a dataset of 423 permanent plots(2,500 m^(2))located in temperate forests in Durango,Mexico.First,we define tree species cohorts based on individual and neighborhood-based variables using a combination of principal component and cluster analyses.Then,we estimate the basal area increment of each cohort through the generalized additive model to describe the effect of tree size,competition,stand density and site quality.The principal component and cluster analyses assign a total of 37 tree species to eight cohorts that differed primarily with regard to the distribution of tree size and vertical position within the community.The generalized additive models provide satisfactory estimates of tree growth for the species cohorts,explaining between 19 and 53 percent of the total variation of basal area increment,and highlight the following results:i)most cohorts show a"rise-and-fall"effect of tree size on tree growth;ii)surprisingly,the competition index"basal area of larger trees"had showed a positive effect in four of the eight cohorts;iii)stand density had a negative effect on basal area increment,though the effect was minor in medium-and high-density stands,and iv)basal area growth was positively correlated with site quality except for an oak cohort.The developed species cohorts and growth models provide insight into their particular ecological features and growth patterns that may support the development of sustainable management strategies for temperate multispecies forests.展开更多
Humans are experiencing the inclusion of artificial agents in their lives,such as unmanned vehicles,service robots,voice assistants,and intelligent medical care.If the artificial agents cannot align with social values...Humans are experiencing the inclusion of artificial agents in their lives,such as unmanned vehicles,service robots,voice assistants,and intelligent medical care.If the artificial agents cannot align with social values or make ethical decisions,they may not meet the expectations of humans.Traditionally,an ethical decision-making framework is constructed by rule-based or statistical approaches.In this paper,we propose an ethical decision-making framework based on incremental ILP(Inductive Logic Programming),which can overcome the brittleness of rule-based approaches and little interpretability of statistical approaches.As the current incremental ILP makes it difficult to solve conflicts,we propose a novel ethical decision-making framework considering conflicts in this paper,which adopts our proposed incremental ILP system.The framework consists of two processes:the learning process and the deduction process.The first process records bottom clauses with their score functions and learns rules guided by the entailment and the score function.The second process obtains an ethical decision based on the rules.In an ethical scenario about chatbots for teenagers’mental health,we verify that our framework can learn ethical rules and make ethical decisions.Besides,we extract incremental ILP from the framework and compare it with the state-of-the-art ILP systems based on ASP(Answer Set Programming)focusing on conflict resolution.The results of comparisons show that our proposed system can generate better-quality rules than most other systems.展开更多
The visions of Industry 4.0 and 5.0 have reinforced the industrial environment.They have also made artificial intelligence incorporated as a major facilitator.Diagnosing machine faults has become a solid foundation fo...The visions of Industry 4.0 and 5.0 have reinforced the industrial environment.They have also made artificial intelligence incorporated as a major facilitator.Diagnosing machine faults has become a solid foundation for automatically recognizing machine failure,and thus timely maintenance can ensure safe operations.Transfer learning is a promising solution that can enhance the machine fault diagnosis model by borrowing pre-trained knowledge from the source model and applying it to the target model,which typically involves two datasets.In response to the availability of multiple datasets,this paper proposes using selective and adaptive incremental transfer learning(SA-ITL),which fuses three algorithms,namely,the hybrid selective algorithm,the transferability enhancement algorithm,and the incremental transfer learning algorithm.It is a selective algorithm that enables selecting and ordering appropriate datasets for transfer learning and selecting useful knowledge to avoid negative transfer.The algorithm also adaptively adjusts the portion of training data to balance the learning rate and training time.The proposed algorithm is evaluated and analyzed using ten benchmark datasets.Compared with other algorithms from existing works,SA-ITL improves the accuracy of all datasets.Ablation studies present the accuracy enhancements of the SA-ITL,including the hybrid selective algorithm(1.22%-3.82%),transferability enhancement algorithm(1.91%-4.15%),and incremental transfer learning algorithm(0.605%-2.68%).These also show the benefits of enhancing the target model with heterogeneous image datasets that widen the range of domain selection between source and target domains.展开更多
We investigated the parametric optimization on incremental sheet forming of stainless steel using Grey Relational Analysis(GRA) coupled with Principal Component Analysis(PCA). AISI 316L stainless steel sheets were use...We investigated the parametric optimization on incremental sheet forming of stainless steel using Grey Relational Analysis(GRA) coupled with Principal Component Analysis(PCA). AISI 316L stainless steel sheets were used to develop double wall angle pyramid with aid of tungsten carbide tool. GRA coupled with PCA was used to plan the experiment conditions. Control factors such as Tool Diameter(TD), Step Depth(SD), Bottom Wall Angle(BWA), Feed Rate(FR) and Spindle Speed(SS) on Top Wall Angle(TWA) and Top Wall Angle Surface Roughness(TWASR) have been studied. Wall angle increases with increasing tool diameter due to large contact area between tool and workpiece. As the step depth, feed rate and spindle speed increase,TWASR decreases with increasing tool diameter. As the step depth increasing, the hydrostatic stress is raised causing severe cracks in the deformed surface. Hence it was concluded that the proposed hybrid method was suitable for optimizing the factors and response.展开更多
Hyperspectral images typically have high spectral resolution but low spatial resolution,which impacts the reliability and accuracy of subsequent applications,for example,remote sensingclassification and mineral identi...Hyperspectral images typically have high spectral resolution but low spatial resolution,which impacts the reliability and accuracy of subsequent applications,for example,remote sensingclassification and mineral identification.But in traditional methods via deep convolution neural net-works,indiscriminately extracting and fusing spectral and spatial features makes it challenging toutilize the differentiated information across adjacent spectral channels.Thus,we proposed a multi-branch interleaved iterative upsampling hyperspectral image super-resolution reconstruction net-work(MIIUSR)to address the above problems.We reinforce spatial feature extraction by integrat-ing detailed features from different receptive fields across adjacent channels.Furthermore,we pro-pose an interleaved iterative upsampling process during the reconstruction stage,which progres-sively fuses incremental information among adjacent frequency bands.Additionally,we add twoparallel three dimensional(3D)feature extraction branches to the backbone network to extractspectral and spatial features of varying granularity.We further enhance the backbone network’sconstruction results by leveraging the difference between two dimensional(2D)channel-groupingspatial features and 3D multi-granularity features.The results obtained by applying the proposednetwork model to the CAVE test set show that,at a scaling factor of×4,the peak signal to noiseratio,spectral angle mapping,and structural similarity are 37.310 dB,3.525 and 0.9438,respec-tively.Besides,extensive experiments conducted on the Harvard and Foster datasets demonstratethe superior potential of the proposed model in hyperspectral super-resolution reconstruction.展开更多
To improve the prediction accuracy of chaotic time series and reconstruct a more reasonable phase space structure of the prediction network,we propose a convolutional neural network-long short-term memory(CNN-LSTM)pre...To improve the prediction accuracy of chaotic time series and reconstruct a more reasonable phase space structure of the prediction network,we propose a convolutional neural network-long short-term memory(CNN-LSTM)prediction model based on the incremental attention mechanism.Firstly,a traversal search is conducted through the traversal layer for finite parameters in the phase space.Then,an incremental attention layer is utilized for parameter judgment based on the dimension weight criteria(DWC).The phase space parameters that best meet DWC are selected and fed into the input layer.Finally,the constructed CNN-LSTM network extracts spatio-temporal features and provides the final prediction results.The model is verified using Logistic,Lorenz,and sunspot chaotic time series,and the performance is compared from the two dimensions of prediction accuracy and network phase space structure.Additionally,the CNN-LSTM network based on incremental attention is compared with long short-term memory(LSTM),convolutional neural network(CNN),recurrent neural network(RNN),and support vector regression(SVR)for prediction accuracy.The experiment results indicate that the proposed composite network model possesses enhanced capability in extracting temporal features and achieves higher prediction accuracy.Also,the algorithm to estimate the phase space parameter is compared with the traditional CAO,false nearest neighbor,and C-C,three typical methods for determining the chaotic phase space parameters.The experiments reveal that the phase space parameter estimation algorithm based on the incremental attention mechanism is superior in prediction accuracy compared with the traditional phase space reconstruction method in five networks,including CNN-LSTM,LSTM,CNN,RNN,and SVR.展开更多
With the continuous advancement of China’s“peak carbon dioxide emissions and Carbon Neutrality”process,the proportion of wind power is increasing.In the current research,aiming at the problem that the forecasting m...With the continuous advancement of China’s“peak carbon dioxide emissions and Carbon Neutrality”process,the proportion of wind power is increasing.In the current research,aiming at the problem that the forecasting model is outdated due to the continuous updating of wind power data,a short-term wind power forecasting algorithm based on Incremental Learning-Bagging Deep Hybrid Kernel Extreme Learning Machine(IL-Bagging-DHKELM)error affinity propagation cluster analysis is proposed.The algorithm effectively combines deep hybrid kernel extreme learning machine(DHKELM)with incremental learning(IL).Firstly,an initial wind power prediction model is trained using the Bagging-DHKELM model.Secondly,Euclidean morphological distance affinity propagation AP clustering algorithm is used to cluster and analyze the prediction error of wind power obtained from the initial training model.Finally,the correlation between wind power prediction errors and Numerical Weather Prediction(NWP)data is introduced as incremental updates to the initial wind power prediction model.During the incremental learning process,multiple error performance indicators are used to measure the overall model performance,thereby enabling incremental updates of wind power models.Practical examples show the method proposed in this article reduces the root mean square error of the initial model by 1.9 percentage points,indicating that this method can be better adapted to the current scenario of the continuous increase in wind power penetration rate.The accuracy and precision of wind power generation prediction are effectively improved through the method.展开更多
Surrounding rocks at different locations are generally subjected to different stress paths during the process of deep hard rock excavation.In this study,to reveal the mechanical parameters of deep surrounding rock und...Surrounding rocks at different locations are generally subjected to different stress paths during the process of deep hard rock excavation.In this study,to reveal the mechanical parameters of deep surrounding rock under different stress paths,a new cyclic loading and unloading test method for controlled true triaxial loading and unloading and principal stress direction interchange was proposed,and the evolution of mechanical parameters of Shuangjiangkou granite under different stress paths was studied,including the deformation modulus,elastic deformation increment ratios,fracture degree,cohesion and internal friction angle.Additionally,stress path coefficient was defined to characterize different stress paths,and the functional relationships among the stress path coefficient,rock fracture degree difference coefficient,cohesion and internal friction angle were obtained.The results show that during the true triaxial cyclic loading and unloading process,the deformation modulus and cohesion gradually decrease,while the internal friction angle gradually increases with increasing equivalent crack strain.The stress path coefficient is exponentially related to the rock fracture degree difference coefficient.As the stress path coefficient increases,the degrees of cohesion weakening and internal friction angle strengthening decrease linearly.During cyclic loading and unloading under true triaxial principal stress direction interchange,the direction of crack development changes,and the deformation modulus increases,while the cohesion and internal friction angle decrease slightly,indicating that the principal stress direction interchange has a strengthening effect on the surrounding rocks.Finally,the influences of the principal stress interchange direction on the stabilities of deep engineering excavation projects are discussed.展开更多
Since its inception,the Internet has been rapidly evolving.With the advancement of science and technology and the explosive growth of the population,the demand for the Internet has been on the rise.Many applications i...Since its inception,the Internet has been rapidly evolving.With the advancement of science and technology and the explosive growth of the population,the demand for the Internet has been on the rise.Many applications in education,healthcare,entertainment,science,and more are being increasingly deployed based on the internet.Concurrently,malicious threats on the internet are on the rise as well.Distributed Denial of Service(DDoS)attacks are among the most common and dangerous threats on the internet today.The scale and complexity of DDoS attacks are constantly growing.Intrusion Detection Systems(IDS)have been deployed and have demonstrated their effectiveness in defense against those threats.In addition,the research of Machine Learning(ML)and Deep Learning(DL)in IDS has gained effective results and significant attention.However,one of the challenges when applying ML and DL techniques in intrusion detection is the identification of unknown attacks.These attacks,which are not encountered during the system’s training,can lead to misclassification with significant errors.In this research,we focused on addressing the issue of Unknown Attack Detection,combining two methods:Spatial Location Constraint Prototype Loss(SLCPL)and Fuzzy C-Means(FCM).With the proposed method,we achieved promising results compared to traditional methods.The proposed method demonstrates a very high accuracy of up to 99.8%with a low false positive rate for known attacks on the Intrusion Detection Evaluation Dataset(CICIDS2017)dataset.Particularly,the accuracy is also very high,reaching 99.7%,and the precision goes up to 99.9%for unknown DDoS attacks on the DDoS Evaluation Dataset(CICDDoS2019)dataset.The success of the proposed method is due to the combination of SLCPL,an advanced Open-Set Recognition(OSR)technique,and FCM,a traditional yet highly applicable clustering technique.This has yielded a novel method in the field of unknown attack detection.This further expands the trend of applying DL and ML techniques in the development of intrusion detection systems and cybersecurity.Finally,implementing the proposed method in real-world systems can enhance the security capabilities against increasingly complex threats on computer networks.展开更多
It is well known that Diabetes Specific Nutritional Supplements (DSNSs) are linked to improved glycemic control in individuals with diabetes. However, data on efficacy of DSNSs in prediabetics is limited. This was a t...It is well known that Diabetes Specific Nutritional Supplements (DSNSs) are linked to improved glycemic control in individuals with diabetes. However, data on efficacy of DSNSs in prediabetics is limited. This was a two-armed, open-labelled, randomized controlled six-week study on 199 prediabetics [30 - 65 years;Glycosylated Hemoglobin (HbA1c) 5.7% - 6.4% and/or Fasting Blood Glucose (FBG) 100-125 mg/dl]. Two parallel phases were conducted: Acute Blood Glucose Response (ABGR) and Intervention phase. Prediabetic participants were randomized into test (n = 100) and control (n = 99). The primary objective was to assess the ABGR of DSNS versus an isocaloric snack, measured by incremental Area under the Curve (iAUC). Test and control received 60 g of DSNS and 56 g of isocaloric snack (cornflakes) respectively, both in 250 ml double-toned milk on visit days 1, 15, 29 and 43. Postprandial Blood Glucose (PPG) was estimated at 30, 60, 90, 120, 150 and 180 minutes. During the 4 weeks intervention phase, the test group received DSNS with lifestyle counselling (DSNS + LC) and was compared with the control receiving lifestyle counselling alone (LC alone). Impact was studied on FBG, HbA1C, anthropometry, body composition, blood pressure, nutrient intake, and physical activity. The impact of DSNS was also studied using CGM between two 14-day phases: CGM1 baseline (days 1 - 14) and CGM2 endline (days 28 - 42). DSNS showed significantly lower PPG versus isocaloric snack at 30 (p 12, and chromium were reported by DSNS + LC versus LC alone. No other significant changes were reported between groups. It may be concluded that DSNS may be considered as a snack for prediabetic or hyperglycemic individuals requiring nutritional support for improved glycemic control.展开更多
The integration of set-valued ordered rough set models and incremental learning signify a progressive advancement of conventional rough set theory, with the objective of tackling the heterogeneity and ongoing transfor...The integration of set-valued ordered rough set models and incremental learning signify a progressive advancement of conventional rough set theory, with the objective of tackling the heterogeneity and ongoing transformations in information systems. In set-valued ordered decision systems, when changes occur in the attribute value domain, such as adding conditional values, it may result in changes in the preference relation between objects, indirectly leading to changes in approximations. In this paper, we effectively addressed the issue of updating approximations that arose from adding conditional values in set-valued ordered decision systems. Firstly, we classified the research objects into two categories: objects with changes in conditional values and objects without changes, and then conducted theoretical studies on updating approximations for these two categories, presenting approximation update theories for adding conditional values. Subsequently, we presented incremental algorithms corresponding to approximation update theories. We demonstrated the feasibility of the proposed incremental update method with numerical examples and showed that our incremental algorithm outperformed the static algorithm. Ultimately, by comparing experimental results on different datasets, it is evident that the incremental algorithm efficiently reduced processing time. In conclusion, this study offered a promising strategy to address the challenges of set-valued ordered decision systems in dynamic environments.展开更多
Quinta Monroy is an award-winning co-designed settlement for 93 families on half a hectare of land at Iquique in northern Chile.Neighbors,complaints about the disorderly settlement peaked after the landowner^death and...Quinta Monroy is an award-winning co-designed settlement for 93 families on half a hectare of land at Iquique in northern Chile.Neighbors,complaints about the disorderly settlement peaked after the landowner^death and provoked untenured residents to seek government subsidies to redevelop the settlement.From 2003,a government social housing project was coordinated by the,,Elementar,architecture firm with US$10,000 per household.With the residenfs temporary relocation,93 modular and interlinked apartments were built around a series of courtyards.These apartments,which were designed as,fhalf-houses,M were subsequently co-opted by residents adding rooms in locations planned in advance by Elemental.Many households have since doubled the size of their apartment and reformed the settlement in ways not anticipated by Elemental.This paper details a spatial and ethnographic study of the Quinta Monroy settlement since redevelopment to identify opportunities and risks that accompany this type of social housing model.The study reveals evidence that residents,capacities to enlarge apartments commonly exceeds the architects expectations and that unregulated expansions often compromise the settlements livability.This research anticipates further opportunities for expansion in this semi-regulated settlement and investigates possibilities that another contested slum settlement may emerge.展开更多
The objective was to examine the effects of optimal leaf nitrogen levels>2.0%and suboptimal levels<2.0%,nitrogen nutrition on net photo synthetic rate,stem diameter increment,height growth increment and acorn ma...The objective was to examine the effects of optimal leaf nitrogen levels>2.0%and suboptimal levels<2.0%,nitrogen nutrition on net photo synthetic rate,stem diameter increment,height growth increment and acorn mass of pedunculate oak during 2010 in the absence of drought stress and during 2011 under the impact of moderate drought stress.According to the results,moderate drought stress significantly reduced net photo synthetic rate,stem diameter increment and height growth increment,while acorn mass was not affected.Suboptimal nitrogen nutrition significantly reduced the net photo synthetic rate and stem diameter increment only in the wet year,acorn mass in both wet and dry years,while height growth increment was not significantly reduced by suboptimal nitrogen nutrition in either year.The results indicate that optimal nitrogen levels can stimulate photo synthetic rate and stem diameter increment of pedunculate oak only in the absence of moderate drought stress.Moreover,the results show that moderate drought stress is a more dominant stressor for photosynthesis and growth of pedunculate oak than suboptimal nitrogen nutrition,while for acorn development,it is the more dominant stressor.展开更多
The meridional gradient of surface air temperature associated with“Warm Arctic–Cold Eurasia”(GradTAE)is closely related to climate anomalies and weather extremes in the mid-low latitudes.However,the Climate Forecas...The meridional gradient of surface air temperature associated with“Warm Arctic–Cold Eurasia”(GradTAE)is closely related to climate anomalies and weather extremes in the mid-low latitudes.However,the Climate Forecast System Version 2(CFSv2)shows poor capability for GradTAE prediction.Based on the year-to-year increment approach,analysis using a hybrid seasonal prediction model for GradTAE in winter(HMAE)is conducted with observed September sea ice over the Barents–Kara Sea,October sea surface temperature over the North Atlantic,September soil moisture in southern North America,and CFSv2 forecasted winter sea ice over the Baffin Bay,Davis Strait,and Labrador Sea.HMAE demonstrates good capability for predicting GradTAE with a significant correlation coefficient of 0.84,and the percentage of the same sign is 88%in cross-validation during 1983−2015.HMAE also maintains high accuracy and robustness during independent predictions of 2016−20.Meanwhile,HMAE can predict the GradTAE in 2021 well as an experiment of routine operation.Moreover,well-predicted GradTAE is useful in the prediction of the large-scale pattern of“Warm Arctic–Cold Eurasia”and has potential to enhance the skill of surface air temperature occurrences in the east of China.展开更多
基金financed by the National Science Centre,Poland,under project No.2019/35/B/NZ8/01381 entitled"Impact of invasive tree species on ecosystem services:plant biodiversity,carbon and nitrogen cycling and climate regulation"by the Institute of Dendrology,Polish Academy of Sciences。
文摘Prunus serotina and Robinia pseudoacacia are the most widespread invasive trees in Central Europe.In addition,according to climate models,decreased growth of many economically and ecologically important native trees will likely be observed in the future.We aimed to assess the impact of these two neophytes,which differ in the biomass range and nitrogen-fixing abilities observed in Central European conditions,on the relative aboveground biomass increments of native oaks Qucrcus robur and Q.petraea and Scots pine Pinus sylvestris.We aimed to increase our understanding of the relationship between facilitation and competition between woody alien species and overstory native trees.We established 72 circular plots(0.05 ha)in two different forest habitat types and stands varying in age in western Poland.We chose plots with different abundances of the studied neophytes to determine how effects scaled along the quantitative invasion gradient.Furthermore,we collected growth cores of the studied native species,and we calculated aboveground biomass increments at the tree and stand levels.Then,we used generalized linear mixed-effects models to assess the impact of invasive species abundances on relative aboveground biomass increments of native tree species.We did not find a biologically or statistically significant impact of invasive R.pseudoacacia or P.serotina on the relative aboveground,biomass increments of native oaks and pines along the quantitative gradient of invader biomass or on the proportion of total stand biomass accounted for by invaders.The neophytes did not act as native tree growth stimulators but also did not compete with them for resources,which would escalate the negative impact of climate change on pines and oaks.The neophytes should not significantly modify the carbon sequestration capacity of the native species.Our work combines elements of the per capita effect of invasion with research on mixed forest management.
基金sponsored by the National Natural Science Foundation of China(Nos.61972208,62102194 and 62102196)National Natural Science Foundation of China(Youth Project)(No.62302237)+3 种基金Six Talent Peaks Project of Jiangsu Province(No.RJFW-111),China Postdoctoral Science Foundation Project(No.2018M640509)Postgraduate Research and Practice Innovation Program of Jiangsu Province(Nos.KYCX22_1019,KYCX23_1087,KYCX22_1027,KYCX23_1087,SJCX24_0339 and SJCX24_0346)Innovative Training Program for College Students of Nanjing University of Posts and Telecommunications(No.XZD2019116)Nanjing University of Posts and Telecommunications College Students Innovation Training Program(Nos.XZD2019116,XYB2019331).
文摘The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces a new method named Big Data Tensor Multi-Cluster Distributed Incremental Update(BDTMCDIncreUpdate),which combines distributed computing,storage technology,and incremental update techniques to provide an efficient and effective means for clustering analysis.Firstly,the original dataset is divided into multiple subblocks,and distributed computing resources are utilized to process the sub-blocks in parallel,enhancing efficiency.Then,initial clustering is performed on each sub-block using tensor-based multi-clustering techniques to obtain preliminary results.When new data arrives,incremental update technology is employed to update the core tensor and factor matrix,ensuring that the clustering model can adapt to changes in data.Finally,by combining the updated core tensor and factor matrix with historical computational results,refined clustering results are obtained,achieving real-time adaptation to dynamic data.Through experimental simulation on the Aminer dataset,the BDTMCDIncreUpdate method has demonstrated outstanding performance in terms of accuracy(ACC)and normalized mutual information(NMI)metrics,achieving an accuracy rate of 90%and an NMI score of 0.85,which outperforms existing methods such as TClusInitUpdate and TKLClusUpdate in most scenarios.Therefore,the BDTMCDIncreUpdate method offers an innovative solution to the field of big data analysis,integrating distributed computing,incremental updates,and tensor-based multi-clustering techniques.It not only improves the efficiency and scalability in processing large-scale high-dimensional datasets but also has been validated for its effectiveness and accuracy through experiments.This method shows great potential in real-world applications where dynamic data growth is common,and it is of significant importance for advancing the development of data analysis technology.
基金supported by the Innovation Scientists and Technicians Troop Construction Projects of Henan Province(224000510002)。
文摘Time-Sensitive Network(TSN)with deterministic transmission capability is increasingly used in many emerging fields.It mainly guarantees the Quality of Service(QoS)of applications with strict requirements on time and security.One of the core features of TSN is traffic scheduling with bounded low delay in the network.However,traffic scheduling schemes in TSN are usually synthesized offline and lack dynamism.To implement incremental scheduling of newly arrived traffic in TSN,we propose a Dynamic Response Incremental Scheduling(DR-IS)method for time-sensitive traffic and deploy it on a software-defined time-sensitive network architecture.Under the premise of meeting the traffic scheduling requirements,we adopt two modes,traffic shift and traffic exchange,to dynamically adjust the time slot injection position of the traffic in the original scheme,and determine the sending offset time of the new timesensitive traffic to minimize the global traffic transmission jitter.The evaluation results show that DRIS method can effectively control the large increase of traffic transmission jitter in incremental scheduling without affecting the transmission delay,thus realizing the dynamic incremental scheduling of time-sensitive traffic in TSN.
基金Project supported by the National Natural Science Foundation of China (Grant No.12065015)the Hongliu Firstlevel Discipline Construction Project of Lanzhou University of Technology。
文摘The instability of plasma waves in the channel of field-effect transistors will cause the electromagnetic waves with THz frequency.Based on a self-consistent quantum hydrodynamic model,the instability of THz plasmas waves in the channel of graphene field-effect transistors has been investigated with external magnetic field and quantum effects.We analyzed the influence of weak magnetic fields,quantum effects,device size,and temperature on the instability of plasma waves under asymmetric boundary conditions numerically.The results show that the magnetic fields,quantum effects,and the thickness of the dielectric layer between the gate and the channel can increase the radiation frequency.Additionally,we observed that increase in temperature leads to a decrease in both oscillation frequency and instability increment.The numerical results and accompanying images obtained from our simulations provide support for the above conclusions.
基金financially supported by the Science&Technology Project of Beijing Education Committee(KM202210005013)National Natural Science Foundation of China(52306180)。
文摘This study used a three-dimensional numerical model of a proton exchange membrane fuel cell with five types of channels:a smooth channel(Case 1);eight rectangular baffles were arranged in the upstream(Case 2),midstream(Case 3),downstream(Case 4),and the entire cathode flow channel(Case 5)to study the effects of baffle position on mass transport,power density,net power,etc.Moreover,the effects of back pressure and humidity on the voltage were investigated.Results showed that compared to smooth channels,the oxygen and water transport facilitation at the diffusion layer-channel interface were added 11.53%-20.60%and 7.81%-9.80%at 1.68 A·cm^(-2)by adding baffles.The closer the baffles were to upstream,the higher the total oxygen flux,but the lower the flux uniformity the worse the water removal.The oxygen flux of upstream baffles was 8.14%higher than that of downstream baffles,but oxygen flux uniformity decreased by 18.96%at 1.68 A·cm^(-2).The order of water removal and voltage improvement was Case 4>Case 5>Case 3>Case 2>Case 1.Net power of Case 4 was 9.87%higher than that of the smooth channel.To the Case 4,when the cell worked under low back pressure or high humidity,the voltage increments were higher.The potential increment for the back pressure of 0 atm was 0.9%higher than that of 2 atm(1 atm=101.325 kPa).The potential increment for the humidity of 100%was 7.89%higher than that of 50%.
基金supported by Fundamental Research Program of Shanxi Province(No.20210302123444)the Research Project at the College Level of China Institute of Labor Relations(No.23XYJS018)+2 种基金the ICH Digitalization and Multi-Source Information Fusion Fujian Provincial University Engineering Research Center 2022 Open Fund Project(G3-KF2207)the China University Industry University Research Innovation Fund(No.2021FNA02009)the Key R&D Program(International Science and Technology Cooperation Project)of Shanxi Province China(No.201903D421003).
文摘Currently,distributed routing protocols are constrained by offering a single path between any pair of nodes,thereby limiting the potential throughput and overall network performance.This approach not only restricts the flow of data but also makes the network susceptible to failures in case the primary path is disrupted.In contrast,routing protocols that leverage multiple paths within the network offer a more resilient and efficient solution.Multipath routing,as a fundamental concept,surpasses the limitations of traditional shortest path first protocols.It not only redirects traffic to unused resources,effectively mitigating network congestion,but also ensures load balancing across the network.This optimization significantly improves network utilization and boosts the overall performance,making it a widely recognized efficient method for enhancing network reliability.To further strengthen network resilience against failures,we introduce a routing scheme known as Multiple Nodes with at least Two Choices(MNTC).This innovative approach aims to significantly enhance network availability by providing each node with at least two routing choices.By doing so,it not only reduces the dependency on a single path but also creates redundant paths that can be utilized in case of failures,thereby enhancing the overall resilience of the network.To ensure the optimal placement of nodes,we propose three incremental deployment algorithms.These algorithms carefully select the most suitable set of nodes for deployment,taking into account various factors such as node connectivity,traffic patterns,and network topology.By deployingMNTCon a carefully chosen set of nodes,we can significantly enhance network reliability without the need for a complete overhaul of the existing infrastructure.We have conducted extensive evaluations of MNTC in diverse topological spaces,demonstrating its effectiveness in maintaining high network availability with minimal path stretch.The results are impressive,showing that even when implemented on just 60%of nodes,our incremental deployment method significantly boosts network availability.This underscores the potential of MNTC in enhancing network resilience and performance,making it a viable solution for modern networks facing increasing demands and complexities.The algorithms OSPF,TBFH,DC and LFC perform fast rerouting based on strict conditions,while MNTC is not restricted by these conditions.In five real network topologies,the average network availability ofMNTCis improved by 14.68%,6.28%,4.76%and 2.84%,respectively,compared with OSPF,TBFH,DC and LFC.
基金The National Forestry Commission of Mexico and The Mexican National Council for Science and Technology(CONAFOR-CONACYT-115900)。
文摘Multispecies forests have received increased scientific attention,driven by the hypothesis that biodiversity improves ecological resilience.However,a greater species diversity presents challenges for forest management and research.Our study aims to develop basal area growth models for tree species cohorts.The analysis is based on a dataset of 423 permanent plots(2,500 m^(2))located in temperate forests in Durango,Mexico.First,we define tree species cohorts based on individual and neighborhood-based variables using a combination of principal component and cluster analyses.Then,we estimate the basal area increment of each cohort through the generalized additive model to describe the effect of tree size,competition,stand density and site quality.The principal component and cluster analyses assign a total of 37 tree species to eight cohorts that differed primarily with regard to the distribution of tree size and vertical position within the community.The generalized additive models provide satisfactory estimates of tree growth for the species cohorts,explaining between 19 and 53 percent of the total variation of basal area increment,and highlight the following results:i)most cohorts show a"rise-and-fall"effect of tree size on tree growth;ii)surprisingly,the competition index"basal area of larger trees"had showed a positive effect in four of the eight cohorts;iii)stand density had a negative effect on basal area increment,though the effect was minor in medium-and high-density stands,and iv)basal area growth was positively correlated with site quality except for an oak cohort.The developed species cohorts and growth models provide insight into their particular ecological features and growth patterns that may support the development of sustainable management strategies for temperate multispecies forests.
基金This work was funded by the National Natural Science Foundation of China Nos.U22A2099,61966009,62006057the Graduate Innovation Program No.YCSW2022286.
文摘Humans are experiencing the inclusion of artificial agents in their lives,such as unmanned vehicles,service robots,voice assistants,and intelligent medical care.If the artificial agents cannot align with social values or make ethical decisions,they may not meet the expectations of humans.Traditionally,an ethical decision-making framework is constructed by rule-based or statistical approaches.In this paper,we propose an ethical decision-making framework based on incremental ILP(Inductive Logic Programming),which can overcome the brittleness of rule-based approaches and little interpretability of statistical approaches.As the current incremental ILP makes it difficult to solve conflicts,we propose a novel ethical decision-making framework considering conflicts in this paper,which adopts our proposed incremental ILP system.The framework consists of two processes:the learning process and the deduction process.The first process records bottom clauses with their score functions and learns rules guided by the entailment and the score function.The second process obtains an ethical decision based on the rules.In an ethical scenario about chatbots for teenagers’mental health,we verify that our framework can learn ethical rules and make ethical decisions.Besides,we extract incremental ILP from the framework and compare it with the state-of-the-art ILP systems based on ASP(Answer Set Programming)focusing on conflict resolution.The results of comparisons show that our proposed system can generate better-quality rules than most other systems.
文摘The visions of Industry 4.0 and 5.0 have reinforced the industrial environment.They have also made artificial intelligence incorporated as a major facilitator.Diagnosing machine faults has become a solid foundation for automatically recognizing machine failure,and thus timely maintenance can ensure safe operations.Transfer learning is a promising solution that can enhance the machine fault diagnosis model by borrowing pre-trained knowledge from the source model and applying it to the target model,which typically involves two datasets.In response to the availability of multiple datasets,this paper proposes using selective and adaptive incremental transfer learning(SA-ITL),which fuses three algorithms,namely,the hybrid selective algorithm,the transferability enhancement algorithm,and the incremental transfer learning algorithm.It is a selective algorithm that enables selecting and ordering appropriate datasets for transfer learning and selecting useful knowledge to avoid negative transfer.The algorithm also adaptively adjusts the portion of training data to balance the learning rate and training time.The proposed algorithm is evaluated and analyzed using ten benchmark datasets.Compared with other algorithms from existing works,SA-ITL improves the accuracy of all datasets.Ablation studies present the accuracy enhancements of the SA-ITL,including the hybrid selective algorithm(1.22%-3.82%),transferability enhancement algorithm(1.91%-4.15%),and incremental transfer learning algorithm(0.605%-2.68%).These also show the benefits of enhancing the target model with heterogeneous image datasets that widen the range of domain selection between source and target domains.
文摘We investigated the parametric optimization on incremental sheet forming of stainless steel using Grey Relational Analysis(GRA) coupled with Principal Component Analysis(PCA). AISI 316L stainless steel sheets were used to develop double wall angle pyramid with aid of tungsten carbide tool. GRA coupled with PCA was used to plan the experiment conditions. Control factors such as Tool Diameter(TD), Step Depth(SD), Bottom Wall Angle(BWA), Feed Rate(FR) and Spindle Speed(SS) on Top Wall Angle(TWA) and Top Wall Angle Surface Roughness(TWASR) have been studied. Wall angle increases with increasing tool diameter due to large contact area between tool and workpiece. As the step depth, feed rate and spindle speed increase,TWASR decreases with increasing tool diameter. As the step depth increasing, the hydrostatic stress is raised causing severe cracks in the deformed surface. Hence it was concluded that the proposed hybrid method was suitable for optimizing the factors and response.
基金the National Natural Science Foun-dation of China(Nos.61471263,61872267 and U21B2024)the Natural Science Foundation of Tianjin,China(No.16JCZDJC31100)Tianjin University Innovation Foundation(No.2021XZC0024).
文摘Hyperspectral images typically have high spectral resolution but low spatial resolution,which impacts the reliability and accuracy of subsequent applications,for example,remote sensingclassification and mineral identification.But in traditional methods via deep convolution neural net-works,indiscriminately extracting and fusing spectral and spatial features makes it challenging toutilize the differentiated information across adjacent spectral channels.Thus,we proposed a multi-branch interleaved iterative upsampling hyperspectral image super-resolution reconstruction net-work(MIIUSR)to address the above problems.We reinforce spatial feature extraction by integrat-ing detailed features from different receptive fields across adjacent channels.Furthermore,we pro-pose an interleaved iterative upsampling process during the reconstruction stage,which progres-sively fuses incremental information among adjacent frequency bands.Additionally,we add twoparallel three dimensional(3D)feature extraction branches to the backbone network to extractspectral and spatial features of varying granularity.We further enhance the backbone network’sconstruction results by leveraging the difference between two dimensional(2D)channel-groupingspatial features and 3D multi-granularity features.The results obtained by applying the proposednetwork model to the CAVE test set show that,at a scaling factor of×4,the peak signal to noiseratio,spectral angle mapping,and structural similarity are 37.310 dB,3.525 and 0.9438,respec-tively.Besides,extensive experiments conducted on the Harvard and Foster datasets demonstratethe superior potential of the proposed model in hyperspectral super-resolution reconstruction.
文摘To improve the prediction accuracy of chaotic time series and reconstruct a more reasonable phase space structure of the prediction network,we propose a convolutional neural network-long short-term memory(CNN-LSTM)prediction model based on the incremental attention mechanism.Firstly,a traversal search is conducted through the traversal layer for finite parameters in the phase space.Then,an incremental attention layer is utilized for parameter judgment based on the dimension weight criteria(DWC).The phase space parameters that best meet DWC are selected and fed into the input layer.Finally,the constructed CNN-LSTM network extracts spatio-temporal features and provides the final prediction results.The model is verified using Logistic,Lorenz,and sunspot chaotic time series,and the performance is compared from the two dimensions of prediction accuracy and network phase space structure.Additionally,the CNN-LSTM network based on incremental attention is compared with long short-term memory(LSTM),convolutional neural network(CNN),recurrent neural network(RNN),and support vector regression(SVR)for prediction accuracy.The experiment results indicate that the proposed composite network model possesses enhanced capability in extracting temporal features and achieves higher prediction accuracy.Also,the algorithm to estimate the phase space parameter is compared with the traditional CAO,false nearest neighbor,and C-C,three typical methods for determining the chaotic phase space parameters.The experiments reveal that the phase space parameter estimation algorithm based on the incremental attention mechanism is superior in prediction accuracy compared with the traditional phase space reconstruction method in five networks,including CNN-LSTM,LSTM,CNN,RNN,and SVR.
基金funded by Liaoning Provincial Department of Science and Technology(2023JH2/101600058)。
文摘With the continuous advancement of China’s“peak carbon dioxide emissions and Carbon Neutrality”process,the proportion of wind power is increasing.In the current research,aiming at the problem that the forecasting model is outdated due to the continuous updating of wind power data,a short-term wind power forecasting algorithm based on Incremental Learning-Bagging Deep Hybrid Kernel Extreme Learning Machine(IL-Bagging-DHKELM)error affinity propagation cluster analysis is proposed.The algorithm effectively combines deep hybrid kernel extreme learning machine(DHKELM)with incremental learning(IL).Firstly,an initial wind power prediction model is trained using the Bagging-DHKELM model.Secondly,Euclidean morphological distance affinity propagation AP clustering algorithm is used to cluster and analyze the prediction error of wind power obtained from the initial training model.Finally,the correlation between wind power prediction errors and Numerical Weather Prediction(NWP)data is introduced as incremental updates to the initial wind power prediction model.During the incremental learning process,multiple error performance indicators are used to measure the overall model performance,thereby enabling incremental updates of wind power models.Practical examples show the method proposed in this article reduces the root mean square error of the initial model by 1.9 percentage points,indicating that this method can be better adapted to the current scenario of the continuous increase in wind power penetration rate.The accuracy and precision of wind power generation prediction are effectively improved through the method.
基金the financial support from the National Natural Science Foundation of China(Grant Nos.51839003 and 42207221).
文摘Surrounding rocks at different locations are generally subjected to different stress paths during the process of deep hard rock excavation.In this study,to reveal the mechanical parameters of deep surrounding rock under different stress paths,a new cyclic loading and unloading test method for controlled true triaxial loading and unloading and principal stress direction interchange was proposed,and the evolution of mechanical parameters of Shuangjiangkou granite under different stress paths was studied,including the deformation modulus,elastic deformation increment ratios,fracture degree,cohesion and internal friction angle.Additionally,stress path coefficient was defined to characterize different stress paths,and the functional relationships among the stress path coefficient,rock fracture degree difference coefficient,cohesion and internal friction angle were obtained.The results show that during the true triaxial cyclic loading and unloading process,the deformation modulus and cohesion gradually decrease,while the internal friction angle gradually increases with increasing equivalent crack strain.The stress path coefficient is exponentially related to the rock fracture degree difference coefficient.As the stress path coefficient increases,the degrees of cohesion weakening and internal friction angle strengthening decrease linearly.During cyclic loading and unloading under true triaxial principal stress direction interchange,the direction of crack development changes,and the deformation modulus increases,while the cohesion and internal friction angle decrease slightly,indicating that the principal stress direction interchange has a strengthening effect on the surrounding rocks.Finally,the influences of the principal stress interchange direction on the stabilities of deep engineering excavation projects are discussed.
基金This research was partly supported by the National Science and Technology Council,Taiwan with Grant Numbers 112-2221-E-992-045,112-2221-E-992-057-MY3 and 112-2622-8-992-009-TD1.
文摘Since its inception,the Internet has been rapidly evolving.With the advancement of science and technology and the explosive growth of the population,the demand for the Internet has been on the rise.Many applications in education,healthcare,entertainment,science,and more are being increasingly deployed based on the internet.Concurrently,malicious threats on the internet are on the rise as well.Distributed Denial of Service(DDoS)attacks are among the most common and dangerous threats on the internet today.The scale and complexity of DDoS attacks are constantly growing.Intrusion Detection Systems(IDS)have been deployed and have demonstrated their effectiveness in defense against those threats.In addition,the research of Machine Learning(ML)and Deep Learning(DL)in IDS has gained effective results and significant attention.However,one of the challenges when applying ML and DL techniques in intrusion detection is the identification of unknown attacks.These attacks,which are not encountered during the system’s training,can lead to misclassification with significant errors.In this research,we focused on addressing the issue of Unknown Attack Detection,combining two methods:Spatial Location Constraint Prototype Loss(SLCPL)and Fuzzy C-Means(FCM).With the proposed method,we achieved promising results compared to traditional methods.The proposed method demonstrates a very high accuracy of up to 99.8%with a low false positive rate for known attacks on the Intrusion Detection Evaluation Dataset(CICIDS2017)dataset.Particularly,the accuracy is also very high,reaching 99.7%,and the precision goes up to 99.9%for unknown DDoS attacks on the DDoS Evaluation Dataset(CICDDoS2019)dataset.The success of the proposed method is due to the combination of SLCPL,an advanced Open-Set Recognition(OSR)technique,and FCM,a traditional yet highly applicable clustering technique.This has yielded a novel method in the field of unknown attack detection.This further expands the trend of applying DL and ML techniques in the development of intrusion detection systems and cybersecurity.Finally,implementing the proposed method in real-world systems can enhance the security capabilities against increasingly complex threats on computer networks.
文摘It is well known that Diabetes Specific Nutritional Supplements (DSNSs) are linked to improved glycemic control in individuals with diabetes. However, data on efficacy of DSNSs in prediabetics is limited. This was a two-armed, open-labelled, randomized controlled six-week study on 199 prediabetics [30 - 65 years;Glycosylated Hemoglobin (HbA1c) 5.7% - 6.4% and/or Fasting Blood Glucose (FBG) 100-125 mg/dl]. Two parallel phases were conducted: Acute Blood Glucose Response (ABGR) and Intervention phase. Prediabetic participants were randomized into test (n = 100) and control (n = 99). The primary objective was to assess the ABGR of DSNS versus an isocaloric snack, measured by incremental Area under the Curve (iAUC). Test and control received 60 g of DSNS and 56 g of isocaloric snack (cornflakes) respectively, both in 250 ml double-toned milk on visit days 1, 15, 29 and 43. Postprandial Blood Glucose (PPG) was estimated at 30, 60, 90, 120, 150 and 180 minutes. During the 4 weeks intervention phase, the test group received DSNS with lifestyle counselling (DSNS + LC) and was compared with the control receiving lifestyle counselling alone (LC alone). Impact was studied on FBG, HbA1C, anthropometry, body composition, blood pressure, nutrient intake, and physical activity. The impact of DSNS was also studied using CGM between two 14-day phases: CGM1 baseline (days 1 - 14) and CGM2 endline (days 28 - 42). DSNS showed significantly lower PPG versus isocaloric snack at 30 (p 12, and chromium were reported by DSNS + LC versus LC alone. No other significant changes were reported between groups. It may be concluded that DSNS may be considered as a snack for prediabetic or hyperglycemic individuals requiring nutritional support for improved glycemic control.
文摘The integration of set-valued ordered rough set models and incremental learning signify a progressive advancement of conventional rough set theory, with the objective of tackling the heterogeneity and ongoing transformations in information systems. In set-valued ordered decision systems, when changes occur in the attribute value domain, such as adding conditional values, it may result in changes in the preference relation between objects, indirectly leading to changes in approximations. In this paper, we effectively addressed the issue of updating approximations that arose from adding conditional values in set-valued ordered decision systems. Firstly, we classified the research objects into two categories: objects with changes in conditional values and objects without changes, and then conducted theoretical studies on updating approximations for these two categories, presenting approximation update theories for adding conditional values. Subsequently, we presented incremental algorithms corresponding to approximation update theories. We demonstrated the feasibility of the proposed incremental update method with numerical examples and showed that our incremental algorithm outperformed the static algorithm. Ultimately, by comparing experimental results on different datasets, it is evident that the incremental algorithm efficiently reduced processing time. In conclusion, this study offered a promising strategy to address the challenges of set-valued ordered decision systems in dynamic environments.
文摘Quinta Monroy is an award-winning co-designed settlement for 93 families on half a hectare of land at Iquique in northern Chile.Neighbors,complaints about the disorderly settlement peaked after the landowner^death and provoked untenured residents to seek government subsidies to redevelop the settlement.From 2003,a government social housing project was coordinated by the,,Elementar,architecture firm with US$10,000 per household.With the residenfs temporary relocation,93 modular and interlinked apartments were built around a series of courtyards.These apartments,which were designed as,fhalf-houses,M were subsequently co-opted by residents adding rooms in locations planned in advance by Elemental.Many households have since doubled the size of their apartment and reformed the settlement in ways not anticipated by Elemental.This paper details a spatial and ethnographic study of the Quinta Monroy settlement since redevelopment to identify opportunities and risks that accompany this type of social housing model.The study reveals evidence that residents,capacities to enlarge apartments commonly exceeds the architects expectations and that unregulated expansions often compromise the settlements livability.This research anticipates further opportunities for expansion in this semi-regulated settlement and investigates possibilities that another contested slum settlement may emerge.
基金conducted as part of the research project“Reproductive physiology of pedunculate oak(Quercus robur L.)in Spa?va”fully supported and funded by“Croatian Forests Ltd”。
文摘The objective was to examine the effects of optimal leaf nitrogen levels>2.0%and suboptimal levels<2.0%,nitrogen nutrition on net photo synthetic rate,stem diameter increment,height growth increment and acorn mass of pedunculate oak during 2010 in the absence of drought stress and during 2011 under the impact of moderate drought stress.According to the results,moderate drought stress significantly reduced net photo synthetic rate,stem diameter increment and height growth increment,while acorn mass was not affected.Suboptimal nitrogen nutrition significantly reduced the net photo synthetic rate and stem diameter increment only in the wet year,acorn mass in both wet and dry years,while height growth increment was not significantly reduced by suboptimal nitrogen nutrition in either year.The results indicate that optimal nitrogen levels can stimulate photo synthetic rate and stem diameter increment of pedunculate oak only in the absence of moderate drought stress.Moreover,the results show that moderate drought stress is a more dominant stressor for photosynthesis and growth of pedunculate oak than suboptimal nitrogen nutrition,while for acorn development,it is the more dominant stressor.
基金This research is supported by the National Key R&D Program of China(Grant No.2022YFF0801604).
文摘The meridional gradient of surface air temperature associated with“Warm Arctic–Cold Eurasia”(GradTAE)is closely related to climate anomalies and weather extremes in the mid-low latitudes.However,the Climate Forecast System Version 2(CFSv2)shows poor capability for GradTAE prediction.Based on the year-to-year increment approach,analysis using a hybrid seasonal prediction model for GradTAE in winter(HMAE)is conducted with observed September sea ice over the Barents–Kara Sea,October sea surface temperature over the North Atlantic,September soil moisture in southern North America,and CFSv2 forecasted winter sea ice over the Baffin Bay,Davis Strait,and Labrador Sea.HMAE demonstrates good capability for predicting GradTAE with a significant correlation coefficient of 0.84,and the percentage of the same sign is 88%in cross-validation during 1983−2015.HMAE also maintains high accuracy and robustness during independent predictions of 2016−20.Meanwhile,HMAE can predict the GradTAE in 2021 well as an experiment of routine operation.Moreover,well-predicted GradTAE is useful in the prediction of the large-scale pattern of“Warm Arctic–Cold Eurasia”and has potential to enhance the skill of surface air temperature occurrences in the east of China.