In the process of promoting the principal level system,there are often practical dilemmas such as the dislocation of modern school system construction,the dislocation of school values,the absence of assessment and eva...In the process of promoting the principal level system,there are often practical dilemmas such as the dislocation of modern school system construction,the dislocation of school values,the absence of assessment and evaluation mechanisms,and the lack of endogenous development of subjects.Therefore,it is necessary to actively establish a modern school system,stimulate the vitality of educators in running schools,adhere to the conscience of educating people,call on the professional awareness of educators in running schools,optimize the evaluation and evaluation mechanism,enhance the growth momentum of educators in running schools,enhance the connotative construction of the subject,and lead The continuous development of educators is the exploration of the main path.展开更多
Space-Time Block Coded(STBC)Orthogonal Frequency Division Multiplexing(OFDM)satisfies higher data-rate requirements while maintaining signal quality in a multipath fading channel.However,conventional STBCs,including O...Space-Time Block Coded(STBC)Orthogonal Frequency Division Multiplexing(OFDM)satisfies higher data-rate requirements while maintaining signal quality in a multipath fading channel.However,conventional STBCs,including Orthogonal STBCs(OSTBCs),Non-Orthogonal(NOSTBCs),and Quasi-Orthogonal STBCs(QOSTBCs),do not provide both maximal diversity order and unity code rate simultaneously for more than two transmit antennas.This paper targets this problem and applies Maximum Rank Distance(MRD)codes in designing STBCOFDM systems.By following the direct-matrix construction method,we can construct binary extended finite field MRD-STBCs for any number of transmitting antennas.Work uses MRD-STBCs built over Phase-Shift Keying(PSK)modulation to develop an MRD-based STBC-OFDM system.The MRD-based STBC-OFDM system sacrifices minor error performance compared to traditional OSTBC-OFDM but shows improved results against NOSTBC and QOSTBC-OFDM.It also provides 25%higher data-rates than OSTBC-OFDM in configurations that use more than two transmit antennas.The tradeoffs are minor increases in computational complexity and processing delays.展开更多
To ensure flight safety,the complex network method is used to study the influence and invulnerability of air traffic cyber physical system(CPS)nodes.According to the rules of air traffic management,the logical couplin...To ensure flight safety,the complex network method is used to study the influence and invulnerability of air traffic cyber physical system(CPS)nodes.According to the rules of air traffic management,the logical coupling relationship between routes and sectors is analyzed,an air traffic CPS network model is constructed,and the indicators of node influence and invulnerability are established.The K-shell algorithm is improved to identify node influence,and the invulnerability is analyzed under random and selective attacks.Taking Airspace in Eastern China as an example,its influential nodes are sorted by degree,namely,K-shell,the improved K-shell(IKS)and betweenness centrality.The invulnerability of air traffic CPS under different attacks is analyzed.Results show that IKS can effectively identify the influential nodes in the air traffic CPS network,and IKS and betweenness centrality are the two key indicators that affect the invulnerability of air traffic CPS.展开更多
Cloud computing is becoming popular technology due to its functional properties and variety of customer-oriented services over the Internet.The design of reliable and high-quality cloud applications requires a strong ...Cloud computing is becoming popular technology due to its functional properties and variety of customer-oriented services over the Internet.The design of reliable and high-quality cloud applications requires a strong Quality of Service QoS parameter metric.In a hyperconverged cloud ecosystem environment,building high-reliability cloud applications is a challenging job.The selection of cloud services is based on the QoS parameters that play essential roles in optimizing and improving cloud rankings.The emergence of cloud computing is significantly reshaping the digital ecosystem,and the numerous services offered by cloud service providers are playing a vital role in this transformation.Hyperconverged software-based unified utilities combine storage virtualization,compute virtualization,and network virtualization.The availability of the latter has also raised the demand for QoS.Due to the diversity of services,the respective quality parameters are also in abundance and need a carefully designed mechanism to compare and identify the critical,common,and impactful parameters.It is also necessary to reconsider the market needs in terms of service requirements and the QoS provided by various CSPs.This research provides a machine learning-based mechanism to monitor the QoS in a hyperconverged environment with three core service parameters:service quality,downtime of servers,and outage of cloud services.展开更多
In this paper, the maximal and minimal ranks of the solution to a system of matrix equations over H, the real quaternion algebra, were derived. A previous known result could be regarded as a special case of the new re...In this paper, the maximal and minimal ranks of the solution to a system of matrix equations over H, the real quaternion algebra, were derived. A previous known result could be regarded as a special case of the new result.展开更多
Heavy-duty machine tools are composed of many subsystems with different functions,and their reliability is governed by the reliabilities of these subsystems.It is important to rank the weaknesses of subsystems and ide...Heavy-duty machine tools are composed of many subsystems with different functions,and their reliability is governed by the reliabilities of these subsystems.It is important to rank the weaknesses of subsystems and identify the weakest subsystem to optimize products and improve their reliabilities.However,traditional ranking methods based on failure mode effect and critical analysis(FMECA)does not consider the complex maintenance of products.Herein,a weakness ranking method for the subsystems of heavy-duty machine tools is proposed based on generalized FMECA information.In this method,eight reliability indexes,including maintainability and maintenance cost,are considered in the generalized FMECA information.Subsequently,the cognition best worst method is used to calculate the weight of each screened index,and the weaknesses of the subsystems are ranked using a technique for order preference by similarity to an ideal solution.Finally,based on the failure data collected from certain domestic heavy-duty horizontal lathes,the weakness ranking result of the subsystems is obtained to verify the effectiveness of the proposed method.An improved weakness ranking method that can comprehensively analyze and identify weak subsystems is proposed herein for designing and improving the reliability of complex electromechanical products.展开更多
Offshore structures will encounter serious environmental load, so it is important to study the structural system reliability and to evaluate the structural component safety rank. In this paper, the bracnch-and-bound m...Offshore structures will encounter serious environmental load, so it is important to study the structural system reliability and to evaluate the structural component safety rank. In this paper, the bracnch-and-bound method is adopted to search the main failure path, and the Ditlevsen bound method is used to calculate the system failure probability. The structure is then assessed by the fuzzy comprehensive assessment method, which evaluates the structural component safety rank. The ultimate equation of the tubular cross- section is analyzed on the basis of ultimate stregnth analysis. The influence of effect coefficients on the structural system failure probability is investigated, and basic results are obtained. A general program for spatial frame structures by means of the above method is developed, and verified by the numerical examples.展开更多
Expanding internet-connected services has increased cyberattacks,many of which have grave and disastrous repercussions.An Intrusion Detection System(IDS)plays an essential role in network security since it helps to pr...Expanding internet-connected services has increased cyberattacks,many of which have grave and disastrous repercussions.An Intrusion Detection System(IDS)plays an essential role in network security since it helps to protect the network from vulnerabilities and attacks.Although extensive research was reported in IDS,detecting novel intrusions with optimal features and reducing false alarm rates are still challenging.Therefore,we developed a novel fusion-based feature importance method to reduce the high dimensional feature space,which helps to identify attacks accurately with less false alarm rate.Initially,to improve training data quality,various preprocessing techniques are utilized.The Adaptive Synthetic oversampling technique generates synthetic samples for minority classes.In the proposed fusion-based feature importance,we use different approaches from the filter,wrapper,and embedded methods like mutual information,random forest importance,permutation importance,Shapley Additive exPlanations(SHAP)-based feature importance,and statistical feature importance methods like the difference of mean and median and standard deviation to rank each feature according to its rank.Then by simple plurality voting,the most optimal features are retrieved.Then the optimal features are fed to various models like Extra Tree(ET),Logistic Regression(LR),Support vector Machine(SVM),Decision Tree(DT),and Extreme Gradient Boosting Machine(XGBM).Then the hyperparameters of classification models are tuned with Halving Random Search cross-validation to enhance the performance.The experiments were carried out on the original imbalanced data and balanced data.The outcomes demonstrate that the balanced data scenario knocked out the imbalanced data.Finally,the experimental analysis proved that our proposed fusionbased feature importance performed well with XGBM giving an accuracy of 99.86%,99.68%,and 92.4%,with 9,7 and 8 features by training time of 1.5,4.5 and 5.5 s on Network Security Laboratory-Knowledge Discovery in Databases(NSL-KDD),Canadian Institute for Cybersecurity(CIC-IDS 2017),and UNSW-NB15,datasets respectively.In addition,the suggested technique has been examined and contrasted with the state of art methods on three datasets.展开更多
A number of risk ranking systems for contaminated sites have been developed by different jurisdictions. While the intent of each of these systems is similar, it is not clear whether they provide results that are compa...A number of risk ranking systems for contaminated sites have been developed by different jurisdictions. While the intent of each of these systems is similar, it is not clear whether they provide results that are comparable. In this paper, 20 contaminated sites are used to assess the United States’ Preliminary Assessment (PA) system, Sweden’s Methods for Inventories of Contaminated Sites (MICS) and New Zealand’s Risk Screening System (RSS) methods. The results were compared with each other and with Canada’s National Classification System for Contaminated Sites (NCSCS) as well as preliminary quantitative risk assessment (PQRA) results. The objectives were to determine if the systems yield similar recommendations regarding further actions, and to assess if there are acceptable correlations between different methods. The study concludes that PA, MICS and NCSCS methods can achieve similar conclusions, although there is a certain degree of inconsistency that is present, RSS can distinguish the very high and very low risk sites and, acceptable correlations exists among the methods except for PA and PQRA.展开更多
Security measures for a computer network system can be enhanced with better understanding the vulnerabilities and their behavior over the time. It is observed that the effects of vulnerabilities vary with the time ove...Security measures for a computer network system can be enhanced with better understanding the vulnerabilities and their behavior over the time. It is observed that the effects of vulnerabilities vary with the time over their life cycle. In the present study, we have presented a new methodology to assess the magnitude of the risk of a vulnerability as a “Risk Rank”. To derive this new methodology well known Markovian approach with a transition probability matrix is used including relevant risk factors for discovered and recorded vulnerabilities. However, in addition to observing the risk factor for each vulnerability individually we have introduced the concept of ranking vulnerabilities at a particular time taking a similar approach to Google Page Rank Algorithm. New methodology is exemplified using a simple model of computer network with three recorded vulnerabilities with their CVSS scores.展开更多
Deep neural networks(DNNs)have achieved great success in many data processing applications.However,high computational complexity and storage cost make deep learning difficult to be used on resource-constrained devices...Deep neural networks(DNNs)have achieved great success in many data processing applications.However,high computational complexity and storage cost make deep learning difficult to be used on resource-constrained devices,and it is not environmental-friendly with much power cost.In this paper,we focus on low-rank optimization for efficient deep learning techniques.In the space domain,DNNs are compressed by low rank approximation of the network parameters,which directly reduces the storage requirement with a smaller number of network parameters.In the time domain,the network parameters can be trained in a few subspaces,which enables efficient training for fast convergence.The model compression in the spatial domain is summarized into three categories as pre-train,pre-set,and compression-aware methods,respectively.With a series of integrable techniques discussed,such as sparse pruning,quantization,and entropy coding,we can ensemble them in an integration framework with lower computational complexity and storage.In addition to summary of recent technical advances,we have two findings for motivating future works.One is that the effective rank,derived from the Shannon entropy of the normalized singular values,outperforms other conventional sparse measures such as the?_1 norm for network compression.The other is a spatial and temporal balance for tensorized neural networks.For accelerating the training of tensorized neural networks,it is crucial to leverage redundancy for both model compression and subspace training.展开更多
Based on the characteristics of high-end products,crowd-sourcing user stories can be seen as an effective means of gathering requirements,involving a large user base and generating a substantial amount of unstructured...Based on the characteristics of high-end products,crowd-sourcing user stories can be seen as an effective means of gathering requirements,involving a large user base and generating a substantial amount of unstructured feedback.The key challenge lies in transforming abstract user needs into specific ones,requiring integration and analysis.Therefore,we propose a topic mining-based approach to categorize,summarize,and rank product requirements from user stories.Specifically,after determining the number of story categories based on py LDAvis,we initially classify“I want to”phrases within user stories.Subsequently,classic topic models are applied to each category to generate their names,defining each post-classification user story category as a requirement.Furthermore,a weighted ranking function is devised to calculate the importance of each requirement.Finally,we validate the effectiveness and feasibility of the proposed method using 2966 crowd-sourced user stories related to smart home systems.展开更多
The RPL(IPv6 Routing Protocol for Low-Power and Lossy Networks)protocol is essential for efficient communi-cation within the Internet of Things(IoT)ecosystem.Despite its significance,RPL’s susceptibility to attacks r...The RPL(IPv6 Routing Protocol for Low-Power and Lossy Networks)protocol is essential for efficient communi-cation within the Internet of Things(IoT)ecosystem.Despite its significance,RPL’s susceptibility to attacks remains a concern.This paper presents a comprehensive simulation-based analysis of the RPL protocol’s vulnerability to the decreased rank attack in both static andmobilenetwork environments.We employ the Random Direction Mobility Model(RDM)for mobile scenarios within the Cooja simulator.Our systematic evaluation focuses on critical performance metrics,including Packet Delivery Ratio(PDR),Average End to End Delay(AE2ED),throughput,Expected Transmission Count(ETX),and Average Power Consumption(APC).Our findings illuminate the disruptive impact of this attack on the routing hierarchy,resulting in decreased PDR and throughput,increased AE2ED,ETX,and APC.These results underscore the urgent need for robust security measures to protect RPL-based IoT networks.Furthermore,our study emphasizes the exacerbated impact of the attack in mobile scenarios,highlighting the evolving security requirements of IoT networks.展开更多
Purpose:The quantitative rankings of over 55,000 institutions and their institutional programs are based on the individual rankings of approximately 30 million scholars determined by their productivity,impact,and qual...Purpose:The quantitative rankings of over 55,000 institutions and their institutional programs are based on the individual rankings of approximately 30 million scholars determined by their productivity,impact,and quality.Design/methodology/approach:The institutional ranking process developed here considers all institutions in all countries and regions,thereby including those that are established,as well as those that are emerging in scholarly prowess.Rankings of individual scholars worldwide are first generated using the recently introduced,fully indexed ScholarGPS database.The rankings of individual scholars are extended here to determine the lifetime and last-five-year Top 20 rankings of academic institutions over all Fields of scholarly endeavor,in 14 individual Fields,in 177 Disciplines,and in approximately 350,000 unique Specialties.Rankings associated with five specific Fields(Medicine,Engineering&Computer Science,Life Sciences,Physical Sciences&Mathematics,and Social Sciences),and in two Disciplines(Chemistry,and Electrical&Computer Engineering)are presented as examples,and changes in the rankings over time are discussed.Findings:For the Fields considered here,the Top 20 institutional rankings in Medicine have undergone the least change(lifetime versus last five years),while the rankings in Engineering&Computer Science have exhibited significant change.The evolution of institutional rankings over time is largely attributed to the recent emergence of Chinese academic institutions,although this emergence is shown to be highly Field-and Discipline-dependent.Practical implementations:Existing rankings of academic institutions have:(i)often been restricted to pre-selected institutions,clouding the potential discovery of scholarly activity in emerging institutions and countries;(ii)considered only broad areas of research,limiting the ability of university leadership to act on the assessments in a concrete manner,or in contrast;(iii)have considered only a narrow area of research for comparison,diminishing the broader applicability and impact of the assessment.In general,existing institutional rankings depend on which institutions are included in the ranking process,which areas of research are considered,the breadth(or granularity)of the research areas of interest,and the methodologies used to define and quantify research performance.In contrast,the methods presented here can provide important data over a broad range of granularity to allow responsible individuals to gauge the performance of any institution from the Overall(all Fields)level,to the level of the Specialty.The methods may also assist identification of the root causes of shifts in institution rankings,and how these shifts vary across hundreds of thousands of Fields,Disciplines,and Specialties of scholarly endeavor.Originality/value:This study provides the first ranking of all academic institutions worldwide over Fields,Disciplines,and Specialties based on a unique methodology that quantifies the productivity,impact,and quality of individual scholars.展开更多
文摘In the process of promoting the principal level system,there are often practical dilemmas such as the dislocation of modern school system construction,the dislocation of school values,the absence of assessment and evaluation mechanisms,and the lack of endogenous development of subjects.Therefore,it is necessary to actively establish a modern school system,stimulate the vitality of educators in running schools,adhere to the conscience of educating people,call on the professional awareness of educators in running schools,optimize the evaluation and evaluation mechanism,enhance the growth momentum of educators in running schools,enhance the connotative construction of the subject,and lead The continuous development of educators is the exploration of the main path.
基金supported by the Excellent Foreign Student scholarship program,Sirindhorn International Institute of Technology.
文摘Space-Time Block Coded(STBC)Orthogonal Frequency Division Multiplexing(OFDM)satisfies higher data-rate requirements while maintaining signal quality in a multipath fading channel.However,conventional STBCs,including Orthogonal STBCs(OSTBCs),Non-Orthogonal(NOSTBCs),and Quasi-Orthogonal STBCs(QOSTBCs),do not provide both maximal diversity order and unity code rate simultaneously for more than two transmit antennas.This paper targets this problem and applies Maximum Rank Distance(MRD)codes in designing STBCOFDM systems.By following the direct-matrix construction method,we can construct binary extended finite field MRD-STBCs for any number of transmitting antennas.Work uses MRD-STBCs built over Phase-Shift Keying(PSK)modulation to develop an MRD-based STBC-OFDM system.The MRD-based STBC-OFDM system sacrifices minor error performance compared to traditional OSTBC-OFDM but shows improved results against NOSTBC and QOSTBC-OFDM.It also provides 25%higher data-rates than OSTBC-OFDM in configurations that use more than two transmit antennas.The tradeoffs are minor increases in computational complexity and processing delays.
基金This work was supported by the Fundamental Research Funds for the Central Universities(No.3122019191).
文摘To ensure flight safety,the complex network method is used to study the influence and invulnerability of air traffic cyber physical system(CPS)nodes.According to the rules of air traffic management,the logical coupling relationship between routes and sectors is analyzed,an air traffic CPS network model is constructed,and the indicators of node influence and invulnerability are established.The K-shell algorithm is improved to identify node influence,and the invulnerability is analyzed under random and selective attacks.Taking Airspace in Eastern China as an example,its influential nodes are sorted by degree,namely,K-shell,the improved K-shell(IKS)and betweenness centrality.The invulnerability of air traffic CPS under different attacks is analyzed.Results show that IKS can effectively identify the influential nodes in the air traffic CPS network,and IKS and betweenness centrality are the two key indicators that affect the invulnerability of air traffic CPS.
文摘Cloud computing is becoming popular technology due to its functional properties and variety of customer-oriented services over the Internet.The design of reliable and high-quality cloud applications requires a strong Quality of Service QoS parameter metric.In a hyperconverged cloud ecosystem environment,building high-reliability cloud applications is a challenging job.The selection of cloud services is based on the QoS parameters that play essential roles in optimizing and improving cloud rankings.The emergence of cloud computing is significantly reshaping the digital ecosystem,and the numerous services offered by cloud service providers are playing a vital role in this transformation.Hyperconverged software-based unified utilities combine storage virtualization,compute virtualization,and network virtualization.The availability of the latter has also raised the demand for QoS.Due to the diversity of services,the respective quality parameters are also in abundance and need a carefully designed mechanism to compare and identify the critical,common,and impactful parameters.It is also necessary to reconsider the market needs in terms of service requirements and the QoS provided by various CSPs.This research provides a machine learning-based mechanism to monitor the QoS in a hyperconverged environment with three core service parameters:service quality,downtime of servers,and outage of cloud services.
基金Project supported by the National Natural Science Foundation of China (Grant No.60672160)
文摘In this paper, the maximal and minimal ranks of the solution to a system of matrix equations over H, the real quaternion algebra, were derived. A previous known result could be regarded as a special case of the new result.
基金Supported by National Nat ural Science Foundation of China(Grant Nos.51675227,51975249)Jilin Province Science and Technology Development Funds(Grant Nos.20180201007GX,20190302017GX)+2 种基金Technology Development and Research of Jilin Province(Grant No.2019C037-01)Changchun Science and Technology Planning Project(Grant No.19SS011)National Science and technology Major Project(Grant No.2014ZX04015031).
文摘Heavy-duty machine tools are composed of many subsystems with different functions,and their reliability is governed by the reliabilities of these subsystems.It is important to rank the weaknesses of subsystems and identify the weakest subsystem to optimize products and improve their reliabilities.However,traditional ranking methods based on failure mode effect and critical analysis(FMECA)does not consider the complex maintenance of products.Herein,a weakness ranking method for the subsystems of heavy-duty machine tools is proposed based on generalized FMECA information.In this method,eight reliability indexes,including maintainability and maintenance cost,are considered in the generalized FMECA information.Subsequently,the cognition best worst method is used to calculate the weight of each screened index,and the weaknesses of the subsystems are ranked using a technique for order preference by similarity to an ideal solution.Finally,based on the failure data collected from certain domestic heavy-duty horizontal lathes,the weakness ranking result of the subsystems is obtained to verify the effectiveness of the proposed method.An improved weakness ranking method that can comprehensively analyze and identify weak subsystems is proposed herein for designing and improving the reliability of complex electromechanical products.
文摘Offshore structures will encounter serious environmental load, so it is important to study the structural system reliability and to evaluate the structural component safety rank. In this paper, the bracnch-and-bound method is adopted to search the main failure path, and the Ditlevsen bound method is used to calculate the system failure probability. The structure is then assessed by the fuzzy comprehensive assessment method, which evaluates the structural component safety rank. The ultimate equation of the tubular cross- section is analyzed on the basis of ultimate stregnth analysis. The influence of effect coefficients on the structural system failure probability is investigated, and basic results are obtained. A general program for spatial frame structures by means of the above method is developed, and verified by the numerical examples.
文摘Expanding internet-connected services has increased cyberattacks,many of which have grave and disastrous repercussions.An Intrusion Detection System(IDS)plays an essential role in network security since it helps to protect the network from vulnerabilities and attacks.Although extensive research was reported in IDS,detecting novel intrusions with optimal features and reducing false alarm rates are still challenging.Therefore,we developed a novel fusion-based feature importance method to reduce the high dimensional feature space,which helps to identify attacks accurately with less false alarm rate.Initially,to improve training data quality,various preprocessing techniques are utilized.The Adaptive Synthetic oversampling technique generates synthetic samples for minority classes.In the proposed fusion-based feature importance,we use different approaches from the filter,wrapper,and embedded methods like mutual information,random forest importance,permutation importance,Shapley Additive exPlanations(SHAP)-based feature importance,and statistical feature importance methods like the difference of mean and median and standard deviation to rank each feature according to its rank.Then by simple plurality voting,the most optimal features are retrieved.Then the optimal features are fed to various models like Extra Tree(ET),Logistic Regression(LR),Support vector Machine(SVM),Decision Tree(DT),and Extreme Gradient Boosting Machine(XGBM).Then the hyperparameters of classification models are tuned with Halving Random Search cross-validation to enhance the performance.The experiments were carried out on the original imbalanced data and balanced data.The outcomes demonstrate that the balanced data scenario knocked out the imbalanced data.Finally,the experimental analysis proved that our proposed fusionbased feature importance performed well with XGBM giving an accuracy of 99.86%,99.68%,and 92.4%,with 9,7 and 8 features by training time of 1.5,4.5 and 5.5 s on Network Security Laboratory-Knowledge Discovery in Databases(NSL-KDD),Canadian Institute for Cybersecurity(CIC-IDS 2017),and UNSW-NB15,datasets respectively.In addition,the suggested technique has been examined and contrasted with the state of art methods on three datasets.
文摘A number of risk ranking systems for contaminated sites have been developed by different jurisdictions. While the intent of each of these systems is similar, it is not clear whether they provide results that are comparable. In this paper, 20 contaminated sites are used to assess the United States’ Preliminary Assessment (PA) system, Sweden’s Methods for Inventories of Contaminated Sites (MICS) and New Zealand’s Risk Screening System (RSS) methods. The results were compared with each other and with Canada’s National Classification System for Contaminated Sites (NCSCS) as well as preliminary quantitative risk assessment (PQRA) results. The objectives were to determine if the systems yield similar recommendations regarding further actions, and to assess if there are acceptable correlations between different methods. The study concludes that PA, MICS and NCSCS methods can achieve similar conclusions, although there is a certain degree of inconsistency that is present, RSS can distinguish the very high and very low risk sites and, acceptable correlations exists among the methods except for PA and PQRA.
文摘Security measures for a computer network system can be enhanced with better understanding the vulnerabilities and their behavior over the time. It is observed that the effects of vulnerabilities vary with the time over their life cycle. In the present study, we have presented a new methodology to assess the magnitude of the risk of a vulnerability as a “Risk Rank”. To derive this new methodology well known Markovian approach with a transition probability matrix is used including relevant risk factors for discovered and recorded vulnerabilities. However, in addition to observing the risk factor for each vulnerability individually we have introduced the concept of ranking vulnerabilities at a particular time taking a similar approach to Google Page Rank Algorithm. New methodology is exemplified using a simple model of computer network with three recorded vulnerabilities with their CVSS scores.
基金supported by the National Natural Science Foundation of China(62171088,U19A2052,62020106011)the Medico-Engineering Cooperation Funds from University of Electronic Science and Technology of China(ZYGX2021YGLH215,ZYGX2022YGRH005)。
文摘Deep neural networks(DNNs)have achieved great success in many data processing applications.However,high computational complexity and storage cost make deep learning difficult to be used on resource-constrained devices,and it is not environmental-friendly with much power cost.In this paper,we focus on low-rank optimization for efficient deep learning techniques.In the space domain,DNNs are compressed by low rank approximation of the network parameters,which directly reduces the storage requirement with a smaller number of network parameters.In the time domain,the network parameters can be trained in a few subspaces,which enables efficient training for fast convergence.The model compression in the spatial domain is summarized into three categories as pre-train,pre-set,and compression-aware methods,respectively.With a series of integrable techniques discussed,such as sparse pruning,quantization,and entropy coding,we can ensemble them in an integration framework with lower computational complexity and storage.In addition to summary of recent technical advances,we have two findings for motivating future works.One is that the effective rank,derived from the Shannon entropy of the normalized singular values,outperforms other conventional sparse measures such as the?_1 norm for network compression.The other is a spatial and temporal balance for tensorized neural networks.For accelerating the training of tensorized neural networks,it is crucial to leverage redundancy for both model compression and subspace training.
基金supported by the National Natural Science Foundation of China(71690233,71901214)。
文摘Based on the characteristics of high-end products,crowd-sourcing user stories can be seen as an effective means of gathering requirements,involving a large user base and generating a substantial amount of unstructured feedback.The key challenge lies in transforming abstract user needs into specific ones,requiring integration and analysis.Therefore,we propose a topic mining-based approach to categorize,summarize,and rank product requirements from user stories.Specifically,after determining the number of story categories based on py LDAvis,we initially classify“I want to”phrases within user stories.Subsequently,classic topic models are applied to each category to generate their names,defining each post-classification user story category as a requirement.Furthermore,a weighted ranking function is devised to calculate the importance of each requirement.Finally,we validate the effectiveness and feasibility of the proposed method using 2966 crowd-sourced user stories related to smart home systems.
文摘The RPL(IPv6 Routing Protocol for Low-Power and Lossy Networks)protocol is essential for efficient communi-cation within the Internet of Things(IoT)ecosystem.Despite its significance,RPL’s susceptibility to attacks remains a concern.This paper presents a comprehensive simulation-based analysis of the RPL protocol’s vulnerability to the decreased rank attack in both static andmobilenetwork environments.We employ the Random Direction Mobility Model(RDM)for mobile scenarios within the Cooja simulator.Our systematic evaluation focuses on critical performance metrics,including Packet Delivery Ratio(PDR),Average End to End Delay(AE2ED),throughput,Expected Transmission Count(ETX),and Average Power Consumption(APC).Our findings illuminate the disruptive impact of this attack on the routing hierarchy,resulting in decreased PDR and throughput,increased AE2ED,ETX,and APC.These results underscore the urgent need for robust security measures to protect RPL-based IoT networks.Furthermore,our study emphasizes the exacerbated impact of the attack in mobile scenarios,highlighting the evolving security requirements of IoT networks.
文摘Purpose:The quantitative rankings of over 55,000 institutions and their institutional programs are based on the individual rankings of approximately 30 million scholars determined by their productivity,impact,and quality.Design/methodology/approach:The institutional ranking process developed here considers all institutions in all countries and regions,thereby including those that are established,as well as those that are emerging in scholarly prowess.Rankings of individual scholars worldwide are first generated using the recently introduced,fully indexed ScholarGPS database.The rankings of individual scholars are extended here to determine the lifetime and last-five-year Top 20 rankings of academic institutions over all Fields of scholarly endeavor,in 14 individual Fields,in 177 Disciplines,and in approximately 350,000 unique Specialties.Rankings associated with five specific Fields(Medicine,Engineering&Computer Science,Life Sciences,Physical Sciences&Mathematics,and Social Sciences),and in two Disciplines(Chemistry,and Electrical&Computer Engineering)are presented as examples,and changes in the rankings over time are discussed.Findings:For the Fields considered here,the Top 20 institutional rankings in Medicine have undergone the least change(lifetime versus last five years),while the rankings in Engineering&Computer Science have exhibited significant change.The evolution of institutional rankings over time is largely attributed to the recent emergence of Chinese academic institutions,although this emergence is shown to be highly Field-and Discipline-dependent.Practical implementations:Existing rankings of academic institutions have:(i)often been restricted to pre-selected institutions,clouding the potential discovery of scholarly activity in emerging institutions and countries;(ii)considered only broad areas of research,limiting the ability of university leadership to act on the assessments in a concrete manner,or in contrast;(iii)have considered only a narrow area of research for comparison,diminishing the broader applicability and impact of the assessment.In general,existing institutional rankings depend on which institutions are included in the ranking process,which areas of research are considered,the breadth(or granularity)of the research areas of interest,and the methodologies used to define and quantify research performance.In contrast,the methods presented here can provide important data over a broad range of granularity to allow responsible individuals to gauge the performance of any institution from the Overall(all Fields)level,to the level of the Specialty.The methods may also assist identification of the root causes of shifts in institution rankings,and how these shifts vary across hundreds of thousands of Fields,Disciplines,and Specialties of scholarly endeavor.Originality/value:This study provides the first ranking of all academic institutions worldwide over Fields,Disciplines,and Specialties based on a unique methodology that quantifies the productivity,impact,and quality of individual scholars.