The evolution of the current network has challenges of programmability, maintainability and manageability, due to network ossification. This challenge led to the concept of software-defined networking (SDN), to decoup...The evolution of the current network has challenges of programmability, maintainability and manageability, due to network ossification. This challenge led to the concept of software-defined networking (SDN), to decouple the control system from the infrastructure plane caused by ossification. The innovation created a problem with controller placement. That is how to effectively place controllers within a network topology to manage the network of data plane devices from the control plane. The study was designed to empirically evaluate and compare the functionalities of two controller placement algorithms: the POCO and MOCO. The methodology adopted in the study is the explorative and comparative investigation techniques. The study evaluated the performances of the Pareto optimal combination (POCO) and multi-objective combination (MOCO) algorithms in relation to calibrated positions of the controller within a software-defined network. The network environment and measurement metrics were held constant for both the POCO and MOCO models during the evaluation. The strengths and weaknesses of the POCO and MOCO models were justified. The results showed that the latencies of the two algorithms in relation to the GoodNet network are 3100 ms and 2500 ms for POCO and MOCO respectively. In Switch to Controller Average Case latency, the performance gives 2598 ms and 2769 ms for POCO and MOCO respectively. In Worst Case Switch to Controller latency, the performance shows 2776 ms and 2987 ms for POCO and MOCO respectively. The latencies of the two algorithms evaluated in relation to the Savvis network, compared as follows: 2912 ms and 2784 ms for POCO and MOCO respectively in Switch to Controller Average Case latency, 3129 ms and 3017 ms for POCO and MOCO respectively in Worst Case Switch to Controller latency, 2789 ms and 2693 ms for POCO and MOCO respectively in Average Case Controller to Controller latency, and 2873 ms and 2756 ms for POCO and MOCO in Worst Case Switch to Controller latency respectively. The latencies of the two algorithms evaluated in relation to the AARNet, network compared as follows: 2473 ms and 2129 ms for POCO and MOCO respectively, in Switch to Controller Average Case latency, 2198 ms and 2268 ms for POCO and MOCO respectively, in Worst Case Switch to Controller latency, 2598 ms and 2471 ms for POCO and MOCO respectively, in Average Case Controller to Controller latency, 2689 ms and 2814 ms for POCO and MOCO respectively Worst Case Controller to Controller latency. The Average Case and Worst-Case latencies for Switch to Controller and Controller to Controller are minimal, and favourable to the POCO model as against the MOCO model when evaluated in the Goodnet, Savvis, and the Aanet networks. This simply indicates that the POCO model has a speed advantage as against the MOCO model, which appears to be more resilient than the POCO model.展开更多
In this paper,we solve the obstacle problems on metric measure spaces with generalized Ricci lower bounds.We show the existence and Lipschitz continuity of the solutions,and then we establish some regularities of the ...In this paper,we solve the obstacle problems on metric measure spaces with generalized Ricci lower bounds.We show the existence and Lipschitz continuity of the solutions,and then we establish some regularities of the free boundaries.展开更多
To professionally plan and manage the development and evolution of the Internet of Things(IoT),researchers have proposed several IoT performance measurement solutions.IoT performance measurement solutions can be very ...To professionally plan and manage the development and evolution of the Internet of Things(IoT),researchers have proposed several IoT performance measurement solutions.IoT performance measurement solutions can be very valuable for managing the development and evolution of IoT systems,as they provide insights into performance issues,resource optimization,predictive maintenance,security,reliability,and user experience.However,there are several issues that can impact the accuracy and reliability of IoT performance measurements,including lack of standardization,complexity of IoT systems,scalability,data privacy,and security.While previous studies proposed several IoT measurement solutions in the literature,they did not evaluate any individual one to figure out their respective measurement strengths and weaknesses.This study provides a novel scheme for the evaluation of proposed IoT measurement solutions using a metrology-coverage evaluation based on evaluation theory,metrology principles,and software measurement best practices.This evaluation approach was employed for 12 IoT measure categories and 158 IoT measurement solutions identified in a Systematic Literature Review(SLR)from 2010 to 2021.The metrology coverage of these IoT measurement solutions was analyzed from four perspectives:across IoT categories,within each study,improvement over time,and implications for IoT practitioners and researchers.The criteria in this metrology-coverage evaluation allowed for the identification of strengths and weaknesses in the theoretical and empirical definitions of the proposed IoT measurement solutions.We found that the metrological coverage varies significantly across IoT measurement solution categories and did not show improvement over the 2010–2021 timeframe.Detailed findings can help practitioners understand the limitations of the proposed measurement solutions and choose those with stronger designs.These evaluation results can also be used by researchers to improve current IoT measurement solution designs and suggest new solutions with a stronger metrology base.展开更多
On the basis of software testing tools we developed for progrmnming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based s...On the basis of software testing tools we developed for progrmnming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1 ; J-complexity 1 + ; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.展开更多
In software engineering, software measures are often proposed without precise identification of the measurable concepts they attempt to quantify: consequently, the numbers obtained are challenging to reproduce in diff...In software engineering, software measures are often proposed without precise identification of the measurable concepts they attempt to quantify: consequently, the numbers obtained are challenging to reproduce in different measurement contexts and to interpret, either as base measures or in combination as derived measures. The lack of consistency when using base measures in data collection can affect both data preparation and data analysis. This paper analyzes the similarities and differences across three different views of measurement methods (ISO International Vocabulary on Metrology, ISO 15939, and ISO 25021), and uses a process proposed for the design of software measurement methods to analyze two examples of such methods selected from the literature.展开更多
Pre-and post-selected(PPS) measurement, especially the weak PPS measurement, has been proved to be a useful tool for measuring extremely tiny physical parameters. However, it is difficult to retain both the attainable...Pre-and post-selected(PPS) measurement, especially the weak PPS measurement, has been proved to be a useful tool for measuring extremely tiny physical parameters. However, it is difficult to retain both the attainable highest measurement sensitivity and precision with the increase of the parameter to be measured. Here, a modulated PPS measurement scheme based on coupling-strength-dependent modulation is presented with the highest sensitivity and precision retained for an arbitrary coupling strength. This idea is demonstrated by comparing the modulated PPS measurement scheme with the standard PPS measurement scheme in the case of unbalanced input meter. By using the Fisher information metric, we derive the optimal pre-and post-selected states, as well as the optimal coupling-strength-dependent modulation without any restriction on the coupling strength. We also give the specific strategy of performing the modulated PPS measurement scheme, which may promote practical application of this scheme in precision metrology.展开更多
An analysis is reported of conventional vs. alternative metrics used in measuring food production efficiency. Economic efficiency is driven by marketplace economics, while engineering efficiency is driven by useful en...An analysis is reported of conventional vs. alternative metrics used in measuring food production efficiency. Economic efficiency is driven by marketplace economics, while engineering efficiency is driven by useful energy conservation. As farming systems are optimized for maximum efficiency, how “efficiency” is defined will dictate the methods used in food production. Farming methods that are optimized in terms of economic efficiency have environmental consequences that are not inherent of engineering efficiency;however, farming methods optimized in terms of engineering efficiency have labor requirements not inherent of economic efficiency. A shift from optimizing food production in terms of economic efficiency to engineering efficiency may be necessary in order to feed a growing human population.展开更多
In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with t...In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.展开更多
Metric measurement of digitized shapes is commonly applied in optical measuring systems.In this letter,three shape-related factors defined by the authors are used in the construction of amultiple linear regression mod...Metric measurement of digitized shapes is commonly applied in optical measuring systems.In this letter,three shape-related factors defined by the authors are used in the construction of amultiple linear regression model which is utilized to compute the circumference of the convex shapes inmillimeter unit.The model is first built upon the relationship hypothesis and then its adequacy ismathematically validated.The results of applying the developed model to the given number of convexshapes in a finite circumferential length range suggest that,in terms of percent error,the model pre-cision is to satisfaction by being within±4%.The test also shows the model’s robustness against theshape’s orientation anisotropy.展开更多
We have studied statistically self similar measures together with statistically self similar sets in this paper.A special kind of statistically self similar measures has been constructed and a class of statisticall...We have studied statistically self similar measures together with statistically self similar sets in this paper.A special kind of statistically self similar measures has been constructed and a class of statistically self similar sets as well.展开更多
文摘The evolution of the current network has challenges of programmability, maintainability and manageability, due to network ossification. This challenge led to the concept of software-defined networking (SDN), to decouple the control system from the infrastructure plane caused by ossification. The innovation created a problem with controller placement. That is how to effectively place controllers within a network topology to manage the network of data plane devices from the control plane. The study was designed to empirically evaluate and compare the functionalities of two controller placement algorithms: the POCO and MOCO. The methodology adopted in the study is the explorative and comparative investigation techniques. The study evaluated the performances of the Pareto optimal combination (POCO) and multi-objective combination (MOCO) algorithms in relation to calibrated positions of the controller within a software-defined network. The network environment and measurement metrics were held constant for both the POCO and MOCO models during the evaluation. The strengths and weaknesses of the POCO and MOCO models were justified. The results showed that the latencies of the two algorithms in relation to the GoodNet network are 3100 ms and 2500 ms for POCO and MOCO respectively. In Switch to Controller Average Case latency, the performance gives 2598 ms and 2769 ms for POCO and MOCO respectively. In Worst Case Switch to Controller latency, the performance shows 2776 ms and 2987 ms for POCO and MOCO respectively. The latencies of the two algorithms evaluated in relation to the Savvis network, compared as follows: 2912 ms and 2784 ms for POCO and MOCO respectively in Switch to Controller Average Case latency, 3129 ms and 3017 ms for POCO and MOCO respectively in Worst Case Switch to Controller latency, 2789 ms and 2693 ms for POCO and MOCO respectively in Average Case Controller to Controller latency, and 2873 ms and 2756 ms for POCO and MOCO in Worst Case Switch to Controller latency respectively. The latencies of the two algorithms evaluated in relation to the AARNet, network compared as follows: 2473 ms and 2129 ms for POCO and MOCO respectively, in Switch to Controller Average Case latency, 2198 ms and 2268 ms for POCO and MOCO respectively, in Worst Case Switch to Controller latency, 2598 ms and 2471 ms for POCO and MOCO respectively, in Average Case Controller to Controller latency, 2689 ms and 2814 ms for POCO and MOCO respectively Worst Case Controller to Controller latency. The Average Case and Worst-Case latencies for Switch to Controller and Controller to Controller are minimal, and favourable to the POCO model as against the MOCO model when evaluated in the Goodnet, Savvis, and the Aanet networks. This simply indicates that the POCO model has a speed advantage as against the MOCO model, which appears to be more resilient than the POCO model.
基金supported by the National Key R&Dprogram of China(2021YFA1003001)。
文摘In this paper,we solve the obstacle problems on metric measure spaces with generalized Ricci lower bounds.We show the existence and Lipschitz continuity of the solutions,and then we establish some regularities of the free boundaries.
基金supported by the University of South Africa under Grant No.409000.
文摘To professionally plan and manage the development and evolution of the Internet of Things(IoT),researchers have proposed several IoT performance measurement solutions.IoT performance measurement solutions can be very valuable for managing the development and evolution of IoT systems,as they provide insights into performance issues,resource optimization,predictive maintenance,security,reliability,and user experience.However,there are several issues that can impact the accuracy and reliability of IoT performance measurements,including lack of standardization,complexity of IoT systems,scalability,data privacy,and security.While previous studies proposed several IoT measurement solutions in the literature,they did not evaluate any individual one to figure out their respective measurement strengths and weaknesses.This study provides a novel scheme for the evaluation of proposed IoT measurement solutions using a metrology-coverage evaluation based on evaluation theory,metrology principles,and software measurement best practices.This evaluation approach was employed for 12 IoT measure categories and 158 IoT measurement solutions identified in a Systematic Literature Review(SLR)from 2010 to 2021.The metrology coverage of these IoT measurement solutions was analyzed from four perspectives:across IoT categories,within each study,improvement over time,and implications for IoT practitioners and researchers.The criteria in this metrology-coverage evaluation allowed for the identification of strengths and weaknesses in the theoretical and empirical definitions of the proposed IoT measurement solutions.We found that the metrological coverage varies significantly across IoT measurement solution categories and did not show improvement over the 2010–2021 timeframe.Detailed findings can help practitioners understand the limitations of the proposed measurement solutions and choose those with stronger designs.These evaluation results can also be used by researchers to improve current IoT measurement solution designs and suggest new solutions with a stronger metrology base.
文摘On the basis of software testing tools we developed for progrmnming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1 ; J-complexity 1 + ; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.
文摘In software engineering, software measures are often proposed without precise identification of the measurable concepts they attempt to quantify: consequently, the numbers obtained are challenging to reproduce in different measurement contexts and to interpret, either as base measures or in combination as derived measures. The lack of consistency when using base measures in data collection can affect both data preparation and data analysis. This paper analyzes the similarities and differences across three different views of measurement methods (ISO International Vocabulary on Metrology, ISO 15939, and ISO 25021), and uses a process proposed for the design of software measurement methods to analyze two examples of such methods selected from the literature.
基金supported by the National Natural Science Foundation of China(Grant Nos.11674234 and 11605205)the Fundamental Research Funds for the Central Universities,China(Grant No.2012017yjsy143)+4 种基金the National Key Research and Development Program of China(Grant No.2017YFA0305200)the Youth Innovation Promotion Association of Chinese Academy of Sciences(CAS)(Grant No.2015317)the Natural Science Foundation of Chongqing,China(Grant Nos.cstc2015jcyjA00021 and cstc2018jcyjAX0656)the Entrepreneurship and Innovation Support Program for Chongqing Overseas Returnees,China(Grant No.cx017134)the Fund of CAS Key Laboratory of Microscale Magnetic Resonance,China,and the Fund of CAS Key Laboratory of Quantum Information,China
文摘Pre-and post-selected(PPS) measurement, especially the weak PPS measurement, has been proved to be a useful tool for measuring extremely tiny physical parameters. However, it is difficult to retain both the attainable highest measurement sensitivity and precision with the increase of the parameter to be measured. Here, a modulated PPS measurement scheme based on coupling-strength-dependent modulation is presented with the highest sensitivity and precision retained for an arbitrary coupling strength. This idea is demonstrated by comparing the modulated PPS measurement scheme with the standard PPS measurement scheme in the case of unbalanced input meter. By using the Fisher information metric, we derive the optimal pre-and post-selected states, as well as the optimal coupling-strength-dependent modulation without any restriction on the coupling strength. We also give the specific strategy of performing the modulated PPS measurement scheme, which may promote practical application of this scheme in precision metrology.
文摘An analysis is reported of conventional vs. alternative metrics used in measuring food production efficiency. Economic efficiency is driven by marketplace economics, while engineering efficiency is driven by useful energy conservation. As farming systems are optimized for maximum efficiency, how “efficiency” is defined will dictate the methods used in food production. Farming methods that are optimized in terms of economic efficiency have environmental consequences that are not inherent of engineering efficiency;however, farming methods optimized in terms of engineering efficiency have labor requirements not inherent of economic efficiency. A shift from optimizing food production in terms of economic efficiency to engineering efficiency may be necessary in order to feed a growing human population.
文摘In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.
基金the Ningbo Natural Science Foundation(No.2006A610016)the Foundation of National EducationMinistry for Returned Overseas Students&Scholars(SRFfor ROCS,SEM.No.2006699).
文摘Metric measurement of digitized shapes is commonly applied in optical measuring systems.In this letter,three shape-related factors defined by the authors are used in the construction of amultiple linear regression model which is utilized to compute the circumference of the convex shapes inmillimeter unit.The model is first built upon the relationship hypothesis and then its adequacy ismathematically validated.The results of applying the developed model to the given number of convexshapes in a finite circumferential length range suggest that,in terms of percent error,the model pre-cision is to satisfaction by being within±4%.The test also shows the model’s robustness against theshape’s orientation anisotropy.
文摘We have studied statistically self similar measures together with statistically self similar sets in this paper.A special kind of statistically self similar measures has been constructed and a class of statistically self similar sets as well.