Nearly all real-world networks are complex networks and usually are in danger of collapse.Therefore,it is crucial to exploit and understand the mechanisms of network attacks and provide better protection for network f...Nearly all real-world networks are complex networks and usually are in danger of collapse.Therefore,it is crucial to exploit and understand the mechanisms of network attacks and provide better protection for network functionalities.Network dismantling aims to find the smallest set of nodes such that after their removal the network is broken into connected components of sub-extensive size.To overcome the limitations and drawbacks of existing network dismantling methods,this paper focuses on network dismantling problem and proposes a neighbor-loop structure based centrality metric,NL,which achieves a balance between computational efficiency and evaluation accuracy.In addition,we design a novel method combining NL-based nodes-removing,greedy tree-breaking and reinsertion.Moreover,we compare five baseline methods with our algorithm on ten widely used real-world networks and three types of model networks including Erd€os-Renyi random networks,Watts-Strogatz smallworld networks and Barabasi-Albert scale-free networks with different network generation parameters.Experimental results demonstrate that our proposed method outperforms most peer methods by obtaining a minimal set of targeted attack nodes.Furthermore,the insights gained from this study may be of assistance to future practical research into real-world networks.展开更多
The number of botnet malware attacks on Internet devices has grown at an equivalent rate to the number of Internet devices that are connected to the Internet.Bot detection using machine learning(ML)with flow-based fea...The number of botnet malware attacks on Internet devices has grown at an equivalent rate to the number of Internet devices that are connected to the Internet.Bot detection using machine learning(ML)with flow-based features has been extensively studied in the literature.Existing flow-based detection methods involve significant computational overhead that does not completely capture network communication patterns that might reveal other features ofmalicious hosts.Recently,Graph-Based Bot Detection methods using ML have gained attention to overcome these limitations,as graphs provide a real representation of network communications.The purpose of this study is to build a botnet malware detection system utilizing centrality measures for graph-based botnet detection and ML.We propose BotSward,a graph-based bot detection system that is based on ML.We apply the efficient centrality measures,which are Closeness Centrality(CC),Degree Centrality(CC),and PageRank(PR),and compare them with others used in the state-of-the-art.The efficiency of the proposed method is verified on the available Czech Technical University 13 dataset(CTU-13).The CTU-13 dataset contains 13 real botnet traffic scenarios that are connected to a command-and-control(C&C)channel and that cause malicious actions such as phishing,distributed denial-of-service(DDoS)attacks,spam attacks,etc.BotSward is robust to zero-day attacks,suitable for large-scale datasets,and is intended to produce better accuracy than state-of-the-art techniques.The proposed BotSward solution achieved 99%accuracy in botnet attack detection with a false positive rate as low as 0.0001%.展开更多
To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empiri...To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empirical studies of the gene ontology with various perspectives, this paper shows that the whole gene ontology displays the same topological features as complex networks including "small world" and "scale-free",while some sub-ontologies have the "scale-free" property but no "small world" effect.The potential important terms in an ontology are discovered by some famous complex network centralization methods.An evaluation method based on information retrieval in MEDLINE is designed to measure the effectiveness of the discovered important terms.According to the relevant literature of the gene ontology terms,the suitability of these centralization methods for ontology important concepts discovering is quantitatively evaluated.The experimental results indicate that the betweenness centrality is the most appropriate method among all the evaluated centralization measures.展开更多
To examine the interdependency and evolution of Pakistan’s stock market,we consider the cross-correlation coefficients of daily stock returns belonging to the blue chip Karachi stock exchange(KSE-100)index.Using the ...To examine the interdependency and evolution of Pakistan’s stock market,we consider the cross-correlation coefficients of daily stock returns belonging to the blue chip Karachi stock exchange(KSE-100)index.Using the minimum spanning tree network-based method,we extend the financial network literature by examining the topological properties of the network and generating six minimum spanning tree networks around three general elections in Pakistan.Our results reveal a star-like structure after the general elections of 2018 and before those in 2008,and a tree-like structure otherwise.We also highlight key nodes,the presence of different clusters,and compare the differences between the three elections.Additionally,the sectorial centrality measures reveal economic expansion in three industrial sectors—cement,oil and gas,and fertilizers.Moreover,a strong overall intermediary role of the fertilizer sector is observed.The results indicate a structural change in the stock market network due to general elections.Consequently,through this analysis,policy makers can focus on monitoring key nodes around general elections to estimate stock market stability,while local and international investors can form optimal diversification strategies.展开更多
A tiny fraction of influential individuals play a critical role in the dynamics on complex systems. Identifying the influential nodes in complex networks has theoretical and practical significance. Considering the unc...A tiny fraction of influential individuals play a critical role in the dynamics on complex systems. Identifying the influential nodes in complex networks has theoretical and practical significance. Considering the uncertainties of network scale and topology, and the timeliness of dynamic behaviors in real networks, we propose a rapid identifying method(RIM)to find the fraction of high-influential nodes. Instead of ranking all nodes, our method only aims at ranking a small number of nodes in network. We set the high-influential nodes as initial spreaders, and evaluate the performance of RIM by the susceptible-infected-recovered(SIR) model. The simulations show that in different networks, RIM performs well on rapid identifying high-influential nodes, which is verified by typical ranking methods, such as degree, closeness, betweenness,and eigenvector centrality methods.展开更多
To discover and identify the influential nodes in any complex network has been an important issue.It is a significant factor in order to control over the network.Through control on a network,any information can be spr...To discover and identify the influential nodes in any complex network has been an important issue.It is a significant factor in order to control over the network.Through control on a network,any information can be spread and stopped in a short span of time.Both targets can be achieved,since network of information can be extended and as well destroyed.So,information spread and community formation have become one of the most crucial issues in the world of SNA(Social Network Analysis).In this work,the complex network of twitter social network has been formalized and results are analyzed.For this purpose,different network metrics have been utilized.Visualization of the network is provided in its original form and then filter out(different percentages)from the network to eliminate the less impacting nodes and edges for better analysis.This network is analyzed according to different centrality measures,like edge-betweenness,betweenness centrality,closeness centrality and eigenvector centrality.Influential nodes are detected and their impact is observed on the network.The communities are analyzed in terms of network coverage considering theMinimum Spanning Tree,shortest path distribution and network diameter.It is found that these are the very effective ways to find influential and central nodes from such big social networks like Facebook,Instagram,Twitter,LinkedIn,etc.展开更多
A new centrality measure for complex networks, called resource flow centrality, is pro- posed in this paper. This centrality measure is based on the concept of the resource flow in net- works. It not only can be appli...A new centrality measure for complex networks, called resource flow centrality, is pro- posed in this paper. This centrality measure is based on the concept of the resource flow in net- works. It not only can be applied to the connected networks, but also the disconnected networks. Moreover, it overcomes some disadvantages of several common centrality measures. The perform- ance of the proposed measure is compared with some standard centrality measures using a classic dataset and the results indicate the proposed measure performs more reasonably. The statistical dis- tribution of the proposed centrality is investigated by experiments on large scale computer generated graphs and two networks from the real world.展开更多
The main aim of this paper is to compare the stability, in terms of systemic risk, of conventional and Islamic banking systems. To this aim, we propose correlation network models for stock market returns based on grap...The main aim of this paper is to compare the stability, in terms of systemic risk, of conventional and Islamic banking systems. To this aim, we propose correlation network models for stock market returns based on graphical Gaussian distributions, which allows us to capture the contagion effects that move along countries. We also consider Bayesian graphical models, to account for model uncertainty in the measurement of financial systems interconnectedness. Our proposed model is applied to the Middle East and North Africa (MENA) region banking sector, characterized by the presence of both conventional and Islamic banks, for the period from 2007 to the beginning of 2014. Our empirical findings show that there are differences in the systemic risk and stability of the two banking systems during crisis times. In addition, the differences are subject to country specific effects that are amplified during crisis period.展开更多
<div style="text-align:justify;"> We know that functional and structural organization is altered in human brain network due to Alzheimer’s disease. In this paper we highlight how Graph Theory techniqu...<div style="text-align:justify;"> We know that functional and structural organization is altered in human brain network due to Alzheimer’s disease. In this paper we highlight how Graph Theory techniques, its structural parameters like connectivity, diameter, vertex centrality, betweenness centrality, clustering coefficient, degree distribution, cluster analysis and graph cores are involved to analyse magnetoencephalography data to explore functional network integrity in Alzheimer’s disease affected patients. We also record that both weighted and unweighted undirected/directed graphs depending on functional connectivity analysis with attention to connectivity of the network and vertex centrality, could model and provide explanation to loss of links, status of the hub in the region of parietal, derailed synchronization in network and centrality loss at the vital left temporal region that is clinically significant were found in cases carrying Alzheimer’s disease. We also notice that graph theory driven measures such as characteristic path length and clustering coefficient could be used to study and report a sudden electroencephalography effect in Alzheimer’s disease through entropy of the cross-sample. We also provide adequate literature survey to demonstrate the latest and advanced graphical tools for both graph layouts and graph visualization to understand the complex brain networks and to unravel the mysteries of Alzheimer’s disease. </div>展开更多
Since the COVID-19 pandemic was first reported in 2019,it has rapidly spread around the world.Many countries implemented several measures to try to control the virus spreading.The healthcare system and consequently th...Since the COVID-19 pandemic was first reported in 2019,it has rapidly spread around the world.Many countries implemented several measures to try to control the virus spreading.The healthcare system and consequently the general quality of life population in the cities have all been significantly impacted by the Coronavirus pandemic.The different waves of contagious were responsible for the increase in the number of cases that,unfortunately,many times lead to death.In this paper,we aim to characterize the dynamics of the six waves of cases and deaths caused by COVID-19 in Rio de Janeiro city using techniques such as the Poincaréplot,approximate entropy,second-order difference plot,and central tendency measures.Our results reveal that by examining the structure and patterns of the time series,using a set of non-linear techniques we can gain a better understanding of the role of multiple waves of COVID-19,also,we can identify underlying dynamics of disease spreading and extract meaningful information about the dynamical behavior of epidemiological time series.Such findings can help to closely approximate the dynamics of virus spread and obtain a correlation between the different stages of the disease,allowing us to identify and categorize the stages due to different virus variants that are reflected in the time series.展开更多
The famous modernist architect Richard Neutra argued that movement through,and understanding of a building could be choreographed by controlling the visual stimuli that is available to a person.These claims are tested...The famous modernist architect Richard Neutra argued that movement through,and understanding of a building could be choreographed by controlling the visual stimuli that is available to a person.These claims are tested by quantifying the lines of sight and intelligibility of five of Neutra’s residential designs.A computational method,weighted axial line analysis,is used to investigate lines of sight and movement in five of Neutra’s house designs.The cumulative lengths of axial lines required to reach public and private spaces are compared,and centrality measures are calculated for each design that are weighted with line length data.Intelligibility metrics are calculated from these centrality measures.The first hypothesis,that visual stimuli in Neutra’s architecture is greater when accessing public rather than private spaces is supported by the results.The second hypothesis,that Neutra’s architecture is highly intelligible,is not supported by the results.This research tests two theories used to explain the works of a famous architect and it develops a new variation of well-known Space Syntax technique,to account for axial line lengths.展开更多
In this paper, the relationship between the s-dimensional Hausdorff measures and the g-measures in Rd is discussed, where g is a gauge function which is equivalent to ts and 0 < s≤d. It shows that if s=d, then Hg ...In this paper, the relationship between the s-dimensional Hausdorff measures and the g-measures in Rd is discussed, where g is a gauge function which is equivalent to ts and 0 < s≤d. It shows that if s=d, then Hg = c1Hd, Cg = c2Cd and Pg = c3Pd on Rd, where constants c1, c2 and c3 are determined by where Wg, Cg and Pg are the g-Hausdorff, g-central Hausdorff and g-packing measures on Rd respectively. In the case 0<s<d, some examples are given to show that the above conclusion may fail. However, there is always some s-set F (?) Rd such that Hg|F=C1HS|F, Cg|F = c2Cs|F and Pg|F = c3Ps|F, where the constants c1, c2 and c3 depend not only on g and s, but also on F. A criterion is presented for judging whether an s-set has the above properties.展开更多
The blockchain art market is partitioned around the roles of artists and collectors and highly concentrated among a few prominent figures.We hence propose to adapt Kleinberg's authority/hub HITS(Hyperlink-Induced ...The blockchain art market is partitioned around the roles of artists and collectors and highly concentrated among a few prominent figures.We hence propose to adapt Kleinberg's authority/hub HITS(Hyperlink-Induced Topic Search)method to rate artists and collectors in the art context.This seems a reasonable choice since the original method deftly defines its scores in terms of a mutual recursive relationship between authorities/artists—the miners of information/art,and hubs/collectors—the assemblers of such information/art.We evaluated the proposed method on the collector-artist network of SuperRare gallery,the major crypto art marketplace.We found that the proposed artist and collector metrics are weakly correlated with other networks science metrics like degree and strength.This hints at the possibility of coupling different measures in order to profile active users of the gallery and suggests investment strategies with different risk/reward ratios for collectors as well as marketing strategies with different targets for artists.展开更多
Routing in wireless sensor networks plays a crucial role in deploying and managing an efficient and adaptive network.Ensuring efficient routing entails an ever-increasing necessity for optimized energy consumption and...Routing in wireless sensor networks plays a crucial role in deploying and managing an efficient and adaptive network.Ensuring efficient routing entails an ever-increasing necessity for optimized energy consumption and reliable resource management of both the sensor nodes and the overall sensor network.An efficient routing solution is characterized by its ability to increase network lifetime,enhance efficiency,and ensure the appropriate quality of service.Therefore,the routing protocols need to be designed with an ultimate objective by considering and efficiently managing many characteristics and requirements,such as fault tolerance,scalability,production costs,and others.Unfortunately,many of the existing solutions lead to higher power consumption and communication control overheads,which not only increase network congestion but also decrease network lifetime.In addition,most of these protocols consider a limited number of criteria,in contrast to the complexity and the evolution of WSNs.This paper presents a new adaptive and dynamic multi-criteria routing protocol.Our protocol operates in multi-constraint environments,where most of the current solutions fail to monitor successive and continuous changes in network state and user preferences.This approach provides a routing mechanism,which creates a routing tree based on the evaluation of many criteria.These criteria can cover the topological metrics of neighboring nodes(the role of the nodes in intracommunications,connections between different parts of the network,etc.),the estimated power consumption to reach each direct neighbor,the path length(number of hops to the sink),the remaining energy of individual sensor nodes,and others.These criteria are controlled and supervised dynamically through a detection scheme.In addition,a dynamic selection mechanism,based on multi-attribute decision-making methods,is implemented to build and update the routing tree.In response to changes in the network state,user preferences,and application-defined goals,the election mechanism provides the best routing neighbor between each node and the sink.展开更多
基金the National Natural Science Foundation of China under Grants 61871209 and 61901210,in part by Artificial Intelligence and Intelligent Transportation Joint Technical Center of HUST and Hubei Chutian Intelligent Transportation Co.,LTD under project”Intelligent Transportation Operation Monitoring Network and System”.
文摘Nearly all real-world networks are complex networks and usually are in danger of collapse.Therefore,it is crucial to exploit and understand the mechanisms of network attacks and provide better protection for network functionalities.Network dismantling aims to find the smallest set of nodes such that after their removal the network is broken into connected components of sub-extensive size.To overcome the limitations and drawbacks of existing network dismantling methods,this paper focuses on network dismantling problem and proposes a neighbor-loop structure based centrality metric,NL,which achieves a balance between computational efficiency and evaluation accuracy.In addition,we design a novel method combining NL-based nodes-removing,greedy tree-breaking and reinsertion.Moreover,we compare five baseline methods with our algorithm on ten widely used real-world networks and three types of model networks including Erd€os-Renyi random networks,Watts-Strogatz smallworld networks and Barabasi-Albert scale-free networks with different network generation parameters.Experimental results demonstrate that our proposed method outperforms most peer methods by obtaining a minimal set of targeted attack nodes.Furthermore,the insights gained from this study may be of assistance to future practical research into real-world networks.
文摘The number of botnet malware attacks on Internet devices has grown at an equivalent rate to the number of Internet devices that are connected to the Internet.Bot detection using machine learning(ML)with flow-based features has been extensively studied in the literature.Existing flow-based detection methods involve significant computational overhead that does not completely capture network communication patterns that might reveal other features ofmalicious hosts.Recently,Graph-Based Bot Detection methods using ML have gained attention to overcome these limitations,as graphs provide a real representation of network communications.The purpose of this study is to build a botnet malware detection system utilizing centrality measures for graph-based botnet detection and ML.We propose BotSward,a graph-based bot detection system that is based on ML.We apply the efficient centrality measures,which are Closeness Centrality(CC),Degree Centrality(CC),and PageRank(PR),and compare them with others used in the state-of-the-art.The efficiency of the proposed method is verified on the available Czech Technical University 13 dataset(CTU-13).The CTU-13 dataset contains 13 real botnet traffic scenarios that are connected to a command-and-control(C&C)channel and that cause malicious actions such as phishing,distributed denial-of-service(DDoS)attacks,spam attacks,etc.BotSward is robust to zero-day attacks,suitable for large-scale datasets,and is intended to produce better accuracy than state-of-the-art techniques.The proposed BotSward solution achieved 99%accuracy in botnet attack detection with a false positive rate as low as 0.0001%.
基金The National Basic Research Program of China (973Program) (No.2005CB321802)Program for New Century Excellent Talents in University (No.NCET-06-0926)the National Natural Science Foundation of China (No.60873097,90612009)
文摘To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empirical studies of the gene ontology with various perspectives, this paper shows that the whole gene ontology displays the same topological features as complex networks including "small world" and "scale-free",while some sub-ontologies have the "scale-free" property but no "small world" effect.The potential important terms in an ontology are discovered by some famous complex network centralization methods.An evaluation method based on information retrieval in MEDLINE is designed to measure the effectiveness of the discovered important terms.According to the relevant literature of the gene ontology terms,the suitability of these centralization methods for ontology important concepts discovering is quantitatively evaluated.The experimental results indicate that the betweenness centrality is the most appropriate method among all the evaluated centralization measures.
文摘To examine the interdependency and evolution of Pakistan’s stock market,we consider the cross-correlation coefficients of daily stock returns belonging to the blue chip Karachi stock exchange(KSE-100)index.Using the minimum spanning tree network-based method,we extend the financial network literature by examining the topological properties of the network and generating six minimum spanning tree networks around three general elections in Pakistan.Our results reveal a star-like structure after the general elections of 2018 and before those in 2008,and a tree-like structure otherwise.We also highlight key nodes,the presence of different clusters,and compare the differences between the three elections.Additionally,the sectorial centrality measures reveal economic expansion in three industrial sectors—cement,oil and gas,and fertilizers.Moreover,a strong overall intermediary role of the fertilizer sector is observed.The results indicate a structural change in the stock market network due to general elections.Consequently,through this analysis,policy makers can focus on monitoring key nodes around general elections to estimate stock market stability,while local and international investors can form optimal diversification strategies.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61374180 and 61373136)the Ministry of Education Research in the Humanities and Social Sciences Planning Fund Project,China(Grant No.12YJAZH120)the Six Projects Sponsoring Talent Summits of Jiangsu Province,China(Grant No.RLD201212)
文摘A tiny fraction of influential individuals play a critical role in the dynamics on complex systems. Identifying the influential nodes in complex networks has theoretical and practical significance. Considering the uncertainties of network scale and topology, and the timeliness of dynamic behaviors in real networks, we propose a rapid identifying method(RIM)to find the fraction of high-influential nodes. Instead of ranking all nodes, our method only aims at ranking a small number of nodes in network. We set the high-influential nodes as initial spreaders, and evaluate the performance of RIM by the susceptible-infected-recovered(SIR) model. The simulations show that in different networks, RIM performs well on rapid identifying high-influential nodes, which is verified by typical ranking methods, such as degree, closeness, betweenness,and eigenvector centrality methods.
文摘To discover and identify the influential nodes in any complex network has been an important issue.It is a significant factor in order to control over the network.Through control on a network,any information can be spread and stopped in a short span of time.Both targets can be achieved,since network of information can be extended and as well destroyed.So,information spread and community formation have become one of the most crucial issues in the world of SNA(Social Network Analysis).In this work,the complex network of twitter social network has been formalized and results are analyzed.For this purpose,different network metrics have been utilized.Visualization of the network is provided in its original form and then filter out(different percentages)from the network to eliminate the less impacting nodes and edges for better analysis.This network is analyzed according to different centrality measures,like edge-betweenness,betweenness centrality,closeness centrality and eigenvector centrality.Influential nodes are detected and their impact is observed on the network.The communities are analyzed in terms of network coverage considering theMinimum Spanning Tree,shortest path distribution and network diameter.It is found that these are the very effective ways to find influential and central nodes from such big social networks like Facebook,Instagram,Twitter,LinkedIn,etc.
基金Supported by the National Natural Science Foundation of China(61272119,61203372)
文摘A new centrality measure for complex networks, called resource flow centrality, is pro- posed in this paper. This centrality measure is based on the concept of the resource flow in net- works. It not only can be applied to the connected networks, but also the disconnected networks. Moreover, it overcomes some disadvantages of several common centrality measures. The perform- ance of the proposed measure is compared with some standard centrality measures using a classic dataset and the results indicate the proposed measure performs more reasonably. The statistical dis- tribution of the proposed centrality is investigated by experiments on large scale computer generated graphs and two networks from the real world.
文摘The main aim of this paper is to compare the stability, in terms of systemic risk, of conventional and Islamic banking systems. To this aim, we propose correlation network models for stock market returns based on graphical Gaussian distributions, which allows us to capture the contagion effects that move along countries. We also consider Bayesian graphical models, to account for model uncertainty in the measurement of financial systems interconnectedness. Our proposed model is applied to the Middle East and North Africa (MENA) region banking sector, characterized by the presence of both conventional and Islamic banks, for the period from 2007 to the beginning of 2014. Our empirical findings show that there are differences in the systemic risk and stability of the two banking systems during crisis times. In addition, the differences are subject to country specific effects that are amplified during crisis period.
文摘<div style="text-align:justify;"> We know that functional and structural organization is altered in human brain network due to Alzheimer’s disease. In this paper we highlight how Graph Theory techniques, its structural parameters like connectivity, diameter, vertex centrality, betweenness centrality, clustering coefficient, degree distribution, cluster analysis and graph cores are involved to analyse magnetoencephalography data to explore functional network integrity in Alzheimer’s disease affected patients. We also record that both weighted and unweighted undirected/directed graphs depending on functional connectivity analysis with attention to connectivity of the network and vertex centrality, could model and provide explanation to loss of links, status of the hub in the region of parietal, derailed synchronization in network and centrality loss at the vital left temporal region that is clinically significant were found in cases carrying Alzheimer’s disease. We also notice that graph theory driven measures such as characteristic path length and clustering coefficient could be used to study and report a sudden electroencephalography effect in Alzheimer’s disease through entropy of the cross-sample. We also provide adequate literature survey to demonstrate the latest and advanced graphical tools for both graph layouts and graph visualization to understand the complex brain networks and to unravel the mysteries of Alzheimer’s disease. </div>
基金This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior-Brasil(CAPES)-Finance Code 001(88887.712553/2022–00)Conselho Nacional de Desenvolvimento Científico e Tecnológico-Brasil(CNPq)under grant 441016/2020-0São Paulo Research Foundation(FAPESP)under grant 2021/10599-3.
文摘Since the COVID-19 pandemic was first reported in 2019,it has rapidly spread around the world.Many countries implemented several measures to try to control the virus spreading.The healthcare system and consequently the general quality of life population in the cities have all been significantly impacted by the Coronavirus pandemic.The different waves of contagious were responsible for the increase in the number of cases that,unfortunately,many times lead to death.In this paper,we aim to characterize the dynamics of the six waves of cases and deaths caused by COVID-19 in Rio de Janeiro city using techniques such as the Poincaréplot,approximate entropy,second-order difference plot,and central tendency measures.Our results reveal that by examining the structure and patterns of the time series,using a set of non-linear techniques we can gain a better understanding of the role of multiple waves of COVID-19,also,we can identify underlying dynamics of disease spreading and extract meaningful information about the dynamical behavior of epidemiological time series.Such findings can help to closely approximate the dynamics of virus spread and obtain a correlation between the different stages of the disease,allowing us to identify and categorize the stages due to different virus variants that are reflected in the time series.
文摘The famous modernist architect Richard Neutra argued that movement through,and understanding of a building could be choreographed by controlling the visual stimuli that is available to a person.These claims are tested by quantifying the lines of sight and intelligibility of five of Neutra’s residential designs.A computational method,weighted axial line analysis,is used to investigate lines of sight and movement in five of Neutra’s house designs.The cumulative lengths of axial lines required to reach public and private spaces are compared,and centrality measures are calculated for each design that are weighted with line length data.Intelligibility metrics are calculated from these centrality measures.The first hypothesis,that visual stimuli in Neutra’s architecture is greater when accessing public rather than private spaces is supported by the results.The second hypothesis,that Neutra’s architecture is highly intelligible,is not supported by the results.This research tests two theories used to explain the works of a famous architect and it develops a new variation of well-known Space Syntax technique,to account for axial line lengths.
文摘In this paper, the relationship between the s-dimensional Hausdorff measures and the g-measures in Rd is discussed, where g is a gauge function which is equivalent to ts and 0 < s≤d. It shows that if s=d, then Hg = c1Hd, Cg = c2Cd and Pg = c3Pd on Rd, where constants c1, c2 and c3 are determined by where Wg, Cg and Pg are the g-Hausdorff, g-central Hausdorff and g-packing measures on Rd respectively. In the case 0<s<d, some examples are given to show that the above conclusion may fail. However, there is always some s-set F (?) Rd such that Hg|F=C1HS|F, Cg|F = c2Cs|F and Pg|F = c3Ps|F, where the constants c1, c2 and c3 depend not only on g and s, but also on F. A criterion is presented for judging whether an s-set has the above properties.
文摘The blockchain art market is partitioned around the roles of artists and collectors and highly concentrated among a few prominent figures.We hence propose to adapt Kleinberg's authority/hub HITS(Hyperlink-Induced Topic Search)method to rate artists and collectors in the art context.This seems a reasonable choice since the original method deftly defines its scores in terms of a mutual recursive relationship between authorities/artists—the miners of information/art,and hubs/collectors—the assemblers of such information/art.We evaluated the proposed method on the collector-artist network of SuperRare gallery,the major crypto art marketplace.We found that the proposed artist and collector metrics are weakly correlated with other networks science metrics like degree and strength.This hints at the possibility of coupling different measures in order to profile active users of the gallery and suggests investment strategies with different risk/reward ratios for collectors as well as marketing strategies with different targets for artists.
文摘Routing in wireless sensor networks plays a crucial role in deploying and managing an efficient and adaptive network.Ensuring efficient routing entails an ever-increasing necessity for optimized energy consumption and reliable resource management of both the sensor nodes and the overall sensor network.An efficient routing solution is characterized by its ability to increase network lifetime,enhance efficiency,and ensure the appropriate quality of service.Therefore,the routing protocols need to be designed with an ultimate objective by considering and efficiently managing many characteristics and requirements,such as fault tolerance,scalability,production costs,and others.Unfortunately,many of the existing solutions lead to higher power consumption and communication control overheads,which not only increase network congestion but also decrease network lifetime.In addition,most of these protocols consider a limited number of criteria,in contrast to the complexity and the evolution of WSNs.This paper presents a new adaptive and dynamic multi-criteria routing protocol.Our protocol operates in multi-constraint environments,where most of the current solutions fail to monitor successive and continuous changes in network state and user preferences.This approach provides a routing mechanism,which creates a routing tree based on the evaluation of many criteria.These criteria can cover the topological metrics of neighboring nodes(the role of the nodes in intracommunications,connections between different parts of the network,etc.),the estimated power consumption to reach each direct neighbor,the path length(number of hops to the sink),the remaining energy of individual sensor nodes,and others.These criteria are controlled and supervised dynamically through a detection scheme.In addition,a dynamic selection mechanism,based on multi-attribute decision-making methods,is implemented to build and update the routing tree.In response to changes in the network state,user preferences,and application-defined goals,the election mechanism provides the best routing neighbor between each node and the sink.