The modeling of Application Triggering Architecture (ATA) in the IP Multimedia Sub- system (IMS) is presented. The session setup delay and system throughput are employed as the measurement to investigate the perfo...The modeling of Application Triggering Architecture (ATA) in the IP Multimedia Sub- system (IMS) is presented. The session setup delay and system throughput are employed as the measurement to investigate the performance of ATA and the Serving Call Session Control Punction (S-CSCF). With theoretical analysis and simulation results, we find that, the number of the ASs (Application Servers), the use of the subsequent Filter Criteria (sFC) and the arrival rate have heavy impact on the session setup delay and the S-CSCF is the major bottleneck in IMS network. The results are useful in constructing IMS network. At last, we propose several possible solutions to reduce the session setup delay and decrease the load of the S-CSCF.展开更多
With the increasing requirement of a higher living quality and the growing awareness of energy saving, how to improve the indoor comfort level and to reduce the expenditure of energy and slow down the rate of natural ...With the increasing requirement of a higher living quality and the growing awareness of energy saving, how to improve the indoor comfort level and to reduce the expenditure of energy and slow down the rate of natural resource consumption is becoming increasingly important. The theory of open-plan housing is able to provide a more flexible and adaptive space for the users and bring sustainable and economic benefit in the way of making full use of construction materials. Sustainable architecture design, as a method to respond the phenomenon, is able to low down the building' s energy consumption and has enormous potentials in creation of sustainable living environment and a high-quality dwelling condition. The primary aim of this research is to create a new sustainable architecture design method for occupancy by integrating openplan housing theory and application of sustainable technologies. Numerical simulation by computer program is applied in order to investigate and evaluate the possibility of this method in teruas of improving indoor comfort level and energy-saving capacity.展开更多
The network on chip(NoC)is used as a solution for the communication problems in a complex system on chip(SoC)design.To further enhance performances,the NoC architectures,a high level modeling and an evaluation met...The network on chip(NoC)is used as a solution for the communication problems in a complex system on chip(SoC)design.To further enhance performances,the NoC architectures,a high level modeling and an evaluation method based on OPNET are proposed to analyze their performances on different injection rates and traffic patterns.Simulation results for general NoC in terms of the average latency and the throughput are analyzed and used as a guideline to make appropriate choices for a given application.Finally,a MPEG4 decoder is mapped on different NoC architectures.Results prove the effectiveness of the evaluation method.展开更多
The purpose of this paper is to discuss recent findings in neuroscience that can be useful to architecture.Knowing the working patterns of the brain and how space affects cerebral functions can help architects design ...The purpose of this paper is to discuss recent findings in neuroscience that can be useful to architecture.Knowing the working patterns of the brain and how space affects cerebral functions can help architects design buildings that improve the user’s behavior,performance and well-being.The built environment has a direct impact on the human brain.Social relations,focus,cognition,creativity,memory and well-being can be influenced by the surrounding physical space.Although it is not possible to create the perfect room,the space can be used in a strategic way,depending on the task that individuals are supposed to do there and depending on the people(age,gender,culture)who will make use of the space.Schools can be designed in a way to improve cognition,learning and memorization;hospital buildings can help improving recovery;workspaces can improve performance,creativity and collaboration.Above all,all spaces of long occupation should be designed in a way to improve well-being.How can architecture change automatic behaviors and nudge people to behave in a healthier way?Can architects create buildings and cities that improve socialization and happiness?Can criminality levels drop due to changes on the way the environments are designed?These are some of the questions that will be discussed in this paper.展开更多
The evolution of the current network has challenges of programmability, maintainability and manageability, due to network ossification. This challenge led to the concept of software-defined networking (SDN), to decoup...The evolution of the current network has challenges of programmability, maintainability and manageability, due to network ossification. This challenge led to the concept of software-defined networking (SDN), to decouple the control system from the infrastructure plane caused by ossification. The innovation created a problem with controller placement. That is how to effectively place controllers within a network topology to manage the network of data plane devices from the control plane. The study was designed to empirically evaluate and compare the functionalities of two controller placement algorithms: the POCO and MOCO. The methodology adopted in the study is the explorative and comparative investigation techniques. The study evaluated the performances of the Pareto optimal combination (POCO) and multi-objective combination (MOCO) algorithms in relation to calibrated positions of the controller within a software-defined network. The network environment and measurement metrics were held constant for both the POCO and MOCO models during the evaluation. The strengths and weaknesses of the POCO and MOCO models were justified. The results showed that the latencies of the two algorithms in relation to the GoodNet network are 3100 ms and 2500 ms for POCO and MOCO respectively. In Switch to Controller Average Case latency, the performance gives 2598 ms and 2769 ms for POCO and MOCO respectively. In Worst Case Switch to Controller latency, the performance shows 2776 ms and 2987 ms for POCO and MOCO respectively. The latencies of the two algorithms evaluated in relation to the Savvis network, compared as follows: 2912 ms and 2784 ms for POCO and MOCO respectively in Switch to Controller Average Case latency, 3129 ms and 3017 ms for POCO and MOCO respectively in Worst Case Switch to Controller latency, 2789 ms and 2693 ms for POCO and MOCO respectively in Average Case Controller to Controller latency, and 2873 ms and 2756 ms for POCO and MOCO in Worst Case Switch to Controller latency respectively. The latencies of the two algorithms evaluated in relation to the AARNet, network compared as follows: 2473 ms and 2129 ms for POCO and MOCO respectively, in Switch to Controller Average Case latency, 2198 ms and 2268 ms for POCO and MOCO respectively, in Worst Case Switch to Controller latency, 2598 ms and 2471 ms for POCO and MOCO respectively, in Average Case Controller to Controller latency, 2689 ms and 2814 ms for POCO and MOCO respectively Worst Case Controller to Controller latency. The Average Case and Worst-Case latencies for Switch to Controller and Controller to Controller are minimal, and favourable to the POCO model as against the MOCO model when evaluated in the Goodnet, Savvis, and the Aanet networks. This simply indicates that the POCO model has a speed advantage as against the MOCO model, which appears to be more resilient than the POCO model.展开更多
Co-free Li-rich Mn-based layered oxides are promising candidates for next-generation lithium-ion batteries(LIBs)due to their high specific capacity,high voltage,low cost.However,their commercialization is hindered by ...Co-free Li-rich Mn-based layered oxides are promising candidates for next-generation lithium-ion batteries(LIBs)due to their high specific capacity,high voltage,low cost.However,their commercialization is hindered by limited cycle life and poor rate performance.Herein,an in-situ simple and low-cost strategy with a nanoscale double-layer architecture of lithium polyphosphate(LiPP)and spinel phase covered on top of the bulk layered phase,is developed for Li_(1.2)Mn_(0.6)Ni_(0.2)O_(2)(LMNO)using Li^(+)-conductor LiPP(denoted as LMNO@S-LiPP).With such a double-layer covered architecture,the half-cell of LMNO@S-LiPP delivers an extremely high capacity of 202.5 mAh·g^(−1)at 1 A·g^(−1)and retains 85.3%of the initial capacity after 300 cycles,so far,the best highrate electrochemical performance of all the previously reported LMNOs.The energy density of the full-cell assembled with commercial graphite reaches 620.9 Wh·kg^(−1)(based on total weight of active materials in cathode and anode).Mechanism studies indicate that the superior electrochemical performance of LMNO@S-LiPP is originated from such a nanoscale double-layer covered architecture,which accelerates Li-ion diffusion,restrains oxygen release,inhibits interfacial side reactions,suppresses structural degradation during cycling.Moreover,this strategy is applicable for other high-energy-density cathodes,such as LiNi_(0.8)Co_(0.1)Mn_(0.1)O_(2),Li_(1.2)Ni_(0.13)Co_(0.13)Mn_(0.54)O_(2),LiCoO_(2).Hence,this work presents a simple,cost-effective,scalable strategy for the development of high-performance cathode materials.展开更多
State departments of transportation’s (DOTs) decisions to invest resources to expand or implement intelligent transportation systems (ITS) programs or even retire existing infrastructure need to be based on performan...State departments of transportation’s (DOTs) decisions to invest resources to expand or implement intelligent transportation systems (ITS) programs or even retire existing infrastructure need to be based on performance evaluations. Nonetheless, an apparent gap exists between the need for ITS performance measurements and the actual implementation. The evidence available points to challenges in the ITS performance measurement processes. This paper evaluated the state of practice of performance measurement for ITS across the US and provided insights. A comprehensive literature review assessed the use of performance measures by DOTs for monitoring implemented ITS programs. Based on the gaps identified through the literature review, a nationwide qualitative survey was used to gather insights from key stakeholders on the subject matter and presented in this paper. From the data gathered, performance measurement of ITS is fairly integrated into ITS programs by DOTs, with most agencies considering the process beneficial. There, however, exist reasons that prevent agencies from measuring ITS performance to greater detail and quality. These include lack of data, fragmented or incomparable data formats, the complexity of the endeavor, lack of data scientists, and difficulty assigning responsibilities when inter-agency collaboration is required. Additionally, DOTs do not benchmark or compare their ITS performance with others for reasons that include lack of data, lack of guidance or best practices, and incomparable data formats. This paper is relevant as it provides insights expected to guide DOTs and other agencies in developing or reevaluating their ITS performance measurement processes.展开更多
High performance computer (HPC) is a complex huge system, of which the architecture design meets increasing difficulties and risks. Traditional methods, such as theoretical analysis, component-level simulation and s...High performance computer (HPC) is a complex huge system, of which the architecture design meets increasing difficulties and risks. Traditional methods, such as theoretical analysis, component-level simulation and sequential simulation, are not applicable to system-level simulations of HPC systems. Even the parallel simulation using large-scale parallel machines also have many difficulties in scalability, reliability, generality, as well as efficiency. According to the current needs of HPC architecture design, this paper proposes a system-level parallel simulation platform: ArchSim. We first introduce the architecture of ArchSim simulation platform which is composed of a global server (GS), local server agents (LSA) and entities. Secondly, we emphasize some key techniques of ArchSim, including the synchronization protocol, the communication mechanism and the distributed checkpointing/restart mechanism. We then make a synthesized test of some main performance indices of ArchSim with the phold benchmark and analyze the extra overhead generated by ArchSim. Finally, based on ArchSim, we construct a parallel event-driven interconnection network simulator and a system-level simulator for a small scale HPC system with 256 processors. The results of the performance test and HPC system simulations demonstrate that ArchSim can achieve high speedup ratio and high scalability on parallel host machine and support system-level simulations for the architecture design of HPC systems.展开更多
Traditional electrode manufacturing for lithium-ion batteries is well established,reliable,and has already reached high processing speeds and improvements in production costs.For modern electric vehicles,however,the n...Traditional electrode manufacturing for lithium-ion batteries is well established,reliable,and has already reached high processing speeds and improvements in production costs.For modern electric vehicles,however,the need for batteries with high gravimetric and volumetric energy densities at cell level is increasing;and new production concepts are required for this purpose.During the last decade,laser processing of battery materials emerged as a promising processing tool for either improving manufacturing flexibility and product reliability or enhancing battery performances.Laser cutting and welding already reached a high level of maturity and it is obvious that in the near future they will become frequently implemented in battery production lines.This review focuses on laser texturing of electrode materials due to its high potential for significantly enhancing battery performances beyond state-of-the-art.Technical approaches and processing strategies for new electrode architectures and concepts will be presented and discussed with regard to energy and power density requirements.The boost of electrochemical performances due to laser texturing of energy storage materials is currently proven at the laboratory scale.However,promising developments in high-power,ultrafast laser technology may push laser structuring of batteries to the next technical readiness level soon.For demonstration in pilot lines adapted to future cell production,process upscaling regarding footprint area and processing speed are the main issues as well as the economic aspects with regards to CapEx amortization and the benefits resulting from the next generation battery.This review begins with an introduction of the three-dimensional battery and thick film concept,made possible by laser texturing.Laser processing of electrode components,namely current collectors,anodes,and cathodes will be presented.Different types of electrode architectures,such as holes,grids,and lines,were generated;their impact on battery performances are illustrated.The usage of high-energy materials,which are on the threshold of commercialization,is highlighted.Battery performance increase is triggered by controlling lithium-ion diffusion kinetics in liquid electrolyte filled porous electrodes.This review concludes with a discussion of various laser parameter tasks for process upscaling in a new type of extreme manufacturing.展开更多
School decision makers are faced with a great many decisions when considering a school renovation or new school building. All stakeholders want a building that is safe and provides an optimal learning environment. Ho...School decision makers are faced with a great many decisions when considering a school renovation or new school building. All stakeholders want a building that is safe and provides an optimal learning environment. However, it is often difficult to know which building features will have the greatest effect on student learning. Because of a limited understanding of the relationship between individual building features and student learning, researchers at the University of Oklahoma hope to explore how building components influence student and teacher performance. This paper explores the importance of school building features that can be designed and changed during a renovation project. The hope is to one day determine which features have the greatest impact on student test scores. The research team believes that although it is difficult to find the exact relationship between each building features and student outcomes with one study, if multiple users repeat the same or similar studies, hopefully we will one day know the effect of these building features. In order to develop feature building users survey and physical assessment tools, it was necessary for investigators to develop a list of important building features and their associated definitions in layman terms. This was accomplished through utilization and conducting of a CAB (community advisory board) and subject matter expert materials. In addition, previous research relating to different school building features and their associations with student performance were reviewed. To define and narrow the list the researchers, community educational, and building professionals rated based on their professional experience, how directly related each feature is to student performance. The building feature list serves as a starting point to determine which features should be analyzed in a later phase of the project. It is hoped that resulting tools based on the work of this project can be used by school decision-makers and researchers to access building features that have been identified through research as being important for student and teacher performance.展开更多
Heterogeneity is inevitable in enterprises due to their various input requirements. The usage of proprietary integration products results in the increased cost of enterprises. During the integration, the focus area ha...Heterogeneity is inevitable in enterprises due to their various input requirements. The usage of proprietary integration products results in the increased cost of enterprises. During the integration, the focus area has been found to often address only the functional requirements, while the non-functional requirements are side-stepped during the initial stages of a project. Moreover, the use of proprietary integration products and non-standards-based integration platform has given rise to an inflexible integration infrastructure resulting in adaptability concerns. Web services-based integration, based on open standards, is deemed to be the only feasible solution in such cases. This paper explains the performance analysis of enterprise integration in heterogeneous environments for the distributed and the transactional applications. The analysis presented in this paper is seen as a step towards making intelligent decisions well in advance when choosing the integration mechanism/products to address the functional as well as the non-functional requirements considering the future integration needs.展开更多
Neural architecture search(NAS)has become increasingly popular in the deep learning community recently,mainly because it can provide an opportunity to allow interested users without rich expertise to benefit from the ...Neural architecture search(NAS)has become increasingly popular in the deep learning community recently,mainly because it can provide an opportunity to allow interested users without rich expertise to benefit from the success of deep neural networks(DNNs).However,NAS is still laborious and time-consuming because a large number of performance estimations are required during the search process of NAS,and training DNNs is computationally intensive.To solve this major limitation of NAS,improving the computational efficiency is essential in the design of NAS.However,a systematic overview of computationally efficient NAS(CE-NAS)methods still lacks.To fill this gap,we provide a comprehensive survey of the state-of-the-art on CE-NAS by categorizing the existing work into proxy-based and surrogate-assisted NAS methods,together with a thorough discussion of their design principles and a quantitative comparison of their performances and computational complexities.The remaining challenges and open research questions are also discussed,and promising research topics in this emerging field are suggested.展开更多
This paper investigates the thermal performance of prefabricated exterior walls using the Computational Fluid Dynamics method to reduce energy consumption.The thermal performance of the prefabricated exterior wall was...This paper investigates the thermal performance of prefabricated exterior walls using the Computational Fluid Dynamics method to reduce energy consumption.The thermal performance of the prefabricated exterior wall was numerically simulated using the software ANSYS Fluent.The composite wall containing the cavity is taken as the research object in this paper after analysis.The simulation suggests that when the cavity thickness is 20 mm and 30 mm,the heat transfer coefficient of the air-sandwich wall is 1.3 and 1.29,respectively.Therefore,the optimal width of the cavity is 20 mm,and the most suitable material is the aerated concrete block.In addition,a comparative analysis is conducted on the cavity temperature in the wall under different conditions.It is proven that an intelligent environment control system can significantly improve thermal efficiency and provide a solid theoretical basis for further research in the external insulation of prefabricated buildings.展开更多
The expanding amounts of information created by Internet of Things(IoT)devices places a strain on cloud computing,which is often used for data analysis and storage.This paper investigates a different approach based on...The expanding amounts of information created by Internet of Things(IoT)devices places a strain on cloud computing,which is often used for data analysis and storage.This paper investigates a different approach based on edge cloud applications,which involves data filtering and processing before being delivered to a backup cloud environment.This Paper suggest designing and implementing a low cost,low power cluster of Single Board Computers(SBC)for this purpose,reducing the amount of data that must be transmitted elsewhere,using Big Data ideas and technology.An Apache Hadoop and Spark Cluster that was used to run a test application was containerized and deployed using a Raspberry Pi cluster and Docker.To obtain system data and analyze the setup’s performance a Prometheusbased stack monitoring and alerting solution in the cloud based market is employed.This Paper assesses the system’s complexity and demonstrates how containerization can improve fault tolerance and maintenance ease,allowing the suggested solution to be used in industry.An evaluation of the overall performance is presented to highlight the capabilities and limitations of the suggested architecture,taking into consideration the suggested solution’s resource use in respect to device restrictions.展开更多
Experience from recent earthquakes such as Gilan, Zanjan, Bam and Lorestan earthquakes in Iran indicated that the constructed buildings are vulnerable against earthquake. Vulnerability of these structures is due to va...Experience from recent earthquakes such as Gilan, Zanjan, Bam and Lorestan earthquakes in Iran indicated that the constructed buildings are vulnerable against earthquake. Vulnerability of these structures is due to various reasons such as designing without considering seismic regulations, problems of regulations (design goals), implementation problems, changing of the building occupancy class, increasing the weight of building stories, adding new stories to the building and changing in architecture of building without considering structural system. So the main objective of this research is to examine the features of building configuration and their effects as for the damages to buildings in past earthquakes. For this purpose, initially four occurred earthquakes in Iran are selected as case study. Then three types of buildings (steel structure, concrete structure and masonry buildings) are analyzed with details. Results showed that the most of damages are occurred in the old steel structures and masonry buildings which their ages are more than 25 years. The study showed that most of the buildings in the study area are steel structure and masonry buildings while concrete structures are infrequent which most of them had no or slight damages. Therefore, the importance and need to enhance the performance of available buildings against earthquake forces by rehabilitating methods would be more important than before. Also results indicated that the decisions related to architectural plan which have significant effect on seismic performance of buildings, can be divided into three categories: configuration of building, restrictive formal architectural plan and dangerous structural components, as these categories are not obstacle of each other, it is possible that each category has an influential effect on others. So organizing the design decisions in this way is very important so as to manage their effects and interdependencies.展开更多
Integrative approaches to architectural design+environmental technology pedagogy are essential in educating future generations to respond to impending building energy use challenges.This paper will describe new approa...Integrative approaches to architectural design+environmental technology pedagogy are essential in educating future generations to respond to impending building energy use challenges.This paper will describe new approaches to incorporating building physics and building technology in the design studio via a diverse cohort of students and faculty,with strong emphasis placed on the development of innovative architectural strategies operating at the intersection of urban demographics,house and housing design,building performance,and sustainability.The United States Department of Energy reports that our buildings account for forty percent of all energy consumed nationally.Our focus on high performance buildings at the Georgia Tech College of Architecture aims to reduce that percentage and meet the rising demand for design and building performance professionals to evaluate the environmental impact of design decisions.Continuing a twenty-five-year trajectory of research leadership,Tech students and faculty are leading the way in digital design,building simulation,engineering,and construction integration.Over the past four years,students from various schools across campus have been working together in a seminar and design studio setting to expand 21st century housing options.Changing urban demographics,sustainability targets,and alternative energy requirements are investigated through smartly researched and elegantly designed housing and public space propositions.The move from an ecologically aware architecture towards an architecture immersed in the emerging debates about carbon footprint and energy consumption is in part driven by increasing international concern over resource availability and delivery.Through reduced costs of alternative energy capture,higher efficiencies,rapid evolution of upstream technologies and applications and more robust software platforms along with growing social,political and economic debate,the definition of sustainability is evolving - moving to transform integral parts of archi-tectural practice and education from a primarily aesthetic and assembly oriented tra-jectory to a more comprehensive understanding of the relationship between design thinking and building performance.展开更多
The flexibility offered by an Enterprise Service Bus (ESB) in enabling various applications to exchange data makes it a very important middleware layer that is responsible for transporting data in a Service-Oriented A...The flexibility offered by an Enterprise Service Bus (ESB) in enabling various applications to exchange data makes it a very important middleware layer that is responsible for transporting data in a Service-Oriented Architecture (SOA). The popularity of the ESB has given rise to a number of commercial off the shelf (COTS) products as well as open source ESBs. In this study, we evaluated three open source ESBs and compared them both qualitatively and quantitatively. The empirical results were statistically tested to determine the statistical significance of the results.展开更多
Based on Simultancous Multithrtading (SMT), we propose a fault-tola antscheme called Tri-modular Redun-danlly and Simultaneously threaded processor with Recovery (TRSTR),TRSTR features as following: First, we introduc...Based on Simultancous Multithrtading (SMT), we propose a fault-tola antscheme called Tri-modular Redun-danlly and Simultaneously threaded processor with Recovery (TRSTR),TRSTR features as following: First, we introduce an arbitrator context into thtconventional SRT(Simultaneous and Redundantly Threaded), which acts as an arbitrator when results from the other twocontexts disagree, or acts as an ordinary thread generally, thus making full use of SMT'sparallelism. Second, we append reconfigurablefeature to sphere of replication in SRT, making it moreflexible for changing demands and situations Third, TRSFR has two working modes: Tri-Simultancouswith Voling (TSV) and Dual-Simultaneous with Arbitrator CDSA), which can switch at will. Finally, inaddition to transient-fault coverage, TRSTR has on-line self-checking and self-recover ingabilities, so as to shield off some permanent faults and reconfigure itself without stopping thecrucial job. improving its reliability and availability.展开更多
We implemented a generalized infrastructure for Internet of Things (IoT infrastructure) to be applicable in various areas such as Smart Grid. That IoT infrastructure has two methods to store sensor data. They commonly...We implemented a generalized infrastructure for Internet of Things (IoT infrastructure) to be applicable in various areas such as Smart Grid. That IoT infrastructure has two methods to store sensor data. They commonly have the features of double overlay structure, virtualization of sensors, composite services as federation using publisher/subscriber. And they are implemented as synthesizing the elemental architectures. The two methods majorly have the common architectural elements, however there are differences in how to compose and utilize them. But we observed the non-negligible differences in their achieved performance by the actual implementations due to operational items beyond these architectural elements. In this paper, we present the results of our analysis about the factors of the revealed differences based on the measured performance. In particular, it is clarified that a negative side effect due to combining independent elemental micro solutions naively could be amplified, if maximizing the level of loose coupling is applied as the most prioritized design and operational policy. Primarily, these combinations should be evaluated and verified during the basic design phase. However, the variation of how to synthesize them tends to be a blind spot when adopting the multiple independent architectural elements commonly. As a practical suggestion from this case, the emphasized importance in carrying out a new synthetization with multiple architectures is to make a balance naturally among architectural elements, or solutions based on them, and there is a certain demand to establish a methodology for architectural synthetization, including verification.展开更多
Developments in information technology are providing methods to improve current design practices,where uncertainties about various design elements can be simulated and studied from the design inception.Energy and ther...Developments in information technology are providing methods to improve current design practices,where uncertainties about various design elements can be simulated and studied from the design inception.Energy and thermal simulations,improved design representations and enhanced collaboration using digital media are increasingly being used.With the expanding interest in energy-efficient build-ing design,whole building energy simulation programs are increasingly employed in the design process to help architects and engineers determine which design strat-egies save energy and improve building performance.The purpose of this research was to investigate the potential of these programs to perform whole building energy analysis during the early stages of architectural design,and compare the results with the actual building energy performance.The research was conducted by simulating energy usage of a fully functional research laboratory building using two different simulation tools that are aimed for early schematic design.The results were compared with utility data of the building to identify the degree of close-ness with which simulation results match the actual energy usage of the build-ing.Results indicate that modeled energy data from one of the software programs was significantly higher than the measured,actual energy usage data,while the results from the second application were comparable,but did not correctly predict monthly energy loads for the building.This suggests that significant deviations may exist between modeled and actual energy consumption for buildings,and more importantly between different simulation software programs.Understanding the limitations and suitability of specific simulation programs is crucial for successful integration of performance simulations with the design process.展开更多
基金Supported by the National Science Fund for Distinguished Young Scholars (No.60525110)the National 973 Program (No.2007CB307100, 2007CB307103)+2 种基金the Program for New Century Excellent Talents in University (No. NCET-04-0111)the Development Fund Project for Electronic and Information Industry (Mobile Service and Application System Based on 3G)the National Spe-cific Project for Hi-Tech Industrialization and Information Equipments (Mobile Intelligent Network Supporting Value-added Data Services)
文摘The modeling of Application Triggering Architecture (ATA) in the IP Multimedia Sub- system (IMS) is presented. The session setup delay and system throughput are employed as the measurement to investigate the performance of ATA and the Serving Call Session Control Punction (S-CSCF). With theoretical analysis and simulation results, we find that, the number of the ASs (Application Servers), the use of the subsequent Filter Criteria (sFC) and the arrival rate have heavy impact on the session setup delay and the S-CSCF is the major bottleneck in IMS network. The results are useful in constructing IMS network. At last, we propose several possible solutions to reduce the session setup delay and decrease the load of the S-CSCF.
文摘With the increasing requirement of a higher living quality and the growing awareness of energy saving, how to improve the indoor comfort level and to reduce the expenditure of energy and slow down the rate of natural resource consumption is becoming increasingly important. The theory of open-plan housing is able to provide a more flexible and adaptive space for the users and bring sustainable and economic benefit in the way of making full use of construction materials. Sustainable architecture design, as a method to respond the phenomenon, is able to low down the building' s energy consumption and has enormous potentials in creation of sustainable living environment and a high-quality dwelling condition. The primary aim of this research is to create a new sustainable architecture design method for occupancy by integrating openplan housing theory and application of sustainable technologies. Numerical simulation by computer program is applied in order to investigate and evaluate the possibility of this method in teruas of improving indoor comfort level and energy-saving capacity.
基金Supported by the Natural Science Foundation of China(61076019)the China Postdoctoral Science Foundation(20100481134)+1 种基金the Natural Science Foundation of Jiangsu Province(BK2008387)the Graduate Student Innovation Foundation of Jiangsu Province(CX07B-105z)~~
文摘The network on chip(NoC)is used as a solution for the communication problems in a complex system on chip(SoC)design.To further enhance performances,the NoC architectures,a high level modeling and an evaluation method based on OPNET are proposed to analyze their performances on different injection rates and traffic patterns.Simulation results for general NoC in terms of the average latency and the throughput are analyzed and used as a guideline to make appropriate choices for a given application.Finally,a MPEG4 decoder is mapped on different NoC architectures.Results prove the effectiveness of the evaluation method.
文摘The purpose of this paper is to discuss recent findings in neuroscience that can be useful to architecture.Knowing the working patterns of the brain and how space affects cerebral functions can help architects design buildings that improve the user’s behavior,performance and well-being.The built environment has a direct impact on the human brain.Social relations,focus,cognition,creativity,memory and well-being can be influenced by the surrounding physical space.Although it is not possible to create the perfect room,the space can be used in a strategic way,depending on the task that individuals are supposed to do there and depending on the people(age,gender,culture)who will make use of the space.Schools can be designed in a way to improve cognition,learning and memorization;hospital buildings can help improving recovery;workspaces can improve performance,creativity and collaboration.Above all,all spaces of long occupation should be designed in a way to improve well-being.How can architecture change automatic behaviors and nudge people to behave in a healthier way?Can architects create buildings and cities that improve socialization and happiness?Can criminality levels drop due to changes on the way the environments are designed?These are some of the questions that will be discussed in this paper.
文摘The evolution of the current network has challenges of programmability, maintainability and manageability, due to network ossification. This challenge led to the concept of software-defined networking (SDN), to decouple the control system from the infrastructure plane caused by ossification. The innovation created a problem with controller placement. That is how to effectively place controllers within a network topology to manage the network of data plane devices from the control plane. The study was designed to empirically evaluate and compare the functionalities of two controller placement algorithms: the POCO and MOCO. The methodology adopted in the study is the explorative and comparative investigation techniques. The study evaluated the performances of the Pareto optimal combination (POCO) and multi-objective combination (MOCO) algorithms in relation to calibrated positions of the controller within a software-defined network. The network environment and measurement metrics were held constant for both the POCO and MOCO models during the evaluation. The strengths and weaknesses of the POCO and MOCO models were justified. The results showed that the latencies of the two algorithms in relation to the GoodNet network are 3100 ms and 2500 ms for POCO and MOCO respectively. In Switch to Controller Average Case latency, the performance gives 2598 ms and 2769 ms for POCO and MOCO respectively. In Worst Case Switch to Controller latency, the performance shows 2776 ms and 2987 ms for POCO and MOCO respectively. The latencies of the two algorithms evaluated in relation to the Savvis network, compared as follows: 2912 ms and 2784 ms for POCO and MOCO respectively in Switch to Controller Average Case latency, 3129 ms and 3017 ms for POCO and MOCO respectively in Worst Case Switch to Controller latency, 2789 ms and 2693 ms for POCO and MOCO respectively in Average Case Controller to Controller latency, and 2873 ms and 2756 ms for POCO and MOCO in Worst Case Switch to Controller latency respectively. The latencies of the two algorithms evaluated in relation to the AARNet, network compared as follows: 2473 ms and 2129 ms for POCO and MOCO respectively, in Switch to Controller Average Case latency, 2198 ms and 2268 ms for POCO and MOCO respectively, in Worst Case Switch to Controller latency, 2598 ms and 2471 ms for POCO and MOCO respectively, in Average Case Controller to Controller latency, 2689 ms and 2814 ms for POCO and MOCO respectively Worst Case Controller to Controller latency. The Average Case and Worst-Case latencies for Switch to Controller and Controller to Controller are minimal, and favourable to the POCO model as against the MOCO model when evaluated in the Goodnet, Savvis, and the Aanet networks. This simply indicates that the POCO model has a speed advantage as against the MOCO model, which appears to be more resilient than the POCO model.
基金the financial support from the Ministry of Science and Technology of China(MoST,No.52090034)the Higher Education Discipline Innovation Project(No.B12015).
文摘Co-free Li-rich Mn-based layered oxides are promising candidates for next-generation lithium-ion batteries(LIBs)due to their high specific capacity,high voltage,low cost.However,their commercialization is hindered by limited cycle life and poor rate performance.Herein,an in-situ simple and low-cost strategy with a nanoscale double-layer architecture of lithium polyphosphate(LiPP)and spinel phase covered on top of the bulk layered phase,is developed for Li_(1.2)Mn_(0.6)Ni_(0.2)O_(2)(LMNO)using Li^(+)-conductor LiPP(denoted as LMNO@S-LiPP).With such a double-layer covered architecture,the half-cell of LMNO@S-LiPP delivers an extremely high capacity of 202.5 mAh·g^(−1)at 1 A·g^(−1)and retains 85.3%of the initial capacity after 300 cycles,so far,the best highrate electrochemical performance of all the previously reported LMNOs.The energy density of the full-cell assembled with commercial graphite reaches 620.9 Wh·kg^(−1)(based on total weight of active materials in cathode and anode).Mechanism studies indicate that the superior electrochemical performance of LMNO@S-LiPP is originated from such a nanoscale double-layer covered architecture,which accelerates Li-ion diffusion,restrains oxygen release,inhibits interfacial side reactions,suppresses structural degradation during cycling.Moreover,this strategy is applicable for other high-energy-density cathodes,such as LiNi_(0.8)Co_(0.1)Mn_(0.1)O_(2),Li_(1.2)Ni_(0.13)Co_(0.13)Mn_(0.54)O_(2),LiCoO_(2).Hence,this work presents a simple,cost-effective,scalable strategy for the development of high-performance cathode materials.
文摘State departments of transportation’s (DOTs) decisions to invest resources to expand or implement intelligent transportation systems (ITS) programs or even retire existing infrastructure need to be based on performance evaluations. Nonetheless, an apparent gap exists between the need for ITS performance measurements and the actual implementation. The evidence available points to challenges in the ITS performance measurement processes. This paper evaluated the state of practice of performance measurement for ITS across the US and provided insights. A comprehensive literature review assessed the use of performance measures by DOTs for monitoring implemented ITS programs. Based on the gaps identified through the literature review, a nationwide qualitative survey was used to gather insights from key stakeholders on the subject matter and presented in this paper. From the data gathered, performance measurement of ITS is fairly integrated into ITS programs by DOTs, with most agencies considering the process beneficial. There, however, exist reasons that prevent agencies from measuring ITS performance to greater detail and quality. These include lack of data, fragmented or incomparable data formats, the complexity of the endeavor, lack of data scientists, and difficulty assigning responsibilities when inter-agency collaboration is required. Additionally, DOTs do not benchmark or compare their ITS performance with others for reasons that include lack of data, lack of guidance or best practices, and incomparable data formats. This paper is relevant as it provides insights expected to guide DOTs and other agencies in developing or reevaluating their ITS performance measurement processes.
基金supported by the National High Technology Research and Development 863 Program of China under Grant No. 2007AA01Z117the National Basic Research 973 Program of China under Grant No.2007CB310900
文摘High performance computer (HPC) is a complex huge system, of which the architecture design meets increasing difficulties and risks. Traditional methods, such as theoretical analysis, component-level simulation and sequential simulation, are not applicable to system-level simulations of HPC systems. Even the parallel simulation using large-scale parallel machines also have many difficulties in scalability, reliability, generality, as well as efficiency. According to the current needs of HPC architecture design, this paper proposes a system-level parallel simulation platform: ArchSim. We first introduce the architecture of ArchSim simulation platform which is composed of a global server (GS), local server agents (LSA) and entities. Secondly, we emphasize some key techniques of ArchSim, including the synchronization protocol, the communication mechanism and the distributed checkpointing/restart mechanism. We then make a synthesized test of some main performance indices of ArchSim with the phold benchmark and analyze the extra overhead generated by ArchSim. Finally, based on ArchSim, we construct a parallel event-driven interconnection network simulator and a system-level simulator for a small scale HPC system with 256 processors. The results of the performance test and HPC system simulations demonstrate that ArchSim can achieve high speedup ratio and high scalability on parallel host machine and support system-level simulations for the architecture design of HPC systems.
基金The research to anode material development received funding from the German Research Foundation(DFG,project No.392322200)the development of cathode materials and upscaling strategies was funded by the Federal Ministry of Education and Research(Project NextGen-3DBat,03XP0198F).
文摘Traditional electrode manufacturing for lithium-ion batteries is well established,reliable,and has already reached high processing speeds and improvements in production costs.For modern electric vehicles,however,the need for batteries with high gravimetric and volumetric energy densities at cell level is increasing;and new production concepts are required for this purpose.During the last decade,laser processing of battery materials emerged as a promising processing tool for either improving manufacturing flexibility and product reliability or enhancing battery performances.Laser cutting and welding already reached a high level of maturity and it is obvious that in the near future they will become frequently implemented in battery production lines.This review focuses on laser texturing of electrode materials due to its high potential for significantly enhancing battery performances beyond state-of-the-art.Technical approaches and processing strategies for new electrode architectures and concepts will be presented and discussed with regard to energy and power density requirements.The boost of electrochemical performances due to laser texturing of energy storage materials is currently proven at the laboratory scale.However,promising developments in high-power,ultrafast laser technology may push laser structuring of batteries to the next technical readiness level soon.For demonstration in pilot lines adapted to future cell production,process upscaling regarding footprint area and processing speed are the main issues as well as the economic aspects with regards to CapEx amortization and the benefits resulting from the next generation battery.This review begins with an introduction of the three-dimensional battery and thick film concept,made possible by laser texturing.Laser processing of electrode components,namely current collectors,anodes,and cathodes will be presented.Different types of electrode architectures,such as holes,grids,and lines,were generated;their impact on battery performances are illustrated.The usage of high-energy materials,which are on the threshold of commercialization,is highlighted.Battery performance increase is triggered by controlling lithium-ion diffusion kinetics in liquid electrolyte filled porous electrodes.This review concludes with a discussion of various laser parameter tasks for process upscaling in a new type of extreme manufacturing.
文摘School decision makers are faced with a great many decisions when considering a school renovation or new school building. All stakeholders want a building that is safe and provides an optimal learning environment. However, it is often difficult to know which building features will have the greatest effect on student learning. Because of a limited understanding of the relationship between individual building features and student learning, researchers at the University of Oklahoma hope to explore how building components influence student and teacher performance. This paper explores the importance of school building features that can be designed and changed during a renovation project. The hope is to one day determine which features have the greatest impact on student test scores. The research team believes that although it is difficult to find the exact relationship between each building features and student outcomes with one study, if multiple users repeat the same or similar studies, hopefully we will one day know the effect of these building features. In order to develop feature building users survey and physical assessment tools, it was necessary for investigators to develop a list of important building features and their associated definitions in layman terms. This was accomplished through utilization and conducting of a CAB (community advisory board) and subject matter expert materials. In addition, previous research relating to different school building features and their associations with student performance were reviewed. To define and narrow the list the researchers, community educational, and building professionals rated based on their professional experience, how directly related each feature is to student performance. The building feature list serves as a starting point to determine which features should be analyzed in a later phase of the project. It is hoped that resulting tools based on the work of this project can be used by school decision-makers and researchers to access building features that have been identified through research as being important for student and teacher performance.
文摘Heterogeneity is inevitable in enterprises due to their various input requirements. The usage of proprietary integration products results in the increased cost of enterprises. During the integration, the focus area has been found to often address only the functional requirements, while the non-functional requirements are side-stepped during the initial stages of a project. Moreover, the use of proprietary integration products and non-standards-based integration platform has given rise to an inflexible integration infrastructure resulting in adaptability concerns. Web services-based integration, based on open standards, is deemed to be the only feasible solution in such cases. This paper explains the performance analysis of enterprise integration in heterogeneous environments for the distributed and the transactional applications. The analysis presented in this paper is seen as a step towards making intelligent decisions well in advance when choosing the integration mechanism/products to address the functional as well as the non-functional requirements considering the future integration needs.
基金This work was supported by a Ulucu PhD studentshipY.Jin is funded by an Alexander von Humboldt Professorship for Artificial Intelligence endowed by the German Federal Ministry of Education and Research.
文摘Neural architecture search(NAS)has become increasingly popular in the deep learning community recently,mainly because it can provide an opportunity to allow interested users without rich expertise to benefit from the success of deep neural networks(DNNs).However,NAS is still laborious and time-consuming because a large number of performance estimations are required during the search process of NAS,and training DNNs is computationally intensive.To solve this major limitation of NAS,improving the computational efficiency is essential in the design of NAS.However,a systematic overview of computationally efficient NAS(CE-NAS)methods still lacks.To fill this gap,we provide a comprehensive survey of the state-of-the-art on CE-NAS by categorizing the existing work into proxy-based and surrogate-assisted NAS methods,together with a thorough discussion of their design principles and a quantitative comparison of their performances and computational complexities.The remaining challenges and open research questions are also discussed,and promising research topics in this emerging field are suggested.
基金This study was sponsored by the“Civil Engineering,Brand Major Construction Site of Private Universities of Education Department of Henan Province 2017”(Henan Finance and Education:[2016]119).
文摘This paper investigates the thermal performance of prefabricated exterior walls using the Computational Fluid Dynamics method to reduce energy consumption.The thermal performance of the prefabricated exterior wall was numerically simulated using the software ANSYS Fluent.The composite wall containing the cavity is taken as the research object in this paper after analysis.The simulation suggests that when the cavity thickness is 20 mm and 30 mm,the heat transfer coefficient of the air-sandwich wall is 1.3 and 1.29,respectively.Therefore,the optimal width of the cavity is 20 mm,and the most suitable material is the aerated concrete block.In addition,a comparative analysis is conducted on the cavity temperature in the wall under different conditions.It is proven that an intelligent environment control system can significantly improve thermal efficiency and provide a solid theoretical basis for further research in the external insulation of prefabricated buildings.
基金This research project was supported by a grant from the“Research Center of College of Computer and Information Sciences”,Deanship of Scientific Research,King Saud University.
文摘The expanding amounts of information created by Internet of Things(IoT)devices places a strain on cloud computing,which is often used for data analysis and storage.This paper investigates a different approach based on edge cloud applications,which involves data filtering and processing before being delivered to a backup cloud environment.This Paper suggest designing and implementing a low cost,low power cluster of Single Board Computers(SBC)for this purpose,reducing the amount of data that must be transmitted elsewhere,using Big Data ideas and technology.An Apache Hadoop and Spark Cluster that was used to run a test application was containerized and deployed using a Raspberry Pi cluster and Docker.To obtain system data and analyze the setup’s performance a Prometheusbased stack monitoring and alerting solution in the cloud based market is employed.This Paper assesses the system’s complexity and demonstrates how containerization can improve fault tolerance and maintenance ease,allowing the suggested solution to be used in industry.An evaluation of the overall performance is presented to highlight the capabilities and limitations of the suggested architecture,taking into consideration the suggested solution’s resource use in respect to device restrictions.
文摘Experience from recent earthquakes such as Gilan, Zanjan, Bam and Lorestan earthquakes in Iran indicated that the constructed buildings are vulnerable against earthquake. Vulnerability of these structures is due to various reasons such as designing without considering seismic regulations, problems of regulations (design goals), implementation problems, changing of the building occupancy class, increasing the weight of building stories, adding new stories to the building and changing in architecture of building without considering structural system. So the main objective of this research is to examine the features of building configuration and their effects as for the damages to buildings in past earthquakes. For this purpose, initially four occurred earthquakes in Iran are selected as case study. Then three types of buildings (steel structure, concrete structure and masonry buildings) are analyzed with details. Results showed that the most of damages are occurred in the old steel structures and masonry buildings which their ages are more than 25 years. The study showed that most of the buildings in the study area are steel structure and masonry buildings while concrete structures are infrequent which most of them had no or slight damages. Therefore, the importance and need to enhance the performance of available buildings against earthquake forces by rehabilitating methods would be more important than before. Also results indicated that the decisions related to architectural plan which have significant effect on seismic performance of buildings, can be divided into three categories: configuration of building, restrictive formal architectural plan and dangerous structural components, as these categories are not obstacle of each other, it is possible that each category has an influential effect on others. So organizing the design decisions in this way is very important so as to manage their effects and interdependencies.
文摘Integrative approaches to architectural design+environmental technology pedagogy are essential in educating future generations to respond to impending building energy use challenges.This paper will describe new approaches to incorporating building physics and building technology in the design studio via a diverse cohort of students and faculty,with strong emphasis placed on the development of innovative architectural strategies operating at the intersection of urban demographics,house and housing design,building performance,and sustainability.The United States Department of Energy reports that our buildings account for forty percent of all energy consumed nationally.Our focus on high performance buildings at the Georgia Tech College of Architecture aims to reduce that percentage and meet the rising demand for design and building performance professionals to evaluate the environmental impact of design decisions.Continuing a twenty-five-year trajectory of research leadership,Tech students and faculty are leading the way in digital design,building simulation,engineering,and construction integration.Over the past four years,students from various schools across campus have been working together in a seminar and design studio setting to expand 21st century housing options.Changing urban demographics,sustainability targets,and alternative energy requirements are investigated through smartly researched and elegantly designed housing and public space propositions.The move from an ecologically aware architecture towards an architecture immersed in the emerging debates about carbon footprint and energy consumption is in part driven by increasing international concern over resource availability and delivery.Through reduced costs of alternative energy capture,higher efficiencies,rapid evolution of upstream technologies and applications and more robust software platforms along with growing social,political and economic debate,the definition of sustainability is evolving - moving to transform integral parts of archi-tectural practice and education from a primarily aesthetic and assembly oriented tra-jectory to a more comprehensive understanding of the relationship between design thinking and building performance.
文摘The flexibility offered by an Enterprise Service Bus (ESB) in enabling various applications to exchange data makes it a very important middleware layer that is responsible for transporting data in a Service-Oriented Architecture (SOA). The popularity of the ESB has given rise to a number of commercial off the shelf (COTS) products as well as open source ESBs. In this study, we evaluated three open source ESBs and compared them both qualitatively and quantitatively. The empirical results were statistically tested to determine the statistical significance of the results.
基金Supported by the 10th5 Year National Defence Pre Research Project (41316.1.2)
文摘Based on Simultancous Multithrtading (SMT), we propose a fault-tola antscheme called Tri-modular Redun-danlly and Simultaneously threaded processor with Recovery (TRSTR),TRSTR features as following: First, we introduce an arbitrator context into thtconventional SRT(Simultaneous and Redundantly Threaded), which acts as an arbitrator when results from the other twocontexts disagree, or acts as an ordinary thread generally, thus making full use of SMT'sparallelism. Second, we append reconfigurablefeature to sphere of replication in SRT, making it moreflexible for changing demands and situations Third, TRSFR has two working modes: Tri-Simultancouswith Voling (TSV) and Dual-Simultaneous with Arbitrator CDSA), which can switch at will. Finally, inaddition to transient-fault coverage, TRSTR has on-line self-checking and self-recover ingabilities, so as to shield off some permanent faults and reconfigure itself without stopping thecrucial job. improving its reliability and availability.
文摘We implemented a generalized infrastructure for Internet of Things (IoT infrastructure) to be applicable in various areas such as Smart Grid. That IoT infrastructure has two methods to store sensor data. They commonly have the features of double overlay structure, virtualization of sensors, composite services as federation using publisher/subscriber. And they are implemented as synthesizing the elemental architectures. The two methods majorly have the common architectural elements, however there are differences in how to compose and utilize them. But we observed the non-negligible differences in their achieved performance by the actual implementations due to operational items beyond these architectural elements. In this paper, we present the results of our analysis about the factors of the revealed differences based on the measured performance. In particular, it is clarified that a negative side effect due to combining independent elemental micro solutions naively could be amplified, if maximizing the level of loose coupling is applied as the most prioritized design and operational policy. Primarily, these combinations should be evaluated and verified during the basic design phase. However, the variation of how to synthesize them tends to be a blind spot when adopting the multiple independent architectural elements commonly. As a practical suggestion from this case, the emphasized importance in carrying out a new synthetization with multiple architectures is to make a balance naturally among architectural elements, or solutions based on them, and there is a certain demand to establish a methodology for architectural synthetization, including verification.
文摘Developments in information technology are providing methods to improve current design practices,where uncertainties about various design elements can be simulated and studied from the design inception.Energy and thermal simulations,improved design representations and enhanced collaboration using digital media are increasingly being used.With the expanding interest in energy-efficient build-ing design,whole building energy simulation programs are increasingly employed in the design process to help architects and engineers determine which design strat-egies save energy and improve building performance.The purpose of this research was to investigate the potential of these programs to perform whole building energy analysis during the early stages of architectural design,and compare the results with the actual building energy performance.The research was conducted by simulating energy usage of a fully functional research laboratory building using two different simulation tools that are aimed for early schematic design.The results were compared with utility data of the building to identify the degree of close-ness with which simulation results match the actual energy usage of the build-ing.Results indicate that modeled energy data from one of the software programs was significantly higher than the measured,actual energy usage data,while the results from the second application were comparable,but did not correctly predict monthly energy loads for the building.This suggests that significant deviations may exist between modeled and actual energy consumption for buildings,and more importantly between different simulation software programs.Understanding the limitations and suitability of specific simulation programs is crucial for successful integration of performance simulations with the design process.