It is generally accepted that the extra construction costs involved in the construction of green buildings will result in benefits including lower operation costs,higher sale/rental prices,and better sustainability pe...It is generally accepted that the extra construction costs involved in the construction of green buildings will result in benefits including lower operation costs,higher sale/rental prices,and better sustainability performance.However,there has been little recognition of construction waste minimization(CWM)as one of the important benefits of sustainability performance as designated in green building.is paper aims to provide a better understanding of the cost benefit of green buildings with respect to CWM by using big data in the context of Hong Kong.The study is innovative in that it conducts a cost-benefit analysis specifically on CWM of green buildings by mining large-volume datasets.A surprise finding is that Hong Kong’s green building rating system(GBRS),i.e.the BEAM Plus,has a negligible effect on CWM,while it generally increases construction costs by approximately 24%.Hence,the increased construction cost of green buildings cannot be offset by CWM if corresponding items in the BEAM Plus are not properly incentivized.is paper demonstrates the necessity of emphasizing CWM-related items in GBRSs and of taking appropriate measures to deal with them.It also provides better decision-support information on the increased construction costs and the attainable benefits of green building that developers may wish to consider when initiating a green building project.展开更多
Based on the Data Envelopment Analysis method,and by using CCR and BCC model,Super Efficiency model and Malmquist model guided by input efficiency,the input-output efficiency of elements of urban construction land in ...Based on the Data Envelopment Analysis method,and by using CCR and BCC model,Super Efficiency model and Malmquist model guided by input efficiency,the input-output efficiency of elements of urban construction land in different jurisdictions of Beijing from 2005 to 2015 was studied.The results showed that there were obvious differences between input-output efficiency of elements of urban construction land in different jurisdictions of Beijing,among which the efficiency of the core area of capital,Yanqing District,Fangshan District and Huairou District was relatively high,while the efficiency of Daxing District,Fengtai District and Miyun District was relatively low.There was no obvious correlation between efficiency differentiation and location factors,which is mainly caused by whether the land use in each jurisdiction has scale effect,whether the technology is improved,whether the input is redundant and whether the output is insufficient.For the jurisdiction of inefficient land use,we should strengthen the consciousness of intensive land use,improve the technical level,appropriately reduce the redundancy of input elements,and pay attention to the output of social and ecological benefits.展开更多
This study utilized Data Envelopment Analysis (DEA) in assessing the efficiency of health center in tuberculosis (TB) treatment. Assessing the efficiency of health center treating TB is a vital and sensitive topic, be...This study utilized Data Envelopment Analysis (DEA) in assessing the efficiency of health center in tuberculosis (TB) treatment. Assessing the efficiency of health center treating TB is a vital and sensitive topic, because there is a cumulative amount of public funds devoted to healthcare. In this research, a DEA model has been correlated to evaluate and assess the efficiency of 17 health centers. The researchers selected the health budget and the number of health workers as input variables likewise, the number of people served, number of TB patients served, and TB patients treated (%) as output variables. Based on the result of the study, only five (5) health centers out of seventeen (17) have 100% efficiencies throughout the 2 years period. It is recommended that other health centers should learn from their efficient peers recognized by the DEA model so as to increase the overall performance of the healthcare system. Likewise, health centers should integrate Health Information Technology to deliver healthier care for their patients.展开更多
The Energization and Radiation in Geospace (ERG) mission seeks to explore the dynamics of the radiation belts in the Earth's inner magnetosphere with a space-borne probe (ERG satellite) in coordination with relat...The Energization and Radiation in Geospace (ERG) mission seeks to explore the dynamics of the radiation belts in the Earth's inner magnetosphere with a space-borne probe (ERG satellite) in coordination with related ground observations and simulations/modeling studies. For this mission, the Science Center of the ERG project (ERG-SC) will provide a useful data analysis platform based on the THEMIS Data Analysis software Suite (TDAS), which has been widely used by researchers in many conjunction studies of the Time History of Events and Macroscale Interactions during Substorms (THEMIS) spacecraft and ground data. To import SuperDARN data to this highly useful platform, ERG-SC, in close collaboration with SuperDARN groups, developed the Common Data Format (CDF) design suitable for fitacf data and has prepared an open database of SuperDARN data archived in CDE ERG-SC has also been developing programs written in Interactive Data Language (IDL) to load fltacf CDF files and to generate various kinds of plots-not only range-time-intensity-type plots but also two-dimensional map plots that can be superposed with other data, such as all-sky images of THEMIS-GBO and orbital footprints of various satellites. The CDF-TDAS scheme developed by ERG-SC will make it easier for researchers who are not familiar with SuperDARN data to access and analyze SuperDARN data and thereby facilitate collaborative studies with satellite data, such as the inner magnetosphere data pro- vided by the ERG (Japan)-RBSP (USA)-THEMIS (USA) fleet.展开更多
Organoids,miniature and simplified in vitro model systems that mimic the structure and function of organs,have attracted considerable interest due to their promising applications in disease modeling,drug screening,per...Organoids,miniature and simplified in vitro model systems that mimic the structure and function of organs,have attracted considerable interest due to their promising applications in disease modeling,drug screening,personalized medicine,and tissue engineering.Despite the substantial success in cultivating physiologically relevant organoids,challenges remain concerning the complexities of their assembly and the difficulties associated with data analysis.The advent of AI-Enabled Organoids,which interfaces with artificial intelligence(AI),holds the potential to revolutionize the field by offering novel insights and methodologies that can expedite the development and clinical application of organoids.This review succinctly delineates the fundamental concepts and mechanisms underlying AI-Enabled Organoids,summarizing the prospective applications on rapid screening of construction strategies,cost-effective extraction of multiscale image features,streamlined analysis of multi-omics data,and precise preclinical evaluation and application.We also explore the challenges and limitations of interfacing organoids with AI,and discuss the future direction of the field.Taken together,the AI-Enabled Organoids hold significant promise for advancing our understanding of organ development and disease progression,ultimately laying the groundwork for clinical application.展开更多
Data center networks may comprise tens or hundreds of thousands of nodes,and,naturally,suffer from frequent software and hardware failures as well as link congestions.Packets are routed along the shortest paths with s...Data center networks may comprise tens or hundreds of thousands of nodes,and,naturally,suffer from frequent software and hardware failures as well as link congestions.Packets are routed along the shortest paths with sufficient resources to facilitate efficient network utilization and minimize delays.In such dynamic networks,links frequently fail or get congested,making the recalculation of the shortest paths a computationally intensive problem.Various routing protocols were proposed to overcome this problem by focusing on network utilization rather than speed.Surprisingly,the design of fast shortest-path algorithms for data centers was largely neglected,though they are universal components of routing protocols.Moreover,parallelization techniques were mostly deployed for random network topologies,and not for regular topologies that are often found in data centers.The aim of this paper is to improve scalability and reduce the time required for the shortest-path calculation in data center networks by parallelization on general-purpose hardware.We propose a novel algorithm that parallelizes edge relaxations as a faster and more scalable solution for popular data center topologies.展开更多
Thermal analysis of data centers is in urgent need to ensure that computer chips remain below the critical temperature while the energy consumption for cooling can be reduced.It is difficult to obtain detailed hotspot...Thermal analysis of data centers is in urgent need to ensure that computer chips remain below the critical temperature while the energy consumption for cooling can be reduced.It is difficult to obtain detailed hotspot locations and temperatures of chips in large data centers containing hundreds of racks or more by direct measurement.In this paper,a multi-scale thermal analysis method is proposed that can predict the temperature distribution of chips and solder balls in data centers.The multi-scale model is divided into six scales:room,rack,server,Insulated-Gate Bipolar Transistor(IGBT),chip and solder ball.A concept of sub-model is proposed and the six levels are organized into four simulation sub-models.Sub-model 1 contains Room,Rack and Server(RRS);Sub-model 2 contains Server and IGBT(SI);Sub-model 3 contains IGBT and Chip(IC),and Sub-model 4 contains Chip and Solder-ball(CS).These four sub-models are one-way coupled by transmitting their results as boundary conditions between levels.The full-field simulation method is employed to verify the efficiency and accuracy of multi-scale simulation method for a single-rack data center.The two simulation results show that the highest temperature emerges in the same location.The Single-rack Full-field Model(SRFFM)costs 2.5 times more computational time than that with Single-rack Multi-scale Model(SRMSM).The deviation of the highest temperature of chips and solder balls are 1.57℃and 0.2℃between the two models which indicates that the multi-scale simulation method has good prospect in the data center thermal simulation.Finally,the multi-scale thermal analysis method is applied to a ship data center with 15 racks.展开更多
Hazard maps are usually prepared for each disaster, including seismic hazard maps, flood hazard maps, and landslide hazard maps. However, when the general public attempts to check their own disaster risk, most are lik...Hazard maps are usually prepared for each disaster, including seismic hazard maps, flood hazard maps, and landslide hazard maps. However, when the general public attempts to check their own disaster risk, most are likely not aware of the specific types of disaster. So, first of all, we need to know what kind<span style="font-family:;" "="">s</span><span style="font-family:;" "=""> of hazards are important. However, the information that integrates multiple hazards is not well maintained, and there are few such studies. On the other hand, in Japan, a lot of hazard information is being released on the Internet. So, we summarized and assessed hazard data that can be accessed online regarding shelters (where evacuees live during disasters) and their catchments (areas assigned to each shelter) in Yokohama City, Kanagawa Prefecture. Based on the results, we investigated whether a grouping by cluster analysis would allow for multi-hazard assessment. We used four natural disasters (seismic, flood, tsunami, sediment disaster) and six parameters of other population and senior population. However, since the characteristics of the population and the senior population were almost the same, only population data was used in the final examination. From the cluster analysis, it was found that it is appropriate to group the designated evacuation centers in Yokohama City into six groups. In addition, each of the six groups was found <span>to have explainable characteristics, confirming the effectiveness of multi-hazard</span> creation using cluster analysis. For example, we divided, all hazards are low, both flood and Seismic hazards are high, sediment hazards are high, etc. In many Japanese cities, disaster prevention measures have been constructed in consideration of ground hazards, mainly for earthquake disasters. In this paper, we confirmed the consistency between the evaluation results of the multi-hazard evaluated here and the existing ground hazard map and examined the usefulness of the designated evacuation center. Finally, the validity was confirmed by comparing this result with the ground hazard based on the actual measurement by the past research. In places where the seismic hazard is large, the two are consistent with the fact that the easiness of shaking by actual measurement is also large.</span>展开更多
The Chandra Galactic Center Survey detected -800 X-ray point-like sources in the 2°× 0.8° sky region around the Galactic Center. We study the spatial and luminosity distributions of these sources accord...The Chandra Galactic Center Survey detected -800 X-ray point-like sources in the 2°× 0.8° sky region around the Galactic Center. We study the spatial and luminosity distributions of these sources according to their spectral properties. Fourteen bright sources detected are used to fit jointly an absorbed power-law model, from which the power-law photon index is determined to be -2.5. Assuming that all other sources have the same power-law form, the relation between hardness ratio and HI column density NH is used to estimate the NH values for all sources. Monte Carlo simulations show that these sources are more likely concentrated in the Galactic center region, rather than distributed throughout the Galactic disk. We also find that the luminosities of the sources are positively correlated with their HI column densities, i.e., a more luminous source has a higher HI column density. From this relation, we suggest that the X-ray luminosity comes from the interaction between an isolated old neutron star and interstellar medium (mainly dense molecular clouds). Using the standard Bondi accretion theory and the statistical information of molecular clouds in the Galactic center, we confirm this positive correlation and calculate the luminosity range in this scenario, which is consistent with the observation (10^32 - 10^35 erg s^-1).展开更多
It is very important to monitor surrounding rock deformation in tunnel construction. The principle, function, development and application of the system composed of a total station and computer for monitoring and analy...It is very important to monitor surrounding rock deformation in tunnel construction. The principle, function, development and application of the system composed of a total station and computer for monitoring and analyzing surrounding rock deformation were discussed. The new methods of two free station of 3D measurement and its mathematic adjustment mode were presented. The development of software for total station on-board and post for computer were also described. Without centering it and measuring its height, the total station controlled by the software on-board can fulfill the whole measurements to target points. Monitoring data can be processed by the post software and results of regression analysis, forecasting information of the tunnel surrounding rock deformation can be provided in time. The practical use shows that this system is practicable, highly accurate and efficient. It satisfies the needs of safety and information construction in tunnel construction of underground engineering.展开更多
Video surveillance applications need video data center to provide elastic virtual machine (VM) provisioning. However, the workloads of the VMs are hardly to be predicted for online video surveillance service. The un...Video surveillance applications need video data center to provide elastic virtual machine (VM) provisioning. However, the workloads of the VMs are hardly to be predicted for online video surveillance service. The unknown arrival workloads easily lead to workload skew among VMs. In this paper, we study how to balance the workload skew on online video surveillance system. First, we design the system framework for online surveillance service which con- sists of video capturing and analysis tasks. Second, we propose StreamTune, an online resource scheduling approach for workload balancing, to deal with irregular video analysis workload with the minimum number of VMs. We aim at timely balancing the workload skew on video analyzers without depending on any workload prediction method. Furthermore, we evaluate the performance of the proposed approach using a traffic surveillance application. The experimental results show that our approach is well adaptive to the variation of workload and achieves workload balance with less VMs.展开更多
This study calculates the efficiency of Rural Health Centers (RHCs) and investigates the impact of other variables affecting the efficiency of RHCs. The study considers 29 RHCs, 13 of District Faisalabad, 9 of Toba an...This study calculates the efficiency of Rural Health Centers (RHCs) and investigates the impact of other variables affecting the efficiency of RHCs. The study considers 29 RHCs, 13 of District Faisalabad, 9 of Toba and 7 of Jhang;a survey was conducted to collect data from each RHC for the year 2016. Data Envelopment Analysis (DEA) model was utilized to get the scores for efficiency. Thereafter, after getting the results from DEA Tobit regression was used in the second stage. Out of the 29 Rural Health Centers, only 11 (38%) are working efficiently as compare to others. Distance from the tehsil headquarter, Distance from the road with “0” probability, Distance from private hospital with “0” probability, Behavior of the staff with “0.0064” probability and laboratory equipment’s with “0” probability, have an impact on the efficiency scores. Distance from other health facilitators, Staff’s behavior, list of medicine and equipment’s used at RHCs should be improved to increase the efficiency of RHC’s.展开更多
基金supported by the National Nature Science Foundation of China (NSFC) (Project No.:71273219)the Hong Kong Research Grants Council (RGC)General Research Fund (GRF) (Project No.:17201917).
文摘It is generally accepted that the extra construction costs involved in the construction of green buildings will result in benefits including lower operation costs,higher sale/rental prices,and better sustainability performance.However,there has been little recognition of construction waste minimization(CWM)as one of the important benefits of sustainability performance as designated in green building.is paper aims to provide a better understanding of the cost benefit of green buildings with respect to CWM by using big data in the context of Hong Kong.The study is innovative in that it conducts a cost-benefit analysis specifically on CWM of green buildings by mining large-volume datasets.A surprise finding is that Hong Kong’s green building rating system(GBRS),i.e.the BEAM Plus,has a negligible effect on CWM,while it generally increases construction costs by approximately 24%.Hence,the increased construction cost of green buildings cannot be offset by CWM if corresponding items in the BEAM Plus are not properly incentivized.is paper demonstrates the necessity of emphasizing CWM-related items in GBRSs and of taking appropriate measures to deal with them.It also provides better decision-support information on the increased construction costs and the attainable benefits of green building that developers may wish to consider when initiating a green building project.
文摘Based on the Data Envelopment Analysis method,and by using CCR and BCC model,Super Efficiency model and Malmquist model guided by input efficiency,the input-output efficiency of elements of urban construction land in different jurisdictions of Beijing from 2005 to 2015 was studied.The results showed that there were obvious differences between input-output efficiency of elements of urban construction land in different jurisdictions of Beijing,among which the efficiency of the core area of capital,Yanqing District,Fangshan District and Huairou District was relatively high,while the efficiency of Daxing District,Fengtai District and Miyun District was relatively low.There was no obvious correlation between efficiency differentiation and location factors,which is mainly caused by whether the land use in each jurisdiction has scale effect,whether the technology is improved,whether the input is redundant and whether the output is insufficient.For the jurisdiction of inefficient land use,we should strengthen the consciousness of intensive land use,improve the technical level,appropriately reduce the redundancy of input elements,and pay attention to the output of social and ecological benefits.
文摘This study utilized Data Envelopment Analysis (DEA) in assessing the efficiency of health center in tuberculosis (TB) treatment. Assessing the efficiency of health center treating TB is a vital and sensitive topic, because there is a cumulative amount of public funds devoted to healthcare. In this research, a DEA model has been correlated to evaluate and assess the efficiency of 17 health centers. The researchers selected the health budget and the number of health workers as input variables likewise, the number of people served, number of TB patients served, and TB patients treated (%) as output variables. Based on the result of the study, only five (5) health centers out of seventeen (17) have 100% efficiencies throughout the 2 years period. It is recommended that other health centers should learn from their efficient peers recognized by the DEA model so as to increase the overall performance of the healthcare system. Likewise, health centers should integrate Health Information Technology to deliver healthier care for their patients.
文摘The Energization and Radiation in Geospace (ERG) mission seeks to explore the dynamics of the radiation belts in the Earth's inner magnetosphere with a space-borne probe (ERG satellite) in coordination with related ground observations and simulations/modeling studies. For this mission, the Science Center of the ERG project (ERG-SC) will provide a useful data analysis platform based on the THEMIS Data Analysis software Suite (TDAS), which has been widely used by researchers in many conjunction studies of the Time History of Events and Macroscale Interactions during Substorms (THEMIS) spacecraft and ground data. To import SuperDARN data to this highly useful platform, ERG-SC, in close collaboration with SuperDARN groups, developed the Common Data Format (CDF) design suitable for fitacf data and has prepared an open database of SuperDARN data archived in CDE ERG-SC has also been developing programs written in Interactive Data Language (IDL) to load fltacf CDF files and to generate various kinds of plots-not only range-time-intensity-type plots but also two-dimensional map plots that can be superposed with other data, such as all-sky images of THEMIS-GBO and orbital footprints of various satellites. The CDF-TDAS scheme developed by ERG-SC will make it easier for researchers who are not familiar with SuperDARN data to access and analyze SuperDARN data and thereby facilitate collaborative studies with satellite data, such as the inner magnetosphere data pro- vided by the ERG (Japan)-RBSP (USA)-THEMIS (USA) fleet.
基金financially supported by National Natural Science Foundation of China(82230071,82172098)Shanghai Committee of Science and Technology(23141900600,Laboratory Animal Research Project).
文摘Organoids,miniature and simplified in vitro model systems that mimic the structure and function of organs,have attracted considerable interest due to their promising applications in disease modeling,drug screening,personalized medicine,and tissue engineering.Despite the substantial success in cultivating physiologically relevant organoids,challenges remain concerning the complexities of their assembly and the difficulties associated with data analysis.The advent of AI-Enabled Organoids,which interfaces with artificial intelligence(AI),holds the potential to revolutionize the field by offering novel insights and methodologies that can expedite the development and clinical application of organoids.This review succinctly delineates the fundamental concepts and mechanisms underlying AI-Enabled Organoids,summarizing the prospective applications on rapid screening of construction strategies,cost-effective extraction of multiscale image features,streamlined analysis of multi-omics data,and precise preclinical evaluation and application.We also explore the challenges and limitations of interfacing organoids with AI,and discuss the future direction of the field.Taken together,the AI-Enabled Organoids hold significant promise for advancing our understanding of organ development and disease progression,ultimately laying the groundwork for clinical application.
基金This work was supported by the Serbian Ministry of Science and Education(project TR-32022)by companies Telekom Srbija and Informatika.
文摘Data center networks may comprise tens or hundreds of thousands of nodes,and,naturally,suffer from frequent software and hardware failures as well as link congestions.Packets are routed along the shortest paths with sufficient resources to facilitate efficient network utilization and minimize delays.In such dynamic networks,links frequently fail or get congested,making the recalculation of the shortest paths a computationally intensive problem.Various routing protocols were proposed to overcome this problem by focusing on network utilization rather than speed.Surprisingly,the design of fast shortest-path algorithms for data centers was largely neglected,though they are universal components of routing protocols.Moreover,parallelization techniques were mostly deployed for random network topologies,and not for regular topologies that are often found in data centers.The aim of this paper is to improve scalability and reduce the time required for the shortest-path calculation in data center networks by parallelization on general-purpose hardware.We propose a novel algorithm that parallelizes edge relaxations as a faster and more scalable solution for popular data center topologies.
基金supported by the National Natural Science Foundation of China(No.51806167)China Postdoctoral Science Foundation(2017M623166)+1 种基金Science and Technology on Thermal Energy and Power Laboratory Open Foundation of China(No.TPL2017BA004)the Fund of Xi’an Science and Technology Bureau(2019218714SYS002CG024).
文摘Thermal analysis of data centers is in urgent need to ensure that computer chips remain below the critical temperature while the energy consumption for cooling can be reduced.It is difficult to obtain detailed hotspot locations and temperatures of chips in large data centers containing hundreds of racks or more by direct measurement.In this paper,a multi-scale thermal analysis method is proposed that can predict the temperature distribution of chips and solder balls in data centers.The multi-scale model is divided into six scales:room,rack,server,Insulated-Gate Bipolar Transistor(IGBT),chip and solder ball.A concept of sub-model is proposed and the six levels are organized into four simulation sub-models.Sub-model 1 contains Room,Rack and Server(RRS);Sub-model 2 contains Server and IGBT(SI);Sub-model 3 contains IGBT and Chip(IC),and Sub-model 4 contains Chip and Solder-ball(CS).These four sub-models are one-way coupled by transmitting their results as boundary conditions between levels.The full-field simulation method is employed to verify the efficiency and accuracy of multi-scale simulation method for a single-rack data center.The two simulation results show that the highest temperature emerges in the same location.The Single-rack Full-field Model(SRFFM)costs 2.5 times more computational time than that with Single-rack Multi-scale Model(SRMSM).The deviation of the highest temperature of chips and solder balls are 1.57℃and 0.2℃between the two models which indicates that the multi-scale simulation method has good prospect in the data center thermal simulation.Finally,the multi-scale thermal analysis method is applied to a ship data center with 15 racks.
文摘Hazard maps are usually prepared for each disaster, including seismic hazard maps, flood hazard maps, and landslide hazard maps. However, when the general public attempts to check their own disaster risk, most are likely not aware of the specific types of disaster. So, first of all, we need to know what kind<span style="font-family:;" "="">s</span><span style="font-family:;" "=""> of hazards are important. However, the information that integrates multiple hazards is not well maintained, and there are few such studies. On the other hand, in Japan, a lot of hazard information is being released on the Internet. So, we summarized and assessed hazard data that can be accessed online regarding shelters (where evacuees live during disasters) and their catchments (areas assigned to each shelter) in Yokohama City, Kanagawa Prefecture. Based on the results, we investigated whether a grouping by cluster analysis would allow for multi-hazard assessment. We used four natural disasters (seismic, flood, tsunami, sediment disaster) and six parameters of other population and senior population. However, since the characteristics of the population and the senior population were almost the same, only population data was used in the final examination. From the cluster analysis, it was found that it is appropriate to group the designated evacuation centers in Yokohama City into six groups. In addition, each of the six groups was found <span>to have explainable characteristics, confirming the effectiveness of multi-hazard</span> creation using cluster analysis. For example, we divided, all hazards are low, both flood and Seismic hazards are high, sediment hazards are high, etc. In many Japanese cities, disaster prevention measures have been constructed in consideration of ground hazards, mainly for earthquake disasters. In this paper, we confirmed the consistency between the evaluation results of the multi-hazard evaluated here and the existing ground hazard map and examined the usefulness of the designated evacuation center. Finally, the validity was confirmed by comparing this result with the ground hazard based on the actual measurement by the past research. In places where the seismic hazard is large, the two are consistent with the fact that the easiness of shaking by actual measurement is also large.</span>
基金Supported by the National Natural Science Foundation of China.
文摘The Chandra Galactic Center Survey detected -800 X-ray point-like sources in the 2°× 0.8° sky region around the Galactic Center. We study the spatial and luminosity distributions of these sources according to their spectral properties. Fourteen bright sources detected are used to fit jointly an absorbed power-law model, from which the power-law photon index is determined to be -2.5. Assuming that all other sources have the same power-law form, the relation between hardness ratio and HI column density NH is used to estimate the NH values for all sources. Monte Carlo simulations show that these sources are more likely concentrated in the Galactic center region, rather than distributed throughout the Galactic disk. We also find that the luminosities of the sources are positively correlated with their HI column densities, i.e., a more luminous source has a higher HI column density. From this relation, we suggest that the X-ray luminosity comes from the interaction between an isolated old neutron star and interstellar medium (mainly dense molecular clouds). Using the standard Bondi accretion theory and the statistical information of molecular clouds in the Galactic center, we confirm this positive correlation and calculate the luminosity range in this scenario, which is consistent with the observation (10^32 - 10^35 erg s^-1).
基金Project(2000G033) supported by the S & T, Ministry of Railroad , China
文摘It is very important to monitor surrounding rock deformation in tunnel construction. The principle, function, development and application of the system composed of a total station and computer for monitoring and analyzing surrounding rock deformation were discussed. The new methods of two free station of 3D measurement and its mathematic adjustment mode were presented. The development of software for total station on-board and post for computer were also described. Without centering it and measuring its height, the total station controlled by the software on-board can fulfill the whole measurements to target points. Monitoring data can be processed by the post software and results of regression analysis, forecasting information of the tunnel surrounding rock deformation can be provided in time. The practical use shows that this system is practicable, highly accurate and efficient. It satisfies the needs of safety and information construction in tunnel construction of underground engineering.
文摘Video surveillance applications need video data center to provide elastic virtual machine (VM) provisioning. However, the workloads of the VMs are hardly to be predicted for online video surveillance service. The unknown arrival workloads easily lead to workload skew among VMs. In this paper, we study how to balance the workload skew on online video surveillance system. First, we design the system framework for online surveillance service which con- sists of video capturing and analysis tasks. Second, we propose StreamTune, an online resource scheduling approach for workload balancing, to deal with irregular video analysis workload with the minimum number of VMs. We aim at timely balancing the workload skew on video analyzers without depending on any workload prediction method. Furthermore, we evaluate the performance of the proposed approach using a traffic surveillance application. The experimental results show that our approach is well adaptive to the variation of workload and achieves workload balance with less VMs.
文摘This study calculates the efficiency of Rural Health Centers (RHCs) and investigates the impact of other variables affecting the efficiency of RHCs. The study considers 29 RHCs, 13 of District Faisalabad, 9 of Toba and 7 of Jhang;a survey was conducted to collect data from each RHC for the year 2016. Data Envelopment Analysis (DEA) model was utilized to get the scores for efficiency. Thereafter, after getting the results from DEA Tobit regression was used in the second stage. Out of the 29 Rural Health Centers, only 11 (38%) are working efficiently as compare to others. Distance from the tehsil headquarter, Distance from the road with “0” probability, Distance from private hospital with “0” probability, Behavior of the staff with “0.0064” probability and laboratory equipment’s with “0” probability, have an impact on the efficiency scores. Distance from other health facilitators, Staff’s behavior, list of medicine and equipment’s used at RHCs should be improved to increase the efficiency of RHC’s.