Bear bile has been a valuable and effective medicinal material in traditional Chinese medicine(TCM)for over 13 centuries.However,the current practice of obtaining it through bear farming is under scrutiny for its adve...Bear bile has been a valuable and effective medicinal material in traditional Chinese medicine(TCM)for over 13 centuries.However,the current practice of obtaining it through bear farming is under scrutiny for its adverse impact on bear welfare.Here,we present a new approach for creating artificial bear bile(ABB)as a high-quality and sustainable alternative to natural bear bile.This study addresses the scientific challenges of creating bear bile alternatives through interdisciplinary collaborations across various fields,including resources,chemistry,biology,medicine,pharmacology,and TCM.A comprehensive efficacy assessment system that bridges the gap between TCM and modern medical terminology has been established,allowing for the systematic screening of therapeutic constituents.Through the utilization of chemical synthesis and enzyme engineering technologies,our research has achieved the environmentally friendly,large-scale production of bear bile therapeutic compounds,as well as the optimization and recomposition of ABB formulations.The resulting ABB not only closely resembles natural bear bile in its composition but also offers advantages such as consistent product quality,availability of raw materials,and independence from threatened or wild resources.Comprehensive preclinical efficacy evaluations have demonstrated the equivalence of the therapeutic effects from ABB and those from commercially available drained bear bile(DBB).Furthermore,preclinical toxicological assessment and phase I clinical trials show that the safety of ABB is on par with that of the currently used DBB.This innovative strategy can serve as a new research paradigm for developing alternatives for other endangered TCMs,thereby strengthening the integrity and sustainability of TCM.展开更多
Spark is the most popular in-memory processing framework for big data analytics.Memory is the crucial resource for workloads to achieve performance acceleration on Spark.The extant memory capacity configuration approa...Spark is the most popular in-memory processing framework for big data analytics.Memory is the crucial resource for workloads to achieve performance acceleration on Spark.The extant memory capacity configuration approach in Spark is to statically configure the memory capacity for workloads based on user’s specifications.However,without the deep knowledge of the workload’s system-level characteristics,users in practice often conservatively overestimate the memory utilizations of their workloads and require resource manager to grant more memory share than that they actually need,which leads to the severe waste of memory resources.To address the above issue,SMConf,an automated memory capacity configuration solution for in-memory computing workloads in Spark is proposed.SMConf is designed based on the observation that,though there is not one-size-fit-all proper configuration,the one-size-fit-bunch configuration can be found for in-memory computing workloads.SMConf classifies typical Spark workloads into categories based on metrics across layers of Spark system stack.For each workload category,an individual memory requirement model is learned from the workload’s input data size and the strong-correlated configuration parameters.For an ad-hoc workload,SMConf matches its memory requirement signature to one of the workload categories with small-sized input data and determines its proper memory capacity configuration with the corresponding memory requirement model.Experimental results demonstrate that,compared to the conservative default configuration,SMConf can reduce the memory resource provision to Spark workloads by up to 69%with the slight performance degradation,and reduce the average turnaround time of Spark workloads by up to 55%in the multi-tenant environments.展开更多
基金supported by the Major Program of National Natural Science Foundation of China(T2192970-T2192974)the CAMS Innovation Fund for Medical Sciences(CIFMS,2021-I2M-1-027).
文摘Bear bile has been a valuable and effective medicinal material in traditional Chinese medicine(TCM)for over 13 centuries.However,the current practice of obtaining it through bear farming is under scrutiny for its adverse impact on bear welfare.Here,we present a new approach for creating artificial bear bile(ABB)as a high-quality and sustainable alternative to natural bear bile.This study addresses the scientific challenges of creating bear bile alternatives through interdisciplinary collaborations across various fields,including resources,chemistry,biology,medicine,pharmacology,and TCM.A comprehensive efficacy assessment system that bridges the gap between TCM and modern medical terminology has been established,allowing for the systematic screening of therapeutic constituents.Through the utilization of chemical synthesis and enzyme engineering technologies,our research has achieved the environmentally friendly,large-scale production of bear bile therapeutic compounds,as well as the optimization and recomposition of ABB formulations.The resulting ABB not only closely resembles natural bear bile in its composition but also offers advantages such as consistent product quality,availability of raw materials,and independence from threatened or wild resources.Comprehensive preclinical efficacy evaluations have demonstrated the equivalence of the therapeutic effects from ABB and those from commercially available drained bear bile(DBB).Furthermore,preclinical toxicological assessment and phase I clinical trials show that the safety of ABB is on par with that of the currently used DBB.This innovative strategy can serve as a new research paradigm for developing alternatives for other endangered TCMs,thereby strengthening the integrity and sustainability of TCM.
基金National Key R&D Program of China(No.2017YFC0803300)the National Natural Science of Foundation of China(No.61703013).
文摘Spark is the most popular in-memory processing framework for big data analytics.Memory is the crucial resource for workloads to achieve performance acceleration on Spark.The extant memory capacity configuration approach in Spark is to statically configure the memory capacity for workloads based on user’s specifications.However,without the deep knowledge of the workload’s system-level characteristics,users in practice often conservatively overestimate the memory utilizations of their workloads and require resource manager to grant more memory share than that they actually need,which leads to the severe waste of memory resources.To address the above issue,SMConf,an automated memory capacity configuration solution for in-memory computing workloads in Spark is proposed.SMConf is designed based on the observation that,though there is not one-size-fit-all proper configuration,the one-size-fit-bunch configuration can be found for in-memory computing workloads.SMConf classifies typical Spark workloads into categories based on metrics across layers of Spark system stack.For each workload category,an individual memory requirement model is learned from the workload’s input data size and the strong-correlated configuration parameters.For an ad-hoc workload,SMConf matches its memory requirement signature to one of the workload categories with small-sized input data and determines its proper memory capacity configuration with the corresponding memory requirement model.Experimental results demonstrate that,compared to the conservative default configuration,SMConf can reduce the memory resource provision to Spark workloads by up to 69%with the slight performance degradation,and reduce the average turnaround time of Spark workloads by up to 55%in the multi-tenant environments.