In tea plants,the abundant flavonoid compounds are responsible for the health benefits for the human body and define the astringent flavor profile.While the downstream mechanisms of flavonoid biosynthesis have been ex...In tea plants,the abundant flavonoid compounds are responsible for the health benefits for the human body and define the astringent flavor profile.While the downstream mechanisms of flavonoid biosynthesis have been extensively studied,the role of chalcone synthase(CHS)in this secondary metabolic process in tea plants remains less clear.In this study,we compared the evolutionary profile of the flavonoid metabolism pathway and discovered that gene duplication of CHS occurred in tea plants.We identified three CsCHS genes,along with a CsCHS-like gene,as potential candidates for further functional investigation.Unlike the CsCHS-like gene,the CsCHS genes effectively restored flavonoid production in Arabidopsis chs-mutants.Additionally,CsCHS transgenic tobacco plants exhibited higher flavonoid compound accumulation compared to their wild-type counterparts.Most notably,our examination of promoter and gene expression levels for the selected CHS genes revealed distinct responses to UV-B stress in tea plants.Our findings suggest that environmental factors such as UV-B exposure could have been the key drivers behind the gene duplication events in CHS.展开更多
ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.D...ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.Duplicate detections involve discovering records referring to the same practical components,indicating tasks,which are generally dependent on several input parameters that experts yield.Record linkage specifies the issue of finding identical records across various data sources.The similarity existing between two records is characterized based on domain-based similarity functions over different features.De-duplication of one dataset or the linkage of multiple data sets has become a highly significant operation in the data processing stages of different data mining programmes.The objective is to match all the records associated with the same entity.Various measures have been in use for representing the quality and complexity about data linkage algorithms,and many other novel metrics have been introduced.An outline of the problem existing in themeasurement of data linkage and de-duplication quality and complexity is presented.This article focuses on the reprocessing of health data that is horizontally divided among data custodians,with the purpose of custodians giving similar features to sets of patients.The first step in this technique is about an automatic selection of training examples with superior quality from the compared record pairs and the second step involves training the reciprocal neuro-fuzzy inference system(RANFIS)classifier.Using the Optimal Threshold classifier,it is presumed that there is information about the original match status for all compared record pairs(i.e.,Ant Lion Optimization),and therefore an optimal threshold can be computed based on the respective RANFIS.Febrl,Clinical Decision(CD),and Cork Open Research Archive(CORA)data repository help analyze the proposed method with evaluated benchmarks with current techniques.展开更多
Duplicated inferior vena cava with bilateral iliac vein compression is extremely rare.We report a case of an 87-year-old man presented with bilateral lower extremity swelling,who was noted to have duplicated inferior ...Duplicated inferior vena cava with bilateral iliac vein compression is extremely rare.We report a case of an 87-year-old man presented with bilateral lower extremity swelling,who was noted to have duplicated inferior vena cava,as revealed by computed tomography angiography(CTA).This revealed bilateral iliac vein compression caused by surrounding structures.Anticoagulant treatment combined with stent implantation completely alleviated this chronic debilitating condition during the follow-up of 2 months with no recurrence.展开更多
On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is e...On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is essential for on-site programming big data.Duplicate data detection is an important step in data cleaning,which can save storage resources and enhance data consistency.Due to the insufficiency in traditional Sorted Neighborhood Method(SNM)and the difficulty of high-dimensional data detection,an optimized algorithm based on random forests with the dynamic and adaptive window size is proposed.The efficiency of the algorithm can be elevated by improving the method of the key-selection,reducing dimension of data set and using an adaptive variable size sliding window.Experimental results show that the improved SNM algorithm exhibits better performance and achieve higher accuracy.展开更多
The reliability evaluation of a multistate network is primarily based on d-minimal paths/cuts(d-MPs/d-MCs).However,being a nondeterminism polynomial hard(NP-hard)problem,searching for all d-MPs is a rather challenging...The reliability evaluation of a multistate network is primarily based on d-minimal paths/cuts(d-MPs/d-MCs).However,being a nondeterminism polynomial hard(NP-hard)problem,searching for all d-MPs is a rather challenging task.In existing implicit enumeration algorithms based on minimal paths(MPs),duplicate d-MP candidates may be generated.An extra step is needed to locate and remove these duplicate d-MP candidates,which costs significant computational effort.This paper proposes an efficient method to prevent the generation of duplicate d-MP candidates for implicit enumeration algorithms for d-MPs.First,the mechanism of generating duplicate d-MP candidates in the implicit enumeration algorithms is discussed.Second,a direct and efficient avoiding-duplicates method is proposed.Third,an improved algorithm is developed,followed by complexity analysis and illustrative examples.Based on the computational experiments comparing with two existing algorithms,it is found that the proposed method can significantly improve the efficiency of generating d-MPs for a particular demand level d.展开更多
In this issue of the Journal of Geriatric Cardiology;Jing et al. showed off their near perfect results of percutaneous coronary interventions (PCI) through transfemoral approach (TFA) and transradial approach (TRA... In this issue of the Journal of Geriatric Cardiology;Jing et al. showed off their near perfect results of percutaneous coronary interventions (PCI) through transfemoral approach (TFA) and transradial approach (TRA) in the elderly Chinese patients. All patients were older.than 60years of age, with an average of 67.……展开更多
The number of systematic reviews is gradually increasing over time. Also, the methods to perform a systematic review are being improved. However, little attention has been paid for the issue regarding how to find dupl...The number of systematic reviews is gradually increasing over time. Also, the methods to perform a systematic review are being improved. However, little attention has been paid for the issue regarding how to find duplicates in systematic reviews. On the basis of the survey and systematic reviews by our team and others, we review the prevalence, significance and classification of duplicates and the method to find duplicates in a systematic review. Notably, although a preliminary method to find duplicates is established, its usefulness and convenience need to be further confirmed.展开更多
Semantic duplicates in databases represent today an important data quality challenge which leads to bad decisions. In large databases, we sometimes find ourselves with tens of thousands of duplicates, which necessitat...Semantic duplicates in databases represent today an important data quality challenge which leads to bad decisions. In large databases, we sometimes find ourselves with tens of thousands of duplicates, which necessitates an automatic deduplication. For this, it is necessary to detect duplicates, with a fairly reliable method to find as many duplicates as possible and powerful enough to run in a reasonable time. This paper proposes and compares on real data effective duplicates detection methods for automatic deduplication of files based on names, working with French texts or English texts, and the names of people or places, in Africa or in the West. After conducting a more complete classification of semantic duplicates than the usual classifications, we introduce several methods for detecting duplicates whose average complexity observed is less than O(2n). Through a simple model, we highlight a global efficacy rate, combining precision and recall. We propose a new metric distance between records, as well as rules for automatic duplicate detection. Analyses made on a database containing real data for an administration in Central Africa, and on a known standard database containing names of restaurants in the USA, have shown better results than those of known methods, with a lesser complexity.展开更多
The duplicate form of the generalized Gould-Hsu inversions has been obtained by Shi and Zhang. In this paper, we present a simple proof of this duplicate form. With the same method, we construct the duplicate form of ...The duplicate form of the generalized Gould-Hsu inversions has been obtained by Shi and Zhang. In this paper, we present a simple proof of this duplicate form. With the same method, we construct the duplicate form of the generalized Carlitz inversions. Using this duplicate form, we obtain several terminating basic hypergeometric identities and some limiting cases.展开更多
Transmission Control Protocol (TCP) performance over MANET is an area of extensive research. Congestion control mechanisms are major components of TCP which affect its performance. The improvement of these mechanisms ...Transmission Control Protocol (TCP) performance over MANET is an area of extensive research. Congestion control mechanisms are major components of TCP which affect its performance. The improvement of these mechanisms represents a big challenge especially over wireless environments. Additive Increase Multiplicative Decrease (AIMD) mechanisms control the amount of increment and decrement of the transmission rate as a response to changes in the level of contention on routers buffer space and links bandwidth. The role of an AIMD mechanism in transmitting the proper amount of data is not easy, especially over MANET. This is because MANET has a very dynamic topology and high bit error rate wireless links that cause packet loss. Such a loss could be misinterpreted as severe congestion by the transmitting TCP node. This leads to unnecessary sharp reduction in the transmission rate which could degrades TCP throughput. This paper introduces a new AIMD algorithm that takes the number of already received duplicated ACK, when a timeout takes place, into account in deciding the amount of multiplicative decrease. Specifically, it decides the point from which Slow-start mechanism should begin its recovery of the congestion window size. The new AIMD algorithm has been developed as a new TCP variant which we call TCP Karak. The aim of TCP Karak is to be more adaptive to mobile wireless networks conditions by being able to distinguish between loss due to severe congestion and that due to link breakages or bit errors. Several simulated experiments have been conducted to evaluate TCP Karak and compare its performance with TCP NewReno. Results have shown that TCP Karak is able to achieve higher throughput and goodput than TCP NewReno under various mobility speeds, traffic loads, and bit error rates.展开更多
Aneurysm at the origin of a duplication of the middle cerebral artery (DMCA) is very rare, and only 29 treated cases have been reported. All of the cases were treated by direct surgery except a ruptured case treated b...Aneurysm at the origin of a duplication of the middle cerebral artery (DMCA) is very rare, and only 29 treated cases have been reported. All of the cases were treated by direct surgery except a ruptured case treated by intentional partial coil embolization. We report the first unruptured case treated by coil embolization and review the previously published cases. Coil embolization can be alternative treatment for an unruptured aneurysm at the origin of the DMCA. Stable framing to spare the origin of it and prevention of thromboembolic complications are keys for safe treatment.展开更多
Thirty-six daily duplicate diet samples were collected from 12 healthy female Japanese vegans and sodium, potassium, calcium, magnesium, phosphorus, iron, zinc, copper, manganese, iodine, selenium, chromium and molybd...Thirty-six daily duplicate diet samples were collected from 12 healthy female Japanese vegans and sodium, potassium, calcium, magnesium, phosphorus, iron, zinc, copper, manganese, iodine, selenium, chromium and molybdenum in the diets were measured to estimate mineral and trace element intake by Japanese vegans. Significantly higher intake of potassium, magnesium, phosphorus, iron, copper, manganese and molybdenum was observed in vegans than in general Japanese women, but no difference was observed in sodium, iodine, selenium and chromium intake. Vegan calcium intake tended to be low compared to that of general women but the difference was not significant. Since high potassium, magnesium and iron intakes cannot be achieved by general Japanese diets and high intake of potassium and magnesium may prevent hyperextension and cardiovascular disease in vegans, there are few problems with Japanese vegan diets regarding mineral and trace element intake, except for calcium intake, which is low as it is in the general Japanese people.展开更多
Variations of the anterior cerebral artery(ACA)-anterior communicating artery(ACo A) complex are commonly observed when associated with a symptomatic intracranial aneurysm. We report an asymptomatic ACo A aneurysm ass...Variations of the anterior cerebral artery(ACA)-anterior communicating artery(ACo A) complex are commonly observed when associated with a symptomatic intracranial aneurysm. We report an asymptomatic ACo A aneurysm associated with duplicated hypoplastic A1 segment of the right ACA, observed in a 70-year-old female cadaver. Furthermore, the aneurysm, practically substituting the ACo A, caused a remarkable depression on the internal surface of the right frontal lobe, anterior to the optic chiasm. Aneurysms and other anomalies of the ACA and ACo A are common and their microvascular surgical management requires sound knowledge of the normal and variant vascular anatomy. Persistence of some embryonic vessels that normally disappear, disappearance of vessels that would normally persist or sprouting of new vessels due to hemodynamic and genetic factors are the usual causes for such anomalies. The high incidence of coexisting vascular anomalies and aneurysm suggests that such abnormalities predispose to aneurysm formation due to changes in the regional blood flow. A1 segment duplication has been reported to occur in 4% of subjects in cadaveric studies and in up to 0.5%-9.7% of cases of ACo A aneurysm surgery. Angiographic hypoplasias and aplasias of the A1 seg-ment have been also correlated with ACo A aneurysm patients.展开更多
BACKGROUND Thumb polydactyly is one of the most common congenital hand deformities,and the Bilhaut-Cloquet procedure or a modified one is often used.However,controversy remains over the rare instances in which both th...BACKGROUND Thumb polydactyly is one of the most common congenital hand deformities,and the Bilhaut-Cloquet procedure or a modified one is often used.However,controversy remains over the rare instances in which both thumbs are not of similar length or far apart in distance.AIM To evaluate the clinical outcomes of pedicle complex tissue flap transfer in the treatment of duplicated thumbs with unequal size.METHODS From January 2014 to December 2020,15 patients underwent duplicated thumb reconstruction by pedicle complex tissue flap transfer at our hand surgery center.The technique was used when it was necessary to combine different tissues from both severed and preserved thumbs that were not of similar length or far apart in distance.Subjective parents’evaluations and functional outcomes(ALURRA and TATA criteria)were obtained.The alignment deviation,instability,range of motion(percent of opposite thumb)of the interphalangeal and metacarpophalangeal joints,and the aesthetic aspects,including circumference,length,nail size,and nail deformity,were used to assess the clinical outcomes.RESULTS The average age of patients at the time of surgery was 13 mo,and the mean final follow-up occurred at 42 mo.An appropriate volume with a stable joint and good appearance was obtained in 14 reconstructed thumbs.An unstable interphalangeal joint occurred in one thumb.The flexion-extension arc at the metacarpophalangeal joint was good,while that at the interphalangeal joint was poor.Most of the parents were satisfied with the cosmetic and functional results of the reconstructed thumbs.The mean ALURRA score was 21.8(range:20-24),and the Tada score was 6.9(range:5-8).Compared with the non-operated side,the length of the operated thumb was approximately 95%,the girth was 89%,and the nail width was 82.9%.The mean ranges of motion were 62.1%of that of the unaffected thumb in the interphalangeal joint and 78.3%in the metacarpophalangeal joint.CONCLUSION Harvesting a pedicle flap from a severed thumb is a safe and reliable procedure.Defects of the preserved thumb,such as the skin,nail,and bone,can be effectively restored using the complex tissue flap.展开更多
Since the discovery of the first transposon by Dr.Barbara McClintock,the prevalence and diversity of transposable elements(TEs)have been gradually recognized.As fundamental genetic components,TEs drive organismal evol...Since the discovery of the first transposon by Dr.Barbara McClintock,the prevalence and diversity of transposable elements(TEs)have been gradually recognized.As fundamental genetic components,TEs drive organismal evolution not only by contributing functional sequences(e.g.,regulatory elements or“controllers”as phrased by Dr.McClintock)but also by shuffling genomic sequences.In the latter respect,TE-mediated gene duplications have contributed to the origination of new genes and attracted extensive interest.In response to the development of this field,we herein attempt to provide an overview of TEmediated duplication by focusing on common rules emerging across duplications generated by different TE types.Specifically,despite the huge divergence of transposition machinery across TEs,we identify three common features of various TE-mediated duplication mechanisms,including end bypass,template switching,and recurrent transposition.These three features lead to one common functional outcome,namely,TE-mediated duplicates tend to be subjected to exon shuffling and neofunctionalization.Therefore,the intrinsic properties of the mutational mechanism constrain the evolutionary trajectories of these duplicates.We finally discuss the future of this field including an in-depth characterization of both the duplication mechanisms and functions of TE-mediated duplicates.展开更多
Fatty acyl reductases(FARs)are key enzymes that participate in sex pheromone biosynthesis by reducing fatty acids to fatty alcohols.Lepidoptera typically harbor numerous FAR gene family members.Although FAR genes are ...Fatty acyl reductases(FARs)are key enzymes that participate in sex pheromone biosynthesis by reducing fatty acids to fatty alcohols.Lepidoptera typically harbor numerous FAR gene family members.Although FAR genes are involved in the biosynthesis of sex pheromones in moths,the key FAR gene of Spodoptera litura remains unclear.In this work,we predicted 30 FAR genes from the S.litura genome and identified a domain duplication within gene SlitFAR3,which exhibited high and preferential expression in the sexually mature female pheromone glands(PGs)and a rhythmic expression pattern during the scotophase of sex pheromone production.The molecular docking of SlitFAR3,as predicted using a 3D model,revealed a co-factor NADPH binding cavity and 2 substrate binding cavities.Functional expression in yeast cells combined with comprehensive gas chromatography indicated that the SlitFAR3 gene could produce fatty alcohol products.This study is the first to focus on the special phenomenon of FAR domain duplication,which will advance our understanding of biosynthesis-related genes from the perspective of evolutionary biology.展开更多
基金supported by the National Natural Science Foundation of China(U21A20232,32372756,and 32202551).
文摘In tea plants,the abundant flavonoid compounds are responsible for the health benefits for the human body and define the astringent flavor profile.While the downstream mechanisms of flavonoid biosynthesis have been extensively studied,the role of chalcone synthase(CHS)in this secondary metabolic process in tea plants remains less clear.In this study,we compared the evolutionary profile of the flavonoid metabolism pathway and discovered that gene duplication of CHS occurred in tea plants.We identified three CsCHS genes,along with a CsCHS-like gene,as potential candidates for further functional investigation.Unlike the CsCHS-like gene,the CsCHS genes effectively restored flavonoid production in Arabidopsis chs-mutants.Additionally,CsCHS transgenic tobacco plants exhibited higher flavonoid compound accumulation compared to their wild-type counterparts.Most notably,our examination of promoter and gene expression levels for the selected CHS genes revealed distinct responses to UV-B stress in tea plants.Our findings suggest that environmental factors such as UV-B exposure could have been the key drivers behind the gene duplication events in CHS.
基金This research project was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R234),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.Duplicate detections involve discovering records referring to the same practical components,indicating tasks,which are generally dependent on several input parameters that experts yield.Record linkage specifies the issue of finding identical records across various data sources.The similarity existing between two records is characterized based on domain-based similarity functions over different features.De-duplication of one dataset or the linkage of multiple data sets has become a highly significant operation in the data processing stages of different data mining programmes.The objective is to match all the records associated with the same entity.Various measures have been in use for representing the quality and complexity about data linkage algorithms,and many other novel metrics have been introduced.An outline of the problem existing in themeasurement of data linkage and de-duplication quality and complexity is presented.This article focuses on the reprocessing of health data that is horizontally divided among data custodians,with the purpose of custodians giving similar features to sets of patients.The first step in this technique is about an automatic selection of training examples with superior quality from the compared record pairs and the second step involves training the reciprocal neuro-fuzzy inference system(RANFIS)classifier.Using the Optimal Threshold classifier,it is presumed that there is information about the original match status for all compared record pairs(i.e.,Ant Lion Optimization),and therefore an optimal threshold can be computed based on the respective RANFIS.Febrl,Clinical Decision(CD),and Cork Open Research Archive(CORA)data repository help analyze the proposed method with evaluated benchmarks with current techniques.
文摘Duplicated inferior vena cava with bilateral iliac vein compression is extremely rare.We report a case of an 87-year-old man presented with bilateral lower extremity swelling,who was noted to have duplicated inferior vena cava,as revealed by computed tomography angiography(CTA).This revealed bilateral iliac vein compression caused by surrounding structures.Anticoagulant treatment combined with stent implantation completely alleviated this chronic debilitating condition during the follow-up of 2 months with no recurrence.
基金supported by the National Key R&D Program of China(Nos.2018YFB1003905)the National Natural Science Foundation of China under Grant No.61971032,Fundamental Research Funds for the Central Universities(No.FRF-TP-18-008A3).
文摘On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is essential for on-site programming big data.Duplicate data detection is an important step in data cleaning,which can save storage resources and enhance data consistency.Due to the insufficiency in traditional Sorted Neighborhood Method(SNM)and the difficulty of high-dimensional data detection,an optimized algorithm based on random forests with the dynamic and adaptive window size is proposed.The efficiency of the algorithm can be elevated by improving the method of the key-selection,reducing dimension of data set and using an adaptive variable size sliding window.Experimental results show that the improved SNM algorithm exhibits better performance and achieve higher accuracy.
基金supported by the National Natural Science Foundation of China(71701207)the Science and Technology on Reliability&Environmental Engineering Laboratory(6142004004-2)the Science and Technology Commission of the CMC(2019-JCJQ-JJ-180)。
文摘The reliability evaluation of a multistate network is primarily based on d-minimal paths/cuts(d-MPs/d-MCs).However,being a nondeterminism polynomial hard(NP-hard)problem,searching for all d-MPs is a rather challenging task.In existing implicit enumeration algorithms based on minimal paths(MPs),duplicate d-MP candidates may be generated.An extra step is needed to locate and remove these duplicate d-MP candidates,which costs significant computational effort.This paper proposes an efficient method to prevent the generation of duplicate d-MP candidates for implicit enumeration algorithms for d-MPs.First,the mechanism of generating duplicate d-MP candidates in the implicit enumeration algorithms is discussed.Second,a direct and efficient avoiding-duplicates method is proposed.Third,an improved algorithm is developed,followed by complexity analysis and illustrative examples.Based on the computational experiments comparing with two existing algorithms,it is found that the proposed method can significantly improve the efficiency of generating d-MPs for a particular demand level d.
文摘 In this issue of the Journal of Geriatric Cardiology;Jing et al. showed off their near perfect results of percutaneous coronary interventions (PCI) through transfemoral approach (TFA) and transradial approach (TRA) in the elderly Chinese patients. All patients were older.than 60years of age, with an average of 67.……
文摘The number of systematic reviews is gradually increasing over time. Also, the methods to perform a systematic review are being improved. However, little attention has been paid for the issue regarding how to find duplicates in systematic reviews. On the basis of the survey and systematic reviews by our team and others, we review the prevalence, significance and classification of duplicates and the method to find duplicates in a systematic review. Notably, although a preliminary method to find duplicates is established, its usefulness and convenience need to be further confirmed.
文摘Semantic duplicates in databases represent today an important data quality challenge which leads to bad decisions. In large databases, we sometimes find ourselves with tens of thousands of duplicates, which necessitates an automatic deduplication. For this, it is necessary to detect duplicates, with a fairly reliable method to find as many duplicates as possible and powerful enough to run in a reasonable time. This paper proposes and compares on real data effective duplicates detection methods for automatic deduplication of files based on names, working with French texts or English texts, and the names of people or places, in Africa or in the West. After conducting a more complete classification of semantic duplicates than the usual classifications, we introduce several methods for detecting duplicates whose average complexity observed is less than O(2n). Through a simple model, we highlight a global efficacy rate, combining precision and recall. We propose a new metric distance between records, as well as rules for automatic duplicate detection. Analyses made on a database containing real data for an administration in Central Africa, and on a known standard database containing names of restaurants in the USA, have shown better results than those of known methods, with a lesser complexity.
文摘The duplicate form of the generalized Gould-Hsu inversions has been obtained by Shi and Zhang. In this paper, we present a simple proof of this duplicate form. With the same method, we construct the duplicate form of the generalized Carlitz inversions. Using this duplicate form, we obtain several terminating basic hypergeometric identities and some limiting cases.
文摘Transmission Control Protocol (TCP) performance over MANET is an area of extensive research. Congestion control mechanisms are major components of TCP which affect its performance. The improvement of these mechanisms represents a big challenge especially over wireless environments. Additive Increase Multiplicative Decrease (AIMD) mechanisms control the amount of increment and decrement of the transmission rate as a response to changes in the level of contention on routers buffer space and links bandwidth. The role of an AIMD mechanism in transmitting the proper amount of data is not easy, especially over MANET. This is because MANET has a very dynamic topology and high bit error rate wireless links that cause packet loss. Such a loss could be misinterpreted as severe congestion by the transmitting TCP node. This leads to unnecessary sharp reduction in the transmission rate which could degrades TCP throughput. This paper introduces a new AIMD algorithm that takes the number of already received duplicated ACK, when a timeout takes place, into account in deciding the amount of multiplicative decrease. Specifically, it decides the point from which Slow-start mechanism should begin its recovery of the congestion window size. The new AIMD algorithm has been developed as a new TCP variant which we call TCP Karak. The aim of TCP Karak is to be more adaptive to mobile wireless networks conditions by being able to distinguish between loss due to severe congestion and that due to link breakages or bit errors. Several simulated experiments have been conducted to evaluate TCP Karak and compare its performance with TCP NewReno. Results have shown that TCP Karak is able to achieve higher throughput and goodput than TCP NewReno under various mobility speeds, traffic loads, and bit error rates.
文摘Aneurysm at the origin of a duplication of the middle cerebral artery (DMCA) is very rare, and only 29 treated cases have been reported. All of the cases were treated by direct surgery except a ruptured case treated by intentional partial coil embolization. We report the first unruptured case treated by coil embolization and review the previously published cases. Coil embolization can be alternative treatment for an unruptured aneurysm at the origin of the DMCA. Stable framing to spare the origin of it and prevention of thromboembolic complications are keys for safe treatment.
基金comprehensive research on cardiovascular and lifestyle disease from the Ministry of Health, Labour and Welfare of Japan
文摘Thirty-six daily duplicate diet samples were collected from 12 healthy female Japanese vegans and sodium, potassium, calcium, magnesium, phosphorus, iron, zinc, copper, manganese, iodine, selenium, chromium and molybdenum in the diets were measured to estimate mineral and trace element intake by Japanese vegans. Significantly higher intake of potassium, magnesium, phosphorus, iron, copper, manganese and molybdenum was observed in vegans than in general Japanese women, but no difference was observed in sodium, iodine, selenium and chromium intake. Vegan calcium intake tended to be low compared to that of general women but the difference was not significant. Since high potassium, magnesium and iron intakes cannot be achieved by general Japanese diets and high intake of potassium and magnesium may prevent hyperextension and cardiovascular disease in vegans, there are few problems with Japanese vegan diets regarding mineral and trace element intake, except for calcium intake, which is low as it is in the general Japanese people.
文摘Variations of the anterior cerebral artery(ACA)-anterior communicating artery(ACo A) complex are commonly observed when associated with a symptomatic intracranial aneurysm. We report an asymptomatic ACo A aneurysm associated with duplicated hypoplastic A1 segment of the right ACA, observed in a 70-year-old female cadaver. Furthermore, the aneurysm, practically substituting the ACo A, caused a remarkable depression on the internal surface of the right frontal lobe, anterior to the optic chiasm. Aneurysms and other anomalies of the ACA and ACo A are common and their microvascular surgical management requires sound knowledge of the normal and variant vascular anatomy. Persistence of some embryonic vessels that normally disappear, disappearance of vessels that would normally persist or sprouting of new vessels due to hemodynamic and genetic factors are the usual causes for such anomalies. The high incidence of coexisting vascular anomalies and aneurysm suggests that such abnormalities predispose to aneurysm formation due to changes in the regional blood flow. A1 segment duplication has been reported to occur in 4% of subjects in cadaveric studies and in up to 0.5%-9.7% of cases of ACo A aneurysm surgery. Angiographic hypoplasias and aplasias of the A1 seg-ment have been also correlated with ACo A aneurysm patients.
基金the China Scholarship Council,No.201808080126(to Liu FX).
文摘BACKGROUND Thumb polydactyly is one of the most common congenital hand deformities,and the Bilhaut-Cloquet procedure or a modified one is often used.However,controversy remains over the rare instances in which both thumbs are not of similar length or far apart in distance.AIM To evaluate the clinical outcomes of pedicle complex tissue flap transfer in the treatment of duplicated thumbs with unequal size.METHODS From January 2014 to December 2020,15 patients underwent duplicated thumb reconstruction by pedicle complex tissue flap transfer at our hand surgery center.The technique was used when it was necessary to combine different tissues from both severed and preserved thumbs that were not of similar length or far apart in distance.Subjective parents’evaluations and functional outcomes(ALURRA and TATA criteria)were obtained.The alignment deviation,instability,range of motion(percent of opposite thumb)of the interphalangeal and metacarpophalangeal joints,and the aesthetic aspects,including circumference,length,nail size,and nail deformity,were used to assess the clinical outcomes.RESULTS The average age of patients at the time of surgery was 13 mo,and the mean final follow-up occurred at 42 mo.An appropriate volume with a stable joint and good appearance was obtained in 14 reconstructed thumbs.An unstable interphalangeal joint occurred in one thumb.The flexion-extension arc at the metacarpophalangeal joint was good,while that at the interphalangeal joint was poor.Most of the parents were satisfied with the cosmetic and functional results of the reconstructed thumbs.The mean ALURRA score was 21.8(range:20-24),and the Tada score was 6.9(range:5-8).Compared with the non-operated side,the length of the operated thumb was approximately 95%,the girth was 89%,and the nail width was 82.9%.The mean ranges of motion were 62.1%of that of the unaffected thumb in the interphalangeal joint and 78.3%in the metacarpophalangeal joint.CONCLUSION Harvesting a pedicle flap from a severed thumb is a safe and reliable procedure.Defects of the preserved thumb,such as the skin,nail,and bone,can be effectively restored using the complex tissue flap.
基金the Ministry of Agriculture and Rural Affairs of China,the National Key R&D Program of China(2019YFA0802600)the Chinese Academy of Sciences(ZDBS-LY-SM005,XDPB17)the National Natural Science Foundation of China(31970565).
文摘Since the discovery of the first transposon by Dr.Barbara McClintock,the prevalence and diversity of transposable elements(TEs)have been gradually recognized.As fundamental genetic components,TEs drive organismal evolution not only by contributing functional sequences(e.g.,regulatory elements or“controllers”as phrased by Dr.McClintock)but also by shuffling genomic sequences.In the latter respect,TE-mediated gene duplications have contributed to the origination of new genes and attracted extensive interest.In response to the development of this field,we herein attempt to provide an overview of TEmediated duplication by focusing on common rules emerging across duplications generated by different TE types.Specifically,despite the huge divergence of transposition machinery across TEs,we identify three common features of various TE-mediated duplication mechanisms,including end bypass,template switching,and recurrent transposition.These three features lead to one common functional outcome,namely,TE-mediated duplicates tend to be subjected to exon shuffling and neofunctionalization.Therefore,the intrinsic properties of the mutational mechanism constrain the evolutionary trajectories of these duplicates.We finally discuss the future of this field including an in-depth characterization of both the duplication mechanisms and functions of TE-mediated duplicates.
基金China Agriculture Research System of MOF and MARA(CARS-24-C-03)National Key R&D Program of China(Grant no.2019YFD1002100).
文摘Fatty acyl reductases(FARs)are key enzymes that participate in sex pheromone biosynthesis by reducing fatty acids to fatty alcohols.Lepidoptera typically harbor numerous FAR gene family members.Although FAR genes are involved in the biosynthesis of sex pheromones in moths,the key FAR gene of Spodoptera litura remains unclear.In this work,we predicted 30 FAR genes from the S.litura genome and identified a domain duplication within gene SlitFAR3,which exhibited high and preferential expression in the sexually mature female pheromone glands(PGs)and a rhythmic expression pattern during the scotophase of sex pheromone production.The molecular docking of SlitFAR3,as predicted using a 3D model,revealed a co-factor NADPH binding cavity and 2 substrate binding cavities.Functional expression in yeast cells combined with comprehensive gas chromatography indicated that the SlitFAR3 gene could produce fatty alcohol products.This study is the first to focus on the special phenomenon of FAR domain duplication,which will advance our understanding of biosynthesis-related genes from the perspective of evolutionary biology.