Soybean is a broadly popular and extensively cultivated crop,however,many high-yield and high-quality varieties require specific growth conditions,restricting their widespread adoption.The appropriate light conditions...Soybean is a broadly popular and extensively cultivated crop,however,many high-yield and high-quality varieties require specific growth conditions,restricting their widespread adoption.The appropriate light conditions and photoperiod must be attained for these varieties to thrive in new environments.In this study,we employed CRISPR/Cas9 to design two sgRNAs aimed at knocking out the maturity-related gene E4 in a major American soybean variety called''Jack'',which belongs to maturity group MGII.E4 gene is primarily involved in the photoperiodic flowering and maturity in soybean,making it an ideal candidate for genetic manipulation.We successfully obtained 1 homozygous E4-SG1 mutant type with 1-bp insertion,and 4 homozygous E4-SG2 mutants type with 2-bp deletion,7-bp deletion,61-bp deletion,and 1-bp insertion,respectively.The homozygous e4 mutant plants contained early termination codons devoid of transgenic elements.Additionally,no potential offtarget sites of the E4 gene were detected.A comparative analysis revealed that,unlike the wild-type,the maturity time of homozygous e4 mutants was early under both short-day and long-day conditions.These mutants offer novel germplasm resources that may be used to modify the photoperiod sensitivity and maturity of soybean,enhancing its adaptability to high-latitude regions.展开更多
BACKGROUND Prostaglandin E1(PGE1),or alprostadil,is a potent vasodilator that improves hepatic blood flow and reduces ischemia-reperfusion injury post-liver transplantation(LT).However,the benefits of PGE1 on renal fu...BACKGROUND Prostaglandin E1(PGE1),or alprostadil,is a potent vasodilator that improves hepatic blood flow and reduces ischemia-reperfusion injury post-liver transplantation(LT).However,the benefits of PGE1 on renal function after LT have not yet been well described.AIM To assess the impact of PGE1 administration on renal function in patients who underwent liver or liver-kidney transplant.METHODS This retrospective study included all patients who underwent liver or liverkidney transplant at our institution from January,2011 to December,2021.Patients were classified based on whether they received PGE1.PGE1 was administered post-LT to those with transaminases>1000 U/L in the immediate postoperative period.Demographics,post-LT treatments and/or complications,renal function,and survival were analyzed.Multivariable logistic regression analysis was performed,and a two-tailed P value<0.05 was considered statistically significant.RESULTS A total of 145 patients underwent LT,with 44(30%)receiving PGE1.Baseline patient characteristics were comparable,except the PGE1 group had significantly higher aspartate aminotransferase(AST)(1961.9 U/L±1862.3 U/L vs 878 U/L±741.4 U/L,P=0.000),alanine aminotransferase(1070.6 U/L±895 U/L vs 547.7 U/L±410 U/L,P=0.000),international normalized ratio on post-LT day 1(2±0.74 vs 1.8±0.4,P=0.03),a longer intensive care unit stay(8.1 days±11.8 days vs 3.8 days±4.6 days,P=0.003),more vasopressor use(55.53 hours±111 hours vs 16.33 hours±26.3 hours,P=0.002),and higher immediate postoperative complications(18.6%vs 4.9%,P=0.04).The PGE1 group also had a significantly higher 90-day readmission rate(29.6%vs 13.1%,P=0.02)and lower 1-year liver graft survival(87.5%vs 98.9%,P=0.005).However,30-day readmission(31.6%vs 27.4%,P=0.64),LT complications(hepatic artery thrombosis,biliary complications,rejection of liver graft,cardiomyopathy),1-year patient survival(96.9%vs 97.8%,P=0.77),overall liver graft survival,and overall patient survival were similar between the two groups(95.4%vs 93.9%,P=0.74 and 88.4%vs 86.9%,P=0.81 respectively).Although the PGE1 group had a significantly lower glomerular filtration rate(eGFR)on post-LT day 7(46.3 mL/minute±26.7 mL/minute vs 62.5 mL/minute±34 mL/minute,P=0.009),the eventual need for renal replacement therapy(13.6%vs 5.9%,P=0.09),the number of dialysis sessions(0.91 vs 0.27,P=0.13),and eGFR at 1-month(37.2 mL/minute±35.9 mL/minute vs 42 mL/minute±36.9 mL/minute,P=0.49),6-months(54.8 mL/minute±21.6 mL/minute vs 62 mL/minute±21.4 mL/minute,P=0.09),and 12-months(63.7 mL/minute±20.7 mL/minute vs 62.8 mL/minute±20.3 mL/minute,P=0.85)post-LT were similar to those in the non-PGE1 group.CONCLUSION In patients who received PGE1 for ischemia-reperfusion injury,despite immediate acute renal injury post-LT,the renal function at 1-month,6-months,and 12-months post-LT was similar compared to those without ischemiareperfusion injury.Prospective clinical trials are needed to further elucidate the benefits of PGE1 use in renal function.展开更多
Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the i...Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: With M.A.R.I.E. enable a rational quantified measurement of Emotional-Visual-Acuity (EVA) of 1) a) an individual observer, b) in a population aged 20 to 70 years old, 2) measure the range and intensity of expressed emotions by 3 Face-Tests, 3) quantify the performance of a sample of 204 observers with hyper normal measures of cognition, “thymia,” (ibid. defined elsewhere) and low levels of anxiety 4) analysis of the 6 primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual-Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Deci-sion-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Finger-print-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.展开更多
Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the i...Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: M.A.R.I.E. enables the rational, quantified measurement of Emotional Visual Acuity (EVA) in an individual observer and a population aged 20 to 70 years. Meanwhile, it can measure the range and intensity of expressed emotions through three Face- Tests, quantify the performance of a sample of 204 observers with hypernormal measures of cognition, “thymia” (defined elsewhere), and low levels of anxiety, and perform analysis of the six primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual- Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Decision-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”, 6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Fingerprint-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.展开更多
Designing and optimizing complex scientific code for new computing architectures is a challenging task. To address this issue in the E3SM land model (ELM) development, we developed a software tool called SPEL, which f...Designing and optimizing complex scientific code for new computing architectures is a challenging task. To address this issue in the E3SM land model (ELM) development, we developed a software tool called SPEL, which facilitates code generation, verification, and performance tuning using compiler directives within a Function Unit Test framework. In this paper, we present a SPEL extension that leverages the version control system (e.g., Git) to autonomous code generation and demonstrate its application to continuous code integration and development of the ELM software system. The study can benefit the scientific software development community.展开更多
基金supported by grants from the National Key R&D Program of China (2023YFD1201300)CAAS Agricultural Science and Technology Innovation Project
文摘Soybean is a broadly popular and extensively cultivated crop,however,many high-yield and high-quality varieties require specific growth conditions,restricting their widespread adoption.The appropriate light conditions and photoperiod must be attained for these varieties to thrive in new environments.In this study,we employed CRISPR/Cas9 to design two sgRNAs aimed at knocking out the maturity-related gene E4 in a major American soybean variety called''Jack'',which belongs to maturity group MGII.E4 gene is primarily involved in the photoperiodic flowering and maturity in soybean,making it an ideal candidate for genetic manipulation.We successfully obtained 1 homozygous E4-SG1 mutant type with 1-bp insertion,and 4 homozygous E4-SG2 mutants type with 2-bp deletion,7-bp deletion,61-bp deletion,and 1-bp insertion,respectively.The homozygous e4 mutant plants contained early termination codons devoid of transgenic elements.Additionally,no potential offtarget sites of the E4 gene were detected.A comparative analysis revealed that,unlike the wild-type,the maturity time of homozygous e4 mutants was early under both short-day and long-day conditions.These mutants offer novel germplasm resources that may be used to modify the photoperiod sensitivity and maturity of soybean,enhancing its adaptability to high-latitude regions.
文摘BACKGROUND Prostaglandin E1(PGE1),or alprostadil,is a potent vasodilator that improves hepatic blood flow and reduces ischemia-reperfusion injury post-liver transplantation(LT).However,the benefits of PGE1 on renal function after LT have not yet been well described.AIM To assess the impact of PGE1 administration on renal function in patients who underwent liver or liver-kidney transplant.METHODS This retrospective study included all patients who underwent liver or liverkidney transplant at our institution from January,2011 to December,2021.Patients were classified based on whether they received PGE1.PGE1 was administered post-LT to those with transaminases>1000 U/L in the immediate postoperative period.Demographics,post-LT treatments and/or complications,renal function,and survival were analyzed.Multivariable logistic regression analysis was performed,and a two-tailed P value<0.05 was considered statistically significant.RESULTS A total of 145 patients underwent LT,with 44(30%)receiving PGE1.Baseline patient characteristics were comparable,except the PGE1 group had significantly higher aspartate aminotransferase(AST)(1961.9 U/L±1862.3 U/L vs 878 U/L±741.4 U/L,P=0.000),alanine aminotransferase(1070.6 U/L±895 U/L vs 547.7 U/L±410 U/L,P=0.000),international normalized ratio on post-LT day 1(2±0.74 vs 1.8±0.4,P=0.03),a longer intensive care unit stay(8.1 days±11.8 days vs 3.8 days±4.6 days,P=0.003),more vasopressor use(55.53 hours±111 hours vs 16.33 hours±26.3 hours,P=0.002),and higher immediate postoperative complications(18.6%vs 4.9%,P=0.04).The PGE1 group also had a significantly higher 90-day readmission rate(29.6%vs 13.1%,P=0.02)and lower 1-year liver graft survival(87.5%vs 98.9%,P=0.005).However,30-day readmission(31.6%vs 27.4%,P=0.64),LT complications(hepatic artery thrombosis,biliary complications,rejection of liver graft,cardiomyopathy),1-year patient survival(96.9%vs 97.8%,P=0.77),overall liver graft survival,and overall patient survival were similar between the two groups(95.4%vs 93.9%,P=0.74 and 88.4%vs 86.9%,P=0.81 respectively).Although the PGE1 group had a significantly lower glomerular filtration rate(eGFR)on post-LT day 7(46.3 mL/minute±26.7 mL/minute vs 62.5 mL/minute±34 mL/minute,P=0.009),the eventual need for renal replacement therapy(13.6%vs 5.9%,P=0.09),the number of dialysis sessions(0.91 vs 0.27,P=0.13),and eGFR at 1-month(37.2 mL/minute±35.9 mL/minute vs 42 mL/minute±36.9 mL/minute,P=0.49),6-months(54.8 mL/minute±21.6 mL/minute vs 62 mL/minute±21.4 mL/minute,P=0.09),and 12-months(63.7 mL/minute±20.7 mL/minute vs 62.8 mL/minute±20.3 mL/minute,P=0.85)post-LT were similar to those in the non-PGE1 group.CONCLUSION In patients who received PGE1 for ischemia-reperfusion injury,despite immediate acute renal injury post-LT,the renal function at 1-month,6-months,and 12-months post-LT was similar compared to those without ischemiareperfusion injury.Prospective clinical trials are needed to further elucidate the benefits of PGE1 use in renal function.
文摘Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: With M.A.R.I.E. enable a rational quantified measurement of Emotional-Visual-Acuity (EVA) of 1) a) an individual observer, b) in a population aged 20 to 70 years old, 2) measure the range and intensity of expressed emotions by 3 Face-Tests, 3) quantify the performance of a sample of 204 observers with hyper normal measures of cognition, “thymia,” (ibid. defined elsewhere) and low levels of anxiety 4) analysis of the 6 primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual-Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Deci-sion-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Finger-print-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.
文摘Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: M.A.R.I.E. enables the rational, quantified measurement of Emotional Visual Acuity (EVA) in an individual observer and a population aged 20 to 70 years. Meanwhile, it can measure the range and intensity of expressed emotions through three Face- Tests, quantify the performance of a sample of 204 observers with hypernormal measures of cognition, “thymia” (defined elsewhere), and low levels of anxiety, and perform analysis of the six primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual- Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Decision-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”, 6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Fingerprint-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.
文摘Designing and optimizing complex scientific code for new computing architectures is a challenging task. To address this issue in the E3SM land model (ELM) development, we developed a software tool called SPEL, which facilitates code generation, verification, and performance tuning using compiler directives within a Function Unit Test framework. In this paper, we present a SPEL extension that leverages the version control system (e.g., Git) to autonomous code generation and demonstrate its application to continuous code integration and development of the ELM software system. The study can benefit the scientific software development community.