Objective We aimed to identify new,more accurate risk factors of liver transplantation for liver cancer through using the Surveillance,Epidemiology,and End Results(SEER)database.Methods Using the SEER database,we iden...Objective We aimed to identify new,more accurate risk factors of liver transplantation for liver cancer through using the Surveillance,Epidemiology,and End Results(SEER)database.Methods Using the SEER database,we identified patients that had undergone surgical resection for non-metastatic hepatocellular carcinoma(HCC)and subsequent liver transplantation between 2010 and 2017.Overall survival(OS)was estimated using Kaplan-Meier plotter.Cox proportional hazards regression modelling was used to identify factors independently associated with recurrent disease[presented as adjusted hazard ratios(HR)with 95%CIs].Results Totally,1530 eligible patients were included in the analysis.There were significant differences in ethnicity(P=0.04),cancer stage(P<0.001),vascular invasion(P<0.001)and gall bladder involvement(P<0.001)between the groups that survived,died due to cancer,or died due to other causes.In the Cox regression model,there were no significant differences in OS at 5 years with different operative strategies(autotransplantation versus allotransplantation),nor at survival at 1 year with neoadjuvant radiotherapy.However,neoadjuvant radiotherapy did appear to improve survival at both 3 years(HR:0.540,95%CI:0.326–0.896,P=0.017)and 5 years(HR:0.338,95%CI:0.153–0.747,P=0.007)from diagnosis.Conclusion This study demonstrated differences in patient characteristics between prognostic groups after liver resection and transplantation for HCC.These criteria can be used to inform patient selection and consent in this setting.Preoperative radiotherapy may improve long-term survival post-transplantation.展开更多
Disruption database and disruption warning database of the EAST tokamak had been established by a disruption research group. The disruption database, based on Structured Query Language(SQL), comprises 41 disruption ...Disruption database and disruption warning database of the EAST tokamak had been established by a disruption research group. The disruption database, based on Structured Query Language(SQL), comprises 41 disruption parameters, which include current quench characteristics, EFIT equilibrium characteristics, kinetic parameters, halo currents,and vertical motion. Presently most disruption databases are based on plasma experiments of non-superconducting tokamak devices. The purposes of the EAST database are to find disruption characteristics and disruption statistics to the fully superconducting tokamak EAST,to elucidate the physics underlying tokamak disruptions, to explore the influence of disruption on superconducting magnets and to extrapolate toward future burning plasma devices. In order to quantitatively assess the usefulness of various plasma parameters for predicting disruptions,a similar SQL database to Alcator C-Mod for EAST has been created by compiling values for a number of proposed disruption-relevant parameters sampled from all plasma discharges in the2015 campaign. The detailed statistic results and analysis of two databases on the EAST tokamak are presented.展开更多
The discrete excitation-emission-matrix fluorescence spectra (EEMS) at 12 excitation wavelengths (400, 430, 450, 460, 470, 490, 500, 510, 525, 550, 570, and 590 nm) and emission wavelengths ranging from 600-750 nm wer...The discrete excitation-emission-matrix fluorescence spectra (EEMS) at 12 excitation wavelengths (400, 430, 450, 460, 470, 490, 500, 510, 525, 550, 570, and 590 nm) and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species. A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed. For laboratory simulatively mixed samples, the samples mixed from 43 algal species (the algae of one division accounted for 25%, 50%, 75%, 85%, and 100% of the gross biomass, respectively), the average discrimination rates at the level of division were 65.0%, 87.5%, 98.6%, 99.0%, and 99.1%, with average relative contents of 18.9%, 44.5%, 68.9%, 73.4%, and 82.9%, respectively; the samples mixed from 32 red tide algal species (the dominant species accounted for 60%, 70%, 80%, 90%, and 100% of the gross biomass, respectively), the average correct discrimination rates of the dominant species at the level of genus were 63.3%, 74.2%, 78.8%, 83.4%, and 79.4%, respectively. For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass (chlorophyll), the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus, respectively. For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007, the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level; for the 12 samples obtained from Jiaozhou Bay in August 2007, the dominant species of all the 12 samples were recognized at the division level. The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for phytoplankton population.展开更多
OBJECTIVE In patients undergoing cardiac surgery,reduced preoperative ejection fraction(EF)and senior age are associated with a worse outcome.As most outcome data available for these patients are mainly from Western s...OBJECTIVE In patients undergoing cardiac surgery,reduced preoperative ejection fraction(EF)and senior age are associated with a worse outcome.As most outcome data available for these patients are mainly from Western surgical populations involving specific surgery types,our aim is to evaluate the real-world characteristics and perioperative outcomes of surgery in senior-aged heart failure patients with reduced EF across a broad range cardiac surgeries.METHODS Data were obtained from the China Heart Failure Surgery Registry(China-HFSR)database,a nationwide multicenter registry study in China's Mainland.Multiple variable regression analysis was performed in patients over 75 years old to identify risk factors associated with mortality.RESULTS From 2012 to 2017,578 senior-aged(>75 years)patients were enrolled in China HFSR,21.1%of whom were female.Isolated coronary bypass grafting(CABG)were performed in 71.6%of patients,10.1%of patients underwent isolated valve surgery and 8.7%received CABG combined with valve surgery.In-hospital mortality was 10.6%,and the major complication rate was 17.3%.Multivariate analysis identified diabetes mellitus(odds ratio(OR)=1.985),increased creatinine(OR=1.007),New York Heart Association(NYHA)Class III(OR=1.408),NYHA class IV(OR=1.955),cardiogenic shock(OR,6.271),and preoperative intra-aortic balloon pump insertion(OR=3.426)as independent predictors of in-hospital mortality.CONCLUSIONS In senior-aged patients,preoperative evaluation should be carefully performed,and strict management of reversible factors needs more attention.Senior-aged patients commonly have a more severe disease status combined with more frequent comorbidities,which may lead to a high risk in mortality.展开更多
HA (hashing array), a new algorithm, for mining frequent itemsets of large database is proposed. It employs a structure hash array, ltemArray ( ) to store the information of database and then uses it instead of da...HA (hashing array), a new algorithm, for mining frequent itemsets of large database is proposed. It employs a structure hash array, ltemArray ( ) to store the information of database and then uses it instead of database in later iteration. By this improvement, only twice scanning of the whole database is necessary, thereby the computational cost can be reduced significantly. To overcome the performance bottleneck of frequent 2-itemsets mining, a modified algorithm of HA, DHA (directaddressing hashing and array) is proposed, which combines HA with direct-addressing hashing technique. The new hybrid algorithm, DHA, not only overcomes the performance bottleneck but also inherits the advantages of HA. Extensive simulations are conducted in this paper to evaluate the performance of the proposed new algorithm, and the results prove the new algorithm is more efficient and reasonable.展开更多
The usage of a subset of observed stars in a CCD image to find their corresponding matched stars in a stellar catalog is an important issue in astronomical research. Subgraph isomorphic-based algorithms are the most w...The usage of a subset of observed stars in a CCD image to find their corresponding matched stars in a stellar catalog is an important issue in astronomical research. Subgraph isomorphic-based algorithms are the most widely used methods in star catalog matching. When more subgraph features are provided, the CCD images are recognized better. However, when the navigation feature database is large, the method requires more time to match the observing model. To solve this problem, this study investigates further and improves subgraph isomorphic matching algorithms. We present an algorithm based on a locality-sensitive hashing technique, which allocates quadrilateral models in the navigation feature database into different hash buckets and reduces the search range to the bucket in which the observed quadrilateral model is located. Experimental results indicate the effectivity of our method.展开更多
Considering features of stellar spectral radiation and sky surveys, we established a computational model for stellar effective temperatures, detected angular parameters and gray rates. Using known stellar flux data in...Considering features of stellar spectral radiation and sky surveys, we established a computational model for stellar effective temperatures, detected angular parameters and gray rates. Using known stellar flux data in some bands, we estimated stellar effective temperatures and detected angular parameters using stochastic particle swarm optimization (SPSO). We first verified the reliability of SPSO, and then determined reasonable parameters that produced highly accurate estimates under certain gray deviation levels. Finally, we calculated 177 860 stellar effective temperatures and detected angular parameters using data from the Midcourse Space Experiment (MSX) catalog. These derived stellar effective temperatures were accurate when we compared them to known values from literatures. This research makes full use of catalog data and presents an original technique for studying stellar characteristics. It proposes a novel method for calculating stellar effective temperatures and detecting angular parameters, and provides theoretical and practical data for finding information about radiation in any band.展开更多
We compare the performance of Bayesian Belief Networks (BBN), Multilayer Perception (MLP) networks and Alternating Decision Trees (ADtree) on separating quasars from stars with the database from the 2MASS and FI...We compare the performance of Bayesian Belief Networks (BBN), Multilayer Perception (MLP) networks and Alternating Decision Trees (ADtree) on separating quasars from stars with the database from the 2MASS and FIRST survey catalogs. Having a training sample of sources of known object types, the classifiers are trained to separate quasars from stars. By the statistical properties of the sample, the features important for classifica- tion are selected. We compare the classification results with and without feature selection. Experiments show that the results with feature selection are better than those without feature selection. From the high accuracy found, it is concluded that these automated methods are robust and effective for classifying point sources. They may all be applied to large survey projects (e.g. selecting input catalogs) and for other astronomical issues, such as the parameter measurement of stars and the redshift estimation of galaxies and quasars.展开更多
The archiving of Internet traffic is an essential function for retrospective network event analysis and forensic computer communication. The state-of-the-art approach for network monitoring and analysis involves stora...The archiving of Internet traffic is an essential function for retrospective network event analysis and forensic computer communication. The state-of-the-art approach for network monitoring and analysis involves storage and analysis of network flow statistic. However, this approach loses much valuable information within the Internet traffic. With the advancement of commodity hardware, in particular the volume of storage devices and the speed of interconnect technologies used in network adapter cards and multi-core processors, it is now possible to capture 10 Gbps and beyond real-time network traffic using a commodity computer, such as n2disk. Also with the advancement of distributed file system (such as Hadoop, ZFS, etc.) and open cloud computing platform (such as OpenStack, CloudStack, and Eucalyptus, etc.), it is practical to store such large volume of traffic data and fully in-depth analyse the inside communication within an acceptable latency. In this paper, based on well- known TimeMachine, we present TIFAflow, the design and implementation of a novel system for archiving and querying network flows. Firstly, we enhance the traffic archiving system named TImemachine+FAstbit (TIFA) with flow granularity, i.e., supply the system with flow table and flow module. Secondly, based on real network traces, we conduct performance comparison experiments of TIFAflow with other implementations such as common database solution, TimeMachine and TIFA system. Finally, based on comparison results, we demonstrate that TIFAflow has a higher performance improvement in storing and querying performance than TimeMachine and TIFA, both in time and space metrics.展开更多
As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification lan- guage NITSL,a knowled...As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification lan- guage NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool, NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.展开更多
Background:The NIA-AA research framework proposes a biological definition of Alzheimer’s disease,where asymptomatic persons with amyloid deposition would be considered as having this disease prior to symptoms.Discuss...Background:The NIA-AA research framework proposes a biological definition of Alzheimer’s disease,where asymptomatic persons with amyloid deposition would be considered as having this disease prior to symptoms.Discussion:Notwithstanding the fact that amyloid deposition in isolation is not associated with dementia,even the combined association of amyloid and tau pathology does not inevitably need to dementia over age 65.Other pathological factors may play a leading or an accelerating role in age-associated cognitive decline,including vascular small vessel disease,neuroinflammation and Lewy Body pathology.Conclusion:Research should aim at understanding the interaction between all these factors,rather than focusing on them individually.Hopefully this will lead to a personalized approach to the prevention of brain aging,based on individual biological,genetic and cognitive profiles.展开更多
基金supported by funds from the National Natural Science Foundation of China(No.82000602)the Chen Xiao-Ping Foundation for the Development of Science and Technology of Hubei Province(No.CXPJJH11900001-2019330)Innovation Team Project of Health Commission of Hubei Province(No.WJ2021C001).
文摘Objective We aimed to identify new,more accurate risk factors of liver transplantation for liver cancer through using the Surveillance,Epidemiology,and End Results(SEER)database.Methods Using the SEER database,we identified patients that had undergone surgical resection for non-metastatic hepatocellular carcinoma(HCC)and subsequent liver transplantation between 2010 and 2017.Overall survival(OS)was estimated using Kaplan-Meier plotter.Cox proportional hazards regression modelling was used to identify factors independently associated with recurrent disease[presented as adjusted hazard ratios(HR)with 95%CIs].Results Totally,1530 eligible patients were included in the analysis.There were significant differences in ethnicity(P=0.04),cancer stage(P<0.001),vascular invasion(P<0.001)and gall bladder involvement(P<0.001)between the groups that survived,died due to cancer,or died due to other causes.In the Cox regression model,there were no significant differences in OS at 5 years with different operative strategies(autotransplantation versus allotransplantation),nor at survival at 1 year with neoadjuvant radiotherapy.However,neoadjuvant radiotherapy did appear to improve survival at both 3 years(HR:0.540,95%CI:0.326–0.896,P=0.017)and 5 years(HR:0.338,95%CI:0.153–0.747,P=0.007)from diagnosis.Conclusion This study demonstrated differences in patient characteristics between prognostic groups after liver resection and transplantation for HCC.These criteria can be used to inform patient selection and consent in this setting.Preoperative radiotherapy may improve long-term survival post-transplantation.
基金supported by the National Magnetic Confinement Fusion Science Program of China(No.2014GB103000)
文摘Disruption database and disruption warning database of the EAST tokamak had been established by a disruption research group. The disruption database, based on Structured Query Language(SQL), comprises 41 disruption parameters, which include current quench characteristics, EFIT equilibrium characteristics, kinetic parameters, halo currents,and vertical motion. Presently most disruption databases are based on plasma experiments of non-superconducting tokamak devices. The purposes of the EAST database are to find disruption characteristics and disruption statistics to the fully superconducting tokamak EAST,to elucidate the physics underlying tokamak disruptions, to explore the influence of disruption on superconducting magnets and to extrapolate toward future burning plasma devices. In order to quantitatively assess the usefulness of various plasma parameters for predicting disruptions,a similar SQL database to Alcator C-Mod for EAST has been created by compiling values for a number of proposed disruption-relevant parameters sampled from all plasma discharges in the2015 campaign. The detailed statistic results and analysis of two databases on the EAST tokamak are presented.
基金supported by National High-Tech Research and Development Program of China (863 Program)(No.2009AA063005)Natural Science Foundation of Shandong Province (No.ZR2009EM001)
文摘The discrete excitation-emission-matrix fluorescence spectra (EEMS) at 12 excitation wavelengths (400, 430, 450, 460, 470, 490, 500, 510, 525, 550, 570, and 590 nm) and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species. A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed. For laboratory simulatively mixed samples, the samples mixed from 43 algal species (the algae of one division accounted for 25%, 50%, 75%, 85%, and 100% of the gross biomass, respectively), the average discrimination rates at the level of division were 65.0%, 87.5%, 98.6%, 99.0%, and 99.1%, with average relative contents of 18.9%, 44.5%, 68.9%, 73.4%, and 82.9%, respectively; the samples mixed from 32 red tide algal species (the dominant species accounted for 60%, 70%, 80%, 90%, and 100% of the gross biomass, respectively), the average correct discrimination rates of the dominant species at the level of genus were 63.3%, 74.2%, 78.8%, 83.4%, and 79.4%, respectively. For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass (chlorophyll), the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus, respectively. For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007, the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level; for the 12 samples obtained from Jiaozhou Bay in August 2007, the dominant species of all the 12 samples were recognized at the division level. The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for phytoplankton population.
文摘OBJECTIVE In patients undergoing cardiac surgery,reduced preoperative ejection fraction(EF)and senior age are associated with a worse outcome.As most outcome data available for these patients are mainly from Western surgical populations involving specific surgery types,our aim is to evaluate the real-world characteristics and perioperative outcomes of surgery in senior-aged heart failure patients with reduced EF across a broad range cardiac surgeries.METHODS Data were obtained from the China Heart Failure Surgery Registry(China-HFSR)database,a nationwide multicenter registry study in China's Mainland.Multiple variable regression analysis was performed in patients over 75 years old to identify risk factors associated with mortality.RESULTS From 2012 to 2017,578 senior-aged(>75 years)patients were enrolled in China HFSR,21.1%of whom were female.Isolated coronary bypass grafting(CABG)were performed in 71.6%of patients,10.1%of patients underwent isolated valve surgery and 8.7%received CABG combined with valve surgery.In-hospital mortality was 10.6%,and the major complication rate was 17.3%.Multivariate analysis identified diabetes mellitus(odds ratio(OR)=1.985),increased creatinine(OR=1.007),New York Heart Association(NYHA)Class III(OR=1.408),NYHA class IV(OR=1.955),cardiogenic shock(OR,6.271),and preoperative intra-aortic balloon pump insertion(OR=3.426)as independent predictors of in-hospital mortality.CONCLUSIONS In senior-aged patients,preoperative evaluation should be carefully performed,and strict management of reversible factors needs more attention.Senior-aged patients commonly have a more severe disease status combined with more frequent comorbidities,which may lead to a high risk in mortality.
文摘HA (hashing array), a new algorithm, for mining frequent itemsets of large database is proposed. It employs a structure hash array, ltemArray ( ) to store the information of database and then uses it instead of database in later iteration. By this improvement, only twice scanning of the whole database is necessary, thereby the computational cost can be reduced significantly. To overcome the performance bottleneck of frequent 2-itemsets mining, a modified algorithm of HA, DHA (directaddressing hashing and array) is proposed, which combines HA with direct-addressing hashing technique. The new hybrid algorithm, DHA, not only overcomes the performance bottleneck but also inherits the advantages of HA. Extensive simulations are conducted in this paper to evaluate the performance of the proposed new algorithm, and the results prove the new algorithm is more efficient and reasonable.
基金supported by the National Natural Science Foundation of China(U1431227)Guangzhou Science and Technology Planning Project(201604010037)
文摘The usage of a subset of observed stars in a CCD image to find their corresponding matched stars in a stellar catalog is an important issue in astronomical research. Subgraph isomorphic-based algorithms are the most widely used methods in star catalog matching. When more subgraph features are provided, the CCD images are recognized better. However, when the navigation feature database is large, the method requires more time to match the observing model. To solve this problem, this study investigates further and improves subgraph isomorphic matching algorithms. We present an algorithm based on a locality-sensitive hashing technique, which allocates quadrilateral models in the navigation feature database into different hash buckets and reduces the search range to the bucket in which the observed quadrilateral model is located. Experimental results indicate the effectivity of our method.
基金supported by the National Natural Science Foundation of China (Grant Nos. 51327803 and 51406041)the Fundamental Research Funds for the Central Universities (Grant No. HIT. NSRIF.2014090)
文摘Considering features of stellar spectral radiation and sky surveys, we established a computational model for stellar effective temperatures, detected angular parameters and gray rates. Using known stellar flux data in some bands, we estimated stellar effective temperatures and detected angular parameters using stochastic particle swarm optimization (SPSO). We first verified the reliability of SPSO, and then determined reasonable parameters that produced highly accurate estimates under certain gray deviation levels. Finally, we calculated 177 860 stellar effective temperatures and detected angular parameters using data from the Midcourse Space Experiment (MSX) catalog. These derived stellar effective temperatures were accurate when we compared them to known values from literatures. This research makes full use of catalog data and presents an original technique for studying stellar characteristics. It proposes a novel method for calculating stellar effective temperatures and detecting angular parameters, and provides theoretical and practical data for finding information about radiation in any band.
基金Supported by the National Natural Science Foundation of China.
文摘We compare the performance of Bayesian Belief Networks (BBN), Multilayer Perception (MLP) networks and Alternating Decision Trees (ADtree) on separating quasars from stars with the database from the 2MASS and FIRST survey catalogs. Having a training sample of sources of known object types, the classifiers are trained to separate quasars from stars. By the statistical properties of the sample, the features important for classifica- tion are selected. We compare the classification results with and without feature selection. Experiments show that the results with feature selection are better than those without feature selection. From the high accuracy found, it is concluded that these automated methods are robust and effective for classifying point sources. They may all be applied to large survey projects (e.g. selecting input catalogs) and for other astronomical issues, such as the parameter measurement of stars and the redshift estimation of galaxies and quasars.
基金the National Key Basic Research and Development (973) Program of China (Nos. 2012CB315801 and 2011CB302805)the National Natural Science Foundation of China A3 Program (No. 61161140320) and the National Natural Science Foundation of China (No. 61233016)Intel Research Councils UPO program with title of security Vulnerability Analysis based on Cloud Platform with Intel IA Architecture
文摘The archiving of Internet traffic is an essential function for retrospective network event analysis and forensic computer communication. The state-of-the-art approach for network monitoring and analysis involves storage and analysis of network flow statistic. However, this approach loses much valuable information within the Internet traffic. With the advancement of commodity hardware, in particular the volume of storage devices and the speed of interconnect technologies used in network adapter cards and multi-core processors, it is now possible to capture 10 Gbps and beyond real-time network traffic using a commodity computer, such as n2disk. Also with the advancement of distributed file system (such as Hadoop, ZFS, etc.) and open cloud computing platform (such as OpenStack, CloudStack, and Eucalyptus, etc.), it is practical to store such large volume of traffic data and fully in-depth analyse the inside communication within an acceptable latency. In this paper, based on well- known TimeMachine, we present TIFAflow, the design and implementation of a novel system for archiving and querying network flows. Firstly, we enhance the traffic archiving system named TImemachine+FAstbit (TIFA) with flow granularity, i.e., supply the system with flow table and flow module. Secondly, based on real network traces, we conduct performance comparison experiments of TIFAflow with other implementations such as common database solution, TimeMachine and TIFA system. Finally, based on comparison results, we demonstrate that TIFAflow has a higher performance improvement in storing and querying performance than TimeMachine and TIFA, both in time and space metrics.
文摘As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification lan- guage NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool, NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.
基金The author's research is funded by the Canadian Consortium on Neurodegeneration in Aging,the Canadian Institutes for Health Research,and The Weston Brain Institute.
文摘Background:The NIA-AA research framework proposes a biological definition of Alzheimer’s disease,where asymptomatic persons with amyloid deposition would be considered as having this disease prior to symptoms.Discussion:Notwithstanding the fact that amyloid deposition in isolation is not associated with dementia,even the combined association of amyloid and tau pathology does not inevitably need to dementia over age 65.Other pathological factors may play a leading or an accelerating role in age-associated cognitive decline,including vascular small vessel disease,neuroinflammation and Lewy Body pathology.Conclusion:Research should aim at understanding the interaction between all these factors,rather than focusing on them individually.Hopefully this will lead to a personalized approach to the prevention of brain aging,based on individual biological,genetic and cognitive profiles.