Data acquisition and modeling are the two important, difficult and costful aspects in a Cybercity project. 2D-GIS is mature and can manage a lot of spatial data. Thus 3D-GIS should make the best of data and technology...Data acquisition and modeling are the two important, difficult and costful aspects in a Cybercity project. 2D-GIS is mature and can manage a lot of spatial data. Thus 3D-GIS should make the best of data and technology of 2D-GIS. Construction of a useful synthetic environment requires integration of multiple types of information like DEM, texture images and 3D representation of objects such as buildings. In this paper, the method for 3D city landscape data model and visualization based on integrated databases is presented. Since the data volume of raster are very huge, special strategies(for example, pyramid gridded method) must be adopted in order to manage raster data efficiently. Three different methods of data acquisition, the proper data structure and a simple modeling method are presented as well. At last, a pilot project of Shanghai Cybercity is illustrated.展开更多
1 Introduction Geological outcrops or sections are the basis of geological research.However,the traditional methods for presenting them are mainly photos which fall short of delivering a true visual sense (Deng et al....1 Introduction Geological outcrops or sections are the basis of geological research.However,the traditional methods for presenting them are mainly photos which fall short of delivering a true visual sense (Deng et al.,2009;Hou et al.,2014).With the continuous development of image acquisition technology using single-lens reflex camera (SLR camera),image synthesis,large file storage and acquisition,panoramic visualization and network technology.展开更多
Based on former studies on weather simulator modules in IPMist laboratory, study on visual programming of stochastic weather generator(VS-WGEN)was continued. In this study, Markov Chain, Monte Carlo, Fourier Series, a...Based on former studies on weather simulator modules in IPMist laboratory, study on visual programming of stochastic weather generator(VS-WGEN)was continued. In this study, Markov Chain, Monte Carlo, Fourier Series, and weak stationary process were used to generate the daily weather data in software Matlab 6. 0, with the data input from 40 years' weather data recorded by Beijing Weather Station. The generated data includes daily maximum temperature, minimum temperature, precipitation and solar radiation. It has been verified that the weather data generated by the VS-WGEN were more accurate than that by the old WGEN, when twenty new model parameters were included. VS-WGEN has wide software applications, such as pest risk analysis, pest forecast and management. It can be implemented in hardware development as well, such as weather control in weather chamber and greenhouse for researches on ecological adaptation of crop varieties to a given location over time and space. Overall, VS-WGEN is a very useful tool for studies on theoretical and applied ecology.展开更多
CHDTEPDB(URL:http://chdtepdb.com/)is a manually integrated database for congenital heart disease(CHD)that stores the expression profiling data of CHD derived from published papers,aiming to provide rich resources for i...CHDTEPDB(URL:http://chdtepdb.com/)is a manually integrated database for congenital heart disease(CHD)that stores the expression profiling data of CHD derived from published papers,aiming to provide rich resources for investigating a deeper correlation between human CHD and aberrant transcriptome expression.The develop-ment of human diseases involves important regulatory roles of RNAs,and expression profiling data can reflect the underlying etiology of inherited diseases.Hence,collecting and compiling expression profiling data is of critical significance for a comprehensive understanding of the mechanisms and functions that underpin genetic diseases.CHDTEPDB stores the expression profiles of over 200 sets of 7 types of CHD and provides users with more convenient basic analytical functions.Due to the differences in clinical indicators such as disease type and unavoidable detection errors among various datasets,users are able to customize their selection of corresponding data for personalized analysis.Moreover,we provide a submission page for researchers to submit their own data so that increasing expression profiles as well as some other histological data could be supplemented to the database.CHDTEPDB is a user-friendly interface that allows users to quickly browse,retrieve,download,and analyze their target samples.CHDTEPDB will significantly improve the current knowledge of expression profiling data in CHD and has the potential to be exploited as an important tool for future research on the disease.展开更多
Data are limitless. But those are usually not formed or created in our needs. Most of data providers deliver their data in Microsoft Excel spreadsheet, which is compatible with ArcGIS, the most widely used GIS (Geogr...Data are limitless. But those are usually not formed or created in our needs. Most of data providers deliver their data in Microsoft Excel spreadsheet, which is compatible with ArcGIS, the most widely used GIS (Geographic Information System) software in GIS sector. However, those table data contain much unnecessary information that do not need for a certain project. Using the raw data can increase processing times and reduce performance of geoprocessing tools. This study shows steps of how the raw data are being processed using ArcGIS ModelBuilder and Python script.展开更多
The optical character recognition for the right to left and cursive languages such as Arabic is challenging and received little attention from researchers in the past compared to the other Latin languages.Moreover,the...The optical character recognition for the right to left and cursive languages such as Arabic is challenging and received little attention from researchers in the past compared to the other Latin languages.Moreover,the absence of a standard publicly available dataset for several low-resource lan-guages,including the Pashto language remained a hurdle in the advancement of language processing.Realizing that,a clean dataset is the fundamental and core requirement of character recognition,this research begins with dataset generation and aims at a system capable of complete language understanding.Keeping in view the complete and full autonomous recognition of the cursive Pashto script.The first achievement of this research is a clean and standard dataset for the isolated characters of the Pashto script.In this paper,a database of isolated Pashto characters for forty four alphabets using various font styles has been introduced.In order to overcome the font style shortage,the graphical software Inkscape has been used to generate sufficient image data samples for each character.The dataset has been pre-processed and reduced in dimensions to 32×32 pixels,and further converted into the binary format with a black background and white text so that it resembles the Modified National Institute of Standards and Technology(MNIST)database.The benchmark database is publicly available for further research on the standard GitHub and Kaggle database servers both in pixel and Comma Separated Values(CSV)formats.展开更多
This paper presents a tool for managing, reusing and analysing C software code based on database techniques. The abstract information of entire software code is stored in a program database that is the conceptual sche...This paper presents a tool for managing, reusing and analysing C software code based on database techniques. The abstract information of entire software code is stored in a program database that is the conceptual scheme of the entire software, whereas the reuse component is a subscheme. Relational algebra can be conveniently used to manage, analyse and reuse C code. In the tool, we can manage, analyse and reuse any components in the program database and rapidly extract source code of any components or construct the program code of a new system. The rule system is introduced in reusing source code.展开更多
This paper presents a case study on structure design and establishment of database application system for alien species in Shandong Province, integrating with Geographic Information System, computer network, and datab...This paper presents a case study on structure design and establishment of database application system for alien species in Shandong Province, integrating with Geographic Information System, computer network, and database technology to the research of alien species. The modules of alien species database, including classified data input, statistics and analysis, species pictures and distribution maps, and out date input, were approached by Visual Studio.net 2003 and Microsoft SQL server 2000. The alien species information contains the information of classification, species distinction characteristics, biological characteristics, original area, distribution area, the entering fashion and route, invasion time, invasion reason, interaction with the endemic species, growth state, danger state and spatial information, i.e. distribution map. Based on the above bases, several models including application, checking, modifying, printing, adding and returning models were developed. Furthermore, through the establishment of index tables and index maps, we can also spatially query the data like picture, text and GIS map data. This research established the technological platform of sharing information about scientific resource of alien species in Shandong Province, offering the basis for the dynamic inquiry of alien species, the warning technology of prevention and the fast reaction system. The database application system possessed the principles of good practicability, friendly user interface and convenient usage. It can supply full and accurate information inquiry services of alien species for the users and provide functions of dynamically managing the database for the administrator.展开更多
Database applications are becoming increasingly popular, mainly due to the advanced data management facilities that the underlying database management system offers compared against traditional legacy software applica...Database applications are becoming increasingly popular, mainly due to the advanced data management facilities that the underlying database management system offers compared against traditional legacy software applications. The interaction, however, of such applications with the database system introduces a number of issues, among which, this paper addresses the impact analysis of the changes performed at the database schema level. Our motivation is to provide the software engineers of database applications with automated methods that facilitate major maintenance tasks, such as source code corrections and regression testing, which should be triggered by the occurrence of such changes. The presented impact analysis is thus two-folded: the impact is analysed in terms of both the affected source code statements and the affected test suites concerning the testing of these applications. To achieve the former objective, a program slicing technique is employed, which is based on an extended version of the program dependency graph. The latter objective requires the analysis of test suites generated for database applications, which is accomplished by employing testing techniques tailored for this type of applications. Utilising both the slicing and the testing techniques enhances program comprehension of database applications, while also supporting the development of a number of practical metrics regarding their maintainability against schema changes. To evaluate the feasibility and effectiveness of the presented techniques and metrics, a software tool, called DATA, has been implemented. The experimental results from its usage on the TPC-C case study are reported and analysed.展开更多
There are hundreds of records in the curriculum of our Language Learning Lab every semester and every record has several important properties. It takes too much time to manage the information in the traditional way an...There are hundreds of records in the curriculum of our Language Learning Lab every semester and every record has several important properties. It takes too much time to manage the information in the traditional way and it always makes mistakes. Managing the information with database technology, designing program to access the database with Visual c tool, setting a combobox in the application UI and inquiring curriculum information by selecting the room name in the combobox. It is easy to operate the program for the staff members. The result is accurate, this way promotes the efficiency of managing information.展开更多
This paper describes an extendable graphical framework,aflak,which provides a visualization and provenance management environment for the analysis of multi-spectral astronomical datasets.Via its node editor interface,...This paper describes an extendable graphical framework,aflak,which provides a visualization and provenance management environment for the analysis of multi-spectral astronomical datasets.Via its node editor interface,aflak allows the astronomer to compose transforms on input datasets queryable from public astronomical data repositories,then to export the results of the analysis as Flexible Image Transport System(FITS)files,in a manner such that the full provenance of the output data be preserved and reviewable,and that the exported file be usable by other common astronomical analysis software.FITS is the standard of data interchange in astronomy.By embedding aflak’s provenance data into FITS files,we both achieve interoperability with existing software and full reproducibility of the process by which astronomers make discoveries.展开更多
Based on the study of the current two methods—interpretation and compilation—for the integration of logic programming and relational database,a new precompilation-based interpretive approach is proposed.It inherits ...Based on the study of the current two methods—interpretation and compilation—for the integration of logic programming and relational database,a new precompilation-based interpretive approach is proposed.It inherits the advantages of both methods,but overcomes the drawbacks of theirs.A new integrated system based on this approach is presented,which has been implemented on Micro VAX Ⅱ and applied to practise as the kernel of the GKBMS knowledge base management system.Also discussed are the key implementation techniques,including the coupling of logic and relational database systems,the compound of logic and relational database languages,the partial evaluation and static optimization of user's programs,fact scheduling and version management in problem-solving.展开更多
The purpose of this research is the design and implementation of a support system for learning programming. To archive this purpose, in this article, we propose a Puzzle Programming System that uses jigsaw puzzles as ...The purpose of this research is the design and implementation of a support system for learning programming. To archive this purpose, in this article, we propose a Puzzle Programming System that uses jigsaw puzzles as an example of the application of physical visualization, which visualizes logical constraints to physical ones. This Puzzle Programming System aims to teach basic programming concepts by presenting the invisible constraints of programming language syntax using the visual constraints of jigsaw puzzle pieces. This system runs on an Apple iPad and was developed using the Unity game engine. We used YAML as a data format for serializing structured data for data management. By inviting high school students to try out a prototype, we could confirm the usefulness of the Puzzle Programming System. The experimental evaluation results also shed light on aspects of the game that need to be redesigned and parts where the visual programming model needs to be modified and expanded.展开更多
To simulate the process of cold roll-forming process, a new method isadopted. The theoretical foundation of this method is an elastic-plastic large deformation splinefinite strip method based on object-oriented progra...To simulate the process of cold roll-forming process, a new method isadopted. The theoretical foundation of this method is an elastic-plastic large deformation splinefinite strip method based on object-oriented programming. Combined with the computer graphicstechnology, the visual simulation of cold roll-forming is completed and the system is established.By analyzing common channel steel, the process is shown and explained including theory method, modeland result display. So the simulation system is already a kind of mature and effective tool toanalyze the process of cold roll forming.展开更多
BACKGROUND Gastric cancer(GC)is prevalent and aggressive,especially when patients have distant lung metastases,which often places patients into advanced stages.By identifying prognostic variables for lung metastasis i...BACKGROUND Gastric cancer(GC)is prevalent and aggressive,especially when patients have distant lung metastases,which often places patients into advanced stages.By identifying prognostic variables for lung metastasis in GC patients,it may be po-ssible to construct a good prediction model for both overall survival(OS)and the cumulative incidence prediction(CIP)plot of the tumour.AIM To investigate the predictors of GC with lung metastasis(GCLM)to produce nomograms for OS and generate CIP by using cancer-specific survival(CSS)data.METHODS Data from January 2000 to December 2020 involving 1652 patients with GCLM were obtained from the Surveillance,epidemiology,and end results program database.The major observational endpoint was OS;hence,patients were se-parated into training and validation groups.Correlation analysis determined va-rious connections.Univariate and multivariate Cox analyses validated the independent predictive factors.Nomogram distinction and calibration were performed with the time-dependent area under the curve(AUC)and calibration curves.To evaluate the accuracy and clinical usefulness of the nomograms,decision curve analysis(DCA)was performed.The clinical utility of the novel prognostic model was compared to that of the 7th edition of the American Joint Committee on Cancer(AJCC)staging system by utilizing Net Reclassification Improvement(NRI)and Integrated Discrimination Improvement(IDI).Finally,the OS prognostic model and Cox-AJCC risk stratification model modified for the AJCC system were compared.RESULTS For the purpose of creating the OS nomogram,a CIP plot based on CSS was generated.Cox multivariate regression analysis identified eleven significant prognostic factors(P<0.05)related to liver metastasis,bone metastasis,primary site,surgery,regional surgery,treatment sequence,chemotherapy,radiotherapy,positive lymph node count,N staging,and time from diagnosis to treatment.It was clear from the DCA(net benefit>0),time-de-pendent ROC curve(training/validation set AUC>0.7),and calibration curve(reliability slope closer to 45 degrees)results that the OS nomogram demonstrated a high level of predictive efficiency.The OS prediction model(New Model AUC=0.83)also performed much better than the old Cox-AJCC model(AUC difference between the new model and the old model greater than 0)in terms of risk stratification(P<0.0001)and verification using the IDI and NRI.CONCLUSION The OS nomogram for GCLM successfully predicts 1-and 3-year OS.Moreover,this approach can help to ap-propriately classify patients into high-risk and low-risk groups,thereby guiding treatment.展开更多
文摘Data acquisition and modeling are the two important, difficult and costful aspects in a Cybercity project. 2D-GIS is mature and can manage a lot of spatial data. Thus 3D-GIS should make the best of data and technology of 2D-GIS. Construction of a useful synthetic environment requires integration of multiple types of information like DEM, texture images and 3D representation of objects such as buildings. In this paper, the method for 3D city landscape data model and visualization based on integrated databases is presented. Since the data volume of raster are very huge, special strategies(for example, pyramid gridded method) must be adopted in order to manage raster data efficiently. Three different methods of data acquisition, the proper data structure and a simple modeling method are presented as well. At last, a pilot project of Shanghai Cybercity is illustrated.
基金granted by National Natural Science Foundation of China (Grant No.41725007)Chinese Academy of Sciences (Grant Nos.XDB10010100 and XXH13506)State Key Laboratory of Palaeobiology and Stratigraphy,NIGPAS (Grant No.20183127).
文摘1 Introduction Geological outcrops or sections are the basis of geological research.However,the traditional methods for presenting them are mainly photos which fall short of delivering a true visual sense (Deng et al.,2009;Hou et al.,2014).With the continuous development of image acquisition technology using single-lens reflex camera (SLR camera),image synthesis,large file storage and acquisition,panoramic visualization and network technology.
基金supported jointly by the grant of Project“973”:Fundamental Studies on Invasion and Control of Extra Pest(2002CB111400)the grant of Key Project of Ministry of Science and Technology of China:Development of New Technologies for Pest Forecasting(2001BA50PB01).
文摘Based on former studies on weather simulator modules in IPMist laboratory, study on visual programming of stochastic weather generator(VS-WGEN)was continued. In this study, Markov Chain, Monte Carlo, Fourier Series, and weak stationary process were used to generate the daily weather data in software Matlab 6. 0, with the data input from 40 years' weather data recorded by Beijing Weather Station. The generated data includes daily maximum temperature, minimum temperature, precipitation and solar radiation. It has been verified that the weather data generated by the VS-WGEN were more accurate than that by the old WGEN, when twenty new model parameters were included. VS-WGEN has wide software applications, such as pest risk analysis, pest forecast and management. It can be implemented in hardware development as well, such as weather control in weather chamber and greenhouse for researches on ecological adaptation of crop varieties to a given location over time and space. Overall, VS-WGEN is a very useful tool for studies on theoretical and applied ecology.
文摘CHDTEPDB(URL:http://chdtepdb.com/)is a manually integrated database for congenital heart disease(CHD)that stores the expression profiling data of CHD derived from published papers,aiming to provide rich resources for investigating a deeper correlation between human CHD and aberrant transcriptome expression.The develop-ment of human diseases involves important regulatory roles of RNAs,and expression profiling data can reflect the underlying etiology of inherited diseases.Hence,collecting and compiling expression profiling data is of critical significance for a comprehensive understanding of the mechanisms and functions that underpin genetic diseases.CHDTEPDB stores the expression profiles of over 200 sets of 7 types of CHD and provides users with more convenient basic analytical functions.Due to the differences in clinical indicators such as disease type and unavoidable detection errors among various datasets,users are able to customize their selection of corresponding data for personalized analysis.Moreover,we provide a submission page for researchers to submit their own data so that increasing expression profiles as well as some other histological data could be supplemented to the database.CHDTEPDB is a user-friendly interface that allows users to quickly browse,retrieve,download,and analyze their target samples.CHDTEPDB will significantly improve the current knowledge of expression profiling data in CHD and has the potential to be exploited as an important tool for future research on the disease.
文摘Data are limitless. But those are usually not formed or created in our needs. Most of data providers deliver their data in Microsoft Excel spreadsheet, which is compatible with ArcGIS, the most widely used GIS (Geographic Information System) software in GIS sector. However, those table data contain much unnecessary information that do not need for a certain project. Using the raw data can increase processing times and reduce performance of geoprocessing tools. This study shows steps of how the raw data are being processed using ArcGIS ModelBuilder and Python script.
文摘The optical character recognition for the right to left and cursive languages such as Arabic is challenging and received little attention from researchers in the past compared to the other Latin languages.Moreover,the absence of a standard publicly available dataset for several low-resource lan-guages,including the Pashto language remained a hurdle in the advancement of language processing.Realizing that,a clean dataset is the fundamental and core requirement of character recognition,this research begins with dataset generation and aims at a system capable of complete language understanding.Keeping in view the complete and full autonomous recognition of the cursive Pashto script.The first achievement of this research is a clean and standard dataset for the isolated characters of the Pashto script.In this paper,a database of isolated Pashto characters for forty four alphabets using various font styles has been introduced.In order to overcome the font style shortage,the graphical software Inkscape has been used to generate sufficient image data samples for each character.The dataset has been pre-processed and reduced in dimensions to 32×32 pixels,and further converted into the binary format with a black background and white text so that it resembles the Modified National Institute of Standards and Technology(MNIST)database.The benchmark database is publicly available for further research on the standard GitHub and Kaggle database servers both in pixel and Comma Separated Values(CSV)formats.
文摘This paper presents a tool for managing, reusing and analysing C software code based on database techniques. The abstract information of entire software code is stored in a program database that is the conceptual scheme of the entire software, whereas the reuse component is a subscheme. Relational algebra can be conveniently used to manage, analyse and reuse C code. In the tool, we can manage, analyse and reuse any components in the program database and rapidly extract source code of any components or construct the program code of a new system. The rule system is introduced in reusing source code.
基金this study was supported by the Key Project of Natu-ral Science Foundation of Shandong Province(No. Z2003D05)Key Project of Environmental Protection Science of Shandong Province(No. 2004057)Outstanding Young Scientists Grants of Shandong Province(No.2005BS08010),China
文摘This paper presents a case study on structure design and establishment of database application system for alien species in Shandong Province, integrating with Geographic Information System, computer network, and database technology to the research of alien species. The modules of alien species database, including classified data input, statistics and analysis, species pictures and distribution maps, and out date input, were approached by Visual Studio.net 2003 and Microsoft SQL server 2000. The alien species information contains the information of classification, species distinction characteristics, biological characteristics, original area, distribution area, the entering fashion and route, invasion time, invasion reason, interaction with the endemic species, growth state, danger state and spatial information, i.e. distribution map. Based on the above bases, several models including application, checking, modifying, printing, adding and returning models were developed. Furthermore, through the establishment of index tables and index maps, we can also spatially query the data like picture, text and GIS map data. This research established the technological platform of sharing information about scientific resource of alien species in Shandong Province, offering the basis for the dynamic inquiry of alien species, the warning technology of prevention and the fast reaction system. The database application system possessed the principles of good practicability, friendly user interface and convenient usage. It can supply full and accurate information inquiry services of alien species for the users and provide functions of dynamically managing the database for the administrator.
文摘Database applications are becoming increasingly popular, mainly due to the advanced data management facilities that the underlying database management system offers compared against traditional legacy software applications. The interaction, however, of such applications with the database system introduces a number of issues, among which, this paper addresses the impact analysis of the changes performed at the database schema level. Our motivation is to provide the software engineers of database applications with automated methods that facilitate major maintenance tasks, such as source code corrections and regression testing, which should be triggered by the occurrence of such changes. The presented impact analysis is thus two-folded: the impact is analysed in terms of both the affected source code statements and the affected test suites concerning the testing of these applications. To achieve the former objective, a program slicing technique is employed, which is based on an extended version of the program dependency graph. The latter objective requires the analysis of test suites generated for database applications, which is accomplished by employing testing techniques tailored for this type of applications. Utilising both the slicing and the testing techniques enhances program comprehension of database applications, while also supporting the development of a number of practical metrics regarding their maintainability against schema changes. To evaluate the feasibility and effectiveness of the presented techniques and metrics, a software tool, called DATA, has been implemented. The experimental results from its usage on the TPC-C case study are reported and analysed.
文摘There are hundreds of records in the curriculum of our Language Learning Lab every semester and every record has several important properties. It takes too much time to manage the information in the traditional way and it always makes mistakes. Managing the information with database technology, designing program to access the database with Visual c tool, setting a combobox in the application UI and inquiring curriculum information by selecting the room name in the combobox. It is easy to operate the program for the staff members. The result is accurate, this way promotes the efficiency of managing information.
基金JSPS KAKENHI(Japan)Grant Numbers 17K00173 and 17H00737.
文摘This paper describes an extendable graphical framework,aflak,which provides a visualization and provenance management environment for the analysis of multi-spectral astronomical datasets.Via its node editor interface,aflak allows the astronomer to compose transforms on input datasets queryable from public astronomical data repositories,then to export the results of the analysis as Flexible Image Transport System(FITS)files,in a manner such that the full provenance of the output data be preserved and reviewable,and that the exported file be usable by other common astronomical analysis software.FITS is the standard of data interchange in astronomy.By embedding aflak’s provenance data into FITS files,we both achieve interoperability with existing software and full reproducibility of the process by which astronomers make discoveries.
文摘Based on the study of the current two methods—interpretation and compilation—for the integration of logic programming and relational database,a new precompilation-based interpretive approach is proposed.It inherits the advantages of both methods,but overcomes the drawbacks of theirs.A new integrated system based on this approach is presented,which has been implemented on Micro VAX Ⅱ and applied to practise as the kernel of the GKBMS knowledge base management system.Also discussed are the key implementation techniques,including the coupling of logic and relational database systems,the compound of logic and relational database languages,the partial evaluation and static optimization of user's programs,fact scheduling and version management in problem-solving.
文摘The purpose of this research is the design and implementation of a support system for learning programming. To archive this purpose, in this article, we propose a Puzzle Programming System that uses jigsaw puzzles as an example of the application of physical visualization, which visualizes logical constraints to physical ones. This Puzzle Programming System aims to teach basic programming concepts by presenting the invisible constraints of programming language syntax using the visual constraints of jigsaw puzzle pieces. This system runs on an Apple iPad and was developed using the Unity game engine. We used YAML as a data format for serializing structured data for data management. By inviting high school students to try out a prototype, we could confirm the usefulness of the Puzzle Programming System. The experimental evaluation results also shed light on aspects of the game that need to be redesigned and parts where the visual programming model needs to be modified and expanded.
基金This project is supported by Provincial Natural Science Foundation of Hebei (No.502214).
文摘To simulate the process of cold roll-forming process, a new method isadopted. The theoretical foundation of this method is an elastic-plastic large deformation splinefinite strip method based on object-oriented programming. Combined with the computer graphicstechnology, the visual simulation of cold roll-forming is completed and the system is established.By analyzing common channel steel, the process is shown and explained including theory method, modeland result display. So the simulation system is already a kind of mature and effective tool toanalyze the process of cold roll forming.
基金Supported by Peng-Cheng Talent-Medical Young Reserve Talent Training Program,No.XWRCHT20220002Xuzhou City Health and Health Commission Technology Project Contract,No.XWKYHT20230081and Key Research and Development Plan Project of Xuzhou City,No.KC22179.
文摘BACKGROUND Gastric cancer(GC)is prevalent and aggressive,especially when patients have distant lung metastases,which often places patients into advanced stages.By identifying prognostic variables for lung metastasis in GC patients,it may be po-ssible to construct a good prediction model for both overall survival(OS)and the cumulative incidence prediction(CIP)plot of the tumour.AIM To investigate the predictors of GC with lung metastasis(GCLM)to produce nomograms for OS and generate CIP by using cancer-specific survival(CSS)data.METHODS Data from January 2000 to December 2020 involving 1652 patients with GCLM were obtained from the Surveillance,epidemiology,and end results program database.The major observational endpoint was OS;hence,patients were se-parated into training and validation groups.Correlation analysis determined va-rious connections.Univariate and multivariate Cox analyses validated the independent predictive factors.Nomogram distinction and calibration were performed with the time-dependent area under the curve(AUC)and calibration curves.To evaluate the accuracy and clinical usefulness of the nomograms,decision curve analysis(DCA)was performed.The clinical utility of the novel prognostic model was compared to that of the 7th edition of the American Joint Committee on Cancer(AJCC)staging system by utilizing Net Reclassification Improvement(NRI)and Integrated Discrimination Improvement(IDI).Finally,the OS prognostic model and Cox-AJCC risk stratification model modified for the AJCC system were compared.RESULTS For the purpose of creating the OS nomogram,a CIP plot based on CSS was generated.Cox multivariate regression analysis identified eleven significant prognostic factors(P<0.05)related to liver metastasis,bone metastasis,primary site,surgery,regional surgery,treatment sequence,chemotherapy,radiotherapy,positive lymph node count,N staging,and time from diagnosis to treatment.It was clear from the DCA(net benefit>0),time-de-pendent ROC curve(training/validation set AUC>0.7),and calibration curve(reliability slope closer to 45 degrees)results that the OS nomogram demonstrated a high level of predictive efficiency.The OS prediction model(New Model AUC=0.83)also performed much better than the old Cox-AJCC model(AUC difference between the new model and the old model greater than 0)in terms of risk stratification(P<0.0001)and verification using the IDI and NRI.CONCLUSION The OS nomogram for GCLM successfully predicts 1-and 3-year OS.Moreover,this approach can help to ap-propriately classify patients into high-risk and low-risk groups,thereby guiding treatment.