Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and com...Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and commonly used data format,namely,JavaScript Object Notation(JSON),was introduced in this study.We designed a fully described data structure to collect TCM clinical trial information based on the JSON syntax.Results:A smart and powerful data format,JSON-ASR,was developed.JSON-ASR uses a plain-text data format in the form of key/value pairs and consists of six sections and more than 80 preset pairs.JSON-ASR adopts extensible structured arrays to support the situations of multi-groups and multi-outcomes.Conclusion:JSON-ASR has the characteristics of light weight,flexibility,and good scalability,which is suitable for the complex data of clinical evidence.展开更多
On Nov.4th, AQSIQ (General Administration of Quality Supervision, Inspection and Quarantine of the People’s Republic of China), SAC (Standardization Administration of China), National Audit Office of China (CNAO), an...On Nov.4th, AQSIQ (General Administration of Quality Supervision, Inspection and Quarantine of the People’s Republic of China), SAC (Standardization Administration of China), National Audit Office of China (CNAO), and National Ministry of Finance of China jointly held the conference press on the national standard of Information Technology-Data Interface of Accounting Software (GB/T 19581-2004) in Beijing. The standard was approved and issued on Sept. 20, 2004 by AQSIQ and SAC, and it would come into effect all over the whole nation from January 1st, 2005.展开更多
A new chance of developing traditional manufacturing industry comes forth with the development of network technology. Application technology oriented rapid response manufacturing in the distributed network environment...A new chance of developing traditional manufacturing industry comes forth with the development of network technology. Application technology oriented rapid response manufacturing in the distributed network environments, that is, how to take advantage of the Intranet and Internet, combine the numerous manufacturing resources spread around the region, the country and even the globe is the key to the agile design, manufacturing and the buildup of comprehensively competitive power, at the same time, is also an important research direction in the field of advanced manufacturing technology. Rapid response manufacturing in the distributed network environment is a newly manufactory pattern that can be used to implement the conception of agile design and manufacturing, but there are some new problems coming with it, which will directly influence the enterprise’s ability of rapid response in the distributed network manufacturing pattern and lead to the failure of the league and the lost of the given orders. In this paper, we establish some approaches to solve these problems in product development process. The paper then presents the research on key application technologies and solutions includes: network safety strategy which guarantees data transferring among the leaguer members, production data management based on Web/DOT (Distributed Object Technology) and XML criteria which guarantees data exchange in structure-variance characteristic environments, the network platform which provides the conversion service of different types of CAD files each other. All of these solutions are aim for technology problems existing in the distributed network environments and among the league members. Finally, the paper takes one project, that is, the establishment of the online application service system for Shanghai Advance Manufacturing Technology Research Center as a good instance.展开更多
Chronic myeloid leukemia(CML)in minors is a rare disease which can be effectively treated by tyrosine kinase inhibitors(TKIs)since the year 2000.A majority of pediatricians will encounter one or two CML patients in th...Chronic myeloid leukemia(CML)in minors is a rare disease which can be effectively treated by tyrosine kinase inhibitors(TKIs)since the year 2000.A majority of pediatricians will encounter one or two CML patients in the course of their careers and will typically have to rely on written information along with their own intuition to provide care.Knowledge of response to TKIs and of agespecific side effects has an impact on the design of pediatric CML trials in many ways aiming to contribute toward greater predictability of clinical improvements.Information from a registry on a rare disease like CML offers the enormous benefit of enabling treating physicians to interact and share their collective experience.The International Registry on Pediatric CML(IR-PCML)was founded at Poitiers/France almost 10 years ago.Since then,the number of collaboration centers and in parallel of registered patients continuously increased(>550 patients as of December 2019).Ideally,from a given treatment center in a country data are transferred to a national coordinator who interacts with the IR-PCML.In the sense of quality assurance,the registry can offer dissemination of knowledge on state-of-the-art diagnostics(including reference appraisal),optimal treatment approaches,and follow-up procedures within a network that is exerting its strength via participation.With continuous growth during the recent years,very rare subgroups of patients could be identified(e.g.,CML diagnosed at age<3 years,children presenting with specific problems at diagnosis or during course of treatment)which had not been described before.Publications coming from the IR-PCML disseminated this useful information derived from patients who robustly participate and share information about their disease,among themselves and with their caregivers and clinicians.Patient input driving the collection of data on this rare leukemia is the basis for the considerable success of bringing new therapeutics into clinical use.展开更多
An ever-increasing number of sensor resources are being exposed via the World Wide Web to become part of the Digital Earth.Discovery,selection and use of these sensors and their observations require a robust sensor in...An ever-increasing number of sensor resources are being exposed via the World Wide Web to become part of the Digital Earth.Discovery,selection and use of these sensors and their observations require a robust sensor information model,but the consistent description of sensor metadata is a complex and difficult task.Currently,the only available robust model is SensorML,which is intentionally designed in a very generic way.Due to this genericness,interoperability can hardly be achieved without the definition of application profiles that further constrain the use and expressiveness of the root language.So far,such SensorML profiles have only been developed up to a limited extent.This work describes a new approach for defining sensor metadata,the Starfish Fungus Language(StarFL)model.This language follows a more restrictive approach and incorporates concepts from the recently published Semantic Sensor Network Ontology to overcome the key issues users are experiencing with SensorML.StarFL defines a restricted vocabulary and model for sensor metadata to achieve a high level of interoperability and a straightforward reusability of sensor descriptions.展开更多
The vision of a Digital Earth calls for more dynamic information systems,new sources of information,and stronger capabilities for their integration.Sensor networks have been identified as a major information source fo...The vision of a Digital Earth calls for more dynamic information systems,new sources of information,and stronger capabilities for their integration.Sensor networks have been identified as a major information source for the Digital Earth,while Semantic Web technologies have been proposed to facilitate integration.So far,sensor data are stored and published using the Observations&Measurements standard of the Open Geospatial Consortium(OGC)as data model.With the advent of Volunteered Geographic Information and the Semantic Sensor Web,work on an ontological model gained importance within Sensor Web Enablement(SWE).In contrast to data models,an ontological approach abstracts from implementation details by focusing on modeling the physical world from the perspective of a particular domain.Ontologies restrict the interpretation of vocabularies toward their intended meaning.The ongoing paradigm shift to Linked Sensor Data complements this attempt.Two questions have to be addressed:(1)how to refer to changing and frequently updated data sets using Uniform Resource Identifiers,and(2)how to establish meaningful links between those data sets,that is,observations,sensors,features of interest,and observed properties?In this paper,we present a Linked Data model and a RESTful proxy for OGC’s Sensor Observation Service to improve integration and inter-linkage of observation data for the Digital Earth.展开更多
The exchange of information between transmission system operators(TSOs)and distribution system operators(DSOs)is a common practice.However,the evolution of the regulatory frameworks in Europe has increased the need fo...The exchange of information between transmission system operators(TSOs)and distribution system operators(DSOs)is a common practice.However,the evolution of the regulatory frameworks in Europe has increased the need for enhancing TSO-DSO data exchange and interoperability.This paper provides an overview of the TSO-DSO data exchanges and demonstrates the best practices using International Electrotechnical Commission(IEC)common information model(CIM),including the implementation of IEC common grid model exchange standard(CGMES),and discussion of the corresponding advantages,disadvantages,and challenges.Furthermore,this paper evaluates and reports the activities already carried out within European projects,with particular focus on TSO-DSO interoperability.Finally,this paper concludes the need for TSOs and DSOs to rely on standard-based solutions when performing TSO-DSO data exchange,which enables the efficient operation and development of the future power systems.展开更多
The Geoscience Markup Language(GeoSciML)has been developed to enable the interchange of geoscience information,principally that portrayed on geological maps as well as boreholes.A GeoSciML testbed was developed both t...The Geoscience Markup Language(GeoSciML)has been developed to enable the interchange of geoscience information,principally that portrayed on geological maps as well as boreholes.A GeoSciML testbed was developed both to test the implementation of the data model and its application in web services.The OneGeology-Europe project aims to use the GeoSciML data model,and build on the experience of the GeoSciML testbed,in implementing a geoportal for a harmonised geological map of Europe at 1:1 million scale.This involves the integration of web services from 20 participating organisations.An important objective of OneGeology-Europe is to contribute to Infrastructure for Spatial Information in the European Community(INSPIRE),both through the development of a geological data specification and the use of the INSPIRE technical architecture.GeoSciML and the OneGeology-Europe project are also steps towards incorporating geoscience data into a Digital Earth.Both the development of GeoSciML and the implementation of web services for GeoSciML and OneGeology-Europe,have followed a standards-based metho-dology.The technical architecture comprises a geoportal providing access to a Catalogue Service for the Web for metadata describing both the data and services available.OneGeology-Europe will provide bothWeb Map Services view andWeb Feature Services download services,which aim to be compliant with the INSPIRE implementing rules.展开更多
We can adequately study broad global issues and policies only by taking geosciences into account.Our research and decision-making must share and make effective use of interdisciplinary data sources,models,and processe...We can adequately study broad global issues and policies only by taking geosciences into account.Our research and decision-making must share and make effective use of interdisciplinary data sources,models,and processes.Noninteroperability impedes sharing of data and computing resources.Standards from the Open Geospatial Consortium(OGC)and other organizations are the basis for successfully deploying a seamless,distributed information infrastructure for the geosciences.Several specifications now adopted by the OGC consensus process are the result of OGC interoperability initiatives.The OGC standards,deployment architectures,and interoperability initiatives are described showing how the OGC standards baseline has been developed and applies to the geosciences.展开更多
Integration of data across multiple independently developed data sources can be challenging due to a variety of heterogeneities that exist across such systems.Data mediation technologies provide approaches for overcom...Integration of data across multiple independently developed data sources can be challenging due to a variety of heterogeneities that exist across such systems.Data mediation technologies provide approaches for overcoming these heterogeneities.Standards such as Geoscience Markup Language can address some of the heterogeneity issues by providing schema standards which sources can adhere to.This article addresses the issue of semantic heterogeneity across information resources by using domain ontologies and registering schema elements and data values to such ontologies.Registering data to ontologies provides a powerful search and data integration capability across disparate geoscience information resources.展开更多
In this paper,novel mesh techniques are proposed for wind field simulation of flexible spatial structure.For mesh generation,an interpolation strategy is presented to obtain a mesh system with variable density.Two spa...In this paper,novel mesh techniques are proposed for wind field simulation of flexible spatial structure.For mesh generation,an interpolation strategy is presented to obtain a mesh system with variable density.Two spatial structure examples are used to examine the efficiency and applicability of this technique.Then based on the structured mesh system generated by the technique,the mesh nodal coordinates are updated to adapt the moving boundary conditions by means of the mapping interpolation functions and some examples are given to verify the effectiveness.Furthermore,the constrained counterforce distribution technique and projection interpolation strategy are developed to implement the data exchange on the interaction surface of wind and structure.Finally,the computational accuracy is numerically validated.展开更多
With the development of information technology in the past 12 years,China has established the specialized or vertical web-based information systems for data collection of disease and related risk factor.These informat...With the development of information technology in the past 12 years,China has established the specialized or vertical web-based information systems for data collection of disease and related risk factor.These information systems are described as public health information systems(PHIS) in China.When PHIS has thus evolved in a haphazard and fragmented way under the pressure administrative,economic and legal,local health workers who are overburdened with multiple and timeconsuming reporting requirements cannot deliver timely,accurate and complete data.Seeing that the information of disease surveillance is provided by hospitals and health service centers in China and that the development of electronic medical record(EMR) and electronic health record(EHR) occurs under the policy and investment of health since 2009 in China,there should be a connection and data sharing mechanism between EMR(EHR) and PHIS to support public health surveillance and public health decision-making.The paper aims at discussing current status and problems of PHIS in China,making a brief introduction on the blueprint of health information technology in China,exploring solution for interoperability between PHIS and EHR(EMR),and sharing some experiences and lessons from pilot project on automatic notifiable infectious disease reporting.展开更多
Fixture design and planning is one of the most important manufacturing activities, playing a pivotal role in deciding the lead time for product development. Fixture design, which affects the part-quality in terms of g...Fixture design and planning is one of the most important manufacturing activities, playing a pivotal role in deciding the lead time for product development. Fixture design, which affects the part-quality in terms of geometric accuracy and surface finish, can be enhanced by using the product manufacturing information(PMI) stored in the neutral standard for the exchange of product model data(STEP) file, thereby integrating design and manufacturing. The present paper proposes a unique fixture design approach, to extract the geometry information from STEP application protocol(AP) 242 files of computer aided design(CAD) models, for providing automatic suggestions of locator positions and clamping surfaces. Automatic feature extraction software "FiXplan", developed using the programming language C#, is used to extract the part feature, dimension and geometry information. The information from the STEP AP 242 file is deduced using geometric reasoning techniques, which in turn is utilized for fixture planning. The developed software is observed to be adept in identifying the primary, secondary, and tertiary locating faces and locator position configurations of prismatic components. Structural analysis of the prismatic part under different locator positions was performed using commercial finite element method software, ABAQUS, and the optimized locator position was identified on the basis of minimum deformation of the workpiece.The area-ratio(base locator enclosed area(%)/work piece base area(%)) for the ideal locator configuration was observed as 33%. Experiments were conducted on a prismatic workpiece using a specially designed fixture, for different locator configurations. The surface roughness and waviness of the machined surfaces were analysed using an Alicona non-contact optical profilometer. The best surface characteristics were obtained for the surface machined under the ideal locator positions having an area-ratio of 33%, thus validating the predicted numerical results. The efficiency, capability and applicability of the developed software is demonstrated for the finishing operation of a sensor cover – a typical prismatic component having applications in the naval industry, under different locator configurations.The best results were obtained under the proposed ideal locator configuration of area-ratio 33%.展开更多
基金the National Key R&D Program of China(Grant no.2019YFC1709803)National Natural Science Foundation of China(Grant no.81873183).
文摘Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and commonly used data format,namely,JavaScript Object Notation(JSON),was introduced in this study.We designed a fully described data structure to collect TCM clinical trial information based on the JSON syntax.Results:A smart and powerful data format,JSON-ASR,was developed.JSON-ASR uses a plain-text data format in the form of key/value pairs and consists of six sections and more than 80 preset pairs.JSON-ASR adopts extensible structured arrays to support the situations of multi-groups and multi-outcomes.Conclusion:JSON-ASR has the characteristics of light weight,flexibility,and good scalability,which is suitable for the complex data of clinical evidence.
文摘On Nov.4th, AQSIQ (General Administration of Quality Supervision, Inspection and Quarantine of the People’s Republic of China), SAC (Standardization Administration of China), National Audit Office of China (CNAO), and National Ministry of Finance of China jointly held the conference press on the national standard of Information Technology-Data Interface of Accounting Software (GB/T 19581-2004) in Beijing. The standard was approved and issued on Sept. 20, 2004 by AQSIQ and SAC, and it would come into effect all over the whole nation from January 1st, 2005.
文摘A new chance of developing traditional manufacturing industry comes forth with the development of network technology. Application technology oriented rapid response manufacturing in the distributed network environments, that is, how to take advantage of the Intranet and Internet, combine the numerous manufacturing resources spread around the region, the country and even the globe is the key to the agile design, manufacturing and the buildup of comprehensively competitive power, at the same time, is also an important research direction in the field of advanced manufacturing technology. Rapid response manufacturing in the distributed network environment is a newly manufactory pattern that can be used to implement the conception of agile design and manufacturing, but there are some new problems coming with it, which will directly influence the enterprise’s ability of rapid response in the distributed network manufacturing pattern and lead to the failure of the league and the lost of the given orders. In this paper, we establish some approaches to solve these problems in product development process. The paper then presents the research on key application technologies and solutions includes: network safety strategy which guarantees data transferring among the leaguer members, production data management based on Web/DOT (Distributed Object Technology) and XML criteria which guarantees data exchange in structure-variance characteristic environments, the network platform which provides the conversion service of different types of CAD files each other. All of these solutions are aim for technology problems existing in the distributed network environments and among the league members. Finally, the paper takes one project, that is, the establishment of the online application service system for Shanghai Advance Manufacturing Technology Research Center as a good instance.
文摘Chronic myeloid leukemia(CML)in minors is a rare disease which can be effectively treated by tyrosine kinase inhibitors(TKIs)since the year 2000.A majority of pediatricians will encounter one or two CML patients in the course of their careers and will typically have to rely on written information along with their own intuition to provide care.Knowledge of response to TKIs and of agespecific side effects has an impact on the design of pediatric CML trials in many ways aiming to contribute toward greater predictability of clinical improvements.Information from a registry on a rare disease like CML offers the enormous benefit of enabling treating physicians to interact and share their collective experience.The International Registry on Pediatric CML(IR-PCML)was founded at Poitiers/France almost 10 years ago.Since then,the number of collaboration centers and in parallel of registered patients continuously increased(>550 patients as of December 2019).Ideally,from a given treatment center in a country data are transferred to a national coordinator who interacts with the IR-PCML.In the sense of quality assurance,the registry can offer dissemination of knowledge on state-of-the-art diagnostics(including reference appraisal),optimal treatment approaches,and follow-up procedures within a network that is exerting its strength via participation.With continuous growth during the recent years,very rare subgroups of patients could be identified(e.g.,CML diagnosed at age<3 years,children presenting with specific problems at diagnosis or during course of treatment)which had not been described before.Publications coming from the IR-PCML disseminated this useful information derived from patients who robustly participate and share information about their disease,among themselves and with their caregivers and clinicians.Patient input driving the collection of data on this rare leukemia is the basis for the considerable success of bringing new therapeutics into clinical use.
文摘An ever-increasing number of sensor resources are being exposed via the World Wide Web to become part of the Digital Earth.Discovery,selection and use of these sensors and their observations require a robust sensor information model,but the consistent description of sensor metadata is a complex and difficult task.Currently,the only available robust model is SensorML,which is intentionally designed in a very generic way.Due to this genericness,interoperability can hardly be achieved without the definition of application profiles that further constrain the use and expressiveness of the root language.So far,such SensorML profiles have only been developed up to a limited extent.This work describes a new approach for defining sensor metadata,the Starfish Fungus Language(StarFL)model.This language follows a more restrictive approach and incorporates concepts from the recently published Semantic Sensor Network Ontology to overcome the key issues users are experiencing with SensorML.StarFL defines a restricted vocabulary and model for sensor metadata to achieve a high level of interoperability and a straightforward reusability of sensor descriptions.
基金The presented work is developed within the 528 North semantics community,and partly funded by the European projects UncertWeb(FP7-248488)ENVISION(FP7-249170)the GENESIS project(an Integrated Project,contract number 223996).
文摘The vision of a Digital Earth calls for more dynamic information systems,new sources of information,and stronger capabilities for their integration.Sensor networks have been identified as a major information source for the Digital Earth,while Semantic Web technologies have been proposed to facilitate integration.So far,sensor data are stored and published using the Observations&Measurements standard of the Open Geospatial Consortium(OGC)as data model.With the advent of Volunteered Geographic Information and the Semantic Sensor Web,work on an ontological model gained importance within Sensor Web Enablement(SWE).In contrast to data models,an ontological approach abstracts from implementation details by focusing on modeling the physical world from the perspective of a particular domain.Ontologies restrict the interpretation of vocabularies toward their intended meaning.The ongoing paradigm shift to Linked Sensor Data complements this attempt.Two questions have to be addressed:(1)how to refer to changing and frequently updated data sets using Uniform Resource Identifiers,and(2)how to establish meaningful links between those data sets,that is,observations,sensors,features of interest,and observed properties?In this paper,we present a Linked Data model and a RESTful proxy for OGC’s Sensor Observation Service to improve integration and inter-linkage of observation data for the Digital Earth.
基金the OneNet,TDX-ASSIST,EU-SysFlex,and INTER-RFACE projects funded by the European Union's Horizon 2020 Research and Innovation Programme(especially under Grants No.957739,No.774500,No.773505,and No.824330).
文摘The exchange of information between transmission system operators(TSOs)and distribution system operators(DSOs)is a common practice.However,the evolution of the regulatory frameworks in Europe has increased the need for enhancing TSO-DSO data exchange and interoperability.This paper provides an overview of the TSO-DSO data exchanges and demonstrates the best practices using International Electrotechnical Commission(IEC)common information model(CIM),including the implementation of IEC common grid model exchange standard(CGMES),and discussion of the corresponding advantages,disadvantages,and challenges.Furthermore,this paper evaluates and reports the activities already carried out within European projects,with particular focus on TSO-DSO interoperability.Finally,this paper concludes the need for TSOs and DSOs to rely on standard-based solutions when performing TSO-DSO data exchange,which enables the efficient operation and development of the future power systems.
文摘The Geoscience Markup Language(GeoSciML)has been developed to enable the interchange of geoscience information,principally that portrayed on geological maps as well as boreholes.A GeoSciML testbed was developed both to test the implementation of the data model and its application in web services.The OneGeology-Europe project aims to use the GeoSciML data model,and build on the experience of the GeoSciML testbed,in implementing a geoportal for a harmonised geological map of Europe at 1:1 million scale.This involves the integration of web services from 20 participating organisations.An important objective of OneGeology-Europe is to contribute to Infrastructure for Spatial Information in the European Community(INSPIRE),both through the development of a geological data specification and the use of the INSPIRE technical architecture.GeoSciML and the OneGeology-Europe project are also steps towards incorporating geoscience data into a Digital Earth.Both the development of GeoSciML and the implementation of web services for GeoSciML and OneGeology-Europe,have followed a standards-based metho-dology.The technical architecture comprises a geoportal providing access to a Catalogue Service for the Web for metadata describing both the data and services available.OneGeology-Europe will provide bothWeb Map Services view andWeb Feature Services download services,which aim to be compliant with the INSPIRE implementing rules.
文摘We can adequately study broad global issues and policies only by taking geosciences into account.Our research and decision-making must share and make effective use of interdisciplinary data sources,models,and processes.Noninteroperability impedes sharing of data and computing resources.Standards from the Open Geospatial Consortium(OGC)and other organizations are the basis for successfully deploying a seamless,distributed information infrastructure for the geosciences.Several specifications now adopted by the OGC consensus process are the result of OGC interoperability initiatives.The OGC standards,deployment architectures,and interoperability initiatives are described showing how the OGC standards baseline has been developed and applies to the geosciences.
基金This research has been funded by the US National Science Foundation via grants 0225673 and 0744229.
文摘Integration of data across multiple independently developed data sources can be challenging due to a variety of heterogeneities that exist across such systems.Data mediation technologies provide approaches for overcoming these heterogeneities.Standards such as Geoscience Markup Language can address some of the heterogeneity issues by providing schema standards which sources can adhere to.This article addresses the issue of semantic heterogeneity across information resources by using domain ontologies and registering schema elements and data values to such ontologies.Registering data to ontologies provides a powerful search and data integration capability across disparate geoscience information resources.
基金the National Natural Science Foundation of China (No. 50778111) the Doctoral Disciplinary Special Research Project of Chinese Ministry of Education (No. 200802480056)
文摘In this paper,novel mesh techniques are proposed for wind field simulation of flexible spatial structure.For mesh generation,an interpolation strategy is presented to obtain a mesh system with variable density.Two spatial structure examples are used to examine the efficiency and applicability of this technique.Then based on the structured mesh system generated by the technique,the mesh nodal coordinates are updated to adapt the moving boundary conditions by means of the mapping interpolation functions and some examples are given to verify the effectiveness.Furthermore,the constrained counterforce distribution technique and projection interpolation strategy are developed to implement the data exchange on the interaction surface of wind and structure.Finally,the computational accuracy is numerically validated.
文摘With the development of information technology in the past 12 years,China has established the specialized or vertical web-based information systems for data collection of disease and related risk factor.These information systems are described as public health information systems(PHIS) in China.When PHIS has thus evolved in a haphazard and fragmented way under the pressure administrative,economic and legal,local health workers who are overburdened with multiple and timeconsuming reporting requirements cannot deliver timely,accurate and complete data.Seeing that the information of disease surveillance is provided by hospitals and health service centers in China and that the development of electronic medical record(EMR) and electronic health record(EHR) occurs under the policy and investment of health since 2009 in China,there should be a connection and data sharing mechanism between EMR(EHR) and PHIS to support public health surveillance and public health decision-making.The paper aims at discussing current status and problems of PHIS in China,making a brief introduction on the blueprint of health information technology in China,exploring solution for interoperability between PHIS and EHR(EMR),and sharing some experiences and lessons from pilot project on automatic notifiable infectious disease reporting.
基金Department of Science and Technology,Government of India for providing financial support under the scheme FIST(No.SR/FST/ETI-388/2015)。
文摘Fixture design and planning is one of the most important manufacturing activities, playing a pivotal role in deciding the lead time for product development. Fixture design, which affects the part-quality in terms of geometric accuracy and surface finish, can be enhanced by using the product manufacturing information(PMI) stored in the neutral standard for the exchange of product model data(STEP) file, thereby integrating design and manufacturing. The present paper proposes a unique fixture design approach, to extract the geometry information from STEP application protocol(AP) 242 files of computer aided design(CAD) models, for providing automatic suggestions of locator positions and clamping surfaces. Automatic feature extraction software "FiXplan", developed using the programming language C#, is used to extract the part feature, dimension and geometry information. The information from the STEP AP 242 file is deduced using geometric reasoning techniques, which in turn is utilized for fixture planning. The developed software is observed to be adept in identifying the primary, secondary, and tertiary locating faces and locator position configurations of prismatic components. Structural analysis of the prismatic part under different locator positions was performed using commercial finite element method software, ABAQUS, and the optimized locator position was identified on the basis of minimum deformation of the workpiece.The area-ratio(base locator enclosed area(%)/work piece base area(%)) for the ideal locator configuration was observed as 33%. Experiments were conducted on a prismatic workpiece using a specially designed fixture, for different locator configurations. The surface roughness and waviness of the machined surfaces were analysed using an Alicona non-contact optical profilometer. The best surface characteristics were obtained for the surface machined under the ideal locator positions having an area-ratio of 33%, thus validating the predicted numerical results. The efficiency, capability and applicability of the developed software is demonstrated for the finishing operation of a sensor cover – a typical prismatic component having applications in the naval industry, under different locator configurations.The best results were obtained under the proposed ideal locator configuration of area-ratio 33%.