The transition from middle-income to high-income stage is fraught with risks of growth divergence. Economic transition is clouded by the following possibilities: (1)falling share of industrial seetor through indust...The transition from middle-income to high-income stage is fraught with risks of growth divergence. Economic transition is clouded by the following possibilities: (1)falling share of industrial seetor through industrial depression and weakening growth momentum caused by the large urbanization costs; (2) the subordination of service sector as a result of nearly irreversibly industrial professional, which falters the process of service sector transition and upgrading," (3) inefficient knowledge production allocation and human capital upgrade due to the absence of incentivized compensation of knowledge consumption. We suggest that a country should reshape its efficiency model by upgrading knowledge factor and human capital as the pre-requisite. Given the dilemmas of transition, China should take the faetorization trend of service sector and reshape efficiency model through institutional reform, ensuring that service sector will develop in tandem with industrial sector.展开更多
Understanding definitions and differences between the processes,knowledge processes and business processes is the first step of the integration of knowledge processes into management systems of an organization.The nex...Understanding definitions and differences between the processes,knowledge processes and business processes is the first step of the integration of knowledge processes into management systems of an organization.The next step is to understand throughout the company why the processes should be introduced and continuously maintained.The knowledge is one of the most valuable assets of the company,relevant part of the intellectual capital.The management of the knowledge and its lifecycle can give a market advantage for the organization.In the nuclear industry this is the vital requirement to maintain the safe and reliable operation of a nuclear facility,or radiation safety activates.Those companies who already implemented an integrated management system were following international standards,or good practices(like ISO 9001,EFQM,Standard Nuclear Performance Model developed by Nuclear Energy Institute(NEI)and others).This article focuses on nuclear industry organizations,approaches and methods,and how to integrate the knowledge processes into management system.This is the last step of the knowledge management implementation in an organization.When it was done,we can say that the knowledge processes are embedded into organization’s day-to-day life and the knowledge managed in the organization as all other resources.展开更多
This research explores the integration of large language models (LLMs) into scientific data assimilation, focusing on combustion science as a case study. Leveraging foundational models integrated with Retrieval-Augmen...This research explores the integration of large language models (LLMs) into scientific data assimilation, focusing on combustion science as a case study. Leveraging foundational models integrated with Retrieval-Augmented Generation (RAG) framework, the study introduces an approach to process diverse combustion research data, spanning experimental studies, simulations, and literature. The multifaceted nature of combustion research emphasizes the critical role of knowledge processing in navigating and extracting valuable information from a vast and diverse pool of sources. The developed approach minimizes computational and economic expenses while optimizing data privacy and accuracy. It incorporates prompt engineering and offline open-source LLMs, offering user autonomy in selecting base models. The study provides a thorough examination of text segmentation strategies, conducts comparative studies between LLMs, and explores various optimized prompts to demonstrate the effectiveness of the framework. By incorporating an external vector database, the framework outperforms a conventional LLM in generating accurate responses and constructing robust arguments. Additionally, the study delves into the investigation of optimized prompt templates for the purpose of efficient extraction of scientific literature. Furthermore, we present a targeted scaling study to quantify the algorithmic performance of the framework as the number of prompt tokens increases. The research addresses concerns related to hallucinations and false research articles by introducing a custom workflow developed with a detection algorithm to filter out inaccuracies. Despite identified areas for improvement, the framework consistently delivers accurate domain-specific responses with minimal human oversight. The prompt-agnostic approach introduced holds promise for future improvements. The study underscores the significance of integrating LLMs and knowledge processing techniques in scientific research, providing a foundation for advancements in data assimilation and utilization.展开更多
To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new lig...To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.展开更多
We present a step-by-step approach for constructing a framework for knowledge process analysis (KPA). We intend to apply this framework to the analysis of own research projects in an exploratory way and elaborate it...We present a step-by-step approach for constructing a framework for knowledge process analysis (KPA). We intend to apply this framework to the analysis of own research projects in an exploratory way and elaborate it through the accumulation of case studies. This study is based on a methodology consisting of knowledge process modeling, primitives synthesis, and reflective verification. We describe details of the methodology and present the results of case studies: a novel methodology, a practical work guide, and a tool for KPA; insights for improving future research projects and education; and the integration of existing knowledge creation theories.展开更多
This paper analyzed the relationship between entrepreneurial orientation and new product development perlormance based on the perspective of knowledge creation process. Through a questionnaire survey, we found that en...This paper analyzed the relationship between entrepreneurial orientation and new product development perlormance based on the perspective of knowledge creation process. Through a questionnaire survey, we found that entrepreneurial orientation is positively related to new product performance, and knowledge creation process plays a mediating role in this relationship. This article examines the role of entrepreneurial orientation on new product innovation performance in Chinese situations, and it is the first time to check the intermediary functions on each dimension of knowledge test between entrepreneurial orientation and new product development performance.展开更多
Knowledge transfer is widely emphasized as a strategic issue for firm competition. A model for intra-firm horizontal knowledge transfer is proposed to model horizontal knowledge transfer to solve some demerits in curr...Knowledge transfer is widely emphasized as a strategic issue for firm competition. A model for intra-firm horizontal knowledge transfer is proposed to model horizontal knowledge transfer to solve some demerits in current knowledge transfer researches. The concept model of intra-firm horizontal knowledge transfer was described and a framework was provided to define the main components of the transfer process. Horizontal knowledge transfer is that knowledge is transferred from the source to the same hierarchical level recipients as the target. Horizontal knowledge transfer constitutes a strategic area of knowledge management research. However, little is known about the circumstances under which one particular mechanism is the most appropriate. To address these issues, some significant conclusions are drawn concerning knowledge transfer mechanisms in a real-world setting.展开更多
A prodouct modeling and a process planning that are two essential basses of realizing concurrent engineering are investigated , a logical modeling technique , grammar representation scheme of technology knowledge and...A prodouct modeling and a process planning that are two essential basses of realizing concurrent engineering are investigated , a logical modeling technique , grammar representation scheme of technology knowledge and architecture of expert system for process planning within con- current engineering environment are proposed. They have been utilized in a real reaserch project.展开更多
In order to reduce knowledge reasoning space and improve knowledge processing efficiency, a framework of distributed attribute reduction in concept lattices is presented. By employing the idea similar to that of the r...In order to reduce knowledge reasoning space and improve knowledge processing efficiency, a framework of distributed attribute reduction in concept lattices is presented. By employing the idea similar to that of the rough set, the characterization of core attributes, dispensable attributes and unnecessary attributes are described from the point of view of local formal contexts and virtual global contexts. A determinant theorem of attribute reduction is derived. Based on these results, an approach for distributed attribute reduction is presented. It first performs reduction independently on each local context using the existing approaches, and then local reducts are merged to compute reducts of global contexts. An algorithm implementation is provided and its effectiveness is validated. The distributed reduction algorithm facilitates not only improving computation efficiency but also avoiding the problems caused by the existing approaches, such as data privacy and communication overhead.展开更多
Discovery of useful forecasting rules from observational weather data is an outstanding interesting topic.The traditional methods of acquiring forecasting knowledge are manual analysis and investigation performed by h...Discovery of useful forecasting rules from observational weather data is an outstanding interesting topic.The traditional methods of acquiring forecasting knowledge are manual analysis and investigation performed by human scientists.This paper presents the experimental results of an automatic machine learning system which derives forecasting rules from real observational data.We tested the system on the two large real data sets from the areas of centra! China and Victoria of Australia.The experimental results show that the forecasting rules discovered by the system are very competitive to human experts.The forecasting accuracy rates are 86.4% and 78% of the two data sets respectively展开更多
The paper presents our research efforts motivated by the apparent need to combine conventional,preexisting computing functions with novel knowledge--based functions. This has been likened to what occurred in the evolu...The paper presents our research efforts motivated by the apparent need to combine conventional,preexisting computing functions with novel knowledge--based functions. This has been likened to what occurred in the evolution of primates, where the 'new brain' (the cortex) was added to, layered upon, and given control over the 'old brain' common to the less complex animals.展开更多
The simulator is a combination of computational functions describing physical system, distributed rule and knowledge bases, and a reasoning machine, all executing in parallel. The hardware,comprising two levels of mul...The simulator is a combination of computational functions describing physical system, distributed rule and knowledge bases, and a reasoning machine, all executing in parallel. The hardware,comprising two levels of multiprocessor network-- a T800 transputer and STD--BUS computer systems--and analogue and I/O boards, has been used successfully to train and simulate a 100 MW power system at a low cost. The system has wide applications in industry.展开更多
Although many theories regarding the implementation of knowledge management (KM) in organizations have been proposed and studied, most applications tend to stand alone without incorporating the business processes. D...Although many theories regarding the implementation of knowledge management (KM) in organizations have been proposed and studied, most applications tend to stand alone without incorporating the business processes. Different categories of knowledge provide different benefits and how to integrate various categories of KM into a hybrid approach as an effective KM manner remains strategically important, and yet is still understudied. Therefore, in this paper a hybrid model that integrates principal KM applications for new service development (NSD) and the measurement of the resulting f'mancial benefits have been developed. The proposed KM model incorporates newsgroups, knowledge forums, knowledge asset management and knowledge application processes as a hybrid means for sharing organizational knowledge along two axes, explicit vs. implicit and individual vs. collective. One of the largest management consulting companies in Taiwan, China, whose process model of NSD was standing alone with KM applications, was selected for the case study. A set of hybrid KM processes was developed to implement the proposed KM model, and it illustrates an application with greater financial benefits for integrating hybrid KM practices into the business process. Based on knowledge value added (KVA) validation, the proposed KM model provides a new operating system for sharing NSD knowledge within an organization. Through the case study by measuring the achieved financial results, the proposed KM model is found to provide an exclusive hybrid platform with an empirical process model to address innovative approaches and practical values of KM within an organization.展开更多
The auto body process monitoring and the root cause diagnosis based on data-driven approaches are vital ways to improve the dimension quality of sheet metal assemblies. However, during the launch time of the process m...The auto body process monitoring and the root cause diagnosis based on data-driven approaches are vital ways to improve the dimension quality of sheet metal assemblies. However, during the launch time of the process mass production with an off-line measurement strategy, the traditional statistical methods are difficult to perform process control effectively. Based on the powerful abilities in information fusion, a systematic Bayesian based quality control approach is presented to solve the quality problems in condition of incomplete dataset. For the process monitoring, a Bayesian estimation method is used to give out-of-control signals in the process. With the abnormal evidence, the Bayesian network(BN) approach is employed to identify the fixture root causes. A novel BN structure and the conditional probability training methods based on process knowledge representation are proposed to obtain the diagnostic model. Furthermore, based on the diagnostic performance analysis, a case study is used to evaluate the effectiveness of the proposed approach. Results show that the Bayesian based method has a better diagnostic performance for multi-fault cases.展开更多
This paper considers the eventual leader election problem in asynchronous message-passing systems where an arbitrary number t of processes can crash(t〈n,where n is the total number of processes).It considers weak a...This paper considers the eventual leader election problem in asynchronous message-passing systems where an arbitrary number t of processes can crash(t〈n,where n is the total number of processes).It considers weak assumptions both on the initial knowledge of the processes and on the network behavior.More precisely,initially,a process knows only its identity and the fact that the process identities are different and totally ordered(it knows neither n nor t).Two eventual leader election protocols and a lower bound are presented.The first protocol assumes that a process also knows a lower bound α on the number of processes that do not crash.This protocol requires the following behavioral properties from the underlying network:the graph made up of the correct processes and fair lossy links is strongly connected,and there is a correct process connected to(n〈f)-α other correct processes(where f is the actual number of crashes in the considered run) through eventually timely paths(paths made up of correct processes and eventually timely links).This protocol is not communication-efficient in the sense that each correct process has to send messages forever.The second protocol is communication-efficient:after some time,only the final common leader has to send messages forever.This protocol does not require the processes to know α,but requires stronger properties from the underlying network:each pair of correct processes has to be connected by fair lossy links(one in each direction),and there is a correct process whose n〈f-1 output links to the rest of correct processes have to be eventually timely.A matching lower bound result shows that any eventual leader election protocol must have runs with this number of eventually timely links,even if all processes know all the processes identities.In addition to being communication-efficient,the second protocol has another noteworthy efficiency property,namely,be the run finite or infinite,all the local variables and message fields have a finite domain in the run.展开更多
One of the key issues in climate risk management is to develop climate resilient infrastructure so as to ensure safety and sustainability of urban functioning systems as well as mitigate the adverse impacts associated...One of the key issues in climate risk management is to develop climate resilient infrastructure so as to ensure safety and sustainability of urban functioning systems as well as mitigate the adverse impacts associated with increasing climate hazards.However,conventional methods of assessing risks do not fully address the interaction of various subsystems within the city system and are unable to consolidate diverse opinions of various stakeholders on their assessments of sector-specific risks posed by climate change.To address this gap,this study advances an integrated-systems-analysis tool-Climate Risk Assessment of Infrastructure Tool(CRAIT),and applies it to analyze and compare the extent of risk factor exposure and vulnerability over time across five critical urban infrastructure sectors in Shanghai and Shenzhen,two cities that have distinctive geo-climate profiles and histories of infrastructure development.The results show significantly higher level of variation between the two cities in terms of vulnerability levels than that of exposure.More specifically,the sectors of critical buildings,water,energy,and information&communication in Shenzhen have significantly higher vulnerability levels than Shanghai in both the 2000s and the 2050s.We further discussed the vulnerability levels of subsystems in each sector and proposed twelve potential adaptation options for the roads system based on four sets of criteria:technical feasibility,flexibility,co-benefits,and policy compatibility.The application of CRAIT is bound to be a knowledge co-production process with the local experts and stakeholders.This knowledge co-production process highlights the importance of management advancements and nature-based green solutions in managing climate change risk in the future though differences are observed across the efficacy categories due to the geographical and meteorological conditions in the two cities.This study demonstrates that this knowledge co-creation process is valuable in facilitating policymakers'decision-making and their feedback to scientific understanding in climate risk assessment,and that this approach has general applicability for cities in other regions and countries.展开更多
基金sponsored by major tendering projects of National Social Sciences Foundation "Study on Accelerating Economic Adjustment and Coordinated Development"(Grant No.12&ZD084) and "Study on Contribution of Consumption to Economic Growth under Shifting Demand Structure"(Grant No.15ZDC011)projects of National Social Sciences Foundation "Study on China's Structural Growth Deceleration,Transition Risks and Efficiency Improvement Path"(Grant No.14AJL006) and "Study on the Scale,Spatial Clustering and Management Model of Chinese Cities"(Grant No.15ZDC011)
文摘The transition from middle-income to high-income stage is fraught with risks of growth divergence. Economic transition is clouded by the following possibilities: (1)falling share of industrial seetor through industrial depression and weakening growth momentum caused by the large urbanization costs; (2) the subordination of service sector as a result of nearly irreversibly industrial professional, which falters the process of service sector transition and upgrading," (3) inefficient knowledge production allocation and human capital upgrade due to the absence of incentivized compensation of knowledge consumption. We suggest that a country should reshape its efficiency model by upgrading knowledge factor and human capital as the pre-requisite. Given the dilemmas of transition, China should take the faetorization trend of service sector and reshape efficiency model through institutional reform, ensuring that service sector will develop in tandem with industrial sector.
文摘Understanding definitions and differences between the processes,knowledge processes and business processes is the first step of the integration of knowledge processes into management systems of an organization.The next step is to understand throughout the company why the processes should be introduced and continuously maintained.The knowledge is one of the most valuable assets of the company,relevant part of the intellectual capital.The management of the knowledge and its lifecycle can give a market advantage for the organization.In the nuclear industry this is the vital requirement to maintain the safe and reliable operation of a nuclear facility,or radiation safety activates.Those companies who already implemented an integrated management system were following international standards,or good practices(like ISO 9001,EFQM,Standard Nuclear Performance Model developed by Nuclear Energy Institute(NEI)and others).This article focuses on nuclear industry organizations,approaches and methods,and how to integrate the knowledge processes into management system.This is the last step of the knowledge management implementation in an organization.When it was done,we can say that the knowledge processes are embedded into organization’s day-to-day life and the knowledge managed in the organization as all other resources.
基金support from the Defense Threat Reduction Agency(DTRA)under Grant No.HDTRA12110012with Dr.Richard Fry as the Program Officer,and partial project support from the Air Force Office of Scientific Research(AFOSR)under Grant No.FA9550-24-1-0017with Dr.Chiping Li as the Program Officer.
文摘This research explores the integration of large language models (LLMs) into scientific data assimilation, focusing on combustion science as a case study. Leveraging foundational models integrated with Retrieval-Augmented Generation (RAG) framework, the study introduces an approach to process diverse combustion research data, spanning experimental studies, simulations, and literature. The multifaceted nature of combustion research emphasizes the critical role of knowledge processing in navigating and extracting valuable information from a vast and diverse pool of sources. The developed approach minimizes computational and economic expenses while optimizing data privacy and accuracy. It incorporates prompt engineering and offline open-source LLMs, offering user autonomy in selecting base models. The study provides a thorough examination of text segmentation strategies, conducts comparative studies between LLMs, and explores various optimized prompts to demonstrate the effectiveness of the framework. By incorporating an external vector database, the framework outperforms a conventional LLM in generating accurate responses and constructing robust arguments. Additionally, the study delves into the investigation of optimized prompt templates for the purpose of efficient extraction of scientific literature. Furthermore, we present a targeted scaling study to quantify the algorithmic performance of the framework as the number of prompt tokens increases. The research addresses concerns related to hallucinations and false research articles by introducing a custom workflow developed with a detection algorithm to filter out inaccuracies. Despite identified areas for improvement, the framework consistently delivers accurate domain-specific responses with minimal human oversight. The prompt-agnostic approach introduced holds promise for future improvements. The study underscores the significance of integrating LLMs and knowledge processing techniques in scientific research, providing a foundation for advancements in data assimilation and utilization.
基金support provided by the National Natural Science Foundation of China(22122802,22278044,and 21878028)the Chongqing Science Fund for Distinguished Young Scholars(CSTB2022NSCQ-JQX0021)the Fundamental Research Funds for the Central Universities(2022CDJXY-003).
文摘To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.
文摘We present a step-by-step approach for constructing a framework for knowledge process analysis (KPA). We intend to apply this framework to the analysis of own research projects in an exploratory way and elaborate it through the accumulation of case studies. This study is based on a methodology consisting of knowledge process modeling, primitives synthesis, and reflective verification. We describe details of the methodology and present the results of case studies: a novel methodology, a practical work guide, and a tool for KPA; insights for improving future research projects and education; and the integration of existing knowledge creation theories.
文摘This paper analyzed the relationship between entrepreneurial orientation and new product development perlormance based on the perspective of knowledge creation process. Through a questionnaire survey, we found that entrepreneurial orientation is positively related to new product performance, and knowledge creation process plays a mediating role in this relationship. This article examines the role of entrepreneurial orientation on new product innovation performance in Chinese situations, and it is the first time to check the intermediary functions on each dimension of knowledge test between entrepreneurial orientation and new product development performance.
文摘Knowledge transfer is widely emphasized as a strategic issue for firm competition. A model for intra-firm horizontal knowledge transfer is proposed to model horizontal knowledge transfer to solve some demerits in current knowledge transfer researches. The concept model of intra-firm horizontal knowledge transfer was described and a framework was provided to define the main components of the transfer process. Horizontal knowledge transfer is that knowledge is transferred from the source to the same hierarchical level recipients as the target. Horizontal knowledge transfer constitutes a strategic area of knowledge management research. However, little is known about the circumstances under which one particular mechanism is the most appropriate. To address these issues, some significant conclusions are drawn concerning knowledge transfer mechanisms in a real-world setting.
文摘A prodouct modeling and a process planning that are two essential basses of realizing concurrent engineering are investigated , a logical modeling technique , grammar representation scheme of technology knowledge and architecture of expert system for process planning within con- current engineering environment are proposed. They have been utilized in a real reaserch project.
基金The National Outstanding Young Scientist Foundationby NSFC(No.60425206)the National Natural Science Foundation of Chi-na(No.60503020)the Natural Science Foundation of Jiangsu Province(No.BK2006094).
文摘In order to reduce knowledge reasoning space and improve knowledge processing efficiency, a framework of distributed attribute reduction in concept lattices is presented. By employing the idea similar to that of the rough set, the characterization of core attributes, dispensable attributes and unnecessary attributes are described from the point of view of local formal contexts and virtual global contexts. A determinant theorem of attribute reduction is derived. Based on these results, an approach for distributed attribute reduction is presented. It first performs reduction independently on each local context using the existing approaches, and then local reducts are merged to compute reducts of global contexts. An algorithm implementation is provided and its effectiveness is validated. The distributed reduction algorithm facilitates not only improving computation efficiency but also avoiding the problems caused by the existing approaches, such as data privacy and communication overhead.
文摘Discovery of useful forecasting rules from observational weather data is an outstanding interesting topic.The traditional methods of acquiring forecasting knowledge are manual analysis and investigation performed by human scientists.This paper presents the experimental results of an automatic machine learning system which derives forecasting rules from real observational data.We tested the system on the two large real data sets from the areas of centra! China and Victoria of Australia.The experimental results show that the forecasting rules discovered by the system are very competitive to human experts.The forecasting accuracy rates are 86.4% and 78% of the two data sets respectively
文摘The paper presents our research efforts motivated by the apparent need to combine conventional,preexisting computing functions with novel knowledge--based functions. This has been likened to what occurred in the evolution of primates, where the 'new brain' (the cortex) was added to, layered upon, and given control over the 'old brain' common to the less complex animals.
文摘The simulator is a combination of computational functions describing physical system, distributed rule and knowledge bases, and a reasoning machine, all executing in parallel. The hardware,comprising two levels of multiprocessor network-- a T800 transputer and STD--BUS computer systems--and analogue and I/O boards, has been used successfully to train and simulate a 100 MW power system at a low cost. The system has wide applications in industry.
文摘Although many theories regarding the implementation of knowledge management (KM) in organizations have been proposed and studied, most applications tend to stand alone without incorporating the business processes. Different categories of knowledge provide different benefits and how to integrate various categories of KM into a hybrid approach as an effective KM manner remains strategically important, and yet is still understudied. Therefore, in this paper a hybrid model that integrates principal KM applications for new service development (NSD) and the measurement of the resulting f'mancial benefits have been developed. The proposed KM model incorporates newsgroups, knowledge forums, knowledge asset management and knowledge application processes as a hybrid means for sharing organizational knowledge along two axes, explicit vs. implicit and individual vs. collective. One of the largest management consulting companies in Taiwan, China, whose process model of NSD was standing alone with KM applications, was selected for the case study. A set of hybrid KM processes was developed to implement the proposed KM model, and it illustrates an application with greater financial benefits for integrating hybrid KM practices into the business process. Based on knowledge value added (KVA) validation, the proposed KM model provides a new operating system for sharing NSD knowledge within an organization. Through the case study by measuring the achieved financial results, the proposed KM model is found to provide an exclusive hybrid platform with an empirical process model to address innovative approaches and practical values of KM within an organization.
基金the National Natural Science Foundation of China(Nos.51405299 and 51175340)the Natural Science Foundation of Shanghai(No.14ZR1428700)
文摘The auto body process monitoring and the root cause diagnosis based on data-driven approaches are vital ways to improve the dimension quality of sheet metal assemblies. However, during the launch time of the process mass production with an off-line measurement strategy, the traditional statistical methods are difficult to perform process control effectively. Based on the powerful abilities in information fusion, a systematic Bayesian based quality control approach is presented to solve the quality problems in condition of incomplete dataset. For the process monitoring, a Bayesian estimation method is used to give out-of-control signals in the process. With the abnormal evidence, the Bayesian network(BN) approach is employed to identify the fixture root causes. A novel BN structure and the conditional probability training methods based on process knowledge representation are proposed to obtain the diagnostic model. Furthermore, based on the diagnostic performance analysis, a case study is used to evaluate the effectiveness of the proposed approach. Results show that the Bayesian based method has a better diagnostic performance for multi-fault cases.
基金supported by the Comunidad de Madrid under Grant No.S2009/TIC-1692the Spanish MEC under Grant Nos.TIN2007-67353-C02-01 and TIN2008-06735-C02-01
文摘This paper considers the eventual leader election problem in asynchronous message-passing systems where an arbitrary number t of processes can crash(t〈n,where n is the total number of processes).It considers weak assumptions both on the initial knowledge of the processes and on the network behavior.More precisely,initially,a process knows only its identity and the fact that the process identities are different and totally ordered(it knows neither n nor t).Two eventual leader election protocols and a lower bound are presented.The first protocol assumes that a process also knows a lower bound α on the number of processes that do not crash.This protocol requires the following behavioral properties from the underlying network:the graph made up of the correct processes and fair lossy links is strongly connected,and there is a correct process connected to(n〈f)-α other correct processes(where f is the actual number of crashes in the considered run) through eventually timely paths(paths made up of correct processes and eventually timely links).This protocol is not communication-efficient in the sense that each correct process has to send messages forever.The second protocol is communication-efficient:after some time,only the final common leader has to send messages forever.This protocol does not require the processes to know α,but requires stronger properties from the underlying network:each pair of correct processes has to be connected by fair lossy links(one in each direction),and there is a correct process whose n〈f-1 output links to the rest of correct processes have to be eventually timely.A matching lower bound result shows that any eventual leader election protocol must have runs with this number of eventually timely links,even if all processes know all the processes identities.In addition to being communication-efficient,the second protocol has another noteworthy efficiency property,namely,be the run finite or infinite,all the local variables and message fields have a finite domain in the run.
基金supported by the Shenzhen Science and Technology Program(KCXFZ20201221173412035)the National Natural Science Foundation of China(51761135024)+1 种基金the UK-China Research&Innovation Partnership Fund through the Met Office Climate Science for Service Partnership(CSSP)China as part of the Newton Fund(Project:Climate Risk Assessment Tool for Chinese Cities)the UK-China Cooperation on Climate Change Risk Assessment(Phase 3)for financial support.
文摘One of the key issues in climate risk management is to develop climate resilient infrastructure so as to ensure safety and sustainability of urban functioning systems as well as mitigate the adverse impacts associated with increasing climate hazards.However,conventional methods of assessing risks do not fully address the interaction of various subsystems within the city system and are unable to consolidate diverse opinions of various stakeholders on their assessments of sector-specific risks posed by climate change.To address this gap,this study advances an integrated-systems-analysis tool-Climate Risk Assessment of Infrastructure Tool(CRAIT),and applies it to analyze and compare the extent of risk factor exposure and vulnerability over time across five critical urban infrastructure sectors in Shanghai and Shenzhen,two cities that have distinctive geo-climate profiles and histories of infrastructure development.The results show significantly higher level of variation between the two cities in terms of vulnerability levels than that of exposure.More specifically,the sectors of critical buildings,water,energy,and information&communication in Shenzhen have significantly higher vulnerability levels than Shanghai in both the 2000s and the 2050s.We further discussed the vulnerability levels of subsystems in each sector and proposed twelve potential adaptation options for the roads system based on four sets of criteria:technical feasibility,flexibility,co-benefits,and policy compatibility.The application of CRAIT is bound to be a knowledge co-production process with the local experts and stakeholders.This knowledge co-production process highlights the importance of management advancements and nature-based green solutions in managing climate change risk in the future though differences are observed across the efficacy categories due to the geographical and meteorological conditions in the two cities.This study demonstrates that this knowledge co-creation process is valuable in facilitating policymakers'decision-making and their feedback to scientific understanding in climate risk assessment,and that this approach has general applicability for cities in other regions and countries.