AIM:To investigate the role of 18 F-fluorodeoxyglucose positron emission tomography(FDG-PET) in the diagnosis of small pancreatic cancer. METHODS:This study involved 31 patients with proven invasive ductal cancer of t...AIM:To investigate the role of 18 F-fluorodeoxyglucose positron emission tomography(FDG-PET) in the diagnosis of small pancreatic cancer. METHODS:This study involved 31 patients with proven invasive ductal cancer of the pancreas.The patients were divided into 3 groups according to the maximum diameter of the tumor:TS1(maximum tumor size≤2.0 cm) ,TS2(>2.0 cm and≤4.0 cm) or TS3-4(>4.0 cm) .The relationships between the TS and various diagnostic tools,including FDG-PET with dual time point evaluation,were analyzed. RESULTS:The tumors ranged from 1.3 to 11.0 cm in diameter.Thirty of the 31 patients(97%) had a positive FDG-PET study.There were 5 patients classified as TS1,15 as TS2 and 11 as TS3-4.The sensitivity of FDG-PET,computed tomography(CT) and magnetic resonanceimaging(MRI) were 100%,40%,0%in TS1,93%,93%,89%in TS2 and 100%,100%,100%in TS3-4. The sensitivity of FDG-PET was significantly higher in comparison to CT and MRI in patients with TS1(P< 0.032) .The mean standardized uptake values(SUVs) did not show a significant difference in relation to the TS(TS1:5.8±4.5,TS2:5.7±2.2,TS3-4:8.2±3.9) ,respectively.All the TS1 tumors(from 13 to 20 mm) showed higher SUVs in FDG-PET with dual time point evaluation in the delayed phase compared with the early phase,which suggested the lesions were malignant. CONCLUSION:These results indicate that FDG-PET with dual time point evaluation is a useful modality for the detection of small pancreatic cancers with a diameter of less than 20 mm.展开更多
In the last century, there has been a significant development in the evaluation of methods to predict ground movement due to underground extraction. Some remarkable developments in three-dimensional computational meth...In the last century, there has been a significant development in the evaluation of methods to predict ground movement due to underground extraction. Some remarkable developments in three-dimensional computational methods have been supported in civil engineering, subsidence engineering and mining engineering practice. However, ground movement problem due to mining extraction sequence is effectively four dimensional (4D). A rational prediction is getting more and more important for long-term underground mining planning. Hence, computer-based analytical methods that realistically simulate spatially distributed time-dependent ground movement process are needed for the reliable long-term underground mining planning to minimize the surface environmental damages. In this research, a new computational system is developed to simulate four-dimensional (4D) ground movement by combining a stochastic medium theory, Knothe time-delay model and geographic information system (GIS) technology. All the calculations are implemented by a computational program, in which the components of GIS are used to fulfill the spatial-temporal analysis model. In this paper a tight coupling strategy based on component object model of GIS technology is used to overcome the problems of complex three-dimensional extraction model and spatial data integration. Moreover, the implementation of computational of the interfaces of the developed tool is described. The GIS based developed tool is validated by two study cases. The developed computational tool and models are achieved within the GIS system so the effective and efficient calculation methodology can be obtained, so the simulation problems of 4D ground movement due to underground mining extraction sequence can be solved by implementation of the developed tool in GIS.展开更多
This study looks into new perspectives in preschoolers' assessment of being at risk for learning disabilities. Precisely, two innovative assessment approaches are examined in order to reveal new research perspectives...This study looks into new perspectives in preschoolers' assessment of being at risk for learning disabilities. Precisely, two innovative assessment approaches are examined in order to reveal new research perspectives. The first tool, a traditional approach, is the "Early Dyslexia Identification Test" and the second tool, a computerized approach, is an lnternet based Speech Pathology Diagnostic Expert System named "APLo". Both evaluate the sectors of phonological awareness, memory, psychomotor development, pre-writing and pre-reading skills in Greek. The findings o f the current study formulate three directions: (1) the complementary of speech language and learning disorders as a systemic approach, (2) the diagnosis of suspicious factors and compatibilities of learning disabilities even at the preschool age, and (3) the application of alternative methods of assessment aiming for a multidimentional approach with the combined prospect and potential of web tools in the early diagnosis and intervention in learning disabilities.展开更多
Software tools are developed for computer realization of syntactic, semantic, and morphological models of natural language texts, using rule based programming. The tools are efficient for a language, which has free or...Software tools are developed for computer realization of syntactic, semantic, and morphological models of natural language texts, using rule based programming. The tools are efficient for a language, which has free order of words and developed morphological structure like Georgian. For instance, a Georgian verb has several thousand verb-forms. It is very difficult to express rules of morphological analysis by finite automaton and it will be inefficient as well. Resolution of some problems of full morphological analysis of Georgian words is impossible by finite automaton. Splitting of some Georgian verb-forms into morphemes requires non-deterministic search algorithm, which needs many backtrackings. To minimize backtrackings, it is necessary to put constraints, which exist among morphemes and verify them as soon as possible to avoid false directions of search. Software tool for syntactic analysis has means to reduce rules, which have the same members in different order. The authors used the tool for semantic analysis as well. Thus, proposed software tools have many means to construct efficient parser, test and correct it. The authors realized morphological and syntactic analysis of Georgian texts by these tools. In the presented paper, the authors describe the software tools and its application for Georgian language.展开更多
The present paper proposes the impact of the air temperature on electricity demand as expected. It is clear that the annual maximum load is recorded versus the years starting by the year 2009 up to 2012. At present, t...The present paper proposes the impact of the air temperature on electricity demand as expected. It is clear that the annual maximum load is recorded versus the years starting by the year 2009 up to 2012. At present, the graph fitting technique is applied with some mathematical and computational tools based on the actual values of the years 2009 up to 2012 considering the lower values, the higher values and the average values of the annual maximum loads for Kingdom of Bahrain. For the three scenarios, the models are obtained by curve fitting technique. As well, the model of actual loads is obtained finally which has mostly the closest values obtained.展开更多
Derivatives are the foundation of mathematical calculations,however,for some functions, using the rules of finding a derivative may lead to cumbersome steps. Therefore, this paper provides a simple way using transform...Derivatives are the foundation of mathematical calculations,however,for some functions, using the rules of finding a derivative may lead to cumbersome steps. Therefore, this paper provides a simple way using transformation thought for the reciprocal function derivative.展开更多
Brain tumor is a major cause of an increased transient between children and adults. This article proposes an improved method based on magnetic resonance (MRI) brain imaging and image segmentation. Automated classifi...Brain tumor is a major cause of an increased transient between children and adults. This article proposes an improved method based on magnetic resonance (MRI) brain imaging and image segmentation. Automated classification is encouraged by the need for high accuracy in dealing with a human life. Detection of brain tumor is a challenging problem due to the high diversity in tumor appearance and ambiguous tumor boundaries. MRI images are chosen for the detection of brain tumors as they are used in the determination of soft tissues. First, image preprocessing is used to improve image quality. Second, the multi-scale decomposition of complex dual-wavelet tree transformations is used to analyze the texture of an image. Resource extraction draws resources from an image using gray-level co-occurrence matrix (GLCM). Therefore, the neuro-fuzzy technique is used to classify brain tumor stages as benign, malignant, or normal based on texture characteristics. Finally, tumor location is detected using Otsu threshold. The performance of the classifier is evaluated on the basis of classification accuracies. The simulated results show that the proposed classifier provides better accuracy than the previous method.展开更多
The lack of standard to electronic circuits modeling made possible the development of many tools and modeling languages for electronic circuits. In this way, several tools to be used on different descriptions stage of...The lack of standard to electronic circuits modeling made possible the development of many tools and modeling languages for electronic circuits. In this way, several tools to be used on different descriptions stage of the designs are necessary. This paper presents a tool called SF^2HDL (Stateflow to Hardware Description Language or State Transition Table) that translates a finite state machine on state transition diagram representation, described by Stateflow tool, into an input file standard for TABELA program or into a file behavioral VHDL (Very High Speed Integrated Circuits Hardware Description Language) directly. The TABELA program was used to optimization this finite state machine. After that, the TAB2VHDL program was used to generate the VHDL code on register transfer level, what permits comparisons with results obtained by synthesis. The finite state machine must be described by Mealy model and the user can describe the machine on high level abstraction using all Simulink supports. The tool was very efficient on computational cost and it made translation of several cases, for the two VHDL description models. Every state machine translated was simulated and implemented on device EP2C20F484C7 using Quartus II environment.展开更多
This paper aims to conduct a research on the state of the art of artificial intelligence techniques to investigate the relationships between cognitive actions addressed in steps of mathematical modeling and computatio...This paper aims to conduct a research on the state of the art of artificial intelligence techniques to investigate the relationships between cognitive actions addressed in steps of mathematical modeling and computational semiotics activities. It also briefly reviews the main techniques of artificial intelligence, with particular emphasis on intelligent systems techniques. Such analysis uses semiotic concepts in order to identify the use of new techniques for modeling intelligent systems through the integrated use of mathematical and computational tools. At last, once understood that semiotics can bring contributions to the study of intelligent systems, a methodology for modeling computational semiotics based on the semiotic concepts formalization extracted from the semiotic theory of Charles Sanders Peiree is proposed.展开更多
Transformers utilizing HTS (high temperature superconductors) are considered as a timely invention. The number of power transformers age more than 30 years old and nearing retirement is increasing. If this window of...Transformers utilizing HTS (high temperature superconductors) are considered as a timely invention. The number of power transformers age more than 30 years old and nearing retirement is increasing. If this window of opportunity is not grabbed, there would be great reluctance to replace recently installed highly priced capital asset. Major projects of developing HTS transformers are well making progress in the United States, Europe, Japan, Korea and China which indicate the interest. The efforts must have been appropriately verified through the economic interest of the discounted losses. Consequently, it is very important to develop an understanding of the fundamental HTS transformer design issues that can provide guidance for developing practical devices of interest to the electric utility industry. The parameters of HTS transformer need to be validated before any effort is to carry out to model the behaviour of a distribution network under a range of conditions. The predicted performance and reliability of HTS transformers can then be verified through the network modelling and analysis calculation. The ultimate purpose is to furnish electric utilities precise information as to which HTS transformers work under various applications with greater technical efficiency and proven reliability.展开更多
Particle swarm optimization algorithm is presented for the layout of "Integrate Circuit (IC)" design. Particle swarm optimization based on swarm intelligence is a new evolutionary computational tool and is success...Particle swarm optimization algorithm is presented for the layout of "Integrate Circuit (IC)" design. Particle swarm optimization based on swarm intelligence is a new evolutionary computational tool and is successfully applied in function optimization, neural network design, classification, pattern recognition, signal processing and robot technology and so on. A modified algorithm is presented and applied to the layout of IC design. For a given layout plane, first of all, this algorithm generates the corresponding grid group by barriers and nets' ports with the thought ofgridless net routing, establishes initialization fuzzy matrix, then utilizes the global optimization character to find out the best layout route only if it exits. The results of model simulation indicate that PSO algorithm is feasible and efficient in IC layout design.展开更多
In most network analysis tools the computation of the shortest paths between all pairs of nodes is a fundamental step to the discovery of other properties. Among other properties is the computation of closeness centra...In most network analysis tools the computation of the shortest paths between all pairs of nodes is a fundamental step to the discovery of other properties. Among other properties is the computation of closeness centrality, a measure of the nodes that shows how central a vertex is on a given network. In this paper, the authors present a method to compute the All Pairs Shortest Paths on graphs that present two characteristics: abundance of nodes with degree value one, and existence of articulation points along the graph. These characteristics are present in many real life networks especially in networks that show a power law degree distribution as is the case of biological networks. The authors' method compacts the single nodes to their source, and then by using the network articulation points it disconnects the network and computes the shortest paths in the biconnected components. At the final step the authors proposed methods merges the results to provide the whole network shortest paths. The authors' method achieves remarkable speedup compared to state of the art methods to compute the shortest paths, as much as 7 fold speed up in artificial graphs and 3.25 fold speed up in real application graphs. The authors' performance improvement is unlike previous research as it does not involve elaborated setups since the authors algorithm can process significant instances on a popular workstation.展开更多
Recognizing the drawbacks of stand-alone computer-aided tools in engineering, several hybrid systems are suggested with varying degree of success. In transforming the design concept to a finished product, in particula...Recognizing the drawbacks of stand-alone computer-aided tools in engineering, several hybrid systems are suggested with varying degree of success. In transforming the design concept to a finished product, in particular, smooth interfacing of the design data is crucial to reduce product cost and time to market. Having a product model that contains the complete product description and computer-aided tools that can understand each other are the primary requirements to achieve the interfacing goal. This article discusses the development methodology of hybrid engineering software systems with particular focus on application of soft computing tools such as genetic algorithms and neural networks. Forms of hybridization options are discussed and the applications are elaborated using two case studies. The forefront aims to develop hybrid systems that combine the strong side of each tool, such as, the learning, pattern recognition and classification power of neural networks with the powerful capacity of genetic algorithms in global search and optimization. While most optimization tasks need a certain form of model, there are many processes in the mechanical engineering field that are difficult to model using conventional modeling techniques. The proposed hybrid system solves such difficult-to-model processes and contributes to the effort of smooth interfacing design data to other downstream processes.展开更多
The theoretical analysis discussed in this work is a suitable mathematical tool by which the performance of the proposed collector can be predicted. The obtained experimental results coincide with the obtained theoret...The theoretical analysis discussed in this work is a suitable mathematical tool by which the performance of the proposed collector can be predicted. The obtained experimental results coincide with the obtained theoretical data obtained from the devised computer program. Controlled output temperature can be obtained from the proposed system. The performance of the tested collector under the proposed intermittent flow conditions overcomes that of the conventional thermosyphone flow collector.展开更多
Abstract Generalized B-splines have been employed as geometric modeling and numerical simu- lation tools for isogeometric analysis (IGA for short). However, the previous models used in IGA, such as trigonometric gen...Abstract Generalized B-splines have been employed as geometric modeling and numerical simu- lation tools for isogeometric analysis (IGA for short). However, the previous models used in IGA, such as trigonometric generalized B-splines or hyperbolic generalized B-splines, are not the unified mathematical representation of conics and polynomial parametric curves/surfaces. In this paper, a unified approach to construct the generalized non-uniform B-splines over the space spanned by {α(t),β(t),ξ(t), η(t), 1, t,……. , tn-4} is proposed, and the corresponding isogeometric analysis framework for PDE solving is also studied. Compared with the NURBS-IGA method, the proposed frameworks have several advantages such as high accuracy, easy-to-compute derivatives and integrals due to the non-rational form. Furthermore, with the proposed spline models, isogeometric analysis can be performed on the computational domain bounded by transcendental curves/surfaces, such as the involute of circle, the helix/helicoid, the catenary/catenoid and the cycloid. Several numerical examples for isogeometrie heat conduction problems are presented to show the effectiveness of the proposed methods.展开更多
Metallic implants are commonly used in various orthopaedic surgeries, like fracture fixation, spinal instrumentation, joint replacement and bone tumour surgery.Patients may need to adapt to the fixed dimensions of the...Metallic implants are commonly used in various orthopaedic surgeries, like fracture fixation, spinal instrumentation, joint replacement and bone tumour surgery.Patients may need to adapt to the fixed dimensions of the standard implants. It may result in suboptimal fit to the host bones and possible adverse clinical results. The standard traditional implants may not address the reconstructive challenges such as severe bone deformity or bone loss after implant loosening and bone tumour resection. With the advent of digital technologies in medical imaging, computer programming in three-dimensional(3 D) modelling and computer-assisted tools in precise placement of implants, patient-specific implants(PSI) have gained more attention in complex orthopaedic reconstruction. Additive manufacturing technology, in contrast to the conventional subtractive manufacturing, is a flexible process that can fabricate anatomically conforming implants that match the patients’ anatomy and surgical requirements. Complex internal structures with porous scaffold can also be built to enhance osseointegration for better implant longevity. Although basic studies have suggested that additive manufactured(AM) metal structures are good engineered biomaterials for bone replacement, not much peer-reviewed literature is available on the clinical results of the new types of implants. The article gives an overview of the metallic materials commonly used for fabricating orthopaedic implants, describes the metal-based additive manufacturing technology and the processing chain in metallic implants; discusses the features of AM implants;reports the current status in orthopaedic surgical applications and comments on the challenges of AM implants in orthopaedic practice.展开更多
Compressed sensing is a new signM acquisition method that acquires signal in a compressed form and then recovers the signal by the use of computational tools and techniques. This means fewer measurements of signal are...Compressed sensing is a new signM acquisition method that acquires signal in a compressed form and then recovers the signal by the use of computational tools and techniques. This means fewer measurements of signal are needed and thus it will save huge amount of time and storage space. We, in this paper, consider the compressed sensing of sparse integer-valued signal (referred as "q-states signal" throughout the paper). In order to accelerate the speed of reconstruction, we adopt the sparse rather than dense measurement matrices. Using methods and tools developed in statistical physics, we locate the reconstruction limit for Lo-reconstruction method and propose a belief propagation- based algorithm that can deal with instance with large size and its typical reconstruction performance are also analyzed.展开更多
A molecular-level kinetics model has been developed for the pyrolysis of heavy residual oil. Resid structure was modeled in terms of three attribute groups: cores, inter-core linkages, and side chains. The concentrati...A molecular-level kinetics model has been developed for the pyrolysis of heavy residual oil. Resid structure was modeled in terms of three attribute groups: cores, inter-core linkages, and side chains. The concentrations of attributes were constrained by probability density functions (PDFs) that were optimized by minimizing the difference between the properties of the computational representation-which were obtained by juxtaposing the attributes-to measured properties, which were obtained by analytical chemistry measurements. Computational tools were used to build a reaction network that was constructed based upon model compounds and their associated kinetics. For cases with an intractable number of species, equations were written in terms of the three attribute groups and the molecular composition was retained implicitly through the juxtaposition. These modeling methods were applied to the Shengli and Daqing resids. The composition of the simulated molecular feedstock fit well with analytical chemistry measurements. After simulated pyrolysis, both resids showed representative increases in the weight fractions of lighter hydrocarbons. Relevant end-use properties were predicted for the product mixtures.展开更多
A novel light scattering technique for mapping metal surface corrosion is presented and its results on copper exposed to atmosphere are reported. The front end of the instrument is made up of a sensor module comprisin...A novel light scattering technique for mapping metal surface corrosion is presented and its results on copper exposed to atmosphere are reported. The front end of the instrument is made up of a sensor module comprising a thin beam light emitting diode (LED) illuminating a small spot on the metal surface, and a matched pair of photodetectors, one for capturing the reflected light and the other for sampling the scattered light. The analog photocurrent signals are digitized and processed online by a personal computer (PC) to determine the corrosion factor defined in terms of the two current values. By scanning the sample surface using the light beam and by computing the corrosion factor values simultaneously, a three dimensional graph and a two dimensional contour map are generated in the PC using Matlab tools. The values of the corrosion factor measured in different durations of exposure to atmosphere, which obey a bilogarithmic law, testify to the validity of our mathematical model.展开更多
文摘AIM:To investigate the role of 18 F-fluorodeoxyglucose positron emission tomography(FDG-PET) in the diagnosis of small pancreatic cancer. METHODS:This study involved 31 patients with proven invasive ductal cancer of the pancreas.The patients were divided into 3 groups according to the maximum diameter of the tumor:TS1(maximum tumor size≤2.0 cm) ,TS2(>2.0 cm and≤4.0 cm) or TS3-4(>4.0 cm) .The relationships between the TS and various diagnostic tools,including FDG-PET with dual time point evaluation,were analyzed. RESULTS:The tumors ranged from 1.3 to 11.0 cm in diameter.Thirty of the 31 patients(97%) had a positive FDG-PET study.There were 5 patients classified as TS1,15 as TS2 and 11 as TS3-4.The sensitivity of FDG-PET,computed tomography(CT) and magnetic resonanceimaging(MRI) were 100%,40%,0%in TS1,93%,93%,89%in TS2 and 100%,100%,100%in TS3-4. The sensitivity of FDG-PET was significantly higher in comparison to CT and MRI in patients with TS1(P< 0.032) .The mean standardized uptake values(SUVs) did not show a significant difference in relation to the TS(TS1:5.8±4.5,TS2:5.7±2.2,TS3-4:8.2±3.9) ,respectively.All the TS1 tumors(from 13 to 20 mm) showed higher SUVs in FDG-PET with dual time point evaluation in the delayed phase compared with the early phase,which suggested the lesions were malignant. CONCLUSION:These results indicate that FDG-PET with dual time point evaluation is a useful modality for the detection of small pancreatic cancers with a diameter of less than 20 mm.
文摘In the last century, there has been a significant development in the evaluation of methods to predict ground movement due to underground extraction. Some remarkable developments in three-dimensional computational methods have been supported in civil engineering, subsidence engineering and mining engineering practice. However, ground movement problem due to mining extraction sequence is effectively four dimensional (4D). A rational prediction is getting more and more important for long-term underground mining planning. Hence, computer-based analytical methods that realistically simulate spatially distributed time-dependent ground movement process are needed for the reliable long-term underground mining planning to minimize the surface environmental damages. In this research, a new computational system is developed to simulate four-dimensional (4D) ground movement by combining a stochastic medium theory, Knothe time-delay model and geographic information system (GIS) technology. All the calculations are implemented by a computational program, in which the components of GIS are used to fulfill the spatial-temporal analysis model. In this paper a tight coupling strategy based on component object model of GIS technology is used to overcome the problems of complex three-dimensional extraction model and spatial data integration. Moreover, the implementation of computational of the interfaces of the developed tool is described. The GIS based developed tool is validated by two study cases. The developed computational tool and models are achieved within the GIS system so the effective and efficient calculation methodology can be obtained, so the simulation problems of 4D ground movement due to underground mining extraction sequence can be solved by implementation of the developed tool in GIS.
文摘This study looks into new perspectives in preschoolers' assessment of being at risk for learning disabilities. Precisely, two innovative assessment approaches are examined in order to reveal new research perspectives. The first tool, a traditional approach, is the "Early Dyslexia Identification Test" and the second tool, a computerized approach, is an lnternet based Speech Pathology Diagnostic Expert System named "APLo". Both evaluate the sectors of phonological awareness, memory, psychomotor development, pre-writing and pre-reading skills in Greek. The findings o f the current study formulate three directions: (1) the complementary of speech language and learning disorders as a systemic approach, (2) the diagnosis of suspicious factors and compatibilities of learning disabilities even at the preschool age, and (3) the application of alternative methods of assessment aiming for a multidimentional approach with the combined prospect and potential of web tools in the early diagnosis and intervention in learning disabilities.
文摘Software tools are developed for computer realization of syntactic, semantic, and morphological models of natural language texts, using rule based programming. The tools are efficient for a language, which has free order of words and developed morphological structure like Georgian. For instance, a Georgian verb has several thousand verb-forms. It is very difficult to express rules of morphological analysis by finite automaton and it will be inefficient as well. Resolution of some problems of full morphological analysis of Georgian words is impossible by finite automaton. Splitting of some Georgian verb-forms into morphemes requires non-deterministic search algorithm, which needs many backtrackings. To minimize backtrackings, it is necessary to put constraints, which exist among morphemes and verify them as soon as possible to avoid false directions of search. Software tool for syntactic analysis has means to reduce rules, which have the same members in different order. The authors used the tool for semantic analysis as well. Thus, proposed software tools have many means to construct efficient parser, test and correct it. The authors realized morphological and syntactic analysis of Georgian texts by these tools. In the presented paper, the authors describe the software tools and its application for Georgian language.
文摘The present paper proposes the impact of the air temperature on electricity demand as expected. It is clear that the annual maximum load is recorded versus the years starting by the year 2009 up to 2012. At present, the graph fitting technique is applied with some mathematical and computational tools based on the actual values of the years 2009 up to 2012 considering the lower values, the higher values and the average values of the annual maximum loads for Kingdom of Bahrain. For the three scenarios, the models are obtained by curve fitting technique. As well, the model of actual loads is obtained finally which has mostly the closest values obtained.
文摘Derivatives are the foundation of mathematical calculations,however,for some functions, using the rules of finding a derivative may lead to cumbersome steps. Therefore, this paper provides a simple way using transformation thought for the reciprocal function derivative.
文摘Brain tumor is a major cause of an increased transient between children and adults. This article proposes an improved method based on magnetic resonance (MRI) brain imaging and image segmentation. Automated classification is encouraged by the need for high accuracy in dealing with a human life. Detection of brain tumor is a challenging problem due to the high diversity in tumor appearance and ambiguous tumor boundaries. MRI images are chosen for the detection of brain tumors as they are used in the determination of soft tissues. First, image preprocessing is used to improve image quality. Second, the multi-scale decomposition of complex dual-wavelet tree transformations is used to analyze the texture of an image. Resource extraction draws resources from an image using gray-level co-occurrence matrix (GLCM). Therefore, the neuro-fuzzy technique is used to classify brain tumor stages as benign, malignant, or normal based on texture characteristics. Finally, tumor location is detected using Otsu threshold. The performance of the classifier is evaluated on the basis of classification accuracies. The simulated results show that the proposed classifier provides better accuracy than the previous method.
文摘The lack of standard to electronic circuits modeling made possible the development of many tools and modeling languages for electronic circuits. In this way, several tools to be used on different descriptions stage of the designs are necessary. This paper presents a tool called SF^2HDL (Stateflow to Hardware Description Language or State Transition Table) that translates a finite state machine on state transition diagram representation, described by Stateflow tool, into an input file standard for TABELA program or into a file behavioral VHDL (Very High Speed Integrated Circuits Hardware Description Language) directly. The TABELA program was used to optimization this finite state machine. After that, the TAB2VHDL program was used to generate the VHDL code on register transfer level, what permits comparisons with results obtained by synthesis. The finite state machine must be described by Mealy model and the user can describe the machine on high level abstraction using all Simulink supports. The tool was very efficient on computational cost and it made translation of several cases, for the two VHDL description models. Every state machine translated was simulated and implemented on device EP2C20F484C7 using Quartus II environment.
文摘This paper aims to conduct a research on the state of the art of artificial intelligence techniques to investigate the relationships between cognitive actions addressed in steps of mathematical modeling and computational semiotics activities. It also briefly reviews the main techniques of artificial intelligence, with particular emphasis on intelligent systems techniques. Such analysis uses semiotic concepts in order to identify the use of new techniques for modeling intelligent systems through the integrated use of mathematical and computational tools. At last, once understood that semiotics can bring contributions to the study of intelligent systems, a methodology for modeling computational semiotics based on the semiotic concepts formalization extracted from the semiotic theory of Charles Sanders Peiree is proposed.
文摘Transformers utilizing HTS (high temperature superconductors) are considered as a timely invention. The number of power transformers age more than 30 years old and nearing retirement is increasing. If this window of opportunity is not grabbed, there would be great reluctance to replace recently installed highly priced capital asset. Major projects of developing HTS transformers are well making progress in the United States, Europe, Japan, Korea and China which indicate the interest. The efforts must have been appropriately verified through the economic interest of the discounted losses. Consequently, it is very important to develop an understanding of the fundamental HTS transformer design issues that can provide guidance for developing practical devices of interest to the electric utility industry. The parameters of HTS transformer need to be validated before any effort is to carry out to model the behaviour of a distribution network under a range of conditions. The predicted performance and reliability of HTS transformers can then be verified through the network modelling and analysis calculation. The ultimate purpose is to furnish electric utilities precise information as to which HTS transformers work under various applications with greater technical efficiency and proven reliability.
文摘Particle swarm optimization algorithm is presented for the layout of "Integrate Circuit (IC)" design. Particle swarm optimization based on swarm intelligence is a new evolutionary computational tool and is successfully applied in function optimization, neural network design, classification, pattern recognition, signal processing and robot technology and so on. A modified algorithm is presented and applied to the layout of IC design. For a given layout plane, first of all, this algorithm generates the corresponding grid group by barriers and nets' ports with the thought ofgridless net routing, establishes initialization fuzzy matrix, then utilizes the global optimization character to find out the best layout route only if it exits. The results of model simulation indicate that PSO algorithm is feasible and efficient in IC layout design.
文摘In most network analysis tools the computation of the shortest paths between all pairs of nodes is a fundamental step to the discovery of other properties. Among other properties is the computation of closeness centrality, a measure of the nodes that shows how central a vertex is on a given network. In this paper, the authors present a method to compute the All Pairs Shortest Paths on graphs that present two characteristics: abundance of nodes with degree value one, and existence of articulation points along the graph. These characteristics are present in many real life networks especially in networks that show a power law degree distribution as is the case of biological networks. The authors' method compacts the single nodes to their source, and then by using the network articulation points it disconnects the network and computes the shortest paths in the biconnected components. At the final step the authors proposed methods merges the results to provide the whole network shortest paths. The authors' method achieves remarkable speedup compared to state of the art methods to compute the shortest paths, as much as 7 fold speed up in artificial graphs and 3.25 fold speed up in real application graphs. The authors' performance improvement is unlike previous research as it does not involve elaborated setups since the authors algorithm can process significant instances on a popular workstation.
文摘Recognizing the drawbacks of stand-alone computer-aided tools in engineering, several hybrid systems are suggested with varying degree of success. In transforming the design concept to a finished product, in particular, smooth interfacing of the design data is crucial to reduce product cost and time to market. Having a product model that contains the complete product description and computer-aided tools that can understand each other are the primary requirements to achieve the interfacing goal. This article discusses the development methodology of hybrid engineering software systems with particular focus on application of soft computing tools such as genetic algorithms and neural networks. Forms of hybridization options are discussed and the applications are elaborated using two case studies. The forefront aims to develop hybrid systems that combine the strong side of each tool, such as, the learning, pattern recognition and classification power of neural networks with the powerful capacity of genetic algorithms in global search and optimization. While most optimization tasks need a certain form of model, there are many processes in the mechanical engineering field that are difficult to model using conventional modeling techniques. The proposed hybrid system solves such difficult-to-model processes and contributes to the effort of smooth interfacing design data to other downstream processes.
文摘The theoretical analysis discussed in this work is a suitable mathematical tool by which the performance of the proposed collector can be predicted. The obtained experimental results coincide with the obtained theoretical data obtained from the devised computer program. Controlled output temperature can be obtained from the proposed system. The performance of the tested collector under the proposed intermittent flow conditions overcomes that of the conventional thermosyphone flow collector.
基金supported by Zhejiang Provincial Natural Science Foundation of China under Grant No.LR16F020003the National Nature Science Foundation of China under Grant Nos.61472111,61602138+1 种基金the Open Project Program of the State Key Lab of CAD&CG(A1703)Zhejiang University
文摘Abstract Generalized B-splines have been employed as geometric modeling and numerical simu- lation tools for isogeometric analysis (IGA for short). However, the previous models used in IGA, such as trigonometric generalized B-splines or hyperbolic generalized B-splines, are not the unified mathematical representation of conics and polynomial parametric curves/surfaces. In this paper, a unified approach to construct the generalized non-uniform B-splines over the space spanned by {α(t),β(t),ξ(t), η(t), 1, t,……. , tn-4} is proposed, and the corresponding isogeometric analysis framework for PDE solving is also studied. Compared with the NURBS-IGA method, the proposed frameworks have several advantages such as high accuracy, easy-to-compute derivatives and integrals due to the non-rational form. Furthermore, with the proposed spline models, isogeometric analysis can be performed on the computational domain bounded by transcendental curves/surfaces, such as the involute of circle, the helix/helicoid, the catenary/catenoid and the cycloid. Several numerical examples for isogeometrie heat conduction problems are presented to show the effectiveness of the proposed methods.
文摘Metallic implants are commonly used in various orthopaedic surgeries, like fracture fixation, spinal instrumentation, joint replacement and bone tumour surgery.Patients may need to adapt to the fixed dimensions of the standard implants. It may result in suboptimal fit to the host bones and possible adverse clinical results. The standard traditional implants may not address the reconstructive challenges such as severe bone deformity or bone loss after implant loosening and bone tumour resection. With the advent of digital technologies in medical imaging, computer programming in three-dimensional(3 D) modelling and computer-assisted tools in precise placement of implants, patient-specific implants(PSI) have gained more attention in complex orthopaedic reconstruction. Additive manufacturing technology, in contrast to the conventional subtractive manufacturing, is a flexible process that can fabricate anatomically conforming implants that match the patients’ anatomy and surgical requirements. Complex internal structures with porous scaffold can also be built to enhance osseointegration for better implant longevity. Although basic studies have suggested that additive manufactured(AM) metal structures are good engineered biomaterials for bone replacement, not much peer-reviewed literature is available on the clinical results of the new types of implants. The article gives an overview of the metallic materials commonly used for fabricating orthopaedic implants, describes the metal-based additive manufacturing technology and the processing chain in metallic implants; discusses the features of AM implants;reports the current status in orthopaedic surgical applications and comments on the challenges of AM implants in orthopaedic practice.
文摘Compressed sensing is a new signM acquisition method that acquires signal in a compressed form and then recovers the signal by the use of computational tools and techniques. This means fewer measurements of signal are needed and thus it will save huge amount of time and storage space. We, in this paper, consider the compressed sensing of sparse integer-valued signal (referred as "q-states signal" throughout the paper). In order to accelerate the speed of reconstruction, we adopt the sparse rather than dense measurement matrices. Using methods and tools developed in statistical physics, we locate the reconstruction limit for Lo-reconstruction method and propose a belief propagation- based algorithm that can deal with instance with large size and its typical reconstruction performance are also analyzed.
文摘A molecular-level kinetics model has been developed for the pyrolysis of heavy residual oil. Resid structure was modeled in terms of three attribute groups: cores, inter-core linkages, and side chains. The concentrations of attributes were constrained by probability density functions (PDFs) that were optimized by minimizing the difference between the properties of the computational representation-which were obtained by juxtaposing the attributes-to measured properties, which were obtained by analytical chemistry measurements. Computational tools were used to build a reaction network that was constructed based upon model compounds and their associated kinetics. For cases with an intractable number of species, equations were written in terms of the three attribute groups and the molecular composition was retained implicitly through the juxtaposition. These modeling methods were applied to the Shengli and Daqing resids. The composition of the simulated molecular feedstock fit well with analytical chemistry measurements. After simulated pyrolysis, both resids showed representative increases in the weight fractions of lighter hydrocarbons. Relevant end-use properties were predicted for the product mixtures.
文摘A novel light scattering technique for mapping metal surface corrosion is presented and its results on copper exposed to atmosphere are reported. The front end of the instrument is made up of a sensor module comprising a thin beam light emitting diode (LED) illuminating a small spot on the metal surface, and a matched pair of photodetectors, one for capturing the reflected light and the other for sampling the scattered light. The analog photocurrent signals are digitized and processed online by a personal computer (PC) to determine the corrosion factor defined in terms of the two current values. By scanning the sample surface using the light beam and by computing the corrosion factor values simultaneously, a three dimensional graph and a two dimensional contour map are generated in the PC using Matlab tools. The values of the corrosion factor measured in different durations of exposure to atmosphere, which obey a bilogarithmic law, testify to the validity of our mathematical model.