Progress in developing robust therapies for spinal cord injury (SCI), trau- matic brain injury (TBI) and peripheral nerve injury has been slow. A great deal has been learned over the past 30 years regarding both t...Progress in developing robust therapies for spinal cord injury (SCI), trau- matic brain injury (TBI) and peripheral nerve injury has been slow. A great deal has been learned over the past 30 years regarding both the intrinsic factors and the environmental factors that regulate axon growth, but this large body of information has not yet resulted in clinically available thera- peutics. This therapeutic bottleneck has many root causes, but a consensus is emerging that one contributing factor is a lack of standards for experi- mental design and reporting. The absence of reporting standards, and even of commonly accepted definitions of key words, also make data mining and bioinformatics analysis of neural plasticity and regeneration difficult, if not impossible. This short review will consider relevant background and poten- tial solutions to this problem in the axon regeneration domain.展开更多
Commentary Most would agree that providing comprehensive detail in scientific reporting is critical for the development of mean- ingful therapies and treatments for diseases. Such stellar practices 1) allow for repro...Commentary Most would agree that providing comprehensive detail in scientific reporting is critical for the development of mean- ingful therapies and treatments for diseases. Such stellar practices 1) allow for reproduction of experiments to con- firm results, 2) promote thorough analyses of data, and 3) foster the incremental advancement of valid approaches. Unfortunately, most would also agree we have far to go to reach this vital goal (Hackam and Redelmeier, 2006; Prinz et al., 2011; Baker et al., 2014).展开更多
There are multiple operating modes in the real industrial process, and the collected data follow the complex multimodal distribution, so most traditional process monitoring methods are no longer applicable because the...There are multiple operating modes in the real industrial process, and the collected data follow the complex multimodal distribution, so most traditional process monitoring methods are no longer applicable because their presumptions are that sampled-data should obey the single Gaussian distribution or non-Gaussian distribution. In order to solve these problems, a novel weighted local standardization(WLS) strategy is proposed to standardize the multimodal data, which can eliminate the multi-mode characteristics of the collected data, and normalize them into unimodal data distribution. After detailed analysis of the raised data preprocessing strategy, a new algorithm using WLS strategy with support vector data description(SVDD) is put forward to apply for multi-mode monitoring process. Unlike the strategy of building multiple local models, the developed method only contains a model without the prior knowledge of multi-mode process. To demonstrate the proposed method's validity, it is applied to a numerical example and a Tennessee Eastman(TE) process. Finally, the simulation results show that the WLS strategy is very effective to standardize multimodal data, and the WLS-SVDD monitoring method has great advantages over the traditional SVDD and PCA combined with a local standardization strategy(LNS-PCA) in multi-mode process monitoring.展开更多
1.1. Development of international data exchange standards in securities field Securities market involves a large number of participants, like investors, securities companies, exchanges, clearingcorporations and so on...1.1. Development of international data exchange standards in securities field Securities market involves a large number of participants, like investors, securities companies, exchanges, clearingcorporations and so on. Businesses among the participants are completed via data exchange. Therefore, the data exchange protocols serve an important factor to determine and promote the sate and rapid development of the securities market.展开更多
The general use of aluminium as an indentation standard for the iteration of contact heights for the determination of ISO-14577 hardness and elastic modulus is challenged because of as yet not appreciated phase-change...The general use of aluminium as an indentation standard for the iteration of contact heights for the determination of ISO-14577 hardness and elastic modulus is challenged because of as yet not appreciated phase-changes in the physical force-depth standard curve that seemed to be secured by claims from 1992. The physical and mathematical analyses with closed formulas avoid the still world-wide standardized energy-law violation by not reserving 33.33% (h2 belief) (or 20% h3/2 physical law) of the loading force and thus energy for all not depth producing events but using 100% for the depth formation is a severe violation of the energy law. The not depth producing part of the indentation work cannot be done with zero energy! Both twinning and structural phase-transition onsets and normalized phase-transition energies are now calculated without iterations but with physically correct closed arithmetic equations. These are reported for Berkovich and cubecorner indentations, including their comparison on geometric grounds and an indentation standard without mechanical twinning is proposed. Characteristic data are reported. This is the first detection of the indentation twinning of aluminium at room temperature and the mechanical twinning of fused quartz is also new. Their disqualification as indentation standards is established. Also, the again found higher load phase-transitions disqualify aluminium and fused quartz as ISO-ASTM 14577 (International Standardization Organization and American Society for Testing and Materials) standards for the contact depth “hc” iterations. The incorrect and still world-wide used black-box values for H- and Er-values (the latter are still falsely called “Young’s moduli” even though they are not directional) and all mechanical properties that depend on them. They lack relation to bulk moduli from compression experiments. Experimentally obtained and so published force vs depth parabolas always follow the linear FN = kh3/2 + Fa equation, where Fa is the axis-cut before and after the phase-transition branches (never “h2” as falsely enforced and used for H, Er and giving incorrectly calculated parameters). The regression slopes k are the precise physical hardness values, which for the first time allow for precise calculation of the mechanical qualities by indentation in relation to the geometry of the indenter tip. Exactly 20% of the applied force and thus energy is not available for the indentation depth. Only these scientific k-values must be used for AI-advises at the expense of falsely iterated indentation hardness H-values. Any incorrect H-ISO-ASTM and also the iterated Er-ISO-ASTM modulus values of technical materials in artificial intelligence will be a disaster for the daily safety. The AI must be told that these are unscientific and must therefore be replaced by physical data. Iterated data (3 and 8 free parameters!) cannot be transformed into physical data. One has to start with real experimental loading curves and an absolute ZerodurR standard that must be calibrated with standard force and standard length to create absolute indentation results. .展开更多
In Brazil and various regions globally, the initiation of landslides is frequently associated with rainfall;yet the spatial arrangement of geological structures and stratification considerably influences landslide occ...In Brazil and various regions globally, the initiation of landslides is frequently associated with rainfall;yet the spatial arrangement of geological structures and stratification considerably influences landslide occurrences. The multifaceted nature of these influences makes the surveillance of mass movements a highly intricate task, requiring an understanding of numerous interdependent variables. Recent years have seen an emergence in scholarly research aimed at integrating geophysical and geotechnical methodologies. The conjoint examination of geophysical and geotechnical data offers an enhanced perspective into subsurface structures. Within this work, a methodology is proposed for the synchronous analysis of electrical resistivity geophysical data and geotechnical data, specifically those extracted from the Light Dynamic Penetrometer (DPL) and Standard Penetration Test (SPT). This study involved a linear fitting process to correlate resistivity with N10/SPT N-values from DPL/SPT soundings, culminating in a 2D profile of N10/SPT N-values predicated on electrical profiles. The findings of this research furnish invaluable insights into slope stability by allowing for a two-dimensional representation of penetration resistance properties. Through the synthesis of geophysical and geotechnical data, this project aims to augment the comprehension of subsurface conditions, with potential implications for refining landslide risk evaluations. This endeavor offers insight into the formulation of more effective and precise slope management protocols and disaster prevention strategies.展开更多
By using the data in 169 sounding stations over the world,NCEP/NCAR reanalysis data were tested,and the distribution characteristics of standard errors of geopotential height,temperature and wind speed field from the ...By using the data in 169 sounding stations over the world,NCEP/NCAR reanalysis data were tested,and the distribution characteristics of standard errors of geopotential height,temperature and wind speed field from the upper troposphere to the lower stratosphere over the world(most were the land zone) were analyzed.The results showed that the standard error distribution of reanalysis wind speed field data was mainly affected by the jet stream zone.There existed the obvious difference between the jet stream zone and the actual wind field.The distribution of standard error in the wind speed field had the obvious seasonal difference in winter,summer,and the average deviation was larger near the coastline.The high value zones of standard errors of reanalysis geopotential height and temperature field mainly concentrated in the low-latitude region in the Eastern Hemisphere(Indian Ocean coast).The distribution of standard error was basically consistent with average error.Therefore,the standard error could be explained well by the average error.The standard errors of reanalysis temperature and geopotential height data in the inland zone were lower.The high value zone mainly distributed along the coastline,and the average error of wind speed field was bigger near the coastline.It closely related to the quality of data in the sounding stations,the regional difference and the fact that the land observation stations were dense,and the ocean observation stations were fewer.展开更多
Complex industrial process often contains multiple operating modes, and the challenge of multimode process monitoring has recently gained much attention. However, most multivariate statistical process monitoring (MSPM...Complex industrial process often contains multiple operating modes, and the challenge of multimode process monitoring has recently gained much attention. However, most multivariate statistical process monitoring (MSPM) methods are based on the assumption that the process has only one nominal mode. When the process data contain different distributions, they may not function as well as in single mode processes. To address this issue, an improved partial least squares (IPLS) method was proposed for multimode process monitoring. By utilizing a novel local standardization strategy, the normal data in multiple modes could be centralized after being standardized and the fundamental assumption of partial least squares (PLS) could be valid again in multimode process. In this way, PLS method was extended to be suitable for not only single mode processes but also multimode processes. The efficiency of the proposed method was illustrated by comparing the monitoring results of PLS and IPLS in Tennessee Eastman(TE) process.展开更多
In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative ...In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative materials, this paper presents extensible markup language (XML) based strategy for several important problems of data processing in network supported collaborative design, such as the representation of standard for the exchange of product model data (STEP) with XML in the product information expression and the management of XML documents using relational database. The paper gives a detailed exposition on how to clarify the mapping between XML structure and the relationship database structure and how XML-QL queries can be translated into structured query language (SQL) queries. Finally, the structure of data processing system based on XML is presented.展开更多
To make inorganic structure data more useful for further studies a five-point list of simple procedures to be followed by authors of crystal structure papers is proposed. 1. A crystal structure should be described wit...To make inorganic structure data more useful for further studies a five-point list of simple procedures to be followed by authors of crystal structure papers is proposed. 1. A crystal structure should be described with the space group corresponding to its true symmetry. 2. A new structure proposal should be tested, if it is realistic in principle. 3. A structure should be described with a space group in a setting given in the International Tables. 4. For a comparison with other structures the structure data should be standardized with the program STRUCTURE TIDY. 5. 揘ew?structure data should be checked in the databases, Chemical Abstracts or on-line internet resources, if they are really new. The list is supplemented with many explanations, commentaries, examples and references.展开更多
1 Introduction The primary goal of the Deep-time Digital Earth project is to develop an open collaboration and data sharing platform that enables the transition of deep-time geoscientific research to a Big Data driven...1 Introduction The primary goal of the Deep-time Digital Earth project is to develop an open collaboration and data sharing platform that enables the transition of deep-time geoscientific research to a Big Data driven paradigm.Such an open platform will require the ability to effectively and efficiently access and integrate a wide variety of digital Earth data.展开更多
Data Migration is a multi-step process that begins with analyzing old data and culminates in data uploading and reconciliation in new applications. With the rapid growth of data, organizations constantly need to migra...Data Migration is a multi-step process that begins with analyzing old data and culminates in data uploading and reconciliation in new applications. With the rapid growth of data, organizations constantly need to migrate data. Data migration can be a complex process as testing must be done to ensure data quality. Migration also can be very costly if best practices are not followed and hidden costs are not identified in the early stage. <span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">O</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">n the other hand</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">,</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"> many organizations today instead of buying IT equipment (hardware and/or software) and managing it themselves, they prefer to buy services from IT service providers. The number of service providers is increasing dramatically and the cloud is becoming the preferred tool for more cloud storage services. However, as more information and personal data are transferred to the cloud, to social media sites, DropBox, Baidu WangPan, etc., data security and privacy issues are questioned. So, academia and industry circles strive to find an effective way to secure data migration in the cloud. Various resolving methods and encryption techniques have been implemented. In this work, we will try to cover many important points in data migration as Strategy, Challenges, Need, methodology, Categories, Risks, and Uses with Cloud computing. Finally, we discuss data migration security and privacy challenge and how to solve this problem by making improvements in it’s using with Cloud through suggested proposed model that enhances data security and privacy by gathering Advanced Encryption Standard-256 (ATS256), Data Dispersion Algorithms and Secure Hash Algorithm-512. This model achieves verifiable security ratings and fast execution times.</span></span></span>展开更多
基金Research in the Lemmon/Bixby lab is supported by NIH grants NS080145 and NS059866by the Miami Project to Cure Paralysis
文摘Progress in developing robust therapies for spinal cord injury (SCI), trau- matic brain injury (TBI) and peripheral nerve injury has been slow. A great deal has been learned over the past 30 years regarding both the intrinsic factors and the environmental factors that regulate axon growth, but this large body of information has not yet resulted in clinically available thera- peutics. This therapeutic bottleneck has many root causes, but a consensus is emerging that one contributing factor is a lack of standards for experi- mental design and reporting. The absence of reporting standards, and even of commonly accepted definitions of key words, also make data mining and bioinformatics analysis of neural plasticity and regeneration difficult, if not impossible. This short review will consider relevant background and poten- tial solutions to this problem in the axon regeneration domain.
文摘Commentary Most would agree that providing comprehensive detail in scientific reporting is critical for the development of mean- ingful therapies and treatments for diseases. Such stellar practices 1) allow for reproduction of experiments to con- firm results, 2) promote thorough analyses of data, and 3) foster the incremental advancement of valid approaches. Unfortunately, most would also agree we have far to go to reach this vital goal (Hackam and Redelmeier, 2006; Prinz et al., 2011; Baker et al., 2014).
基金Project(61374140)supported by the National Natural Science Foundation of China
文摘There are multiple operating modes in the real industrial process, and the collected data follow the complex multimodal distribution, so most traditional process monitoring methods are no longer applicable because their presumptions are that sampled-data should obey the single Gaussian distribution or non-Gaussian distribution. In order to solve these problems, a novel weighted local standardization(WLS) strategy is proposed to standardize the multimodal data, which can eliminate the multi-mode characteristics of the collected data, and normalize them into unimodal data distribution. After detailed analysis of the raised data preprocessing strategy, a new algorithm using WLS strategy with support vector data description(SVDD) is put forward to apply for multi-mode monitoring process. Unlike the strategy of building multiple local models, the developed method only contains a model without the prior knowledge of multi-mode process. To demonstrate the proposed method's validity, it is applied to a numerical example and a Tennessee Eastman(TE) process. Finally, the simulation results show that the WLS strategy is very effective to standardize multimodal data, and the WLS-SVDD monitoring method has great advantages over the traditional SVDD and PCA combined with a local standardization strategy(LNS-PCA) in multi-mode process monitoring.
文摘1.1. Development of international data exchange standards in securities field Securities market involves a large number of participants, like investors, securities companies, exchanges, clearingcorporations and so on. Businesses among the participants are completed via data exchange. Therefore, the data exchange protocols serve an important factor to determine and promote the sate and rapid development of the securities market.
文摘The general use of aluminium as an indentation standard for the iteration of contact heights for the determination of ISO-14577 hardness and elastic modulus is challenged because of as yet not appreciated phase-changes in the physical force-depth standard curve that seemed to be secured by claims from 1992. The physical and mathematical analyses with closed formulas avoid the still world-wide standardized energy-law violation by not reserving 33.33% (h2 belief) (or 20% h3/2 physical law) of the loading force and thus energy for all not depth producing events but using 100% for the depth formation is a severe violation of the energy law. The not depth producing part of the indentation work cannot be done with zero energy! Both twinning and structural phase-transition onsets and normalized phase-transition energies are now calculated without iterations but with physically correct closed arithmetic equations. These are reported for Berkovich and cubecorner indentations, including their comparison on geometric grounds and an indentation standard without mechanical twinning is proposed. Characteristic data are reported. This is the first detection of the indentation twinning of aluminium at room temperature and the mechanical twinning of fused quartz is also new. Their disqualification as indentation standards is established. Also, the again found higher load phase-transitions disqualify aluminium and fused quartz as ISO-ASTM 14577 (International Standardization Organization and American Society for Testing and Materials) standards for the contact depth “hc” iterations. The incorrect and still world-wide used black-box values for H- and Er-values (the latter are still falsely called “Young’s moduli” even though they are not directional) and all mechanical properties that depend on them. They lack relation to bulk moduli from compression experiments. Experimentally obtained and so published force vs depth parabolas always follow the linear FN = kh3/2 + Fa equation, where Fa is the axis-cut before and after the phase-transition branches (never “h2” as falsely enforced and used for H, Er and giving incorrectly calculated parameters). The regression slopes k are the precise physical hardness values, which for the first time allow for precise calculation of the mechanical qualities by indentation in relation to the geometry of the indenter tip. Exactly 20% of the applied force and thus energy is not available for the indentation depth. Only these scientific k-values must be used for AI-advises at the expense of falsely iterated indentation hardness H-values. Any incorrect H-ISO-ASTM and also the iterated Er-ISO-ASTM modulus values of technical materials in artificial intelligence will be a disaster for the daily safety. The AI must be told that these are unscientific and must therefore be replaced by physical data. Iterated data (3 and 8 free parameters!) cannot be transformed into physical data. One has to start with real experimental loading curves and an absolute ZerodurR standard that must be calibrated with standard force and standard length to create absolute indentation results. .
文摘In Brazil and various regions globally, the initiation of landslides is frequently associated with rainfall;yet the spatial arrangement of geological structures and stratification considerably influences landslide occurrences. The multifaceted nature of these influences makes the surveillance of mass movements a highly intricate task, requiring an understanding of numerous interdependent variables. Recent years have seen an emergence in scholarly research aimed at integrating geophysical and geotechnical methodologies. The conjoint examination of geophysical and geotechnical data offers an enhanced perspective into subsurface structures. Within this work, a methodology is proposed for the synchronous analysis of electrical resistivity geophysical data and geotechnical data, specifically those extracted from the Light Dynamic Penetrometer (DPL) and Standard Penetration Test (SPT). This study involved a linear fitting process to correlate resistivity with N10/SPT N-values from DPL/SPT soundings, culminating in a 2D profile of N10/SPT N-values predicated on electrical profiles. The findings of this research furnish invaluable insights into slope stability by allowing for a two-dimensional representation of penetration resistance properties. Through the synthesis of geophysical and geotechnical data, this project aims to augment the comprehension of subsurface conditions, with potential implications for refining landslide risk evaluations. This endeavor offers insight into the formulation of more effective and precise slope management protocols and disaster prevention strategies.
基金Supported by The National Key Basic Research Development Plan(2010CB428602)
文摘By using the data in 169 sounding stations over the world,NCEP/NCAR reanalysis data were tested,and the distribution characteristics of standard errors of geopotential height,temperature and wind speed field from the upper troposphere to the lower stratosphere over the world(most were the land zone) were analyzed.The results showed that the standard error distribution of reanalysis wind speed field data was mainly affected by the jet stream zone.There existed the obvious difference between the jet stream zone and the actual wind field.The distribution of standard error in the wind speed field had the obvious seasonal difference in winter,summer,and the average deviation was larger near the coastline.The high value zones of standard errors of reanalysis geopotential height and temperature field mainly concentrated in the low-latitude region in the Eastern Hemisphere(Indian Ocean coast).The distribution of standard error was basically consistent with average error.Therefore,the standard error could be explained well by the average error.The standard errors of reanalysis temperature and geopotential height data in the inland zone were lower.The high value zone mainly distributed along the coastline,and the average error of wind speed field was bigger near the coastline.It closely related to the quality of data in the sounding stations,the regional difference and the fact that the land observation stations were dense,and the ocean observation stations were fewer.
基金National Natural Science Foundation of China ( No. 61074079) Shanghai Leading Academic Discipline Project,China ( No.B504)
文摘Complex industrial process often contains multiple operating modes, and the challenge of multimode process monitoring has recently gained much attention. However, most multivariate statistical process monitoring (MSPM) methods are based on the assumption that the process has only one nominal mode. When the process data contain different distributions, they may not function as well as in single mode processes. To address this issue, an improved partial least squares (IPLS) method was proposed for multimode process monitoring. By utilizing a novel local standardization strategy, the normal data in multiple modes could be centralized after being standardized and the fundamental assumption of partial least squares (PLS) could be valid again in multimode process. In this way, PLS method was extended to be suitable for not only single mode processes but also multimode processes. The efficiency of the proposed method was illustrated by comparing the monitoring results of PLS and IPLS in Tennessee Eastman(TE) process.
基金supported by National High Technology Research and Development Program of China (863 Program) (No. AA420060)
文摘In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative materials, this paper presents extensible markup language (XML) based strategy for several important problems of data processing in network supported collaborative design, such as the representation of standard for the exchange of product model data (STEP) with XML in the product information expression and the management of XML documents using relational database. The paper gives a detailed exposition on how to clarify the mapping between XML structure and the relationship database structure and how XML-QL queries can be translated into structured query language (SQL) queries. Finally, the structure of data processing system based on XML is presented.
文摘To make inorganic structure data more useful for further studies a five-point list of simple procedures to be followed by authors of crystal structure papers is proposed. 1. A crystal structure should be described with the space group corresponding to its true symmetry. 2. A new structure proposal should be tested, if it is realistic in principle. 3. A structure should be described with a space group in a setting given in the International Tables. 4. For a comparison with other structures the structure data should be standardized with the program STRUCTURE TIDY. 5. 揘ew?structure data should be checked in the databases, Chemical Abstracts or on-line internet resources, if they are really new. The list is supplemented with many explanations, commentaries, examples and references.
基金the US National Science Foundation for their long-time support of the development of the IGSN(Grant Nos.NSF-0445178,NSF-0514551,NSF-0552123)the Earth Chem system(Grant No.NSF-0522195)+1 种基金operation of both systems as part of the IEDA Data Facility(Grant Nos.NSF-0950477,NSF-1636653)the Alfred P.Sloan Foundation for a grant to Columbia University to support the development of a global,scalable,and sustainable technical and organizational infrastructure for persistent unique identifiers of physical scientific samples.
文摘1 Introduction The primary goal of the Deep-time Digital Earth project is to develop an open collaboration and data sharing platform that enables the transition of deep-time geoscientific research to a Big Data driven paradigm.Such an open platform will require the ability to effectively and efficiently access and integrate a wide variety of digital Earth data.
文摘Data Migration is a multi-step process that begins with analyzing old data and culminates in data uploading and reconciliation in new applications. With the rapid growth of data, organizations constantly need to migrate data. Data migration can be a complex process as testing must be done to ensure data quality. Migration also can be very costly if best practices are not followed and hidden costs are not identified in the early stage. <span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">O</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">n the other hand</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">,</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"> many organizations today instead of buying IT equipment (hardware and/or software) and managing it themselves, they prefer to buy services from IT service providers. The number of service providers is increasing dramatically and the cloud is becoming the preferred tool for more cloud storage services. However, as more information and personal data are transferred to the cloud, to social media sites, DropBox, Baidu WangPan, etc., data security and privacy issues are questioned. So, academia and industry circles strive to find an effective way to secure data migration in the cloud. Various resolving methods and encryption techniques have been implemented. In this work, we will try to cover many important points in data migration as Strategy, Challenges, Need, methodology, Categories, Risks, and Uses with Cloud computing. Finally, we discuss data migration security and privacy challenge and how to solve this problem by making improvements in it’s using with Cloud through suggested proposed model that enhances data security and privacy by gathering Advanced Encryption Standard-256 (ATS256), Data Dispersion Algorithms and Secure Hash Algorithm-512. This model achieves verifiable security ratings and fast execution times.</span></span></span>