期刊文献+
共找到718,198篇文章
< 1 2 250 >
每页显示 20 50 100
气象观测场在汽车试验场中的应用研究
1
作者 陈海建 《时代汽车》 2024年第14期172-174,178,共4页
汽车试验场作为汽车开展道路测试的重要场所,用于验证汽车产品的品质以及可靠性。除了场地道路外,气象条件作为汽车道路测试的重要一环,在《GB/T12534-1990汽车道路试验方法通则》中也有明确要求,如:试验时应是无雨无雾天气,相对湿度小... 汽车试验场作为汽车开展道路测试的重要场所,用于验证汽车产品的品质以及可靠性。除了场地道路外,气象条件作为汽车道路测试的重要一环,在《GB/T12534-1990汽车道路试验方法通则》中也有明确要求,如:试验时应是无雨无雾天气,相对湿度小于95%,气温0-40℃,风速不大于3m/s。同时气象条件也作为试验场道路管控的重要依据,实时风速、雨量、能见度等信息为场地管理者发布限速、限行、封场等通知提供必要参考依据,直接影响道路测试安全管控的及时性。因此,文章从气象观测场的建设、气象服务、异常天气道路管控等方面开展气象观测场在汽车试验场中的应用研究。 展开更多
关键词 products. In addition to the SITE roads METEOROLOGICAL conditions are an important part of AUTOMOTIVE ROAD testing and there are also clear requirements in the GB/T12534-1990 General Rules for AUTOMOTIVE ROAD Test Methods. For example the test should be conducted in rain and fog free weather with a relative humidity of less than 95% a temperature of 0-40 and a wind SPEED of no more than 3m/s. At the same time METEOROLOGICAL conditions also serve as an important basis for ROAD control in the test site. Real time information such as wind SPEED rainfall and visibility provides necessary reference for SITE managers to issue notices on SPEED limits SITE closures and trac restrictions directly aecting the timeliness of ROAD testing safety control. Therefore this article conducts research on the application of METEOROLOGICAL observation SITES in AUTOMOTIVE testing SITES from the construction of METEOROLOGICAL observation SITES METEOROLOGICAL services and abnormal weather ROAD control.
下载PDF
Contribution of the MERISE-Type Conceptual Data Model to the Construction of Monitoring and Evaluation Indicators of the Effectiveness of Training in Relation to the Needs of the Labor Market in the Republic of Congo
2
作者 Roch Corneille Ngoubou Basile Guy Richard Bossoto Régis Babindamana 《Open Journal of Applied Sciences》 2024年第8期2187-2200,共14页
This study proposes the use of the MERISE conceptual data model to create indicators for monitoring and evaluating the effectiveness of vocational training in the Republic of Congo. The importance of MERISE for struct... This study proposes the use of the MERISE conceptual data model to create indicators for monitoring and evaluating the effectiveness of vocational training in the Republic of Congo. The importance of MERISE for structuring and analyzing data is underlined, as it enables the measurement of the adequacy between training and the needs of the labor market. The innovation of the study lies in the adaptation of the MERISE model to the local context, the development of innovative indicators, and the integration of a participatory approach including all relevant stakeholders. Contextual adaptation and local innovation: The study suggests adapting MERISE to the specific context of the Republic of Congo, considering the local particularities of the labor market. Development of innovative indicators and new measurement tools: It proposes creating indicators to assess skills matching and employer satisfaction, which are crucial for evaluating the effectiveness of vocational training. Participatory approach and inclusion of stakeholders: The study emphasizes actively involving training centers, employers, and recruitment agencies in the evaluation process. This participatory approach ensures that the perspectives of all stakeholders are considered, leading to more relevant and practical outcomes. Using the MERISE model allows for: • Rigorous data structuring, organization, and standardization: Clearly defining entities and relationships facilitates data organization and standardization, crucial for effective data analysis. • Facilitation of monitoring, analysis, and relevant indicators: Developing both quantitative and qualitative indicators helps measure the effectiveness of training in relation to the labor market, allowing for a comprehensive evaluation. • Improved communication and common language: By providing a common language for different stakeholders, MERISE enhances communication and collaboration, ensuring that all parties have a shared understanding. The study’s approach and contribution to existing research lie in: • Structured theoretical and practical framework and holistic approach: The study offers a structured framework for data collection and analysis, covering both quantitative and qualitative aspects, thus providing a comprehensive view of the training system. • Reproducible methodology and international comparison: The proposed methodology can be replicated in other contexts, facilitating international comparison and the adoption of best practices. • Extension of knowledge and new perspective: By integrating a participatory approach and developing indicators adapted to local needs, the study extends existing research and offers new perspectives on vocational training evaluation. 展开更多
关键词 MERISE Conceptual data Model (MCD) Monitoring Indicators Evaluation of Training Effectiveness Training-Employment Adequacy Labor Market Information Systems Analysis Adjustment of Training Programs EMPLOYABILITY Professional Skills
下载PDF
Cyber Resilience through Real-Time Threat Analysis in Information Security
3
作者 Aparna Gadhi Ragha Madhavi Gondu +1 位作者 Hitendra Chaudhary Olatunde Abiona 《International Journal of Communications, Network and System Sciences》 2024年第4期51-67,共17页
This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t... This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1]. 展开更多
关键词 Cybersecurity Information Security Network Security Cyber Resilience Real-Time Threat Analysis Cyber Threats Cyberattacks Threat Intelligence Machine Learning Artificial Intelligence Threat Detection Threat Mitigation Risk Assessment Vulnerability Management Incident Response Security Orchestration Automation Threat Landscape Cyber-Physical Systems Critical Infrastructure data Protection Privacy Compliance Regulations Policy Ethics CYBERCRIME Threat Actors Threat Modeling Security Architecture
下载PDF
Planning and construction of Xiong’an New Area(city of over 5 million people):Contributions of China’s geologists and urban geology
4
作者 Bo Han Zhen Ma +9 位作者 Liang-jun Lin Hong-wei Liu Yi-hang Gao Yu-bo Xia Hai-tao Li Xu Guo Feng Ma Yu-shan Wang Ya-long Zhou Hong-qiang Li 《China Geology》 CAS CSCD 2024年第3期382-408,共27页
China established Xiong’an New Area in Hebei Province in 2017,which is planned to accommodate about 5 million people,aiming to relieve Beijing City of the functions non-essential to its role as China’s capital and t... China established Xiong’an New Area in Hebei Province in 2017,which is planned to accommodate about 5 million people,aiming to relieve Beijing City of the functions non-essential to its role as China’s capital and to expedite the coordinated development of the Beijing-Tianjin-Hebei region.From 2017 to 2021,the China Geological Survey(CGS)took the lead in multi-factor urban geological surveys involving space,resources,environments,and disasters according to the general requirements of“global vision,international standards,distinctive Chinese features,and future-oriented goals”in Xiong’an New Area,identifying the engineering geologic conditions and geologic environmental challenges of this area.The achievements also include a 3D engineering geological structure model for the whole area,along with“one city proper and five clusters”,insights into the ecology and the background endowment of natural resources like land,geothermal resources,groundwater,and wetland of the area before engineering construction,a comprehensive monitoring network of resources and environments in the area,and the“Transparent Xiong’an”geological information platform that is open,shared,dynamically updated,and three-dimensionally visualized.China’s geologists and urban geology have played a significant role in the urban planning and construction of Xiong’an New Area,providing whole-process geological solutions for urban planning,construction,operation and management.The future urban construction of Xiong’an New Area will necessitate the theoretical and technical support of earth system science(ESS)from various aspects,and the purpose is to enhance the resilience of the new type of city and to provide support for the green,low-carbon,and sustainable development of this area. 展开更多
关键词 Low Carbon New City Planning and construction Land Geothermal resources Groundwater Wetland Underground space Geologic disasters Site stability Natural resource Ecosystem Geological safety Transparent Xiongan Resilient city Xiongan New Area
下载PDF
Quantitative Comparative Study of the Performance of Lossless Compression Methods Based on a Text Data Model
5
作者 Namogo Silué Sié Ouattara +1 位作者 Mouhamadou Dosso Alain Clément 《Open Journal of Applied Sciences》 2024年第7期1944-1962,共19页
Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their perform... Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their performance is exploited with lossy compression techniques for images and videos generally using a mixed approach. To achieve our intended objective, which is to study the performance of lossless compression methods, we first carried out a literature review, a summary of which enabled us to select the most relevant, namely the following: arithmetic coding, LZW, Tunstall’s algorithm, RLE, BWT, Huffman coding and Shannon-Fano. Secondly, we designed a purposive text dataset with a repeating pattern in order to test the behavior and effectiveness of the selected compression techniques. Thirdly, we designed the compression algorithms and developed the programs (scripts) in Matlab in order to test their performance. Finally, following the tests conducted on relevant data that we constructed according to a deliberate model, the results show that these methods presented in order of performance are very satisfactory:- LZW- Arithmetic coding- Tunstall algorithm- BWT + RLELikewise, it appears that on the one hand, the performance of certain techniques relative to others is strongly linked to the sequencing and/or recurrence of symbols that make up the message, and on the other hand, to the cumulative time of encoding and decoding. 展开更多
关键词 Arithmetic Coding BWT Compression Ratio Comparative Study Compression Techniques Shannon-Fano HUFFMAN Lossless Compression LZW PERFORMANCE REDUNDANCY RLE Text data Tunstall
下载PDF
Cyberattack Ramifications, The Hidden Cost of a Security Breach
6
作者 Meysam Tahmasebi 《Journal of Information Security》 2024年第2期87-105,共19页
In this in-depth exploration, I delve into the complex implications and costs of cybersecurity breaches. Venturing beyond just the immediate repercussions, the research unearths both the overt and concealed long-term ... In this in-depth exploration, I delve into the complex implications and costs of cybersecurity breaches. Venturing beyond just the immediate repercussions, the research unearths both the overt and concealed long-term consequences that businesses encounter. This study integrates findings from various research, including quantitative reports, drawing upon real-world incidents faced by both small and large enterprises. This investigation emphasizes the profound intangible costs, such as trade name devaluation and potential damage to brand reputation, which can persist long after the breach. By collating insights from industry experts and a myriad of research, the study provides a comprehensive perspective on the profound, multi-dimensional impacts of cybersecurity incidents. The overarching aim is to underscore the often-underestimated scope and depth of these breaches, emphasizing the entire timeline post-incident and the urgent need for fortified preventative and reactive measures in the digital domain. 展开更多
关键词 Artificial Intelligence (AI) Business Continuity Case Studies Copyright Cost-Benefit Analysis Credit Rating Cyberwarfare Cybersecurity Breaches data Breaches Denial Of Service (DOS) Devaluation Of Trade Name Disaster Recovery Distributed Denial of Service (DDOS) Identity Theft Increased Cost to Raise Debt Insurance Premium Intellectual Property Operational Disruption Patent Post-Breach Customer Protection Recovery Point Objective (RPO) Recovery Time Objective (RTO) Regulatory Compliance Risk Assessment Service Level Agreement Stuxnet Trade Secret
下载PDF
Intelligent System for Parallel Fault-Tolerant Diagnostic Tests Construction
7
作者 Anna Yankovskaya Sergei Kitler 《Journal of Software Engineering and Applications》 2013年第4期54-61,共8页
This investigation deals with the intelligent system for parallel fault-tolerant diagnostic tests construction. A modified parallel algorithm for fault-tolerant diagnostic tests construction is proposed. The algorithm... This investigation deals with the intelligent system for parallel fault-tolerant diagnostic tests construction. A modified parallel algorithm for fault-tolerant diagnostic tests construction is proposed. The algorithm is allowed to optimize processing time on tests construction. A matrix model of data and knowledge representation, as well as various kinds of regularities in data and knowledge are presented. Applied intelligent system for diagnostic of mental health of population which is developed with the use of intelligent system for parallel fault-tolerant DTs construction is suggested. 展开更多
关键词 Intelligent System Test Methods of Pattern Recognition MATRIX Model of Knowledge and data Representation REVEALING of Various Kinds REGULARITIES FAULT-TOLERANT Diagnostic Tests PARALLEL Algorithm Irredundant H-Fold Column Coverings of BOOLEAN MATRIX
下载PDF
Identification of Categorical Registration Data of Domain Names in Data Warehouse Construction Task
8
作者 Rasim Alguliev Rena Gasimova 《Intelligent Control and Automation》 2013年第2期227-234,共8页
This work is dedicated to formation of data warehouse for processing of a large volume of registration data of domain names. Data cleaning is applied in order to increase the effectiveness of decision making support. ... This work is dedicated to formation of data warehouse for processing of a large volume of registration data of domain names. Data cleaning is applied in order to increase the effectiveness of decision making support. Data cleaning is ap- plied in warehouses for detection and deletion of errors, discrepancy in data in order to improve their quality. For this purpose, fuzzy record comparison algorithms are for clearing of registration data of domain names reviewed in this work. Also, identification method of domain names registration data for data warehouse formation is proposed. Deci- sion making algorithms for identification of registration data are implemented in DRRacket and Python. 展开更多
关键词 DOMAIN DOMAIN NAME System Registrar Registrant Category data data WAREHOUSE data CLEARING Fuzzy Search Algorithms Damerau-Levenstein Distance Decision Tree
下载PDF
Modern Corrosion Mapping of Storage Tank Bottoms--Notable Advancements in Critical Zone Coverage,Inspection Efficiency and Data Integrity
9
作者 Andrew J.Simpson Matthew A.Boat 《Journal of Civil Engineering and Architecture》 2024年第3期148-153,共6页
Every day,an NDT(Non-Destructive Testing)report will govern key decisions and inform inspection strategies that could affect the flow of millions of dollars which ultimately affects local environments and potential ri... Every day,an NDT(Non-Destructive Testing)report will govern key decisions and inform inspection strategies that could affect the flow of millions of dollars which ultimately affects local environments and potential risk to life.There is a direct correlation between report quality and equipment capability.The more able the equipment is-in terms of efficient data gathering,signal to noise ratio,positioning,and coverage-the more actionable the report is.This results in optimal maintenance and repair strategies providing the report is clear and well presented.Furthermore,when considering tank floor storage inspection it is essential that asset owners have total confidence in inspection findings and the ensuing reports.Tank floor inspection equipment must not only be efficient and highly capable,but data sets should be traceable and integrity maintained throughout.Corrosion mapping of large surface areas such as storage tank bottoms is an inherently arduous and time-consuming process.MFL(magnetic flux leakage)based tank bottom scanners present a well-established and highly rated method for inspection.There are many benefits of using modern MFL technology to generate actionable reports.Chief among these includes efficiency of coverage while gaining valuable information regarding defect location,severity,surface origin and the extent of coverage.More recent advancements in modern MFL tank bottom scanners afford the ability to scan and record data sets at areas of the tank bottom which were previously classed as dead zones or areas not scanned due to physical restraints.An example of this includes scanning the CZ(critical zone)which is the area close to the annular to shell junction weld.Inclusion of these additional dead zones increases overall inspection coverage,quality and traceability.Inspection of the CZ areas allows engineers to quickly determine the integrity of arguably the most important area of the tank bottom.Herein we discuss notable developments in CZ coverage,inspection efficiency and data integrity that combines to deliver an actionable report.The asset owner can interrogate this report to develop pertinent and accurate maintenance and repair strategies. 展开更多
关键词 Storage tank tank bottom CZ MFL stars CORROSION corrosion-mapping EFFICIENCY COVERAGE paperless reporting data traceability
下载PDF
Statistical Methods of SNP Data Analysis and Applications
10
作者 Alexander Bulinski Oleg Butkovsky +5 位作者 Victor Sadovnichy Alexey Shashkin Pavel Yaskov Alexander Balatskiy Larisa Samokhodskaya Vsevolod Tkachuk 《Open Journal of Statistics》 2012年第1期73-87,共15页
We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction... We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed. 展开更多
关键词 Genetic data Statistical Analysis Multifactor Dimensionality Reduction Ternary Logic Regression Random FORESTS Stochastic Gradient Boosting Independent Rule Single NUCLEOTIDE POLYMORPHISMS CORONARY Heart Disease MYOCARDIAL INFARCTION
下载PDF
Potential Applications of Milk Fractions and Valorization of Dairy By-Products: A Review of the State-of-the-Art Available Data, Outlining the Innovation Potential from a Bigger Data Standpoint 被引量:3
11
作者 Serge Rebouillat Salvadora Ortega-Requena 《Journal of Biomaterials and Nanobiotechnology》 2015年第3期176-203,共28页
The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation... The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation of milk is the key in the development of those ingredients and products;hence continuous research and development on this field, especially various levels of fractionation and separation by filtration, have been carried out. This review focuses on the production of milk fractions as well as their particular properties, applications and processes that increase their exploitation. Whey proteins and caseins from the protein fraction are excellent emulsifiers and protein supplements. Besides, they can be chemically or enzymatically modified to obtain bioactive peptides with numerous functional and nutritional properties. In this context, valorization techniques of cheese-whey proteins, by-product of dairy industry that constitutes both economic and environmental problems, are being developed. Phospholipids from the milk fat fraction are powerful emulsifiers and also have exclusive nutraceutical properties. In addition, enzyme modification of milk phospholipids makes it possible to tailor emulsifiers with particular properties. However, several aspects remain to be overcome;those refer to a deeper understanding of the healthy, functional and nutritional properties of these new ingredients that might be barriers for its use and acceptability. Additionally, in this review, alternative applications of milk constituents in the non-food area such as in the manufacture of plastic materials and textile fibers are also introduced. The unmet needs, the cross-fertilization in between various protein domains,the carbon footprint requirements, the environmental necessities, the health and wellness new demand, etc., are dominant factors in the search for innovation approaches;these factors are also outlining the further innovation potential deriving from those “apparent” constrains obliging science and technology to take them into account. 展开更多
关键词 MILK Product MILK Fractionation Casein Phospholipid WHEY Protein NON-FOOD Application VALORIZATION Enzyme Modification Bioactive Peptides BIGGER data Innovation: Closed Open Collaborative Disruptive Inclusive Nested
下载PDF
A Review: On Smart Materials Based on Some Polysaccharides;within the Contextual Bigger Data, Insiders, “Improvisation” and Said Artificial Intelligence Trends 被引量:1
12
作者 Serge Rebouillat Fernand Pla 《Journal of Biomaterials and Nanobiotechnology》 2019年第2期41-77,共37页
Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new conte... Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new contextual data influxes. Has this, in the last 20 years, changed some of the essential fundamental dimensions and the required skills of the actors such as providers, users, insiders, etc.? This is a preliminary focus and prelude of this review. As an example, polysaccharide materials are the most abundant macromolecules present as an integral part of the natural system of our planet. They are renewable, biodegradable, carbon neutral with low environmental, health and safety risks and serve as structural materials in the cell walls of plants. Most of them are used, for many years, as engineering materials in many important industrial processes, such as pulp and papermaking and manufacture of synthetic textile fibres. They are also used in other domains such as conversion into biofuels and, more recently, in the design of processes using polysaccharide nanoparticles. The main properties of polysaccharides (e.g. low density, thermal stability, chemical resistance, high mechanical strength…), together with their biocompatibility, biodegradability, functionality, durability and uniformity, allow their use for manufacturing smart materials such as blends and composites, electroactive polymers and hydrogels which can be obtained 1) through direct utilization and/or 2) after chemical or physical modifications of the polysaccharides. This paper reviews recent works developed on polysaccharides, mainly on cellulose, hemicelluloses, chitin, chitosans, alginates, and their by-products (blends and composites), with the objectives of manufacturing smart materials. It is worth noting that, today, the fundamental understanding of the molecular level interactions that confer smartness to polysaccharides remains poor and one can predict that new experimental and theoretical tools will emerge to develop the necessary understanding of the structure-property-function relationships that will enable polysaccharide-smartness to be better understood and controlled, giving rise to the development of new and innovative applications such as nanotechnology, foods, cosmetics and medicine (e.g. controlled drug release and regenerative medicine) and so, opening up major commercial markets in the context of green chemistry. 展开更多
关键词 POLYSACCHARIDES Cellulose Hemicelluloses Chitosan Alginate Composites Blends Hydrogels Smart Materials Electro-Active Papers Sensors Actuators BIGGER data Innovation Science in Education Jazz 4C CRAC
下载PDF
The Information Protection in Automatic Reconstruction of Not Continuous Geophysical Data Series 被引量:1
13
作者 Osvaldo Faggioni 《Journal of Data Analysis and Information Processing》 2019年第4期208-227,共20页
We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the infor... We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the information acquired before and after the interruption by means of a numerical protocol with the lowest possible calculation weight. The signal reconstruction process is based on the synthesis of the low frequency signal extracted for subsampling (subsampling &#8711Dirac = ΔT in phase with ΔT) with the high frequency signal recorded before the crash. The SYRec (SYnthetic REConstruction) method for simplicity and speed of calculation and for spectral response stability is particularly effective in the studies of high speed transient phenomena that develop in very perturbed fields. This operative condition is found a mental when almost immediate informational responses are required to the observation system. In this example we are dealing with geomagnetic data coming from an uw counter intrusion magnetic system. The system produces (on time) information about the transit of local magnetic singularities (magnetic perturbations with low spatial extension), originated by quasi-point form and kinematic sources (divers), in harbors magnetic underwater fields. The performances of stability of the SYRec system make it usable also in long and medium period of observation (activity of geomagnetic observatories). 展开更多
关键词 Geomatic GEOMAGNETISM Not Continuous data SERIES Synthetic REconstruction Protection of the PHYSIC Informations data Manipulation
下载PDF
A Study of EM Algorithm as an Imputation Method: A Model-Based Simulation Study with Application to a Synthetic Compositional Data
14
作者 Yisa Adeniyi Abolade Yichuan Zhao 《Open Journal of Modelling and Simulation》 2024年第2期33-42,共10页
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode... Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance. 展开更多
关键词 Compositional data Linear Regression Model Least Square Method Robust Least Square Method Synthetic data Aitchison Distance Maximum Likelihood Estimation Expectation-Maximization Algorithm k-Nearest Neighbor and Mean imputation
下载PDF
Modeling and Simulation Study of Space Data Link Protocol
15
作者 Ismail Hababeh Rizik M. H. Al-Sayyed +2 位作者 Ja’far Alqatawna Yousef Majdalawi Marwan Nabelsi 《International Journal of Communications, Network and System Sciences》 2014年第10期440-452,共13页
This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). ... This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). The primer focus is the telecommand (TC) part of the standard. The implementation of the standard was in the form of DLL functions using C++ programming language. The second objective of this paper was to use the DLL functions with OMNeT++ simulating environment to create a simulator in order to analyze the mean end-to-end Packet Delay, maximum achievable application layer throughput for a given fixed link capacity and normalized protocol overhead, defined as the total number of bytes transmitted on the link in a given period of time (e.g. per second) divided by the number of bytes of application data received at the application layer model data sink. In addition, the DLL was also integrated with Ground Support Equipment Operating System (GSEOS), a software system for space instruments and small spacecrafts especially suited for low budget missions. The SDLP is designed for rapid test system design and high flexibility for changing telemetry and command requirements. GSEOS can be seamlessly moved from EM/FM development (bench testing) to flight operations. It features the Python programming language as a configuration/scripting tool and can easily be extended to accommodate custom hardware interfaces. This paper also shows the results of the simulations and its analysis. 展开更多
关键词 Consultative COMMITTEE for SPACE data Systems Standards SPACE data Link PROTOCOL Mean END-TO-END Packet Delay Maximum Achievable Application Layer Throughput Normalized PROTOCOL OVERHEAD Telecommand Spacecrafts SPACE Instruments
下载PDF
Empowering the Future: Exploring the Construction and Characteristics of Lithium-Ion Batteries
16
作者 Dan Tshiswaka Dan 《Advances in Chemical Engineering and Science》 CAS 2024年第2期84-111,共28页
Lithium element has attracted remarkable attraction for energy storage devices, over the past 30 years. Lithium is a light element and exhibits the low atomic number 3, just after hydrogen and helium in the periodic t... Lithium element has attracted remarkable attraction for energy storage devices, over the past 30 years. Lithium is a light element and exhibits the low atomic number 3, just after hydrogen and helium in the periodic table. The lithium atom has a strong tendency to release one electron and constitute a positive charge, as Li<sup> </sup>. Initially, lithium metal was employed as a negative electrode, which released electrons. However, it was observed that its structure changed after the repetition of charge-discharge cycles. To remedy this, the cathode mainly consisted of layer metal oxide and olive, e.g., cobalt oxide, LiFePO<sub>4</sub>, etc., along with some contents of lithium, while the anode was assembled by graphite and silicon, etc. Moreover, the electrolyte was prepared using the lithium salt in a suitable solvent to attain a greater concentration of lithium ions. Owing to the lithium ions’ role, the battery’s name was mentioned as a lithium-ion battery. Herein, the presented work describes the working and operational mechanism of the lithium-ion battery. Further, the lithium-ion batteries’ general view and future prospects have also been elaborated. 展开更多
关键词 Lithium-Ion Batteries Battery construction Battery Characteristics Energy Storage Electrochemical Cells Anode Materials Cathode Materials State of Charge (SOC) Depth of Discharge (DOD) Solid Electrolyte Interface (SEI)
下载PDF
Data Mining Based Research of Development Direction of Waist Protection Equipment
17
作者 Lingfeng ZHU Zhizhen LU +3 位作者 Haijie YU Haifen YING Zheming LI Huashan FAN 《Medicinal Plant》 2024年第2期84-90,共7页
[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The str... [Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The structural design information of all waist protection equipment was collected from the national Internet platform,and the data were classified and a database was established.IBM SPSS 26.0 and MATLAB 2018a were used to analyze the data and tabulate them in Tableau 2022.4.After the association rules were clarified,the data were imported into Cinema 4D R21 to create the concept contour of waist protection pillow.[Results]The average and standard deviation of the single airbag design were the highest in all groups,with an average of 0.511 and a standard deviation of 0.502.The average and standard deviation of the upper and lower dual airbags were the lowest in all groups,with an average of 0.015 and a standard deviation of 0.120;the correlation coefficient between single airbag and 120°arc stretching was 0.325,which was positively correlated with each other(P<0.01);the correlation coefficient between multiple airbags and 360°encircling fitting was 0.501,which was positively correlated with each other and had the highest correlation degree(P<0.01).[Conclusions]The single airbag design is well recognized by companies,and has received the highest attention among all brand products.While focusing on single airbag design,most brands will consider the need to add 120°arc stretching elements in product design.At the time of focusing on multiple airbag design,some brands believe that 360°encircling fitting elements need to be added to the product,and the correlation between the two is the highest among all groups. 展开更多
关键词 SPINE Low back pain data mining AIRBAG STRETCHING Fitting Steel plate support Bidirectional compression Conceptual contour Design
下载PDF
Spatio-temporal variability of surface chlorophyll a in the Yellow Sea and the East China Sea based on reconstructions of satellite data of 2001-2020
18
作者 Weichen XIE Tao WANG Wensheng JIANG 《Journal of Oceanology and Limnology》 SCIE CAS CSCD 2024年第2期390-407,共18页
Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-20... Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-2020 were investigated by reconstructing the MODIS Level 3 products with the data interpolation empirical orthogonal function(DINEOF)method.The reconstructed results by interpolating the combined MODIS daily+8-day datasets were found better than those merely by interpolating daily or 8-day data.Chl-a concentration in the YS and the ECS reached its maximum in spring,with blooms occurring,decreased in summer and autumn,and increased in late autumn and early winter.By performing empirical orthogonal function(EOF)decomposition of the reconstructed data fields and correlation analysis with several potential environmental factors,we found that the sea surface temperature(SST)plays a significant role in the seasonal variation of Chl a,especially during spring and summer.The increase of SST in spring and the upper-layer nutrients mixed up during the last winter might favor the occurrence of spring blooms.The high sea surface temperature(SST)throughout the summer would strengthen the vertical stratification and prevent nutrients supply from deep water,resulting in low surface Chl-a concentrations.The sea surface Chl-a concentration in the YS was found decreased significantly from 2012 to 2020,which was possibly related to the Pacific Decadal Oscillation(PDO). 展开更多
关键词 chlorophyll a(Chl a) data interpolation empirical orthogonal function(DINEOF) empirical orthogonal function(EOF)analysis Yellow Sea East China Sea
下载PDF
FDI对中国收入分配影响的panel data模型分析 被引量:3
19
作者 林宏 《浙江统计》 2005年第3期19-21,共3页
关键词 FDI panel data
下载PDF
Proposed Caching Scheme for Optimizing Trade-off between Freshness and Energy Consumption in Name Data Networking Based IoT 被引量:1
20
作者 Rahul Shrimali Hemal Shah Riya Chauhan 《Advances in Internet of Things》 2017年第2期11-24,共14页
Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offer... Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required. 展开更多
关键词 Internet of Things (IoT) Named data NETWORKING Smart CACHING Table Pending INTEREST Forwarding INFORMATION Base CONTENT Store CONTENT Centric NETWORKING INFORMATION Centric NETWORKING data & INTEREST Packets SCTSmart CACHING
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部