期刊文献+
共找到23,900篇文章
< 1 2 250 >
每页显示 20 50 100
Phase equilibrium data prediction and process optimizationin butadiene extraction process
1
作者 Baowei Niu Yanjie Yi +5 位作者 Yuwen Wei Fuzhen Zhang Lili Wang Li Xia Xiaoyan Sun Shuguang Xiang 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2024年第7期1-12,共12页
In response to the lack of reliable physical parameters in the process simulation of the butadiene extraction,a large amount of phase equilibrium data were collected in the context of the actual process of butadiene p... In response to the lack of reliable physical parameters in the process simulation of the butadiene extraction,a large amount of phase equilibrium data were collected in the context of the actual process of butadiene production by acetonitrile.The accuracy of five prediction methods,UNIFAC(UNIQUAC Functional-group Activity Coefficients),UNIFAC-LL,UNIFAC-LBY,UNIFAC-DMD and COSMO-RS,applied to the butadiene extraction process was verified using partial phase equilibrium data.The results showed that the UNIFAC-DMD method had the highest accuracy in predicting phase equilibrium data for the missing system.COSMO-RS-predicted multiple systems showed good accuracy,and a large number of missing phase equilibrium data were estimated using the UNIFAC-DMD method and COSMO-RS method.The predicted phase equilibrium data were checked for consistency.The NRTL-RK(non-Random Two Liquid-Redlich-Kwong Equation of State)and UNIQUAC thermodynamic models were used to correlate the phase equilibrium data.Industrial device simulations were used to verify the accuracy of the thermodynamic model applied to the butadiene extraction process.The simulation results showed that the average deviations of the simulated results using the correlated thermodynamic model from the actual values were less than 2%compared to that using the commercial simulation software,Aspen Plus and its database.The average deviation was much smaller than that of the simulations using the Aspen Plus database(>10%),indicating that the obtained phase equilibrium data are highly accurate and reliable.The best phase equilibrium data and thermodynamic model parameters for butadiene extraction are provided.This improves the accuracy and reliability of the design,optimization and control of the process,and provides a basis and guarantee for developing a more environmentally friendly and economical butadiene extraction process. 展开更多
关键词 Butadiene extraction Phase equilibrium data Prediction methods Thermodynamic modeling process simulation
下载PDF
Big Data Application Simulation Platform Design for Onboard Distributed Processing of LEO Mega-Constellation Networks
2
作者 Zhang Zhikai Gu Shushi +1 位作者 Zhang Qinyu Xue Jiayin 《China Communications》 SCIE CSCD 2024年第7期334-345,共12页
Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In exist... Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In existing technologies,the efficiency of big data applications(BDAs)in distributed systems hinges on the stable-state and low-latency links between worker nodes.However,LMCNs with high-dynamic nodes and long-distance links can not provide the above conditions,which makes the performance of OBDP hard to be intuitively measured.To bridge this gap,a multidimensional simulation platform is indispensable that can simulate the network environment of LMCNs and put BDAs in it for performance testing.Using STK's APIs and parallel computing framework,we achieve real-time simulation for thousands of satellite nodes,which are mapped as application nodes through software defined network(SDN)and container technologies.We elaborate the architecture and mechanism of the simulation platform,and take the Starlink and Hadoop as realistic examples for simulations.The results indicate that LMCNs have dynamic end-to-end latency which fluctuates periodically with the constellation movement.Compared to ground data center networks(GDCNs),LMCNs deteriorate the computing and storage job throughput,which can be alleviated by the utilization of erasure codes and data flow scheduling of worker nodes. 展开更多
关键词 big data application Hadoop LEO mega-constellation multidimensional simulation onboard distributed processing
下载PDF
Data processing method for aerial testing of rotating accelerometer gravity gradiometer
3
作者 QIAN Xuewu TANG Hailiang 《中国惯性技术学报》 EI CSCD 北大核心 2024年第8期743-752,共10页
A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det... A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry. 展开更多
关键词 airborne gravity gradiometer data processing band-passing filter evaluation function
下载PDF
Cloud-Edge Collaborative Federated GAN Based Data Processing for IoT-Empowered Multi-Flow Integrated Energy Aggregation Dispatch
4
作者 Zhan Shi 《Computers, Materials & Continua》 SCIE EI 2024年第7期973-994,共22页
The convergence of Internet of Things(IoT),5G,and cloud collaboration offers tailored solutions to the rigorous demands of multi-flow integrated energy aggregation dispatch data processing.While generative adversarial... The convergence of Internet of Things(IoT),5G,and cloud collaboration offers tailored solutions to the rigorous demands of multi-flow integrated energy aggregation dispatch data processing.While generative adversarial networks(GANs)are instrumental in resource scheduling,their application in this domain is impeded by challenges such as convergence speed,inferior optimality searching capability,and the inability to learn from failed decision making feedbacks.Therefore,a cloud-edge collaborative federated GAN-based communication and computing resource scheduling algorithm with long-term constraint violation sensitiveness is proposed to address these challenges.The proposed algorithm facilitates real-time,energy-efficient data processing by optimizing transmission power control,data migration,and computing resource allocation.It employs federated learning for global parameter aggregation to enhance GAN parameter updating and dynamically adjusts GAN learning rates and global aggregation weights based on energy consumption constraint violations.Simulation results indicate that the proposed algorithm effectively reduces data processing latency,energy consumption,and convergence time. 展开更多
关键词 IOT federated learning generative adversarial network data processing multi-flowintegration energy aggregation dispatch
下载PDF
Machine Learning-based Identification of Contaminated Images in Light Curve Data Preprocessing
5
作者 Hui Li Rong-Wang Li +1 位作者 Peng Shu Yu-Qiang Li 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第4期287-295,共9页
Attitude is one of the crucial parameters for space objects and plays a vital role in collision prediction and debris removal.Analyzing light curves to determine attitude is the most commonly used method.In photometri... Attitude is one of the crucial parameters for space objects and plays a vital role in collision prediction and debris removal.Analyzing light curves to determine attitude is the most commonly used method.In photometric observations,outliers may exist in the obtained light curves due to various reasons.Therefore,preprocessing is required to remove these outliers to obtain high quality light curves.Through statistical analysis,the reasons leading to outliers can be categorized into two main types:first,the brightness of the object significantly increases due to the passage of a star nearby,referred to as“stellar contamination,”and second,the brightness markedly decreases due to cloudy cover,referred to as“cloudy contamination.”The traditional approach of manually inspecting images for contamination is time-consuming and labor-intensive.However,we propose the utilization of machine learning methods as a substitute.Convolutional Neural Networks and SVMs are employed to identify cases of stellar contamination and cloudy contamination,achieving F1 scores of 1.00 and 0.98 on a test set,respectively.We also explore other machine learning methods such as ResNet-18 and Light Gradient Boosting Machine,then conduct comparative analyses of the results. 展开更多
关键词 techniques:image processing methods:data analysis light pollution
下载PDF
Combination of Tsoft and ET34-ANA-V80 software for the preprocessing and analysis of tide gauge data in French Polynesia 被引量:1
6
作者 Bernard Ducarme Jean-Pierre Barriot Fangzhao Zhang 《Geodesy and Geodynamics》 CSCD 2023年第1期26-34,共9页
Since 2008 a network of five sea-level monitoring stations was progressively installed in French Polynesia.The stations are autonomous and data,collected at a sampling rate of 1 or 2 min,are not only recorded locally,... Since 2008 a network of five sea-level monitoring stations was progressively installed in French Polynesia.The stations are autonomous and data,collected at a sampling rate of 1 or 2 min,are not only recorded locally,but also transferred in real time by a radio-link to the NOAA through the GOES satellite.The new ET34-ANA-V80 version of ETERNA,initially developed for Earth Tides analysis,is now able to analyze ocean tides records.Through a two-step validation scheme,we took advantage of the flexibility of this new version,operated in conjunction with the preprocessing facilities of the Tsoft software,to recover co rrected data series able to model sea-level variations after elimination of the ocean tides signal.We performed the tidal analysis of the tide gauge data with the highest possible selectivity(optimal wave grouping)and a maximum of additional terms(shallow water constituents).Our goal was to provide corrected data series and modelled ocean tides signal to compute tide-free sea-level variations as well as tidal prediction models with centimeter precision.We also present in this study the characteristics of the ocean tides in French Polynesia and preliminary results concerning the non-tidal variations of the sea level concerning the tide gauge setting. 展开更多
关键词 Tide gauges Tidal data processing Mean sea level
下载PDF
Big Data Analytics Using Graph Signal Processing
7
作者 Farhan Amin Omar M.Barukab Gyu Sang Choi 《Computers, Materials & Continua》 SCIE EI 2023年第1期489-502,共14页
The networks are fundamental to our modern world and they appear throughout science and society.Access to a massive amount of data presents a unique opportunity to the researcher’s community.As networks grow in size ... The networks are fundamental to our modern world and they appear throughout science and society.Access to a massive amount of data presents a unique opportunity to the researcher’s community.As networks grow in size the complexity increases and our ability to analyze them using the current state of the art is at severe risk of failing to keep pace.Therefore,this paper initiates a discussion on graph signal processing for large-scale data analysis.We first provide a comprehensive overview of core ideas in Graph signal processing(GSP)and their connection to conventional digital signal processing(DSP).We then summarize recent developments in developing basic GSP tools,including methods for graph filtering or graph learning,graph signal,graph Fourier transform(GFT),spectrum,graph frequency,etc.Graph filtering is a basic task that allows for isolating the contribution of individual frequencies and therefore enables the removal of noise.We then consider a graph filter as a model that helps to extend the application of GSP methods to large datasets.To show the suitability and the effeteness,we first created a noisy graph signal and then applied it to the filter.After several rounds of simulation results.We see that the filtered signal appears to be smoother and is closer to the original noise-free distance-based signal.By using this example application,we thoroughly demonstrated that graph filtration is efficient for big data analytics. 展开更多
关键词 Big data data science big data processing graph signal processing social networks
下载PDF
Application and evaluation of layering shear method in LADCP data processing
8
作者 Zijian Cui Chujin Liang +2 位作者 Binbin Guo Feilong Lin Yong Mu 《Acta Oceanologica Sinica》 SCIE CAS CSCD 2023年第12期9-21,共13页
The current velocity observation of LADCP(Lowered Acoustic Doppler Current Profiler)has the advantages of a large vertical range of observation and high operability compared with traditional current measurement method... The current velocity observation of LADCP(Lowered Acoustic Doppler Current Profiler)has the advantages of a large vertical range of observation and high operability compared with traditional current measurement methods,and is being widely used in the field of ocean observation.Shear and inverse methods are now commonly used by the international marine community to process LADCP data and calculate ocean current profiles.The two methods have their advantages and shortcomings.The shear method calculates the value of current shear more accurately,while the accuracy in an absolute value of the current is lower.The inverse method calculates the absolute value of the current velocity more accurately,but the current shear is less accurate.Based on the shear method,this paper proposes a layering shear method to calculate the current velocity profile by“layering averaging”,and proposes corresponding current calculation methods according to the different types of problems in several field observation data from the western Pacific,forming an independent LADCP data processing system.The comparison results have shown that the layering shear method can achieve the same effect as the inverse method in the calculation of the absolute value of current velocity,while retaining the advantages of the shear method in the calculation of a value of the current shear. 展开更多
关键词 LADCP data processing layering shear method Western Pacific
下载PDF
Data processing of the Kuiyang-ST2000 deep-towed high-resolution multichannel seismic system and application to South China Sea data
9
作者 Yanliang PEI Mingming WEN +3 位作者 Zhengrong WEI Baohua LIU Kai LIU Guangming KAN 《Journal of Oceanology and Limnology》 SCIE CAS CSCD 2023年第2期644-659,共16页
The Kuiyang-ST2000 deep-towed high-resolution multichannel seismic system was designed by the First Institute of Oceanography,Ministry of Natural Resources(FIO,MNR).The system is mainly composed of a plasma spark sour... The Kuiyang-ST2000 deep-towed high-resolution multichannel seismic system was designed by the First Institute of Oceanography,Ministry of Natural Resources(FIO,MNR).The system is mainly composed of a plasma spark source(source level:216 dB,main frequency:750 Hz,frequency bandwidth:150-1200 Hz)and a towed hydrophone streamer with 48 channels.Because the source and the towed hydrophone streamer are constantly moving according to the towing configuration,the accurate positioning of the towing hydrophone array and the moveout correction of deep-towed multichannel seismic data processing before imaging are challenging.Initially,according to the characteristics of the system and the towing streamer shape in deep water,travel-time positioning method was used to construct the hydrophone streamer shape,and the results were corrected by using the polynomial curve fitting method.Then,a new data-processing workflow for Kuiyang-ST2000 system data was introduced,mainly including float datum setting,residual static correction,phase-based moveout correction,which allows the imaging algorithms of conventional marine seismic data processing to extend to deep-towed seismic data.We successfully applied the Kuiyang-ST2000 system and methodology of data processing to a gas hydrate survey of the Qiongdongnan and Shenhu areas in the South China Sea,and the results show that the profile has very high vertical and lateral resolutions(0.5 m and 8 m,respectively),which can provide full and accurate details of gas hydrate-related and geohazard sedimentary and structural features in the South China Sea. 展开更多
关键词 Kuiyang-ST2000 system deep-towed system seismic data process plasma spark source high resolution gas hydrate South China Sea
下载PDF
Status of UnDifferenced and Uncombined GNSS Data Processing Activities in China
10
作者 Pengyu HOU Delu CHE +3 位作者 Teng LIU Jiuping ZHA Yunbin YUAN Baocheng ZHANG 《Journal of Geodesy and Geoinformation Science》 CSCD 2023年第3期135-144,共10页
With the continued development of multiple Global Navigation Satellite Systems(GNSS)and the emergence of various frequencies,UnDifferenced and UnCombined(UDUC)data processing has become an increasingly attractive opti... With the continued development of multiple Global Navigation Satellite Systems(GNSS)and the emergence of various frequencies,UnDifferenced and UnCombined(UDUC)data processing has become an increasingly attractive option.In this contribution,we provide an overview of the current status of UDUC GNSS data processing activities in China.These activities encompass the formulation of Precise Point Positioning(PPP)models and PPP-Real-Time Kinematic(PPP-RTK)models for processing single-station and multi-station GNSS data,respectively.Regarding single-station data processing,we discuss the advancements in PPP models,particularly the extension from a single system to multiple systems,and from dual frequencies to single and multiple frequencies.Additionally,we introduce the modified PPP model,which accounts for the time variation of receiver code biases,a departure from the conventional PPP model that typically assumes these biases to be time-constant.In the realm of multi-station PPP-RTK data processing,we introduce the ionosphere-weighted PPP-RTK model,which enhances the model strength by considering the spatial correlation of ionospheric delays.We also review the phase-only PPP-RTK model,designed to mitigate the impact of unmodelled code-related errors.Furthermore,we explore GLONASS PPP-RTK,achieved through the application of the integer-estimable model.For large-scale network data processing,we introduce the all-in-view PPP-RTK model,which alleviates the strict common-view requirement at all receivers.Moreover,we present the decentralized PPP-RTK data processing strategy,designed to improve computational efficiency.Overall,this work highlights the various advancements in UDUC GNSS data processing,providing insights into the state-of-the-art techniques employed in China to achieve precise GNSS applications. 展开更多
关键词 Global Navigation Satellite Systems(GNSS) UnDifferenced and UnCombined(UDUC) Precise Point Positioning(PPP) PPP-Real-Time Kinematic(PPP-RTK) single-station data processing multi-station data processing
下载PDF
Inter-agency government information sharing under data-driven blockchain framework
11
作者 XIAO Jiong-en HONG Ming DING Li-ping 《控制理论与应用》 EI CAS CSCD 北大核心 2024年第8期1369-1376,共8页
The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fu... The inter-agency government information sharing(IAGIS)plays an important role in improving service and efficiency of government agencies.Currently,there is still no effective and secure way for data-driven IAGIS to fulfill dynamic demands of information sharing between government agencies.Motivated by blockchain and data mining,a data-driven framework is proposed for IAGIS in this paper.Firstly,the blockchain is used as the core to design the whole framework for monitoring and preventing leakage and abuse of government information,in order to guarantee information security.Secondly,a four-layer architecture is designed for implementing the proposed framework.Thirdly,the classical data mining algorithms PageRank and Apriori are applied to dynamically design smart contracts for information sharing,for the purposed of flexibly adjusting the information sharing strategies according to the practical demands of government agencies for public management and public service.Finally,a case study is presented to illustrate the operation of the proposed framework. 展开更多
关键词 government data processing blockchain PAGERANK APRIORI
下载PDF
Research on Tensor Multi-Clustering Distributed Incremental Updating Method for Big Data
12
作者 Hongjun Zhang Zeyu Zhang +3 位作者 Yilong Ruan Hao Ye Peng Li Desheng Shi 《Computers, Materials & Continua》 SCIE EI 2024年第10期1409-1432,共24页
The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces ... The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces a new method named Big Data Tensor Multi-Cluster Distributed Incremental Update(BDTMCDIncreUpdate),which combines distributed computing,storage technology,and incremental update techniques to provide an efficient and effective means for clustering analysis.Firstly,the original dataset is divided into multiple subblocks,and distributed computing resources are utilized to process the sub-blocks in parallel,enhancing efficiency.Then,initial clustering is performed on each sub-block using tensor-based multi-clustering techniques to obtain preliminary results.When new data arrives,incremental update technology is employed to update the core tensor and factor matrix,ensuring that the clustering model can adapt to changes in data.Finally,by combining the updated core tensor and factor matrix with historical computational results,refined clustering results are obtained,achieving real-time adaptation to dynamic data.Through experimental simulation on the Aminer dataset,the BDTMCDIncreUpdate method has demonstrated outstanding performance in terms of accuracy(ACC)and normalized mutual information(NMI)metrics,achieving an accuracy rate of 90%and an NMI score of 0.85,which outperforms existing methods such as TClusInitUpdate and TKLClusUpdate in most scenarios.Therefore,the BDTMCDIncreUpdate method offers an innovative solution to the field of big data analysis,integrating distributed computing,incremental updates,and tensor-based multi-clustering techniques.It not only improves the efficiency and scalability in processing large-scale high-dimensional datasets but also has been validated for its effectiveness and accuracy through experiments.This method shows great potential in real-world applications where dynamic data growth is common,and it is of significant importance for advancing the development of data analysis technology. 展开更多
关键词 TENSOR incremental update DISTRIBUTED clustering processing big data
下载PDF
A novel medical image data protection scheme for smart healthcare system
13
作者 Mujeeb Ur Rehman Arslan Shafique +6 位作者 Muhammad Shahbaz Khan Maha Driss Wadii Boulila Yazeed Yasin Ghadi Suresh Babu Changalasetty Majed Alhaisoni Jawad Ahmad 《CAAI Transactions on Intelligence Technology》 SCIE EI 2024年第4期821-836,共16页
The Internet of Multimedia Things(IoMT)refers to a network of interconnected multimedia devices that communicate with each other over the Internet.Recently,smart healthcare has emerged as a significant application of ... The Internet of Multimedia Things(IoMT)refers to a network of interconnected multimedia devices that communicate with each other over the Internet.Recently,smart healthcare has emerged as a significant application of the IoMT,particularly in the context of knowledge‐based learning systems.Smart healthcare systems leverage knowledge‐based learning to become more context‐aware,adaptable,and auditable while maintain-ing the ability to learn from historical data.In smart healthcare systems,devices capture images,such as X‐rays,Magnetic Resonance Imaging.The security and integrity of these images are crucial for the databases used in knowledge‐based learning systems to foster structured decision‐making and enhance the learning abilities of AI.Moreover,in knowledge‐driven systems,the storage and transmission of HD medical images exert a burden on the limited bandwidth of the communication channel,leading to data trans-mission delays.To address the security and latency concerns,this paper presents a lightweight medical image encryption scheme utilising bit‐plane decomposition and chaos theory.The results of the experiment yield entropy,energy,and correlation values of 7.999,0.0156,and 0.0001,respectively.This validates the effectiveness of the encryption system proposed in this paper,which offers high‐quality encryption,a large key space,key sensitivity,and resistance to statistical attacks. 展开更多
关键词 data analysis medical image processing SECURITY
下载PDF
Terrorism Attack Classification Using Machine Learning: The Effectiveness of Using Textual Features Extracted from GTD Dataset
14
作者 Mohammed Abdalsalam Chunlin Li +1 位作者 Abdelghani Dahou Natalia Kryvinska 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第2期1427-1467,共41页
One of the biggest dangers to society today is terrorism, where attacks have become one of the most significantrisks to international peace and national security. Big data, information analysis, and artificial intelli... One of the biggest dangers to society today is terrorism, where attacks have become one of the most significantrisks to international peace and national security. Big data, information analysis, and artificial intelligence (AI) havebecome the basis for making strategic decisions in many sensitive areas, such as fraud detection, risk management,medical diagnosis, and counter-terrorism. However, there is still a need to assess how terrorist attacks are related,initiated, and detected. For this purpose, we propose a novel framework for classifying and predicting terroristattacks. The proposed framework posits that neglected text attributes included in the Global Terrorism Database(GTD) can influence the accuracy of the model’s classification of terrorist attacks, where each part of the datacan provide vital information to enrich the ability of classifier learning. Each data point in a multiclass taxonomyhas one or more tags attached to it, referred as “related tags.” We applied machine learning classifiers to classifyterrorist attack incidents obtained from the GTD. A transformer-based technique called DistilBERT extracts andlearns contextual features from text attributes to acquiremore information from text data. The extracted contextualfeatures are combined with the “key features” of the dataset and used to perform the final classification. Thestudy explored different experimental setups with various classifiers to evaluate the model’s performance. Theexperimental results show that the proposed framework outperforms the latest techniques for classifying terroristattacks with an accuracy of 98.7% using a combined feature set and extreme gradient boosting classifier. 展开更多
关键词 Artificial intelligence machine learning natural language processing data analytic DistilBERT feature extraction terrorism classification GTD dataset
下载PDF
Enhancing the vertical resolution of lunar penetrating radar data using predictive deconvolution
15
作者 Chao Li JinHai Zhang 《Earth and Planetary Physics》 EI CAS CSCD 2024年第4期570-578,共9页
The Yutu-2 rover onboard the Chang’E-4 mission performed the first lunar penetrating radar detection on the farside of the Moon.The high-frequency channel presented us with many unprecedented details of the subsurfac... The Yutu-2 rover onboard the Chang’E-4 mission performed the first lunar penetrating radar detection on the farside of the Moon.The high-frequency channel presented us with many unprecedented details of the subsurface structures within a depth of approximately 50 m.However,it was still difficult to identify finer layers from the cluttered reflections and scattering waves.We applied deconvolution to improve the vertical resolution of the radar profile by extending the limited bandwidth associated with the emissive radar pulse.To overcome the challenges arising from the mixed-phase wavelets and the problematic amplification of noise,we performed predictive deconvolution to remove the minimum-phase components from the Chang’E-4 dataset,followed by a comprehensive phase rotation to rectify phase anomalies in the radar image.Subsequently,we implemented irreversible migration filtering to mitigate the noise and diminutive clutter echoes amplified by deconvolution.The processed data showed evident enhancement of the vertical resolution with a widened bandwidth in the frequency domain and better signal clarity in the time domain,providing us with more undisputed details of subsurface structures near the Chang’E-4 landing site. 展开更多
关键词 Chang’E-4 lunar penetrating radar data processing predictive deconvolution irreversible migration filtering
下载PDF
Comparison of Results of Different GPS Post-processing Software
16
作者 Dapeng SHI 《Asian Agricultural Research》 2024年第6期33-35,共3页
In order to obtain high-precision GPS control point results and provide high-precision known points for various projects,this study uses a variety of mature GPS post-processing software to process the observation data... In order to obtain high-precision GPS control point results and provide high-precision known points for various projects,this study uses a variety of mature GPS post-processing software to process the observation data of the GPS control network of Guanyinge Reservoir,and compares the results obtained by several kinds of software.According to the test results,the reasons for the accuracy differences between different software are analyzed,and the optimal results are obtained in the analysis and comparison.The purpose of this paper is to provide useful reference for GPS software users to process data. 展开更多
关键词 GPS data processing POINT POSITION PRECISION
下载PDF
LÉVY AREA ANALYSIS AND PARAMETER ESTIMATION FOR FOU PROCESSES VIA NON-GEOMETRIC ROUGH PATH THEORY
17
作者 Zhongmin QIAN Xingcheng XU 《Acta Mathematica Scientia》 SCIE CSCD 2024年第5期1609-1638,共30页
This paper addresses the estimation problem of an unknown drift parameter matrix for a fractional Ornstein-Uhlenbeck process in a multi-dimensional setting.To tackle this problem,we propose a novel approach based on r... This paper addresses the estimation problem of an unknown drift parameter matrix for a fractional Ornstein-Uhlenbeck process in a multi-dimensional setting.To tackle this problem,we propose a novel approach based on rough path theory that allows us to construct pathwise rough path estimators from both continuous and discrete observations of a single path.Our approach is particularly suitable for high-frequency data.To formulate the parameter estimators,we introduce a theory of pathwise Itôintegrals with respect to fractional Brownian motion.By establishing the regularity of fractional Ornstein-Uhlenbeck processes and analyzing the long-term behavior of the associated Lévy area processes,we demonstrate that our estimators are strongly consistent and pathwise stable.Our findings offer a new perspective on estimating the drift parameter matrix for fractional Ornstein-Uhlenbeck processes in multi-dimensional settings,and may have practical implications for fields including finance,economics,and engineering. 展开更多
关键词 Itôintegration Lévy area non-geometric rough path fOU processes pathwise stability long time asymptotic high-frequency data
下载PDF
Ending Privacy’s Gremlin: Stopping the Data-Broker Loophole to the Fourth Amendment’s Search Warrant Requirement
18
作者 Samantha B. Larkin Shakour Abuzneid 《Journal of Information Security》 2024年第4期589-611,共23页
Advances in technology require upgrades in the law. One such area involves data brokers, which have thus far gone unregulated. Data brokers use artificial intelligence to aggregate information into data profiles about... Advances in technology require upgrades in the law. One such area involves data brokers, which have thus far gone unregulated. Data brokers use artificial intelligence to aggregate information into data profiles about individual Americans derived from consumer use of the internet and connected devices. Data profiles are then sold for profit. Government investigators use a legal loophole to purchase this data instead of obtaining a search warrant, which the Fourth Amendment would otherwise require. Consumers have lacked a reasonable means to fight or correct the information data brokers collect. Americans may not even be aware of the risks of data aggregation, which upends the test of reasonable expectations used in a search warrant analysis. Data aggregation should be controlled and regulated, which is the direction some privacy laws take. Legislatures must step forward to safeguard against shadowy data-profiling practices, whether abroad or at home. In the meantime, courts can modify their search warrant analysis by including data privacy principles. 展开更多
关键词 Access Control Access Rights Artificial Intelligence Consumer Behavior Consumer Protection Criminal Law data Brokers data Handling data Privacy data processing data Profiling Digital Forensics
下载PDF
Robotic Process Automation with New Future Trends
19
作者 Abu Tayab Yanwen Li 《Journal of Computer and Communications》 2024年第6期12-24,共13页
The invention concept of Robotic Process Automation (RPA) has emerged as a transformative technology that has revolved the local business processes by programming repetitive task and efficiency adjusting the operation... The invention concept of Robotic Process Automation (RPA) has emerged as a transformative technology that has revolved the local business processes by programming repetitive task and efficiency adjusting the operations. This research had focused on developing the RPA environment and its future features in order to elaborate on the projected policies based on its comprehensive experiences. The current and previous situations of industry are looking for IT solutions to fully scale their company Improve business flexibility, improve customer satisfaction, improve productivity, accuracy and reduce costs, quick scalability in RPA has currently appeared as an advance technology with exceptional performance. It emphasizes future trends and foresees the evolution of RPA by integrating artificial intelligence, learning of machine and cognitive automation into RPA frameworks. Moreover, it has analyzed the technical constraints, including the scalability, security issues and interoperability, while investigating regulatory and ethical considerations that are so important to the ethical utilization of RPA. By providing a comprehensive analysis of RPA with new future trends in this study, researcher’s ambitions to provide valuable insights the benefits of it on industrial performances from the gap observed so as to guide the strategic decision and future implementation of the RPA. 展开更多
关键词 Robotic process Automation Artificial Intelligence Machine Learning Cognitive Computing INTEROPERABILITY data Security
下载PDF
A Review of the Status and Development Strategies of Computer Science and Technology Under the Background of Big Data
20
作者 Junlin Zhang 《Journal of Electronic Research and Application》 2024年第2期49-53,共5页
This article discusses the current status and development strategies of computer science and technology in the context of big data.Firstly,it explains the relationship between big data and computer science and technol... This article discusses the current status and development strategies of computer science and technology in the context of big data.Firstly,it explains the relationship between big data and computer science and technology,focusing on analyzing the current application status of computer science and technology in big data,including data storage,data processing,and data analysis.Then,it proposes development strategies for big data processing.Computer science and technology play a vital role in big data processing by providing strong technical support. 展开更多
关键词 Big data Computer science and technology data storage data processing data visualization
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部