The inter-city linkage heat data provided by Baidu Migration is employed as a characterization of inter-city linkages in order to facilitate the study of the network linkage characteristics and hierarchical structure ...The inter-city linkage heat data provided by Baidu Migration is employed as a characterization of inter-city linkages in order to facilitate the study of the network linkage characteristics and hierarchical structure of urban agglomeration in the Greater Bay Area through the use of social network analysis method.This is the inaugural application of big data based on location services in the study of urban agglomeration network structure,which represents a novel research perspective on this topic.The study reveals that the density of network linkages in the Greater Bay Area urban agglomeration has reached 100%,indicating a mature network-like spatial structure.This structure has given rise to three distinct communities:Shenzhen-Dongguan-Huizhou,Guangzhou-Foshan-Zhaoqing,and Zhuhai-Zhongshan-Jiangmen.Additionally,cities within the Greater Bay Area urban agglomeration play different roles,suggesting that varying development strategies may be necessary to achieve staggered development.The study demonstrates that large datasets represented by LBS can offer novel insights and methodologies for the examination of urban agglomeration network structures,contingent on the appropriate mining and processing of the data.展开更多
Based on high-tide shoreline data extracted from 87 Landsat satellite images from 1986 to 2019 as well as using the linear regression rate and performing a Mann-Kendall(M–K)trend test,this study analyzes the linear c...Based on high-tide shoreline data extracted from 87 Landsat satellite images from 1986 to 2019 as well as using the linear regression rate and performing a Mann-Kendall(M–K)trend test,this study analyzes the linear characteristics and nonlinear behavior of the medium-to long-term shoreline evolution of Jinghai Bay,eastern Guangdong Province.In particular,shoreline rotation caused by a shore-normal coastal structure is emphasized.The results show that the overall shoreline evolution over the past 30 years is characterized by erosion on the southwest beach,with an average erosion rate of 3.1 m/a,and significant accretion on the northeast beach,with an average accretion rate of 5.6 m/a.Results of the M–K trend test indicate that significant shoreline changes occurred in early 2006,which can be attributed to shore-normal engineering.Prior to that engineering construction,the shorelines are slightly eroded,where the average erosion rate is 0.7 m/a.However,after shore-normal engineering is performed,the shoreline is characterized by significant erosion(3.2 m/a)on the southwest beach and significant accretion(8.5 m/a)on the northeast beach,thus indicating that the shore-normal engineering at the updrift headland contributes to clockwise shoreline rotation.Further analysis shows that the clockwise shoreline rotation is promoted not only by longshore sediment transport processes from southwest to northeast,but also by cross-shore sediment transport processes.These findings are crucial for beach erosion risk management,coastal disaster zoning,regional sediment budget assessments,and further observations and predictions of beach morphodynamics.展开更多
The present study focuses on the analysis and description of lineaments interpreted as secondary structures to describe the nature of Senegalo Malian Discontinuity. These lineaments cross-cut the large north-south ori...The present study focuses on the analysis and description of lineaments interpreted as secondary structures to describe the nature of Senegalo Malian Discontinuity. These lineaments cross-cut the large north-south oriented transcurrent lithospheric structure known as the Senegalo Malian Discontinuity (SMD). Two lineaments were selected oriented NNE (N15˚ to N25˚), one at Dialafara and one at Sadiola. Four profiles on each lineament of these 2 zones, so that there were 2 on each side of the SMD. The ground data collected were processed using proper parameter and software. Some filters were applied to enhance the signal level. These ground data were later compared to the existing airborne magnetic data for consistency and accuracy using the upward continuation filter. The results show that the quality of ground data is good. In addition, the ground magnetic data show the presence of certain local anomalies that are not visible in the regional data. The analytical signal was also used to determine domain boundaries or possible contact zones. The contact zone can be highlighted on certain profiles such as L300 and L600. The study showed that the west and east sides of the SMD are not the same. Secondary structures become wide when approaching the SMD on both sides. They are also duplicated to the east of the SMD when we move progressively away. In the Dialafara area, the ground magnetic data intersect an interpreted fold. The results of this work confirm the presence of the secondary structures and their evolution in relation to the SMD. The relationships between the secondary structures in the Dailafara and Sadiola zones and their relations with the SMD are highlighted. The technique used in this study, is an important approach to better description and interpreting of regional structures using the secondary structures and proposing a structural model.展开更多
The South Yellow Sea basin is filled with Mesozoic-Cenozoic continental sediments overlying pre-Palaeozoic and Mesozoic-Palaeozoic marine sediments.Conventional multi-channel seismic data cannot describe the velocity ...The South Yellow Sea basin is filled with Mesozoic-Cenozoic continental sediments overlying pre-Palaeozoic and Mesozoic-Palaeozoic marine sediments.Conventional multi-channel seismic data cannot describe the velocity structure of the marine residual basin in detail,leading to the lack of a deeper understanding of the distribution and lithology owing to strong energy shielding on the top interface of marine sediments.In this study,we present seismic tomography data from ocean bottom seismographs that describe the NEE-trending velocity distributions of the basin.The results indicate that strong velocity variations occur at shallow crustal levels.Horizontal velocity bodies show good correlation with surface geological features,and multi-layer features exist in the vertical velocity framework(depth:0–10 km).The analyses of the velocity model,gravity data,magnetic data,multichannel seismic profiles,and drilling data showed that high-velocity anomalies(>6.5 km/s)of small(thickness:1–2 km)and large(thickness:>5 km)scales were caused by igneous complexes in the multi-layer structure,which were active during the Palaeogene.Possible locations of good Mesozoic and Palaeozoic marine strata are limited to the Central Uplift and the western part of the Northern Depression along the wide-angle ocean bottom seismograph array.Following the Indosinian movement,a strong compression existed in the Northern Depression during the extensional phase that caused the formation of folds in the middle of the survey line.This study is useful for reconstructing the regional tectonic evolution and delineating the distribution of the marine residual basin in the South Yellow Sea basin.展开更多
The research purpose of this dissertation is threefold: to innovate artificial intelligence methods, to create the intersection of artificial intelligence and biological research, and to innovate human methodology. Th...The research purpose of this dissertation is threefold: to innovate artificial intelligence methods, to create the intersection of artificial intelligence and biological research, and to innovate human methodology. The work I have done in my research includes: improving logical structure and logical engineering, using my theory to study the innovation of the development path of artificial intelligence, using my theory to create biomimetic logic, a new intersection of artificial intelligence and biological research, and exploring the innovation of human methodology through the previous two works. The results of the research are as follows: 1) Introduction to bionic logic, incorporating simulations of people, society, and life as core principles. 2) Definition of the logical structure as the primary focus of research, with logic mechanics serving as foundational research principles. 3) Examination of the logical structure’s environment through logical fields and networks. 4) Study of logical structure communication via logical networks and main lines. 5) Proposal of data logic. 6) Investigation into the logic of logical structures, employing structural diagrams of logical equations. 7) Development of a theory of life activity within logical structures, encompassing information reasoning, its corresponding control structure, and structural reasoning. 8) Introduction of the lifecycle theory for logical structures and examination of the clock equation. 9) Exploration of logical structure intelligence. 10) Study of logical structures in mathematical forms. 11) Introduction of logic engineering. 12) Examination of artificial intelligence’s significance. 13) Investigation into the significance of human methodology.展开更多
With the deepening of the Guangdong-Hong Kong-Macao Greater Bay Area strategy and the accelerated integration and development of the east and west sides of the Pearl River Estuary,Zhuhai’s hub position is becoming mo...With the deepening of the Guangdong-Hong Kong-Macao Greater Bay Area strategy and the accelerated integration and development of the east and west sides of the Pearl River Estuary,Zhuhai’s hub position is becoming more and more prominent.The city of Zhuhai has a dense water network and is divided into two urban areas,the east and the west,under the influence of the Mordor Gate waterway.Based on the theory of spatial syntax,this paper carries out an analytical study on the urban spatial structure of Zhuhai,identifies the distribution characteristics of urban POIs,and provides theoretical support for the urban development of Zhuhai.展开更多
Data protection in databases is critical for any organization,as unauthorized access or manipulation can have severe negative consequences.Intrusion detection systems are essential for keeping databases secure.Advance...Data protection in databases is critical for any organization,as unauthorized access or manipulation can have severe negative consequences.Intrusion detection systems are essential for keeping databases secure.Advancements in technology will lead to significant changes in the medical field,improving healthcare services through real-time information sharing.However,reliability and consistency still need to be solved.Safeguards against cyber-attacks are necessary due to the risk of unauthorized access to sensitive information and potential data corruption.Dis-ruptions to data items can propagate throughout the database,making it crucial to reverse fraudulent transactions without delay,especially in the healthcare industry,where real-time data access is vital.This research presents a role-based access control architecture for an anomaly detection technique.Additionally,the Structured Query Language(SQL)queries are stored in a new data structure called Pentaplet.These pentaplets allow us to maintain the correlation between SQL statements within the same transaction by employing the transaction-log entry information,thereby increasing detection accuracy,particularly for individuals within the company exhibiting unusual behavior.To identify anomalous queries,this system employs a supervised machine learning technique called Support Vector Machine(SVM).According to experimental findings,the proposed model performed well in terms of detection accuracy,achieving 99.92%through SVM with One Hot Encoding and Principal Component Analysis(PCA).展开更多
To capitalize on the primary role of major course teaching and to facilitate students’understanding of abstract concepts in the data structure course,it is essential to increase their interest in learning and develop...To capitalize on the primary role of major course teaching and to facilitate students’understanding of abstract concepts in the data structure course,it is essential to increase their interest in learning and develop case studies that highlight fine traditional culture.By incorporating these culture-rich case studies into classroom instruction,we employ a project-driven teaching approach.This not only allows students to master professional knowledge,but also enhances their abilities to solve specific engineering problems,ultimately fostering cultural confidence.Over the past few years,during which educational reforms have been conducted for trial runs,the feasibility and effectiveness of these reform schemes have been demonstrated.展开更多
A robust and efficient algorithm is presented to build multiresolution models (MRMs) of arbitrary meshes without requirement of subdivision connectivity. To overcome the sampling difficulty of arbitrary meshes, edge c...A robust and efficient algorithm is presented to build multiresolution models (MRMs) of arbitrary meshes without requirement of subdivision connectivity. To overcome the sampling difficulty of arbitrary meshes, edge contraction and vertex expansion are used as downsampling and upsampling methods. Our MRMs of a mesh are composed of a base mesh and a series of edge split operations, which are organized as a directed graph. Each split operation encodes two parts of information. One is the modification to the mesh, and the other is the dependency relation among splits. Such organization ensures the efficiency and robustness of our MRM algorithm. Examples demonstrate the functionality of our method.展开更多
More web pages are widely applying AJAX (Asynchronous JavaScript XML) due to the rich interactivity and incremental communication. By observing, it is found that the AJAX contents, which could not be seen by traditi...More web pages are widely applying AJAX (Asynchronous JavaScript XML) due to the rich interactivity and incremental communication. By observing, it is found that the AJAX contents, which could not be seen by traditional crawler, are well-structured and belong to one specific domain generally. Extracting the structured data from AJAX contents and annotating its semantic are very significant for further applications. In this paper, a structured AJAX data extraction method for agricultural domain based on agricultural ontology was proposed. Firstly, Crawljax, an open AJAX crawling tool, was overridden to explore and retrieve the AJAX contents; secondly, the retrieved contents were partitioned into items and then classified by combining with agricultural ontology. HTML tags and punctuations were used to segment the retrieved contents into entity items. Finally, the entity items were clustered and the semantic annotation was assigned to clustering results according to agricultural ontology. By experimental evaluation, the proposed approach was proved effectively in resource exploring, entity extraction, and semantic annotation.展开更多
In this paper, a new concept called numerical structure of seismic data is introduced and the difference between numerical structure and numerical value of seismic data is explained. Our study shows that the numerical...In this paper, a new concept called numerical structure of seismic data is introduced and the difference between numerical structure and numerical value of seismic data is explained. Our study shows that the numerical seismic structure is closely related to oil and gas-bearing reservoir, so it is very useful for a geologist or a geophysicist to precisely interpret the oil-bearing layers from the seismic data. This technology can be applied to any exploration or production stage. The new method has been tested on a series of exploratory or development wells and proved to be reliable in China. Hydrocarbon-detection with this new method for 39 exploration wells on 25 structures indi- cates a success ratio of over 80 percent. The new method of hydrocarbon prediction can be applied for: (1) depositional environment of reservoirs with marine fades, delta, or non-marine fades (including fluvial facies, lacustrine fades); (2) sedimentary rocks of reservoirs that are non-marine clastic rocks and carbonate rock; and (3) burial depths range from 300 m to 7000 m, and the minimum thickness of these reservoirs is over 8 m (main frequency is about 50 Hz).展开更多
In order to improve the quality of web search,a new query expansion method by choosing meaningful structure data from a domain database is proposed.It categories attributes into three different classes,named as concep...In order to improve the quality of web search,a new query expansion method by choosing meaningful structure data from a domain database is proposed.It categories attributes into three different classes,named as concept attribute,context attribute and meaningless attribute,according to their semantic features which are document frequency features and distinguishing capability features.It also defines the semantic relevance between two attributes when they have correlations in the database.Then it proposes trie-bitmap structure and pair pointer tables to implement efficient algorithms for discovering attribute semantic feature and detecting their semantic relevances.By using semantic attributes and their semantic relevances,expansion words can be generated and embedded into a vector space model with interpolation parameters.The experiments use an IMDB movie database and real texts collections to evaluate the proposed method by comparing its performance with a classical vector space model.The results show that the proposed method can improve text search efficiently and also improve both semantic features and semantic relevances with good separation capabilities.展开更多
Seismic data structure characteristics means the waveform character arranged in the time sequence at discrete data points in each 2-D or 3-D seismic trace. Hydrocarbon prediction using seismic data structure character...Seismic data structure characteristics means the waveform character arranged in the time sequence at discrete data points in each 2-D or 3-D seismic trace. Hydrocarbon prediction using seismic data structure characteristics is a new reservoir prediction technique. When the main pay interval is in carbonate fracture and fissure-cavern type reservoirs with very strong inhomogeneity, there are some difficulties with hydrocarbon prediction. Because of the special geological conditions of the eighth zone in the Tahe oil field, we apply seismic data structure characteristics to hydrocarbon prediction for the Ordovician reservoir in this zone. We divide the area oil zone into favorable and unfavorable blocks. Eighteen well locations were proposed in the favorable oil block, drilled, and recovered higher output of oil and gas.展开更多
Efficient methods for incorporating engineering experience into the intelligent generation and optimization of shear wall structures are lacking,hindering intelligent design performance assessment and enhancement.This...Efficient methods for incorporating engineering experience into the intelligent generation and optimization of shear wall structures are lacking,hindering intelligent design performance assessment and enhancement.This study introduces an assessment method used in the intelligent design and optimization of shear wall structures that effectively combines mechanical analysis and formulaic encoding of empirical rules.First,the critical information about the structure was extracted through data structuring.Second,an empirical rule assessment method was developed based on the engineer's experience and design standards to complete a preliminary assessment and screening of the structure.Subsequently,an assessment method based on mechanical performance and material consumption was used to compare different structural schemes comprehensively.Finally,the assessment effectiveness was demonstrated using a typical case.Compared to traditional assessment methods,the proposed method is more comprehensive and significantly more efficient,promoting the intelligent transformation of structural design.展开更多
Aiming to increase the efficiency of gem design and manufacturing, a new method in computer-aided-design (CAD) of convex faceted gem cuts (CFGC) based on Half-edge data structure (HDS), including the algorithms for th...Aiming to increase the efficiency of gem design and manufacturing, a new method in computer-aided-design (CAD) of convex faceted gem cuts (CFGC) based on Half-edge data structure (HDS), including the algorithms for the implementation is presented in this work. By using object-oriented methods, geometrical elements of CFGC are classified and responding geometrical feature classes are established. Each class is implemented and embedded based on the gem process. Matrix arithmetic and analytical geometry are used to derive the affine transformation and the cutting algorithm. Based on the demand for a diversity of gem cuts, CAD functions both for free-style faceted cuts and parametric designs of typical cuts and visualization and human-computer interactions of the CAD system including two-dimensional and three-dimensional interactions have been realized which enhances the flexibility and universality of the CAD system. Furthermore, data in this CAD system can also be used directly by the gem CAM module, which will promote the gem CAD/CAM integration.展开更多
Major interactions are known to trigger star formation in galaxies and alter their color.We study the major interactions in filaments and sheets using SDSS data to understand the influence of large-scale environments ...Major interactions are known to trigger star formation in galaxies and alter their color.We study the major interactions in filaments and sheets using SDSS data to understand the influence of large-scale environments on galaxy interactions.We identify the galaxies in filaments and sheets using the local dimension and also find the major pairs residing in these environments.The star formation rate(SFR) and color of the interacting galaxies as a function of pair separation are separately analyzed in filaments and sheets.The analysis is repeated for three volume limited samples covering different magnitude ranges.The major pairs residing in the filaments show a significantly higher SFR and bluer color than those residing in the sheets up to the projected pair separation of~50 kpc.We observe a complete reversal of this behavior for both the SFR and color of the galaxy pairs having a projected separation larger than 50 kpc.Some earlier studies report that the galaxy pairs align with the filament axis.Such alignment inside filaments indicates anisotropic accretion that may cause these differences.We do not observe these trends in the brighter galaxy samples.The pairs in filaments and sheets from the brighter galaxy samples trace relatively denser regions in these environments.The absence of these trends in the brighter samples may be explained by the dominant effect of the local density over the effects of the large-scale environment.展开更多
With the increasing number of digital devices generating a vast amount of video data,the recognition of abnormal image patterns has become more important.Accordingly,it is necessary to develop a method that achieves t...With the increasing number of digital devices generating a vast amount of video data,the recognition of abnormal image patterns has become more important.Accordingly,it is necessary to develop a method that achieves this task using object and behavior information within video data.Existing methods for detecting abnormal behaviors only focus on simple motions,therefore they cannot determine the overall behavior occurring throughout a video.In this study,an abnormal behavior detection method that uses deep learning(DL)-based video-data structuring is proposed.Objects and motions are first extracted from continuous images by combining existing DL-based image analysis models.The weight of the continuous data pattern is then analyzed through data structuring to classify the overall video.The performance of the proposed method was evaluated using varying parameter settings,such as the size of the action clip and interval between action clips.The model achieved an accuracy of 0.9817,indicating excellent performance.Therefore,we conclude that the proposed data structuring method is useful in detecting and classifying abnormal behaviors.展开更多
Joint inversion is one of the most effective methods for reducing non-uniqueness for geophysical inversion.The current joint inversion methods can be divided into the structural consistency constraint and petrophysica...Joint inversion is one of the most effective methods for reducing non-uniqueness for geophysical inversion.The current joint inversion methods can be divided into the structural consistency constraint and petrophysical consistency constraint methods,which are mutually independent.Currently,there is a need for joint inversion methods that can comprehensively consider the structural consistency constraints and petrophysical consistency constraints.This paper develops the structural similarity index(SSIM)as a new structural and petrophysical consistency constraint for the joint inversion of gravity and vertical gradient data.The SSIM constraint is in the form of a fraction,which may have analytical singularities.Therefore,converting the fractional form to the subtractive form can solve the problem of analytic singularity and finally form a modified structural consistency index of the joint inversion,which enhances the stability of the SSIM constraint applied to the joint inversion.Compared to the reconstructed results from the cross-gradient inversion,the proposed method presents good performance and stability.The SSIM algorithm is a new joint inversion method for petrophysical and structural constraints.It can promote the consistency of the recovered models from the distribution and the structure of the physical property values.Then,applications to synthetic data illustrate that the algorithm proposed in this paper can well process the synthetic data and acquire good reconstructed results.展开更多
In conjunction with association rules for data mining, the connections between testing indices and strong and weak association rules were determined, and new derivative rules were obtained by further reasoning. Associ...In conjunction with association rules for data mining, the connections between testing indices and strong and weak association rules were determined, and new derivative rules were obtained by further reasoning. Association rules were used to analyze correlation and check consistency between indices. This study shows that the judgment obtained by weak association rules or non-association rules is more accurate and more credible than that obtained by strong association rules. When the testing grades of two indices in the weak association rules are inconsistent, the testing grades of indices are more likely to be erroneous, and the mistakes are often caused by human factors. Clustering data mining technology was used to analyze the reliability of a diagnosis, or to perform health diagnosis directly. Analysis showed that the clustering results are related to the indices selected, and that if the indices selected are more significant, the characteristics of clustering results are also more significant, and the analysis or diagnosis is more credible. The indices and diagnosis analysis function produced by this study provide a necessary theoretical foundation and new ideas for the development of hydraulic metal structure health diagnosis technology.展开更多
文摘The inter-city linkage heat data provided by Baidu Migration is employed as a characterization of inter-city linkages in order to facilitate the study of the network linkage characteristics and hierarchical structure of urban agglomeration in the Greater Bay Area through the use of social network analysis method.This is the inaugural application of big data based on location services in the study of urban agglomeration network structure,which represents a novel research perspective on this topic.The study reveals that the density of network linkages in the Greater Bay Area urban agglomeration has reached 100%,indicating a mature network-like spatial structure.This structure has given rise to three distinct communities:Shenzhen-Dongguan-Huizhou,Guangzhou-Foshan-Zhaoqing,and Zhuhai-Zhongshan-Jiangmen.Additionally,cities within the Greater Bay Area urban agglomeration play different roles,suggesting that varying development strategies may be necessary to achieve staggered development.The study demonstrates that large datasets represented by LBS can offer novel insights and methodologies for the examination of urban agglomeration network structures,contingent on the appropriate mining and processing of the data.
基金The National Nature Science Foundation of China under contract No.42071007the Nature Science Foundation of Hainan Province under contract Nos 422RC665,421QN0883,and 423RC553。
文摘Based on high-tide shoreline data extracted from 87 Landsat satellite images from 1986 to 2019 as well as using the linear regression rate and performing a Mann-Kendall(M–K)trend test,this study analyzes the linear characteristics and nonlinear behavior of the medium-to long-term shoreline evolution of Jinghai Bay,eastern Guangdong Province.In particular,shoreline rotation caused by a shore-normal coastal structure is emphasized.The results show that the overall shoreline evolution over the past 30 years is characterized by erosion on the southwest beach,with an average erosion rate of 3.1 m/a,and significant accretion on the northeast beach,with an average accretion rate of 5.6 m/a.Results of the M–K trend test indicate that significant shoreline changes occurred in early 2006,which can be attributed to shore-normal engineering.Prior to that engineering construction,the shorelines are slightly eroded,where the average erosion rate is 0.7 m/a.However,after shore-normal engineering is performed,the shoreline is characterized by significant erosion(3.2 m/a)on the southwest beach and significant accretion(8.5 m/a)on the northeast beach,thus indicating that the shore-normal engineering at the updrift headland contributes to clockwise shoreline rotation.Further analysis shows that the clockwise shoreline rotation is promoted not only by longshore sediment transport processes from southwest to northeast,but also by cross-shore sediment transport processes.These findings are crucial for beach erosion risk management,coastal disaster zoning,regional sediment budget assessments,and further observations and predictions of beach morphodynamics.
文摘The present study focuses on the analysis and description of lineaments interpreted as secondary structures to describe the nature of Senegalo Malian Discontinuity. These lineaments cross-cut the large north-south oriented transcurrent lithospheric structure known as the Senegalo Malian Discontinuity (SMD). Two lineaments were selected oriented NNE (N15˚ to N25˚), one at Dialafara and one at Sadiola. Four profiles on each lineament of these 2 zones, so that there were 2 on each side of the SMD. The ground data collected were processed using proper parameter and software. Some filters were applied to enhance the signal level. These ground data were later compared to the existing airborne magnetic data for consistency and accuracy using the upward continuation filter. The results show that the quality of ground data is good. In addition, the ground magnetic data show the presence of certain local anomalies that are not visible in the regional data. The analytical signal was also used to determine domain boundaries or possible contact zones. The contact zone can be highlighted on certain profiles such as L300 and L600. The study showed that the west and east sides of the SMD are not the same. Secondary structures become wide when approaching the SMD on both sides. They are also duplicated to the east of the SMD when we move progressively away. In the Dialafara area, the ground magnetic data intersect an interpreted fold. The results of this work confirm the presence of the secondary structures and their evolution in relation to the SMD. The relationships between the secondary structures in the Dailafara and Sadiola zones and their relations with the SMD are highlighted. The technique used in this study, is an important approach to better description and interpreting of regional structures using the secondary structures and proposing a structural model.
基金The National Natural Science Foundation of China under contract No.41806048the Open Fund of the Hubei Key Laboratory of Marine Geological Resources under contract No.MGR202009+2 种基金the Fund from the Key Laboratory of Deep-Earth Dynamics of Ministry of Natural Resource,Institute of Geology,Chinese Academy of Geological Sciences under contract No.J1901-16the Aoshan Science and Technology Innovation Project of Pilot National Laboratory for Marine Science and Technology(Qingdao)under contract No.2015ASKJ03-Seabed Resourcesthe Fund from the Korea Institute of Ocean Science and Technology(KIOST)under contract No.PE99741.
文摘The South Yellow Sea basin is filled with Mesozoic-Cenozoic continental sediments overlying pre-Palaeozoic and Mesozoic-Palaeozoic marine sediments.Conventional multi-channel seismic data cannot describe the velocity structure of the marine residual basin in detail,leading to the lack of a deeper understanding of the distribution and lithology owing to strong energy shielding on the top interface of marine sediments.In this study,we present seismic tomography data from ocean bottom seismographs that describe the NEE-trending velocity distributions of the basin.The results indicate that strong velocity variations occur at shallow crustal levels.Horizontal velocity bodies show good correlation with surface geological features,and multi-layer features exist in the vertical velocity framework(depth:0–10 km).The analyses of the velocity model,gravity data,magnetic data,multichannel seismic profiles,and drilling data showed that high-velocity anomalies(>6.5 km/s)of small(thickness:1–2 km)and large(thickness:>5 km)scales were caused by igneous complexes in the multi-layer structure,which were active during the Palaeogene.Possible locations of good Mesozoic and Palaeozoic marine strata are limited to the Central Uplift and the western part of the Northern Depression along the wide-angle ocean bottom seismograph array.Following the Indosinian movement,a strong compression existed in the Northern Depression during the extensional phase that caused the formation of folds in the middle of the survey line.This study is useful for reconstructing the regional tectonic evolution and delineating the distribution of the marine residual basin in the South Yellow Sea basin.
文摘The research purpose of this dissertation is threefold: to innovate artificial intelligence methods, to create the intersection of artificial intelligence and biological research, and to innovate human methodology. The work I have done in my research includes: improving logical structure and logical engineering, using my theory to study the innovation of the development path of artificial intelligence, using my theory to create biomimetic logic, a new intersection of artificial intelligence and biological research, and exploring the innovation of human methodology through the previous two works. The results of the research are as follows: 1) Introduction to bionic logic, incorporating simulations of people, society, and life as core principles. 2) Definition of the logical structure as the primary focus of research, with logic mechanics serving as foundational research principles. 3) Examination of the logical structure’s environment through logical fields and networks. 4) Study of logical structure communication via logical networks and main lines. 5) Proposal of data logic. 6) Investigation into the logic of logical structures, employing structural diagrams of logical equations. 7) Development of a theory of life activity within logical structures, encompassing information reasoning, its corresponding control structure, and structural reasoning. 8) Introduction of the lifecycle theory for logical structures and examination of the clock equation. 9) Exploration of logical structure intelligence. 10) Study of logical structures in mathematical forms. 11) Introduction of logic engineering. 12) Examination of artificial intelligence’s significance. 13) Investigation into the significance of human methodology.
基金funded by The Guangdong Province General Universities Young Innovative Talent Project(Grant No.2023WQNCX122)The Zhuhai Philosophy and Social Science Planning Project(Grant No.2023YBB049)。
文摘With the deepening of the Guangdong-Hong Kong-Macao Greater Bay Area strategy and the accelerated integration and development of the east and west sides of the Pearl River Estuary,Zhuhai’s hub position is becoming more and more prominent.The city of Zhuhai has a dense water network and is divided into two urban areas,the east and the west,under the influence of the Mordor Gate waterway.Based on the theory of spatial syntax,this paper carries out an analytical study on the urban spatial structure of Zhuhai,identifies the distribution characteristics of urban POIs,and provides theoretical support for the urban development of Zhuhai.
基金thankful to the Dean of Scientific Research at Najran University for funding this work under the Research Groups Funding Program,Grant Code(NU/RG/SERC/12/6).
文摘Data protection in databases is critical for any organization,as unauthorized access or manipulation can have severe negative consequences.Intrusion detection systems are essential for keeping databases secure.Advancements in technology will lead to significant changes in the medical field,improving healthcare services through real-time information sharing.However,reliability and consistency still need to be solved.Safeguards against cyber-attacks are necessary due to the risk of unauthorized access to sensitive information and potential data corruption.Dis-ruptions to data items can propagate throughout the database,making it crucial to reverse fraudulent transactions without delay,especially in the healthcare industry,where real-time data access is vital.This research presents a role-based access control architecture for an anomaly detection technique.Additionally,the Structured Query Language(SQL)queries are stored in a new data structure called Pentaplet.These pentaplets allow us to maintain the correlation between SQL statements within the same transaction by employing the transaction-log entry information,thereby increasing detection accuracy,particularly for individuals within the company exhibiting unusual behavior.To identify anomalous queries,this system employs a supervised machine learning technique called Support Vector Machine(SVM).According to experimental findings,the proposed model performed well in terms of detection accuracy,achieving 99.92%through SVM with One Hot Encoding and Principal Component Analysis(PCA).
基金the research outcomes of a blended top-tier undergraduate course in Henan ProvinceData Structures and Algorithms(Jiao Gao[2022]324)a research-based teaching demonstration course in Henan Province-Data Structures and Algorithms(Jiao Gao[2023]36)a model course of ideological and political education of Anyang Normal University-Data Structures and Algorithms(No.YBKC20210012)。
文摘To capitalize on the primary role of major course teaching and to facilitate students’understanding of abstract concepts in the data structure course,it is essential to increase their interest in learning and develop case studies that highlight fine traditional culture.By incorporating these culture-rich case studies into classroom instruction,we employ a project-driven teaching approach.This not only allows students to master professional knowledge,but also enhances their abilities to solve specific engineering problems,ultimately fostering cultural confidence.Over the past few years,during which educational reforms have been conducted for trial runs,the feasibility and effectiveness of these reform schemes have been demonstrated.
文摘A robust and efficient algorithm is presented to build multiresolution models (MRMs) of arbitrary meshes without requirement of subdivision connectivity. To overcome the sampling difficulty of arbitrary meshes, edge contraction and vertex expansion are used as downsampling and upsampling methods. Our MRMs of a mesh are composed of a base mesh and a series of edge split operations, which are organized as a directed graph. Each split operation encodes two parts of information. One is the modification to the mesh, and the other is the dependency relation among splits. Such organization ensures the efficiency and robustness of our MRM algorithm. Examples demonstrate the functionality of our method.
基金supported by the Knowledge Innovation Program of the Chinese Academy of Sciencesthe National High-Tech R&D Program of China(2008BAK49B05)
文摘More web pages are widely applying AJAX (Asynchronous JavaScript XML) due to the rich interactivity and incremental communication. By observing, it is found that the AJAX contents, which could not be seen by traditional crawler, are well-structured and belong to one specific domain generally. Extracting the structured data from AJAX contents and annotating its semantic are very significant for further applications. In this paper, a structured AJAX data extraction method for agricultural domain based on agricultural ontology was proposed. Firstly, Crawljax, an open AJAX crawling tool, was overridden to explore and retrieve the AJAX contents; secondly, the retrieved contents were partitioned into items and then classified by combining with agricultural ontology. HTML tags and punctuations were used to segment the retrieved contents into entity items. Finally, the entity items were clustered and the semantic annotation was assigned to clustering results according to agricultural ontology. By experimental evaluation, the proposed approach was proved effectively in resource exploring, entity extraction, and semantic annotation.
基金Mainly presented at the 6-th international meeting of acoustics in Aug. 2003, and The 1999 SPE Asia Pacific Oil and GasConference and Exhibition held in Jakarta, Indonesia, 20-22 April 1999, SPE 54274.
文摘In this paper, a new concept called numerical structure of seismic data is introduced and the difference between numerical structure and numerical value of seismic data is explained. Our study shows that the numerical seismic structure is closely related to oil and gas-bearing reservoir, so it is very useful for a geologist or a geophysicist to precisely interpret the oil-bearing layers from the seismic data. This technology can be applied to any exploration or production stage. The new method has been tested on a series of exploratory or development wells and proved to be reliable in China. Hydrocarbon-detection with this new method for 39 exploration wells on 25 structures indi- cates a success ratio of over 80 percent. The new method of hydrocarbon prediction can be applied for: (1) depositional environment of reservoirs with marine fades, delta, or non-marine fades (including fluvial facies, lacustrine fades); (2) sedimentary rocks of reservoirs that are non-marine clastic rocks and carbonate rock; and (3) burial depths range from 300 m to 7000 m, and the minimum thickness of these reservoirs is over 8 m (main frequency is about 50 Hz).
基金Program for New Century Excellent Talents in University(No.NCET-06-0290)the National Natural Science Foundation of China(No.60503036)the Fok Ying Tong Education Foundation Award(No.104027)
文摘In order to improve the quality of web search,a new query expansion method by choosing meaningful structure data from a domain database is proposed.It categories attributes into three different classes,named as concept attribute,context attribute and meaningless attribute,according to their semantic features which are document frequency features and distinguishing capability features.It also defines the semantic relevance between two attributes when they have correlations in the database.Then it proposes trie-bitmap structure and pair pointer tables to implement efficient algorithms for discovering attribute semantic feature and detecting their semantic relevances.By using semantic attributes and their semantic relevances,expansion words can be generated and embedded into a vector space model with interpolation parameters.The experiments use an IMDB movie database and real texts collections to evaluate the proposed method by comparing its performance with a classical vector space model.The results show that the proposed method can improve text search efficiently and also improve both semantic features and semantic relevances with good separation capabilities.
基金This reservoir research is sponsored by the National 973 Subject Project (No. 2001CB209).
文摘Seismic data structure characteristics means the waveform character arranged in the time sequence at discrete data points in each 2-D or 3-D seismic trace. Hydrocarbon prediction using seismic data structure characteristics is a new reservoir prediction technique. When the main pay interval is in carbonate fracture and fissure-cavern type reservoirs with very strong inhomogeneity, there are some difficulties with hydrocarbon prediction. Because of the special geological conditions of the eighth zone in the Tahe oil field, we apply seismic data structure characteristics to hydrocarbon prediction for the Ordovician reservoir in this zone. We divide the area oil zone into favorable and unfavorable blocks. Eighteen well locations were proposed in the favorable oil block, drilled, and recovered higher output of oil and gas.
文摘Efficient methods for incorporating engineering experience into the intelligent generation and optimization of shear wall structures are lacking,hindering intelligent design performance assessment and enhancement.This study introduces an assessment method used in the intelligent design and optimization of shear wall structures that effectively combines mechanical analysis and formulaic encoding of empirical rules.First,the critical information about the structure was extracted through data structuring.Second,an empirical rule assessment method was developed based on the engineer's experience and design standards to complete a preliminary assessment and screening of the structure.Subsequently,an assessment method based on mechanical performance and material consumption was used to compare different structural schemes comprehensively.Finally,the assessment effectiveness was demonstrated using a typical case.Compared to traditional assessment methods,the proposed method is more comprehensive and significantly more efficient,promoting the intelligent transformation of structural design.
基金Supported by the National Natural Science Foundation of China(21576240)Experimental Technology Research Program of China University of Geosciences(Key Program)(SJ-201422)
文摘Aiming to increase the efficiency of gem design and manufacturing, a new method in computer-aided-design (CAD) of convex faceted gem cuts (CFGC) based on Half-edge data structure (HDS), including the algorithms for the implementation is presented in this work. By using object-oriented methods, geometrical elements of CFGC are classified and responding geometrical feature classes are established. Each class is implemented and embedded based on the gem process. Matrix arithmetic and analytical geometry are used to derive the affine transformation and the cutting algorithm. Based on the demand for a diversity of gem cuts, CAD functions both for free-style faceted cuts and parametric designs of typical cuts and visualization and human-computer interactions of the CAD system including two-dimensional and three-dimensional interactions have been realized which enhances the flexibility and universality of the CAD system. Furthermore, data in this CAD system can also be used directly by the gem CAM module, which will promote the gem CAD/CAM integration.
基金financial support from the SERB,DST,Government of India through the project CRG/2019/001110IUCAA,Pune for providing support through an associateship program+1 种基金IISER Tirupati for support through a postdoctoral fellowshipFunding for the SDSS and SDSS-Ⅱhas been provided by the Alfred P.Sloan Foundation,the U.S.Department of Energy,the National Aeronautics and Space Administration,the Japanese Monbukagakusho,the Max Planck Society,and the Higher Education Funding Council for England。
文摘Major interactions are known to trigger star formation in galaxies and alter their color.We study the major interactions in filaments and sheets using SDSS data to understand the influence of large-scale environments on galaxy interactions.We identify the galaxies in filaments and sheets using the local dimension and also find the major pairs residing in these environments.The star formation rate(SFR) and color of the interacting galaxies as a function of pair separation are separately analyzed in filaments and sheets.The analysis is repeated for three volume limited samples covering different magnitude ranges.The major pairs residing in the filaments show a significantly higher SFR and bluer color than those residing in the sheets up to the projected pair separation of~50 kpc.We observe a complete reversal of this behavior for both the SFR and color of the galaxy pairs having a projected separation larger than 50 kpc.Some earlier studies report that the galaxy pairs align with the filament axis.Such alignment inside filaments indicates anisotropic accretion that may cause these differences.We do not observe these trends in the brighter galaxy samples.The pairs in filaments and sheets from the brighter galaxy samples trace relatively denser regions in these environments.The absence of these trends in the brighter samples may be explained by the dominant effect of the local density over the effects of the large-scale environment.
基金supported by Basic Science Research Program through the NationalResearch Foundation of Korea (NRF)funded by the Ministry of Education (2020R1A6A1A03040583).
文摘With the increasing number of digital devices generating a vast amount of video data,the recognition of abnormal image patterns has become more important.Accordingly,it is necessary to develop a method that achieves this task using object and behavior information within video data.Existing methods for detecting abnormal behaviors only focus on simple motions,therefore they cannot determine the overall behavior occurring throughout a video.In this study,an abnormal behavior detection method that uses deep learning(DL)-based video-data structuring is proposed.Objects and motions are first extracted from continuous images by combining existing DL-based image analysis models.The weight of the continuous data pattern is then analyzed through data structuring to classify the overall video.The performance of the proposed method was evaluated using varying parameter settings,such as the size of the action clip and interval between action clips.The model achieved an accuracy of 0.9817,indicating excellent performance.Therefore,we conclude that the proposed data structuring method is useful in detecting and classifying abnormal behaviors.
基金supported by the National Key Research and Development Program(Grant No.2021YFA0716100)the National Key Research and Development Program of China Project(Grant No.2018YFC0603502)+1 种基金the Henan Youth Science Fund Program(Grant No.212300410105)the provincial key R&D and promotion special project of Henan Province(Grant No.222102320279).
文摘Joint inversion is one of the most effective methods for reducing non-uniqueness for geophysical inversion.The current joint inversion methods can be divided into the structural consistency constraint and petrophysical consistency constraint methods,which are mutually independent.Currently,there is a need for joint inversion methods that can comprehensively consider the structural consistency constraints and petrophysical consistency constraints.This paper develops the structural similarity index(SSIM)as a new structural and petrophysical consistency constraint for the joint inversion of gravity and vertical gradient data.The SSIM constraint is in the form of a fraction,which may have analytical singularities.Therefore,converting the fractional form to the subtractive form can solve the problem of analytic singularity and finally form a modified structural consistency index of the joint inversion,which enhances the stability of the SSIM constraint applied to the joint inversion.Compared to the reconstructed results from the cross-gradient inversion,the proposed method presents good performance and stability.The SSIM algorithm is a new joint inversion method for petrophysical and structural constraints.It can promote the consistency of the recovered models from the distribution and the structure of the physical property values.Then,applications to synthetic data illustrate that the algorithm proposed in this paper can well process the synthetic data and acquire good reconstructed results.
基金supported by the Key Program of the National Natural Science Foundation of China(Grant No.50539010)the Special Fund for Public Welfare Industry of the Ministry of Water Resources of China(Grant No.200801019)
文摘In conjunction with association rules for data mining, the connections between testing indices and strong and weak association rules were determined, and new derivative rules were obtained by further reasoning. Association rules were used to analyze correlation and check consistency between indices. This study shows that the judgment obtained by weak association rules or non-association rules is more accurate and more credible than that obtained by strong association rules. When the testing grades of two indices in the weak association rules are inconsistent, the testing grades of indices are more likely to be erroneous, and the mistakes are often caused by human factors. Clustering data mining technology was used to analyze the reliability of a diagnosis, or to perform health diagnosis directly. Analysis showed that the clustering results are related to the indices selected, and that if the indices selected are more significant, the characteristics of clustering results are also more significant, and the analysis or diagnosis is more credible. The indices and diagnosis analysis function produced by this study provide a necessary theoretical foundation and new ideas for the development of hydraulic metal structure health diagnosis technology.