The attention is a scarce resource in decentralized autonomous organizations(DAOs),as their self-governance relies heavily on the attention-intensive decision-making process of“proposal and voting”.To prevent the ne...The attention is a scarce resource in decentralized autonomous organizations(DAOs),as their self-governance relies heavily on the attention-intensive decision-making process of“proposal and voting”.To prevent the negative effects of pro-posers’attention-capturing strategies that contribute to the“tragedy of the commons”and ensure an efficient distribution of attention among multiple proposals,it is necessary to establish a market-driven allocation scheme for DAOs’attention.First,the Harberger tax-based attention markets are designed to facilitate its allocation via continuous and automated trading,where the individualized Harberger tax rate(HTR)determined by the pro-posers’reputation is adopted.Then,the Stackelberg game model is formulated in these markets,casting attention to owners in the role of leaders and other competitive proposers as followers.Its equilibrium trading strategies are also discussed to unravel the intricate dynamics of attention pricing.Moreover,utilizing the single-round Stackelberg game as an illustrative example,the existence of Nash equilibrium trading strategies is demonstrated.Finally,the impact of individualized HTR on trading strategies is investigated,and results suggest that it has a negative correlation with leaders’self-accessed prices and ownership duration,but its effect on their revenues varies under different conditions.This study is expected to provide valuable insights into leveraging attention resources to improve DAOs’governance and decision-making process.展开更多
Centralized storage and identity identification methods pose many risks,including hacker attacks,data misuse,and single points of failure.Additionally,existing centralized identity management methods face interoperabi...Centralized storage and identity identification methods pose many risks,including hacker attacks,data misuse,and single points of failure.Additionally,existing centralized identity management methods face interoperability issues and rely on a single identity provider,leaving users without control over their identities.Therefore,this paper proposes a mechanism for identity identification and data sharing based on decentralized identifiers.The scheme utilizes blockchain technology to store the identifiers and data hashed on the chain to ensure permanent identity recognition and data integrity.Data is stored on InterPlanetary File System(IPFS)to avoid the risk of single points of failure and to enhance data persistence and availability.At the same time,compliance with World Wide Web Consortium(W3C)standards for decentralized identifiers and verifiable credentials increases the mechanism’s scalability and interoperability.展开更多
In permissioned blockchain networks,the Proof of Authority(PoA)consensus,which uses the election of authorized nodes to validate transactions and blocks,has beenwidely advocated thanks to its high transaction throughp...In permissioned blockchain networks,the Proof of Authority(PoA)consensus,which uses the election of authorized nodes to validate transactions and blocks,has beenwidely advocated thanks to its high transaction throughput and fault tolerance.However,PoA suffers from the drawback of centralization dominated by a limited number of authorized nodes and the lack of anonymity due to the round-robin block proposal mechanism.As a result,traditional PoA is vulnerable to a single point of failure that compromises the security of the blockchain network.To address these issues,we propose a novel decentralized reputation management mechanism for permissioned blockchain networks to enhance security,promote liveness,and mitigate centralization while retaining the same throughput as traditional PoA.This paper aims to design an off-chain reputation evaluation and an on-chain reputation-aided consensus.First,we evaluate the nodes’reputation in the context of the blockchain networks and make the reputation globally verifiable through smart contracts.Second,building upon traditional PoA,we propose a reputation-aided PoA(rPoA)consensus to enhance securitywithout sacrificing throughput.In particular,rPoA can incentivize nodes to autonomously form committees based on reputation authority,which prevents block generation from being tracked through the randomness of reputation variation.Moreover,we develop a reputation-aided fork-choice rule for rPoA to promote the network’s liveness.Finally,experimental results show that the proposed rPoA achieves higher security performance while retaining transaction throughput compared to traditional PoA.展开更多
In Decentralized Machine Learning(DML)systems,system participants contribute their resources to assist others in developing machine learning solutions.Identifying malicious contributions in DML systems is challenging,...In Decentralized Machine Learning(DML)systems,system participants contribute their resources to assist others in developing machine learning solutions.Identifying malicious contributions in DML systems is challenging,which has led to the exploration of blockchain technology.Blockchain leverages its transparency and immutability to record the provenance and reliability of training data.However,storing massive datasets or implementing model evaluation processes on smart contracts incurs high computational costs.Additionally,current research on preventing malicious contributions in DML systems primarily focuses on protecting models from being exploited by workers who contribute incorrect or misleading data.However,less attention has been paid to the scenario where malicious requesters intentionally manipulate test data during evaluation to gain an unfair advantage.This paper proposes a transparent and accountable training data sharing method that securely shares data among potentially malicious system participants.First,we introduce a blockchain-based DML system architecture that supports secure training data sharing through the IPFS network.Second,we design a blockchain smart contract to transparently split training datasets into training and test datasets,respectively,without involving system participants.Under the system,transparent and accountable training data sharing can be achieved with attribute-based proxy re-encryption.We demonstrate the security analysis for the system,and conduct experiments on the Ethereum and IPFS platforms to show the feasibility and practicality of the system.展开更多
The paper addresses the decentralized optimal control and stabilization problems for interconnected systems subject to asymmetric information.Compared with previous work,a closed-loop optimal solution to the control p...The paper addresses the decentralized optimal control and stabilization problems for interconnected systems subject to asymmetric information.Compared with previous work,a closed-loop optimal solution to the control problem and sufficient and necessary conditions for the stabilization problem of the interconnected systems are given for the first time.The main challenge lies in three aspects:Firstly,the asymmetric information results in coupling between control and estimation and failure of the separation principle.Secondly,two extra unknown variables are generated by asymmetric information(different information filtration)when solving forward-backward stochastic difference equations.Thirdly,the existence of additive noise makes the study of mean-square boundedness an obstacle.The adopted technique is proving and assuming the linear form of controllers and establishing the equivalence between the two systems with and without additive noise.A dual-motor parallel drive system is presented to demonstrate the validity of the proposed algorithm.展开更多
This paper investigates the anomaly-resistant decentralized state estimation(SE) problem for a class of wide-area power systems which are divided into several non-overlapping areas connected through transmission lines...This paper investigates the anomaly-resistant decentralized state estimation(SE) problem for a class of wide-area power systems which are divided into several non-overlapping areas connected through transmission lines. Two classes of measurements(i.e., local measurements and edge measurements) are obtained, respectively, from the individual area and the transmission lines. A decentralized state estimator, whose performance is resistant against measurement with anomalies, is designed based on the minimum error entropy with fiducial points(MEEF) criterion. Specifically, 1) An augmented model, which incorporates the local prediction and local measurement, is developed by resorting to the unscented transformation approach and the statistical linearization approach;2) Using the augmented model, an MEEF-based cost function is designed that reflects the local prediction errors of the state and the measurement;and 3) The local estimate is first obtained by minimizing the MEEF-based cost function through a fixed-point iteration and then updated by using the edge measuring information. Finally, simulation experiments with three scenarios are carried out on the IEEE 14-bus system to illustrate the validity of the proposed anomaly-resistant decentralized SE scheme.展开更多
The aim of this paper is to broaden the application of Stochastic Configuration Network (SCN) in the semi-supervised domain by utilizing common unlabeled data in daily life. It can enhance the classification accuracy ...The aim of this paper is to broaden the application of Stochastic Configuration Network (SCN) in the semi-supervised domain by utilizing common unlabeled data in daily life. It can enhance the classification accuracy of decentralized SCN algorithms while effectively protecting user privacy. To this end, we propose a decentralized semi-supervised learning algorithm for SCN, called DMT-SCN, which introduces teacher and student models by combining the idea of consistency regularization to improve the response speed of model iterations. In order to reduce the possible negative impact of unsupervised data on the model, we purposely change the way of adding noise to the unlabeled data. Simulation results show that the algorithm can effectively utilize unlabeled data to improve the classification accuracy of SCN training and is robust under different ground simulation environments.展开更多
The COVID-19 pandemic has devastated our daily lives,leaving horrific repercussions in its aftermath.Due to its rapid spread,it was quite difficult for medical personnel to diagnose it in such a big quantity.Patients ...The COVID-19 pandemic has devastated our daily lives,leaving horrific repercussions in its aftermath.Due to its rapid spread,it was quite difficult for medical personnel to diagnose it in such a big quantity.Patients who test positive for Covid-19 are diagnosed via a nasal PCR test.In comparison,polymerase chain reaction(PCR)findings take a few hours to a few days.The PCR test is expensive,although the government may bear expenses in certain places.Furthermore,subsets of the population resist invasive testing like swabs.Therefore,chest X-rays or Computerized Vomography(CT)scans are preferred in most cases,and more importantly,they are non-invasive,inexpensive,and provide a faster response time.Recent advances in Artificial Intelligence(AI),in combination with state-of-the-art methods,have allowed for the diagnosis of COVID-19 using chest x-rays.This article proposes a method for classifying COVID-19 as positive or negative on a decentralized dataset that is based on the Federated learning scheme.In order to build a progressive global COVID-19 classification model,two edge devices are employed to train the model on their respective localized dataset,and a 3-layered custom Convolutional Neural Network(CNN)model is used in the process of training the model,which can be deployed from the server.These two edge devices then communicate their learned parameter and weight to the server,where it aggregates and updates the globalmodel.The proposed model is trained using an image dataset that can be found on Kaggle.There are more than 13,000 X-ray images in Kaggle Database collection,from that collection 9000 images of Normal and COVID-19 positive images are used.Each edge node possesses a different number of images;edge node 1 has 3200 images,while edge node 2 has 5800.There is no association between the datasets of the various nodes that are included in the network.By doing it in this manner,each of the nodes will have access to a separate image collection that has no correlation with each other.The diagnosis of COVID-19 has become considerably more efficient with the installation of the suggested algorithm and dataset,and the findings that we have obtained are quite encouraging.展开更多
Straw return is a promising strategy for managing soil organic carbon(SOC)and improving yield stability.However,the optimal straw return strategy for sustainable crop production in the wheat(Triticum aestivum L.)-cott...Straw return is a promising strategy for managing soil organic carbon(SOC)and improving yield stability.However,the optimal straw return strategy for sustainable crop production in the wheat(Triticum aestivum L.)-cotton(Gossypium hirsutum L.)cropping system remains uncertain.The objective of this study was to quantify the long-term(10 years)impact of carbon(C)input on SOC sequestration,soil aggregation and crop yields in a wheat-cotton cropping system in the Yangtze River Valley,China.Five treatments were arranged with a single-factor randomized design as follows:no straw return(Control),return of wheat straw only(Wt),return of cotton straw only(Ct),return of 50%wheat and 50%cotton straw(Wh-Ch)and return of 100%wheat and 100%cotton straw(Wt-Ct).In comparison to the Control,the SOC content increased by 8.4 to 20.2%under straw return.A significant linear positive correlation between SOC sequestration and C input(1.42-7.19 Mg ha^(−1)yr^(−1))(P<0.05)was detected.The percentages of aggregates of sizes>2 and 1-2 mm at the 0-20 cm soil depth were also significantly elevated under straw return,with the greatest increase of the aggregate stability in the Wt-Ct treatment(28.1%).The average wheat yields increased by 12.4-36.0%and cotton yields increased by 29.4-73.7%,and significantly linear positive correlations were also detected between C input and the yields of wheat and cotton.The average sustainable yield index(SYI)reached a maximum value of 0.69 when the C input was 7.08 Mg ha^(−1)yr^(−1),which was close to the maximum value(SYI of 0.69,C input of 7.19 Mg ha^(−1)yr^(-1))in the Wt-Ct treatment.Overall,the return of both wheat and cotton straw was the best strategy for improving SOC sequestration,soil aggregation,yields and their sustainability in the wheat-cotton rotation system.展开更多
Parkinson's disease(PD),a prevalent neurodegenerative disorder,is chara cterized by the loss of dopaminergic neurons and the aggregation ofα-synuclein protein into Lewy bodies.While the current standards of thera...Parkinson's disease(PD),a prevalent neurodegenerative disorder,is chara cterized by the loss of dopaminergic neurons and the aggregation ofα-synuclein protein into Lewy bodies.While the current standards of therapy have been successful in providing some symptom relief,they fail to address the underlying pathophysiology of PD and as a result,they have no effect on disease progression.展开更多
Red blood cells(RBCs)are the most abundant human blood cells.RBC aggregation and deformation strongly determine blood viscosity which impacts hemorheology and microcirculation.In turn,RBC properties depend on di®...Red blood cells(RBCs)are the most abundant human blood cells.RBC aggregation and deformation strongly determine blood viscosity which impacts hemorheology and microcirculation.In turn,RBC properties depend on di®erent endogenous and exogenous factors.One such factor is nitric oxide(NO),which is mainly produced by endothelial cells(EC)from L-arginine amino acid in the circulatory system.Since the mechanisms of the RBC-endothelium interplay are not clear up to date and considering its possible clinical importance,the aims of this study are to investigate in vitro:(1)The effect of L-arginine induced NO on RBC aggregation and adhesion to endothelium;(2)the NO e®ect on RBC aggregation and deformation induced by L-arginine and sodium nitroprusside without the presence of endothelium in the samples.The RBC aggregation and adhesion to a monolayer of EC were studied using optical tweezers(OT).The RBC deformability and aggregation without endothelium in the samples were studied using the flow chamber method and Myrenne aggregometer.We confirmed that NO increases deformability and decreases aggregation of RBCs.We showed that the soluble guanylate cyclase pathway appears to be the only NO signaling pathway involved.In the samples with the endothelium,the "bell-shaped"dependence of RBC aggregation force on L-arginine concentration was observed,which improves our knowledge about the process of NO production by endothelium.Additionally,data related to L-arginine accumulation by endothelium were obtained:Necessity of the presence of extracellular L-arginine stated by other authors was put under question.In our study,NO decreased the RBC-endothelium adhesion,however,the tendency appeared to be weak and was not confirmed in another set of experiments.To our knowledge,this is the first attempt to measure the forces of RBC adhesion to endothelium monolayer with OT.展开更多
In recent times,technology has advanced significantly and is currently being integrated into educational environments to facilitate distance learning and interaction between learners.Integrating the Internet of Things...In recent times,technology has advanced significantly and is currently being integrated into educational environments to facilitate distance learning and interaction between learners.Integrating the Internet of Things(IoT)into education can facilitate the teaching and learning process and expand the context in which students learn.Nevertheless,learning data is very sensitive and must be protected when transmitted over the network or stored in data centers.Moreover,the identity and the authenticity of interacting students,instructors,and staff need to be verified to mitigate the impact of attacks.However,most of the current security and authentication schemes are centralized,relying on trusted third-party cloud servers,to facilitate continuous secure communication.In addition,most of these schemes are resourceintensive;thus,security and efficiency issues arise when heterogeneous and resource-limited IoT devices are being used.In this paper,we propose a blockchain-based architecture that accurately identifies and authenticates learners and their IoT devices in a decentralized manner and prevents the unauthorized modification of stored learning records in a distributed university network.It allows students and instructors to easily migrate to and join multiple universities within the network using their identity without the need for user re-authentication.The proposed architecture was tested using a simulation tool,and measured to evaluate its performance.The simulation results demonstrate the ability of the proposed architecture to significantly increase the throughput of learning transactions(40%),reduce the communication overhead and response time(26%),improve authentication efficiency(27%),and reduce the IoT power consumption(35%)compared to the centralized authentication mechanisms.In addition,the security analysis proves the effectiveness of the proposed architecture in resisting various attacks and ensuring the security requirements of learning data in the university network.展开更多
Ubiquitous data monitoring and processing with minimal latency is one of the crucial challenges in real-time and scalable applications.Internet of Things(IoT),fog computing,edge computing,cloud computing,and the edge ...Ubiquitous data monitoring and processing with minimal latency is one of the crucial challenges in real-time and scalable applications.Internet of Things(IoT),fog computing,edge computing,cloud computing,and the edge of things are the spine of all real-time and scalable applications.Conspicuously,this study proposed a novel framework for a real-time and scalable application that changes dynamically with time.In this study,IoT deployment is recommended for data acquisition.The Pre-Processing of data with local edge and fog nodes is implemented in this study.The thresholdoriented data classification method is deployed to improve the intrusion detection mechanism’s performance.The employment of machine learningempowered intelligent algorithms in a distributed manner is implemented to enhance the overall response rate of the layered framework.The placement of respondent nodes near the framework’s IoT layer minimizes the network’s latency.For economic evaluation of the proposed framework with minimal efforts,EdgeCloudSim and FogNetSim++simulation environments are deployed in this study.The experimental results confirm the robustness of the proposed system by its improvised threshold-oriented data classification and intrusion detection approach,improved response rate,and prediction mechanism.Moreover,the proposed layered framework provides a robust solution for real-time and scalable applications that changes dynamically with time.展开更多
NonorthogonalMultiple Access(NOMA)is incorporated into the wireless network systems to achieve better connectivity,spectral and energy effectiveness,higher data transfer rate,and also obtain the high quality of servic...NonorthogonalMultiple Access(NOMA)is incorporated into the wireless network systems to achieve better connectivity,spectral and energy effectiveness,higher data transfer rate,and also obtain the high quality of services(QoS).In order to improve throughput and minimum latency,aMultivariate Renkonen Regressive Weighted Preference Bootstrap Aggregation based Nonorthogonal Multiple Access(MRRWPBA-NOMA)technique is introduced for network communication.In the downlink transmission,each mobile device’s resources and their characteristics like energy,bandwidth,and trust are measured.Followed by,the Weighted Preference Bootstrap Aggregation is applied to recognize the resource-efficient mobile devices for aware data transmission by constructing the different weak hypotheses i.e.,Multivariate Renkonen Regression functions.Based on the classification,resource and trust-aware devices are selected for transmission.Simulation of the proposed MRRWPBA-NOMA technique and existing methods are carried out with different metrics such as data delivery ratio,throughput,latency,packet loss rate,and energy efficiency,signaling overhead.The simulation results assessment indicates that the proposed MRRWPBA-NOMA outperforms well than the conventional methods.展开更多
The evolution of smart mobile devices has significantly impacted the way we generate and share contents and introduced a huge volume of Internet traffic.To address this issue and take advantage of the short-range comm...The evolution of smart mobile devices has significantly impacted the way we generate and share contents and introduced a huge volume of Internet traffic.To address this issue and take advantage of the short-range communication capabilities of smart mobile devices,the decentralized content sharing approach has emerged as a suitable and promising alternative.Decentralized content sharing uses a peer-to-peer network among colocated smart mobile device users to fulfil content requests.Several articles have been published to date to address its different aspects including group management,interest extraction,message forwarding,participation incentive,and content replication.This survey paper summarizes and critically analyzes recent advancements in decentralized content sharing and highlights potential research issues that need further consideration.展开更多
Aggregation of species with similar ecological properties is one of the effective methods to simplify food web researches.However,species aggregation will affect not only the complexity of modeling process but also th...Aggregation of species with similar ecological properties is one of the effective methods to simplify food web researches.However,species aggregation will affect not only the complexity of modeling process but also the accuracy of models’outputs.Selection of aggregation methods and the number of trophospecies are the keys to study the simplification of food web.In this study,three aggregation methods,including taxonomic aggregation(TA),structural equivalence aggregation(SEA),and self-organizing maps(SOM),were analyzed and compared with the linear inverse model–Markov Chain Monte Carlo(LIM-MCMC)model.Impacts of aggregation methods and trophospecies number on food webs were evaluated based on the robustness and unitless of ecological net-work indices.Results showed that aggregation method of SEA performed better than the other two methods in estimating food web structure and function indices.The effects of aggregation methods were driven by the differences in species aggregation principles,which will alter food web structure and function through the redistribution of energy flow.According to the results of mean absolute percentage error(MAPE)which can be applied to evaluate the accuracy of the model,we found that MAPE in food web indices will increase with the reducing trophospecies number,and MAPE in food web function indices were smaller and more stable than those in food web structure indices.Therefore,trade-off between simplifying food webs and reflecting the status of ecosystem should be con-sidered in food web studies.These findings highlight the importance of aggregation methods and trophospecies number in the analy-sis of food web simplification.This study provided a framework to explore the extent to which food web models are affected by dif-ferent species aggregation,and will provide scientific basis for the construction of food webs.展开更多
A simplex centroid design method was employed to design the gradation of recycled coarse aggregate.The bulk density was measured while the specific surface area and average excess paste thickness were calculated with ...A simplex centroid design method was employed to design the gradation of recycled coarse aggregate.The bulk density was measured while the specific surface area and average excess paste thickness were calculated with different gradations.The fluidity,dynamic yield stress,static yield stress,printed width,printed inclination,compressive strength and ultrasonic wave velocity of 3D printed recycled aggregate concrete(3DPRAC)were further studied.The experimental results demonstrate that,with the increase of small-sized aggregate(4.75-7 mm)content,the bulk density initially increases and then decreases,and the specific surface area gradually increases.The average excess paste thickness fluctuates with both bulk density and specific surface area.The workability of 3DPRAC is closely related to the average excess paste thickness.With an increase in average paste thickness,there is a gradual decrease in dynamic yield stress,static yield stress and printed inclination,accompanied by an increase in fluidity and printed width.The mechanical performance of 3DPRAC closely correlates with the bulk density.With an increase in the bulk density,there is an increase in the ultrasonic wave velocity,accompanied by a slight increase in the compressive strength and a significant decrease in the anisotropic coefficient.Furthermore,an index for buildability failure of 3DPRAC based on the average excess paste thickness is proposed.展开更多
Underwater pulse waveform recognition is an important method for underwater object detection.Most existing works focus on the application of traditional pattern recognition methods,which ignore the time-and space-vary...Underwater pulse waveform recognition is an important method for underwater object detection.Most existing works focus on the application of traditional pattern recognition methods,which ignore the time-and space-varying characteristics in sound propagation channels and cannot easily extract valuable waveform features.Sound propagation channels in seawater are time-and space-varying convolutional channels.In the extraction of the waveform features of underwater acoustic signals,the effect of high-accuracy underwater acoustic signal recognition is identified by eliminating the influence of time-and space-varying convolutional channels to the greatest extent possible.We propose a hash aggregate discriminative network(HADN),which combines hash learning and deep learning to minimize the time-and space-varying effects on convolutional channels and adaptively learns effective underwater waveform features to achieve high-accuracy underwater pulse waveform recognition.In the extraction of the hash features of acoustic signals,a discrete constraint between clusters within a hash feature class is introduced.This constraint can ensure that the influence of convolutional channels on hash features is minimized.In addition,we design a new loss function called aggregate discriminative loss(AD-loss).The use of AD-loss and softmax-loss can increase the discriminativeness of the learned hash features.Experimental results show that on pool and ocean datasets,which were collected in pools and oceans,respectively,by using acoustic collectors,the proposed HADN performs better than other comparative models in terms of accuracy and mAP.展开更多
Protein aggregation has been linked with many neurodegenerative diseases,such as Alzheimer’s disease(AD)or Parkinson’s disease.AD belongs to a group of heterogeneous and incurable neurodegenerative disorders collect...Protein aggregation has been linked with many neurodegenerative diseases,such as Alzheimer’s disease(AD)or Parkinson’s disease.AD belongs to a group of heterogeneous and incurable neurodegenerative disorders collectively known as tauopathies.They comprise frontotemporal dementia,Pick’s disease,or corticobasal degeneration,among others.The symptomatology varies with the specific tau protein variant involved and the affected brain region or cell type.However,they share a common neuropathological hallmark-the formation of proteinaceous deposits named neurofibrillary tangles.Neurofibrillary tangles,primarily composed of aggregated tau(Zhang et al.,2022),disrupt normal neuronal functions,leading to cell death and cognitive decline.展开更多
The bioreduction of graphene oxide(GO)using environmentally functional bacteria such as Shewanella represents a green approach to produce reduced graphene oxide(rGO).This process differs from the chemical reduction th...The bioreduction of graphene oxide(GO)using environmentally functional bacteria such as Shewanella represents a green approach to produce reduced graphene oxide(rGO).This process differs from the chemical reduction that involves instantaneous molecular reactions.In bioreduction,the contact of bacterial cells and GO is considered the rate-limiting step.To reveal how the bacteria-GO integration regulates rGO production,the comparative experiments of GO and three Shewanella strains were carried out.Fourier-transform infrared spectroscopy,X-ray photoelectron spectroscopy,Raman spectroscopy,and atomic force microscopy were used to characterize the reduction degree and the aggregation degree.The results showed that a spontaneous aggregation of GO and Shewanella into the condensed entity occurred within 36 h.A positive linear correlation was established,linking three indexes of the aggregation potential,the bacterial reduction ability,and the reduction degree(ID/IG)comprehensively.展开更多
基金supported by the National Natural Science Foundation of China(62103411)the Science and Technology Development Fund of Macao SAR(0093/2023/RIA2,0050/2020/A1)。
文摘The attention is a scarce resource in decentralized autonomous organizations(DAOs),as their self-governance relies heavily on the attention-intensive decision-making process of“proposal and voting”.To prevent the negative effects of pro-posers’attention-capturing strategies that contribute to the“tragedy of the commons”and ensure an efficient distribution of attention among multiple proposals,it is necessary to establish a market-driven allocation scheme for DAOs’attention.First,the Harberger tax-based attention markets are designed to facilitate its allocation via continuous and automated trading,where the individualized Harberger tax rate(HTR)determined by the pro-posers’reputation is adopted.Then,the Stackelberg game model is formulated in these markets,casting attention to owners in the role of leaders and other competitive proposers as followers.Its equilibrium trading strategies are also discussed to unravel the intricate dynamics of attention pricing.Moreover,utilizing the single-round Stackelberg game as an illustrative example,the existence of Nash equilibrium trading strategies is demonstrated.Finally,the impact of individualized HTR on trading strategies is investigated,and results suggest that it has a negative correlation with leaders’self-accessed prices and ownership duration,but its effect on their revenues varies under different conditions.This study is expected to provide valuable insights into leveraging attention resources to improve DAOs’governance and decision-making process.
文摘Centralized storage and identity identification methods pose many risks,including hacker attacks,data misuse,and single points of failure.Additionally,existing centralized identity management methods face interoperability issues and rely on a single identity provider,leaving users without control over their identities.Therefore,this paper proposes a mechanism for identity identification and data sharing based on decentralized identifiers.The scheme utilizes blockchain technology to store the identifiers and data hashed on the chain to ensure permanent identity recognition and data integrity.Data is stored on InterPlanetary File System(IPFS)to avoid the risk of single points of failure and to enhance data persistence and availability.At the same time,compliance with World Wide Web Consortium(W3C)standards for decentralized identifiers and verifiable credentials increases the mechanism’s scalability and interoperability.
基金supported by the Shenzhen Science and Technology Program under Grants KCXST20221021111404010,JSGG20220831103400002,JSGGKQTD20221101115655027,JCYJ 20210324094609027the National KeyR&DProgram of China under Grant 2021YFB2700900+1 种基金the National Natural Science Foundation of China under Grants 62371239,62376074,72301083the Jiangsu Specially-Appointed Professor Program 2021.
文摘In permissioned blockchain networks,the Proof of Authority(PoA)consensus,which uses the election of authorized nodes to validate transactions and blocks,has beenwidely advocated thanks to its high transaction throughput and fault tolerance.However,PoA suffers from the drawback of centralization dominated by a limited number of authorized nodes and the lack of anonymity due to the round-robin block proposal mechanism.As a result,traditional PoA is vulnerable to a single point of failure that compromises the security of the blockchain network.To address these issues,we propose a novel decentralized reputation management mechanism for permissioned blockchain networks to enhance security,promote liveness,and mitigate centralization while retaining the same throughput as traditional PoA.This paper aims to design an off-chain reputation evaluation and an on-chain reputation-aided consensus.First,we evaluate the nodes’reputation in the context of the blockchain networks and make the reputation globally verifiable through smart contracts.Second,building upon traditional PoA,we propose a reputation-aided PoA(rPoA)consensus to enhance securitywithout sacrificing throughput.In particular,rPoA can incentivize nodes to autonomously form committees based on reputation authority,which prevents block generation from being tracked through the randomness of reputation variation.Moreover,we develop a reputation-aided fork-choice rule for rPoA to promote the network’s liveness.Finally,experimental results show that the proposed rPoA achieves higher security performance while retaining transaction throughput compared to traditional PoA.
基金supported by the MSIT(Ministry of Science and ICT),Korea,under the Special R&D Zone Development Project(R&D)—Development of R&D Innovation Valley support program(2023-DD-RD-0152)supervised by the Innovation Foundation.It was also partially supported by the Ministry of Science and ICT(MSIT),Korea,under the Information Technology Research Center(ITRC)support program(IITP-2024-2020-0-01797)supervised by the Institute for Information&Communications Technology Planning&Evaluation(IITP).
文摘In Decentralized Machine Learning(DML)systems,system participants contribute their resources to assist others in developing machine learning solutions.Identifying malicious contributions in DML systems is challenging,which has led to the exploration of blockchain technology.Blockchain leverages its transparency and immutability to record the provenance and reliability of training data.However,storing massive datasets or implementing model evaluation processes on smart contracts incurs high computational costs.Additionally,current research on preventing malicious contributions in DML systems primarily focuses on protecting models from being exploited by workers who contribute incorrect or misleading data.However,less attention has been paid to the scenario where malicious requesters intentionally manipulate test data during evaluation to gain an unfair advantage.This paper proposes a transparent and accountable training data sharing method that securely shares data among potentially malicious system participants.First,we introduce a blockchain-based DML system architecture that supports secure training data sharing through the IPFS network.Second,we design a blockchain smart contract to transparently split training datasets into training and test datasets,respectively,without involving system participants.Under the system,transparent and accountable training data sharing can be achieved with attribute-based proxy re-encryption.We demonstrate the security analysis for the system,and conduct experiments on the Ethereum and IPFS platforms to show the feasibility and practicality of the system.
基金supported by the National Natural Science Foundation of China(62273213,62073199,62103241)Natural Science Foundation of Shandong Province for Innovation and Development Joint Funds(ZR2022LZH001)+4 种基金Natural Science Foundation of Shandong Province(ZR2020MF095,ZR2021QF107)Taishan Scholarship Construction Engineeringthe Original Exploratory Program Project of National Natural Science Foundation of China(62250056)Major Basic Research of Natural Science Foundation of Shandong Province(ZR2021ZD14)High-level Talent Team Project of Qingdao West Coast New Area(RCTD-JC-2019-05)。
文摘The paper addresses the decentralized optimal control and stabilization problems for interconnected systems subject to asymmetric information.Compared with previous work,a closed-loop optimal solution to the control problem and sufficient and necessary conditions for the stabilization problem of the interconnected systems are given for the first time.The main challenge lies in three aspects:Firstly,the asymmetric information results in coupling between control and estimation and failure of the separation principle.Secondly,two extra unknown variables are generated by asymmetric information(different information filtration)when solving forward-backward stochastic difference equations.Thirdly,the existence of additive noise makes the study of mean-square boundedness an obstacle.The adopted technique is proving and assuming the linear form of controllers and establishing the equivalence between the two systems with and without additive noise.A dual-motor parallel drive system is presented to demonstrate the validity of the proposed algorithm.
基金supported in part by the National Natural Science Foundation of China(61933007, U21A2019, 62273005, 62273088, 62303301)the Program of Shanghai Academic/Technology Research Leader of China (20XD1420100)+2 种基金the Hainan Province Science and Technology Special Fund of China(ZDYF2022SHFZ105)the Natural Science Foundation of Anhui Province of China (2108085MA07)the Alexander von Humboldt Foundation of Germany。
文摘This paper investigates the anomaly-resistant decentralized state estimation(SE) problem for a class of wide-area power systems which are divided into several non-overlapping areas connected through transmission lines. Two classes of measurements(i.e., local measurements and edge measurements) are obtained, respectively, from the individual area and the transmission lines. A decentralized state estimator, whose performance is resistant against measurement with anomalies, is designed based on the minimum error entropy with fiducial points(MEEF) criterion. Specifically, 1) An augmented model, which incorporates the local prediction and local measurement, is developed by resorting to the unscented transformation approach and the statistical linearization approach;2) Using the augmented model, an MEEF-based cost function is designed that reflects the local prediction errors of the state and the measurement;and 3) The local estimate is first obtained by minimizing the MEEF-based cost function through a fixed-point iteration and then updated by using the edge measuring information. Finally, simulation experiments with three scenarios are carried out on the IEEE 14-bus system to illustrate the validity of the proposed anomaly-resistant decentralized SE scheme.
文摘The aim of this paper is to broaden the application of Stochastic Configuration Network (SCN) in the semi-supervised domain by utilizing common unlabeled data in daily life. It can enhance the classification accuracy of decentralized SCN algorithms while effectively protecting user privacy. To this end, we propose a decentralized semi-supervised learning algorithm for SCN, called DMT-SCN, which introduces teacher and student models by combining the idea of consistency regularization to improve the response speed of model iterations. In order to reduce the possible negative impact of unsupervised data on the model, we purposely change the way of adding noise to the unlabeled data. Simulation results show that the algorithm can effectively utilize unlabeled data to improve the classification accuracy of SCN training and is robust under different ground simulation environments.
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2023R66)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The COVID-19 pandemic has devastated our daily lives,leaving horrific repercussions in its aftermath.Due to its rapid spread,it was quite difficult for medical personnel to diagnose it in such a big quantity.Patients who test positive for Covid-19 are diagnosed via a nasal PCR test.In comparison,polymerase chain reaction(PCR)findings take a few hours to a few days.The PCR test is expensive,although the government may bear expenses in certain places.Furthermore,subsets of the population resist invasive testing like swabs.Therefore,chest X-rays or Computerized Vomography(CT)scans are preferred in most cases,and more importantly,they are non-invasive,inexpensive,and provide a faster response time.Recent advances in Artificial Intelligence(AI),in combination with state-of-the-art methods,have allowed for the diagnosis of COVID-19 using chest x-rays.This article proposes a method for classifying COVID-19 as positive or negative on a decentralized dataset that is based on the Federated learning scheme.In order to build a progressive global COVID-19 classification model,two edge devices are employed to train the model on their respective localized dataset,and a 3-layered custom Convolutional Neural Network(CNN)model is used in the process of training the model,which can be deployed from the server.These two edge devices then communicate their learned parameter and weight to the server,where it aggregates and updates the globalmodel.The proposed model is trained using an image dataset that can be found on Kaggle.There are more than 13,000 X-ray images in Kaggle Database collection,from that collection 9000 images of Normal and COVID-19 positive images are used.Each edge node possesses a different number of images;edge node 1 has 3200 images,while edge node 2 has 5800.There is no association between the datasets of the various nodes that are included in the network.By doing it in this manner,each of the nodes will have access to a separate image collection that has no correlation with each other.The diagnosis of COVID-19 has become considerably more efficient with the installation of the suggested algorithm and dataset,and the findings that we have obtained are quite encouraging.
基金supported by the National Natural Science Foundation of China(32071968)the Jiangsu Agricultural Science and Technology Innovation Fund,China(CX(22)2015))the Jiangsu Collaborative Innovation Center for Modern Crop Production,China。
文摘Straw return is a promising strategy for managing soil organic carbon(SOC)and improving yield stability.However,the optimal straw return strategy for sustainable crop production in the wheat(Triticum aestivum L.)-cotton(Gossypium hirsutum L.)cropping system remains uncertain.The objective of this study was to quantify the long-term(10 years)impact of carbon(C)input on SOC sequestration,soil aggregation and crop yields in a wheat-cotton cropping system in the Yangtze River Valley,China.Five treatments were arranged with a single-factor randomized design as follows:no straw return(Control),return of wheat straw only(Wt),return of cotton straw only(Ct),return of 50%wheat and 50%cotton straw(Wh-Ch)and return of 100%wheat and 100%cotton straw(Wt-Ct).In comparison to the Control,the SOC content increased by 8.4 to 20.2%under straw return.A significant linear positive correlation between SOC sequestration and C input(1.42-7.19 Mg ha^(−1)yr^(−1))(P<0.05)was detected.The percentages of aggregates of sizes>2 and 1-2 mm at the 0-20 cm soil depth were also significantly elevated under straw return,with the greatest increase of the aggregate stability in the Wt-Ct treatment(28.1%).The average wheat yields increased by 12.4-36.0%and cotton yields increased by 29.4-73.7%,and significantly linear positive correlations were also detected between C input and the yields of wheat and cotton.The average sustainable yield index(SYI)reached a maximum value of 0.69 when the C input was 7.08 Mg ha^(−1)yr^(−1),which was close to the maximum value(SYI of 0.69,C input of 7.19 Mg ha^(−1)yr^(-1))in the Wt-Ct treatment.Overall,the return of both wheat and cotton straw was the best strategy for improving SOC sequestration,soil aggregation,yields and their sustainability in the wheat-cotton rotation system.
基金the financial support received from the Michael J.Fox Foundation through the Target Advancement Program Grant Award (Grant No.MJFF-000649) (to HK)。
文摘Parkinson's disease(PD),a prevalent neurodegenerative disorder,is chara cterized by the loss of dopaminergic neurons and the aggregation ofα-synuclein protein into Lewy bodies.While the current standards of therapy have been successful in providing some symptom relief,they fail to address the underlying pathophysiology of PD and as a result,they have no effect on disease progression.
基金supported by the Russian Science Foundation Grant No.22-15-00120.
文摘Red blood cells(RBCs)are the most abundant human blood cells.RBC aggregation and deformation strongly determine blood viscosity which impacts hemorheology and microcirculation.In turn,RBC properties depend on di®erent endogenous and exogenous factors.One such factor is nitric oxide(NO),which is mainly produced by endothelial cells(EC)from L-arginine amino acid in the circulatory system.Since the mechanisms of the RBC-endothelium interplay are not clear up to date and considering its possible clinical importance,the aims of this study are to investigate in vitro:(1)The effect of L-arginine induced NO on RBC aggregation and adhesion to endothelium;(2)the NO e®ect on RBC aggregation and deformation induced by L-arginine and sodium nitroprusside without the presence of endothelium in the samples.The RBC aggregation and adhesion to a monolayer of EC were studied using optical tweezers(OT).The RBC deformability and aggregation without endothelium in the samples were studied using the flow chamber method and Myrenne aggregometer.We confirmed that NO increases deformability and decreases aggregation of RBCs.We showed that the soluble guanylate cyclase pathway appears to be the only NO signaling pathway involved.In the samples with the endothelium,the "bell-shaped"dependence of RBC aggregation force on L-arginine concentration was observed,which improves our knowledge about the process of NO production by endothelium.Additionally,data related to L-arginine accumulation by endothelium were obtained:Necessity of the presence of extracellular L-arginine stated by other authors was put under question.In our study,NO decreased the RBC-endothelium adhesion,however,the tendency appeared to be weak and was not confirmed in another set of experiments.To our knowledge,this is the first attempt to measure the forces of RBC adhesion to endothelium monolayer with OT.
文摘In recent times,technology has advanced significantly and is currently being integrated into educational environments to facilitate distance learning and interaction between learners.Integrating the Internet of Things(IoT)into education can facilitate the teaching and learning process and expand the context in which students learn.Nevertheless,learning data is very sensitive and must be protected when transmitted over the network or stored in data centers.Moreover,the identity and the authenticity of interacting students,instructors,and staff need to be verified to mitigate the impact of attacks.However,most of the current security and authentication schemes are centralized,relying on trusted third-party cloud servers,to facilitate continuous secure communication.In addition,most of these schemes are resourceintensive;thus,security and efficiency issues arise when heterogeneous and resource-limited IoT devices are being used.In this paper,we propose a blockchain-based architecture that accurately identifies and authenticates learners and their IoT devices in a decentralized manner and prevents the unauthorized modification of stored learning records in a distributed university network.It allows students and instructors to easily migrate to and join multiple universities within the network using their identity without the need for user re-authentication.The proposed architecture was tested using a simulation tool,and measured to evaluate its performance.The simulation results demonstrate the ability of the proposed architecture to significantly increase the throughput of learning transactions(40%),reduce the communication overhead and response time(26%),improve authentication efficiency(27%),and reduce the IoT power consumption(35%)compared to the centralized authentication mechanisms.In addition,the security analysis proves the effectiveness of the proposed architecture in resisting various attacks and ensuring the security requirements of learning data in the university network.
文摘Ubiquitous data monitoring and processing with minimal latency is one of the crucial challenges in real-time and scalable applications.Internet of Things(IoT),fog computing,edge computing,cloud computing,and the edge of things are the spine of all real-time and scalable applications.Conspicuously,this study proposed a novel framework for a real-time and scalable application that changes dynamically with time.In this study,IoT deployment is recommended for data acquisition.The Pre-Processing of data with local edge and fog nodes is implemented in this study.The thresholdoriented data classification method is deployed to improve the intrusion detection mechanism’s performance.The employment of machine learningempowered intelligent algorithms in a distributed manner is implemented to enhance the overall response rate of the layered framework.The placement of respondent nodes near the framework’s IoT layer minimizes the network’s latency.For economic evaluation of the proposed framework with minimal efforts,EdgeCloudSim and FogNetSim++simulation environments are deployed in this study.The experimental results confirm the robustness of the proposed system by its improvised threshold-oriented data classification and intrusion detection approach,improved response rate,and prediction mechanism.Moreover,the proposed layered framework provides a robust solution for real-time and scalable applications that changes dynamically with time.
基金the Taif University Researchers Supporting Project number(TURSP-2020/36),Taif University,Taif,Saudi Arabiafundedby Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R97), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia。
文摘NonorthogonalMultiple Access(NOMA)is incorporated into the wireless network systems to achieve better connectivity,spectral and energy effectiveness,higher data transfer rate,and also obtain the high quality of services(QoS).In order to improve throughput and minimum latency,aMultivariate Renkonen Regressive Weighted Preference Bootstrap Aggregation based Nonorthogonal Multiple Access(MRRWPBA-NOMA)technique is introduced for network communication.In the downlink transmission,each mobile device’s resources and their characteristics like energy,bandwidth,and trust are measured.Followed by,the Weighted Preference Bootstrap Aggregation is applied to recognize the resource-efficient mobile devices for aware data transmission by constructing the different weak hypotheses i.e.,Multivariate Renkonen Regression functions.Based on the classification,resource and trust-aware devices are selected for transmission.Simulation of the proposed MRRWPBA-NOMA technique and existing methods are carried out with different metrics such as data delivery ratio,throughput,latency,packet loss rate,and energy efficiency,signaling overhead.The simulation results assessment indicates that the proposed MRRWPBA-NOMA outperforms well than the conventional methods.
文摘The evolution of smart mobile devices has significantly impacted the way we generate and share contents and introduced a huge volume of Internet traffic.To address this issue and take advantage of the short-range communication capabilities of smart mobile devices,the decentralized content sharing approach has emerged as a suitable and promising alternative.Decentralized content sharing uses a peer-to-peer network among colocated smart mobile device users to fulfil content requests.Several articles have been published to date to address its different aspects including group management,interest extraction,message forwarding,participation incentive,and content replication.This survey paper summarizes and critically analyzes recent advancements in decentralized content sharing and highlights potential research issues that need further consideration.
基金supported by the National Key R&D Program of China(Nos.2019YFD0901204,2019YFD 0901205).
文摘Aggregation of species with similar ecological properties is one of the effective methods to simplify food web researches.However,species aggregation will affect not only the complexity of modeling process but also the accuracy of models’outputs.Selection of aggregation methods and the number of trophospecies are the keys to study the simplification of food web.In this study,three aggregation methods,including taxonomic aggregation(TA),structural equivalence aggregation(SEA),and self-organizing maps(SOM),were analyzed and compared with the linear inverse model–Markov Chain Monte Carlo(LIM-MCMC)model.Impacts of aggregation methods and trophospecies number on food webs were evaluated based on the robustness and unitless of ecological net-work indices.Results showed that aggregation method of SEA performed better than the other two methods in estimating food web structure and function indices.The effects of aggregation methods were driven by the differences in species aggregation principles,which will alter food web structure and function through the redistribution of energy flow.According to the results of mean absolute percentage error(MAPE)which can be applied to evaluate the accuracy of the model,we found that MAPE in food web indices will increase with the reducing trophospecies number,and MAPE in food web function indices were smaller and more stable than those in food web structure indices.Therefore,trade-off between simplifying food webs and reflecting the status of ecosystem should be con-sidered in food web studies.These findings highlight the importance of aggregation methods and trophospecies number in the analy-sis of food web simplification.This study provided a framework to explore the extent to which food web models are affected by dif-ferent species aggregation,and will provide scientific basis for the construction of food webs.
基金Funded by the National Natural Science Foundation of China(No.U1904188)。
文摘A simplex centroid design method was employed to design the gradation of recycled coarse aggregate.The bulk density was measured while the specific surface area and average excess paste thickness were calculated with different gradations.The fluidity,dynamic yield stress,static yield stress,printed width,printed inclination,compressive strength and ultrasonic wave velocity of 3D printed recycled aggregate concrete(3DPRAC)were further studied.The experimental results demonstrate that,with the increase of small-sized aggregate(4.75-7 mm)content,the bulk density initially increases and then decreases,and the specific surface area gradually increases.The average excess paste thickness fluctuates with both bulk density and specific surface area.The workability of 3DPRAC is closely related to the average excess paste thickness.With an increase in average paste thickness,there is a gradual decrease in dynamic yield stress,static yield stress and printed inclination,accompanied by an increase in fluidity and printed width.The mechanical performance of 3DPRAC closely correlates with the bulk density.With an increase in the bulk density,there is an increase in the ultrasonic wave velocity,accompanied by a slight increase in the compressive strength and a significant decrease in the anisotropic coefficient.Furthermore,an index for buildability failure of 3DPRAC based on the average excess paste thickness is proposed.
基金partially supported by the National Key Research and Development Program of China(No.2018 AAA0100400)the Natural Science Foundation of Shandong Province(Nos.ZR2020MF131 and ZR2021ZD19)the Science and Technology Program of Qingdao(No.21-1-4-ny-19-nsh).
文摘Underwater pulse waveform recognition is an important method for underwater object detection.Most existing works focus on the application of traditional pattern recognition methods,which ignore the time-and space-varying characteristics in sound propagation channels and cannot easily extract valuable waveform features.Sound propagation channels in seawater are time-and space-varying convolutional channels.In the extraction of the waveform features of underwater acoustic signals,the effect of high-accuracy underwater acoustic signal recognition is identified by eliminating the influence of time-and space-varying convolutional channels to the greatest extent possible.We propose a hash aggregate discriminative network(HADN),which combines hash learning and deep learning to minimize the time-and space-varying effects on convolutional channels and adaptively learns effective underwater waveform features to achieve high-accuracy underwater pulse waveform recognition.In the extraction of the hash features of acoustic signals,a discrete constraint between clusters within a hash feature class is introduced.This constraint can ensure that the influence of convolutional channels on hash features is minimized.In addition,we design a new loss function called aggregate discriminative loss(AD-loss).The use of AD-loss and softmax-loss can increase the discriminativeness of the learned hash features.Experimental results show that on pool and ocean datasets,which were collected in pools and oceans,respectively,by using acoustic collectors,the proposed HADN performs better than other comparative models in terms of accuracy and mAP.
基金funded by European Union Horizon 2020 research and innovation programme under GA 952334(PhasAGE)the Spanish Ministry of Science and Innovation(PID2019-105017RB-I00)by ICREA,ICREA Academia 2015,and 2020(to SV).
文摘Protein aggregation has been linked with many neurodegenerative diseases,such as Alzheimer’s disease(AD)or Parkinson’s disease.AD belongs to a group of heterogeneous and incurable neurodegenerative disorders collectively known as tauopathies.They comprise frontotemporal dementia,Pick’s disease,or corticobasal degeneration,among others.The symptomatology varies with the specific tau protein variant involved and the affected brain region or cell type.However,they share a common neuropathological hallmark-the formation of proteinaceous deposits named neurofibrillary tangles.Neurofibrillary tangles,primarily composed of aggregated tau(Zhang et al.,2022),disrupt normal neuronal functions,leading to cell death and cognitive decline.
基金supported by the National Natural Science Foundation of China(22178293)the Natural Science Foundation of Fujian Province of China(2022J01022)。
文摘The bioreduction of graphene oxide(GO)using environmentally functional bacteria such as Shewanella represents a green approach to produce reduced graphene oxide(rGO).This process differs from the chemical reduction that involves instantaneous molecular reactions.In bioreduction,the contact of bacterial cells and GO is considered the rate-limiting step.To reveal how the bacteria-GO integration regulates rGO production,the comparative experiments of GO and three Shewanella strains were carried out.Fourier-transform infrared spectroscopy,X-ray photoelectron spectroscopy,Raman spectroscopy,and atomic force microscopy were used to characterize the reduction degree and the aggregation degree.The results showed that a spontaneous aggregation of GO and Shewanella into the condensed entity occurred within 36 h.A positive linear correlation was established,linking three indexes of the aggregation potential,the bacterial reduction ability,and the reduction degree(ID/IG)comprehensively.