Filter bank multicarrier(FBMC)systems with offset quadrature amplitude modulation(OQAM)need long data blocks to achieve high spectral efficiency.However,the transmission of long data blocks in underwater acoustic(UWA)...Filter bank multicarrier(FBMC)systems with offset quadrature amplitude modulation(OQAM)need long data blocks to achieve high spectral efficiency.However,the transmission of long data blocks in underwater acoustic(UWA)communication systems often encounters the challenge of time-varying channels.This paper proposes a time-varying channel tracking method for short-range high-rate UWA FBMC-OQAM communication applications.First,a known preamble is used to initialize the channel estimation at the initial time of the signal block.Next,the estimated channel is applied to detect data symbols at several symbol periods.The detected data symbols are then reused as new pilots to estimate the next time channel.In the above steps,the unified transmission matrix model is extended to describe the time-varying channel input-output model in this paper and is used for symbol detection.Simulation results show that the channel tracking error can be reduced to less than−20 dB when the channel temporal coherence coefficient exceeds 0.75 within one block period of FBMC-OQAM signals.Compared with conventional known-pilot-based methods,the proposed method needs lower system overhead while exhibiting similar time-varying channel tracking performance.The sea trial results further proved the practicability of the proposed method.展开更多
In recent years,as newer technologies have evolved around the healthcare ecosystem,more and more data have been generated.Advanced analytics could power the data collected from numerous sources,both from healthcare in...In recent years,as newer technologies have evolved around the healthcare ecosystem,more and more data have been generated.Advanced analytics could power the data collected from numerous sources,both from healthcare institutions,or generated by individuals themselves via apps and devices,and lead to innovations in treatment and diagnosis of diseases;improve the care given to the patient;and empower citizens to participate in the decision-making process regarding their own health and well-being.However,the sensitive nature of the health data prohibits healthcare organizations from sharing the data.The Personal Health Train(PHT)is a novel approach,aiming to establish a distributed data analytics infrastructure enabling the(re)use of distributed healthcare data,while data owners stay in control of their own data.The main principle of the PHT is that data remain in their original location,and analytical tasks visit data sources and execute the tasks.The PHT provides a distributed,flexible approach to use data in a network of participants,incorporating the FAIR principles.It facilitates the responsible use of sensitive and/or personal data by adopting international principles and regulations.This paper presents the concepts and main components of the PHT and demonstrates how it complies with FAIR principles.展开更多
Easy access to data is one of the main avenues to accelerate scientific research.As a key element of scientific innovations,data sharing allows the reproduction of results and helps prevent data fabrication,falsificat...Easy access to data is one of the main avenues to accelerate scientific research.As a key element of scientific innovations,data sharing allows the reproduction of results and helps prevent data fabrication,falsification,and misuse.Although the research benefits from data reuse are widely acknowledged,the data collections existing today are still kept in silos.Indeed,monitoring what happens to data once they have been handed to a third party is currently not feasible within the current data sharing practices.We propose a blockchain-based system to trace data collections and potentially create a more trustworthy data sharing process.In this paper,we present the LUCE(License accoUntability and CompliancE)architecture as a decentralized blockchain-based platform supporting data sharing and reuse.LUCE is designed to provide full transparency on what happens to the data after they are shared with third parties.The contributions of this work consist of i)the design of a decentralized data sharing solution with accountability and compliance by design and ii)the inclusion of a dynamic consent model for personalized data sharing preferences and for enabling legal compliance mechanisms.We test the scalability of the platform in a real-time environment where a growing number of users access and reuse different datasets.Compared to existing data sharing solutions,LUCE provides transparency over data sharing practices,enables data reuse,and supports regulatory requirements.The experimentation shows that the platform can be scaled for a large number of users.展开更多
The widening gap between processor and memory speeds makes cache an important issue in the computer system design. Compared with work set of programs, cache resource is often rare. Therefore, it is very important for ...The widening gap between processor and memory speeds makes cache an important issue in the computer system design. Compared with work set of programs, cache resource is often rare. Therefore, it is very important for a computer system to use cache efficiently. Toward a dynamically reconfigurable cache proposed recently, DOOC (Data- Object Oriented Cache), this paper proposes a quantitative framework for analyzing the cache requirement of data-objects, which includes cache capacity, block size, associativity and coherence protocol. And a kind of graph coloring algorithm dealing with the competition between data-objects in the DOOC is proposed as well. Finally, we apply our approaches to the compiler management of DOOC. We test our approaches on both a single-core platform and a four-core platform. Compared with the traditional caches, the DOOC in both platforms achieves an average reduction of 44.98% and 49.69% in miss rate respectively. And its performance is very close to the ideal optimal cache.展开更多
Research Data Management(RDM)has become increasingly important for more and more academic institutions.Using the Peking University Open Research Data Repository(PKU-ORDR)project as an example,this paper will review a ...Research Data Management(RDM)has become increasingly important for more and more academic institutions.Using the Peking University Open Research Data Repository(PKU-ORDR)project as an example,this paper will review a library-based university-wide open research data repository project and related RDM services implementation process including project kickoff,needs assessment,partnerships establishment,software investigation and selection,software customization,as well as data curation services and training.Through the review,some issues revealed during the stages of the implementation process are also discussed and addressed in the paper such as awareness of research data,demands from data providers and users,data policies and requirements from home institution,requirements from funding agencies and publishers,the collaboration between administrative units and libraries,and concerns from data providers and users.The significance of the study is that the paper shows an example of creating an Open Data repository and RDM services for other Chinese academic libraries planning to implement their RDM services for their home institutions.The authors of the paper have also observed since the PKU-ORDR and RDM services implemented in 2015,the Peking University Library(PKUL)has helped numerous researchers to support the entire research life cycle and enhanced Open Science(OS)practices on campus,as well as impacted the national OS movement in China through various national events and activities hosted by the PKUL.展开更多
基金Supported by the National Natural Science Foundation of China under Grant Nos.62171405,62225114 and 62101489.
文摘Filter bank multicarrier(FBMC)systems with offset quadrature amplitude modulation(OQAM)need long data blocks to achieve high spectral efficiency.However,the transmission of long data blocks in underwater acoustic(UWA)communication systems often encounters the challenge of time-varying channels.This paper proposes a time-varying channel tracking method for short-range high-rate UWA FBMC-OQAM communication applications.First,a known preamble is used to initialize the channel estimation at the initial time of the signal block.Next,the estimated channel is applied to detect data symbols at several symbol periods.The detected data symbols are then reused as new pilots to estimate the next time channel.In the above steps,the unified transmission matrix model is extended to describe the time-varying channel input-output model in this paper and is used for symbol detection.Simulation results show that the channel tracking error can be reduced to less than−20 dB when the channel temporal coherence coefficient exceeds 0.75 within one block period of FBMC-OQAM signals.Compared with conventional known-pilot-based methods,the proposed method needs lower system overhead while exhibiting similar time-varying channel tracking performance.The sea trial results further proved the practicability of the proposed method.
文摘In recent years,as newer technologies have evolved around the healthcare ecosystem,more and more data have been generated.Advanced analytics could power the data collected from numerous sources,both from healthcare institutions,or generated by individuals themselves via apps and devices,and lead to innovations in treatment and diagnosis of diseases;improve the care given to the patient;and empower citizens to participate in the decision-making process regarding their own health and well-being.However,the sensitive nature of the health data prohibits healthcare organizations from sharing the data.The Personal Health Train(PHT)is a novel approach,aiming to establish a distributed data analytics infrastructure enabling the(re)use of distributed healthcare data,while data owners stay in control of their own data.The main principle of the PHT is that data remain in their original location,and analytical tasks visit data sources and execute the tasks.The PHT provides a distributed,flexible approach to use data in a network of participants,incorporating the FAIR principles.It facilitates the responsible use of sensitive and/or personal data by adopting international principles and regulations.This paper presents the concepts and main components of the PHT and demonstrates how it complies with FAIR principles.
基金This work was supported in part by the NWO Aspasia(Grant 91716421)by the Maastricht York-Partnership Grant。
文摘Easy access to data is one of the main avenues to accelerate scientific research.As a key element of scientific innovations,data sharing allows the reproduction of results and helps prevent data fabrication,falsification,and misuse.Although the research benefits from data reuse are widely acknowledged,the data collections existing today are still kept in silos.Indeed,monitoring what happens to data once they have been handed to a third party is currently not feasible within the current data sharing practices.We propose a blockchain-based system to trace data collections and potentially create a more trustworthy data sharing process.In this paper,we present the LUCE(License accoUntability and CompliancE)architecture as a decentralized blockchain-based platform supporting data sharing and reuse.LUCE is designed to provide full transparency on what happens to the data after they are shared with third parties.The contributions of this work consist of i)the design of a decentralized data sharing solution with accountability and compliance by design and ii)the inclusion of a dynamic consent model for personalized data sharing preferences and for enabling legal compliance mechanisms.We test the scalability of the platform in a real-time environment where a growing number of users access and reuse different datasets.Compared to existing data sharing solutions,LUCE provides transparency over data sharing practices,enables data reuse,and supports regulatory requirements.The experimentation shows that the platform can be scaled for a large number of users.
基金supported in part by the National Natural Science Foundation of China under Grant Nos.60621003,60873014.
文摘The widening gap between processor and memory speeds makes cache an important issue in the computer system design. Compared with work set of programs, cache resource is often rare. Therefore, it is very important for a computer system to use cache efficiently. Toward a dynamically reconfigurable cache proposed recently, DOOC (Data- Object Oriented Cache), this paper proposes a quantitative framework for analyzing the cache requirement of data-objects, which includes cache capacity, block size, associativity and coherence protocol. And a kind of graph coloring algorithm dealing with the competition between data-objects in the DOOC is proposed as well. Finally, we apply our approaches to the compiler management of DOOC. We test our approaches on both a single-core platform and a four-core platform. Compared with the traditional caches, the DOOC in both platforms achieves an average reduction of 44.98% and 49.69% in miss rate respectively. And its performance is very close to the ideal optimal cache.
文摘Research Data Management(RDM)has become increasingly important for more and more academic institutions.Using the Peking University Open Research Data Repository(PKU-ORDR)project as an example,this paper will review a library-based university-wide open research data repository project and related RDM services implementation process including project kickoff,needs assessment,partnerships establishment,software investigation and selection,software customization,as well as data curation services and training.Through the review,some issues revealed during the stages of the implementation process are also discussed and addressed in the paper such as awareness of research data,demands from data providers and users,data policies and requirements from home institution,requirements from funding agencies and publishers,the collaboration between administrative units and libraries,and concerns from data providers and users.The significance of the study is that the paper shows an example of creating an Open Data repository and RDM services for other Chinese academic libraries planning to implement their RDM services for their home institutions.The authors of the paper have also observed since the PKU-ORDR and RDM services implemented in 2015,the Peking University Library(PKUL)has helped numerous researchers to support the entire research life cycle and enhanced Open Science(OS)practices on campus,as well as impacted the national OS movement in China through various national events and activities hosted by the PKUL.