期刊文献+
共找到306篇文章
< 1 2 16 >
每页显示 20 50 100
Duplicated chalcone synthase(CHS)genes modulate flavonoid production in tea plants in response to light stress
1
作者 Mingzhuo Li Wenzhao Wang +5 位作者 Yeru Wang Lili Guo Yajun Liu Xiaolan Jiang Liping Gao Tao Xia 《Journal of Integrative Agriculture》 SCIE CAS CSCD 2024年第6期1940-1955,共16页
In tea plants,the abundant flavonoid compounds are responsible for the health benefits for the human body and define the astringent flavor profile.While the downstream mechanisms of flavonoid biosynthesis have been ex... In tea plants,the abundant flavonoid compounds are responsible for the health benefits for the human body and define the astringent flavor profile.While the downstream mechanisms of flavonoid biosynthesis have been extensively studied,the role of chalcone synthase(CHS)in this secondary metabolic process in tea plants remains less clear.In this study,we compared the evolutionary profile of the flavonoid metabolism pathway and discovered that gene duplication of CHS occurred in tea plants.We identified three CsCHS genes,along with a CsCHS-like gene,as potential candidates for further functional investigation.Unlike the CsCHS-like gene,the CsCHS genes effectively restored flavonoid production in Arabidopsis chs-mutants.Additionally,CsCHS transgenic tobacco plants exhibited higher flavonoid compound accumulation compared to their wild-type counterparts.Most notably,our examination of promoter and gene expression levels for the selected CHS genes revealed distinct responses to UV-B stress in tea plants.Our findings suggest that environmental factors such as UV-B exposure could have been the key drivers behind the gene duplication events in CHS. 展开更多
关键词 TEA flavonoids biosynthesis CHS gene duplication UV-B stress
下载PDF
Duplicate identification model for deep web 被引量:4
2
作者 刘丽楠 寇月 +2 位作者 孙高尚 申德荣 于戈 《Journal of Southeast University(English Edition)》 EI CAS 2008年第3期315-317,共3页
A duplicate identification model is presented to deal with semi-structured or unstructured data extracted from multiple data sources in the deep web.First,the extracted data is generated to the entity records in the d... A duplicate identification model is presented to deal with semi-structured or unstructured data extracted from multiple data sources in the deep web.First,the extracted data is generated to the entity records in the data preprocessing module,and then,in the heterogeneous records processing module it calculates the similarity degree of the entity records to obtain the duplicate records based on the weights calculated in the homogeneous records processing module.Unlike traditional methods,the proposed approach is implemented without schema matching in advance.And multiple estimators with selective algorithms are adopted to reach a better matching efficiency.The experimental results show that the duplicate identification model is feasible and efficient. 展开更多
关键词 duplicate records deep web data cleaning semi-structured data
下载PDF
真实世界证据与随机对照试验:RCT DUPLICATE项目方法学介绍 被引量:15
3
作者 石舒原 赵厚宇 +2 位作者 周庆欣 孙凤 詹思延 《药物流行病学杂志》 CAS 2020年第3期198-205,共8页
传统随机对照试验(RCT)之外基于真实世界数据(RWD)分析产生的证据,即真实世界证据(RWE),在医学研究领域受到越来越多的关注。哈佛大学的研究团队于2018年率先发起了RCT DUPLICATE项目,旨在利用RWD开展非随机的研究,来重复或预测RCT的结... 传统随机对照试验(RCT)之外基于真实世界数据(RWD)分析产生的证据,即真实世界证据(RWE),在医学研究领域受到越来越多的关注。哈佛大学的研究团队于2018年率先发起了RCT DUPLICATE项目,旨在利用RWD开展非随机的研究,来重复或预测RCT的结果,探索RWD产生RWE涉及的理论方法,进一步推广RWE的应用。本文简要介绍了该项目的产生背景、待重复或待预测的RCT的遴选标准,并对开展RWD研究过程中存在的偏倚及实际应用中的应对策略进行了重点讨论。本文进一步总结归纳了研究者在设计、实施和评估这类研究时需要关注的关键问题,并整理了该项目组提出的RWD研究的结构化流程框架,以期帮助国内学者更好地理解RWD研究的应用价值及其局限性,为相关领域的学者今后开展更深入的研究工作提供参考,为医疗监管机构进行决策提供证据支撑。 展开更多
关键词 随机对照试验 真实世界数据 真实世界证据 RCT duplicate项目 混杂 偏倚
下载PDF
真实世界证据与随机对照试验:RCT DUPLICATE项目成果 被引量:3
4
作者 石舒原 周庆欣 +1 位作者 孙凤 詹思延 《药物流行病学杂志》 CAS 2019年第11期757-762,共6页
随着医疗大数据的推广和使用,政策制定者和研究者逐渐聚焦于利用真实世界数据(RWD)产生真实世界证据(RWE)及其相关研究上。哈佛的研究团队于2018年率先发起了RCT DUPLICATE项目,通过利用RWD构建非随机的观察性研究,来重复随机对照试验(R... 随着医疗大数据的推广和使用,政策制定者和研究者逐渐聚焦于利用真实世界数据(RWD)产生真实世界证据(RWE)及其相关研究上。哈佛的研究团队于2018年率先发起了RCT DUPLICATE项目,通过利用RWD构建非随机的观察性研究,来重复随机对照试验(RCT)的结果,以期充分评估RWD产生RWE后达成的初期理论及进一步推广应用。本文在简要介绍了RCT DUPLICATE项目的背景和两个基于RWD形成的RWE并在此基础上获得审批的药物实例之后,对该项目的四个子项目的研究过程、研究进展及研究成果进行了重点阐述,希望对国内学者理解RWE的应用价值和日后深入地开展研究有所裨益。 展开更多
关键词 随机对照试验 非随机观察性研究 真实世界证据 RCT duplicate项目
下载PDF
真实世界证据与随机对照试验:RCT DUPLICATE项目概述 被引量:11
5
作者 姚晓莹 张靖雪 詹思延 《药物流行病学杂志》 CAS 2019年第8期495-497,517,共4页
近年来,利用真实世界数据(real-world data,RWD)通过恰当的设计和分析产生真实世界证据(real-world evidence,RWE),已经成为学术界、工业界和监管机构共同关注的话题。但RWE能否代替来自严格受控的随机对照试验(RCT)产生的证据仍不确定... 近年来,利用真实世界数据(real-world data,RWD)通过恰当的设计和分析产生真实世界证据(real-world evidence,RWE),已经成为学术界、工业界和监管机构共同关注的话题。但RWE能否代替来自严格受控的随机对照试验(RCT)产生的证据仍不确定。为此,2018年美国启动了RCT DUPLICATE项目,旨在利用真实世界证据,通过非随机的观察性研究来重复RCT的结果。本文回顾了RCT DUPLICATE产生的背景,重点介绍了该项目的研究团队、研究目的、研究内容及项目意义,以期促进国内学者更好地理解RWE的适用范围和应用价值。 展开更多
关键词 真实世界证据 随机对照试验 非随机观察性研究 RCT duplicate项目
下载PDF
Duplicated appendix complicated by appendiceal cancer 被引量:1
6
作者 Hugh J Freeman 《World Journal of Gastroenterology》 SCIE CAS CSCD 2011年第1期135-136,共2页
A 37-year old male presented with an acute abdomen suggestive of an appendiceal perforation.Urgent laparotomy showed a duplicated appendix with one of the lumens involved with appendicitis and a focal periappendicular... A 37-year old male presented with an acute abdomen suggestive of an appendiceal perforation.Urgent laparotomy showed a duplicated appendix with one of the lumens involved with appendicitis and a focal periappendicular abscess while the other lumen had a localized appendiceal cancer.Recognition of congenital intestinal duplications in adults is important to avoid serious clinical consequences. 展开更多
关键词 duplicated appendix Bifid appendix Appendiceal cancer Congenital duplication.
下载PDF
Random Forests Algorithm Based Duplicate Detection in On-Site Programming Big Data Environment 被引量:1
7
作者 Qianqian Li Meng Li +1 位作者 Lei Guo Zhen Zhang 《Journal of Information Hiding and Privacy Protection》 2020年第4期199-205,共7页
On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is e... On-site programming big data refers to the massive data generated in the process of software development with the characteristics of real-time,complexity and high-difficulty for processing.Therefore,data cleaning is essential for on-site programming big data.Duplicate data detection is an important step in data cleaning,which can save storage resources and enhance data consistency.Due to the insufficiency in traditional Sorted Neighborhood Method(SNM)and the difficulty of high-dimensional data detection,an optimized algorithm based on random forests with the dynamic and adaptive window size is proposed.The efficiency of the algorithm can be elevated by improving the method of the key-selection,reducing dimension of data set and using an adaptive variable size sliding window.Experimental results show that the improved SNM algorithm exhibits better performance and achieve higher accuracy. 展开更多
关键词 On-site programming big data duplicate record detection random forests adaptive sliding window
下载PDF
Duplicate publication bias weakens the validity of metaanalysis of immunosuppression after transplantation
8
作者 Cameron J Fairfield Ewen M Harrison Stephen J Wigmore 《World Journal of Gastroenterology》 SCIE CAS 2017年第39期7198-7200,共3页
Duplicate publication can introduce significant bias into a meta-analysis if studies are inadvertently included more than once. Many studies are published in more than one journal to maximize readership and impact of ... Duplicate publication can introduce significant bias into a meta-analysis if studies are inadvertently included more than once. Many studies are published in more than one journal to maximize readership and impact of the study findings. Inclusion of multiple publications of the same study within a meta-analysis affords inappropriate weight to the duplicated data if reports of the same study are not linked together. As studies which have positive findings are more likely to be published in multiple journals this leads to a potential overestimate of the benefits of an intervention. Recent advances in immunosuppression strategies following liver transplantation have led to many studies investigating immunosuppressive regimes including immunosuppression monotherapy. In this letter we focus on a recently published meta-analysis by Lan et al investigating studies assessing immunosuppression monotherapy for liver transplantation. The authors claim to have identified fourteen separate randomised studies investigating immunosuppression monotherapy. Seven of the references appear to relate to only three studies which have been subject to duplicate publication. Several similarities can be identified in each of the duplicate publications including similar authorship, identical immunosuppression regimes, identical dates of enrolment and citation of the original publication in the subsequent manuscripts. We discuss the evidence of the duplicate publication inclusion in the meta-analysis. 展开更多
关键词 Liver transplantation IMMUNOSUPPRESSION META-ANALYSIS duplicate publication BIAS
下载PDF
Approximate Discovery of Service Nodes by Duplicate Detection in Flows
9
作者 Zhou Changling Xiao Jianguo +2 位作者 Cui Jian Zhang Bei Li Feng 《China Communications》 SCIE CSCD 2012年第5期75-89,共15页
Discovery of service nodes in flows is a challenging task, especially in large ISPs or campus networks where the amount of traffic across net-work is rmssive. We propose an effective data structure called Round-robin ... Discovery of service nodes in flows is a challenging task, especially in large ISPs or campus networks where the amount of traffic across net-work is rmssive. We propose an effective data structure called Round-robin Buddy Bloom Filters (RBBF) to detect duplicate elements in flows. A two-stage approximate algorithm based on RBBF which can be used for detecting service nodes from NetFlow data is also given and the perfonmnce of the algorithm is analyzed. In our case, the proposed algorithm uses about 1% memory of hash table with false positive error rate less than 5%. A proto-type system, which is compatible with both IPv4 and IPv6, using the proposed data structure and al-gorithm is introduced. Some real world case studies based on the prototype system are discussed. 展开更多
关键词 duplicate detection service nodes dis-covery buddy bloom filter round-robin schema NETFLOW
下载PDF
An Automatic Threshold Selection Using ALO for Healthcare Duplicate Record Detection with Reciprocal Neuro-Fuzzy Inference System
10
作者 Ala Saleh Alluhaidan Pushparaj +4 位作者 Anitha Subbappa Ved Prakash Mishra P.V.Chandrika Anurika Vaish Sarthak Sengupta 《Computers, Materials & Continua》 SCIE EI 2023年第3期5821-5836,共16页
ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.D... ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.Duplicate detections involve discovering records referring to the same practical components,indicating tasks,which are generally dependent on several input parameters that experts yield.Record linkage specifies the issue of finding identical records across various data sources.The similarity existing between two records is characterized based on domain-based similarity functions over different features.De-duplication of one dataset or the linkage of multiple data sets has become a highly significant operation in the data processing stages of different data mining programmes.The objective is to match all the records associated with the same entity.Various measures have been in use for representing the quality and complexity about data linkage algorithms,and many other novel metrics have been introduced.An outline of the problem existing in themeasurement of data linkage and de-duplication quality and complexity is presented.This article focuses on the reprocessing of health data that is horizontally divided among data custodians,with the purpose of custodians giving similar features to sets of patients.The first step in this technique is about an automatic selection of training examples with superior quality from the compared record pairs and the second step involves training the reciprocal neuro-fuzzy inference system(RANFIS)classifier.Using the Optimal Threshold classifier,it is presumed that there is information about the original match status for all compared record pairs(i.e.,Ant Lion Optimization),and therefore an optimal threshold can be computed based on the respective RANFIS.Febrl,Clinical Decision(CD),and Cork Open Research Archive(CORA)data repository help analyze the proposed method with evaluated benchmarks with current techniques. 展开更多
关键词 duplicate detection healthcare record linkage dataset pre-processing reciprocal neuro-fuzzy inference system and ant lion optimization fuzzy system
下载PDF
Search for d-MPs without duplicates in two-terminal multistate networks based on MPs
11
作者 XU Bei FANG Yining +2 位作者 BAI Guanghan ZHANG Yun’an TAO Junyong 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2022年第6期1332-1341,共10页
The reliability evaluation of a multistate network is primarily based on d-minimal paths/cuts(d-MPs/d-MCs).However,being a nondeterminism polynomial hard(NP-hard)problem,searching for all d-MPs is a rather challenging... The reliability evaluation of a multistate network is primarily based on d-minimal paths/cuts(d-MPs/d-MCs).However,being a nondeterminism polynomial hard(NP-hard)problem,searching for all d-MPs is a rather challenging task.In existing implicit enumeration algorithms based on minimal paths(MPs),duplicate d-MP candidates may be generated.An extra step is needed to locate and remove these duplicate d-MP candidates,which costs significant computational effort.This paper proposes an efficient method to prevent the generation of duplicate d-MP candidates for implicit enumeration algorithms for d-MPs.First,the mechanism of generating duplicate d-MP candidates in the implicit enumeration algorithms is discussed.Second,a direct and efficient avoiding-duplicates method is proposed.Third,an improved algorithm is developed,followed by complexity analysis and illustrative examples.Based on the computational experiments comparing with two existing algorithms,it is found that the proposed method can significantly improve the efficiency of generating d-MPs for a particular demand level d. 展开更多
关键词 RELIABILITY multistate network d-minimal path(d-MP) duplicate
下载PDF
How to duplicate the procedural success of coronary interventions by the radial approach:tips and tricks in the selection and manipulations of guides
12
作者 Thach Nguyen Lan Nguyen 《Journal of Geriatric Cardiology》 SCIE CAS CSCD 2007年第1期17-19,共3页
  In this issue of the Journal of Geriatric Cardiology;Jing et al. showed off their near perfect results of percutaneous coronary interventions (PCI) through transfemoral approach (TFA) and transradial approach (TRA...   In this issue of the Journal of Geriatric Cardiology;Jing et al. showed off their near perfect results of percutaneous coronary interventions (PCI) through transfemoral approach (TFA) and transradial approach (TRA) in the elderly Chinese patients. All patients were older.than 60years of age, with an average of 67.…… 展开更多
关键词 How to duplicate the procedural success of coronary interventions by the radial approach LAD
下载PDF
Fast Semantic Duplicate Detection Techniques in Databases
13
作者 Ibrahim Moukouop Nguena Amolo-Makama Ophélie Carmen Richeline 《Journal of Software Engineering and Applications》 2017年第6期529-545,共17页
Semantic duplicates in databases represent today an important data quality challenge which leads to bad decisions. In large databases, we sometimes find ourselves with tens of thousands of duplicates, which necessitat... Semantic duplicates in databases represent today an important data quality challenge which leads to bad decisions. In large databases, we sometimes find ourselves with tens of thousands of duplicates, which necessitates an automatic deduplication. For this, it is necessary to detect duplicates, with a fairly reliable method to find as many duplicates as possible and powerful enough to run in a reasonable time. This paper proposes and compares on real data effective duplicates detection methods for automatic deduplication of files based on names, working with French texts or English texts, and the names of people or places, in Africa or in the West. After conducting a more complete classification of semantic duplicates than the usual classifications, we introduce several methods for detecting duplicates whose average complexity observed is less than O(2n). Through a simple model, we highlight a global efficacy rate, combining precision and recall. We propose a new metric distance between records, as well as rules for automatic duplicate detection. Analyses made on a database containing real data for an administration in Central Africa, and on a known standard database containing names of restaurants in the USA, have shown better results than those of known methods, with a lesser complexity. 展开更多
关键词 SEMANTIC duplicate DETECTION Technique DETECTION CAPABILITY Automatic DEDUPLICATION DETECTION Rates and Error Rates
下载PDF
Duplicates in systematic reviews: A critical, but often neglected issue
14
作者 Xing-Shun Qi Ming Bai +1 位作者 Zhi-Ping Yang Wei-Rong Ren 《World Journal of Meta-Analysis》 2013年第3期97-101,共5页
The number of systematic reviews is gradually increas-ing over time. Also, the methods to perform a systematic review are being improved. However, little attention has been paid for the issue regarding how to fnd dupl... The number of systematic reviews is gradually increas-ing over time. Also, the methods to perform a systematic review are being improved. However, little attention has been paid for the issue regarding how to fnd duplicates in systematic reviews. On the basis of the survey and systematic reviews by our team and others, we review the prevalence, significance and classification of duplicates and the method to fnd duplicates in a systematic review. Notably, although a preliminary method to fnd duplicates is established, its usefulness and convenience need to be further confrmed. 展开更多
关键词 duplicates Systematic review METHOD PREVALENCE Signifcance
下载PDF
Duplicated inferior vena cava in a patient undergoing right transumbilical laparoendoscopic single-site (LESS) radical nephrectomy
15
作者 Wang Linhui Wu Zhenjie +2 位作者 Liu Bing Yang Qing Sun Yinghao 《Journal of Medical Colleges of PLA(China)》 CAS 2011年第5期279-282,共4页
LaparoEndoscopic Single-site(LESS) renal surgery emerging as a potential alternative to conventional laparoscopy,is technically challenging and the major vascular anomaly may increase the risk of intraoperative haemor... LaparoEndoscopic Single-site(LESS) renal surgery emerging as a potential alternative to conventional laparoscopy,is technically challenging and the major vascular anomaly may increase the risk of intraoperative haemorrhage.Herein,we present a case of right transumbilical LESS radical nephrectomy which was successfully performed in the presence of double inferior vena cava and duplicated the standard laparoscopic techniques.Most importantly,to bring such an aberrant vascular anatomy to the attention of laparoscopic,especially LESS surgeons with high resolution pictorial illustrations. 展开更多
关键词 duplicated inferior vena cava ANOMALY Laparoendosopic single-site surgery Radical nephrectomy
下载PDF
A case of duplicated inferior vena cava with bilateral iliac vein compression
16
作者 Liang Yang Hongdong Xu +1 位作者 Shibin Hu Shuangling Yao 《Journal of Interventional Medicine》 2023年第3期134-136,共3页
Duplicated inferior vena cava with bilateral iliac vein compression is extremely rare.We report a case of an 87-year-old man presented with bilateral lower extremity swelling,who was noted to have duplicated inferior ... Duplicated inferior vena cava with bilateral iliac vein compression is extremely rare.We report a case of an 87-year-old man presented with bilateral lower extremity swelling,who was noted to have duplicated inferior vena cava,as revealed by computed tomography angiography(CTA).This revealed bilateral iliac vein compression caused by surrounding structures.Anticoagulant treatment combined with stent implantation completely alleviated this chronic debilitating condition during the follow-up of 2 months with no recurrence. 展开更多
关键词 duplicated inferior vena cava STENT Venous compression syndromes Computed tomography angiography
下载PDF
Duplicate Form of the Generalized Carlitz Inversions and Summation Formulae
17
作者 Qiaoying Dong 《Journal of Applied Mathematics and Physics》 2019年第4期900-911,共12页
The duplicate form of the generalized Gould-Hsu inversions has been obtained by Shi and Zhang. In this paper, we present a simple proof of this duplicate form. With the same method, we construct the duplicate form of ... The duplicate form of the generalized Gould-Hsu inversions has been obtained by Shi and Zhang. In this paper, we present a simple proof of this duplicate form. With the same method, we construct the duplicate form of the generalized Carlitz inversions. Using this duplicate form, we obtain several terminating basic hypergeometric identities and some limiting cases. 展开更多
关键词 duplicate INVERSIONS GENERALIZED Gould-Hsu INVERSIONS GENERALIZED Carlitz INVERSIONS
下载PDF
TCP Karak: A New TCP AIMD Algorithm Based on Duplicated Acknowledgements for MANET
18
作者 Wesam A. Almobaideen Njoud O. Al-maitah 《International Journal of Communications, Network and System Sciences》 2014年第9期396-407,共12页
Transmission Control Protocol (TCP) performance over MANET is an area of extensive research. Congestion control mechanisms are major components of TCP which affect its performance. The improvement of these mechanisms ... Transmission Control Protocol (TCP) performance over MANET is an area of extensive research. Congestion control mechanisms are major components of TCP which affect its performance. The improvement of these mechanisms represents a big challenge especially over wireless environments. Additive Increase Multiplicative Decrease (AIMD) mechanisms control the amount of increment and decrement of the transmission rate as a response to changes in the level of contention on routers buffer space and links bandwidth. The role of an AIMD mechanism in transmitting the proper amount of data is not easy, especially over MANET. This is because MANET has a very dynamic topology and high bit error rate wireless links that cause packet loss. Such a loss could be misinterpreted as severe congestion by the transmitting TCP node. This leads to unnecessary sharp reduction in the transmission rate which could degrades TCP throughput. This paper introduces a new AIMD algorithm that takes the number of already received duplicated ACK, when a timeout takes place, into account in deciding the amount of multiplicative decrease. Specifically, it decides the point from which Slow-start mechanism should begin its recovery of the congestion window size. The new AIMD algorithm has been developed as a new TCP variant which we call TCP Karak. The aim of TCP Karak is to be more adaptive to mobile wireless networks conditions by being able to distinguish between loss due to severe congestion and that due to link breakages or bit errors. Several simulated experiments have been conducted to evaluate TCP Karak and compare its performance with TCP NewReno. Results have shown that TCP Karak is able to achieve higher throughput and goodput than TCP NewReno under various mobility speeds, traffic loads, and bit error rates. 展开更多
关键词 TCP Congestion Control Additive Increase MULTIPLICATIVE DECREASE Mobile Ad HOC Networks duplicated ACKNOWLEDGEMENT
下载PDF
Unruptured Aneurysm at the Origin of the Duplicated Middle Cerebral Artery Treated by Coil Embolization: A Case Report
19
作者 Shingo Toyota Tetsuya Kumagai +3 位作者 Hirofumi Sugano Shota Yamamoto Kanji Mori Takuyu Taki 《Open Journal of Modern Neurosurgery》 2015年第1期27-33,共7页
Aneurysm at the origin of a duplication of the middle cerebral artery (DMCA) is very rare, and only 29 treated cases have been reported. All of the cases were treated by direct surgery except a ruptured case treated b... Aneurysm at the origin of a duplication of the middle cerebral artery (DMCA) is very rare, and only 29 treated cases have been reported. All of the cases were treated by direct surgery except a ruptured case treated by intentional partial coil embolization. We report the first unruptured case treated by coil embolization and review the previously published cases. Coil embolization can be alternative treatment for an unruptured aneurysm at the origin of the DMCA. Stable framing to spare the origin of it and prevention of thromboembolic complications are keys for safe treatment. 展开更多
关键词 duplicated MIDDLE CEREBRAL ARTERY ANEURYSM COIL EMBOLIZATION
下载PDF
Research on using hemacyte segregation apparatus to collect duplicate thrombocyte from one blood donor
20
《中国输血杂志》 CAS CSCD 2001年第S1期326-,共1页
关键词 Research on using hemacyte segregation apparatus to collect duplicate thrombocyte from one blood donor
下载PDF
上一页 1 2 16 下一页 到第
使用帮助 返回顶部