Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random samp...Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.展开更多
Back in August 2004, the Standing Committee of the National People's Congress, the highest legislature of China, published the Decision on Improving the People's Assessor System (hereinafter referred to as the Deci...Back in August 2004, the Standing Committee of the National People's Congress, the highest legislature of China, published the Decision on Improving the People's Assessor System (hereinafter referred to as the Decision). Since it became effective on May 1, 2005, the Decision has proved important to ensuring the right of citizens to participate in activities of adjudication in accordance with the law. It has helped expand judicial democracy and safeguard the legitimate rights and interests of the litigant parties.展开更多
In this paper a family, called the pivotal family, of distributions is considered.A pivotal family is determined by a generalized pivotal model. Analytical results show that a great many parametric families of distrib...In this paper a family, called the pivotal family, of distributions is considered.A pivotal family is determined by a generalized pivotal model. Analytical results show that a great many parametric families of distributions are pivotal. In a pivotal family of distributions a general method of deriving fiducial distributions of parameters is proposed. In the method a fiducial model plays an important role. A fiducial model is a function of a random variable with a known distribution, called the pivotal random element, when the observation of a statistic is given.The method of this paper includes some other methods of deriving fiducial distributions. Specially the first fiducial distribution given by Fisher can be derived by the method. For the monotone likelihood ratio family of distributions, which is a pivotal family, the fiducial distributions have a frequentist property in the Neyman-Pearson view. Fiducial distributions of regular parametric functions also have the above frequentist property. Some advantages of the fiducial inference are exhibited in four applications of the fiducial distribution. Many examples are given, in which the fiducial distributions cannot be derived by the existing methods.展开更多
Grid Based on Mobile Agent is a new grid scheme.The purpose of the paper is to solve the pivotal technology ofGrid Based on Mobile Agent(GBMA)combined with thought of Virtual Organization(VO).In GBMA,virtual orga-niza...Grid Based on Mobile Agent is a new grid scheme.The purpose of the paper is to solve the pivotal technology ofGrid Based on Mobile Agent(GBMA)combined with thought of Virtual Organization(VO).In GBMA,virtual orga-nization is viewed as the basic management unit of the grid,and mobile agent is regarded as an important interactivemeans.Grid architecture,grid resource management and grid task management are the core technology problem of GB-MA.The simulation results showy that Inter-VO pattern has the obvious advantage because it can make full use of resourcesfrom other virtual organizations in GBMA environment.展开更多
Reductionist thinking will no longer suffice to address contemporary,complex challenges that defy sectoral,national,or disciplinary boundaries.Furthermore,lessons learned from the past cannot be confidently used to pr...Reductionist thinking will no longer suffice to address contemporary,complex challenges that defy sectoral,national,or disciplinary boundaries.Furthermore,lessons learned from the past cannot be confidently used to predict outcomes or guide future actions.The authors propose that the confluence of a number of technology and social disruptors presents a pivotal moment in history to enable real time,accelerated,and integrated action that can adequately support a‘future earth’through transformational solutions.Building on more than a decade of dialogues hosted by the International Society for Digital Earth(ISDE),and evolving a briefing note presented to delegates of Pivotal 2015,the paper presents an emergent context for collectively addressing spatial information,sustainable development,and good governance through three guiding principles for enabling prosperous living in the twenty-first century.These are:(1)open data,(2)real-world context,and(3)informed visualization for decision support.The paper synthesizes an interdisciplinary dialogue to create a credible and positive future vision of collaborative and transparent action for the betterment of humanity and planet.It is intended that these Pivotal Principles can be used as an elegant framework for action toward the Digital Earth vision,across local,regional,and international communities and organizations.展开更多
This paper offers a portrait of C. S. Peirce as a playful thinker but also an account of play as that upon which much turns. He was, after all, a philosopher who in his maturity insisted, "a bit of fun helps thou...This paper offers a portrait of C. S. Peirce as a playful thinker but also an account of play as that upon which much turns. He was, after all, a philosopher who in his maturity insisted, "a bit of fun helps thought and tends to keep it pragmatical." In doing so, however, Peirce showed in his youth how evasions of responsibility might, in their own way, often be instances of engagement at once playful and more deeply responsible than any attempt to meet the formal expectations of external authority. For the account of play as pivotal, the author draws upon Cornelius Castoriadis and, to a far greater extent, John Dewey. Moreover, he explores an apparently stark contrast between the Peircean emphasis upon habit and the Derridean celebration of play, showing that the opposition is not as thoroughgoing as it might appear. Interweaving Dewey's insights with Peirce's, the author highlights the power of intense play to melt the rigidity of sedimented habits and, thereby, to generate opportunities for habit-change. This suggests that the capacity to act in an imaginative or creative way both ineluctably draws upon habits but also inevitably modifies, often in a dramatic manner, these habits. If the ultimate logical interpretant of a sign-process is, as Peirce suggests, a habit-change, then play is truly pivotal for seeing one of the most important ways in which such alteration takes place.展开更多
To gain superior computational efficiency, it might be necessary to change the underlying philosophy of the simplex method. In this paper, we propose a Phase-1 method along this line. We relax not only the conventiona...To gain superior computational efficiency, it might be necessary to change the underlying philosophy of the simplex method. In this paper, we propose a Phase-1 method along this line. We relax not only the conventional condition that some function value increases monotonically, but also the condition that all feasible variables remain feasible after basis change in Phase-1. That is, taking a purely combinatorial approach to achieving feasibility. This enables us to get rid of ratio test in pivoting, reducing computational cost per iteration to a large extent. Numerical results on a group of problems are encouraging.展开更多
Pancreatic cancer is one of the deadliest cancers,ranking fourth among cancer-related deaths. Despite all the major molecular advances and treatment breakthroughs, mainly targeted therapies, the cornerstone treatment ...Pancreatic cancer is one of the deadliest cancers,ranking fourth among cancer-related deaths. Despite all the major molecular advances and treatment breakthroughs, mainly targeted therapies, the cornerstone treatment of metastatic pancreatic cancer(m PC) remains cytotoxic chemotherapy. In 2016, more than 40 years after the introduction of gemcitabine in the management of m PC, the best choice for first-line treatment has not yet been fully elucidated. Two main strategies have been adopted to enhance treatment efficacy. The first strategy is based on combining non-cross resistant drugs, while the second option includes the development of newer generations of chemotherapy. More recently, two new regimens, FOLFIRINOX and gemcitabine/nab-paclitaxel(GNP), have both been shown to improve overall survival in comparison with gemcitabine alone, at the cost of increased toxicity. Therefore, the best choice for first line therapy is a matter of debate. For some authors, FOLFIRINOX should be the first choice in patients with an Eastern Cooperative Oncology Group score(0-1) given its lower hazard ratio. However, others do not share this opinion. In this paper, we review the main comparison points between FOLFIRINOX and GNP. We analyze the two pivotal trials to determine the similarities and differences in study design. In addition, we compare the toxicity profile of the two regimens as well as the impact on quality of life. Finally, we present studies revealing real life experiences and review the advantages and disadvantages of possible second-line therapies including their cost effectiveness.展开更多
When sampling from a finite population there is often auxiliary information available on unit level. Such information can be used to improve the estimation of the target parameter. We show that probability samples tha...When sampling from a finite population there is often auxiliary information available on unit level. Such information can be used to improve the estimation of the target parameter. We show that probability samples that are well spread in the auxiliary space are balanced, or approximately balanced, on the auxiliary variables. A consequence of this balancing effect is that the Horvitz-Thompson estimator will be a very good estimator for any target variable that can be well approximated by a Lipschitz continuous function of the auxiliary variables. Hence we give a theoretical motivation for use of well spread probability samples. Our conclusions imply that well spread samples, combined with the Horvitz- Thompson estimator, is a good strategy in a varsity of situations.展开更多
Based on the geological tectonics, aftershock activity, earthquake surface rupture and peak ground motion, the geometric and dynamic characteristics of seismogenic tectonics about the 1995 Hanshin earthquake are analy...Based on the geological tectonics, aftershock activity, earthquake surface rupture and peak ground motion, the geometric and dynamic characteristics of seismogenic tectonics about the 1995 Hanshin earthquake are analysed. Nojima fault and Rokko fault have the same trending direction, but opposite dips. Their rising and falling plates are in symmetrically diagonal distribution. The two faults can be defined as thrust strike slip faults and constitute a pivotal strike slip fault. The earthquake just occurred at the pivot, which is the seismotectonics for the earthquake to develop and occur. The pivotal movement along a strike slip fault often leads to the occurrence of large earthquakes, whose dynamic process can be demonstrated by the stress analysis on the torsion of a beam with rectangle section. The displacement of earthquake surface rupture, aftershock density and peak acceleration change in a certain range of epicentral distance just similar as the shear stress changes from the center to the sides in the rectangle section. The distribution characteristics of the heaviest damage areas are also discussed in the article from the aspects of special geological tectonics and seismotectonic condition. The result obtained from the article can be applied not only to realizing the potencial earthquake sources in middle long time, but also to build reasonably the prediction model about earthquake hazard.展开更多
基金the Ministry of Agriculture and Forestry key project“Puuta liikkeelle ja uusia tuotteita metsästä”(“Wood on the move and new products from forest”)Academy of Finland(project numbers 295100 , 306875).
文摘Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.
文摘Back in August 2004, the Standing Committee of the National People's Congress, the highest legislature of China, published the Decision on Improving the People's Assessor System (hereinafter referred to as the Decision). Since it became effective on May 1, 2005, the Decision has proved important to ensuring the right of citizens to participate in activities of adjudication in accordance with the law. It has helped expand judicial democracy and safeguard the legitimate rights and interests of the litigant parties.
基金supported by the National Natural Science Foundation of China(Grant Nos.10271013,10071090).
文摘In this paper a family, called the pivotal family, of distributions is considered.A pivotal family is determined by a generalized pivotal model. Analytical results show that a great many parametric families of distributions are pivotal. In a pivotal family of distributions a general method of deriving fiducial distributions of parameters is proposed. In the method a fiducial model plays an important role. A fiducial model is a function of a random variable with a known distribution, called the pivotal random element, when the observation of a statistic is given.The method of this paper includes some other methods of deriving fiducial distributions. Specially the first fiducial distribution given by Fisher can be derived by the method. For the monotone likelihood ratio family of distributions, which is a pivotal family, the fiducial distributions have a frequentist property in the Neyman-Pearson view. Fiducial distributions of regular parametric functions also have the above frequentist property. Some advantages of the fiducial inference are exhibited in four applications of the fiducial distribution. Many examples are given, in which the fiducial distributions cannot be derived by the existing methods.
文摘Grid Based on Mobile Agent is a new grid scheme.The purpose of the paper is to solve the pivotal technology ofGrid Based on Mobile Agent(GBMA)combined with thought of Virtual Organization(VO).In GBMA,virtual orga-nization is viewed as the basic management unit of the grid,and mobile agent is regarded as an important interactivemeans.Grid architecture,grid resource management and grid task management are the core technology problem of GB-MA.The simulation results showy that Inter-VO pattern has the obvious advantage because it can make full use of resourcesfrom other virtual organizations in GBMA environment.
文摘Reductionist thinking will no longer suffice to address contemporary,complex challenges that defy sectoral,national,or disciplinary boundaries.Furthermore,lessons learned from the past cannot be confidently used to predict outcomes or guide future actions.The authors propose that the confluence of a number of technology and social disruptors presents a pivotal moment in history to enable real time,accelerated,and integrated action that can adequately support a‘future earth’through transformational solutions.Building on more than a decade of dialogues hosted by the International Society for Digital Earth(ISDE),and evolving a briefing note presented to delegates of Pivotal 2015,the paper presents an emergent context for collectively addressing spatial information,sustainable development,and good governance through three guiding principles for enabling prosperous living in the twenty-first century.These are:(1)open data,(2)real-world context,and(3)informed visualization for decision support.The paper synthesizes an interdisciplinary dialogue to create a credible and positive future vision of collaborative and transparent action for the betterment of humanity and planet.It is intended that these Pivotal Principles can be used as an elegant framework for action toward the Digital Earth vision,across local,regional,and international communities and organizations.
文摘This paper offers a portrait of C. S. Peirce as a playful thinker but also an account of play as that upon which much turns. He was, after all, a philosopher who in his maturity insisted, "a bit of fun helps thought and tends to keep it pragmatical." In doing so, however, Peirce showed in his youth how evasions of responsibility might, in their own way, often be instances of engagement at once playful and more deeply responsible than any attempt to meet the formal expectations of external authority. For the account of play as pivotal, the author draws upon Cornelius Castoriadis and, to a far greater extent, John Dewey. Moreover, he explores an apparently stark contrast between the Peircean emphasis upon habit and the Derridean celebration of play, showing that the opposition is not as thoroughgoing as it might appear. Interweaving Dewey's insights with Peirce's, the author highlights the power of intense play to melt the rigidity of sedimented habits and, thereby, to generate opportunities for habit-change. This suggests that the capacity to act in an imaginative or creative way both ineluctably draws upon habits but also inevitably modifies, often in a dramatic manner, these habits. If the ultimate logical interpretant of a sign-process is, as Peirce suggests, a habit-change, then play is truly pivotal for seeing one of the most important ways in which such alteration takes place.
文摘To gain superior computational efficiency, it might be necessary to change the underlying philosophy of the simplex method. In this paper, we propose a Phase-1 method along this line. We relax not only the conventional condition that some function value increases monotonically, but also the condition that all feasible variables remain feasible after basis change in Phase-1. That is, taking a purely combinatorial approach to achieving feasibility. This enables us to get rid of ratio test in pivoting, reducing computational cost per iteration to a large extent. Numerical results on a group of problems are encouraging.
文摘Pancreatic cancer is one of the deadliest cancers,ranking fourth among cancer-related deaths. Despite all the major molecular advances and treatment breakthroughs, mainly targeted therapies, the cornerstone treatment of metastatic pancreatic cancer(m PC) remains cytotoxic chemotherapy. In 2016, more than 40 years after the introduction of gemcitabine in the management of m PC, the best choice for first-line treatment has not yet been fully elucidated. Two main strategies have been adopted to enhance treatment efficacy. The first strategy is based on combining non-cross resistant drugs, while the second option includes the development of newer generations of chemotherapy. More recently, two new regimens, FOLFIRINOX and gemcitabine/nab-paclitaxel(GNP), have both been shown to improve overall survival in comparison with gemcitabine alone, at the cost of increased toxicity. Therefore, the best choice for first line therapy is a matter of debate. For some authors, FOLFIRINOX should be the first choice in patients with an Eastern Cooperative Oncology Group score(0-1) given its lower hazard ratio. However, others do not share this opinion. In this paper, we review the main comparison points between FOLFIRINOX and GNP. We analyze the two pivotal trials to determine the similarities and differences in study design. In addition, we compare the toxicity profile of the two regimens as well as the impact on quality of life. Finally, we present studies revealing real life experiences and review the advantages and disadvantages of possible second-line therapies including their cost effectiveness.
文摘When sampling from a finite population there is often auxiliary information available on unit level. Such information can be used to improve the estimation of the target parameter. We show that probability samples that are well spread in the auxiliary space are balanced, or approximately balanced, on the auxiliary variables. A consequence of this balancing effect is that the Horvitz-Thompson estimator will be a very good estimator for any target variable that can be well approximated by a Lipschitz continuous function of the auxiliary variables. Hence we give a theoretical motivation for use of well spread probability samples. Our conclusions imply that well spread samples, combined with the Horvitz- Thompson estimator, is a good strategy in a varsity of situations.
文摘Based on the geological tectonics, aftershock activity, earthquake surface rupture and peak ground motion, the geometric and dynamic characteristics of seismogenic tectonics about the 1995 Hanshin earthquake are analysed. Nojima fault and Rokko fault have the same trending direction, but opposite dips. Their rising and falling plates are in symmetrically diagonal distribution. The two faults can be defined as thrust strike slip faults and constitute a pivotal strike slip fault. The earthquake just occurred at the pivot, which is the seismotectonics for the earthquake to develop and occur. The pivotal movement along a strike slip fault often leads to the occurrence of large earthquakes, whose dynamic process can be demonstrated by the stress analysis on the torsion of a beam with rectangle section. The displacement of earthquake surface rupture, aftershock density and peak acceleration change in a certain range of epicentral distance just similar as the shear stress changes from the center to the sides in the rectangle section. The distribution characteristics of the heaviest damage areas are also discussed in the article from the aspects of special geological tectonics and seismotectonic condition. The result obtained from the article can be applied not only to realizing the potencial earthquake sources in middle long time, but also to build reasonably the prediction model about earthquake hazard.