A deconvolution data processing is developed for obtaining a Functionalized Data Operator (FDO) model that is trained to approximate past and present, input-output data relations. The FDO model is designed to predict ...A deconvolution data processing is developed for obtaining a Functionalized Data Operator (FDO) model that is trained to approximate past and present, input-output data relations. The FDO model is designed to predict future output features for deviated input vectors from any expected, feared of conceivable, future input for optimum control, forecast, or early-warning hazard evaluation. The linearized FDO provides fast analytical, input-output solution in matrix equation form. If the FDO is invertible, the necessary input for a desired output may be explicitly evaluated. A numerical example is presented for FDO model identification and hazard evaluation for methane inflow into the working face in an underground mine: First, a Physics-Based Operator (PBO) model to match monitored data. Second, FDO models are identified for matching the observed, short-term variations with time in the measured data of methane inflow, varying model parameters and simplifications following the parsimony concept of Occam’s Razor. The numerical coefficients of the PBO and FDO models are found to differ by two to three orders of magnitude for methane release as a function of short-time barometric pressure variations. As being data-driven, the significantly different results from an FDO versus PBO model is either an indication of methane release processes poorly understood and modeled in PBO, missing some physics for the pressure spikes;or of problems in the monitored data fluctuations, erroneously sampled with time;or of false correlation. Either way, the FDO model is originated from the functionalized form of the monitored data, and its result is considered experimentally significant within the specified RMS error of model matching.展开更多
Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically ...Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction,search,inductive inference based on Occam’s razor,problem solving,decision making,and reinforcement learning in environments of a very general type.Since inductive inference is at the heart of all inductive sciences,some of the results are relevant not only for AI and computer science but also for physics,provoking nontraditional predictions based on Zuse’s thesis of the computer-generated universe.We first briefly review the history of AI since Godel’s 1931 paper,then discuss recent post-2000 approaches that are currently transforming general AI research into a formal science.展开更多
文摘A deconvolution data processing is developed for obtaining a Functionalized Data Operator (FDO) model that is trained to approximate past and present, input-output data relations. The FDO model is designed to predict future output features for deviated input vectors from any expected, feared of conceivable, future input for optimum control, forecast, or early-warning hazard evaluation. The linearized FDO provides fast analytical, input-output solution in matrix equation form. If the FDO is invertible, the necessary input for a desired output may be explicitly evaluated. A numerical example is presented for FDO model identification and hazard evaluation for methane inflow into the working face in an underground mine: First, a Physics-Based Operator (PBO) model to match monitored data. Second, FDO models are identified for matching the observed, short-term variations with time in the measured data of methane inflow, varying model parameters and simplifications following the parsimony concept of Occam’s Razor. The numerical coefficients of the PBO and FDO models are found to differ by two to three orders of magnitude for methane release as a function of short-time barometric pressure variations. As being data-driven, the significantly different results from an FDO versus PBO model is either an indication of methane release processes poorly understood and modeled in PBO, missing some physics for the pressure spikes;or of problems in the monitored data fluctuations, erroneously sampled with time;or of false correlation. Either way, the FDO model is originated from the functionalized form of the monitored data, and its result is considered experimentally significant within the specified RMS error of model matching.
文摘Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction,search,inductive inference based on Occam’s razor,problem solving,decision making,and reinforcement learning in environments of a very general type.Since inductive inference is at the heart of all inductive sciences,some of the results are relevant not only for AI and computer science but also for physics,provoking nontraditional predictions based on Zuse’s thesis of the computer-generated universe.We first briefly review the history of AI since Godel’s 1931 paper,then discuss recent post-2000 approaches that are currently transforming general AI research into a formal science.