The first part of this article develops [1] a closed universe model deploying by identical multiplication a Friedmann-Planck micro-universe;thus this one constitutes the grains of the vacuum of this universe. The quan...The first part of this article develops [1] a closed universe model deploying by identical multiplication a Friedmann-Planck micro-universe;thus this one constitutes the grains of the vacuum of this universe. The quantum initial expansion of this is quadratic as a function of time. Using this model, calculating the density of matter at the present time gives a correct numerical result. The essential point is that during periods of expansion following the initial quadratic period, this model reveals a surprising phenomenon. The function expressing the radius curvature as a function of time depends on the individual mass of the heaviest elementary particles created at the end of the quadratic period. The model also leads to reflection on the dark matter. The second part imagines a new type of Big Rip based on the following hypothesis: when the acceleration of the Universe, caused by dark energy, reaches the value of Planck acceleration, destruction of the microscopic structure of the Universe occurs and is replaced by a macroscopic structure (photon spheres) identical to that of the initial Planck element. Thus a new Big Bang could begin on an immensely larger scale. This reasoning eventually leads to reflection on the origins of the Big Bang.展开更多
The big problem of Big Data is the lack of a machine learning process that scales and finds meaningful features. Humans fill in for the insufficient automation, but the complexity of the tasks outpaces the human mind...The big problem of Big Data is the lack of a machine learning process that scales and finds meaningful features. Humans fill in for the insufficient automation, but the complexity of the tasks outpaces the human mind’s capacity to comprehend the data. Heuristic partition methods may help but still need humans to adjust the parameters. The same problems exist in many other disciplines and technologies that depend on Big Data or Machine Learning. Proposed here is a fractal groupoid-theoretical method that recursively partitions the problem and requires no heuristics or human intervention. It takes two steps. First, make explicit the fundamental causal nature of information in the physical world by encoding it as a causal set. Second, construct a functor F: C C′ on the category of causal sets that morphs causal set C into smaller causal set C′ by partitioning C into a set of invariant groupoid-theoretical blocks. Repeating the construction, there arises a sequence of progressively smaller causal sets C, C′, C″, … The sequence defines a fractal hierarchy of features, with the features being invariant and hence endowed with a physical meaning, and the hierarchy being scale-free and hence ensuring proper scaling at all granularities. Fractals exist in nature nearly everywhere and at all physical scales, and invariants have long been known to be meaningful to us. The theory is also of interest for NP-hard combinatorial problems that can be expressed as a causal set, such as the Traveling Salesman problem. The recursive groupoid partition promoted by functor F works against their combinatorial complexity and appears to allow a low-order polynomial solution. A true test of this property requires special hardware, not yet available. However, as a proof of concept, a suite of sequential, non-heuristic algorithms were developed and used to solve a real-world 120-city problem of TSP on a personal computer. The results are reported.展开更多
文摘The first part of this article develops [1] a closed universe model deploying by identical multiplication a Friedmann-Planck micro-universe;thus this one constitutes the grains of the vacuum of this universe. The quantum initial expansion of this is quadratic as a function of time. Using this model, calculating the density of matter at the present time gives a correct numerical result. The essential point is that during periods of expansion following the initial quadratic period, this model reveals a surprising phenomenon. The function expressing the radius curvature as a function of time depends on the individual mass of the heaviest elementary particles created at the end of the quadratic period. The model also leads to reflection on the dark matter. The second part imagines a new type of Big Rip based on the following hypothesis: when the acceleration of the Universe, caused by dark energy, reaches the value of Planck acceleration, destruction of the microscopic structure of the Universe occurs and is replaced by a macroscopic structure (photon spheres) identical to that of the initial Planck element. Thus a new Big Bang could begin on an immensely larger scale. This reasoning eventually leads to reflection on the origins of the Big Bang.
文摘The big problem of Big Data is the lack of a machine learning process that scales and finds meaningful features. Humans fill in for the insufficient automation, but the complexity of the tasks outpaces the human mind’s capacity to comprehend the data. Heuristic partition methods may help but still need humans to adjust the parameters. The same problems exist in many other disciplines and technologies that depend on Big Data or Machine Learning. Proposed here is a fractal groupoid-theoretical method that recursively partitions the problem and requires no heuristics or human intervention. It takes two steps. First, make explicit the fundamental causal nature of information in the physical world by encoding it as a causal set. Second, construct a functor F: C C′ on the category of causal sets that morphs causal set C into smaller causal set C′ by partitioning C into a set of invariant groupoid-theoretical blocks. Repeating the construction, there arises a sequence of progressively smaller causal sets C, C′, C″, … The sequence defines a fractal hierarchy of features, with the features being invariant and hence endowed with a physical meaning, and the hierarchy being scale-free and hence ensuring proper scaling at all granularities. Fractals exist in nature nearly everywhere and at all physical scales, and invariants have long been known to be meaningful to us. The theory is also of interest for NP-hard combinatorial problems that can be expressed as a causal set, such as the Traveling Salesman problem. The recursive groupoid partition promoted by functor F works against their combinatorial complexity and appears to allow a low-order polynomial solution. A true test of this property requires special hardware, not yet available. However, as a proof of concept, a suite of sequential, non-heuristic algorithms were developed and used to solve a real-world 120-city problem of TSP on a personal computer. The results are reported.