In network traffic classification,it is important to understand the correlation between network traffic and its causal application,protocol,or service group,for example,in facilitating lawful interception,ensuring the...In network traffic classification,it is important to understand the correlation between network traffic and its causal application,protocol,or service group,for example,in facilitating lawful interception,ensuring the quality of service,preventing application choke points,and facilitating malicious behavior identification.In this paper,we review existing network classification techniques,such as port-based identification and those based on deep packet inspection,statistical features in conjunction with machine learning,and deep learning algorithms.We also explain the implementations,advantages,and limitations associated with these techniques.Our review also extends to publicly available datasets used in the literature.Finally,we discuss existing and emerging challenges,as well as future research directions.展开更多
In the world of wireless sensor networks(WSNs),optimizing performance and extending network lifetime are critical goals.In this paper,we propose a new model called DTLR-Net(Deep Temporal LSTM Regression Network)that e...In the world of wireless sensor networks(WSNs),optimizing performance and extending network lifetime are critical goals.In this paper,we propose a new model called DTLR-Net(Deep Temporal LSTM Regression Network)that employs long-short-term memory and is effective for long-term dependencies.Mobile sinks can move in arbitrary patterns,so the model employs long short-term memory(LSTM)networks to handle such movements.The parameters were initialized iteratively,and each node updated its position,mobility level,and other important metrics at each turn,with key measurements including active or inactive node ratio,energy consumption per cycle,received packets for each node,contact time,and interconnect time between nodes,among others.These metrics aid in determining whether the model can remain stable under a variety of conditions.Furthermore,in addition to focusing on stability and security,these measurements assist us in predicting future node behaviors as well as how the network operates.The results show that the proposed model outperformed all other models by achieving a lifetime of 493.5 s for a 400-node WSN that persisted through 750 rounds,whereas other models could not reach this value and were significantly lower.This research has many implications,and one way to improve network performance dependability and sustainability is to incorporate deep learning approaches into WSN dynamics.展开更多
The advent of Network Function Virtualization(NFV)and Service Function Chains(SFCs)unleashes the power of dynamic creation of network services using Virtual Network Functions(VNFs).This is of great interest to network...The advent of Network Function Virtualization(NFV)and Service Function Chains(SFCs)unleashes the power of dynamic creation of network services using Virtual Network Functions(VNFs).This is of great interest to network operators since poor service quality and resource wastage can potentially hurt their revenue in the long term.However,the study shows with a set of test-bed experiments that packet loss at certain positions(i.e.,different VNFs)in an SFC can cause various degrees of resource wastage and performance degradation because of repeated upstream processing and transmission of retransmitted packets.To overcome this challenge,this study focuses on resource scheduling and deployment of SFCs while considering packet loss positions.This study developed a novel SFC packet dropping cost model and formulated an SFC scheduling problem that aims to minimize overall packet dropping cost as a Mixed-Integer Linear Programming(MILP)and proved that it is NP-hard.In this study,Palos is proposed as an efficient scheme in exploiting the functional characteristics of VNFs and their positions in SFCs for scheduling resources and deployment to optimize packet dropping cost.Extensive experiment results show that Palos can achieve up to 42.73%improvement on packet dropping cost and up to 33.03%reduction on average SFC latency when compared with two other state-of-the-art schemes.展开更多
Structured illumination microscopy(SIM)is a popular and powerful super-resolution(SR)technique in biomedical research.However,the conventional reconstruction algorithm for SIM heavily relies on the accurate prior know...Structured illumination microscopy(SIM)is a popular and powerful super-resolution(SR)technique in biomedical research.However,the conventional reconstruction algorithm for SIM heavily relies on the accurate prior knowledge of illumination patterns and signal-to-noise ratio(SNR)of raw images.To obtain high-quality SR images,several raw images need to be captured under high fluorescence level,which further restricts SIM’s temporal resolution and its applications.Deep learning(DL)is a data-driven technology that has been used to expand the limits of optical microscopy.In this study,we propose a deep neural network based on multi-level wavelet and attention mechanism(MWAM)for SIM.Our results show that the MWAM network can extract high-frequency information contained in SIM raw images and accurately integrate it into the output image,resulting in superior SR images compared to those generated using wide-field images as input data.We also demonstrate that the number of SIM raw images can be reduced to three,with one image in each illumination orientation,to achieve the optimal tradeoff between temporal and spatial resolution.Furthermore,our MWAM network exhibits superior reconstruction ability on low-SNR images compared to conventional SIM algorithms.We have also analyzed the adaptability of this network on other biological samples and successfully applied the pretrained model to other SIM systems.展开更多
Although the classical spectral representation method(SRM)has been widely used in the generation of spatially varying ground motions,there are still challenges in efficient simulation of the non-stationary stochastic ...Although the classical spectral representation method(SRM)has been widely used in the generation of spatially varying ground motions,there are still challenges in efficient simulation of the non-stationary stochastic vector process in practice.The first problem is the inherent limitation and inflexibility of the deterministic time/frequency modulation function.Another difficulty is the estimation of evolutionary power spectral density(EPSD)with quite a few samples.To tackle these problems,the wavelet packet transform(WPT)algorithm is utilized to build a time-varying spectrum of seed recording which describes the energy distribution in the time-frequency domain.The time-varying spectrum is proven to preserve the time and frequency marginal property as theoretical EPSD will do for the stationary process.For the simulation of spatially varying ground motions,the auto-EPSD for all locations is directly estimated using the time-varying spectrum of seed recording rather than matching predefined EPSD models.Then the constructed spectral matrix is incorporated in SRM to simulate spatially varying non-stationary ground motions using efficient Cholesky decomposition techniques.In addition to a good match with the target coherency model,two numerical examples indicate that the generated time histories retain the physical properties of the prescribed seed recording,including waveform,temporal/spectral non-stationarity,normalized energy buildup,and significant duration.展开更多
Broadcasting gateway equipment generally uses a method of simply switching to a spare input stream when a failure occurs in a main input stream.However,when the transmission environment is unstable,problems such as re...Broadcasting gateway equipment generally uses a method of simply switching to a spare input stream when a failure occurs in a main input stream.However,when the transmission environment is unstable,problems such as reduction in the lifespan of equipment due to frequent switching and interruption,delay,and stoppage of services may occur.Therefore,applying a machine learning(ML)method,which is possible to automatically judge and classify network-related service anomaly,and switch multi-input signals without dropping or changing signals by predicting or quickly determining the time of error occurrence for smooth stream switching when there are problems such as transmission errors,is required.In this paper,we propose an intelligent packet switching method based on the ML method of classification,which is one of the supervised learning methods,that presents the risk level of abnormal multi-stream occurring in broadcasting gateway equipment based on data.Furthermore,we subdivide the risk levels obtained from classification techniques into probabilities and then derive vectorized representative values for each attribute value of the collected input data and continuously update them.The obtained reference vector value is used for switching judgment through the cosine similarity value between input data obtained when a dangerous situation occurs.In the broadcasting gateway equipment to which the proposed method is applied,it is possible to perform more stable and smarter switching than before by solving problems of reliability and broadcasting accidents of the equipment and can maintain stable video streaming as well.展开更多
Grouting defects are an inherent challenge in construction practices,exerting a considerable impact on the operational structural integrity of connections.This investigation employed the impact-echo technique for the ...Grouting defects are an inherent challenge in construction practices,exerting a considerable impact on the operational structural integrity of connections.This investigation employed the impact-echo technique for the detection of grouting anomalies within connections,enhancing its precision through the integration of wavelet packet energy principles for damage identification purposes.A series of grouting completeness assessments were meticulously conducted,taking into account variables such as the divergent material properties of the sleeves and the configuration of adjacent reinforcement.The findings revealed that:(i)the energy distribution for the highstrength concrete cohort predominantly occupied the frequency bands 42,44,45,and 47,whereas for other groups,it was concentrated within the 37 to 40 frequency band;(ii)the delineation of empty sleeves was effectively discernible by examining the wavelet packet energy ratios across the spectrum of frequencies,albeit distinguishing between sleeves with 50%and full grouting density proved challenging;and(iii)the wavelet packet energy analysis yielded variable detection outcomes contingent on the material attributes of the sleeves,demonstrating heightened sensitivity when applied to ultrahigh-performance concrete matrices and GFRP-reinforced steel bars.展开更多
This paper proposes an adaptive hybrid forward error correction(AH-FEC)coding scheme for coping with dynamic packet loss events in video and audio transmission.Specifically,the proposed scheme consists of a hybrid Ree...This paper proposes an adaptive hybrid forward error correction(AH-FEC)coding scheme for coping with dynamic packet loss events in video and audio transmission.Specifically,the proposed scheme consists of a hybrid Reed-Solomon and low-density parity-check(RS-LDPC)coding system,combined with a Kalman filter-based adaptive algorithm.The hybrid RS-LDPC coding accommodates a wide range of code length requirements,employing RS coding for short codes and LDPC coding for medium-long codes.We delimit the short and medium-length codes by coding performance so that both codes remain in the optimal region.Additionally,a Kalman filter-based adaptive algorithm has been developed to handle dynamic alterations in a packet loss rate.The Kalman filter estimates packet loss rate utilizing observation data and system models,and then we establish the redundancy decision module through receiver feedback.As a result,the lost packets can be perfectly recovered by the receiver based on the redundant packets.Experimental results show that the proposed method enhances the decoding performance significantly under the same redundancy and channel packet loss.展开更多
Many domains, including communication, signal processing, and image processing, use the Fourier Transform as a mathematical tool for signal analysis. Although it can analyze signals with steady and transitory properti...Many domains, including communication, signal processing, and image processing, use the Fourier Transform as a mathematical tool for signal analysis. Although it can analyze signals with steady and transitory properties, it has limits. The Wavelet Packet Decomposition (WPD) is a novel technique that we suggest in this study as a way to improve the Fourier Transform and get beyond these drawbacks. In this experiment, we specifically considered the utilization of Daubechies level 4 for the wavelet transformation. The choice of Daubechies level 4 was motivated by several reasons. Daubechies wavelets are known for their compact support, orthogonality, and good time-frequency localization. By choosing Daubechies level 4, we aimed to strike a balance between preserving important transient information and avoiding excessive noise or oversmoothing in the transformed signal. Then we compared the outcomes of our suggested approach to the conventional Fourier Transform using a non-stationary signal. The findings demonstrated that the suggested method offered a more accurate representation of non-stationary and transient signals in the frequency domain. Our method precisely showed a 12% reduction in MSE and a 3% rise in PSNR for the standard Fourier transform, as well as a 35% decrease in MSE and an 8% increase in PSNR for voice signals when compared to the traditional wavelet packet decomposition method.展开更多
With the arrival of the 4G and 5G,the telecommunications networks have experienced a large expansion of these networks.That enabled the integration of many services and adequate flow,thus enabling the operators to res...With the arrival of the 4G and 5G,the telecommunications networks have experienced a large expansion of these networks.That enabled the integration of many services and adequate flow,thus enabling the operators to respond to the growing demand of users.This rapid evolution has given the operators to adapt,their methods to the new technologies that increase.This complexity becomes more important,when these networks include several technologies to access different from the heterogeneous network like in the 4G network.The dimensional new challenges tell the application and the considerable increase in demand for services and the compatibility with existing networks,the management of mobility intercellular of users and it offers a better quality of services.Thus,the proposed solution to meet these new requirements is the sizing of the EPC(Evolved Packet Core)core network to support the 5G access network.For the case of Orange Guinea,this involves setting up an architecture for interconnecting the core networks of Sonfonia and Camayenne.The objectives of our work are of two orders:(1)to propose these solutions and recommendations for the heart network EPC sizing and the deployment to be adopted;(2)supply and architectural interconnection in the heart network EPC and an existing heart network.In our work,the model of traffic in communication that we use to calculate the traffic generated with each technology has link in the network of the heart.展开更多
In the special theory of relativity, massive particles can travel at neither the speed of light c nor faster. Meanwhile, since the photon was quantized, many have thought of it as a point particle. How pointed? The id...In the special theory of relativity, massive particles can travel at neither the speed of light c nor faster. Meanwhile, since the photon was quantized, many have thought of it as a point particle. How pointed? The idea could be a mathematical device or physical simplification. By contrast, the preceding notion of wave-group duality has two velocities: a group velocity vg and a phase velocity vp. In light vp = vg = c;but it follows from special relativity that, in massive particles, vp > c. The phase velocity is the product of the two best measured variables, and so their product constitutes internal motion that travels, verifiably, faster than light. How does vp then appear in Minkowski space? For light, the spatio-temporal Lorentz invariant metric is s2=c2t2−x2−y2−z2, the same in whatever frame it is viewed. The space is divided into 3 parts: firstly a cone, symmetric about the vertical axis ct > 0 that represents the world line of a stationary particle while the conical surface at s = 0 represents the locus for light rays that travel at the speed of light c. Since no real thing travels faster than the speed of light c, the surface is also a horizon for what can be seen by an observer starting from the origin at time t = 0. Secondly, an inverted cone represents, equivalently, time past. Thirdly, outside the cones, inaccessible space. The phase velocity vp, group velocity vg and speed of light are all equal in free space, vp = vg = c, constant. By contrast, for particles, where causality is due to particle interactions having rest mass mo > 0, we have to employ the Klein-Gordon equation with s2=c2t2−x2−y2−z2+mo2c2. Now special relativity requires a complication: vp.vg = c2 where vg c and therefore vp > c. In the volume outside the cones, causality due to light interactions cannot extend beyond the cones. However, since vp > c and even vp >> c when wavelength λ is long, extreme phase velocities are then limited in their causal effects by the particle uncertainty σ, i.e. to vgt ± σ/ω, where ω is the particle angular frequency. This is the first time the phase range has been described for a massive particle.展开更多
文摘In network traffic classification,it is important to understand the correlation between network traffic and its causal application,protocol,or service group,for example,in facilitating lawful interception,ensuring the quality of service,preventing application choke points,and facilitating malicious behavior identification.In this paper,we review existing network classification techniques,such as port-based identification and those based on deep packet inspection,statistical features in conjunction with machine learning,and deep learning algorithms.We also explain the implementations,advantages,and limitations associated with these techniques.Our review also extends to publicly available datasets used in the literature.Finally,we discuss existing and emerging challenges,as well as future research directions.
文摘In the world of wireless sensor networks(WSNs),optimizing performance and extending network lifetime are critical goals.In this paper,we propose a new model called DTLR-Net(Deep Temporal LSTM Regression Network)that employs long-short-term memory and is effective for long-term dependencies.Mobile sinks can move in arbitrary patterns,so the model employs long short-term memory(LSTM)networks to handle such movements.The parameters were initialized iteratively,and each node updated its position,mobility level,and other important metrics at each turn,with key measurements including active or inactive node ratio,energy consumption per cycle,received packets for each node,contact time,and interconnect time between nodes,among others.These metrics aid in determining whether the model can remain stable under a variety of conditions.Furthermore,in addition to focusing on stability and security,these measurements assist us in predicting future node behaviors as well as how the network operates.The results show that the proposed model outperformed all other models by achieving a lifetime of 493.5 s for a 400-node WSN that persisted through 750 rounds,whereas other models could not reach this value and were significantly lower.This research has many implications,and one way to improve network performance dependability and sustainability is to incorporate deep learning approaches into WSN dynamics.
基金supported by the National Natural Science Foundation of China(NSFC)No.62172189 and 61772235the Natural Science Foundation of Guangdong Province No.2020A1515010771+1 种基金the Science and Technology Program of Guangzhou No.202002030372the UK Engineering and Physical Sciences Research Council(EPSRC)grants EP/P004407/2 and EP/P004024/1,and Innovate UK grant 106199-47198.
文摘The advent of Network Function Virtualization(NFV)and Service Function Chains(SFCs)unleashes the power of dynamic creation of network services using Virtual Network Functions(VNFs).This is of great interest to network operators since poor service quality and resource wastage can potentially hurt their revenue in the long term.However,the study shows with a set of test-bed experiments that packet loss at certain positions(i.e.,different VNFs)in an SFC can cause various degrees of resource wastage and performance degradation because of repeated upstream processing and transmission of retransmitted packets.To overcome this challenge,this study focuses on resource scheduling and deployment of SFCs while considering packet loss positions.This study developed a novel SFC packet dropping cost model and formulated an SFC scheduling problem that aims to minimize overall packet dropping cost as a Mixed-Integer Linear Programming(MILP)and proved that it is NP-hard.In this study,Palos is proposed as an efficient scheme in exploiting the functional characteristics of VNFs and their positions in SFCs for scheduling resources and deployment to optimize packet dropping cost.Extensive experiment results show that Palos can achieve up to 42.73%improvement on packet dropping cost and up to 33.03%reduction on average SFC latency when compared with two other state-of-the-art schemes.
基金supported by the National Natural Science Foundation of China(Grant Nos.62005307 and 61975228).
文摘Structured illumination microscopy(SIM)is a popular and powerful super-resolution(SR)technique in biomedical research.However,the conventional reconstruction algorithm for SIM heavily relies on the accurate prior knowledge of illumination patterns and signal-to-noise ratio(SNR)of raw images.To obtain high-quality SR images,several raw images need to be captured under high fluorescence level,which further restricts SIM’s temporal resolution and its applications.Deep learning(DL)is a data-driven technology that has been used to expand the limits of optical microscopy.In this study,we propose a deep neural network based on multi-level wavelet and attention mechanism(MWAM)for SIM.Our results show that the MWAM network can extract high-frequency information contained in SIM raw images and accurately integrate it into the output image,resulting in superior SR images compared to those generated using wide-field images as input data.We also demonstrate that the number of SIM raw images can be reduced to three,with one image in each illumination orientation,to achieve the optimal tradeoff between temporal and spatial resolution.Furthermore,our MWAM network exhibits superior reconstruction ability on low-SNR images compared to conventional SIM algorithms.We have also analyzed the adaptability of this network on other biological samples and successfully applied the pretrained model to other SIM systems.
基金National Key Research and Development Program of China under Grant No.2023YFE0102900National Natural Science Foundation of China under Grant Nos.52378506 and 52208164。
文摘Although the classical spectral representation method(SRM)has been widely used in the generation of spatially varying ground motions,there are still challenges in efficient simulation of the non-stationary stochastic vector process in practice.The first problem is the inherent limitation and inflexibility of the deterministic time/frequency modulation function.Another difficulty is the estimation of evolutionary power spectral density(EPSD)with quite a few samples.To tackle these problems,the wavelet packet transform(WPT)algorithm is utilized to build a time-varying spectrum of seed recording which describes the energy distribution in the time-frequency domain.The time-varying spectrum is proven to preserve the time and frequency marginal property as theoretical EPSD will do for the stationary process.For the simulation of spatially varying ground motions,the auto-EPSD for all locations is directly estimated using the time-varying spectrum of seed recording rather than matching predefined EPSD models.Then the constructed spectral matrix is incorporated in SRM to simulate spatially varying non-stationary ground motions using efficient Cholesky decomposition techniques.In addition to a good match with the target coherency model,two numerical examples indicate that the generated time histories retain the physical properties of the prescribed seed recording,including waveform,temporal/spectral non-stationarity,normalized energy buildup,and significant duration.
基金This work was supported by a research grant from Seoul Women’s University(2023-0183).
文摘Broadcasting gateway equipment generally uses a method of simply switching to a spare input stream when a failure occurs in a main input stream.However,when the transmission environment is unstable,problems such as reduction in the lifespan of equipment due to frequent switching and interruption,delay,and stoppage of services may occur.Therefore,applying a machine learning(ML)method,which is possible to automatically judge and classify network-related service anomaly,and switch multi-input signals without dropping or changing signals by predicting or quickly determining the time of error occurrence for smooth stream switching when there are problems such as transmission errors,is required.In this paper,we propose an intelligent packet switching method based on the ML method of classification,which is one of the supervised learning methods,that presents the risk level of abnormal multi-stream occurring in broadcasting gateway equipment based on data.Furthermore,we subdivide the risk levels obtained from classification techniques into probabilities and then derive vectorized representative values for each attribute value of the collected input data and continuously update them.The obtained reference vector value is used for switching judgment through the cosine similarity value between input data obtained when a dangerous situation occurs.In the broadcasting gateway equipment to which the proposed method is applied,it is possible to perform more stable and smarter switching than before by solving problems of reliability and broadcasting accidents of the equipment and can maintain stable video streaming as well.
基金supported by financial support from the National Natural Science Foundation of China(U1904177)the Excellent Youth Natural Science Foundation of Henan Province of China(212300410079)+2 种基金the Subproject of the Key Project of the National Development and Reform Commission of China(202203001)the Project of Young Key Teachers in Henan Province of China(2019GGJS01)Horizontal Research Projects(20230352A).
文摘Grouting defects are an inherent challenge in construction practices,exerting a considerable impact on the operational structural integrity of connections.This investigation employed the impact-echo technique for the detection of grouting anomalies within connections,enhancing its precision through the integration of wavelet packet energy principles for damage identification purposes.A series of grouting completeness assessments were meticulously conducted,taking into account variables such as the divergent material properties of the sleeves and the configuration of adjacent reinforcement.The findings revealed that:(i)the energy distribution for the highstrength concrete cohort predominantly occupied the frequency bands 42,44,45,and 47,whereas for other groups,it was concentrated within the 37 to 40 frequency band;(ii)the delineation of empty sleeves was effectively discernible by examining the wavelet packet energy ratios across the spectrum of frequencies,albeit distinguishing between sleeves with 50%and full grouting density proved challenging;and(iii)the wavelet packet energy analysis yielded variable detection outcomes contingent on the material attributes of the sleeves,demonstrating heightened sensitivity when applied to ultrahigh-performance concrete matrices and GFRP-reinforced steel bars.
文摘This paper proposes an adaptive hybrid forward error correction(AH-FEC)coding scheme for coping with dynamic packet loss events in video and audio transmission.Specifically,the proposed scheme consists of a hybrid Reed-Solomon and low-density parity-check(RS-LDPC)coding system,combined with a Kalman filter-based adaptive algorithm.The hybrid RS-LDPC coding accommodates a wide range of code length requirements,employing RS coding for short codes and LDPC coding for medium-long codes.We delimit the short and medium-length codes by coding performance so that both codes remain in the optimal region.Additionally,a Kalman filter-based adaptive algorithm has been developed to handle dynamic alterations in a packet loss rate.The Kalman filter estimates packet loss rate utilizing observation data and system models,and then we establish the redundancy decision module through receiver feedback.As a result,the lost packets can be perfectly recovered by the receiver based on the redundant packets.Experimental results show that the proposed method enhances the decoding performance significantly under the same redundancy and channel packet loss.
文摘Many domains, including communication, signal processing, and image processing, use the Fourier Transform as a mathematical tool for signal analysis. Although it can analyze signals with steady and transitory properties, it has limits. The Wavelet Packet Decomposition (WPD) is a novel technique that we suggest in this study as a way to improve the Fourier Transform and get beyond these drawbacks. In this experiment, we specifically considered the utilization of Daubechies level 4 for the wavelet transformation. The choice of Daubechies level 4 was motivated by several reasons. Daubechies wavelets are known for their compact support, orthogonality, and good time-frequency localization. By choosing Daubechies level 4, we aimed to strike a balance between preserving important transient information and avoiding excessive noise or oversmoothing in the transformed signal. Then we compared the outcomes of our suggested approach to the conventional Fourier Transform using a non-stationary signal. The findings demonstrated that the suggested method offered a more accurate representation of non-stationary and transient signals in the frequency domain. Our method precisely showed a 12% reduction in MSE and a 3% rise in PSNR for the standard Fourier transform, as well as a 35% decrease in MSE and an 8% increase in PSNR for voice signals when compared to the traditional wavelet packet decomposition method.
文摘With the arrival of the 4G and 5G,the telecommunications networks have experienced a large expansion of these networks.That enabled the integration of many services and adequate flow,thus enabling the operators to respond to the growing demand of users.This rapid evolution has given the operators to adapt,their methods to the new technologies that increase.This complexity becomes more important,when these networks include several technologies to access different from the heterogeneous network like in the 4G network.The dimensional new challenges tell the application and the considerable increase in demand for services and the compatibility with existing networks,the management of mobility intercellular of users and it offers a better quality of services.Thus,the proposed solution to meet these new requirements is the sizing of the EPC(Evolved Packet Core)core network to support the 5G access network.For the case of Orange Guinea,this involves setting up an architecture for interconnecting the core networks of Sonfonia and Camayenne.The objectives of our work are of two orders:(1)to propose these solutions and recommendations for the heart network EPC sizing and the deployment to be adopted;(2)supply and architectural interconnection in the heart network EPC and an existing heart network.In our work,the model of traffic in communication that we use to calculate the traffic generated with each technology has link in the network of the heart.
文摘In the special theory of relativity, massive particles can travel at neither the speed of light c nor faster. Meanwhile, since the photon was quantized, many have thought of it as a point particle. How pointed? The idea could be a mathematical device or physical simplification. By contrast, the preceding notion of wave-group duality has two velocities: a group velocity vg and a phase velocity vp. In light vp = vg = c;but it follows from special relativity that, in massive particles, vp > c. The phase velocity is the product of the two best measured variables, and so their product constitutes internal motion that travels, verifiably, faster than light. How does vp then appear in Minkowski space? For light, the spatio-temporal Lorentz invariant metric is s2=c2t2−x2−y2−z2, the same in whatever frame it is viewed. The space is divided into 3 parts: firstly a cone, symmetric about the vertical axis ct > 0 that represents the world line of a stationary particle while the conical surface at s = 0 represents the locus for light rays that travel at the speed of light c. Since no real thing travels faster than the speed of light c, the surface is also a horizon for what can be seen by an observer starting from the origin at time t = 0. Secondly, an inverted cone represents, equivalently, time past. Thirdly, outside the cones, inaccessible space. The phase velocity vp, group velocity vg and speed of light are all equal in free space, vp = vg = c, constant. By contrast, for particles, where causality is due to particle interactions having rest mass mo > 0, we have to employ the Klein-Gordon equation with s2=c2t2−x2−y2−z2+mo2c2. Now special relativity requires a complication: vp.vg = c2 where vg c and therefore vp > c. In the volume outside the cones, causality due to light interactions cannot extend beyond the cones. However, since vp > c and even vp >> c when wavelength λ is long, extreme phase velocities are then limited in their causal effects by the particle uncertainty σ, i.e. to vgt ± σ/ω, where ω is the particle angular frequency. This is the first time the phase range has been described for a massive particle.