A method is presented for detecting changes to the distribution of a criminal or terrorist point process between two time periods using a non-model-based approach. By treating the criminal/terrorist point process as an intelligent site... more
A method is presented for detecting changes to the distribution of a criminal or terrorist point process between two time periods using a non-model-based approach. By treating the criminal/terrorist point process as an intelligent site selection problem, changes to the process can signify changes in the behavior or activity level of the criminals/terrorists. The locations of past events and an associated vector of geographic, environmental, and socio-economic feature values are employed in the analysis. By modeling the locations of events in each time period as a marked point process, we can then detect differences in the intensity of each component process. A modified PRIM (patient rule induction method) is implemented to partition the high-dimensional feature space, which can include mixed variables, into the most likely change regions. Monte Carlo simulations are easily and quickly generated under random relabeling to test a scan statistic for significance. By detecting local regions of change, not only can it be determined if change has occurred in the study area, but the specific spatial regions where change occurs is also identified. An example is provided of breaking and entering crimes over two-time periods to demonstrate the use of this technique for detecting local regions of change. This methodology also applies to detecting regions of differences between two types of events such as in case–control data.
The golden ring to which most physicists aspire is a unified field theory that incorporates all of modern and classical physics. Some scientists and others call this a TOE or ‘theory of everything’, but it is no more than false hubris to... more
The golden ring to which most physicists aspire is a unified field theory that incorporates all of modern and classical physics. Some scientists and others call this a TOE or ‘theory of everything’, but it is no more than false hubris to believe that humans could possibly know and explain everything about the universe at this time. Einstein chased this goal for the last three decades of his life, basing his theoretical research on his general theory of relativity. Meanwhile, the vast majority of scientists supporting the other major accomplishment of the Second Scientific Revolution were investing all of their time and efforts to advancing the quantum theory and their quest has been extremely successful. They originally had no interest in a unified field theory. After Einstein died in 1955, his efforts were all but abandoned because of his philosophical stance against the prevalent Copenhagen Interpretation of quantum theory even though he had been one of quantum theory’s founders. During the 1970s the tables started to turn and quantum theorists became interested in unifying physics, although not from the foundational principles of relativity theory. They claimed that quantum theory was more fundamental than relativity so they began the same quest from a totally different direction despite their claims to be continuing Einstein’s quest. Throughout the development of the ensuing standard quantum model, superstring theory and many other theoretical schemes, quantum theorists have remained resolute in their conviction that the quantum and relativity are mutually incompatible so the quantum must completely do away with and replace relativity once and for all. However, the quantum theory and relativity are not actually incompatible and, in fact, say the some of the same things about the nature of physical reality. When the similarities are fully defined and studied and the basic assumptions behind each of the theories are altered to reflect the similarities instead of the incompatibilities, only then can the point of their compatibility be determined and act as a unifying principle resulting in a completed unified field theory of the type that Einstein once sought. The development of this physical model of reality is not without irony. Not only is the quantum theory incomplete as Einstein argued in EPR, but Einstein’s general relativity is also seriously incomplete and true unification cannot be rendered complete at any level of reality until all the theoretical models being unified are themselves complete.
This summary sheet includes: (1) the “Single Field Theory” (SOFT) equations and their relationships that amount to a final version of Einstein’s unified Field theory; (2) a “Unification Tree” that shows the historical relationship between... more
This summary sheet includes: (1) the “Single Field Theory” (SOFT) equations and their relationships that amount to a final version of Einstein’s unified Field theory; (2) a “Unification Tree” that shows the historical relationship between various unification attempts; (3) the chart “Evolution of Classical Unified Field Theories” that shows the development of Einstein’s and related work; (4) a 1980 diagram of a particle in quantized curved space-time; (5) a more modern depiction of a particle in quantized curved space-time; (6) a rough picture of a Lithium six nucleus that unifies the features of the shell models, fluid drop models, Yukawa potential and quantum tunneling; (6) a graphical representation of the electronic orbits around a nucleus in the context of the quantized space-time continuum; and (7) a graph showing “The Atom as single field potential”. This Handout from the Vigier IX Symposium accompanied the presentation "Modern Fysics Phallacies: The best way ‘not to unify physics’ turns out to be the only way ‘to unify physics’."
Many physicists believe that the unification of physics within a single paradigmatic theory has been the primary goal in science for only the past few decades, but this would not be true. Unification was the original goal of Einstein and... more
Many physicists believe that the unification of physics within a single paradigmatic theory has been the primary goal in science for only the past few decades, but this would not be true. Unification was the original goal of Einstein and a few other physicists throughout the 1920s to the 1960s during a period of time when quantum theorists were ironing out their own unique set of problems. The unification of physics under the guise of the quantum paradigm only emerged during the 1970s and has since overshadowed all attempts to unify physics from the fundamental principles of relativity. Today too many physicists believe, without any supporting evidence, that the concept of the quantum is far more fundamental than relativity so earlier attempts based on the continuity of relativity have been all but abandoned. However, both approaches are basically flawed because both relativity and the quantum theory are incomplete as they now stand. Had either side of the quantum versus relativity controversy just simplified their worldview and sought commonality between the two, unification would have been accomplished long ago. The point is, literally, that the discrete quantum, continuous relativity, basic physical geometry and classical physics all share one common characteristic – a paradoxical duality between a dimensionless point (discrete) and an extended length (continuity) in any dimension – and if the problem of unification is approached from an understanding of how this paradox relates to each paradigm all of physics could be unified under a single new theoretical paradigm. Unfortunately there has never been a method, either mathematical or physical, by which a three-dimensional space can be generated from two or more dimensionless points. This shortcoming raises the question – how can the dimensionless point-particles of the Standard Model which presently dominates physics be extended to account for the three-dimensional space in which the physical interactions they describe occur? Fortunately, this question can now be answered, but the answer does not favor the Standard Model of the quantum as it is presently interpreted. Instead, a unified field theory based on continuity that completely unifies the quantum and relativity and completely incorporates the Standard Model as well as the Superstring, Brane and commonly accepted classical theories has been completed.
One aspect of tactical crime or terrorism analysis is predicting the location of the next event in a series. The objective of this article is to present a methodology to identify the optimal parameters for and test the performance of... more
One aspect of tactical crime or terrorism analysis is predicting the location of the next event in a series. The objective of this article is to present a methodology to identify the optimal parameters for and test the performance of temporally weighted kernel density estimation models for predicting the next event in a criminal or terrorist event series. By placing event series in a space-time point pattern framework, the next event prediction models are shown to be based on estimating a conditional spatial density function. We use temporal weights that indicate how much influence past events have toward predicting future event locations and can also incorporate uncertainty in the event timing. Results of applying this methodology to crime series in Baltimore County, MD indicate that performance can vary greatly by crime type, little by series length, and is fairly robust to choice of bandwidth.
Objective: This article explores patterns of terrorist activity over the period from 2000 through 2010 across three target countries: Indonesia, the Philippines and Thailand. Methods: We use self-exciting point process models to create... more
Objective:
This article explores patterns of terrorist activity over the period from 2000 through 2010 across three target countries: Indonesia, the Philippines and Thailand.
Methods:
We use self-exciting point process models to create interpretable and replicable metrics for three key terrorism concepts: risk, resilience and volatility, as defined in the context of terrorist activity.
Results:
Analysis of the data shows significant and important differences in the risk, volatility and resilience metrics over time across the three countries. For the three countries analysed, we show that risk varied on a scale from 0.005 to 1.61 “expected terrorist attacks per day”, volatility ranged from 0.820 to 0.994 “additional attacks caused by each attack”, and resilience, as measured by the number of days until risk subsides to a pre-attack level, ranged from 19 to 39 days. We find that of the three countries, Indonesia had the lowest average risk and volatility, and the highest level of resilience, indicative of the relatively sporadic nature of terrorist activity in Indonesia. The high terrorism risk and low resilience in the Philippines was a function of the more intense, less clustered pattern of terrorism than what was evident in Indonesia.
Conclusions:
Mathematical models hold great promise for creating replicable, reliable and interpretable “metrics” to key terrorism concepts such as risk, resilience and volatility.
The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel... more
The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.
The presence of recurring arrhythmic events (also known as cardiac dysrhythmia or irregular heartbeats), as well as erroneous beat detection due to low signal quality, significantly affects estimation of both time and frequency domain... more
The presence of recurring arrhythmic events (also known as cardiac dysrhythmia or irregular heartbeats), as well as erroneous beat detection due to low signal quality, significantly affects estimation of both time and frequency domain indices of heart rate variability (HRV). A reliable, real-time classification and correction of ECG-derived heartbeats is a necessary prerequisite for an accurate online monitoring of HRV and cardiovascular control. We have developed a novel point-process-based method for real-time R-R interval error detection and correction. Given an R-wave event, we assume that the length of the next R-R interval follows a physiologically motivated, time-varying inverse Gaussian probability distribution. We then devise an instantaneous automated detection and correction procedure for erroneous and arrhythmic beats by using the information on the probability of occurrence of the observed beat provided by the model. We test our algorithm over two datasets from the PhysioNet archive. The Fantasia normal rhythm database is artificially corrupted with known erroneous beats to test both the detection procedure and correction procedure. The benchmark MIT-BIH Arrhythmia database is further considered to test the detection procedure of real arrhythmic events and compare it with results from previously published algorithms. Our automated algorithm represents an improvement over previous procedures, with best specificity for the detection of correct beats, as well as highest sensitivity to missed and extra beats, artificially misplaced beats, and for real arrhythmic events. A near-optimal heartbeat classification and correction, together with the ability to adapt to time-varying changes of heartbeat dynamics in an online fashion, may provide a solid base for building a more reliable real-time HRV monitoring device.
Likelihood-based encoding models founded on point processes have received significant attention in the literature because of their ability to reveal the information encoded by spiking neural populations. We propose an approximation to the... more
Likelihood-based encoding models founded on point processes have received significant attention in the literature because of their ability to reveal the information encoded by spiking neural populations. We propose an approximation to the likelihood of a point-process model of neurons that holds under assumptions about the continuous time process that are physiologically reasonable for neural spike trains: the presence of a refractory period, the predictability of the conditional intensity function, and its integrability. These are properties that apply to a large class of point processes arising in applications other than neuroscience. The proposed approach has several advantages over conventional ones. In particular, one can use standard fitting procedures for generalized linear models based on iteratively reweighted least squares while improving the accuracy of the approximation to the likelihood and reducing bias in the estimation of the parameters of the underlying continuous-time model. As a result, the proposed approach can use a larger bin size to achieve the same accuracy as conventional approaches would with a smaller bin size. This is particularly important when analyzing neural data with high mean and instantaneous firing rates. We demonstrate these claims on simulated and real neural spiking activity. By allowing a substantive increase in the required bin size, our algorithm has the potential to lower the barrier to the use of point-process methods in an increasing number of applications.
Abstract Robust and automated classification and correction of ECG-derived heart beats are a necessary prerequisite for an accurate real-time estimation of measures of heart rate variability and cardiovascular control. In particular, the... more
Abstract Robust and automated classification and correction of ECG-derived heart beats are a necessary prerequisite for an accurate real-time estimation of measures of heart rate variability and cardiovascular control. In particular, the low quality of the signal, as well as the presence of recurring arrhythmic events, may significantly affect estimation accuracy. We here present a novel point process based method for a real time RR interval error detection and correction.