Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Geoffrey Matthews
  • Bellingham, Washington, United States
Inverse kinematics greatly simplifies the animation of models: positioning the hand alone will position the entire arm. In some dance styles, the position of the feet, together with which foot the weight is on, determines much of the... more
Inverse kinematics greatly simplifies the animation of models: positioning the hand alone will position the entire arm. In some dance styles, the position of the feet, together with which foot the weight is on, determines much of the bending and rotation of the legs, hips, rib cage, shoulders and arms. Cuban motion is a highly stylized example of this, used in several dances such as Salsa, Rumba, and Cha-cha. The principles covering Cuban motion, when combined with normal rigging constraints, allow a wide variety of dances and dance moves to be synthesized rapidly with minimal input. Only the timing of the weight changes (usually fixed for each dance), and the placement of the feet (usually fixed for each dance move), need be specified. In this talk we outline the principles of Cuban motion, and demonstrate how natural looking dance moves can be procedurally generated. We have found that the code for dance moves can be simplified enough to resemble the instructions given in dance guidebooks.
We2 present an overview of current research in the measurement of enjoyment in video games. We attempt to systematize the disparate approaches found with the vocabulary and constructs of quasi-experimentation. In each area, we try to make... more
We2 present an overview of current research in the measurement of enjoyment in video games. We attempt to systematize the disparate approaches found with the vocabulary and constructs of quasi-experimentation. In each area, we try to make some recommendations for improvement, and find some areas in need of further research.
This research paper describes a new method in which the discovery and construction of man-made structures, specifically medieval castles, in a three dimensional environment can be intelligently constructed given only a random terrain and... more
This research paper describes a new method in which the discovery and construction of man-made structures, specifically medieval castles, in a three dimensional environment can be intelligently constructed given only a random terrain and simple user-defined suggestions. Our method of procedural modeling makes use of convex hulls, extracted from a given terrain, to provide several construction elements that are used to complete the finished model.In developing the procedurally modeled castles, we employed hybrid multi-fractals using Perlin noise as way to generate random terrains. It was essential to implement the random/fractal terrain generation to provide sufficient means of testing several cases, as opposed to using the same terrain over and over. This provided a fast and efficient means of testing a component of the modeling method on several different terrains.The approach this modeling procedure first takes is common among the construction of man made buildings. Just as one would do when first setting up a tent, for example, the first step is to locate a suitable site. The target for such a search is a satisfactory foundation in which to build upon. Our approach was to search the generated terrain for those facets whose normals are within some tolerance of straight up. This will find those portions of the terrain that are the most horizontal and, therefore, offer the most potential for a natural foundation for the castle, and require the least movement of soil.Once the horizontal facets of the terrain have been located they are grouped together. Now the decision as to what shape the encompassing foundation will take must be made. This is done because in most cases the grouped facets will contain holes or appear to branch which does not lend itself well to a foundation. Obviously bounding squares, rectangles, circumferences, or any other user defined shape, for that matter, could do this job. In fact, many castles are indeed built in such a way that they are quite symmetric and using a user defined shape may provide a very desirable foundation. However, our method implements a more general approach that provides a unique foundation and therefore a unique castle construction each time it is presented with a terrain.Our approach to foundation shape creation is to use the points from the grouped horizontal facets and use the convex hull of these points as the boundary of the foundation. One major advantage and positive visual effect that using a convex hull foundation provides over a user defined bounding shape is that it gives the appearance of the castle being built to suit its environment with minimal modification to the terrain.To provide easy access to these candidate foundation hulls, a scoring system was created which accepts a user's request for a convex hull from the set generated initially. The request defines desired attributes such as position, altitude, and area along with weights that specify the importance of the attributes. The foundation finder then scores the hulls according the the user's request and returns the highest scoring hull.We discovered that the convex hulls lent themselves very nicely to the other elements of the castle. Castle walls could be built simply by using either the foundation convex hull or a similar hull by using the hull's sides to project onto the terrain where the walls should be built. In addition, concentric outer and inner walls could be built by using hulls varying in scale and by perturbing the points of the hull to provide additional randomness. The towers along the walls could simply be placed at the points along the boundary of the hulls.The convex hulls were also employed in the creation of the castle moat. Using three hulls; one for the inner shore, another to define the middle band of the moat where it is at its deepest, and one for the outer shore, the terrain can be interpolated into a v-shape to create the moat. Additional terrain interpolation was also implemented to erode raised foundations back into the generated terrain to prevent sharp cliff-like artifacts. This too used convex hull boundaries to specify the points that were to interpolated.Convex hulls were also used when expanding the size of the castles. Castles often outgrew their usefulness and would need to be expanded in order to further serve a meaningful purpose. We again put to use the request system when expanding the area of a castle by using the first foundation request that was used to obtain the first convex hull foundation. Using the same request lets us retrieve the most similar, more importantly closest, piece of foundation-worthy land. We can then create a new convex hull using an enlarged version of our initial foundation hull and the newly requested convex hull, for example, in order to create a new outer wall.
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western... more
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western Washington University. The major objective of this program is to provide long-term baseline water quality monitoring in Lake Whatcom and selected tributaries. Each section contains brief explanations about the water quality data, along with discussions of patterns observed in Lake Whatcom
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western... more
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western Washington University. The major objective of this program is to provide long-term baseline water quality monitoring in Lake Whatcom and selected tributaries. Each section contains brief explanations about the water quality data, along with discussions of patterns observed in Lake Whatcom
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western... more
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western Washington University. The major objective of this program is to provide long-term baseline water quality monitoring in Lake Whatcom and selected tributaries. Each section contains brief explanations about the water quality data, along with discussions of patterns observed in Lake Whatcom
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western... more
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western Washington University. The major objective of this program is to provide long-term baseline water quality monitoring in Lake Whatcom and selected tributaries. Each section contains brief explanations about the water quality data, along with discussions of patterns observed in Lake Whatcom
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western... more
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western Washington University. The major objective of this program is to provide long-term baseline water quality monitoring in Lake Whatcom and selected tributaries. Each section contains brief explanations about the water quality data, along with discussions of patterns observed in Lake Whatcom
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western... more
This report is part of an on-going series of annual reports and special project reports that document the Lake Whatcom monitoring program. This work is conducted by the Institute for Watershed Studies and other departments at Western Washington University. The major objective of this program is to provide long-term baseline water quality monitoring in Lake Whatcom and selected tributaries. Each section contains brief explanations about the water quality data, along with discussions of patterns observed in Lake Whatcom
This research paper describes a new method in which the discovery and construction of man-made structures, specifically medieval castles, in a three dimensional environment can be intelligently constructed given only a random terrain and... more
This research paper describes a new method in which the discovery and construction of man-made structures, specifically medieval castles, in a three dimensional environment can be intelligently constructed given only a random terrain and simple user-defined suggestions. Our method of procedural modeling makes use of convex hulls, extracted from a given terrain, to provide several construction elements that are used to complete the finished model.In developing the procedurally modeled castles, we employed hybrid multi-fractals using Perlin noise as way to generate random terrains. It was essential to implement the random/fractal terrain generation to provide sufficient means of testing several cases, as opposed to using the same terrain over and over. This provided a fast and efficient means of testing a component of the modeling method on several different terrains.The approach this modeling procedure first takes is common among the construction of man made buildings. Just as one would do when first setting up a tent, for example, the first step is to locate a suitable site. The target for such a search is a satisfactory foundation in which to build upon. Our approach was to search the generated terrain for those facets whose normals are within some tolerance of straight up. This will find those portions of the terrain that are the most horizontal and, therefore, offer the most potential for a natural foundation for the castle, and require the least movement of soil.Once the horizontal facets of the terrain have been located they are grouped together. Now the decision as to what shape the encompassing foundation will take must be made. This is done because in most cases the grouped facets will contain holes or appear to branch which does not lend itself well to a foundation. Obviously bounding squares, rectangles, circumferences, or any other user defined shape, for that matter, could do this job. In fact, many castles are indeed built in such a way that they are quite symmetric and using a user defined shape may provide a very desirable foundation. However, our method implements a more general approach that provides a unique foundation and therefore a unique castle construction each time it is presented with a terrain.Our approach to foundation shape creation is to use the points from the grouped horizontal facets and use the convex hull of these points as the boundary of the foundation. One major advantage and positive visual effect that using a convex hull foundation provides over a user defined bounding shape is that it gives the appearance of the castle being built to suit its environment with minimal modification to the terrain.To provide easy access to these candidate foundation hulls, a scoring system was created which accepts a user's request for a convex hull from the set generated initially. The request defines desired attributes such as position, altitude, and area along with weights that specify the importance of the attributes. The foundation finder then scores the hulls according the the user's request and returns the highest scoring hull.We discovered that the convex hulls lent themselves very nicely to the other elements of the castle. Castle walls could be built simply by using either the foundation convex hull or a similar hull by using the hull's sides to project onto the terrain where the walls should be built. In addition, concentric outer and inner walls could be built by using hulls varying in scale and by perturbing the points of the hull to provide additional randomness. The towers along the walls could simply be placed at the points along the boundary of the hulls.The convex hulls were also employed in the creation of the castle moat. Using three hulls; one for the inner shore, another to define the middle band of the moat where it is at its deepest, and one for the outer shore, the terrain can be interpolated into a v-shape to create the moat. Additional terrain interpolation was also implemented to erode raised foundations back into the generated terrain to prevent sharp cliff-like artifacts. This too used convex hull boundaries to specify the points that were to interpolated.Convex hulls were also used when expanding the size of the castles. Castles often outgrew their usefulness and would need to be expanded in order to further serve a meaningful purpose. We again put to use the request system when expanding the area of a castle by using the first foundation request that was used to obtain the first convex hull foundation. Using the same request lets us retrieve the most similar, more importantly closest, piece of foundation-worthy land. We can then create a new convex hull using an enlarged version of our initial foundation hull and the newly requested convex hull, for example, in order to create a new outer wall.
A methodology for clustering data in which a distance metric or similarity function is not used is described. Instead, clusterings are optimized based on their intended function: the accurate prediction of properties of the data. The... more
A methodology for clustering data in which a distance metric or similarity function is not used is described. Instead, clusterings are optimized based on their intended function: the accurate prediction of properties of the data. The resulting clustering methodology is applicable, without further ad hoc assumptions or transformations of the data, (1) when features are heterogeneous (both discrete and continuous)
We2 present an overview of current research in the measurement of enjoyment in video games. We attempt to systematize the disparate approaches found with the vocabulary and constructs of quasi-experimentation. In each area, we try to make... more
We2 present an overview of current research in the measurement of enjoyment in video games. We attempt to systematize the disparate approaches found with the vocabulary and constructs of quasi-experimentation. In each area, we try to make some recommendations for improvement, and find some areas in need of further research.
Ecological studies and multispecies ecotoxicological tests are based on the examination of a variety of physical, chemical and biological data with the intent of finding patterns in their changing relationships over time. The data sets... more
Ecological studies and multispecies ecotoxicological tests are based on the examination of a variety of physical, chemical and biological data with the intent of finding patterns in their changing relationships over time. The data sets resulting from such studies are often noisy, incomplete, and difficult to envision. We have developed machine learning and visualization software to aid in the analysis,
Abstract : In this research program, new methods of data analysis were applied to the analysis of multispecies toxicity tests using three complex toxicants. The water soluble traction of the turbine fuels Jet-A, JP-4 and JP-8 have been... more
Abstract : In this research program, new methods of data analysis were applied to the analysis of multispecies toxicity tests using three complex toxicants. The water soluble traction of the turbine fuels Jet-A, JP-4 and JP-8 have been examined as stressors for two microcosm protocols, the standardized aquatic microcosm (SAM) and the mixed flask culture (MFC). The SAM is a 3 L system inoculated with standard cultures of algae, zooplankton, bacteria, and protozoa. In contrast, the MFC is 1 L and is inoculated with a complex mixture of organisms derived from a natural source. Analysis of the organism counts and physical data were conducted using conventional and newly derived multivariate nonmetric clustering methods and computer visualization techniques. Several fundamental discoveries regarding the impacts of toxicants on ecological systems were made. The first is that recovery of an ecosystem in the sense that it returns to the original or reference state is not a property of these systems. In fact, it is unlikely that recovery is a property of other larger ecological systems. In our experiments the various treatment groups incorporated the information as to toxicant concentration that was expressed after periods of so- called recovery. The differentiation of the treatment groups occurred even after the elimination of the toxicant from the test system. Another fundamental discovery is that multispecies toxicity tests are not repeatable, although within one experiment the replicates of a treatment group are replicable. In other words, initial conditions are important. The outcome of this research may lead to a new viewpoint in describing the impacts of toxicants on complex ecological systems. This viewpoint is described as the Community Conditioning Hypothesis.
This study evaluated the usefulness of using groups of biomarkers as measures of exposure and which statistical approach wouldbe most robust for analysing such data. We used both analysis of variance (ANOVA) and nonmetric cluster and... more
This study evaluated the usefulness of using groups of biomarkers as measures of exposure and which statistical approach wouldbe most robust for analysing such data. We used both analysis of variance (ANOVA) and nonmetric cluster and associationanalysis (NCAA) to look for patterns in biomarker responses of populations of gray-tailed voles (Microtus canicaudus)in field enclosures exposed to azinphos-methyl (Guthion2S) at 0.0, 1.55 and 4.67 kg active ingredient(AI)ha-1 (four enclosures per treatment level). Biomarkers measured were hematocrit,total leukocyte counts, leukocyte differentiation, plasma lactate dehydrogenase (LDH), isocitratedehydrogenase, creatine phosphokinase (CPK) activities, and plasma creatinine and blood urea nitrogenconcentrations. Brain cholinesterase (AChE) activity was measured in a subset of animals. The ANOVAwas able to distinguish differences between treatment groups only for brain AChE. The NCAA confirmedthe ANOVA analysis that brain AChE activity differed ...
... AUTHORS Wayne G. Landis Robin A. Matthews Geoffrey B. Matthews 5. FUNDING NUMBERS USAFOSR Grant No. F49620-94-1-285 ... Geoffrey B. Matthews Computer Science Department, Western Washington University, Bellingham, WA 98225,360/650-•if... more
... AUTHORS Wayne G. Landis Robin A. Matthews Geoffrey B. Matthews 5. FUNDING NUMBERS USAFOSR Grant No. F49620-94-1-285 ... Geoffrey B. Matthews Computer Science Department, Western Washington University, Bellingham, WA 98225,360/650-•if 97 ...
... A test of the community conditioning hypothesis: Persistence of effects in model ecological structures dosed with the jet fuel jp-8. Wayne G. Landis 1,* ,; April J. Markiewicz 1 ,; Robin A. Matthews 2 ,; Geoffrey B. Matthews 3.... more
... A test of the community conditioning hypothesis: Persistence of effects in model ecological structures dosed with the jet fuel jp-8. Wayne G. Landis 1,* ,; April J. Markiewicz 1 ,; Robin A. Matthews 2 ,; Geoffrey B. Matthews 3. Article first published online: 2 NOV 2009. ...
ABSTRACT In this paper we compare the differences between principal components analysis, hierarchical clustering, correspondence analysis and conceptual clustering to show their effectiveness for identifying patterns in a large... more
ABSTRACT In this paper we compare the differences between principal components analysis, hierarchical clustering, correspondence analysis and conceptual clustering to show their effectiveness for identifying patterns in a large limnological data set. The data for this comparison come from a multi-year study of Lake Whatcom, a large lake located in the Puget Sound lowlands of the state of Washington. The data include both physical and chemical parameters (temperature, dissolved oxygen, pH, alkalinity, turbidity, conductivity, and nutrients) as well as biological parameters (Secchi depth, chlorophyll a, and phytoplankton species and total counts). The patterns we expected to find include (a) temperature and dissolved oxygen interactions, (b) ordination by algal bloom sequences, and (c) clustering due to the effects of stratification.Principal components analysis was somewhat useful for confirming known water quality trends, but did not successfully identify large-scale patterns such as stratification and seasonal plankton changes. Correspondence analysis proved to be superior to principal components analysis for detecting phytoplankton trends, but was not as good for interpreting water quality changes. Hierarchical clustering produced highly unbalanced trees for both the water quality and phytoplankton data, and was useless as an exploratory tool. A new approach to clustering, implemented in the computer program riffle, is introduced here. This clustering algorithm outperformed the other exploratory tools in clustering and parameter ordination, and successfully identified a number of expected and unexpected patterns in the limnological data.
Macroinvertebrates were collected at four sites in Padden Creek, a small second-order stream in Whatcom County, Washington, USA. Two upstream sites were characterized by high densities of sensitive taxa, predominantly mayflies,... more
Macroinvertebrates were collected at four sites in Padden Creek, a small second-order stream in Whatcom County, Washington, USA. Two upstream sites were characterized by high densities of sensitive taxa, predominantly mayflies, stoneflies, and caddisflies, and two downstream sites showed high densities of tolerant taxa, especially true flies, annelids, Baetis mayflies, and gastropods. Despite the small sample size, some statistical techniques proved useful. The first two components of correspondence analysis were used to confirm the existence of both seasonal and spatial trends in the benthic macroinvertebrate populations of the stream. Neither component alone, however, ordinated the samples with respect to these trends. Combinations of the first two components were required. A standard clustering technique, k-means clustering with squared Euclidean distance, further confirmed the seasonal trend. Nonmetric clustering, not widely used in the analysis of ecological data, was necessary...
... ecological reality. Natural systems may have similarities, Page 2. 598 Environ. Toxicol. Chem. 15, 1996 RA Matthews et al. Fig. 1. Total Daphnia per ml () and total unicellularalgae 103 per ml (A) in 0–15% WSF Jet-A. but because ...
ABSTRACT The community conditioning hypothesis is used as a framework in which to place the layers of effects during and after pesticide intoxication. Community conditioning states that information about the history of a system can be and... more
ABSTRACT The community conditioning hypothesis is used as a framework in which to place the layers of effects during and after pesticide intoxication. Community conditioning states that information about the history of a system can be and is written at a variety of organismal and ecological levels. This historical component or etiology determines the future dynamics of a system. The storage of information concerning prior stressor events has been observed in a variety of compartments. Fish populations have been observed to have different genetic structures in populations that have been exposed to toxicant stressors. Analysis of biomarker data from field experiments reveals a variety of patterns, some due to the location of the field plots. Treatment groups within a series of microcosm experiments maintain their identities long after the degradation of the toxicant. The dynamics of the treatment groups in multivariate ecological space are characteristic of a particular treatment. Other microcosm systems differentially respond to invasion depending upon the order of the inoculation of the biotic components, even though at the time of the invasion the systems are indistinguishable. A major factor in the uncertainty of pesticide risk assessment will be the unknown etiology of the system of interest.
The theory of general relativity has produced some great insights into the nature of space and time. Unfortunately, its relevance to the problem of the direction of time has been overestimated. This paper points out that the problem of... more
The theory of general relativity has produced some great insights into the nature of space and time. Unfortunately, its relevance to the problem of the direction of time has been overestimated. This paper points out that the problem of the direction of time can be ...
A methodology for clustering data in which a distance metric or similarity function is not used is described. Instead, clusterings are optimized based on their intended function: the accurate prediction of properties of the data. The... more
A methodology for clustering data in which a distance metric or similarity function is not used is described. Instead, clusterings are optimized based on their intended function: the accurate prediction of properties of the data. The resulting clustering methodology is applicable, without further ad hoc assumptions or transformations of the data, (1) when features are heterogeneous (both discrete and continuous)

And 6 more