Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Peter Cheeseman

    Peter Cheeseman

    In many robotic applications the need to represent and reason about spatial relationships is of great importance. However, our knowledge of particular spatial relationships is inherently uncertain. The most used method for handling the... more
    In many robotic applications the need to represent and reason about spatial relationships is of great importance. However, our knowledge of particular spatial relationships is inherently uncertain. The most used method for handling the uncertainty is to "pre-engineer" the problem away, by structuring the working environment and using specially-suited high-precision equipment. In some advanced robotic research domains, however, such as automatic task planning, off-line robot programming, and autonomous vehicle operation, prior structuring will not be possible, because of dynamically changing environments, or because of the demand for greater reasoning flexibility. Spatial reasoning is further complicated because relationships are often not described explicitly, but are given by uncertain relative information. This is particularly true when many different frames of reference are used, producing a network of uncertain relationships. Rather than treat spatial uncertainty as a side issue in geometrical reasoning, we believe it must be an intrinsic part of spatial representations. In this paper, we describe a representation for spatial information, called the stochastic map, and associated procedures for building it, reading information from it, and revising it incrementally as new information is obtained. The map always contains the best estimates of relationships among objects in the map, and their uncertainties. The procedures provide a general solution to the problem of estimating uncertain relative spatial relationships. The estimates are probabilistic in nature, an advance over the previous, very conservative, worst-case approaches to the problem. Finally, the procedures are developed in the context of state-estimation and filtering theory, which provides a solid basis for numerous extensions.
    ABSTRACT
    The availability of a reclassification of the IRAS LRS Atlas of spectra using a new Bayesian classification procedure (AutoClass) is announced. The classes of objects which result from the application of the AutoClass algorithm include... more
    The availability of a reclassification of the IRAS LRS Atlas of spectra using a new Bayesian classification procedure (AutoClass) is announced. The classes of objects which result from the application of the AutoClass algorithm include many of the previously known LRS classes. New classes which have interesting astronomical and astrophysical interpretations were also found. Techniques, such as the AutoClass algorithm,
    This paper presents a new method for calculating the conditional probability of any attribute value, given particular information about the individual case. The calculation is based on the principle of maximum entropy and yields the most... more
    This paper presents a new method for calculating the conditional probability of any attribute value, given particular information about the individual case. The calculation is based on the principle of maximum entropy and yields the most unbiased probability estimate, given the available evidence. Previous methods for computing maximum entropy values are either very restrictive in the probabilistic information (constraints) they can use or are combinatorially explosive. The computational complexity of the new procedure depends on the interconnectedness of the constraints, but in practical cases it is small.
    This paper describes a method for extracting information from data to form the knowledge base for a probabilistic expert system. The information the method finds consists of joint probabilities that show significant probabilistic... more
    This paper describes a method for extracting information from data to form the knowledge base for a probabilistic expert system. The information the method finds consists of joint probabilities that show significant probabilistic connections between the associated attribute values. These joint probabilities can be combined with information about particular cases to compute particular conditional probabilities as described in [3]. The search procedure and significance test required are presented for different types of data. The significance test requires finding if the minimum message length (in the information theory sense) required to encode the data is reduced if the joint probability being tested is given explicitly. This significance test is derived from Bayes' theorem and is shown to find the hypothesis (i.e. set of significant joint probabilities) with the highest posterior probability given the data.
    ... The various fuzzy approaches (fuzzy sets, fuzzy logic, possibility theory and higher order ... The evidential reasoning approach for MADA under both probabilistic and fuzzy uncertainties. ... T Reeves, RS Lockhart in Journal of... more
    ... The various fuzzy approaches (fuzzy sets, fuzzy logic, possibility theory and higher order ... The evidential reasoning approach for MADA under both probabilistic and fuzzy uncertainties. ... T Reeves, RS Lockhart in Journal of Experimental Psychology: General (1993). Save ...
    The author discusses the development of artificial intelligence (AI). He explains the basic elements of AI: Heuristic search, knowledge representation, AI languages and tools, Natural Language Processing, computer vision, expert systems... more
    The author discusses the development of artificial intelligence (AI). He explains the basic elements of AI: Heuristic search, knowledge representation, AI languages and tools, Natural Language Processing, computer vision, expert systems and problem solving and planning.
    A major goal of NASA's Systems Autonomy Demonstration Project is to focus research in artificial intelligence, human factors, and dynamic control systems in support of Space Station automation. Another goal is to demonstrate the use... more
    A major goal of NASA's Systems Autonomy Demonstration Project is to focus research in artificial intelligence, human factors, and dynamic control systems in support of Space Station automation. Another goal is to demonstrate the use of these technologies in real space systems, for both ground-based mission support and on-board operations. The design, construction, and evaluation of an intelligent autonomous system shell is recognized as an important part of the Systems Autonomy research program. This paper describes autonomous systems and executive controllers, outlines how these in-telligent systems can be utilized within the Space Station, and discusses a number of key design issues that have been raised during some preliminary work to develop an autonomous executive controller shell at NASA Ames Research Center.
    The Energy Citations Database (ECD) provides access to historical and current research (1948 to the present) from the Department of Energy (DOE) and predecessor agencies.
    I am in complete agreement with the arguments presented in "A Probabilistic and Statistical View of Fuzzy Methods" (henceforth PSFM). In particular, PSFM states: "we have found ... no solutions using FST [fuzzy set theory]... more
    I am in complete agreement with the arguments presented in "A Probabilistic and Statistical View of Fuzzy Methods" (henceforth PSFM). In particular, PSFM states: "we have found ... no solutions using FST [fuzzy set theory] that could not have been achieved at least as effectively using probability and statistics" (p. 260). The same conclusion was also previously put forward by me (Cheeseman 1986, 1988) and others. The failure of the FST community to provide a single sustainable instance of the alleged superiority of FST over probability, despite many attempts, is further evidence of this negative conclusion. The authors of PSFM are to be congratulated for providing detailed examples of how to do with probability what is claimed to be only possible using FST. I fear, however, that no amount of demonstrations of this kind will convince the FST community that FST is not a theoretical advance. My own experience has been that demonstrations of the probabilistic solution to instances supposedly only solvable by FST methods just brings forth more examples-the larger pattern is missed. The following comments are a clarification of some of the points in PSFM.
    This paper describes a representation of time and its associated inference rules that is being applied to process planning. The representation associates a time interval and a duration with every proposition in a world model, so that... more
    This paper describes a representation of time and its associated inference rules that is being applied to process planning. The representation associates a time interval and a duration with every proposition in a world model, so that particular time relationships can be reasoned about explicitly. These time intervals are defined by their beginning and end instants, so interval relationships, such as "before" and "overlaps", are defined indirectly by the corresponding relations between the end instants. This new representation allows simpler rules for deducing the usual planning interval relationships. Similarly, interval durations are associated with each proposition by asserting the maximum and minimum durations, thus allowing the discovery of critical paths and allowing partial plans to be optimized. A planner has been designed to use the above time representation to produce plans for work stations with multiple robots, for example. A method is given for the detection and correction of interactions in the current partial plan that avoids any conflicts and deadlocks that could arise.
    The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free... more
    The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5.4, VAX/Ultrix v4.1, and MIPS/Ultrix v4, rev. 179; and on the Macintosh personal computer. The minimum Macintosh required is the IIci. This program will not run under CMU Common Lisp or VAX/VMS DEC Common Lisp. A minimum of 8Mb of RAM is required for Macintosh platforms and 16Mb for workstations. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 3.5 inch diskette in Macintosh format. An electronic copy of the documentation is included on the distribution medium. AUTOCLASS was developed between March 1988 and March 1992. It was initially released in May 1991. Sun is a trademark of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. DEC, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation. Macintosh is a trademark of Apple Computer, Inc. Allegro CL is a registered trademark of Franz, Inc.
    In the near future NASA intends to explore various regions of our solar system using robotic devices such as rovers, spacecraft, airplanes, and/or balloons. Such platforms will likely carry imaging devices, and a variety of analytical... more
    In the near future NASA intends to explore various regions of our solar system using robotic devices such as rovers, spacecraft, airplanes, and/or balloons. Such platforms will likely carry imaging devices, and a variety of analytical instruments intended to evaluate the chemical and ...
    An account is given of the many related activities employing AI that are classifiable as 'automatic planning and scheduling'. A human can form plans and successfully execute them, but no current automatic-planning AI system can... more
    An account is given of the many related activities employing AI that are classifiable as 'automatic planning and scheduling'. A human can form plans and successfully execute them, but no current automatic-planning AI system can match this robustness and generality; it is in fact suggested that automatic planning is unlikely to be achieved by a general-purpose planning system. It is judged likely that partially-specialized planners will emerge for the efficient solution of specific classes of problems. Current planners are also found to make unrealistic informational demands, especially in the requirement that the 'state of the world' be known at all times, and that the only changes that occur are under the planner's control.
    Paper discusses principal issues to be resolved in development of autonomous executive-controller shell for Space Station. Shell represents major increase in complexity of automated systems. More-complex control tasks require system that... more
    Paper discusses principal issues to be resolved in development of autonomous executive-controller shell for Space Station. Shell represents major increase in complexity of automated systems. More-complex control tasks require system that deals with different goals requiring sequences of tasks that change state of system world in complex ways. Requires integration of all functions. Applications include space station communications, tracking, life support, data processing support, navigation, and control of thermal and structural subsystems.

    And 90 more