Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY
Reconstructability Analysis (RA) and Bayesian Networks (BN) are both probabilistic graphical modeling methodologies used in machine learning and artificial intelligence. There are RA models that are statistically equivalent to BN models... more
Reconstructability Analysis (RA) and Bayesian Networks (BN) are both probabilistic graphical modeling methodologies used in machine learning and artificial intelligence. There are RA models that are statistically equivalent to BN models and there are also models unique to RA and models unique to BN. The primary goal of this paper is to unify these two methodologies via a lattice of structures that offers an expanded set of models to represent complex systems more
accurately or more simply. The conceptualization of this lattice also offers a framework for additional innovations beyond what is presented here. Specifically, this paper integrates RA and BN by developing and visualizing: (1) a BN neutral system lattice of general and specific  graphs, (2) a joint RA-BN neutral system lattice of general and specific graphs, (3) an augmented RA directed system lattice of prediction graphs, and (4) a BN directed system lattice of prediction graphs. Additionally, it (5) extends RA notation to encompass BN graphs and (6) offers an algorithm to search the joint RA-BN neutral system lattice to find the best representation of system structure from underlying system
variables. All lattices shown in this paper are for four variables, but the theory and methodology presented in this paper are general and apply to any number of variables. These methodological innovations are contributions to machine learning and artificial intelligence and more generally to complex systems analysis. The paper also reviews some relevant prior work of others so that the innovations offered here can be understood in a self-contained way within the context of this paper.
Reconstructability analysis, a methodology based on information theory and graph theory, was used to perform a sensitivity analysis of an agent-based model. The NetLogo BehaviorSpace tool was employed to do a full 2k factorial parameter... more
Reconstructability analysis, a methodology based on information
theory and graph theory, was used to perform a sensitivity analysis
of an agent-based model. The NetLogo BehaviorSpace tool was
employed to do a full 2k factorial parameter sweep on Uri Wilensky’s
Wealth Distribution NetLogo model, to which a Gini-coefficient
convergence condition was added. The analysis identified the most
influential predictors (parameters and their interactions) of the Gini coefficient wealth inequality outcome. Implications of this type of
analysis for building and testing agent-based simulation models are
discussed.
Reconstructability analysis (RA) is a method for detecting and analyzing the structure of multivariate categorical data. While Jones and his colleagues extended the original variable-based formulation of RA to encompass models defined in... more
Reconstructability analysis (RA) is a method for detecting and analyzing the structure of multivariate categorical data. While Jones and his colleagues extended the original variable-based formulation of RA to encompass models defined in terms of system states, their focus was the analysis and approximation of real-valued functions. In this paper, we separate two ideas that Jones had merged together: the "g to k" transformation and state-based modeling. We relate the idea of state-based modeling to established variable-based RA concepts and methods, including structure lattices, search strategies, metrics of model quality, and the statistical evaluation of model fit for analyses based on sample data. We also discuss the interpretation of state-based modeling results for both neutral and directed systems, and address the practical question of how state-based approaches can be used in conjunction with established variable-based methods.
Fourier methods used in 2-and 3-dimensional image reconstruction can be used also in reconstructability analysis (RA). These methods maximize a variance-type measure instead of information-theoretic uncertainty, but the two measures are... more
Fourier methods used in 2-and 3-dimensional image reconstruction can be used also in reconstructability analysis (RA). These methods maximize a variance-type measure instead of information-theoretic uncertainty, but the two measures are roughly colinear and the Fourier approach yields results close to those of standard RA. The Fourier method, however, does not require iterative calculations for models with loops. Moreover the error in Fourier RA models can be assessed without actually generating the full probability distributions of the models; calculations scale with the size of the data rather than the state space. State-based modeling using the Fourier approach is also readily implemented. Fourier methods may thus enhance the power of RA for data analysis and data mining.
The building block hypothesis implies that genetic algorithm effectiveness is influenced by the relative location of epistatic genes on the chromosome. We find that this influence exists, but depends on the generation in which it is... more
The building block hypothesis implies that genetic algorithm effectiveness is influenced by the relative location of epistatic genes on the chromosome.  We find that this influence exists, but depends on the generation in which it is measured.  Early in the search process it may be more effective to have epistatic genes widely separated.  Late in the search process, effectiveness is improved when they are close together.  The early search effect is weak but still statistically significant; the late search effect is much stronger and plainly visible. We demonstrate both effects with a set of simple problems, and show that information-theoretic reconstructability analysis can be used to decide on optimal gene ordering.
When the reconstructability analysis of a directed system yields a structure in which a generated variable appears in more than one subsystem, information from all of the subsystems can be used in modeling the relationship between... more
When the reconstructability analysis of a directed system yields a structure in which a generated variable appears in more than one subsystem, information from all of the subsystems can be used in modeling the relationship between generating and generated variables. The conceptualization and procedure proposed here is discussed in relation to Klir's concept of control uniqueness.
Modified Reconstructibility Analysis (MRA), a novel decomposition within the framework of set-theoretic (crisp possibilistic) reconstructibility analysis, is presented. It is shown that in some cases, while three-variable NPN-classified... more
Modified Reconstructibility Analysis (MRA), a novel decomposition within the framework of set-theoretic (crisp possibilistic) reconstructibility analysis, is presented. It is shown that in some cases, while three-variable NPN-classified Boolean functions are not decomposable using Conventional Reconstructibility Analysis (CRA), they are decomposable using MRA. Also, it is shown that whenever a decomposition of three-variable NPN-classified Boolean functions exists in both MRA and CRA, MRA yields simpler or equal complexity decompositions. A comparison of the corresponding complexities for Ashenhurst-Curtis decompositions and MRA is also presented. While both AC and MRA decompose some but not all NPN-classes, MRA decomposes more classes, and consequently more Boolean functions. MRA for many-valued functions is also presented, and algorithms using two different methods (intersection and union) are given. A many-valued case is presented where CRA fails to decompose but MRA decomposes.
To achieve reduced training time and improved generalization with artificial neural networks (ANN, or NN), it is important to use a reduced complexity NN structure. A "problem" is defined by constraints among the variables... more
To achieve reduced training time and improved generalization with artificial neural networks (ANN, or NN), it is important to use a reduced complexity NN structure. A "problem" is defined by constraints among the variables describing it. If knowledge about these ...
Reconstructability analysis (RA) is a method for detecting and analyzing the structure of multivariate categorical data. Jones and his colleagues extended the original variable-based formulation of RA to encompass models defined in terms... more
Reconstructability analysis (RA) is a method for detecting and analyzing the structure of multivariate categorical data. Jones and his colleagues extended the original variable-based formulation of RA to encompass models defined in terms of system states (Jones 1982; Jones 1985; ...
Modified reconstructability analysis (MRA), a novel decomposition technique within the framework of set-theoretic (crisp possibilistic) reconstructability analysis, is applied to three-variable NPN-classified Boolean functions. MRA is... more
Modified reconstructability analysis (MRA), a novel decomposition technique within the
framework of set-theoretic (crisp possibilistic) reconstructability analysis, is applied to
three-variable NPN-classified Boolean functions. MRA is superior to conventional
reconstructability analysis, i.e. it decomposes more NPN functions. MRA is compared to
Ashenhurst-Curtis (AC) decomposition using two different complexity measures: log-functionality, a measure suitable for machine learning, and the count of the total number of two-input gates, a measure suitable for circuit design. MRA is superior to AC using the first of these measures, and is comparable to, but different from AC, using the second.
ABSTRACT We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern... more
ABSTRACT We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification context. General Systems Methodology tools for finding problem-domain structure suffer exponential scaling of computation with respect to the number of variables considered. Therefore we introduce the use of Extended Dependency Analysis (EDA), which scales only polynomially in the number of variables, for the desired analysis. Based on EDA, we demonstrate a number of NN pre-structuring techniques applicable for building neural classifiers. An example is provided in which EDA results in significant dimension reduction of the input space, as well as capability for direct design of an NN classifier. Keywords: neural networks, structure, pattern recognition, classifier, information theoretic reconstructability, extended dependency analysis, optical Fouri...
The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability... more
The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses.
Modified reconstructability analysis (MRA), a novel decomposition technique within the framework of set-theoretic (crisp possibilistic) reconstructability analysis, is applied to three-variable NPN-classified Boolean functions. MRA is... more
Modified reconstructability analysis (MRA), a novel decomposition technique within the
framework of set-theoretic (crisp possibilistic) reconstructability analysis, is applied to
three-variable NPN-classified Boolean functions. MRA is superior to conventional
reconstructability analysis, i.e. it decomposes more NPN functions. MRA is compared to
Ashenhurst-Curtis (AC) decomposition using two different complexity measures: log-functionality,
a measure suitable for machine learning, and the count of the total number of two-input gates, a
measure suitable for circuit design. MRA is superior to AC using the first of these measures, and is
comparable to, but different from AC, using the second.
This paper is a systems theoretic examination of Eugen Rosenstock-Huessy's 'cross of reality', a structure that fuses a spatial dyad of inner-outer and a temporal dyad of past-future into a space-time tetrad. This structure is compatible... more
This paper is a systems theoretic examination of Eugen Rosenstock-Huessy's 'cross of reality', a structure that fuses a spatial dyad of inner-outer and a temporal dyad of past-future into a space-time tetrad. This structure is compatible not only with the 'human-centered' point of view that Rosenstock-Huessy favours, but also with the 'world-centered' point of view inherent in science. The structure, based in his analysis of speech, is applied by him to a wide variety of individual and collective human phenomena, including language, religion, and social critique. To appropriate terminology used by physicists, the cross of reality could be viewed as Rosenstock-Huessy's 'theory of everything', a framework for the social sciences and humanities that can be used to model entities, events and processes. The cross diagrams some basic notions of systems theory. Rosenstock-Huessy's critique of science is partially shared by systems thought, and the goal he posited for sociology of understanding and alleviating human suffering can gain support from systems ideas and methods.
This paper is a systems theoretic examination of Eugen Rosenstock-Huessy's "cross of reality," a structure that fuses a vertical spatial dyad of inner-outer and a horizontal temporal dyad of past-future into a space-time tetrad. This... more
This paper is a systems theoretic examination of Eugen Rosenstock-Huessy's "cross of reality," a structure that fuses a vertical spatial dyad of inner-outer and a horizontal temporal dyad of past-future into a space-time tetrad. This tetrad is compatible not only with the human-centered phenomenological point of view that Rosenstock-Huessy favors, but also with a world-centered scientific point of view. It is applied by him explicitly or implicitly to a wide variety of individual and collective human experiences. In this paper I mention a few examples of these applications from the realm of language, religion, and social critique. I also show that Rosenstock-Huessy's tetradic structure accords with and diagrams some basic concepts in systems theory.
This paper uses a systems-theoretic model to structure an account of human history. According to the model, a process, after its beginning & early development, often reaches a critical stage where it encounters some limitation. If the... more
This paper uses a systems-theoretic model to structure an account of human history.  According to the model, a
process, after its beginning & early development, often
reaches a critical stage where it encounters some limitation.  If the limitation is overcome, development does not face a comparable challenge until a second critical juncture is reached, where obstacles to further advance are more severe.  At the first juncture, continued development requires some complexity-managing innovation; at the second, it needs some
event of systemic integration in which the old organizing principle of the process is replaced by a new principle.  Overcoming the first blockage sometimes occurs via a secondary process that augments & blends with the primary process, & is subject in turn to its own developmental difficulties.

Applied to history the model joins together the materialism of Marx with the cultural emphasis of Toynbee & Jaspers. It describes human history as a triad of developmental processes which encounter points of difficulty.  The ‘primary’ process began with the emergence of the human species, continued with the development of agriculture, & reached its first critical juncture after the rise of the great urban civilizations.  Crises of disorder & complexity faced by these civilizations were eased by the religions & philosophies that emerged in the Axial period.  These Axial traditions became the cultural cores of major world civilizations, their development constituting a ‘secondary’ process that merged with & enriched the first.

This secondary process also eventually stalled, but in the West, the impasse was overcome by a ‘tertiary’ process: the emergence of humanism & secularism & –
quintessentially – the development of science & technology.  This third process blended with the first two in societal & religious change that ushered in what we call ‘modernity.’  Today, this third current of development also falters, & inter-civilizational tension afflicts the secondary stream.  Much more seriously, the primary process has reached its second & critically hazardous juncture – the current global environmental-ecological crisis.  System integration via a new organizing principle is needed on a planetary scale.
Received Systems theory offers a language in which one might formulate a metaphysics or more specifically an ontology of problems. This proposal is based upon a conception of systems theory shared by von Bertalanffy, Wiener, Boulding,... more
Received Systems theory offers a language in which one might formulate a metaphysics or more specifically an ontology of problems. This proposal is based upon a conception of systems theory shared by von Bertalanffy, Wiener, Boulding, Rapoport, Ashby, Klir, and others, and expressed succinctly by Bunge, who considered game theory, information theory, feedback control theory, and the like to be attempts to construct an exact and scientific metaphysics."
Our prevailing conceptions of problems" are concretized yet also fragmented and in fact dissolved by the standard reductionist model of science, which cannot provide a general framework for analysis. The idea of a systems theory," however, suggests the possibility of an abstract and coherent account of the origin and essence of problems. Such an account would constitute a secular theodicy.
This claim is illustrated by examples from game theory, information processing, non-linear dynamics, optimization, and other areas. It is not that systems theory requires as a matter of deductive necessity that problems exist, but it does reveal the universal and lawful character of many problems which do arise.
A partial review of Thomas Nagel's book, Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False is used to articulate some systems-theoretic ideas about the challenge of understanding subjective... more
A partial review of Thomas Nagel's book, Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False is used to articulate some systems-theoretic ideas about the challenge of understanding subjective experience. The article accepts Nagel' s view that reductionist materialism fails as an approach to this challenge, but argues that seeking an explanation of mind based on emergence is more plausible than one based on panpsychism, which Nagel favors. However, the article proposes something similar to Nagel's neutral monism by positing a hierarchy of information processes that span the domains of matter, life, and mind. As depicted in this hierarchy, subjective experience is emergent, but also continuous with informational phenomena at lower levels.
"Freedom" is a phenomenon in the natural world. This phenomenon - and indirectly the question of free will - is explored using a variety of systems-theoretic ideas. It is argued that freedom can emerge only in systems that are partially... more
"Freedom" is a phenomenon in the natural world. This phenomenon - and indirectly the question of free will - is explored using a variety of systems-theoretic ideas. It is argued that freedom can emerge only in systems that are partially determined and partially random, and that freedom is a matter of degree. The paper considers types of freedom and their conditions of possibility in simple living systems and in complex living systems
that have modeling (cognitive) subsystems. In simple living systems, types of freedom include independence from fixed materiality, internal rather than external determination, activeness that is unblocked and holistic, and the capacity to choose or alter environmental constraint. In complex living systems, there is freedom in satisfaction of lower level needs that allows higher potentials to be realized. Several types of freedom also manifest in the modeling subsystems of these complex systems: in the transcending of automatism in subjective experience, in reason as instrument for passion yet also in reason ruling over passion, in independence from informational colonization by the
environment, and in mobility of attention. Considering the wide range of freedoms in simple and complex living systems allows a panoramic view of this diverse and
important natural phenomenon.
This article explores aspects of Rosenzweig's Star of Redemption from the perspective of systems theory. Mosès, Pollock, and others have noted the systematic character of the Star. While "systematic" does not mean "systems theoretic," the... more
This article explores aspects of Rosenzweig's Star of Redemption from the perspective of systems theory. Mosès, Pollock, and others have noted the systematic character of the Star. While "systematic" does not mean "systems theoretic," the philosophical theology of the Star encompasses ideas that are salient in systems theory. The Magen David star to which the title refers, and which deeply structures Rosenzweig's thought, fits the classic definition of "system"-a set of elements (God, World, Human) and relations between the elements (Creation, Revelation, Redemption). The Yes and No of the elements and their reversals illustrate the bridging of element and relation with the third category of "attribute," a notion also central to the definition of "system." In the diachronics of "the All," the relations actualize what is only potential in the elements in their primordial state and thus remedy the incompleteness of these elements, fusing them into an integrated whole. Incompleteness is a major theme of systems theory, which also explicitly examines the relations between wholes and parts and offers a formal framework for clearly expressing such fusions. In this article, the systems character of Parts I & II of the Star is explored through extensive use of diagrams; a systems exploration of Part III is left for future work. Remarkably, given its highly architectonic character, diagrams are absent in Rosenzweig's book, except for the triangle of elements, the triangle of relations, and the hexadic star, which are presented on the opening page of each part of the book. While structures can be explicated entirely in words, diagrams are a visual medium of communication that supplements words and supports a nonverbal understanding that structures both thought and experience.
A graph can specify the skeletal structure of an idea, onto which meaning can be added by interpreting the structure. This paper considers several directed and undirected graphs consisting of four nodes, and suggests different meanings... more
A graph can specify the skeletal structure of an idea, onto which meaning can be added by interpreting the structure. This paper considers several directed and undirected graphs consisting of four nodes, and suggests different meanings that can be associated with these different structures. Drawing on John G. Bennett's "systematics," specifically on the Tetrad that systematics offers as a model of "activity," the analysis formalizes and augments the systematics account and shows that the Tetrad is a versatile model of problem-solving, regulation and control, and other processes. Discussion is extended to include hypergraphs, in which links can relate more than two nodes, and the possibility of a "reconstructability analysis of ideas" is suggested.
INTRODUCTION Mario Bunge (1973) has provided a deep and succinct characterization of systems and cybernetics theories, eg, information theory, game theory, automata theory, and the like, as attempts to construct an exact and scientific... more
INTRODUCTION Mario Bunge (1973) has provided a deep and succinct characterization of systems and cybernetics theories, eg, information theory, game theory, automata theory, and the like, as attempts to construct an exact and scientific metaphysics. These theories can be ...
One significant but not widely appreciated impact of the “new religions” has been to reopen the question of the relation of religion to science. I speak of new religions in the sense defined by Needleman in his book by that title, that... more
One significant but not widely appreciated impact of the “new religions” has been to reopen the question of the relation of religion to science. I speak of new religions in the sense defined by Needleman in his book by that title, that is, I am referring primarily to Eastern teachings ...
This essay is a selective review of Systems: New Paradigms for the Human Sciences, edited by Gabriel Altmann and Walter A. Koch (Berlin: Walter de Gryter, 1998). It is selective because it is impossible to engage such a varied collection... more
This essay is a selective review of Systems: New Paradigms for the Human Sciences, edited by Gabriel Altmann and Walter A. Koch (Berlin: Walter de Gryter, 1998). It is selective because it is impossible to engage such a varied collection of systems-theoretic essays in a review ...
This paper discusses similarities of both form and meaning between two symbolic structures: the Diagram of the Supreme Pole of Song Neo-Confucianism and the Kabbalistic Tree of medieval Jewish mysticism. These similarities are remarkable... more
This paper discusses similarities of both form and meaning between two symbolic structures: the Diagram of the Supreme Pole of Song Neo-Confucianism and the Kabbalistic Tree of medieval Jewish mysticism. These similarities are remarkable in the light of the many differences that exist between Chinese and Judaic thought, which also manifest in the two symbols. Intercultural influence might account for the similarities, but there is no historical evidence for such influence. An alternative explanation would attribute the similarities to the ubiquity of religious-philosophical ideas about hierarchy, polarity, and macrocosm-microcosm parallelism, but this does not adequately account for the similar overall structure of the symbols. The question of how to understand these similarities remains open.
The following is a dialog, published in The Global Spiral, January 9, 2008, 1 about the idea of a systems-theoretic ‘secular theodicy,’ discussed in the author’s “Towards an Ontology of Problems,” 2 “Understanding Imperfection,” 3 and... more
The following is a dialog, published in The Global Spiral, January 9, 2008, 1 about the idea of a systems-theoretic ‘secular theodicy,’ discussed in the author’s “Towards an Ontology of Problems,” 2 “Understanding Imperfection,” 3 and (exemplified in a preliminary way in) “Incompleteness, Negation, and Hazard: On the Precariousness of Systems.” 4 The dialog was inspired by Susan Neiman’s Evil in Modern Thought: An Alternative History of Philosophy , Princeton University Press, 2002.
This paper uses a systems-theoretic model to structure an account of human history. According to the model, a process, after its beginning and early development, often reaches a critical stage where it encounters some limitation. If the... more
This paper uses a systems-theoretic model to structure an account of human history. According to the model, a process, after its beginning and early development, often reaches a critical stage where it encounters some limitation. If the limitation is overcome, development does not face a comparable challenge until a second critical juncture is reached, where obstacles to further advance are more severe. At the first juncture, continued development requires some complexity-managing innovation; at the second, it needs some event of systemic integration in which the old organizing principle of the process is replaced by a new principle. Overcoming the first blockage sometimes occurs via a secondary process that augments and blends with the primary process, and is subject in turn to its own developmental difficulties. Applied to history the model joins together the materialism of Marx and the cultural emphasis of Toynbee and Jaspers. It describes human history as a triad of developmenta...
Many symbolic structures used in religious and philosophical traditions are composed of “elements” and relations between elements. Similarities between such structures can be described using the systems theoretic idea of “isomorphism.”... more
Many symbolic structures used in religious and philosophical traditions are composed of “elements” and relations between elements. Similarities between such structures can be described using the systems theoretic idea of “isomorphism.” This paper demonstrates the existence of a near isomorphism between two symbolic structures: the Diagram of the Supreme Pole of Song Neo-Confucianism and the Kabbalistic Tree of medieval Jewish mysticism. The similarities of these two symbols in form and meaning are remarkable in the light of the many differences that exist between Chinese and Judaic thought. Intercultural influence might account for these similarities, but there is no historical evidence for such influence. An alternative explanation would invoke the ubiquity of ideas about hierarchy, polarity, and macrocosm-microcosm parallelism, but this does not adequately account for the extent of similarity of the symbols. The question of how to explain their resemblance remains unresolved.
as the attempt to construct an "exact and scientific metaphysics" (ESM). By metaphysics Bunge means general propositions about the world which hold for a wide variety of systems ("metaphysics" here thus does not refer... more
as the attempt to construct an "exact and scientific metaphysics" (ESM). By metaphysics Bunge means general propositions about the world which hold for a wide variety of systems ("metaphysics" here thus does not refer to questions of the existence of God, free will, etc.). By scientific, he means grounded in, i.e., drawing upon and contributing to, the sciences. By exact, he means mathematical, or capable ultimately of being expressed mathematically (Bunge actually includes "exactness" within "scientific," but I pull it out as a separate idea). This ESM is presently constituted by a multiplicity of systems theories, e.g., information theory, control theory, game theory, and the like. A singular systems theory does not exist. In a sense, this is the goal of the systems project, a systems "theory of everything" (TOE) radically different from the more familiar TOE sought by physicists trying t
h a conception. Section 5 concludes by noting some virtues and deficiencies of this approach. This is only a sketch of a argument which needs to be made in more detail; a comprehensive treatment is currently in progress. 2. Systems Theory... more
h a conception. Section 5 concludes by noting some virtues and deficiencies of this approach. This is only a sketch of a argument which needs to be made in more detail; a comprehensive treatment is currently in progress. 2. Systems Theory In the view of Mario Bunge (1973), systems theory reflects an attempt to construct an "exact and scientific metaphysics." "Metaphysics" here means a system of abstract propositions of general interest and applicability; an "exact" metaphysics is one which is expressed mathematically or is at least a candidate for mathematical formalization; a "scientific" metaphysics is one which bears upon - draws from and/or contributes to - one or more scientific disciplines. Bunge's conception is close to views of von Bertalanffy (1968), Wiener (1950), Boulding (1956), Rapoport (1986), Ashby (1956), Klir (1991) and many others. ISSN 1078-6236 c fl 1995 International Institute for General Systems Studies 2 M. Zwic
Research Interests:
The Genetic Algorithm (GA) and Simulated Annealing (SA), two techniques for global optimization, were applied to a reduced (simplified) form of the phase problem (RPP) in computational crystallography. Results were compared with those... more
The Genetic Algorithm (GA) and Simulated Annealing (SA), two techniques for global
optimization, were applied to a reduced (simplified) form of the phase problem (RPP) in
computational crystallography. Results were compared with those of "enhanced pair flipping" (EPF), a more elaborate problem-specific algorithm incorporating local and global searches. Not surprisingly, EPF did better than the GA or SA approaches, but the existence of GA and SA techniques more advanced than those used in this study suggest that these techniques still hold promise for phase problem applications. The RPP is, furthermore, an excellent test problem for such global optimization methods.
The n-player prisoner's dilemma (PD) is a usefil model of multilevel selection for altruistic traits. It highlights the non zero-sum interactions necessary for the evolution of altruism as well as the tension between individual and... more
The n-player prisoner's dilemma (PD) is a usefil model of multilevel selection for altruistic traits. It highlights the non zero-sum interactions necessary for the evolution of altruism as well as the tension between individual and group-level selection. The parameters of the ...
Reciprocal altruism and inclusive fitness are generally considered alternative mechanisms by which cooperative, altruistic traits may evolve. Here we demonstrate that very general versions of Hamilton's inclusive fitness rule... more
Reciprocal altruism and inclusive fitness are generally considered alternative mechanisms by which cooperative, altruistic traits may evolve. Here we demonstrate that very general versions of Hamilton's inclusive fitness rule (developed by Queller) can be applied to traditional reciprocal altruism models such as the iterated prisoner's dilemma. In this way we show that both mechanisms rely fundamentally on the same principle - the positive assortment of helping behaviors. We discuss barriers to this unified view, including phenotype/genotype differences and nonadditive fitness (or utility) functions that are typical of reciprocal altruism models. We then demonstrate how Queller's versions of Hamilton's rule remove these obstacles.
Research Interests:
We defne three information-theoretic methods for measuring genetic diversity and compare the dynamics of these measures in simple evolutionary models consisting of a population of agents living, reproducing, and dying while competing... more
We defne three information-theoretic methods for measuring genetic diversity and
compare the dynamics of these measures in simple evolutionary models consisting of a
population of agents living, reproducing, and dying while competing for resources. The
models are static resource models, i.e., the distribution of resources is constant for
all time. Simulation of these models shows that (i) focusing the diversity measures on
used alleles and loci especially highlights the adaptive dynamics of diversity, and (ii)
even though resources are static, the evolving interactions among the agents makes the
effective environment for evolution dynamic.
Many origins-of-life scenarios depict a situation in which there are common and potentially scarce resources needed by molecules that compete for survival and reproduction. The dynamics of RNA assembly in a complex mixture of sequences is... more
Many origins-of-life scenarios depict a situation in which there are common and potentially scarce resources needed by molecules that compete for survival and reproduction. The dynamics of RNA assembly in a complex mixture of sequences is a frequency-dependent process and mimics such scenarios. By synthesizing Azoarcus ribozyme genotypes that differ in their single-nucleotide interactions with other genotypes, we can create molecules that interact among each other to reproduce. Pairwise interplays between RNAs involve both cooperation and selfishness, quantifiable in a 2 × 2 payoff matrix. We show that a simple model of differential equations based on chemical kinetics accurately predicts the outcomes of these molecular competitions using simple rate inputs into these matrices. In some cases, we find that mixtures of different RNAs reproduce much better than each RNA type alone, reflecting a molecular form of reciprocal cooperation. We also demonstrate that three RNA genotypes can stably coexist in a rock-paper-scissors analog. Our experiments suggest a new type of evolutionary game dynamics, called prelife game dynamics or chemical game dynamics. These operate without template-directed replication, illustrating how small networks of RNAs could have developed and evolved in an RNA world.
Several methods of image reconstruction from projections are treated within a unified formal framework to demonstrate their common features and highlight their particular differences. This is done analytically (ignoring computational... more
Several methods of image reconstruction from projections are treated within a unified formal framework to demonstrate their common features and highlight their particular differences. This is done analytically (ignoring computational factors) for the following techniques: the Convolution method, Algebraic Reconstruction, Back-projection, and the Fourier-Bessel approach.
In this presentation, the possible role of computer graphics in macromolecular chemistry is demonstrated by reference to a molecular model building program running on a computer with a graphics console. The program was developed to pursue... more
In this presentation, the possible role of computer graphics in macromolecular chemistry is demonstrated by reference to a molecular model building program running on a computer with a graphics console. The program was developed to pursue studies in protein configurations. A description of the program is given together with examples of the display produced.
R. W. Schevitz , A.D. Podjarny, M. Zwick, J. Hughes; E.M. Westbrook, D. Feldman, and P.B.Sigler, Dept. of Biophysics and Theoretical Biology, The University of Chicago, Chicago,Ill. 60637 Density modification is a method of direct... more
R. W. Schevitz , A.D. Podjarny, M. Zwick, J. Hughes; E.M. Westbrook, D. Feldman, and P.B.Sigler, Dept. of Biophysics and Theoretical Biology, The University of Chicago,
Chicago,Ill. 60637

Density modification is a method of direct phase improvement or extension in which sensible restrictions not dependent on a detailed interpretation of the electron density map are imposed on an initial or provisional map to yield in turn more accurate phases
following (fast) Fourier transformation. These phases are then merged with the initial set in subsequent iterations to give a new image of greater interpretability. Non-negativity of the electron density and constancy of the solvent regions were the restrictions exploited in three macromolecular structural studies ranging from ;],0r to high resolution. 2633 MIR phases of yeast tRNA:re which spanned from 14 to 4.51 resolution having an average phase error of 68" were improved and extended following density modification to a 3545 reflection phase set ranging from 100 to 41 resolution having an average phase error of 43". Interpretability of the map was improved, and it resembled closely the map calculated from the refined molecular coordinates. A 2.51 MIR map of phospholipase A2 from Q. atrox was improved sufficiently by density modification to substantially improve the tracing of the backbone. A 61 MIR map of ketosteroid isomerase was improved by density modification to allow recognition of the molecular boundaries of two independent dimers in the asymmetric unit.
This course will be based on Professor Zwick’s new book, to be published by Springer in the summer 2023, titled Elements and Relations: Aspects of a Scientific Metaphysics. This book develops the core proposition that systems theory is... more
This course will be based on Professor Zwick’s new book, to be published by Springer in the summer 2023, titled Elements and Relations: Aspects of a Scientific Metaphysics.
This book develops the core proposition that systems theory is an attempt to construct an “exact and scientific metaphysics,” a system of general ideas central to science that can be expressed mathematically. Collectively, these ideas would constitute a nonreductionist “theory of everything” unlike what is being sought in physics. Inherently transdisciplinary, systems theory offers ideas and methods that are relevant to all of the sciences and also to professional fields such as systems engineering, public policy, business, and social work.
The book has three parts: Essay, Notes, and Commentary. The Essay section is a short distillation of systems ideas that illuminate the problems that many types of systems face. Commentary explains systems thinking, its value, and its relation to mainstream scientific knowledge. It shows how systems ideas revise our understanding of science and how they impact our views on religion, politics, and history. Finally, Notes contains all the mathematics in the book, as well as scientific, philosophical, and poetic content that is accessible to readers without a strong mathematical background.
Elements and Relations is intended for researchers and students in the systems (complexity) field as well as related fields of social science modeling, systems biology and ecology, and cognitive science. It can be used as a textbook in systems courses at the undergraduate or graduate level and for STEM education. As much of the book does not require a background in mathematics, it is also suitable for general readers in the natural and social sciences as well as in the humanities, especially philosophy.
For more information contact:
Prof. Martin Zwick
zwick@pdx.edu
Research Interests:
This seminar will draw on Bunge’s conception of systems science as aimed at the construction of “an exact and scientific metaphysics” (ESM): a system of very general transdisciplinary ideas that are applicable to the sciences and capable... more
This seminar will draw on Bunge’s conception of systems science as aimed at the construction of “an exact and scientific metaphysics” (ESM): a system of very general transdisciplinary ideas that are applicable to the sciences and capable of being expressed in the exact language of mathematics. These ideas come from non-linear dynamics, information theory, game and decision theory, thermodynamics, evolutionary theory, and other sources.

These ideas are applied to the theme of problems encountered by many types of systems. Of special interest are problems faced by social systems, such as political dysfunction and environmental unsustainability. Systems metaphysics is also relevant to philosophy and to the troubled relation between science and religion.
In this course, information theory is used as a framework for modeling and data mining: for analyzing static or dynamic relations among discrete* variables, for detecting complex interaction effects, and for discovering nonlinearities in... more
In this course, information theory is used as a framework for modeling and data mining: for analyzing static or dynamic relations among discrete* variables, for detecting complex interaction effects, and for discovering nonlinearities in continuous variables made discrete by binning. 

In the systems literature, these information-theoretic and related set-theoretic methods, used together with graph theory techniques, are called “Reconstructability Analysis” (RA).  RA overlaps with and extends log-linear modeling in the social sciences, Bayesian networks and graphical models in machine learning, decomposition techniques in multi-valued logic design, Fourier methods for compression, and other modeling approaches.  It can be used for confirmatory and exploratory statistical modeling as well as for non-statistical applications.

Because of their applicability to both qualitative and quantitative variables, RA methods are very general.  They are usable in the natural sciences, social sciences, engineering, business, and other professional fields. The ideas of RA define “structure,” “complexity,” “holism,” and other basic notions, and are foundational for systems science.  For course-related research and publications, see items listed in the Discrete Multivariate Modeling category in my Selected Works website, https://works.bepress.com/martin_zwick/ .

This is the theory course that goes with the project course, SySc 431/531 Data Mining with Information Theory, next offered in Winter 2022. It is also the theory course for the Occam software, recently been made open source; see https://www.occam-ra.io/
*Discrete variables are typically nominal (categorical, symbolic), but may be ordinal or integer.
In this course, information theory is used as a framework for modeling and data mining: for analyzing static or dynamic relations among discrete (typically, nominal) variables, for detecting complex interaction effects, and for... more
In this course, information theory is used as a framework for modeling and data mining: for analyzing static or dynamic relations among discrete (typically, nominal) variables, for detecting complex interaction effects, and for discovering nonlinearities in continuous variables made discrete by binning. In the systems literature, these information-theoretic and related set-theoretic methods, used together with graph theory techniques, are called "Reconstructability Analysis" (RA). RA overlaps with and extends log-linear modeling in the social sciences, Bayesian networks and graphical models in machine learning, decomposition techniques in multi-valued logic design, Fourier methods for compression, and other modeling approaches. It can be used for confirmatory and exploratory statistical modeling as well as for non-statistical applications.
Occam is a Discrete Multivariate Modeling (DMM) tool based on the methodology of Reconstructability Analysis (RA). Its typical usage is for analysis of problems involving large numbers of discrete variables. Models are developed which... more
Occam is a Discrete Multivariate Modeling (DMM) tool based on the methodology of Reconstructability Analysis (RA). Its typical usage is for analysis of problems involving large numbers of discrete variables. Models are developed which consist of one or more components, which are then evaluated for their fit and statistical significance. Occam can search the lattice of all possible models, or can do detailed analysis on a specific model.

In Variable-Based Modeling (VBM), model components are collections of variables. In State-Based Modeling (SBM), components identify one or more specific states or substates.

Occam provides a web-based interface, which allows uploading a data file, performing analysis, and viewing or downloading results.For papers on Reconstructability Analysis, see the Discrete Multivariate Modeling
section on the Selected Works page of Dr. Zwick:
https://works.bepress.com/martin_zwick/

For an overview of RA, see the following two papers:
“Wholes and Parts in General Systems Methodology”
“An Overview of Reconstructability Analysis”
We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification... more
We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification context. General Systems Methodology tools for finding problem-domain structure suffer exponential scaling of computation with respect to the number of variables considered. Therefore we introduce the use of Extended Dependency Analysis (EDA), which scales only polynomially in the number of variables, for the desired analysis. Based on EDA, we demonstrate a number of NN pre-structuring techniques applicable for building neural classifiers. An example is provided in which EDA results in significant dimension reduction of the input space, as well as capability for direct design of an NN classifier.
Speculations in Science and Technology, Vol. 1, No, 2 (1978) QUANTUM MEASUREMENT AND GODEL'S PROOF MARTIN ZWICK Systems Science, Portland State University, PO Box 751, Portland, Oregon 97207, USA. Received: 19 June 1978 Abstract The... more
Speculations in Science and Technology, Vol. 1, No, 2 (1978) QUANTUM MEASUREMENT AND GODEL'S PROOF MARTIN ZWICK Systems Science, Portland State University, PO Box 751, Portland, Oregon 97207, USA. Received: 19 June 1978 Abstract The measurement ...
The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability... more
The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses.

And 185 more

Discount flyer for purchase of Elements and Relations.