Distinct software metrics have been proposed for programs. By contrast, metrics for databases have been neglected on the grounds that databases were mere plain files that do not affect considerably information systems maintainability.... more
Distinct software metrics have been proposed for programs. By contrast, metrics for databases have been neglected on the grounds that databases were mere plain files that do not affect considerably information systems maintainability. However, later enhancements on database systems ...
An unknown quantum state ]P) can be disassembled into, then later reconstructed from, purely classical information and purely nonclassical Einstein-Podolsky-Rosen (EPR) correlations. To do so the sender, "Alice, " and the receiver, "Bob,... more
An unknown quantum state ]P) can be disassembled into, then later reconstructed from, purely classical information and purely nonclassical Einstein-Podolsky-Rosen (EPR) correlations. To do so the sender, "Alice, " and the receiver, "Bob, " must prearrange the sharing of an EPR-correlated pair of particles. Alice makes a joint measurement on her EPR particle and the unknown quantum system, and sends Bob the classical result of this measurement. Knowing this, Bob can convert the state of his EPR particle into an exact replica of the unknown state ]P) which Alice destroyed.
Interference of photons emerging from independent sources is essential for modern quantuminformation processing schemes, above all quantum repeaters and linear-optics quantum computers. We report an observation of nonclassical... more
Interference of photons emerging from independent sources is essential for modern quantuminformation processing schemes, above all quantum repeaters and linear-optics quantum computers. We report an observation of nonclassical interference of two single photons originating from two independent, separated sources, which were actively synchronized with a rms timing jitter of 260 fs. The resulting (two-photon) interference visibility was (83 4)%.
Invariant, additive, and separable parameters for measures of human, social, and natural capital have repeatedly proven their value and utility globally over the last 50 years. Given growing demand for comparable living capital metrics,... more
Invariant, additive, and separable parameters for measures of human, social, and natural capital have repeatedly proven their value and utility globally over the last 50 years. Given growing demand for comparable living capital metrics, metrological organizations should position themselves to provide the needed calibration and traceability services.
We have created heralded coherent state superpositions (CSS), by subtracting up to three photons from a pulse of squeezed vacuum light. To produce such CSSs at a sufficient rate, we used our highefficiency photon-number-resolving... more
We have created heralded coherent state superpositions (CSS), by subtracting up to three photons from a pulse of squeezed vacuum light. To produce such CSSs at a sufficient rate, we used our highefficiency photon-number-resolving transition edge sensor to detect the subtracted photons. This is the first experiment enabled by and utilizing the full photon-number-resolving capabilities of this detector. The CSS produced by three-photon subtraction had a mean photon number of 2.75 +0.06 −0.24 and a fidelity of 0.59 +0.04 −0.14 with an ideal CSS. This confirms that subtracting more photons results in higher-amplitude CSSs.
In classical control theory, tracking refers to the ability to perform measurements and feedback on a classical system in order to enforce some desired dynamics. In this paper we investigate a simple version of quantum tracking, namely,... more
In classical control theory, tracking refers to the ability to perform measurements and feedback on a classical system in order to enforce some desired dynamics. In this paper we investigate a simple version of quantum tracking, namely, we look at how to optimally transform the state of a single qubit into a given target state, when the system can be prepared in two different ways, and the target state depends on the choice of preparation. We propose a tracking strategy that is proved to be optimal for any input and target states. Applications in the context of state discrimination, state purification, state stabilization, and state-dependent quantum cloning are presented, where existing optimality results are recovered and extended.
Universal quantum computation on decoherence-free subspaces and subsystems ͑DFSs͒ is examined with particular emphasis on using only physically relevant interactions. A necessary and sufficient condition for the existence of... more
Universal quantum computation on decoherence-free subspaces and subsystems ͑DFSs͒ is examined with particular emphasis on using only physically relevant interactions. A necessary and sufficient condition for the existence of decoherence-free ͑noiseless͒ subsystems in the Markovian regime is derived here for the first time. A stabilizer formalism for DFSs is then developed which allows for the explicit understanding of these in their dual role as quantum error correcting codes. Conditions for the existence of Hamiltonians whose induced evolution always preserves a DFS are derived within this stabilizer formalism. Two possible collective decoherence mechanisms arising from permutation symmetries of the system-bath coupling are examined within this framework. It is shown that in both cases universal quantum computation which always preserves the DFS ͑natural fault-tolerant computation͒ can be performed using only two-body interactions. This is in marked contrast to standard error correcting codes, where all known constructions using one-or two-body interactions must leave the code space during the on-time of the fault-tolerant gates. A further consequence of our universality construction is that a single exchange Hamiltonian can be used to perform universal quantum computation on an encoded space whose asymptotic coding efficiency is unity. The exchange Hamiltonian, which is naturally present in many quantum systems, is thus asymptotically universal.
About 90 pages of very detailed descriptions of proofs and detailed solutions to selected exercises from Steen Thorbjørnsens book "Fundamental Measure- and Integration Theory".
Page 1. What are Quantum Jumps? This article has been downloaded from IOPscience. Please scroll down to see the full text article. 1988 Phys. Scr. 1988 49 (http://iopscience. iop.org/1402-4896/1988/T21/009) Download details ...
In this paper (part II), we analyze some Measure Theory equations and Ramanujan formulas with the developments of the MRB Constant. We obtain new possible mathematical connections with some Cosmological parameters and sectors of String... more
In this paper (part II), we analyze some Measure Theory equations and Ramanujan formulas with the developments of the MRB Constant. We obtain new possible mathematical connections with some Cosmological parameters and sectors of String Theory.
We discuss a recently proposed extension of Bohmian mechanics to quantum field theory. For more or less any regularized quantum field theory there is a corresponding theory of particle motion, which in particular ascribes trajectories to... more
We discuss a recently proposed extension of Bohmian mechanics to quantum field theory. For more or less any regularized quantum field theory there is a corresponding theory of particle motion, which in particular ascribes trajectories to the electrons or whatever sort of particles the quantum field theory is about. Corresponding to the nonconservation of the particle number operator in the quantum field theory, the theory describes explicit creation and annihilation events: the world lines for the particles can begin and end.
Perceiver characteristics need to be modeled in sociometric measurement. The author accomplishes this with a latent trait model ofsociometric choice that is especially successful when unlimited nominations are collected.
The need for a theory of social presence is more pressing as the Internet and virtual environments become increasing social. With time we can observe an increase in social interaction not only among users, but also between users and... more
The need for a theory of social presence is more pressing as the Internet and virtual environments become increasing social. With time we can observe an increase in social interaction not only among users, but also between users and computer agents. A robust and detailed theory and measure of social presence could contribute to our understanding and explaining social behavior in mediated environments, allow researchers to predict and measure differences among media interfaces, and to guide the design of new social environments and interfaces.
Database and data model evolution cause significant problems in the highly dynamic business environment that we experience these days. To support the rapidly changing data requirements of agile companies, conceptual data models, which... more
Database and data model evolution cause significant problems in the highly dynamic business environment that we experience these days. To support the rapidly changing data requirements of agile companies, conceptual data models, which constitute the foundation of database design, should be sufficiently flexible to be able to incorporate changes easily and smoothly. In order to understand what factors drive the maintainability of conceptual data models and to improve conceptual modelling processes, we need to be able to assess conceptual data model properties and qualities in an objective and cost-efficient manner. The scarcity of early available and thoroughly validated maintainability measurement instruments motivated us to define a set of metrics for Entity-Relationship (ER) diagrams. In this paper we show that these easily calculated and objective metrics, measuring structural properties of ER diagrams, can be used as indicators of the understandability of the diagrams. Understandability is a key factor in determining maintainability as model modifications must be preceded by a thorough understanding of the model. The validation of the metrics as early understandability indicators opens up the way for an in-depth study of how structural properties determine conceptual data model understandability. It also allows building maintenance-related prediction models that can be used in conceptual data modelling practice.
Class voting is supposedly in severe decline in advanced industrial democracies. However, this conventional wisdom derives from research using problematic methods and measures and an overly simple model of political change. This chapter... more
Class voting is supposedly in severe decline in advanced industrial democracies. However, this conventional wisdom derives from research using problematic methods and measures and an overly simple model of political change. This chapter overviews past and current comparative research into changes in and explanations of class-based political behavior and argues for the continued significance of class voting and, by extension, class politics in contemporary democracies. I particularly emphasize the importance of using more appropriate methods and the application and testing of theories that integrate developments in this area with those in studies of voting behavior more generally. This translates into a need for the systematic testing of bottom-up/top-down interactions in the relations between social structure and political preferences and the precise specification and measurement of explanatory mechanisms that can account for the association between class position and voting.
"In many different fields, the empiric phenomenons seem to obey certain general law that is necessary to call the Law of the Big Numbers. This law says that the derived numeric proportions of the observation of a very big number of... more
"In many different fields, the empiric phenomenons seem to obey certain general law that is necessary to call the Law of the Big Numbers. This law says that the derived numeric proportions of the observation of a very big number of similar events it practically remains constant, provided that these events are partly governed by constant factors, partly for variable factors whose variations are irregular and they don’t cause a systemic change in a defined address. Certain values of these proportions are characteristic of each class of events given. When the longitude of the series of observations increasing the derived proportions of those observations they approach more and more to those characteristic constants. One can hope it reproduces them exactly if it were possible to make series of observations of infinite longitude".
With regard to my position today, I can justify it from two different points of view. The first one says that, "it depends on the positions that I have occupied in the past, pondered by the influence that they have presently, and of a present interference caused by external aleatory factors, independent of my past."The second says that, "it is the result of the means in which I live and of infinite aleatory and independent interferences that have affected me in the past, pondered by the influence that this interferences have in my present". Both positions are surely certain, being necessary to fuse them with the care and hierarchy that each argument deserves, depending only and exclusively of the phenomenon to the one that is planned to apply.
El primer párrafo fue tomado de una traducción en inglés en la introducción de un ejemplar del tratado de Poisson. El segundo párrafo es la interpretación que la profesora Emilia Correa, de la Universidad Nacional de Colombia, sede Medellín, da al modelo ARIMA, en sus notas del curso series de tiempo, primera edición octubre 2.000.
Software metrics should be used in order to improve the productivity and quality of software, because they provide critical information about reliability and maintainability of the system. In this paper, we propose a cognitive complexity... more
Software metrics should be used in order to improve the productivity and quality of software, because they provide critical information about reliability and maintainability of the system. In this paper, we propose a cognitive complexity metric for evaluating design of object-oriented (OO) code. The proposed metric is based on an important feature of the OO systems: Inheritance. It calculates the complexity at method level considering internal structure of methods, and also considers inheritance to calculate the complexity of class hierarchies. The proposed metric is validated both theoretically and empirically. For theoretical validation, principles of measurement theory are applied since the measurement theory has been proposed and extensively used in the literature as a means to evaluate the software engineering metrics. We applied our metric on a real project for empirical validation and compared it with Chidamber and Kemerer (CK) metrics suite. The theoretical, practical and empirical validations and the comparative study prove the robustness of the measure.
In modern science, we usually associate value with a numerical determination – such as, for instance, the value of the Planck constant. However, once we examine value as a co-original facet of measure, we are led to distinguish – with... more
In modern science, we usually associate value with a numerical determination – such as, for instance, the value of the Planck constant. However, once we examine value as a co-original facet of measure, we are led to distinguish – with Spinoza – what we could call the natura naturans of measure from what we understand by measurement as its natura naturata. First, I discuss the tensions and the connections between the extensive side of measures ( molis, magnitude) and their intensive side ( virtus, worth) to provide a preliminary map for plotting the relations between measures and the social–moral–technical environments where they are performed. The second part of the text presents the articles in this special issue, highlighting how they tackle the social ecology of measures drawing from distinct theoretical lineages.
El valor absoluto de un número real es una noción básica de la Matemática y más específicamente de una de sus ramas principales: el Cálculo diferencial e integral. En esta noción se apoyan las definiciones de conceptos fundamentales como... more
El valor absoluto de un número real es una noción básica de la Matemática y más específicamente de una de sus ramas principales: el Cálculo diferencial e integral. En esta noción se apoyan las definiciones de conceptos fundamentales como límite y continuidad, que constituyen verdaderos obstáculos epistemológicos en el aprendizaje, tal como lo muestran innumerables investigaciones. Reflexionando sobre este hecho consideramos conveniente indagar si una de las causas era la incomprensión de esta noción básica. Nos parece tan simple que, en su enseñanza rara vez le otorgamos un tratamiento exhaustivo. Tampoco nos preguntamos si el enfoque es el adecuado, sobre todo teniendo en cuenta la necesidad de iniciar a los alumnos en el pensamiento variacional, necesario para captar la esencia del Cálculo. Con respecto a lo señalado, hemos podido constatar que en la enseñanza de Precálculo no se pone suficiente énfasis en diseñar situaciones para que los alumnos comprendan este concepto. Sus propiedades son simplemente enumeradas y sus primeras aplicaciones, como resolución de igualdades y desigualdades con valor absoluto se tratan muy someramente. Por ello decidimos realizar una investigación, donde la población estuvo constituida por alumnos del último año de la Educación Polimodal, con orientación en Ciencias Naturales, Salud y Ambiente, pertenecientes a una escuela pública de la Ciudad de San Luis. Adoptamos como enfoque cognitivo el basado en los registros de representación semiótica y su incidencia en el aprendizaje de la matemática. Este enfoque considera que "la comprensión de un concepto matemático involucra la articulación coherente de los diferentes sistemas semióticos que entran en juego en la resolución de problemas" y que "un conocimiento asociado a un concepto es estable en un individuo, si él puede articular las diferentes representaciones de un objeto sin contradicciones" (4) Esta articulación contempla también la adquisición de estrategias de utilización de aquellos sistemas semióticos que facilitan la interpretación, la vía de solución y el control de resultados, de acuerdo a las características del problema. Los objetivos de la experiencia fueron indagar sobre:
We criticize speculations to the effect that quantum mechanics is fundamentally about information. We do this by pointing out how unfounded such speculations in fact are. Our analysis focuses on the dubious claims of this kind recently... more
We criticize speculations to the effect that quantum mechanics is fundamentally about information. We do this by pointing out how unfounded such speculations in fact are. Our analysis focuses on the dubious claims of this kind recently made by Anton Zeilinger.
This paper is intended to give exposition on The Cantor Set: its properties, Construction and its measurability. The key objective is to point to the beginner and others interested in the study of this famous and fascinating set, The... more
This paper is intended to give exposition on The Cantor Set: its properties, Construction and its measurability. The key objective is to point to the beginner and others interested in the study of this famous and fascinating set, The Cantor Set, of some of the main ideas with regards to perfect, Compact but not empty sets which further should be researched into. This paper serves as an accessible introduction to beginners and other scholars interested in the studying the Cantor set especially its properties and measurability.
Con el desarrollo de la tecnología en las últimas décadas, especialmente en el área de la computación, algunas áreas de las matemáticas recobraron interés, por la gran ventaja que representa el poder realizar millones de cálculos en unos... more
Con el desarrollo de la tecnología en las últimas décadas, especialmente en el área de la computación, algunas áreas de las matemáticas recobraron interés, por la gran ventaja que representa el poder realizar millones de cálculos en unos cuántos segundos con una computadora. Una de tales áreas de interés es el estudio de la iteración de funciones racionales de una variable compleja, tema que en las primeras décadas del siglo XX había sido tratado por Julia y Fatou.
La palabra “fractal” proviene del latín fractus, que significa “fragmentado”, “fracturado”, o simplemente “roto” o "quebrado”, término muy apropiado para objetos cuya dimensión es fraccionaria. El término fue acuñado por Benoît Mandelbrot en 1977 y apareció en su libro The Fractal Geometry of Nature.
Este trabajo es principalmente una compilación de la teoría de los Sistemas de Funciones Iteradas (IFS), donde el atractor del sistema dinámico será el soporte de la medida invariante que surge de dicho sistema (en su versión estocástica) de manera natural, al ser modelado como un proceso de Markov, y donde el atractor tiene un comportamiento fractal. Se integran algunos ejemplos y simulaciones por computadora.
The purpose of this study is to describe how Guttman, Rasch, and Mokken approached issues related to invariant measurement. These measurement theorists were chosen to illustrate the evolution of our conceptualizations of invariant... more
The purpose of this study is to describe how Guttman, Rasch, and Mokken approached issues related to invariant measurement. These measurement theorists were chosen to illustrate the evolution of our conceptualizations of invariant measurement during the 20 th century within the research tradition of item response theory. Item response theory can be viewed as consisting of nonparametric and parametric scaling models. Invariant measurement consists of requirements related to the item-invariant measurement of individuals, and the sample-invariant calibration of items. Guttman was selected to represent a nonparametric approach to invariant measurement based on a deterministic scaling model, whereas Rasch was selected to represent a parametric approach based on a probabilistic model. Mokken presents a nonparametric approach based on a probabilistic model. Mokken combines aspects of Guttman Scaling with Rasch measurement theory, and provides a nonparametric framework for examining Rasch's requirements for invariant measurement.
We present a framework for efficiently performing Monte Carlo wave-function simulations in cavity QED with moving particles. It relies heavily on the object-oriented programming paradigm as realised in C++, and is extensible and... more
We present a framework for efficiently performing Monte Carlo wave-function simulations in cavity QED with moving particles. It relies heavily on the object-oriented programming paradigm as realised in C++, and is extensible and applicable for simulating open interacting quantum dynamics in general. The user is provided with a number of "elements", eg pumped moving particles, pumped lossy cavity modes, and various interactions to compose complex interacting systems, which contain several particles moving in electromagnetic fields of various configurations, and perform wave-function simulations on such systems. A number of tools are provided to facilitate the implementation of new elements.
1] We describe the physical processes by which a vertically localized absorber perturbs the top-of-atmosphere solar backscattered ultraviolet (UV) radiance. The distinct spectral responses to perturbations of an absorber in its column... more
1] We describe the physical processes by which a vertically localized absorber perturbs the top-of-atmosphere solar backscattered ultraviolet (UV) radiance. The distinct spectral responses to perturbations of an absorber in its column amount and layer altitude provide the basis for a practical satellite retrieval technique, the Extended Iterative Spectral Fitting (EISF) algorithm, for the simultaneous retrieval of these quantities of a SO 2 plume. In addition, the EISF retrieval provides an improved UV aerosol index for quantifying the spectral contrast of apparent scene reflectance at the bottom of atmosphere bounded by the surface and/or cloud; hence it can be used for detection of the presence or absence of UV absorbing aerosols. We study the performance and characterize the uncertainties of the EISF algorithm using synthetic backscattered UV radiances, retrievals from which can be compared with those used in the simulation. Our findings indicate that the presence of aerosols (both absorbing and nonabsorbing) does not cause large errors in EISF retrievals under most observing conditions when they are located below the SO 2 plume. The EISF retrievals assuming a homogeneous field of view can provide accurate column amounts for inhomogeneous scenes, but they always underestimate the plume altitudes. The EISF algorithm reduces systematic errors present in existing linear retrieval algorithms that use prescribed SO 2 plume heights. Applying the EISF algorithm to Ozone Monitoring Instrument satellite observations of the recent Kasatochi volcanic eruption, we demonstrate the successful retrieval of effective plume altitude of volcanic SO 2 , and we also show the improvement in accuracy in the corresponding SO 2 columns. (2010), Direct retrieval of sulfur dioxide amount and altitude from spaceborne hyperspectral UV measurements: Theory and application,
Establishing predictive validity of measures is a major concern in marketing research. This paper investigates the conditions favoring the use of single items versus multi-item scales in terms of predictive validity. A series of... more
Establishing predictive validity of measures is a major concern in marketing research. This paper investigates the conditions favoring the use of single items versus multi-item scales in terms of predictive validity. A series of complementary studies reveals that the predictive validity of single items varies considerably across different (concrete) constructs and stimuli objects. In an attempt to explain the observed instability, a comprehensive simulation study is conducted aimed at identifying the influence of different factors on the predictive validity of single versus multiitem measures. These include the average inter-item correlations in the predictor and criterion constructs, the number of items measuring these constructs, as well as the correlation patterns of multiple and single items between the predictor and criterion constructs. The simulation results show that, under most conditions typically encountered in practical applications, multi-item scales clearly outperform single items in terms of predictive validity. Only under very specific conditions do single items perform equally well as multi-item scales. Therefore, the use of single-item measures in empirical research should be approached with caution, and the use of such measures should be limited to special circumstances.
In this paper, we describe the challenge of adequately characterizing and measuring experiences associated with playing digital games. We discuss the applicability of traditional usability metrics to user-centred game design, and... more
In this paper, we describe the challenge of adequately characterizing and measuring experiences associated with playing digital games. We discuss the applicability of traditional usability metrics to user-centred game design, and highlight two prominent concepts, flow and immersion, as potential candidates for evaluating gameplay. The paper concludes by describing the multi-measure approach taken by the Game Experience Research Lab in Eindhoven.
Mathematically considered, a Triangular Norm is a kind of binary operation frequently used in the context of Probabilistic Metric Spaces, but also in other very interesting fields, as may be Fuzzy Logic, or in general, in Multi-Valued... more
Mathematically considered, a Triangular Norm is a kind of binary operation frequently used in the context of Probabilistic Metric Spaces, but also in other very interesting fields, as may be Fuzzy Logic, or in general, in Multi-Valued Logic (MVL). The T-conorm, or S-norm, is a dual concept. Both ideas allow us to generalize the intersection and the union in a Lattice, or disjunction and conjunction in Logic. Also may be very interesting to introduce a special class of real monotone operations. We refer to the so-called Copulas, very useful in many fields. So, we offer now a comprehensive analysis of all these aggregation operators.
The goal of performance shaping factors (PSFs) is to provide measures to account for human performance. PSFs fall into two categories-direct and indirect measures of human performance. While some PSFs such as "time to complete a task" are... more
The goal of performance shaping factors (PSFs) is to provide measures to account for human performance. PSFs fall into two categories-direct and indirect measures of human performance. While some PSFs such as "time to complete a task" are directly measurable, other PSFs, such as "fitness for duty," can only be measured indirectly through other measures and PSFs, such as through fatigue measures. This paper explores the role of direct and indirect measures in human reliability analysis (HRA) and the implications that measurement theory has on analyses and applications using PSFs. The paper concludes with suggestions for maximizing the reliability and validity of PSFs.
Hermitian matrices and let be the cone of the positive definite ones. We say that the random variable S, taking its values in , has the complex Wishart distribution γ p,σ if E(exp trace(θS)) = (det(I r − σ θ)) −p , where σ and σ −1 − θ... more
Hermitian matrices and let be the cone of the positive definite ones. We say that the random variable S, taking its values in , has the complex Wishart distribution γ p,σ if E(exp trace(θS)) = (det(I r − σ θ)) −p , where σ and σ −1 − θ are in , and where p = 1, 2,. .. , r − 1 or p > r − 1. In this paper, we compute all moments of S and S −1. The techniques involve in particular the use of the irreducible characters of the symmetric group.
Signal causality, the prohibition of superluminal information transmission, is the fundamental property shared by quantum measurement theory and relativity, and it is the key to understanding the connection between nonlocal measurement... more
Signal causality, the prohibition of superluminal information transmission, is the fundamental property shared by quantum measurement theory and relativity, and it is the key to understanding the connection between nonlocal measurement effects and elementary interactions. To prevent those effects from transmitting information between the generating and observing process, they must be induced by the kinds of entangling interactions that constitute measurements, as implied in the Projection Postulate. They must also be nondeterministic as reflected in the Born Probability Rule. The nondeterminism of entanglement-generating processes explains why the relevant types of information cannot be instantiated in elementary systems, and why the sequencing of nonlocal effects is, in principle, unobservable. This perspective suggests a simple hypothesis about nonlocal transfers of amplitude during entangling interactions, which yields straightforward experimental consequences.
Resumen: En el presente estudio se realiza una revisión de las principales aportaciones teóricas y metodológicas al estudio de la empatía. Así, se revisan los orígenes del término empatía. Se aborda la historia de su estudio, haciendo... more
Resumen: En el presente estudio se realiza una revisión de las principales aportaciones teóricas y metodológicas al estudio de la empatía. Así, se revisan los orígenes del término empatía. Se aborda la historia de su estudio, haciendo referencia a las distintas perspectivas desde las que se ha abordado: el debate entre la postura disposicional y la situacional, y las visiones cognitiva, afectiva, y por último integradora. Se recogen y comentan las principales medidas de la empatía desde las distintas aproximaciones mencionadas. Se comentan los estudios realizados sobre la empatía desde distintas perspectivas: neuropsicológica, diferencial y social. También se comentan las más importantes aplicaciones prácticas del estudio de la empatía en los ámbitos clínico y organizacional. Por último, se trata de constituir toda la información en un modelo explicativo integrador de la empatía, y se plantean las cuestiones que permanecen abiertas o que requieren de un mayor esfuerzo investigador para avanzar en el estudio de este constructo. Palabras clave: Empatía; medida; adopción de perspectiva; emoción.
The purpose of this study is to examine the interactions among measurement theories, writing theories, and writing assessments in the United States from an historical perspective. The assessment of writing provides a useful framework for... more
The purpose of this study is to examine the interactions among measurement theories, writing theories, and writing assessments in the United States from an historical perspective. The assessment of writing provides a useful framework for examining how theories influence, and in some cases fail to influence actual practice. Two research traditions are described to classify measurement theories (test-score and scaling), and three research traditions are proposed for classifying writing theories (form, idea and content, and sociocultural context). The results of this study trace the impact of measurement and writing traditions on writing assessment practices within selected time periods during the 20th century in the United States. One of the major findings of this historical analysis is that measurement theory has had a strong influence on writing assessments, while writing theory has had minimal influence on writing assessments. We also found support for the idea that a new discipline of writing assessment has emerged. This new discipline combines multiple fields including the writing, composition, and measurement communities of scholars, and it has the potential to set the stage for the future of writing assessment in the 21st century.► Reviews 100 years of measurement theory, writing theory and writing assessments. ► Analyzes the influence of measurement and writing theory on writing assessment within the United States. ► Analyzes the bidirectional influence between measurement theory and writing theory. ► Finds that writing theory has had little influence on writing assessment practice. ► Emphasizes the need to have cross-discipline conversations.
This paper offers a new methodological framework to guide researchers attempting to quantitatively assess how a pluralistic audience perceives a standardized television advertisement. Rasch (1960) measurement theory is introduced as an... more
This paper offers a new methodological framework to guide researchers attempting to quantitatively assess how a pluralistic audience perceives a standardized television advertisement. Rasch (1960) measurement theory is introduced as an alternative to the more commonly employed multigroup confirmatory factor analysis (CFA) approach to assessing cross-cultural scalar equivalence. By analyzing a multicultural data set, we are able to make various inferences concerning the scalar equivalence of Schlinger's confusion scale. The methodology reveals the limits of the scale, which in all probability would not have been detected using traditional approaches. For researchers attempting to develop new scales, or even to refine existing scales, strict adherence to established guidelines of item generation together with the application of the proposed methodology should ensure better results for both theorists and practitioners.
The present conference takes place in the same year that celebrates the centenary of Albert Einstein. Hence it is a good occasion to reflect on those problems which have been at the core of Einstein's intellectual activity. Undoubtedly... more
The present conference takes place in the same year that celebrates the centenary of Albert Einstein. Hence it is a good occasion to reflect on those problems which have been at the core of Einstein's intellectual activity. Undoubtedly the foundation of quantum mechanics (QM) is one of these problems. It is known that Einstein was never convinced by the interpretation of quantum mechanics accepted, in his times and still now, by the majority of physicists. The fact that he was sharing this skepticism with people like Schrödinger and, most of all, the fact that no convincing answer, to the doubts of these people, had emerged in a more than half a century old debate, helped in keeping alive the attention of a growing number of people on this problem.
In this document, we prove the existence and unicity (up to a positive constant multiple) of the left Haar measure and the right Haar measure on a locally compact topological group. We will assume all the results and notations in... more
In this document, we prove the existence and unicity (up to a positive constant multiple) of the left Haar measure and the right Haar measure on a locally compact topological group. We will assume all the results and notations in "Topological Groups Notes" and results from elementary measure theory, which I believe Analysis - Elliott H. Lieb / Michael Loss (see References section) is good for. Also, we compute examples of Haar measure at the end, namely R^n , Mat_n×n (R) and GL_n (R).
In this paper, we analyze various equations regarding the "Area-minimizing oriented boundaries". We describe the new possible mathematical connections with some sectors of Number Theory, String Theory and cosmological parameters