We formulate a family of direct utility functions for the consumption of a differentiated good. This family is based on a generalization of the Shan-non entropy. It includes dual representations of all additive random utility discrete... more
We formulate a family of direct utility functions for the consumption of a differentiated good. This family is based on a generalization of the Shan-non entropy. It includes dual representations of all additive random utility discrete choice models, as well as models in which goods are complements. Demand models for market shares can be estimated by plain regression, enabling the use of instrumental variables. Models for microdata can be estimated by maximum likelihood.
The disaggregation problem arises each time the question of interest requires knowledge of microeconomic behavior that must be based on aggregate sample data or model results. A related question is, what is the optimal level of... more
The disaggregation problem arises each time the question of interest requires knowledge of microeconomic behavior that must be based on aggregate sample data or model results. A related question is, what is the optimal level of disaggregation when facing different types of data at different scale levels? Using the most disaggregate level conserves as much information as possible, but may not be justified given the additional model complexity. In this paper we develop a data-consistent approach to the estimation of cropping choices by farmers at a disaggregate level (field-level) using more aggregate (regional-level) data or the results of an aggregate production policy model. Our data disaggregation procedure requires two steps. The first step consists of specifying a dynamic model of crop allocation and estimating it using aggregate data. In the second step, we disaggregate the outcomes of aggregate model using maximum entropy (ME). Two points should be noticed. First, we explicitl...
The maximum entropy principle (MEP), which has been popular in the modeling of droplet size and velocity distribution in sprays, is, strictly speaking, only applicable for isolated systems in thermodynamic equilibrium; whereas the spray... more
The maximum entropy principle (MEP), which has been popular in the modeling of droplet size and velocity distribution in sprays, is, strictly speaking, only applicable for isolated systems in thermodynamic equilibrium; whereas the spray formation processes are irreversible and non-isolated with interaction between the atomizing liquid and its surrounding gas medium. In this study, a new model for the droplet size distribution has been developed based on the thermodynamically consistent concept – the maximization of entropy generation during the liquid atomization process. The model prediction compares favorably with the experimentally measured size distribution for droplets, near the liquid bulk breakup region, produced by an air-blast annular nozzle, a planar nozzle and a practical gas turbine nozzle. Therefore, the present model can be used to predict the initial droplet size distribution in sprays. Corresponding author Introduction Since the later 1980s, Maximum Entropy Principle...
Agriculture and Agri-Food Canada (AAFC) has an ongoing research program to provide information on the effect of potential agricultural policy and technology scenarios on the environment and the economic conditions, behavior and... more
Agriculture and Agri-Food Canada (AAFC) has an ongoing research program to provide information on the effect of potential agricultural policy and technology scenarios on the environment and the economic conditions, behavior and performance in the agriculture sector. Included in this work program is a project to improve our farm level data on cost of production and farm management practices for economic and environmental analysis. As part of this effort to improve our data, this report evaluates an analytical method, called Maximum Entropy (ME), for its effectiveness in extracting detailed, enterprise level, cost of production information from whole-farm data. The ME method has been shown to be a promising and cost-effective option for obtaining these enterprise-level estimates from whole-farm data sets already available.
This paper deals with the abstract density topologies in the family of Lebesgue measurable sets generated by an operator similar to the density lower operator defined in the family of measurable sets.
Abstract—Classifying text data has been an active area of research for a long time. Text document is multifaceted object and often inherently ambiguous by nature. Multi-label learning deals with such ambiguous object. Classification of... more
Abstract—Classifying text data has been an active area of research for a long time. Text document is multifaceted object and often inherently ambiguous by nature. Multi-label learning deals with such ambiguous object. Classification of such ambiguous text objects ...
In this paper, freight transportation is taken into account. One of the models used for modelling \Origin-Destination" freight ows is log- regression model obtained by applying a log-transformation to the tradi- tional gravity model.... more
In this paper, freight transportation is taken into account. One of the models used for modelling \Origin-Destination" freight ows is log- regression model obtained by applying a log-transformation to the tradi- tional gravity model. Freight ows between ten provinces of Turkey is ana- lyzed by using generalized maximum entropy estimator of the log-regression model for freight ow. The data set is gathered together from the axle load survey performed by Turkish Directorate of Highways and other so- cioeconomic and demographic variables related with provinces of interest. Relations between considered socioeconomic and demographic variables and freight ows are gured out and results are discussed.
Image segmentation is an essential but critical component in low level vision image analysis, pattern recognition, and in robotic systems. It is one of the most difficult and challenging tasks in image processing which determines the... more
Image segmentation is an essential but critical component in low level vision image analysis, pattern recognition, and in robotic systems. It is one of the most difficult and challenging tasks in image processing which determines the quality of the final result of the image analysis. Image segmentation is the process of dividing an image into different regions such that each region is homogeneous. Various image segmentation algorithms are discussed. Some examples in different image formats are presented and overall results discussed and compared considering different parameters.
Enhancing the production in liquid-loaded horizontal natural gas wells using an acoustic liquid atomizer tool is proposed as a possible artificial lift method. The more liquid that is converted to droplets, the more available gas is able... more
Enhancing the production in liquid-loaded horizontal natural gas wells using an acoustic liquid atomizer tool is proposed as a possible artificial lift method. The more liquid that is converted to droplets, the more available gas is able to carry the liquid to the surface, resulting in an increase in production. The acoustic atomizer was selected to be the atomization device as it can create very small droplets at certain frequencies leading to a mist flow. The contribution of this research includes obtaining experimental data using different laboratory procedures for horizontal and slightly inclined tubulars. Two-phase (gas and water) injection stream lines are joined to the test section to introduce gas and water at desired rates. An ultrasonic atomizer inside the test section is used to better understand the atomization mechanism as an artificial lift technique. Several experiments with varying factors influencing the acoustic properties are tested including varying liquid and ga...
This paper estimates von Neumann and Morgenstern utility functions comparing the generalized maximum entropy (GME) with OLS, using data obtained by utility elicitation methods. Thus, it provides a comparison of the performance of the two... more
This paper estimates von Neumann and Morgenstern utility functions comparing the generalized maximum entropy (GME) with OLS, using data obtained by utility elicitation methods. Thus, it provides a comparison of the performance of the two estimators in a real data small sample setup. The results confirm the ones obtained for small samples through Monte Carlo simulations. The difference between the two estimators is small and it decreases as the width of the parameter support vector increases. Moreover the GME ...
In this paper, a data constrained generalized maximum entropy (GME) estimator for the general linear measurement error model is proposed. GME estimation, as developed by (A. Golan, G. Judge and D. Miller A Maximum Entropy Econometrics:... more
In this paper, a data constrained generalized maximum entropy (GME) estimator for the general linear measurement error model is proposed. GME estimation, as developed by (A. Golan, G. Judge and D. Miller A Maximum Entropy Econometrics: Robust Estimation with limited data (Wiley, New York, 1996)), was formulated as a convex mixed-integer nonlinear optimization problem. Shannon entropy measures and its generalization, namely 'entropy of order r'by Tsallis and Rényi are briefly discussed. A Monte Carlo comparison is made with ...
Zellner; Rossi (1989) söyleşisinde Zellner (1962a)’de geliştirdiği GİR fikrinin nereden aklına geldiği sorusuna şu yanıtı vermiştir. ”1956 ya da 1957 yıllarında yağmurlu bir Seattle akşamında, her nasılsa aklıma çok değişkenli modeli tek... more
Zellner; Rossi (1989) söyleşisinde Zellner (1962a)’de geliştirdiği GİR fikrinin nereden aklına geldiği sorusuna şu yanıtı vermiştir. ”1956 ya da 1957 yıllarında yağmurlu bir Seattle akşamında, her nasılsa aklıma çok değişkenli modeli tek (single) denklem formuyla yazma fikri geldi. Bunu nasıl çözeceğimi düşünürken her şey yerli yerine oturdu çünkü o zaman birçok tek değişkenli sonuç çok değişkenli sistemlere çevrilebiliyordu ve çok değişkenli sistemin analizi gösterimsel, matematiksel ve kavramsal olarak daha basitleştirilmişti.” (Rossi, 1989: 292).
L’objectif principal de cette etude est de definir les methodes necessaires pour optimiser un reseau de surveillance concu pour la caracterisation de source de rejets atmospheriques. L’optimisation consiste ici a determiner le nombre et... more
L’objectif principal de cette etude est de definir les methodes necessaires pour optimiser un reseau de surveillance concu pour la caracterisation de source de rejets atmospheriques. L’optimisation consiste ici a determiner le nombre et les positions optimales de capteurs a deployer afin de repondre a ce type de besoin. Dans ce contexte, l’optimisation est realisee pour la premiere fois par un couplage entre la technique d’inversion de donnees dite de « renormalisation » et des algorithmes d’optimisation metaheuristiques. La methode d’inversion a ete en premier lieu evaluee pour la caracterisation de source ponctuelle, et a permis ensuite, de definir des criteres d’optimalite pour la conception des reseaux. Dans cette etude, le processus d’optimisation a ete evalue dans le cadre d’experiences realisees en terrain plat sans obstacles (DYCE) et en milieu urbain idealise (MUST). Trois problematiques ont ete definies et testees sur ces experiences. Elles concernent (i) la determination ...
We present a new, information-theoretic approach for estimating a system of many demand equations where the unobserved reservation or choke prices vary across consumers. We illustrate this method by estimating a nonlinear, almost ideal... more
We present a new, information-theoretic approach for estimating a system of many demand equations where the unobserved reservation or choke prices vary across consumers. We illustrate this method by estimating a nonlinear, almost ideal demand system (AIDS) for four types of meat using cross-sectional data from Mexico, where most households did not buy at least one type of meat during the survey week. The system of demand curves vary across demographic groups.
Abstract: Moving Average process is a representation of a time series written as a finite linear combination of uncorrelated random variables. Our main interest is to compare a classical estimation method; namely Exact Maximum Likelihood... more
Abstract: Moving Average process is a representation of a time series written as a finite linear combination of uncorrelated random variables. Our main interest is to compare a classical estimation method; namely Exact Maximum Likelihood Estimation (EMLE) with the Generalized Maximum Entropy (GME) approach for estimating the parameters of the second order moving average processes. In this paper, in applying EMLE we have to find the exact likelihood function through deriving the probability density function of the series. ...
TEZ9981Tez (Yüksek Lisans) -- Çukurova Üniversitesi, Adana, 2016.Kaynakça (s. 52-54) var.xi, 58 s. : tablo ; 29 cm.Eşanlı denklem modelleri, iktisat, ekonometri ve istatistik alanında oldukça yoğun kullanılmaktadır. Eşanlı denklem... more
TEZ9981Tez (Yüksek Lisans) -- Çukurova Üniversitesi, Adana, 2016.Kaynakça (s. 52-54) var.xi, 58 s. : tablo ; 29 cm.Eşanlı denklem modelleri, iktisat, ekonometri ve istatistik alanında oldukça yoğun kullanılmaktadır. Eşanlı denklem modellerinin tahmininde en küçük kareler (EKK) yönteminin kullanılması sapmalı ve tutarsız tahminler verir. Bu modellerin tahmininde, modelin hata terimleri ile içsel değişkenleri arasındaki eşanlılığı düzelten tahmin yöntemleri kullanılmaktadır. Ancak bu tahmin yöntemleriyle elde edilen parametre tahminleri çoklu iç ilişki söz konusu olduğunda kararsız ve yüksek varyanslı olarak elde edilirler. Bu durumda çoklu iç ilişkinin etkisini azaltan ve daha kararlı tahminler veren tahmin ediciler kullanılır. Bu çalışmada Klein’ın 1950 yılında “Birleşik Devletler için İktisadi Dalgalanmalar” başlıklı çalışmasında formüle ettiği çoklu iç ilişki sorunu olan eşanlı denklem modeli kullanılmıştır. Model, geleneksel tahmincilerden iki aşamalı en küçük kareler (2AEKK) ve ...
Spatial econometrics is a subdiscipline that have gained a huge popularity in the last twenty years, not only in theoretical econometrics but in empirical studies as well. Basically, spatial econometric methods measure spatial interaction... more
Spatial econometrics is a subdiscipline that have gained a huge popularity in the last twenty years, not only in theoretical econometrics but in empirical studies as well. Basically, spatial econometric methods measure spatial interaction and incorporate spatial structure into regression ...