Airborne laser scanners enable the geometric acquisitionof the terrain surface, including objects liketrees or buildings which rise from the terrain. Eventhough for a number of applications a so#called DigitalSurface Model #DSM#... more
Airborne laser scanners enable the geometric acquisitionof the terrain surface, including objects liketrees or buildings which rise from the terrain. Eventhough for a number of applications a so#called DigitalSurface Model #DSM# representing the surfacegeometry by an object independent distribution ofpoints is su#cient, the further quali#cation of theoriginal scanner data is necessary for more sophisticatedtasks like visualizations or high quality3Dsimulations.
In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN ⁎ PRO model of... more
In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN ⁎ PRO model of store sales. Models with continuous and discrete representations of heterogeneity, estimated using hierarchical Bayes (HB) and finite mixture (FM) techniques, respectively, are empirically compared to the original model, which does not account for store-level heterogeneity in marketing mix effects, and is estimated using ordinary least squares (OLS). The empirical comparisons are conducted in two contexts: Dutch store-level scanner data for the shampoo product category, and an extensive simulation experiment. The simulation investigates how between-and within-segment variance in marketing mix effects, error variance, the number of weeks of data, and the number of stores impact the accuracy of marketing mix elasticities, model fit, and forecasting accuracy. Contrary to expectations, accommodating store-level heterogeneity does not improve the accuracy of marketing mix elasticities relative to the homogeneous SCAN ⁎ PRO model, suggesting that little may be lost by employing the original homogeneous SCAN ⁎ PRO model estimated using ordinary least squares. Improvements in fit and forecasting accuracy are also fairly modest. We pursue an explanation for this result since research in other contexts has shown clear advantages from assuming some type of heterogeneity in market response models. In an Afterthought section, we comment on the controversial nature of our result, distinguishing factors inherent to household-level data and associated models vs. general store-level data and associated models vs. the unique SCAN ⁎ PRO model specification.
Management of brand equity has come to be viewed as critical to a brand's optimal long-term performance. The authors evaluate the usefulness of brand equity estimates obtained from store-level data for monitoring the health of a brand.... more
Management of brand equity has come to be viewed as critical to a brand's optimal long-term performance. The authors evaluate the usefulness of brand equity estimates obtained from store-level data for monitoring the health of a brand. They use a random coefficients logit demand model calibrated on store-level scanner data to track brand equity estimates over time in two consumer packaged goods categories that experienced several new product introductions during the period of the empirical investigation. Using these tracked measures, the authors also study the impact of marketing actions, such as advertising, sales promotions, and product innovations, on brand equity. They find that the brand equity estimates effectively capture the high equity of strongly positioned popular brands and brands that command a significant price premium in niche markets. Using an example, the authors illustrate how these brand equity estimates can be used to monitor changes in brand equity, which measures such as market share may fail to capture. The substantive results indicate that advertising has a positive effect on brand equity in both the product categories, whereas the effect of sales promotions is not significant in either category. Furthermore, the results reveal that new product innovations have a positive impact on brand equity and can explain a significant proportion of its variation. Overall, the analysis shows that a brand manager can track brand equity using store-level data, gain insights into the drivers of the brand's equity, and manage these drivers to achieve brand equity targets.
The effects of topography on the radiometric properties of multispectral scanner (MSS) data are examined in the context of the remote sensing of forests in mountainous regions. The two test areas considered for this study are located in... more
The effects of topography on the radiometric properties of multispectral scanner (MSS) data are examined in the context of the remote sensing of forests in mountainous regions. The two test areas considered for this study are located in the coastal mountains of British Columbia, ...
This study characterises the ‘deal-proneness’ of consumers by analysis of the consumer-level characteristics of price sensitivity and brand loyalty. The study first develops a multinomial logistic (MNL) latent class model suitable for use... more
This study characterises the ‘deal-proneness’ of consumers by analysis of the consumer-level characteristics of price sensitivity and brand loyalty. The study first develops a multinomial logistic (MNL) latent class model suitable for use with universal product code (UPC) point-of-sale (hypermarket) scanner data. The model is then used to assess the deal-proneness of consumers with respect to monetary promotions (price reductions) and non-monetary promotions (store flyers). The results show that almost 47% of consumers can be considered deal-prone, in that both kinds of sales promotions have a significant effect on their choice behaviour. The findings provide important insights for retail management in seeking to optimise the results obtained from promotional budgets.
a b s t r a c t X-ray fluorescence (XRF) scanning of unlithified, untreated sediment cores is becoming an increasingly common method used to obtain paleoproxy data from lake records. XRF-scanning is fast and delivers high-resolution... more
a b s t r a c t X-ray fluorescence (XRF) scanning of unlithified, untreated sediment cores is becoming an increasingly common method used to obtain paleoproxy data from lake records. XRF-scanning is fast and delivers high-resolution records of relative variations in the elemental composition of the sediment. However, lake sediments display extreme variations in their organic matter content, which can vary from just a few percent to well over 50%. As XRF scanners are largely insensitive to organic material in the sediment, increasing levels of organic material effectively dilute those components that can be measured, such as the lithogenic material (the closed-sum effect). Consequently, in sediments with large variations in organic material, the measured variations in an element will to a large extent mirror the changes in organic material. It is therefore necessary to normalize the elements in the lithogenic component of the sediment against a conservative element to allow changes in the input of the elements to be addressed. In this study we show that Al, which is the lightest element that can be measured using the Itrax XRF-scanner, can be used to effectively normalize the elements of the lithogenic fraction of the sediment against variations in organic content. We also show that care must be taken when choosing resolution and exposure time to ensure optimal output from the measurements.
The data acquired from the hyperspectral airborne sensor DAIS-7915 over Izrael Valley in northern Israel was processed to yield quantitative soil properties maps of organic matter, soil eld moisture, soil saturated moisture, and soil... more
The data acquired from the hyperspectral airborne sensor DAIS-7915 over Izrael Valley in northern Israel was processed to yield quantitative soil properties maps of organic matter, soil eld moisture, soil saturated moisture, and soil salinity. The method adopted for this purpose was the Visible and Near Infrared Analysis ( VNIRA) approach, which yields an empirical model for predicting the soil property in question from both wet chemistry and spectral information of a representative set of samples (calibration set). Based on spectral laboratory data that show a signi cant capability to predict the above soil properties and populations using the VNIRA strategy, the next step was to examine this feasibility under a hyperspectral remote sensing (HSR) domain. After atmospherically rectifying the DAIS-7915 data and omitting noisy bands, the VNIRA routine was performed to yield a prediction equation model for each property, using the re ectance image data. Applying this equation on a pixel-bypixel basis revealed images that described spatially and quantitatively the surface distribution of each property. The VNIRA results were validated successfully from a priori knowledge of the area characteristics and from data collected from several sampling points. Following these examinations, a procedure was developed in order to create a soil property map of the entire area, including soils under vegetated areas. This procedure employed a random selection of more than 80 points along nonvegetated areas from the quantitative soil property images and interpolation of the points to yield an isocontour map for each property. It is concluded that the VNIRA method is a promising strategy for quantitative soil surface mapping, furthermore, the method could even be improved if a better quality of HSR data were used.
Consumers in developed countries are increasingly interested in the consumption of products incorporating ethical aspects, particularly fair trade products. They are usually distributed in a network of World Shops and more recently also... more
Consumers in developed countries are increasingly interested in the consumption of products incorporating ethical aspects, particularly fair trade products. They are usually distributed in a network of World Shops and more recently also introduced in supermarkets and shopping centres. The fair trade product with the highest share on the total is coffee. This study aims to ascertain the implicit price paid by Italian consumers for the fair trade content of coffee. The sample utilised is based on the purchase data of a representative sample of supermarket and shopping centre consumers observed from 1998 to 2002. Since scanner data are used, the analysis can allow for the numerous coffee attributes described by the labels: branded, organic, decaffeinated, fair trade, soluble, and so on. The empirical approach followed is the calculation of hedonic prices for the fair trade content and other attributes of coffee.
In this paper we introduce a new flexible mixed model for multinomial discrete choice where the key individual-and alternative-specific parameters of interest are allowed to follow an assumptionfree nonparametric density specification... more
In this paper we introduce a new flexible mixed model for multinomial discrete choice where the key individual-and alternative-specific parameters of interest are allowed to follow an assumptionfree nonparametric density specification while other alternative-specific coefficients are assumed to be drawn from a multivariate normal distribution which eliminates the independence of irrelevant alternatives assumption at the individual level. A hierarchical specification of our model allows us to break down a complex data structure into a set of submodels with the desired features that are naturally assembled in the original system. We estimate the model using a Bayesian Markov Chain Monte Carlo technique with a multivariate Dirichlet Process (DP) prior on the coefficients with nonparametrically estimated density. We employ a "latent class" sampling algorithm which is applicable to a general class of models including non-conjugate DP base priors. The model is applied to supermarket choices of a panel of Houston households whose shopping behavior was observed over a 24-month period in years 2004-2005. We estimate the nonparametric density of two key variables of interest: the price of a basket of goods based on scanner data, and driving distance to the supermarket based on their respective locations. Our semi-parametric approach allows us to identify a complex multi-modal preference distribution which distinguishes between inframarginal consumers and consumers who strongly value either lower prices or shopping convenience.
Knowledge of a high correlation between a consumerÕs residence and his place of grocery shopping has allowed researchers to use scanner data to assess the relationship between income and shopping behavior. This study addresses the... more
Knowledge of a high correlation between a consumerÕs residence and his place of grocery shopping has allowed researchers to use scanner data to assess the relationship between income and shopping behavior. This study addresses the shopping behavior of over 100,000 consumers who patronize six supermarkets weekly. Three of these supermarkets are best characterized as stores that service primarily lower-income shoppers, and three are best characterized as stores that service primarily higher-income shoppers. A key objective of this research is to determine if purchasing patterns differ for the two income groups and, if so, to determine if these differences are consistent with economic theory. The results show that the dominant income group for a given area makes purchase decisions that are so widespread and prominent that the confounding effects of other income shoppers are completely overshadowed. Simply stated, the statistical evidence is so strong that it overcomes all possible deviating effects which may result from data outliers.
Consumers in developed countries are increasingly interested in the consumption of products incorporating ethical aspects, particularly fair trade products. They are usually distributed in a network of World Shops and more recently also... more
Consumers in developed countries are increasingly interested in the consumption of products incorporating ethical aspects, particularly fair trade products. They are usually distributed in a network of World Shops and more recently also introduced in ...
Estimating permeability from NMR well logs or mobile NMR core scanner data is an attractive method as the measurements can be performed directly in the formation or on fresh cores right after drilling. Furthermore, the method is fast and... more
Estimating permeability from NMR well logs or mobile NMR core scanner data is an attractive method as the measurements can be performed directly in the formation or on fresh cores right after drilling. Furthermore, the method is fast and non-destructive. Compared to T 1 relaxation times, commonly measured T 2 distributions are influenced by external and internal magnetic field gradients. We performed two-dimensional T 1 and T 2 relaxation experiments on samples of Rhaetian sandstone, a rock with low porosity and small pore radii, using a mobile NMR core scanner which operates within a nearly homogeneous static magnetic field. Because small pore sizes are associated with high internal magnetic field gradients, standard methods from NMR logging in the oil industry cannot be applied for accurate permeability prediction. Therefore, a new model theory was developed, which describes the pore radius dependence of the surface relaxivity ρ 2 by both an analytical and a more practical empirical equation. Using corrected ρ 2 values, permeability can be predicted accurately from the logarithmic mean of the T 2 distribution and the physically based Kozeny-Carman equation. Additional core plug measurements of structural parameters such as porosity, permeability, specific inner surface area and pore radius distributions supported the NMR results.
Rapidly available and accurate information about the location and extent of avalanche events is important for avalanche forecasting, safety assessments for roads and ski resorts, verification of warning products, as well as for hazard... more
Rapidly available and accurate information about the location and extent of avalanche events is important for avalanche forecasting, safety assessments for roads and ski resorts, verification of warning products, as well as for hazard mapping and avalanche model calibration/validation. Today, observations from individual experts in the field provide isolated information with very limited coverage. This study presents a methodology for an automated, systematic and wide-area detection and mapping of avalanche deposits using optical remote sensing data of high spatial and radiometric resolution. A processing chain, integrating directional, textural and spectral information, is developed using ADS40 airborne digital scanner data acquired over a test site near Davos, Switzerland. Though certain limitations exist, encouraging detection and mapping accuracies can be reported. The presented approach is a promising addition to existing field observation methods for remote regions, and can be applied in otherwise inaccessible areas.
A criticism of purchase-based brand loyalty measures is that they are confounded by the marketing mix variables that affect brand choice. This paper investigates the magnitude and direction of the associations for share of category... more
A criticism of purchase-based brand loyalty measures is that they are confounded by the marketing mix variables that affect brand choice. This paper investigates the magnitude and direction of the associations for share of category requirements (SCR), defined as each brand's share among the group of households who bought the brand at least once during the time period under consideration.
While a significant literature has emerged recently on the longer-term effects of price promotions, as inferred from persistence models, there is very little if any attention paid to whether such longer-term effects vary across different... more
While a significant literature has emerged recently on the longer-term effects of price promotions, as inferred from persistence models, there is very little if any attention paid to whether such longer-term effects vary across different types of consumers. This paper takes a first step in that direction by exploring whether the adjustment, permanent, and total effects of price promotions, and the duration of the adjustment period, differ between consumers segmented based on their usage rates in a product category and their loyalty to a brand. We also investigate whether such consumer segmentation will improve the forecasting performance of persistence models at both product category and brand levels. Expectations are developed based on consumer behavior theory on various effects of price promotions, such as the post-deal trough, the mere purchase effect, the promotion usage effect, and responsiveness to competitor's reactions. Evidence from household-level supermarket scanner data on four product categories is provided. We find substantial differences between consumer segments and provide insights on how managers can increase the longer-term effectiveness of price promotions by targeting each consumer segment with a different promotion program. In addition, consumer segmentation is found to significantly improve the forecasting performance of the persistence model for two of the four product categories. For the other two product categories, consumer segmentation provides forecasting performance similar to that obtained from aggregate-level persistence models. D
This paper employs a nation-wide sample of supermarket scanner data to estimate a large brand-level demand system for beer in the U.S. using the Distance Metric method of Pinkse, Slade and Brett [Pinkse, J., Slade, M., Brett, C., 2002.... more
This paper employs a nation-wide sample of supermarket scanner data to estimate a large brand-level demand system for beer in the U.S. using the Distance Metric method of Pinkse, Slade and Brett [Pinkse, J., Slade, M., Brett, C., 2002. Spatial price competition: a semiparametric approach. Econometrica 70, . Unlike previous studies, this work estimates the own-and cross-advertising elasticities in addition to price elasticities. Positive and negative cross-advertising elasticities imply the presence of both cooperative and predatory effects of advertising expenditures across brands; however, the former effect appears to dominate suggesting that advertising increases the overall demand for beer. We discuss the implications of these results in this industry.
B rain-computer interfaces (BCIs) utilize neurophysiological signals originating in the brain to activate or deactivate external devices or computers . Different neuroelectric signals have been used to control external devices, including... more
B rain-computer interfaces (BCIs) utilize neurophysiological signals originating in the brain to activate or deactivate external devices or computers . Different neuroelectric signals have been used to control external devices, including EEG oscillations, electrocorticograms (ECoGs) from implanted electrodes, event-related potentials (ERPs) such as the P300 and slow cortical potential (SCP), short latency subcortical potentials and visually evoked potentials, and action potential spike trains from implanted multielectrodes. In comparison, the development of BCIs based on metabolic activity of the brain using two different imaging methods, functional magnetic resonance imaging (fMRI) [2] and functional near infrared spectroscopy [3], has been more recent.
The National Aeronautics and Space Administration's Clouds and the Earth's Radiant Energy System (CERES) Project was designed to improve our understanding of the relationship between clouds and solar and longwave radiation. This is... more
The National Aeronautics and Space Administration's Clouds and the Earth's Radiant Energy System (CERES) Project was designed to improve our understanding of the relationship between clouds and solar and longwave radiation. This is achieved using satellite broad-band instruments to map the top-of-atmosphere radiation fields with coincident data from satellite narrow-band imagers employed to retrieve the properties of clouds associated with those fields. This paper documents the CERES Edition-2 cloud property retrieval system used to analyze data from the Tropical Rainfall Measuring Mission Visible and Infrared Scanner and by the MODerate-resolution Imaging Spectrometer instruments on board the Terra and Aqua satellites covering the period 1998 through 2007. Two daytime retrieval methods are explained: the Visible Infrared Shortwave-infrared Split-window Technique for snow-free surfaces and the Shortwave-infrared Infrared Near-infrared Technique for snow or ice-covered surfaces. The Shortwave-infrared Infrared Splitwindow Technique is used for all surfaces at night. These methods, along with the ancillary data and empirical parameterizations of cloud thickness, are used to derive cloud boundaries, phase, optical depth, effective particle size, and condensed/frozen water path at both pixel and CERES footprint levels. Additional information is presented, detailing the potential effects of satellite calibration differences, highlighting methods to compensate for spectral differences and correct for atmospheric absorption and emissivity, and discussing known errors in the code. Because a consistent set of algorithms, auxiliary input, and calibrations across platforms are used, instrument and algorithm-induced changes in the data record are minimized. This facilitates the use of the CERES data products for studying climate-scale trends.
We study the optimal levels of advertising and promotion budgets in dynamic markets with brand equity as a mediating variable. To this end, we develop and estimate a state-space model based on the Kalman filter that captures the dynamics... more
We study the optimal levels of advertising and promotion budgets in dynamic markets with brand equity as a mediating variable. To this end, we develop and estimate a state-space model based on the Kalman filter that captures the dynamics of brand equity as influenced by its drivers that include a brand's advertising and sales promotion expenditures. By integrating the Kalman filter with the random coefficients logit demand model, our estimation allows us to capture the dynamics of brand equity as well as model consumer heterogeneity using store-level data. Using these demand model estimates, we determine the Markov Perfect Equilibrium advertising and promotion strategies. Our empirical analysis is based on store-level scanner data in the orange juice category, which comprises two major brands -Tropicana and Minute Maid. The calibration of the demand model reveals that sales promotions have a significant positive effect on consumers' utility and induce consumers to switch to the promoted brand. However, there is also a negative effect of promotions on brand equity that carries over from period to period. Overall, we find that while sales promotions have a net positive impact both in the short-term and in the long-term, the implied total elasticity including the long-term effect is smaller than the short-term elasticity. Correspondingly, we expect myopic decision-makers to allocate higher than optimal expenditures to sales promotions. Our results from the supply side analysis reveal that the equilibrium forward-looking promotion levels are higher for Minute Maid, the brand for which the adverse long-term effect of promotions is lower. However, the observed promotion levels are higher for Tropicana compared to Minute Maid, a result consistent only with the myopic case. Further, our results reveal that the actual promotion levels for both brands are higher than the optimal budgets for the forward-looking as well as the two-year planning horizon scenarios. Hence, it may be profitable for both brands to reduce their promotion levels. The equilibrium forward-looking advertising levels are higher for Tropicana, the brand that has a higher responsiveness to advertising. Further, the optimal forward-looking advertising levels are higher than the optimal budgets for the myopic as well as the two-year planning horizon scenarios. However, these levels are higher than the actual advertising expenditures for both brands. Hence, it may be optimal for both brands to reduce their advertising levels.
The workhorse brand choice models in marketing are the multinomial logit (MNL) and nested multinomial logit (NMNL). These models place strong restrictions on how brand share and purchase incidence price elasticities are related. In this... more
The workhorse brand choice models in marketing are the multinomial logit (MNL) and nested multinomial logit (NMNL). These models place strong restrictions on how brand share and purchase incidence price elasticities are related. In this paper, we propose a new model of brand choice, the “price consideration” (PC) model, that allows more flexibility in this relationship. In the PC model,
B rain-computer interfaces (BCIs) utilize neurophysiological signals originating in the brain to activate or deactivate external devices or computers . Different neuroelectric signals have been used to control external devices, including... more
B rain-computer interfaces (BCIs) utilize neurophysiological signals originating in the brain to activate or deactivate external devices or computers . Different neuroelectric signals have been used to control external devices, including EEG oscillations, electrocorticograms (ECoGs) from implanted electrodes, event-related potentials (ERPs) such as the P300 and slow cortical potential (SCP), short latency subcortical potentials and visually evoked potentials, and action potential spike trains from implanted multielectrodes. In comparison, the development of BCIs based on metabolic activity of the brain using two different imaging methods, functional magnetic resonance imaging (fMRI) [2] and functional near infrared spectroscopy [3], has been more recent.
We propose that consumers use the presence of a restriction (i.e., purchase limit, purchase precondition, or time limit) as a source of information to evaluate a deal. In a series of four studies we present evidence suggesting that... more
We propose that consumers use the presence of a restriction (i.e., purchase limit, purchase precondition, or time limit) as a source of information to evaluate a deal. In a series of four studies we present evidence suggesting that restrictions serve to accentuate deal value and act as ''promoters'' of promotions. We begin by using aggregate level scanner data to test our hypothesis that a sales restriction (e.g., ''limit X per customer'') results in higher sales. Via three subsequent experiments, we then investigate contextual and individual factors moderating this effect. Study 2 suggests that restrictions only have a positive effect for low need for cognition individuals. Study 3 explores the potential mediating role of deal evaluations on purchase intent across discount levels. Study 4 examines the effect of three types of restrictions (purchase limits, time limits, and purchase preconditions) across discount levels and explores the underlying beliefs driving these effects. An integrative model across studies demonstrates the robustness of the restriction effect and supports the premise that restrictions work through signaling value. Implications for how consumers determine promotional value are discussed.
The purpose of this paper is to examine the dynamic effects of sales promotions. We create dynamic brand sales models (for weekly store-level scanner data) by relating store intercepts and a brand's own price elasticity to a measure of... more
The purpose of this paper is to examine the dynamic effects of sales promotions. We create dynamic brand sales models (for weekly store-level scanner data) by relating store intercepts and a brand's own price elasticity to a measure of the cumulated previous price discountsamount and timefor that brand as well as for other brands. The brand's own non-price promotional response parameters are related to the time since the most recent promotion for that brand as well as for other brands. We demonstrate that these dynamic econometric models provide greater managerial relevance than static models can.
Over the past two decades, marketing scientists in academia and industry have employed consumer choice models calibrated using supermarket scanner data to assess the impact of price and promotion on consumer choice, and they continue to... more
Over the past two decades, marketing scientists in academia and industry have employed consumer choice models calibrated using supermarket scanner data to assess the impact of price and promotion on consumer choice, and they continue to do so today. Despite the extensive usage of scanner panel data for choice modeling, very little is known about the impact of data preparation strategies on the results of modeling efforts. In most cases, scanner panel data is pruned prior to model estimation to eliminate less significant brands, sizes, product forms, etc., as well as households with purchase histories not long enough to provide information on key consumer behavior concepts such as loyalty, variety seeking, and brand consideration. Further, product entity aggregation is usually part of data preparation also since hundreds of SKUs are available as choice alternatives in many product categories. This study conducts an extensive simulation experiment to investigate the effects of data pruning and entity aggregation strategies on estimated price and promotion sensitivities. Characteristics of the data that may moderate the effects of data preparation strategies are also manipulated. The results show that data preparation strategies can result in significant bias in estimated parameters. Based on the results, we make recommendations on how the model builder can prepare scanner panel data so as to avoid significant biases in estimated price and promotion responses. D
Resumo In Austria about half of the entire area (46%) is covered by forests. The majority of these forests are highly managed and controlled in growth. Besides timber production, forest ecosystems play a multifunctional role including... more
Resumo In Austria about half of the entire area (46%) is covered by forests. The majority of these forests are highly managed and controlled in growth. Besides timber production, forest ecosystems play a multifunctional role including climate control, habitat provision and, ...
Cognitive deficits in Huntington's disease (HD) have been attributed to neuronal degeneration within the striatum; however, postmortem and structural imaging studies have revealed more widespread morphological changes. To examine the... more
Cognitive deficits in Huntington's disease (HD) have been attributed to neuronal degeneration within the striatum; however, postmortem and structural imaging studies have revealed more widespread morphological changes. To examine the impact of HD-related changes in regions outside the striatum, we used functional magnetic resonance imaging (fMRI) in HD to examine brain activation patterns using a Simon task that required a button press response to either congruent or incongruent arrow stimuli. Twenty mild to moderate stage HD patients and 17 healthy controls were scanned using a 3 T GE scanner. Data analysis involved the use of statistical parametric mapping software with a random effects analysis model to investigate group differences brain activation patterns compared to baseline. HD patients recruited frontal and parietal cortical regions to perform the task, and also showed significantly greater activation, compared to controls, in the caudal anterior cingulate, insula, inferior parietal lobules, superior temporal gyrus bilaterally, right inferior frontal gyrus, right precuneus/superior parietal lobule, left precentral gyrus, and left dorsal premotor cortex. The significantly increased activation in anterior cingulate-frontal-motor-parietal cortex in HD may represent a primary dysfunction due to direct cell loss or damage in cortical regions, and/or a secondary compensatory mechanism of increased cortical recruitment due to primary striatal deficits.
Though it has been widely reported in the marketing literature that temporary price discounts generate substantial short-term sales increase, the shape of the deal effect curve constitutes a key research topic, for which there are still... more
Though it has been widely reported in the marketing literature that temporary price discounts generate substantial short-term sales increase, the shape of the deal effect curve constitutes a key research topic, for which there are still limited empirical results. To address this issue, a semiparametric regression approach is used to model the complex nature of this phenomenon. Our model is developed at the brand level using daily store-level scanner-data, which allows the study of several nonreported promotional effects, such as the influence of the day of the week both in promotional and nonpromotional periods. The results show that the weekend is the most effective in increasing promotional sales and that asymmetric and neighborhood effects hold. However, 9-ending promotional prices are not impactful. r
We investigate the theoretical possibility and empirical regularity of two troublesome anomalies that frequently arise when cross-price elasticities are estimated for a set of brands expected to be substitutes. These anomalies are the... more
We investigate the theoretical possibility and empirical regularity of two troublesome anomalies that frequently arise when cross-price elasticities are estimated for a set of brands expected to be substitutes. These anomalies are the occurrence of: (a) negatively signed cross-elasticities; and (b) sign asymmetries in pairs of cross price elasticities. Drawing upon the Slutsky equation from neoclassical demand theory, we show how and why these anomalies may occur when cross elasticities are estimated for pairs of brands that are substitutes. We empirically examine these issues in the context of the widely used Multiplicative Competitive Interaction (MCI) and Multinomial Logit (MNL) specifications of the fully extended attraction models (Cooper and Nakanishi 1988). Utilizing a database of store-level scanner data for 25 categories and 127 brands of frequently purchased branded consumer goods, we find that about 18% of a total of 732 cross elasticity estimates are negative and approximately 40% of the 366 pairs of cross elasticities are sign asymmetric. Finally, we find that the occurrence of negatively signed cross elasticities can be partially explained by a set of hypothesized relationships between cross-price elasticities and brand share and elasticities of income and category demand. * 1 I What is not well understood is how often and under what circumstances that negative cross elasticities arise. It is also not well known how this relation translates into
In this paper we present thermal characteristics of coal fires as measured during simulated fires under an experimental setting in Germany in July 2002. It is thus a continuation of the previously published paper "Thermal surface... more
In this paper we present thermal characteristics of coal fires as measured during simulated fires under an experimental setting in Germany in July 2002. It is thus a continuation of the previously published paper "Thermal surface characteristics of coal fire 1: Results of in-situ measurement", in which we presented temperature measurements of real subsurface coal fires in China [Zhang, J., Kuenzer, C., accepted for publication. Thermal Surface Characteristics of Coal Fires 1: Results of in-situ measurements. Accepted for publication at Journal of Applied Geophysics.]. The focus is on simulated coal fires, which are less complex in nature than fires under natural conditions. In the present study we simulated all the influences usually occurring under natural conditions in a controllable manner (uniform background material of known thermal properties, known ventilation pathways, homogeneous coal substrate), creating two artificial outdoor coal fires under simplified settings. One surface coal fire and one subsurface coal fire were observed over the course of 2 days. The set up of the fires allowed for measurements not always feasible under "real" in-situ conditions: thus compared to the in-situ investigations presented in paper one we could retrieve numerous temperature measurements inside of the fires. Single temperature measurements, diurnal profiles and airborne thermal surveying present the typical temperature patterns of a small surface-and a subsurface fire under undisturbed conditions (easily accessible terrain, 24 hour measurements period, homogeneous materials). We found that the outside air temperature does not influence the fire's surface temperature (up to 900°C), while fire centre temperatures of up to 1200°C strongly correlate with surface temperatures of the fire. The fires could heat their surrounding up to a distance of 4.5 m. However, thermal anomalies on the background surface only persist as long as the fire is burning and disappear very fast if the heat source is removed. Furthermore, heat outside of the fires is transported mainly by convection and not by radiation. In spatial thermal line scanner data the diurnal thermal patterns of the coal fire are clearly represented. Our experiments during that data collection also visualize the thermal anomaly differences between covered (underground) and uncovered (surface) coal fires. The latter could not be observed in-situ in a real coal fire area. Subsurface coal fires express a much weaker signal than open surface fires and contrast only by few degrees against the background. In airborne thermal imaging scanner data the fires are also well represented. Here we could show that the mid-infrared domain (3.8 μm) is more suitable to pick up very hot anomalies, compared to the common thermal (8.8 μm) domain. Our results help to Author's personal copy understand coal fires and their thermal patterns as well as the limitations occurring during their analysis. We believe that the results presented here can practicably help for the planning of coal fire thermal mapping campaignsincluding remote sensing methods and the thermal data can be included into numerical coal fire modelling as initial or boundary conditions.
While a significant literature has emerged recently on the longer-term effects of price promotions, as inferred from persistence models, there is very little if any attention paid to whether such longer-term effects vary across different... more
While a significant literature has emerged recently on the longer-term effects of price promotions, as inferred from persistence models, there is very little if any attention paid to whether such longer-term effects vary across different types of consumers. This paper takes a first step in that direction by exploring whether the adjustment, permanent, and total effects of price promotions, and the duration of the adjustment period, differ between consumers segmented based on their usage rates in a product category and their loyalty to a brand. We also investigate whether such consumer segmentation will improve the forecasting performance of persistence models at both product category and brand levels. Expectations are developed based on consumer behavior theory on various effects of price promotions, such as the post-deal trough, the mere purchase effect, the promotion usage effect, and responsiveness to competitor's reactions. Evidence from household-level supermarket scanner data on four product categories is provided. We find substantial differences between consumer segments and provide insights on how managers can increase the longer-term effectiveness of price promotions by targeting each consumer segment with a different promotion program. In addition, consumer segmentation is found to significantly improve the forecasting performance of the persistence model for two of the four product categories. For the other two product categories, consumer segmentation provides forecasting performance similar to that obtained from aggregate-level persistence models.
Landmark-based navigation of autonomous mobile robots or vehicles has been widely adopted in industry. Such a navigation strategy relies on identification and subsequent recognition of distinctive environment features or objects that are... more
Landmark-based navigation of autonomous mobile robots or vehicles has been widely adopted in industry. Such a navigation strategy relies on identification and subsequent recognition of distinctive environment features or objects that are either known a priori or extracted dynamically. This process has inherent difficulties in practice due to sensor noise and environment uncertainty. This paper is to propose a navigation algorithm that simultaneously locates the robots and updates landmarks in a manufacturing environment. A key issue being addressed is how to improve the localization accuracy for mobile robots in a continuous operation, in which the Kalman filter algorithm is adopted to integrate odometry data with scanner data to achieve the required robustness and accuracy. The Kohonen neural networks have been used to recognize landmarks using scanner data in order to initialize and recalibrate the robot position by means of triangulation when necessary.
Using supermarket scanner data, we test a variety of hypotheses from trade journals about the invasion of private-label food products. According to conventional industry wisdom, name-brand firms defended their brands against new... more
Using supermarket scanner data, we test a variety of hypotheses from trade journals about the invasion of private-label food products. According to conventional industry wisdom, name-brand firms defended their brands against new private-label products by lowering their prices, engaging in additional promotional activities, and increasingly differentiating their products. Our empirical evidence is inconsistent with these beliefs.
The advent of very high-resolution satellite programs and digital airborne cameras with ultra high resolution offers new possibilities for very accurate mapping of the environment. With these sensors of improved spatial resolution,... more
The advent of very high-resolution satellite programs and digital airborne cameras with ultra high resolution offers new possibilities for very accurate mapping of the environment. With these sensors of improved spatial resolution, however, the user community faces a new problem in the analysis of this type of image data. Standard classification techniques have to be augmented with appropriate analysis procedures because the required homogeneity of landuse/landcover classes can no longer be achieved by the integration effect of large pixel sizes (e.g., 20 -80 m). New intelligent techniques will have to be developed that make use of multisensor approaches, geographic information system (GIS) integration and context-based interpretation schemes.
Stimulation of the vagus nerve in the neck can reduce seizures in epilepsy patients, and may be helpful in treating depression. PET studies have shown that vagus nerve stimulation (VNS) in epilepsy patients causes acute dose (intensity)... more
Stimulation of the vagus nerve in the neck can reduce seizures in epilepsy patients, and may be helpful in treating depression. PET studies have shown that vagus nerve stimulation (VNS) in epilepsy patients causes acute dose (intensity) dependent changes in regional cerebral blood flow. We sought to use the newly developed VNS synchronized fMRI technique to examine whether VNS BOLD signal changes depend on the frequency of stimulation. Six adults with recurrent depression were scanned inside a 1.5 T MR scanner. Data were acquired at rest, with the VNS device on for 7 s, and also, for comparison, while the patient listened to a tone for 7 s. In two separate back-to-back sessions, the VNS stimulation frequency was set to either 5 or 20 Hz. Data were transformed into Talairach space and then compared by condition. Compared to 5 Hz, 20 Hz VNS produced more acute activity changes from rest in regions similar to our initial VNS synchronized fMRI feasibility study in depression. Brain regions activated by hearing a tone were also greater when VNS was intermittently being applied at 20 Hz than at 5 Hz. In depressed adults, left cervical VNS causes regional brain activity changes that depend on the frequency of stimulation or total dose, or both. In addition to the acute immediate effects of VNS on regional brain activity, this study suggests further that VNS at different frequencies likely has frequency or dose dependent modulatory effects on other brain activities (e.g. hearing a tone). # (M. Lomarev).
In 2006 FDA announced that consumers should not eat fresh spinach in the wake of a large foodborne illness outbreak of E. coli O157:H7. This paper investigates response of consumers to the announcement. We use an AIDS demand model with 5... more
In 2006 FDA announced that consumers should not eat fresh spinach in the wake of a large foodborne illness outbreak of E. coli O157:H7. This paper investigates response of consumers to the announcement. We use an AIDS demand model with 5 food safety shock variables and retail scanner data to analyze market response. Even fifteen months after the outbreak, predicted sales of spinach in bags were still down 10 percent from what they would have been in the absense of the food safety shock. After the outbreak, consumers shifted to other leafy greens such as bulk iceberg lettuce, other bulk lettuce, and bagged salads without spinach.
Determining the impacts on consumers of government policies affecting the demand for food products requires a theoretically consistent micro-level demand model. We estimate a system of demands for weekly city-level dairy product purchases... more
Determining the impacts on consumers of government policies affecting the demand for food products requires a theoretically consistent micro-level demand model. We estimate a system of demands for weekly city-level dairy product purchases by nonlinear three stage least squares to account for joint determination between quantities and prices. We analyze the distributional effects of federal milk marketing orders, and find results that vary substantially across demographic groups. Families with young children suffer, while wealthier childless couples benefit. We also find that households with lower incomes bear a greater regulatory burden due to marketing orders than those with higher income levels.
This paper presents a research agenda for making scanner data more useful to managers. Recommendations are generated in three areas. First, we draw on three case examples to distill a list of characteristics of research that encourage... more
This paper presents a research agenda for making scanner data more useful to managers. Recommendations are generated in three areas. First, we draw on three case examples to distill a list of characteristics of research that encourage managerial impact. Second, we generate a master list of managerial research issues, and then, compile a list of nine priority topics. Third, we define a set of research topics on translating research into action. These topics involve the manner in which managers react to research and how they communicate research findings with other managers. The theme throughout is that a combination of rigor and relevance is a potent force for improving managerial practice.
Retailers who wish to make decisions for a single store and have access to the scanner data of all purchases and to the scanner data of customer card-holders may worry about erroneous inferences when using one of the two databases and... more
Retailers who wish to make decisions for a single store and have access to the scanner data of all purchases and to the scanner data of customer card-holders may worry about erroneous inferences when using one of the two databases and when using the same models to estimate the effects of their main marketing variables. The questions are: ''do the transactions reflected by the customer cards necessarily reflect the usual purchase behavior of all customers?''; and ''do the same customer response models apply equally to regular customers and for the rest of individuals shopping at the store?''. To answer these questions, a brand choice multinomial logit model choice is applied for the product category chosen (ground coffee). The major findings are that regular price elasticity of all brands in the customer segment is twice what was estimated when studying the total purchases. The effects on brand and type of coffee are greater in the customer card-holders segment than in the total purchases.
We investigate the theoretical possibility and empirical regularity of two troublesome anomalies that frequently arise when cross-price elasticities are estimated for a set of brands expected to be substitutes. These anomalies are the... more
We investigate the theoretical possibility and empirical regularity of two troublesome anomalies that frequently arise when cross-price elasticities are estimated for a set of brands expected to be substitutes. These anomalies are the occurrence of: (a) negatively signed cross-elasticities; and (b) sign asymmetries in pairs of cross price elasticities. Drawing upon the Slutsky equation from neoclassical demand theory, we show
Fracture aperture is usually estimated by cubic law, which assumes flow between two smooth parallel plates. However, many researchers have proved that the fracture aperture is not a smooth surface but rather has tortuous paths and... more
Fracture aperture is usually estimated by cubic law, which assumes flow between two smooth parallel plates. However, many researchers have proved that the fracture aperture is not a smooth surface but rather has tortuous paths and roughness and hence the flow behavior is different. Previous research showed that fracture aperture follows lognormal distribution. Nevertheless, there has not been any research conducted to validate the fracture aperture distribution with the change in stress conditions, which is common in fractured reservoirs. With the advent of X-ray CT scanner in the field of petroleum engineering, fracture apertures can be visualized and measured. Since there is no direct calculation for fracture aperture measurement from CT scanner data a calibration curve needs to be established. We developed a calibration curve based on existing calibration techniques, which involves area integration of the fracture region to obtain a correlation between integrated CT numbers and the calibrated fracture aperture. Using this calibration curve, we obtained distribution patterns for fracture apertures along the length of the core for various stress conditions, from about six thousand fracture aperture measurements for each stress condition. The results show that aperture distributions still follow lognormal distribution under various stress conditions.
This paper presents a research agenda for making scanner data more useful to managers. Recommendations are generated in three areas. First, we draw on three case examples to distill a list of characteristics of research that encourage... more
This paper presents a research agenda for making scanner data more useful to managers. Recommendations are generated in three areas. First, we draw on three case examples to distill a list of characteristics of research that encourage managerial impact. Second, we generate a master list of managerial research issues, and then, compile a list of nine priority topics. Third, we define a set of research topics on translating research into action. These topics involve the manner in which managers react to research and how they communicate research findings with other managers. The theme throughout is that a combination of rigor and relevance is a potent force for improving managerial practice.
Food manufacturers have an incentive to include nutrient content claims, health claims, or other types of labeling statements on foods if they believe that consumers will be willing to pay more for products with specific attributes. We... more
Food manufacturers have an incentive to include nutrient content claims, health claims, or other types of labeling statements on foods if they believe that consumers will be willing to pay more for products with specific attributes. We estimated semi-log hedonic price regressions for five breakfast bar and cereal product categories using Nielsen ScanTrack scanner data for 2004 and found that labeling statements for these foods are often associated with substantial increases in consumer willingness to pay. The largest effects were associated with "carbconscious" carbohydrate labeling (reflecting the time period of the data), followed by fat and sugar content labeling statements.
This paper uses over two years of weekly scanner data from two small US cities to characterize time and state dependence of grocers' pricing decisions. In these data, the probability of a nominal adjustment declines with the time since... more
This paper uses over two years of weekly scanner data from two small US cities to characterize time and state dependence of grocers' pricing decisions. In these data, the probability of a nominal adjustment declines with the time since the last price change even after controlling for heterogeneity across store-product cells and replacing sale prices with regular prices. We also detect state dependence: The probability of a nominal adjustment is highest when a store's price substantially differs from the average of other stores' prices. However, extreme prices typically reflect the selling store's recent nominal adjustments rather than changes in other stores' prices.