Bu çalışma dijital avatarların iletişimdeki rollerini iki farklı şekilde ele almaktadır. İlk bölümde avatarların kullanıcıları için ne tarz anlamlar taşıdığı tartışılmaktadır. Bu soruya yanıt vermek için Saussure ve Lacan’ın semiyotik... more
Bu çalışma dijital avatarların iletişimdeki rollerini iki farklı şekilde ele almaktadır. İlk bölümde avatarların kullanıcıları için ne tarz anlamlar taşıdığı tartışılmaktadır. Bu soruya yanıt vermek için Saussure ve Lacan’ın semiyotik teorileri baz alınarak yeni bir önerme getirilmiştir. Saussure’ün gösterge teorisi ile Lacan’ın özbenliğe giriş için göstergeler zinciri teorisi birleştirilerek yeni bir bakış açısı sunulmuştur. İkinci bölümde avatarların dijital iletişimde alıcılar için anlamları Berger’in belirsizlik azaltma teorisi üzerinden tartışılmıştır.
This study deals with the gathering of information regarding how interpersonal relationships are developed in the ephemeral media platforms. The role of self-disclosure plays a vital role in the interpersonal relationships that is formed... more
This study deals with the gathering of information regarding how interpersonal relationships are developed in the ephemeral media platforms. The role of self-disclosure plays a vital role in the interpersonal relationships that is formed in these platforms. Ephemeral media platforms such as Snapchat, Facebook Stories and Instagram Stories will be the main focus, as these three are known major platforms in terms of ephemeral content. The use of Social Penetration Theory and Uncertainty Reduction helped in forming the survey used for the data gathering, and as well as the guide for the focus group discussion which this study conducted in order to get both a general and specific information regarding this research study. There were exactly 134 respondents who answered the survey, and two focus group discussions were conducted. The common themes, as well as the most used ephemeral media was observed and interpreted in the study.
I developed an original classroom activity to apply Uncertainty Reduction Theory to initial relationship development using "Hitch." Using my "Uncertainty Reduction Strategies" worksheet to screen a scene between the film's protagonists... more
I developed an original classroom activity to apply Uncertainty Reduction Theory to initial relationship development using "Hitch." Using my "Uncertainty Reduction Strategies" worksheet to screen a scene between the film's protagonists (played by Eva Mendez and Will Smith), students discover that initiating interpersonal relationships is both rule-governed and imaginative.
Uncertainty is a pervasive feature of our times. In this paper I argue that an age of uncertainty is not a reason for despair, but rather an opportunity to develop and mobilize our capacities for creativity, complexity, and collaboration.... more
Uncertainty is a pervasive feature of our times. In this paper I argue that an age of uncertainty is not a reason for despair, but rather an opportunity to develop and mobilize our capacities for creativity, complexity, and collaboration. Creativity because the future is not determined and has to be created. Complex thought because in an uncertain, networked world our thinking cannot be reductive and disjunctive, but must be able to address interconnectedness and uncertainty. Collaboration, because the crises facing humanity today require an awareness of our shared destiny and in a pluralistic society we need the ability to navigate creativity and complexity together.
Evaluation of Slope stability is one of the day-today practices of geotechnical engineers. Nowadays, different methods are available to evaluate the stability of a particular slope. Despite the advances that have been made in site... more
Evaluation of Slope stability is one of the day-today practices of geotechnical engineers. Nowadays, different methods are available to evaluate the stability of a particular slope. Despite the advances that have been made in site exploration, evaluating the stability of slopes remains a challenge. Recently, Ethiopia has been trying to construct a newly planned railway routes to connect the country's development centers and link with ports of neighboring countries. However, this newly planned railway routes will pass in the heart of highly fragile mountainous terrains and earthquake prone regions. Therefore, the prime objective of this paper is to investigate the stability of the railway embankment by using three different stochastic approaches (First Order Reliability Method, Point Estimate Method and Monte Carlo Simulation) with commercially available finite element programs. Moreover, the seismic response of the railway embankment was studied by using a nonlinear analysis (FLAC2D v 7.0) program. The first order reliability method (FORM), Monte Carlo Simulation (MCS) and Point-estimate method (PEM) gave 3.2%, 4.14% and 1.5% of probability of failure respectively. In the mean time, there was no any indication of liquefaction observed due to stiff foundation clay soils and deep groundwater table.
A strategy for adaptive control and energetic optimization of aerobic fermentors was implemented, with both air flow and agitation speed as manipulated variables. This strategy is separable in its components: control, optimization,... more
A strategy for adaptive control and energetic optimization of aerobic fermentors was implemented, with both air flow and agitation speed as manipulated variables. This strategy is separable in its components: control, optimization, estimation. We optimized parameter’s estimation (from the usual KLa correlation) using sinusoidal excitation of air flow and agitation speed. We have implemented parameter’s estimation trough recursive least squares algorithm with forgetting factor. We carried separate essays on control, optimization and estimation algorithms. We carried our essays using an original computational simulation environment, with noise and delay generating facilities for data sampling and filtering.
Our results show the convergence and robustness of the estimation algorithm used, improved with use of both forgetting factor and KLa dead-band facilities. Control algorithm used in our work compares favorably with PID using the integrated area criteria for deviation between oxygen molarity and critical molarity (set point). Optimization algorithm clearly reduces energetic consumption, respecting critical molarity. Integration of control, optimization and adaptive algorithms was implemented, but future work is needed for stability. Methods were defined and implemented for stability improvement. We have implemented data acquisition and computer manipulation of air flow and agitation speed for actual fermentors.
In this essay, we will try to see uncertainty in the art market positively, as an essential element of the creative process and therefore as a natural component of its organisation. First, we will analyse the nature of uncertainty, as... more
In this essay, we will try to see uncertainty in the art market positively, as an essential element of the creative process and therefore as a natural component of its organisation. First, we will analyse the nature of uncertainty, as being both qualitative and quantitative (I). Then we will distinguish the management of uncertainty on two timescales: the short- term, with the daily strategies developed by the producer to reduce uncertainty (II), and the long-term, in which uncertainty is the raison d’être of the art market (III).
Developmental disorders such as autism have generally been theorized as due to some kind of modular “deficit” or “dysfunction”—typically of cortical origin, i.e., failures of “theory of mind”, of the “mirror neuron system”, of “weak... more
Developmental disorders such as autism have generally been theorized as due to some kind of modular “deficit” or “dysfunction”—typically of cortical origin, i.e., failures of “theory of mind”, of the “mirror neuron system”, of “weak central coherence” or of the balance of “empathizing” and “systemizing”, just to list a few. The broad array of autonomic and sensorimotor differences experienced and reported by people with autism have by such theories typically been sidelined as “co-morbidities,” possibly sharing genetic causes, but rendered as incidental and decisively behaviorally irrelevant symptoms—surely disconnected from cognition. This article entertains the idea that the development of cortically based mental processes and autonomous control relies on the complexities and proper function of the peripheral nervous systems. Through such an “embodied” lens the heterogeneous symptoms of autism invites new interpretations. We propose here that many behavioral-level findings can be re-defined as downstream effects of how developing nervous systems attempt to cope and adapt to the challenges of having various noisy, unpredictable, and unreliable peripheral inputs.
Prediction is the key objective of many machine learning applications. Accurate, reliable and robust predictions are essential for optimal and fair decisions by downstream components of artificial intelligence systems, especially in... more
Prediction is the key objective of many machine learning applications. Accurate, reliable and robust predictions are essential for optimal and fair decisions by downstream components of artificial intelligence systems, especially in high-stakes applications, such as personalised health, self-driving cars, finance, new drug development, forecasting of election outcomes and pandemics. Many modern machinelearning algorithms output overconfident predictions, resulting in incorrect decisions and technology acceptance issues. Classical calibration methods rely on artificial assumptions and often result in overfitting, whilst modern calibration methods attempt to solve calibration issues by modifying components of black-box deeplearning systems. While this provides a partial solution, such modifications do not provide mathematical guarantees of predictions validity, are intrusive, complex, and costly to implement. This thesis introduces novel methods for producing well-calibrated probabilistic predictions for machine learning classification and regression problems. A new method for multi-class classification problems is developed and compared to traditional calibration approaches. In the regression setting, the thesis develops novel methods for probabilistic regression to derive predictive distribution functions that are valid under under a nonparametric IID assumption in terms of guaranteed coverage and contain more information when compared to classical conformal prediction methods whilst improving computational efficiency. Experimental studies of the methods introduced in this thesis demonstrate advantages with regard to state-of-the-art. The main advantage of split conformal predictive systems is their guaranteed validity, whilst cross-conformal predictive systems enjoy higher predictive efficiency andempiricalvalidity in the absence of excess randomisation.
This intervention contributes to recent work in urban geography that integrates the conceptual frameworks of assemblages and actor-network theory by highlighting two additional directions that require a more rigorous and detailed... more
This intervention contributes to recent work in urban geography that integrates the conceptual frameworks of assemblages and actor-network theory by highlighting two additional directions that require a more rigorous and detailed theorization. The first direction concerns the relationship between contingency and necessity in urban assemblages and actor-networks and this paper delineates four specific propositions as a starting point for further reflection. The second direction suggests that urban assemblages and actor-networks require a more explicit vocabulary for thinking about competition and cooperation within and between cities. To this end, the paper introduces a new concept – delayed asymmetric counterforces – that can foster a better understanding of competition-induced urban change and destabilization. The novel concept is developed in conjunction with a typology of delays in competitive urban dynamics, which helps illuminate how delayed asymmetric counterforces are both a cause and an effect of the complexity inherent in the urban realm.
This chapter reviews the extensive literature on bias in favor of in-groups at the expense of out-groups. We focus on five issues and identify areas for future research: (a) measurement and conceptual issues (especially in-group... more
This chapter reviews the extensive literature on bias in favor of in-groups at the expense of out-groups. We focus on five issues and identify areas for future research: (a) measurement and conceptual issues (especially in-group favoritism vs. out-group derogation, and explicit vs. implicit measures of bias); (b) modern theories of bias highlighting motivational explanations (social identity, optimal distinctiveness, uncertainty reduction, social dominance, terror management); (c) key moderators of bias, especially those that exacerbate bias (identification, group size, status and power, threat, positive-negative asymmetry, personality and individual differences); (d) reduction of bias (individual vs. intergroup approaches, especially models of social categorization); and (e) the link between intergroup bias and more corrosive forms of social hostility.
A paper I presented in the panel "Myths as Theoretical Models for Religious Identity in Ancient Greece" at the Nordic TAG Conference held at the University of Oslo.
Terror management theory (TMT) proposes that thoughts of death trigger a concern about self-annihilation that motivates the defense of cultural worldviews. In contrast, uncertainty theorists propose that thoughts of death trigger feelings... more
Terror management theory (TMT) proposes that thoughts of death trigger a concern about self-annihilation that motivates the defense of cultural worldviews. In contrast, uncertainty theorists propose that thoughts of death trigger feelings of uncertainty that motivate worldview defense. University students (N = 414) completed measures of the chronic fear of self-annihilation and existential uncertainty as well as the need for closure. They then evaluated either a meaning threat stimulus or a control stimulus. Consistent with TMT, participants with a high fear of self-annihilation and a high need for closure showed the greatest dislike of the meaning threat stimulus, even after controlling for their existential uncertainty. Contrary to the uncertainty perspective, fear of existential uncertainty showed no significant effects.
Actors in competitive environments are bound to decide and act under conditions of uncertainty because they rarely have accurate foreknowledge of how their opponents will respond and when they will respond. Just as a competitor makes a... more
Actors in competitive environments are bound to decide and act under conditions of uncertainty because they rarely have accurate foreknowledge of how their opponents will respond and when they will respond. Just as a competitor makes a move to improve their standing on a given variable relative to a target competitor, she should expect the latter to counteract with an iterative lagged asymmetric response, that is, with a sequence of countermoves (iteration) that is very different in kind from its trigger (asymmetry) and that will be launched at some unknown point in the future (time lag). The paper explicates the broad relevance of the newly proposed concept of " iterative lagged asymmetric responses " to the social study of temporality and to fields as diverse as intelligence and counterintelligence studies, strategic management, futures studies, military theory, and long-range planning. By bringing out in the foreground and substantiating the observation that competitive environments place a strategic premium on surprise, the concept of iterative lagged asymmetric responses makes a contribution to the never-ending and many-pronged debate about the extent to which the future can be predicted.
DOI: https://doi.org/10.1177/0961463X17752652
There is now considerable evidence that human sentence processing is expectation based: As people read a sentence, they use their statistical experience with their language to generate predictions about upcoming syntactic structure. This... more
There is now considerable evidence that human sentence processing is expectation based: As people read a sentence, they use their statistical experience with their language to generate predictions about upcoming syntactic structure. This study examines how sentence processing is affected by readers’ uncertainty about those expectations. In a self-paced reading study, we use lexical subcategorization distributions to factorially manipulate both the strength of expectations and the uncertainty about them. We compare two types of uncertainty: uncertainty about the verb’s complement, reflecting the next prediction step; and uncertainty about the full sentence, reflecting an unbounded number of prediction steps. We find that uncertainty about the full structure, but not about the next step, was a significant predictor of processing difficulty: Greater reduction in uncertainty was correlated with increased reading times. We additionally replicated previously observed effects of expectation violation (surprisal), orthogonal to the effect of uncertainty. This suggests that both surprisal and uncertainty affect human reading times. We discuss the consequences for theories of sentence comprehension.
Diversos estudos recentes sobre avatares online abordam a sua autenticidade em termos da representação das pessoas que os gerem. Supostamente, os utilizadores construirão uma presença melhorada ou idealizada de si mesmos online, sem... more
Diversos estudos recentes sobre avatares online abordam a sua autenticidade em termos da representação das pessoas que os gerem. Supostamente, os utilizadores construirão uma presença melhorada ou idealizada de si mesmos online, sem compreenderem, no entanto, que os outros fazem o mesmo ao procurarem informações de outros utilizadores através dos seus avatares. Este fenómeno torna-se ainda mais curioso no seio dos espaços dos jogos de vídeo online, uma vez que já se espera que os avatares dos jogos de vídeo não estejam relacionados com os seus jogadores, mas ainda são vistos como fontes de informação acerca dos mesmos. Este estudo aborda a problemática da comunicação e procura explicar o processo recorrendo à Teoria de Redução da Incerteza, de Berger (TRI). Agregando a TRI com diversas outras abordagens não verbais e visuais da comunicação, discute-se de que modo os avatares dos jogos de vídeo – aparentemente não relacionados ou arbitrariamente relacionados com os seus utilizadores – se transformam em fontes de informações acerca dos mesmos. Adicionalmente, de modo a elaborar ainda mais o processo, também se analisa a relação entre o próprio e os avatares. Para criar esta associação, aprofundaram-se as teorias semióticas de Saussure e Lacan e propôs-se uma nova abordagem. O processo de significação de Saussure e as cadeias de significação de Lacan foram adaptados aos avatares digitais, de modo a definir um ciclo repetitivo de retorno contínuo entre os avatares do jogo de vídeo e o próprio.
Various recent research on online avatars debated their authenticity in terms of representing the individuals that manage them. Seemingly users would construct an enhanced or idealized presence of themselves online, yet fail to realize... more
Various recent research on online avatars debated their authenticity in terms of representing
the individuals that manage them. Seemingly users would construct an enhanced or idealized
presence of themselves online, yet fail to realize that others also do so when seeking information
of other users through their avatars. This phenomenon becomes even more curious inside
online video game spaces, since video game avatars are already expected to be unrelated with
their players but are still seen as sources of information about them. This study approaches the
issue as a communication problem and tries to explain the process through Berger’s Uncertainty
Reduction Theory (URT). Merging URT with various other nonverbal and visual communication
approaches, it is debated how video game avatars – seemingly unrelated or arbitrarily related
entitites with their users – become information sources about them. Additionally to elaborate
further on the process, the relationship between self and avatars is also analyzed. To create this
link, semiotic theories of Saussure and Lacan were expanded and a new approach was proposed.
Saussure’s signification process and Lacan’s chains of signification were adapted into digital
avatars to define an on-going feedback loop between the video game avatars and the self.
Evaluation of Slope stability is one of the day-today practices of geotechnical engineers. Nowadays, different methods are available to evaluate the stability of a particular slope. Despite the advances that have been made in site... more
Evaluation of Slope stability is one of the day-today practices of geotechnical engineers. Nowadays, different methods are available to evaluate the stability of a particular slope. Despite the advances that have been made in site exploration, evaluating the stability of slopes remains a challenge. Recently, Ethiopia has been trying to construct a newly planned railway routes to connect the country's development centers and link with ports of neighboring countries. However, this newly planned railway routes will pass in the heart of highly fragile mountainous terrains and earthquake prone regions. Therefore, the prime objective of this paper is to investigate the stability of the railway embankment by using three different stochastic approaches (First Order Reliability Method, Point Estimate Method and Monte Carlo Simulation) with commercially available finite element and finite difference programs. Moreover, the seismic response of the railway Vol. 22 [2017], Bund. 01 52 embank...
Abstract—Dempster-Shafer evidence theory is very important in the fields of information fusion and decision making. However, it always brings high computational cost when the frames of discernments to deal with become large. To reduce the... more
Abstract—Dempster-Shafer evidence theory is very important in the fields of information fusion and decision making. However, it always brings high computational cost when the frames of discernments to deal with become large. To reduce the heavy computational load involved in many rules of combinations, the approximation of a general belief function is needed. In this paper we present a new general principle for uncertainty reduction based on hierarchical proportional redistribution (HPR) method which allows to approximate any general basic belief assignment (bba) at a given level of non-specificity, up to the ultimate level 1 corresponding to a Bayesian bba. The level of non-specificity can be adjusted by the users. Some experiments are provided to illustrate our proposed HPR method. Index Terms—Belief functions, hierarchical proportional redis-tribution (HPR), evidence combination, belief approximation. I.
The needs both for increased experimental throughput and for in operando characterization of functional materials under increasingly realistic experimental conditions have emerged as major challenges across the whole of crystallography. A... more
The needs both for increased experimental throughput and for in operando characterization of functional materials under increasingly realistic experimental conditions have emerged as major challenges across the whole of crystallography. A novel measurement scheme that allows multiplexed simultaneous measurements from multiple nearby sample volumes is presented. This new approach enables better measurement statistics or direct probing of heterogeneous structure, dynamics or elemental composition. To illustrate, the submicrometer precision that optical lithography provides has been exploited to create a multiplexed form of ultra-small-angle scattering based X-ray photon correlation spectroscopy (USAXS-XPCS) using micro-slit arrays fabricated by photolithography. Multiplexed USAXS-XPCS is applied to follow the equilibrium dynamics of a simple colloidal suspension. While the dependence of the relaxation time on momentum transfer, and its relationship with the diffusion constant and the ...
Uncertainty is real. This paper aims to describe “Competitive Stakeholder Theory “ as a business strategy. “CSR (Triple Bottom Line philosophy) and Stakeholder Theory are competing theories considered as strategic management to... more
Uncertainty is real. This paper aims to describe “Competitive Stakeholder Theory “ as a business strategy. “CSR (Triple Bottom Line philosophy) and Stakeholder Theory are competing theories considered as strategic management to achieve objectives through value maximizing. The goal of Stakeholder Theory is pro all stakeholders involved. Every stakeholder including shareholder shares and creates values together which are useful for themselves. Stakeholder Theory is a dynamic process that contributed by Power and Control of Stakeholders embedded in ethics/philosophy; Existing Issues; Cost Effective Strategies; Moral and Trust; PDCA (Plan-Do-Check-Act); Recognition and Creating Values. They are a continuous process and have an interrelated relationship”. Competitive Stakeholder Theory would be applied at stable situation and at unstable situation such as uncertainty, turbulence, chaos, limited resources, remote areas, minimizing risk and issues of social responsibility. Key words: Stakeholder theory, Uncertainty
The present research describes the manner in which individuals use various media in the interpersonal information seeking process. Stephens' (2007) information and communication technology (ICT) succession theory was applied to an... more
The present research describes the manner in which individuals use various media in the interpersonal information seeking process. Stephens' (2007) information and communication technology (ICT) succession theory was applied to an interpersonal information seeking context, and ...
Trends in rainfall at 39 locations of the Nile River Basin (NRB) in Africa were analyzed. Comparison was made between rainfall trend results from the long-term data and those of short-term series selected over different time periods. The... more
Trends in rainfall at 39 locations of the Nile River Basin (NRB) in Africa were analyzed. Comparison was made between rainfall trend results from the long-term data and those of short-term series selected over different time periods. The bias on trend results from series of short-term records was quantified. Homogeneity test was conducted to assess the coherence of the trend directions on a regional basis. Based on an assumed population (for simplicity) of rainfall data time periods in the range 75–100 years, bias in the short-term trend analysis was noted to reduce by about 10% for every 10% increase in record length. Under some conditions if respected, it was possible to derive trends at stations with short rainfall records based on those at nearby stations with longer term records but in the same region. Using the same data record length and uniform time period at all the selected stations, an improved regional coherence of rainfall trend results was obtained. In the equatorial region, trend in annual rainfall was found mainly positive and significant at level α = 5% in 4 of the 7 stations. Collectively for Sudan, Ethiopia and Egypt, trends in the annual rainfall were mostly negative and significant at α = 5% in 69% of the 32 stations. Heterogeneity in the trend directions for the entire NRB was confirmed at α = 1% in 13% of the 39 stations. These findings are vital for water and agricultural management practices.
Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically... more
Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.
There is now considerable evidence that human sentence processing is expectation based: As people read a sentence, they use their statistical experience with their language to generate predictions about upcoming syntactic structure. This... more
There is now considerable evidence that human sentence processing is expectation based: As people read a sentence, they use their statistical experience with their language to generate predictions about upcoming syntactic structure. This study examines how sentence processing is affected by readers' uncertainty about those expectations. In a self-paced reading study, we use lexical subcategorization distributions to factorially manipulate both the strength of expectations and the uncertainty about them. We compare two types of uncertainty: uncertainty about the verb's complement, reflecting the next prediction step; and uncertainty about the full sentence, reflecting an unbounded number of prediction steps. We find that uncertainty about the full structure, but not about the next step, was a significant predictor of processing difficulty: Greater reduction in uncertainty was correlated with increased reading times (RTs). We additionally replicated previously observed effects of expectation violation (surprisal), orthogonal to the effect of uncertainty. This suggests that both surprisal and uncertainty affect human RTs. We discuss the consequences for theories of sentence comprehension.
The interaction between the patient's expected outcome of an intervention and the inherent effects of that intervention can have extraordinary effects. Thus in clinical trials an effort is made to conceal the nature of the administered... more
The interaction between the patient's expected outcome of an intervention and the inherent effects of that intervention can have extraordinary effects. Thus in clinical trials an effort is made to conceal the nature of the administered intervention from the participants in the trial i.e. to blind it. Yet, in practice perfect blinding is impossible to ensure or even verify. The current standard is follow up the trial with an auxiliary questionnaire, which allows trial participants to express their belief concerning the assigned intervention and which is used to compute a measure of the extent of blinding in the trial. If the estimated extent of blinding exceeds a threshold the trial is deemed sufficiently blinded; otherwise, the trial is deemed to have failed. In this paper we make several important contributions. Firstly, we identify a series of fundamental problems of the aforesaid practice and discuss them in context of the most commonly used blinding measures. Secondly, motivated by the highlighted problems, we formulate a novel method for handling imperfectly blinded trials. We too adopt a post-trial feedback questionnaire but interpret the collected data using an original approach, fundamentally different from those previously proposed. Unlike previous approaches, ours is void of any ad hoc free parameters, is robust to small changes in auxiliary data and is not predicated on any strong assumptions used to interpret participants' feedback.