Journal of Organizational Behavior, J. Organiz. Behav. 33, 412–427 (2012)
Published online 18 January 2012 in Wiley Online Library (wileyonlinelibrary.com) DOI: 10.1002/job.1776
Sociometric badges: Using sensor technology to
capture new forms of collaboration
TAEMIE KIM1*, ERIN MCFEE2, DANIEL OLGUIN OLGUIN3, BEN WABER3 AND
ALEX “SANDY” PENTLAND4
1
Accenture Technology Labs, San Jose, California, U.S.A.
Harvard Business School, Allston, Massachusetts, U.S.A.
Sociometric Solutions, Inc., Boston, Massachusetts, U.S.A.
4
MIT Media Lab, Cambridge, Massachusetts, U.S.A.
2
3
Summary
This article introduces sociometric badges as a research tool that captures with great accuracy fine-scale
speech patterns and body movements among a group of individuals at a scale that heretofore has been
impossible in groups and teams studies. Such a tool offers great potential for studying the changing ecology
of team structures and new modes of collaboration. Team boundaries are blurring as members disperse across
multiple cultures, convene through various media, and operate in new configurations. As the how and why of
collaboration evolves, an opportunity emerges to reassess the methods used to capture these interactions and
to identify new tools that might help us create synergies across existing approaches to teams research. We
offer sociometric badges as a complement to existing data collection methods—one that is well-positioned
to capture real-time collaboration in new forms of teams. Used as one component in a multi-method approach,
sociometric badges can capture what an observer or cross-sectional survey might miss, contributing to a more
accurate understanding of group dynamics in new teams. We also revisit traditional teams research to suggest
how we might use these badges to tackle long-standing challenges. We conclude with three case studies
demonstrating potential applications of these sociometric badges. Copyright © 2012 John Wiley & Sons, Ltd.
Keywords: sociometric badges; collaboration; teams; technology; methodology
Introduction
Contemporary collaboration moves beyond traditional team conceptualizations such as small collocated face-to-face
groups (Leavitt, 1975). As discussed in the introduction to this special issue, team boundaries are blurring as
members disperse across multiple cultures, languages, and configurations. Virtuality and increasingly ambiguous
team structures contribute to an evolving understanding of member interdependence, and measuring team effectiveness becomes increasingly complex as individuals move in and out of teams and juggle multiple team memberships.
New leadership challenges emerge as organizations push self-forming self-governing teams and charge them with
leading organizational change. As the how and why of collaboration evolves, an opportunity emerges to reassess
the methods used to capture these interactions and to identify new tools that might help us create synergies across
existing approaches to teams research.
The supporting infrastructures outside of the team are also changing. Workplaces increasingly deploy various
media alternatives to leverage for collective work: video and telephone conferencing, instant messaging, and
enterprise social media platforms, for example. These technologies have freed team members from having to be
in the same place at the same time, allowing reliable access to distributed human capital that may otherwise
*Correspondence to: Taemie Kim, Accenture Technology Labs, San Jose, California, U.S.A. E-mail: taemie.j.kim@accenture.com
Copyright © 2012 John Wiley & Sons, Ltd.
Received 30 November 2011, Accepted 1 December 2011
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
413
have remained elusive. As a result, there has been more emphasis in research on quantitative analysis of online
communication, such as email and social media (e.g., Rooksby et al., 2009). However, these media only cover
one aspect of collaborative exchanges (e.g., email exchanges among team members) and fail to capture the richness
of face-to-face interactions. For example, in one of the case studies that we discuss in this article, we found that
email volume analysis alone underestimated the amount of inter-team collaboration, which often happened as
informal spontaneous interactions. Therefore, multiple modes of observation are required to capture the various
dimensions of team member interactions, minimizing the bias in the data. Even with multi-method research,
however, certain aspects of collaboration in teams might still elude comprehensive depictions in the literature, in part
because the complexities of real-world teams and their organizational environments are extremely difficult and
expensive to capture for analysis. As team structures, memberships, and tasks combine in novel ways through novel
applications of technology, we might expect these data collection challenges to become increasingly salient.
In response to these emergent considerations, recent studies have explored the usefulness of technology both for
capturing original data and for calibrating with other kinds of observations. Research on face-to-face communication
suggests that verbal interaction patterns are meaningful predictors of future team performance (Curhan & Pentland,
2007). Past laboratory studies have used microphones to record audio input as a primary means of understanding group
collaboration patterns (Curhan & Pentland, 2007; DiMicco, Pandolfo, & Bender, 2004; Dong et al., 2007), and video
capture is common for analyzing physical interaction patterns, expressions, and transmission of social signals among
group members (Aran & Gatica-Perez, 2010; Brdiczka, Maisonnasse, & Reignier, 2005; Pianesi et al., 2008). Sensor
technologies such as accelerometers (Sung & Pentland, 2009), proximity and motion sensors (Gips, 2006), and sociometers (Choudhury & Pentland, 2004) have been developed to measure acceleration pull and vibration, group social
structures, and who the wearer is facing, respectively. It is in this vein that we offer the sociometric badge as a tool
for groups and teams research.
Sociometric badges utilize a technology that enables automatic measurement of collective patterns of behavior over
time and across physical boundaries at a scale that heretofore has been impossible in groups research. They are worn on
the person like an identification badge and capture with great accuracy fine-scale speech patterns and body movements
among a group of individuals (Olguín, Gloor & Pentland, 2009; Pentland 2008; see Appendix A for a full description of
the technical specifications of the badges and an example of badge data.) Types of data captured with these badges
include speech and movement energy, speech frequency, and group interaction patterns, among other things. They
record streaming real-time emergent collaboration patterns on a large scale (e.g., organization-wide) and with a
calibrated precision that is impossible to replicate with observation alone. When compared with the labor-intensive task
of preparing observation notes and coding videos for analysis, sociometric badge data are much easier to work with in
this phase. It is ready for analysis almost immediately, requiring just a couple of minutes to process into a spreadsheet
format for every hour of data recorded. Included in this article are specific examples of how sociometric badges might be
used to address important emergent and long-standing questions in groups research.
First, we discuss how new forms of collaboration might benefit from this methodological tool. Second, we suggest
how we might also use sociometric badges to revisit traditional challenges in teams research and to add an additional
dimension of rigor to existing work. Third, we offer three brief case studies in which the badges capture the following
phenomena: dynamic inter-team collaborations, patterns of intra-team integration for newly formed teams, and the
effects of real-time interventions for distributed teams. These studies illustrate potential applications and highlight
additional research questions we might explore with this technology both in the laboratory and in the field.
(Mis)Understanding New Forms of Collaboration
Sociometric badges are a research tool that, through the advantages listed earlier, supports an accurate understanding
of the changing nature of team interactions—one that traditional methods alone might miss. This is not to suggest
that traditional methods are faulty but rather that they would benefit from this tool given the enormity of the task
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
414
T. KIM ET AL.
of capturing collaboration across locations, time zones, and cultures. For example, we might use sociometric badges
to record a more accurate account of increasingly fluid memberships. Whereas in the past, team membership often
appeared as a simple survey item or through direct observation of team interactions, increasingly, membership is not
this straightforward. Formal team memberships might not reflect influential informal interactions, and self-reports
might be misleading. Survey measures alone could paint an incomplete picture of how and with whom collaboration
happens in a particular organizational setting. Observation hours are limited, and although field notes can provide a
rich set of insights from a small subset of cases, the researcher then runs into challenges with generalizability of
findings. By using sociometric badges alongside survey items, we can determine how perceptions of membership
correlate with contributions to team performance. We might test the hypothesis that perceptions of membership
predict the frequency and intensity of interactions around project-related tasks. We could also examine the relationship between the status of individuals who did not report formal membership (as measured by specific status cues
such as organizational title) and their influence over team project outcomes (as captured by the stream of interaction
data recorded by the sociometric badges). Informal mentoring relationships are often challenging to encourage in
organizations that structure compensation around production. In addition to researchers better understanding
power dynamics among collaborators, organizations might also be able to attribute and reward individuals who
serve as informal advisors to others or who make substantive contributions to projects that are not part of their
formal responsibilities.
Along with the fluidity of team membership, multiple team memberships offer a second example of how
sociometric badge data can support a more complete understanding of new forms of collaboration. These memberships tend to be captured using self-report survey items, often a named percentage of time spent on each team.
Although these are useful in framing the number and priority of projects that individuals work on, we suggest that
sociometric badge data can capture three elements that are impossible to collect with survey data and very difficult to
capture effectively through observation: (i) the exact amount of time spent on each team; (ii) the difference in an
individual’s behavior and speech patterns between focal and peripheral teams; and (iii) the difference in the team’s
treatment of the members for whom the team is their focal versus peripheral team. We might imagine a study in
which we ask individuals to put the badges on each morning and to wear them over several months. In doing this,
we could capture without having to be present the exact amount of time that individuals spend interacting with
members from various teams. Combined with self-report percentages, we could explore any perceptual discrepancies between time reported on each team and actual time spent on various teams. This could reveal new types of
challenges that individuals might face given their multiple memberships, such as managing escalating commitments
to peripheral team projects. To the second point, we might use sociometric badge data to calibrate survey and
observation data on interpersonal interactions. We might test the hypothesis that individuals behave and speak
with greater frequency and intensity on their core teams than they do on their peripheral ones. From the team’s
perspective, we might also hypothesize differential treatment of team members for whom the team is a core concern
versus those with less commitment. Because of the complexities of observing individuals with multiple team
memberships, sociometric badges could greatly facilitate data collection and capture an element of interpersonal
exchanges necessary to answer such questions that arise from these new forms of collaboration.
In a third example, we explore how traditional methods might miss some of the crucial effects of distributed team
members on team processes and performance. In a field setting with members located across two sites, it is possible
to set up data collection with sociometric badges in both sites. A conference call of all team members across
locations might disband into sub-teams comprising collocated members. The badges can collect the physical and
speech pattern interactions on the call, afterward in the sub-groups, and within these groups as side conversations
take place. Dialing into the call directly might afford insight into the content of formal collaboration moments. Direct
observation captures the dynamics in the room. By combining all these methods, a researcher might capture the
differential behavior and speech patterns across sites and record impromptu collaboration as it occurs. What might
have been challenging or impossible to capture with a remote or single-site presence can now be recorded at a low
cost and with little interference to natural team behaviors. We might examine how collocated team members’
behaviors were different when their team members spoke versus when the remote team members had the floor. With
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
415
only the observation data, we might assume that decisions made in the formal collaboration moment guide each
site’s team in much the same way. Combining the content of the phone and sub-group conversations with the badge
data, we could test the hypothesis that critical decisions are also made in informal collaboration moments, which
could help explain some of the challenges to lead dispersed teams.
As an illustration of some of the methodological challenges named earlier, we provide later in the article a case
example of the how a single methodology might have resulted in skewed findings in teams leveraging technology
for collaboration. We have just discussed how the sociometric badge is well-positioned to complement existing
methodologies as collaborators disperse, adopt more ambiguous roles on teams, and lead complex team lives. Alone,
the badge data will not offer a complete understanding of these new forms of team interactions, but traditional
methodologies in isolation are also at risk of missing crucial aspects of team life because of the cost and operational
challenges behind collecting these data in real time and at scale. Thus, we suggest that the sociometric badges are a
critical tool for creating synergies between existing network studies, team studies, and conversation analysis.
Revisiting “Old” Teams
Much in the same way sociometric badges might enhance measurement of new team forms, they could also
contribute substantively to data collection in more traditional teams and team settings. In a review of existing
teams research, we noted several consistent methodological constraints that limit our understanding of team-level
phenomena. We have included in this discussion two challenges in particular that sociometric badges are best suited
to address; as such, this is not a comprehensive list of methodological constraints. In the succeeding texts, we discuss
how badge data might help verify self-report items and record causal linkages.
In general, there seems to be an agreement that self-report measures are a good source of people’s feelings and
perceptions (Spector, 1994). The challenge arises when self-report items are used as a source for capturing objective
changes, such as structural changes to the operating environment (Frese & Zapf, 1988). As such, these measures are
limited by source bias and attribution effects (Staw, 1975) and impacted by other things such as respondent attitudes,
cognitive processes, mood, and personality (Spector, 1992). For example, in her influential work on task group
effectiveness in marketing teams in the communications industry, Gladstein (1984, p. 514) notes in her limitations
section that “[The study] relied on one-time self-report measures to represent most constructs. . .[which] raises the
question of how much of the explained variance is common-method variance, and how much is true variance.”
We learned from this important work that group ratings of open communication, supportiveness, and active
leadership among other variables were positively associated with group ratings of satisfaction and performance,
even though these same variables had either no impact on or a negative relationship to the team’s actual overall
performance (Gladstein, 1984).
With the sociometric badges today, we might revisit a similar organizational setting and have the team members
wear the badges to collect continuous data from their team meetings. We could then calibrate their self-report
measures from the original main surveys against the speech pattern and behavioral interaction data from the badges
to determine whether group members held accurate perceptions of open communication, supportiveness, and active
leadership. Counting interruptions and comparing the frequencies and durations of each member’s contributions
might verify openness of communication. Examining two types of interruptions might confirm member supportiveness: those that resulted in further elaboration by the original speaker and those that resulted in the group discussion
changing direction. Active leadership could be measured as the number and intensity of communications emerging
from the named leader. At little additional financial cost, this process both satisfies the call for such objective
benchmarks against which we can compare self-reported measures (Spector, 1994) and when visualized (see example
provided in Case Study #2) provides a useful tool for communicating the team’s performance back to its members,
another call issued in Gladstein’s work.
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
416
T. KIM ET AL.
A second frequently encountered challenge is determining causality. Longitudinal data collection and field
observations are well suited to investigate the antecedents and consequences of particular team actions, but they also
have a distinct set of challenges with regard to accurately capturing causality. Outcomes of a longitudinal study
hinge on the timing of data collection, site selection, and the level of complexity associated with such a project
(Van de Ven & Huber, 1990). Field observers or participant observers face problems with influencing or being
influenced by the system they are observing (Barley, 1990; Leonard-Barton, 1990). Cross-sectional designs limit
tests of dynamic models. As Jehn (1995, p. 277) notes in her seminal work on intragroup conflict, “A dynamic
model of conflict would take into account the possibility that one type of conflict may change into another
type. . .The cross sectional design [of this study] also limits my ability to untangle causal relationships in the model.”
She goes on to suggest a longitudinal design might be used to examine causal factors, but as mentioned earlier, that
too has its challenges. We suggest that a combination of cross-sectional, longitudinal, and sociometric badge data
could afford novel insights into long-standing research questions on groups and teams.
If we were to return to Jehn’s (1995) 105 collocated work groups and management teams in the international
headquarters of a large freight transportation firm, we might deploy 589 badges for the sample to wear during
team meetings. We could use the badge data as the connective tissue between her multi-method approach
(surveys, interviews, observations, and performance appraisal ratings) to support the dynamic model of conflict
in groups that she named in her discussion and with limited added cost and data preparation time. Some of the
factors Jehn names as critical to a dynamic model are the onset of the conflict, the changes in conflict over time,
and the inconsistencies between latent and manifest conflicts. Sociometric badge data can be used to examine
these items by recording the temporal sequences of team member interactions. The intensity and frequency of
speech patterns (e.g., frequent interruptions) combined with behavioral cues (e.g., leaning in closer to the
conversation partner) can signal the onset of a manifest conflict. This timing can be calibrated with observational
data to ensure consistency in analysis across team meetings. If observations of relational and task-related
conflicts in groups yield distinct speech and movement patterns, we might investigate and better understand a
group’s devolution from a potentially productive task conflict to an unproductive relational conflict. Finally,
linking survey, interview, and observational data to meetings in which latent and manifest conflicts
occurred, we might query the badge data to determine interaction patterns that precede both. Because the cost
to collect a significant volume of data is low, it becomes far more feasible to test resultant models on
different teams in different settings, supporting more generalizable findings than are often the case in in-depth
case studies.
To understand team-level phenomena, we must consider the context, content, and process of team interactions.
Sociometric badges offer a novel means of capturing detailed content of team interactions that, when combined with
other methodological approaches, can contribute to new understandings of existing group phenomena and to the
exploration of new phenomena as teams adopt new forms and practices. A key benefit of these badges is the cost
and speed of data preparation. Observations typically require at least one researcher present for the number of hours
observed; then, the field researcher returns to his or her office and spends a great deal of time sorting through and
coding notes into the desired format for analysis. The result is a set of coded observations representing some hours
of team interactions. With sociometric badges in hand, that same researcher could add a significant data set of nonobserved team interactions as a complement for testing, dramatically adding to the robustness of any findings. One
hour of recorded sociometric badge data takes just a few minutes to process into a spreadsheet, and that processing
is computer moderated. Although the badge data alone would be a qualitatively poor source of data, it can
quantitatively substantiate insights drawn from the observational work.
In the following section, we provide three supporting examples that highlight how automatically measured
interaction data help us extend the depth of understanding of changing forms of collaboration. The first case study
shows how observing both email and sociometric badge data provided an unbiased understanding of inter-team
collaboration patterns. The second case study delineates the detailed integration process of a newly formed team,
using fine-grained data collection points. The last case study shows that real-time collection of collaboration data
opens opportunities for real-time automated intervention.
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
417
Case Study #1: Understanding Inter-Team Collaboration Patterns
As the forms of collaboration change, cross-team collaboration is increasingly common to the point that the formal
team structure can serve more for an administrative organization than as a governor of interaction and knowledge
exchange.
To understand these inter-team collaboration patterns, we collected email and interaction data for a period of one
month (20 working days) in the marketing division of a German bank. The division comprised 22 employees
distributed into four teams. A manager oversees each of these teams, and a mid-level manager, in turn, supervised
the manager. These mid-level managers were responsible for two teams, and they reported directly to the division
manager. We treated the mid-level and division-level managers as a single team in the analysis (for the division’s
organizational chart, see Figure 1). We collected interaction data using the sociometric badges, which recorded
the wearer’s speech, interaction, and body movement patterns, as well as his or her location. All information was
transmitted in real time. We instructed each employee to wear a sociometric badge every day from the moment
he or she arrived at work until he or she left his or her office. In total, we collected 2200 hours of data (an average
of 100 hours per employee) and 880 reciprocal emails.
Using the email and face-to-face interaction data collected in the experiment, we were able to create a dynamic
visualization that shows communication flow within and between teams. Figure 2a–c illustrates inter-team email
volume (arcs below each box) and face-to-face interaction volume (arcs above each box). This provides a useful
and intuitive method of visualizing the total communication within an organization.
These data present a more nuanced view of inter-team collaboration practices. In addition to providing accurate data
on the volume of each type of interaction (email versus face-to-face), we also explored the dynamic utilization patterns
of each to better understand how communication is used to manage certain events. The email communication is mostly
among managers and the other teams. Moreover, this pattern is quite evenly replicated among the first three days. We
found this pattern of behavior to continue throughout the duration of the study. On the contrary, face-to-face interactions
are much more variable across the three days. These daily changes continue even after the first three days.
On average, 63.4 per cent of all the email transactions were between individuals on different teams (inter-team
emails), whereas the remaining 36.6 per cent of emails were between individuals on the same team (intra-team
email). Of all face-to-face interactions, 68.1 per cent of the detections were between individuals on different teams,
and 31.9 per cent of the detections were between individuals on the same team. Thus, the ratios of inter-team and
intra-team interactions are similar between the two communication media. However, the behavioral patterns
underlying these similar ratios are quite different. The daily variance in the volume of email communication is
not the same as that of face-to-face interactions. For email, the standard deviation of all days is 8.5 per cent, whereas
it is 20.1 per cent for the face-to-face interactions. Thus, face-to-face interactions had a 2.4 times higher variation in
Figure 1. German bank marketing division organizational chart
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
418
T. KIM ET AL.
Figure 2. (a) Day 1: email and face-to-face communication patterns, (b) Day 2: email and face-to-face communication patterns,
and (c) Day 3: email and face-to-face communication patterns
daily volume. Hence, we can conclude two things: (i) a good portion of the face-to-face communication happens
across teams and (ii) who talks to whom and how much changes quite a bit over time. For example, the sales and
support groups comprised the bulk of face-to-face interactions on Day 2, whereas the development and sales groups
had the biggest portion of face-to-face communication on Day 3. Moreover, the visualization of the volume of
communication of the first 3 days (Figures 2a–c) suggests that a big portion of email includes managers, whereas
a big portion of face-to-face interaction happens between non-manager teams.
If we were to look at the email data alone for all 20 observed days, the percentage of communication including
managers is 37.2 per cent for email communication and 23.9 per cent for face-to-face communication
(t(14) = 2.84, p < .05); this overstates the collaboration including managers when compared with the findings that
include the badge data.
With the sociometric badge data, we see that including the face-to-face analysis presents a more thorough portrait
of what actually transpired in these real teams: there is a great deal of inter-team communication, and these
interactions are highly variable. Capturing both media allowed us to deduce that email is the method used to
communicate regularly with managers, whereas face-to-face communication supports inter-team collaboration
responding to daily needs. Additionally, even though the team structure was fixed, the composition of collaborating
groups varied. The face-to-face communication accurately measured the collaboration happening outside the formal
team structure. This collaboration happened quite dynamically, which might have been difficult to capture using
self-reported data.
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
419
Out of this case study on inter-team communication patterns, we envision this technology impacting researchers
and practitioners in several ways. First, researchers can begin to examine causal linkages between collaboration
patterns and team outcomes. For example, did a flurry of face-to-face interactions precede a successful outcome
in the project strategy? Does a team or department’s heavier-than-average reliance on electronic communication
leave them politically underrepresented when it comes time to make decisions? Does an inconsistent collaboration
practice lead to inconsistent outcome quality? Second, organizations attempting to improve cross-team collaboration
can analyze progress through fairly straightforward visualizations of team interactions coupled with content
calibration to ensure decisions that would benefit from a collaborative approach are the ones receiving such
collective attention. Third, organizations can optimize technology solutions for employees on the basis of real usage
of platforms for various activities (e.g., informational interactions versus decision-making ones). Along those same
lines, industrial engineers can use collaboration patterns to plan work flow, design, and physical space to optimize
knowledge exchange practices. These and other research questions are afforded a new perspective with the sensor
technology and sociometric badges.
Case Study #2: Measuring the Integration Process in Multi-Cultural Teams
We continue our exploration of the utility of these badges by examining the critical launch phase of newly formed
teams. Capturing fine-scale data on temporal changes in team dynamics has long been a challenge for groups
researchers. Harrison, Mohammed, McGrath, Florey, and Vanderstoep (2003) conducted weekly observations to understand how the patterns of collaboration change over time. Although the study yielded important insights on
changes in collaboration patterns, the timing of the observations limited their findings.
With sociometric technology in hand, we set out to better understand how integration happens in newly formed
multi-cultural teams. Previous work has also examined this question; for example, in one longitudinal study,
researchers found that the effects of demographic diversity reduce over time (Harrison, Price, Gavin, & Florey,
2002). Their method included participant surveys on perceived diversity and collaboration at both the outset of the
study and five weeks later. The study provided important findings on the dynamic qualities of diversity effects on
collaboration. However, we still lacked a complete understanding of exactly how the team integrated. We were interested in extending the findings of this and other research by providing more detailed temporal information on team
integration processes, namely who was responsible for leading the integration and at what point it began to occur.
To answer our research questions, we used the sociometric badges at a leadership forum held in Tokyo, Japan.
The forum brought together 20 university students from the U.S. and 20 university students from Japan. The forum
was held for two weeks, in which the first week focused on listening to lectures and field trips and the second week
was spent working on a team engineering project. Each team was asked to build a Rube Goldberg machine,
which was later evaluated by experts. The student participants wore the badges during all working hours for seven
working days of the second week. The first few days involved brainstorming on the design, whereas the latter days
were used to physically build the machine and troubleshoot problems. We recorded the dynamic change in their
communication patterns as they were working together in these groups.
Figure 3 shows a sample team visualization over the one-week period as captured by the sociometric badges. Each
circle in the diagram represents a member of that team, whose identity is represented by their initials next to the
diagram. The size of each circle represents the amount of time a person participated in the conversation. The shade
of the circle represents how interactive each person was on the basis of their turn-taking pattern (darker means more
interactive), and the width of the edges represents the turn-taking patterns (thicker links indicate that the two
connected nodes had much back-and-forth conversations).
In this particular team, half the students in the team were Japanese (4KF, 4MH, 4YI, and 4TS) and the other half
were American (4HS, 4GO, 4SI, and 4BR). We can observe from the width of the links (turn-taking frequency) that
the American students interacted mostly among themselves during the first few days. In the first day, only the two
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
420
T. KIM ET AL.
Figure 3. Intra-team interactions over a one-week period
Japanese students included in the conversation (4MH and 4KF) were Japanese students studying in the U.S., who
were probably more accustomed to U.S. culture and language. We observe that this pattern changed over the course
of the week: the Japanese students became more integrated into the team. We can also see that one of the students
(4SI) took the most dominant role on the second day but dramatically changed his behavior the next day. By the end
of the week, the patterns of the team became more balanced and highly interactive, showing no sign of collaborative
separation on the basis of the nationality of the participants.
By quantifying the turn-taking patterns within the team, we were able to understand the sequence and timing of
integrating multi-cultural actors. Moreover, we were able to measure multiple teams at the same time over a period
of multiple days without sourcing field observers who needed to be present at all times. The sociometric badges
afforded us the opportunity to gather complete data sets for a longitudinal study on the temporal changes in
dynamics within teams.
Coupling the sensor technology with visualization and mapping techniques, we are now better equipped to
explore the power and influence dynamics in team settings. Including the physical movements of the team members,
we might address questions of influence and decision quality. For example, in the study we just discussed, we see
that 4SI’s participation in the group burgeoned between Days 1 and 2 but generally declined over the remaining five
days. What happened to cause these participation level changes? Tying participation to team outcomes, we might
explore how deliverables on Day 2 differed in quality from those on Day 7, when 4SI participated the least.
Alternately, Day 5 saw the most evenly distributed participation among team members: what were the environmental
conditions that created this scenario? What was the team’s performance quality on that day? Is this emergent
pattern typical in a team’s forming cycle, germane to U.S.–Japanese teams, a particular function of the sevenday exercise design, or something else? Unalloyed interaction patterns over indefinite periods can provide a
substantive starting point for evaluating our existing ideas of the role of individual characteristics and power and
influence in collaborative contexts. It can also support exploration of new questions previously underexplored in the
literature. For example, if 4SI were to wear the badge for the rest of the conference, does he or she tend to replicate this
pattern of early dominance, delayed reticence in all of his interactions in sessions, cocktail conversations, and other
events? In other words, is this a quality of the individual that can be diagnosed or something that is instead emergent
in a particular team context? How might that impact how we design teams? With sociometric badges, we can now
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
421
examine these and other questions around individual member attributes and collaborative behaviors. We examine some
practical applications of these badges in the succeeding study.
Case Study #3: Sociometric Intervention for Distributed Teams
Along with all the aforementioned capabilities, sociometric badges can collect and analyze data in real time, thus
creating new opportunities for real-time interventions. Live substantive feedback can change the team’s trajectory
and ultimately impact overall performance. We set out to better understand the effects of mixed motives in teams
on a social dilemma task that pitted individual goals against collective goals. Previous work explores the effect of
interventions on social dilemma tasks and suggests that providing opportunities for task-related group discussions
greatly increases the level of cooperation compared with groups that do not get a chance to talk (Orbell, Kragt,
and Dawes, 1988). Such task-related discussions often lead to shared commitments about cooperation, which make
group members more likely to follow through with cooperative decisions. Also, correcting for spatial distortions of
eye gaze in video conferencing can improve the level of cooperation even more, supporting comparable performance
on distributed teams versus collocated teams (Nguyen and Canny, 2009).
To further our understanding of these phenomena, we conducted a laboratory study to explore whether we could
improve the performance of distributed teams by providing real-time interventions on the basis of sociometric
feedback. We chose geographically distributed teams as our target because they have been shown to perform more
poorly than co-located teams in certain tasks (Hinds and Bailey, 2003) and because of their increasing use in
organizational collaboration. Moreover, members in distributed teams often have an inaccurate understanding of
their communication behavior and the group dynamics (Cramton, 2001). Sociometric badges address this challenge
by simultaneously capturing interaction data from team members not present in the same location. The badge
collects wearer-centered data and uploads it to a central server, where data from multiple badges are combined to
support a comprehensive understanding of the whole unit.
We created 45 groups of four randomly assigned individuals who were given a social dilemma task (i.e., each
member had to decide either to maximize their self-earning or the overall group earning). Participants could choose
whether to invest portions of their money into a group fund; the group maximized its goal if everyone invested fully in
the group, but personal gain was maximized by withholding contributions. In such social dilemma tasks, cooperation
is challenging when individual interest is in conflict with the group’s interest (Bazerman, Mannix, & Thompson,
1988). This is especially true for geographically distributed groups: trust breaks down in computer-mediated communications, undermining cooperative behavior (Bicchieri and Lev-On, 2007; Rocco, 1998). It can become extremely
difficult for distributed groups to establish and maintain trust and cooperation compared with groups communicating
face-to-face. Because of this and the aforementioned findings, we designed a real-time feedback system that provides
visual feedback to groups on their patterns of participation and turn taking. It was visually designed to encourage
more balanced participation among members and a higher frequency of turn transitions (Appendix B). We conducted
a 2 2 between subject study to understand the effect of geographic distribution and sociometric feedback.
There are clear differences in communication patterns between the group with and without feedback. Feedback
successfully increased the speaking time of members and increased the frequency of turn transitions. The results
hold for both co-located and distributed groups, but the effect size was much stronger in distributed groups. But
was all of this turn taking and sharing related to the quality of the group’s decision? To determine this, we examined
the total amount of money invested in the group fund as a measure of cooperation. In the no-feedback condition, the
cooperation level was significantly lower for individuals in groups that were distributed first (mean = $27.85 and
$23.29 for co-located and distributed, respectively). In the presence of feedback, we see that there was a significant
increase in the cooperation level in groups that were distributed first (mean = $26.64 and $28.69 for co-located and
distributed, respectively, w2(3176) = 10.17, p < .05, Figure 4). Hence, the sociometric feedback helped distributedfirst groups to perform no worse than co-located-first groups. The main effect is not significant for either feedback
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
422
T. KIM ET AL.
Figure 4. Results of intervention on team cooperation levels for collocated and distributed teams
(F(1, 176) = 2.43, p = .12) or geographical condition (F(1, 176) = 0.86, p = .35), but the interaction effect is significant (F(1, 176) = 6.02, p < .05).
In this example, we examined the change in cooperation levels on a social dilemma tasks in teams of four. We
also might use similar study designs to explore interactions of decision-making task types, intervention timings,
and performance outcomes in a way that heretofore has been difficult to isolate at scale. For example, what effects
do intra-team versus extra-team interventions have on performance? How much does the source of the feedback
matter (e.g., high status, low status, and no status)? How does drawing the team’s attention to the volume of their
contributions in a visual and objective way change their perception of their role on the team and potentially alter
other beneficial or harmful collective behaviors? We suggest that these three case studies are just the starting point
for leveraging these sociometric badges in research settings.
Discussion, Limitations, and Conclusion
Discussion
In this paper, we introduce sensor technology as a methodological complement intended to support and expand our
understanding of collaboration in teams and various organizational settings. We examined how badge data might
help us avoid misinterpretations of patterns of collaboration on new forms of teams and how we might revisit
traditional teams with this new tool to support a more well-rounded approach to long-standing questions of selfreport bias and determining causal linkages. We also provided three mini case studies to illustrate potential
applications of this technology.
In one application of this technology, we used sociometric badges to collect face-to-face and distributed
interaction data in a series of laboratory and field studies. After some time, these badges are likely to be far less
intrusive than a human observer, potentially limiting any social facilitation distortions to the data. Sensor technology
allows us to chart the speech and movement patterns of the wearer and adds an informative dimension to established
methods and tools for data collection.
Additionally, this technology is highly mobile, going everywhere the host goes. From setup to researcher use,
these key-card-sized devices make it easy to collect data continuously even as the wearer moves throughout his
or her day. In part because of the compact size, we can measure both formal (in-person meetings, presentations,
distributed technologically mediated meetings, etc.) and informal interactions (water cooler conversations, coffee
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
423
and lunch breaks, shared-space interactions, etc.). As workplaces increasingly attempt open designs in an effort to
foster collaboration, technology such as this could capture otherwise evanescent interpersonal exchanges to better
assess the impact of such designs on collaboration.
When using the sociometric badges, we synchronized data from various locations on a central server. This
capability allowed us to gather data from many different locales simultaneously and capture collaborative
interactions and responses to team member feedback that might otherwise prove too costly or time-consuming using
traditional methods alone. As such, we were able to follow distributed team members in ways that other approaches
to data collection might not be able to reasonably achieve. Today’s members of collaborative work groups
potentially have memberships on multiple teams (Mortensen, Woolley, & O’Leary, 2007), perform a variety of conceptual and decision-making tasks, and split their time across myriad locales. Sensor technology allows us to follow
these people through many of these transitions.
Finally, sociometric badges are also scalable and record continuous streams of interaction data for indefinite
periods. By capturing streaming data, two things are now possible that were previously very difficult to execute:
(i) objective data-supported real-time interventions and (ii) extensive causal analyses of team processes, their
mediators, moderators, and outcomes. In particular, using badge data to complement existing data capture (e.g.,
diaries and surveys) will provide an interesting pasture for exploration as potential discrepancies between self-report
and objective data capture emerge. We might also consider how the presence of a recording device of any type might
affect the self-report process itself. Attacking the question of what we experience and report versus what actually
transpired is an exciting proposition and one that will result in a better understanding of team interactions
and motives.
Limitations
One significant limitation of sensor technology is that it does not collect conversation content. This design decision
was due to the privacy concerns of subjects and the low accuracy of speech recognition. It is our hope, however,
that this limitation will actually enable more data collection opportunities as privacy issues are nullified through
anonymous data capture. Additionally, although we tested the badges in several pilot studies, the potential for this
research tool has yet to be widely tapped. Lastly, although the data capture and processing is relatively simple,
managing such large volumes of data requires new considerations for research design and strategy.
Conclusion
Recently, a debate has sprung up around the constitutionality of law enforcement using GPS tracking devices on
suspects’ vehicles. Proponents argue that it is a technologically mediated and efficient means of conducting the same
surveillance work that law enforcement has always performed. Opponents cite the Fourth Amendment, suggesting
that the tracking devices that transmit continuous location data violate expectations of privacy and amount to
unreasonable search and seizure. Debate aside, law enforcement officials have begun to utilize these tools at a
different point in the process of building a case against a suspect: before and during the commission of a crime. This
is a dramatic step away from the traditional evidence-based policing that occurs after a crime has been committed.
Somewhat analogous to the application of GPS technology in this instance, sociometric badges can offer teams
researchers entry at an earlier point in the research process, potentially reframing the way we conceptualize research
design. Because of their unobtrusiveness and low cost, badge data from pilot studies may also prove fertile grounds
for exploring and challenging assumptions present in extant research.
Once in the field, these badges can help transition field observers from part of the context to part of the action. We
have here positioned the sociometric badge as a tool to capture collaboration patterns of teams in real time. However,
we suspect there are uses of this technology beyond our current conceptualizations. We thus call on researchers
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
424
T. KIM ET AL.
across disciplines to unpack the possibilities for sociometric badge data to further our understanding of the changing
ecology of teams.
Author biographies
Taemie Kim is a researcher at the Accenture Technology Labs focusing on social collaboration in enterprises. She
received her PhD from MIT Media Lab, where she conducted research on the sociometric badges. Her research focuses on studying team collaboration patterns and utilizing the patterns to influence team behavior and performance.
Erin McFee is currently a research associate in Organizational Behavior at Harvard Business School and the lead
research associate for Organizing for Health. Her research interests are in teams, leadership, psychology, and health
care. Erin received her MBA from the Simmons College School of Management.
Daniel Olguin Olguin is the CTO of Sociometric Solutions. He received a PhD from the MIT Media Lab, where he
worked on the development of a sensor-based organizational design and engineering approach that combines human
behavioral sensor data with performance information to model, simulate, and optimize organizational performance.
Ben Waber is a Senior Researcher at Harvard Business School and a visiting scientist at the MIT Media Lab, where
he received his PhD. Benjamin is also the President and CEO of Sociometric Solutions, a management consulting
firm based on his research.
Alex “Sandy” Pentland directs MIT’s Human Dynamics Laboratory and the MIT Media Lab Entrepreneurship
Program. He has previously helped create and direct MIT Media Lab, and is among the most-cited computational
scientists in the world. His most recent book is ‘Honest Signals,’ published by MIT Press.
References
Aran, O., & Gatica-Perez, D. (2010). Fusing audio-visual nonverbal cues to detect dominant people in group conversations.
Proceedings International Conference on Pattern Recognition (ICPR) (pp. 3687–3690) doi:10.1109/ICPR.2010.898
Barley, S. R. (1990). Images of imagining: Notes on doing longitudinal field work. Organization Science, 1(3), 220–247.
Bazerman, M. H., Mannix, E. A. & Thompson, L. L. (1988). Groups as mixed-motive negotiations. Advances in Group
Processes, 5, 195–216.
Bicchieri, C & Lev-On, A. (2007). Computer-mediated communication and cooperation in social dilemmas: an experimental
analysis. Politics, Philosophy and Economics, 6, 139–168.
Brdiczka, O., Maisonnasse, J., & Reignier, P. (2005). Automatic detection of interaction groups. Proceedings of the IEEE
International Conference on Multimodal Interface (pp. 32–36). doi:10.1145/1088463.1088473
Choudhury, T., & Pentland, A. (2004). Characterizing social interactions using the sociometer. Proceedings of NAACOS.
Cramton, C. D. (2001). The mutual knowledge problem and its consequences for dispersed. Organizational Science, 12, 346–371.
Curhan, J., & Pentland, A. (2007). Thin slices of negotiation: Predicting outcomes from conversational dynamics within the first
five minutes. Journal of Applied Psychology, 92, 802–811.
DiMicco, J. M., Pandolfo, A., & Bender, W. (2004). Influencing group participation with a shared display. Proceedings of the
ACM Conference on Computer Supported Cooperative Work (p. 614). doi:1031607.1031713
Dong, W., Lepri, B., Cappelletti, A., Pentland, A., Pianesi, F., & Zancanaro, M. (2007). Using the influence model to recognize
functional roles in meeting. Proceedings of the ACM International Conference on Multimodal Interaction, (pp. 271–278).
doi:1322192.1322239
Frese, M., & Zapf, D. (1988) Methodological issues in the study of work stress: Objective vs subjective measurement of work
stress and the question of longitudinal studies. In C. L. Cooper, & R. Payne (Eds.), Causes, coping, and consequences of stress
at work (pp. 375–409). West Sussex England: John Wiley.
Gips, J. P. (2006). Social motion: Mobile networking through sensing human behavior (Master’s thesis). MIT Media Laboratory,
Cambridge, MA.
Gladstein, D. L. (1984). Groups in context: A model of task group effectiveness. Administrative Science Quarterly, 29(4), 499–517.
Harrison, D. A., Mohammed, S., McGrath, J. E., Florey, A. T. & Vanderstoep, S. W. (2003). Time matters in team performance:
Effects of member familiarity, entrainment, and task discontinuity on speed and quality. Personal Psychology, 56, 633–669.
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
425
Harrison, D. A., Price, K. H., Gavin, J. H. & Florey, A. T. (2002). Time, teams, and task performance: Changing effects of
surface- and deep-level diversity on group functioning. Academy of Management Journal, 45, 1029–1045.
Hinds, P. & Bailey, D. (2003). Out of sight, out of sync: Understanding conflict in distributed teams. Organizational Science, 14,
615–632.
Jehn, K. (1995). A multi-method examination of the benefits and detriments of intragroup conflict. Administrative Science
Quarterly, 40, 256–282.
Kim, T., Brdiczka, O., Chu, M. & Begole, B. (2009). Predicting shoppers’ interest from social interaction using sociometric
sensors. Proceedings of the ACM Conference on Human Factors In Computing Systems (pp. 4513–4518). doi:10.1145/
1520340.1520692
Leavitt, H. J. (1975). Suppose we took groups seriously. In E. L. Cass, & G. G. Zimmer (Eds.), Men and work in society: A report
on the symposium held on the occasion of the 50th anniversary of the original Hawthorne studies (pp. 67–77). New York, NY:
Van Nostrand Reinhold.
Leonard-Barton, D. A. (1990). A dual methodology for case studies: Synergistic use of a longitudinal single site with replicated
multiple sites. Organization Science, 1(3), 1–19.
Mortensen, M., Woolley, A. W., & O’Leary, M. B. (2007). Conditions enabling effective multiple team membership. In
K. Crowston, S. Sieber, & E. Wynn (Eds.), Virtuality and virtualization (Vol. 236, pp. 215–228). Portland, OR: Springer
Publishers.
Nguyen, D. T. & Canny, J. F. (2009). More than face-to-face: empathy effects of video framing. CHI 2009, April 4 – 9, Boston,
Massachusetts, USA. Available at bid.berkeley.edu/files/papers/CHIempathy09.pdf
Olguín, D. A., Gloor, P. A., & Pentland, A. (2009). Wearable sensors for pervasive healthcare management. 3rd International
Conference on Pervasive Computing Technologies for Healthcare. London.
Orbell, J., Kragt, A. & Dawes, R. (1988). Explaining discussion-induced cooperation. Journal of Personality and Social Psychology, 54, 811–819.
Pentland, A. (2008) Honest signals: How they shape our world. Cambridge, MA: MIT Press.
Pianesi, F., Zancanaro, M., Not, E., Leonardi, C., Falcon, V., & Lepri, B. (2008). Multimodal support to group dynamics.
Personal Ubiquitous Computing, 12, 181–195.
Rocco, E (1998). Trust breaks down in electronic contexts but can be repaired by some initial face-to-face contact. Proceedings of
the ACM Conference on Human Factors in Computing Systems (pp. 496–502). doi:10.1145/274644.274711
Rooksby, J., Baxter, G., Cliff, D., Greenwood, D., Harvey, N., Kahn, A. W., . . . Sommerville, I. (2009). The UK large scale complex IT systems initiative. The UK Large Scale Complex IT Systems Initiative. www.LSCITS.org
Spector, P. E. (1992). A consideration of the validity and meaning of self-report measures of job conditions. In C. L. Cooper, &
I. T. Robertson (Eds.), International review of industrial and organizational psychology. West Sussex, England: John Wiley.
Spector, P. E. (1994). Using self-report questionnaires in OB research: A comment on the use of a controversial method. Journal
of Organizational Behavior, 15, 385–392.
Staw, B. M. (1975). Attribution of the ‘causes’ of performance: A general alternative interpretation of cross-sectional research on
organizations. Organizational Behavior and Human Decision Processes, 13, 414–432.
Sung, M., & Pentland, A. (2009). Stress and lie detection through non-invasive physiological sensing. Biomedical Soft
Computing and Human Sciences, 14, 109–116.
Van de Ven, A. H., & Huber, G. P. (1990). Longitudinal field research methods for studying processes of organizational change.
Organization Science, 1(3), 213–219.
Appendix A: Sociometric Badges
The sociometric badge is a wearable electronic-sensing device that collects and analyzes social behavioral data
(Olguín et al., 2009; Figure 5). It is designed to be worn around one’s neck like a typical company ID badge.
The badge extracts speech features in real time to measure nonlinguistic social signals. It does not record speech
content but is capable of identifying social behavior such as speaking time, speaking speed, and speech energy of the
user. Turn-taking or short-affirming phrases reveal social dynamics that can be measured through synchronization
of badges of multiple participants. Sociometric badges also measure body movement using a single three-axis
accelerometer. This can detect individual activities such as gestures and posture as well as social interactions such
as body movement mimicry or rhythmic patterns. The badges can send and receive information over 2.4 GHz radio
to and from different users and base stations in real time. The proximity data collected from this radio paint a picture
of the relational distance and position of multiple wearers (up to a 10-m radius in a 1-m resolution). This function
can be used to detect the distribution of team members. Additionally, the badges feature an infrared sensor, which
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
426
T. KIM ET AL.
Figure 5. Sociometric badge
can capture and identify face-to-face interactions; the sensor not only records the encounter but also the postural
direction of the interacting bodies. Researchers can also perform indoor user localization by measuring received
signal strength from fixed based stations, detecting proximal travel patterns.
Figure 6 shows sample data from a couple shopping for furniture (Kim, Brdiczka, Chu, & Begole, 2009). A couple was asked to wear the sociometric badges during their time of strolling through a furniture store and browsing
multiple items. The figure shows the multimodal data collected while looking at a single item, a couch. The darker
lines indicate the female’s data and the lighter lines the male’s with the X-axis covering the timeline. The top row
shows how the couple moved about during the period: the flat area in the middle corresponds to the time when
the couple was sitting down on a couch. The second row shows the speech energy of the couple; the spiking patterns
show fluctuation in volume and speech variance. The third row depicts the logical value of whether each individual
was speaking. One can see the turn transitions of the couple as well as the overlap in speech. There is much overlap
in speech especially during the time when they were sitting on the couch. The fourth row shows infrared hits, which
occur when the couple turned their body so that they were directly facing each other. The last row shows the signal
strength of the radio, which estimates the proximity of the couple. The multimodal data provide a broad understanding of the dynamics and communication that happened between the couple as they toured the store, engaged in a
conversation while sitting on one of the couches and responded to each other’s dialog.
Appendix B: Sociometric Feedback Application
The sociometric feedback was designed for collaboration settings in which active and balanced participation and
frequent turn transitions are beneficial to the group performance. Each of the four participants is represented by a
colored square in the corner of the screen. In the study, the square colors were identical to the color of each
participant’s badge and seat.
We display each member’s speaking time through the thickness of the line connecting the central circle with each
member’s corner. The circle in the center of the image portrays the participation balance within the group: the more a
participant talks, the more they pull the circle closer to their corner. Lastly, the color of the central circle gradually
shifts between white and green to indicate the frequency of turn transitions. Green corresponds to frequent turn
transitions, which can be a predictor of the group interactivity level (Figure 7).
The data are accumulated throughout the meeting, showing the accumulated communication patterns from the
start of the meeting to the current time. We also placed the devices on the table facing each user and at the
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job
SENSOR TECHNOLOGY AND COLLABORATION IN TEAMS
427
periphery of the user’s attention. The display updates periodically so that it does not require constant attention
from the user.
A1
Figure 6. Sample badge data of a couple shopping for furniture
B1
Figure 7. Team interaction snapshots
Copyright © 2012 John Wiley & Sons, Ltd.
J. Organiz. Behav. 33, 412–427 (2012)
DOI: 10.1002/job