Designing a Data-Driven Survey System: Leveraging Participants' Online Data to Personalize Surveys
Abstract
1 Introduction
2 Related Work
2.1 Existing Survey Tools
2.2 The Use of Chatbots in Survey Research
2.3 Existing Instrumental Toolkits
2.4 Researchers’ Challenges in Data-Driven Survey Creation
3 Formative Research
3.1 Literature Review
3.1.1 Method.
3.1.2 Results.
Paper | Data Collection Method | Purpose of Data Collection |
Epstein et al. [27] | Fitbit API | to implement an ad-hoc data-driven survey |
Zufferey et al. [90] | Fitbit API | to infer participants’ personality traits |
Orlosky et al. [62] | Fitbit API | did not specify |
Dreher et al. [24] | used a proprietary service (Fitabase*) | to validate Fitbit usage |
Stück et al. [75] | asked members of a health campaign (AchieveMint) | to analyze physical activity behavior |
Shin [74] | used a mobile app called “HealthExported for Fitbit to CSV” | to analyze physical activity behavior |
Dai et al. [20] | did not specify | to validate Fitbit usage |
Preusse et al. [66] | downloaded manually | did not specify |
* See https://www.fitabase.com/, last accessed Feb. 2024. |
Research Purpose* | How would a data-driven approach help?** | |||||||
---|---|---|---|---|---|---|---|---|
Freq. | Survey Goals | Examples | Primary | Screener | Time | Reliability | Quality | Personaliz. |
n = 9 | Assessed device ownership, brand, and usage patterns | [25, 67] | ✓ | - | ✓ | ✓ | - | - |
n = 2 | Assessed respondent’s motivation or behavior for physical activity | [63, 84] | ✓ | - | - | - | ✓ | - |
n = 3 | Assessed perceived usefulness or value of data | [44, 80] | ✓ | - | - | - | ✓ | - |
n = 4 | Assessed willingness to share data and data-sharing behavior | [33, 73] | ✓ | - | - | - | ✓ | - |
n = 6 | Assessed perceived data sensitivity and privacy concerns | [53, 54] | ✓ | - | - | - | ✓ | - |
n = 4 | Presented threat scenarios to study privacy-coping strategies | [8, 33] | ✓ | - | - | - | ✓ | - |
n = 2 | Tested data-sharing ideas using hypothetical scenarios (e.g., mock-up interfaces) | [46, 81] | ✓ | - | - | - | - | ✓ |
n = 36 | Assessed device ownership, brand, usage patterns, and using 3rd-party apps | [58, 79] | - | ✓ | ✓ | ✓ | - | - |
n = 1 | Verified device ownership by sharing a photo of the device | [89] | - | ✓ | ✓ | ✓ | - | - |
n = 2 | Asked participants to bring sample data to (follow-up) interviews | [38, 74] | - | ✓ | ✓ | - | ✓ | - |
* These columns distinguish between papers where the survey was a primary instrument versus those where it was a screener tool. | ||||||||
** These columns show the potential benefits of a data-driven approach for surveys with different goals, including: (i) saving respondents’ time by omitting easily collectible questions (e.g., device brand), (ii) enhancing response reliability by preventing dishonest answers (e.g., false device ownership claims on Prolific), (iii) improving data quality by reducing reliance on self-reported information, and (iv) personalizing questions (e.g., customizing mock-up interfaces). |
3.2 Online Survey
n | % | |
Gender | ||
Woman | 26 | 50.0% |
Man | 23 | 44.2% |
Non-binary | 1 | 1.9% |
Prefer not to disclose | 2 | 3.8% |
Age | ||
25-34 years | 15 | 28.8% |
35-44 years | 24 | 46.2% |
45-54 years | 7 | 13.5% |
55-64 years | 3 | 5.8% |
65+ years | 2 | 3.8% |
Prefer not to disclose | 1 | 3.8% |
Main Research Field | ||
Security and privacy (S& P)* | 26 | 50.0% |
HCI | 19 | 36.5% |
Information systems (IS) | 3 | 5.8% |
Social/professional topics | 2 | 3.8% |
n | % | |
Surveys Conducted | ||
(Last 5 Yrs) | ||
4+ | 43 | 82.7% |
3 | 5 | 9.6% |
2 | 3 | 5.8% |
1 | 1 | 1.9% |
Proportion of Research | ||
Using Surveys | ||
None | 0 | 0.0% |
Very little | 1 | 1.9% |
A little | 5 | 9.6% |
About half | 16 | 30.8% |
A lot | 11 | 21.1% |
A great deal | 16 | 30.8% |
All | 3 | 5.8% |
n | % | |
Proportion of the Used | ||
Survey Platforms | ||
Qualtrics | 43 | 82.7% |
Google Forms | 36 | 69.2% |
SurveyMonkey | 27 | 51.9% |
LimeSurvey | 12 | 23.1% |
Microsoft Forms | 2 | 3.8% |
Typeform | 2 | 3.8% |
Unipark | 2 | 3.8% |
Alchemer | 1 | 1.9% |
AWS | 1 | 1.9% |
Checkbox Survey Solutions | 1 | 1.9% |
EU Survey | 1 | 1.9% |
QuestionPro | 1 | 1.9% |
SharePoint | 1 | 1.9% |
Slido | 1 | 1.9% |
UserZoom | 1 | 1.9% |
WJX | 1 | 1.9% |
Made their own system | 2 | 3.8% |
*Note that 96.2% of these researchers reported working in the sub-field of usable security and privacy. |
3.2.1 Method.
3.2.2 Results.
4 Design and Implementation of DDS
4.1 Design Introduction
4.1.1 Design Goals.
4.1.2 Design Methodology.
4.1.3 Choice of Integrated Platforms and Services.
4.1.4 Overview and Core Features of DDS.
4.1.5 Distributing DDS Surveys.
4.2 Privacy Considerations
4.3 Architecture
4.4 Researcher Flow
4.5 Participant Flow
4.6 Extensions and Forward Compatibility
5 Discussion
5.1 Contribution
5.2 Limitations and Future Work
5.2.1 Limitations.
5.2.2 Future Work.
6 Conclusion
Acknowledgments
A Survey Transcript
Survey sections | Question numbers |
---|---|
Q1 | |
Sec. 2 Research Field | Q2, Q3 |
Sec. 3 Screening | Q4, Q5, Q6 |
Sec. 4 Main - Survey Flow and Skip Logic | Q7, Q8, Q9, Q10, Q11, Q12, Q13 |
Sec. 5 Main - Questions and Answers | Q14, Q15, Q16, Q17, Q18, Q19, Q20 |
Sec. 6 Main - Custom Variables | Q21, Q22, Q23, Q24, Q25, Q26, Q27 |
Sec. 7 Main - Open | Q28 |
Sec. 8 Background | Q29, Q30 |
Sec. 9 Follow-up | Q31 |
You can click on section numbers or question numbers to jump to the associated section or question. |
B Detailed Architecture
Footnotes
Supplemental Material
- Download
- 7.57 MB
- Transcript
- Download
- 218.07 MB
- Transcript
References
Index Terms
- Designing a Data-Driven Survey System: Leveraging Participants' Online Data to Personalize Surveys
Recommendations
Designing Surveys for HCI Research
CHI EA '15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing SystemsOnline surveys are widely used in human-computer interaction (HCI) to gather feedback and measure satisfaction; at a glance many tools are available and the cost of conducting surveys appears low. However, there is a wide gap between quick-and-dirty ...
Designing unbiased surveys for HCI research
CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing SystemsSurveys are a commonly used method within HCI research. While it initially appears easy and inexpensive to conduct surveys, overlooking key considerations in questionnaire design and the survey research process can yield skewed, biased, or entirely ...
Think-Aloud Surveys: A Method for Eliciting Enhanced Insights During User Studies
Human-Computer Interaction – INTERACT 2021AbstractIn a user experiment, we tried out a novel data collection approach consisting of combining surveys with the think aloud method. We coin the phrase “think-aloud survey method”, where participants think-aloud while completing a questionnaire. We ...
Comments
Information & Contributors
Information
Published In
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Badges
- Honorable Mention
Author Tags
Qualifiers
- Research-article
- Research
- Refereed limited
Data Availability
Funding Sources
Conference
Acceptance Rates
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 750Total Downloads
- Downloads (Last 12 months)750
- Downloads (Last 6 weeks)283
Other Metrics
Citations
View Options
View options
View or Download as a PDF file.
PDFeReader
View online with eReader.
eReaderHTML Format
View this article in HTML Format.
HTML FormatGet Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in