Abstract
Mobile health applications (‘health apps’) that promise the user to help her with some aspect of her health are very popular: for-profit apps such as MyFitnessPal, Fitbit, or Headspace have tens of millions of users each. For-profit health apps are designed and run as optimization systems. One would expect that these health apps aim to optimize the health of the user, but in reality they aim to optimize user engagement and, in effect, conversion. This is problematic, I argue, because digital health environments that aim to optimize user engagement risk being manipulative. To develop this argument, I first provide a brief analysis of the underlying business models and the resulting designs of the digital environments provided by popular for-profit health apps. In a second step, I present a concept of manipulation that can help analyze digital environments such as health apps. In the last part of the article, I use my concept of manipulation to analyze the manipulative potential of for-profit health apps. Although for-profit health can certainly empower their users, the conditions for empowerment also largely overlap with the conditions for manipulation. As a result, we should be cautious when embracing the empowerment discourse surrounding health apps. An additional aim of this article is to contribute to the rapidly growing literature on digital choice architectures and the ethics of influencing behavior through such choice architectures. I take health apps to be a paradigmatic example of digital choice architectures that give rise to ethical questions, so my analysis of the manipulative potential of health apps can also inform the larger literature on digital choice architectures.
Similar content being viewed by others
Notes
I thus do not focus on non-profit health apps, or on the more medical health apps that are built—often in cooperation with academic hospitals—to address a very specific medical problem. I focus on the big commercial players that develop ‘healthy lifestyle and wellness’ services for the general consumer population.
Conversion refers to the principle of turning users into profitable users.
Thanks to Thomas Nys for pointing this out to me. See also Nys (2016).
See Fahy et al. (2018) for a more elaborate analysis of Apple’s and Google’s developer platforms and the different kinds of monetization strategies apps can use.
Apple, ‘Choosing a Business Model’: https://developer.apple.com/app-store/business-models/.
Apple, ‘Choosing a Business Model’: https://developer.apple.com/app-store/business-models/.
Google, ‘Earn more revenue with the right monetization options’: https://developer.android.com/distribute/best-practices/earn/monetization-options.
Google, ‘Earn more revenue with the right monetization options’: https://developer.android.com/distribute/best-practices/earn/monetization-options.
In the case of health apps this means advertising that is deliberately designed to ‘look and feel’ like health content, thereby obfuscating to the user the true commercial nature of the content.
This option is not explicitly mentioned by Apple or Google. It is, however, a serious option. MyFitnessPal, an immensely popular calorie counting app with millions of users, is a good example. On its jobs page, MyFitnessPal announces that “MyFitnessPal has the largest database of human eating habits in the world. The opportunities for a data scientist here are almost endless.” (https://www.myfitnesspal.com/jobs) Forbes reports that health care providers and researchers can access the database when they enter into a “formal partnership” with MyFitnessPal (Olson 2014).
Apple, ‘Using the Freemium Model’: https://developer.apple.com/app-store/freemium-business-model/.
Google, ‘Improve Conversion Using Google Analytics for Firebase’: https://developer.android.com/distribute/best-practices/earn/improve-conversions.
Google, ‘Improve Conversion Using Google Analytics for Firebase’: https://developer.android.com/distribute/best-practices/earn/improve-conversions.
Google, ‘Improve Conversion Using Google Analytics for Firebase’: https://developer.android.com/distribute/best-practices/earn/improve-conversions.
Headspace, ‘About Headspace’: https://www.headspace.com/about-us.
This lack of access to the (business) operations of Big Tech companies could of course be criticized for a variety of reasons. For example, it makes it harder for investigative journalists and academics to scrutinize the practices of these companies. The same holds for policymakers and regulators who often have a hard time gaining access to Big Tech companies.
Headspace, ‘Senior Data Analyst,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.
Headspace, ‘VP of Growth,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.
Headspace, ‘VP of Growth,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.
There is a rich literature on Foucauldian biopower and health and the role (digital) technologies can play in the exercise of biopower (e.g., Foucault 1975; Armstrong 1995; Petersen and Bunton 1997; Casper and Morrison 2010; Lupton 2012; Mayes 2015; Ajana 2017; Fotopoulou and O’Riordan 2017; Sanders 2017). Although this literature provides interesting and promising perspectives for my research, I do not have enough space at my disposal in this article to incorporate this complex literature into my argument.
NativeAdBuzz, ‘This Health and Wellness Boom Has Been Building for Years… And It’s Finally About to ERUPT (Urgent: Your Free VIP Christmas Gift Has Arrived)’: http://www.nativeadbuzz.com/blog/this-health-and-wellness-boom-has-been-building-for-years-and-its-finally-about-to-erupt-urgent-your-free-vip-christmas-gift-has-arrived/.
Fitbit, ‘Why Fitbit’: https://www.fitbit.com/whyfitbit.
MyFitnessPal, ‘These Playlists Were Built to Make You Better’: https://blog.myfitnesspal.com/these-playlists-were-built-to-make-you-better/.
MyFitnessPal, ‘These On-Ear Headphones Can Actually Withstand Your Workouts’: https://blog.myfitnesspal.com/these-on-ear-headphones-can-actually-withstand-your-workouts/.
MyFitnessPal, ‘Why and How You Should Nix an Alarm Clock’: https://blog.myfitnesspal.com/why-and-how-you-should-nix-an-alarm-clock/.
MyFitnessPal, ‘A Day in the Life of a Yoga Teacher’: https://blog.myfitnesspal.com/day-life-yoga-teacher/.
MyFitnessPal, ‘How a Nutritionist Spends $50 at Whole Foods’: https://blog.myfitnesspal.com/how-a-nutritionist-spends-50-at-whole-foods/.
Susser et al. (2019a) speak of ‘vulnerabilities.’ I rather use the looser term ‘exploitable characteristics of a person’ because the term ‘vulnerabilities’ is sometimes associated (especially in legal discourse) with a fixed set of narrowly defined weakness, such as those that are the result of one’s age (‘the old and the young’) or of one’s psychical or mental infirmities (‘people with medically diagnosed handicaps’).
Rudinow (1978, p. 346, emphasis added) explains that “the manipulator’s behavior is normally either deceptive or predicated on some privileged insight into the personality of his intended manipulee.”.
Mills (2014, p. 138) provides a similar argument, referring to Gorin (2014) and Barnhill (2014): “Both Gorin and Barnhill point out that manipulation does not need to involve deception or covertness; these are not defining features of manipulation necessarily present in all cases of what we could agree to be manipulation. But most manipulators seek to hide the degree to which they are angling to achieve their desired result and would find the success of their project seriously compromised if their manipulative intentions were revealed.”.
Christman (1991, p. 10) has formulated a popular, somewhat weaker alternative: “What matters is what the agent thinks about the process of coming to have the desire, and whether she resists that process when (or if) given the chance.”
I admit that the resulting picture can feel a bit messy or unclear. Someone can have a manipulative mindset, but, in the end, be drawn to techniques that are not manipulative in nature—An in principle manipulative mindset does not necessarily lead to manipulation. Although I agree that the resulting picture is messy, I see no way to avoid this. The empirical reality of data-driven dynamically adjustable choice architectures simply is very messy. The industry is constantly running (multiple, parallel) experiments to test a plethora of tweaks to their digital choice architectures to test whether those tweaks can successfully shape (patterns of) behavior. In this constant hunt for new behavior influence techniques, some will turn out to be manipulative, while some will turn out to be something else (e.g. coercive).
It could of course still be argued that quite some business-to-consumer practices rise to the level of manipulation. Take, for example, Santilli (1983) and Crisp (1987) who argue that nearly all advertising is—at least slightly—manipulative. If one really wants to stretch my concept of manipulation, once could even try to argue that a billboard showing advertising is manipulative. Such a billboard with an advertisement for company X is (1) put up intentionally by company X, (2) with the aim to further the ends of company X without a genuine regard for the interests of the people passing by the billboard, (3) is designed by company X in such a manner that it targets either particular people in the street, or particular desires or fears of people in the street, and (4) the billboard does not explicitly communicate that “company X is trying to target you in such a manner that company X’s earns as much as possible.” Even if we agree that a billboard can, strictly speaking, be interpreted to be manipulative, it does not follow that every instance of manipulation warrants the same level of scrutiny. There is a significant difference between, one the one hand, a billboard that displays one and the same message to every person at a fixed location, and, on the other hand, a digital health environment which builds a relationship with the user over time, offers a continuous communication channel to the user, and can be personalized in real time based on what the continuous experiments tell will leads to maximum engagement. Unlike billboards, digital technologies like the health apps discussed can get to know their users over time and can at any time they see fit (e.g. through push notifications) try to leverage that information to manipulate every user personally.
Headspace, ‘Senior Data Analyst,’ job description that has since been removed, screenshot available here: https://imgur.com/a/qtSe4Ii.
NativeAdBuzz, ‘This Health and Wellness Boom Has Been Building for Years… And It’s Finally About to ERUPT (Urgent: Your Free VIP Christmas Gift Has Arrived)’: http://www.nativeadbuzz.com/blog/this-health-and-wellness-boom-has-been-building-for-years-and-its-finally-about-to-erupt-urgent-your-free-vip-christmas-gift-has-arrived/.
Already in 1999, Hanson and Kysar used the concept of ‘market manipulation’ to identify such cases, a concept that was later updated by Calo (2014) who spoke of ‘digital market manipulation.’ Calo (2014, p. 1018) noted how “firms will increasingly be able to create suckers, rather than waiting for one to be born.” Spencer (2019, p. 34) has argued in a similar vein that “[r]ather than discovering existing vulnerabilities, marketers could exacerbate or even create vulnerabilities in individual subjects and then exploit those vulnerabilities.”
For example, Headspace was funded through four funding rounds, raising $75 million (https://www.crunchbase.com/organization/headspace#section-investors). MyFitnessPal also received funding from venture capitalists (https://www.crunchbase.com/organization/myfitnesspal#section-investors) and was later acquired by Under Armour for $475 million (Olson 2015). Fitbit also saw four funding rounds raising $66 million from venture capitalists (https://www.crunchbase.com/organization/fitbit#section-investors).
Consider also Culbert et al.’s (2015) meta-analysis of what causes people’s problematic relation to food. They emphasize that especially for perfectly healthy adolescent and young adult females, (digital) media exposure, and more specifically health ideals portrayed in those media, “have all been shown to prospectively predict increased levels of disordered eating cognitions and behaviors (e.g., body dissatisfaction, dieting, bulimic symptoms)” (Culbert et al. 2015, p. 1145).
Here are a few examples. The blog post called ‘A Day in the Life of a Yoga Teacher’ (https://blog.myfitnesspal.com/day-life-yoga-teacher/) is, in reality, native advertising for skincare products that are framed as being part of a healthy, mindful life. The blog post called ‘These Playlists Were Built to Make You Better’ (https://blog.myfitnesspal.com/these-playlists-were-built-to-make-you-better/) is native advertising for a particular brand of headphones. The blog post called ‘Why and How You Should Nix an Alarm Clock’ (https://blog.myfitnesspal.com/why-and-how-you-should-nix-an-alarm-clock/) is native advertising for a company offering “certified sleep coaches” and for a company selling a wide range of sleep products.
References
Ajana, B. (2017). Digital health and the biopolitics of the quantified self. Digital Health, 3, 1–18.
Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us hooked. New York: Penguin Press.
Anderson, J. H. (2014). Autonomy and vulnerability entwined. In C. Mackenzie, W. Rogers, & S. Dodds (Eds.), Vulnerability: New essays in ethics and feminist philosophy (pp. 134–161). Oxford: Oxford University Press.
Anderson, J. H., & Honneth, A. (2005). Autonomy, vulnerability, recognition, and justice. In J. Christman & A. Honneth (Eds.), Autonomy and the challenges to liberalism (pp. 127–149). Cambridge: Cambridge University Press.
Armstrong, D. (1995). The rise of surveillance medicine. Sociology of Health & Illness, 17(3), 393–404.
Barnhill, A. (2014). What is manipulation? In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 51–72). Oxford: Oxford University Press.
Baron, M. (2003). Manipulativeness. Proceedings and Addresses of the American Philosophical Association, 77(2), 37–54.
Boorse, C. (1975). On the distinction between disease and illness. Philosophy and Public Affairs, 5(1), 49–68.
Boorse, C. (1977). Health as a theoretical concept. Philosophy of Science, 44(4), 542–573.
Bovens, L. (2009). The ethics of nudge. In T. Grüne-Yanoff & S. Hansson (Eds.), Preference change: Approaches from philosophy, economics and psychology. New York: Springer.
Brodesser-Akner, T. (2018). How Goop’s haters made Gwyneth Paltrow’s Company Worth $250 million: Inside the growth of the most controversial brand in the wellness industry. The New York Times Magazine, July 25, 2018. Retrieved from https://www.nytimes.com/2018/07/25/magazine/big-business-gwyneth-paltrow-wellness.html.
Buss, S. (2005). Valuing autonomy and respecting persons: Manipulation, seduction, and the basis of moral constraints. Ethics, 115(2), 195–235.
Calo, R. (2014). Digital market manipulation. George Washington Law Review, 82(4), 995–1051.
Casper, M., & Morrison, D. (2010). Medical sociology and technology: Critical Engagements. Journal of Health and Social Behavior, 51(1), 12–32.
Cederström, C., & Spicer, A. (2015). The wellness syndrome. Cambridge: Polity Press.
Chaykowski, K. (2017). Meet headspace, the app that made meditation a $250 million business. Forbes, January 8, 2017. Retrieved from https://www.forbes.com/sites/kathleenchaykowski/2017/01/08/meet-headspace-the-app-that-made-meditation-a-250-million-business/#7641f4f81f1b.
Christman, J. (1991). Autonomy and personal history. Canadian Journal of Philosophy, 21(1), 1–24.
Christman, J. (2004). Relational autonomy, liberal individualism, and the social constitution of selves. Philosophical Studies, 117(1/2), 143–164.
Code, L. (1991). What can she know? Feminist theory and the construction of knowledge. Ithaca: Cornell University Press.
Cohen, S. (2018). Manipulation and deception. Australian Journal of Philosophy, 96(3), 483–497.
Crawford, R. (2006). Health as a meaningful social practice. Health, 10(4), 401–420.
Crisp, R. (1987). Persuasive advertising, autonomy, and the creation of desire. Journal of Business Ethics, 6(5), 413–418.
Culbert, K. M., Racine, S. E., & Klump, K. L. (2015). Research review: What we have learned about the causes of eating disorders – a synthesis of sociocultural, psychological, and biological research. Journal of Child Psychology and Psychiatry, 56(11), 1141–1164.
Eikey, E. V., & Reddy, M. C. (2017). “It’s definitely been a journey”: A qualitative study on how women with eating disorders use weight loss apps. In: Proceedings on the 2017 CHI Conference on Human Factors in Computing Systems, pp. 642–654.
Eikey, E. V., Reddy, M. C., Booth, K. M., Kvasny, L., Blair, J. L., Li, V., & Poole, E. (2017). Desire to be underweight: Exploratory study on weight loss app community and user perceptions of the impact on disordered eating behaviors. JMIR mHealth and uHealth, 5(10), e150.
European Commission (2012). Communication from the Commission to the Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Health Action Plan 2012–2020—Innovative healthcare for the 21st century (Brussels, 6.12.2012. COM/2012/0736 final). Brussels: European Union.
European Commission. (2014). Green paper on mobile health (Brussels, 10.4.2014, COM(2014) 135 final). Available at https://ec.europa.eu/digital-single-market/en/news/green-paper-mobile-health-mhealth.
Eyal, N. (2014). Hooked: How to build habit-forming products. New York: Portfolio/Penguin.
Faden, R. R., & Beauchamp, T. L. (1986). A history and theory of informed consent. New York: Oxford University Press.
Fahy, R., Van Hoboken, J., & Van Eijk, N. (2018). Data privacy, transparency and the data-driven transformation of games to services. In Proceedings of the IEEE Games, Entertainment, Media Conference, pp. 1–9.
Floridi, L. (2005). The ontological interpretation of informational privacy. Ethics and Information Technology, 7(4), 185–200.
Fotopoulou, A., & O’Roirdan, K. (2017). Training to self-care: Fitness tracking, biopedagogy and the healthy consumer. Health Sociology Review, 26(1), 54–68.
Foucault, M. (1975). The birth of the clinic: An archeology of medical perception. New York: Vintage Books.
Frischmann, B., & Selinger, E. (2018). Re-engineering humanity. Cambridge: Cambridge University Press.
Gorin, M. (2014). Towards a theory of interpersonal manipulation. In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 73–97). Oxford: Oxford University Press.
Greenspan, P. (2003). The problem with manipulation. American Philosophical Quarterly, 40(2), 155–164.
Hanson, J. D., & Kysar, D. A. (1999). Taking behavioralism seriously: The problem of market manipulation. New York University Law Review, 74(3), 630–749.
Hardin, R. (2002). Trust and trustworthiness. New York: Russell Sage Foundation.
Hayek, F. A. (2006 [1960]). Freedom and coercion. In D. L. Miller (Ed.), The liberty reader (pp. 80–99). New York: Routledge.
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Lanzing, M. (2016). The transparent self. Ethics and Information Technology, 18(1), 9–16.
Lanzing, M. (2018). “Strongly recommended”: revisiting decisional privacy to judge hypernudging in self-tracking technologies. Philosophy & Technology, online first.
Petersen, A., & Bunton, R. (Eds.). (1997). Foucault, health, and medicine. London: Routledge.
Lupton, D. (2012). M-Health and health promotion: The digital cyborg and surveillance society. Social Theory & Health, 10(3), 229–244.
Lupton, D. (2013). Quantifying the body: Monitoring and measuring health in the age of mHealth technologies. Critical Public Health, 23(4), 393–403.
Lupton, D. (2018). I just want it to be done, done, done! Food tracking apps, affects, and agential capacities. Multimodal Technologies Interact, 2(2), 29–44.
Mackenzie, C., & Stoljar, N. (2000). Introduction: Autonomy refigured’. In C. Mackenzie & N. Stoljar (Eds.), Relational autonomy: Feminist perspectives on autonomy, agency, and the social self (pp. 3–31). Oxford: Oxford University Press.
Mayes, C. (2015). The biopolitics of lifestyle: Foucault, ethics and healthy choices. London: Routledge.
Meyers, D. (1989). Self, society and personal choice. Oxford: Oxford University Press.
Mills, C. (2014). Manipulation as an aesthetic flaw. In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 135–150). Oxford: Oxford University Press.
Nedelsky, J. (1989). Reconceiving autonomy: Sources, thoughts and possibilities. Yale Journal of Law and Feminism, 1, 7–36.
Noggle, R. (1996). Manipulative actions: A conceptual and moral analysis. American Philosophical Quarterly, 33(1), 43–55.
Nordenfelt, L. (1986). Health and disease: Two philosophical perspectives. Journal of Epidemiology and Community Health, 41, 281–284.
Nordenfelt, L. (1987). On the nature of health: An action theoretic approach. Dordrecht: Reidel.
Nys, T. R. V. (2016). Autonomy, trust, and respect. Journal of Medicine and Philosophy, 41(1), 10–24.
Olson, P. (2014). MyFitnessPal Starts tracking steps to grow the world’s largest nutrition database. Forbes, May 1, 2014. Retrieved from https://www.forbes.com/sites/parmyolson/2014/05/01/myfitnesspal-starts-tracking-steps-to-grow-the-worlds-largest-nutrition-database/#341b09d05968.
Olson, P. (2015). Under armour buys health-tracking app MyFitnessPal for $475 million. Forbes, February 4, 2015. Retrieved from https://www.forbes.com/sites/parmyolson/2015/02/04/myfitnesspal-acquisition-under-armour/#1a75350c6935.
Overdorf, R., Kulynych, B., Balsa, E., Troncoso, C., & Gürses, S. (2018). POTs: Protective optimization technologies. arXiv: 1806.02711.
Patterson, H. (2013). Contextual expectations of privacy in self-generated health information flows. In: The 41st Research Conference on Communication, Information and Internet Policy. Retrieved from https://ssrn.com/abstract=2242144.
Rudinow, J. (1978). Manipulation. Ethics, 88(4), 338–347.
Sanders, R. (2017). Self-tracking in the digital era: Biopower, patriarchy, and the new biometric body projects. Body & Society, 23(1), 36–63.
Santilli, P. C. (1983). The informative and persuasive functions of advertising: A moral appraisal. Journal of Business Ethics, 2(1), 27–33.
Sax, M., Helberger, N., & Bol, N. (2018). Health as a Means Towards Profitable Ends: mHealth Apps, user autonomy, and unfair commercial practices. Journal of Consumer Policy, 41(2), 103–134.
Spencer, S. B. (2019). The problem of online manipulation. Retrieved from https://ssrn.com/abstract=3341653.
Stoljar, N. (2011). Informed consent and relational conceptions of autonomy. The Journal of Medicine and Philosophy, 36(4), 375–384.
Susser, D. (2019). Invisible influence: Artificial intelligence and the ethics of adaptive choice architectures. In AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society, January 27–28, Honolulu, Hawaii, USA. Retrieved from http://www.aies-conference.com/wp-content/papers/main/AIES-19_paper_54.pdf.
Susser, D., Roessler, B., & Nissenbaum, H. (2019a). Online manipulation: hidden influences in a digital World. Georgetown Law Technology Review, 4(1), 1–45.
Susser, D., Roessler, B., & Nissenbaum, H. (2019b). Technology, autonomy, and manipulation. Internet Policy Review. https://doi.org/10.14763/2019.2.1410.
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: improving decision about health, wealth, and happiness. New Haven, CT: Yale University Press.
Wood, A. W. (2014). Coercion, manipulation, exploitation. In C. Coons, M. Weber (Eds.), Manipulation: Theory and Practice (pp. 17–50). Oxford: Oxford University Press.
Wojdynski, B. W., & Evans, N. J. (2016). Going native: Effects of disclosure position and language on the recognition and evaluation of online advertising. Journal of Advertising, 45(2), 157–168.
Wojdynski, B. W., Bang, H., Keib, K., Jefferson, B. N., Choi, D., & Malson, J. L. (2017). Building a better native advertising disclosure. Journal of Interactive Advertising, 17(2), 150–161.
World Health Organization. (2011). mHealth. New horizons for health through mobile technologies. Report based on the findings of the second global survey on eHealth. Geneva: World Health Organization.
Yeung, K. (2017). ‘Hypernudge’: Big data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136.
Zuboff, S. (2015). Big other: Surveillance capitalism and the prospect of an information civilization. Journal of Information Technology, 30(1), 75–89.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the frontier of power. New York: Public Affairs.
Acknowledgments
This article was written during a research visit to The Digital Life Initiative at Cornell Tech. I would like to thank Helen Nissenbaum, Eran Toch, Elizabeth O’Neill, Jake Goldenfein, Michael Byrne, Erica Du, Yvonne Wang, Lauren van Haaften-Schick, Mason Marks, and Jessie Taft for providing me with a welcoming and intellectually stimulating environment during my research visit. Special thanks to Elizabeth O’Neill, Eva Groen-Reijman, Thomas Nys, and Beate Roessler for extensive feedback on earlier versions of this article. I would also like to thank the anonymous reviewers for their constructive reviews and helpful suggestions for improvement of the article. The study was supported by the Research Priority Area ‘Personalised Communication’ of the University of Amsterdam.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sax, M. Optimization of what? For-profit health apps as manipulative digital environments. Ethics Inf Technol 23, 345–361 (2021). https://doi.org/10.1007/s10676-020-09576-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10676-020-09576-6