My research investigates how technology designers and engineers address privacy in their professional practices, and how design methods and approaches can be used to proactively surface, explore, discuss, and critique privacy and other social values-related concerns during the design process. I draw upon work and approaches from science & technology studies, human computer interaction, and speculative and critical design. See More
I am a PhD candidate at the UC Berkeley School of Information, working with Professor Deirdre Mulligan and a member of the BioSENSE research group. My research investigates how design methods and approaches can be used to proactively raise privacy and other social values-related concerns in technology design, and be used to explore alternative ways to develop technologies in ways that are cognizant of these issues. I draw upon work and approaches from science & technology studies, human computer interaction, and speculative and critical design.
Most recently, I have done work creating workbooks of speculative design fictions depicting biosensing technologies in a range of scenarios to help reflect on technical, social, and legal aspects of privacy. I have also used these workbooks as probes to engage research participants in discussions about privacy (and other social values), trying to understand how the conceptualize privacy-related issues, and where they see points of intervention to address those issues, whether they be technical, policy, or social interventions.
I graduated from Cornell University in 2014 where I double majored in Information Science and Science & Technology Studies. I completed a senior honors thesis entitled 'Wireless Visions: Creating and Contesting Sociotechnical Imaginaries of Electromagnetic Spectrum,' under the guidance of Prof Steve Jackson, investigating the technical and policy work currently being done to advance new frameworks for sharing radiospectrum frequencies motivated by the prospect of providing more wireless broadband.
I was an intern at the White House Office of Science and Technology Policy (OSTP), working with President's Council of Advisors on Science and Technology, and OSTP Awards and Events in the summer of 2012 and was a 2011 Fulbright Summer Institute Participant, spending a month in London studying aspects of British citizenship.
I am also a recipient of an NSF Graduate Research Fellowship in Computer & Information Science & Engineering.
In my spare time, I enjoy playing and composing music, video editing, and curling with the Bay Area Curling Club and the Cal Curling Group.
Richmond Y. Wong and Deirdre K. Mulligan. (2019). Bringing Design to the Privacy Table: Broadening "Design" in "Privacy by Design". In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI'19).
Download PDF Pre-Print
James Pierce, Sarah Fox, Nick Merrill, and Richmond Y. Wong. (2018). Differential Vulnerabilities and a Diversity of Tactics: What Toolkits Teach Us about Cybersecurity. Proceedings of the ACM on Human-Computer Interaction - CSCW. 2, CSCW, Article 139 (November 2018), 23 pages.
Download PDF
Open Download from ACM
Honorable Mention Award Richmond Y. Wong, Nick Merrill and John Chuang. (June 2018). When BCIs have APIs: Design Fictions of Everyday Brain-Computer Interface Adoption. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS '18).
Download PDF
Open Download from ACM
Project Page
James Pierce, Sarah Fox, Nick Merrill, Richmond Y. Wong and Carl DiSalvo. (2018). An Interface without A User: An Exploratory Design Study of Online Privacy Policies and Digital Legalese. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS '18).
Download PDF
Open Download from ACM
Best Paper Award Richmond Y. Wong, Deirdre K. Mulligan, Ellen Van Wyk, James Pierce and John Chuang. (2017). Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proceedings of the ACM Human Computer Interaction (CSCW Online First). 1, CSCW, Article 111 (November 2017), 27 pages.
Download PDF
Open Download from ACM
Project Page
Richmond Y. Wong, Ellen Van Wyk and James Pierce. (2017). Real-Fictional Entanglements: Using Science Fiction and Design Fiction to Interrogate Sensing Technologies. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS '17).
Download PDF
Open Download from ACM
Workbook PDF
Project Page
Steven Weber and Richmond Y. Wong. (2017). The new world of data: Four provocations on the Internet of Things. First Monday 22(2).
Read Online
Richmond Y. Wong and Deirdre K. Mulligan. (2016). These Aren’t the Autonomous Drones You’re Looking for: Investigating Privacy Concerns Through Concept Videos. Journal of Human-Robot Interaction 5(3).
Read Online
Project Page
Richmond Y. Wong and Deirdre K. Mulligan. (2016). When a Product Is Still Fictional: Anticipating and Speculating Futures through Concept Videos. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS '16).
Download PDF
Open Download from ACM
Project Page
Honorable Mention Award Richmond Y. Wong and Steven J. Jackson. (2015). Wireless Visions: Infrastructure, Imagination, and US Spectrum Policy. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW '15). ACM, New York, NY, USA, 105-115.
Download PDF
Open Download from ACM
Elaine Sedenberg, Richmond Wong, and John Chuang. (2018). A window into the soul: Biosensing in public. In Surveillence, Privacy and Public Space, Bryce Clayton Newell, Tjerk Timan, Bert-Jaap Koops (eds.)
Download PDF Pre-print
Download from Publisher
Richmond Y. Wong and Vera Khovanskaya. (2018). Speculative Design in HCI: From Corporate Imaginations to Critical Orientations. In New Directions in Third Wave Human-Computer Interaction: Volume 2 - Methodologies, Michael Filimowicz and Veronika Tzankova (eds.)
Download PDF Pre-print
Download from Springer
Richmond Wong. (September 2017). Exploring Biosensing Privacy Futures with Design Fiction and Science Fiction. 4S 2017, Boston, MA.
Richmond Wong, Lauren Kilgour. (December 2016). Performing Algorithms: TED Talks and Public Understandings of (Computer) Science. Algorithms in Culture, Berkeley, CA.
Richmond Wong, Deirdre Mulligan. (August 2016). Framing Future Privacy Concerns through Corporate Concept Videos. 4S/EASST 2016, Barcelona, Spain.
Deirdre Mulligan, Leslie Harris, Sebastian Benthall, Richmond Wong. (October 2015) What Really Happened After the Battle Against SOPA and PIPA? Computers, Freedom, and Privacy Conference, Washington DC.
Richmond Wong. (10 April 2015). Big Data Narratives of Crowd and Cloud. ST Global, Washington DC.
Poster Presentation Richmond Y. Wong, Deirdre K. Mulligan, and John Chuang. (2017). Using science fiction texts to surface user reflections on privacy. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers (UbiComp '17). 213-216
Workshop Organizer Nick Merrill, Richmond Wong, Noura Howell, Luke Stark, Lucian Leahu, Dawn Nafus. (June 2017). Interrogating Biosensing in Everyday Life. ACM Conference on Designing Interactive Systems (DIS '17).
Workshop Position Paper Rena Coen, Jennifer King, Richmond Wong. (June 2016). The Privacy Policy Paradox. SOUPS 2016, Workshop on Privacy Indicators
Workshop Position Paper Richmond Wong, Deirdre Mulligan. (May 2016). Using Concept Videos and Speculative Design with Privacy by Design. CHI 2016, Bridging the Gap between Privacy by Design and Privacy in Practice Workshop
Scenarios Report Contributing Author (May 2016). Cybersecurity Futures 2020. UC Berkeley Center for Long-Term Cybersecurity. Available at https://cltc.berkeley.edu/scenarios/
Guest Blog Richmond Wong. (January 2016). Reviewing Danielle Citron’s Hate Crimes in Cyberspace. UC Berkeley Center for Technology, Society & Policy. Available at https://ctsp.berkeley.edu/reviewing-danielle-citrons-hate-crimes-in-cyberspace
Report Nicholas Doty, Ann Drobnis, Deirdre Mulligan, Richmond Wong. (February 2015). Privacy by Design – State of Research and Practice: Workshop 1 Report. Computing Community Consortium. Available at http://cra.org/ccc/events/pbd-state-of-research-and-practice/
Guest Blog Nicholas Doty, Richmond Wong. (February 2015). Privacy by Design Workshop: Concepts and Connections. Computing Community Consortium. Available at http://www.cccblog.org/2015/02/17/privacy-by-design-workshop-concepts-and-connections/
Jan 2019 - I am a 2019 Berkeley Center for Long-Term Cybersecurity Reserach Grantee and Center for Technology, Society & Policy fellow for projects in collaboration with James Pierce, Sarah Fox, Nick Merrill, Noura Howell, and Franchesca Spektor.
Jan 2019 - My paper with Deirdre Mulligan, Bringing Design to the Privacy Table: Broadening "Design" in "Privacy by Design" has been accepted to CHI 2019
Oct 2018 - My CSCW 2018 paper with Deirdre Mulligan, Ellen Van Wyk, James Pierce, and John Chuang, Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, has received a Best Paper Award
Sep 2018 - Research done in collaboration with James Pierce, Sarah Fox, and Nick Merrill, Differential Vulnerabilities and a Diversity of Tactics: What toolkits teach us about cybersecurity, has been accepted to CSCW 2018
Jul 2018 - A book chapter in collaboration with Elaine Sedenberg and John Chuang on biosensing privacy in public spaces has been published, A window into the soul: Biosensing in public
Jul 2018 - Along with collaborator Vera Khovanskaya, our chapter Speculative Design in HCI: From Corporate Imaginations to Critical Orientations has been published as a part of the book New Directions in Third Wave Human-Computer Interaction: Volume 2 - Methodologies
Jun 2018 - My paper When BCIs have APIs: Design Fictions of Everyday Brain-Computer Interface Adoption with I School co-authors Nick Merrill and John Chuang has recieved an honorable mention award at the 2018 ACM Conference on Designing Interactive Systems (DIS '18)
May 2018 - Our pictorial, An Interface without A User: An Exploratory Design Study of Online Privacy Policies and Digital Legalese, has been accepted to the 2018 ACM Conference on Designing Interactive Systems (DIS '18), with collaborators James Pierce, Sarah Fox, Nick Merrill, and Carl DiSalvo
Mar 2018 - My paper When BCIs have APIs: Design Fictions of Everyday Brain-Computer Interface Adoption with I School co-authors Nick Merrill and John Chuang has been accepted to the 2018 ACM Conference on Designing Interactive Systems (DIS '18)
Mar 2018 - I am honored to have recieved a UC Berkeley Outstanding Graduate Student Instructor (GSI) Award from the GSI Teaching & Resource Center for my work in Prof. Deirdre Mulligan's Technology and Delegation course in Fall 2017.
Jan 2018 - I am a 2018 joint Research Grantee of the Berkeley Center for Long-Term Cybersecurity and Center for Technology, Society & Policy, along with collaborators James Pierce, Sarah Fox, Noura Howell, and Nick Merrill
Dec 2017 - My CSCW 2018 paper, "Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks" with Deirdre Mulligan, Ellen Van Wyk, James Pierce, and John Chuang is now available in the ACM Digital Library
Nov 2017 - I presented a talk covering 2 recent papers, Interrogating Biosensing Privacy Futures with Design Fiction (video), at the Berkeley I School's PhD Research Reception
Sep 2017 - My paper Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, with Deirdre Mulligan, Ellen Van Wyk, James Pierce, and John Chuang, will be published in November in CSCW 2018 Online First in the Proceedings of the ACM Human Computer Interaction.
Sep 2017 - I will be presenting a poster at Ubicomp discussing work done with Deirdre Mulligan and John Chuang.
Aug 2017 - I will be speaking at 4S in Boston, discussing my paper Exploring Biosensing Privacy Futures with Design Fiction and Science Fiction during the Sensing Subjectivities panel.
Apr 2017 - My paper with Ellen Van Wyk and James Pierce, 'Real-Fictional Entanglements: Using Science Fiction and Design Fiction to Interrogate Sensing Technologies' was accepted to the 2017 ACM Designing Interactive Systems (DIS) Conference
Mar 2017 - We will be hosting a workshop on interrogating biosensing at DIS 2017 in Edinburugh in June. See our CfP for more information
Feb 2017 - I will be speaking at the February Privacy Lab talking about the Berkeley Center for Long-Term Cybersecurity scenarios of cybersecurity in the year 2020
Feb 2017 - My paper with Steve Weber, The new world of data: Four provocations on the Internet of Things, has been published in First Monday
Dec 2016 - My paper with Deirdre Mulligan, These Aren’t the Autonomous Drones You’re Looking for: Investigating Privacy Concerns Through Concept Videos, has been published in the Journal of Human-Robot Interaction
Dec 2016 - Co-presented with Lauren Kilgour at the Algorithms in Culture Conference analyzing how algorithms are represented in TED Talks
Sep 2016 - Presented at 4S/EASST 2016, presenting work done by me and Deirdre Mulligan on Framing Future Privacy Concerns through Corporate Concept Videos
Aug 2016 - I will be co-leading a Social Science Matrix Research Team with Anne Jonas this semester entitled 'Assembling Critical Practices in the Social Sciences' to explore the connections and relationships between critical practices and theories from different disciplines
June 2016 - Participated in the Workshop on Privacy Indicators and the Workshop on the Future of Privacy Notices and Indicators at the 2016 Symposium on Usable Privacy and Security
Jun 2016 - Presented my paper with Deirdre Mulligan, 'When a Product Is Still Fictional: Anticipating and Speculating Futures through Concept Videos', at the 2016 ACM Designing Interactive Systems (DIS) Conference
May 2016 - Presented a workshop position paper with Deirdre Mulligan, 'Using Concept Videos and Speculative Design with Privacy by Design' at the CHI 2016 Bridging the Gap between Privacy by Design and Privacy in Practice Workshop
Apr 2016 - My paper with Deirdre Mulligan, 'When a Product Is Still Fictional: Anticipating and Speculating Futures through Concept Videos', was accepted to the 2016 ACM Designing Interactive Systems Conference
Apr 2016 - I was awarded a 2016 NSF Graduate Research Fellowship in Computing & Information Science & Engineering: Computer Security and Privacy
Mar 2016 - I will be moderating a panel on The Future of User Centered Design at this year's Berkeley I School InfoCamp
Feb 2016 - Contributed to a blog post covering privacy by design and tech policy events during early 2016.
Jan 2016 - Reviewed Danielle Citron's Hate Crimes in Cyberspace on the Berkeley Center for Technology, Society & Policy Blog.
Feb 2015 - Contributed to a guest blog post on Computing Community Consortium Blog with Nick Doty on a February Privacy by Design Workshop
Jan 2015 - My paper with Steve Jackson, Wireless Visions: Infrastructure, Imagination, and U.S. Spectrum Policy has received an honorable mention best paper award at CSCW 2015
Read more blog posts on The Bytegeist Blog on Wordpress.
Posted on December 20, 2018
In the spirit of taking a break over the holidays, this is more of a fun post with some very rough thoughts (though inspired by some of my prior work on paying attention to and critiquing narratives and futures portrayed by tech advertising). The basic version is that the Cricket Wireless 2018 Holiday Ad, Four the Holidays (made by ad company Psyop), portrays a narrative that makes a slight critique of an always-connected world and suggests that physical face-to-face interaction is a more enjoyable experience for friends than digital sharing. While perhaps a over-simplistic critique of mobile technology use, the twin messages of “buy a wireless phone plan to connect with friends” and “try to disconnect to spend time with friends” highlight important tensions and contradictions present in everyday digital life.
But let’s look at the ad in a little more detail!
Last month, while streaming Canadian curling matches (it’s more fun than you might think, case in point, I’ve blogged about the sport’s own controversy with broom technology) there was a short Cricket ad playing with a holiday jingle. And I’m generally inclined to pay attention to an ad with a good jingle. Looking it up online brought up a 3 minute long short film version expanding upon the 15 second commercial (embedded above), which I’ll describe and analyze below.
It starts with Cricket’s animated characters Ramon (the green blob with hair), Dusty (the orange fuzzy ball), Chip (the blue square), and Rose (the green oblong shape) on a Hollywood set, “filming” the aforementioned commercial, singing their jingle:
The four, the merrier! Cricket keeps us share-ier!
Four lines of unlimited data, for a hundred bucks a month!
After their shoot is over, Dusty wants the group to watch fireworks from the Cricket water tower (which is really the Warner Brothers Studio water tower, though maybe we should call it Chekov’s water tower in this instance) on New Year’s Eve. Alas, the crew has other plans, and everyone flies to their holiday destinations: Ramon to Mexico, Dusty to Canada, Chip to New York, and Rose to Aspen.
The video then shows each character enjoying the holidays in their respective locations with their smartphones. Ramon uses his phone to take pictures of food shared on a family table; Rose uses hers to take selfies on a ski lift.
The first hint that there might be a message critiquing an always-connected world is when the ad shows Dusty in a snowed-in, remote Canadian cabin. Presumably this tells us that he gets a cell signal up there, but in this scene, he is not using his phone. Rather, he’s making cookies with his two (human) nieces (not sure how that works, but I’ll suspend my disbelief), highlighting a face-to-face familial interaction using a traditional holiday group activity.
The second hint that something might not be quite right is the dutch angel establishing shot of New York City in the next scene. The non-horizontal horizon line (which also evokes the off-balance establishing shot of New York from an Avengers: Infinity War trailer) visually puts the scene off balance. But the moment quickly passes, as we see Chip on the streets of New York taking instagram selfies.
Dutch angle of New York from Cricket Wireless’ “Four the Holidays” (left) and Marvel’s Avengers Infinity War (right)
Then comes a rapid montage of photos and smiling selfies that the group is sending and sharing with each other, in a sort of digital self-presentation utopia. But as the short film has been hinting at, this utopia is not reflective of the characters’ lived experience.
The video cuts to Dusty, skating alone on a frozen pond, successfully completing a trick, but then realizes that he has no one to share the moment with. He then sings “The four the merrier, Cricket keeps us share-ier” in a minor key as re-envisions clouds in the sky as the form of the four friends. The minor key and Dusty’s singing show skepticism in the lyrics’ claim that being share-ier is indeed merrier.
The minor key continues, as Ramon sings while envisioning a set of holiday lights as the four friends, and Rose sees a department store window display as the four friends. Chip attends a party where the Cricket commercial (from the start of the video) airs on a TV, but is still lonely. Chip then hails a cab, dramatically stating in a deep voice “Take me home.”
In the last scene, Chip sits atop the Cricket Water Tower (or, Chekov’s Water Tower returns!) at 11:57pm on New Year’s Eve, staring alone at his phone, discontent. This is the clearest signal about the lack of fulfillment he finds from his phone, and by extension, the digitally mediated connection with his friends.
Immediately this is juxtaposed with Ramon singing with his guitar from the other side of the water tower, still in the minor key. Chip hears him and immediately becomes happier, and the music shifts to a major key as Rose and Dusty enter as the tempo picks up, and the drums and orchestra of instruments join in. And the commercial ends with the four of them watching New Year’s fireworks together. It’s worth noting the lyrics at the end:
Ramon: The four the merrier…
Chip [spoken]: Ramon?! You’re here!
Rose: There’s something in the air-ier
All: That helps us connect, all the season through. The four, the merrier
Dusty: One’s a little harrier (So hairy!)
All: The holidays are better, the holidays are better, the holidays are better with your crew.
Nothing here is explicitly about Cricket wireless, or the value of being digitally connected. It’s also worth noting that the phone that Chip was previously staring at is nowhere to be found after he sees Ramon. There is some ambiguous use of the word “connect,” which could refer to both a face-to-face interaction or a digitally mediated one, but the tone of the scene and emotional storyline bringing the four friends physically together seems to suggest that connect refers to the value of face-to-face interaction.
So what might this all mean (beyond the fact that I’ve watched this commercial too many times and have the music stuck in my head)? Perhaps the larger and more important point is that the commercial/short film is emblematic of a series of tensions around connection and disconnection in today’s society. Being digitally connected is seen as a positive that allows for greater opportunity (and greater work output), but at the same time discontent is reflected in culture and media, ranging from articles on tech addiction, to guides on grayscaling iPhones to combat color stimulation, to disconnection camps. There’s also a moralizing force behind these tensions: to be a good employee/student/friend/family member/etc, we are told that we must be digitally connected and always-on, but at the same time, we are told that we must also be dis-connected or interact face-to-face in order to be good subjects.
In many ways, the tensions expressed in this video — an advertisement for a wireless provider trying to encourage customers to sign up for their wireless plans, while presenting a story highlighting the need to digitally disconnect — parallels the tensions that Ellie Harmon and Melissa Mazmanian find in their analysis of media discourse of smartphones: that there is both a push for individuals to integrate the smartphone into everyday life, and to dis-integrate the smartphone from everyday life. What is fascinating to me here is that this video from Cricket exhibits both of those ideas at the same time. As Harmon and Mazmanian write,
The stories that circulate about the smartphone in American culture matter. They matter for how individuals experience the device, the ways that designers envision future technologies, and the ways that researchers frame their questions.
While Four the Holidays doesn’t tell the most complex or nuanced story about connectivity and smartphone use, the narrative that Cricket and Psyop created veers away from a utopian imagining of the world with tech, and instead begins to reflect some of the inherent tensions and contradictions of smartphone use and mobile connectivity that are experienced as a part of everyday life.
Posted on November 11, 2018
This blog post is a version of a talk I gave at the 2018 ACM Computer Supported Cooperative Work and Social Computing (CSCW) Conference based on a paper written with Deirdre Mulligan, Ellen Van Wyk, John Chuang, and James Pierce, entitled Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, which was honored with a best paper award. Find out more on our project page, our summary blog post, or download the paper: [PDF link] [ACM link]
In the work described in our paper, we created a set of conceptual speculative designs to explore privacy issues around emerging biosensing technologies, technologies that sense human bodies. We then used these designs to help elicit discussions about privacy with students training to be technologists. We argue that this approach can be useful for Values in Design and Privacy by Design research and practice.
Image from publicintelligence.net. Note the middle bullet point in the middle column – “avoids all privacy issues.”
Let me start with a motivating example, which I’ve discussed in previous talks. In 2007, the US Department of Homeland Security proposed a program to try to predict criminal behavior in advance of the crime itself –using thermal sensing, computer vision, eye tracking, gait sensing, and other physiological signals. And supposedly it would “avoid all privacy issues.” But it seems pretty clear that privacy was not fully thought through in this project. Now Homeland Security projects actually do go through privacy impact assessments and I would guess that in this case, they would probably go through the impact assessment process, find that the system doesn’t store the biosensed data, so privacy is protected. But while this might address one conception of privacy related to storing data, there are other conceptions of privacy at play. There are still questions here about consent and movement in public space, about data use and collection, or about fairness and privacy from algorithmic bias.
While that particular imagined future hasn’t come to fruition; a lot of these types of sensors are now becoming available as consumer devices, used in applications ranging from health and quantified self, to interpersonal interactions, to tracking and monitoring. And it often seems like privacy isn’t fully thought through before new sensing devices and services are publicly announced or released.
A lot of existing privacy approaches, like privacy impact assessments, are deductive, checklist-based, or assume that privacy problems already known and well-defined in advance which often isn’t the case. Furthermore, the term “design” in discussions of Privacy by Design, is often seen as a way of providing solutions to problems identified by law, rather than viewing design as a generative set of practices useful to understanding what privacy issues might need to be considered in the first place. We argue that speculative design-inspired approaches can help explore and define problem spaces of privacy in inductive, situated, and contextual ways.
We created a design workbook of speculative designs. Workbooks are collections of conceptual designs drawn together to allow designers to explore and reflect on a design space. Speculative design is a practice of using design to ask social questions, by creating conceptual designs or artifacts that help create or suggest a fictional world. We can create speculative designs explore different configurations of the world, imagine and understand possible alternative futures, which helps us think through issues that have relevance in the present. So rather than start with trying to find design solutions for privacy, we wanted to use design workbooks and speculative designs together to create a collection of designs to help us explore the what problem space of privacy might look like with emerging biosensing technologies.
A sampling of the conceptual designs we created as part of our design workbook
In our prior work, we created a design workbook to do this exploration and reflection. Inspired by recent research, science fiction, and trends from the technology industry, we created a couple dozen fictional products, interfaces, and webpages of biosensing technologies. These included smart camera enabled neighborhood watch systems, advanced surveillance systems, implantable tracking devices, and non-contact remote sensors that detect people’s heartrates. This process is documented in a paper from Designing Interactive Systems. These were created as part of a self-reflective exercise, for us as design researchers to explore the problem space of privacy. However, we wanted to know how non-researchers, particularly technology practitioners might discuss privacy in relation to these conceptual designs.
A note on how we’re approaching privacy and values. Following other values in design work and privacy research, we want to avoid providing a single universalizing definition of privacy as a social value. We recognize privacy as inherently multiple – something that is situated and differs within different contexts and situations.
Our goal was to use our workbook as a way to elicit values reflections and discussion about privacy from our participants – rather than looking for “stakeholder values” to generate design requirements for privacy solutions. In other words, we were interested in how technologists-in-training would use privacy and other values to make sense of the designs.
Growing regulatory calls for “Privacy by Design” suggest that privacy should be embedded into all aspects of the design process, and at least partially done by designers and engineers. Because of this, the ability for technology professionals to surface, discuss, and address privacy and related values is vital. We wanted to know how people training for those jobs might use privacy to discuss their reactions to these designs. We conducted an interview study, recruiting 10 graduate students from a West Coast US University who are training to go into technology professions, most of whom had prior tech industry experience via prior jobs or internships. At the start of the interview, we gave them a physical copy of the designs and explained that the designs were conceptual, but didn’t tell them that the designs were initially made to think about privacy issues. In the following slides, I’ll show a few examples of the speculative design concepts we showed – you can see more of them in the paper. And then I’ll discuss the ways in which participants used values to make sense of or react to some of the designs.
This design depicts an imagined surveillance system for public spaces like airports that automatically assigns threat statuses to people by color-coding them. We intentionally left it ambiguous how the design makes its color-coding determinations to try to invite questions about how the system classifies people.
Conceptual TruWork design – “An integrated solution for your office or workplace!”
In our designs, we also began to iterate on ideas relating to tracking implants, and different types of social contexts they could be used in. Here’s a scenario advertising a workplace implantable tracking device called TruWork. Employers can subscribe to the service and make their employees implant these devices to keep track of their whereabouts and work activities to improve efficiency.
Conceptual CoupleTrack infographic depicting an implantable tracking chip for couples
We also re-imagined the implant as “coupletrack,” an implantable tracking chip for couples to use, as shown in this infographic.
We found that participants centered values in their discussions when looking at the designs – predominantly privacy, but also related values such as trust, fairness, security, and due process. We found eight themes of how participants interacted with the designs in ways that surfaced discussion of values, but I’ll highlight three here: Imagining the designs as real; seeing one’s self as multiple users; and seeing one’s self as a technology professional. The rest are discussed in more detail in the paper.
Conceptual product page for a small, hidden, wearable camera
Even though participants were aware that the designs were imagined, Some participants imagined the designs as seemingly real by thinking about long term effects in the fictional world of the design. This design (pictured above) is an easily hideable, wearable, live streaming HD camera. One participant imagined what could happen to social norms if these became widely adopted, saying “If anyone can do it, then the definition of wrong-doing would be questioned, would be scrutinized.” He suggests that previously unmonitored activities would become open for surveillance and tracking like “are the nannies picking up my children at the right time or not? The definition of wrong-doing will be challenged”. Participants became actively involved fleshing out and creating the worlds in which these designs might exist. This reflection is also interesting, because it begins to consider some secondary implications of widespread adoption, highlighting potential changes in social norms with increasing data collection.
Second, participants took multiple user subject positions in relation to the designs. One participant read the webpage for TruWork and laughed at the design’s claim to create a “happier, more efficient workplace,” saying, “This is again, positioned to the person who would be doing the tracking, not the person who would be tracked.” She notes that the website is really aimed at the employer. She then imagines herself as an employee using the system, saying:
If I called in sick to work, it shouldn’t actually matter if I’m really sick. […] There’s lots of reasons why I might not wanna say, “This is why I’m not coming to work.” The idea that someone can check up on what I said—it’s not fair.
This participant put herself in both the viewpoint of an employer using the system and as an employee using the system, bringing up issues of workplace surveillance and fairness. This allowed participants to see values implications of the designs from different subject positions or stakeholder viewpoints.
Third, participants also looked at the designs through the lens of being a technology practitioner, relating the designs to their own professional practices. Looking at the design that automatically flags and detects supposedly suspicious people, one participant reflected on his self-identification as a data scientist and the values implications of predicting criminal behavior with data when he said:
the creepy thing, the bad thing is, like—and I am a data scientist, so it’s probably bad for me too, but—the data science is predicting, like Minority Report… [and then half-jokingly says] …Basically, you don’t hire data scientists.
Here he began to reflect on how his practices as data scientist might be implicated in this product’s creepiness – that a his initial propensity to want to use the data to predict if subjects are criminals or not might not be a good way to approach this problem and have implications for due process.
Another participant compared the CoupleTrack design to a project he was working on. He said:
[CoupleTrack] is very similar to our idea. […] except ours is not embedded in your skin. It’s like an IOT charm which people [in relationships] carry around. […] It’s voluntary, and that makes all the difference. You can choose to keep it or not to keep it.
In comparing the fictional CoupleTrack product to the product he’s working on in his own technical practice, the value of consent, and how one might revoke consent, became very clear to this participant. Again, we thought it was compelling that the designs led some participants to begin reflecting on the privacy implications in their own technical practices.
Given the workbooks’ ability to help elicit reflections on and discussion of privacy in multiple ways, we see this approach as useful for future Values in Design and Privacy by Design work.
The speculative workbooks helped open up discussions about values, similar to some of what Katie Shilton identifies as “values levers,” activities that foreground values, and cause them to be viewed as relevant and useful to design. Participants’ seeing themselves as users to reflect on privacy harms is similar to prior work showing how self-testing can lead to discussion of values. Participants looking at the designs from multiple subject positions evokes value sensitive design’s foregrounding of multiple stakeholder perspectives. Participants reflected on the designs both from stakeholder subject positions and through the lenses of their professional practices as technology practitioners in training.
While Shilton identifies a range of people who might surface values discussions, we see the workbook as an actor to help surface values discussions. By depicting some provocative designs that raised some visceral and affective reactions, the workbooks brought attention to questions about potential sociotechnical configurations of biosensing technologies. Future values in design work might consider creating and sharing speculative design workbooks for eliciting values reflections with experts and technology practitioners.
More specifically, with this project’s focus on privacy, we think that this approach might be useful for “Privacy by Design”, particularly for technologists trying to surface discussions about the nature of the privacy problem at play for an emerging technology. We analyzed participants’ responses using Mulligan et al’s privacy analytic framework. The paper discusses this in more detail, but the important thing is that participants went beyond just saying privacy and other values are important to think about. They began to grapple with specific, situated, and contextual aspects of privacy – such as considering different ways to consent to data collection, or noting different types of harms that might emerge when the same technology is used in a workplace setting compared to an intimate relationship. Privacy professionals are looking for tools to help them “look around corners,” to help understand what new types of problems related to privacy might occur in emerging technologies and contexts. This provides a potential new tool for privacy professionals in addition to many of the current top-down, checklist approaches–which assume that the concepts of privacy at play are well known in advance. Speculative design practices can be particularly useful here – not to predict the future, but in helping to open and explore the space of possibilities.
Thank you to my collaborators, our participants, and the anonymous reviewers.
Paper citation: Richmond Y. Wong, Deirdre K. Mulligan, Ellen Van Wyk, James Pierce, and John Chuang. 2017. Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 111 (December 2017), 26 pages. DOI: https://doi.org/10.1145/3134746
Posted on October 17, 2018
This post summarizes a research paper, Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, co-authored with Deirdre Mulligan, Ellen Van Wyk, John Chuang, and James Pierce. The paper will be presented at the ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) on Monday November 5th (in the afternoon Privacy in Social Media session). Full paper available here.
Recent wearable and sensing devices, such as Google Glass, Strava, and internet-connected toys have raised questions about ways in which privacy and other social values might be implicated by their development, use, and adoption. At the same time, legal, policy, and technical advocates for “privacy by design” have suggested that privacy should embedded into all aspects of the design process, rather than being addressed after a product is released, or rather than being addressed as just a legal issue. By advocating that privacy be addressed through technical design processes, the ability for technology professionals to surface, discuss, and address privacy and other social values becomes vital.
Companies and technologists already use a range of tools and practices to help address privacy, including privacy engineering practices, or making privacy policies more readable and usable. But many existing privacy mitigation tools are either deductive, or assume that privacy problems already known and well-defined in advance. However we often don’t have privacy concerns well-conceptualized in advance when creating systems. Our research shows that design approaches (drawing on a set of techniques called speculative design and design fiction) can help better explore, define, perhaps even anticipate, the what we mean by “privacy” in a given situation. Rather than trying to look at a single, abstract, universal definition of privacy, these methods help us think about privacy as relations among people, technologies, and institutions in different types of contexts and situations.
We created a set of design workbooks — collections of design proposals or conceptual designs, drawn together to allow designers to investigate, explore, reflect on, and expand a design space. We drew on speculative design practices: in brief, our goal was to create a set of slightly provocative conceptual designs to help engage people in reflections or discussions about privacy (rather than propose specific solutions to problems posed by privacy).
A set of sketches that comprise the design workbook
Inspired by science fiction, technology research, and trends from the technology industry, we created a couple dozen fictional products, interfaces, and webpages of biosensing technologies, or technologies that sense people. These included smart camera enabled neighborhood watch systems, advanced surveillance systems, implantable tracking devices, and non-contact remote sensors that detect people’s heartrates. In earlier design work, we reflected on how putting the same technologies in different types of situations, scenarios, and social contexts, would vary the types of privacy concerns that emerged (such as the different types of privacy concerns that would emerge if advanced miniatures cameras were used by the police, by political advocates, or by the general public). However, we wanted to see how non-researchers might react to and discuss the conceptual designs.
Through a series of interviews, we shared our workbook of designs with masters students in an information technology program who were training to go into the tech industry. We found several ways in which they brought up privacy-related issues while interacting with the workbooks, and highlight three of those ways here.
TruWork — A product webpage for a fictional system that uses an implanted chip allowing employers to keep track of employees’ location, activities, and health, 24/7.
First, our interviewees discussed privacy by taking on multiple user subject positions in relation to the designs. For instance, one participant looked at the fictional TruWork workplace implant design by imagining herself in the positions of an employer using the system and an employee using the system, noting how the product’s claim of creating a “happier, more efficient workplace,” was a value proposition aimed at the employer rather than the employee. While the system promises to tell employers whether or not their employees are lying about why they need a sick day, the participant noted that there might be many reasons why an employee might need to take a sick day, and those reasons should be private from their employer. These reflections are valuable, as prior work has documented how considering the viewpoints of direct and indirect stakeholders is important for considering social values in design practices.
CoupleTrack — an advertising graphic for a fictional system that uses an implanted chip for people in a relationship wear in order to keep track of each other’s location and activities.
A second way privacy reflections emerged was when participants discussed the designs in relation to their professional technical practices. One participant compared the fictional CoupleTrack implant to a wearable device for couples that he was building, in order to discuss different ways in which consent to data collection can be obtained and revoked. CoupleTrack’s embedded nature makes it much more difficult to revoke consent, while a wearable device can be more easily removed. This is useful because we’re looking for ways workbooks of speculative designs can help technologists discuss privacy in ways that they can relate back to their own technical practices.
Airport Tracking System — a sketch of an interface for a fictional system that automatically detects and flags “suspicious people” by color-coding people in surveillance camera footage.
A third theme that we found was that participants discussed and compared multiple ways in which a design could be configured or implemented. Our designs tend to describe products’ functions but do not specify technical implementation details, allowing participants to imagine multiple implementations. For example, a participant looking at the fictional automatic airport tracking and flagging system discussed the privacy implication of two possible implementations: one where the system only identifies and flags people with a prior criminal history (which might create extra burdens for people who have already served their time for a crime and have been released from prison); and one where the system uses behavioral predictors to try to identify “suspicious” behavior (which might go against a notion of “innocent until proven guilty”). The designs were useful at provoking conversations about the privacy and values implications of different design decisions.
This work provides a case study showing how design workbooks and speculative design can be useful for thinking about the social values implications of technology, particularly privacy. In the time since we’ve made these designs, some (sometimes eerily) similar technologies have been developed or released, such as workers at a Swedish company embedding RFID chips in their hands, or Logitech’s Circle Camera.
But our design work isn’t meant to predict the future. Instead, what we tried to do is take some technologies that are emerging or on the near horizon, and think seriously about ways in which they might get adopted, or used and misused, or interact with existing social systems — such as the workplace, or government surveillance, or school systems. How might privacy and other values be at stake in those contexts and situations? We aim for for these designs to help shed light on the space of possibilities, in an effort to help technologists make more socially informed design decisions in the present.
We find it compelling that our design workbooks helped technologists-in-training discuss emerging technologies in relation to everyday, situated contexts. These workbooks don’t depict far off speculative science fiction with flying cars and spaceships. Rather they imagine future uses of technologies by having someone look at a product website, or a amazon.com page or an interface and thinking about the real and diverse ways in which people might experience those technology products. Using these techniques that focus on the potential adoptions and uses of emerging technologies in everyday contexts helps raise issues which might not be immediately obvious if we only think about positive social implications of technologies, and they also help surface issues that we might not see if we only think about social implications of technologies in terms of “worst case scenarios” or dystopias.
Paper Citation:
Richmond Y. Wong, Deirdre K. Mulligan, Ellen Van Wyk, James Pierce, and John Chuang. 2017. Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 111 (December 2017), 26 pages. DOI: https://doi.org/10.1145/3134746
This post is crossposted with the ACM CSCW Blog
Read more blog posts on The Bytegeist Blog on Wordpress.
Jan 2019 - I am a 2019 Berkeley Center for Long-Term Cybersecurity Reserach Grantee and Center for Technology, Society & Policy fellow for projects in collaboration with James Pierce, Sarah Fox, Nick Merrill, Noura Howell, and Franchesca Spektor.
Jan 2019 - My paper with Deirdre Mulligan, Bringing Design to the Privacy Table: Broadening "Design" in "Privacy by Design" has been accepted to CHI 2019
Oct 2018 - My CSCW 2018 paper with Deirdre Mulligan, Ellen Van Wyk, James Pierce, and John Chuang, Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, has received a Best Paper Award
Sep 2018 - Research done in collaboration with James Pierce, Sarah Fox, and Nick Merrill, Differential Vulnerabilities and a Diversity of Tactics: What toolkits teach us about cybersecurity, has been accepted to CSCW 2018
© 2019, Richmond Wong
Show me a version of this page in plain HTML
UC Berkeley School of Information
Privacy and Cookies