Woman sitting at her desk in front of her computer. With the caption: When you have ads from divorce lawyers in your social media feed, that’s no coincidence.

After a few days, my husband received suggestions for divorce lawyers

Helena Tängdén
3 min readJun 14, 2021

A friend of mine asked for help in finding Swedish dating sites. As a result, after a few days, my husband received suggestions for divorce lawyers in his Facebook feed. My search engine, Google, sold my searches to Facebook, which linked my search behavior with a possible consequence, that my husband would need a divorce lawyer. Facebook sold that insight as a basis for marketing to divorce lawyers.

Every time I tell my friends this episode, I expect them to be flabbergasted but instead everyone recognizes the situation. As a user, it is not predictable how Google uses and sells information it collects about me.

The use of Google comes with a price, my privacy. Choosing another search engine also comes with a price because Google knows me, my family, and my behaviors and therefore delivers fantastic search results.

The same goes for Facebook as it also does not report how it uses collected or disseminated data, whether the retrieved data originates from me (who did the Google search) or my husband, who is not at all involved in the original course of events.

None of the users involved knew. How can we design so that the service responds to the ethical rules?

In my example, none of the users involved knew how the information was sold and used. Furthermore, all revenue from the process ended up with the service providers, and the users had no real opportunity to influence or predict the course of events.

Revenue sharing

When the European Data Protection Regulation, GDPR, became law, it meant, among other things, a requirement that the consent obtained by suppliers to handle our personal data must be worded so that it is understandable. Instead, agreements by big tech companies are characterized by the fact that they are extensive, impenetrable, and one-sided. There is no room for negotiation for me as a user to discuss contract terms, so if I want to use the service, I must accept their terms 100%.

If we imagine transparency in how data is used today, it could be applied as a couple of choices for the user. Since there is financial gain behind the entire data collection, revenue sharing could be introduced.

The reason for revenue sharing is that the user should be presented with a choice. Either I let you use my data, and with that, we will share the revenue that comes with that, or I do not permit you to share my data.

Optional ethical and transparent nudging

As in the aforementioned example, you as a user can benefit from your spouse getting a discreet hint that they might need a divorce attorney, as long as you know and agree. However, what tech companies do with my data must be ethical and transparent, and the way to go could be prompting the user with multiple choices.

Two examples to be displayed to the user at the time of search: Firstly, don’t hesitate to give suitable offers based on this search to my friends. Second choice: Don’t send any hints to friends or family.
Choices with understandable consequences

The objective is to facilitate good use and make life of the user easier. We have long said that you as a user of social media should not write things that you would not articulate orally. By the same token, tech companies should not design invisible services that do things we would not do offline.

--

--

Helena Tängdén
Helena Tängdén

Written by Helena Tängdén

Experience Designer | Malmö, Sweden

No responses yet