Breach


Game Design - Pratt Institute (Visiting Semester)
Royal College of Art / Imperial College London
Systems Design, Product Design, Game Design, Big Data & Privacy
(October - December 2020)






A card game modeled after real world scenarios involving data, privacy, and products/services that we know all too well.


Designed to pit friends and family against each other in (mostly) friendly competition, competing to keep things hidden that shouldn’t ever have been brought to light.



I’ve spent my time at the Pratt Institute looking into the this nebulous concept of “Big Data”, working to understand how the system works, who the big players are in the space, and how we–the users and unknowing participants–play a part. This body of work is meant to help inform a larger project I hope to finish as my Masters Thesis project at the Royal College of Art and Imperial College, but for my time at Pratt I wanted to bring these concepts home.


Goal:

To spread awareness about how we–as normal, everyday citizens–are involved with data; how the system works, what data even is, how that data moves within the system, and what that impact can be on us individually.

Hope:

To start creating transparency in the systems that we cannot see, so that one day we could begin to trust data for the asset and resource it can be, but with the confidence that we won’t be taken advantage of.




Check out the Pitch Video:











Question 1.


In beginning of this exploration, I found myself diving into this big nebulous concept of “Big Data,” a phrase we keep hearing repeated in the media, workplace conversation, and definitely in higher education spheres. My questions were prompted from my work looking at the effects of digital design on mental health, and realizing that all of these interactions that are dangerous to our mental health are entirely driven by data. So that sent me down the path with one main question:



What even is big data?




For answers, I looked initially at a number of impactful books–like Mindf*ck by Christopher Wiley, Democracy Hacked by Martin Moore, and The Age of Surveillance Capitalism by Shoshana Zuboff, all wonderfully scary books–but I realized quickly that despite their ability to sharing terrifying concepts and examples of how big data can be used against us, they didn’t really explain the system itself in a depth that I was looking for.

So I turned to the law. Specifically the most recent mass-scale laws intended to govern data and privacy: GDPR and CCPA. the General Data Protection Regulation (GDPR) is simply defined as:


“The General Data Protection Regulation (GDPR) is the toughest privacy and security law in the world. Though it was drafted and passed by the European Union (EU), it imposes obligations onto organizations anywhere, so long as they target or collect data related to people in the EU. The regulation was put into effect on May 25, 2018. The GDPR will levy harsh fines against those who violate its privacy and security standards, with penalties reaching into the tens of millions of euros.” [ source. ]


GDPR applies to any company doing business within the EU, but was the first major piece of regulation put into place to try and govern the movement and ownership of personal data and information. The California Consumer Privacy Act (CCPA), on the other hand, is the first American equivalent aimed at doing the same thing. It’s defined by the California Office of the Attorney General as:


“The California Consumer Privacy Act of 2018 (CCPA) gives consumers more control over the personal information that businesses collect about them. This landmark law secures new privacy rights for California consumers, including:

  • The right to know about the personal information a business collects about them and how it is used and shared;
  • The right to delete personal information collected from them (with some exceptions);
  • The right to opt-out of the sale of their personal information; and
  • The right to non-discrimination for exercising their CCPA rights.

“Businesses are required to give consumers certain notices explaining their privacy practices. The CCPA applies to many businesses, including data brokers.” [ source. ]


After reading through both major sets of regulation and consulting with an attorney who is expert in these specific sets of regulations, I was able to breakdown the system into it’s major key players and key interactions that take place.

Turns out, it’s not quite as complex as I initially thought.





A simplified diagram of how the system governing “big data” functions, including key players within the system and key interactions regarding the movement of personal information and data being exchanged or sold.



Question 2.


A quick summary of the diagrammed process above: data initially flows from the user to the controller (a flexible title for the first point of contact for the user), and from controller to processor (another flexible title, could be the same organization or another organization entirely, dependent on where data is stored and processed for insight), then from either controllers or processors to third parties–either sold or exchanged.

It was only after understanding this process that my next big question formed–one that I have asked many times, and one that I assumed was a large misunderstanding among the public:


How do you even define personal data?



Luckily, GDPR and CCPA provide those answers as well, but it was more overwhelming and broad than I could have ever expected.





Listed definitions from GDPR, and although these definitions differ slightly in CCPA, they are referencing the same types of information, and are just as broad in scope as those found in GDPR.




the answer:     basically
              everything.




Personal information or data is literally any piece of data or information that could be directly related to the identification of a person. Simple things we can think of, like phone numbers or id card numbers, but also subjective things like opinions or judgements that are recorded somewhere online. 



Impact.


When it comes down to it, these regulations were put in place for each of our personal protection, but with what goal in mind? Likely, to protect us against anyone who wants to do us harm.

We’ve seen numerous examples of major data breaches resulting in the loss of mass amounts of data, with some famous targets including Facebook, the Aadhaar ID system in India, Yahoo, Twitter, Microsoft, and many many others. ( I highly recommend checking out this infographic on informationisbeautiful.net to grasp even just a portion of the scale of these breaches over the years. ) 


Not to say that we shouldn’t work to protect against bad-actors in this space, but instead I’ve spent my time thinking of the potential good that can come from accepting the nature of big data. 



Similar to the number of bad examples, there are numerous examples of the good that data can provide the world. Recently in the world of science we’ve seen the breakthrough from Google’s AI division in determining the shapes of protein structures, a process entirely driven by data that will provide new avenues for developing drugs and vaccines. In the world of big data and privacy, we can look to the country of Estonia, an entirely digitally driven nation, using data to streamline public services and increase both security and freedom for it’s citizens. 

Reading about these stories makes me hopeful of a world where we could begin to trust data and continue to make progress in technology, in society, and in everyday life. But in order to get there, the public needs to trust data... but how do we do that?




building trust
in data





1. Autonomy & Control




Building trust in the system that controls our lives is a difficult challenge, but I believe it’s achievable. It’s going to take two things: The first, creating autonomy and control for individual users. Autonomy and control over their personal information, because I do believe that this information is inherently ours to control, seeing as it’s all about us–information on our lives, our movements, our habits, our interests, our friends and family, etc.

The issue is creating that level of autonomy and control in most modern systems as they exist, especially in the United States, will have to come through systematic change–arguably, and unfortunately, the slowest route possible to make changes. 

Luckily, there’s a second half to the equation.




2. Understanding & Transparency




Building trust in any system also requires a level of understanding, clarity, and transparency of the process, the key players, and the impact of that system. In this case, the impact of the data being collected, transfered, and analyzed. This piece of the puzzle is much easier to address while we work to systematically change this process that we currently seem to have no control over. 

Now these are big sweeping statements making the assumption that the general public is more or less on the same level of understanding–or maybe the same lack of understanding. So I decided to ask and find out. 




Question 3.


I’ve spent this whole time talking about the system, how it works, what needs to change, and how we need to get there. The piece I realized quickly I was missing was an understanding of what the public actually does or doesn’t know. By generating a survey of 14 questions, I received 134 responses from participants across the United States, within all age ranges, asking them a series of questions surrounding my third big question:


What do you even understand about your personal data?




Despite the fact that this was a limited survey, of which I plan to do more, my suspicions were unfortunately confirmed. Just like myself before beginning this project, it doesn’t appear that anyone really understands what big data is either.





A collection of only three responses from this survey, but three answers that I felt accurately communicated the overarching sentiment from most, if not all answers from all participants.



The three responses above are only in relation to one question, but across the all participant answers I was seeing patterns in a few different areas. After reviewing all of those responses, I wanted to try and embody those feelings in a few descriptors:


  •  fear of manipulation 

  •  worries about security 

  •  hesitation 

  •  curiosity 

  •  helplessness 

  •  ignorance 



These are all powerful words: fear, worry, hesitation, curiosity, helplessness, ignorance. But it says a lot about the way that the public could feel as an entire body. Realistically, when you think about it, it makes a lot of sense. All we see in the news are aticles talking of fears regarding election results, interference from outside powers, national companies taking advantage of us, monopolies dominating the advertising or e-commerce spaces, etc. It’s no wonder that there’s fear and worry, but I found it interesting the number of responses of people just purely curious to know more. That says one of two things to me:


Either there is a clear lack of education regarding this area of industry, or there’s a lack of access for people to learn more even if they wanted to.



Luckily, both of those areas are fairly easy to address. So I decided to bring the world of big data down to our level, into our homes, into our lives in a much more transparent way–despite the fact that we already live and breathe big data without even knowing it. 




Introducing:

BREACH




BREACH is a card game modeled after real world scenarios involving data, privacy, and products/services that we know all too well. Designed to pit friends and family against each other in (mostly) friendly competition, competing to keep things hidden that shouldn’t ever have been brought to light.

If you haven’t watched the 3 minute pitch video found at the top of the page yet, I recommend watching it [ here. ]




Description of the game and its purpose, and a mockup of how the finished package would look when purchased.



Game mechanics.




1.

Players download the BREACH app where they each will input their name and a piece of collateral.

Collateral can be anything, but we recommend a secret or something of value. This gives a sense of meaning to the game, a real thing that you’re fighting to protect. And at the same time it also gives you a sense of purpose for winning, becasue whichever player or players lose, their collateral will be read aloud, while the remaining collateral is deleted. 

(Obviously this becomes a fun point of conversation for people, becasue every person I’ve spoken to about this project immediately assumes that this platform will just become a source for collecting blackmail on people, but I promise that isn’t the point lol)




A quick walkthrough of the BREACH app, screen by screen, demonstrating how players would interact with the digital aspect of the game.




Breach mobile app highlights.




2.

The BREACH app then randomly assigns player roles. One being selected to play the System, while the rest are left to fend for themselves.

In allowing the app to designate player roles, it gives players their first taste of the lack of control they have when it comes to big data and their privacy and forces them to start to put some control in a system they may not understand.



3.

Players collect their 8 unique data tokens, along with $1000. 

Each unique data token is representative of real data that is collected on us every day. Some more valuable than others, and some more generic that we’re used to giving away.





Player tokens and cash allotment to start the game.




3D models of each individual data token.




Test prints of the data tokens


Cash card design




Printed versions of each card for testing gameplay.




4.

Players take turns flipping over a scenario card and dealing with the consequences, while the system sits outside this rotation playing cards anytime they feel.


Player goal:

Survive till the end of the round with at least one data token in hand.


System goal:

Collect all player data by any means necessary.



This unique way of pitting players not only against each other, but against the system as well, was designed to mimic the reality of today, where often we make decisions that we feel are in our best interest, but could easily have a negative impact on other players. Despite the fact that we are all collectively working against the system. 





Player rotation and system control.




Card types.




5.

Scenario cards are modeled after real world scenarios, some common and others less so. Each scenario not only impacts individual players, but the system as well. 

This once again speaks to the reality of the world we are living in. Some situations put us in a good place, putting us in control of our data and funds, creating a negative impact on the system. But other situations are quite the opposite, causing the loss of data or penalties for doing something illegal or unethical.





3 examples of Scenario cards that players could encounter.




Stacked cards with tokens, an example of how the full box set would be assembled.


Profile of the card was designed to match the profile of the collection of 8 individual data tokens.




6.

Players can collect Support or Hacker cards to help in their pursuit for survival.

The concept for these cards came from the realities of the world we live in. Support cards are meant to demonstrate the power that we do have with big data and privacy. Hacker cards conversely show some of the threats that exists in our world, some more likely than others, but all very real threats.




Examples of Support cards (top row) and Hacker cards (bottom row).




7.

When the timer ends, the game is over. It’s at that point that players assess who survived and who are the losing players. Any player who did not survive with at least 1 data token becomes one of the losing players. However, if all players survived with at least 1 token, then the System instead becomes the losing player.

This is where the game gets interesting, because in reality if the System does not have complete control, then it isn’t doing its job. But when players are working to purely survive, if they weren’t up to the task, then obviously they don’t deserve their privacy. Right?

 

The two variable end results, one where the player lost, and one where the system lost. 




8.

Players return to the app, select the losing players, and then sit back while each losing player’s collateral is read aloud.

This is what makes the game all worth it for each individual player, the chance to learn something new about your friends or family or whoever you’re playing with. And it also really hits home when you hear these things read aloud from a system that you can’t control. 




 Welcome to BREACH. 







Conclusion.


I hope that with BREACH we can start to open up this conversation about data, privacy, and people. Because it’s only if we start talking that we’ll be able to make changes for the better.


Otherwise we’ll just be stuck circling until the day our data is breached in reality.



Overall, this project was far too much fun, figuring out what types of mechanisms can be used to manipulate each individual player, all within the constraints of big data and privacy. The sad thing is it really didn’t take much to come up with example after example. There’s a lot to choose from. There’s a strong chance I’ll continue to flesh this out as I continue through the rest of my Master’s Program, and who knows, it may just end up on Kickstarter at some point so anyone could play.

...

Lastly, a big thank you to my family who helped with the ideation and testing of the game, as well as the filming of the video, had nothing but a blast working and laughing with them.

 


Additional process photos




...