Privacy settings

Privacy settings are "the part of a social networking website, internet browser, piece of software, etc. that allows you to control who sees information about you". With the growing prevalence of social networking services, opportunities for privacy exposures also grow. Privacy settings allow a person to control what information is shared on these platforms.

Many social networking services (SNS) such as Facebook, have default privacy settings that leave users more prone to sharing personal information. Privacy settings are contributed to by users, companies, and external forces. Contributing factors that influence user activity in privacy settings include the privacy paradox and the third person effect. The third person effect explains why privacy settings can remain unchanged throughout time. Companies can enforce a Principle of Reciprocity (PoR) where users have to decide what information they are willing to share in exchange for others’ information.

With the growing focus on internet privacy, there are technologies and programs designed to enhance and encourage more privacy setting activity. Applications such as the Personal Data Manager (PDM) are used to improve the efficiency of privacy setting management. Privacy by design can enhance privacy settings through incorporating privacy notifications or prompting users to occasionally manage their privacy settings.

Significance
SNS are designed to connect people together online. Users share information and build relationships online. Privacy leaks can still occur even with privacy settings intact. Users’ connections on SNS can reveal personal information such as having friends from the same university can lead to an inference that a person attends that university. Furthermore, even if a person has strict privacy settings enabled, their privacy can still be leaked through their connections who may not have as many privacy settings in place. This calls for enhanced privacy settings that can tolerate different privacy settings while allowing online connections.

The ability to control who views their content influences users’ decision to share or not share images on SNS such as WeChat or Qzone. Different communities call for different levels of privacy. For example, an individual is more likely to share a photo of themselves and a close friend with their close friend circle and family than strangers. This reveals a need for fine privacy settings that allow users more flexibility in their SNS sharing ability. Hu Xiaoxu et al. suggests privacy settings should encourage social networking on SNS while simultaneously protecting user privacy.

Default settings
Privacy settings for SNS have default settings that set up users to automatically share personal information the user has inputted. For example, Twitter users are automatically prone to a public profile when an account is first made. Furthermore, SNS privacy policies have shown to be too complex for consumers to fully understand, leading to personal information being shared regardless of user awareness. Even after a user deletes their Facebook profile, Facebook can still use and sell user information according to their privacy policy. Facebook's default settings allow friends to view a person's profile and anyone to search for one's profile. Default settings can be chosen due to their convenience; users do not have to exert as much effort to choose default settings compared to personalizing privacy settings.

Theories
Privacy settings are situated in the framework of the communication privacy management theory. This theory states that privacy management involves setting boundaries and agreements with individuals, highlighting that once information is shared, it is now their information as well. In a study about teenagers and their privacy, it was revealed that privacy concerns was the biggest contributor to management both personal and interpersonal privacy (see Privacy concerns with social networking services). Privacy settings can be inaccessible or effortful to implement. Teenagers that feel fatalistic toward their personal privacy are more likely to depend on interpersonal privacy techniques. De Wolf's study revealed that teenagers used more personal privacy techniques than interpersonal privacy management, which emphasizes the need for accessible, clear privacy settings.

Voluntary servitude is an idea that states people knowingly give their support to authoritative figures by subjecting themselves to servitude. In the sense of social media, voluntary servitude is how users expose their information to companies and perpetuate data collection and monetization. Romele et al. offers a possible explanation to voluntary servitude through the System Justification Theory. This theory states that people learn to internalize societal hierarchies and perpetuate their existence. Users are complying to the power hierarchy by allowing their information to be extracted from these companies, through methods such as sharing personal information with close family and friends which can be harvested by companies and third parties.

The theory of planned behavior aligns beliefs, attitudes, norms and behavior together. In a study that surveyed undergraduate students on their Facebook use, a majority responded that when their close friends think privacy protection is important or engage in privacy protection activity, they are more likely to have intentions to also engage in privacy protection behavior.

Privacy paradox
By accepting privacy policies and therefore agreeing to default settings set in place by companies, users are prone to oversharing information. However, users usually do not change their privacy settings unless they personally experience a privacy invasion or unintentionally share information. In a study exploring the connection between Facebook attitudes and behaviors, having a friend experience a privacy invasion (e.g. someone hacking into their profile) did not lead to one implementing privacy changes. A possible explanation for this is the third-person effect, which is the belief that one has a less chance of a privacy invasion than someone else, thinking they are safer than others. This protective mindset is flawed because anyone is susceptible to privacy invasions, and managing privacy settings can help decrease that risk.

The privacy paradox is an idea that states individuals' privacy attitudes and beliefs do not match their privacy behavior. This concept offers an explanation to why individuals may be complacent in privacy setting management. The privacy paradox intertwines with the third-person effect because individuals believe privacy is important but do not believe a privacy-related incident will happen to them over others. Recognizing personal privacy as important is a low-cost effort, but actually taking measures to protect one's privacy may be too high-cost for individuals, explaining the privacy paradox.

An extension of the privacy paradox is the discrepancy between the nothing to hide claim and the use of privacy settings. In a study of Canadian teenage use of SNS, a majority of participants that claimed they had nothing to hide still utilized privacy settings such as blocking other users. In addition, different selves were also portrayed across different SNS such as Facebook and Snapchat, which is a form of privacy in itself.

Influences
Another reason users do not alter their privacy settings is a lack of knowing these settings exist. Facebook users that know privacy settings exist are more likely to change them compared to users who do not know privacy settings exist. Furthermore, with Facebook, users explain their lack of privacy setting alteration because the choice to choose who is a Facebook friend is already a form of privacy. However, Debatin et al. emphasizes that the criteria individuals use to decide who is a Facebook friend is typically relaxed, posing privacy risks for users. In a different study, it was shown that the number of friends who had a private profile increased the likelihood that a user would adopt a private profile. Furthermore, the amount of time a person spent on Facebook and women were more likely to have a private profile. A private Facebook profile was defined as changing the default settings so non-friends cannot search for their profile. If the data is valuable, privacy is prevalent on the app, and implementing privacy settings is easy, users say they are more likely to engage in privacy behavior.

Updated privacy settings
In May 2020, Facebook began implementing the option to give users to delete or archive past posts from a certain time or from certain people. This "Manage Activity" option allows more security and privacy control for users. This tool is only accessible through the mobile app and has yet to be adapted to the web version of Facebook.

Policy design
However, users are not the sole party involved in privacy. Companies are also responsible for implementing default privacy settings and creating options. Romele et al. acknowledges that social media companies often play a significant role in perpetuating the voluntary servitude of users. Social media privacy policies can be complex and default settings are set up to collect beneficial, profitable data to companies. In a study that examined Facebook's privacy policy from 2005 to 2015, Shore et al. found that the policy became increasingly unclear and unaccountable. A specific portion regarding the use of user personal data and third party involvement was found to be increasingly confusing. In addition to companies portraying their privacy protection as clear and beneficial for users, they also do not do anything to heighten awareness of constant data collection. An example of this is shown through Facebook ad preferences. Facebook users can have personalized ads on their feeds which Facebook portrays as a specialized, beneficial option for users but does not explicitly state that these ads largely benefit the companies and third parties involved.

Principle of reciprocity
Default settings put users on a specific trajectory regarding their privacy. The principle of reciprocity (PoR) is played out in terms of privacy and sociability on these networks. PoR is the concept that users who give their privacy receive application utility in return. In a WhatsApp study testing user privacy choices (see Reception and criticism of WhatsApp security and privacy features), users were interviewed regarding the Last Seen Online (LSO) and Read Receipt (RR) option. Participants revealed a higher preference for keeping the RR on than the LSO option. Individuals were more likely to share if they have read a message in exchange for their recipients' read status, an example of the principle of reciprocity. LSO was not as practical as RR because being online was not a direct translation to reading their messages. In the study, younger participants had LSO turned off, indicating that older participants were less likely to have restrictive privacy settings. This could be because older users have closer circle of contacts and/or they do not place as much emphasis on what they share. Designing privacy settings to be reciprocal force users to make a decision on what private information they are willing to exchange for others' private information.

Profit
SNS companies who want to increase revenue from advertisements can achieve this by increasing the number of users and retention rate. Lin et al. revealed that both informational and territorial privacy are user concerns with SNS. Territory coordination, the access to an individual's virtual territory which can be a Facebook profile or a Twitter page, influences user privacy management more than informational disclosure. More fine-grain privacy settings are recommended by Lin et al. to better suit a wide collection of territorial and informational privacy preferences. By targeting user preferences and needs, companies can increase the number of users on their platforms and increase their revenue. This serves as a monetary motivation for companies to adjust their privacy settings to better support users.

Culture
Cultural differences such as being in a collectivistic versus an individualistic society can influence the general privacy settings chosen by users. In a study that analyzed varying Twitter privacy behaviors across the globe, there appeared a cultural difference between countries. In this study, culture was measured through individualism and uncertainty avoidance. According to Hofstede's cultural dimensions, individualism is a heavy focus on the person while collectivism is more based on group effort. Uncertainty avoidance is how uncomfortable an individual is with the unknown. Collectivistic societies tend to be less individualistic and less avoidant of uncertainty while individualistic societies tend to be more individualistic and more avoidant of uncertainty. The results of this study show collectivistic cultural values support more permissiveness, as measured by sharing geographic location on a person's profile. Privacy settings in collectivist communities supported more information sharing than in an individualistic community. The cultural differences between in group versus out group could also play a role in this relationship. Collectivistic societies, such as Japan or China, distinguish more between in group versus out group than their individualistic counterparts. Users in collectivistic countries typically keep a smaller inner circle, which fosters a more intimate, trusting environment for sharing personal information. On the other hand, individualistic societies such as the U.S. or Europe, typically keep a bigger social circle and in group and out group do not differ much. Internet penetration, the measure of how many people use to how many do not use the Internet in a country, predicted user privacy. In areas that had low internet penetration, users likely had private accounts but also did not conceal their location. This study reveals how cultural values and access to Internet can affect a person's privacy settings.

Societal norms
Societal norms can also contribute to privacy setting behavior. Close friends’ own privacy behavior can influence a person's intentions to also participate in similar behavior. However, intentions may not always lead to action which was seen in Saeri et al.’s study. Two weeks after indicating intentions to engage in privacy protection behavior on a survey, most participants did nothing. An explanation lies in the habitual nature of using Facebook, making it difficult for some to change their behavior. Facebook also encourages users to share information which shapes norms that influence behavior. Norms serve a role in influencing privacy behavior.

Design
With the growing prevalence of social media, the risk for privacy leaks becomes more and more possible. Privacy settings can help protect users if designed and used in specific ways. Privacy settings could be re-designed with the intention of protecting the user as much as possible. This is called privacy by design. Privacy by design aims to limit the risks of information sharing while maintaining or possibly increasing the benefits. Privacy policies can be complex and unclear which serves as an obstacle to users understanding their privacy. Potential changes in privacy settings include simpler privacy policies that are concise, clear, and understood by the user. Incorporating reminders for users to check, and possibly update their privacy settings occasionally may increase privacy awareness. With default settings designed for users to keep an open profile, Watson et al. offered a different default setting design. Facebook default privacy settings could be altered to be more conservative (e.g. not being searchable by anyone on Facebook) to possibly prevent unintentional information sharing. However, utility needs to be balanced with default privacy settings. If default privacy settings were too strict and closed off, the functionality of social media apps could decrease. A balance between default privacy settings that protect the user from unwanted privacy leaks but also allow users to socialize and interact online should be considered.

When first choosing privacy settings, it may be useful to choose from pre-made profiles that have varying levels of privacy in them. Sanchez et al.’s study revealed that profile examples accurately reflect user privacy preferences and beliefs; this limits the discrepancy between privacy beliefs and privacy behavior. Along with the profile examples, users could also manually change and adjust their privacy settings accordingly after. This keeps the privacy setting process flexible and more convenient than manually choosing each privacy setting option. Furthermore, a simulation tool that informs users about posts and comments’ visibility can encourage users to use privacy settings more. Sayin et al. created a Facebook simulation tool that showed users how a post's audience setting was going to affect comment owners and their privacy. For example, if a user commented on a post that was initially only viewable by friends but later changed to public, the user's privacy is risked because Facebook does not notify users of this audience change. 95% of participants that used the simulation tool believed that this tool would help increase Facebook privacy awareness.

Software
Anytime a new connection is made on a social media app, users could be prompted to set privacy settings for that specific individual. This may be tedious and too effortful for some users to use effectively. However, this can be balanced with the assistance of software tools such as a personal data manager. This software can be used to take into account user's privacy wants, and apply appropriate privacy settings that match these preferences to an individual's accounts. However, more research needs to be conducted to make sure this software can accurately apply privacy preferences to privacy settings. Personal data managers have the potential to help users become more involved in their privacy and lessen the effort for setting privacy controls. Another software, AID-S (Adaptive Inference Discovery Service) personalizes each user's privacy preferences since what is considered private information varies from each individual. Torre et al. found that AID-S can be used to find a user's preferred privacy settings and help users make more informed decisions regarding privacy including third party inclusion. Furthermore, a framework was created for smart home information processing that includes a two layer security that improves user privacy. This framework incorporates user privacy preferences, similar to the personal data manager, and uses Data Encryption Standard (DES) and Top Three Most Significant Bit (TTMSB) to safely transmit data from smart home devices. TTMSB takes into account user privacy desires and conceals sensitive information during data transmission.

Data science is a field that has been used to extract inferred information from databases of collected SNS user information. However, the ability to infer information from data science can be used to better inform users on how their information can be used and increase informed privacy decisions. Control over one's privacy and transparency on how their information will be used can help facilitate a relationship between data science and privacy.

Trust-based negotiations
Trust-based negotiations are based on a contingency of acceptance or rejection from the user. In respect to privacy, trust-based negotiations have been offered as a diversion from the binary of accepting or rejecting privacy policies in full, and allow acceptance and rejection of specific parts of the privacy policies. This allows users to have more control over their privacy and allow interaction between the two parties. The general data protection regulation (GDPR) ensures that third parties (TPs) are using concise, clear language in their privacy policies while the user also gives a clear response of acceptance or rejection to them. There must be a consensual agreement between the user and the TPs. If either side is unsatisfied with the privacy terms, they can negotiate. PDM plays a major role in this by mediating between the user and the TP. PDM applies the user's privacy preferences to the TP's privacy statements and either accepts or rejects it based on the user's preferences. If there is a rejection, both parties can enter negotiation. The advancement of an interactive privacy model attempts to make privacy settings a one-time action that will be applied to all Internet of things (IoT).

Interactive educational games
The privacy paradox is contributed by lack of privacy knowledge, fatigue, and feeling distant from a privacy invasion. Education and increasing intrinsic motivation can help alleviate the effect of these contributors, and in turn, the privacy paradox. However, in a study that tested the effectiveness of an interactive privacy smartwatch game, game players were more likely to engage in privacy behavior such as enabling a lock screen. The game was personalized where users could create their own avatar and had to complete time-sensitive tasks. The time restraint encouraged users to continue engaging with the game more, and the personalization connected with them. These tasks were split up into levels according to difficulty. For example, tasks included checking app permissions, enabling screen lock, disabling GPS, and turning off SMS permissions. Alongside the tasks, individuals also had to answer questions regarding privacy settings such as, "How can you stop your contacts from being used by apps?" A person's digital character had health that was determined by performance regarding the task and answering questions correctly. This study emphasizes the potential effectiveness of interactive privacy games that can increase privacy literacy and encourage more privacy setting usage.