A tripartite model of trust in Facebook : acceptance of information personalization, privacy concern, and privacy literacy
This study draws on the mental accounting perspective and a tripartite model of trust to explain why users trust Facebook. We argue that trust in Facebook is related to (1) trust in companies that collect personal data, (2) acceptance of information personalization, (3) low privacy concern, and (4)...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/145658 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | This study draws on the mental accounting perspective and a tripartite model of trust to explain why users trust Facebook. We argue that trust in Facebook is related to (1) trust in companies that collect personal data, (2) acceptance of information personalization, (3) low privacy concern, and (4) low privacy literacy. Further, we argue that privacy literacy amplifies the relationship between privacy concern and the other factors. This is because, among individuals with high privacy literacy, privacy concern is especially diagnostic of the potential harms of a loss of privacy. These arguments align broadly with theorizations about factors influencing privacy-related cognitions. We analyzed cross-national survey data from 4,684 mobile internet users and found support for our predictions. Our findings suggest that privacy concern has a weak relationship with trust-related beliefs except for among individuals with good privacy literacy. Among those individuals, privacy concern is negatively related to trust, potentially threatening an important revenue stream to data-driven companies, especially amid growing calls for privacy literacy education. |
---|