The proliferation of smart home devices has ushered in an age of unprecedented convenience, yet a comprehensive new study reveals a deeply unsettling reality lurking behind the seamless user experience of many popular applications. An in-depth analysis of 49 widely used smart home apps from Apple’s App Store in mainland China has uncovered a systemic and alarming disregard for the privacy of individuals who are not the primary users of these devices. This research highlights a disturbing ecosystem where privacy policies are inconsistent with actual app functions, user controls are often illusory, and the personal data of family members, guests, and even neighbors—collectively termed “bystanders”—is collected without their knowledge or consent. The findings paint a stark picture of a surveillance-by-proxy environment, where the convenience for one user comes at the unacknowledged privacy cost of many others, facilitated by misleading information presented even on trusted platforms like the App Store.
Pervasive and Sensitive Data Collection
The investigation’s core finding is the sheer breadth and depth of data collected by these smart home applications, a process that begins the moment a user signs up. Every single one of the 49 apps examined mandated user registration through a mobile phone number, which must be verified via an SMS code. This practice is not merely a security measure; it aligns with China’s stringent real-name registration system, immediately and irrevocably linking the app’s activity to a specific, state-identifiable individual. The data acquisition, however, extends far beyond this initial contact information. The study cataloged extensive indirect collection of technical data, including unique device identifiers like IMEI and MAC addresses, operating system details, and comprehensive network information. This includes Wi-Fi network specifics, IP addresses, and even cellular base station data, a combination that can be leveraged to achieve highly precise and persistent location tracking, often without the user’s active awareness of how this information is being triangulated.
Beyond the technical and registration data, the privacy policies of these applications openly disclose the collection of intensely personal and sensitive information, justified as necessary for the devices’ intended functions. An overwhelming majority—39 of the 49 apps—reported collecting financial data, which could encompass everything from transaction amounts to bank card numbers or third-party payment account details. More alarmingly, the privacy policies for all 49 applications stated that they collected biometric information, with a specific and recurring emphasis on facial recognition data captured by smart cameras and doorbells. The intrusion into users’ private lives deepens further, with several apps also acknowledging the collection of personal health and wellness data. This can include highly intimate metrics such as blood pressure readings, blood sugar levels, menstrual cycle patterns, heart rate, sleep quality, body weight, and physical activity logs, effectively turning the smart home into a constant source of sensitive health surveillance.
The Overlooked Issue of Bystander Privacy
A central and deeply troubling focus of the research was the handling of data related to “bystanders”—individuals who are captured by smart home devices but are not the primary account holders. The study categorized these individuals into three distinct groups: live-in bystanders such as family members or roommates, visiting bystanders like guests, and uninvolved bystanders, including neighbors or passersby who might be inadvertently recorded. The analysis concluded that privacy protections for these groups are virtually nonexistent across the entire sample of applications. All privacy controls, settings, and consent mechanisms were designed exclusively for the primary account owner, creating a digital hierarchy where one person’s control over a device grants them unilateral control over the data of everyone else in its vicinity. This design choice effectively renders bystanders invisible from a data rights perspective, leaving them without any agency or recourse regarding their own personal information.
This systemic neglect becomes even more apparent when examining the app’s features and policies for shared access. While 19 of the applications offered features for sharing device control with other users, such as family members, their privacy policies completely failed to articulate how consent should be obtained from these secondary users or how their data would be managed as a separate and distinct entity. For visiting and uninvolved bystanders, the researchers found a complete and total absence of any explicit mentions in privacy policies or any corresponding features in the user interface. Consequently, the study’s traceability analysis classified bystander privacy protections as “absent” across all 49 apps. This means that bystanders receive no notification when they are being monitored, are offered no mechanism for providing or withholding consent, and have no independent control over their own data, which is entirely managed—or mismanaged—through the primary user’s account without their input.
Divergence Between Policies Interfaces and Guarantees
The study meticulously mapped the statements made in privacy policies against the actual behavior and options presented in the apps’ user interfaces, revealing critical and recurring mismatches that undermine user trust. A particularly stark example involves data sharing with government authorities. Every single app’s privacy policy contained clauses referencing their legal obligations to share user data with law enforcement or public security agencies under Chinese law. Despite this universal disclosure, not one of the 49 apps provided a corresponding feature in its user interface to notify users if or when such data sharing has occurred. This universal failure to provide transparency in practice led the researchers to classify every app as “broken” in this crucial category, leaving users completely in the dark about when their most private data is handed over. Further inconsistencies were found in how the apps handled user control over marketing, with some apps describing opt-out controls in their policies without implementing any corresponding settings in the interface, creating a false sense of control for the user.
This structural lack of user control is further complicated by usability issues and legal frameworks that override stated policies. The researchers documented significant problems with data deletion processes, where users often faced confusing or repetitive workflows to delete their accounts or revoke consent. More critically, deletion rules were applied inconsistently. One documented case showed that a user’s connected devices remained active and potentially continued collecting data even after their account had been formally deleted. These deficiencies are embedded within a legal context where user rights are fundamentally constrained. China’s Cybersecurity Law mandates that companies provide technical assistance to state security authorities upon request, creating a core conflict where privacy policies offering deletion mechanisms are ultimately superseded by the state’s broad access to personal data for national security purposes, rendering user control effectively meaningless in certain contexts.
Apple’s Privacy Labels a Source of Misinformation
Perhaps the most striking and globally relevant finding was the widespread inaccuracy of Apple’s App Store privacy labels for these Chinese smart home apps. The researchers conducted a detailed comparison between the information presented on these labels, the apps’ own privacy policies, and their observed data collection behaviors, uncovering a pattern of systemic underreporting and outright misinformation. This directly challenges the integrity of a system designed to empower users to make informed privacy decisions. For instance, twenty-six of the applications boldly claimed in their privacy labels that they collected no data that was used to track users across other apps or websites. This claim was directly contradicted by their own privacy policies, which disclosed the use of third-party Software Development Kits (SDKs) for user monitoring and analytics—the very definition of tracking that the labels are supposed to prevent.
The deception presented on the App Store labels extended beyond tracking claims. A total of twenty-three apps falsely claimed that they did not collect any data that was linked to the user’s identity. This assertion is patently false, as all 49 apps in the study required phone-number-based registration, a primary personal identifier that directly links all collected data back to a specific individual. Most egregiously, six of the applications declared in their App Store labels that they collected no data whatsoever. This was in stark and undeniable opposition to their own detailed privacy policies, which outlined extensive data collection, processing, and sharing practices. These findings suggest that the self-reported nature of Apple’s privacy labels is being exploited, turning a tool intended for transparency into a vehicle for misleading consumers and obscuring the true extent of data harvesting.
A Call for Systemic Reform
The research painted a troubling picture of a smart home ecosystem where the privacy of users, and especially that of bystanders, was systematically sidelined in favor of functionality and data acquisition. The investigation concluded by calling for urgent and comprehensive reforms to address these deep-seated issues. Recommendations included the implementation of stronger, bystander-focused privacy controls that would require explicit consent from anyone being monitored, not just the device owner. Furthermore, the study emphasized the need for greater transparency within app interfaces regarding actual data practices, particularly concerning data sharing with third parties and government authorities. Finally, the findings underscored the necessity for much stricter and more proactive enforcement of consistency between Apple’s App Store privacy labels and the true data handling practices of the applications, ensuring that these labels serve as a reliable source of information rather than a facade for invasive surveillance.
