Ahead of Tuesday's #AppleEvent, protests outside #Apple Store by @EFF & other #privacy advocates urge company not to roll out operating system feature that scans for child abuse content. 6000 petition signatures collected. #csam #abc7now pic.twitter.com/tdwCJ0XGuB
— David Louie (@abc7david) September 14, 2021
Today, September 14th, Apple hosts a conference to announce a range of new products and services, including the new iPhone 13 and Apple Watch. However, there are controversies. The expected premiere of iOS 15 will bring changes that privacy-oriented users may perceive as contrary to Apple’s communication about privacy as the ultimate value for the company.
Recently, Apple announced that iOS 15 would feature scanning users’ photos and informing authorities to detect child sexual abuse material (CSAM). However, this change caused users concerns that their privacy might be affected by the proposed feature. 6,000 people in San Francisco protested against it.
In the background, there are also lawsuits against Apple and allegedly too large fees for in-app purchases. Developers argue that a 30% fee is unreasonable and influences users with higher prices for products and services.
Pixalate recently revealed the Mobile - Delisted Apps Report: H1 2021, providing insights into how Apple App Store and Google Play Store sometimes protect users and delist malicious apps from the stores. The report found out that 59% of all delisted apps (in H1 2021) from the Apple App Store did not have a Privacy Policy prior to delisting, according to Pixalate’s analysis. Jalal Nasir indicated that CSAM policy might monitor some issues, but much more important matters are waiting to be addressed.
David Louie, ABC7 News tech reporter, introduced Pixalate and informed that “Pixalate studied apps in Apple App Store and found that some do not have a privacy policy, exposing weaknesses that even proposed scanning program, known as CSAM, doesn’t.”
Jalal Nasir, CEO of Pixalate, explained that “Apple, with the CSAM announcement, might monitor the front door in some ways, but the window in the backdoor of this house needs much more transparency and policing than the front doors right now.”
Download a copy of the H1 2021 Delisted Apps Report here for free.
You can also register for our webinar on September 30, 2021, we will review this data — and other data about risk factors in the mobile in-app ecosystem — in greater detail.
*By entering your email address and clicking Subscribe, you are agreeing to our Terms of Use and Privacy Policy.
These Stories on Mobile
*By entering your email address and clicking Subscribe, you are agreeing to our Terms of Use and Privacy Policy.
Disclaimer: The content of this page reflects Pixalate’s opinions with respect to the factors that Pixalate believes can be useful to the digital media industry. Any proprietary data shared is grounded in Pixalate’s proprietary technology and analytics, which Pixalate is continuously evaluating and updating. Any references to outside sources should not be construed as endorsements. Pixalate’s opinions are just that - opinion, not facts or guarantees.
Per the MRC, “'Fraud' is not intended to represent fraud as defined in various laws, statutes and ordinances or as conventionally used in U.S. Court or other legal proceedings, but rather a custom definition strictly for advertising measurement purposes. Also per the MRC, “‘Invalid Traffic’ is defined generally as traffic that does not meet certain ad serving quality or completeness criteria, or otherwise does not represent legitimate ad traffic that should be included in measurement counts. Among the reasons why ad traffic may be deemed invalid is it is a result of non-human traffic (spiders, bots, etc.), or activity designed to produce fraudulent traffic.”