On the App Store, Apple is legislator, judge, jury, and executioner. Apple makes the rules. It has the final say about which apps you can officially purchase, download, and use on your iPhone or iPad. And importantly, Apple can change its mind at any time and make an app disappear — even to promote Apple’s own apps at the expense of a competitor and even if that competitor is a small business that relies on the App Store for its very existence.
As the world takes a closer look at the power Silicon Valley wields, that status quo is facing new scrutiny. Presidential candidate Sen. Elizabeth Warren (D-MA) actually believes Apple should be broken up: “Either they run the platform or they play in the store,” she told The Verge in March. The Supreme Court recently let an antitrust lawsuit proceed against Apple. And one recent scandal, in particular, has raised the question yet again: does Apple moderate the App Store fairly?
Apple is fully aware that it’s in the crosshairs: just this week, the company published a new webpage titled “App Store - Principles and Practices” defending the company’s stewardship over the store. The App Store offers “equal opportunities to developers,” Apple argues, going so far as to list all of the apps that compete with its own services (including Google Maps, Facebook Messenger, and Amazon Music) that are freely available on the App Store.
But Apple’s defense is full of holes. Yes, Apple has its guidelines for the App Store and a review process, but after a decade, it’s clear that the company doesn’t consistently enforce them or often chooses to enforce them when it profits Apple. Even for the apps that are allowed on the store, developers still have to fight an uphill battle against Apple’s own services. Spotify — as the company’s EU antitrust lawsuit makes clear — can’t ever be the default music app on an iPhone. Plus, Apple’s 30 percent cuts means that if Spotify sells subscriptions through the App Store, it has to charge customers more just to break even. Apple’s rules also prevent it from even directing customers in the app to its website so they can subscribe without paying Apple those fees.
The most recent example of these issues is Apple’s seemingly conveniently timed ban of apps that let parents control and monitor what their kids can do on a phone. On April 27th, The New York Times reported that Apple had coincidentally started banning or restricting “at least 11 of the 17 most downloaded screen-time and parental-control apps” right around the same time Apple debuted its own version of that idea in iOS 12. “Apple has approved our software for over five years 37 times,” an OurPact representative told The Verge. “So right now what they’re doing is retroactively enforcing these restrictions that haven’t really been in place.”
According to Apple, the removal of these apps was simply business as usual: the company responded to the Times article by explaining that those apps had simply broken the rules. Apple updated its App Store policies back in 2017 to outlaw consumer-grade apps from using an extremely powerful feature, known as mobile device management (MDM), to enable those parental controls. MDM is generally used by IT departments at companies and schools to manage employees’ devices, and Apple argued that it would be “incredibly risky… for a private, consumer-focused app business to install MDM control over a customer’s device” due to privacy concerns if a bad actor found their way into a kid’s iPhone.
Apple isn’t entirely off base here. In 2010, a company called EchoMetrix, which offered parental control software for parents to monitor their children’s internet traffic, was caught passing that data over to the other side of its business: Pulse, the company’s market research arm.
But if Apple is so concerned about the privacy risks of MDM software, why did it offer that feature in the first place, approve these banned parental control apps for years before it changed the policy in 2017, and still fail to remove them even after that change was enacted? As OurPact — one of the now-banned apps — documented, Apple approved its MDM-using apps dozens of times over the years, including 10 updates in 2018. “From day one, the very first version of OutPact that we submitted to the App Store for review has MDM in it. We’ve clarified questions for the App Review team about our use of MDM,” notes Dustin Dailey, a senior product manager at OurPact. Other apps, like Kidslox and Qustodio, also saw their updates rejected starting in the summer of 2018 when — again, coincidentally — Apple’s Screen Time feature was first announced. (The two companies have since filed an antitrust complaint against Apple.)
Meanwhile, the developers of these apps have banded together to demand an API from Apple that would allow them to offer those services again in an Apple-approved format, even going as far as proposing actual specifications for what that might entail. After all, they argue, if Apple is really committed to a “competitive, innovative app ecosystem,” the company should put its money where its mouth is and let these services compete. This seems unlikely to work, though: according to Dailey, the company was told by Apple that even if they found another approved method to make the app work, the function of blocking apps itself was fundamentally problematic to Apple.
The timing of Apple’s enforcement just isn’t a good look for Apple, even if the company insists that it’s a coincidence, as an Apple spokesperson told The New York Times. (When The Verge reached out to clarify some of these inconsistent policies, Apple declined to comment further.)
Meanwhile, Apple still allows plenty of MDM apps on the App Store, like the business-focused Jamf Now or any number of MDM solutions available on an academic level for managing iOS devices for students. Why does Apple allow employers to leave their customers data vulnerable or schools to put their students’ data at risk, but not allow parents to make similar decisions with devices they’ve purchased for their kids?
The most charitable explanation is that Apple really believes that using these APIs is an unacceptable risk for consumers, and that it allows businesses and schools to use them simply because there’s no other recourse or because those larger institutions are better equipped to handle the risk.
But it’s a view that’s oddly restrictive toward this one type of app, and it doesn’t take into account that nearly every app and service we use comes with a risk of bad actors. After all, Facebook is allowed to stay on the app store, despite its numerous security breaches that have compromised user data, and Amazon can ask for your credit card number without concerns that Jeff Bezos will steal it. So for Apple to say that these parental control apps are too much of a risk feels like an arbitrary line in the sand, and it’s not clear why we should trust big enterprise companies to not steal customer data any more than these now-banned small ones.
At best, Apple’s stewardship here is inconsistent; at worst, it’s biased in favor of its own services. Neither of those reasons says anything positive about Apple’s ability to successfully run or moderate the App Store in a fair manner. (Apple’s former app approval chief says he’s “really worried” about its behavior.) It all highlights the biggest problems with Apple’s walled garden, which is that you live or die by Apple’s whim. Even if you’re a developer who’s been building an app for years, the whole thing can be yanked out from under you in an instant simply because Apple changed the rules of the game.
Apple is well aware that its leadership of the App Store is under fire, and it already seems to be making moves to appear less anti-competitive. Take Valve’s Steam Link app, which finally made its surprise debut nearly a whole year after Apple mysteriously blocked it for “business conflicts with app guidelines” (despite the fact that it worked similarly to other LAN-based remote desktop apps that you could already download from the store). The approval came just days after the Supreme Court’s ruling that Apple would have to face an antitrust case about monopolistic practices on the App Store.
Next week, the company will have its biggest opportunity yet to convince developers that it will treat them fairly. Monday marks the beginning of the company’s Worldwide Developer Conference (WWDC) where Apple makes its annual pitch to developers on why they should create apps for Apple’s platform and where Apple is expected to come with new software and hardware in tow.
For many, the most important feature in iOS 13 might not be a new Dark Mode or undo gesture. Instead, it’ll be a promise that Apple will let you build a business without fear that some new rule will suddenly bring it crumbling down.
https://www.theverge.com/2019/5/31/18647249/wwdc-apple-parental-control-app-store-mdm-spotify-moderation-developers-2019
2019-05-31 14:59:02Z
52780305635134
Tidak ada komentar:
Posting Komentar