Skip to main content
Markkula Center for Applied Ethics

The Root of the Problem

Why the Details of a Recent Story about Facebook Matter

Irina Raicu

Irina Raicu is the director of the Internet Ethics program at the Markkula Center for Applied Ethics at Santa Clara University.  Views are her own.

On January 29, TechCrunch reported on an iOS app whose users were paid $20 per month for allowing Facebook to collect information about their online behavior. The app appeared to have replaced an earlier Facebook product, called Onovo, which Apple had banned from its App Store last August for failing to meet its revamped data collection policies.

TechCrunch reporter Josh Constine explained that the app had “been referred to as Project Atlas since at least mid-2018, around when …  Apple instituted its new rules that prohibited Onavo.” According to security expert Will Strafach, the developers of the new Facebook app, however, “didn't even bother to change the function names, the selector names, or even the ‘ONV’ class prefix. it's literally all just Onavo code with a different UI.”

The app’s users were asked to give Facebook root access to their phones. According to Strafach, “[t]his hands Facebook continuous access to the most sensitive data about you, … there is no good way to articulate just how much power is handed to Facebook when you do this.”

The app’s users were also instructed to install the iOS app via an enterprise certificate. However, Apple’s Enterprise Certificate Program is restricted, according to Apples’ terms, to internal apps for companies’ own employees. 

On Twitter, Strafach wrote, “this is the most defiant behavior I have EVER seen by an App Store developer. it's mind blowing...” His entire Twitter thread on this subject is well worth reading, if you want to understand his shock (and get details about misuse of enterprise certificates, the data to which the app requested access, responses to particular claims that Facebook made after the TechCrunch story was published, etc.).

By the end of the day on the 29th, Apple had revoked Facebook’s enterprise certificate, and none of Facebook’s internal iOS apps worked.  A day later, TechCrunch reported that Google had also been running a research app through an enterprise certificate (though Google’s did not seek and get root access, and tried to screen out teenagers). In response, Google released a statement that read, in part, “The Screenwise Meter iOS app should not have operated under Apple’s developer enterprise program — this was a mistake, and we apologize. We have disabled this app on iOS devices.” (It bears noting that Facebook’s Onavo VPN, the one banned by Apple last summer, is still available through the Google Play store.) Apple revoked, but soon restored, Google’s enterprise certificate, too.

(The deployment of the app under the name “Project Atlas” suddenly had an ironic ring to it. In Greek mythology, Atlas was a titan who lost a battle with the gods for control of the heavens. Is Facebook the titan? Are Apple and Google the gods?)

Many articles followed, reflecting on Apple’s de facto role as privacy regulator. Some praised Apple. At least one commentator accused it of “grandstanding” and argued that, if it really cared, it could do more—that Apple “is in the best position to enforce a set of values about data access and collection, if the company truly believes in them.” Others pointed out, however, that Apple’s actions already raised questions about its own vast power.

In a New York Times op-ed, Kara Swisher described the controversy as “two digital giants engaged in an esoteric tussle over how Facebook did an end run around Apple’s very strict rules about data use on its app platform,” and added that “the details hardly matter, except to note that what Facebook did is the equivalent of sneaking out of the house after the curfew your parents imposed after you last sneaked out of the house.”

But the details do matter. Rather than gloss over them, we should see this as an opportunity for all of us to learn—about the definition of “root access,” for example. About the power of platforms (for good, or not) and the inadequacy of laws that constrain them. About the lengths to which some companies will go to collect certain personal data and to camouflage that collection. As TechCrunch explained in a subsequent article,

Facebook claims it didn’t hide the program, but it was never formally announced like every other Facebook product. There were no Facebook Help pages, blog posts, or support info from the company. It used intermediaries … to run the program under names like Project Atlas and Project Kodiak. Users only found out Facebook was involved once they started the sign-up process and signed a non-disclosure agreement prohibiting them from discussing it publicly.

There is nothing esoteric about that.

The key issue in last week’s clash is not the violation of Apple’s Terms of Service. It’s that mere users have little if any control in an environment whose workings and power structures are opaque to most of us. Articles like the ones in TechCrunch are a means of empowering citizens/consumers (and regulators), by making that environment less opaque.

Photo by Esther Vargas, used without modification under a Creative Commons license.

Feb 5, 2019
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: