Mobile Privacy and the APPS Rights Act...
Mobile privacy is suddenly on the national political agenda in a big way. On Friday, the F.T.C. announced recommended mobile privacy guidelines that "lays out a clear picture of what sort of activities might bring a company under investigation". This comes right on the heals of Congressman Hank Johnson (D-GA) releasing the APPS Rights Act which aims to "improve the security and transparency of user data in mobile apps".
What's at issue here is the fact that, as people are now statistically connecting to the Internet more frequently through mobile devices like smartphones or tablets than they are through PCs or laptops, the laws regulating the types of data that companies can collect, store, and sell to third-parties are quickly being found to be either obsolete or nonexistent. There have been cases where companies have collected personal information about underage users younger than 13 and then also accessed information about all of the contacts in their address book, and other cases where companies have conveyed the impression that an app will gather geolocation data only one time, when, in fact, it does so repeatedly.
There has always been some level of online privacy legislation in the works; mobile is just lagging behind. Thus, there have been policy proposals for a Do-Not-Track setting in common PC-based web browsers like Firefox and Internet Explorer, but not for mobile browsers or apps.
But this is hardly a slam-dunk, no-brainer of an issue. First of all, there are the familiar competing interests to consider between what is good for individual privacy versus what is good for innovation and macroeconomic growth. As some tech observers have argued, these types of federal privacy guidelines are usually unenforceable, written by people outside the industry, and place a huge legal burden on small developers.
Second, if you actually read the text of the APPS Rights Act, you'll notice that, when it gets down to it, the noble goals of "Transparency, User Control, and Security" are all addressed by a set of guidelines that will most likely just be incorporated into the legalese of companies' Terms of Service agreements (which nobody ever reads). Virtually no common practices by mobile developers have actually been prohibited.
That said, I would argue that such federal guidelines still serve an important and positive role. Whether or not they develop into meaningful regulations with the force of law isn't necessarily the point. Such guidelines act as policy catalysts.
For instance, while it's been introduced several times in the past decade, a Do-Not-Track bill has never actually been passed by Congress. Instead, companies like Microsoft, Apple, and the non-profit Mozilla Foundation have chosen to voluntarily incorporate Do-Not-Track features into their software - either because they saw a legitimate market demand for it, or because they sought to preempt government regulation. Either way, they've done it. Additionally, the World Wide Web Consortium (W3C) - the leading standards-setting institution for the Web - is currently in the process of developing a Do-Not-Track technical standard, the DNT header field, to build the feature into the code itself.
What this demonstrates is that, while federal online privacy guidelines might truly be unenforceable, they nevertheless still often serve a vital purpose in guiding the direction of policymaking in the private sector. And even if the F.T.C.'s new recommendations and the APPS Rights Act ultimately go nowhere, that matters.