Our Smartphones are increasingly exposing us
Australia, December 29, 2017
Over the course of 2017, people in the US and around the world became increasingly concerned about how their digital data are transmitted, stored and analysed.
As news broke that every Yahoo email account had been compromised, as well as the financial information of nearly every adult in the US, the true scale of how much data private companies have about people became clearer than ever.
This, of course, brings them enormous profits, but comes with significant social and individual risks.
Many scholars are researching aspects of this issue, both describing the problem in greater detail and identifying ways people can reclaim power over the data their lives and online activity generate.
Here we spotlight seven examples from our 2017 archives.
The Government View
The government does not think much of user privacy:One major concern people have about digital privacy is how much access the police might have to their online information, like what websites people visit and what their emails and text messages say. Mobile phones can be particularly revealing, not only containing large amounts of private information, but also tracking users’ locations.
As HV Jagadish at University of Michigan writes, the government does not think that smartphones’ locations are private information.
The legal logic defies common sense:
By carrying a cellphone – which communicates on its own with the phone company – you have effectively told the phone company where you are.
Therefore, your location is not private, and the police can get that information from the cellphone company without a warrant, and without even telling you that they are tracking you.
Private data reported
Neither do software designers: But mobile phone companies and the government aren’t the only people with access to data on people’s smartphones. Mobile apps of all kinds can monitor location, user activity and data stored on their users’ phones.
As an international group of telecommunications security scholars found, “More than 70% of smartphone apps are reporting personal data to third-party tracking companies like Google Analytics, the Facebook Graph API or Crashlytics.”
Those companies can even merge information from different apps – one that tracks a user’s location and another that tracks, say, time spent playing a game or money spent through a digital wallet – to develop extremely detailed profiles of individual users.
People care, but struggle to find information: Despite how concerned people are, they cannot easily find out what is being shared about them, when or to whom. Florian Schaub at the University of Michigan explains the conflicting purposes of apps’ and websites’ privacy policies:
That can leave consumers without the information they need to make informed choices.
Another problem with privacy policies is that they are incomprehensible. Anyone who does try to read and understand them will be quickly frustrated by the legalese and awkward language.
Karuna Pande Joshi and Tim Finin from the University of Maryland, Baltimore County suggest that artificial intelligence could help:
“What if a computerised assistant could digest all that legal jargon in a few seconds and highlight key points? Perhaps a user could even tell the automated assistant to pay particular attention to certain issues, like when an email address is shared, or whether search engines can index personal posts.”
Programmers can help
Jean Yang at Carnegie Mellon University is working to change that assumption.
At the moment, she explains, computer programmers have to keep track of users’ choices about privacy protections throughout all the various programs a site uses to operate. That makes errors both likely and hard to track down.
Yang’s approach, called ‘Policy-Agnostic Programming,’ builds sharing restrictions right into the software design process. That forces developers to address privacy, and makes it easier for them to do so.
But it may not be enough for some software developers to choose programming tools that would protect their users’ data.
Scott Shackelford from Indiana University discussed the movement to declare cybersecurity – including data privacy – a human right recognized under international law. He predicts real progress will result from consumer demand:
“As people use online services more in their daily lives, their expectations of digital privacy and freedom of expression will lead them to demand better protections. Governments will respond by building on the foundations of existing international law, formally extending into cyberspace the human rights to privacy, freedom of expression and improved economic well-being.”
But governments can be slow to act, leaving people to protect themselves in the meantime.
The real basis of all privacy is strong encryption.
The fundamental way to protect privacy is to make sure data is stored so securely that only the people authorized to access it are able to read it.
Susan Landau at Tufts University explains the importance of individuals having access to strong encryption. And she observes police and the intelligence community are coming around to understanding this view:
“Increasingly, a number of former senior law enforcement and national security officials have come out strongly in support of end-to-end encryption and strong device protection …, which can protect against hacking and other data theft incidents.”
One day, perhaps, governments and businesses will have the same concerns about individuals’ privacy as people themselves do. Until then, strong encryption without special access for law enforcement or other authorities will remain the only reliable guardian of privacy.
Jeff Inglis is Science + Technology Editor, The Conversation, USA.The above article, which appeared under ‘The Conversation’ has been reproduced here under Creative Commons Licence.
Photo Caption: Jeff Inglis (Courtesy: Twitter.com)