The future of safe technology

Three panelists discuss the implications of technology in the D.B. Clarke Theatre. Photo by Savanna Craig.

Online users should take precaution of the information they share

In our electronically dependant society, the majority of people carry a smartphone in their hand.  This has created a greater opportunity for some to intertwine their lives with social media and self-tracking apps, such as Facebook or Fitbit. The production and distribution of these new technological advances makes the near future a bit of a mystery. Can we ensure the private information we share with apps and social media accounts will remain secure?

Three panelists discuss the implications of technology in the D.B. Clarke Theatre. Photo by Savanna Craig.
Three panelists discuss the implications of technology in the D.B. Clarke Theatre.
Photo by Savanna Craig.

On March 1, two journalists and a teaching assistant from Concordia’s Institute for Information Systems Engineering took to the stage of D.B. Clarke Theatre to answer that question. The talk “Connecting to Your Tech Future: A Conversation About What’s Next” is the third discussion in the Thinking Out Loud series, presented by Concordia and moderated by The Globe and Mail’s Jared Bland.

Graham Carr, vice president of Research and Graduate Studies at Concordia University, opened the conversion. “As individuals, we’re poised to become walking, talking, big data sets,” Carr said. The talk was then taken over by two panelists: journalist Nora Young, host of CBC radio’s technology talkshow Spark and author of The Virtual Self: How Our Digital Lives Are Altering the World Around Us, and assistant professor Jeremy Clark from the Concordia Institute for Information Systems Engineering also participated in the talk.

Graphic by Florence Yee.
Graphic by Florence Yee.

Young described the difference in present day society and past to be that technology is now everywhere around us. “Data and information now is ubiquitous,” Young said.

Take ads that track your browser history and personalize advertisements towards you. Young and Clark explain this as a result of algorithms monitored by computer programs.

“What we’re seeing is shaped so much by algorithms,” said Young. “We don’t know what the biases are, baked into the algorithms. We don’t know [how] it is determining what we see in our newsfeeds.”

The Concordian talked to Young prior to the event to discuss how algorithms can determine what advertising makes it on our screens. “So, in the Facebook newsfeeds case, an algorithm is figuring out what news stories and posts to show you, based on a number of factors,” said Young. She mentioned that it is not known what the factors are, but it is assumed that they are stories you have read in the past, shared or commented on. “As far as ads go, sometimes the ads you see are just following you around online based on what you searched for last,” she said.

The ads you frequently notice after you search a product, such as a book on Amazon, is an example. The longer you are online, the more you are being tracked by algorithms, she said.

Young also mentioned that users of self-tracking apps should take precautions, as these apps may record possibly sensitive information , such as the details of your health. “That might include personally identifying information, so it makes sense to consider what kind of data trails we’re leaving,” Young said. “You might not care about that data being used for the purposes of targeted ads, but you might care about it being used for other reasons.”

According to an article from Lifehacker, a weblog that discusses technology and personal productivity, health apps we are using may be selling the data we submit to private companies. As Lifehacker points out, the buyers include advertising companies . Ad Age, a magazine about marketing and media, reported that a 2014 Federal Trade Commision (FTC) study of 12 health and fitness apps found that the apps had distributed information submitted by users to 76 other parties. Ad age wrote that “the FTC did not reveal which apps or wearable devices it analyzed in its study; however it said it analyzed data sharing by free apps for pregnancy, smoking cessation and exercise.” Jah-Jiun Ho, an attorney from the FTC, told Ad Age that of the 12 apps studied, four apps sent data to one company specifically. Ho also said the study found the names and emails of the apps’ users. Ad Age claimed that due to this, companies have the potential to connect data on customers from each app to find out more personal information.

The Concordian asked Young what discretion we should take while online. “In many ways, the problem is a lack of plain language transparency on the part of these companies as to what exactly is being done with the data,” said Young. “It can therefore be difficult to know what’s being collected, and therefore what precautions you ought to take.” She said the best way to avoid this is choosing not to sign up for social media, though Young calls that an “imperfect” way of not having your personal information sold.

What other forms of security should we be worried about? Clark said that, currently, online security is terrible. “There was a search engine that someone created where you could look at all sorts of cameras of people online or people who, in their homes, have nanny cams set up pointed at their crib,” said Clark.

Clark said this is not hacking, as people have set up these cameras to be publicly accessible by a URL. While you have to actively search for sites like these, they are easily available online. However, he notes that in the early days of computers and networks there were problems with security; it has since improved tremendously, and people still use networks today. “The sad security state of the internet … is sort of a phase,” Clark said. “[People will] get better and we’ll stop worrying about those early concerns, then we’ll turn to the longer-term concerns, which are the privacy implications of these things.”

The Concordian also asked Clark what precautions social media and apps users should take, to which he said we should question these apps and what types of data they are collecting. If they are collecting data, we should ask if this data will be given or sold to other companies. “The answers might be buried in terms of service or a privacy policy, but it can be hard to find it and parse out exactly what the legalese means,” said Clark. “Ultimately we all take calculated risks. It is good to assume the worst and decide if the utility we get from tracking is worth it.”

The trouble may lie in our willingness to accept the safety of apps and social media sites, as Young suggests. She describes our ease of acceptance as a cultural issue. “We have a lot of blind faith in data and we have a lot of blind faith that algorithms are neutral,” Young said.

Related Posts