Algorithm editors and what they mean

What would journalism be without editors? Well, in my opinion, it would be pretty chaotic.

Editors are the backbone of journalism — take them out of the equation and you are setting loose a tsunami of fake news, badly written and poorly researched stories – to sum up, just total amateurism.

But, what do editors actually do?

According to Amelia Pisapia, journalist and former editorial director of Novel, editors are talented problem solvers who excel at putting information in context, assessing the accuracy of data and weeding out bias.

“They view issues from multiple angles, connect the dots and uncover human stories in complex systems,” writes Pisapia.

Pisapia adds that editors work within established ethical frameworks. She says that all editors have five values in common: accuracy, independence, impartiality, humanity and accountability.

However, in recent years editors have started to quite literally lose some of their humanity. With developments in technology and artificial intelligence, more and more media and news distributing platforms have started to use algorithms as editors instead of actual humans.

A good example is the algorithm behind the news feed on Facebook.Tobias Rose-Stockwell, a strategist, designer and journalist for Quartz wrote in his article, “[Facebook’s algorithm] shows you stories, tracks your responses, and filters out the ones that you are least likely to respond to. It is mapping your brain, seeking patterns of engagement.”

Sounds great doesn’t it? Having only quality news that you are interested in delivered right to your doorstep without having to move a muscle.

Well if it sounds too good to be true, it’s because it simply is. Algorithms are actually very far from being these perfect editors that we hope them to be. They have massive flaws and are actually very dangerous.

Don’t misunderstand me, algorithm editors have some good sides. They do surpass humans on some points — vis à vis their conduct as an editor for example.

In his article, “Can an Algorithm be an Editor?,” José Moreno, former multimedia director at Motorpress Lisboa explains that an algorithm has the silver lining of always acting the same way.

“Human editors always act differently on the basis of a common code,” Moreno says. “In a way, there is more accuracy and reliability in a “system” that always performs a function in the same way than in a “system” that always performs differently.”

So, yes algorithms have some upsides; Professor Pablo Boczkowski from Northwester University even called Facebook’s algorithm “the greatest editor in the history of humanity.”

But unfortunately, despite their virtues, any positive aspect that algorithms may present are always heavily outweighed by their negative counterparts.

The study , The Editor vs. the Algorithm: Targeting, Data and Externalities in Online News done by a collection of professors from different universities compared the different aspects of AI and human editors. The researchers discovered an alarming number of problems with algorithms editors, for example the algorithms tend to serve a less diverse mix of news to readers. They create a “bubble” effect as readers are presented with a narrower set of topics. An example the study presented was about readers who lived in German states where there was a high share of votes for extreme political parties. In the last election, those people were more likely to increase their consumption of political stories when their stories were selected by algorithms.

Another flaw with algorithms is their lack of social awareness; every calculation they make is based on an individual-level data. Algorithms don’t take into account “socially optimal reading behaviour,” according to the study.

“It doesn’t differentiate between factual information and things that merely look like facts,” said  Rose-Stockwell, referring to the Facebook example above. “It doesn’t identify content that is profoundly biased, or stories that are designed to propagate fear, mistrust, or outrage.”

The worst part in all of this, is that algorithms have even started to change the way some human editors think as well as the behavior of some news organizations. We have entered a traffic-at-all-costs mentality. News outlets are influenced by numbers, clicks and views now and no longer by journalistic values.

Despite all their flaws, regrettably, algorithm editors are still here and due to humans’ lust for technology and artificial intelligence, they are probably going to stay and even multiply.

But, why should algorithm editors be opposite to human editors, why should it be human vs machine?

The solution is easy: use a mix of both. The researchers from the study mentioned above concluded that “the optimal strategy for a news outlet seems to be to employ a combination of the algorithm and the human to maximize user engagement.”

In the digital age that we currently live in, machines will continue to take over more and more aspects of life. However, humans are more relevant than ever because these machines aren’t always optimal. So, in the end having a symbiosis between humans and machines is actually a comforting thought. It is the promise of a better tomorrow where machines will help humans and not supplant them.

Graphic by @sundaeghost

Student Life

Four Montreal students take first place at HackHarvard

Four Montreal students take first place at HackHarvard

“HackHarvard was maybe my 10th hackathon,” said Nicolas MacBeth, a first-year software engineering student at Concordia. He and his friend Alex Shevchenko, also a first-year software engineering student, have decided to make a name for themselves and frequent as many hackathon competitions as they can. The pair have already participated in many hackathons over the last year, both together and separately. “I just went to one last weekend [called] BlocHacks, and I was a finalist at that,” said MacBeth.

Most notable of the pair’s achievements, along with their other teammates Jay Abi-Saad and Ajay Patal, two students from McGill, is their team’s first place ranking as ‘overall best’ in the HackHarvard Global 2018 competition on Oct. 19. According to MacBeth, while all hackathons are international competitions, “HackHarvard was probably the one that had the most people from different places than the United States.” The competition is sponsored by some of the largest transnational conglomerates in the tech industry. For example, Alibaba Cloud, a subsidiary of Alibaba Group, a multinational conglomerate specializing in e-commerce, retail, and Artificial Intelligence (AI) technology, as well as Zhejiang Lab, a Zhejiang provincial government sponsored institute whose research focuses on big data and cloud computing.

MacBeth said he and Shevchenko sifted through events on the ‘North American Hackathons’ section of the Major League Hacking (MLH) website, the official student hacking league that supports over 200 competitions around the world, according to their website. “We’ve gone to a couple hackathons, me and Alex together,” said MacBeth. “And we told ourselves ‘Why not? Let’s apply. [HackHarvard] is one of the biggest hackathons.’ […] So we applied for all the ones in the US. We both got into HackHarvard, and so we went.”

Essentially, MacBeth, Shevchenko, Abi-Saad, and Patal spent 36 hours conceptualizing, designing, and coding their program called sober.AI. The web application uses AI in tandem with visual data input to “increase accuracy and accessibility, and to reduce bias and cost of a normal field sobriety test,” according to the program’s description on Devpost. “I read a statistic somewhere that only a certain amount of police officers have been trained to be able to detect people [under the influence],” said MacBeth. “Drunk, they can test because they have [breathalyzers], but high, it’s kind of hard for people to test.”

MacBeth explained that the user-friendly web application could be helpful in a range of situations, from trying to convince an inebriated friend not to drive under the influence, to law enforcement officials conducting roadside testing in a way that reduces bias, to employees, who may have to prove sobriety for work, to do so non-invasively.

Sober.AI estimates the overall percentage of sobriety through a series of tests that are relayed via visual data—either a photo of an individual’s’ face or a video of the individual performing a task—that is inputted into two neural networks designed by the team of students.

“We wanted to recreate a field sobriety test in a way that would be as accurate as how police officers do it,” said MacBeth.

The first stage is an eye exam, where a picture of an individual is fed to the first neural network, which gives an estimation of sobriety based on the droopiness of the eye, any glassy haze, redness, and whether the pupils are dilated. The second stage is a dexterity test where individuals have to touch their finger to their nose, and the third is a balance test where people have to stand on one leg. “At the end, we compile the results and [sober.AI] gives a percentage of how inebriated we think the person is,” said MacBeth.

“Basically, what you want to do with AI is recreate how a human would think,” explained MacBeth. AI programs become increasingly more accurate and efficient as more referential data is inputted into the neural networks. “The hardest part was probably finding data,” explained MacBeth. “Because writing on the internet ‘pictures of people high’ or ‘red eyes’ and stuff like that is kind of a pain.” MacBeth said that he took to his social media pages to crowdsource photos of his friends and acquaintances who were high, which provided some more data. However, MacBeth said his team made a name for themselves at the hackathon when they started going from group to group, asking their competitors to stand on one leg, as if they were sober, then again after spinning around in a circle ten times. “That was how we made our data,” said MacBeth. “It was long and hard.”

Participating in such a prestigious competition and having sober.AI win ‘overall best’ left MacBeth and Shevchenko thirsty for more. “HackHarvard had a lot more weight to it. We were on the international level, and just having the chance of being accepted into HackHarvard within the six or seven hundred students in all of North America that were accepted, I felt like we actually needed to give it our all and try to win—to represent Concordia, to represent Montreal.”

MacBeth and Shevchenko have gone their separate ways in terms of competitions for the time being, however the pair’s collaborations are far from over. Both are planning to compete separately in ConUHacks IV at the end of January 2019, where MacBeth explained that they will team up with other software engineering students who have yet to compete in hackathons. “We’re gonna try to groom other people into becoming very good teammates,” said MacBeth.

The first-year software engineer concluded with some advice for fellow Concordia students. “For those in software engineering and even computer science: just go to hackathons,” advised MacBeth. “Even if you’re skilled, not skilled, want to learn, anything, you’re going to learn in those 24 hours, because you’re either gonna be with someone who knows, or you’re gonna learn on your own. Those are the skills you will use in the real world to bring any project to life.”

Feature photo courtesy of Nicolas Macbeth


The dark side of social media platforms

The recent Facebook scandal highlights the ways our privacy doesn’t exist online

The hashtag #DeleteFacebook was trending on social media last week, raising awareness of how much private information the platform knows about its users. It began when the Observer reported that the private data of more than 50 million Facebook users was obtained by Cambridge Analytica, a British political consulting firm. The data was used during the 2016 American presidential elections to profile voters, predict their behaviours and target them with personalized political advertisements. Similar tactics were used for the Leave campaign leading up to the Brexit vote, reported the Observer.

According to Global News, Cambridge Analytica worked for U.S. senator Ted Cruz’s campaign as well as Donald Trump’s campaign. Christopher Wylie, the whistleblower, told the Observer the firm acquired data and used a software system to target specific Facebook users’ “inner demons.” On top of that, Global News reported that Facebook has since stated there is proof Cambridge Analytica hasn’t deleted the data used during those political campaigns, which is problematic. Why are they still holding on to that information?

It’s not surprising that Facebook’s response is trying to draw attention away from the platform itself. Nor is it surprising that Facebook was involved in this type of scandal to begin with. As Facebook reiterates its commitment to privacy, users need to be smart and stop burying their heads in the sand. Everyday, I witness users sharing their most personal thoughts and details about their life on Facebook. People need to realize just how accessible Facebook is to strangers and how privacy settings only do so much in the age of big data. How can users be upset about this situation if they are basically an open-book on Facebook?

Ever since I’ve had access to the internet, my parents always told me to be careful about what I post on social media. We live in a technological era where it’s easy to go on a computer and find someone’s personal information. Users must always be aware of the dark side of social media platforms like Facebook.

Facebook is a double-edged sword. It allows people to connect with whomever they like, but it also makes their personal life publicly available. It’s hard to ignore that social media, specifically Facebook, has a creepy reputation of knowing its users activities.

I believe Facebook needs to strengthen its privacy settings to gain back the trust of its users. Third parties like Cambridge Analytica should not be able to obtain data—especially without the users’ permission or knowledge. Nonetheless, private information will always be more easily accessible on the internet, and I believe society will have to deal with these kinds of problems more frequently. Everything about a person’s life can be found on the internet, which has become an extension of the individual.

People’s entire lives are plastered across the online world. But even knowing the dark side of social media, I will not delete Facebook. I am aware of the privacy risks, but to me, the perks surpass the downsides. Of course, I always think twice before posting or liking anything on Facebook, and I encourage everyone to do the same.

I think the hashtag #DeleteFacebook is legitimate for everyone who felt betrayed by the platform. I understand why these users are angry. People’s Facebook profile data had been stolen to fuel political agendas without their permission, which is just plain wrong.

Even though Facebook is to blame for lacking safeguards to protect user data, these users have a duty to be informed about what happens when they publish information online and agree to use a website. It’s important to not blindly trust Facebook or other social media platforms. I take it upon myself to make smart decisions and be critical about how I interact with my Facebook feed—everyone should do the same.

Graphic by Zeze Le Lin


Hacking your way to a more democratic society

On Saturday, several dozen people and laptops crammed a room near Montreal’s Old Port to pour over numbers and fine tune website designs. Fueled by coffee and sandwiches, they collectively turned endless spools of data into helpful projects for the average Montreal to use.
To Jonathan Brun, this room full of humming laptops and people quietly working is a sign that Montreal has finally become a leader in open data, and a “testament” to the enthusiasm of a community looking to improve the city.
Brun is one of the four founders of Montréal Ouvert, a group of business types and open data advocates that has been lobbying the City of Montreal to release a ton of information to the public in the open data format.
Everywhere, coders and designers have been culling, or “scrapping” data from government databases in order to create a variety of websites and apps for people to use. But up until a few months ago, Montreal was behind the times.
“It got started because the four of us [were] sort of passionate about open government and making government more democratic, more accessible to people. And the open data movement has sort of taken off in a lot of Canadian cities, but was not happening in Montreal,” said Brun.
After about a year’s worth of sophisticated lobbying, which involved meeting with elected officials and targeting “key” open data fans at City Hall, holding two hackathons and three public meetings, an eventual formal report into the city’s proposed open data policy was presented last September.
“Montreal is putting together a ‘table de concertation’ – a group of people that think about open data,” said Brun, referencing a project unique to Montreal.
While the open data portal, which is mostly in French, is not yet perfect, the city also provided a venue for Saturday’s Hackathon, and sponsored it, along with other groups like The Montreal Gazette and the local OpenFile outlet.
Students made up a modest number of the Hackathon participants, like Concordia undergraduates Tavish Armstrong and Natalie Black, who study software engineering and computer science, respectively.
On Saturday, both were working to make the ongoing website more user-friendly. It uses open data to track restaurant health violations across Montreal. The top offender: a Chinese restaurant near Concordia’s downtown campus that has racked up over $33,000 in fines.
By the end of the day’s marathon session, four students from Université de Laval in Quebec City had the chance to explain their project to the group. Though still a work-in-progress, is a search engine that aggregates public library catalogues. A quick search will let you know which libraries have it in stock, as well as the price on Amazon, and whether it’s available to read on Google Books.
Jacques-Olivier Desjardins, a Laval computer engineering student, said his team did it for the fun of it, but also for the networking and to get their name out in the business.
Montreal has taken its own path to get to this point, said Mercier, Open Montreal’s city hall ‘champion’, a small, smiling woman with grey hair. She was hardpressed to name her favourite project. “I’m like a mother who has many children, I don’t have favourites,” she said. “I find it extraordinary. What happened today is the concrete application of what open data is. You have city employees with citizens, and they’re working to make a project with data that belongs to everybody. It doesn’t get better than that.”

Exit mobile version