News

Can data aggregation help you understand controversial topics?

Researchers at the École Polytechnique Fédérale de Lausanne are investigating whether you can use a search platform to understand controversial topics. Angelique Lu speaks with Claudiu Cristian Musat about his research.

by WAN-IFRA Staff executivenews@wan-ifra.org | May 7, 2015

Using developments in Artificial Intelligence, Musat and his colleagues are developing the Perspectives platform, a ‘search engine’ which extracts positive and negative opinions about a controversial topic, and presents them to the individual, allowing them to arrive at their own conclusions. “We’re showing the reader how many articles were positive and how many were negative,” Musat said. “We look at the controversy, see the ratio of positive to negative articles and when they go in, they can choose to read the positive or the negative side.”

Musat, an associate researcher in the Artificial Intelligence Lab at the École Polytechnique Fédérale de Lausanne, came across the idea of Perspectives during the 2007 financial crisis. “When, in 2007, 2008, things started crashing all around me and I was in the stock market, I had no idea what was going on,” Musat explained. “I said to myself – while I was losing money – that I was reading the wrong things. It turned out there a huge volume of information out there already, that if I had read it before it would have helped me not lose money.”

“At that time I didn’t know what I wanted to do,” he explained. “I had just finished my Bachelors (degree in computer science) I was already working in the industry thinking of a career in investing and then 2007 happened which made me rethink everything.”

“That was a powerful motivator for me to go into academia actually,” he said. Musat went on to investigate media mining. Perspectives is a platform that Musat developed after seven years of work and research.

Using opposing emotive words, such as good and bad, excellent and terrible, the platform attempts to gauge an article’s slant. “These words have an intrinsic polarity,” Musat said.

The next process the Perspectives platform goes through is extracting ideas found in these articles. “Using this algorithm you get ideas rather than key words. It can be a collection of keywords that have the same meaning in that specific context.”

“On one hand you have words like good and bad, low and high and so on, and on the other you have what are usually nouns. Like NSA, Snowden, surveillance and so on.”

The platform then analyses phrase structures, assessing the semantics and its syntax. Articles that include nouns from controversial topics like surveillance, are combined with opinion words. Using this methodology, Musat said they are able to arrive at a relatively accurate figure, gauging the positive and negative opinions on a topic.

“The current accuracy is somewhere being 85 and 90%,” he said. “For the rest of the 100%, these are exceptions, these are context-dependent.”

Objectivity and impartiality is an on-going debate within the media. The methodology offered by the Perspectives’ platform assumes that objectivity – balancing between two opposing perspectives – is necessarily a good idea. Some media critics like Jay Rosen have questioned the “He said-She said’ approach to journalism. “Journalists associate the middle with truth when there may be no reason to,” Rosen wrote on his blog.

How then does Musat and his research team deal with topics like global warming, where majority of academics agree that it is occurring?

“Obviously source reputation is an important issue here,” Musat said. “You can’t have the New York Times and a blog with 10 users have the same weighting.”

“Right now if it’s very negative it will be on top if its very positive it will be on top,” Musat said. “The most negative and the most positive are the least reputable.”

“I think in the end we’re going to keep the separation positive and negative and within each camp give a higher weight to the reputation in the opinion score.”

“We’re focusing people on the subjects that would be difficult to understand,” Musat explained. “If you’re talking about the Israeli-Arab conflict, there are so many sides to this story its difficult to get your head around them. If you’re talking about Serena Williams latest win that’s not really controversial,” he said. “We don’t want to get into factual reports, like the weather, this you can get from basically anywhere. What we want to focus on is the controversial side.”

New points of view

Users of the platform are able to read opposing opinions immediately. It’s this function Musat explained, which allows users to overcome what they call ‘selective exposure’, where an individual only reads articles that they agree with. “We’re offering the option of seeing the opposite point of view at any time,” he said. “We have proven that people will take that option.”

Studies conducted by Musat and his colleagues show a willingness to read opposing opinions. “We did a user study, do people click ‘show me the opposite’ button,” he said. “We’ve done that for English and for French in economic articles and in health-related articles. We reduced the imbalance by 50% which is a lot. For each topic if you read four positive articles and one negative article that’s quite imbalanced. In the control group we had this huge imbalance but in our group this was this was reduced to half. This is huge because there was this assumption that people will want to click and to see the opposite side, which is true people will do that.”

Uses for media

The technology, Musat said, has major potential for journalism. He is currently in talks with major news publications, with the Perspectives platform being able to guage how partial a publication is on a particular topic. “I can run their content through my system and tell them, okay you’re balanced, okay you’re not,” Musat said. “Or a journalist who is writing a story I can tell them by looking at the story, okay you’re 80% negative towards this and present our tool to as a way to improve the story before it gets out.”

The technology can allow readers to monitor the opinions or potential biases of a journalist. Musat said that this can be used to understand the connection between news coverage of politicians and a journalist’s own biases. “I want to see whether a journalist always agree with a party or to profile sensitive opinions towards sensitive subjects or sources,” he said. “I have a source, or a person at an individual level, I want to see how that person’s opinion vary through time on a subject. What’s his normal stance on each of these controversies?

“If that person has written on politics, I want to see what are the controversies he covered and what’s his general opinion about each one of those. That’s a way to bring transparency. I want to know whether the person I’m reading right now is always of the same opinion or does he shift.  It’s a reputation tool.”

Businesses are also seeing the potential of the technology. Musat and his colleagues are working with Nestle to allow them to understand their public profile. The platform allows companies to save time and resources on data analysis. “This can be used to understand what others say about you,” he said. “If you use tools… those will give you a lot of information but processing that information is not easy. You need an army of people to see what does that mean? We have 1000 tweets about Nestle, are they good or are they bad?”

A new social network 

Also under development is the Topic Profile Collaborative Filtering (TPCF), an idea network aimed at connecting people with similar intellectual ideas. The difference to other social networks, Musat said, is the ability to see an individual’s thoughts on other controversial topics.

“We want to add a bit of depth to this online chatting experience,” Musat said. “We wanted to create deeper interactions.”

“So the connection between these two is we can separate the positive and the negative aspects of something that is controversial and we can let our readers decide which they are, or whether to be on a side or just accept arguments from both sides. This is what we believe how communities are formed. That’s what we’re working on right now to help people find like-minded individuals.”

Musat wanted to capitalise on the forum communities on the internet. “An estimate is that 550 million people are forum users. Those people are part of a community for whatever reason. Coin collecting society or they’re in a computer forum or so on. The reason to search for like-minded individuals and discuss the things that they’re passionate about, those are the people that we’re targeting with this system.”

Acording to Musat, their platform emulates Reddit. “The main difference is that we bring structure,” he said. “I tell you exactly who is pro-guns, anti-abortion and so on.”

The platform Musat said, also shares similarities with internet dating. “What are the ideas that define you? It leads to a different sort of interaction. In a sense we’re doing a sort of online dating but for penpals.”

“This recommendation part is missing from other networks,” Musat said. “Because you don’t have the data. Because of the way we structured it, for us this data is natural. I know exactly what you’re interested in. I know who to recommend to you and usually the people right now, the way we created the recommendation system is that I recommend to people who are quite similar but have at least one difference. If I recommend people who are exactly the same as you are you won’t gain much out of that interaction. I wanted to you to interact with someone who is 80% similar.”

Declaration: Musat’s research is part of WAN-IFRA’s Global Alliance for Media Innovation

Share via
Copy link