(DNA Top 10 in 2014) Censorship 2.0: Shadowy forces controlling online conversations: Page 2 of 4
(DNA Top 10 in 2014) Censorship 2.0: Shadowy forces controlling online conversations: Page 2 of 4
(DNA Top 10 in 2014) Censorship 2.0: Shadowy forces controlling online conversations: Page 2 of 4
Mailing list manipulation
Thinkst set out to investigate the phenomenon knowing that it had come out with good, scientific, and repeatable tests.
“One of the big challenges is that everyone kind of suspects that this kind of thing is already happening, but it’s kind of hard to measure,” Haroon acknowledged. “How do we determine if we were successful in changing people’s opinions?”
Thinkst first looked into mailing lists – which may sound like a very Web 1.0 thing to do, given today’s tweeting and instagramming online communities, where even blogging sounds quaint.
But as pointed out by Thinkst researcher Azhar Desai, people still use mailing lists or get email from them.
“So we looked at whether we could get more people to read an email on the mailing list, or get fewer people to read it. To measure how many people read an email, we used link clicks – we put links in the email that you have to click to read,” he said.
The team first sent out a control email with a link, waited 48 hours, then counted the number clicks. For the experimental email, it sent out email with a link, then used sock puppets to send several replies to make the discussion thread longer (this makes people curious to find out what’s going on), waited 48 hours, then counted the clicks.
And found that more people read the experimental email (see below).
Thinkst set out to investigate the phenomenon knowing that it had come out with good, scientific, and repeatable tests.
“One of the big challenges is that everyone kind of suspects that this kind of thing is already happening, but it’s kind of hard to measure,” Haroon acknowledged. “How do we determine if we were successful in changing people’s opinions?”
Thinkst first looked into mailing lists – which may sound like a very Web 1.0 thing to do, given today’s tweeting and instagramming online communities, where even blogging sounds quaint.
But as pointed out by Thinkst researcher Azhar Desai, people still use mailing lists or get email from them.
“So we looked at whether we could get more people to read an email on the mailing list, or get fewer people to read it. To measure how many people read an email, we used link clicks – we put links in the email that you have to click to read,” he said.
The team first sent out a control email with a link, waited 48 hours, then counted the number clicks. For the experimental email, it sent out email with a link, then used sock puppets to send several replies to make the discussion thread longer (this makes people curious to find out what’s going on), waited 48 hours, then counted the clicks.
And found that more people read the experimental email (see below).

To test how mailing lists could manipulated so that fewer people would read a particular item, the team took the same steps as above, but instead of the sock puppets being used to send replies to make the discussion thread longer, they were used to send several separate emails starting new threads to distract everyone.
And true enough, fewer people read the experimental email (see below).
And true enough, fewer people read the experimental email (see below).

One of the false email threads Thinkst planted, on a so-called vulnerability in the Mac OS X operating system, was actually picked up by a site dedicated to vulnerability reports.
“So you can use sock puppets to attract people’s attention to something you want them to read, or to distract them from something you don’t want them to read,” said Haroon. “It’s relatively simple, and all too-intuitive.”
Thinkst believes that certain parties can not only use such sock puppetry to manipulate people into paying attention to items, or to distract from reading others, but this technique can also be used to discredit opponents by doing a bad sock puppetry operation on their ‘behalf.’
Online polls
Online polls are a perennial favourite, easily gamed and very influential too, since major news sites use them.
“There is an ongoing tradition of gaming online polls, the most famous being TIME magazine’s Most Influential’ poll of 2009,” said Azhar, referring to the online poll that ‘moot’ (aka Christopher Poole) – the founder of the discussion board 4chan – won.
“You can go for the obvious landslide, or more subtle wins,” he added.
Malaysians may remember the TIME poll because the second most influential person in the world was apparently local opposition leader Anwar Ibrahim, but the poll had actually been manipulated to spell out a cryptic message ‘Marblecake Also the Game,’ noted Fox News.
Closer to home, in 2011, a poll on The Star Online on whether a proposed ‘Bersih’ pro-democracy rally should be cancelled was also manipulated. It received more than one million responses in less than a day, compared with the average of about 30,000 responses over several days, which led The Star Online to realise the poll was being manipulated.
“Furthermore, the total number of unique visitors to The Star Online is about 400,000 per day, lending further credence to suspicions that there was manipulation,” the news portal reported.
“We found it hard to find an online poll that wasn’t seriously gameable,” Haroon said at HITBSecConf.
UGC: Twitter and Reddit
On user-generated content (UGC), Thinkst explored Twitter and Reddit. For the former, Haroon referred to a study conducted by data scientist Gilad Lotan, who bought followers and then analysed the effects of buying Twitter followers – a practice that even politicians (or their even less scrupulous ‘social media consultants’) are guilty of.
“One thing’s crystal clear – on social media it is easy to mistake popularity for credibility,” Lotan had said.
“One of things he [Lotan] found was that ‘bought followers’ actually win you ‘organic followers,’ and that those real followers stay on even after your bought followers dropped off,” said Haroon.
“This is kind of intuitive, because if you use Twitter, you’d be more likely to follow someone with a high follower count than someone with a low count,” he added.
“Why does this matter? Because of the way we use Twitter – it’s not like an RSS feed. You don’t go in to catch up on all the day’s tweets, it’s a stream that you dip into.
“If I can convince a lot of people to follow me just by tweeting more, I get to dominate that person’s timeline, and essentially, what I get to do is crowd out other conversations. I can crowd out what I don’t want them to see,” Haroon said, referring to the practice of ‘timeline crowding.’
Meanwhile, Azhar said that Reddit is one of the best examples of UGC: It’s about people submitting links or stories, and the community voting on this content and policing it.
“You can use proxy accounts to ‘upvote’ articles you want to promote; or to ‘downvote’ articles you want to kill,” he said.
“Reddit expects people to try and game the system, so has many defences in place,” he added.
Still, Thinkst managed to breach those defences easily. It registered 50 proxy accounts and found that this was sufficient to upvote or downvote articles in subreddits – the various, more specialised sections on the platform.
Articles that are downvoted enough times are removed from the page for Reddit admins to investigate – but that act in itself is a victory, because, as noted, the article is knocked off the page.
“You can also try ‘trickle downvoting’ – only downvote as many times as new articles have upvotes. It keeps the score even, and while you do this, upvote the article you want people to see,” Azhar said.
When Thinkst did some mass downvoting on the WorldNews subreddit, downvoting all new articles as they appeared, the admins suspected something was up, but all they did was to put up a notice telling their users not to panic.
When Thinkst tried the same type of manipulation on the more specialised NetSec subreddit – devoted to news and matters about cyber-security – “the moderators responded with intelligent discussions and roped in official reddit admins to talk about the problem."
“But even then, they didn’t seem to have a handle of what was really going on. We had 50 bots operating, but they estimated only 20,” said Azhar.
“This shouldn’t have been so hard to spot at all,” added Haroon. “You have a bunch of users all voting on the same article, and you can isolate and detect things like voting in synch; signup times that are similar; common email domains; patterns in usernames; IP (Internet Protocol) addresses from known open proxies; and more, including users with low karma scores.”
‘Karma’ is how Reddit users are rated for their contributions to and engagement with the community.
“This stuff should be easy to spot once you’re looking at what you know is a compromised thread,” said Haroon. “This was as basic a bot attack as you could have.”
Next Page: Even major news sites not spared, controlling comments
“So you can use sock puppets to attract people’s attention to something you want them to read, or to distract them from something you don’t want them to read,” said Haroon. “It’s relatively simple, and all too-intuitive.”
Thinkst believes that certain parties can not only use such sock puppetry to manipulate people into paying attention to items, or to distract from reading others, but this technique can also be used to discredit opponents by doing a bad sock puppetry operation on their ‘behalf.’
Online polls
Online polls are a perennial favourite, easily gamed and very influential too, since major news sites use them.
“There is an ongoing tradition of gaming online polls, the most famous being TIME magazine’s Most Influential’ poll of 2009,” said Azhar, referring to the online poll that ‘moot’ (aka Christopher Poole) – the founder of the discussion board 4chan – won.
“You can go for the obvious landslide, or more subtle wins,” he added.
Malaysians may remember the TIME poll because the second most influential person in the world was apparently local opposition leader Anwar Ibrahim, but the poll had actually been manipulated to spell out a cryptic message ‘Marblecake Also the Game,’ noted Fox News.
Closer to home, in 2011, a poll on The Star Online on whether a proposed ‘Bersih’ pro-democracy rally should be cancelled was also manipulated. It received more than one million responses in less than a day, compared with the average of about 30,000 responses over several days, which led The Star Online to realise the poll was being manipulated.
“Furthermore, the total number of unique visitors to The Star Online is about 400,000 per day, lending further credence to suspicions that there was manipulation,” the news portal reported.
“We found it hard to find an online poll that wasn’t seriously gameable,” Haroon said at HITBSecConf.
UGC: Twitter and Reddit
On user-generated content (UGC), Thinkst explored Twitter and Reddit. For the former, Haroon referred to a study conducted by data scientist Gilad Lotan, who bought followers and then analysed the effects of buying Twitter followers – a practice that even politicians (or their even less scrupulous ‘social media consultants’) are guilty of.
“One thing’s crystal clear – on social media it is easy to mistake popularity for credibility,” Lotan had said.
“One of things he [Lotan] found was that ‘bought followers’ actually win you ‘organic followers,’ and that those real followers stay on even after your bought followers dropped off,” said Haroon.
“This is kind of intuitive, because if you use Twitter, you’d be more likely to follow someone with a high follower count than someone with a low count,” he added.
“Why does this matter? Because of the way we use Twitter – it’s not like an RSS feed. You don’t go in to catch up on all the day’s tweets, it’s a stream that you dip into.
“If I can convince a lot of people to follow me just by tweeting more, I get to dominate that person’s timeline, and essentially, what I get to do is crowd out other conversations. I can crowd out what I don’t want them to see,” Haroon said, referring to the practice of ‘timeline crowding.’
Meanwhile, Azhar said that Reddit is one of the best examples of UGC: It’s about people submitting links or stories, and the community voting on this content and policing it.
“You can use proxy accounts to ‘upvote’ articles you want to promote; or to ‘downvote’ articles you want to kill,” he said.
“Reddit expects people to try and game the system, so has many defences in place,” he added.
Still, Thinkst managed to breach those defences easily. It registered 50 proxy accounts and found that this was sufficient to upvote or downvote articles in subreddits – the various, more specialised sections on the platform.
Articles that are downvoted enough times are removed from the page for Reddit admins to investigate – but that act in itself is a victory, because, as noted, the article is knocked off the page.
“You can also try ‘trickle downvoting’ – only downvote as many times as new articles have upvotes. It keeps the score even, and while you do this, upvote the article you want people to see,” Azhar said.
When Thinkst did some mass downvoting on the WorldNews subreddit, downvoting all new articles as they appeared, the admins suspected something was up, but all they did was to put up a notice telling their users not to panic.
When Thinkst tried the same type of manipulation on the more specialised NetSec subreddit – devoted to news and matters about cyber-security – “the moderators responded with intelligent discussions and roped in official reddit admins to talk about the problem."
“But even then, they didn’t seem to have a handle of what was really going on. We had 50 bots operating, but they estimated only 20,” said Azhar.
“This shouldn’t have been so hard to spot at all,” added Haroon. “You have a bunch of users all voting on the same article, and you can isolate and detect things like voting in synch; signup times that are similar; common email domains; patterns in usernames; IP (Internet Protocol) addresses from known open proxies; and more, including users with low karma scores.”
‘Karma’ is how Reddit users are rated for their contributions to and engagement with the community.
“This stuff should be easy to spot once you’re looking at what you know is a compromised thread,” said Haroon. “This was as basic a bot attack as you could have.”
Next Page: Even major news sites not spared, controlling comments