Just Security: Taking No Chances, Thailand’s Junta Locks Down the Internet ahead of Elections

Spread the Knowledge

After repeated postponements, Thailand will hold general elections on March 24—the first vote since high-ranking military officers effected a coup d’état in 2014 and installed themselves as the National Council for Peace and Order.

The elections are hardly a gamble for the junta. A new constitution, drafted under the generals’ supervision, provides the structural means for the military to maintain political dominance while tolerating superficially democratic processes. To control the government, a party must secure a majority of seats in the two houses of parliament combined. In the lower chamber, the House of Representatives, a total of 350 constituency seats and 150 party-list seats will be filled through direct elections. But in the Senate, the junta will appoint all 250 members for the first five-year term, and six seats will be held by senior military officials. With the Senate’s support guaranteed, pro-junta parties such as Palang Pracharath will need to capture only a third of the lower house to select the prime minister. This bodes well for coup leader Prayuth Chan-ocha, the incumbent prime minister and leading candidate of Palang Pracharath.

Despite the tailor-made constitutional structure and weakened electoral laws, the junta is moving aggressively to squelch support for the opposition. For example, the politicized Constitutional Court recently dissolved Thai Raksa Chart, an opposition party affiliated with ousted former prime minister Thaksin Shinawatra, after Ubolratana Rajakanya, the maverick sister of Thailand’s king, attempted to run as the party’s candidate for prime minister.

But the most telling sign of the junta’s determination to silence dissenting voices ahead of the elections is its onerous regime of restrictions on social media, as outlined in Freedom House’s Internet Freedom Election Monitor.

Censoring Online Content

In January, the Election Commission of Thailand released strict guidelines that limit parties’ use of social media. Parties must register social media pages with the commission or be subjected to fines and prison terms. The rules also include penalties for sharing or “liking” defamatory content or spreading “false information.” If found in violation, parties can be dissolved and candidates can be jailed or disqualified from politics. The rules have already caused candidates to self-censor; some have even opted to delete their accounts entirely. Others have hired legal teams to ensure that their posts are compliant, reallocating resources that could be used to bring citizens into the political process.

The Thai authorities also commonly use removal requests to censor political content, and this has continued into the election period. The Election Commission set up a “war room” with a half-dozen monitors reviewing and flagging content deemed to be in violation of the guidelines. Requests have then been sent to platforms to remove posts that “spread lies, slander candidates, or use rude language.”

Arrests and Prosecutions for Online Activity 

The junta has an arsenal of repressive laws with which to target opposition leaders, activists, and journalists, most notably the Computer-Related Crime Act (CCA). Despite recent amendments, the CCA includes a number of broad and vague provisions that allow for politically motivated prosecutions. For example, it prohibits the dissemination of “false” and “distorted” information, and criminalizes content that the government deems to harm national security. Internet users are also routinely charged for lèse majesté, or insulting the monarchy.

At least five people have reportedly been charged under the CCA with sharing false information, endangering national security, and defaming officials in connection with the elections. Thanathorn Juangroongruangkit, the popular leader of the newly established opposition Future Forward Party, is facing charges under the CCA for “uploading false information” after he criticized the junta’s efforts to maintain power on Facebook in June 2018. In March, a junta official filed a complaint with police accusing Pongsakorn Rodchompoo, deputy leader of the Future Forward Party, of violating the CCA. Pongsakorn had shared an article on Facebook about a member of the military supposedly buying $377 coffee cups, though he quickly deleted the post when he discovered that it was false. Individuals charged under the CCA face up to five years in prison if convicted.

Keeping Voters in the Dark

Freedom House has long designated Thailand’s online environment as Not Free, and conditions seem at risk of further deterioration. In a country where a reported 74 percent of the population uses social media, platforms such as Facebook, Line, Twitter, and Instagram are fundamental to many people’s everyday lives. The junta’s targeting of social media suggests that it understands the unique role digital tools can play in opposition forces’ efforts to disseminate information and reach out to voters. The Future Forward Party has gained traction in part by using Line to build a following at the village level. It is also worth noting that the Election Commission’s guidelines are likely to have a disproportionate impact on younger voters, who use social media more regularly and may be more inclined to support progressive parties.

The military-dominated government’s controls on social media ensure that even within the narrow space for political competition allowed by the new constitution and related electoral laws, Thai voters will not have access to the open debate and discussion necessary to make an informed choice. The elections were never going to be free or fair enough to permit a democratic change in leadership, but they might have served as a roughly accurate barometer of public views. With the information environment so tightly circumscribed, even that gauge will be severely distorted.

Image: Future Forward Party leader Thanathorn Juangroongruangkit (L), is projected on a giant screen while speaking as New Economic Party leader Mingkwan Saengsuwan (2nd L), Pheu Thai party prime ministerial candidate Sudarat Keyuraphan (R) and Democrat Party leader Abhisit Vejjajiva listen during a live televised debate for candidates in Bangkok, Thailand on March 20, 2019, ahead of the March 24 general election. – Thais will head to polls on March 24 for the country’s much-anticipated elections after nearly half a decade under junta rule. (Photo by Romeo GACAD / AFP/Getty Images)

The post Taking No Chances, Thailand’s Junta Locks Down the Internet ahead of Elections appeared first on Just Security.

Just Security


Spread the Knowledge

“trump and republican party” – Google News: The Republican party has surrounded to racist-in-chief Trump — will the rest of America? – AlterNet

Spread the Knowledge

The Republican party has surrounded to racist-in-chief Trump — will the rest of America?  AlterNet

We are about to learn something important about this country. Donald Trump is running for re-election as a flat-out racist and bigot, and we are going to discover …

“trump and republican party” – Google News


Spread the Knowledge

“Russia influence in Eastern Europe” – Google News: The US is losing the nuclear energy export race to China and Russia. Here’s the Trump team’s plan to turn the tide – CNBC

Spread the Knowledge

The US is losing the nuclear energy export race to China and Russia. Here’s the Trump team’s plan to turn the tide  CNBC

Belarusian and Russian national flags are seen at the construction site of the very first Belarusian nuclear power plant, which will have two power-generating …

“Russia influence in Eastern Europe” – Google News


Spread the Knowledge

“crime and terror” – Google News: Birmingham mosque attacks: Counter-terror police called in to investigate incidents at Muslim places of worship across city – The Independent

Spread the Knowledge

Birmingham mosque attacks: Counter-terror police called in to investigate incidents at Muslim places of worship across city  The Independent

Counter-terrorism police have been called in to investigate a series of attacks on mosques across Birmingham. Four mosques in the city had windows smashed …

“crime and terror” – Google News


Spread the Knowledge

“organized crime and terrorism” – Google News: Israeli spyware used to target slain Mexican journalist’s widow, report says – Haaretz

Spread the Knowledge

Israeli spyware used to target slain Mexican journalist’s widow, report says  Haaretz

Pegasus, licensed by NSO, allows monitoring devices and their *content*, including the remote activation of cameras and microphones without users’ knowledge.

“organized crime and terrorism” – Google News


Spread the Knowledge

mikenov on Twitter: The Nazi Fox News serving the Nazi Pig Trump gets a new old kapo: “Lachlan Murdoch takes control of Fox Corp. But how will he deal with President Trump?” Wapo is concerned… How will The Pig get his TV coverage? trumpinvestigations.bl

Spread the Knowledge

The Nazi Fox News serving the Nazi Pig Trump gets a new old kapo: “Lachlan Murdoch takes control of Fox Corp. But how will he deal with President Trump?” Wapo is concerned… How will The Pig get his TV coverage? trumpinvestigations.blogspot.com/2019/03/the-na… pic.twitter.com/btRvOh4Rp1



Posted by

mikenov
on Thursday, March 21st, 2019 9:05am

mikenov on Twitter


Spread the Knowledge

mikenov on Twitter: The Nazi Fox News serving the Nazi Pig Trump gets a new old kapo: “Lachlan Murdoch takes control of Fox Corp. But how will he deal with President Trump?” Wapo is concerned… How will The Pig get his TV coverage? trumpinvestigations.b

Spread the Knowledge

The Nazi Fox News serving the Nazi Pig Trump gets a new old kapo: “Lachlan Murdoch takes control of Fox Corp. But how will he deal with President Trump?” Wapo is concerned…
How will The Pig get his TV coverage? trumpinvestigations.blogspot.com/2019/03/the-na…


Posted by

mikenov
on Thursday, March 21st, 2019 9:04am

mikenov on Twitter


Spread the Knowledge

Just Security: Sizing Up Facebook’s New Disclosures About the Christchurch Shooting

Spread the Knowledge

This article is co-published with Protego Press.

 

Facebook has just released new information about how the video of the recent terrorist attack in Christchurch, New Zealand, circulated online. There is much to digest here, and much more surely to come from technology companies, governments, and researchers in coming months; but Facebook’s new disclosures prompt four initial thoughts.

First and most simply, Facebook has made an effort to say more about its role in the spread of videos of the horrific violence in Christchurch—and to do so relatively quickly. Having been criticized for how slow and reluctant the company was to share information on the spread of disinformation during the 2016 presidential campaign cycle, the company is clearly taking a more proactive approach here. That’s to its credit, even if there are still more questions some will justifiably want answered, such as the full extent of Facebook user engagement with the uploaded videos and the average length of time these videos remained on the platform before being removed. And it’s worth keeping in mind the possibility that law enforcement (in the United States or elsewhere) might be asking the company not to share certain information while the attack is being investigated.

Second, Facebook’s disclosures demonstrate powerfully the cross-platform complexity of how violence spreads online. Facebook’s report reaffirms earlier indications that approximately 200 people watched the attack in real time and none reported it, and clarifies that the audience of 200 wasn’t assembled using Facebook itself. That, in turn, prompts the question of how that audience was assembled, and suggests the possibility that it might have amassed through a closed chat group on an end-to-end encrypted application like Signal, Telegram, or WhatsApp (itself owned by Facebook). That’s one way in which the spread of this vile material crossed platforms. Another way emerged once the video was no longer live but recorded: Facebook’s disclosures indicate that, even after Facebook essentially cleaned its platform of the recorded video, it was shared on other platforms (like YouTube), then reappeared on Facebook as it was uploaded there once again, in modified forms. All of this drives home the key point that addressing today’s content policy challenges can’t be viewed as a problem that any one company can solve on its own.

Third, Facebook’s report updates the company’s earlier estimates of how many distinct versions of the video Facebook has shared via the database jointly maintained by a number of tech companies through the Global Internet Forum to Counter Terrorism. It’s remarkable that nearly a thousand variations of the video have been created so quickly. That exceeds the rate at which jihadist groups like ISIS and their supporters tend to create versions of their video materials, perhaps as those groups seek to maintain stricter “brand control.” And it shows how diffuse this challenge has become: presumably these variations of the video are being made—rapidly—by individuals distributed globally, wholly unconnected to the Christchurch attacker, and yet intent on spreading video footage of his violence. Accentuating that point even further is the astonishing pace at which copies of the video were uploaded, leading some experts to speculate that the video’s rapid amplification was, in part, the product of a more deliberate, coordinated campaign. If that proves true, it would show a more tightly linked global community supporting white nationalist terrorism than has been seen before, adding to the increasingly international nature of what has long been called “domestic terrorism.”

Fourth and most fundamentally, much of the commentary over the past week has lumped together the real-time broadcast of the violence in Christchurch with the continued availability and indeed spread of recordings of that violence after the attack was over. Those are, of course, related challenges insofar as they involve the same basic content, but they’re distinguishable in key ways; and conflating them muddies rather than clarifies the problem and its potential solutions. The ability to broadcast in real time has become entrenched not only on Facebook but also on other leading communications platforms like Twitter; and figuring out how, if at all, to moderate that real-time content is a particular (and particularly difficult) challenge. One possibility would be for companies like Facebook to introduce a delay in broadcasting much as network television utilizes; but that has its downside, too, such as by inviting authoritarian regimes to demand that companies use that delay to block dissidents before their voices can be heard. Plus, it’s not clear how effectively violence can be caught through automated review in just seconds or even minutes of delayed broadcast. In contrast, the challenges posed by content that has been recorded and is now being maintained, recycled, and even repackaged are, in key respects, distinguishable. They include questions about what content should be removed; what role computer algorithms versus human review should play in making those removal decisions; what content should be blocked—or at least delayed—before being uploaded in the first place; and what other content should be offered via algorithm to a viewer as something to watch next.

These are hard, important, and urgent questions. Facebook’s disclosures today provide more fodder for discussing and debating them; and that is, itself, valuable.

Image: People view flowers and tributes by the botanical gardens on March 19, 2019 in Christchurch, New Zealand – Carl Court/Getty Images

The post Sizing Up Facebook’s New Disclosures About the Christchurch Shooting appeared first on Just Security.

Just Security


Spread the Knowledge