The State of Internet Freedom Around the World: Difference between revisions
No edit summary |
No edit summary |
||
(5 intermediate revisions by the same user not shown) | |||
Line 8: | Line 8: | ||
|} | |} | ||
'''Who:''' Jillian C. York, EFF | '''Who:''' Jillian C. York, EFF | ||
'''Date:''' Thursday, October 22nd | '''Date:''' Thursday, October 22nd | ||
This year has been one of the most difficult years the Internet Freedom community has experienced in the last decade. Join EFF’s very own Jillian C. York, as she gives us a year in review, helping us understand the trends we are experiencing, and what we can do in the coming year to support Internet Freedom. Join this session and: | This year has been one of the most difficult years the Internet Freedom community has experienced in the last decade. Join EFF’s very own Jillian C. York, as she gives us a year in review, helping us understand the trends we are experiencing, and what we can do in the coming year to support Internet Freedom. Join this session and: | ||
Line 26: | Line 22: | ||
Jillian C. York is EFF's Director for International Freedom of Expression and is based in Berlin, Germany. Her work examines state and corporate censorship and its impact on culture and human rights, with an emphasis on marginalized communities. At EFF, she leads [https://www.onlinecensorship.org/ Onlinecensorship.org] and works on platform censorship and accountability, state censorship, the impact of sanctions, and digital security. Jillian's writing has been featured in Motherboard, Buzzfeed, the Guardian, Quartz, the Washington Post, and the New York Times, among others. She is also a regular speaker at global events. | Jillian C. York is EFF's Director for International Freedom of Expression and is based in Berlin, Germany. Her work examines state and corporate censorship and its impact on culture and human rights, with an emphasis on marginalized communities. At EFF, she leads [https://www.onlinecensorship.org/ Onlinecensorship.org] and works on platform censorship and accountability, state censorship, the impact of sanctions, and digital security. Jillian's writing has been featured in Motherboard, Buzzfeed, the Guardian, Quartz, the Washington Post, and the New York Times, among others. She is also a regular speaker at global events. | ||
// | ==Notes== | ||
* We are at an advanced generation of blocking, compared to 10 years ago. | |||
* Currently, we are looking at an era where legislation passed in the USA and Europe is having a disproportionate impact in the rest of the world. We are also seeing an export of laws/policies, like we have had an export of technology. | |||
* In Turkey, for example, their new internet law mimics the German law aimed at hate speech and misinformation, which forces companies to take down content in 24 hours. What makes this law different than the German law, is that they want to impose an even shorter period to take down content, and require companies to have a local person/employee. | |||
* The issue with these policies is that while they start off targeting social media platforms to address hate speech and misinfo/disinfo, they are then used to target activists/journalists/protests/etc. | |||
* Internet Shutdowns are currently much more sophisticated than they use to be. Some of the first Internet shutdowns happened in Mauritius, Mynmar, and Guinea. Trends happening in other parts of the world, could very easily reach the USA. | |||
* The general protection regulation coming out of Europe helps many of us. HOWEVER, other laws that are coming up in the EU are concerning, particularly those trying to deal with terrorism. Currently, companies help each other share content that has marked as extremist/terrorist, so that they can all take it down, which is usually done via automated technology instead of humans. The issue is that they disproportionately focus on Islam, for example, versus right wing extremists. | |||
* The terrorist regulations in Europe are going to require content to be take down in 1 hour, which means more automation/AI. Twitch for example, admitted to taking taking down anti-terrorist content because its too hard to tell they difference between it and pro-terrorism talk. | |||
* The upcoming Digital Service Act is a good place to put in some of the things we want. | |||
* Disinfo is impacting conflict zones. As it becomes a bigger problems in smaller countries, we are going to be seeing a lot more downside of tackling disinformation, for example the “accidental” removal of journalists and activists accounts. | |||
* In the US, we should be mindful that Internet shutdowns never happen. It is a possibility even though it may seem far fetched. | |||
* What is worrying about the future is platforms and their behavior. Why do companies like Facebook continue to centralize power, Why don't they understand freedom of speech and seem to promote hate speech. For example, why can you deny they holocaust but can’t show a nude body. Platform seem to be okay encouraging nasty behavior. Someone needs to intervene but who is it? Do we want the government? Other companies? A lot of the features companies are putting in now, they should have put out years ago, or should have been solved at the root. A lot of the problems that lead to where we are, we don’t talk about. We have to talk about inequality, race, gender, and everything else. | |||
* What we can’t do is think that there is one answer. We have to think about these problem holistically. Educate law makers in the US, EU and your country. Especially in the EU, since it will have an impact every place else. | |||
* We need more art to talk about these issues. How about bringing back Culture Jamming. | |||
* Its so important to engage the platforms. We can stop them from doing the worse things. | |||
* Currently, Jilian and others are working on the revamping the Santa Clara principles on transparency and accountability in content moderation. They had alot of recommendations coming from Brazil and Kenya. | |||
* Currently, platforms moderate speech without liability, and they can moderate speech as they see fit. They exert so much control over our speech. WE also have to define freedom of speech because certain speech negatively harms vulnerable communities. | |||
* Automated tools are being implemented to moderate content than ever before. AI can be effective in certain circumstances but it doesn;t solve everything. You need humans, but make sure moderators are taking care of, its very difficult because they see so much hateful and horrible material. | |||
* '''Jilian has written a new book: Silicon Values!!!!''' [https://www.penguinrandomhouse.com/books/667400/silicon-values-by-jillian-york/ https://www.penguinrandomhouse.com/books/667400/silicon-values-by-jillian-york/] | |||
>> <span style="font-size:larger">'''[[CKS Notes|Check out notes from other sessions here]]'''</span> | >> <span style="font-size:larger">'''[[CKS Notes|Check out notes from other sessions here]]'''</span> |
Latest revision as of 15:53, 22 October 2020
Who: Jillian C. York, EFF
Date: Thursday, October 22nd
This year has been one of the most difficult years the Internet Freedom community has experienced in the last decade. Join EFF’s very own Jillian C. York, as she gives us a year in review, helping us understand the trends we are experiencing, and what we can do in the coming year to support Internet Freedom. Join this session and:
- Learn about the state of Internet Freedom and developments in different regions.
- Common trends in the digital rights space.
- Upcoming, next-generation challenges on the horizon
- What you can do to help the community
Jillian C. York is EFF's Director for International Freedom of Expression and is based in Berlin, Germany. Her work examines state and corporate censorship and its impact on culture and human rights, with an emphasis on marginalized communities. At EFF, she leads Onlinecensorship.org and works on platform censorship and accountability, state censorship, the impact of sanctions, and digital security. Jillian's writing has been featured in Motherboard, Buzzfeed, the Guardian, Quartz, the Washington Post, and the New York Times, among others. She is also a regular speaker at global events.
Notes
- We are at an advanced generation of blocking, compared to 10 years ago.
- Currently, we are looking at an era where legislation passed in the USA and Europe is having a disproportionate impact in the rest of the world. We are also seeing an export of laws/policies, like we have had an export of technology.
- In Turkey, for example, their new internet law mimics the German law aimed at hate speech and misinformation, which forces companies to take down content in 24 hours. What makes this law different than the German law, is that they want to impose an even shorter period to take down content, and require companies to have a local person/employee.
- The issue with these policies is that while they start off targeting social media platforms to address hate speech and misinfo/disinfo, they are then used to target activists/journalists/protests/etc.
- Internet Shutdowns are currently much more sophisticated than they use to be. Some of the first Internet shutdowns happened in Mauritius, Mynmar, and Guinea. Trends happening in other parts of the world, could very easily reach the USA.
- The general protection regulation coming out of Europe helps many of us. HOWEVER, other laws that are coming up in the EU are concerning, particularly those trying to deal with terrorism. Currently, companies help each other share content that has marked as extremist/terrorist, so that they can all take it down, which is usually done via automated technology instead of humans. The issue is that they disproportionately focus on Islam, for example, versus right wing extremists.
- The terrorist regulations in Europe are going to require content to be take down in 1 hour, which means more automation/AI. Twitch for example, admitted to taking taking down anti-terrorist content because its too hard to tell they difference between it and pro-terrorism talk.
- The upcoming Digital Service Act is a good place to put in some of the things we want.
- Disinfo is impacting conflict zones. As it becomes a bigger problems in smaller countries, we are going to be seeing a lot more downside of tackling disinformation, for example the “accidental” removal of journalists and activists accounts.
- In the US, we should be mindful that Internet shutdowns never happen. It is a possibility even though it may seem far fetched.
- What is worrying about the future is platforms and their behavior. Why do companies like Facebook continue to centralize power, Why don't they understand freedom of speech and seem to promote hate speech. For example, why can you deny they holocaust but can’t show a nude body. Platform seem to be okay encouraging nasty behavior. Someone needs to intervene but who is it? Do we want the government? Other companies? A lot of the features companies are putting in now, they should have put out years ago, or should have been solved at the root. A lot of the problems that lead to where we are, we don’t talk about. We have to talk about inequality, race, gender, and everything else.
- What we can’t do is think that there is one answer. We have to think about these problem holistically. Educate law makers in the US, EU and your country. Especially in the EU, since it will have an impact every place else.
- We need more art to talk about these issues. How about bringing back Culture Jamming.
- Its so important to engage the platforms. We can stop them from doing the worse things.
- Currently, Jilian and others are working on the revamping the Santa Clara principles on transparency and accountability in content moderation. They had alot of recommendations coming from Brazil and Kenya.
- Currently, platforms moderate speech without liability, and they can moderate speech as they see fit. They exert so much control over our speech. WE also have to define freedom of speech because certain speech negatively harms vulnerable communities.
- Automated tools are being implemented to moderate content than ever before. AI can be effective in certain circumstances but it doesn;t solve everything. You need humans, but make sure moderators are taking care of, its very difficult because they see so much hateful and horrible material.
- Jilian has written a new book: Silicon Values!!!! https://www.penguinrandomhouse.com/books/667400/silicon-values-by-jillian-york/