January 12 2023 GM: Difference between revisions

From TCU Wiki
Victoria (talk | contribs)
No edit summary
Victoria (talk | contribs)
Line 23: Line 23:
'''Bio:''' Stella Cooper and Ray Adams Row Farr who are the frontline of the Evidence Lab’s research, and Marija Ristic who leads the Digital Verification Corps (the student volunteer cohort) alongside Sophie Dyer, who is visual researcher and Tactical Research Advisor at Amnesty will speak with each other and us on the ways in which we could embed OSINT in our work from feminist and ethical perspectives.
'''Bio:''' Stella Cooper and Ray Adams Row Farr who are the frontline of the Evidence Lab’s research, and Marija Ristic who leads the Digital Verification Corps (the student volunteer cohort) alongside Sophie Dyer, who is visual researcher and Tactical Research Advisor at Amnesty will speak with each other and us on the ways in which we could embed OSINT in our work from feminist and ethical perspectives.
==Notes==
==Notes==
''Notes will be posted here''
This is the [https://citizenevidence.org/ Evidence Lab Blog], where you can find how-tos, case studies, reflections and ethical discussions from the Evidence Lab team.
Today we're speaking with @stella.cooper , @rayadamsrowfarr , @marienr and @sophie_d (you can find them on Mattermost with this usernames): Stella Cooper and Ray Adams Row Farr,  who are the frontline of the Evidence Lab’s research, and Marija Ristic who leads the Digital Verification Corps (the student volunteer cohort) alongside Sophie Dyer, who is a visual researcher and Tactical Research Advisor at Amnesty!
 
===== '''Tell us more about yourselves, how do you personally define OSINT (if you'd like to!) and how did you get into this type of work/research?''' =====
*Stella:
**I took a zigzag path to where I am now. When I was in university, I really loved travel and  research. I  started looking for career paths that would let me do that. A professor recommended I email C4ADS (Center for Advanced Defense Studies), which is a Washington DC based non-profit. They use open source (I had never heard the phrase at the time) to map and track global illicit networks that contribute to weapons trafficking, environmental crime, nuclear proliferation, and forced labor. At the time it was a really small organization (less than 15 people) and I just thought their reports were very fascinating. I started interning for them my senior year of college and stayed on staff for 4 years after that. During that 4 years I got to develop new open source techniques to work on issues I really cared about, like building a database of weapons in South Sudan, uncovering the first database of property ownership in Dubai, and even heling build a flight tracking platform. I learned how to collaborate and innovate.
**Still, that job wasn’t the fit for me and I was curious about combining more traditional investigative approaches (testimony, narrative storytelling, etc) with open source methods. I applied to the New York Times fellowship program and was very lucky to join their Visual Investigations desk (first as a fellow and afterwards as a freelance video journalist). My plans to focus on open source in long term investigations, but quickly the New York Times realized that verification and video forensics were becoming central to breaking news. I started covering a lot of US news including the BLM protests and the rise of white supremacy in the leadup to January 6th.
**After that I landed at Amnesty where I’m an open source researcher with the Evidence Lab, which basically means that I look for ways to apply open source methods across Amnesty International. Alongside Ray, Marija, and Sophie, I help Amnesty researchers by adding skills like verification/geolocation, analyzing videos from social media, working with public records, and many more open source approaches.
**I define open source as using any digital/publicly available information for investigative purposes. I like to keep that definition wide because it’s always evolving. One note, our team makes an effort to use the phrase “open source” rather than “OSINT.” We made this choice because the “-INT” terminology is a product of the anglo-american intelligence industry, and we feel the that shouldn’t be centered in the broader open source community.
**I feel really strongly that you shouldn't need a specific technical background to begin exploring open source investigations (I certainly didn't). I think the beauty of this work is in finding research partners with complimentary skills. That's a big part of why I've been able to learn what I know now. Some people will code, others will map, or interview, or endlessly scrub the web. You don't need any one skill to bring value to the space.
**There is also the myth of the #OSINT "sofa sleuth" who conducts high stakes investigations from their bedroom. This might be true in a few exceptional circumstances. Working for Amnesty we get a lot of professional support e.g. psycho social support, which mitigates the risks of open source such as secondary stress and trauma or security threats.
*Ryan:
**I’ve been doing open source research on human rights issues for 5 years, first volunteering for Amnesty as part of our Digital Verification Corps when I was at university, and then for the past 18 months as an Open Source Researcher in the Evidence Lab. I also teach the ethics, challenges and practicalities of open source research at the University of Cambridge, the UK’s National Centre for Research Methods and MIT. One part of my job I’m really passionate about is sharing knowledge and having open discussions about the tensions and difficulties of doing the work we do.
**For me, open source is a way of using openly accessible digital sources to conduct research into a given topic. Most commonly it’s thought of in terms of photos and videos, but - especially in the Evidence Lab - it can mean anything from satellite imagery and large data sets (like [https://acleddata.com/ ACLED]) to pollution data and text analysis.
*Marien: For me, open-source is a body of data, docs, images, videos etc, that are publicly available. My background is in journalism, the same as Stella,  so this came naturally. At the beginning of my carrier, I used a lot of traditional open-source information, such as documents, and then as the field developed, I started working more and more on images, videos, data scraping etc
*Sophie:
**My main responsibility is leading [https://decoders.amnesty.org/projects/ Amnesty Decoders and working on our visual investigations. Amnesty Decoders combines open source information with micro tasking, a.k.a people power, and other forms of digital participation to generate the data for human rights investigations.
**My training is in design, research and spatial investigations (via the Centre for Research Architecture at Goldsmiths). Before Amnesty, I worked at Airwars, a small NGO that uses social media to record civilian casualties, at the time from international air strikes in Syria and Iraq.
**Until recently I co-organized the Feminist Open Source Investigations Group. As soul food, I also maintain an art practice.
**I personally define digital open source research as any type of research that makes use of information that is in the public domain i.e. not private or classified, or so expansive that the cost would be prohibitive. This can include freedom of information requests or purchased satellite imagery.
 
===== '''What are the major methodological challenges about open source (these can be ethical, operational, etc)?''' =====
*Sophia says that running a crowdsourcing/solving project, one of the biggest challenges for me is finding a research question that fits. We need a big problem that can be broken down into many simple tasks, and does not contain any graphic or otherwise traumatic content. A good example of this was Decode Surveillance NYC, where 7K volunteers helped us count and categorize CCTV cameras on Google Street View.  But we scope lots of projects before finding one that could work.
*Stella adds that, right now, she is thinking A LOT about overconfidence in digital evidence - We live in a world where people believe what they see. In the professional community we occupy, I’ve seen far too many times that people are fixated on the minutiae of videos (i.e. giving the exact time something happened) and lose sight of the implications of the work. For example, it may be possible to provide an exact play-by-play of how an officer involved shooting, but does that matter if you traumatize your audience and retraumatize the victims by publishing that footage? I see a lot of emphasis on “truth” and “verification” in mainstream open source right now, and fewer questions of “should we be applying these methods in these ways?” This has so many implications for privacy in a world where open source information is constantly being created about all of us.
*To build on what Stella’s mentioned, one methodological challenge Ray struggles with on a daily basis is how you decide on the threshold of “verified” for a piece of content. When collaborating with others, this is often framed as the ‘goal’ of our research, as it’s a commonly used term and part of the wider community’s understanding of what open source is.
*What is meant by “Verified” is highly contextual, and can vary from case to case. The over reliance on this one term can oversimplify a complex reality. The challenge then is to use language sensitively and accurately to represent what you’ve found out.
*The other issue with defining our research in terms of ‘verified’ or ‘unverified’ is that it obscures the labor that goes into research which is ultimately unverifiable. Doing work which goes nowhere is a really important part of our work, and [https://citizenevidence.org/2021/12/10/not-everything-is-verifiable-but-thats-ok-lessons-from-a-failed-geolocation/ finding something unverifiable is often equally as important] as it rules out what you definitely can’t say with confidence. On a personal level, these challenges shape my perception of 'success' in the field and can shape how accomplished you feel on a given project.
*For Sophie, bias is also a big problem in open source. Some human rights violations are simply more visible because they show up in photos or are not taboo subjects, such as sexual or gender based violence. Here are some recent writing on bias:
**[https://www.researchgate.net/publication/350749186_Open_Source_Information's_Blind_Spot Open Source Information’s Blind Spot] by Yvonne McDermott, Alexa Koenig and Daragh Murray
**[https://academic.oup.com/jicj/article/19/1/55/6276591 Power and Privilege: Investigating Sexual Violence with Digital Open Source Information] by Alexa Koenig, Ulic Egan
**[https://twitter.com/sophiecdyer/status/1501253912799723522?s=20&t=k_4wnRpiGIyxPsjDJa1PSQ Twitter thread] about the limits of open source in the context of the invasion of Ukraine
*Stella adds to this: Right now, a lot of resources are being poured into using open source for traditional conflict issues, which often align with military interests, etc... it's changing quickly though and I think this is evident when you look at how young people already apply a lot of these "research methods" all the time. TBH I have Gen Z cousins that can unravel information online faster than most researchers/journalists I've worked with.
*For Ray, this is a constant source of frustration for us, and illustrates a core tension in open source: it often uses other people's publicly available data, justified by the moral purpose of being used to 'do good'. Examples like the one you've mentioned test the limits of to what extent that can be a justification for invasive practices being used in the field
 
===== '''Where to get started? What are the best places/accessible or free resources you recommend to learn about open source investigation?''' =====
*Different people get inspiration from different things! Some people like to follow  data journalists, investigative reporters on Twitter. This can give you awareness of how these methods are applied.
*Marien: Agree with Stella and add that there are many ethical challenges the open source community is facing and in general ethics of open sources is very poorly defined. It is usually up to organizations to develop their own ethical guidelines and determine how they use open-source content. If I need to list some ethical concerns, I can, for example, outline the lack of guidance on how we credit content we take, and do we ask permission or not. In many cases, people just repost what is online, rarely asking for consent. So people who post are often removed from their agency to decide how something is used further by the open-source community.
*Sophie builds on Marien’s feedback: while there are sometimes valid security reasons for not naming a source, too often investigators do not try and get permission. Until recently, I felt that there was a sense of exception in open source research where the normal rules don't apply. I think this is changing, but not fast enough. When we are doing this work, we should always being asking who benefits? Who is put at risk because of the work? Justice for who?
*Check out blogs/guides like:
**[https://gijn.org/online-research-tools/ Bellingcat, GIJN]
**[https://citizenevidence.org/ Citizen Evidence], by Evidence Lab
**[https://advocacyassembly.org/en/partners/amnesty/ Free and interactive course on Open Source Research], by Evidence Lab
**Read research papers from groups like [https://www.info-res.org/ C4ADS or CIR]
 
===== '''Is there a responsible data framework for [https://responsibledata.io/2016/11/14/responsible-data-open-source-intelligence/ open source investigation]?''' =====
*This is an amazing question! Sadly, the answer (to Stella’s  knowledge) is no. Right now, things are being done on an institutional basis (and not all groups take the time). Her hope is that as more groups publish their ethical guides, community guidelines are clarified.
*But also keep in mind that all ethics are contextual, so they should vary for each person and group.
*Sophie adds: This is not a field-wide effort, but [https://www.theengineroom.org/ethical-considerations-for-open-source-investigations/ this Engine Room Ethical Considerations for Open Source Investigations] was published recently and I think it is a really accessible and thoughtful resource.
 
===== '''What are the tools you use in this work? What criteria do you use when looking at a new tool to use?''' =====
*When looking at a new tool, Stella explains, I always examine how data may be stored by the tools I’m using. Where is the data hosted? Who has access to it? Who has a backdoor? Will I be selling someone else’s data by using this tool? Can I ensure that my information will not be co-opted?
*This is especially important at Amnesty because we often integrate witness/survivor testimony into our investigation. We have to be protective of our investigations to respect the people Amnesty interviews.
 
*Stella also thinks about if the tool itself is doing anything unethical. For example, there are some "people search" tools that basically get data from their users address book and make that information searchable to others. To me, that crosses the line of what individuals have consented to putting online and I don't use it.
*Building on what Stella mentioned, Marien says that they try to use tools that are previously vetted, either by other organizations or us. I always pay attention to who is behind this tool, and for what purpose the tool was designed. Additionally, I pay particular attention if any data is stored or if I am leaving a digital footprint by using these specific tools.
*Ray finds that tools are most useful at making existing manual processes easier, or helping answer one part of a very specific question (e.g. If want to know what time of day a video was captured at, you turn to [https://www.suncalc.org/ SunCalc] to use the shadows to work out the time, or [https://www.wolframalpha.com/input?i=weather%3A+lithuania+16+november Wolfram Alpha] to search for the weather on the day). The most difficult bit is often not finding an appropriate tool, but working out where a tool may be needed in the first place. This points more broadly to the most useful tool in open source: your analytical, curious and critical mindset! You don't need access to fancy, complicated or paid tools to get started and do a successful open source investigation. That being said, one free and easy recommendation is [https://www.google.com/search?q=invid&oq=invid&aqs=chrome..69i57j69i59j0i271j69i61l3.1105j0j7&sourceid=chrome&ie=UTF-8 InVID] as a catch-all combing a few useful processes like reverse image search and metadata downloading.
*Here are some good sites to check:
**[https://www.marinetraffic.com/en/ais/home/centerx:-12.0/centery:25.0/zoom:4 Marine Traffic]
**[https://apps.sentinel-hub.com/eo-browser/?zoom=12&lat=9.38968&lng=31.5022&themeId=DEFAULT-THEME&toTime=2023-01-12T14%3A48%3A38.873Z Sentinel Hub]
**[https://opencorporates.com/ Open Corporate]
**Maltego
**Google Earth
 
===== '''What would you recommend to digital rights defenders in terms of open source? Particularly, because we live in an increasingly oppressive conditions, we are pushed by a sense of urgency (which then causes unethical processes under the guise of greater good)''' =====
*Stella:
**Let honest questions about your investigation drive the work, not just the research methods. Often I see people who are starting out in open source let practicing established open source methods drive their work, even though they may not directly contribute to understanding the problem (i.e. you spend 15 hours geolocating a video that doesn’t actually need to be). Exploration and practice are important, but try to keep the bigger picture in mind. What are you trying to prove and how can you honor that story?
**Take your problem/research question and think about all the signals of that activity. When looking at a major event I think “who may have captured the event on camera?,” like government cameras (CCTV), observers with cell phones, neighbors with home surveillance systems, satellites from above. When looking at a company I think, “what official processes do they have to go through to exist as a company?,” like filing taxes, registering licenses, settling lawsuits, etc.  When looking at a weapons delivery, I think “where did this have to travel to get here?,” like from a manufacturer with a catalog online, through a port with records, on a vessel that you can track with AIS. Don't be afraid to use interviews and human sources alongside your work. Stay curious about your research question and keep looking for new data sources.
*Sophie:
**Look after yourself
**If something feels uncomfortable or risky stop and get support or a second opinion
**Think twice before you publish. Some information might be best shared privately or in a dark archive. This [https://citizenevidence.org/2020/11/10/ethics-data-open-source/ blog post] we wrote includes a series of decision trees for the collection and sharing of geo-located data in crisis situations.
*Ray: just because you can access/research/publish something, that doesn't mean you should. Integrate reflection and evaluation (by yourself and others) into your research as a core part of the process.
 
===== '''What are some of the best recent investigations/reports based on open source research that had a human rights impact that you recommend checking out? And some of the open source research experts we can follow on Twitter or Mastodon?''' =====
*In an act of shameless self promotion, we would love to suggest [https://www.amnesty.org/en/latest/research/2022/11/myanmar-the-supply-chain-fueling-war-crimes/ this report our team contributed to recently]! I think it's an amazing example of mixing open source methods (social media, weapons analysis, satellite imagery, vessel tracking) and traditional reporting (leaked documents, interviews, etc.)

Revision as of 11:34, 16 January 2023

Glitter Meetups

Glitter Meetup is the weekly town hall of the Internet Freedom community at the IF Square on the TCU Mattermost, at 9am EST / 2pm UTC. Do you need an invite? Learn how to get one here.

  • Date: Thursday, January 12th
  • Time: 9am EST / 2pm UTC
  • Who: Stella Cooper, Ray Adams Row Farr, Marija Ristic, and Sophie Dyer
  • Moderator: Islam
  • Where: On TCU Mattermost "IF Square" Channel.

AMA Evidence Lab: Feminist and Ethical Perspectives on OSINT

For this GM, we’re meeting with Amnesty’s Evidence Lab frontlines where you will have the space to ask any questions about open source research methodologies, tactics, and ethicalities. Using open source-research excavated data in humanitarian, human rights, and open source investigations has been on the rise, and in several contexts, such as the MENA, the question of ethics is integral to how data is collected and presented. This is a chance for everyone to learn more about this work, and an opportunity to contribute to several questions that frontlines researchers deal with.

Bio: Stella Cooper and Ray Adams Row Farr who are the frontline of the Evidence Lab’s research, and Marija Ristic who leads the Digital Verification Corps (the student volunteer cohort) alongside Sophie Dyer, who is visual researcher and Tactical Research Advisor at Amnesty will speak with each other and us on the ways in which we could embed OSINT in our work from feminist and ethical perspectives.

Notes

This is the Evidence Lab Blog, where you can find how-tos, case studies, reflections and ethical discussions from the Evidence Lab team.

Today we're speaking with @stella.cooper , @rayadamsrowfarr , @marienr and @sophie_d (you can find them on Mattermost with this usernames): Stella Cooper and Ray Adams Row Farr, who are the frontline of the Evidence Lab’s research, and Marija Ristic who leads the Digital Verification Corps (the student volunteer cohort) alongside Sophie Dyer, who is a visual researcher and Tactical Research Advisor at Amnesty!

Tell us more about yourselves, how do you personally define OSINT (if you'd like to!) and how did you get into this type of work/research?
  • Stella:
    • I took a zigzag path to where I am now. When I was in university, I really loved travel and research. I started looking for career paths that would let me do that. A professor recommended I email C4ADS (Center for Advanced Defense Studies), which is a Washington DC based non-profit. They use open source (I had never heard the phrase at the time) to map and track global illicit networks that contribute to weapons trafficking, environmental crime, nuclear proliferation, and forced labor. At the time it was a really small organization (less than 15 people) and I just thought their reports were very fascinating. I started interning for them my senior year of college and stayed on staff for 4 years after that. During that 4 years I got to develop new open source techniques to work on issues I really cared about, like building a database of weapons in South Sudan, uncovering the first database of property ownership in Dubai, and even heling build a flight tracking platform. I learned how to collaborate and innovate.
    • Still, that job wasn’t the fit for me and I was curious about combining more traditional investigative approaches (testimony, narrative storytelling, etc) with open source methods. I applied to the New York Times fellowship program and was very lucky to join their Visual Investigations desk (first as a fellow and afterwards as a freelance video journalist). My plans to focus on open source in long term investigations, but quickly the New York Times realized that verification and video forensics were becoming central to breaking news. I started covering a lot of US news including the BLM protests and the rise of white supremacy in the leadup to January 6th.
    • After that I landed at Amnesty where I’m an open source researcher with the Evidence Lab, which basically means that I look for ways to apply open source methods across Amnesty International. Alongside Ray, Marija, and Sophie, I help Amnesty researchers by adding skills like verification/geolocation, analyzing videos from social media, working with public records, and many more open source approaches.
    • I define open source as using any digital/publicly available information for investigative purposes. I like to keep that definition wide because it’s always evolving. One note, our team makes an effort to use the phrase “open source” rather than “OSINT.” We made this choice because the “-INT” terminology is a product of the anglo-american intelligence industry, and we feel the that shouldn’t be centered in the broader open source community.
    • I feel really strongly that you shouldn't need a specific technical background to begin exploring open source investigations (I certainly didn't). I think the beauty of this work is in finding research partners with complimentary skills. That's a big part of why I've been able to learn what I know now. Some people will code, others will map, or interview, or endlessly scrub the web. You don't need any one skill to bring value to the space.
    • There is also the myth of the #OSINT "sofa sleuth" who conducts high stakes investigations from their bedroom. This might be true in a few exceptional circumstances. Working for Amnesty we get a lot of professional support e.g. psycho social support, which mitigates the risks of open source such as secondary stress and trauma or security threats.
  • Ryan:
    • I’ve been doing open source research on human rights issues for 5 years, first volunteering for Amnesty as part of our Digital Verification Corps when I was at university, and then for the past 18 months as an Open Source Researcher in the Evidence Lab. I also teach the ethics, challenges and practicalities of open source research at the University of Cambridge, the UK’s National Centre for Research Methods and MIT. One part of my job I’m really passionate about is sharing knowledge and having open discussions about the tensions and difficulties of doing the work we do.
    • For me, open source is a way of using openly accessible digital sources to conduct research into a given topic. Most commonly it’s thought of in terms of photos and videos, but - especially in the Evidence Lab - it can mean anything from satellite imagery and large data sets (like ACLED) to pollution data and text analysis.
  • Marien: For me, open-source is a body of data, docs, images, videos etc, that are publicly available. My background is in journalism, the same as Stella, so this came naturally. At the beginning of my carrier, I used a lot of traditional open-source information, such as documents, and then as the field developed, I started working more and more on images, videos, data scraping etc
  • Sophie:
    • My main responsibility is leading [https://decoders.amnesty.org/projects/ Amnesty Decoders and working on our visual investigations. Amnesty Decoders combines open source information with micro tasking, a.k.a people power, and other forms of digital participation to generate the data for human rights investigations.
    • My training is in design, research and spatial investigations (via the Centre for Research Architecture at Goldsmiths). Before Amnesty, I worked at Airwars, a small NGO that uses social media to record civilian casualties, at the time from international air strikes in Syria and Iraq.
    • Until recently I co-organized the Feminist Open Source Investigations Group. As soul food, I also maintain an art practice.
    • I personally define digital open source research as any type of research that makes use of information that is in the public domain i.e. not private or classified, or so expansive that the cost would be prohibitive. This can include freedom of information requests or purchased satellite imagery.
What are the major methodological challenges about open source (these can be ethical, operational, etc)?
  • Sophia says that running a crowdsourcing/solving project, one of the biggest challenges for me is finding a research question that fits. We need a big problem that can be broken down into many simple tasks, and does not contain any graphic or otherwise traumatic content. A good example of this was Decode Surveillance NYC, where 7K volunteers helped us count and categorize CCTV cameras on Google Street View. But we scope lots of projects before finding one that could work.
  • Stella adds that, right now, she is thinking A LOT about overconfidence in digital evidence - We live in a world where people believe what they see. In the professional community we occupy, I’ve seen far too many times that people are fixated on the minutiae of videos (i.e. giving the exact time something happened) and lose sight of the implications of the work. For example, it may be possible to provide an exact play-by-play of how an officer involved shooting, but does that matter if you traumatize your audience and retraumatize the victims by publishing that footage? I see a lot of emphasis on “truth” and “verification” in mainstream open source right now, and fewer questions of “should we be applying these methods in these ways?” This has so many implications for privacy in a world where open source information is constantly being created about all of us.
  • To build on what Stella’s mentioned, one methodological challenge Ray struggles with on a daily basis is how you decide on the threshold of “verified” for a piece of content. When collaborating with others, this is often framed as the ‘goal’ of our research, as it’s a commonly used term and part of the wider community’s understanding of what open source is.
  • What is meant by “Verified” is highly contextual, and can vary from case to case. The over reliance on this one term can oversimplify a complex reality. The challenge then is to use language sensitively and accurately to represent what you’ve found out.
  • The other issue with defining our research in terms of ‘verified’ or ‘unverified’ is that it obscures the labor that goes into research which is ultimately unverifiable. Doing work which goes nowhere is a really important part of our work, and finding something unverifiable is often equally as important as it rules out what you definitely can’t say with confidence. On a personal level, these challenges shape my perception of 'success' in the field and can shape how accomplished you feel on a given project.
  • For Sophie, bias is also a big problem in open source. Some human rights violations are simply more visible because they show up in photos or are not taboo subjects, such as sexual or gender based violence. Here are some recent writing on bias:
  • Stella adds to this: Right now, a lot of resources are being poured into using open source for traditional conflict issues, which often align with military interests, etc... it's changing quickly though and I think this is evident when you look at how young people already apply a lot of these "research methods" all the time. TBH I have Gen Z cousins that can unravel information online faster than most researchers/journalists I've worked with.
  • For Ray, this is a constant source of frustration for us, and illustrates a core tension in open source: it often uses other people's publicly available data, justified by the moral purpose of being used to 'do good'. Examples like the one you've mentioned test the limits of to what extent that can be a justification for invasive practices being used in the field
Where to get started? What are the best places/accessible or free resources you recommend to learn about open source investigation?
  • Different people get inspiration from different things! Some people like to follow data journalists, investigative reporters on Twitter. This can give you awareness of how these methods are applied.
  • Marien: Agree with Stella and add that there are many ethical challenges the open source community is facing and in general ethics of open sources is very poorly defined. It is usually up to organizations to develop their own ethical guidelines and determine how they use open-source content. If I need to list some ethical concerns, I can, for example, outline the lack of guidance on how we credit content we take, and do we ask permission or not. In many cases, people just repost what is online, rarely asking for consent. So people who post are often removed from their agency to decide how something is used further by the open-source community.
  • Sophie builds on Marien’s feedback: while there are sometimes valid security reasons for not naming a source, too often investigators do not try and get permission. Until recently, I felt that there was a sense of exception in open source research where the normal rules don't apply. I think this is changing, but not fast enough. When we are doing this work, we should always being asking who benefits? Who is put at risk because of the work? Justice for who?
  • Check out blogs/guides like:
Is there a responsible data framework for open source investigation?
  • This is an amazing question! Sadly, the answer (to Stella’s knowledge) is no. Right now, things are being done on an institutional basis (and not all groups take the time). Her hope is that as more groups publish their ethical guides, community guidelines are clarified.
  • But also keep in mind that all ethics are contextual, so they should vary for each person and group.
  • Sophie adds: This is not a field-wide effort, but this Engine Room Ethical Considerations for Open Source Investigations was published recently and I think it is a really accessible and thoughtful resource.
What are the tools you use in this work? What criteria do you use when looking at a new tool to use?
  • When looking at a new tool, Stella explains, I always examine how data may be stored by the tools I’m using. Where is the data hosted? Who has access to it? Who has a backdoor? Will I be selling someone else’s data by using this tool? Can I ensure that my information will not be co-opted?
  • This is especially important at Amnesty because we often integrate witness/survivor testimony into our investigation. We have to be protective of our investigations to respect the people Amnesty interviews.
  • Stella also thinks about if the tool itself is doing anything unethical. For example, there are some "people search" tools that basically get data from their users address book and make that information searchable to others. To me, that crosses the line of what individuals have consented to putting online and I don't use it.
  • Building on what Stella mentioned, Marien says that they try to use tools that are previously vetted, either by other organizations or us. I always pay attention to who is behind this tool, and for what purpose the tool was designed. Additionally, I pay particular attention if any data is stored or if I am leaving a digital footprint by using these specific tools.
  • Ray finds that tools are most useful at making existing manual processes easier, or helping answer one part of a very specific question (e.g. If want to know what time of day a video was captured at, you turn to SunCalc to use the shadows to work out the time, or Wolfram Alpha to search for the weather on the day). The most difficult bit is often not finding an appropriate tool, but working out where a tool may be needed in the first place. This points more broadly to the most useful tool in open source: your analytical, curious and critical mindset! You don't need access to fancy, complicated or paid tools to get started and do a successful open source investigation. That being said, one free and easy recommendation is InVID as a catch-all combing a few useful processes like reverse image search and metadata downloading.
  • Here are some good sites to check:
What would you recommend to digital rights defenders in terms of open source? Particularly, because we live in an increasingly oppressive conditions, we are pushed by a sense of urgency (which then causes unethical processes under the guise of greater good)
  • Stella:
    • Let honest questions about your investigation drive the work, not just the research methods. Often I see people who are starting out in open source let practicing established open source methods drive their work, even though they may not directly contribute to understanding the problem (i.e. you spend 15 hours geolocating a video that doesn’t actually need to be). Exploration and practice are important, but try to keep the bigger picture in mind. What are you trying to prove and how can you honor that story?
    • Take your problem/research question and think about all the signals of that activity. When looking at a major event I think “who may have captured the event on camera?,” like government cameras (CCTV), observers with cell phones, neighbors with home surveillance systems, satellites from above. When looking at a company I think, “what official processes do they have to go through to exist as a company?,” like filing taxes, registering licenses, settling lawsuits, etc. When looking at a weapons delivery, I think “where did this have to travel to get here?,” like from a manufacturer with a catalog online, through a port with records, on a vessel that you can track with AIS. Don't be afraid to use interviews and human sources alongside your work. Stay curious about your research question and keep looking for new data sources.
  • Sophie:
    • Look after yourself
    • If something feels uncomfortable or risky stop and get support or a second opinion
    • Think twice before you publish. Some information might be best shared privately or in a dark archive. This blog post we wrote includes a series of decision trees for the collection and sharing of geo-located data in crisis situations.
  • Ray: just because you can access/research/publish something, that doesn't mean you should. Integrate reflection and evaluation (by yourself and others) into your research as a core part of the process.
What are some of the best recent investigations/reports based on open source research that had a human rights impact that you recommend checking out? And some of the open source research experts we can follow on Twitter or Mastodon?
  • In an act of shameless self promotion, we would love to suggest this report our team contributed to recently! I think it's an amazing example of mixing open source methods (social media, weapons analysis, satellite imagery, vessel tracking) and traditional reporting (leaked documents, interviews, etc.)