Edit-a-Thon (September 17, 2019)

Revision as of 18:18, 18 September 2019 by Ntnsndr (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Strategic refusal!

September 17, 2019, 2-4 p.m. Eastern Time

This edit-a-thon will focus on the idea of strategic refusal with regard to concerning forms of educational technology. The event will begin with a conversation between Donna Lanclos and Charles Logan. Their conversation will explore Donna’s recent keynote address on strategic refusal, the challenges associated with strategic refusal, and how Charles is currently using strategic refusal in his work as an educational technologist. Time will be reserved to open the discussion to all participants, and after the discussion, we’ll spend time drafting texts to use for strategic refusal.


  • 60 min: Videoconference introduction on Zoom
    • 30 min: Conversation between Donna Lanclos and Charles Logan
    • 30 min: Open discussion between all participants
  • 60 min: Edit-a-thon

Recommended Reading


If you can make it, edit this section and add your name below. (You'll need to log in or create an account first. Check your spam folder if you don't receive an account confirmation.)

Edit-a-thon results

Reading recommendations mentioned in discussion

Remixable text for explaining why you won't use or support a certain edtech tool

  • The following text was written by Charles Logan, a campus edtech staff member, as an ethical rationale for refusing to support a certain tool. The text is shared here so that others can freely adapt this text for their own purposes.

Dear [name],

After a lot of reading, listening, and reflecting over the past year, I have decided to no longer support the use of [tool name]. I struggled with the decision. I am committed to assisting you as we work together to teach students, and I know my choice means I will be not be present to help on this particular issue.

I ultimately arrived at my decision for many reasons. First, as I read through the reviews of [tool name] left by students in the Google Chrome Store, I was struck by the fear, anger, anxiety, and desperation that runs through many of the posts. Second, I examined [tool name] website. The company promises academic integrity using “machine learning and advanced facial recognition technologies to remove human error and bias.” I do not believe the technology can remove human error and bias. I arrived at this conclusion based on the work of people like Dr. Safiya Umoja Noble and Dr. Chris Gilliard. A third reason I've decided to no longer support [tool name] has to do with surveillance. I'm concerned that students - and all of us, really - are living under increasing surveillance from different institutions, and that this surveillance has an outsized impact on marginalized peoples. I worry what normalizing surveillance means for students' freedom and their ability to trust me.

In sharing my reasoning with you, I do not intend to imply that you want to make your students feel uncomfortable or that you distrust your students. I have witnessed firsthand your passion for teaching and your desire to support your students' success. I also know the frustration that comes when an institution demands we do more and more with less and less support, so I understand why [tool name] appears to meet the challenges of large enrollment and its related issues.

I welcome the chance to talk through my thinking on [tool name] as well as provide ideas and support for reimagining your assessments in such a way that [tool name] is no longer needed.