Tag Archives: free software

1028 Words on Free Software

The promise of free software is a near-future utopia, built on democratized technology. This future is just and it is beautiful, full of opportunity and fulfillment for everyone everywhere. We can create the things we dream about when we let our minds wander into the places they want to. We can be with the people we want and need to be, when we want and need to.

This is currently possible with the technology we have today, but it’s availability is limited by the reality of the world we live in – the injustice, the inequity, the inequality. Technology runs the world, but it does not serve the interests of most of us. In order to create a better world, our technology must be transparent, accountable, trustworthy. It must be just. It must be free.

The job of the free software movement is to demonstrate that this world is possible by living its values now: justice, equity, equality. We build them into our technology, and we build technology that make it possible for these values to exist in the world.

At the Free Software Foundation, we liked to say that we used all free software because it was important to show that we could. You can do anything with free software, so we did everything with it. We demonstrated the importance of unions for tech workers and non-profit workers by having one. We organized collectively and protected our rights for the sake of ourselves and one another. We had non-negotiable salaries, based on responsibility level and position. That didn’t mean we worked in an office free from the systemic problems that plague workplaces everywhere, but we were able to think about them differently.

Things were this way because of Richard Stallman – but I view his influence on these things as negative rather than positive. He was a cause that forced these outcomes, rather than being supportive of the desires and needs of others. Rather than indulge in gossip or stories, I would like to jump to the idea that he was supposed to have been deplatformed in October 2019. In resigning from his position as president of the FSF, he certainly lost some of his ability to reach audiences. However, Richard still gives talks. The FSF continues to use his image and rhetoric in their own messaging and materials. They gave him time to speak at their annual conference in 2020. He maintains leadership in the GNU project and otherwise within the FSF sphere. The people who empowered him for so many years are still in charge.

Richard, and the continued respect and space he is given, is not the only problem. It represents a bigger problem. Sexism and racism (among others) run rampant in the community. This happens because of bad actors and, more significantly, by the complacency of organizations, projects, and individuals afraid of losing contributors, respect, or funding. In a sector that has so much money and so many resources, women are still being paid less than men; we deny people opportunities to learn and grow in the name of immediate results; people who aren’t men, who aren’t white, are abused and harassed; people are mentally and emotionally taken advantage of, and we are coerced into burn out and giving up our lives for these companies and projects and we are paid for tolerating all of this by being told we’re doing a good job or making a difference.

But we’re not making a difference. We’re perpetuating the worst of the status quo that we should be fighting against. We must not continue. We cannot. We need to live our ideals as they are, and take the natural next steps in their evolution. We cannot have a world of just technology when we live in a world of exclusion; we cannot have free software if we continue to allow, tolerate, and laud the worst of us. I’ve been in and around free software for seventeen years. Nearly every part of it I’ve participated in has members and leadership that benefit from allowing and encouraging the continuation of maleficence and systemic oppression.

We must purge ourselves of these things – of sexism, racism, injustice, and the people who continue and enable it. There is no space to argue over whether a comment was transphobic – if it hurt a trans person then it is transphobic and it is unacceptable. Racism is a global problem and we must be anti-racist or we are complicit. Sexism is present and all men benefit from it, even if they don’t want to. These are free software issues. These are things that plague software, and these are things software reinforces within our societies.

If a technology is exclusionary, it does not work. If a community is exclusionary, it must be fixed or thrown away. There is no middle ground here. There is no compromise. Without doing this, without taking the hard, painful steps to actually live the promise of user freedom and everything it requires and entails, our work is pointless and free software will fail.

I don’t think it’s too late for there to be a radical change – the radical change – that allows us to create the utopia we want to see in the world. We must do that by acknowledging that just technology leads to a just society, and that a just society allows us to make just technology. We must do that by living within the principles that guide this future now.

I don’t know what will happen if things don’t change soon. I recently saw someone comment that change doesn’t happens unless one person is willing to sacrifice everything to make that change, to lead and inspire others to play small parts. This is unreasonable to ask of or expect from someone. I’ve been burning myself out to meet other people’s expectations for seventeen years, and I can’t keep doing it. Of course I am not alone, and I am not the only one working on and occupied by these problems. More people must step up, not just for my sake, but for the sake of all of us, the work free software needs to do, and the future I dream about.

Why should you work on free software (or other technology issues)?

Twice this week I was asked how it can be okay to work on free software when there are issues like climate change and racial injustice. I have a few answers for that.

You can work on injustice while working on free software.

A world in which all technology is just cannot exist under capitalism. It cannot exist under racism or sexism or ableism. It cannot exist in a world that does not exist if we are ravaged by the effects of climate change. At the same time, free software is part of the story of each of these. The modern technology state fuels capitalism, and capitalism fuels it. It cannot exist without transparency at all levels of the creation process. Proprietary software and algorithms reinforce racial and gender injustice. Technology is very guilty of its contributions to the climate crisis. By working on making technology more just, by making it more free, we are working to address these issues. Software makes the world work, and oppressive software creates an oppressive world.

You can work on free software while working on injustice.

Let’s say you do want to devote your time to working on climate justice full time. Activism doesn’t have to only happen in the streets or in legislative buildings. Being a body in a protest is activism, and so is running servers for your community’s federated social network, providing wiki support, developing custom software, and otherwise bringing your free software skills into new environments. As long as your work is being accomplished under an ethos of free software, with free software, and under free software licenses, you’re working on free software issues while saving the world in other ways too!

Not everyone needs to work on everything all the time.

When your house in on fire, you need to put out the fire. However, maybe you can’t help put out the first. Maybe You don’t have the skills or knowledge or physical ability. Maybe your house is on fire, but there’s also an earthquake and a meteor and a airborn toxic event all coming at once. When that happens, we have to split up our efforts and that’s okay.

Transparency

Technology must be transparent in order to be knowable. Technology must be knowable in order for us to be able to consent to it in good faith. Good faith informed consent is necessary to preserving our (digital) autonomy.

Let’s now look at this in reverse, considering first why informed consent is necessary to our digital autonomy.

Let’s take the concept of our digital autonomy as being one of the highest goods. It is necessary to preserve and respect the value of each individual, and the collectives we choose to form. It is a right to which we are entitled by our very nature, and a prerequisite for building the lives we want, that fulfill us. This is something that we have generally agreed on as important or even sacred. Our autonomy, in whatever form it takes, in whatever part of our life it governs, is necessary and must be protected.

One of the things we must do in order to accomplish this is to build a practice and culture of consent. Giving consent — saying yes — is not enough. This consent must come from a place of understand to that which one is consenting. “Informed consent is consenting to the unknowable.”(1)

Looking at sexual consent as a parallel, even when we have a partner who discloses their sexual history and activities, we cannot know whether they are being truthful and complete. Let’s even say they are and that we can trust this, there is a limit to how much even they know about their body, health, and experience. They might not know the extent of their other partners’ experience. They might be carrying HPV without symptoms; we rarely test for herpes.

Arguably, we have more potential to definitely know what is occurring when it comes to technological consent. Technology can be broken apart. We can share and examine code, schematics, and design documentation. Certainly, lots of information is being hidden from us — a lot of code is proprietary, technical documentation unavailable, and the skills to process these things is treated as special, arcane, and even magical. Tracing the resource pipelines for the minerals and metals essential to building circuit boards is not possible for the average person. Knowing the labor practices of each step of this process, and understanding what those imply for individuals, societies, and the environments they exist in seems improbable at best.

Even though true informed consent might not be possible, it is an ideal towards which we must strive. We must work with what we have, and we must be provided as much as possible.

A periodic conversation that arises in the consideration of technology rights is whether companies should build backdoors into technology for the purpose of government exploitation. A backdoor is a hidden vulnerability in a piece of technology that, when used, would afford someone else access to your device or work or cloud storage or whatever. As long as the source code that powers computing technology is proprietary and opaque, we cannot truly know whether backdoors exist and how secure we are in our digital spaces and even our own computers, phones, and other mobile devices.

We must commit wholly to transparency and openness in order to create the possibility of as-informed-as-possible consent in order to protect our digital autonomy. We cannot exist in a vacuum and practical autonomy relies on networks of truth in order to provide the opportunity for the ideal of informed consent. These networks of truth are created through the open availability and sharing of information, relating to how and why technology works the way it does.

(1) Heintzman, Kit. 2020.

Endorsements

Transparency is essential to trusting a technology. Through transparency we can understand what we’re using and build trust. When we know what is actually going on, what processes are occurring and how it is made, we are able to decide whether interacting with it is something we actually want, and we’re able to trust it and use it with confidence.

This transparency could mean many things, though it most frequently refers to the technology itself: the code or, in the case of hardware, the designs. We could also apply it to the overall architecture of a system. We could think about the decision making, practices, and policies of whomever is designing and/or making the technology. These are all valuable in some of the same ways, including that they allow us to make a conscious choice about what we are supporting.

When we choose to use a piece of technology, we are supporting those who produce it. This could be because we are directly paying for it, however our support is not limited to direct financial contributions. In some cases this is because of things hidden within a technology: tracking mechanisms or backdoors that could allow companies or governments access to what we’re doing. When creating different types of files on a computer, these files can contain metadata that says what software was used to make it. This is an implicit endorsement, and you can also explicitly endorse a technology by talking about that or how you use it. In this, you have a right (not just a duty) to be aware of what you’re supporting. This includes, for example, organizational practices and whether a given company relies on abusive labor policies, indentured servitude, or slave labor.
Endorsements inspire others to choose a piece of technology. Most of my technology is something I investigate purely for functionality, and the pieces I investigate are based on what people I know use. The people I trust in these cases are more inclined than most to do this kind of research, to perform technical interrogations, and to be aware of what producers of technology are up to.

This is how technology spreads and becomes common or the standard choice. In one sense, we all have the responsibility (one I am shirking) to investigate our technologies before we choose them. However, we must acknowledge that not everyone has the resources for this – the time, the skills, the knowledge, and therein endorsements become even more important to recognize.

Those producing a technology have the responsibility of making all of these angles something one could investigate. Understanding cannot only be the realm of experts. It should not require an extensive background in research and investigative journalism to find out whether a company punishes employees who try to unionize or pay non-living wages. Instead, these must be easy activities to carry out. It should be standard for a company (or other technology producer) to be open and share with people using their technology what makes them function. It should be considered shameful and shady to not do so. Not only does this empower those making choices about what technologies to use, but it empowers others down the line, who rely on those choices. It also respects the people involved in the processes of making these technologies. By acknowledging their role in bringing our tools to life, we are respecting their labor. By holding companies accountable for their practices and policies, we are respecting their lives.

Free Software Activities – September 2020

I haven’t done one of these in a while, so let’s see how it goes.

Debian

The Community Team has been busy. We’re planning a sprint to work on a bigger writing project and have some tough discussions that need to happen.
I personally have only worked on one incident, but we’ve had a few others come in.
I’m attempting to step down from the Outreach team, which is more work than I thought it would be. I had a very complicated relationship with the Outreach team. When no one else was there to take on making sure we did GSoC and Outreachy, I stepped up. It wasn’t really what I wanted to be doing, but it’s important. I’m glad to have more time to focus on other things that feel more aligned with what I’m trying to work on right now.

GNOME

In addition to, you know, work, I joined the Code of Conduct Committee. Always a good time! Rosanna and I presented at GNOME Onboard Africa Virtual about the GNOME CoC. It was super fun!

Digital Autonomy

Karen and I did an interview on FLOSS Weekly with Doc Searls and Dan Lynch. Super fun! I’ve been doing some more writing, which I still hope to publish soon, and a lot of organization on it. I’m also in the process of investigating some funding, as there are a few things we’d like to do that come with price tags. Separately, I started working on a video to explain the Principles. I’m excited!

Misc

I started a call that meets every other week where we talk about Code of Conduct stuff. Good peeps. Into it.

busy busy

I’ve been working with Karen Sandler over the past few months on the first draft of the Declaration of Digital Autonomy. Feedback welcome, please be constructive. It’s a pretty big deal for me, and feels like the culmination of a lifetime of experiences and the start of something new.

We talked about it at GUADEC and HOPE. We don’t have any other talks scheduled yet, but are available for events, meetups, dinner parties, and b’nai mitzvahs.

Fire

The world is on fire.

I know many of you are either my parents friends or here for the free software thoughts, but rather than listen to me, I want you to listed to Black voices in these fields.

If you’re not Black, it’s your job to educate yourself on issues that affect Black people all over the world and the way systems are designed to benefit White Supremacy. It is our job to acknowledge that racism is a problem — whether it appears as White Supremacy, Colonialism, or something as seemingly banal as pay gaps.

We must make space for Black voices. We must make space for Black Women. We must make space for Black trans lives. We must do this in technology. We must build equity. We must listen.

I know I have a platform. It’s one I value highly because I’ve worked so hard for the past twelve years to build it against sexist attitudes in my field (and the world). However, it is time for me to use that platform for the voices of others.

Please pay attention to Black voices in tech and FOSS. Do not just expect them to explain diversity and inclusion, but go to them for their expertise. Pay them respectful salaries. Mentor Black youth and Black people who want to be involved. Volunteer or donate to groups like Black Girls Code, The Last Mile, and Resilient Coders.

If you’re looking for mentorship, especially around things like writing, speaking, community organizing, or getting your career going in open source, don’t hesitate to reach out to me. Mentorship can be a lasting relationship, or a brief exchange around a specific issue of event. If I can’t help you, I’ll try to connect you to someone who can.

We cannot build the techno-utopia unless everyone is involved.

Racism is a Free Software Issue

Racism is a free software issue. I gave a talk that touched on this at CopyLeft Conf 2019. I also talked a little bit about it at All Things Open 2019 and FOSDEM 2020 in my talk The Ethics Behind Your IoT. I know statistics, theory, and free software. I don’t know about race and racism nearly as well. I might make mistakes – I have made some and I will make more. Please, when I do, help me do better.

I want to look at a few particular technologies and think about how they reinforce systemic racism. Worded another way: how is technology racist? How does technology hurt Black Indigenous People of Color (BIPOC)? How does technology keep us racist? How does technology make it easier to be racist?

Breathalyzers

In the United States, Latinx folks are less likely to drink than white people and, overall, less likely to be arrested for DUIs3,4. However, they are more likely to be stopped by police while driving5,6.

Who is being stopped by police is up to the police and they pull over a disproportionate number of Latinx drivers. After someone is pulled over for suspected drunk driving, they are given a breathalyzer test. Breathalyzers are so easy to (un)intentionally mis-calibrate that they have been banned as valid evidence in multiple states. The biases of the police are not canceled out by the technology that should, in theory, let us know whether someone is actually drunk.

Facial Recognition

I could talk about for quite some time and, in fact, have. So have others. Google’s image recognition software recognized black people as gorillas – and to fix the issue it removed gorillas from it’s image-labeling technology.

Facial recognition software does a bad job at recognizing black people. In fact, it’s also terrible at identifying indigenous people and other people of color. (Incidentally, it’s also not great at recognizing women, but let’s not talk about that right now.)

As we use facial recognition technology for more things, from automated store checkouts (even more relevant in the socially distanced age of Covid-19), airport ticketing, phone unlocking, police identification, and a number of other things, it becomes a bigger problem that this software cannot tell the difference between two Asian people.

Targeted Advertising

Black kids see 70% more online ads for food than white kids, and twice as many ads for junk food. In general BIPOC youth are more likely to see junk food advertisements online. This is intentional, and happens after they are identified as BIPOC youth.

Technology Reinforces Racism; Racism Builds Technology

The technology we have developed reinforces racism on a society wide scale because it makes it harder for BIPOC people to interact with this world that is run by computers and software. It’s harder to not be racist when the technology around us is being used to perpetuate racist paradigms. For example, if a store implements facial recognition software for checkout, black women are less likely to be identified. They are then more likely to be targeted as trying to steal from the store. We are more likely to take this to mean that black women are more likely to steal. This is how technology builds racism,

People are being excluded largely because they are not building these technologies, because they are not welcome in our spaces. There simply are not enough Black and Hispanic technologists and that is a problem. We need to care about this because when software doesn’t work for everyone, it doesn’t work. We cannot build on the promise of free and open source software when we are excluding the majority of people.

Computing Under Quarantine

Under the current climate of lock-ins, self-isolation, shelter-in-place policies, and quarantine, it is becoming evident to more people the integral role computers play in our lives. Students are learning entirely online, those who can are working from home, and our personal relationships are being carried largely by technology like video chats, online games, and group messages. When these things have become our only means of socializing with those outside our homes, we begin to realize how important they are and the inequity inherent to many technologies.

Someone was telling me how a neighbor doesn’t have a printer, so they are printing off school assignments for their neighbor. People I know are sharing internet connections with people in their buildings, when possible, to help save on costs with people losing jobs. I worry now even more about people who have limited access to home devices or poor internet connections.

As we are forced into our homes and are increasingly limited in the resources we have available, we find ourselves potentially unable to easily fill material needs and desires. In my neighborhood, it’s hard to find flour. A friend cannot find yeast. A coworker couldn’t find eggs. Someone else is without dish soap. Supply chains are not designed to meet with the demand currently being exerted on the system.

This problem is mimicked in technology. If your computer breaks, it is much harder to fix it, and you lose a lot more than just a machine – you lose your source of connection with the world. If you run out of toner cartridges for your printer – and only one particular brand works – the risk of losing your printer, and your access to school work, becomes a bigger deal. As an increasing number of things in our homes are wired, networked, and only able to function with a prescribed set of proprietary parts, gaps in supply chains become an even bigger issue. When you cannot use whatever is available, and instead need to wait for the particular thing, you find yourself either hoarding or going without. What happens when you can’t get the toothbrush heads for your smart toothbrush due to prioritization and scarcity with online ordering when it’s not so easy to just go to the pharmacy and get a regular toothbrush?

In response to COVID-19 Adobe is offering no-cost access to some of their services. If people allow themselves to rely on these free services, they end up in a bad situation when a cost is re-attached.

Lock-in is always a risk, but when people are desperate, unemployed, and lacking the resources they need to survive, the implications of being trapped in these proprietary systems are much more painful.

What worries me even more than this is the reliance on insecure communication apps. Zoom, which is becoming the default service in many fields right now, offers anti-features like attendee attention tracking and user reporting.

We are now being required to use technologies designed to maximize opportunities for surveillance to learn, work, and socialize. This is worrisome to me for two main reasons: the violation of privacy and the normalization of a surveillance state. It is a violation of privacy, to have our actions tracked. It also gets us used to being watched, which is dangerous as we look towards the future.

Seven hundred words on Internet access

I wrote this a few months ago, and never published it. Here you go.

In the summer of 2017, I biked from Boston, MA to Montreal, QC. I rode across Massachusetts, then up the New York/Vermont border, weaving between the two states over two days. I spent the night in Washington County, NY at a bed and breakfast that generously fed me dinner even though they weren’t supposed to. One of the proprietors told me about his history as a physics teacher, and talked about volunteer work he was doing. He somewhat casually mentioned that in his town there isn’t really internet access.

At the time (at least) Washington County wasn’t served by broadband companies. Instead, for $80 a month you could purchase a limited data package from a mobile phone company, and use that. A limited data package means limited access. This could mean no or limited internet in schools or libraries.

This was not the first time I heard about failings of Internet penetration in the United States. When I first moved to Boston I was an intern at One Laptop Per Child. I spoke with someone interested in bringing internet access to their rural community in Maine. They had hope for mesh networks, linking computers together into a web of connectivity, bouncing signals from one machine to another in order to bring internet to everyone.

Access to the Internet is a necessity. As I write this, 2020 is only weeks away, which brings our decennial, nationwide census. There had been discussions of making the census entirely online, but it was settled that people could fill it out “online, by telephone, or via mail” and that households can “answer the questions on the internet or by phone in English and 12 Non-English languages.” [1][2]

This is important because a comprehensive census is important. A census provides, if nothing else, population and demographics information, which is used to assist in the disbursement of government funding and grants to geographic communities. Apportionment, or the redistribution of the 435 seats occupied by members of the House of Representatives, is done based on the population of a given state: more people, more seats.

Researchers, students, and curious people use census data to carry out their work. Non-profits and activist organizations can better understand the populations they serve.

As things like the Census increasingly move online, the availability of access becomes increasingly important.

Some things are only available online – including job applications, customer service assistance, and even education opportunities like courses, academic resources, and applications for grants, scholarships, and admissions.

The Internet is also a necessary point of connection between people, and necessary for building our identities. Being acknowledged with their correct names and pronouns decreases the risk of depression and suicide among trans youths – and one assumes adults as well. [3] Online spaces provide acknowledgment and recognition that is not being met in physical spaces and geographic communities.

Internet access has been important to me in my own mental health struggles and understanding. My bipolar exhibits itself through long, crushing periods of depression during which I can do little more than wait for it to be over. I fill these quiet spaces by listening to podcasts and talking with my friends using apps like Signal to manage our communications.

My story of continuous recovery includes a particularly gnarly episode of bulimia in 2015. I was only able to really acknowledge that I had a problems with food and purging, using both as opportunities to inflict violence onto myself, when reading Tumblr posts by people with eating disorders. This made it possible for me to talk about my purging with my therapist, my psychiatrist, and my doctor in order to modify my treatment plan in order to start getting help I need.

All of these things are made possible by having reliable, fast access to the Internet. We can respond to our needs immediately, regardless of where we are. We can find or build the communities we need, and serve the ones we already live in, whether they’re physical or exist purely as digital.

[1]: https://census.lacounty.gov/census/ Accessed 29.11.2019
[2]: https://www.census.gov/library/stories/2019/03/one-year-out-census-bureau-on-track-for-2020-census.html Accessed 29.11.2019
[3]: https://news.utexas.edu/2018/03/30/name-use-matters-for-transgender-youths-mental-health/ Accessed 29.11.2019