Category Archives: Uncategorized

Fire

The world is on fire.

I know many of you are either my parents friends or here for the free software thoughts, but rather than listen to me, I want you to listed to Black voices in these fields.

If you’re not Black, it’s your job to educate yourself on issues that affect Black people all over the world and the way systems are designed to benefit White Supremacy. It is our job to acknowledge that racism is a problem — whether it appears as White Supremacy, Colonialism, or something as seemingly banal as pay gaps.

We must make space for Black voices. We must make space for Black Women. We must make space for Black trans lives. We must do this in technology. We must build equity. We must listen.

I know I have a platform. It’s one I value highly because I’ve worked so hard for the past twelve years to build it against sexist attitudes in my field (and the world). However, it is time for me to use that platform for the voices of others.

Please pay attention to Black voices in tech and FOSS. Do not just expect them to explain diversity and inclusion, but go to them for their expertise. Pay them respectful salaries. Mentor Black youth and Black people who want to be involved. Volunteer or donate to groups like Black Girls Code, The Last Mile, and Resilient Coders.

If you’re looking for mentorship, especially around things like writing, speaking, community organizing, or getting your career going in open source, don’t hesitate to reach out to me. Mentorship can be a lasting relationship, or a brief exchange around a specific issue of event. If I can’t help you, I’ll try to connect you to someone who can.

We cannot build the techno-utopia unless everyone is involved.

Racism is a Free Software Issue

Racism is a free software issue. I gave a talk that touched on this at CopyLeft Conf 2019. I also talked a little bit about it at All Things Open 2019 and FOSDEM 2020 in my talk The Ethics Behind Your IoT. I know statistics, theory, and free software. I don’t know about race and racism nearly as well. I might make mistakes – I have made some and I will make more. Please, when I do, help me do better.

I want to look at a few particular technologies and think about how they reinforce systemic racism. Worded another way: how is technology racist? How does technology hurt Black Indigenous People of Color (BIPOC)? How does technology keep us racist? How does technology make it easier to be racist?

Breathalyzers

In the United States, Latinx folks are less likely to drink than white people and, overall, less likely to be arrested for DUIs3,4. However, they are more likely to be stopped by police while driving5,6.

Who is being stopped by police is up to the police and they pull over a disproportionate number of Latinx drivers. After someone is pulled over for suspected drunk driving, they are given a breathalyzer test. Breathalyzers are so easy to (un)intentionally mis-calibrate that they have been banned as valid evidence in multiple states. The biases of the police are not canceled out by the technology that should, in theory, let us know whether someone is actually drunk.

Facial Recognition

I could talk about for quite some time and, in fact, have. So have others. Google’s image recognition software recognized black people as gorillas – and to fix the issue it removed gorillas from it’s image-labeling technology.

Facial recognition software does a bad job at recognizing black people. In fact, it’s also terrible at identifying indigenous people and other people of color. (Incidentally, it’s also not great at recognizing women, but let’s not talk about that right now.)

As we use facial recognition technology for more things, from automated store checkouts (even more relevant in the socially distanced age of Covid-19), airport ticketing, phone unlocking, police identification, and a number of other things, it becomes a bigger problem that this software cannot tell the difference between two Asian people.

Targeted Advertising

Black kids see 70% more online ads for food than white kids, and twice as many ads for junk food. In general BIPOC youth are more likely to see junk food advertisements online. This is intentional, and happens after they are identified as BIPOC youth.

Technology Reinforces Racism; Racism Builds Technology

The technology we have developed reinforces racism on a society wide scale because it makes it harder for BIPOC people to interact with this world that is run by computers and software. It’s harder to not be racist when the technology around us is being used to perpetuate racist paradigms. For example, if a store implements facial recognition software for checkout, black women are less likely to be identified. They are then more likely to be targeted as trying to steal from the store. We are more likely to take this to mean that black women are more likely to steal. This is how technology builds racism,

People are being excluded largely because they are not building these technologies, because they are not welcome in our spaces. There simply are not enough Black and Hispanic technologists and that is a problem. We need to care about this because when software doesn’t work for everyone, it doesn’t work. We cannot build on the promise of free and open source software when we are excluding the majority of people.

Iron Cocktail Club: Ancho Paloma

The purpose of Iron Cocktail Club is to pick a cocktail and have the participants attempt to make it from whatever they have available.

I picked the Ancho Paloma because it’s the kind of drink I adore, but rarely think to make for myself. It’s turning into spring, and I wanted something light and refreshing.

An ancho paloma, in a peanut butter jar, with a grapefruit wedge.

Ingredients

1½ oz. Siete Misterios Doba-Yej mezcal
½ oz. Ancho Reyes
¾ oz. fresh grapefruit juice
½ oz. fresh lime juice
¼ oz. agave nectar
2 drops salt solution (1:1 salt to water)
Club soda
Grapefruit wedge dipped in sal de gusano

Combine all the ingredients except the club soda in a shaker with ice and shake until chilled. Strain into a Collins glass over ice. Top with soda and garnish.

My Recipe

1½ oz. Casamigos Repasado Tequila
½ oz. Ancho chile simple syrup*
¾ oz. fresh grapefruit juice
½ oz. fresh lime juice
¼ oz. (potato) vodka
pinch smoked salt**
Club soda
Grapefruit wedge sprinkled with smoked salt**

Combine all the ingredients except the club soda in a shaker with ice and shake until chilled. Strain into a mason jar over ice. Top with soda and garnish.

* Ancho Chile Simple Syrup

To make this, take:
½ cup sugar
¼ cup water
2 cut ancho chiles
Cayenne to taste

Put the sugar in a small pot on medium heat. Add the water, ancho chiles, and cayenne. Now comes the hardest part: leave it alone. Just don’t mess with it. Watch it, but don’t touch it. after a while it’ll turn a caramel color. At this point, turn the temperature down to the lowest it goes and stir it. Once it’s appropriately syrupy (think a a bit thinner than agave), turn off the temperature and remove it from the burner. LET IT COOL DOWN BEFORE TRYING IT. Then, add cayenne to taste. Delicious, delicious taste.

A pot of simmering caramel syrup with ancho chiles in it.

**Smoked salt

½ cup kosher salt (or similarly ground salt)
1 tablespoon liquid salt

Preheat your oven to 300 degrees. Mix the salt and liquid smoke. Spread out on a parchment paper covered cookie sheet/baking pan. Put in the oven for 10-15 minutes, or however long it takes to try out.

Smoked salt drying on a baking sheet.

Thoughts

I loved it! It was a Paloma with a bit of heat. Very refreshing. Wonderful spring drink!

IoT

We, in the US, are starting to  talk more widely about the dangers posed by Internet of Things (IoT) devices. This is great!

IoT devices are by and large terrible. They’re truly horrendous. They can be nice in a lot of ways — I enjoy controlling the music in the kitchen from my phone — but they normalize the situation where we trade our privacy and data for convenience. This is not just true of obvious surveillance technologies, though it is especially true for them and I want to talk about those.

Most of the conversation I have seen about surveillance IoT — like Ring doorbells and home surveillance devices — is focused on the insecurity of it. Major news outlets covered when a girl was harassed by someone hacking into a Ring camera her parents installed into her bedroom.

Ignoring how creepy it is that her parents decided to install a camera in her bedroom, this story is disturbing because it’s about someone violating the sanctity of what should be a safe space for a family and, moreso, for a child. This is posed as a security issue affecting an individual.

We need to shift the conversation in two ways:

1) No amount of security will make these kind of devices safe;

and

2) This is not just about the individual — these types of surveillance put communities at risk.

I think the latter is the more important point, and something I want to focus on. The conversation should not just be about the security risk of someone breaking into my home surveillance. Instead it should focus on how, for example, surveillance systems are putting your neighbors at risk, especially as these systems are being co-opted by law enforcement and faulty facial recognition tech is being used.

We should talk about how victims of domestic violence and stalking can be monitored and tracked more easily by their abusers.

I believe strongly that the people making decisions, designing, building, and selling these technologies have a responsibility to the people who purchase them as well as those who come in contact with them. I view broadening the conversations beyond the “unhackability” of devices as a necessary next step.

 

 

MollyGive 2019

After much deliberation, I decided to not do MollyGive 2019. This was a bit of a blow, especially after MollyGive 2018 having a lot less reach than previous years. (For MollyGive 2018 I supported a large, matching donation to the Software Freedom Conservancy.)

I’ve spent the past seven months paying helping a friend pay their rent. I’ve paid for groceries, train tickets, meals, coffee, books for people I know and people I don’t. Medications for strangers. I stopped keeping track of the “good” I was doing, and instead just gave when I saw people in need.

This is counter to my past behavior. I’ve been burned a few times when offering funds to people who have a major need in their life — thousands of dollars to help people make major life changes only to have them instead use the money on other things.

I believe pretty strongly that, generally, people in need know what they need and are capable of taking care of it themselves. I don’t think it’s my place to dictate or prescribe. The experience of being burned and my thought about people knowing what they need were at odds.

At the same time, I saw people suffering around me. People who know what they needed .People in positions I’ve been in: food or medication? A winter jacket or rent? I have the resources to take care of those material needs, so I supported them when the opportunity presented itself.

I have a friend who is scheduled to have surgery this spring. They have been given advice on how to fundraise for the surgery. In fact, people facing  the prospect of life saving, crushing debt generating treatments are given lots of information about how to run successful crowd funding campaigns. This is appalling. You should be disgusted by it. You need to be disgusted by it.

Giving to charity helps. Giving to your neighbors helps. However, this is not enough. The sheer level of suffering and injustice in the world, in your country, your neighborhood, your home is sickening and giving ourselves a reprieve by donating to charities will not fix these systemic problems.

All of that being said, I have made donations to non-profits, and will make more. I hope you’ll join me in supporting groups that are doing good, necessary work. I also hope you’ll join me in striving to bring about the big societal changes that will make it so we don’t need so many charities.

My career has been in non-profits, it is my dearest hope to one day be out of a job. In the mean time, I’ll continue to work, and I’ll continue to give in whatever ways I can.

Consent

I was walking down the platform at the train station when I caught eyes with a police officer. Instinctively, I smiled and he smiled back. When I got closer, he said “Excuse me, do you mind if I swipe down your bag?” He gestured to a machine he was holding. “Just a random check.”

The slight tension I’d felt since I first saw him grabbed hold of my spine, shoulders, and jaw. I stood up a little straighter and clenched my teeth down.

“Sure, I guess,” I said uncertainly.

He could hear something in my voice, or read something in my change of posture. “You have to consent in order for me to be allowed to do it.”

Consent. I’d just been writing about consent that morning, before going to catch the train down to New York for Thanksgiving. It set me on edge and made more real what was happening: someone wanted to move into my personal space. There was now a legal interaction happening. “I don’t want to be difficult, but I’d rather you didn’t if you don’t have to.”

“It’s just a random check,” he said. “You don’t have to consent.”

“What happens if I say no?”

“You can’t get on the train,” he gestured to the track with his machine.

“So, my options are to let you search my bag or not go see my family for Thanksgiving?”

“You could take a bus,” he offered.

I thought about how I wanted to say this. Words are powerful and important.

“I consent to this in as much as I must without having any other reasonable option presented to me.”

He looked unconvinced, but swiped down my bag anyway, declared it safe, and sent me off.

Did I really have the right to withhold consent in this situation? Technically, yes. I could have told him no, but I had no other reasonable option.

At the heart of user freedom is the idea that we must be able to consent to the technology we’re directly and indirectly using. It is also important to note that we should not suffer unduly by denying consent.

If I don’t want to interact with a facial recognition system at an airport, I should be able to say no, but I should not be required to give up my seat or risk missing my flight spending exceptional effort as a consequence of refusing to consent. Consenting to something that you don’t want to do should not be incentivized, especially by making you take on extra risk or make extra sacrifices.

In many situations, especially with technology, we are presented with the option to opt out, but that doesn’t just mean opting out of playing a particular game: it can mean choosing whether or not to get a life saving medical implant; not filing your taxes because of government mandated tax software; or being unable to count yourself in a census.

When the choice is “agree or suffer the consequences” we do not have an option to actually consent.

Ethical Source (2)

Continued from “Ethical Source.

Keeping Ethics in Open Source

For the sake of argument, we’re currently going to assume that open source (defined as “software under an OSI approved license”) does not adequately address social issues.

Ethical Source proponents suggest adopting licenses with “ethics clauses,” also frequently known as “do no harm clauses.” These include such points as:

  • must confirm to local labor laws;
  • may not be used by governments;
  • environmental destruction; and
  • may not be used to profit from “the destruction of people’s physical and mental health”

as well as the above examples from the Vaccine License and the Hippocratic License.

I would argue that these types of clauses are inherently flawed either due to ambiguity or unintended consequences.

To address the former, I want us to look at “environmental destruction.” There is a solid argument that all software causes environmental destruction – due to the drain on non-renewable energy resources. Software that makes cars safer also powers these cars, which fits into a narrative of car driven environmental damage.

When considering “the destruction of people’s physical and mental health,” we have to acknowledge how much software is damaging to both the physical and the mental. I am definitely having back problems due to poor posture as I sit typing away all day at my laptop. Social media has enabled bullying that has literally killed people.

These sorts of clauses are just too ambiguous to use.

Then there are more firm qualifiers, like must confirm to local labor laws. This seems rather straight forward, but there are plenty of places where women are still fighting for the right to work, for equal pay, and against all forms of discrimination. In some countries husbands can prevent their wives from working. Following local labor laws means creating a community where whole groups of people are not allowed to participating in the building of open source software.

I also want to point out that “government use” is a very broad category. Governments provide health care, social security, scientific funding, arts funding, and necessary infrastructure. By restricting government use, we are restricting our access to things like education and weather data.

Licenses are not the tool to push for social issues. Licenses are a tool to build equity, and they are even a tool to fight against inequality, but they alone are not enough.

Seth Vargo pulled source code from the Chef code base when it came to light that Chef was working with ICE. Google employees staged walkouts and protests against Project Dragonfly. Tech workers and contributors can institute codes of conduct, ban companies doing evil from their communities, refuse to accept pull requests or contracts, unionize, collectively organize, and simply refuse to work when the technology they’re creating is being used for evil or by evil.

The other problem with Do No Harm licenses is that they require the adoption of those licenses. There are already many open source licenses to choose from. Much of the critical infrastructure we’re discussing is being built by companies, which I think are unlikely to adopt Do No Harm licenses.

Acknowledgments to Elana Hashman for ideas here.

Ethical Source

This is going to be post one of some unknown number. I think I cannot write everything I want to say in one post, but also that it would just make it undigestably long.

Read part 2!

Is Ethical Source Open?

Let’s first define our terms here: “open source,” for the sake of this particular post, means “under an OSI approved license.” An OSI approved license must meet the points laid out in the Open Source Definition (OSD) — a ten point list of qualifications. “Ethical source” is being defined as being “under a license that applies moral or ethical limitations to the use and modification of the software.”

Ethical source is not open source. Eevery ethical source license I’ve seen violates OSD 5 and/or OSD 6.

5. No Discrimination Against Persons or Groups

The license must not discriminate against any person or group of persons.

6. No Discrimination Against Fields of Endeavor

The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

The Vaccine License is a good example of the first:

The Vaccine License is a software license that requires that users vaccinate their children, and themselves, and that user businesses make a similar requirement of their employees, to the greatest extent legally possible. The required vaccinations are those recommended by the user’s national administration, for example the United States Center for Disease Control. There is an exception for those who, for medical reasons, should not receive a vaccine.

The Vaccine License is saying that you cannot use software under the vaccine license if you’re not vaccinated (medical exceptions exist).

The Hippocratic License violates the second point:

No Harm: The software may not be used by anyone for systems or activities that actively and knowingly endanger, harm, or otherwise threaten the physical, mental, economic, or general well-being of other individuals or groups, in violation of the United Nations Universal Declaration of Human Rights (https://www.un.org/en/universal-declaration-human-rights/).

Services: If the Software is used to provide a service to others, the licensee shall, as a condition of use, require those others not to use the service in any way that violates the No Harm clause above.

If you have been following the conversation around licenses and ethical source, this is not new. If you haven’t, then it might be!

In the former case, there is a straightforward connection: not vaccinated? Not eligible to use it! This is specifically about the individual user.

The latter example, the Hippocratic License, violates OSD 5 in that it may not be used by individuals (or groups of individuals) found in violation, but it also makes verboten fields of endeavor — horrible, illegal ones, but fields of endeavor none the less. You cannot use this software for torture.

Neither of these licenses are open source.

In general, ethical sources licenses place restrictions on individuals, groups, or fields of endeavor, this means that they cannot be open source.

Does it matter that they are not “open source”?

There is commercial and social value in a license being open source. For a company, it’s a friendly certification mark that appeals to customers, consumers, and potential employees. From a social perspective, by creating open source software you’re adding to the Software Commons — the resources available to everyone. This is just nice. Plenty of people want their software to be open source, and they especially want it to be open source on their terms.

In some contexts and for some people, ethical technology is nearly synonymous with free/open technology — or it is a prerequisite that a piece of technology be open source for it to be ethical.

There is also already a strong community around open source software. People consider themselves not just a member of a project’s community, but the open source community. By being part of the open source community, you are getting access to a lot of people and you are part of something. There’s a lot of value in that. It is understandable why proponents of Ethical Source licenses would want it to also be open source.

However, under the current circumstances, something simply cannot be open if there are restrictions to “ethical” cases.

Autonomy

I’ve been stuck on the question: Why is autonomy an ethical imperative? or, worded another way Why does autonomy matter? I think if we’re going to argue that free software matters (or if I am anyway), there needs to be a point where we have to be able to answer why autonomy matters.

I’ve been thinking about this in the framing of technology and consent since the summer of 2018, when Karen Sandler and I spoke at HOPE and DebConf 18 on user and software freedom. Sitting with Karen before HOPE, I had a bit of a crisis of faith and lost track of why software freedom matters after I moved to the point that consent is necessary to our continued autonomy. But why does autonomy matter?

Autonomy matters because autonomy matters. It is the postulate on which not only have I built my arguments, but my entire world view. It is an idea that is instilled in us very deeply that all arguments about what should be a legal right are framed. We have the idea of autonomy so fundamental as part of our society, that we have been trained to have negative, sometimes physical, reactions to the loss of autonomy. Pro-choice and anti-choice arguments both boil down to the question of respecting autonomy — but whose autonomy?  Arguments against euthanasia come down to autonomy — questions of whether someone really have the agency to decide to die versus concerns about autonomy being ignored and death being forced on a person. Even climate change is a question of autonomy — how can we be autonomous if we can’t even be?

Person autonomy means we can consent, user freedom is a tool for consent, software freedom is a tool for user freedom, free software is a tool for software freedom. We can also think about this in reverse:

Free software is the reality of software freedom. Software freedom protects user freedom. User freedom enables consent. Consent is necessary to autonomy. Autonomy is essential. Autonomy is essential because autonomy is essential. And that’s enough.

Free software activities (November 2019)

November brings two things very demanding of my time: Thanksgiving and the start of fundraising season.

Free software activities (personal)

  • The Open Source Initiative had it’s twice-a-year face to face board meeting! Good times all around.
  • Debian is having a GR. I’ve been following the development of proposals and conversation, which is basically a part time job in and of itself.
  • Participated in Debian Community Team meetings.
  • I started drafting Bits from the Debian Community Team.
  • Wrote some blog posts! I liked them this month.
  • Wearing multiple hats I attended SustainNYC, talking about sustainability in free and open source software.
  • I submitted to some CFPs — SCaLE, FOSSASIA, and OSCON.
  • I am serving on the papers committee for CopyLeftConf, and for this I reviewed proposals.

Free software activities (professional)

  • We launched a fundraiser! (About a patent infringement case)
  • Funding a legal case is an expensive proposition, so I am also meeting with companies and potential large donors interested in helping out with the case.
  • We launched another fundraiser! (About general Foundation activities)
  • I participated in the hiring process to fill two roles at the GNOME Foundation.