Skip to content
Home » All the signs were there, but we didn’t understand them: an essay on gatekeeping in IT

All the signs were there, but we didn’t understand them: an essay on gatekeeping in IT

  • by

This is the first in a series of posts about gatekeeping in Information Technology and other fields.

I am not the first — nor will I be the last — to say that the terminology we use in the technology sphere is problematic.

Language and gender bias

Let’s look at several tweets collected in the last month. We’ll start with Michele Cynowicz imploring people to use realistic examples in code tutorials:

Here’s my response which is laced with snark, because I believe that a number of words we use by convention need to be phased out, especially if they have negative connotations:

Here we have Karen Lopez, a community leader and expert in the data world, asking what I think is a very simple request:

This is an interesting one. Anyone who identifies a certain way with their gender should be able to express that without being forced into a binary system of male or female. However, many database systems are (and continue to be) designed with a Boolean option, which only provides for two states: true and false.

In the original draft of this post, I suggested that a Boolean value might favour “1” or “True” reflecting Male, meaning that “0” or “False” reflected Female. I deleted that because I thought it was too far a stretch.

Then this happened:

For the nitpickers who might suggest that a nullable Boolean allows for three states, namely true, false, and null, this is not a solution to the problem where gender and sex (which are separate things) are being forced into a binary system. Null means unknown, and forcing everyone outside of the binary to use that is also problematic.

And finally, Markus Winand pointing out that the terminology we use in Information Technology is opaque:

This is the crux of this post, stating that we use jargon that is unclear. I try to take care in my writing to assume nothing about my audience except that they can read English. When I’ve been called out because of concepts that I previously glossed over, I like to go back and make sure I’ve covered that properly. Opaqueness is the primary tool of gatekeeping.

In the right context, gatekeeping has a useful role. Requiring certifications and licenses for us to use and operate vehicles on public roads is a good example. Requiring that lawyers and accountants must pass board exams is another. However, gatekeeping becomes an issue for access to career and learning opportunities in the modern Internet age where information wants to be free. In other words, despite having a wealth of opportunities to learn about programming, database administration, information security, writing, filmmaking, or acting (to name some of my interests for example), gatekeepers prevent newcomers from entering their field unless those newcomers fit a particular ideal of who should have access. Gatekeeping is a form of profiling.

In Information Technology alone there are hundreds — if not thousands — of examples on Twitter, StackOverflow and elsewhere, of people complaining that they don’t understand the terminology, or that the terminology is being used as a gatekeeper. Coincidentally, these people appear to be predominantly women, or non-white, or informally educated (if not all three). It seems as if computer science has built jargon-filled walls around the field to keep people out.

So it should not be a surprise that people think terminology must evolve with human language, especially given how fast language and technology are changing, thanks to the Internet. In the way that language dictionaries are descriptive, not prescriptive, so too should our terminology evolve to make the field more accessible.

That burden is on us, the well-established IT professionals. We need to discuss the words we use, and understand why they matter. If they are problematic or too abstract, we must change them to be more inclusive and relatable, and therefore accessible.


My friend Janie Larson (blog | Twitter) wrote an essay a few years ago about how technical interviews are designed as gatekeepers (there’s that word again). The Algorithms of Discrimination is a damning indictment of interviews for technical positions, and is well worth your time to read. The takeaway quote for me is this:

Does the test honestly reflect this person’s potential?

In other words, are we interviewing for people who look and sound like us, who know all the right words that we learned in computer science school, who know how to pass? Or, are we testing the candidate’s actual potential?

Which one of these is better suited for working in your organization? Study (MIT) after study (Kellogg Insight) shows that diversity and inclusion (D&I) improves an organization’s revenue, decision-making, and innovation, by hiring people from diverse backgrounds, listening to what they have to say, and incorporating their feedback. In other words, you will succeed by picking people who don’t look or sound like you, and giving them an opportunity to express themselves.

Speaking personally, I want to hire the person who brings a different life experience, is willing to learn, who will grow with the organization, and who collaborates well. No whiteboard algorithm recital is going to tell me that.

I’ve personally been in several interviews where my technical knowledge was measured. I had to write code, design database tables, draw on a whiteboard, and so on. These interviews were in the vast majority a waste of my and the interviewers’ time. In almost every case I could have looked up the information on my favourite search engine and completed the tasks in half the time, more efficiently, with better error checking. Interviews are stressful, and the wrong environment to be challenging problem-solving ability.

When I was asked about O(n) notation (Big O notation) for instance, I told the interviewer flat out that I had no idea why it was relevant, because this was a problem that has already been solved.

The first paragraph on the Wikipedia page for Big O notation says:

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. It is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation.

… What?

Tell me how that is clear or meaningful to anyone trying to study for a whiteboard-style interview who hasn’t done a degree in computer science or tertiary-level mathematics. How does this have any relevance to writing code to display text on a webpage, or retrieve data from a database?

It doesn’t, and that’s why it’s a gatekeeper. It prevents people from applying to jobs, without access to (or the need to access) tertiary education, who are more than capable of learning on the job. Think back to your current job, and how you knew nothing about the environment when you joined.

In the next few weeks we’ll look at how the language and terminology we use matters, how we can be more inclusive in our words and actions, and talk in depth about making technology more accessible. Stay tuned!

Share your thoughts in the comments below.

Photo by Maria Krisanova on Unsplash.

3 thoughts on “All the signs were there, but we didn’t understand them: an essay on gatekeeping in IT”

  1. The impenetrability of Wikipedia’s math-related articles is well known and hurts everyone.

  2. “In the original draft of this post, I suggested that a Boolean value might favour “1” or “True” reflecting Male, meaning that “0” or “False” reflected Female.”

    Not trying to read too far into the post, but when I first started working with datasets I wondered why Male / Female was coded 1/2 instead of say, 0/1. Then, I realized that both 1 and 2 reflect as logical true values for selection whereas 0 codes as logically false.

    1. PeopleSoft (pre-Oracle, when I worked on their HRMS implementations) used to default gender drop downs to Female, which was a nice touch. Still stored a binary value but it was a start.

Comments are closed.