September 27, 2021 | By Chris Miciek
TAGS: technology, diversity and inclusion, trends and predictions, strategic planning, member voices
It would stand us all in good stead to remember that infatuation with high-tech social sorting emerges most aggressively in countries riven by severe inequality and governed by totalitarians.
– Virginia Eubanks, “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor”
How can we engage constructively with vendors and tech partners in the career development and hiring space?
A major concern that has emerged in the recent artificial intelligence/machine learning (AI/ML) push is ethics washing—companies making public statements about how they are developing and/or using technology in ethical ways without taking substantial steps to actually ensure ethical development of tools. Like greenwashing, this practice allows organizations with opaque processes and operations to assure their audiences they are not bad actors but are contributing to societal good. Often, however, the only evidence of this are website pledges and PR announcements. Is such signaling to be trusted?
Rational discrimination does not require class or racial hatred, or even unconscious bias, to operate. It only requires ignoring bias that already exists. When automated decision-making tools are not built to explicitly dismantle structural inequalities, their speed and scale intensifies them.
- Eubanks, Page 190
Note that the workforces of many tech firms are not representative of the U.S. population; this can undermine attempts to create digital tools that are unbiased and nonharmful. In the wake of the protests following the murder of George Floyd and others in 2020, many tech companies, in an effort to be more inclusive of voices in the development process, pledged to dedicate resources to diversifying their workforces. How are these companies doing in keeping those promises? A recent review by Blendoor found that most were not meeting their pledges, and some were even going backwards.
Of course, there are companies that are doing good. How do we identify them? And how do we press the others who want to work with us to do better?
First, those of us working in career development roles need to acknowledge and accept responsibility for trying to fuse competing values into our work and the processes we teach and manage. Hiring, like the rest of modern business, takes its cues from Fred W. Taylor, whose work from the late 19th century has informed much of management science since. Efficiency holds the place of primary virtue. In Taylorism, workers, not people, function as organic parts of the machine. Workers cannot be trusted, and must be managed, studied, and tweaked to improve productivity. Taylor’s work is ultimately dehumanizing, at least for employees, but it is at the heart of business thinking and integral to how practitioners in fields like industrial and organizational psychology are taught and approach their work. It is the unexamined and unchallenged fundamental. This thinking dominates our contexts.
Contrast the values of Taylorism with what dominates career development. For those of us who work with people to help them grow their careers, our values tend to be centered on the human. Our one-on-one interactions, programming, and planning focus on the student and how to reach them and help them grow. Why? So they can get jobs and improve our metrics, sure, but more so that they may find increased access to opportunities and live more fulfilling lives. Beyond that, as a field and association we were quick to speak out on behalf of DACA students. We decry, resist, and root out the ways biases like racism become systematized. We create safe spaces and learn about microaggressions. Then we reform and update workshops, publications, and language to advance these efforts. This is all human-centered work that reflects our belief that people can grow and change. But this work and these values sit in tension with the Taylor-shaped environments that dominate the workplace. While we contend with and even pursue efficiency, up to a point, it stands to the side, a consideration that may help us reach a few more students in the face of tight budgets and limited staffing.
A dangerous form of magical thinking often accompanies new technological developments, a curious assurance that a revolution in our tools inevitably wipes the slate of the past clean.
- Eubanks, Page 183
A similar debate has emerged during the pandemic regarding learning management systems and online proctoring software and their assumptions that students are neurotypical and don’t feel any added pressure with a camera watching them while they take an exam.
For career development, the specifics are a bit different, but the underlying issues remain. If the hiring system has long been problematic, are we solving its problems by adding layers of technology? Simply saying a system or tool reduces bias doesn’t make it so. And such claims without evidence should be suspect when development teams and auditors are not diverse. In a recent piece for EDRi titled “Beyond Debiasing: Regulating AI and Its Inequalities,” the authors point to the hazards of narrowly focusing AI problems as purely technical while ignoring the broader factors that create the bias in society and organizations in the first place.
How can we maintain the human-centered aspect of our work? One way is to model the Career Readiness Competencies we put in front of students. Specifically, we can exercise the “Critical Thinking” and “Equity & Inclusion” dimensions in engaging technology. We can assess the ethical and downstream impacts of our technology choices for our offices and operations. What does the use of these technologies by our offices teach students and applicants about their value and how we—our offices and institutions—value them (or don’t)? Are students worth our time? Are we teaching students a way of interacting with us and the world in ways that are human-centered? Or, are we ultimately fostering a dependency on the use of these technologies?
New technologies develop momentum as they are integrated into institutions. As they mature, they become increasingly difficult to challenge, redirect, or uproot.
- Eubanks, Page 187
To help, below are questions to consider as starting points for internal conversations and for use with current or prospective vendors.
The window for us to deeply engage and shape a future we want and that is healthy for our students is closing. Technology has a place in that future, but whether the tools we use truly promote good or reinforce a damaging status quo depends on the choices we make now. Will we master an understanding of technology in order to use it well, like we teach students through the career readiness competencies? Or, will we consign our students and ourselves to a future as cogs in a machine?
Chris Miciek is the director of the Career Development Center at Thomas Jefferson University. In 2002, while at Baker College Online, he began building the first 100 percent online career center in the United States, becoming a pioneer in leveraging online technology and social media for delivering career development advising and instruction. He earned a B.A. (Calvin College) and M.A. (Wesleyan University) in psychology. He can be reached at email@example.com.
Average percent of eligible interns converted to FTE
2022 Internship & Co-op Survey Report
Percent of interns who are female versus percent of student population that is female
2022 Internship & Co-op Survey Report
Mean hourly rate for bachelor’s-level interns
2022 Guide to Compensation for Interns and Co-ops
Average number of full-time recruiters per recruiting department
2021 Recruiting Benchmarks Survey Report
Percentage of employers who screen candidates by GPA
Job Outlook 2022 Spring Update