Psychopathic Algorithms

Recruitment in an Age of Data

I have never enjoyed the recruitment process. What I felt was not excitement; it was compulsion, perhaps even addiction. The hunt stirred something predatory in me. I crafted the perfect application bait, colouring it with strategic stripes of war paint, designed to draw the reader’s eye to specific areas. It was a psychological dance of glitter and performance, dressed in company values, tailored to seduce a system I did not respect.

I recognised that the recruitment system itself was psychopathic: cold, mechanical, stripped of all humanity. It rewarded detachment and punished vulnerability. To succeed in the system, I learned how to soothe the recruiters’ complexes and mirror their vanity. I became calculating, tactical, psychological. I found myself sharpening the very traits I disliked in others, simply because they got me results. It was a game that forced me to amputate the parts of myself that did not serve the system. I became less human.

By 2010, employment filtering software was becoming more common, and I evolved with it. I responded by matching its energy, experimenting, tweaking, fine-tuning, and even trolling it on occasion. Each application became a test subject. Each response, or lack of one, fed my growing understanding of what the machine wanted. I studied my field’s language like code and its tone like camouflage. The centralisation of desirable roles on Seek widened my field of play, giving me a panoramic view of the job market. I began to decode the subtle correlation between a company’s advertised values and the real corporate culture lurking beneath, and the price they offered and demanded. Over time, I could smell the psychological profile they were fishing for, and I shamelessly served it to them, always remaining compartmentalised and indifferent to rejection. Rinse, repeat, apply became my side hustle.

Within a brief time, I stopped feeling anything about it. Rejection became data; no responses became time savers. I compartmentalised myself so effectively that my inner life became unreachable during the process. Authentic emotions were a liability, so I buried them. When I was overlooked, it was not personal; it was formula misalignment. If I was chosen, it meant I had nailed the formula. The detached mindset made me more resilient. I could pivot fast, bounce back faster and adapt, qualities that looked like strength from the outside. However, beneath it all, there was a strange emptiness. I was not growing; I was cocooning.

Still, the discipline it bred in me was real. I learned to analyse without sentiment, perform under pressure devoid of stress, and maintain a surgical clarity most people do not develop until much later in life, if at all. It laid the groundwork for a psychological toughness I cannot unlearn.

The Rise of Algorithmic Control

At the time, the public hadn’t quite caught on. They were still playing by the old rules. Many still do, submitting hopeful applications, then waiting, obsessively checking their phones for missed calls, mistaking silence for personal failure. But the reality was that they were already submitting to algorithmic filters, coded to favour the spiritually vacant, the predictable, the fluent in corporate doublespeak. Filters designed to reward compliance and output over originality, carefully screening out anyone who brings depth or challenge.

By the 2020s, I had retired from this toxic game, or so I thought. Circumstances, however, forced me back in, stepping into a landscape even more soulless than before. Selection had become a purely mechanical affair. Recruitment was no longer a conversation between people but a transaction between two algorithms or two screens. Just yesterday, I was directed to an interview designed to filter candidates using a bot. I seemed to have missed the exact point where recruitment stopped being a process and became a closed loop: a system perfecting itself until nothing unexpected could break through. The only way to pass through was to strip oneself of all opinions and emotions, suppressing human characteristics such as intuition and even humour.

This hyper-mechanisation does more than just change the process; it trains people to become less human. The relentless demand for perfection, error-free performance and data-driven optimisation turns imperfection into a liability rather than a natural part of growth. As a result, candidates grow increasingly insecure about showing vulnerability or making mistakes, knowing that every slip is recorded.

Such insecurity breeds dependence: dependence on scripts, rehearsed answers and AI tools that perfect résumés and cover letters. The very systems designed to reduce bias and increase efficiency are amplifying the fear of imperfection, pushing candidates to surrender more of their humanity and individuality to algorithms. In this way, AI is not just a filter; it is a social engineer conditioning us to distrust our instincts and values. Forget body image, this goes further, creating a full-blown identity crisis. Yet feminism, Unions and governments remain silent on the psychological effects of identity suppression, failing to apply any pressure on the corporate world. Diversity has never been more restricted.

Much like the beauty industry, with its endless cycle of plastic surgery and concoctions built on dishonest promises and superstition, these systems prey on our self-doubt and dependence. They convince us that perfection is attainable through external means if we surrender our uniqueness. Just as people sacrifice their natural appearance to fit impossible ideals, candidates abandon their authentic voices to appease algorithms. Both systems profit from cultivating insecurities, shaping people into increasing reliance on what is deemed beautiful or successful. The result? A work culture of unsettling uniformity with depression and anxiety on the rise.

Yes, the democratisation of AI has levelled the playing field. Candidates now wield tools that let them reverse-engineer job ads, craft applications with ruthless precision, mimic the system’s mechanics, and even deliver an AI-prompted, rehearsed smile, perfectly timed to sparkle and ding at just the right moment in the conversation to reinforce a key message. The power dynamic has shifted. But in levelling the field, we’ve also flattened ourselves.

The Gatekeepers

What follows is a recruitment landscape devoid of human texture. After countless cut, paste, and apply-button clicks, rejection no longer arrives with an awkward phone call or a strained apology. Instead, it comes via auto-generated emails. At least throughout the 2000s and 2010s, recruiters still hid behind the ‘cultural fit’ excuse, a conveniently ambiguous and weaponised criterion that allowed employers to reject candidates without accountability. But, as manipulative as it was, ‘cultural fit’ at least exposed the insecurities embedded in a workplace’s psyche. Today, even that faint projection has been replaced by the sterile, algorithmic finality of ‘Employer Unlikely to Proceed with Your Application’ digital post-it note pinned to your automated application graveyard.

The filtering AI-authored job applications using HR AI and conducting AI-orchestrated interviews using HR AI bots and HR AI guided questions, renders the whole process fully automated, then role of HR faces increasing obsolescence., Many sensitive functions, once performed internally, are already being outsourced to elite agencies that possess specialised expertise and broader professional networks. This shift is driving the growth of a burgeoning market that continues to expand steadily, reshaping how organisations manage recruitment and administrative processes.

What about maintaining or enhancing a corporate culture? Most HR personnel are neither qualified nor have any notion of how-to social engineer culture. If anything, their dabbling has the opposite effect, as they lack the comprehensive understanding of human psychology and group dynamics. Their insistence on implementing manufactured rituals such as “Welcome to Country” opening statements, “Purple Pride Days” propaganda, diversity workshops and wheelchaired celebrity talks or hiring a trophy trans, is more reminiscent of cult rituals to mask the deeper realities of office life alienation and compliance, keeping employees focused on token acts of care rather than questioning the system itself.

You would be forgiven for asking yourself why such a department has been allowed to pass itself off as corporate psychologists and anthropologists, given that they lack the skills, qualifications, and outcomes required for a People & Culture brand. This cloak is extremely advantageous to a company, as it also conceals the hidden roles within HR, a process of hiring the right kind of people into key positions, not just those who fit the job, but those who fit something else. Something harder to name but always felt. HR’s cultural social-engineering ridiculousness provides the perfect distraction and cloak. And despite how well the corporate fool glove fits, HR personnel appear to be lining themselves to be squeezed out into agencies or redundancy within the decade.

There are advantages to this new system for companies too. Outsourcing not only helps erase accountability and keep complaints external, but it also consolidates power. As corporations offload most of their recruitment functions, they inevitably hand greater control to the digital platforms that now dominate the hiring landscape. These platforms do not merely connect candidates with jobs; they have become silent gatekeepers, controlling access, visibility and opportunity on a scale far beyond the reach of any single HR department.

Centralised platforms such as Seek, LinkedIn and AI-powered applicant tracking systems manage your CV, record the number of job applications you submit, record the number of job applications you submit, monitor your digital activity, seek patterns of rejection or success. Some also store your ID and work visa details. Your application history becomes a dossier: a digital record scrutinised by machines to predict your fit. Recruitment has been transformed into a system of surveillance, empowering these platforms to cancel, ban, ghost or blacklist anyone deemed unfavourable.

It’s disturbing that no open discussions are raised for the risk of job platforms carry in passively filtering the job market on a national scale than the risk itself. There doesn’t seem to be any awareness of the possibility that such platforms could pose a serious national security threat. It is rarely acknowledged that a single platform holds the power to exclude or promote an individual, or an entire group, from the job market. The threat of foreign interference in a nation’s labour ecosystem is both plausible and deeply concerning. These platforms do not merely filter candidates; they shape the talent pools of entire industries. With so much control centralised in the hands of a few private entities, the risk of deliberate manipulation, whether by corporate interests, hostile actors or state-level forces, is no longer theoretical. It represents a structural vulnerability that no one is willing to address.

What has happened to the job market, and the direction in which it is evolving, is not simply about recruitment favouring the predictable. It reveals something far deeper: how readily individuals surrender their agency, even their very identity, to systems they neither question nor control, all for the uncertain promise of a stipend. The pressing question is whether society will continue to participate in a game that is, by design, rigged.

Yet the most confronting truth is not just the dehumanisation itself, but the unsettling realisation of how willingly, and even skilfully, we have played along. Perhaps the darkest part of this story is that we’ve been hunting ourselves all along, allowing a system engineered to erase one’s shape to push us toward invisibility. And so, in the face of this relentless erasure, the most radical act we can undertake is to look in the mirror and refuse to disappear.

Annabelle Fearn