Don’t let the robot discard your resume: Applying for work in an automated world
Many companies use artificial intelligence systems to screen CVs. Here’s how to avoid their biases and exploit their weaknesses
Company hiring managers are integrating artificial intelligence (AI) systems to automate decisions on job applicants. “Job candidates are often unaware of these methods,” warns Cristina Colom, director of Digital Future Society, an agency from Spain that informs citizens about technology issues.
Applicant Tracking Systems (ATS) are one of the most widely used types of software in the human resources field. They offer specific solutions for recruiting personnel and cull applications “to leave a manageable number of resumes that meet a position’s requirements,” explains Carme Roselló, the coordinator of labor market projects at Barcelona Activa.
First and foremost, automation allows for “fast screening,” says recruitment consultant Juan Gonzalez. Softgarden, the company he works for, currently processes 2 million applications a year for 40,000 recruiters. “There are companies that receive 400 candidates for one position. If you’re hiring for four jobs, how could you sift through 1,200 candidates at once?” he asks.
Technologies like Softgarden, a Software as a Service (SaaS) system that allows candidates and companies to connect to applications installed in the cloud, “streamline tasks, accomplishing in 30 minutes what used to take two hours,” says González. He also points out the importance of “respecting the digital rights” set forth in European data protection regulations.
One must “make sure that the resume includes the position’s keywords, because the algorithm works like a dictionary”
The system “uses artificial intelligence to automatically collect data [by] searching for keywords in the text,” González notes. That’s why he recommends that candidates use a “textual resume, without columns or graphics, that have a simple format and list experience, education and languages in reverse chronological order. Applicants should avoid symbols, illustrations and diagrams because creativity works against us in ATS searches. The resume should be simple and use standard fonts like Arial or Verdana because machines don’t read them all equally well,” Roselló adds.
One must “make sure that the position’s keywords appear on the resume, because the algorithm works like a dictionary to match different descriptions,” advises Jaume Alemany, the co-founder of the Etalentum recruitment company. His company just invested $497,650 to create the Robinson system, “a robot that can manage millions of data [bytes] and reduces the risk of automated discards.”
Keywords and employee proximity
Robinson’s algorithm is “based on information from more than 3,000 recruitment processes, so its learning system identifies the variables that companies consider most important. Even with the rise in telecommuting since the pandemic,” the first concern is employee proximity, Etalentum’s co-founder says. “Although it sounds ridiculous, it’s important to indicate the city where you live and where you want to work, if they’re not the same. For example, if you say that you work in [Los Angeles], but you want to move to [New York City] or take a job there, include the latter city, because otherwise the machine will discard your resume,” Softgarden consultant Juan González notes.
Language can also be an important factor to bear in mind. “Depending on the position you’re applying for, it’s [generally] advisable to write your resume in two or three languages, especially in English, because there are machines that are better trained to read [that language],” Alemany points out.
Because “recruitment processes are increasingly automated, especially when applications are initially received and screened, we need to personalize and optimize [our] resumes according to the job description and emphasize the information that’s most relevant to the position,” says Nilton Navarro, the brand manager at Infojobs.
Carme Roselló notes that it’s essential for “a resume to contain the position’s keywords.” For that reason, she recommends researching the qualities that companies, or robots in this case, are seeking for a given position. “There are many ways to search for keywords,” she says. The most basic way is to analyze the terminology that companies use and find information about similar job profiles on networks like LinkedIn.
To make sure that your resume isn’t discarded even though you meet the job requirements, you can avail yourself of tools such as Resume Worded or Zipjob. They let you “upload your resume to perform a scan and see if it would pass the ATS filter,” Roselló says. She also mentions Google’s “very interesting” Interview Warmup, which is “a key interview training tool that gives you information about the words used most frequently in the sector you’re targeting”; she says using it is “a great practice.” The expert insists that it’s important to “stay up to date” on these tricks, because “technology advances very quickly.” She notes that it’s unfortunate that automated systems sometimes discard “good candidates” because of “something outside [the applicants’] control.”
The risk of algorithm bias
The proliferation of smart technology and automated decision systems “can jeopardize some working people’s fundamental rights and increase the risk of discrimination based on gender, race, sexual orientation, religion or simply personal tastes ... that has an impact that goes beyond the digital world and can widen the social gap, especially among more vulnerable groups,” warns Digital Future Society’s director Cristina Colom.
However, she notes that “companies’ use of algorithms and artificial intelligence is here to stay, so the solution isn’t to eliminate the technology but to better control how it’s used and ensure that it protects labor rights and equal opportunities to workplace access.”
Colom says that it’s important to practice “technological humanism” from the get-go in designing these digital solutions. She advocates “integrating different visions to create them, with greater diversity in the design and programming teams in terms of gender, race and religion, as well as by discipline, broadening them to include areas such as psychology, anthropology and sociology, to ensure that the sources that feed AI aren’t biased.” Finally, she emphasizes that “technology can help us to be more efficient and effective, but it can never fully replace people.”