The Breakthrough Hiring Show: Recruiting and Talent Acquisition Conversations

EP 150: AI solutions for modern hiring with Brainner’s Co-Founders, Fede Grinblat and Guillermo Gette

September 12, 2024 James Mackey: Recruiting, Talent Acquisition, Hiring, SaaS, Tech, Startups, growth-stage, RPO, James Mackey, Diversity and Inclusion, HR, Human Resources, business, Retention Strategies, Onboarding Process, Recruitment Metrics, Job Boards, Social Media Re

James Mackey, CEO of a leading RPO provider, SecureVision, and Elijah Elkins, CEO of Avoda, a highly rated global recruiting firm, co-host Brainner’s Co-Founders, Fede Grinblat and Guillermo Gette, in our special series on AI for Hiring.

They share their journey in developing an AI-powered resume screening tool designed to boost hiring efficiency. They discuss how Brainner helps recruiters make faster, fairer decisions by extracting key information from resumes and analyzing it based on predefined criteria. With features like dynamic weighting and ATS integration, the tool is tailored for talent acquisition managers and tech companies.

0:00 Brainner’s generative AI for resume screening

22:22 Developing an AI for criteria-based CV screening

40:47 Enhancing ATS candidate filtering and analysis

56:28 Navigating Bias in AI screening





Thank you to our sponsor, SecureVision, for making this show possible!


Our host James Mackey

Follow us:
https://www.linkedin.com/company/82436841/

#1 Rated Embedded Recruitment Firm on G2!
https://www.g2.com/products/securevision/reviews

Thanks for listening!


Speaker 1:

Welcome to the Breakthrough Hiring Show. I'm your host, James Mackey. I got my co-host with me, Elijah Elkins, today With us today. How you doing, Elijah? I'm well doing, well, Excited to be here. All right, good stuff. We also have Fede and Guillermo, who are the co-founders of Brainerd. We're really excited to have you both with us today. Thanks for joining us.

Speaker 2:

Thank you for having us guys. We are very excited to sit here and discuss what we're doing and also just discuss about recruitment in general. I think it's a great space that you guys created to discuss these topics. Thank you for having us.

Speaker 3:

Thanks for the invitation.

Speaker 1:

Yeah, no, it's great to have you both here. Just to start us off. We would like to hear a little bit about the founding story and, essentially, what problem you were looking to solve when you started the business. Guillermo Fede, who would like to answer that one?

Speaker 2:

I go ahead, I go ahead. Yeah, okay, you guys, everyone, this is.

Speaker 1:

Guillermo speaking, just so everybody knows there we go, so do you get used to my accent?

Speaker 2:

as we move forward. This is Guillermo speaking. Yeah, so we started about a year ago. We are friends. We've been friends for the three founders. We are three founders of the business and we've been friends since high school, so we do know each other very well and each of us has their own sort of businesses before and kind of life got us again in a situation where we were looking to start a business and we actually had this problem was like in our radar, like the problem about resume screening, and then also it was the rise of AI.

Speaker 2:

I come from a technical background, very interested on the whole AI, everything that happened in AI and all the benefits and all the tools and things that you can use. And we, I think, discussing with a friend I can't remember exactly what point well, yeah, this is the idea, but it was just an exploration of ideas of where AI can help businesses and where it can really make a big difference. And I think AI seems like it can do everything. But if you bring it down to the basics, where can it bring value today? So what we did?

Speaker 2:

We actually probably spoke with maybe like 50 different recruiters, start acquisition managers, hiring managers, CEOs 50 different recruiters, starting, acquisition managers, hiring managers, CEOs so we've been working in the industry for 10, 15 years, so we do have a lot of contact between the three of us and we spend a long time just talking with people. That's really how we really got started and that's the one I set up as our starting point and when we have a really good understanding of the problem and we started being an MVP of the possible solutions and then slowly we started growing into what Brainerd is today. But I would say that the main story here, the founding story, it was just all by talking with people. I think we have an idea of what we wanted to grow, but we also wanted to make sure that it was aligned to how people are working today and how people are using these tools today.

Speaker 1:

Yeah, I appreciate the background there, and so let's dive into the primary value proposition from the top. What problem did you decide to initially tackle with generative AI?

Speaker 2:

Yeah, so the first problem, it was very simple, it was just about resume screening. It's like how can we speed up? I think the first time that someone brought this up it was I have a lot of candidates on my pipeline and I'm not able to go through all of them. Can we use AI to at least help me extract some information from the resume so it can help me make my decision faster? So that's how the problem started. Recruiters are spending, depends who you talk to but it could be a couple of hours per day just simply sitting on their ATS, opening one candidate at a time and looking at the resume, maybe doing a scorecard or advancing or tagging, but there's some sort of very manual process of reviewing resumes and so with Brainerd, we created this product but we'll get into more details later but basically it helps you screen resumes faster and smarter.

Speaker 2:

And this process what we are looking into, which is the resume screening process, is the top of your funnel, where you get a job. You've got a new open position. If you publish that open position, you're probably going to get maybe hundreds of applicants, probably in the first 24 hours to 48 hours, like in a moment that you publish a job. There is a stats forum I think it's Ashby ATS that says that between 2021 and 2024, there's been three times more applicants than we used to have. So in terms of volume of applicants applying to jobs and that affects directly your cost to hire, your time to hire, your accuracy the more candidates you have and you don't have enough resources, you're not going to be able to cover all of them. You're not going to be able to. Maybe you get a hundred applicants and maybe you find in the top 20, you already found the first 20 that apply. Maybe you already found someone that you like and then maybe the other 80, maybe they're not going to get the same level. They will. You will try. This is a problem. We found these two people.

Speaker 2:

Do we keep looking for more talent or not? Do we stay with this? What our solution does is it makes the process even more fair as well, because it actually does an analysis on every resume according to your own criteria, and it just gives you a report. It doesn't tell you who's going to be the best candidate. It doesn't tell you who you should hire. It's just only going to extract the information that you care about each candidate, so you can then have a report that will allow you to pick your top candidates much faster than just going one by one.

Speaker 2:

So the main value is about how can we spend less time reading resumes and more time talking to the candidates. How are we doing more interviews if necessary? Are we doing more phone screenings? And that was going to make go to the first interview faster, which is going to make you to hire faster and the user time that you take to hire. So we are focusing on that first part, which is resume screening and just making sure that as a company, you don't miss anyone, and also as a candidate, we want to make sure the candidate gets a fair chance as well.

Speaker 1:

Okay, yeah, that sounds really good. I want to hear I think everybody would like to hear more about your customer base, which I'm sure early stage company. Maybe it's evolving, but I would love to get a sense for early adopters who you're seeing coming on as initial customers and then since then maybe you've dialed in on an ideal customer profile, maybe you're going wide. But I'd just be curious to see for this, specifically for resume screening are you seeing like higher volume types of customers or that do a lot of high volume hiring? Are you seeing specific industries or types of roles? I would love to get some insight there from you.

Speaker 3:

We have three different kinds of clients. Currently. We have a lot of traditional companies like healthcare or banks, for example, but also tech companies. But then we are working with some RPO agencies also, and also we're working with staff augmentation companies. So it's really why the kind of companies that we are working nowadays, but something that we have is that small companies doesn't need this kind of system because they don't have a high volume of resume applications. That's something that I can tell you.

Speaker 2:

Yeah for sure. And I think the other way we approach it is when we were first started, when we first built our first MVP, we didn't really have integrations with an ATS, we were just building. We are like oh, just upload your resumes here and we're going to do the analysis for you. You can just upload a hundred resumes and in a couple of minutes you will have your analysis, your report, done. And what?

Speaker 2:

When we started talking to customers, of course everyone used the customers that you will look at, that have many opening roles and have a high volume of applicants. They are companies that they use an ATS. If you have even like maybe two or three open positions, you probably need an ATS already. Then we started looking more OK, let's just focus on customers that are using an ATS, and that already narrows down the market as well, because if you're using an ATS, it already sets you as a profile and especially, we started with Lever as our first ATS. So then, if you are losing Lever? So then we went more of okay, let's just focus on customers that are using Lever as the ATS, that they have at least one of a couple of current open positions, and that kind of really helps you to narrow down your customer base.

Speaker 2:

And the good thing is, all this information is public as well, so it becomes quite easy to spot a company and be like I think this company could benefit from Brainerd. And we can see that, because when we reach out to those companies, we actually get a really high response rate because you are talking the language they're like yeah, I'm dealing with this problem. Yes, I am spending a long time reading resumes. What do you have to say? So it becomes so the better you get at finding those companies, the more chances are that people actually are interested in what you have to say.

Speaker 1:

I'm curious like we had Elijah I think it was the founder and CEO of Qual right, so he was. They were doing voice AI screening calls and it was so top of funnel initial touchpoint typically, sometimes second touchpoint, but it sounded like it was typically being used at the top of the funnel so it's essentially as soon as somebody would apply they would do this voice AI screening call. So it was higher volume and he actually it sounds like. Just correct me if I'm wrong here, my memory isn't serving me properly, but he had a fair amount of customers and staff augmentation, contingent workforce, staff aug space a lot within light industrial right, is that yeah?

Speaker 4:

Okay. Yeah, I can't remember if he was sending everyone, if everyone actually gets the link to use fall, or if it would be right like something. Like brainer is used to reduce, let's say, 3000 applications down to 2000. And then those 2000 are sent the qual link to be able to go through like an automated kind of screening.

Speaker 1:

Yeah, I don't know. I think the way that he was describing it was the CEO was describing it was basically that for that industry, for light industrial, particularly for the RPO firms that service that and the contingent workforce, contingent firms that service that, a lot of the times folks in those industries they don't do a really good job putting together their resumes.

Speaker 1:

So he's like, all right, let's just, we got to screen these people so we can actually decide who it makes sense to set up a call with. So I guess the reason I'm going down this like little, like side path here is I'm wondering, like the art like from a staffing perspective, whether it's staffing agencies, RPOs or whether it's like companies. I'm wondering, is it similar in the sense that you're getting a lot of the very high volume type of roles where it might not necessarily just be like engineering openings at a growth stage tech company, but you're seeing more adoption of the companies that might be in light, industrial, CEO, qual or maybe healthcare, wherever, like different spaces where there's just a ton of applicants and maybe like similar challenges to what qual is solving, obviously from a different approach. But I'm just wondering, like what you're seeing more of right now?

Speaker 2:

I would say, if I had to pick one industry that, as customer base, is more prominent, is the tech industry. I want to just make sure that that's so far has been in the majority of our cases. When we go into demos, there's always some sort of technical position that they are working with, and I think the technical positions has. You know, in many cases sometimes they're a remote business, so they kind of even hire even from a bigger pool of people. So that's why I think tech positions they do get a lot of applicants and also they are the ones that sometimes they require an extra level of knowledge to actually do a resume screening. If you have to be a technical talent acquisition, you have to be a technical recruiter, because you need to understand AWS or Google Cloud or Azure and you kind of have to have the language and understand what they're looking for a Python, javascript or Node. The good thing about using something like Brainerd is that the AI has that knowledge. So if you decide, okay, this is my position, I'm hiring for a technical role and I'm looking for someone that has at least five years of experience on Python, that's something that the AI can say, yeah, this candidate has it and then this candidate doesn't have it, and then so then that is why AI is helping you even recruiters to navigate these terms.

Speaker 2:

Then the other day, we have a customer that, for example, they were looking for a facilities manager, so they were looking for architects and we were. So we got a lot of people that apply architects and we had a challenge where many of those architects were actually uploading resumes as images. They had this very nice design image and it was harder for our system to parse it and we had to improve the system. I'm just telling you these details just because every industry comes with different challenges. You know what you meant. Some people may not really prepare their resume very well. Some people may prepare their resume too well.

Speaker 1:

That's the other big issue, right.

Speaker 2:

Is that the other part where you get a free page resume with all these images and all these things and it's, with a way of expression, the resume right. So, whatever they decide to do, that's okay, but the system you have to also consider those cases as well. And then, yes, in the healthcare industry we have someone I think this is the other one where there's a lot of rotation nurses and healthcare practitioners. They need to continuously be hiring, so they have these positions where they never start and finish. They continuously look, they say, hey, if you can do this, we want to talk to you. So we do have that, and those jobs get a lot of applicants as well. So, yeah, and then we had another customer which is an RPO customer and he's based in the Netherlands and they actually use our API and they embedded our processes in their own proprietary system. So they use our API to connect to their system and they do the resume screening directly on their process.

Speaker 2:

And that was another challenge for us. Another thing OK, we need an API, we're going to build an API and then you're based in Europe. So now we have to have Europe servers and GPTDR and all these sort of compliance things. But those are the things that grow in pain. Those are the things that push you to make your product better and bigger. So it is a challenge Again. There's no one type of customer that I'm like. I think it's part of the process right now that we are trying to refine and fine-tune, tune, are targeting continuously and constant. But this is so far. We try to go for this vertical, see what happens and we get some response and iterating like that.

Speaker 1:

Yeah, just to see you guys know too the guests that we've had on thus far. We just had the CEO of Workable on the show, ceo of BrightHire, ceo of Pillar, and then there's one other, elijah I'm blanking on the name of the other company. We've had enough podcasts now in the series where I'm like I'm starting to blank out a little bit, but anyways, it's. What's been interesting is, across the board, we're seeing companies go really wide. I don't know why. I just had this assumption in my head. Okay, we're going to see whether it's like resume screening or AI note taking or evaluation of packaging, organizing data or whatever part of the workflow. I thought we were going to probably see more of companies specializing a little bit into a specific industry or a specific role, but I guess now it makes a lot more sense, because the more that I'm learning about is look the way that this technology is being built is you can apply it to like any role. Yeah, but it's still just interesting because yeah like we're just hearing from different CEOs.

Speaker 1:

Oh, we have customers like all over the place. It doesn't seem like it's like there's one industry per se that is leveraging different AI driven technology these days.

Speaker 2:

And I think also it has to do with the fact that when you're just getting started, you're not going to say no to a customer. So in many cases it's oh, you want to talk to us, or like they come directly. They found us on LinkedIn or somewhere. They book a meeting, they tell us what they're doing no-transcript faster. Blah, blah, blah, and maybe we went more narrow our approach, but at this stage it feels more like we are trying to discover where this works the best. And the reality is, yes, there's many industries that this could work. Maybe we'll never really focus on one industry and maybe we will continue this. So maybe at a certain point, especially for marketing efforts and content, sometimes you do need to have more of a narrow focus. So from a go-to-market perspective, we may have more of a narrow focus, but in the back we'll be like, hey, if a customer comes?

Speaker 1:

that doesn't, I will still. We can serve them, like for sure. Yeah, let's do it. That's the go-to-market. I don't know what you're seeing out there from a go-to-market strategy perspective, but yeah, that's another interesting question, right, do you try to dial in on a specific segment but then it's you can basically service other areas, or do you just present as going wide from the beginning, like we can help you with any role company-wide? I'm seeing more companies just do that, right, like they're not even like they're trying almost not. It seems like they're trying even not to really say like which industry they're getting more traction. They just really want to keep it as open as possible.

Speaker 1:

Right now, I think everybody's trying to still figure out is it going to be consistent across the board in terms of different industries where we're going to see traction? I think another thing is like people want customers and technology. But I think, just after the past few years, founders don't only want customers and technology, right, they want more diversification outside of only tech, just because of the volatility. And so, at least for me I'll just speak for myself at Secure Vision, it's like my company's helped over 200 companies through our RPO solution. A lot of them are in the tech industry and we're making a big push for diversification outside of tech.

Speaker 1:

We still that's a lot of our relationships are in tech, but I want more customers in manufacturing, life sciences, banking and finance, construction, real estate. I'm really trying to push out as much as I can to diversify our customer base. So when things like Silicon Valley bank crash happens, we have a little bit. We have customers in different segments. I'm curious, like from a go-to-market perspective, what are you seeing out there? How are you positioning the company? Just curious to get your thoughts here.

Speaker 3:

Yes, we are targeting always talent acquisition managers. That's something that we realized exploring the market. Right, we started targeting recruiters, but we noticed that recruiters sometimes don't have the power to make a decision, for example, or sometimes they're just screening resumes and they don't want to go and push one initiative, a new initiative or a new software. So a great thing was to discover that talent acquisition managers or recruiter managers were the ideal persona, right. And then, as Guille said recently, we prefer the tech companies. You guys prefer Tech companies, of course, but we are always finding new kind of companies. The staff augmentation companies is the thing. Because we are from LATAM, we have a lot of contacts in Latam and United States companies are hiring developers from Argentina, from Colombia, from Uruguay, and that's the reason why we have a lot of staff augmentation companies, because we have contacts of companies in Argentina that are looking for those profiles, hire them in Argentina and put them to work for the United States. We are finding cases every day. That's the thing. We don't have something really narrow.

Speaker 1:

Yeah, there's the use case or the more that you think about staff augmentation firms being early adopters of these types of more AI native solutions is it makes a lot of sense because they're always hiring. It's in any market, like if they're in business, like they're hiring there's always going to be, even in a slower market, if they're a bigger firm, there's still going to be a high volume of hiring, particularly compared to corporate jobs. So it's going to be corporate companies that are just in tech or whatever else. So I think it's going to be interesting too.

Speaker 1:

I think a lot of companies in the recruiting tech space that are rolling out new tech. They probably are going to have a decent segment that comes from that industry just due to the need for efficiency gains right, particularly, too, because these are services firms that have to hire a lot of headcount right, like to deliver business to the extent that they can be a little bit more optimized and cut out inefficiencies. That's a lot of helps them move a lot faster to not only increase revenues. But the reality is like services firms do have to be incredibly focused on sustainability and margins and very careful. So I think that that's I think my guess is that we're going to hear from more CEOs that those are. Some of the customers they're seeing right now are in that space.

Speaker 3:

Yes, and they are always hiring the staff implementation companies and hiring the same positions and the same tech positions. So they have a big database of candidates already analyzed with Brainerd. So they have a lot of candidates that are a good fit that they can continue contacting and hiring them.

Speaker 1:

Of course, yeah, yeah, for sure, elijah, I feel like I've been talking my head off. Man, what questions do you have?

Speaker 4:

Yeah, all good. So I'd love to rewind to when you were looking at starting the company so you mentioned. You talked to, let's say, 50 plus recruiters town acquisition managers. What did you do next? Take us back to that time and how did you really get started building the AI and even dipping into like the technical side a little bit? Yeah, I just think it'd be really interesting for our listeners to hear that firsthand experience of like actually building AI for recruitment.

Speaker 2:

For sure? Yeah, no, thank you for the question. I think it's a great segue into how we did that. And yeah, once I switched the feedback, I mean there was a part of we are three people, so I have mostly Fede and Ignacio and they were having the calls. Some of the calls I will participate as well, but I was on a dark room. They didn't let me leave that room and they made me be coding. I had to be coding and the only thing I could do was coding. That's how we got things done. I was not. They would just slide me pizza under the door and they just gave me coding.

Speaker 2:

That's how we started building the software. No, but in a real sense, there was definitely this sort of because, first of all, we had to see if the technology can do it. So the first thing that we did, I remember we just went on ChatGPT and we were like here's a resume, here is a job description. It is a good fit. You know that, honestly, that's something anyone can do today. You don't even need to pay. You can just do it on any sort of AI. You can just paste your resume, paste your job description, and the AI will actually give you a very decent analysis of that applicant versus that job description. So that's why we're like, hey, this is something that we can use. So the technology is there.

Speaker 2:

Then we go back to the interviews and we're like hey, so what do you think of this report? What do you think about this? Do you find this useful if this report is automatically generated for every candidate? And they were like, oh, yeah, that will be useful. So we're like, okay, let's be a very simple product, very simple UI, very simple website, not something that we can show, because you need something to show and you need to test it out. You can do so many conversations, but people need to see things. So we call this the MVP, your minimum viable product, and in that time it was about it's just a simple ui that you just drop your like 150 resumes and you will just do this analysis. It will compare the job description with a resume and it will give you a report and a scoring of one to a hundred of like how good is that? And we thought that was a killer product. We're like, oh, this is amazing. They give you a score, it gives a report, and then we started showing that.

Speaker 2:

And then, when we started showing that, the first thing that comes to the recruiter mind was but how do I control this? What is this report based on? Like, how do I decide why this score is this score? Why am I getting? There was a lot of questions about why. How was the AI doing the analysis? Like, how was the AI deciding that this candidate has a 70 score versus the other candidate at 50? Like it was a bit of like a black box magic that was happening. Initially we thought, hey, that would be great. But when we started talking to people that were using software that black box magic is not something that they want to see they actually want to have more control of how that analysis is being done and why is like, why the score and where the score is coming from. Ultimately, they want to also set up their own logic, like their own rules for the scoring. So we went back on the whiteboard there and they were like okay, so what is that? What do we need to do here? And then that's when we introduced it is.

Speaker 2:

I think that was probably a tipping point or a turning point for for our product, which is we introduced the concept of defining your criteria. So instead of being a black box sort of like magic thing that happens and give you a score, we actually say, okay, just put your resume, put your job description and we're going to from there, we're going to extract the criteria will be what really are you looking for in this candidate? Number of years of experience with certain technology that has worked on a high growing part of a team, or that works on a high-growing part of a team, or that works on a high-growing company. That the candidate is based on a certain geographical location. So you go from objective things no, not subject, no, soft skills so much, but just more of the things that you. Hey, I need to use scorecard. Has this candidate been working for more than five years? Has this candidate had experience with this technology? So you list all your criteria and then the analysis is done based on that criteria and then the analysis that the AI will be doing will be saying does this candidate fit that criteria partially, completely, or it fails to meet that criteria. So let's say, five years of experience this candidate partially fits this criteria because he's got three years of experience and has worked in a high-growing company. This candidate worked on Airbnb, so he has worked. So high-growing company. This candidate worked on Airbnb, so he has worked. So that's a fully.

Speaker 2:

So then you can imagine you start having this kind of like a green-yellow per criteria, per candidate and then, based on that, then you get the score. Hopefully it makes sense. So you have these two. So you set up your job description, you set up your criteria and then that criteria starts being analyzed on every candidate. So what you have at the end is a dashboard where you will have all your candidates and then you have this color coding. When I say this has everything green, it means that it meets all your criteria and you will probably have 100 score. But maybe you have a candidate that fits almost everything, but a couple of those criteria are partial, so you have an 80 score.

Speaker 2:

And when we did that, that's when we started like people and really understanding how the software does, really understanding the power of how useful it would be, and things really started moving in the right direction and it was just a matter of not automating everything and not making everything so magic and so powerful, but actually taking a step back and giving more tools to the recruiter and then let the AI do a very objective analysis. Know that they are not telling you if it's the best candidate or not. It's just going to tell you this candidate has more than five years of experience on Python, but that's it, that's. And then it's up to you like move that candidate ahead or et cetera, and set up the scoring. But that's something that you know.

Speaker 2:

That came from just talking with customers and talking with the potential customers and really understanding their processes and what they wanted to see, and I think that's something that is a topic that we can discuss. It's like AI can do so much. It could be even maybe even not me right now. Maybe it could be an AI avatar right now talking that technology is already possible. It's just that we are not really ready for that. We can't go from crawling to running. We have to go crawling, walking and get there.

Speaker 1:

Okay. So I have a question about scoring. So I brought up scoring with those founders CEO of Workable, and I briefly touched upon it with BrightHire and Pillar. So Pillar, brighthire and Workable are all staying away from scoring. There's not like any kind of match per se. We're saying this is a 90 match based on criteria. They're not doing it in numbers, they're not like 90. They're not even doing it like in any kind of stack, rank or upper percentile or best fit batching or anything like that. It their focus is on organizing the data I think is how Nikos put it and packaging it in an easy format for hiring teams to essentially consume. Now, granted, these companies are focused down funnel, so these are when they're in the interview process.

Speaker 1:

But I specifically asked Nik. So I was like what about scoring for top of funnel? Like you got, you have a thousand applicants. It would be good to be able to have some kind of criteria or some kind of matching criteria to essentially evaluate score. And he basically Elijah, do you remember what he said? I think he was basically getting into like difficult mistakes can happen, this and the other. I don't know if I. He shared a lot of really brilliant things. I learned a lot from him. I didn't know if I totally understood or agreed on this part. Quite honestly, I'm a big advocate on trying to show relevancy at a minimum of an evaluation, not just presenting data.

Speaker 4:

But I think he might've meant with an actual scorecard or evaluation perspective. If you go on to Workable's website and you look at their, I think it's like workable-ai is the end of the URL. They do have something called a screening assistant that uses AI for semantic matching with candidate resumes and they do have like a best match AI with somebody gets 40 or 60.

Speaker 1:

They do. Yeah, he didn't say that at all on there, cause I remember I asked him a few times.

Speaker 4:

He might've thought you meant right, like an actual scorecard or evaluation. I don't know, guillermo or Fede, how do you feel like I can speak to this? Yeah, yeah.

Speaker 2:

We can speak to this. Yeah, I can speak to this.

Speaker 4:

Yeah, yeah, we can speak to this. Yeah, please do.

Speaker 2:

Because this is what we do. We look at all of them, we try all of them and, you know, because ATS is someone that we want to work with. We can't really we're not looking to compete with the ATS in any way. So then, in some case, some ATS are workable and I'm sure some other ATS will eventually develop some sort of technology, and I'm not going to mention any names. But when we do talk to the customers this is not about any particular ITS, let's just say the model but when we do talk to the customers, do mention these sort of features, oh, they do have this thing, they do have this thing, and the normal feedback that we get is yeah, but we don't know how it works. We don't understand where it's coming from, how it's making this decision, like why this is the best match or why this is the other one. It's definitely helpful because the suggestions from the ATS they tend to be accurate, like if they say it's the best match, it probably is the best match. That's not incorrect. The problem is, like how are you deciding that this is the best match and why the others are not a best match, like why what's the logic behind that? And that is no one to be repetitive, but that's why we introduce the whole criteria and that's what we talk about scoring based on do you fit this criteria yes or no? Yes or no? And then your score comes up like how many of those criteria do you meet? And that's it. And I think for ATS it's a much bigger product.

Speaker 2:

Development Like. One thing is just to add a best match and AI analysis. Another thing is just introduce a whole criteria definition and a program. So I think that is what's happening. Customers are like, yeah, we like we see the AI feature, but we don't really get it, we don't really use it, we don't understand why this is coming from. So it helps.

Speaker 2:

But I think the recruiters and the talent acquisition managers, they want to take it one step further. And also they are like when you do a brain analysis, it's not only about the best match, it's also about maybe the don't look at your top 10, look at your 10 below that that they may be really good for your next position. And you may be able to say, hey, this candidate has 10 years of experience of development. Yeah, he may not have this little thing that I need for this, but this is going to be great for this other position, and that's information that the ATS is not going to give you, like it's only going to tell you the best match.

Speaker 2:

But what about the second best match? And that is so powerful as well, because sometimes your second best match tends to be the person that you end up hiring and it's just a matter of a resume that was not showing the information much, for those AI tools is based purely on what's been written and what's been on the job description. But what is information that wasn't on the job description but it was important for you? Where is that sitting that way? That's sitting on that solution. So that's what.

Speaker 2:

Again, the criteria thing comes very handy and and that allows you to put criteria that may not be on your job description but are still critical for your process. I think there is this. There is some there is. I understand why they don't want to put a score, because when you put a score that you don't explain how the score is being done, it could be deceiving. So I think I understand why they end up being just moving away from the score, because it's too difficult to, unless you give them the tools to control how the score is being done like what we do, then it's better just to do a best match or recommendations kind of title.

Speaker 1:

Yeah, I think it was like one of the things he kept coming back to is we don't want ai to make the decisions, we just want to organize the data and I don't think assigning a score is. No one is saying that the ai should be making the decision. It's just like when you have a thousand plus applicants, we need a starting place and it's just yeah, it's organizing data. That's literally what it is. Scoring is still data. It's just doing it in a format where we can get the highest leverage, the best leverage on our time. I think scoring is incredibly important. Maybe not like down funnel, where you're comparing like four candidates. You may not need as much analysis on fit because a hiring manager is okay, it's four people we can look. The data can be packaged, but it's a slightly different conversation. But when you're looking top of funnel with volume, to me I feel like that should be table stakes.

Speaker 2:

It's almost like when you take an exam at school. It's like they give you a score and the only way to get all your, your class to somehow have sort of relevancy between each other is that we all take a test. We get 10 questions and then we get to see how many we get right. And in this case we are not asking, we're not taking a test from the candidates, but in a way you're saying, hey, do you have this? Yes, you have this. Yes, you have this. Okay. No, all right, this is your score. It doesn't mean that you're a better candidate or no better candidate. You're a better person. It's just a matter of you're taking a test and that's what we're trying to organize the information and give you relevancy, from every candidate to everyone. I think that's what you mentioned, this, and I think it's a great way to start. It's a matter of how to organize this information.

Speaker 1:

It's literally what junior recruiters are doing, or junior sourcers is they're going through this motion, probably a lot slower, probably missing more, looking at resumes to find the relevant ones to push along to the recruiter. Elijah, you look like you got to. You want to say something?

Speaker 4:

Yeah, I'm just curious. So do you find your customers if they've defined a scorecard in their ATS that they copy and paste over what they were using for the scorecard as a criteria into Brainerd as the criteria, or is that different? I'm just curious if they're actually using the scorecard. It seems like they should be if they have one right.

Speaker 3:

Exactly. It's similar, like a scorecard the same scorecard they are using. But we studied a lot and write a lot of guides about how to write a good criteria, for example, quantify the criteria. It's not the same to put experience in Python or plus two years of experience in Python or experience in Python in the last two rows. It's the different level of accomplishment that the criteria is going to have, the different level of accomplishment that the criterion is going to have. Or, for example, we suggest them to include things that wasn't included in the job description. For example, you are targeting the competition and you want persons that are working in the competition. So we write a lot of guides about how to write good criterias and, yes, they are similar to the scorecard, but we include more things than the scorecard.

Speaker 1:

That's a really good example, Like the whole competitor, because that's like in a kickoff call. I'm sure Elijah and I have both been on a ton of calls where customers were like do you have a target list of companies? Do you have companies you want us to stay away from?

Speaker 4:

That's another good one, right.

Speaker 3:

Sometimes you want stuff like they worked at a company in the past, right, there's all sorts of nuance to that right we have a client that has a list of 300 companies that he's targeting and he has a criteria that has the name of those 300 companies and we are telling him if the resumes are aligned or not to that criteria yeah, and he can put more weight on that criteria as well.

Speaker 2:

So then that affects the scoring as well. So if he gets a candidate that meets that criteria, he automatically bumps it up on the scoring because, oh, this is very important to me. He may not have this other attribute, this other criteria, but he does have this. That is very important for me. Yeah, this is the interesting thing about having your little criteria on the side, and I want to also mention, for anyone listening, when you set up your job, when you upload your job description to Brainerd, we extract the criteria automatically.

Speaker 2:

You don't have to do it from scratch. You start with a base that we already give you, that based on our best recommendations. So we have a little AI model that is trained on extracting criteria from the job description and then giving it to you just to save you time. So that way it only takes you a couple. It probably takes you like maybe three to five minutes really to tune this. It's not that you're starting from scratch, copy pasting everything, and we're already embedding our best practices in there as well to make the process much more faster, because I feel like we're talking so much about it, it seems like a tedious process. It's not really tedious. It only takes you a couple of minutes to get it set up and then you start tuning it.

Speaker 2:

Once you get like 100 applicants, you're like, oh, this scoring, I need to go back. And then you go back to your criteria and you tune in and you add more criteria, you reduce, you change the weight and then that affects the scoring, and then you can see that in live data. You can continue to do an analysis on just those candidates that you already have there, which I think is very powerful, especially for staff augmentations as well. Where you want to go, I have another role that is very similar to the previous one. You may need to open the role. You go back to the previous one, you tune the criteria and then you're like, oh, here's my other candidate, let's start with this one.

Speaker 1:

Can we talk about weight for a minute? So you're talking about how you can weigh the criteria. Can you walk us through? Actually, I don't think we've talked about that. No, that's fine.

Speaker 2:

Yeah. So what we hear from customers is that they want to give more weight to certain criteria, for example, like the example that Fredegate just said about working for this company, or that they are more than five years of experience, again, on certain technologies. So they want that criteria to be above. Maybe let's say another criteria that is not so important, like there could be something, a soft skill, it could be something that you know technology or framework. Sometimes, you know, in tech role.

Speaker 2:

I'm a technical person so I always talk about tech roles, but in tech roles sometimes it's five years of experience as a developer and maybe working with AWS. That's a plus. You know, when they're in the job description they always say, oh, these technologies are a plus, or these technologies are a bonus but you don't need to have them. But they will be really good if you have them. But then those go a bit below the ones that are like five years of experience as a developer. You want to be a software engineer for at least for five years. You want to be graduated. Sometimes they ask for a grant. They want people that only graduated from technicals, from master degrees or technical degrees, and they are very important for the client and they want to make sure that if they can teach that, they want to bump it up more than a candidate that fits the other more bonus criteria and then yeah so to that kind of what we allow that you do.

Speaker 2:

We just allow you to give more weight, and that weight affects the scoring, so it will end up affecting how the candidate gets sorted in the table, because the table is sorted by top scores at the top, and so having that ability to weight in the criteria has been something that has been very highly requested. When people understand the process, they are like oh, how do I give more priority to this? So when we didn't have the weight, we have people actually putting the same criteria more than once. They will put five years of experience with python. They will put the criteria three times, because that will actually end up affecting the score, because if you have it that three times, you will win. So that's how people were going around this before we introduced the weight in the criteria. But I don't know what are your thoughts on. How do you guys? Is that something that you guys consider Like some criteria are more important than others? I think it must be quite common in your industry as well.

Speaker 1:

Yeah, yeah, I really like that functionality. It's not something I actually thought of, but I feel like, yeah, if I was using your product, I would very soon after be like, oh yeah, this is important, we need to be able to do this.

Speaker 2:

Yeah, you don't want to miss that one, you know.

Speaker 1:

Yeah, for sure. So is it as simple as just you're telling the product is asking the customer like what you're telling the product is asking the customer like what's most important? Is it really that simple? How does it? What does it look?

Speaker 2:

like, yeah, we started with as simple as saying, if this is mandatory, we have two categories, like mandatory requirements and non-mandatory. And we split the criterias in those two columns because, yeah, the mandatory ones you may want to have that all green, because it's like for every criteria, we give you like a little color bar, right, so you have the mandatory criteria that you want to probably see all that green, and then you have the ideal criteria or the other criteria that is not mandatory, that you're like I'm okay, some of these are missing, and that was the first division. Like that was the first thing that we introduced and that already helps a lot of customers. And then, one step further, we don't have it live yet, but it's about for you to say this is very important, this is somehow important and this is just average. That would be the other dimension that we'll be introducing, not only the mandatory, but just also saying, taking even one step further and saying this is very important, and then just put that on the scoring formula.

Speaker 2:

The scoring formula is something that we publicly share with our customers. It's not a secret, something we are like. Here's the code, here is how we calculate the scoring. There's no magic numbers. There's no AI on the scoring, it's just purely looking at the flags. Yes, no, yes, partial.

Speaker 3:

Another important thing is that you can also filter by criteria and, for example, you want only to get those that have experience in Python. You can filter those candidates. And why this is important? Because companies also need to archive candidates and, for example, if they want to archive a candidate because he's not from that location, they can filter those candidates that aren't from I don't know United States and they can archive them in bulk Instead of going into one single resume. They have to archive that candidate and that's important because it's about the candidate experience. It's not only about the good candidates, but also about archiving or giving a reason to those that apply and you are not going to call them. Yeah.

Speaker 2:

I think many times I think we talk so much about looking for the best candidate, but the reality is many times sometimes it's just about archiving all the ones that are not qualifying, like it's just a matter of you get a thousand, maybe it's going to be maybe 500, that, probably with two filters you can already archive because it's like, oh this, do they have five years of experience? No, okay, how many 200. You select them all and then you archive them and that already gets reflected on your ATS. So then, by the time you archive all the ones that don't fit your criteria, you're probably going to end up with a much shorter list. Even if you want to go one by one, you can still do it. But the ability of just removing the noise, if you will.

Speaker 1:

That makes a lot of sense to me. Elijah, do you have any other questions?

Speaker 4:

I'll just a final question. So let's say, in your scenario, guillermo, you did archive 500 of those candidates, but maybe a lot of those candidates would be relevant for other positions, kind of candidate database that they have, does it suggest and almost give them. You can add, a candidate can have a score for one role but then a different score for another role. Yeah, I'm just curious, I don't know. Could you have 15 scores and you're like high score for this role based on this JD, but you're very much higher score or lower score?

Speaker 2:

for this Totally and I'm glad you're bringing this up because it's very aligned to one of our latest features that it has to do with finding candidates in your database. So thank you for the flag. But you know what we did. We allow you to search for candidates on your ATS. So what we find from recruiters was that because we are just getting started with Brainerd, so it's not so much the data we have on Brainerd, we are just getting started with Brainerd, so it's not so much the data that we have on Brainerd, it's actually the data that you already have in your ATS, because you already, in many cases, they already use tags, they already use stages, they already somehow they have this organized Not as good as having them on in Brainerd, but there is some sort of system in general that they already have. So what we allow you to do is you have your position and then you go on Brainerd and be like you know what I want to search in my ATS database the candidates that apply to this other role or that are candidates that have this tag, or these candidates that have this source, or you know there's many filters that you can apply and what we do is, with all those filters. We go, get all those candidates from your ATS and we bring them into Brainerd and we do the same analysis that we did for the candidates that apply directly, like the candidates, the inbound candidates and then we import the ones from your other positions from your ATS and we import them into Brainerd and we run the same analysis and we basically are able to we allow you to spot really good candidates from your previous positions. That is that's what we have right now. That's something that customers are using right now, where we see this. We want to take it even one step further and be more aligned to your question where, once you have enough candidates in Brainerd, you can start doing also some sort of smart searching on. If you could go on your database and say I want to get all the candidates that have more than five years of experience on Python, and Brainerd will be able to just bring you just that. And then you start searching not by keyword, not by tags, but just by actual semantic meanings, by real things like customer candidates that work on this company, candidates that do this, and because the AI is going to be able to run that analysis and then find you the right candidate. So the more candidates we get into Brainerd, the more we're going to get. We're going to keep evolving this.

Speaker 2:

I think the search in your database is a critical component and yeah, it's something that it hasn't. It wasn't possible before. Keyword matching it works, but you leave a lot out on the table, like keyword matching is, and then people also they game you because they put all the keywords on their resume and then you end up getting the AI doesn't get confused on that. If the AI reads keywords, it doesn't matter. Like the AI will actually understand the text, understand the dates and sort of like how you've been moving around in your career, et cetera, et cetera. Yeah, so we've started with importing from your ATS and the next stage will be to search through your in this way. But so far what we found is we can. The recruiters were telling us I already have tags for.

Speaker 4:

Or I think they call them like the second place or third place or whatever they use for the process. Nice, yeah, that's really helpful because I know on the roles that I work on whenever I need to search through a database and try to find candidates, or re-look at candidates, review a pipeline again.

Speaker 4:

It's just irritating, honestly, right Like you've either looked at the candidate before and now you're trying to reassess. It would be really nice to have a score based on objective criteria and not just trust the tags and, like you're saying, Boolean searches and keyword matching. Yeah, that's a powerful feature.

Speaker 2:

Yeah so we are getting there and then also have to do with us. Just, yeah, one thing at a time. There's so many things we want to build, but yeah, that is one of them and if you want, we can talk about this other. I think maybe that where this is going and I think other parts that if you guys think that are interesting, where we see a lot of other components, the search on the database, like the number one.

Speaker 2:

The second one has information in terms of, okay, what if you can take a step further and maybe send customized rejection emails and not only just a generic? Or what if you could actually introduce a bit of feedback in your rejection emails? Or what about if you can't be there? Apply and it has already like a 90 or 95 score? Why don't you try to book a meeting already? Why don't you try to get a candidate on a phone call? If your candidate is already really good? Why are you going to wait three days to reach out when you can just get them to book the meeting right away? Imagine the candidate is applying and we do the analysis in live and then we can just get them to book the meeting to talk to the recruiter, so the recruiter will wake up the next day and you already have a couple of meetings with potential candidates for the role.

Speaker 1:

That's cool. I really like that one. I really like that one. Just chiming in there, I think that's a really cool use case.

Speaker 2:

And then the other one is about you mentioned this before about resumes not having the right information, and that's something else that we are many times what recruiters are doing. They have a lot of questions when you go to apply. So if we all try and find a job before, it's very personally. You go to apply your job, you put your resume and then you have 10 questions that are like do you have experience with this? And in my head I'm like this is already in my resume. What am I completing this form for? And then that.

Speaker 2:

But I know they use the form to then I know how it works. Now in the ATAs they use that form to then add tags and then to do some Boolean search and filters. But what about if those questions could be dynamically generated? What if we just look at your resume and we feel like, hey, it sounds like you have it all, but have you ever worked with this framework or have you ever worked on with AWS? And maybe that little question the candidate didn't mention it on the resume. They fill out the answer and then we can put that back on the analysis and now you save a lot of time because the candidate was able to express something that they forgot to mention on the resume and then, as the recruiter review, that it will be like, oh, oh, I see it in the question. So, no, not turning into a full and full interview. We don't want to be like, I think, that other company that you guys mentioned, we don't try to get into like so much of interviewing.

Speaker 2:

But, yes, you like, complement those, your knowledge gaps that sometimes people forget to mention in the resume, and it could make a big difference on the applicant so yeah, those are some of the things I haven't written here, but I feel like maybe you guys will find it interesting and I'm curious to hear your thoughts about this stuff as well. I'll be your customer research I love it.

Speaker 1:

Elijah, would anything come top of mind for you right now?

Speaker 1:

no, go ahead and try yeah, I guess I'm wondering, just like the, in terms of feedback loop as candidates are. I'm wondering I'm just picking out loud here, just thinking through this, as I say, as candidates are getting down funnel, I wonder if there's a way to evaluate. We said this is the role criteria. We initially thought, okay, these folks are 95% match. But we're noticing the people at the final round are like 75%. They're like they have, they've matched different criteria. I wonder if there's any kind of feedback loop there and maybe not.

Speaker 2:

Again, I literally just thought of this, but just curious, I get it. Yeah, no, I think there's a lot of that goes in the ats. That has to do with, like the notes after the first phone screening. It becomes a little more that it's not so easy for the third party provider to get into that process as well, because it becomes a very sensitive information. You have the hiring manager putting notes and then you have the notes from the recruiter and things like that, and ATS don't necessarily share that information that easily. But I hear you that it definitely may be all linked together, also even have a. There could be something further done, but that feels like it's maybe not right now. I feel like right now it seems like it's a more sensitive process. What happens right after advancing the candidate? We leave that to them and then, yeah, I think every company has their own sort of like little process there.

Speaker 1:

What about Elijah? Go ahead man.

Speaker 4:

Oh, sorry, I was just going to say part of your go-to-market.

Speaker 4:

if you haven't considered it could potentially be trying to power some matching features for different ATSs. Maybe they don't have the internal resources to be able to develop this sort of feature. Being able to like power that kind of in the background could be really powerful for them to have a differentiated product in the market. You don't have to go sell a thousand customers. You just got at a lower rate a thousand customers from like X company Plus it could turn into if it went really well, maybe just buy you it out. I don't know what you're asking.

Speaker 2:

If anyone there is listening and you guys watch us with the buyer, we give you a finder's fee as well. But no, I think the interesting part is yes, a%. Think that you could become an infrastructure company like the stripe of this, if you will say so, and I think that's a big dream. And what is on our side is the fact that AI has the unique technology challenge, or this unique technology proposition, that you have an AI model, that you are customized, that we are trading the model, that we are customizing the model, and the more customers we get, the better we get at doing this, and that becomes really valuable, because that's just only for you and for your company.

Speaker 2:

And I can see how, if you just want to start what we are doing, you have we already one year ahead. Like you have to cover all that and that's something that you can really speed up. You need a lot of data, you need to train them all, you need to test it out, you need to make sure that works correctly and that, yeah, becomes very valuable. So I could see how maybe even it could be ATS or it could be even a job board. Job boards they try to get into that process as well, where they try to help you find your best matches in your job and things like that. There's been a couple of conversations, very conversations from very far, but we'll see where that goes.

Speaker 1:

All right, I have one more question. So we have Daniel Chait, co-founder and CEO of Greenhouse, on the show fairly often about once every few months and we've talked about AI on a few episodes and one of the things he talks about. He talks a lot about his concerns with AI. I think more than like his shares excitement about potential applications, he's more just concerned.

Speaker 1:

A lot of big customers. Yeah, I always. Yeah. One of the things he talked about is just like from a resume matching perspective, like this arms race of. He called it an arms race between AI resume kind of evaluation tools right, possibly similar to yours right and AI generated resumes.

Speaker 1:

And he's so he was just talking about how it's like you're seeing the market flooded by all these AI generated resumes, and then we're starting to see more evaluation tools for resumes, matching tools. So curious to get your thought on that topic and then like, more specifically, do you worry about that? Do you like think about, okay, like this tool needs a way to try to figure out if a resume was identified for AI, or is that really just not a concern? What do you think about this topic?

Speaker 2:

Yeah, definitely. Something we discuss is because we talk about that Once you get your resume made by the AI, and the AI is evaluating your resume like it's just the AI, and I think, if you take it one step backwards, I think this is more like long-term right. I don't even know this. I think this is more like long-term right, I don't even know. This is my personal opinion. I think we have to look at the resume as a medium. I think here is are we going to go after AI and symbol or are we just going to look at the way the process is right now? It's a process that has been done for how many, I don't know, since forever.

Speaker 2:

We always use a resume as a way for me to introduce myself to a company and, yeah, why the resume has to be in a particular format, in a particular way? Why is the resume? The best way to sell me as a candidate, if you think about, most people have one resume and they send the same to everybody. Now there's this new trend where they tell you you should tweak your resume for the job position and you should make it more, and that makes a lot of sense because it's like, why would I tell you all this experience that hasn't applied for this job. Let me show you about this experience. So it does make sense that you should have almost a unique resume per job application Like that.

Speaker 2:

Maybe people don't have the time to do it and the tools are maybe not there yet, but it almost feels that you could have one resume per job that you apply and the resume maybe is going to become more like a letter, like you could become more of something like hey, this is my story, this is a place that I work with, but this is what you really care about.

Speaker 2:

I did work with this technology, I did work with these people and I did work with that and so on. So, if anything, I think what I can, if the AI is in the resume and the AI is doing the analysis, I think, if anything, we just kind of turn going to have a work experience. They're going to answer some questions and that's going to be the process. Like the resume will become not what we know it as today. I think it's just because the AI technology is here to stay. I think we're not going to be able to just no, we don't want AI, we just only want to do resumes in a Word document and we're going to read that and that's it.

Speaker 1:

It's just going to happen. So it's more about how do we adapt to this process. So I agree with you. I definitely agree with you. One of the examples he brought up and so everybody listening definitely go back and listen to I think it's the most recent episode I had with daniel, but you can check in the episode description. I think it highlights this chapter, so you guys should definitely check that out too.

Speaker 1:

But he brought up like ai being used to match something. I don't know if it was like being done in healthcare or something like that, but the analogy is like the matching it was using, like the type of ML it was using, picked like a very random piece of information that was like a correlated trend associated with a ton of images, and then basically said that piece of data was what should be used to group them into a cluster or something. But it was like something totally irrelevant, like it was like something that wasn't like relevant to the decision at all and he cited that as an example for why we have to be careful. But it's again like I. This is happening. I think already the tools are becoming a lot more sophisticated than that. This isn't just even basic like clustering or anything like that. Like this is totally different yeah but I'm just.

Speaker 1:

It was just. It was an interesting conversation. This wasn't that long ago, this was probably within the past few months, and he was pretty I don't want to say like pessimistic, but he has to say no, but I think like, why would he can't freak out like?

Speaker 4:

all of the enterprise companies using greenhouse you talk about. Talk about a smaller an ATS with a bunch more startups. Yeah, I don't know, anyways go ahead?

Speaker 1:

I don't know.

Speaker 2:

There's definitely. We did something that was interesting. Maybe let's talk about the bad side. Bias is something that it is embedded on the AI models, and that is that it exists and it's clear and it's actually. You can do tests and in a very simple test for anyone out there, if you want to test a bias, you can put a resume, right, and you have the same. You have three resumes and all you do in each resume is change the name of the person and then you do names for different ethnicities, different cultures, and then you put those three resumes and then you ask the AI who is a better job, fit for this role? Same resumes, just different names and the same description. And then you ask the AI, whatever BD Gemini, whatever provider you want to try, and you'll be surprised. In some cases they will wait more a candidate from the other one, based on, maybe, what country are they from or what name they have on, and that's a very cool. I can imagine the CEO of Greenhouse being like. That is really bad. Yes, and we all agree that we don't want that to happen, and that's why we can't let the AI make the work for us. We can't let the AI make the decisions for us and we also.

Speaker 2:

We have to be very smart about the questions that we ask to the AI. What do we ask the AI to do? We want you to just analyze everything and tell me who's the best match. Or just tell me this candidate has five years of experience. That's it. So it's almost. Are you gonna put a child to do to start cooking? Or maybe you're gonna get the child to just ask you can you pass me the eggs? I'm doing pancakes with my kids. Is he really doing all the cooking in the fire? No, he says give me the eggs, grab the eggs. And that is how we should treat ai. That's like a kid. Not, he's not ready for the mainstream and but he's really good at those tasks for a kid. He's really good at giving you the eggs. He's really good at giving you the flower.

Speaker 1:

I hope the analogy is good, but I just frame it that way yeah, I think that makes a lot of sense. It's like also putting in this again the specific criteria of what like good looks like. It's not necessarily looking I don't know, it doesn't sound like it's necessarily looking from a database of this is what good looks like. It's more of taking the search criteria and then using that to determine what good looks like. In terms of preventing bias and I'm not an engineer do you train systems to specifically not make a decision based off certain information? So can you get it to ignore names completely, or is there a way to-.

Speaker 2:

No, you definitely can, but you have to because you ask the AI not to give you an analysis. The whole thing here is that you don't want the AI to give you an analysis. You don't want the AI to give you conclusions the way I say, this is a good candidate because of one ABC. You don't want the AI to do that because that's when the bias comes in, that's when you don't have control. Where is your reasoning coming from? Ai nowadays is not able to reason it. The reasoning is not something that AI can do.

Speaker 2:

What AI can do today is from all this internet data. It's able to come in with some sort of conclusion based on what it's been trained on, but it's not really thinking. It's really good at reading text, but it's not really good at really coming with your own conclusions. All it's doing is just matching text and using text all together and coming up with the best text that follows that. Sorry, it's a bit technical, but what I'm trying to say here is you can't have the AI to think, and because the AI can think and all it does is use the text that it has been training on and, yeah, the data on the internet, it has bias. We all have bias and we write articles, blog articles, and we write things, and we do that and things like that, and that is embedded in the model and it's hard not to have bias.

Speaker 2:

What you can do is just ask questions where the bias is almost none Like objective question Is this green or red? Is this blue or black? Blue or red, is this black or white? That is the question that you can ask the AI, and the AI is really good at that, and that's the new thing versus a keyword search that you can actually ask does this candidate work Amazon, yes or no? Period has this candidate worked for more than five years at Amazon, yes or no? The AI can do that and it can do it very well and very efficiently and it doesn't get confused, it doesn't overseen and it does it in a matter of seconds. That is really. That is an NDA. It all has to do with how you present the software and how you prompt the AI and things like that, but the bias is something that is just always going to be there, I think, until we get to the next version of AI, where AI will start thinking, and then now I don't know what will happen with them.

Speaker 3:

From another perspective also. Humans have bias when they are analyzing a resume. So I prefer having the AI extracting if the candidate has five years of experience in Python or not and then provide to the recruiter all the list of candidates with the accomplishment of the criterion and maybe you can do that in Brainerd. You can hide the name and you can see the list of candidates and which one meets which criteria. So we decided to put the AI to do something that can do more productively than persons and more impartially than persons. That's the thing.

Speaker 1:

Yeah, it makes a lot of sense and we're going a little bit over time today, which has been great. Thank you for spending more time recording with us. This was very insightful. I really enjoyed this, so I just wanted to say thank you so much for coming on the show and sharing all of your insight with us today.

Speaker 2:

Thank you for having us guys. I think it was a great conversation and, yeah, it will be interesting to even if we do that next year not only from our perspective from a company, I think just in terms of the trend in general in tech and HR I think it's going to be a very interesting couple of years ahead of us in tech and HR.

Speaker 1:

I think it's going to be a very interesting couple of years ahead of us. Yeah, definitely. We have probably, I think, at least 15 more episodes on this series, so make sure to check those out and go back. So in this series, I think this is like our fifth or sixth episode, and then also for folks tuning in, if you're interested more in the AI topic as well, there's again. There's, I think, two or three episodes with Daniel Chadover at Greenhouse about AI, and then also with Steve Bartel, co-founder and CEO of GEM. We've done a few episodes with him discussing how GEM's implementing AI and his thoughts on how AI is going to be incorporated with hiring. So we have a fair amount of really good content on the topic already, but we're really up leveling the amount of knowledge that we're going to be able to share with everybody here. Anyways, thanks again, guys. It's been a great conversation, elijah, as always, great to have you here too. Thanks, cool. Thanks for joining us. Talk to you soon. Bye.

People on this episode