The Breakthrough Hiring Show: Recruiting and Talent Acquisition Conversations

EP 146: AI in Action: The blueprint for smarter hiring with the CEO of BrightHire

James Mackey: Recruiting, Talent Acquisition, Hiring, SaaS, Tech, Startups, growth-stage, RPO, James Mackey, Diversity and Inclusion, HR, Human Resources, business, Retention Strategies, Onboarding Process, Recruitment Metrics, Job Boards, Social Media Re

James Mackey, CEO of a leading RPO provider, SecureVision, and Elijah Elkins, CEO of Avoda, a highly rated global recruiting firm, co-host BrightHire’s CEO, Ben Sesser, in our special series on AI for Hiring.

In this episode they uncover the ways AI is optimizing how companies recruit talent. Ben shares valuable insights into how BrightHire's platform is transforming interview planning, execution, and assessment, making the process more structured and inclusive. From creating detailed job descriptions to building comprehensive interview plans, this episode reveals how you can significantly improve hiring outcomes.

0:00 BrightHire's core features

10:49 Evaluating hiring decisions with BrightHire

22:00 Customizing your hiring process with AI

30:42 The technology behind BrightHire


Thank you to our sponsor, SecureVision, for making this show possible!


Our host James Mackey

Follow us:
https://www.linkedin.com/company/82436841/

#1 Rated Embedded Recruitment Firm on G2!
https://www.g2.com/products/securevision/reviews

Thanks for listening!


Speaker 1:

Hello, this is Breakthrough Hiring Show. This is your co-host, james Mackey. Today we're joined by my co-host, elijah Elkins, as well as our guest Ben Sesser. Ben is the co-founder and CEO of BrightHire and he's gonna be here today telling us a little bit about his product and how it incorporates machine learning as well, and then the conversation's gonna progress into a general overview of how AI is impacting hiring today and the foreseeable near-term future, and then we're going to be talking about potential applications 18 months plus out, which. Who knows how accurate we're really going to be, but we're going to take a stab at it regardless. But anyways, ben, welcome to the show.

Speaker 2:

Thank you for having me Super excited.

Speaker 1:

Yeah, of course, and so let's start with an intro on you and, just if you want to dive right into your value prop for BrightHire, that'd be pretty cool too. Thank you.

Speaker 2:

Yeah, absolutely so. I'm Ben, the co-founder and CEO of BrightHire, as James mentioned. I've been building this business for five years, super passionate about the people space. And yeah, righthire is an end-to-end platform for running an exceptional interview process with AI at the core, so we help companies like Canva, rippling, zapier, lattice and a couple hundred others hire really exceptionally well and much more efficiently and with a particular focus on making the interview process better for the organization. So happy to go into more detail, but that's the high level.

Speaker 1:

Yeah, that's super helpful. Could we go into a little bit more detail? I'm looking at your product page on the website and one of the things at the top of the product page just talks about building interview plans within minutes. So it sounds like is it more of on the building kind of a structured hiring process, or are you actually joining the interviews or essentially hosting the interviews for customers? Or can you just walk us through? Yeah, yeah, Make it real. Yeah, yeah yeah, Absolutely.

Speaker 2:

Yeah, Very high level. Think of BrightHire and I know it's a cliche, but truly like an AI co-pilot for your recruiting team and your hiring teams to basically run the interview process much more effectively every day. And then a bunch of insights for the organization to understand the quality of the process holistically and so very practically, that really spans the interview process. So, from step one of planning so helping you figure out what are we trying to hire for, what are the key competencies, skills, so on and so forth creating that kind of architecture for a new role you're hiring for, building a job description, optimizing it, making sure it's inclusive and clear, and translating that into an actual assessment plan and interview plan. And then we operationalize that interview plan. So we actually make that interview plan show up for your interviewers in a meeting like this in Zoom in real time.

Speaker 2:

We record, we transcribe and we summarize the conversations so, as a candidate is sharing really important information with you throughout the process, how do we not lose any of that? Capture it and service it in a really structured way that helps you make much more high fidelity decisions. So make it a little bit easier to run a great interview, particularly for folks that do it all the time which is most folks by freeing them from note-taking, giving them the plan in real time and then capture and synthesize that information in a whole bunch of ways that helps with calibration, decision-making and all the things along the way and ultimately, hopefully, a much more objective, informed decision. And then, across the interview process, we're gathering basically a whole new data set that never existed before about the quality, Quality interviews you're running, what you're assessing for, what you're hearing from candidates, so that you can continuously optimize over time and get more predictive in terms of making the right hires, understand your candidates better, improve your close rates. So it's really very much end to end.

Speaker 1:

Now, when you say building interview plans, is it primarily focused around custom questions? Are you actually structuring interviews? Okay, you need three interviews and here should be the focus of each interview and here are the questions for each. How is that structured or done?

Speaker 2:

Yeah, it's a great question, All of those things. In some cases, it'll be a net new role that a company has never hired for. And so step one is what are we hiring for, which is a step that's often taken for granted, and there's very often a trade-off and I don't know if y'all have experienced this between let's kick this off really fast because we're in a rush and we want to just get things going, chase our tail and we didn't exactly know what we were looking for in the first place, or let's be deliberate and build a really thoughtful plan. Let's understand exactly what we're trying to assess for, build a great job description, build an interview plan but that could take a week or two weeks, depending on who the responsibility is on, and so we obviate that trade-off and so we have a workflow where, if it's a blank page new role you've never hired for, there's actually an interactive workflow that a recruiter can do, a hiring manager can do, they could do it together.

Speaker 2:

That actually is like a back and forth to help build out the architecture of the role, and so you provide a little bit of information about the role itself and then we'll start proposing here's some skills that may be relevant. Here's some experiences, here's some qualifications, and so on and so forth. You could add your own, you could ask for more. We're not making the choices for you. We're expanding the mind of the recruiter and the hiring manager and helping them really think through what they're looking for.

Speaker 2:

And then, once you've solidified that, which I think of as like the foundation, we turn that into okay, how do we express this to the market in a job description that's going to be inclusive and clear and well-written? And then how do we actually pull that thread all the way through to James, your point basically building a full structured interview plan so you can actually define we want to have five interviews. The first one's a recruiter screen and we'll take all of that infrastructure and then automatically kind of allocate the topics, the competencies, the skills logically into the interview plan and then give you very specific questions that you can ask situational, behavioral, technical, tied to the specific items that you identify as important. So it's a full structured interview plan. And then you can either use that in BrightHire, like we can operationalize that, so you can say James, you take this, elijah, you take that, ben, you take this, and it'll show up automatically, or you can drop that into your ATS really seamlessly with our product and then we'll operationalize it through your ATS.

Speaker 1:

Nice, elijah, I think you were going to mention something.

Speaker 3:

So I know one thing, ben, that you guys are really excited about at BrightHire is the interview quality report right that came out I think about a week ago or so, and then also digging into candidate motivation. I'm curious, what did that kind of look like from concept whoever's idea it was to actually getting that launched? Did you have all the data that you already needed collected or was there more work to do? I'm just curious what that looks like, because it's really exciting that's out now.

Speaker 2:

Yeah, so the idea for the interview quality score came from our clients. We've always had a lot of specific metrics that we've captured, that kind of speak to different aspects of the hiring process. So we have data that we're capturing from interviews. We also take in your ATS data. We combine those things so you can slice and dice in a bunch of interesting ways. So we've always had a bunch of data about the dynamics of the interview and what was the talk ratio and that sort of thing. But they're all individual metrics which you could slice and dice and ultimately it's KPIs, like singular KPIs, and North Stars really drive behavior and galvanize an organization towards excellence. And some of our clients larger ones in particular said can we actually bring all this individual data together and create like a composite or like an overall score that just gives us the North Star for the org for different departments? And so, practically speaking, it really came from a bunch of our customers and we just work with them to build this in a way where they can actually customize it, because that's the other thing that you learn when you talk to different companies they may have different priorities or conceptions of what is great and what do they really care about, and so we have a bunch of metrics.

Speaker 2:

We added some new metrics to your point, elijah, that we didn't have before, so we're always adding new data. An example would be like we now can track whether or not the agenda was set. So if you care about canned experience generally a good canned experience to say hey, so nice to meet you we're going to cover today's interview, set that agenda that's a new data point among others and so our clients can go in basically turn on the metrics that they care about, roll those into a composite score, set the thresholds and now track this longitudinally over time. And yeah, it's been really fun. It's only to your point, it's only been out for two weeks publicly. It's been out a little bit longer for some of our clients, but it's now like a core KPI If they're reporting recruiter screen to hiring, manager ratio and other core metrics.

Speaker 2:

Now one of the core metrics in these companies and their ELT pack is the interview quality score and it's already driving competitiveness, I would say, among leaders. Right, like why aren't we at 100? How do we get to 100? And and that's the goal, right, metrics are there to to understand how you're, how you stand and drive behavior, and so that one's been super fun to build and on the candidate motivations. That is, we've always had the data, because the data is text right.

Speaker 2:

We transcribe interviews, which is a new data source, and so one of the amazing things about AI and how it's evolved generative AI is a lot of folks think about it in regards to summarization and those sorts of things, but it's actually great at taking unstructured text or other sources, turning it into structure, like being able to infer meaning from that in a way. That's much smarter than like keywords and then creating data. So we're using our interview transcripts to develop that sort of those metrics and they're really interesting because they tell you things about your candidates that you would never know. Right, no one fills out candidate NPS surveys and Glassdoor reviews are glossy, and so you actually can learn things about what's motivating your candidates. What are they asking about, what do they care about?

Speaker 3:

that you would never know otherwise, which so once you just zoom in on that a little bit. So basically, you guys are all this additional data that companies don't normally get right because the interviews aren't typically recorded. You're then able to transcribe that, take that, turn it into text and then you're able to build additional features or, yeah, product features to make available to customers based on what the customer you know wants and that feedback they give. And you also mentioned that, let's just say, chat GPT. I don't know what you guys use, but whatever you use from a Gen AI perspective, it can take some of that text and data and then be able to synthesize that so that you can simplify it, summarize and be able to create those additional features for customers, right?

Speaker 2:

Yeah, we have basically off-the-shelf data that speaks to are we running a rigorous and consistent interview process? Are we delivering a good candidate experience, what motivates our candidates and what do they care about? So different facets of hiring that are all really important. So the candidate motivations data, as an example, that influences your EVP, your outbound messages, that you're doing in sourcing, how you close candidates, like a whole range of things, and that data never existed before and that's all off the shelf in our product. There's an insights dashboard in our product and all that any client can use.

Speaker 2:

But we often prototype with clients.

Speaker 2:

They'll say, hey, could you help us answer this question or that question? And that's a great way for us to figure out what has value, what can be solved and to your point earlier about using Chatcha, gpt or what have you, it's amazing, with these tools and this is getting to the bigger conversation you can get 80% of the way there like that. But that last 20%, that's where all the work is, because we have a really high standard around accuracy and recall and precision and all the things that matter because hiring is so important. We standard around accuracy and recall and precision and all the things that matter because it's hiring is so important we can't screw it up, so that's where actually all the work is. It takes us a very short period of time to be like, yeah, this is doable, but a much, much longer period of time to get to a place where we feel confident putting into the product and having a drive hiring is this like so as a product across the board for every role, or do you dial in on specific openings for customers?

Speaker 2:

as it relates to usingHire in general or insights in particular.

Speaker 2:

Either or yeah, yeah, it's a great question. We have clients that run BrightHire wall-to-wall, so they're running them from recruiter screen to offer every role in the company and these are scaled businesses. We also have really scaled businesses where they might start in the go-to market team. We want to use BrightHire for sales hiring. It's a huge focus. We're not getting the quality that we want and we think that BrightHire is going to save our recruiters and hiring teams sales reps that we want selling a ton of time and help us improve the quality we get, and so we can scope it and just serve.

Speaker 2:

I just had a conversation earlier today. It was a company with a thousand folks on their go-to-market team and they might use BrightHire to start within that part of the organization and then expand later. So both are super common. And then on the inside side, yeah, you could run these insights and say I want to look at them for the whole company. We have all of this metadata from your ATS and from our system where I could say I want to look at candidate motivations for engineering roles in the US or in Europe or in this region and really just drill down and really understand what's happening, whether it be around the quality or what we're hearing from candidates at a very granular level.

Speaker 1:

So I guess my question around that is because you mentioned the generative AI piece is getting you 80% there and then the 20% is really where the work comes in for you for your team. So it's like that work though it's your. Are you like you're doing that across the board for every opening or do you have to do? You have a client that's hiring 25,000 nurses a year, a massive healthcare system, and we have clients that are hiring the most specialized AI engineers on the planet, right, very different.

Speaker 2:

But the things that we build have to work in both of those contexts, and so when we test our features, we actually have a pretty rigorous process for making sure that we're testing against a lot of different scenarios. That could be functional roles, countries, accents, all sorts of things, because we want to make sure that there's going to be a consistency in terms of the quality of the outputs across the board. As we get into some of the more, I think, future-facing things that we may invest in. There's been interesting opportunities, I think, to double-click and build things specific to a role that's very, very evergreen and very common. That might drive a lot of value for our customers. Like sales is a good example, engineering is a good example Call centers but today we really build features to have to work in all the environments, which is, again, that's part of the 20%. That takes a long time. It's just making sure that we're at the bar that we're responsible to.

Speaker 1:

Got it, what about? So I'm looking at again. I'm still on the product page here, right? So it's like plan interview, then it goes into decide, right? Think about increasing quality of hire. So, with the evaluation aspect of helping companies decide, is it more so just providing the summaries matching the top role requirements and what's important for the role with candidate responses? Is BrightHire actually providing an evaluation on maybe not stack ranking, or how does it present that data? What's the view from the hiring team?

Speaker 2:

Yeah, fantastic question. Our product ethos at the core is that hiring is hard, people don't do it all the time, staff buy volunteers, and that technology can make people feel much more confident and successful doing hiring, and so a lot of what we built is to equip people to be as successful as possible. And so it relates to this question because, to your point, we're not stack ranking, saying hire Ben, don't hire Ben, and applying AI to make those decisions. What we're doing we're not stack ranking, saying hire Ben, don't hire Ben, and applying AI to make those decisions. What we're doing is we're applying AI to give you vastly better information that's more objective and complete and not based on like memories, recollections and the post-it note on your laptop about what Elijah told me in the interview to make a decision, to be calibrated and to have high fidelity going into those decisions. And so we actually do very like.

Speaker 2:

It manifests in a bunch of different ways but, james, is very similar to what you described, which is how do we take this and just spend a lot of time with us and talk to a lot of people and share a lot of really important information about their qualifications.

Speaker 2:

Let's not lose any of that. Let's find all the most important context that they shared about things that are relevant for this role and let's give it to you as a hiring team in a structured way that helps you know what you know, what you don't know and make more objective decisions. And then we have a bunch of other capabilities in our platform that just helps people get calibrated faster, help them know what great sounds like, so that you have the context, you know what you know and don't know and you also know how to evaluate that context really effectively, like much more so than you would in the pre-Bright Hair days where you were calibrating through debriefs and games of telephone. I don't know, elijah or James, if you had those experiences previously, but it's hard to calibrate, particularly with new interviewers or new roles or people coming from different organizations. You're trading secrets about what you're listening for and it can be pretty tough. And when you can share context, it's a pretty massive unlock around that.

Speaker 1:

So it sounds okay, providing those summaries, making sure that you're aware of everything you don't know, potential blind spots in the evaluation, which is pretty cool, and it's basically pulling in data from each interview round, essentially. So, opposed to looking at, let's say, if you're in within a, an applicant tracking system, for instance, like a greenhouse where you have a scorecard, it's looking at individual written feedback from the person interviewed. It's like compiling all of that feedback. It's one summarized evaluation, essentially that of all the prior interview rounds, so that the team can review everything in one, one view, or essentially one page.

Speaker 2:

Yeah, just to get really concrete. So Greenhouse uses BrightHire. They're an amazing partner, they're a great partner and we one make all the feedback on every interview that is being submitted much more robust and make it much faster to submit. So all the scorecards or feedback forms, or whatever you want to call them that you're collecting from your interviewers, they're going to be much more robust, because what we're doing is we're going to take all the most relevant information from every interview and then make it basically a click for that interviewer to take all that information, push it into the scorecard, add their analysis which is actually the most important part on top of the objective evidence, and submit. So we're going to have much more robust scorecards and, in conjunction with that, brighthire is literally imagine like a panel flying out on top of that candidate profile with a list of all the things that you said were most important for this role the competencies, the skills, et cetera. It's going to tell you what do we actually cover or not cover and how deeply so. Is there anything that we missed? That's basically like a heat map for risk and then, against every one of those criteria, a summary and a set of highlights to revisit.

Speaker 2:

So very often what will happen is there'll be a scorecard or some analysis from an interviewer that says I'm not so sure about XYZ as it relates to Ben, or so on and so forth. And in an old day it was tough for a hiring manager because you're either overruling Elijah which is a tough thing to do to your interviewers without actually being in the room, or you're just saying pass and maybe the candidate loses an opportunity. They shouldn't false negative. And now I can get that feedback and I can very easily see the objective information, dive into it, listen to it myself and say, oh, that's a good point, I agree. Or that's coachable, like I'm great flag, but I'm okay with that, because that's actually an area that we've been really successful coaching folks on and make a much more intelligent, informed decision around the decision about the hire.

Speaker 1:

That makes a lot of sense. Decision around the decision about the hire that makes a lot of sense. And I think we also talked about improving and this is one of the more recent product feature releases, right Basically providing statistics on essentially what could be done to improve the interview process, and maybe we touched on this too, but I know we're covering a lot of ground. So just to make sure I understand that too, it's what are the core stats that go into improving the interview quality? Is it identifying, like which gaps? Right? Okay, we haven't covered these types of attributes or skills, or like how does that? Could you provide a little more clarity for that?

Speaker 2:

Yeah, yeah. If you think about what, if you think about what makes a great interview or a great interview process, you want folks that are going to like do a good job by the candidate, deliver a great experience, right. That's like a very important responsibility as a hiring team is someone's putting themselves out there for an opportunity and it's really important to treat them well and to do your job. And you want folks that are going to follow the plan. We know what we want to assess for. We assess for it, we do it consistently and that's at the end of the day. If you run a consistent process, you will get consistent results, and then you can iterate from there. And if you run in random process, you will get random results. So how do we make sure we're consistent? And so, specifically to those two one, we are going to collect a whole bunch of data.

Speaker 2:

That's not going to be again like a lossy NPS survey from your candidates or a glass door review, but actually the behaviors that are a good or a bad candidate experience Are you showing up on time, are you setting an agenda, are you leaving time for candidate questions? And so metrics that help us understand are we doing right by the candidate. And then metrics that help us understand are we being consistent? Are we running a rigorous process? And so, if this was the plan for the hiring process, are we consistently hitting the attributes, the skills, the competencies that we identified as important, the interview guide or the kit from our ATS, those questions, are we doing that? Are we doing that consistently over time?

Speaker 2:

And then on the latter, ultimately you can then look a year later, a year and a half later, at performance and then go back and say is there anything we can learn about what we assessed for what we heard and what happened? Can we identify the questions, the factors, the competencies, the skills, the interviewers that are more apt to lead to success? So those are two of the most as we think about quality interview can, experience and rigor. And then there's a host of other metrics that I was mentioning around understanding your candidates. That's not assessing the quality of your interview process, that's just intelligence for your talent acquisition team to do a variety of things, whether it's like adjusting your EVP, knowing how to close more effectively or selling effectively. Those are the buckets of insects.

Speaker 3:

A couple of quick questions I'm curious about, because you mentioned you guys have been building BrightHire for what? About five years?

Speaker 2:

Yeah, yeah.

Speaker 3:

Okay, so take us back to chat. Gpt comes out, blows up tons of users. What did you guys do? Did you immediately start looking at ways to use it? Were you already testing out similar things and you were well aware of it before the public was? Or? I'm just curious what that with Gen AI has been like for BrightHire. What's that been like for BrightHire?

Speaker 2:

Great question. Yeah, it's been definitely a game changer. Our product, there's software where, like GenAI, is like sprinkles on the cupcake, it's like this feature and that feature there. For us it's the product. It's such a perfectly tailored fit for the type of data we gather and what we want to do with that data that it's literally throughout the entire product.

Speaker 2:

Going back, we've been using AI for five years. Right Transcription is AI, right. Like a lot of the analytics that we would create would be classically like data science or machine learning to create. It's not like we were using no machine learning or no AI, but large language models in particular took a leap in terms of their capabilities. We were obviously aware of the large language models and thinking about oh, this is pretty powerful technology, but the new model gets released and it's just, it's a sea change from previous iterations and it was like immediately a clear five use cases that we could unlock with it. That would drive not like nice to have value, but just like foundational change the game value for our customers.

Speaker 2:

So the first was interviewing is hard, it's distracting. I have to like take notes, Remember what you said, think of the next thing now and so forth. So, if we can your interview notes for you. That sounds small but it's actually massive, right, Because it's going to unlock your interviewers to focus on the candidate, be a much better interview. We're going to increase scorecard completion rates and have much better context. So that was like thing one. It was like great, that's thing one, but there was. It was very evident that there was going to be a set of use cases that we could unlock pretty quickly. And then every month the models improve and there's new, there's a new sort of cost curve associated with the new capabilities. And, yeah, it's just, it's so core to what we do.

Speaker 1:

So I have two, two questions I've thought of here. We talked about how trying to get very clear on role requirements up front, taking more time in that prepping stage but in a sense not taking as much time it takes a traditional hiring process. It actually takes a fair amount of time to prep for a job opening the right way. When I took on my first head of talent acquisition role at a hundred person SaaS company, I had a nine step process before we'd even do a kickoff call with the hiring manager of things that we needed to collect and put in approval process, of course too. So there's a bunch of different things, but it sounds it's probably just as comprehensive. You could probably actually do it a lot faster now, but I guess the question, one of the questions I have, is that, nonetheless, even when you do it the right way, sometimes role requirements change, and so is these evaluations and summaries or whatnot are based on a specific set of role requirements.

Speaker 1:

If a customer changes a role requirement, does it can? Does it like update all of the evaluations? Can I do that where it changes it after the fact, or how does that work?

Speaker 2:

totally yeah. If we think about how we summarize and synthesize the context from a candidate, it's ultimately like infinitely customizable in some sense. So take one specific interview. We can summarize it as question answer. Question answer I'm gonna. I want every question we ask and a synthesis of the candidates answer, so we do that. We also have this concept of templates, so you could say that's great.

Speaker 2:

I actually want you to summarize and structure this context that we've learned according to the scorecard, which is actually different from question and answer. Right? So take the categories and the things that I need to provide feedback on as it relates to my feedback form or my scorecard, and re-synthesize that way and we could do both and you could make your own template. And similarly at the candidate level. Yes, we're basically saying this is the most relevant context, objective evidence against this set of requirements. But if you went in and updated those requirements, we would just go back into the same interviews and find different information that was relevant for the new, the new competencies or topics or skills that you've added or removed, and just re-synthesize. So it's yeah, that's exactly how it works okay.

Speaker 1:

Yeah, that's super helpful. And I suppose the other thing too is consistency across the llm in terms of asking the same question, because I know they're generating a list of questions for a role, but when you're looking at all the roles across all your customer base, of course there's best practices. We need to make sure we're asking salespeople these things every single time.

Speaker 1:

Is there a training in the model to make sure that when they're collecting role requirements or when they're creating these customized questions, that maybe some of the questions aren't customized, Maybe some of them are just like are you open to hybrid right or whatever right? Do you see what I'm saying? So how does it account for that, to ensure that you're providing a consistent, rigorous process, the right questions across the board for every customer? How do you do that?

Speaker 2:

Yeah, look, different customers are going to ultimately have different things they care about or different standard questions they want to ask, or values, or what have you, so they can bring their own, which is pretty common actually among our customers, particularly customers that are scaled right. If you're a multi-thousand or more size organization, you might have standard competencies that you want to assess every hire against standard values. What have you, so you can bring your own standards, and those standards can be like we always want an interview that covers these things with these questions, or our job descriptions always have this language, and so that's part of the product, right, that's the build, right. The cheap version is I'm going to use ChatGPT and I get what I get and I don't get upset. The expensive version that we invest in is a workflow that's really thoughtful in terms of pulling out the right information, where you're bringing your own structure to the table, and it puts that all together and actually works in the flow of work. And so to your point yeah, it's a combination of we will generate whatever you need. So it might just be like we need the domain specific questions, but we have standard questions for assessing, for our values or whatever In terms of making recommendations across all of our customers for common things. That's just how it happens by default.

Speaker 2:

If you started an AE role, account executive, sales role, on plan, you're one going to get likely suggested that you want someone who's met or exceed quota, and then there'll be a question around what's been your track record or performance as an account executive? So those things are generally do become very standardized and then within a company, standardization to your point is also super important. So once they go through this process for a role if it's not like a novel role, that's a one-off that becomes the standard right. They don't have to keep doing it every time they're hiring for that role. They becomes the standard right. They don't have to keep doing it every time they're hiring for that role. They have that in place. They can edit it later. But very often they're just saying we already have a standard for this role. We're hiring for the same thing, it's just in a different region, or it's a senior version of the same thing, so let's version it, or we just want to update it a little bit. They're running a new hire.

Speaker 1:

And that's super helpful. But, like collecting role requirements, are you also like essentially helping your customer not have gaps, right? Because like it's so, so and and we might be a little redundant here, I'm just we covered a lot of grounds, so I just want to make sure I'm slowing down. It's like they might say here are the preset questions, implement. You have the LLM, generate the questions, but what if the hiring team doesn't say what was your quota attainment and what if, for whatever reason, the LLM doesn't generate what was your quota attainment? Is there again safeguards in place to make sure that there aren't those types of gaps?

Speaker 2:

Yeah, as you go through the scoping process, yeah, totally, as you go through the scoping process, one of the big values of that is a lot of folks are going to come in with something in mind. Okay, I'm hiring an account executive, I probably need to attest for these sorts of things. But the workflow is let's give you a much broader menu of things that you might want to consider, based on all the other things that we've seen folks hire, assess for or ask about or what have you for a role like this? And we're just going to present that. And then you can select yes, that's relevant, no, that's not. But we want you to not have those gaps.

Speaker 2:

So that's the workflow, and very often folks know some of the things they want to assess for, but then they'll see new ideas or suggestions as to what might be really important and they'll say, actually wait, that's actually a great point, that is really important, and that's really important and, oops, I don't want to miss that. And then they'll run the process and then, as I was talking about in this, the candidate summary at the end of the process that we want to show them did you miss anything? Did you actually not get to something that's really important, whereas otherwise you never know. And there's always that conversation at the end where you're like is there something we missed or did we cover X? Now we actually just know it's a visual display we can look at and say, oh, it looks like no one drilled into quota team. I thought you had that, I ran out of time and oops. That's pretty important and we never want to make a hire and not know what we don't know, and so that's the back end of it.

Speaker 1:

That's super helpful. Elijah, do you have any other product related questions and we after afterward we can transition to more of a general AI conversation.

Speaker 3:

Yeah, just one more question. So what would you say, ben, for people who are using maybe they're already doing video interviewing with like Zoom or something like that? Maybe they're using Zoom's kind of AI transcription or they're using like a free general transcription tool? What are they really missing by not using something like Bright? For the first question, so like general summarization type tools that join video interviews. And then the second piece is amidst the interview intelligence space, which arguably BrightHire created but now is no longer the only one, why would someone go with BrightHire specifically versus some of those other options? It's like why BrightHire versus general solutions? What are they missing? And then the second part is like why BrightHire versus other interview intelligence type platforms? Does that make sense?

Speaker 2:

Yeah, totally For BrightHire versus let's use Zoom AI Companion. I don't know where to begin, but the note summary is like one little part of a very broad platform we've built, which we've been talking about, which really it's like how do we just make people great from step one to step 10, across the process, day in, day out, in their flow of work, whether it's scoping the role, running a better interview, making a decision, calibrating, debriefing, enabling getting insight and improving. So it's kind of night and day. And so many of the so many of the clients that we work with, they have Zoom AI companion. It's turned on for the whole organization. They use it in meetings every day and it's not even a consideration for them that maybe that's an alternative because the capabilities are so different. But then also it's not in the flow of work. If something's on the flow of work, no one will use it. If all the interviewers are like I use a Zoom AI companion, then I go to Zoom's dashboard or whatever my team's dashboard and I download, no one will use it and the notes themselves won't be good, accurate, reliable.

Speaker 2:

This is hiring. It really matters that this stuff is accurate and works. So we just have such a broader set of capabilities. We're in the flow of work. And then there's a compliance and security side role-based access control, data deletion, gdpr compliance, so many things in the backend that enable us to work with not just little companies but massive enterprises with really important obligations to their interviewers and to their candidates as it relates to data privacy. So it's pretty night and day. And then, as it relates to us and other folks in the market I won't talk about specific folks, but generally what we hear from folks that we talk to prospects one, we hear that we're just a much broader, more robust, deeper platform. It's a really it's an end-to-end solution to run an exceptional interview process, exceptional hiring process. It's not a note taker and that's a lot of what we've been talking about, which is just not the breadth, end-to-end wise, but also the depth, like the depth of features, and those are features that face the user, but also things on the backend that help recruiting ops teams run this really efficiently at scale and so on and so forth.

Speaker 2:

The second is actually that right Just being enterprise ready and proven. I mentioned this earlier. We have clients that are as big as 150,000 employees hiring, with hundreds of recruiters on our platform just from that one organization and these are enterprise-wide deployments. When you see the logos on our site, it's not one user, it's the company, and so there's a lot that goes into making those folks successful. That's under the hood, right. It's security, compliance, configurability, extensibility. Customer success That'd be the second thing. And then the third is the latter point client success. Y'all have both, I'm sure, implemented a lot of recruiting technology in your lives, and the technology is part of the equation, and the other part of the equation is the enablement, the support success. Do you have a partner that's actually going to make sure this thing gets up and running, works beautifully, solves your problems, teaches you best practices and make sure everything you're realizing value, and we invest in that very early. So I'd say those are probably the three biggest areas that we hear from folks that we're talking to.

Speaker 1:

Okay, cool, and we're coming up on time. So I think we could just stick to a dialed back version of section three to talk about general AI application, and let's just keep it. I think we can just keep it more relevant to BrightHire In terms of future, future releases or product to product roadmap. What are you thinking? What are your customers asking for? The next, like they wanted yesterday? What do you think? It's six months out, a year out, as far out as you could think. I'd be curious to hear from you what the future looks like.

Speaker 2:

I'll stick to some near-term things just in the interest of business strategy and secret sauce, but we talked a little bit about the insight side of things and that is just so. So important Data is power, and for our clients, again, these are very scaled businesses in some cases and the hiring process is going to set the stage for their success and failure like full stop. The team you build is the hires you make, is the team you build. The team you build is the company you build and that's it.

Speaker 2:

And there's obviously a lot of conversation about quality of hire, but any additional data that we could bring to the table that's going to help them improve the process to have more predictive outcomes and results and then connect what's happening in the process to outcomes and close the loop is just incredibly valuable. There's a massive enterprise value associated with doing that and enabling our clients to get that right consistently at scale, and so that's an area that obviously we've been investing in, but we're going to keep investing in because it's so important to our clients and it really gives talent acquisition teams enormous empowerment because they can bring so much more insight to the table about a really important part of building a team in a way that just wasn't possible before. So that's massive. We've invested a lot, we're not stopping because it's so important to our clients, and so that's like the insights side of things.

Speaker 1:

Yeah, that's great, we're coming up on time. Here I want to say, ben, thank you so much for joining us today and sharing your insights with our community, and we'd love to have you back on the show one of these days. Yeah, I know, I'm sure Elijah and I probably could continue to ask you questions for the next 30 minutes or an hour, but this was, I think, a pretty good episode. We packed a lot in a short amount of time.

Speaker 2:

Yeah, hopefully it was good on your end. Happy to go deeper into stuff in the product. Sorry we didn't get a chance to talk to some of the more general AI stuff, but hopefully this was useful for y'all and interesting to hear about. How are we actually practically trying to bottle this technology up and implement it in a way that actually shows up and does something? Again, not like a nice to have, but something that solves an actual problem in the daily work of our customers, because that's the only thing that matters at the end of the day is helping them do their jobs better.

Speaker 1:

So for sure, and for everybody tuning in, we're doing a whole series on AI for hiring. We're going to be bringing to the table the co-founders, ceos, top executives from the companies that we've researched are making waves in the space and doing some really new innovative things on the market to change fundamentally how companies hire.

People on this episode