How Suited Leverages AI to Improve the Hiring Process

ABOUT THIS EPISODE

Traditional hiring processes often place a lot of undue emphasis on GPA and alma mater, data points that, generally, aren’t very predictive of future performance. 

Aaron Myers, Ph.D., Chief Technology Officer, and the team at Suited have built a company around a better way to hire—using AI to measure personality traits, experiences, cognitive data, and other metadata so that companies and job candidates find the most suitable match. 

Aaron discusses how Suited works, how the company helps customers implement better hiring data practices, and what the future holds for technology like AI and ML. 

We discuss:

  • Leveraging AI to deliver better hiring outcomes
  • Onboarding, setting expectations, and defining KPIs
  • Explainable AI and the future of hiring and performance management 

Want to hear more stories from high growth software companies? Follow Application Modernization on Apple Podcasts, Spotify, or check out our website

Listening on a desktop & can’t see the links? Just search for Application Modernization in your favorite podcast player.

You are listening to application modernization, a show that spotlights the forward thinking leaders of Highgro software companies. From scaling applications and accelerating time to market to avoiding expensive license and costs, we discuss how you can innovate with new technology and forward thinking processes and save some cash in the process. Let's get into it. Thanks for listening to the application modernization podcast, presented by shadows soft. I'm your host, Nick Marcarelli. Today we spoke to Aaron Myers. He's the CTO at suited. Their organization focuses on helping find the right candidates in specialized firms around legal and financial services. Their platform helps dive into all the unique characteristics that make a candidate the right fit based on metrics. So it really became a great conversations around around ai and how how customers can leverage technology to help them see you through the biases in the hiring process. Is a great conversation. was really excited to have it and really excited to jump into that conversation, having hired teams for many years, so I learned a lot. Hopefully you will as well. Thanks to red hat for continuing to sponsor the PODCAST. Enjoy our conversation with are Aaron. Thanks for joining us today. How we doing? Yeah, thanks for having me. I'm excited and excited to be here. Awesome. Well, you know, we always like to start on the podcast by getting to know you a little better, so why don't you take took a few minutes tell us about you, your career, your path and then leading into your organization? Yeah, of course I'll. I'll try to be somewhat brief here, because it's quite the winding road my career. So I originally started out as a mechanical engineer straight out of college. That's what my degree was in and that's what I went into immediately after school, and I only I only spent a few years there before I started to really become interested in entrepreneurship. At that point I decided to hop to graduate business school to help fill some of the knowledge gaps I had. I wasn't quite ready to hop into entrepreneurship with only my engineering degree. However, I didn't immediately jump into, you know, startups. Afterwards, I went into consulting out of business school, as many of my peers did, again just to get some more experience and, as happens, to many people in this field, I got burned out on consulting pretty quickly and I started to look for a different type of challenge, one that moved me closer to startups and specifically technology startups. So my indirect path they're continuing down my winding road, was to pursue a PhD in computer science and mathematics. So I got really deep into the AI. I got really heavily involved into the in the startup seeing here in Austin, Texas, and then it was a long after finishing a pht I started my first company around a eye in in hiring. I moved on from that after a few years and joined another startup, the one I'm currently had to run their data science and machine learning efforts before ultimate taking on the ctl roll here here at suited Gosha. So I already have questions, which I love when I want to get questions when folks tell us things. So so you come out of you come out of engineering school and then you know, you stated there's some things I wanted to fill some gaps on, so going to going to graduate school made sense. What are some of those like high level concepts that you wanted to try to cent up out of engineering school. Yeah, I think what triggered business school for me was immediately, you know, after, after engineering, after engineering degree, I started to learn a little bit about startups and I started to talk some people that were studios and startups are in had successful exits, and they talked about a lot of concepts that I didn't understand. They talked about, Oh, you know, I got to have a great idea, the great idea one of the ways, you know, an ideas great as if, as a large market, you have a big Tan. You have to understand how to manage a product, how to make product positions instead of, you know, being a sales let organization. Was All these concepts and things that I had never really heard about and I thought, well, business school might be a great way to show that up, because my I was so lacking at such a fundamental level of business knowledge and how to run and build a company that I thought it was some low hanging fruit there to go to go to graduate as a school and and learn some of those concepts and how to run a business and how to be how to be thoughtful about how to think about sales and marketing technology altogether. And it was it was really helpful time that I spent through Gosha. Now that's that's interesting because, you know, it's not every day I speak to someone with a PhD. So I you know, I find that interesting how people choose that path right, because that's that's a huge commitment. You know, you everybody, you know, kind of not everybody, but a lot of people go and they get their Undergrad. It's often in something maybe you didn't intend on using or, you know,...

...you're just surviving. I feel like Undergrad is about surviving. Can I do this right? But then, you know, when people make a decision to go, you know, pursue a master's got a business school, there's usually an inflection point that drives that and then there's or or it's I'm still seeking the path right. So for you is obviously an inflection point. Why PhD in Computer Science and mathematics? Yeah, there's another kind of about face there after business school. So the original path I thought I was on was I kind of need to show up some knowledge on the business side to make sure I can be an effective leader and entrepreneur. And then in consulting, it was largely management consulting. So it was basically helping people run their business. We went into a lot of companies that were facing bankruptcy and how to get him out of that, how you better manage a business any turn this thing around, and that was all really helpful. But that alone wasn't really enough for me, I thought, you know, one of the thing is that that always has really interested me. It's why I got an engineering reub to begin with. was I love math, I love science, I love technology, I love being really, really close to it. So I decided, and not too long after business goal and I was in consulting, that kind of the person I wanted to be in the startups probably wasn't a CEO, it probably wasn't just a cofounder. I really wanted to be leading technology efforts for a startup, in particular around our official intelligence and machinearing, because I was getting a really strong interest in that stuff at the time and I knew those things were so cutting edge that I think, you know, you probably could be effective in a role like mine without a PhD. I think that's certainly possible. I think a lot of people do that. Some people are even more capable than me. That's certainly the case, but it helped. It helped me gain a level of comfort around a very new and emerging field is very complicated and very difficult that I probably could have got on my own, but getting it through a PhD really helped my confidence and helped my knowledge base to go. I I understand enough about this technology to really help lead where it's going in the future, and so it always felt to me that the PhD was the best path for me personally to get there. Gosh's and giving you a bit of a framework for how to work and a space and technology that is evolving rapidly. It's pretty fluid. There's lots of marketing around. THAT'S BAY is. So it's I think there's a there's a lot of education that goes into it for, you know, people you may hire or customers may talk to. It's not as straightforward as I think people, yeah, would think. So, okay, that's interesting. Tell us about suited. Great question, Jender. Generally speaking, are our our goal at suited is to move the hiring process away from traditional hiring metrics, things that tend to both be quite biased and not very predictive of future performance or ten year or all the positive hiring outcomes that most companies want. We all know today a lot of companies filter on GPA, they filter on Almamat or they there's a lot of these common hiring data points that are used to make hiring decisions and in some cases they literally are filters. So there are people where are they are excluded. In other words, a hundred percent of the decision not to hire them is based on the GPA. This thing that is is demonstrably not super predictive, and so we wanted to shift the focus to data that is far more predictive of great hiring outcomes, doesn't cause any enders impact. And the way we do that specifically, is we measure both companies and their candidates on a wide range of personality traits, cognitive data, their experiences, other other metadata, and we combine that with artificial intelligence to help firms find the best candidates for their companies and to give candidates the best shot of getting the jobs they want. So those candidates that go out, well, you didn't go to an Ivy League, sorry, the odds that you get into a top firm somewhere or pretty low. We can go well, no, now is suited. People are, you know, firms are going to be looking at more relevant data and now you have a shot, because the prestige of the Undergrad institution you went to or the graduate school you went to is no longer a heavily weighted factor. It now matters what your personality is, what your cognitive ability is, what your personal working preferences are. Those are things that tend to matter more. Are Predictive of performance and tenure. And, you know, I think we certainly believe that today's hiring practices leave a lot of room for creating equal opportunity, but we also think those same practices tend to lead companies to hire suboptimal candidates because they're looking at the wrong data. So we want to correct all that. Gosh, is so really kind of first step is removing bias, which seems to be everyone want to talk...

...to in the data space. Is, you know, how do we how do we pull the bias out of our data, you know, or how our how our model in our data right, because it's I think that's the hardest part. We have preconceived ideas of what something is. You know, we're in Atlanta, so obviously we like Georgia tech engineers. The store stext great school. But we also have some other good schools around here that people outside of Georgia might not know about that produce great engineers. So we're constantly checking ourselves around that. Also, the whole nation knows that Georgia tech engineers are pretty great. So when we go other places we're competing less, you know, and I think that could be an advantage. So so suited also the customers. Also, a candidate, and maybe I like, do people put their profile into your platform or how do you how do you go about the kind of the matchmaking of the Kpis and things like that? How does that work? That's a great question. So we internally, we have we have developed a suite of assessments. Again, these measure personality, trace cognitrates, experiences and measure a whole host of things. We don't just give this to the candidates, you're right. We give this prior to even what we call kind of launching a predictive model for a firm. We give all of these assessments to the company, to the employees that are currently at the firm. They take the same suite of assessments that candidates are required to take and then we get performance data for those people as well, and that's where the magic comes in, right we have very, I'll say, intimate data on a firm, who is working there, who is successful, who is not, one of the personality traits and cognitive village of those that are successful with those that aren't, and we are just because of the depth of that data, we are able to build models that are so extremely predictive for for firms that it it outcome beetes really any other data point you can use in a hiring in a hiring process. And that's really the magic is that that data that we collect from the individual firms Gotcha. So so these firms may not know how to quantify how Sally is Great. They just know Sali's great. So you guys help them quantify that with actual metrics and performance so that they can try to hire people of similar traits or motivations or things like that. Exactly. We, you know, to be even more, you know specific. If you if you say well, Sally is great, we kind of try to figure out why she's great and we do that through the rest of the day that we go yeah, well, you know, Sally and Bill and Angela are all great and these are the things that they have in common and our goal. The reason why we use AI is because if you use any other fundamental models, if you just take the approach of you know all of our you know most of our high performers are, pick a personality traite, warmth. They're all high in warmth. So what we're going to do now is just filter all our we're going to give a warmth assessment to all our candidates and we're going to filter on people having a high warmth score. What that tends to do is it tends to one misses the complete picture into it tends to copy and paste a single type of person who's successful, because the reality is that are probably plenty of people that could be very successful that are low in warmth and during this example here. So what AI allows you to do is look at this data and look at all the nonlinearies, all the complexities, all the different ways people have been successful at your firm and help you identify any way that a candidate could end up becoming a successful person. So not it's not so rudimentary that we just say I just want to copy and paste sally. It's more about I want to find all the way someone can be successful here in filter for that, when I'm looking a can so that the composite composition of that person could look completely different but has similar outcomes based on however they're made up exacts. That's really the key for yeah, I mean that would make sense, right, because it's hard to it's hard to find like we do this and now you guys focus on on the legal practice, if I do remember correctly. So you're very specific at a certain type of customer. So, like I could relate this to when we try to find sales people. You know, what's the one kind of quality to everybody thinks of when they think of sales? They think of extraversion, which is a terrible way to hire. Yes, me, some of the best sales people have ever met are highly introverted, and you know, there's some extroverted ones that are great too, but it's you know, people think salespeople, they think, you know, quite the Talker, the gift to gap, and that's just not true. Yeah, that that's that's exactly right. That kind of you know. I mean we human nature is that we really want to simplify things and we want to we want an efficient hiring process and so it's it's convenient to pick single traits, especially once they are intuitive...

...to us, like extraversion, introversion. But yeah, you're right, it tends to. It doesn't explain the whole story and it really leaves you handicapped at finding the best, the best possible candidate if you you are on your filtering out some great talent and probably including some not so great talent by by filtering on these. All refer them as somewhat arbitrary or stereotypical data points. Right. So so you describe warmth. Is that? Is that a is that an aspect that? Well, I guess that would be a high level aspect that somebody might consider when they're doing an assessment. So you guys obviously have a different viewpoint on like warmth is fine, but if we double click down, what is what is a quality you would replace with that? I'm sorry, I'm kind of putting on the spot, but I'm pretty interested in this. That's okay. We collect. There are, I think about some around a dozen personality traits we measure and you can kind of group and aggregate them in different ways. But warmth is one of the things we measure and it is one of the lowest level attributes. So there are no subcomponents to warmth. There are specific questions that measure wants. So we the way our assessments work, is for anyone personality trait or competency, we were for them as competencies. I think most people would call them personality traits, will have maybe anywhere from six to fifteen questions that relate to that specific personality trait. So they'll they'll be asked questions like are you more likely to feel this way or that way situation x? Are you more likely more likely to do this or that? And response to those questions are how we tease out what level of want you are relative to the rest of the population. But there are no subcontracts, at least for us, underneath. Got It? Okay, thanks for explaining that. Yeah, so we love customer story. So obviously you don't have to name a customer, but can you give an example of like an interesting customer story, maybe how you help that organization, maybe the high level of what you're solving for some of the results of that? Yeah, I'll share my my favorite story to date. You know, we deal with a wide range of of clients and I can I can talk to you about kind of what we tend to look for. But occasionally we get the client that is a little bit hesitant about implementing suited. They're either nervous about the AI or they don't understand the assessments. There's just it's a new way of doing things or change is just hard sometimes. And we had one client that was quite hesitant about using suited and they didn't really know where to put suited in their process. We were struggling with this with them for a little while or making a lot of recommendations, and they ultimately decided to do, which is kind of their ideas, to run an experiment. And they have a really high volume of hiring every year and so a higher here or higher there they felt was was, you know, it was riskless enough that they decided to hire a candidate that had been rejected by their interviewers. This is came that applied, had been interviewed and they had said No. But the the person suited score was really high. They had a really high probability of success. That tends to be the context of our scores. It's like probability of this person being successful if they joined your firm in this role and a person had a hit of one of the highest scores we had on the platform. So they decided to take a chance and they hired her and at her six month performance review, the comments from all the partners at the firm where of the nature this is the best associate we there was one partner that had been there for for thirty years and say this is the best associate I've ever seen in this firm ever in my thirty years. Comments like I can't believe how successful she was. Comments to the from the interviewers who had rejected her and said, wow, I can't believe how wrong I got this. I was looking at this, this and this, but this person was phenomenal. So I think in the and it. Yeah, it means one of the most powerful stories we have and in the end gave them quite a bit of trust in our data and now these kind of suited to the dating, to the predictions, are a necessary component of their hiring process because they realized how how how powerful it can be. What a good story. Yeah, that's I need to get a story like that. I help somebody with a pipeline tool that change the way they view the world. That's it's it's also got a turnaround story, because these are actually clients we were worried about because they were so hasty about using our product and banded they make a one hundred and eighty. Yeah, I mean that's you know, that's what's interesting when you bring in new ideas into an organization. You know, when you've been doing something for a while, it will stubborn. Sometimes you need some more data. So yes, you know, we did a similar thing here, you know, around our high process. We brought in a you know platform and if anything, we learned a lot about the construction of people. I would say it's a little bit you know what the tool we're using was a little bit more motivations and traits centered, but you know, it's a big learning...

...experience for our entire team. At the time, you know, we were able to go oh, that's why johnny does this, Johnny's wired this way because that's his preference and people have a vote on how they work. So, you know, exactly always going to learn. It is I we firmly believe that in the you know, hiring for so long has been such a human focused space and it's we've relied so much on human judgment and it's just been the norm for so long that it is hard to get people to think outside of how can we how can we do this a bit better? How can we be more thoughtful? What really makes people successful and and not, and understanding of that as a fundamentally very complex question that requires some some very nuanced answers. So let me let me see it. Another random thing. So sorry, wasn't a part of the original idea, but how are you guys approaching like the generational gap and hiring? So and I'll provide some more context. It's some point I realized I'm old and the people I've been that you know, interviewing and hiring are, you know, twenty years younger than me and I don't understand them. Is there anything that you guys do in your platform or any knobs that you tweet that help with that? Yes, not not directly guided towards that problem, but one of the things we we do here is we we don't just provide predictions about how likely someone is to be successful, we also provide context for those predictions because again, it's the very human centric process and field and industry here. So we tell people why a particular model thinks someone is likely to be successful or not at your firm in a particular role, and that context really helps you decide who that person is. So a lot of the context is like their warmth score. This person is more likely to exhibit these kinds of behaviors, these kinds of preferences, and what that really allows companies to do is to take that data and then go, okay, knowing this, I'm going to change the way on board this person and knowing this, I'm going to change the person internally who's going to be their mentor because there seems to be alignment there. So they can use that data and you can go. You know, I don't really necessarily understand the younger generations, but I have enough guidance and information on who they are and what they care about that I can adjust my hiring and onboarding and engagement strategies to accommodate them, even if it's something that they don't don't feel, you know personally, or they don't feel necessarily aligned with themselves, that can at least understand the people that are coming into their organizations that are Gosha. Now that's that's insightful. I think there's a there was a point where how is interviewing someone. I did a hire in this person and she's been amazing, but didn't really know what the fresh Prince Bellaire was and that was a tough day for me. I made when I'm in direct myself to sleeve, but it's like my life is over. But you know that. You know there's a there's a generational gap that we're starting to see, you know, in the workforce. You know, folks are retiring and young people are coming in and droves, and we're a technology focus society, so it really, you know, lends towards young people and there's some cultural aspects to that that I think, you know, people in the middle like me or you know, constantly adjusting to. So you know, it's a I think an important part of hiring is understanding and catering to to your employees and what they care about and how they can be successful. It's definitely becoming, you know, more and more and more important. That's that's certainly the case. I think there are especially, you know, as we work with law firms, we also work with investment banks. These are these are definitely, you know, if you rise to the top of these firms. They're definitely wealth building opportunities and I think a lot of these firms kind of just relied on that, that carrot as how you keep people around. Is like you're going to have these wealth building opportunities and want to realize is that the the needs are shifting right. Having right, you know, large salaries isn't quite enough for folks if they're if they're miserable or don't have the personal lives. They want to care more about that now. And so, yeah, a lot of these farms are having to adjust kind of the way they approach engaging and retaining, while a lot of these young people are coming to the organizations. Yeah, it's funny, like money used to solve everything, right, like Yay, you want to make a million bucks. Yeah, this is what you do. And you know, I you know, for probably for the better. But you know, I'd say the couple generations below me either their value and mobility, their value experience, their value and mission, you know, which I think kind of flies a flag in the face of you know, money will solve it. Money doesn't solve it. A few value mission. So yeah, it's kind of kind of funny. Yeah, that pressure is definitely exists and it's, I think it's only building, and the companies that react faster are going to start to win a lot of talent, I...

...think. Yeah, I'm great. So when you engage with a customer, you know the customer needs to have a feel for what kind of Kpis would make sense. Right there. There's there's some type of extraction process that you probably go with through your team goes through as they on board a customer. Can you talk about that at a high level, like what that experience looks like, maybe on the best and worst spectrum? Yeah, yeah, let me start with kind of, you know, identifying when we're in the sales process, how we know if this is a firm that is really going to fit with us really well, because that's kind of where it's still the onboarding really starts in the sales process, almost right. So my I do really client profile basically, right, you guys, right, really important, right, because this is it's a new field, it's emerging and growing and it's it's changing the way something has done and done for a very long time, and so we tend to look for particular shape and what we look for in firms is you know when we're on calls with them in the sales process. Are a lot of the people that are involved what we call, we use the term hiring evangelists, and these are people that believe there is a better way to hire. They have seen the data on how unpredictive and biased most recruiting data is and when they see the hard science behind what we do and how hard we work to build models that are really predictive of performance and are unbiased, they kind of realize how compelling it is and they be. They have, they are likely to become our champions inside the company, and these are people that are also tend to be very dissatisfied with her hiring outcomes. So these are people that will say things like people aren't staying long enough. How I get them to stay along or how do I identify people earlier that are going to stay? We struggle look at for top talent. Are Hiring classes aren't diverse enough and we're struggling to get them to be more diversely. They usually come to us with these kinds of issues. Contrary to that, if someone says things like we just want to make the process more efficient or we think we're good at this, but we're always looking for ways to be better. Those types of people are kind of signs that they might not fully adopt our perspectives or fully adopt our software. So we tend to and we're in those really calls. Do you know, do we have a group of hiring evangelists or do we have a group of not hired evangels? And that really dictates how well the onboarding goes, how good client they will be and ultimately, what we want to always do, which is it it's predictive of how much value we're going to end up adding to that organization. And so that's that tends to be two tends to be what we look for in the way we identify, the way we go find these people and attract them to us. There's a lot of ways we do that, but one is by, you know, connecting with companies that attend conferences around improve and hiring outcomes. So, in particular, for a law there's an organization called nolp an ALP that does a lot of law firm hiring work and we they put on conference. As we go to them, we find people that fit that hiring evangels profiles. We constantly run webinars are on better hiring and those two hired evangelists tend to be attracted those kind of webinars. But I think the biggest thing so far is is word of mouth. Hiring evangelists tend to network with other hiring evangelists and they spread the word what we're trying to do and how great we are. And but is as far as you know, the the onboarding process. It is pretty heavy because we get such a depth of data on a particular firm. It typically takes anywhere from two to six weeks just to execute the data collection process. All that while we are talking to them about how to use suited, how to engage we suited, what metrics and Kpis we look at to find out if we've been successful at the end. And then it's another few weeks on top of that to build a launch a model. But you know, typically about eight weeks we can go from someone signing a contract week got we're able to predict on their on their on their candidates in the in the KPIS, we try to move our you know, what percentage of your people are getting high performance scores? What how long are people that you've hired through students staying at your firm. It depends on the firm, but we tend to target things. Things like that Gotcha. So so performance management like them, having a framework around that, you know, prospect customer pretty important, I would imagine, because that's probably what derive some of the some of the assessment work that you would do on the front end to help build them, you know, the right model. Yes, so we do typically look for companies that have already have a pretty strong view of performance management. They're already doing it for the employees in their firm and can give us that data. We still will work with firms that don't have it, but we want to do it. So there are some firms that the on boarding process is a bit longer if they don't already have that data. We give recommendations on here's how to go about identifying performance, here's the best way to do that to give us most, you know, holistic and objective data, as you can so client not having readily available forms performance data isn't...

...necessarily a deal breaker. We will if they are open to US prescribing a way to do that, we will still engage with them. But but that data is very critical. So if it's not done well, I could cause a lot of problems. Gotcha. Okay. So shifting to the kind of the technology side of the equation, and this is this probably has more to do with maybe my lack of knowledge of the space. When I hear people talk about machine learning and AI, a giggle a little bit because I feel like that's that's just those are terms we throw around for automatically. Yeah, nothing in technology is automatically, as much as we'd like to tell customers that, like, yeah, this will change the way you do everything, it doesn't. So since you've, you know, spend a lot of time and you've done study around this formally and this is your business, what would you tell the common consumer or, you know, not super technical person like me, what they should know about like ml and ai? What a good question, I think. I think there is a lot of excitement run term. It's very buzzworthy. There are a lot of terms that are used interchangeably that shouldn't be used interchangeably. I could certainly a personal opinions on language here, but you know, I think the important thing when you're when you're thinking about out ai is you really have to have the problem statement in mind. What are you trying to do with AI and why is ai the best tool? So that you don't fall into the trap of we just want to say we use ai or we want to sound fancy, we want to sound cutting edge, we're going to use those terms. I think there's a lot of you know, all artificial intelligence and machine learning that is used where basic analytics would be fine, basic data science would be fine, or even not data science at all. I do think it gets overused. I tend to draw a line between AI and M L. I think when you get into neural network specifically, you're more talking about artificial intelligence. Anything short of that is what I call machine learning. That's a very personal approach and I think for most people that the difference is largely irrelevant. That you the more important thing as a business leader, especially nontechnology business leader, is to think about what problem do I need to solve and then go get tech people in the room to answer the question. Is this specific technology the right the right solution to this problem? Because I think the answer is often that it's not. It's not the right answer. It might be strange for an AI enthusiasts like myself to say that, but I think it is. It is overused. And another another big piece of this that I think is not focused on is is, you know, it's more about data than it is about ai and machine learning. A lot of this technology is being open sourced. There's psyche learn in tens or flow and there's so many open source packages. I've just mentioned two of them that will help you build machine learning models fairly easily. You know, I don't. You don't need to have a PhD in computer science to be able to build AI model today. That is certainly the case. So what is also important as a nontechnology any kind of business leader is not just focus on the problem you're trying to solve. But you know, if ai ends up being the right answer, will the bigger question than is not necessarily how do I go a build, build ai that I think the more important question is how do you go collect the data you need to build strong ai, because once that that question is answered, the building the AI part is, I don't want to eat my words later, but relatively simple and straightforward relative to getting the right data, getting it clean and robust, and that, in my mind, is it's been a harder problem for suited and I think it's a harder problem for most people trying to build ai than the actual model building itself or serving them up into production on a platform. That's an interesting perspective because I've never heard anyone say that, and that kind of makes sense. You know, I think you know or or they're not. They're not. They're not vocalized in the same context. So you'll have some people going, I need to aggregate my data, I need to make my data available, and they're not the same people that are building, you know, machine learning necessarily right. Like it's like a very operational thing, verse a very science and math thing. And yes, there's groups tend to be siload in the same way that development and operations are siload. So we kind of see the same problem and just kind of a different technology space. I think, oh, that's my hypothesis. Yeah, that makes sense. Well, that that's that's really helpful. I think. You know, people need more perspective on the buzz words that we all like around and say. So if we're looking around the corner, where do you see, you know, your...

...space heading? You know, what would you say? The next frontier is around you know, assessing and stacking, you know, the competencies of people and performance management and all those components. It's very complicated business. But you know, when you talk to someone who does UNI virtual meetings, I had someone on the podcast a few weeks ago and they do translation services like via zoom, which is pretty cool when the pandemic it, because that was pretty important. You know, I asked the question around like natural language processing. At what point do the robot stick over? And he said we're a long way from that. It's a cool idea, it's something we're all working on, but it's not it's not very good. What do you see? Maybe like way far out there and around the corner where it could be something to aim for, but we've got some time. Yeah, I tend to agree that the you know, the AI robots ticking over is is quite far way if it ever happens. You know, are our tools specifically today is built around helping recruiters and hiring managers do what they do. We have no intention of replacing anyone. That is it's a it's a it's a relevant fear. I get why that scares people. Sure people like job security, but you know it's not coming anytime soon. especially not for complex tasks like hiring right evaluating humans for jobs. But you know, I think eventually the world will shift towards recognizing the things that are actually important to career success and happiness and fulfillment. I think that the data is clear. I think truth always wins in the long run. So I think, you know, that will move away from the current form of traditional resumes and application forms and I think we're more likely to all have individual profiles that that capture all this relevant information and helps guide and give a recommendation to both companies in candidates on where the most like most likely to be successful and happy and fulfilled. And you know, the the sometimes I think of it as the suited version of Linkedin. You know, Linkedin is basically digitizing resumes, somewhat, but right. You know, I think suited wants to really change the nature of that resume in a big way. It might include your personality data, your working style data, your collaboration provinces might also allow you to share relevant but maybe more personal data like cognitive ability. I think that's way the world is is going to head towards. I think there's also a lot of this can also be done the wrong way there. There's plenty of I'll call it fear mongering out there. There are plenty of people that reasonably call out valid concerns and there's book called weapons of Map Destruction. There are some, you know, independent films and studies and documentaries that are done on personality assessments and Ai. And if we take all that with a grain of salt, understand that this can also be done poorly right. You can also use AI and hiring in a way that ingrains biases instead of helps eliminate them. You can use assessments that don't really capture relevant information for your firm or your industry or the role your hiring for. So there's there are a lot of ways to do really bad science here and to do it really poorly. So I don't want to say this is all it's all straightforward and it's all easy. There are going to be hurdles and there are definitely land mines in this space and you have to be careful and work with with vendors that know what they're doing. But I do think we are going to trend towards we're going to start making it much easier to hire on the things that are relevant to hiring. And you know, easy data points like could GPA in school. They're nice, they're convenient because they're easy to filter on, they've been around for a while, but we just know they're not they should be using the process, but because they are predictive. But they are predictive a very small amount relative to this other data. So use it in accordance with how much it's predictive. Right of GPA predicts three percent of performance. Use It as three percent of your hiring decision, not as a hundred percent right. So it's more about a kind of adjusting to really start to use the data that is predictive, and then I think the natural consequence of that is that people are going to be more aware and that data is going to become more visible and used more in in hiring, hiring processes. Gosha. So what is and this doesn't have to be specific to what you're doing day to day, but as a technologist, what's something that you would like to see in the future, something that is far out that you'd go? That'd be really cool. I hope that happens. Something far out. You know, one of the things I keep my eye on because again back to us being in a very human centric space, one that's been historically or human centric. Is explainable AI. This is, you know, the advances in complexity of a I have been astounding, but working in the business were in now, I know you said potentially outside of where I am now, but that's kind of naturally where I go to because I'm very...

...excited with this technology. Doesn't have to be the yeah, this is a new term for me. So explain away again. Yeah, explainable I. We the acronym is Xai for for short. But you know, it is a space where previously decisions were made by human judgment human relationships. It is hard for people to trust the AI enough to use it in their process. So we spend a good amount of time researching not only how do we make the AI better, but how do we make it easier to understand someone who's not a technologist or we have some concern or fear on AI. So we do a lot of work Unexplainabai so that we can present things like I talked about earlier, which is, you know, not only here is Jane Smith is prediction on how likely they this person is to be successful. Your firm. But here is why this model made this specific prediction right now, in human terms, in psychology terms and cognitive ability terms, that people can understand why. And it's really hard, especially as you get into kind of neural networks, those things become so nonlinear and so complex, to be able to extract out why. Why did this model make this prediction on this specific data set? Something that is very, very hard to tease out and it is, in my mind, one of the cutting edge areas. It's getting better and better every day and we do research on it. We have a team of internal PhDs you research on this. It's where they spend the most time focusing. So we're actually not only exciting and excited about the future here, but hoping that suited itself, can push this boundary forward a bit. I have a follow up on that. So in in this space, so if something you know right the math happens, speaking in the highest terms and the math identifies this outcome, wouldn't someone have to write what that outcome is to words? Is that kind of what you're talking about? Is We're like the mathpiece will then translate into the language that we'd used to describe personality. Traits or motivations or performance metrics. So there's wouldn't someone have to associate, you know, value with word? Or is that like part of the magic? Not to magic into it, because it's part of the magic. Yeah, we end up just doing the more math to translate things into words, but we do do that. We basically say this person has, whatever it might be, an eighty six percent chance of being eighty six percent probability being a high performer at your firm. What I'll say below that is something like this person is likely to take initius on tasks without anyone's direction, and the culture of your firm shows that those who kind of take their own directions, that of waiting for direction, tend to be very successful. So you get that context. Of God, here's why this person aligns with your current organization, how it operates. In more into itive terms. People Go, Oh, yeah, I'm I understand that, and that that resilates with the or the opposite could happen. Hey, this person is rated highly because you have a firm that doesn't allow people to really kind of go wild with their own ideas. You have a very it's very, maybe power or hierarchy oriented, and so people will wait for direction from those above them. That's what makes people successful at your firm, because that's your culture. And so depending on where they are, where they're at in the in the spectrum of that particular trait, but it will have that context in there. Gotcha. Okay, yeah, that I was just thinking. There could be an infinite amount of words or raises or descriptors, you know, based like what if you get this crazy profileable person that we've never seen before, you know, and you know it just kind of goes off to Rick Orst a little bit. Yeah, so would be really exciting. It would be. Yeah, we tend to monitor those kinds of cases. They do happen and we monitor how well do this person do in the hiring process that this person getting offer? Did they do well? And then we can kind of track. Okay, this perch is now a part of our data set. They did well or they didn't do well, and so now we now we have eyes on you know, we've expanded the boundaries of what we can look at because we got this strange personality that came through. We've kind of figured out if they're successful or not and adjusted the model accordingly. So it it's that's the other thing. Is it is. We don't ship a model on the go. We're done. You've got a model to predicts performance by see you never it gets updated every time they give us new data. After a hiring cycle we have to take the model ship a new one. It's something that, laurens and large moons, gets better and better, better and smarters one Gotcha. All right. So last question. What is some advice you'd share to an aspiring technologist? Anything you want, something you've learned along the way, or this might be I'm trying to come up with something unique that I haven't ever gotten before but I think is important into something that I falls into that category for me personally, is to not forget the the soft skills being a technologist, being a technologists. You know, one of the things I see all the time at technology meet UPS orry. You know, I even go to CTO conferences and it's isn't is an extreme focus on what's the next greatest technology. Or even...

...at the last Ceteo conference I went to, I I lost count of how many times I was asked, what percentage of time are you still hands on keyboard right which is another way of asking how much time you still spend writing code. And it was it was, you know, implied that the higher the percentage, the better you were as a Ceteo or the better you were as a technologist. You had fought off the urge to switch into management. And you know it was it was very much that kind of environment. And you know, I don't hear enough about the softer sides of being of being a technologist, to even being an engineer, and I even a CTEO. You know, things like no, very rarely did I hear questions like what training have you done recently on how to collaborate better with not technologist? What have you done to help understand your own work preferences and to share those with the team you have to collaborate with most? You know, how can you help elevate junior engineers to cross the very difficult threshold of becoming a senior engineering being able to think about architecture and design? Your topics IED. I they're present, but I don't see ass often enough and I think the best engineers that I've seen in my career and where I have gained a lot of success in my career has been because I was able to pick up on some of those soft skills, maybe faster than others. Are Put more importance on it than others. And it's not to discount the technology stuff, but that the pressure there to learn the most advanced technology and be up to date is already there. I don't need to put pressure there. But learning the software skills, I think. I think people, especially engineers, might not realize how much growth that could unlock for their career, depending on what they even if they just want to be a you know, you just want to rise the ranks as an independent contributor. Just want to be like a staff engineer, and that's kind of the ceiling. Even in those positions. Learning soft skills is exceptionally helpful in allowing you to be successful and get there faster too. Now it's a great point, I think. You know, as people walk down the path of their career, you can build a bunch of great stuff, but those projects, those lons of code, those databases filled with stuff, don't have mouths. Yeah, so the people you work with, if you're a joy to work with, we'll talk about you. It will create an open more doors for you. So, you know, I think that's often the kind of the soft skills thing on the engineering side. Now, mean we talked our team about all the time. You know, it's like you don't you don't have to be an extrovert, you don't have to be a sales guy, but right, you know, talking to people's good and understanding the health of their cat is fine and, you know, just getting to know people just like you would in your own you know, friends circle thinks. Actually spend more time with people at work then you do usually with your family and your friends. Yeah, so emphasis on communicating. Yeah, exactly, and it helps a lot. You know, when you know. One of the things I especially if you have some if you can just learn some self awareness to and just write down. I have a lot of working style preferences. Right when people, for example, put an idea in front of me, my first instinct is to go attack it. Let me tell you why this is done. Let me tell you I this could go wrong, it isn't make sense and it's that can do very much come off as a personal attack. Sometimes it can cause a lot of friction in a realm. But if you're aware that you go hey, this is how I tend to behave. Let me know if that doesn't work for you or just be prepared for that. Or there's someone that I work with now that that struggles with that part of my personality. One of the things she has asked of me, because I made this aware, she says, maybe give me a compliment before you go down that path. So I got this. This work is great. I'm glad you put this together. This is really helpful. And then we're on blocks so I can go be me again. It's just having that awareness is it allows us to meet a build bridges and relationships with people where we otherwise might not have had a good relationship because there was friction or problems or collaboration difficulties. And just a little bit of self awareness and then communication, that self awareness can go a long way. That would apply well to a spouse, to for yes, all the single people out there. Lead with a compliment before criticism. Bitch in a long way. Yeah, that's awesome. Well, Aaron, thanks so much for your time. I know you're very busy, but this is, I think, been a really interesting conversation and I learned a bunch, so I know listeners will learn a ton as well. So thanks a lot. Well, good thanks for having me as a pleasure to be here. I love talking about this stuff, so yeah, happy to happy to do it again, and I certainly enjoyed this time that our flew by. Yeah, did awesome. Thanks, sir. Appreciate application modernization is sponsored by Red Hat, the world's leading provider of enterprise open source solutions, including high performing Linux, cloud, container and COUBERNETTI's technologies. Thanks for listening to application modernization, a podcast for high growth software companies. Don't forget to subscribe to the show on your favorite podcast player so you never miss an episode, and if you use apple podcasts, do us a favor and leave a quick rating by tapping the stars. Join US on the next episode to learn more about modernizing your infrastructure and applications for growth.

Until next time,.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (25)