Speaker 1: What we're gonna talk about here is a series of tests that we ran, uh, across hundreds and hundreds of landing pages. In 2020, we did a radical redesign on our landing page layout. This really ties into a little bit about what Dr. Pawn was talking about. Now he was talking about in a larger organiz organizational scale of doing sort of radical change. [00:00:30] Um, but we're talking about it in this, uh, in a landing page level, just to make sure that everyone is on the same page. We're just gonna do a little catch up about the difference between sort of incremental testing, what we would call AB testing, uh, where, where you'd be testing a single element on a landing page. So say, uh, does this headline test better than this headline? When sometimes another example might be something as simple as the, a, the color of a button, that type of thing. Speaker 1: So, yeah, this is just to [00:01:00] say, this is, um, this is just a sort of silly non landing page example of when we're talking about, if you're testing a single element on a page, we're looking at, uh, this versus Tammy, if you wanna switch to the next slide this right. So a little aesthetic change. These are the types of changes when we're talking about AB testing, um, in order to get any kind of information from a test, like an AB test, uh, you do wanna keep it pretty simple. So, so it's not to convolute. So you would know, you know, is this button color [00:01:30] versus this button color? That's when we're talking about a simple AB test, when we are talking about a radical redesign, that's when you wanna do a complete overhaul of your copy and design, uh, we can go ahead and, and switch to the next slide. That would be just as an example, would be looking at the difference between something like this room to something like this room. So, Tammy, can you tell us a little bit about the value in each of those types of testings and why [00:02:00] you wanna do a radical redesign? Speaker 2: Um, yeah, like, so AB testing, as you had suggest, or had mentioned is more of an iterative process. So you're wanting to know if a headline works better than another headline, and then that becomes an practice. And then you might add on to that as, okay, does this, this call to action work better than that call to action. Then you define a winner and you add that on to that pasts kind of an, a cumulative, um, process. Radical redesign is more, you think of it as a you're more, [00:02:30] you look more holistically at everything. Like what on this page as a complete experience could be improved. So quite often it's a design change or, you know, sort of how you're, uh, layering the copy, um, the store that you're you're using in the narrative, that kind of stuff. And you, you just go ahead and change everything. The, the biggest benefit in our experience is when you do radical redesign, you get a much bigger lift in your testing than you would on an iterative [00:03:00] AB test singular. You know, you average you might 10% maybe on a, a, a singular element more often than not. It's neutral on a, on a 1 0 1 radical redesigns, more often than not. You're looking at 25, 30, 40, 50, 60% sometimes depending on what you're testing against. So, so yeah, it's a lot more work, a lot more thought, but the potential gains are greater. Speaker 1: And would you say sort of order of operations, um, when you, [00:03:30] like, we've done a radical redesign, and if you get a big gain like that, or as we might talk about later, you might have a big loss, any kind of large result yep. Uh, to lead to what is gonna become your new leading model from there. That's when you might wanna look at doing some refined testing once you've got this larger lift from there, that's when you would wanna maybe go in and look at looking at changing a headline, adding, changing a [00:04:00] form, doing that type of testing after you've got sort of that big change. Speaker 2: Yeah, exactly. So now that you've got some result that looks promising in a radical redesign, you could start to get curious, well, was it this element that we changed that contributed more? Was it that element and you can start picking it apart, working backwards. , it's usually a lot more effective and a lot more insightful that way we find Speaker 1: Interesting. Okay. So this is what, just, again, this is just to set the framework for this test that we're telling you about. So [00:04:30] in 2010, we did a massive redesign and we rolled it out. We tested it on a couple of, uh, clients and a couple of pages to start out. When we saw that we had big gains, we rolled it out across hundreds and hundred landing pages. And, uh, if you wanna go ahead and you can switch the slide, Tammy here, we'll show you just as an example. This is just a, a small screenshot of a, of a fraction of both of those pages. Just to show you what we're talking about. Even though we're using this word radical, sometimes even aesthetically, just to glance at this, it wouldn't seem, you know, we didn't reinvent the wheel necessarily. A lot of our best [00:05:00] practices are still in place, but we did in, uh, we did implement some, some, uh, some, some really different design changes. Speaker 1: And we did implement some, some very different, uh, approaches to copy all, all base sort of on our, on our best practices and our learnings that we'd had over the, over the year. But we had this massive re redesign and we saw across the board again, across hundreds and hundreds of pages, we saw an increase in conversion of around an [00:05:30] average of 30%. And we also saw a decrease in leads, uh, uh, decrease in cost per lead. And, uh, yeah. So the point of that story is that we do, we do great landing pages and it won across the board. Isn't that right, Tammy? Speaker 2: . Wouldn't that be nice? Yeah. Oh, is Speaker 1: That, is that not true? Is that now? It haven't? Speaker 2: No, Speaker 1: It's not. No. Darn. I thought we were done the talk. I Speaker 2: Know. Speaker 1: Okay. So, uh, it won in the majority of cases. Is that fair to say? Speaker 2: Yes. [00:06:00] Yeah. And Speaker 1: About, we had a couple of pesky outliers. Yes, yes. Okay. So yeah, we did this massive redesign. We see this incredible lift, 30% across all these boards, except we see that some pages don't. And even though we're looking across clients, we're looking across hundreds of pages, we do see that there's some commonality to the pages that don't have a lift. Is that right? Speaker 2: Yeah. Yeah. We started to notice patterns around specific programs regardless of the market. So we found that really [00:06:30] quite interesting, and they were more consistent on the, what we deemed the problem pages, you know, problem programs. Speaker 1: So again, I have, it does seem, I'm just guessing that it seems like we might have a bit of Ashire group on this call, but I do wanna put it out to the group. I would be interested to know if there are any guesses as to what programs that we were having problems with. Are there any programs out there in the chat? You can put it in the chat. You're welcome to just turn on your mic and chat it out. If you want to, of course, it's up to you if you don't [00:07:00] want to do either of those things, but I just wanted to give an opportunity. I would be super interested to know, uh, what programs everyone else out there who was running schools, uh, suspects that they, or maybe they're having problems with anyone accounting. I heard accounting. Did someone say, , it doesn't look like we wanna participate that way, which is totally fine. Uh, yeah. Accounting is one, uh, accounting was one of the problem areas we also saw in medical office careers and [00:07:30] in beauty. So again, we're talking dozens and dozens and dozens of other programs that all see a successful list from this same design, but now we have an area to focus on. So Tammy, what would our, what was our first plan of attack when we identified these programs? Speaker 2: Well, I think we wanted to just, you know, obviously reassess those pages and how the content was in comparison, cuz the control one against our new variations. So we, we, we did a deep dive on, on the [00:08:00] control pages. Um, and I think we came to determine that maybe they were a bit more, um, a bit more school focused, you know, which is not, is usually goes against what we, we normally do in our marketing. So, um, we decided to do some rewriting for those people. Speaker 1: Yeah. Yeah. So we have these three program errors. We have accounting, we have medical office and we have beauty careers and of course, and they all come to me and they say, Katie Ellen, you are our senior copywriter. You've been writing in the EDU market for [00:08:30] 15 years. Would you take a crack at these pages? And I said, of course, how could I turn down the challenge? So I rewrote these landing pages and then they still lost Speaker 2: Do you believe Speaker 1: It? I know it's not entirely true. I was able to get a win on accounting, but then we were still struggling. We still had a trouble. We still had trouble with medical office and we had with beauty. And so we have done one round of testing. Now [00:09:00] we've done a, we've done a radical redesign. And then we rewrote the pages. Now years ago when we were little baby testers, when we were just starting out in testing. If we had tested a page, we had our control, like what you're looking at in your screen. Now we had our control and we had our variation. If our variation lost, that was the end of that test. Look, we tried two things. The control was better. The control is the page. And that is where that would end. [00:09:30] So previously again, years ago, had we done these tests? Speaker 1: We, that would be the end of the line for us. And we would've said, Hey, you know, we tried, we ran a test. Now we know our control is the good one and we would've stopped there. But now, now I think we're a little bit addicted to this testing. We're too inquisitive. Now we cannot stop ourselves. And I, it makes me super excited cuz we have further to go on these tests because I, I can't rest knowing medical office [00:10:00] and beauty. Why medical office? Why beauty? What is it about these program areas that is performing so differently to these hundreds of other pages? And the reality is we didn't, we don't know why we didn't know. We didn't know at that time. So, but you know, who does know what their motivations are for filling out a form, the actual prospects, anyone as a prospect comes to that page, they fill out that [00:10:30] page. Speaker 1: They, they wanna know for information, they know where they're coming from. They know if they want information about schools or if they want information about, uh, careers, they know what their motivations are. And so we wanted to come up with a way to just ask them and uh, and that's exactly what we did for those of you who were on. Uh, Shane's talk earlier today, he talked a little bit about this, uh, Tammy, if you wanna advance the slide, this is just an you. Yeah. Um, if you wanna go, uh, we can go back [00:11:00] one slide just as a, so this is, this is just a just graphic representation of our best practice form because what we implemented was a two step form. So I wanted to just give everyone a representation of what the single step form looks like. So this is a pretty standard, uh, I mean it is a, it's our best practice form, but it's a, it's a single step form, meaning that when a prospect fills this out, that is the end of their form journey. They have submitted their information. A school has that as a, as a complete lead. [00:11:30] And that's the end of this form process. But for these program area pages that we are having trouble with, we instigated, uh, what we, we call a super form. It's a two step form. So what happens in that case is a prospect is filled out this form when they hit submit, uh, yeah, Tammy, you can go ahead and uh, we'll move to the next slide. Almost like going back to my, there we go. Speaker 1: Yeah, they would get some follow up questions. So whether or not the [00:12:00] prospect fills out the rest of these questions, it's still a complete lead. Their lead has been submitted. We have their name, we have their contact information, everything that you would get from your regular form submission. Um, but we put these other questions in front of them and uh, and we asked them, Hey, what made you, what made you fill out this form? And you can see some of the questions that we're asking them, we're asking them, you know, to identify whether or not they, what describe what prompted you to reach out today? And we asked them some [00:12:30] questions about whether or not they were more drawn to, if they're the process of, I don't really know what career I'm after or if they know what career they're after and they're looking, um, more at the school specifics. So you can see this is a over two step form. There's some conditional logic applied. This is the first question that we'd ask. And then depending on whether they identified, what prompted them with school or career, we would ask a follow up question. So yeah. Tammy can go ahead and move the slide there. Speaker 2: [00:13:00] Oh, geez. Sorry. Speaker 1: No, no problem. So you can see here, that was just some, there's some, uh, follow up, drill down questions for whether or not they were motivated by school or motivated, um, by, by career and yeah. Tammy, do you wanna speak to some of the results that we had from that? Speaker 2: Yeah. So with our medical office prospects, um, there were 50 50, so we had 50% of the people were actually, um, ready to pursue the career. They knew enough about the career. [00:13:30] They're just taking the next step. And 50% were still deciding on a career path. Um, whereas the cosmetology prospects, they had already known that they're interested in cosmetology, they were sort of bought in and 30% were still exploring career op options. Right? Speaker 1: So now we're armed with this information going back to our pages. And we know, so let's say for example, for these beauty prospects, 70% of them already know they want to work in a beauty career. And typically when I'm going [00:14:00] to write a, a page, I would not have anticipated 70%. I wouldn't have known it's that high. A lot of the copy off the top of the page, I might have been selling them still on becoming, getting into beauty. I would've been talking about the advantages of working in that field. I would've been talking about, you know, if you're type of person, you'll be well suited to this career. In this case, I know from this debt, from this data, 70% of the people who are clicking on who are converting on this page, they're already bought in. They know they [00:14:30] have in, they've been in the beauty industry in their minds already for years. Speaker 1: They're just, they need to know what the experie of going to school is like. And what that tells me as a copywriter is how to wait that information on the landing page, conversely, in our medical office, knowing that it's 50 50, that also informs the way that I'm gonna write it, because I need to balance both of those motivations. We don't wanna overload the page with information about exclusively the school, because we still need to be, uh, appealing [00:15:00] to some of these, these folks who are unsure or they need to know a little bit about the career that can also tell us, we might even wanna focus. We could do two landing pages, we can split that information up, that type of thing. And this is information that we got directly from the prospects themselves. And, uh, so it'll be a, it's a whole effort. It's a whole elder talk about the, the statistical analysis of how this two step form works. But what I can tell you anecdotally, is we were [00:15:30] floored at actually by how many people would fill out more information after the form. Would you say that's true, Tammy? Speaker 2: Yes. Yeah. We expected, you know, relatively low percentage, but, um, I don't have solid stats, but we have quite a, quite a few people that gave us information in detail. Speaker 1: Yeah. I would say from the time that we started working from the time that virtual advisor was developed, I have been wrong almost 100% of the [00:16:00] time whenever we've implemented, when we first implemented, if there's anyone on the call that knows our career training readiness pathway, I was sure no one would fill out all of those questions and I have improved HAPPI have been proved wrong again and again on that. And these two step forms again, uh, you will have a con whether or not someone gives you further information on the form or not, you have their, their first, uh, you have their full, complete lead. And, uh, [00:16:30] what's so interesting about this two step is that you, uh, you have this person at their highest point of motivation. They are completely invest. They have already given you their phone number. That's a much bigger ask than asking them, Hey, why did you, why did you reach out today? Speaker 1: Or what are you looking from from school? And it also personalizes that experience. So they know it's so, so different than filling out a contact form on [00:17:00] a, a lead vendor or just on any sort of general, uh, you know, university college, uh, any of these, these portals or something. They know that this information that they're gonna get from this school is going to be TA to them because we've, we've inquired about, um, about this, this information, but what this information allowed us to do to come back to the topic here about redefining failure. It would've been, it's very easy. It would've been very easy to just say, Hey, [00:17:30] we didn't win on this copywriting, go back to the control. But the thing about a test is that the only way that any test that you run is actually a failure is if you fail to learn anything from that, there cannot be if whatever the new page you make, if it doesn't beat your control, it's really not. Speaker 1: That's really not the game we're in the game that we're in is learning [00:18:00] the game that we're in is learning more information about your prospects. So if you implement even a radical redesign that doesn't move the needle, that gives you information and that's information that you can put into further design work, you can put into further copyright, you can put into, um, further initiatives to, and be investigating with, with your prospects here. Um, so Tammy, you're, you're deeply involved in the, in the tech testing elements. Can you [00:18:30] speak to this sort of evolution of how we've gotten to, uh, move away from, oh, we want, are we lost to get into some of this, this deeper testing and, and redefining how we look at that? Yeah, Speaker 2: I think it, you know, piggy piggybacks onto Shanes talk earlier about just we, we fully and wholly, um, and truthfully wanna understand, and the motivations of these prospects and, and have a deep understanding, you know, cosmetology versus medical office versus accounting, because, um, with our schools, we wanna be [00:19:00] sure that we're marketing to them in their own sort of voice. You know, what are those core things that are a really important to that person? That's gonna be interested in that career. And it's hard to do it if you're just, you know, I mean, partly in our early days of testing, we would make an assumption, we'd throw up a test and, you know, we won our lost, and if it won, we would assume that would be the truth to that thing. You know, that may be true. Maybe it isn't true. And so, I mean, we have the ability now to be engaging, these pros, [00:19:30] prospect have already asked for information and they're willing to give more personal information about why they're interested in this career. So that, that helps us to, you know, not only serve them in their interactions with their admissions, people that reach out to them cuz it's their laying out everything that's important to them. Um, but it also helps us speak to them more as an individual than just sort of broadly. Yeah. Speaker 1: Uh, if you wanna go ahead and move the slides, we have some I've [00:20:00] we have some just sort of takeaways in terms of general testing. So this is, yeah. These sort of points that we already just touched on there. So there, there truly is no such thing as a failed test, there is only a, a failure to learn from a test. And, uh, and again, uh, and I mentioned that I have done a couple of these redefined failure talks outside of the, out of the context of marketing and I, and I think that's really true just across the board in your life, right? Uh, anytime if we're getting knocked down, the only way that that really [00:20:30] is a failure is if we just don't learn anything from it and continue, um, in my case the same, same types of stubborn behaviors that led to my failure in the first place. Speaker 1: uh, so yeah, so always be asking, what does this test tell us regardless of outcome. So that means also if you win from a test, why did it win? If you have, let's say a 30% increase, but you're not inquisitive about what caused that increase. You will enjoy the fruits [00:21:00] of that increase for a time. But when, um, but when as always happens, when friends change, when your, uh, when your prospect changes a little bit, you won't have any insight from that, from that test that you won with in order to implement the next changes that are coming, you won't have that, uh, that knowledge. So anytime you are finished a test, it can be incredibly helpful to take a look, to dig in and see why, what was it about that test that, uh, that caused [00:21:30] that, that great gain. And so, uh, yeah, into the third point, consider how you can apply that learning to future tests. Speaker 1: And, uh, yeah, we just have a final slide on the, on the redefining failure thing, uh, from, uh, icon of our time. , as long as you're learning, uh, you're not failing. And that's absolutely true about these, these tests and, uh, yes, stranger on his camera. And I imagine he has some comments, which I do wanna get to just before we do, I wanna mention about these two step forms. [00:22:00] Another reason why, uh, that content I think is so valuable or it's just a, it's a valuable place to gather information in is that, um, the second part of that form, the prospects are receiving that when they're on your thank you page. So they've filled out your form, they've got to your thank you page and the thank you page of anything, whether it's a landing page, whether it's your website, whether it's on social media, where you have a form or something, a thank you page is an often wildly overlooked opportunity. Speaker 1: [00:22:30] A lot of times people treat that thank you page. Like it's the end when really it is the beginning, it is the absolute first step really of your meaningful interaction with that prospect. Um, and a lot of CRMs, a lot of there's a lot of any kind of software at this point now has some type of form technology. Even if you have a Vimeo video or a YouTube video, oftentimes you can embed a form at the end of that video, but it's a very simple form, [00:23:00] usually just a couple of fields and submitting it. And that's it, there is sort of no further process. There's not a great thank you page, or they're not staying engaged with that person, uh, which I think is, is a really huge missed opportunity. So the way that as I say, these super forms work at the end, when you, you fill the first form and you get these follow up questions, that's taking place on the thank you page. Speaker 1: And if you're not engaging your prospects properly on your thank you page, that would be the, it would be similar to like knocking [00:23:30] on someone's door and saying, Hey, do you have some time to talk about this great vacuum I'm selling and them saying yes. And then you saying, oh great. I I'll follow up with you and we'll come back another time. Uh, so that's just my, that, that's my evangelical. That is just, that's a personal take about how I think, uh, thank you. Pages are being underutilized yeah, no, that's great. Shane, it looks, you have some Speaker 3: Questions, but a question. Excellent. So, um, [00:24:00] Katie and Tammy, uh, have both been with our company for the longest tenure, Katie employee, number one, Tammy, I think maybe five. I don't remember not number two, but close. What changed in our culture or did something change? How did we get from, uh, kind of, oh, lost O well to a much deeper way of looking at it. [00:24:30] And the reason I'm asking is that if, if for someone who's running a school and is trying to implement a culture, that's exploratory, right? That's sort of like this ethos we have there, there's there things that prevent it, there's things that nurture and you guys sort of having been around for a long time, having insight to either what changed or what exists in our culture that has, has made this possible. Speaker 1: I think your [00:25:00] approach, I think creating an environment, if I think really comes down to this, I I'm a, I'm passionate about this topic about redefining failure, even having that word in it really failure, right? Mm-hmm if you, if you take that out as an option that there, that the test can't really fail, there's only failure to learn. Mm-hmm I think that really frees people up to try things to be trying. So if someone implements something and it [00:25:30] doesn't boost conversion, and even, you know, you have a landing page running for a brief period of time that is dropping your conversion. If you have a culture where something like that would feel like a failure where you wouldn't be able to share it, or you wouldn't take a risk like that. I think, um, I mean this on a much larger scale is what, uh, Dr. Speaker 1: Pond was talking about in terms of people being too frightened for their own, you know, job safety or things to, to take risks. Mm-hmm and, you know, in a much smaller [00:26:00] scale, if we're talking about a landing page or something to implement on a website, or, uh, trying something out, if you have a culture where, uh, where sort of perfectionism is, is prized, or even just kind of, you know, running out the clock and, you know, clock punching as opposed to exploratory things that might not be in the short term quantifiable for their value. Um, so, and I think that that cuz in our team culture of [00:26:30] just like that, the actual value is the pursuit of knowledge and not the outcome has really created, uh, just a fun innovation and has created a lot of this, this movement that I've seen in time that I've been here. Mm-hmm Speaker 3: Tell me what about you, anything to add? Speaker 2: Yeah, I think, you know, in leadership it's really important to, to, to create that safety, you know, that you have the option to explore something and you know, certainly this past year with the initiatives of [00:27:00] just trying to get all our team members is involved in thinking about testing areas, you know, whether it's in their own work or even outside has proven, you know, that we fully support exploration and curiosity and um, questioning a little bit deeper than we normally would of, you know, you just, you you've got your work that you gotta do in a given day, but please spend time also exploring this area if it's an interesting to you in your own roles. Um, [00:27:30] and there's, you're just free to do that. I find just bubbles up a lot of that kind of openness to exploring things a little bit deeper. Hmm. Speaker 2: And it that's for me, it's been, it's been a pivotal thing cuz you know, for, and we did a lot of Rab testing, you know, it's always in the end, it's always ROI. We wanna get that better result for our clients and improve their leads and lower their costs. I mean, that's, that's the end game. I think over the last couple years we've been a lot more like what's the [00:28:00] research like what's the meaning? What's the insights underneath all of that as well. Because I think that knowledge on top of the testing is gonna be, you know, in the long run a lot better in terms of results. Mm-hmm , Speaker 3: That's awesome. Happy to hear that. We'll we're not gonna change. I'm think about like cause we're all competitive. Right? We wanna win. And like you said, like [00:28:30] the end game is just, it's better ROI. Right? We're trying to spend client money more prudently. We're trying to extract more for them. We're trying to, you know, squeeze every drop of juice there is out of the PR. Um so I'm trying to understand the difference between being competitive in a, a healthy way and being competitive where failure is punished. Right. So, and, and to me that's maybe the difference [00:29:00] and, and for the, the people on the call, if you're trying to think about how to build the culture that re rewards innovation or like nurtures innovation, right. Nurtures exploration, and, and maybe gets some of the way towards what Dr. Um, pool was talking about the more this morning. Speaker 3: Right. Which is, I think to me, that's the end game. Like that's the, what I'm taking away from this little conference is thinking well, okay, how do we transform? Right. [00:29:30] How, instead of being iterative, how do we be outta this world? Innovators? Yeah. Mm-hmm well, yeah. Or like even sort of beyond being iteratively innovative, how do we, how do we like be transformational? That's what a great question. That is. Mm-hmm um, and I, I appreciate the, the thought from you both. Cause we have strive to, to [00:30:00] avoid punishing failure cuz that it just like kills all initiative at the step. Right. It's just like digging up the root, it just kills everything and it's not realistic to think you're gonna win even the majority of the time. Right? Yeah. It's it's just not, especially as you get farther down you something, it gets harder to make gains. Right. It's just not realistic to think. Oh, I had some terrific idea in the shower this morning and oh, it's gonna, [00:30:30] you know, change everything. Right. It just, it doesn't happen. Um, Speaker 1: I mean, I think that's why it's so important to, to the point about really estimating why something, why a test has won regardless of the outcome and you're not gonna, and it's you won't some of that will be intuition. So if, you know, if you, if you do a radical redesign or you do even a small, you do a small test and you get a win. If you wanna, [00:31:00] the, when we talk about taking that first other, you look at that one test that won and you ask yourself, why did that win? And then let's say, um, you know, oh, I think it won because of the way that we approached. We, we spoke in a more conversational tone than we had in our other one. Mm-hmm , that's the thing that you wanna take then and apply that to another page or another instance and see if you get the same result or a different result. And if you get a different result, why is it different? Is there [00:31:30] different circumstances or was your initial instinct on why it won in the first place? Maybe not the case. And was there something else at, at play there? So really it's, it's a consistency of, uh, inquisitiveness that is going to be your long term, uh, gain mm-hmm Speaker 3: I think similar, Speaker 1: Oh, go ahead. I was gonna say, in relation to your talk also, Shane, for those of you were on it, uh, Shane was talking about the different personality types, different social styles than individuals have. And he was talking about [00:32:00] it from a, a sales point of view, but I've also found in our team am as we have grown, I think I've seen how our tests have gotten so much more robust because we have a deep well of, uh, developers whose brains work in a certain way. We have a deep well of, uh, copy. We have a deep well of folks who can work both sides of that. And then, and it's really in collaboration of those because I will only ever see things as story. That's [00:32:30] how my brain works and that's how I approach tests. And then we have these brilliant analytical developers who see, they don't, the copy is completely irrelevant to them and they see how it's laid out and they see design and they see, uh, backend interaction stuff. And so also encouraging collaboration, I think in, in terms of we're speaking to that culture, mm-hmm Speaker 4: , that's a big piece. Yes. Speaker 3: The collaboration. Yeah. Mm-hmm, [00:33:00] interesting. Here's here's my question. And hopefully someone won't answer it. One of the things we're we're I would say struggling with, but like trying to understand the edge of is, so if we go find some motivations, Hey, I'm, I'm still thinking about the career versus I'm I'm commit into the career. I'm now thinking about the school and we submit that to your CRM, right. Or that gets documented somewhere and that gets pushed, pushed to you in, in [00:33:30] through some, through some way. Right? Can you meaningfully do anything with it? Cause we can. Uh, um, and, and, and if you're not a client just, you know, rhetorically this cause where I would like to, where we would like to go with the company is that as we get more Intel on prospects, right. And, and my, my belief is [00:34:00] that marketing's gonna get harder, right? Speaker 3: So that we've, we've enjoyed a, a, a tailwind with COVID it's running out and things are gonna get harder and it's just, it's, marketing's gonna be harder. And disruption is gonna create a more competitive environment and, you know, there's lots of stuff that's gonna happen. And so we, we focus on, okay, more insight. We can fine tune marketing campaigns, right. And be more effective. And Chris and Trent can do their good work and we can build websites [00:34:30] that are more persuasive and we can, you know, fight the, fight, the trends towards higher costs. We can bump conversion rates and like we can generate more leads. But what we would really love is to provide more insight around who that prospect is so that the, the, the school can do more with it. Right. They have a better, better chance of connecting meaningfully with that person. Shoot. We got two minutes mm-hmm so if anyone's courageous and is willing to talk about this, like whether [00:35:00] internally they can figure out what to do with that, I would, would love the insight. Speaker 4: No. So I'm sorry. Was the question, like, what would we do as the client receiving the detail, the rich information. So Speaker 3: Can you do something with it or will you do? Speaker 4: We can absolutely. We can. Yeah. Um, I mean, any kind of [00:35:30] richer information helps our admissions counselors. I run our text act. So any kind richer information helps our admissions counselors create, um, kind of a more, you know, robust and well formed dialogue when the prospect, when they finally connect with the prospect. So I think that knowing these motivations and having any additional information helps them kind of connect with the prospect where they are and actually, you know, help fit them [00:36:00] into the programs that would be most helpful to them in, you know, meeting their goals by seeking our school for a career. Speaker 3: Awesome. Well, good. That's our goal like the, this coming year that's our goal is like, we're going to try to kick, but refine campaigns, et cetera. But like one of the thrusts is gonna be around providing more Intel into this prospect is to just help improve engagement at the school end. [00:36:30] And I think we can do it. I've got an awesome team here and they're thinking awesome thoughts.