Many of you have worked with Simpson Scarborough in the past. Today's presentation, Enhancing University Websites Through Data-Driven Insights, is about turning insights into action. In higher education, a website is more than just a digital front door. It's an open conversation, one that draws in prospective students, keeps current students engaged, and builds lasting connections with alums.
But here's the challenge. How can you tell if your website is performing? How do you strike the right balance between creativity and measurable results or between functionality and aesthetic appeal? And perhaps more importantly, how do you ensure you're meeting the needs of the people who matter most, your users?
Well, that's exactly what we're diving into today. You'll hear from two of Simpson Scarborough's digital experts who live and breathe this work every day. And by the end of today's session, you'll hopefully have a clear picture of how data-driven decision making and user-experience testing can transform your website into a strategic powerhouse.
First up, we have Cassie Gold, Simpson, Scarborough's Associate Director of Analytic Strategy. With over eight years of experience in digital web brand and media analytics, Cassie bridges numbers and knowledge. She specializes in transforming complex data into clear, strategic insights that move the needle. Cassie's innovative solutions-focused approach helps institutions see what's working and what isn't so they can make smarter decisions faster.
Joining her is Dan Moore, Senior UX Designer. Dan is the architect behind some of the most impactful higher ed websites out there. He combines usability research, UX strategy, and wireframe designs to create digital experiences that are as functional as they are engaging. Sitting at the crossroads of digital strategy and creative, Dan ensures that every click, scroll and interaction is not just functional but thoughtfully tailored for the best user experience. So without further ado, I'll hand it over to Cassie and Dan to take us into the world of data, design, and digital success. Welcome.
Thanks, Ron. Go ahead and get the share going here. All righty. Well, that was an amazing introduction.
Obviously, we are who we are. We're here to share what we've got with you, just going through a couple of case studies from our recent work and some recent projects and how we track those analytics and different data measurements through that process. Just a brief overview of what we're going to get into today.
One is understanding the current state of affairs. And then we're going to get into our specific data-driven approach and then a couple of different case studies, examples of how we're using that process throughout with our clients, and then our last like, how do we continue to move this momentum forward at the end here?
So understanding the current state of affairs, our POV as an agency is ultimately that institutions may or may not be fully harnessing their analytics to inform and optimize their content and web strategies or communicate the true value of the website. What we find is ultimately they're underutilizing their tools and things at their disposal to really enhance the website from a data tracking and data enhancement lens.
This comes from our own internal study of higher ed, specifically the CMO study, where we're finding that budgets have ultimately decreased for technical and web support, meaning that schools are investing less in web development in their digital footprint. When we look at institutions across the board, we find that only about half have really specific web tailored roles, and then even less than half have the deep, rich roles that really drive impact and strategy, like UX and SEO alongside content, UI, and web development. So with that, I'm going to pass it over to Cassie to get into the nitty gritty of what kind of data we should be looking at.
Awesome. Thanks for that intro, Ron and Dan. So I'm just going to start off by saying that higher education websites serve as a crucial touchpoint for prospective students, current students, alumni, and faculty. But to ensure that we're effectively supporting these diverse audiences, we need to ensure that the website's optimized for the user experience and aligned with institution's institutional goals. And then tracking the right data helps us understand how they're engaging with the content, where improvements are needed, and how to drive better outcomes across enrollment, retention, and engagement.
So that's where this stat from our CMO study as well comes into play, where we found that 45% of colleges and universities have regular reporting of their digital analytics leadership. And I'm not entirely surprised, but I want to see that number increase.
Obviously, we said it's a very important element of our marketing portfolio. And so we want to make sure that we have the data that we need to ensure that it's working the most effectively as it can. So how do we do that?
We need to provide a holistic UX and website design methodology by integrating different data sources. So let's talk a little bit about this in the digital realm. When thinking about analytics, we have a lot of different data sources. We have Google Analytics. We have Optimizely. You might have Hotjar. You might have another tool that is a heat mapping tool.
You have a CRM tool. You might have some survey data, like we do on our end. We'll talk a little bit about how we pair that with some of the traditional metrics for a website. But there's a lot of different data sources that we have. And it can seem overwhelming.
So we need to make sure that we're integrating what needs to be integrated and then pulling the right data at the right times. So there's a lot of-- it can be overwhelming when you look at data sets. There's so much talent yelling at you essentially. And we need to be able to sift through and navigate to find exactly what we need to be showing to leadership, showing to our stakeholders.
And that might be different depending on their different levels. Somebody in the content team might want something different than what leadership wants. So we have to take all of that into consideration as we're thinking about measurement for the website.
So we talked about this a little bit at the beginning, but we have a lot of target audiences. So that means that there's a lot of different needs through the website for these audiences. What a prospective student wants even in that grouping, what an undergraduate student wants versus a graduate student wants, is different. And then think about what current students want, what community groupings want, and employees, they're all looking for something different. So there's a lot that we're trying to do with the website, and we need to narrow in to decide what exactly needs to be measured to show success at that point in time.
So here's just a quick example of some different site objectives that we narrowed down for each audience grouping to help us a little bit easier decide what measurement we're going to need and what data we're going to need to support it. So I'll only talk to the prospective students audience because they're the primary audience of the websites. But they need to be guided through a series of experiences on the website-- we're thinking the home page, the academic pages, and the admissions pages-- to sell them on attending.
So we need to make sure that we have the right data to prove that the home page is effective, that the academic programs is giving them what they need, that they have the right ability to get to the admissions page. It's not seven pages deep before they can hit the Apply Now button. So we got to think about all of these things as we're approaching and trying to measure success of a website.
So keeping on that example of our primary audience, the prospective students, we took this from a survey that Nielsen Norman Group did. Nielsen Norman Group, for those who don't know, is a leader in the UX discipline. This comes directly from their study of higher ed. It's about two years ago, where they looked at over 200 websites and had about a little bit over 120 respondents to understand what things are needed in a digital experience to ensure prospective students have the information they need.
And these are the four key things that pop to the top. Prospective students want to know, does this institution have the program I'm interested in? So that program page that we just talked about before is really important. How much will it cost to go here? Cost is always one of the top concerns of prospective students and their parents, so we have to make sure that they can easily find this information and that they can interpret it.
So the language that we use in higher ed might not be exactly the same language that should be used for a prospective student because they're just new to this. They're learning even what undergraduate means. So we have to take those things into consideration.
Will I get accepted here, understanding, what are the qualifications of a typical student at that institution? And can I see myself here? Your culture, your experience needs to shine through in your website. And so we need to find ways to ensure that that is up front and available for the prospective students but also measurable. Are we effectively providing them these four sources of information to help them make their decision easier?
So I talked a lot about what are we going to measure, but how are we going to do that? That's where we talk about a little bit here. So a measurement-first approach is essential for demonstrating value. We want to make sure that, when it comes to website analytics, that reporting is a part of the bigger story.
Remember we saw that 45% did not have digital reporting to leadership. But this is a massive investment that universities make. It supports awareness. It supports reputation. It supports perception.
And all of these things are things that executives can understand. So how do we pair what website metrics we have to speak to awareness, reputation, and perception for that audience? So we'll talk a little bit about this.
So first, every successful web analytics implementation starts with goals. And so that's why, in those last few slides, I started with, What are we trying to achieve with the website? instead of necessarily, What am I am I trying to measure?
So we want to be able to understand the audiences, understand what they're trying to do, understand the information that they need, what is the goal that we're trying to get them to do in this instance? And we might just want them to sign up for a request information. So then that helps me understand what I need to tag.
Then we'll develop the measurement plan, decide what's going to happen and how we're going to track it. We're going to use that to create those custom events in GA4. We're going to talk a little bit more about GA4 and all the exciting things that have been happening in the past few years here in a minute.
But GA4 is an important part here. GTM is an important part. GTM is Google Tag Manager, not go to market. I know. We have all these different lingos that cross reference each other, but in this case GTM means Google Tag Manager.
And then making sure that it's set up to track it, that it is collecting the right information. We're creating those audiences in GA4. So then when it's time, when the data is passing through, we can easily understand what is happening and it doesn't feel like a massive, massive lift to go find that data.
This is my favorite thing. I think it's a little bit silly, when we talk about measurement and analytics, that my favorite thing is documentation in Excel essentially. But it's true. This is a measurement framework.
Again, we have to start out with that plan. We have to start out with goals. So that's what a measurement framework allows us to do. It provides a systematic approach to not only collecting data but also assessing it and interpreting the results that everything should align up to the institutional goal.
So we talked about this, that the website is sometimes the first place that people are engaging with your brand. And so that should support all the way up through to your overall institutional goals. What do you exist for? What are you trying to do?
And then let's break that down by campaign goals. What are we specifically trying to do this year? Marketing goals, what are we specifically trying to do with maybe undergraduate students? How are we communicating to them? Are there any different types of segmentations of target audiences and goals?
And then the bottom half of the measurement framework-- the top half was a lot of goal setting and statements. The bottom half is a little bit more technical. It's like what or the how.
So we have our objectives, which is, again, like we talked about, awareness, reputation, and perception. Sometimes we also say awareness, engagement, and conversion. So there's a little bit of different languages here. But here we have an awareness and engagement bucket. And then essentially figuring out what data sources, what KPIs, benchmarks, dimensions, and cadences of looking at this data, how does that ladder up to proving awareness of the website?
So it's a really helpful tool to outline everything in terms of analytics and measurement. And then for me, I keep it in Excel because I'm an Excel wizard. I should have a badge.
But the nice thing about that is that you can hide columns. You can hide rows and then pare it down for the specific audiences. They might not need to know all this information, but you have it centralized so that you can easily communicate it to someone that might be on the content team or someone in the executive leadership team.
So here's an example. I did pare it down a lot. This is what I call a micro measurement framework. Again, so just imagine I hid some rows and I hid some columns to get to this point.
But here this is an example we had from another client. They had three business goals. They want to improve enrollment, improve advancement, and increase engagement. So how do we break that down for the website specifically? We're thinking about applications, inquiries, scheduling visits, donations, and web visits.
And then let's break it down a little bit more. What's the actual action that we're trying to measure on the website? So for instance, when you do receive applications, some might click Apply Now. And then they get sent to another website to actually apply, so you're not necessarily going to have that full tracking. But what's the closest measure that we can track on our website to show that action and that engagement from that particular, let's say in this case, the prospective student?
And then let's take it a step further. After we have mature setups and after my-- we'll talk about this a little bit later. But my typical recommendation is wait three months. Collect some data before you set a benchmark.
But we want to make sure that we're then setting a goal. In this case in this example is annual targets. You might get sophisticated. You might have monthly targets that are different because we know that higher ed is seasonal and has those different trends. So we might want to get a little bit more sophisticated there.
Or we might be basic. We might just have an even average across and know that it's going to ebb and flow. But ultimately we're trying to get to x number of Apply Now clicks or things like that. So here's just a quick example of how that actually plays out more tangibly in website analytics.
So now that we're--
I'll hand it back--
Oh, go ahead. Yeah, sorry.
Back to Dan. [LAUGHS]
Yeah, a lovely digital handoff. So now that we know what we're looking for, what we're trying to track, we have those tools at our disposal, what do we do with that information? One of the biggest things we come across is that oftentimes the current state of a higher ed website is way to large scale, and it really needs refinement.
We have these tools, but now we have something that's too big to really measure. We really need to hone in and refine this. At the end of the day, why do we have to make it so complicated? We just want to make this simple and easy to understand for both our users and for us as the managers of this content website and analytics and data tracking.
So ultimately, what we do is we take those four critical user journeys that Cassie shared from Nielsen Norman Group, and we really look at the whole website through that lens. This is the most important thing, and those are our most important drivers for admissions, enrollment, and success. And so we want to analyze through the way that they're experiencing the website.
That way we can find their specific pain points. We can conduct additional user research with that audience and then restructure the site based on our goals there. This way, we can improve the navigation and ultimately illustrate the improved user experience through their lens.
So what user data can do for us? And this comes from our Hotjar and heat-mapping tools. You can use Microsoft Clarity as well. What it does is it tells us how well the site's doing the job.
In this example from Concordia, nobody's making it past this large format video. They're just getting stuck here, and they're going into the search or into the menu, missing all the rich, really branded content on the homepage. And we know that's a keystone moment that people want to see.
And from this example from William and Mary, user data tells us what's most important. We found that there's really only two items in this visit and tour page that prospective users were interested in. That was scheduling the tour and doing an online virtual tour. Everything else was tangential, and if we focus on these items and making these items bigger and more higher up on the surface, that would drive greater success for this type of content.
And then user data also allows us to make refinements,. In this example from Berkeley Haas, this was the program page. And this form, this intake form, is taking up the majority of the viewport for what is supposed to be the deep information on the program itself. So then we can take that data and refine that to make it a little bit more seamless, a little bit easier to find that information rather taking up the whole screen.
Continuing down this trend, we just want to use this lens to really streamline the specific experiences. This is all coming from Saint Edwards, where we're taking the user journey for academics, and we're cutting that down to get to the program page faster. Getting to that cost and aid information faster, this is the one thing that we do with almost every client is separating out cost and aid information.
Oftentimes, this is deeply buried either within admissions or financial aid, which are oftentimes linked together. And one thing we find is that undergraduate students don't always know the right terminology for these things. A different study from another agency, Ologie, they found that it was like 60% of undergraduate students don't even know what the term undergraduate means when they are in their discovery phase. So we want to make things as clear and as simple as possible for them to get to the information they need.
Similarly, getting into this sort of admissions process, making sure that that's separate from cost and that they really are getting into the nitty gritty of what it takes to be a student at that institution. Another thing that we find really competes, especially when it comes to our data tracking, is subdomains. They are a necessary part of web development, but they are not the end all, be all. And they should not represent the departmental hierarchy of the institution.
They really need to be specific and doing a specific job, not duplicative. In this example, we have the main.edu, an undergraduate admissions and a graduate admissions, all of which have program pages on each of them, all of which have admissions information on each of them, all of which have student life information on each of them. So what this leads to is a severe cannibalization of that content and that information.
So it's really going to throw our data. So then we're seeing it in different streams. We're not going to know what's 100% accurate. And ultimately it really creates a confusing user journey for folks who are landing in the main experience and getting shoved off to a sub experience or just coming from organic search here.
Looking at Google, we have four different financial aid pages. Which one is the source of truth? So the lesson here is that we want to make sure we establish key sources of truth for those key critical user journeys, making sure that it's always related and always in a single place that can be found. Duplicative content ultimately is going to throw your SEO, and then it's going to be throwing your GA4 event data as well, because you might not be getting the right thing to the right place at the right time.
So when we do conduct this kind of usability research and we're looking at these different data points, what do we take away from it? Ultimately, websites need to be simple and distinct. We need to answer those key questions. And then we need to infuse the brand to continue to tell that story and then incorporate that intuitive and helpful features like menus and specific cost-appropriate tools, just driving the whole space to be very simple and oriented to answering these questions, making sure that information is comprehensive and scannable, making sure these design systems are combining the brand with UI best practices and making information snippet and digestible for a prospective student user rather than getting lost in the higher ed nomenclature that we're used to.
And then last, that concept of navigation and making it accommodating, making sure that the menu is set up to address those key critical user journeys. That's the most important information. Of course, we're going to have our secondary and tertiary audiences in there. But at the end of the day, who are we serving? We're trying to serve these prospective student audiences and established patterns for them to get to their information most critically.
All right, so we talked a little bit about setting our goals and making sure that we have that in the front of our minds. And we talked a little bit about the impact and the things that we're looking for in the website. So I want to talk a little bit now about, how do we make GA4 tracking customized to higher ed? Because it's not, and that shouldn't be a surprise.
GA, Google Analytics, has been created for e-commerce. And then it was widely adopted by everyone else. But still the main point of it is to help e-commerce companies track.
And so what we have to do is we have to fit it to our needs and higher education. So we talked a little bit about those KPI metrics that I outlined in our measurement framework or our micro measurement framework. We talked about request information. We talked about applications and different things like that.
Well, some of that can automatically be tracked with an asterisk. So I'm a queen of caveats. Data people usually are.
But the really great change between GA3 to GA4 was that they now have what we call Enhanced Measurement Events. So by clicking one button in your settings, you can automatically track these several events.
Sometimes there's a little bit of wonkiness with-- I think the form starts, and sometimes the file downloads. So we typically add to this or optimize it. You can turn things off, and instead of having that tracked automatically, you can do it manually. There's some more information there.
But at the bare minimum we can track page views. We can track search results. Remember that example from Dan with that heat map where they were getting stuck at the top? They were probably just going to the search. So what are they searching for? That might help us optimize that page to ensure that content is closer to the front.
And then we have some additional metrics like scroll rates, different clicks that are going to those other subdomains. That's important because we just talked about the use of other different subdomains. So we can start to understand, where are they going? What information are they getting? Are they getting the right information? and things like that.
However, as you can imagine, this is not enough. We need more. So I have some examples here of additional measures that we typically do on all of our implementations for higher ed institutions. And this, again, ladders up to our measurement framework and what we outlined as success.
So instead of tracking every single thing on the website, we want to focus in on form submissions. So that could be application submissions. That could be request info forms, visit forms, et cetera.
That's when they hit. They fill out all their information and send it to you. But before that, they have to click the link on the button. And so we also want to track that.
So we track Apply, Give, Request Info, and Visit link clicks of just the little buttons that say like Request Info because we want to understand how many people are clicking that and then not actually submitting their form and start to understand what might be happening there. Is our form too long? Do they not have the right information? Did we not send them to the right spot for what they were thinking that they were trying to get to?
So we can start to understand that journey a little bit better. So we customized the inputs. Now we have to customize the outputs.
And so we in GA4-- I'll start here. In GA4, we have some standard reports already set in there. But again, GA4 was made for e-commerce, not for higher ed.
So we want to make sure that we are optimizing the report section. And for those who don't know, in GA4, there's two sections where you can get data. There's the report section, which are pretty high level, quick but restricted reports. You sometimes are limited in the number of dimensions or metrics you're allowed to pull in.
But it is really great for someone to quickly understand a little bit of what's getting driven to the site and how long are they staying on the site and what's their engagement rate, those kinds of things on a quick high level. So we want to make sure that we're customizing this so that it meets the needs of what higher ed has.
And so you'll see here that we slimmed that down a ton. And that's important because, like I talked about at the beginning, a lot of times we're just overwhelmed with data. We have so much data yelling at us, saying, hey, I'm important. It's not all important. I promise you it's not.
It might be in certain cases but not on a large scale. So let's slim that down. Let's focus in on the things that are most important. It's that traffic and the conversions, how people are engaging with our content and our events. Search console is a really beneficial part. One of the things in GA4-- the reason for GA4 and the upgrade from GA3 was so that Google can become more integrated with their different systems.
And so that means that we can actually see some Search Console data if we integrate it with GA4 in GA4 itself so you're not bopping from platform to platform to understand. You can really see and have that cohesive experience within one of their softwares. And then understanding the users, the demographics, the GOs, and how they're getting there, the tech is actually pretty interesting too because we need to know that for designing the site.
What's the majority of people visiting the site on? Is it their phone? Is it their tablet? Is their laptop? Things like that, So being able to customize that is important.
So we set our goals. We talked about the website itself. We talked about how to get that tracking in and how to set up our reports. So after all of that comes results. Dan's going to take a couple, and then I have a couple examples as well.
Yeah. Exactly as Cassie said, we've set up a new data system, new data stream. We've designed a beautiful-looking website. What's happening now?
One thing that's really critical that my team takes on is some post-launch usability testing, and that involves looking through our analytics setup and seeing what's going wrong. And we were realizing that these high-level links weren't getting clicked. Users weren't getting out of the menu. They were getting stuck within the menu. And then our main KPI links also weren't getting seen at all.
So using that analytics data and our usability testing, we can refine this to make this a better experience, so highlighting these links, making sure that they have a carrot that shows that, hey, this is a link, it's going somewhere, and bringing up that menu so it's not as stuck at the bottom, and making sure it's just like a hover stick peaking out so we are not getting stuck in that menu, losing that close button, and then differentiating our KPI links here as well, making them a little bit more prominent and a little bit more visible. And all of this is coming from that analytics tracking data and our usability testing.
And then this is something Cassie and I developed together, impression testing for the higher ed websites to see if we moved the needle from the start of the project to the end of the project. Cassie, you can take over from here.
Awesome, yeah. So impression testing was something that is used often in UX research and design. But I hadn't seen it used pretty often in higher ed. So it was something that we wanted to adopt and leverage, especially during the transition between GA3 and GA4, where that data was not apples to apples. It was apples to oranges because, even sessions, they have the same name.
But on the back end, the actual definition, they're different. We switched from bounce rate to engagement rate. We switched a bunch of things in that transition. So we wanted to have a complementary metric to show success.
And to get to that other side of measurement that we don't often get to, which is, how is this perceived? We understand how people are navigating through what they're doing, what content they like because they're clicking on it. But we want to understand a little bit more of like, did this website redesign? I could say from wave 1 to wave 2, wave 2 looks amazing. I love it.
But how do we back that up, that it is resonating with our prospective students? Well, that's going to be an impression test. So it is a research methodology that gauges users' first impression of a design. So we only take the above the fold of a homepage.
We show it to users for five seconds. And then we ask a series of four to five questions based on their viewing of it. And we want to really understand their recall and their appeal of the website.
It's repeatable. It's meant to happen at the beginning of a website redesign and after a website redesign to understand how we were able to shift those perspectives. And then we typically use a paid panel for this because it's hard to find these audiences through lists, and they don't have a lot of-- they don't have a high rate of response.
So we're able to isolate, typically to define who your prospective student is, what are the regions. And it's more like an agile test. We're not going to see an n size of 400. It's not a massive comprehensive. It's more of just getting a sample size to see if we're moving in the right direction.
So we have two examples here. We have York College. You can see the before and after. And then we pair that with the results that we saw.
And we actually saw, on the next slide, we improved net appeal. We improved the word associations aligned with our strategic intent, which was inspiring, creative, and innovative. That's what the client wanted to have come through on the new website. And we increased the desire to explore the website further, which is really important because we want them to get to the site, and we want them to consume the content and want to continue to find information.
I don't know about you all, but I know that my attention span is shortening. Thank you, TikTok. But think about your prospective students. Theirs is probably even less than mine. So we need to make sure we have these engaging, these appealing experiences for them that make them want to consume more content. And this test proved that we were able to improve that from the first website to the second.
We have another example of this with Saint Edward's University. Obviously, I think that the second website looks phenomenal, but we got the data to back it up. And we got an award, so that's like a pat on the back.
But again, we improved net appeal. We increased our word associations align with that strategic intent. We saw inspiring again. It's not uncommon that clients want to have an inspiring viewpoint or representation of their institution. Because it is higher ed, we want to make sure we're up leveling people, we're giving them new opportunities, new information.
So we want it to be inspiring but also impressive. They wanted to lean into impressive as well. And we did see that come through in the results. And then, again, we did see an increase in the desire to continue to explore the website further.
So pairing this data with the GA4 data, we were able to tell a little bit more comprehensive story. And then we also had that usability testing and some of the other stuff that we have to validate the strategies that Dan's team took in order to create these beautiful websites and these additional program pages and such.
So where do we go from here? We've optimized our data. We've strategically optimized our website experience. What's something that can be taken away, especially for a system like UMass? What's the future hold?
Ultimately, our POV as an agency is that the future of higher ed digital experiences require investment, specifically in efficient and tailored and technologically sophisticated interactions. That's what's going to keep people on site. That's going to allow us to feed information at the right place at the right time.
And ultimately, this is our front door to our brand is our website. This is who we are and where we're capturing the most people at the right time. So we have just eight different points here that we should be considering and thinking about as we move forward into considering these things and these processes.
One is making sure we're investing in education and training for our staff and our teams, making sure that they're up to date on the technologies that are available to them. From there, making sure we have a clear data strategy, implementing those things like those measurement frameworks and those goal settings, GA4.
Gathering more tools, one that I love to pitch to clients is Microsoft Clarity. It's free, and you can collect so much interesting data on how users are behaving on your website. You get those really rich heatmaps and click map data. That's a great tool to start to learn and have at your disposal.
And then from there, encourage these data-driven practices, making sure that we carve out spaces within our teams for the development of this work. And then from there, going out to our other teams, our leadership teams, regular stakeholder feedback and user feedback, going to our students, just asking them really quickly, hey, what do you think about this part of the website or this, that, or the other?
And then promoting collaboration across the University units and the central-wide website leadership, we are all in this together. This is all our product. So let's make sure we're all on the same page for its goals and its purposes and its needs.
And then ultimately celebrate success. So celebrate the small wins. When we launch a new landing page, when we see we've increased giving data, when we've increased apply clicks, those are worth celebrating. That means that our strategies are working and that we've done the work.
And then ultimately continuing to learn and adapt as we move on. These tools are changing all the time. AI is going to come in and start to change the way we search and the way we organize content. So just making sure that we are always on the cutting edge.
And that's all we have for you. We did reference a lot from our CMO study. So right here is a QR code, and I think this presentation will also be shared after the conference anyway. So you can use that to access that CMO study and download some reports and some webinars from us as well. Cassie, is there anything else you had for us?
Nope, that was it.
Great. Well, thank you so much. We really appreciate being here. I don't know if we have time for questions or not, but if we do, then this would be the time.
We do. We have a couple of minutes. First, though, thank you for your presentation. It was terrific. And, Cassie, I mispronounced your name. I know it's Cassie Golda--
You're OK. [LAUGHS]
--without the A, so I apologize. One question so far, how do you recommend utilizing analytics to help improve a complicated menu and web structure for a department-based website that have a lot of content that is split between external users and internal login required audiences? We received feedback that the site is hard to navigate, but we have a lot of technical policy-based content that has to be referenced.
Do you want to start, Dan? Yeah, yeah, that's fine. Yeah, that's a really tough one, and it depends on the Department and the depth of where you sit in the institutional hierarchy and how important you are to the external audiences.
At the end of the day, you want to streamline your menu to be the most useful for the most people who use it, so considering who's coming to that page the most, what is the information that they need the most, and pull that forward. And then understanding, where are points where we can drop things out of the menu and just put them on page? On-page navigation is just as important as menu navigation. And I know one thing that we run into a lot is trying to shove too much into menus, so making sure that we're pulling that out and into the on-page experience as well.
I think what I would add to that from understanding how they're leveraging the different menus, we have some abilities in GA4 to isolate audiences. What is really hard is you can no longer use IP addresses. That's not a function. It was in GA3, but it's not in GA4.
So being able to understand the types of actions, like you put in there that they have to log in to get to the internal parts of it, we can tag that and then flag them as an internal user. So for whatever duration you set, when you set up your customized audiences within GA4, say you want it to be last six months, as soon as they log in and that action is tracked within GA4, for the next six months, we can isolate their activities, start to understand that a little bit better, and then, to Dan's point, start to prioritize what content they need, and try to find that balance between those two audiences.
Is it just a click out to a different resource for that internal audience versus the external audience? Things like that. So it's not an exact science, but we're able to start to collect more data in that instance to diagnose what's happening on that specific menu with those specific users.
And to piggyback on that, next page path is also really important in that discovery, so finding out where they are clicking into and then where they finally leave because that means that's the information they were looking for. And if we find that a lot of people are leaving on this specific page, then we should elevate that and get rid of the middle in between there.
All right. Any more questions?