We were very excited to welcome Michelle Robbins, Vice President of Product & Innovation at AimClear, to our DeepCrawl webinar this month. Joining our host, DeepCrawl’s CAB Chairman, Jon Myers, Michelle explored how to use your analytics and crawl data in order to understand user behaviour and optimise your content for increased conversions
We’d like to say a big thank you to Michelle for her great presentation and for taking the time to answer our audience’s questions, as well as to Jon for hosting and all those who attended.
You can watch the full recording here:
Plus, revisit Michelle’s slides here:
We are drowning in data
The amount of data that marketers are faced with daily is ever-increasing, with more than 2.5 quintillion bytes of data created every day. We’re also at the point now where it’s estimated that by next year, 1.7MB of data will be created per person every second. Along with this, we also have an increasing number of business data sources to contend with. For example, web analytics and Google Search Console data, as well as advertising data from platforms such as Google, Bing, Facebook and Amazon.
However, as data collection has increased, according to a survey of marketers that was released in September 2019, marketers confidence in the ability to derive insights from all of this data has decreased. If collecting massive amounts of data doesn’t lead us to make better business decisions, what are we getting out of it? Simply adding data into an analytics program and then pulling reports out of those does little to derive actual meaning from the data.
Measure what matters
Instead of focusing on collecting every data point imaginable and then reporting on every metric possible, Michelle highlighted that it is important to measure what matters. It’s also important to consider who this matters to and what the desired outcome should be, one example of this, specifically in SEO, is focusing on robots instead of the user.
User experience metrics
Another example of this is addressing the impact of page speed metrics for users, rather than bots. One way of doing this is by measuring user behaviour on a site in order to understand and meaningfully measure what users are telling us with their behaviour. For example, the concept of tracking and identifying rage clicks is one method for measuring this. Rage clicks occur when a user is rapidly clicking on a page and can be used to indicate frustration on specific elements and identify when users are likely to exit or bounce. Identifying when behaviours such as rage clicks or cursor trashing are occuring are actionable metrics to measure.
Asking the right questions
In order to understand customers and how they are interacting with a website it is also important to ask the right questions of our data. We often find ourselves focusing too much on overall metrics and highlighting broad aggregate data such as pageviews, channels, bounce rates and conversions, but these numbers are not always actionable. The questions that we ask of the data needs to inform the quality of the insights that we can derive and the actions that we can take, in response to what we are seeing.
While asking questions such as ‘how many visitors converted?’ and ‘how many returning vs new visitors did we get?’ will generate a numerical value, they won’t tell you much about your customers or provide any specific actions you can take to make improvements.
However, asking questions that include ‘how do visitors who converted behave differently from those who didn’t?’ and ‘how do returning visitors behave differently to new visitors?’ can help to directly understand how customers are behaving. By answering these questions we will be able to formulate insights and tactics to improve upon.
Focus on behaviours, not just metrics
Other questions that provide insightful data rather than just aggregate data include things such as ‘does behaviour change based on geographic location?’ or ‘does organic traffic from Google behave differently than traffic from paid campaigns?’ Asking questions from the data that can yield measurable actions to be taken is the key to increasing conversions and decreasing latency in the funnel.
Surfacing behavioural data
In order to understand a customer’s journey through your site, knowing the user path is critical. This behavioural data is a way to understand the steps your customers actually took, in comparison to those you think they might take.
User path analysis
By performing user path analysis, we will be able to answer questions such as;
- Is the site architecture optimised for conversions?
- How is the navigation and sub-navigation utilised?
- Are people clicking on breadcrumbs?
- How is internal linking impacting a users path?
- Are there technical problems leading to bounces or exits?
- Is site search required for customers to find the information they need?
However, the real power comes with segmenting users within these paths, being able to apply filters and drill into events that occur during a users visit. This is where Google Analytics can be used to help.
New Google analytics reports
Traditionally analytics platforms haven’t made this information easy to find and explore. However, Google Analytics recently introduced a new property called App + Web Analytics, and, even though app is included in the name, you don’t need to have an app in order to utilise the new reporting capabilities that come with this property.
Exploration
One of the new reporting features available within this property is the new exploration reporting, this allows you to drag and drop segments, dimensions and metrics to create reports and data visualizations within Google Analytics itself.
Funnel Analysis
The funnel analysis report is another new feature which provides an improved view of the funnels that are currently available within Google Analytics. With more options to create analysis based on segments, dimensions and metrics compared to what is currently available in Universal Analytics.
Path Analysis
As Michelle mentioned, understanding user paths and clickstream data on a website is key, and another area within the App + Web Analytics reporting which allows you to do this is the path analysis report. This report includes a number of new features which are not available within the classic user behaviour and event flow analytics. For example, you can apply filters and segments to narrow down the set of users you want to analyse or change between node types to customise each step.
The keys to conversions: paths and prediction
Understanding user behaviour, and how people navigate through a website on their path to conversion is critical, but how can we use this data in order to optimise, guide and influence a user’s behaviour in order to reduce friction in the customer journey? Michelle explained that this is where predictive analysis, through AI and machine learning models, can be used. Typically, when talking about machine learning and AI, it is focused on big data from large scale sites. However, applying a subset of AI, known as Narrow AI, can provide useful and actionable predictions with less data required.
What is Narrow AI?
Narrow AI describes artificial intelligence systems that are specified to handle a single or limited task, some examples include attribution modelling and personalisation. Narrow AI is also associated with relatively straightforward machine learning algorithms, rather than the more advanced artificial general intelligence. So, while true AI requires large data sets, Narrow AI algorithms are perfect for helping to address challenges and answer questions faced by marketers.
Sequence prediction models
When understanding a user’s next step in the path to conversion, the kinds of Narrow AI models we want to explore are those around sequence prediction modeling. A sequence prediction model uses historical sequence information in order to predict the next step. Examples of these include Compact Prediction Trees (CPT) using Python and Markov chains using R. In order to produce reliable predictions you will need to have a sizeable enough data set, with enough common paths being taken by users, as well as a developer familiar with Python or R and data transformation and modelling.
In terms of learning Python for SEO Michelle recommended watching Hamlet Batista’s recent webinar with DeepCrawl where he explained how Python is being used further within SEO and how to get started with learning it.
Benefits of path analysis and prediction
There are a number of benefits to setting up a Web + App Analytics property in order to understand user behaviour, including:
- Being able to better understand your users
- Discovering what might be broken in your funnel
- Determining if you have a tech, content or optimisation problem
- Adapting goals to capture how people are actually converting
In addition, by applying prediction to path analysis you will be able to:
- Programatically intervene and direct users away from non-converting paths
- Reduce friction in site usage
- Optimise frequent paths
- Implement an internal linking program designed to lead, or keep users, on a path to conversion
Regular site crawls are critical
Site crawls and audits are also a crucial step in user path analysis, as they enable you to understand your site in the same way that search engines do. Site crawls are also foundational to comprehensively understanding how your site is structured and ensuring it is accessible to both users and search engines. Traffic from search engines remains one of the key converting channels for most sites, and visibility depends entirely on whether or not your site is optimally set up for being crawled, indexed and ranked by search engines.
Using a platform like DeepCrawl, you will be able to identify problems for both search engines, such as blocked pages, 404s and duplicate content, as well as for users, for example thin content, poor internal linking and navigation issues.
In order to provide your customers with a conversion-path-optimised experience on your site, you need to ensure that they can first find the website in search, and that your pages deliver on the experience they expect, and site crawls will surface these things for you. Regular site audits will also help to keep you ahead of the problem, allowing you to stay proactive, rather than reactive to any issues your site may face.
Michelle also shared some examples of the reports from DeepCrawl she has used that highlight problems which, once resolved, have ensured maximum search visibility. This includes the main dashboard, uncrawled and non-indexable pages and duplicate content reports.
Adhering to an ongoing process of crawling, optimising, analysing, optimising and then crawling again will set your site up for success in both attracting and converting customers. Michelle recommends regularly crawling your site in order to:
- Surface problems for bots accessing content
- Identify problems impacting user experience
- Discover content optimisation opportunities
Once you have this information, the next step is fixing any errors that may have surfaced in the crawl, pushing changes to production and then analysing the results from these optimisation efforts. Questions you will want to ask during your analysis phase include:
- Did customers return more frequently?
- Did we acquire new customers?
- Are customers spending more time on the site?
- How are users moving through to conversion?
- Have the paths taken to conversion changed?
The results from this analysis will lead to further optimisation, in order to understand if the optimal user path is predictable while optimising towards those conversion paths.
Michelle concluded her webinar with an important message, the web is not static so our sites shouldn’t be either, we should always be optimising towards providing the best customer experience.
Hear more from Michelle in our upcoming Q&A post
The audience asked so many brilliant questions during the webinar and we have collated all of these for an upcoming Q&A post which will be available soon on the DeepCrawl blog.
Get started with DeepCrawl
If you’re interested in learning about how DeepCrawl can help you to identify issues on your site which are impacting both search engines and users, while also assisting with your optimisation efforts, why not get started with a DeepCrawl account today.
Don’t miss our next webinar with Dawn Anderson
Our next webinar will be taking place on Wednesday 18th December at 4PM GMT/11AM EST with Dawn Anderson, Managing Director at Berty. She will be exploring what Natural Language Processing is and how it can be used in order to gain a better understanding of search intent, while also covering how Google is using their BERT model in order to better understand search queries.