Why Google dropped Bing Discover pages from its index

Why Google dropped Bing Discover pages from its index

Earlier this month, Google dropped a subset of Bing’s website from its index. The Bing Discover pages were completely dropped out of Google’s search results and resulted in Bing losing millions of pageviews from Google search.

Bing.com/discover not in Google’s index. Here is a screenshot of Google showing no results from Bing Discover for a site:bing.com/discover command.

How much traffic was lost? SEM Rush reports show a significant amount, just look at this chart showing the visibility drop:

Here is one of the tweets about this huge traffic loss:

Massive traffic. Roey Skif wrote an article back in April of this year about how Bing was able to use Google search to drive insane traffic to its pages. He showed how “Bing piggybacks on the low competition of Google’s image search and how to get huge amounts of organic traffic.” Then a few months later, Google seemed to have enough.

Google’s guidelines. Google’s webmaster guidelines say, “Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages.” Google specifically has said for a long time they do not want search results in its search results. And Bing Discover was just that.

Google and Bing comments. Google would not send us a comment on the issue but Microsoft told us “this was a change that Google made so we cannot answer on their behalf.” Microsoft did not make any changes to Bing Discover that would lead to this change. I do believe they made changes after Google removed those pages from its index, but prior, no – no change.

Why we should care. If you go against Google’s guidelines, no matter how big or small you are, Google may eventually take action against the site. You can lose millions of visitors as a result. Read and understand Google’s webmaster guidelines, stay within those guidelines and do not risk your site from being penalized or removed from Google search.

About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Dit artikel is vertaald van Search Engine Land

E-commerce brands: Track spend efficiency with this Google Ads Script

E-commerce brands: Track spend efficiency with this Google Ads Script

It goes without saying that tracking ROAS is essential for e-commerce brands using Google Ads as this is how most track the profitability of their campaigns.

Although Google provides topline figures for ROAS it’s hard to see the health of your account at the keyword level and identify quick wins.

A reader recently wrote in with a smart way of doing this using stacked bar graphs to help visualize ROAS performance of keywords across the account week on week.

However, they were still doing this manually in Excel.

So the developers at my employer designed this Google Ads script to automate this process and create a visual representation showing the keyword level performance.

How the script works

The script pulls all of the keywords within your account and their respective ROAS.

It then takes this data and turns it into two charts that allow you first to visualize what percentage of keywords have a specific ROAS and secondly compare week on week to see if your ROAS has improved across your account.

Within the controls of the script, you define your ROAS buckets. The first graph will then show you the number of keywords within that bucket, like below.

The second graph will show you the number of keywords within the different ROAS buckets, but this time it is weighted by spend to give you a more representative view of what is going on.

Setting up the script

Before you run the script, you will need to change the following settings:

Set your ROAS thresholds. We generally suggest doing this in increments of 100 (e.g., 100, 200, 300). These should be defined on line 17 of the script between the square brackets. Each increment should be separated with a comma.

Go to Google Sheets and create a new document. Copy the URL of the sheet then paste it into line 21 of the script between the quotation marks.
Finally, you need to set a lookback window on line 24. This is the time frame that the script will take data.

The script

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Wesley is managing director at Clicteq. He currently manages a $4M-plus Google Ads portfolio across a range of different sectors. He regularly writes in leading marketing publications such as Econsultancy and Campaign Magazine.

Dit artikel is vertaald van Search Engine Land

Are your DSAs really outperforming standard ads? Find out with this ad copy length performance analysis script

Are your DSAs really outperforming standard ads? Find out with this ad copy length performance analysis script

I’m sorry to say it, but the rumors are true: size matters. Well, ad copy length does, anyway.

Why else would Google keep increasing character limits? Their research found that the new expanded ads got 15% more clicks than other formats.

But are you actually making use of the space that’s available to you?

If you’ve never tested this before, it’s high time to assess your ad performance based on copy length.

With this script, you can do just that… and more! It can compare the performance of standard ads against Dynamic Search Ads (DSAs) across your account so you can check whether DSAs are actually working for you. As much as I love Google, you shouldn’t always trust them blindly – testing is key!

What does the script do?

This script allows you to see the performance of your ads over the last month aggregated by the number of characters used in each part of your ad copy: headlines, descriptions, and paths. So you get aggregated statistics for headline 1 with 30 characters, 29 characters, and so on.

It downloads a report of the account in a Google spreadsheet and creates a number of tabs: Headline 1, Headline 2, Headline 3, Description 1, Description 2, Description 3, Path 1, Path 2, and Path 3. It also creates three tabs (Headline, Description, and Path) where it concatenates all the respective components.

For each one of these components, the report shows the number of ads with a certain character count, and then the sum of those ads’ clicks, impressions, cost, and conversions. It also shows an average cost per click, click-through-rate (CTR), and cost per acquisition (CPA).

In the headline tabs, DSAs are the ones shown having zero characters (though zeroes in descriptions and paths aren’t necessarily DSAs), so they’re easy to spot. By comparing them to the standard ads, you can check whether DSAs are really outperforming expanded text ads.

Why does it matter?

With more ad space, you can be more relevant to the search query and landing page. In other words, a better quality score (and who wouldn’t want that?).

For example, if your CTR performance is underwhelming for two headlines with 30 characters, you might want to consider adding a third headline or using your word count more effectively.

If you spot paths only a few characters long, you’re probably missing out on valuable space. Longer paths look more natural to users, and improve relevance by telling users exactly what to expect from the landing page.

You can also verify what percentage of your spend is coming from small ad space, e.g. old accounts with old ad formats that haven’t been updated yet.

How to get started

The setup for this one is super easy. First, create a blank spreadsheet. Then, copy the script below and paste it in the scripts section of Google Ads. Replace YOUR_SPREADSHEET_URL_HERE at the top with your blank spreadsheet’s URL, and you’re ready to run it. Easy peasy.

You can also play around with changing the date range and metrics if that works better for you. Here’s a link to the script. Have fun!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Daniel Gilbert is the CEO at Brainlabs, the best paid media agency in the world (self-declared). He has started and invested in a number of big data and technology startups since leaving Google in 2010.

Dit artikel is vertaald van Search Engine Land

Google shortnames and the case of the disappearing reviews

Google shortnames and the case of the disappearing reviews

Late last month Google introduced a number of new GMB features, chief among them a short URL/shortname for businesses. However, last week SEOs started noticing that adding shortnames to their clients’ GMB profiles caused reviews to disappear or listings to be suspended.

These are apparently two separate issues, according to Google.

Two different problems, happening erratically. The disappearing reviews and listing suspensions weren’t happening consistently but often enough to impact numerous local SEOs. Here’s a representative example:

Shortnames are intended to provide a short URL that can be promoted anywhere by local businesses and will directly surface the GMB profile when searched on Maps or Google. Since first exposed last week, Google has been aware of the both the listings problem and the disappearing reviews.

Resolution coming “soon.” Google provided us with the following statement about what’s happening:

The recent concerns around the visibility of certain business listings are being corrected. The business listings were not suspended, but instead were not being shown as visible due to a technical issue. Business owners who experienced issues should be able to see their listings in Search soon. While some users may have experienced an improvement to the situation with the removal of their short name, the issue was not directly tied to the short name feature.

Apparently the problem of disappearing reviews is technically not identical to the missing listings bug. Both issues are being worked on. Google was not specific about a timeframe for resolution of these issues (beyond “soon”), but assured us they are being addressed.

Why we should care. Google shortnames is a very useful marketing and branding tool for local businesses and a convenient way for consumers to quickly search for specific businesses. It’s unfortunate that the rollout has been buggy but it’s expected that the problem will be resolved and marketers can resume adding shortnames to their customers’ GMB profiles.

About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He researches and writes about the connections between digital and offline commerce. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

Dit artikel is vertaald van Search Engine Land

Alexa, Google Home now battling perception of being ‘surveillance’ devices

Alexa, Google Home now battling perception of being ‘surveillance’ devices

Google acknowledged an earlier report from Belgian broadcaster VRT News about third-party subcontractors being able to access recordings of Google Home device owners. According to the report, the audio clips reviewed included enough information to reportedly determine the home addresses of several of the involved individuals.

Clips used by speech and language experts. Google says it sends clips to third-party language experts to ensure that Google is understanding local speech and accents. The company explained this process in a blog post today, saying “We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”

Google goes on to discuss how it safeguards user privacy and how only a small percentage of audio clips go to third parties for review. With this incident, however, Google adds to the perception that smart speaker devices are “listening” to their owners. This has been a problem for Alexa, which has received lots of negative coverage for its alleged eavesdropping on owners.

Privacy concerns a growing problem. A recent survey from NPR suggests that some consumers are now hesitating to buy smart speakers for privacy reasons.

Compared with a similar 2017 survey, more consumers are now more concerned about privacy and security. The top reasons for not owning a smart speaker were: hacking, concerns about smart speakers “always listening” and worry about “government eavesdropping.” This latter issue (government surveillance) is exacerbated is by recent revelations that immigration authorities are mining DMV databases with facial recognition technology without Americans’ knowledge.

Current smart speaker owners also expressed similar privacy fears as non-owners, according to the NPR survey.

Why we should care. In roughly 2010, Facebook CEO Mark Zuckerberg talked about new social norms around information sharing in which people allegedly were less concerned about privacy. He didn’t quite say “privacy is dead,” which is often how his statements are characterized. But he did seem to suggest that privacy norms were now quite different. In the past year, Facebook has done a massive pivot toward privacy, as consumers have demanded more control over their information and Facebook has received mounting criticism over data security and privacy.

Unfortunately, Google, Amazon and Facebook haven’t gone far enough to instill confidence and genuinely give users control over their data, though they would undoubtedly dispute this statement. Behind the scenes, some large tech companies have been trying to weaken California’s forthcoming CCPA rules.

Smart speakers and displays are an important piece of new consumer technology – and have the potential to be an effective platform for marketers. But this nascent channel is increasingly at risk unless Google, Amazon, Facebook and, to a lesser degree, Apple can do more to boost privacy and consumer confidence.

About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He researches and writes about the connections between digital and offline commerce. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

Dit artikel is vertaald van Search Engine Land

Video: Lily Ray on recovering after a Google core update

Video: Lily Ray on recovering after a Google core update

I spoke with Lily Ray of Path Interactive at its NYC office on the topic of Google core algorithm updates. I asked Lily, a Search Engine Land author and SMX speaker, how her team approaches the task of helping its new customers who were negatively impacted by a core update improve their rankings. As you know, Google said there are no fixes for core algorithm updates but as you also know, you cannot tell your clients – sorry, we can’t fix it.

In this video interview we discussed how sites need to build trust through their staff and authors. We also discussed the challenges she faces when convincing her clients that their internal staff need to take a more active role in the web site content. The conversation went into the issues with Google being more or less transparent around these algorithm updates.

I started this vlog series recently, and if you want to sign up to be interviewed, you can fill out this form on Search Engine Roundtable. You can also subscribe to my YouTube channel by clicking here.

About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Dit artikel is vertaald van Search Engine Land

What two big studies tell us about the state of local marketing

What two big studies tell us about the state of local marketing

Two new large studies, released within a week of one another, provide valuable insight into the current state of local digital marketing. FreshChalk published an analysis of 150,000 local business websites and SOCi conducted a Localized Social Marketing benchmarking analysis (registration required) of 163 leading franchise brands in ten vertical categories.

What they reveal, at the highest level, is that many multi-location brands and local marketers are not following basic local and SEO best practices. By not fully building out location pages across Google, Facebook and Yelp and more directly engaging consumers at the local level, marketers are missing opportunities to improve visibility and local rankings. There’s also a correlation between those brands executing well locally and sales growth.

The Localized Social Marketing benchmarking (LSM) study, co-produced by LSA, sought to understand whether a relationship existed between local-social marketing performance and business outcomes for multi-location brands and franchises. It also sought to establish success benchmarks overall and by industry vertical. Disclosure, I was involved in the data analysis and scoring of the SOCi study.

Looked at thousands of local brand pages

LSM examined 100 randomly selected locations associated with 163 brands drawn from the Franchise Times and Entrepreneur top franchise lists — many thousands of pages. It scored each location’s presence, reviews/ratings and local engagement (e.g., local content posting, responding to reviews, Q&A) on Facebook, Google (My Business) and Yelp. It then generated an overall ranking and rankings by industry.

The ten industries represented in the study were: Food & Beverage, Hotels, Personal Care Services, Education, Retail-General, Retail-Convenience, Business Services, Auto Parts & Services, Home Services and Real Estate. The study found that the top performers had sales growth 3x the average for all 163 brands (based on third party sales data). Category leaders had 2x sales growth.

GMB Pages for McAlister’s Deli, the overall LSM winner

Content differences on Google, Facebook, Yelp

One might have expected nearly 100 percent of locations for these brands to be claimed/verified across Google, Facebook and Yelp. LSM found, however, that on average 78% of locations were claimed. Google had the highest percentage (85%), while Facebook had 74% and Yelp had 75%.

As one might expect, Facebook was the platform that featured the most local engagement, by far. Nearly 75% of franchise locations were posting content on Facebook. However, these locations were doing very little in the way of content posting on or otherwise engaging customers on Google My Business or Yelp. Only 2.3% percent of locations were responding to Google Q&A, for example.

Review response rates were (somewhat) better:

  • Facebook – 48% (of locations responded to consumer reviews)
  • Google – 36%
  • Yelp – 16%

The fact that only 16% of franchise locations were responding to reviews on Yelp is a surprise, given that reviews are Yelp’s central feature. Multiple studies show that consumers expect a response to their reviews (especially negative/critical reviews) within 24 hours and often much more quickly. This was a major area of weakness and potential improvement for these brands.

Google has the most reviews

Another surprise came in the form of ratings differences across platforms. Facebook had the highest average review scores (4.27), followed by Google (3.45) and then Yelp (2.09). These numbers are for the identical locations. Someone consulting reviews on Facebook would potentially get a very different sense of a business compared with someone looking on Yelp.

In terms of review volumes, Google had more reviews than Facebook and Yelp combined. Google My Business had 2x the number of reviews on Facebook and more than 10x the number of reviews on Yelp. This is no doubt a result of Google’s (and local SEOs’) increasing attention to reviews and its recruitment of local guides (now 95 million globally).

One might read this report and conclude that Yelp is less and less relevant and start to ignore the site. And it’s clear that Yelp was the lowest priority for these brands. However, Yelp continues to be an important consumer destination and ranks very well in Google results.

Yelp ranks for 92% of local queries

According to the FreshChalk study, “Yelp appears in the top five search results for 92% of Google web queries that consist of a city and business category.” Beyond its status as a consumer destination, it’s an important site for “barnacle SEO.” However, Yelp doesn’t rank equally in the top results for all cities.

HomeAdvisor and Angie’s List (owned by IAC) are the next sites to rank, while Facebook ranks for only 3% of local queries according to this study. (But Facebook ratings appear in Google Knowledge Panels.) FreshChalk points out, however, that Google My Business is the most important “local directory” for ranking in Google search results.

The study also found that better GMB reviews correlated with higher rankings on Google, which makes sense since reviews are a local ranking factor. Not surprisingly, businesses with ratings between 4 and 5 ranked better than those with lower ratings. FreshChalk did not find that better Yelp reviews were correlated with higher rankings in Google SERPs, but did find that Yelp review volume correlated with better Google organic rankings.

Best practices are clear

Without question, GMB is the most important local platform or “local directory,” as FreshChalk called it. But Facebook and Yelp should capture similar attention and effort. Many of the brands in the LSM study were paying insufficient attention to Yelp, which also syndicates its content to Bing, Apple Maps and Alexa.

It should be a no-brainer to claim and verify 100% of brand locations across Google, Facebook and Yelp. And while there are practical, logistical challenges in managing hundreds or thousands of locations, profiles with local content and images should be built out for each one. Most importantly, marketers must respond to reviews (and Google Q&A). And do so within 24 hours at the latest.

Conceptually all this is pretty straightforward, though most multi-location brands are not executing consistently or particularly well. Out of a possible 100 points in the LSM study, 89 was the top score (McAlister’s Deli), but 45 was the average across 163 franchise brands. There’s obviously considerable room for improvement.

About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He researches and writes about the connections between digital and offline commerce. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

Dit artikel is vertaald van Search Engine Land

PPC and machine learning: Where do we draw the line on automation?

PPC and machine learning: Where do we draw the line on automation?

One of the most hotly debated (and least understood) topics in the PPC world, Automation, is a behemoth to take on. Functionally, we have 2 distinct levels of automation that are happening simultaneously on both PPC platforms and in the processes used to manage PPC media. Our industry is at an inflection point, and we will not simply become better marketers by automating everything with blind faith in machines. Rather, we must take an intelligent approach to automation and how we think about the future roles that PPC professionals will need to shift to.

A behemoth of a topic obviously requires some industry heavyweights to take it on at SMX Advanced 2019, so it was no surprise to see Frederick Vallaeys of Optmyzer, Inc. and Brad Geddes of AdAlysis.

Before we dive into the insights shared during this session, it’s important to note one key theme that was prevalent throughout. That theme is the idea that we don’t need to automate everything. While it can be easy to imagine a fully automated future where, at the touch of a button, everything falls into place – that is simply not the reality of the world we are facing. Indeed, the digital transformation is particularly challenging for this exact reason – some things should be automated and some things absolutely shouldn’t – but where do we draw the line?

Insights From Frederick Vallaeys

Hot off the press, Fred just released an excellent new book, entitled “Digital Marketing in an AI World: Futureproofing Your PPC Agency” which I started reading on the flight to Seattle, and honestly, couldn’t put it down! Many of the highlights of his book were featured during his session.

As one of Google’s early employees, Fred has been around long enough to see how the world of PPC has evolved and has, himself, provided a number of functional automation solutions to the wider PPC community.

Fred kicked off his session talking about the key reasons to automate PPC processes, specifically to: 

  • Save Time (Reducing costs to grow the bottom line) 
  • Improve Quality & Reduce Churn
  • Allow for Scale ( in order to make more money)

These reasons, of course, are not exclusive to the world of digital marketing but are key considerations for automation across all industries. The key idea here is, just because we can automate something, doesn’t necessarily mean we should. In order to prioritize our automation work, we need to leverage a Degree of Impact framework. Here’s one that Fred shared during his presentation:

“PPC is Becoming Easier & Harder at the Same Time” – Frederick Vallaeys

This sentiment was echoed by Brad Geddes as well, and fundamentally means, “in-platform account management is becoming easier” while “cross-platform advertising is making it harder as users fragment and walled gardens prevent the sharing of audience/performance data.”

Fundamentally, the role of the PPC Manager is being reshaped by the waves of automation available to both platforms and advertisers. The PPC community has often pushed back on Google’s automation technology, claiming the new solutions don’t work, or that a human can still outperform a computer. But, as Fred notes – perhaps, this is a failure of the PPC manager not understanding how to train a machine correctly and giving up too quickly. This idea is key, and will likely be the main trend we see in future generations of PPC experts – instead of knowing how to manually optimize, we must instead think about how to feed data into the machine and allow for the machines to optimize from there. By understanding core principles of machine learning, the best PPC managers in the future will take a data science approach to campaign management, which in turn, will enable them more time to actually become good marketers.

The evolving role of the PPC manager

Moving beyond the evolution of the PPC Manager to handle platform-specific automation, Fred jumped into the topic of automating workflows associated with PPC management. While it may appear to be a backhanded compliment for one of my favorite companies from our host city (Seattle), I absolutely loved this quote from Fred:

“Starbucks doesn’t make the world’s best coffee, but they do have the world’s best process.”

– Frederick Vallaeys

And, it’s true – Starbucks has scaled because of their ability to bring that local, curated experience to thousands of city blocks around the world. At the end of the day, a defined, repeatable process is required to scale, and that means we need to look at the automation of your organization’s management processes.

Fred provided several great examples and a mapped hierarchy of process that other organizations can use to start compartmentalizing their processes:

Further, Fred shared some great insights regarding Automation tactics including:

  • Tactics are shifting – The old way was crunching numbers and looking at spreadsheets. 
  • PPC Marketers need to become MORE marketers and think about contextual alignment – setting the right goals and ensuring the measurement systems are correctly in place. 

Yes, I decided to type in all capslock for that last one, but it’s a fundamental truth that we need to fundamentally rethink about we engage with the machine learning (ML) based future.

Next, Fred took us to an advanced place as we started to think about the nature of automation layers interacting with each other. This is a key concept to keep in mind with every automation, and we need to ask ourselves:What could the unintended consequences of this automation be, and what could it interact with?

One particular example looked at the relationship between bidding automation and the selected Attribution model applied to the associated goal in Google Ads:

Finally, Fred summed up his session with a few, key insights:

  • The engines are unlikely to be great at cross-platform automation.
  • Automations are point-solutions. Humans still need to define the overall process.
  • Your role as a PPC expert will shift from tactical to strategic.

Insights from Brad Geddes

“I have a confession to make – I don’t *actually* want to automate everything”

– Brad Geddes

Brad is probably one of the smartest people in the world of PPC, and to see him present a session (rather than his typical curation role on the SMX team) was nothing short of fantastic. Indeed, the idea that we want to, or should, automate everything is absurd, but there is a happy medium to be found. For one, functions like reporting should absolutely be automated, did you know that a recent study by the AdAlysis team found that “The average agency spends 8 days a month creating reports” – that’s 26% of our time dedicated to a repeatable task!

Brad kicked off by talking about the various tiers of automation that can exist and their respective core benefits: 

When we think about the way we should be using automation, it’s important to keep in mind that “Automation is the Machine recommending optimizations – BUT you need to decide what to do.” These recommendations sit within 2 core categories:

  • Optimizations: The rules for the category are “debatable”
    • Pause poor performing ads
    • Use rotate ad serving
    • Pause duplicate keywords
    • Use negatives to stop multi-ad group query serving
  • Repairs: Everything is useful
    • Missing extensions
    • Keyword conflicts
    • Broken URLs

But before we even look at recommendations, you need to know your KPIs, know what will actually change as a result (education), understand other changes that might occur (unintended consequences), and the general workflow. In order to understand if we should act on the recommendations provided, Brad shared a great Evaluation Framework:

If you can’t make a workflow then you can’t automate it, you can’t repeat it, and you can’t be consistent in what you do. Be sure to look at your recommendation and ask: 

  • Does this fit for how we think accounts should be optimized? 
  • Should we test it? Sure, put a test in place and actually run a test, rather than making blind modifications. 
  • Did it work? If yes, should you now roll it out in other places. You often do not want to roll out account wide as that can have a lot of impacts

Additionally, when making changes based on automation recommendations, you make a note of what you did – why, and when? 

  • Avoid making the same mistakes twice by documenting your iteration process. 
  • Project management systems are probably your most powerful tool in Paid Search because it keeps everyone on the same page with what actually happened. 
  • Remember, Google will not remind you of a change you may have recently made that has yet to truly show it’s impact. 

Next, Brad tackled the topic of unintended consequences with a client example where a recommendation was actually impacted by other settings:

  • Client example: eCPC – Target CPA for a client – after three weeks, it was almost double than what they were looking for. Why didn’t it work? They had device bid modifiers in place (+100% for Desktop, -25% for mobile) but the machine is using those and is therefore unable to optimize correctly. 
  • When you switch to any automated ad systems, it changes ad rotation – this is a big deal as you can move away from your actual best ads (highest converting vs. highest CTR)

One of the big ideas that Brad shared is to always ensure that any automation will follow the same rules you do, for example, if an automation, such as a bid-modifier by time of day is in place, this can be heavily impacted by conversion volume and may not always follow the best practices set out by a human.

Finally, Brad summed up his session with a few, key insights:

  • Recommendations are great starting places to determine what to automate vs use a workflow.
  • Always understand how the automation works. 
  • Put a recommendation & automation evaluation workflow in place.

The State of Automation

If there was a Gartner Hype Cycle specific to the world of PPC Automation, I believe we would be descending rapidly from the Peak of Inflated Expectations, and adjusting to the rapids within the Through of Disillusionment. As mentioned at the outset of this article, we are at an inflection point in the world of PPC. It is now when we set forth towards our ML-driven future and we must all embrace the future of intelligent automation, albeit cautiously and with an understanding of the potentially massive impact of unintended consequences.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Simon has had a passion for finding creative ways to measure real-world challenges from an early age. Combining an affinity for psychology, statistics and digital marketing, he is currently the Senior Director of Digital Intelligence at Wpromote. Simon regularly speaks at industry events, including SMX West, SMX East, Cleveland Research Group’s eCommerce Catalyst for Change and SMX Advanced on a variety of topics related to data-driven digital marketing. You can find out where he will next be speaking over at spoulton.com. Simon was also a finalist for Search Engine Land’s “Search Marketer of the Year 2018”, and was named one of the top 25 Social Business Leaders by IBM and the Economist in 2015.

Dit artikel is vertaald van Search Engine Land

Microsoft Advertising says it’s keeping average position reporting

Microsoft Advertising says it’s keeping average position reporting

Microsoft Advertising (formerly Bing Ads) has added the position-based impression share metrics that Google introduced last fall. But, unlike Google, it said average position reporting will be sticking around.

Adding prominence metrics. Now called prominence metrics, rather than share of voice, in Microsoft Advertising, the set of six new stats are available at the campaign, ad group and keyword levels.

  • Top impression share
  • Top impression share lost to rank
  • Top impression share lost to budget
  • Absolute top impression share
  • Absolute top impression share lost to rank
  • Absolute top impression share lost to budget

The new prominence metric columns are available now. Image: Microsoft Advertising.

Removing others. Say goodbye to other competitive metrics: impression share lost to bid, relevance, and expected CTR

Average position. Microsoft Adverting confirmed it is keeping average position reporting.

“One key metric that will remain in your reporting is average position, as we’ve heard continuous feedback that shows this information is still very valuable to you,” said Nahva Tecklu, a Microsoft Advertising program manager, in the blog post.

Why we should care. Google announced in February that it will be retiring average position later this year. The news came a few months after it introduced the position metrics Microsoft Advertising added this week.

Google’s reasoning for removing average position is that it’s no longer a great indicator for where your ads actually appear since position one may actually be at the bottom of the page on some search results. Microsoft is bucking this decision, but we’ll see how long it sticks around if advertisers get comfortable not having it in Google Ads.

About The Author

Ginny Marvin is Third Door Media’s Editor-in-Chief, managing day-to-day editorial operations across all of our publications. Ginny writes about paid online marketing topics including paid search, paid social, display and retargeting for Search Engine Land, Marketing Land and MarTech Today. With more than 15 years of marketing experience, she has held both in-house and agency management positions. She can be found on Twitter as @ginnymarvin.

Dit artikel is vertaald van Search Engine Land

TF-IDF: The best content optimization tool SEOs aren’t using

TF-IDF: The best content optimization tool SEOs aren’t using

TF-IDF, short for term frequency–inverse document frequency, identifies the most important terms used in a given document. It is also one of the most ignored content optimization tools used by SEOs today.

TF-IDF fills in the gaps of standard keyword research. The saturation of target keywords on-page doesn’t determine relevance – anyone can practice keyword stuffing. Search marketers can use TF-IDF to uncover the specific words top-ranking pages use to give target keywords context, which is how search engines understand relevance.

Why should SEOs care about TF-IDF?

Conducting a TF-IDF analysis shows you the most important words used in the top 10 pages for a given keyword. You’ll see the exact terms that search engines consider highly relevant for your keyword and then compare your own content with competitors.

Now, I’m not suggesting you throw out your other keyword research tools—they are still very useful in the beginning stages when choosing your target keyword. However, they simply do not provide the semantic keywords necessary to fully represent a topic.

Let’s compare a keyword research tool’s semantic abilities with TF-IDF:

Keyword: ‘how to make coffee’

Say you’re writing a guide about how to make coffee. Here’s what Ahrefs would suggest including:

These tools provide excellent keyword variations but do not offer any keywords to improve topical relevance.

On the other hand, a TF-IDF tool would provide these insights:

In the top 10 pages about how to make coffee, the most weighted words include:

  • water
  • cup
  • brew
  • filter
  • beans

One glance at these words reveals the topic without a mention of the word coffee. That’s because TF-IDF provides a list of semantically related keywords, or “context” keywords, as one can think of them, that search engines are statistically expecting to see in relation to the topic of “how to make coffee.”

The exclusion of these words from an article about making coffee would absolutely indicate a lack of relevance to search engines… which means you can say goodbye to your chances of high rankings. Traditional keyword research just doesn’t provide this type of insight. 

But some may ask: what about E-A-T? Won’t a good reputation be enough to override the content?

The answer is: No, not really.

In his presentation on technical content optimization, Mike King of iPullRank offers an excellent “David and Goliath” example of the importance of content relevance:

Moz, arguably one of the most relevant sites for SEO-related keywords, ranks #20 for “what does seo stand for.”

Moz’s page (URL rating of 56 and 2.54k backlinks):

Alpine Web Design, the “David” in this situation, ranks #2 for the same keyword.

Alpine’s page: (URL rating of 15 and 75 backlinks)

From an authority and UX perspective, Moz is the clear winner. But TF-IDF analysis tells a different side of the story:



As you can see, Moz’s page does not adequately represent many contextual keywords that Google finds relevant for the term “what does SEO stand for.” A significantly higher URL rating and backlink profile couldn’t save it.

How to implement TF-IDF with free tools

The advantages of adding TF-IDF to your content strategy are clear. Fortunately, several free tools exist to simplify this process:

1. Seobility’s TF-IDF tool

Personally, this is my favorite tool. It’s the only one I’ve found that’s completely free, no download or sign-up necessary. You get three TF-IDF checks per day to start, five with free sign-up or 50 with the premium plan.

You also gain access to their text editing tool so you can optimize your content with the tool’s suggestions.

2. Ryte’s content success tool

Ryte’s TF-IDF tool is another excellent choice. You can sign up for Ryte for free and get 10 TF-IDF analyses per month, which includes keyword recommendations and topic inspiration.

This tool also includes a text editor for easy content optimization.

3. Link Assistant’s website auditor

This tool is my honorable mention because it requires downloading to gain access. Once downloaded, you should get unlimited TF-IDF analyses.

If you do decide to download, this video explains how to navigate to the TF-IDF dashboard. 

Final word: TF-IDF is a tool, not the tool

It’s important to note: using TF-IDF is no substitution for having authoritative authors or reviewers, especially when it comes to YMYL topics.

This method of research should be used primarily to increase your understanding of the most weighted terms in a given document, and perhaps influence the variety of words used in your pages. It will never replace the expertise of a professional in the field.

Similarly, TF-IDF should not be taken at face value. You will be unsuccessful if you mimic the exact average of the weighted terms in your own content. Don’t force words in if they don’t make sense.

TF-IDF is just one method of content optimization, not the basket to put all your eggs in. If you get one thing out of this post, it would be to consider adding TF-IDF analysis to your toolbox when creating or updating content, not replacing your existing method of keyword research.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Abby Reimer is a digital strategist at Uproer, where she develops SEO and content strategies for e-commerce and technology companies. Her career dream is to use public speaking and content to make SEO more accessible for marketers at all levels of expertise. She believes wholeheartedly that better search results are better for everyone.

Dit artikel is vertaald van Search Engine Land

smoop logo wit

Over Smoop

Smoop helpt sinds 2011 middengrote en kleine bedrijven met de online vindbaarheid. Smoop denkt mee op strategisch niveau om maximale resultaten uit het marketingbudget te halen.

Wilt u een (snellere) omzetgroei met uw bedrijf? Neem dan contact op.


Altingstraat 17
2593 SP
Den Haag

[email protected]

Copyright 2019 ©  SEO Smoop