How to Onboard Clients with Immersion Workshops - Whiteboard Friday

How to Onboard Clients with Immersion Workshops – Whiteboard Friday

Spending quality time getting to know your client, their goals and capabilities, and getting them familiar with their team sets you up for a better client-agency relationship. Immersion workshops are the answer. Learn more about how to build a strong foundation with your clients in this week’s Whiteboard Friday presented by Heather Physioc.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, everybody, and welcome back to Whiteboard Friday. My name is Heather Physioc, and I’m Group Director of Discoverability at VMLY&R. So I learned that when you onboard clients properly, the rest of the relationship goes a lot smoother.

Through some hard knocks and bumps along the way, we’ve come up with this immersion workshop model that I want to share with you. So I actually conducted a survey of the search industry and found that we tend to onboard clients inconsistently from one to the next if we bother to do a proper onboarding with them at all. So to combat that problem, let’s talk through the immersion workshop.

Why do an immersion workshop with a client?

So why bother taking the time to pause, slow down, and do an immersion workshop with a client? 

1. Get knowledgeable fast

Well, first, it allows you to get a lot more knowledgeable about your client and their business a lot faster than you would if you were picking it up piecemeal over the first year of your partnership. 

2. Opens dialogue

Next it opens a dialogue from day one.

It creates the expectation that you will have a conversation and that the client is expected to participate in that process with you. 

3. Build relationships

You want to build a relationship where you know that you can communicate effectively with one another. It also starts to build relationships, so not only with your immediate, day-to-day client contact, but people like their bosses and their peers inside their organization who can either be blockers or advocates for the search work that your client is going to try to implement.

4. Align on purpose, roadmap, and measuring success

Naturally the immersion workshop is also a crucial time for you to align with your client on the purpose of your search program, to define the roadmap for how you’re going to deliver on that search program and agree on how you’re going to measure success, because if they’re measuring success one way and you’re measuring success a different way, you could end up at completely different places.

5. Understand the DNA of the brand

Ultimately, the purpose of a joint immersion workshop is to truly understand the DNA of the brand, what makes them tick, who are their customers, why should they care what this brand has to offer, which helps you, as a search professional, understand how you can help them and their clients. 

Setting

Do it live! (Or use video chats)

So the setting for this immersion workshop ideally should be live, in-person, face-to-face, same room, same time, same place, same mission.

But worst case scenario, if for some reason that’s not possible, you can also pull this off with video chats, but at least you’re getting that face-to-face communication. There’s going to be a lot of back-and-forth dialogue, so that’s really, really important. It’s also important to building the empathy, communication, and trust between people. Seeing each other’s faces makes a big difference. 

Over 1–3 days

Now the ideal setting for the immersion workshop is two days, in my opinion, so you can get a lot accomplished.

It’s a rigorous two days. But if you need to streamline it for smaller brands, you can totally pull it off with one. Or if you have the luxury of stretching it out and getting more time with them to continue building that relationship and digging deeper, by all means stretch it to three days. 

Customize the agenda

Finally, you should work with the client to customize the agenda. So I like to send them a base template of an immersion workshop agenda with sessions that I know are going to be important to my search work.

But I work side-by-side with that client to customize sessions that are going to be the right fit for their business and their needs. So right away we’ve got their buy-in to the workshop, because they have skin in the game. They know which departments are going to be tricky. They know what objectives they have in their heads. So this is your first point of communication to make this successful.

Types of sessions

So what types of sessions do we want to have in our immersion workshop? 

Vision

The first one is a vision session, and this is actually one that I ask the clients to bring to us. So we slot about 90 minutes for the client to give us a presentation on their brand, their overarching strategy for the year, their marketing strategy for the year.

We want to hear about their goals, revenue targets, objectives, problems they’re trying to solve, threats they see to the business. Whatever is on their mind or keeps them up at night or whatever they’re really excited about, that’s what we want to hear. This vision workshop sets the tone for the entire rest of the workshop and the partnership. 

Stakeholder

Next we want to have stakeholder sessions.

We usually do these on day one. We’re staying pretty high level on day one. So these will be with other departments that are going to integrate with search. So that could be the head of marketing, for example, like a CMO. It could be the sales team. If they have certain sales objectives they’re trying to hit, that would be really great for a search team to know. Or it could be global regions.

Maybe Latin America and Europe have different priorities. So we may want to understand how the brand works on the global scale as opposed to just at HQ. 

Practitioner

On day two is when we start to get a little bit more in the weeds, and we call these our practitioner sessions. So we want to work with our day-to-day SEO contacts inside the organization. But we also set up sessions with people like paid search if they need to integrate their search efforts.

We might set up time with analytics. So this will be where we demo our standard SEO reporting dashboards and then we work with the client to customize it for their needs. This is a time where we find out who they’re reporting up to and what kinds of metrics they’re measured on to determine success. We talk about the goals and conversions they’re measuring, how they’re captured, why they’re tracking those goals, and their existing baseline of performance information.

We also set up time with developers. Technology is essential to actually implementing our SEO recommendations. So we set up time with them to learn about their workflows and their decision-making process. I want to know if they have resource constraints or what makes a good project ticket in Jira to get our work done. Great time to start bonding with them and give them a say in how we execute search.

We also want to meet with content teams. Now content tends to be one of the trickiest areas for our clients. They don’t always have the resources, or maybe the search scope didn’t include content from day one. So we want to bring in whoever the content decision-makers or creators are. We want to understand how they think, their workflows and processes. Are they currently creating search-driven content, or is this going to be a shift in mentality?

So a lot of times we get together and talk about process, editorial calendaring, brand tone and voice, whatever it takes to get content done for search.

Summary and next steps

So after all of these, we always close with a summary and next steps discussion. So we work together to think about all the things that we’ve accomplished during this workshop and what our big takeaways and learnings are, and we take this time to align with our client on next steps.

When we leave that room, everybody should know exactly what they’re responsible for. Very powerful. You want to send a recap after the fact saying, “Here’s what we learned and here’s what we understand the next steps to be. Are we all aligned?” Heads nod. Great. 

Tools to use

So a couple of tools that we’ve created and we’ll make sure to link to these below.

Download all the tools

Onboarding checklist

We’ve created a standard onboarding checklist. The thing about search is when we’re onboarding a new client, we pretty commonly need the same things from one client to the next. We want to know things about their history with SEO. We need access and logins. Or maybe we need a list of their competitors. Whatever the case is, this is a completely repeatable process. So there’s no excuse for reinventing the wheel every single time.

So this standard onboarding checklist allows us to send this list over to the client so they can get started and get all the pieces in place that we need to be successful. It’s like mise en place when you’re cooking. 

Discussion guides

We’ve also created some really helpful session discussion guides. So we give our clients a little homework before these sessions to start thinking about their business in a different way.

We’ll ask them open-ended questions like: What kinds of problems are your business unit solving this year? Or what is one of the biggest obstacles that you’ve had to overcome? Or what’s some work that you’re really proud of? So we send that in advance of the workshop. Then in our business unit discussions, which are part of the stakeholder discussions, we’ll actually use a few of the questions from that discussion guide to start seeding the conversation.

But we don’t just go down the list of questions, checking them off one by one. We just start the conversation with a couple of them and then follow it organically wherever it takes us, open-ended, follow-up, and clarifying questions, because the conversations we are having in that room with our clients are far more powerful than any information you’re going to get from an email that you just threw over the fence.

Sticky note exercise

We also do a pretty awesome little sticky note exercise. It’s really simple. So we pass out sticky notes to all the stakeholders that have attended the sessions, and we ask two simple questions. 

  1. One, what would cause this program to succeed? What are all the factors that can make this work? 
  2. We also ask what will cause it to fail.

Before you know it, the client has revealed, in their own words, what their internal obstacles and blockers will be. What are the things that they’ve run into in the past that have made their search program struggle? By having that simple exercise, it gets everybody in the mind frame of what their role is in making this program a success. 

Search maturity assessment

The last tool, and this one is pretty awesome, is an assessment of the client’s organic search maturity.

Now this is not about how good they are at SEO. This is how well they incorporate SEO into their organization. Now we’ve actually done a separate Whiteboard Friday on the maturity assessment and how to implement that. So make sure to check that out. But a quick overview. So we have a survey that addresses five key areas of a client’s ability to integrate search with their organization.

  • It’s stuff like people. Do they have the right resources? 
  • Process. Do they have a process? Is it documented? Is it improving? 
  • Capacity. Do they have enough budget to actually make search possible? 
  • Knowledge. Are they knowledgeable about search, and are they committed to learning more? Stuff like that.

So we’ve actually created a five-part survey that has a number of different questions that the client can answer. We try to get as many people as possible on the client side to answer these questions as we can. Then we take the numerical answers and the open-ended answers and compile that into a maturity assessment for the brand after the workshop.

So we use that workshop time to actually execute the survey, and we have something that we can bring back to the client not long after to give them a picture of where they stand today and where we’re going to take them in the future and what the biggest obstacles are that we need to overcome to get them there. 

So this is my guide to creating an immersion workshop for your new clients. Be sure to check out the Whiteboard Friday on the maturity assessment as well.

We’d love to hear what you do to onboard your clients in the comments below. Thanks and we’ll see you on the next Whiteboard Friday.

Video transcription by Speechpad.com


Heather shared even more strong team-building goodness in her MozCon 2019 talk. Get access to her session and more in our newly released video bundle, plus access 26 additional future-focused SEO topics from our top-notch speakers:

Grab the sessions now!

Make sure to schedule a learning sesh with the whole team and maximize your investment in SEO education!

Vertaald van MOZ

Search Ads 360 rolls out auction-time bidding for Google Search campaigns

Google Search Ads 360, the enterprise-level search management piec of Google Marketing Platform that enables advertisers and agencies to manage campaigns across multiple search engines, has added auction-bidding for bids on Google Search.

Why we should care

For Search Ads 360 users, this brings the Smart Bidding capabilities offered in Google Ads to your Google Search campaigns. Smart bidding uses machine learning to make real-time bids based on a number of signals such as device, location and time of day.

Auction-time bidding analyzes your account history, conversions registered by Floodlight, Google Marketing Platform’s conversion tracking system, and other signals to predict when a conversion is likely to occur from an ad click. In contrast, a Search Ads 360 bid strategy analyzes campaign performance approximately every 6 hours to set keyword bids and set or recommend bid adjustments.

“By activating auction-time bidding you can enhance your performance when bidding on Google Search, while still maintaining your cross-channel bidding strategy powered by Search Ads 360,” explained Jason Krueger, a Search Ads 360 product manager, in the announcement.

More on the news

  • Auction-time bidding is generally available for Google Search campaigns and in open beta for Google Shopping campaigns.
  • Because auction-time bidding is machine learning-driven and requires data to train, Google says its Smart Bidding system will learn for at least a week before it begins setting auction-time bids.
  • Google says hundreds of advertisers participated in the beta and saw, on average, a conversion lift of 15 to 30% at the same or better ROI.
  • Click on an existing bid strategy to enable auction-time bidding in your Google Search campaigns as shown in the screenshot below. When you opt in, you consent to share the bid strategy’s Floodlight conversions with Google Ads to enable both bidding systems to use the same conversion data. For more details, see the help page.


About The Author

Ginny Marvin is Third Door Media’s Editor-in-Chief, managing day-to-day editorial operations across all of our publications. Ginny writes about paid online marketing topics including paid search, paid social, display and retargeting for Search Engine Land, Marketing Land and MarTech Today. With more than 15 years of marketing experience, she has held both in-house and agency management positions. She can be found on Twitter as @ginnymarvin.

Dit artikel is vertaald van Search Engine Land

How could custom software be useful for your business?

Businesses whether it might be small or big use applications every day. A number of them come at no cost and handle primary duties like surfing. Others applications are costly which can manage complex tasks such as project management or employee timekeeping. Firms like to gain from developing their custom made applications to fulfill their requirements specifically.

Custom made applications that’s also called bespoke applications is utilised to satisfy the specific and desirable goal-oriented support. Since every company has their needs, it’s hard to fix numerous jobs in 1 product. So, creating custom software is much better, making the procedure quantifiable. Having said this, this program was designed to suit the organization’s needs and requirements.

Customized software is adaptable to use and may be used from the whole organization. That is indeed the ideal option for businesses. It offers numerous advantages beyond what the off-the-shelf applications permits. It has its benefits in regions of integration, ensured maintenance, and scalability. A company that is specialized in this is AGR Technology.

Overall, customized applications has an exceptional base that’s constructed for the constant improvement of the companies.

It’s evident that the advantage of a custom made software is, supplying solutions for functionalities that is not possible in off-the-shelf applications. Taking a more comprehensive appearance, these advantages can be split even further as beneath.

1. You are using multiple applications bits to accomplish 1 task.

1 software program for sales, one for charging, and you for transport — surely there’s a better means to do this. In case you’ve got one software program that monitors sales and stock, and yet another sending invoices out, and another that programs shipments, it impacts your bottom line.

2. You are monitoring and assessing data manually.

Spreadsheets are fantastic. They permit you to monitor data in a number of ways. But if you are keeping tabs on stock, revenue reports, and other pieces of information in this manner, you are making data investigation tougher than necessary.

3. You are doing repetitive jobs manually.

Regular tasks like payroll, inventory, and invoicing are cases of manual jobs that occur regularly. Just because a job is insistent does not mean it is error-proof. A transposed amount or additional keystroke may cause all sorts of issues.

4. Your software is not scalable.

Flexibility is crucial in business now. The normal cycles of growth and contraction in the market usually make their way in your small business. Software which can’t scale to satisfy these requirements is software that is not effective for your company.

5. You are constantly in search-and-test style.

There are a lot of software programs intended for specific jobs — you understand this as you are in the process of systematically examining them all. Some are significantly less than what you want. Others are complicated, with features you’ll never use. You might even be attempting to join several parts of unrelated applications with each other, expecting to achieve a whole-business alternative.

6. You have to meet compliance criteria.

Some businesses are subject to government regulations, and also the compliance particulars are typically quite exact. Compliance may be more challenging to reach in case your company does not have the capacity to get the essential information.

7. You’ve got procedures which needs to be easy… but aren’t.

When one action is dependent upon multiple actions to finish , it is difficult to run your company effectively. Just how long and effort does your employees put into these apparently easy tasks? How can this be done ?

8. You’ve got several locations.

Firms that run from more than 1 place often will need to gain access and update data to gain all areas . While off-the-shelf applications is normal in numerous places, in addition, it can make communication between those places more complex than necessary.

9. You’ve got a lot of paper.

When many visionaries called a paperless office later on, the simple fact is that some companies have managed to tame the paper tiger. Nonetheless, it’s likely to offer decent customer support with no hills of paper. Eliminate these whiteboard, sticky notes, and published dispatch reports.

 

Google can crawl AJAX just fine

Google can crawl AJAX just fine

“You no longer need to do anything special for hash-bang URLs,” Google’s John Mueller said on the September 17 edition of #AskGoogleWebmasters, “we’ll just try to render them directly.”

The question. “What is the current status of #! AJAX crawling? How do I set up redirects?” asked user @olegko via Twitter.

The answer. As stated above, webmasters do not need to take any special action for Google to crawl their AJAX applications.

“The AJAX crawling scheme was something we proposed in the early days of JavaScript sites, way back in 2009,” Mueller contextualized. “This worked great for a number of years, but over time, it became kind of redundant. Search engines — or at least, Google — had learned how to render most pages like a browser would. And, in the meantime, we’re even using a special version of Chrome for crawling and rendering.”

“In order to move to a different URL structure, you need to use JavaScript on these pages to create the redirects,” he added. “It’s not possible to use server-side redirects since everything after a hash — so, the number symbol — is not sent to the server, but rather processed in the browser. Once you’ve set up those redirects, as Googlebot reprocesses the hash-bang URLs on the site, it’ll spot the redirect and follow it appropriately.”

Why we should care. If you inherited a codebase that uses AJAX, or are using AJAX for routing URLs, you’ll be glad to know that Google should have no problem indexing your pages discreetly.

Learn more the JavaScript and the evergreen Googlebot. Here are some additional resources to give you a better idea of how Googlebot handles your interactive pages and applications.


About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Dit artikel is vertaald van Search Engine Land

Google search adds key moments for videos in search

Google search adds key moments for videos in search

Yesterday we spotted Google testing a new video feature titled in this video that shows a timeline of what happened in a specific video. Google now has officially launched this feature, calling it key moments in videos.

What it looks like. Here is a screen shot of the feature that shows this in action:

Find content within videos. This helps searchers quickly find content within videos. Google said “When you search for things like how-to videos that have multiple steps, or long videos like speeches or a documentary, Search will provide links to key moments within the video, based on timestamps provided by content creators. You’ll be able to easily scan to see whether a video has what you’re looking for, and find the relevant section of the content. For people who use screen readers, this change also makes video content more accessible.”

How can my videos show this? Google said this will work for English videos hosted on YouTube, where creators have provided timestamp information in the video description. It is not supported outside of YouTube yet, but Google opened a form to ask video creators about their interest outside of YouTube. The form is available over here. Those not on YouTube will need to markup their videos using video schema and fill out that form.

Why we care. Video is big, really big, and Google shows many videos in its search results. If you do video as part of your content marketing strategy, you probably want to test the impact of having key moments work in your videos in search and not. Will those videos rank better, will searchers interact with your videos less or more, will your conversion metrics around these videos suffer or do better? Make sure to test this out and experiment maybe before implementing it on all your videos.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Dit artikel is vertaald van Search Engine Land

How to Automate Pagespeed Insights For Multiple URLs using Google Sheets

How to Automate Pagespeed Insights For Multiple URLs using Google Sheets

Calculating individual page speed performance metrics can help you to understand how efficiently your site is running as a whole. Since Google uses the speed of a site (frequently measured by and referred to as PageSpeed) as one of the signals used by its algorithm to rank pages, it’s important to have that insight down to the page level.

One of the pain points in website performance optimization, however, is the lack of ability to easily run page speed performance evaluations en masse. There are plenty of great tools like PageSpeed Insights or the Lighthouse Chrome plugin that can help you understand more about the performance of an individual page, but these tools are not readily configured to help you gather insights for multiple URLs — and running individual reports for hundreds or even thousands of pages isn’t exactly feasible or efficient.

In September 2018, I set out to find a way to gather sitewide performance metrics and ended up with a working solution. While this method resolved my initial problem, the setup process is rather complex and requires that you have access to a server.

Ultimately, it just wasn’t an efficient method. Furthermore, it was nearly impossible to easily share with others (especially those outside of UpBuild).

In November 2018, two months after I published this method, Google released version 5 of the PageSpeed Insights API. V5 now uses Lighthouse as its analysis engine and also incorporates field data provided by the Chrome User Experience Report (CrUX). In short, this version of the API now easily provides all of the data that is provided in the Chrome Lighthouse audits.

So I went back to the drawing board, and I’m happy to announce that there is now an easier, automated method to produce Lighthouse reports en masse using Google Sheets and Pagespeed Insights API v5.

Introducing the Automated PageSpeed Insights Report:

With this tool, we are able to quickly uncover key performance metrics for multiple URLs with just a couple of clicks.

All you’ll need is a copy of this Google Sheet, a free Google API key, and a list of URLs you want data for — but first, let’s take a quick tour.

How to use this tool

The Google Sheet consists of the three following tabs:

  • Settings
  • Results
  • Log

Settings

On this tab, you will be required to provide a unique Google API key in order to make the sheet work.

Getting a Google API Key

  1. Visit the Google API Credentials page.
  2. Choose the API key option from the ‘Create credentials’ dropdown (as shown):

  1. You should now see a prompt providing you with a unique API key:

  1. Next, simply copy and paste that API key into the section shown below found on the “Settings” tab of the Automated Pagespeed Insights spreadsheet.

Now that you have an API key, you are ready to use the tool.

Setting the report schedule

On the Settings tab, you can schedule which day and time that the report should start running each week. As you can see from this screenshot below, we have set this report to begin every Wednesday at 8:00 am. This will be set to the local time as defined by your Google account.

As you can see this setting is also assigning the report to run for the following three hours on the same day. This is a workaround to the limitations set by both Google Apps Scripts and Google PageSpeed API.

Limitations

Our Google Sheet is using a Google Apps script to run all the magic behind the scenes. Each time that the report runs, Google Apps Scripts sets a six-minute execution time limit, (thirty minutes for G Suite Business / Enterprise / Education and Early Access users).

In six minutes you should be able to extract PageSpeed Insights for around 30 URLs.

Then you’ll be met with the following message:

In order to continue running the function for the rest of the URLs, we simply need to schedule the report to run again. That is why this setting will run the report again three more times in the consecutive hours, picking up exactly where it left off.

The next hurdle is the limitation set by Google Sheets itself.

If you’re doing the math, you’ll see that since we can only automate the report a total of four times — we theoretically will be only able to pull PageSpeed Insights data for around 120 URLs. That’s not ideal if you’re working with a site that has more than a few hundred pages!.

The schedule function in the Settings tab uses the Google Sheet’s built-in Triggers feature. This tells our Google Apps script to run the report automatically at a particular day and time. Unfortunately, using this feature more than four times causes the “Service using too much computer time for one day” message.

This means that our Google Apps Script has exceeded the total allowable execution time for one day. It most commonly occurs for scripts that run on a trigger, which have a lower daily limit than scripts executed manually.

Manually?

You betcha! If you have more than 120 URLs that you want to pull data for, then you can simply use the Manual Push Report button. It does exactly what you think.

Manual Push Report

Once clicked, the ‘Manual Push Report’ button (linked from the PageSpeed Menu on the Google Sheet) will run the report. It will pick up right where it left off with data populating in the fields adjacent to your URLs in the Results tab.

For clarity, you don’t even need to schedule the report to run to use this document. Once you have your API key, all you need to do is add your URLs to the Results tab (starting in cell B6) and click ‘Manual Push Report’.

You will, of course, be met with the inevitable “Exceed maximum execution time” message after six minutes, but you can simply dismiss it, and click “Manual Push Report” again and again until you’re finished. It’s not fully automated, but it should allow you to gather the data you need relatively quickly.

Setting the log schedule

Another feature in the Settings tab is the Log Results function.

This will automatically take the data that has populated in the Results tab and move it to the Log sheet. Once it has copied over the results, it will automatically clear the populated data from the Results tab so that when the next scheduled report run time arrives, it can gather new data accordingly. Ideally, you would want to set the Log day and time after the scheduled report has run to ensure that it has time to capture and log all of the data.

You can also manually push data to the Log sheet using the ‘Manual Push Log’ button in the menu.

How to confirm and adjust the report and log schedules

Once you’re happy with the scheduling for the report and the log, be sure to set it using the ‘Set Report and Log Schedule’ from the PageSpeed Menu (as shown):

Should you want to change the frequency, I’d recommend first setting the report and log schedule using the sheet.

Then adjust the runLog and runTool functions using Google Script Triggers.

  • runLog controls when the data will be sent to the LOG sheet.
  • runTool controls when the API runs for each URL.

Simply click the pencil icon next to each respective function and adjust the timings as you see fit.

You can also use the ‘Reset Schedule’ button in the PageSpeed Menu (next to Help) to clear all scheduled triggers. This can be a helpful shortcut if you’re simply using the interface on the ‘Settings’ tab.

PageSpeed results tab

This tab is where the PageSpeed Insights data will be generated for each URL you provide. All you need to do is add a list of URLs starting from cell B6. You can either wait for your scheduled report time to arrive or use the ‘Manual Push Report’ button.

You should now see the following data generating for each respective URL:

  • Time to Interactive
  • First Contentful Paint
  • First Meaningful Paint
  • Time to First Byte
  • Speed Index

You will also see a column for Last Time Report Ran and Status on this tab. This will tell you when the data was gathered, and if the pull request was successful. A successful API request will show a status of “complete” in the Status column.

Log tab

Logging the data is a useful way to keep a historical account on these important speed metrics. There is nothing to modify in this tab, however, you will want to ensure that there are plenty of empty rows. When the runLog function runs (which is controlled by the Log schedule you assign in the “Settings” tab, or via the Manual Push Log button in the menu), it will move all of the rows from the Results tab that contains a status of “complete”. If there are no empty rows available on the Log tab, it will simply not copy over any of the data. All you need to do is add several thousands of rows depending on how often you plan to check-in and maintain the Log.

How to use the log data

The scheduling feature in this tool has been designed to run on a weekly basis to allow you enough time to review the results, optimize, then monitor your efforts. If you love spreadsheets then you can stop right here, but if you’re more of a visual person, then read on.

Visualizing the results in Google Data Studio

You can also use this Log sheet as a Data Source in Google Data Studio to visualize your results. As long as the Log sheet stays connected as a source, the results should automatically publish each week. This will allow you to work on performance optimization and evaluate results using Data Studio easily, as well as communicate performance issues and progress to clients who might not love spreadsheets as much as you do.

Blend your log data with other data sources

One great Google Data Studio feature is the ability to blend data. This allows you to compare and analyze data from multiple sources, as long as they have a common key. For example, if you wanted to blend the Time to Interactive results against Google Search Console data for those same URLs, you can easily do so. You will notice that the column in the Log tab containing the URLs is titled “Landing Page”. This is the same naming convention that Search Console uses and will allow Data Studio to connect the two sources.

There are several ways that you can use this data in Google Data Studio.

Compare your competitors’ performance

You don’t need to limit yourself to just your own URLs in this tool; you can use any set of URLs. This would be a great way to compare your competitor’s pages and even see if there are any clear indicators of speed affecting positions in Search results.

Improve usability

Don’t immediately assume that your content is the problem. Your visitors may not be leaving the page because they don’t find the content useful; it could be slow load times or other incompatibility issues that are driving visitors away. Compare bounce rates, time on site, and device type data alongside performance metrics to see if it could be a factor.

Increase organic visibility

Compare your performance data against Search ranking positions for your target keywords. Use a tool to gather your page positions, and fix performance issues for landing pages on page two or three of Google Search results to see if you can increase their prominence.

Final thoughts

This tool is all yours.

Make a copy and use it as is, or tear apart the Google Apps Script that makes this thing work and adapt it into something bigger and better (if you do, please let me know; I want to hear all about it).

Remember PageSpeed Insights API V5 now includes all of the same data that is provided in the Chrome Lighthouse audits, which means there are way more available details you can extract beyond the five metrics that this tool generates.

Hopefully, for now, this tool helps you gather Performance data a little more efficiently between now and when Google releases their recently announced Speed report for Search Console.

Vertaald van MOZ

Google cracks down on some review rich results

Google cracks down on some review rich results

Google announced it has applied an “algorithmic updates to reviews in rich results,” in order to determine if Google should show the reviews (the stars) in the search results snippets. Google said this will “ease implementation” for this going forward.

In short Google is now showing review rich results for only:

  • A Clear set of schema types for review snippets
  • Non “self-serving” reviews
  • Reviews that have the name property within the markup

Schema types allowed. Here is a list of the schema types that are currently allowed to show review rich results in the Google search results:

Self-serving reviews aren’t allowed. Google said in addition to the above, self-serving reviews aren’t allowed. “Reviews that can be perceived as “self-serving” aren’t in the best interest of users. We call reviews “self-serving” when a review about entity A is placed on the website of entity A – either directly in their markup or via an embedded 3rd party widget,” Google wrote.

Name property required. Google also said when you are reviewing something, you must add the name property to the markup for it to qualify for this review markup. “the name property is now required, so you’ll want to make sure that you specify the name of the item that’s being reviewed,” Google added.

What are review rich results. Here is a screen shot of a sample snippet in the Google search results that have reviews within the result. These starts are the review rich results:

Updated help docs. Google has updated the review snippet developer documents to clarify these changes going forward.

Why we care. If your pages on your site show review rich results, i.e. the stars, in the search results, then you will want to make sure they continue to show these stars. Review these new rules, make sure your pages and schema markup comply with these rules and if they do, you should be okay. If your review rich results disappear today or in the near future, it is likely related to this new algorithmic update around review rich results.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Dit artikel is vertaald van Search Engine Land

10 Link Building Lies You Must Ignore

10 Link Building Lies You Must Ignore

Even though link building has been a trade for more than a decade, it’s clear that there is still an enormous amount of confusion around it.

Every so often, there is a large kerfuffle. Some of these controversies and arguments arise simply from a necessity to fill a content void, but some of them arise from genuine concern and confusion:

“Don’t ask for links!”

“Stick a fork in it, guest posting is done!”

“Try to avoid link building!”

SEO is an everchanging industry; what worked yesterday might not work today. Google’s personnel doesn’t always help the cause. In fact, they often add fuel to the fire. That’s why I want to play the role of “link building myth-buster” today. I’ve spent over ten years in link building, and I’ve seen it all.

I was around for Penguin, and every iteration since. I was around for the launch of Hummingbird. And I was even around for the Matt Cutts videos.

So, if you’re still confused about link building, read through to have ten of the biggest myths in the business dispelled.

1. If you build it, they will come

There is a notion among many digital marketers and SEOs that if you simply create great content and valuable resources, the users will come to you. If you’re already a widely-recognized brand/website, this can be a true statement. If, however, you are like the vast majority of websites — on the outside looking in — this could be a fatal mindset.

In order to get people to find you, you have to build the roads that will lead them to where you want. This is where link building comes in.

A majority of people searching Google end up clicking on organic results. In fact, for every click on a paid result in Google, there are 11.6 clicks to organic results!

And in order to build your rankings in search engines, you need links.

Which brings me to our second myth around links.

2. You don’t need links to rank

I can’t believe that there are still people who think this in 2019, but there are. That’s why I recently published a case study regarding a project I was working on.

To sum it up briefly, the more authoritative, relevant backlinks I was able to build, the higher the site ranked for its target keywords. This isn’t to say that links are the only factor in Google’s algorithm that matters, but there’s no doubt that a robust and relevant backlink profile goes a long way.

3. Only links with high domain authority matter

As a link builder, you should definitely seek target sites with high metrics. However, they aren’t the only prospects that should matter to you.

Sometimes a low domain authority (DA) might just be an indication that it is a new site. But forget about the metrics for one moment. Along with authority, relevancy matters. If a link prospect is perfectly relevant to your website, but it has a low DA, you should still target it. In fact, most sites that will be so relevant to yours will likely not have the most eye-popping metrics, and that is precisely because they are so niche. But more often than not, relevancy is more important than DA.

When you focus solely on metrics, you will lose out on highly relevant opportunities. A link that sends trust signals is more valuable than a link that has been deemed important by metrics devised by entities other than Google.

Another reason why is because Google’s algorithm looks for diversity in your backlink profile. You might think that a profile with over 100 links, all of which have a 90+ DA would be the aspiration. In fact, Google will look at it as suspect. So while you should absolutely target high DA sites, don’t neglect the “little guys.”

4. You need to build links to your money pages

When I say “money pages,” I mean the pages where you are specifically looking to convert, whether its users into leads or leads into sales.

You would think that if you’re going to put in the effort to build the digital highways that will lead traffic to your website, you would want all of that traffic to find these money pages, right?

In reality, though, you should take the exact opposite approach. First off, approaching sites that are in your niche and asking them to link to your money pages will come off as really spammy/aggressive. You’re shooting yourself in the foot.

But most importantly, these money pages are usually not pages that have the most valuable information. Webmasters are much more likely to link to a page with resourceful information or exquisitely created content, not a page displaying your products or services.

Building links to your linkable assets (more on that in a second) will increase your chances of success and will ultimately raise the profile of your money pages in the long run as well.

5. You have to create the best, most informative linkable asset

If you’re unfamiliar with what a linkable asset is exactly, it’s a page on your website designed specifically to attract links/social shares. Assets can come in many forms: resource pages, funny videos, games, etc.

Of course, linkable assets don’t grow on trees, and the process of coming up with an idea for a valuable linkable asset won’t be easy. This is why some people rely on “the skyscraper technique.” This is when you look at the linkable assets your competitors have created, you choose one, and you simply try to outdo it with something bigger and better.

This isn’t a completely ineffective technique, but you shouldn’t feel like you have to do this.

Linkable assets don’t need to be word-heavy “ultimate guides” or heavily-researched reports. Instead of building something that really only beats your competitor’s word count, do your own research and focus on building an authoritative resource that people in your niche will be interested in.

The value of a linkable asset has much more to do with finding the right angle and the accuracy of the information you’re providing than the amount.

6. The more emails you send, the more links you will get

I know several SEOs who like to cast a wide net — they send out emails to anyone and every one that even has the tiniest bit of relevancy of authority within their niche. It’s an old sales principle: The idea that more conversations will lead to more purchases/conversions. And indeed in sales, this is usually going to be the case. 

In link building? Not so much.

This is because, in link building, your chances of getting someone to link to you are increased when the outreach you send is more thoughtful/personalized. Webmasters pore over emails on top of emails on top of emails, so much so that it’s easy to pass over the generic ones.

They need to be effectively persuaded as to the value of linking to your site. If you choose to send emails to any site with a pulse, you won’t have time to create specific outreach for each valuable target site.

7. The only benefit of link building is algorithmic

As I mentioned earlier, links are fundamental to Google’s algorithm. The more quality backlinks you build, the more likely you are to rank for your target keywords in Google.

This is the modus operandi for link building. But it is not the only reason to build links. In fact, there are several non-algorithmic benefits which link building can provide.

First off, there’s brand visibility. Link building will make you visible not only to Google in the long term but to users in the immediate term. When a user comes upon a resource list with your link, they aren’t thinking about how it benefits your ranking in Google; they just might click your link right then and there.

Link building can also lead to relationship building. Because of link building’s very nature, you will end up conversing with many potential influencers and authority figures within your niche. These conversations don’t have to end as soon as they place your link.

In fact, if the conversations do end there every time, you’re doing marketing wrong. Take advantage of the fact that you have their attention and see what else you can do for each other.

8. You should only pursue exact match anchors

Not all myths are born out of complete and utter fiction. Some myths persist because they have an element of truth to them or they used to be true. The use of exact match anchor text is such a myth.

In the old days of SEO/link building, one of the best ways to get ahead was to use your target keywords/brand name as the anchor text for your backlinks. Keyword stuffing and cloaking were particularly effective as well.

But times have changed in SEO, and I would argue mostly for the better. When Google sees a backlink profile that uses only a couple of variations of anchor text, you are now open to a penalty. It’s now considered spammy. To Google, it does not look like a natural backlink profile.

As such, it’s important to note now that the quality of the link itself is far more important than the anchor text that comes with it.

It really should be out of your hands anyway. When you’re link building the right way, you are working in conjunction with the webmasters who are publishing your link. You do not have 100 percent control of the situation, and the webmaster will frequently end up using the anchor text of their choice.

So sure, you should optimize your internal links with optimized anchor text when possible, but keep in mind that it is best to have diverse anchor text distribution.

9. Link building requires technical abilities

Along with being a link builder, I am also an employer. When hiring other link builders, one skepticism I frequently come across relates to technical skills. Many people who are unfamiliar with link building think that it requires coding or web development ability.

While having such abilities certainly won’t hurt you in your link building endeavors, I’m here to tell you that they aren’t at all necessary. Link building is more about creativity, communication, and strategy than it is knowing how to write a for loop in javascript.

If you have the ability to effectively persuade, create valuable content, or identify trends, you can build links.

10. All follow links provide equal value

Not all links are created equally, and I’m not even talking about the difference between follow links and no-follow links. Indeed, there are distinctions to be made among just follow links.

Let’s take .edu links, for example. These links are some of the most sought after for link builders, as they are thought to carry inordinate power. Let’s say you have two links from the same .edu website. They are both on the same domain, same authority, but they are on different pages. One is on the scholarship page, the other is on a professor’s resource page which has been carefully curated.

They are both do-follow links, so naturally, they should both carry the same weight, right?

Fail. Search engines are smart enough to know the difference between a hard-earned link and a link that just about anyone can submit to.

Along with this, the placement of a link on a page matters. Even if two links are on the exact same page (not just the same domain) a link that is above-the-fold (a link you can see without scrolling) will carry more weight.

Conclusion

Link building and SEO are not rocket science. There’s a lot of confusion out there, thanks mainly to the fact that Google’s standards change rapidly and old habits die hard, and the answers and strategies you seek aren’t always obvious.

That said, the above points are some of the biggest and most pervasive myths in the industry. Hopefully, I was able to clear them up for you.

Vertaald van MOZ

Subdomain leasing and the giant hole in Google’s Medic update

Subdomain leasing and the giant hole in Google’s Medic update

ConsumerAffairs provides buying guides for everything from mattresses to home warranties. But they also direct consumers on purchasing hearing aids, dentures, diabetic supplies, and even lasik surgery. Many have questioned the legitimacy of ConsumerAffairs buying guides, largely because top-rated brands often have financial relationships with the organization. ConsumerAffairs’ health content has been hit in the post-medic world, but now it seems they’ve found a way to circumvent the algorithm update by hosting slightly modified versions of their buying guides on local news websites around the country. Google “hearing aids in Phoenix” and you’ll discover just how well this strategy is working. Local ABC affiliate station ABC15 hosts all of ConsumerAffairs’ buying guides, including those in the health category, on their new “reviews” subdomain. So far, I’ve counted almost 100 of these ConsumerAffairs content mirrors. Despite cracking down on low-authority medical advice and subdomain leasing, Google seems to be missing this huge hack on their ranking algorithm.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Abram Bailey, AuD is a Doctor of Audiology and the founder of HearingTracker.com, the leading independent resource for informed hearing aid consumers.

Dit artikel is vertaald van Search Engine Land

E-A-T and the Quality Raters' Guidelines - Whiteboard Friday

E-A-T and the Quality Raters’ Guidelines – Whiteboard Friday

EAT — also known as Expertise, Authoritativeness, and Trustworthiness — is a big deal when it comes to Google’s algorithms. But what exactly does this acronym entail, and why does it matter to your everyday work? In this bite-sized version of her full MozCon 2019 presentation, Marie Haynes describes exactly what E-A-T means and how it could have a make-or-break effect on your site.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. My name is Marie Haynes, from Marie Haynes Consulting, and I’m going to talk to you today about EAT and the Quality Raters’ Guidelines. By now, you’ve probably heard of EAT. It’s a bit of a buzzword in SEO. I’m going to share with you why EAT is a big part of Google’s algorithms, how we can take advantage of this news, and also why it’s really, really important to all of us.

The Quality Raters’ Guidelines

Let’s talk about the Quality Raters’ Guidelines. These guidelines are a document that Google has provided to this whole army of quality raters. There are apparently 16,000 quality raters, and what they do is they use this document, the Quality Raters’ Guidelines, to determine whether websites are high quality or not.

Now the quality raters do not have the power to put a penalty on your website. They actually have no direct bearing on rankings. But instead, what happens is they feed information back to Google’s engineers, and Google’s engineers can take that information and determine whether their algorithms are doing what they want them to do. Ben Gomes, the Vice President of Search at Google, he had a quote recently in an interview with CNBC, and he said that the quality raters, the information that’s in there is fundamentally what Google wants the algorithm to do.

“They fundamentally show us what the algorithm should do.”
– Ben Gomes, VP Search, Google

So we believe that if something is in the Quality Raters’ Guidelines, either Google is already measuring this algorithmically, or they want to be measuring it, and so we should be paying close attention to everything that is in there. 

How Google fights disinformation

There was a guide that was produced by Google earlier, in February of 2019, and it was a whole guide on how they fight disinformation, how they fight fake news, how they make it so that high-quality results are appearing in the search results.

There were a couple of things in here that were really interesting. 

1. Information from the quality raters allows them to build algorithms

The guide talked about the fact that they take the information from the quality raters and that allows them to build algorithms. So we know that it’s really important that the things that the quality raters are assessing are things that we probably should be paying attention to as well. 

2. Ranking systems are designed to ID sites with high expertise, authoritativeness, and trustworthiness

The thing that was the most important to me or the most interesting, at least, is this line that said our ranking systems are designed to identify sites with a high indicia of EAT, of expertise, authoritativeness, and trustworthiness.

So whether or not we want to argue whether EAT is a ranking factor, I think that’s semantics. What the word “ranking factor” means, what we really need to know is that EAT is really important in Google’s algorithms. We believe that if you’re trying to rank for any term that really matters to people, “your money or your life” really means if it’s a page that is helping people make a decision in their lives or helping people part with money, then you need to pay attention to EAT, because Google doesn’t want to rank websites that are for important queries if they’re lacking EAT.

The three parts of E-A-T

So it’s important to know that EAT has three parts, and a lot of people get hung up on just expertise. I see a lot of people come to me and say, “But I’m a doctor, and I don’t rank well.” Well, there are more parts to EAT than just expertise, and so we’re going to talk about that. 

1. Expertise

But expertise is very important. In the Quality Raters’ Guidelines, which each of you, if you have not read it yet, you really, really should read this document.

It’s a little bit long, but it’s full of so much good information. The raters are given examples of websites, and they’re told, “This is a high-quality website. This is a low-quality website because of this.” One of the things that they say for one of the posts is this particular page is to be considered low quality because the expertise of the author is not clearly communicated.

Add author bios

So the first clue we can gather from this is that for all of our authors we should have an author bio. Perhaps if you are a nationally recognized brand, then you may not need author bios. But for the rest of us, we really should be putting an author bio that says here’s who wrote this post, and here’s why they’re qualified to do so.

Another example in the Quality Raters’ Guidelines talks about was a post about the flu. What the quality raters were told is that there’s no evidence that this author has medical expertise. So this tells us, and there are other examples where there’s no evidence of financial expertise, and legal expertise is another one. Think about it.

If you were diagnosed with a medical condition, would you want to be reading an article that’s written by a content writer who’s done good research? It might be very well written. Or would you rather see an article that is written by somebody who has been practicing in this area for decades and has seen every type of side effect that you can have from medications and things like that?

Hire experts to fact-check your content

Obviously, the doctor is who you want to read. Now I don’t expect us all to go and hire doctors to write all of our content, because there are very few doctors that have time to do that and also the other experts in any other YMYL profession. But what you can do is hire these people to fact check your posts. We’ve had some clients that have seen really nice results from having content writers write the posts in a very well researched and referenced way, and then they’ve hired physicians to say this post was medically fact checked by Dr. So-and-so. So this is really, really important for any type of site that wants to rank for a YMYL query. 

One of the things that we started noticing, in February of 2017, we had a number of sites that came to us with traffic drops. That’s mostly what we do. We deal with sites that were hit by Google algorithm updates. What we were noticing is a weird thing was happening.

Prior to that, sites that were hit, they tended to have all sorts of technical issues, and we could say, “Yes, there’s a really strong reason why this site is not ranking well.” These sites were all ones that were technically, for the most part, sound. But what we noticed is that, in every instance, the posts that were now stealing the rankings they used to have were ones that were written by people with real-life expertise.

This is not something that you want to ignore. 

2. Authoritativeness

We’ll move on to authoritativeness. Authoritativeness is really very, very important, and in my opinion this is the most important part of EAT. Authoritativeness, there’s another reference in the Quality Raters’ Guidelines about a good post, and it says, “The author of this blog post has been known as an expert on parenting issues.”

So it’s one thing to actually be an expert. It’s another thing to be recognized online as an expert, and this should be what we’re all working on is to have other people online recognize us or our clients as experts in their subject matter. That sounds a lot like link building, right? We want to get links from authoritative sites.

The guide to this information actually tells us that PageRank and EAT are closely connected. So this is very, very important. I personally believe — I can’t prove this just yet — but I believe that Google does not want to pass PageRank through sites that do not have EAT, at least for YMYL queries. This could explain why Google feels really comfortable that they can ignore spam links from negative SEO attacks, because those links would come from sites that don’t have EAT.

Get recommendations from experts

So how do we do this? It’s all about getting recommendations from experts. The Quality Raters’ Guidelines say in several places the raters are instructed to determine what do other experts say about this website, about this author, about this brand. It’s very, very important that we can get recommendations from experts. I want to challenge you right now to look at the last few links that you have gotten for your website and look at them and say, “Are these truly recommendations from other people in the industry that I’m working in? Or are they ones that we made?”

In the past, pretty much every link that we could make would have the potential to help boost our rankings. Now, the links that Google wants to count are ones that truly are people recommending your content, your business, your author. So I did a Whiteboard Friday a couple of years ago that talked about the types of links that Google might want to value, and that’s probably a good reference to find how can we find these recommendations from experts.

How can we do link building in a way that boosts our authoritativeness in the eyes of Google? 

3. Trustworthiness

The last part, which a lot of people ignore, is trustworthiness. People would say, “Well, how could Google ever measure whether a website is trustworthy?” I think it’s definitely possible. Google has a patent. Now we know if there’s a patent, that they’re not necessarily doing this.

Reputation via reviews, blog posts, & other online content

But they do have a patent that talks about how they can gather information about a brand, about an individual, about a website from looking at a corpus of reviews, blog posts, and other things that are online. What this patent talks about is looking at the sentiment of these blog posts. Now some people would argue that maybe sentiment is not a part of Google’s algorithms.

I do think it’s a part of how they determine trustworthiness. So what we’re looking for here is if a business really has a bad reputation, if you have a reputation where people online are saying, “Look, I got scammed by this company.” Or, “I couldn’t get a refund.” Or, “I was treated really poorly in terms of customer service.” If there is a general sentiment about this online, that can affect your ability to rank well, and that’s very important. So all of these things are important in terms of trustworthiness.

Credible, clear contact info on website

You really should have very credible and clear contact information on your website. That’s outlined in the Quality Raters’ Guidelines. 

Indexable, easy-to-find info on refund policies

You should have information on your refund policy, assuming that you sell products, and it should be easy for people to find. All of this information I believe should be visible in Google’s index.

We shouldn’t be no indexing these posts. Don’t worry about the fact that they might be kind of thin or irrelevant or perhaps even duplicate content. Google wants to see this, and so we want that to be in their algorithms. 

Scientific references & scientific consensus

Other things too, if you have a medical site or any type of site that can be supported with scientific references, it’s very important that you do that.

One of the things that we’ve been seeing with recent updates is a lot of medical sites are dropping when they’re not really in line with scientific consensus. This is a big one. If you run a site that has to do with natural medicine, this is probably a rough time for you, because Google has been demoting sites that talk about a lot of natural medicine treatments, and the reason for this, I think, is because a lot of these are not in line with the general scientific consensus.

Now, I know a lot of people would say, “Well, who is Google to determine whether essential oils are helpful or not, because I believe a lot of these natural treatments really do help people?” The problem though is that there are a lot of websites that are scamming people. So Google may even err on the side of caution in saying, “Look, we think this website could potentially impact the safety of users.”

You may have trouble ranking well. So if you have posts on natural medicine, on any type of thing that’s outside of the generally accepted scientific consensus, then one thing you can do is try to show both sides of the story, try to talk about how actually traditional physicians would treat this condition.

That can be tricky. 

Ad experience

The other thing that can speak to trust is your ad experience. I think this is something that’s not actually in the algorithms just yet. I think it’s going to come. Perhaps it is. But the Quality Raters’ Guidelines talk a lot about if you have ads that are distracting, that are disruptive, that block the readers from seeing content, then that can be a sign of low trustworthiness.

“If any of Expertise, Authoritativeness, or Trustworthiness is lacking, use the ‘low’ rating.”

I want to leave you with this last quote, again from the Quality Raters’ Guidelines, and this is significant. The raters are instructed that if any one of expertise, authoritativeness, or trustworthiness is lacking, then they are to rate a website as low quality. Again, that’s not going to penalize that website. But it’s going to tell the Google engineers, “Wait a second. We have these low-quality websites that are ranking for these terms.How can we tweak the algorithm so that that doesn’t happen?”



But the important thing here is that if any one of these three things, the E, the A, or the T are lacking, it can impact your ability to rank well. So hopefully this has been helpful. I really hope that this helps you improve the quality of your websites. I would encourage you to leave a comment or a question below. I’m going to be hanging out in the comments section and answering all of your questions.

I have more information on these subjects at mariehaynes.com/eat and also /trust if you’re interested in these trust issues. So with that, I want to thank you. I really wish you the best of luck with your rankings, and please do leave a question for me below.

Video transcription by Speechpad.com


Feeling like you need a better understanding of E-A-T and the Quality Raters’ Guidelines? You can get even more info from Marie’s full MozCon 2019 talk in our newly released video bundle. Go even more in-depth on what drives rankings, plus access 26 additional future-focused SEO topics from our top-notch speakers:

Grab the sessions now!

Invest in a bag of popcorn and get your whole team on board to learn!

Vertaald van MOZ

Banner Smoop website
smoop logo wit

Over Smoop

Smoop helpt sinds 2011 kleine en middelgrote bedrijven met de online vindbaarheid. Smoop denkt mee op strategisch niveau om maximale resultaten uit het marketingbudget te halen.

Wil je een (snellere) omzetgroei met jouw bedrijf? Neem dan direct contact op.

Contact

Kantoor & Bezoek:

Smoop
Laan van Nieuw Oost-Indië 125
2593 BM
Den Haag

Postadres:

Noord-West Buitensingel 5E
2518 PA
Den Haag

Copyright 2011 - 2020 ©  SMOOP

Chat openen
Stel hier vrijblijvend je vraag
Hallo,
Hoe kunnen we je helpen?