- Investor Relations
- Contact Us
- Group Agencies
A round up of some of the most interesting Google updates and industry news for March 2012.
Google released a blog post with over 40 search improvements which are being picked over by SEO’s around the globe:
Changes around the updates have already been reported and highlights include
Increased local results in searches - searches for generic terms that often have local meaning such as “flower delivery” are seeing more local listings appear in the natural search listings, even beyond the Google Places box. This has also been linked with increased recognition of which URLs belong to which country.
This now means that any 'bricks and mortar' businesses will need to make sure that they have well optimised and useful landing pages for all their local stores, in addition to the their normal Google Places pages.
International launch of shopping rich snippets. Shopping rich snippets [project codename “rich snippets”] have been used by Google to highlight product prices, availability, ratings and review counts. This month Google will be expanded shopping rich snippets globally (they were previously only available in the US, Japan and Germany).
Link evaluation. This was initially thought to be a big change by Google as they stated they would be changing the way in which they evaluate links. Apparently they were turning off a method of link analysis that they have been using for several years. “We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.”
However Google engineer John Mu recently played this down and stated that it was such as minor change that they even debated whether to include it on the update.
“Although the link evaluation has been played down, I think it’s time to step back and take thought. If Google were to de-rank the value of anchor text, how would we be affected? More link strategies should be in place to cover such a change. Quality posts with brand mentions, social mentions and citations should be included a lot more as part of a link strategy”
Site wide issues now split out from per page issues.
Google used to report on DNS resolution failures, connectivity issues with your web server, and problems fetching your robots.txt file by URL. However these have been split so that webmasters can keep track of the failure rates for each type of site-wide error.
If there aren’t any errors then these will be summarised with some friendly check marks to let webmasters know everything is OK with their website (see below).
However some data is now only available via the API :
Some data including the number of URLs reported, the specifics of the errors was removed from main Google webmaster tools console earlier in March. However, much of this detail is still available via the GData API.
Two different types of files are available that provide detail about crawl errors:
- A download of eight CSV files, one of which is a list of all crawl errors
- A crawl errors feed, which enables you to programatically fetch 25 errors at a time
What this means is that different slices of data are available in four ways:
- User interface display
- User interface-based CSV download
- API-based download
- API-based feed
The 8 CSV files are:
- Top Pages
- Top Queries
- Crawl Errors
- Content Errors
- Content Keywords
- Internal Links
- External Links
- Social Activity
Google released a notification regarding over 40 search improvements in February this year. There were several size-able updates within the release, one of which being Google’s Venice update which will have a large impact on how local search results are displayed in Google’s results.
Searches for generic terms that often have local relevancy such as “flower delivery” or “eye test” are now seeing more local listings appearing in the place of more traditional landing pages and separately from the Google Places result.
Shown Above - A search query for the keyphrase “eye test” with the location set to Oxford, UK.
Shown Above - A search query for the same keyphrase “eye test” with the location set to Manchester, UK.
Google also state that this has also been linked with increased recognition of which URLs belong to which country.
“This now means that any 'bricks and mortar' businesses will need to make sure that they have well optimised and useful landing pages for all their local stores, in addition to the their normal Google Places pages.”
“Local optimisation is the one area users cannot turn off and local bias is set to become an even more fundamental part of Google’s results. Local optimisation now looks set to be one of the key opportunities for retailers with a high street presence and should be a vital part of their SEO strategy”
Gary Moyle, SEO Manager
“For stores with many branches across the UK, the new Venice update will provide the greatest opportunity to drive relevant local traffic to their website. The will definitely help the larger brands become larger, but the smaller shops may struggle a little and find themselves pushed down further by the bigger brands.”
On a recent panel at SXSW, Matt Cutts (head of Spam at Google) stated:
“We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.”
However the concept itself is nothing new as Google have been making modifications to their algorithm regarding 'over optimisation' for some time now (think keyword stuffing, aggressive footer links, too many backlinks with similar anchor text etc).
Google have yet to announce any official statement or documentation about the update however it is sure to create a great deal of speculation over the implementation of traditional SEO techniques. Google have relied on a culture of fear in the past to control SEO activity so it could be a case of Google's bark being worse than it's bite.
“As with most of Google's algorithm changes, we won't know the real world impact until the update is actually live. In 2009, Matt did a video on over optimization penalties saying there was no such thing so it will be interesting to whether this actually comes in the form of an actual penalty or whether Googlebot will focus less on signals that are easy to game (h1, page title, internal links etc).”
*“It won’t affect you as long as you continue to ensure you create a useful 'well optimised' website with good content, this is something Matt says in the article he wants to reward.
Also there are two points to what Matt described:
1) How Google can help websites that are not very well setup still be competitive.
2) Penalising websites that 'over optimise’. The examples 'over optimising' given being: keyword stuffing, bulk link exchanges etc. So this is business as usual but with the dial turned up a bit.”*
*“Google has recently adopted a much sterner approach to webmaster communications, perhaps blazon on the back off forcefully promoting Google+. Whether it’s losing organic keyphrase data or receiving an ‘unnatural backlink’ Google Webmaster Tools message, this shift is less about levelling the playing field and more about showing those who use questionable SEO practices who’s boss.
The reality is policing the search results is too big a job for thousands of engineers and quality raters. Factoring in social signals from 2 billion internet users would help build a quality global product. I’d expect a flurry of algorithm updates (and roll backs) over the next few months as this unfolds.” *
"Google has announced that by default, it will begin encrypting search beyond Google.com in the coming weeks. Since the reason Google uses secure search is primarily to enable its “Search Plus Your World” personalized results, this means SPYW is likely to expand beyond the US. It also means that search marketers can expect the percentage of “not provided” data they see to greatly increase."
Google has announced this roll out internationally however it seems likely that the rollout will be targeted towards English speaking countries first such as google.co.uk (Google UK) with eventually roll out to most common languages.
We are already seeing large spikes in “not provided” data for UK clients occurring around 5th March and this figure looks set to go beyond the 10% figure for many clients despite Google claims that the percentage of “not provided” data should remain under 10%.
Bing have recommended the "one URL per content item" strategy which is becoming a popular standard for website development
“At Bing, we want to keep things simple by proposing the "one URL per content item" strategy. For each website, instead of having different URLs per platform (one URL for desktop, another for mobile devices, etc.), our feedback is that producing fewer variations of URLs will benefit you by avoiding sub-optimal and under-performing results. It can help manage unwanted bandwidth usage as well. Optimise meta data and copy on store pages for local queries” Duane Forrester , Bing
Google’s stance remains less clear with individual Googlers recommending the single URL approach yet the main guidelines still focus on mobile redirection
“It seems likely with the web development community also getting behind the single URL policy that Google will follow suit before long. Already individual Google engineers are highlighting this as best practice via forum threads and Google+. The issue for clients with enterprise level web projects however is trying to implement this retrospectively which remains an enormous challenge.”
Google has decided that it’s display network is long overdue a separation from the way traditional search campaigns are managed in the AdWords interface, and as a result has awarded the Display Network its own tab! This will enable us to bid, target and optimise display campaigns all from a single place.
Since effective contextual targeting is pivotal to the successful running of display campaigns, Google has enhanced its contextual targeting to, in Google’s words, ‘combine the reach of display with the precision of search, using Next-Gen Keyword Contextual Targeting’. In short, being able to tweak and adjust your contextual campaigns down to individual keyword level. This level of transparency gives us the ability to now optimise display campaigns to aggressively target those particular keywords which are proving to perform well, as well as making it ‘easier for you to extend search campaigns to display and more efficient to run the two types of campaigns together’ (Google).
As part of this update, Google are also introducing a way to visualise the reach of your display campaigns, and access how that reach is influenced by combining multiple targeting types, such as keywords, placements, topics or interests.
This feature is rolling out across all accounts over the coming weeks.
Google has released a new function in AdWords which complements existing ad diagnosis tools, to identify reasons why your PPC ads currently might not be showing. The updated Status column, accessible from the Ads tab, will provide more detailed insight into any issues with ad approval or policy limitations for individual ads.
To access the new functionality, simply hover over the speech bubble – consequently a window containing whether the ad is showing for that particular keyword in a particular location and the reason why it may not be showing is displayed:
You also have the option to re-diagnose the ad with a different keyword or target location by clicking ‘edit’ next to the relevant field. For target location you can either enter specific coordinates, or more usefully search by place name:
“This is a nice little feature for quickly obtaining insight into your ad’s eligibility within the AdWords interface. It’s particularly useful for advertisers whose products might potentially infringe on some of Google’s advertising policies, for example resellers of trademarked goods. In these instances ads often have ‘approved (limited)’ status or may even be disapproved, so quickly accessible information on the reasons why enhances the ability to make quick amendments or exception requests. For those advertisers selling ‘restricted’ products or services, the target location diagnostic is extremely helpful, as some restricted areas of business as deemed by Google might only have limitations in specific countries.”
Google has been trialling a new match type, ‘near phrase, near exact’, which has been in limited beta since November last year and is due to complete at the end of March. This is designed to broaden exact and phrase match keywords with syntactic variants including plurals, misspellings, abbreviations and acronyms, but does not include synonyms. Google has stated that the benefits of this match type are that it assists the advertiser in driving supplementary traffic with minimal effort, carefully broadens the coverage of existing exact and phrase match keywords and eliminates the need to build out extensive keyword lists covering all possible variations of a keyword.
A typical use case is for a brand sensitive or direct response client who is a heavy exact/phrase match user. This feature will enable these types of clients to capture more query volume while still maintaining fairly strict control over the types of queries their keywords match to.
“The introduction of near phrase, near exact further demonstrates Google’s focus on driving more volume (and therefore spend) for advertisers, and will no doubt make certain, perhaps even relatively untouched or undiscovered SERPs, more saturated. Many (smaller) advertisers use phrase and exact matches to both ensure and maximise qualified traffic for limited spend, particularly in competitive verticals, so may be reluctant to diverge from these. In fact, this may better benefit the current broad match advertiser where volume can be refined without killing traffic levels altogether, but as always it will be interesting to see the results when fully launched.”
Google has recently introduced Social Media Reports in Google Analytics intended to enable advertisers to more accurately assess the value of their social media activities, in particular which channels are driving engagement and are proving effective both on their website and across the web. Google says that ‘the new reports bridge the gap between social media and the business metrics you care about - allowing you to better measure the full value of the social channel for your business’. The reports include an overview of social performance and its impact on conversions, visibility on which conversion goals are being impacted by social media and which of your content is being most shared across the web to name a few.
For more detailed insight, Social Media & Analytics Manager at Guava, Mark Edmondson, has written an article on how these reports can now expose the true value of social media.
If you weren’t already aware,Yahoo! and MSN have officially announced that the Search Alliance is taking place at the end of April, long overdue from their initial announcement. This means that all Yahoo! paid search accounts will be transitioned to and subsequently managed through Microsoft adCenter. In terms of management, only needing to optimise from one platform is more effective, but we will not have visibility on segmented search volume by engine.
All Guava clients will be informed of the exact changeover date and our PPC teams are continuing to manage and optimise the accounts until the transition is complete, to ensure clients keep receiving valuable traffic from both accounts.