Wednesday 23 February 2011

Making websites mobile-friendly

The Google Webmaster blog has posted some advice about making websites mobile friendly, in response to an increasing number of questions being posted by website owners and developers.

The blog explains the how the current mix of mobile phones can access the Internet, between traditional mobile phones (i.e. phones with browsers that cannot render normal desktop webpages) and the newer trend of 'Smartphones' (which are phones with browsers that render normal desktop pages, at least to some extent, such as Blackberry devices, iPhones and Android phones).

Google uses 2 types of search engine 'crawler' that are relevant to mobiles: Googlebot and Googlebot-Mobile. Googlebot crawls desktop-browser type of webpages and content embedded in them and Googlebot-Mobile crawls mobile content. The blog post explains how to recognise and serve appropriate content to the Googlebot-Mobile.

Google says they expect smartphones to handle desktop experience content, so there is no real need for mobile-specific effort from webmasters. However, for many websites it may still make sense for the content to be formatted differently for smartphones, and the decision to do so should be based on how website owners can best serve their users.

Most websites currently have only one version of their content, namely in HTML that is designed for desktop web browsers. This means all browsers access the content from the same URL. These websites may not be serving traditional mobile phone users and the quality experienced by their smartphone users depends on the mobile browser they are using and it could be as good as browsing from the desktop.

Labels: ,

0 Comments

Thursday 14 October 2010

New link data in Google Webmaster Tools

The Google Webmaster blog has announced the introduction of new information within the "Links to your site" feature. The updated content now shows a summary of which domains link the most to the website, the pages on the website with the most links, and a sample of the anchor text that external sites are using when they link to the website.

This is an excellent new addition to the array of data provided for website owners, and in particular the knowledge about inbound links can be a key factor in the marketing and traffic volumes coming to the site.

Labels: ,

0 Comments

Friday 12 March 2010

Managing multi-regional websites

The Google Webmaster blog has posted an article about managing multi-regional websites. This covers many of the questions often raised by international companies who have developed, or want to develop, different country versions of their website, including in different languages. There can be a number of issues that need to be considered in these cases, including domain name structure, duplicated content and of course maintaining and updating content efficiently.

The Google post is a brief introduction to some of these issues and reviews the main issues to consider in planning the domain structure as well as the advantages and disadvantages of each approach. It reviews the geotargeting factors and how this can affect the visibility of websites on Google's regional results, as well as the best ways to deal with duplicate content.

Labels: ,

0 Comments

Friday 17 April 2009

Google make changes to search results

Google has made a number of notable changes to their search results. The first, as described by Search Engine Land, has increased the frequency of local business listings being displayed within the first page of results. Whereas previously the small map and 10 business listings only appeared for popular search terms combined with a location in the search query, Google has now started to include these results for these common terms even if the user has not included a location in the query.

This is being done when Google recognises a term that has local search intent and combines this with the identification of a user's location by their IP address. The mapped results are not shown at the top of the search listings - which happens if a search is made with a locational term - but the inclusion within the results aims to improve the local search focus for users.

It's by no means a perfect solution for searchers and much will depend on the IP address of a user's ISP (Internet Service Provider), but this change will have a big impact for local businesses who will now get a further opportunity to appear within the search results list for a potential customer and highlights the importance of getting an optimised local business listing set up with Google.

The second recent change to the ranking results has been reported by the Google Webmaster Blog and concerns the 'sitelinks' that are often displayed under a large or popular website's listing so that users have more opportunities to click directly into a prominent section of the website. Until now, sitelinks have only ever appeared on the first search result, and so at most one site could have sitelinks per query.

Google has now introduced an expansion of these sitelinks into a single row of links which will be displayed for results that didn't show sitelinks before, even for results that aren't in the first position. This means multiple results on one query can now have sitelinks and up to 4 sitelinks can show up right above the page URL, instead of the usual two columns below the URL of the first result.

This will help to show users some relevant sub-pages in the site and give an idea of what the site is about. Comparing the sitelinks that appear for each result can even illustrate the difference between the sites. Google says that, just like regular sitelinks, the new one-line sitelinks are generated algorithmically and the decisions on when to show them and which links to display are entirely based on the expected benefit to users.

For webmasters, this new feature means it's possible that their site will start showing sitelinks for a number of queries where it previously didn't and although site owners can't tell Google which links to include, the can block links they may not want to show, through access to the Google Webmasters Console. However, in most cases this change will probably increase the visibility of, and traffic to, a website, whilst also improving the experience of users, so it's another change that can support the search marketing for websites.

Labels: , ,

0 Comments

Monday 16 February 2009

Leading search engines combine to clean up results

The New York Times reports on a move by the 3 main search engines - Google, Yahoo and Microsoft - to clean up the amount of 'clutter' on the web by creating a new web standard that will allow website publishers to remove duplicate pages from their sites. This should allow the search engines to remove lots of duplicated or 'dead' pages from their indexes to make them more efficient and potentially more comprehensive.

This cooperation between the search engines follows the previous standards developed for the sitemap protocol and this time targets those large dynamic websites (such as e-commerce stores) that generate multiple URLs that all point to the same page. This effect can confuse the search engine 'spiders' that are trawling the web and lead to the indexing of the same pages multiple times. Some estimates claim that as much as 20% of URLs on the web may be duplicates, although this is possibly on the high side.

Google has lead the way with this move, providing website owners the chance to indicate when a URL is a duplicate, and if so, which is the principal, or “canonical,” URL that search engines should be indexing. Yahoo and Microsoft have agreed to support the same standard. This new Canonical Link Tag, as the standard is known, should make it easier for both publishers and search engines to address the problem, but of course the most important thing is to make web publishers aware of this and to give them the incentive to add the tag to their pages.

Labels: , , ,

0 Comments

Friday 17 October 2008

Google warns potentially hackable websites

The Google Webmaster blog has announced a new service being offered to users of its Webmaster Tools service - an alert to webmasters if Google identifies possible issues with a CMS website or online publishing system (like Wordpress) that could create an exposure to hackers.

This is currently undergoing a trial phase, but Google says that they are seeing more websites getting hacked because of various security holes, so this new service should help to provide valuable information to website owners if there is a potential vulnerability. A message will then be posted in the Webmaster Tools account (and even if a website hasn't yet signed up for this tool, the message will be available once the account is opened).

If this service proves effective and is extended across the system then it adds another valuable element to the Webmaster Tools console. Of course Google isn't going to pick up every potential hacker issue, but sites that use common software systems where there are issues should be notified and this will give website owners an advance warning to fix the problem.

Labels: ,

0 Comments

Friday 12 September 2008

Google's view of duplicate content

Google's Webmaster Blog has posted a 'definitive' answer to the question of duplicate web page content and whether a penalty is applied to these pages or sites. It provides a summary of previous posts and articles related to the duplicate content issue, as well as a reminder of Google's guidelines on the matter.

Google's view on duplicate content is one that aims to reduce the use of 'cookie-cutter' pages to create multiple pages with very similar content, or 'screen scraping' where websites will directly copy information from other sites, which can be common for affiliate marketers. The bottom line is that Google wants to index original content, although recognises that this is not always possible for some websites, particularly those generating content dynamically.

Google's webmaster guidelines state that 'duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results'. There is further information here about this issue and ways to avoid it.

Like most search engines, Google aims to present a degree of variety within the search results and they will therefore filter out duplicate documents so that users experience less redundancy. This is done in a number of ways, such as grouping duplicate URLs into one cluster, or selecting what is seen to be the "best" URL to represent the duplicated cluster in search results and then by consolidating the properties of the URLs in the cluster, such as link popularity, to the representative URL.

In summary, this post says that Google is unlikely to implement any form of penalty unless it is decided that a website is duplicating deliberately - rather one version of the duplicated content is seen to be the 'best' option to be displayed within the ranking results.

Labels: ,

0 Comments

Wednesday 20 August 2008

Using 404 pages

The Google Webmaster blog has been running a series of posts over the past 2 weeks outlining the use of the 404 'response code' which is generated when a server cannot find a requested page, either due to a bad link or a mistyped URL / page name. In these cases websites should display an error page that informs the user of the problem and helps to direct them to the Home Page or other part of the website.

One blog posting describes the differences between a 'soft 404' and 'hard 404' response, recommending that the former shouldn't be used. Then there are a series of FAQs about how the 404 should be used, leading up to the latest blog posting which is promoting the use of a new 'widget' being provided by Google.

This allows webmasters to add some JavaScript to their customised 404 pages which will suggest to a user the closest match to a truncated URL or missing page. In addition, this 404 widget - which is still in test phase - will suggest a link to the parent subdirectory, a sitemap webpage or a site search query suggestions and search box, if these are available on the site.

Labels: ,

0 Comments