A Smart Guide to Reach Out to the Best Web Analytic Tool

Best Web Analytics Tools Review

 

Imagine a train running on a rail track without showing any indicators such as fuel used, distance covered, and engine temperature. Similarly, visualize an airplane in the sky moving towards the east without showing the engine’s temperature, speed, working status of the engines, and remaining fuel amount on the control board in the pilot’s cockpit. In both the situations, the driver or pilot does not know what is actually happening or how the train or plane is actually working.

Do you think such a lack of awareness can make the driver or pilot reach up to the intended destination safely and efficiently? Practically, the answer is ‘No’. Well, such an incidence can also happen with your online business site in case you do not know what is exactly happening on it once it goes online. To know what is happening at your site is nothing but analyzing it.

Questions That Inspire You to Analyze

Best web analytics 2015

Let us consider a real-life scenario of your business wherein you are a cloth boutique firm and I am an individual looking for some official as well as casual stuff from brands only, due to bad experience of using unbranded shirts and pants in the past. Now, I only need to look at branded items and I reach on your site through my friend’s recommendation.

Like me, there will be many more people out there who will be interested in your online store. So, how do ensure your brand awareness and popularity online? Well, this is ensured only if you find answers to the following queries, and improve them for brand popularity in case they are dissatisfactory:

  • What is the site’s rank on the search engines result pages?
  • How many visitors are landing at your site each day?
  • How many new visitors are finding your site each day?
  • Which pages are the visitors hitting the most?
  • Which pages are encouraging purchase?
  • Which products do the visitors love more?
  • Is the Web site design effecting in keeping visitors for long on a page?
  • From how many different sources does the traffic (visitors) come from?

Well, these queries are like indicators that show you what is happening on your site. They measure the performance of your site in the most relevant niche and give you a report of how successful or improvable the performance is! If you are not keen on doing this yourself, services like the Commence sales tracking and reporting services are available for hire. It’s best to leave it to the experts if you are not inclined yourself and there’s no shame in that.

Think a Minute

Do you think without the above answers, your online site can ever fulfill your business goals? If you do not track or monitor, it is like leaving the train (site) at its own speed (capabilities), which can get de-railed any time (lost online visibility or rank). After all, you should know where you are standing (site rank) to move further to the target!

Easiest Method that Encourage You to Analyze

Encourage to analyze

Well, this is none other than the Web analytics tools! You can obtain quick answers to all the above questions by using a Web analytics tool. With a simple yet powerful design for notifying you about the site’s performance, these tools aid in tracking different statistics. Through these statistics, you can see the number of people viewing each page, sources from where they came from, and the number of site features that are most popular.

The Web analytic tools track key metrics of your site to measure traffic and Click-Through Rates (CTRs) for new page or content as well as comprehend your visitors’ requirements and behavior. These measurements tell you whether your site needs improvement or not!

Quick Aid

If the absence of above statistics keeps you awake even at night, you are truly eligible for getting a Web analytics tool.

Before I take you on a quick tour of some most important tools in the World of Web Analytics 2.0, let us find out the metrics that you got to measure with them.

Metrics (Data Indicators) That You Should Analyze

metricks

Think a Minute:

Do you think it makes sense to look for a tool when there is no idea of what information needs to be tracked? Well, the information that should be tracked can vary from simple visitors count to complex analysis of visitors’ behavior. Therefore, the question can arise as to which vital metrics should be paid attention to while analyzing the site through a tool. Here is a simple breakup of vital metrics:

Category 1: Primarily, you will have to know how many people are visiting your site. While this one seems easy to find, it is actually not. Metrics answering this query include:

  • Unique Visitors: Shows the total number of visitors on the site during a set period. It does not include repeated visitors. This statistic reveals how effective your marketing efforts are in reaching a wider audience.
  • Repeat Visitors: Shows the number of visitors coming back to your website multiple times. It indicates how big is your loyal fan base!
  • Visits: Shows the number of visitors to a specific page or site. The trend in the number of visits over time shows you insight into your brand’s popularity. When you compare with the number of visits to each page, it shows which parts of your site are useful.
  • Page Views: Shows the number of times a visitor views a specific page. Increased page views simply denote that your site is becoming more engaging.

Category 2: You will now be interested in who these visitors are and what they are doing on your site when they visit. This helps in knowing which site pages or features attract them the most and which ones go unattended. Metrics in this category include:

  • Visitor Information: Includes region or country, Web browser in use, and more details about the visitors.
  • Registered Users: This shows what each logged-in user did on your site in case your portal requires visitors to log in. This facilitates a more detailed understanding of what diverse visitors are doing on your portal.
  • Bounce Rate: This shows the percentage of visits during which a visitor left your portal after looking at just one page. This statistic reflects visit quality. For instance, a high bounce rate for a blog is not a point of worrying because reading one article and leaving the site is sensible. However, for an eCommerce site, it is a point of worry because it indicates that the entrance pages are either non-attractive or irrelevant to the visitors.
  • Click Path: This shows how the visitors navigate across your site. Also known as click tracks, these visual representations show the visitors’ journeys. For example, a click path can reveal that 10% of home page visitors prefer clicking the Resources link, while 50% clicks on Current Offers, 25% go on the About Us page, and 20% leave the site.
  • Conversion: Tracks the number of people responding to your CTAs (Call to Actions) or doing what you want them to do. For example, it can show the number of users who clicked on a Newsletter link to subscribe to it or on Free Offer on the home page and went ahead to grab it. This metric is somewhat complex as it needs to be set in a tool but is truly a valuable one.
  • Top Entry and Exit Pages: Shows the pages through which most visitors tend to enter and leave the site. It is unwise to assume that the home page is always the entry page.
  • Site Search: Shows what people search on the site in case the analytic tool supports it. The metric helps you comprehend what visitors want from your site.

Category 3: Next, you would surely like to know from where the visitors are coming to your site. Knowing this information can help in better comprehending the types of sources that move people to your site. Metrics include:

  • Organic Traffic Sources: Show the percentage of people who come to your site by clicking its organic listings on the search engine results page. If the percentage is below 40%, it simply means to revamp your content.
  • Search Keyword: Shows the words or phrases that the visitors typed in the search box of search engines such as Google to reach to your site. This helps in determining the effectiveness, competitiveness, and success of keywords. It is wise monitoring your keywords for ensuring that you reach out to your target audience.
  • Inbound Links: Shows the number of links to your site coming from other sites. Many tools also show which of these links get people to your site. This metric provides an insight as to how effective is your link building strategy and what types of sites consider your content useful. The rule is that the higher the other site’s domain authority, the more valuable is the link.

While the aforementioned metrics should be adequate to get started, many robust analytics tools come with even more sophisticated metrics and analysis features. In fact, many people earn their livelihood by analyzing these statistics, who are typically professionals hired for a profound analysis for big sites.

When considering the diverse analytical tools for your business, the assortment of options can lead to confusion, as you might not know how to use them. This is why it is recommended hiring a professional to dig the virtual space of your site and generate all vital reports.

I just cannot go ahead without stating about a golden rule existing in this regard. It is known as the 90/10 rule, which means if you are spending $100 on analytics for making new decisions for your site, it makes sense to invest $10 on the analytical tools and $90 on analysts with great expertise. Without a proper understanding of the details given by the tools, the reports remain as raw data (no processing), which is of no use for your business.

Bryan Eisenberg, who is the marketing consultant affirms that investing in tools and people who require the former to be successful is the key. However, what matters is the people comprehending that data. According to Avinash Kaushik who is the author of Web Analytics: An Hour a Day, “The quest for a tool that can answer all your questions will ensure that your business will get scrapped and that your career (Web Chief Marketing Officer and Analysts) will be short-lived.” Therefore, he suggested focusing on ‘multiplicity’, the use of multiple tools for different purposes.

In addition, it is worth keeping in mind that neither all tools have the same number of metrics nor do they work in the same way. Because of the high level of complexity involved in the world of analytics due to unique site’s characteristics and visitors’ varying background of preferences, each tool is designed to deal with the metrics in a different way. This is perhaps why Mr. Kaushik focuses on using multiple tools. Using multiple tools simply deepens your level of insight into the site’s success rate as well as a customer base.

Identifying the web actions as those of a ‘unique visitor’ is complex as well as subjective. Therefore, several tools tend to compute these statistics differently. For example, a few tools consider traffic as per a log of the pages on offer by the Web server, while some depends on the reports provided by cookies (information packs sent by the browser of each user). Therefore, while using different tools, it is common to have different values for the same metrics.

For big businesses, tools that are more robust can really be useful, although at a handsome cost, but for small and mid-sized businesses, cheaper and even free tools are available for getting the desired control over the site statistics. Obviously, you will not use all the available tools all the time but employing a few is simply the need of the hour as well as a vital part of your overall web strategy.

Think a Minute

It is now clear no single tool can be the best for analyzing the site. Then, how do you determine the most suitable suite of tools for analytics?

Recall Web Analytics 2.0

web analytics tutorial

According to Mr. Kaushik, Web Analytics 2.0 is defined as the analysis of quantitative and qualitative data from the site and competition for driving a constant improvement of the online experience of customers and prospects, translating to your set outcomes (online and offline).

Rather than only a data analysis process, Web analytics 2.0 is an ongoing process of three-tiered data analysis and delivery service for all types of businesses. First is the data itself, as it reveals the clicks, page views, traffic, and more for both the site and direct competition. Second comes what is done with that data, or how you process the gathered information from Web Analytics and apply it to both existing and new customers for ensuring a meaningful and better experience.

The last tier involves how the aforementioned actions are aligned with the most vital business objectives (both online and offline), such as sales, social presence, customer engagements. In short, herein, the data itself facilitates watching how your site is performing.

Based on these tiers, Mr. Kaushik has also come up with an acceptable order of key components of Web Analytics, which are:

  1. Clickstream: Measures ‘What’
  2. Multiple Outcome Analysis: Measures ‘How Much’
  3. Experimentation and Testing: Measures ‘Why’
  4. Voice of Customer: Measures ‘Why’ just as the third component
  5. Competitive Intelligence: Measures ‘What Else’
  6. Insights: Gives ‘Gold!’

Making your site reach the desired performance levels is simply not easy. It takes persistent tracking as well as frequent measurement of diverse metrics for optimizing usage. This is exactly where the concept of Analytics is implemented by using tools belonging to its different component levels. This is how you ‘multiplicity’ is used – employing the most suitable tool at each component stage. Let us now look at the most reliable and efficient Web Analytics tools available per component.

Clickstream Analysis Tools

Clickstream measures data for each page visited, including details such as how the visitor came on that page (keywords, social media support taken, and ads), how long the visitor was on that page or site, and what the visitor did (purchased, subscribed to an offer).

Google Analytics (Free, Ideal for all Businesses)

google analytics for blogger

This one is among the most robust, simpler, and profound analyzers in the market and creates detailed statistics about visitors. You can find out the sources that bring your visitors to your site, their actions on the site, and their frequency of coming back. You can even find their demographics such as geographical location and interests, number of daily visits, number of currently active visitors, and daily revenue generated by the site. Because it is integrated with Adwords that is largest paid search platform online, Internet marketers can ideally monitor the landing page activity and employ the necessary improvements.

For small business owners, it is perhaps the only comprehensive as well as easiest tool to rely upon.

Pros: Ease to set up and use, support for mobile sites, goal tracking, custom data gathering and reports, fast report generation, PDF reports download, quick sharing of dashboard, good help to get started, multiple dashboards support, world-class analytical tools

Cons: No brand-matching customization (limited fonts and layouts), less metrics (widgets) per dashboard, requires constant training, limited goals, Premium version costlier although advanced

Piwik (Free for Hosting on Your Web Server)

piwik wordpress

Go for this one in case you a technically oriented person and loves adventures. Piwik is an open-source tool executing on MYSQL and PHP. It is perhaps the leading open source tool providing valuable insights into the market campaigns, visitors, and more for optimizing the existing strategy for ensuring the best online experience of your visitors. The tool is designed as an open source alternative to Google Analytics.

For those who do not like sharing all data with Google, Piwik is an ideal substitute.

Pros: Privacy as data on your server is not shared, customizable interface, real time reports, security ensured by blocking URLs and traffic, tracking ability for multiple sites, several data export formats supported, Android and iOS app, no limit on data storage like Google Analytics, all Google Analytics features included, exhaustive help files

Cons: A bit of time-consuming installation process

Woopra (Free For Up to 30,000 Actions per Month after which it is Chargeable (Many Packs)

woopra alternatives

Woopra is another top-notch tool providing real time tracking of multiple sites for improving customer engagement. It wins over Google Analytics only in terms of update in which the latter takes more time. The desktop tool shows live visitor stats, including location, pages on which they are on, and pages or sections that they have been visited.

I recommend the tool especially for e-commerce site owners who get a chance to chat live with their visitors.

Pros: Well-organized interface, real time viewing, possible customer insights revealed, mobile app supported, notifications for important customer activities, customization of customer data, behavioral profiles for each customer by watching what they did on the website.

Cons: Confusing for beginners due to much data on the dashboard.

You can extend your control and above tools’ power by using Google or Bing Webmaster Tools. These tools help obtaining SEO details and major diagnostic data about your site. It is also recommended using Feedburner for tracking the activity of Clickstream occurring within the RSS feed.

Multiple Outcome Analysis

The outcome analysis is likely to happen within the tools mentioned above and below this section. For instance, you shall be configuring goals and ecommerce tracking in Piwik or Google Analytics. Similarly, campaign’s profit and margin are computable in Microsoft Excel through a relevant database query. Therefore, no specific suggestion of tools for this component.

Experimentation and Testing Tools

These tools aim at improving user experience and increasing the chances for having conversions (leads). At this stage, you should consider conducting different tests such as clickmaps, heatmaps, multivariate (MVT) testing, A/B testing (multiple versions of a site executing simultaneously), geotargeting, and so on. Because there are many tools, I will be covering them briefly.

  • Google Website Optimizer (Free): It is a powerful A/B and Multivariate (MVT) testing solution that rotates varying content segments to see which sections get the most clicks. You can choose different page sections for testing such as images and headline.
  • Optimizely: It is for those who wish to test very fast, especially A/B testing. It is a new but simply tool offering powerful results for improving the site performance without having any kind of technical knowledge.
  • Crazy Egg: This tool employs the Heatmap technology to offer a visual picture of what visitors are doing on pages, where they are moving their mouse, where they are scrolling, and where they click. Such tracking allows observing the most attractive sections of pages and improving the ignored sections for optimizing user interaction. You can also split the data to see how visitors from a single traffic source behaved in comparison with those from other sources.
  • MouseFlow: This tool records what the visitors do on your site, right from moving a mouse to filling a text box. It helps in tracking the visitors’ behavior and comfort level through heat maps as well. Mouseflow combines Crazy Egg and UserTesting features.
  • UserTesting: This one employs a distinct way to test your site: get real time feedback from visitors in the chosen demographic. You pay for participants chosen to answer your questionnaire regarding your site. A video with the user and her or his activity is recorded and you get the feedback in an hour, stating the user’s actual thoughts. The tool is costly, but worth for real time instant feedback.

Voice of Customer Tools

Clickstream data provides insights into what the visitors did on the site, but it does not say why they left the site without staying there for long. Well, this needs visitor research that can be done by the below described tools.

  • Kissinsights (Kissmetrics – Free to Paid): It helps in real-time customer acquisition and retention by revealing information on user habits and engagement before and after they use your It is easy to install and use, as it asks all queries to customers via a simple dashboard for getting customized feedback in the form of short comments. Its data funnel points to the weakest point/step blocking conversions, while the revenue tracking features show what customers do after purchases. In short, you get all data required for increasing conversion rates.
  • 4Q (iPerceptions – Free): This is a pure online survey tool allowing you to comprehend the ‘Why’ aspect of visitors. Surveys facilitate gaining vital insights from the customer real experiences. It answers questions such as how satisfied are the customers, what they do at the site, are they doing what they need to do, and so on.
  • Qualaroo: This is also a real-time survey tool for getting feedback from the users currently on the site. However, it stands apart because it allows you to use any combination of variables. For example, it helps asking shopping experience when the visitor fills the shopping cart and usage experience while leaving the site. You can get both entry and exit reviews. Consider this tool for sites with a high turnover of visitors and site owners who need to know ‘why’.
  • Clicktale (Free to Paid): This one is most prominent in the customer experience analytics, which aims to optimize usability as well as maximize conversion rates. The most useful features are session playback and visitor recordings, which enable watching everything any visitor does on the site. You get to see the recordings of each action of a visitor, right from the first to the last click. The tool also comes with scroll maps, heatmaps, and visual conversion funnels for gaining more useful details about visitors and their habits. However, it is a costly choice.

Competitive Intelligence

Existing since the invent of business, competitive intelligence includes knowing what the competitors are doing, which can be then done better and faster. Such data enable you to recognize what is known and what is unknown as well as analyze ecosystem trends, failures, weaknesses, opportunities, and rising customer preferences. This details help in planning better marketing programs.

  • Google Trends & Google Insights (Free): Both tools for search are now merged into a single tool. The Trends tool offers worldwide visitor data for sites deployed anywhere in the world and makes it easy to comprehend traffic and search keywords. The Insights tool simply points out to the global database of customer intentions; you can know what your customers are thinking and feeling. You can use this tool for analyzing industries, emerging trends, share of search, and offline marketing strategies as well.
  • Google AdWords Keyword Tool (Free): Consider this tool for refining your SEO strategy as per the most searched keywords, download specific user-typed queries, and gaining insights into specific brands under analysis.
  • Facebook Insights (Free): Track the popularity on Facebook by tracking follower counts, post comments, and likes through this tool based on users and interactions. It helps in improving customer engagement on the biggest social media platform.
  • Twitalyzer (Free): This one offers detailed dashboard with several metrics for measuring Twitter influence and engagement on your brand identity. You can track your account’s impact on followers, retweet level, and level of conversation engagement. It is easiest one to use for your Twitter account.
  • Klout: Consider this one for having the cleanest metrics for measuring performance on Facebook and Twitter.

Conclusion

With Web Analytics tools, the modern responsibility of improvement has been given to us. With so many tools available, it is not difficult to improve your online visibility as well as performance. However, the need is to choose a few of the above tools to collect all the data and process it for taking the right decision. Although Google Analytics is still the most famous tool out there, it is better to use it in combination with other tools to push your site to its maximum potential.

In case of no tight budget and customization as a top priority, it is recommended to select the paid tools that offer all the features and metrics for improving the site’ performance and expand the online customer base.

Read More

Beware Online Pirates, Google’s New Pirate Update Is Infallible

Google Pirate update Rolled out

Pirates have existed for triggering copyright infringement in every possible manner on the vast Caribbean of the World Wide Web. This has always forced the different industries online, especially the entertainment industry since a past few years, to ask Google for coming up with its anti-piracy policy or tool. Well, as expected, these requests have fallen on the carefully listening ears due to which the search engine giant has managed to respond by implementing different anti-piracy measures safely.

Google has kept making changes in its search engine algorithms to demote the most wicked pirate sites. One of the most renowned changes is an improved effort to make such sites less visible in search results, directly indicating that they will not appear in the initial search pages. Since 2012, Google has been running a down ranking system but is reported to lack effectiveness as per the copyright industry groups such as RIAA and MPAA.

Just last week, the giant has announced of the improved version that aims to address this comment/issue. With the updated version of ‘How Google Fights Piracy’ report that was originally introduced in 2013 for defending the claims by film and music copyright holders, Google seems to give an overview of all the efforts for fighting piracy as well as reveal the importance of responsibility of copyright holders for making the content available. The 26-page report delineates the following anti-piracy principles of Google:

  • Defense against abuse
  • Generation of better and more legitimate alternatives to keep piracy at bay
  • Provision of transparency
  • Funds monitoring, as Google believes that the most effective way to fight against online pirates is to reduce their money supply while prohibiting rogue sites from its ad and payment services
  • Guarantee of effectiveness, efficiency, and scalability

Introduction to Google Pirate Update

dmca-googleEstablished in August 2012, the Pirate Update refers to a filter designed to disallow sites with several copyright infringement filings through the Google’s Digital Millennium Copyright Act (DMCA) system to rank well in the search engine results. Because this filter is regularly updated, the formerly influenced sites can manage to escape in case they have rectified the mistakes or made improvements. Well, at the same time, the filter is also capable of sensing new sites that managed to escape before as well as releasing ‘falsely caught sites’.

The update is just as other updates such as Penguin and allows processing all sites to catch any site appearing to be in violation. Once caught, the site is then stuck with a downgrade until they tend to get fewer or no complaints to get back into the race. However, since its day of introduction, the filter has never been rerun, which means a real pirate site along with new violators during this two years, which  need to be punished, might have managed to escape. This has perhaps made Google to finally update its Pirate Filter after two years!

Sensing Pirates

A smart way exists for Google to guess whether copyright infringement is happening, which is of using the DMCA ‘takedown’ requests. Due to these requests, the pirated content can be removed from Google. It is just a matter of filing a request and that pirate site can be easily thrown out from search results. Although this is not an evidence of copyright infringement, it acts as an allegation that can be challenged. Keeping this in mind, Google assesses each request and removes the content only after it deems valid.

It would not be wrong to say that filing such a request was a painful task, because each request only addressed one Web page. This was true in case the target Web site is big. However, today, the anti-pirate game’s target has changed from page-by-page to site-by-site basis. A guide to most likely to be affected sites is available on online transparency report showing the rank of sites according to the total number of takedowns obtained.

Therefore, if your site has received too many DMCA ‘takedown’ requests, it is high time to look out. This is because a new penalty is out to lower its rank in Google’s search results. Further, it is applied in conjunction with other penalties or updates such as Penguin and Panda.

Valid Requests: How Much Valid They Can Be?

Request-validation-exception

However, what if someone files a request or complaint that seems to be true even if it is not? Yes, chances are high of such an incidence to occur. A request filed needs to be ‘valid’. The definition of ‘valid’ is simply that a filing with right paperwork has not received a counter-comment.

It is a fact that only copyright holders can determine whether something is authorized and only a court can confirm whether a copyright was infringed or not. The search engine giant itself cannot find whether a specific page violates the copyright policy or not.

Well, while the new filter continues to influence the ranking order in search results, Google shall not remove any page from the results until it receives a justifiable copyright removal notice from the owner of the rights. Further, it will continue to offer counter-notice tools for reinstating the content that is believed to be wrongly removed.

According to a digital rights group called The EFF, it is particularly worried about the false positives problem. For instance, it is common to spot the government wrongly flagged sites that genuinely can post the supposedly infringing material under suspicion. The EFF, therefore, asserts that without knowing on how Google’s is planning to combat with piracy, it is not wise to assume its successful prevention of such similar mistakes.

It is also often argued that the takedown requests are only accusations of infringement of copyright. Neither Google nor court confirms about the validity of accusations although copyright owners might be hold accountable for bad-faith accusations. Downgrading search results only means telling the visitor that these are the most irrelevant sites to check out, which simply ends up giving more control to the copyright owners based on accusations alone.

A blog post by Public Knowledge that is an online rights group has raised fair concerns with reassurances. It believes that the filter is a winner in case the new policy helps in spotting legitimate sources, averts penalizing the lawful sites, and defends the legal interests of copyright holders. However, a new system such as this one has risks as well as accidental consequences, apart from the danger of being misused. Public Knowledge knows that Google is aware of this side of the coin but it is waiting to see how it will tackle such problems and whether it will go on to give top priority to the interests of users or not.

The Upcoming Update 2014

Upcoming-google-updates-For-2015

 

Keeping the above facts and controversies in mind, Google has planned to make it next update such that the pirate sites become harder to find, instead of dropping the sites from the search result pages. These sites will have lower visibility for common terms in use, such as a song or movie name.

The new tweak will ensure that a few most ‘notorious’ pirate sites are less likely to appear on result pages when searchers use keywords related to films, music, and other copyrighted content. However, the fact remains that those who know to search smartly and rightly will be able to see the pirate sites, at least their pages that have not been caught or removed through DMCA filings. Nevertheless, the update is likely to come with improvements and new efforts as follows:

  • Ad Formats: Katherine Oyama, Google’s senior copyright counsel, revealed about testing new ad formats showing links to authorized digital video and music services when search is made for keywords including ‘free’, ‘watch’, and ‘download’ and discarding terms from its auto complete facility in case the return results consists of several DMCA demoted sites. This will surely help in finding the legitimate media sources. The legal sites are likely to flow on top of the page. Apart from testing new formats of ads in search results, Google is also testing other ways of exposing legitimate sources of media, especially through the right panel on the search results page. Right now, these results are shown only in the United States but this will be expanded internationally.
  • Better DMCA Demotion Signal: Google has refined the signal to affect the visible rankings of a few most notorious sites coming up early on result pages for a targeted keyword. This will help down rank the truly violating sites.
  • Utilizing Autocomplete: The new update also aims to removing more terms from the well-known feature of autocomplete, which will done strictly on the basis of DMCA removal notices. Google has actually started demoting autocomplete predictions that fetch URLs on result page, which demoted by the DMCA.

This update is perhaps an essential move to allow the entertainment giants to get the most out of the Internet. With this update, no brand in the industry would continue to blame Google instead of its outdated distribution models. In fact, the MPAA has already admired this move of Google.

According to a post at The Guardian, Michael O’Leary who is the senior executive vice-president at the Motion Picture Association of America praised the entertainment industry. He is quite optimistic about the fact that this update will guide visitors to the several legitimate ways for accessing media online without stealing the hard work of innovative people.

Even RIAA has praised this update and believes it to better prioritize licensed music. It believes the update to be a potentially significant change that can prove to be truly meaningful to the creators by ranking the copyright violating sites lower in results than before. Well, this in turn, should ensure better rankings for licensed media services that not only deliver the best music but also pay to the creative artists. RIAA has affirmed the update to be a vital move in the right direction, something that all media players have been willing to see.

It can be concluded that most entertainment brands consider this update as a rational step, which treats copyright in a consistent manner. Certainly, Google has triggered an innovative willingness to treasure the rights of creators, considering that the digital marketplace on the Web for licensed digital services is far better than it was before few years.

The Reason behind Two Years Delay in Updating the Pirate Filter

While the algorithm updates are there since quite a time, piracy check has been a recent introduction. The reason for this is perhaps the recently recognized need of partnerships by this content distribution company.

It, therefore, has finally concluded to tackle the upsetting situation of pirate sites showing up higher in the result pages. However, one needs to think that why only Google is enforced for such anti-pirate policy while this happens even at Bing, about which anyone is hardly concerned.

Breaking the silence, Google has asserted that the anti-piracy change is happening now because it has all the required data that was lacking before. Because it introduced copyright removals in 2012, it has received much data about online content infringement by the copyright owners.

Today, Google is obtaining and processing more notices of copyright removal on daily basis than in 2009. It will now take this data for ranking pages in search rankings. According to the updated ‘The How Google Fights Piracy’ report, Google obtained more than 224 million DMCA requests in 2013 for search results. This means that the average time consumed for tackling these requests is below six hours. Google also revealed that it removed 222M, which indicates 1% rejection due to lack of additional information, false infringement, or inability to find the page.

At present, by gaining praises from the entertainment industry and digital rights groups, Google is also deepening its partnership with them. Recently, it has collaborated with Paramount Pictures for advertising their new movie ‘Interstellar’ through a highly interactive site. While strengthening the relationships, Google continues to strive hard for overcoming piracy across diverse services.

Read More

A Beginner’s Guide to Google Schema Markup

A Beginner’s Guide to Google Schema Markup

Schema markup is one of the recent SEO evolutions introduced in 2011, which acts as a new, powerful medium of optimization. Despite this, it remains one of the least used SEO forms. It is a new branch reflecting Google’s support for structured data markup, a mechanism that helps expressing a relation between things in a machine-readable format.

Since some years, Google has supported three standards for structured markup namely, RDFa, microformats, and microdata. However, schema markup focuses only on microdata standard for keeping the implementation simpler as well as boost uniformity across search engines by achieving a balance between the simplicity of microformats and extensibility of RDFa.

Including schema microdata in digital pages is a hot topic in the SEO world, as it is a ranking factor recognized by Google. Although structured markup formats are popular since several years, very few sites tend to use schema microdata, and surprisingly, even fewer individuals truly know about what it is or what its importance is.

As per a recent study posted on Searchmetrics, less than 1% of domains include schema markup. Nevertheless, over 33% of Google search results show rich snippets that refer to add pieces of information to stand out a result on the search engine result pages (SERPs) and increase the chances of click-through rates and consequently ranking.

This itself says how essential it has become to include schema markup in your SEO strategy portfolio. Therefore, this guide is dedicated to all those who wish to include the schema in their Web pages.

Overview of Schema, Schema markup, and Schema.org

Schema-markup-for-local-business

Fundamentally, schema refers to a kind of microdata that allows a search engine to parse and decode the information on pages more easily as well as effectively. This helps them in offering relevant results to the visitors as per the search queries.

A schema markup is the novel best friend of an Internet marketer, as it is a distinct set of HTML tags (metadata) to be added to HTML pages for assisting the search engines in having a deeper insight of what that specific page is about. As a result, it easier for visitors to get exactly what they are searching online and consequently enjoy a better search experience.

While a majority of HTML tags conveys search engines what the site says, schema markup informs what your site means. This point of distinction aids the search engines in delivering search results of better quality.

Schema.org is the centralized hub for the Schema project on the Web, as collaboration amongst Google, Yahoo, Bing, and Yandex (Russian search engine trying our search without links) to standardize the mechanism of structured markup. The outcome is an agreed set of code markers conveying the key search engines, how to deal with the data on sites.

Schema.org is not a markup protocol or language, but a collection of schemas or Microdata markup vocabulary detailed and organized on the site. It is a standard vocabulary for search engines to deal with data for ranking. In simple words, Schema.org is an initiative to make structured markup simpler for search crawlers and site owners.

Technically, a schema is a type of rich HTML snippet for adding additional detail to the text below the URL shown on SERPs. Rich snippets help you in informing search engines directly about who you are, what you are offering, and any other information you are providing. They are like a signpost that eliminates any confusion. For example, a schema tells search engines that the given content on ‘Titanic’ is related to the ship that sank in early 20th century rather than the Oscar nominated film.

Therefore, schema is also the preferred method of markup for most search engines. A schema markup offers all types of options to make the listing of a site look flashy as well as relevant on an SERP. It refers to a code (semantic vocabulary) on your site to assist search engines in returning more relevant and informative results.

For example, implementing a proper schema markup for a page optimized for ‘vegetarian thukpa recipe’ can show an image, calorie count, starred rating, cooking time, and other bits of such interesting information under its URL on an SERP. This simply allows the click rate to increase. Here, the right schema markup tells the search engine that the keyword is not just a set of random words, but a cooking recipe that the Web page offers. Therefore, it can be concluded that a schema markup uses a distinct semantic vocabulary in the format of microdata.

When a site has schema markup, the users can observe on the SERPs what a site is all about, what is does, where it is, what its rating is, what is the cost of an item, and more. Therefore, from another viewpoint, schema markup is for users, which webmasters implement as a ‘virtual business card’. This is a user-focused progress that fulfills the goal of giving the most relevant information to the users.

Google names schema markup as the ‘preferred’ method to arrange (structure) the content in case you wish to show a rich snippet in an SERP. Well, the good news is that you are not required to learn new coding skills, as knowledge of HTML is sufficient. The only difference is inserting pieces of schema.org vocabulary to microdata existing in the HTML form.

Working of Schema Microdata

Just like other markups, schema microdata is directly applied to page content for exactly defining what it means and how a search engine should treat it. A webmaster needs to add schema elements and attributes directly to the HTML code of a page for offering additional information to search crawlers. Let us take an example of a rich snippet of a schema markup.

schema1
In the above example, the emphasis is on the content about Alfonso Cuaron’s 2013 movie, ‘Gravity’. By adding the itemscope attribute, you are conveying that the code below the <div> tag is about a particular item named gravity. If you ignore the itemtype attribute, the code then does not specify what kind of item it is. Therefore, itemtype attribute specifies that the item is a movie whose definition exists in schema.org type hierarchy. Kindly note that item type is specified as a URL. It specifies that the information within the <div> block is about a movie.

It is interesting to note that <h1>Gravity</h1> informs the browser to display Gravity in the format of Heading 1. However, the tag itself does not provide any information about the meaning of the string. This makes it tough for search engines to display content that is relevant to a user, intelligently. However, the itemscope and itemtype attributes remove this difficulty. Let us look at one more example:

schema2

In the above example, the schema markup says to search engine that an event (itemtype is an event) will be taking place on a particular day (StartDate and datetime). This makes it simpler for the search engines to display relevant results to the seekers without any kind of ambiguity. Dates and times are actually tough to interpret precisely by search engines due to differences in formats. However, such a schema markup simplifies this interpretation.

By defining item types and item prop (properties), search engines receive structured information for delivering results that are most relevant. Without such semantics, the crawlers have to interpret the date on the site on their own, which means the listing made by them might not be the site’s best representation.

The Significance of Using Schema Markup

Website-architecture-template

 

It is imperative to include a schema markup if you wish the listing of your site to stand out from the rest. Chances are high that you might have set up Google authorship that may have improved your rankings or at least make a visual appeal. To this, if you simply add some more interesting details for the users, the rate of appeal would surely increase.

Including schema markup is not only for SEO purpose but also for benefiting the searcher. With more relevant and useful details shown on SERPs, they are in a better position to go with a more informed choice. The ultimate picture is to make the World Wide Web a better place offering most reliable as well as relevant results on results pages.

Considering the fact that less than 1% of Web pages are using such rich snippets and that more than 30% of results contain schema-derived markups, there is a big and empty playing field for you enter and experience the benefits quickly. Moreover, pages with schema.org schema rank better than the pages without it. While markups as per structured data are rarely in use by webmasters, they are massively displayed in Google SERPs. This itself points to the growing significance of schema markups.

The Benefits of Using Schema Markup

 

Although schema.org semantics do not improve the ranking directly, pages implementing the schema markup have more chance of gaining greater visibility, which in turn, can increase the traffic and consequently ranking. This is something that Google says and focuses all the time. In addition to a prominent position, this new form of optimization offers benefits to searchers, search crawlers, as well as webmasters.

For the Searchers

  • Find quickly what they are looking for precisely as the rich snippet offers information in a summarized form (rating, price, description, and so on).
  • Click the right URL based on the information seen and mapping it with what is needed
  • More relevant results for the query

For the Crawlers

  • Easier, more accurate, and intelligent interpretation of data on site in an organized way
  • Smartly display of rich snippets on SERPs
  • More relevant detection of pages as per the user’s query
  • Increased efficacy

For the Webmasters

  • Make the Web a more useful world of information
  • Increased Click Through Rates (CTRs) by 15-50%
  • Time-saving due to a joint effort of schema.org being associated with Google, Yahoo, and Bing, which eliminates the need to add different markup codes for each search engine
  • More chance to appear higher in search with the usage of more schema types
  • Major opportunity due to less than 1% of sites using schema currently
  • Reduced bounce rate
  • More effective email marketing as well as local SEO
  • No clash with the usage of social media tags
  • Easy conversion of a site not developed in HTML5

 

Top 5 Motivations to Use Schema Markup in Your Web Pages

Even after reading the above discussion, you might have a doubt as to whether should you use schema markup for your pages. In that case, here are the three facts that can motivate you to use it right away!Matt Cutts, who has shut down countless sites utilizing dubious SEO strategies, has been recommending using schema markup for Web pages since a few years. Because he is controlling the largest search engine on the planet, his words have weight to influence your mind for taking the right decision.

As per a webmaster help video in 2012, Cutts said that implementing schema.org does not necessarily give the output of ranking higher, but if you typed ‘lasagne’ and clicked ‘recipes’ in the left hand side, this is where schema.org might help, as you are more likely to show up in that.

Rich Snippets Deliver Higher CTRs

Socializing-in-srilanka-2015

Using schema.org confers visible benefits, one of which is improved listing on SERPs. While generally items such as titles and page snippets are included, a targeted schema markup allows including customer rankings, photos, date/time, and more. The latter is not only more appealing aesthetically, but also more effective.

According to Search Engine Land, rich snippet listings that show more details than standard listings can increase the CTRs by 30%. This means 30% more traffic, which is truly significant for any company to reach the next level while increasing search engine visibility.

Apart from making it easier for the crawlers to chunk and organize your site’s content, schema microdata defines as well as display rich snippets of your information on SERPs. Contrary to a widespread misconception, Google uses schema markup to show up information snippets. If the snippets are clear and concise, they can generate higher click through rates because the visitors can easily as well as quickly determine whether the site’s content is what they are searching.

Schema-implemented Pages Averagely Rank 4 Positions Higher on Google

Although very few Webmasters have implemented schema markup, they are being rewarded at present! Their sites now rank four positions higher on Google, which means a fourth ranking page is now ranking first. The study explains further that Google shows results for pages with Schema markup for more than 36% of keyword queries, whereas those without markups are shown less frequently.

Continued Support to Existing Rich Snippets Markups

Organizational-schema-in-Srilanka

In case you have some kind of markup existing on your pages done with the help of RDFa or microformats, Schema.org will support it. This means you do not have to start from scratch and remove the existing markup, which saves a lot of time and energy in implementing this new schema markup. However, utmost care should be taken that the formats should not be mixed together on the same page, as doing so can confuse the crawlers.

Proactive Testing

Googles-Rich-Snippets-Testing-Tool

Just as testing your Web page, it is useful to test the view of markup. If successful, it simply tells that the crawler is able to parse the data precisely. You can do so by using the rich snippets testing tool. While this tool shows the parsed marked up information, its preview still does not show the text for schema.org markup, a functionality that will be added soon.

The bottom line is that experts of the top search engines recommend schema, which means implementing it is surely not going to hurt in any way, but rather increases your chances of higher click-through rates and consequently higher rank.

Applications for Which You Should Use Schema.org

The new schema markup improves the visibility of your site for almost all content types. There is a markup for movies, events, products, articles, restaurants, book reviews, local businesses, TV episodes and ratings, blogs, authorships, and software applications. Countless markup types, ranging right from toy stores to IT apps exist. Therefore, you can associate almost any type of data on your Web page with ‘itemscope’ and ‘itemtype’ attributes. Several ways exist for implementing schema markup in Web pages, which include:

  • Videos: Matt Cutts has revealed that Google recommends using rich snippets on a distinct video landing page even the video is embedded.
  • Testimonials: This is another worthy application of schema markup, which can show review description and rating of the item being reviewed. Most consumers rely on online reviews, which makes it sensible to markup testimonials on SERPs.
  • Brand Identity: This includes Name, Address, and Phone (NAP) that reveals your business’ contact as well as geographic information precisely. You can use local schema categories in the footer or page saying ‘Contact Us’.
  • Events: This one announces the happening events whose details on SERPs is truly handy for the visitors.

When Google introduced schema.org, several webmasters were disappointed to know that type of information supported by other structured formats is incompatible with this new microdata. The good news is that this problem is now solved, as the schema supports several data types featured in other structured formats. For highlighting the aforementioned information, Google supports the following types of data (schemas) officially:

  • Reviews: For showing reviewer, average rating, total number of reviews, description, and so on
  • Products: For showing availability, price range, image, description, brand, seller, and so on
  • Products: For showing time, video thumbnail image, and so on
  • Products: For showing the date, place, time, and so on
  • Products: For showing payment method, delivery method, place, availability of items for which an offer such as coupon or discount is on offer, and so on
  • Products: For showing location, organization, position, and so on
  • Products: For showing local business’ address, phone, geo-location, logo, and so on
  • Products: For showing calories, cooking time, ingredients, and so on
  • Products: For showing title, child, and URL
  • Products: For showing song, album name, duration, and so on

Do you think the above schema vocabulary is insufficient? If yes, you might have a question as to whether you can add your own types to the schema vocabulary or not. The answer to this is neither ‘yes’ no ‘no’.  Schema’s type hierarchy has several commonly used item types, each with relevant subtypes. However, the level of these subtypes can differ. At times, you might prefer adding a custom itemtype, which is possible by using extensions.

For declaring your own item type, you only need to add a slash as a suffix of a present item type and mention the new term. For example, you can write as ‘Person/Doctor/Dentist’, wherein Person is the present itemtype, while the Doctor and Dentist are the custom types. You can find information about naming conventions and customizing the properties and enumerated items on Products.

Mechanism to Markup Pages with Schema Microdata

Literally sitting and adding markup code in the HTML of each of several pages can be painful, especially if the number is in the hundreds or more. Manual addition means lots of more work for bigger sites than the ones that are smaller. For simplifying work for bigger sites, it is best to use plugins that are on offer by WordPress or you can consider using the following Google tools:

  • Products: This is a real time saver because it automates the task of marking up your page details in a user-friendly manner. It helps in decoding what the schema requires to be added to a page. Just select a data type and enter your page’s URL for marking it up.
  • Products This is like a testing tool that diagnoses any code errors or issues after implementation. You are only required to enter a URL and instantly get direction on the required changes for yielding the correct code. Use this tool only after you are done with including all markup tags.

For a first timer, a question might arise that is it essential to include every property or attribute of marking on every page or not. Well, it is not mandatory to do so, but the more properties you tend to apply schema microdata to, the more transparent the content’s nature and purpose will be to the crawlers.

Moreover, do keep in mind that you need to apply markup to a few properties prior to Google generates rich snippets by referring your microdata. With the help of the testing tool mentioned above, it is easy to check the kind of information to be extracted from the specified markup.

Integration with Social Media Tags

Facebook-open-graph-debugger

Social giants such as Facebook and Twitter have come up with their own rich data types for markup. Facebook has released Products formed on the foundation of RDFa due to which most tagging goes under <meta>. As a result, the tags remain invisible to humans. Twitter has come up with Products. Well, no official declaration from any of the two giants exists about their support for new schema microdata. However, webmasters are successfully using these protocols and Schema.org together on one page.

A few marketers wrongly think that including these two are enough to ensure that the given content is highly shareable. However, you can use schema microdata along with social media tags to offer even more details about content on a page to search engines. Consider including markup next to Open Graph tags, which deliver not only optimized but also shareable content.

Tips on Using Schema Markup

Consider the following tips to dive even deeper into the world of schema microdata and obtain richer results. All you need is to follow the tips proactively.

  • Explore schema.org in detail to know about the data types and properties supported and the extent of customization possible.
  • Find all schema types that are useful for your Web pages.
  • Identify the types that are most commonly implemented in similar Web pages as those of yours. Doing such comparison can also help in choosing the most useful markup types. Schema.org contains most common markup types. In short, because of an assortment of markup types, you need to explore the Products.
  • The more you mark up your pages, the better will be the results. When you start comprehending the diverse collection of itemtypes, it is easy to conclude how much is there on the page for marking up. However, this does not mean that you need to markup all content. The key here is to mark up the content that is visible to humans who visit the page, not hidden content in the div tag and other hidden HTML elements.
  • Avoid overusing the markup in a misleading or inconsistent manner, which can certainly force Google to impose a penalty that will trigger a negative effect on the ranking as well as the reputation of your site on the World Wide Web. Therefore, be ethical and play fair.
  • There is no need to mark up every property.
  • Do not redo the existing content in the new format of schema.org, as Google continues to support snippets for the prevalent content. Moreover, schema.org supports all types of information that even microformats and RDFa support. Nevertheless, consider switching to the novel markup format for gaining more benefits in the long run.
  • In case the site has content that scehma.org does not support currently, consider using a less specific markup type or utilize Products as a powerful customization tool to create a new type.
  • Do set up a Google+ business page for boosting business or brand exposure as well as fetch better reviews. Although this is not directly related with the schema microdata, it can get you more reviews as well as schema information. This is because this social media platform has integrated with Google Places for having a well managed page.
  • Do implement Google Authorship.
  • Always test your schema markups. If you are satisfied with the results, it simply means that search engines can easily crawl it the next time and interpret the data more effectively. This simply means increased chances of being more visible on SERPs.

 

Conclusion

Despite the significance and SEO benefits of schema markup, its acceptance has been low, perhaps due to the lack of technical expertise or even awareness. This has actually created an amazing opportunity for Webmasters and businesses, particularly those in IT and marketing, who are working together for achieving a single goal. While marketers might comprehend the importance of structured data markup, webmasters need to implement it with the help of more advanced HTML knowledge.

In other words, millions of sites are missing a lot by not being a part of this big SEO potential. By using schema markup, you will have a leg up automatically on the big ladder of competition. Including schema microdata in HTML surely increases visibility, but in no way it is a quick or a bad SEO strategy to get a higher rank.

Rather, it is a best practice to make it simpler for you to be recognized by search engines for a set of keywords for which your pages can prominently appear on SERPs. Schema markup is an SEO innovation that is perhaps going to last for a long time. Therefore, it is the right time to learn and implement the most suitable microdata for boosting your search results. However, you need to do in the right way to be ahead of the curve. I hope this exclusive guide helps in using the schema markup for achieving the SEO benefits in an easier and quicker way.

Read More

Powerful Local SEO Strategies for Accelerating Your Local Brand Awareness in 2014

It was almost a decade ago when local search gain momentum on the World Wide Web with customers searching for places and ways to reach those places. However, today more than 750 million GPS-enabled phones are aiding customers to explore places locally. As per the latest statistics, over 85% of users search locally on their smartphones through Web browsers and apps such as Facebook and Google Maps. This inspires Rob Reed to say that local search is a mobile experience at a social level.

In 2012, it was found that over 60% local searches triggered successful purchases. As compared to other digital marketing initiatives, local Search Engine Optimization (SEO) ensures a high investment return. According to a survey in 2013 by Marketing Sherpa, around 54% marketers under survey agreed that local SEO positively influenced their brand.

Search Engines and Local SEO: What’s the Current Impact

local _business seo

With the introduction of Google’s Hummingbird update in 2013, search engines have become more intelligent in returning relevant and localized results for people’s searches. Adding the updates of Panda and Penguin, local SEO has actually become a challenging necessity for businesses. This has triggered the need of targeting location-based keywords without schlepping Google with plenty of inapt local search terms.

However, forecasting local SEO trends is a bit difficult because local SEO is actually not a fad or phenomenon. In fact, it is a never-fading asset that allows optimizing the search experience for users residing in any part of the globe. It is a basic mindset that cannot be implemented later. However, the difficulty does not remain when you choose to have a Google-friendly strategy.

Getting Ready to Prepare a Local SEO Strategy: Gaining the Vision

Ganning the vision

Due to the Google Pigeon update, global brands are forced to reconsider their global SEO strategies that now need to be competitive even among the local businesses in the nations targeted. Whether you are a small or big business brand, local SEO is bound to increase your customer base, boost brand awareness, and expand your business growth as well as success.

So, is it possible for a global brand to compete locally? Yes, it is! Even a small business can compete with global brands. In both the situations, only a good local SEO strategy is going to work. For the strategy to work, one needs to have a comprehensive vision. Just as in case of content marketing, the key here is to think like a customer who is always interested in getting the desired information through the replies for different questions.

When you put yourself in the position of your customer, you can easily identify the dos and don’ts in each of the technical and non-technical aspects of local SEO. The common and bigger picture goes like this: Assuming you have your upfront stores in different countries, you need to create individual Web pages for each of them and optimize them to target local customer base first.

Before Creating Local SEO Strategies

Before we check the strategies, here are some factors to consider for preparing the most effective local SEO plan!

  • Local Laws and Ordinances: These might not apply to all businesses, but for some they act as the primary set of considerations. For example, if your business is swimming pool or home repair company, it is obvious that your customers need to follow the zoning and setback laws. Remember, each nation has its own regulations due to which the customers are more likely to search with phrases such as ‘home repair laws in Chicago’ or ‘swimming pool company in Chicago’.
  • On-site, Off-site, Social/Mobile, and Review: Once you know about the local laws and ordinances, you can easily determine its impact on the keywords and other SEO elements present on the pages, off the pages, on social networks, and on review sites.

Effective Strategies for Different Business Types

Businessman drawing strategy concepts

  • If You Are a Global Brand Having Local Branches or Separate Franchises Run by Locally Registered Firms: Consider developing a separate Google+ page for each and creating local listings on major sites permitting customer reviews, such as Facebook and Yelp. It is suggested decentralizing a sole G+ page into individual branches. Many few search engines prefer brands that have verified their site, running on local IPs, with the search engine.
  • If You Are a Brand Allowing Sales through Partners or Distributors: Consider motivating them to build awareness of your brand locally. Although the officially registered business directories allow personalized names, your goal should be to gain local back links whose anchor text contains a brand name, for instance, the links can refer to a partner.
  • If You Are a Global Brand Having Centralized Approach Without a Local Presence Anywhere: Consider purchasing local domains and localize the site. An ideal way to improve local backlinks without hurting the famed global presence is to subscribe to local domains called ccTLDs. These domains help developing brand awareness among local communities effortlessly. You can even establish your location-based target in Webmaster Tools. Do also include the target country in your site and localize content forms such as metric units and currencies.
  • If You Are a Global Brand with Some Local Popularity but Without Local Branches: Consider having your name, address, and phone (NAP) in your site or subdirectories on a consistent basis. Your brand might be popular through showrooms, training centres, or warehouses. As a result, gaining a phone number should not be difficult. Utilize schema.org to mark NAP so that it can be easily seen in the result pages of search engines (SERPs). This is because business functioning locally to some extent is preferred by search engines that can increase the rankings.
  • If You Are a Globally Reputed Yet a Local Brand: Consider having a single-domain, multilingual site in case you are running a university, tourist agency, or family-operated business. Never buy local domains such as ccTLDs or rely upon local SEO providers in different nations for the half link building task. Doing so can bring your reputation down, which has been earned by growing your brand locally. Rather, utilize your local domain power to create a single site, but offering subdirectories (<sitename.de>/Spanish) for different languages for reaching out to customers.

Local SEO Strategies 2014 for All Businesses

You may have optimized your site for local SEO whose rules still are effective, but some newer strategies are likely to be more effective in 2014 regardless of the business type. Let us check them out!

Keyword Strategy: Make it Local and Long Tail

Long title keyword

Local SEO might not be vital for e-commerce webmasters but finally, Internet-marketing professionals consider localizing their sites. This is because local SEO can bridge the difference between the happening 500 clicks per week and the desired goal of 2,000. One of the ways to achieve this bridge is to include the geographic location in the keyword.

While localizing your site, it is essential to revamp your keyword strategy to change keywords into more meaningful phrases for attracting local customers. As per Moz.com, webmasters should act wisely to include the city and state name in which the corporate operations take place, within meta title and description, page title tag, H1 headings, URLs, page content, and in alt tags. These elements exist for each page, which should include the local keywords including the city and state name. However, do not stuff too much on page content such that Google is forced to penalize you.

After the Hummingbird update, keywords in the question form deliver better and more relevant search results than robot-friendly equivalents that do not have question format. For example, ‘Which is the best pizzeria in Belgium’ delivers better results than ‘Pizzeria in Belgium’.

Before a few years from now, both these keywords would show similar results. However, if you see today, most of the result pages shown in both differ drastically. This is because Google now comprehends the user motive in a better way. Therefore, consider long tail keywords throughout the content of your site and linked sites for targeting the second class of search users.

However, keep in mind that using these keywords does not mean ranking on the first page of results by Google Map for that keyword. For this to happen, you need to take of more things that are discussed ahead.

Organic SEO Strategy: Directly Influencing Your Local Search Results

Local seoResults

According to Mike Blumenthals (Blumenthal.com) and Adam Steele (Lean Marketing), the individual site’s organic SEO rank highly influences the local results. Both performed some local searches to prove this relation between local and organic SEO. It directly says that ranking well for local SEO requires ranking well for organic SEO.

This itself means to continue working on organic SEO efforts, which include optimizing meta title and meta description, images, page speed, URLs with customization, strategic link building, social media, and conversion, and content marketing.

Social Media Strategy: Tapping the Power of Google+, Pinterest, Facebook, Twitter, and Google Mapssocial media platform

Are you ignoring social media for localizing your brand awareness? If yes, then it is high time to stop it! This is because such a platform is the biggest off-site SEO opportunity to spread the awareness of your brand.

Nowadays, not all searches occur in the browser, as several customers use apps to perform a local search. Do you know that the Facebook app is highly used after Google Maps for local search?

Consider learning and implementing new ways of exploring social media sites as a means to expand your business, even locally. While doing so, consider some vital SEO points. First, whatever you do, keep it fresh. Stay active on Facebook, Twitter, and Pinterest to posting new but quality content, which shows search engines and visitors that your brand is flourishing. Search engines favor such fresh updates for a higher rank.

Second, consider increasing the number of likes/shares/retweets by publishing interesting posts. The more their number, the more is the interest of the customers in your content and the more favor you receive from search engines. However, avoid congesting with worthless information, which neither the visitors nor the customers like.

Third, engage with your visitors frequently by responding to them irrespective of the social platform. This shows that you care for your business as well as an online community, which is a plus point for better SEO results.

Lastly, give priority to Google+, as Google is still the giant player when it comes to site ranking. Do create a Google+ account for your brand as well as yourself, add to your circles, share relevant content on your corporate and personal pages, and follow people. For local SEO, consider the following two strategies in addition:

  • Place Pins in Pinterest: This is actually the latest feature of Pinterest, which is of great interest to local businesses, especially those in the service sectors. As a user, you can pin new destinations present on your boards or search novel locations on the FourSquare’s map. In case of a good fan following for your products/services offered to local visitors, pinners can see your service and NAP information along with images of your products and services directly shown on the map.
  • Optimize Google+ Local Page and Google Maps: Both these media are a core factor for local SEO ranking. The integration of Google maps in your site and business previews from Google+ Local are strongly associated to local SEO. While optimizing maps and G+ Local pages, you need to ensure that you provide precise NAP information that needs to match with the one existing on your site. Similarly, try to increase the number of reviews on Google+ page, links from popular and quality sites that speak about your business or site or NAP details, local citations from local listing or review sites, and third party reviews gained on sites other than Google+. In case of maps, ensure correct proximity shown for a city recognized by Google.

Local Citation Strategy: Increasingly Important and Essential One

Although nothing new needs to be done if you are already having proper and adequate local citations (business listings), it is vital to know that this strategy is acquiring more importance than before. After Hummingbird era, getting local citations from reputable and authorized directories apart from locally relevant sites is more important for a higher ranking on local searches in Google.

To know how well your site is listed on locally relevant sites across the Web, visit Get Listed that shows whether you are listed in the precise category with Google Places. Track all these business listings through an Excel worksheet that can be your template to save as well as update local listings with right NAP information.

In case you need to create citations, consider publishing a citation each on Google Places, Foursquare, and Yelp.com. Currently, these are the most reliable and highly targeted business directories. Moreover, they are important for obtaining local but competitive keywords. As per the Search Engine Journal, professionals can wisely submit citations to other directories like industry-specific sites (Avvo), data aggregators (LocalEze), and region-specific sites such as Ontario.com/attractions. In short, consider increasing the number of valid citations by retaining NAP details consistently.

Ensure that fully informed and consistent NAP across all listings. It might be possible for your business to be already listed by someone somewhere. In such a situation, just claim the listing (known as citation). To do so, you need to look first to see whether another business of the same name exists at the same location. For this, use Yext or Localeze tool to spot local listing online, use Google MapMaker to look for phone number to check for duplicate listings and removing them, and look for different names of your business.

Claiming means to accept and verify that you are the proprietor of the revealed business by phone or email. Yext and Localeze tools can help you accelerate this process by creating citations or making them more consistent for ensuring precise local information of your brand or company.

Strategy to Have More Online Reviews: Highly Effective

customer review

As per a study published on Search Engine Land, the search engine giant Google considers online reviews as a key factor for ranking a site. As a fact, your reviews are for users of this giant player who view your brand on a search engine result page. Reviews perhaps allow the consumers to decide whether your site’s link should be clicked or not. They facilitate to take this decision quite quickly.

Therefore, you should try to increase the number of positive reviews online. Luckily, several ways exist to improve the reviews of your company along with the numbers. Because you wish to avoid any fake reviews or offer some presents to those leaving good reviews, you will have to follow natural ways of obtaining reviews, which is what Google likes.

One of the ways to get true reviews is to put a button on your home page for an easy start, prompt visitors to write a review after buying a product or hiring a service or after visiting a specific landing page, or talk with customers in stores for leaving a review.

Mobile SEO Strategy: Just Cannot Be Ignored

mobile-seo-

With more than 85% users of Internet accessing sites through smartphones, it is obvious that an effective mobile SEO strategy is essential for successful local SEO. As per a recent survey, near 40% of mobile searches are targeted locally, which encourages every business to go locally mobile.

While local SEO is yet to get influenced by mobile SEO, it surely governs mobile SEO. This is because not all small and big businesses have successfully optimized their sites for mobile usage. This also means that the rankings of these sites accessed on the desktop cannot be affected as per their corresponding mobile performance. However, a possibility is there in the near future that the search engines will rank sites with their mobile-friendly versions, considering the tremendously increasing users of smartphones and tablets.

The other scene is that if you rank high for your local SEO, chances are high for you to rank great for mobile SEO. In such a situation, in case a mobile user visits your site, it is essential to offer them a matchless mobile experience in the absence of which your local rankings for responsive design will fall. Here are some statistics that prove why mobile SEO is essential for local SEO.

First, 90% of American adults have a cell phone and 58% of them own smartphones as per PewResearch. The same institution has also found that around 63% use the phone for accessing the sites. According to Google, 40% of mobile searches are made for local places, resources, products, or services. Even Econsultancy states that 75% of mobile searches are done for follow up actions that encompass tasks like social media sharing, store visit, research, and even a phone call.

These statistics themselves speak about the importance of having a mobile-friendly site or responsive design. In case you do not have a mobile site, it is wise to start looking for responsive web design tools and tutorials. Remember, quality search engines like Google invest no extra time in changing or updating algorithms for boosting the quality of local search engine result pages. Well, although this is beneficial in the long run, it imposes a challenge in the short run, particularly for the mid- and small-sized local businesses that really do not have a good budget for hiring local SEO experts to keep up with the latest and changing trends.

Despite the challenge and absence of local SEO experts, small business can really overcome all difficulty on their own. This is possible by only knowing what you need to do and why you need to do, which helps in prioritizing the local SEO strategies accordingly. Quick adoption of popular or standardized practices, avidly reading case studies, and readiness for experimenting personally is critical to survive in this mission of Internet marketing.

In short, the time is near when you will race with bigger brands on search that is local and highly specific. Therefore, optimizing all your Web pages for local SEO is truly inevitable.

Read More

WordPress SEO: Detecting And Fixing The Problems Through Smart Strategies

According to a recent analysis of CMS involving over a million sites having a high Alexa rank, WordPress takes up around 60% market share followed by Joomla and Drupal. This itself says that the WordPress Content Management System (CMS) is widely in use by millions for creating as well as managing blogs and sites. Well, this raises the most inquisitive question in mind: What makes WordPress so popular? Well, there are two reasons for it: Legendarily user-friendly and SEO-friendly, which motivate professional developers to use and recommend WordPress.

Peeping into its SEO Potential

Benefits Guaranteed to be Enjoyed

The secret behind being touted as the most recommended blogging platform is the ability of WordPress to act as a great SEO tool for web marketers. It is outstandingly versatile when it is the matter of integrating best optimization practices.

In 2009, Matt Cutts, who is the head of Internet spam at Google honored WordPress as the ideal blogging platform for SEO at the WordCamp conference. According to Matt, WordPress integrates around 80 to 90% of the SEO mechanics even by taking the help of a few famous SEO plugins. This itself speaks a lot about SEO capabilities of WordPress.

All known search engines easily notice domains operating WordPress perhaps due to the manner in which it is coded by adhering to the guidelines of search engines as accepted by Google. This greatly lessens the work of SEO professionals who now only have to submit the script with a domain name and title. The script utilizes these details to obtain SEO tags sensed by the search engines. This indicates that the professionals can readily fill the Web pages or blogs with keyword rich content and other elements of optimization. Such a mechanism guarantees salient benefits as follows:

  • Independent and Efficient Content, Design, and SEO Management: Separation of content from presentation and SEO elements is the foundation of Web design principles, which WordPress ensures. This allows the SEO professionals to twist the SEO-critical components without disturbing the content or design and vice-versa. As a result, it becomes effectual to keep pace with the ever-changing SEO algorithms.
  • Clean Coding and Diverse Functionalities: Leaving behind free themes, the CMS ensures clean code, which means no site penalization. Further, several available plug-ins enhance site or blog functionality without extra coding. Right from developing XML sitemaps to optimizing tags (meta), plug-ins integrate best SEO practices in a flawless and modular way.
  • Open Source Access: WordPress is an open-source CMS that means free download and setup of the site or blog on your own server. It also means thousands of programmers are still working on it to render it more efficient, which means constant updates to improve in line with the changing SEO algorithms. This makes a lot of difference!
  • High Level of Customization: One of the admirable powers of WordPress is to customize the URL appearance. SEO can insert all relevant keywords in the path, which makes it simpler for search engines to sense these phrases and rank the pages.
  • Quick Loading: Despite being database driven, WordPress efficiently queries databases, which means least impact on bandwidth. Further, developers can implement database optimization practices to alleviate loading time even further. This is highly important as quick loading pages secure higher ranking.
  • Built-in Really Simple Syndication (RSS) Feeds: This is a vital SEO benefit, as it allows the content to reach a feed directory and sending a link back to your site. The post is associated with other directories where visitors subscribe to your feed for getting notifications on posting new content.

Looking at the Other Side of the Coin

From the aforementioned benefits and capabilities, there is no doubt that WordPress is becoming even more popular not only amongst the SEO experts but also amongst the SEO beginners. The CMS is simple yet highly effective in terms of grabbing the desired page rank, which sets it apart from the other options or counterparts. Above all, you really need no computer genius to run and manage your site through WordPress.

However, you need to ensure that you stick to the SEO standards as set by the changing algorithms to grab as well as retain the desired ranking position. This is best done by sticking to best practices and latest plug-ins or features. Well, there is a catch here! Knowing about these solutions is a job half done, as the remaining half involves implementing them seamlessly.

Somehow or the other, the gap between knowledge and implementation results in many problems while optimizing WordPress pages. Sometimes, it is also the lack of knowledge that can create SEO problems. However, what you perceive is the symptom, as the problem is always underlying your strategic efforts. For example, you may notice that the page rank is not improving despite all WordPress SEO efforts. However, this is the symptom indicating an underlying problem.

In short, when you see anything undesirable or any observations unfulfilling the goal, your mind suddenly triggers the following three questions:

  1. How do I track the problem through this symptom?
  2. How do I now ensure that my rank continues to stay stable for the given set of primary keywords, despite the prevalent symptom?
  3. How can I now climb the ladder of page rank for improved visibility despite the problem?

Keeping these questions in mind, let us explore the common problems that tend to affect your WordPress SEO negatively. After all, no SEO professional would like to have these problems in their WordPress blogs or sites, as it is matter of online reputation and brand awareness!

Problem 1: Unclear and Non-unique Titles and Page Description

wordpress seo optimization service

  • Main Symptom: Your links in results pages are not clicked.Therefore, it makes sense to write catchy, but unique titles for your pages. After all, you need to convince a searcher to click on your link from the search results page. Just consider yourself as a searcher and you will know the importance of the page title in making the visitor click!
  • Tags : Not having exceptional H1 or metal labels on each page is a significant issue. As per the latest search engine optimization rules, each page of the site, including the landing page, must have its own set of meta and H labels, which involves title, subtitles, and keywords. It is wise to stick to 5 to 7 words for the H1 or title, of which 2 to 3 words should be keywords so that the visitors can search your page quickly.
  • Remember :  WordPress themes do not use the title tag properly or even may not use it at all. Therefore, you should add it with the apt page title before the content begins. Moreover, keep in mind that in a few themes, the main title is ‘H1’ and ‘H2’ is the post title. Write a title that is unique as well as clear, but also successfully conveys to the searcher about the information you are offering. If this much is conveyed, the seeker is bound to click.
  • Call to Action : You can even use an SEO plug-in such as WordPress SEO by Yoast or All in One SEO that allows you to write unique titles. For a unique and catchy title, condense your story to its root by asking three questions: What is the story’s main point, how it will benefit the reader, and what would I put in the search box if I were to look for such a story.

Problem 2: Improper Basic SEO

 common seo errors

Main Symptoms: Your site or blog is not getting satisfactory SEO results in the beginning itself. Indexing is not happening at all. Google is unable to recognize all pages of your site or blog.

Basic SEO includes writing and optimizing unique title tags for both pages and posts along with meta tags, including keywords in the right density per total number of words, having meaningful URLs, optimizing images, and generating an XML sitemap. If there is any inaccuracy, ignorance, or lack of optimization in these aspects, your blog or site is bound to face undesirable consequences.

When a visitor searches in Google for a site, a small snippet of content appears below the page link. You can control this content by customizing the page’s meta description tags namely, Title and Desc. Similarly, you need to add keywords to your page and post title tags to convey what your site or page is all about to the search engines.

It is vital to know that Google no longer crawls the keywords tag anymore because it now has better ways of rankings and checking relevance. However, this is not applicable to other search engines. Therefore, it is best to insert the keywords. It is recommended to do so by activating ‘dynamic’ keywords by using the All in One SEO plug-in.

In case of descriptions, no standard way exists to automate it. In fact, the best ones are hand written. For better configuration for each post, you can consider using the Headspace plugin. It enables auto-filling meta-description of a post as per the category description. This is handy in case you post a lot.

Google keyword planner 2015

Unless you are deep into branding, it is wise to optimize your site for one keyword that can fetch search traffic. Several blogs end up obtaining maximum links to their homepage. Therefore, it is better to try and leverage these links by grabbing rankings for a relevant phrase that is fairly competitive with a decent search volume. Find for such a relevant phrase in the Google Keyword Planner tool.

Once you determine the keywords through the Google Ad Words Planner, you will have to use them such that your content and tags are optimized. Within the content, you need to consider keyword density, number of times a keyword is used in your text with respect to the page’s total word count. Ideally, there is no standard keyword density although most people believe it to be 1% in 500 words.

However, it is best to follow this guideline: Main or primary keyword in your H1 or title tag and in H2 tag, and in the first and last paragraphs. The remaining keywords should at least be present once over the entire text. However, use the keywords naturally; do not force your sentences to turn meaningless, just because you need to keep a keyword there. Moreover, Google is not going to reward you much after the first three instances, and that stuffing more is only going to bring down the ranking position.

Apart from the content, the main keyword for each page should also be present in the page’s title tag, the post’s title, meta keyword list, and in meta title. The main keyword of the entire site should be present in the title tag of your homepage, the site’s heading, logo, and as the anchor text in links from external sources such as other blogs or sites.

wordpress meta description seo

Call to Action: Include meta descriptions and titles having the main keyword that should also be in the page title, post title, sub titles, links, and in the content of the page. Include the rest of the keywords once in content and in the meta tag.

Next, you need to optimize your WordPress URLs that are simply meaningless, by default. Post titles look domainl.com/?p=35, where 35 is posted, but search engines are impressed only if this URL appears as domain.com/SEO-in-wordpress/. This is because the categorized or customized URL informs the visitors what the page is about before it is clicked. The id-based URL is another reason why the user might not click it on the result page, because it is not clear to him or her, what the page is all about.

To change this URL structure, go to Settings à Permalinks. Permalinks refer to your URLs of your posts. It is best to use /post-name or /category/post-name. If you wish to use the first option, kindly change the Common Settings to /%postname%/. However, if you wish to include the category, select Custom Structure and enter the permalink as /%category%/%postname%/. Now, WordPress will take care of redirects.

The last thing to do about permalinks for improving SEO is to eliminate stop words such as ‘and’, ‘a’, and ‘the’. In case you are using latest WordPress SEO plugin, these words are removed automatically once the post is saved. This means you will not get ugly long URLs. However, this is not something you need after posts are live and are clicked. In case you now change the permalink, ensure proper redirection of the post. Usually, WordPress should redirect to the new URL but if it fails, manual intervention is indispensable.

Call to Action: Give meaning permalinks to pages and remove stop words.

Often overlooked is the optimization of images, which is done by writing good alt tags and giving meaningful names to files. These two acts can get extra traffic from the diverse image search engines. They are also a great aid to physically impaired readers who visit your site with a screen reader.

wordpress featured image

Call to Action: Write alt tags for images and save them with meaningful file names.

Last but not the least; ensure that you have an XML sitemap, as it is the easiest way for the search engines to know about content on your site. Having a sitemap is like telling them yourself. A sitemap as a special file allows search engines to index all pages in your site as well as get notified when new content is added to the site or content has been changed. While Google is in favor of XML sitemaps, Bing and human users prefer HTML sitemaps. You can use both by publishing an HTML sitemap on the blog and XML sitemap in your root directory for improved SEO.

how to make a sitemap

Call to Action: Generate both sitemaps by using plugins and keep the XML one in the root directory. Further, submit XML sitemap to the Webmaster tools of Google.

As per the latest updates, keyword density is not highly important with the increasing use of Latent Semantic Indexing (LSI) relevance, although optimization is essential through keywords for best results. When it is the matter of keyword density and semantic relevance, the difference in terms of page and post matters, as the latter allows comments whose vocabulary and semantics are very rarely optimized. This means that for ranking vital piece of content easily, it is best to publish it on a page than through a post, as the page vocabulary and semantics are controllable. Moreover, by default, pages gain more priority than post in the core of WordPress.

Call to Action: Consider making more pages than posts.

Problem 3: Content Duplication

 duplicate content tool

Main Symptom: Site or blog is penalized and gets a lower rank.

Believe it or not; duplicate content is amongst the biggest obstacles to top page ranking. Therefore, it is the first thing that needs to be removed while optimizing the blog or site. If not removed, then the page is likely to incur site wide penalty, because of the introduction of Panda algorithm. Listed below are the different symptoms of duplication along with their strategic solutions.

  • Thin Content: Refers to less distinct or unique text on a page. In short, the page lacks original text or image. Therefore, you need at least 300+ words (not a standard) of unique content on any Web page, as anything lower than this limit increases the chance to be marked as highly similar to other pages, initiating the duplicate content filter. However, you might have a few videos and infographics from other portals. In case of video posts, consider adding a transcription or some paragraphs briefing the content.

Call to Action: Write 300+ words of content on each page are original and creative.

  • Boilerplate Content: Refers to content replicating across several pages. The best example of such content would be eCommerce sites and its product pages conveying shipping details and returns policies in tabbed panes. In this case, you will require including unique content in the product description or review to prevent penalties for duplication.

Call to Action: Further, you will also have to shift that content to separate pages and include a link to find that information. Even a pop up window will solve the problem if you do not wish to have separate pages, but reduce the volume by using bullet points.

  • Junk Pages: You might think that they are absent, but it might not be so. For example, you might have a different URL on each product page for accepting a feedback: domain.com/product1/fdb and domain.com/product2/fdb. Here, the feedback form is the same, but URL generated is different on each page. This means 500 feedback pages for 500 products, which is nothing but duplication. To solve this, add the noindex tag under the <head> tag to tell search engines that it should not index this page (<meta name=”robots” content=”noindex, follow”>). Follow conveys despite no indexing, the search engine can still follow the page and index any links contained in it.
  • Incorrectly Configured Sorting URLs: Most sites have an option to sort the products by color, popularity, and Although this is a cool functionality, it can lead to duplicate content if not rightly configured. For example, each parameter can trigger a different page for indexing, for instance, category?orderby=color and category?orderby=price. The solution is to use a specific meta tag for setting the canonical url as <link rel=”canonical” href=”http://www.domain.com/category”/> under the <head> tag of each page. Doing so tells the search engine that it should not index this page version and that /category is the canonical version.

Call to Action: Do not allow indexing for pages that do not offer any value in the search results.

  • Multiple Taxonomies: Taxonomy refers to the manner in which your content is categorized by parameters such as tag and date. This again can lead to duplicate content. You can solve this problem in two ways. First, utilize one type of taxonomy for sorting and do not be concerned about the date archives or tags. Second, choose a main taxonomy and apply noindex to the rest. This is preferable as you can give your visitors the option to navigate in different ways.

Call to Action: Choose a single taxonomy and do not allow the rest to get indexed.

  • WWW and non-WWW URLs: It is fine to use www.domain.com or domain.com, as ‘www’ is not mandatory. However, using both the options results in double URLs in search results, as search engine treats domain.com as a different URL to www.domain.com. Therefore, you need to go Webmaster Tools and convey your preferred domain to the search engine.

Call to Action: On the main Webmaster Tools page, click the top right cogwheel à Site Settings. Now, select the right option in the preferred domain section.

Conclusion

The above call-to-actions are bound to improve your content as well as SEO outcome. If implemented, it can give your blog or site the best possible chance to be Google-friendly in a seamless manner.

Read More
× How can we help you?