24 August 2009

How building your website correctly from the outset will continue to benefit you now

With all the changes that have been happening lately in the world of SEO - Google buying YouTube, the Google Caffeine algorithm upgrade and the Bing and Yahoo merger - it is becoming increasingly more difficult to keep up and stay ahead when it comes to optimizing your website.

In this day and age having a website no longer means just having a site designed, built and uploaded and leaving it at that. Internet Marketing has grown exponentially in recent times and in order to take full advantage of this trend and reap the associated benefits, you need to have a fully functional well-built website before you can even think of embarking of any form of optimizing in order to gain rankings and attract the right kind of traffic.

The basics of building websites still counts for something

That said if you have a well-indexed website and it was built correctly from the word go or has been kept up to date in terms of the latest building recommendations, you are in good stead for moving forward with your intended optimization and capturing your market share.

Sites build in frames or framesets, flash and/or tables etc are considered by the web authorities (W3Schools and others) to be outdated and in some instances deprecated indicating that it is no longer supported in XHTML 1.0. These sites are then also not crawled by the Search Engine bots and will be left behind in the race for rankings.

Domain Age is a factor in Search Engine Optimization

Although it may not be a defining factor, domain age is still a factor and for two reasons:


  1. The age of a domain gives the appearance of longevity which Google gives higher relevancy to. Search engines generally tens to put more trust into those sites that have been around longer.

  2. Also the older the site, the more time it has had to build links from other sites etc.

Where to from here

There are many things that could be suggested for example you could consider rebuilding your site if its not been built correctly. It might also be a good idea to consider taking over old domains in order to benefit from aging if your current domain is not too well like or if you are just starting out. This is especially true if the domain is relevant and the site is built with high quality content.

What else needs to be considered

Before Google Caffeine kicks into effect take the time to test search keywords and phrases from your site (that you wish to be found under) and see where you are listed on both www.google.co.za and www2.sandbox.google.com. Then look at the results and and make a decision from there.

Web 2.0 sites and those with user generated content is now also considered the way to go and something to keep in mind if you plan on taking full advantage of the upcoming changes and want your website to perform at it optimal best.

Visit the following sites for more information:

Domain Age http://www.a1articles.com/article_1036559_81.html

Google Caffeine http://www.seobook.com/google-caffeine

SEO Basics

http://www.deondesigns.ca/blog/learn-the-basics-of-seo-before-you-try-anything-fancy/

http://www.scribd.com/doc/18752155/Basics-of-SEO

10 June 2009

So a search engine likes my site - Now what?

This pattern is fast becoming a trend in the realm of Search Engine Optimisation. Website owners are finding that they have plenty of visitors coming to their sites all being pushed their by SEO techniques, Google AdWords and other Pay Per Click traffic generation methods.

The sad part is that all these extra visitors are not necessarily translating into extra enquiries and sales at the same exponential rate that the traffic to the site has increased.

The answer to this particularly perplexing conundrum is actually fairly simple : Search Engines Are Not People.

Search Engines may love your site and it may rank incredibly well in Google and other prominent search engines and get excellent keyword reach and relevancy from all the articles and content that you are posting to your site, but at the end of the day a search engine is not going to enquire about your latest and greatest product. So how do we get around the problem?

There are some very simple guidelines to design your landing pages to which will increase the chances of real people liking what they see and actually engaging further with you :

  1. Keep the page simple. People have come through to that page for a specific reason and are looking for a specific thing, don’t complicate the page with tons of other “stuff”, just keep the info and required actions on the page relevant to what they were searching for.
  2. Ensure that the visitor to the page is 100% sure about what it is that you need them to do on that page. If they need to send you a mail – tell them! And tell them a few times for the sake of clarity and ease of use. The worst thing you can have a user do is get to your landing page and say to himself, “now what do I do?”!
  3. Test multiple landing page designs. Don’t assume that the first design you do is the correct one. Try a few designs and see which one gets you the best conversion results and then stick to this one. Be careful of trying to make too many changes at the same time because then you will not be able to judge which changes are the ones which worked.
  4. Tell your visitor where they are. They have just come through to you through a search engine which told them that you were the right people to speak to; now reinforce that with some compelling copy that highlights what it is that they were searching for.
  5. Once your visitor has completed the required action for that page, your site should direct them to a thank you page which will let them know that their enquiry has been successfully submitted. Now the cardinal sin of most websites is that they leave the page at that and they don’t try and cross sell or up sell a visitor who very clearly is interested in your product offering. Use this page to give them more info or to offer them something else that they would be interested in.

Using these guidelines will ensure that your landing pages do not suffer from having a multitude visitors but rather a throng of active participants who are willing to engage with you and take the next step in the buying cycle.

So a search engine likes my site - Now what? Bookmark and Share

04 June 2009

Bing has arrived in South Africa with local search

It has taken Microsoft many long years to finally realise that the reason no-one liked MSN Search and Live Search was because there was no immediate way to filter out results from other countries. It used to be the case of "Good luck finding a SA Company". Well Microsoft have finally stepped up to the plate with Bing which is a search engine that does indeed compete with Google in the fact that it has the "only for South Africa" option. The majority of website users in South Africa are looking for local products and local sites which is what has put Google ahead of the game for many years as it has been the only major search engine to offer this option.

Microsoft has invested over $100-million in advertising for their “Decision Engine “ search engine. As they try to get more than 8% of the US search market, my guess is that they will now see more of their market share coming from their international users because of this local search feature.

So what is so different about Bing?

From a first-off impression I was not very impressed. When I first logged onto Bing I was sure that I was staring at a Google clone.

  • The layout was the same except for Bing having a pretty interactive picture in the background.
  • The links all seamed to mark the same options as Google
  • The search results (SERPS) were identical to Google with the standard blue links and green URL reference
  • The preferences and options are identical
  • The cached page option is identical
  • Windows Live is the same as iGoogle and Bing Maps is very similar to Google Maps
  • Microsoft Advertising which is Microsofts PPC looks exactly like Google Adwords and are in the same place on the page

At first I thought that this was a mistake by Microsoft, but the more I thought about it the more I thought that they really did not have a choice. There was no way to take on a legend like Google so the old saying goes “If you can't beat them – Join them”. Microsoft took the Google shell that users are used to and tweaked it by adding slightly more functionality. Some very neat additional features you will notice are as follows:

  • The very cool extra content pop-out which is located by hovering to the right of the individual results. The great thing about this is that it also gives you a few of the target pages' internal links.
  • The related search bar to the left of the results is quick and easy to access
  • Their image search is much better presented to the user
  • They have integrated Bing into MSN Search so that it searches Bing by default.
  • They have the very cool "Popular Now" link which shows the top searches (only on the Bing.com and not the Bing.co.za)

How will Bing effect SEO companies

Bottom line is that SEO companies will now have to cater for Bing just as much as they did for Google. These companies need to start to use the tools that Bing offers like Bing's Webmaster Tools and Bing Maps. Microsoft is starting to cater for South Africa so will need to make sure we stay ahead of the game by trying out their product. The search results in the pages are different, if your site was #1 on Google it may be #8 on Bing. The two search engines may look very similar but behind the scenes they use different algorithms for their search so the SEO company needs to adjust to that.

Will Bing gain search market share in South Africa?

My answer is "Yes". Many South Africans use MSN Messanger and other Microsoft programs that may make Bing their default. There is a chance that if South Africans can get the same sort of local results that they would usually get from Google then they may indeed embrace the concept.

Bing has arrived in South Africa with local search Bookmark and Share

22 May 2009

Organic Search versus Sponsored Search Results

Search engines such as Yahoo, Google, MSN and AltaVista generally deliver two types of search results namely; sponsored results and organic results. A sponsored or paid listing usually refers to one which is achieved through pay-per-click (PPC) search engine advertising such as Adwords. An organic listing on the other hand is achieved through search engine optimisation (SEO). These are the two most popular Internet marketing techniques. They are both “white hat” means of attracting a higher quantity and quality of traffic to a particular website which is often the primary Internet marketing objective.

As a result of years of practice, experts in the field of Internet marketing have developed a number of techniques which they employ to improve a websites search engine ranking. These techniques are collectively known as search engine optimisation. Although each expert practices his own brand of SEO based on his specific experience, some of the most effective SEO interventions include:

  • Keyword research
  • Keyword density analysis
  • Web page optimisation – title tags, meta tags, page URL and page content
  • Image optimisation
  • Submission to search engines and directories
  • Link building
  • Affiliate programs
  • Tracking and reporting

Pay-per-click is based on the concept of purchasing advertising space however it does differ from a traditional print advertisement in that the advertiser bids for a specific search phrase and is charged each time a visitor clicks through to their website. Adverts look almost exactly like organic search results, except that they are served as “sponsored links” and are ranked or listed according to how much the advertiser is willing to pay for the phrase.

Why “Organic”?

Organic search or natural search is the term used for describing an algorithmic search engine raking. In other words a ranking which is achieved by satisfying a search engines algorithmic criteria. Without getting into too much detail about the ins and outs of search engine technology it bears mentioning that search engines deliver results based on a set of criteria which extensive research tells them will satisfy their users input query.

People often ask how the word organic relates to search engine results. On the one hand the dictionary definition of the word organic is: having properties associated with living organisms. On the other hand, the word organic brings to mind something which is nourishing, wholesome or healthy. Although many traditional marketers would disagree, one of a website’s primary functions is that of being a resource. Internet marketers know that resourceful websites which “feed” their users interests and imaginations attract traffic and garner brand loyalty.

While SEO takes time (usually a few months) to take effect and involves structural improvements it is also more cost effective and more durable than other Internet marketing techniques. It is thus perceived as the “healthier” Internet marketing alternative. Another reason why websites may be associated with living organisms is because of their tendency to change, grow and mature.

Organic Search vs. Pay-Per-Click

As mentioned above both sponsored and organic search results are very effective ways of attracting traffic and optimising Internet marketing efforts. Both techniques are recommended and neither method can be said to outperform the other. A generalisation in this regard would fail to take into account internal and external environmental factors such as industry performance, competitors, budget and marketing objectives to name a few.

The major differences between the two methods come down to time and cost. PPC allows you to get to the top quickly whilst attaining the same results organically could take considerably longer. The cost of a PPC campaign will depend on the demand for your search phrase or keywords. The more popular the phrase the more costly the PPC campaign is likely to be. This in turn will impact on the campaigns sustainability. Although SEO may take longer to implement effectively results will be sustained over a longer period of time.

Organic Search versus Sponsored Search Results Bookmark and Share

13 May 2009

Recession: Why Some Industries like Internet Marketing continue to thrive

A recession is defined as a decline in GDP for two or more consecutive quarters. It is a period of general economic decline characterised by rising unemployment, lower reported company profit and a drop in household spending. The current global recession is said to have begun with a crisis in the US housing industry at the end of 2007. The recession spread quickly, plunging an integrated global economy into possibly the worst financial crisis since the Great Depression in the 1930’s.

Many sectors have been hit hard by this recession including the construction, financial and automotive industries. In-fact industries across the globe in various sectors have experienced significant declines in demand as a result of the recession. Worldwide, bankruptcy, poverty, joblessness and crime are on the rise however for some the global economic downturn tells another story.

Certain industries, it seems, are recession resistant. Despite the general decline being experienced by enterprises in all sectors some companies are thriving in these trying economic times. Two such industries include open source software and Internet marketing.

The use and development of open source software seems to be coming into its own in recent times. The recession has caused many corporations to re-consider their software options giving preference to more cost effective solutions. Companies such as RedHat, SugarCRM and Vyatta continue to grow whilst the technology sector in general is declining.

With Linux based software gaining popularity even Microsoft is including open source components in their software packages. Microsoft spokesman Sam Ramji says "The bulk of the opportunity is open source applications on Windows, but these things are all coming together. We are really bullish on being a platform for some of this innovation as customers want more functionality out of their hardware and software. We are becoming a conduit for other technologies."

Internet marketing is another industry in-which growth figures seem to climb steadily. Many organisations faced with budget cuts say they will continue and even increase spend on Internet marketing. This is not surprising, in light of the results of a 2007 survey in-which 81% of marketers surveyed say that their social media spending will meet or exceed their traditional advertising spending within the next 5 years.

Faced with tight budgets marketers are pressed to look for the best value and employ innovative marketing techniques such as those offered with Internet marketing packages. Search engine optimisation, e-mail marketing, affiliate marketing, pay-per-click and social media optimisation are measurable, efficient and cost-effective.

It stands to reason that industries which serve basic needs such as food, education, healthcare and energy will withstand recessionary pressure due to the low income elasticity of demand. Those businesses which offer great value on vital products and services will appeal to price sensitive buyers. Likewise, in the B2B arena enterprises which cater their client’s basic operational needs in a cost effective manner will thrive in the cash-strapped recessionary climate.

29 April 2009

Google Enhancements Place More Emphasis on Content

The latest improvements to Google’s algorithm place greater emphasis on quality content. Ori Allon, who was responsible for developing the breakthrough Orion algorithm blogs about the integration of new technology which results in a great enhancement of users search experience.

Relevant content has always attracted excellent search engine ranking. Content which is interesting, relevant and frequently updated ensures that web-pages are indexed and served in search engine results pages. Content is therefore every websites foremost SEO concern. Last week, the search engine leader Google announced two improvements to their search engine results pages, both of which place even more emphasis on relevant content.

The two improvements which have caused so much excitement in the Internet marketing industry include an expanded list of useful related searches and the addition of longer search result descriptions. Now, this may not sound like much to the average marketer or business owner but to Internet marketer's across the globe these seemingly minute details have great implications.

The announcement came from the desks of Google's Ori Allon, Technical Lead: Search Quality team and Ken Wilder: Snippets Team Engineer. Ori Allon, as a research student in Australia developed the Orion Search Engine which was considered "revolutionary" for its ability to deliver search results not only for a specific keyword but also for phrases related to the key search term. Google purchased the rights to the Algorithm from Ori Allon in 2006 before hiring him to work at their California headquarters.

In a 2005 Interview Allon was quoted as saying "By displaying results to other associated key words directly related to your search topic, you gain additional pertinent information that you might not have originally conceived, thus offering an expert search without having an expert's knowledge." At the time even Bill Gates agreed that Orion would have far reaching implications for search saying "We need to take the search way beyond how people think of it today. We believe that OrionTM will do that."

Related search results appear at the bottom of the search engine results page. According to Allon and Wilder, the technology deployed last week allows the search engine to "better understand concepts and associations related to your search". This is the height of intuitive web technology. It means that search results are now even more relevant to input queries serving up not only what you need to know but what you did not even know to ask for. The second search engine enhancement applies to longer search queries. The results returned for longer queries are now more detailed and include a longer description. Together, the two improvements are an indication of Google's strong focus on quality results.

With this move Google have once again placed themselves way ahead of the competition. I predict it will be a long time before other search engines catch up. In the mean time Google will continue to gain market share. Once again looking at this from an Internet marketer's point of view, my advice to companies whose businesses depend on the sales generated through Internet marketing is to invest in content and reap the long-term SEO benefits that it delivers.
Google Enhancements Place More Emphasis on Content Bookmark and Share

16 April 2009

Blogs and Blog Topics

Bloggers often wonder what topics they should or should not include in their blog. Many bloggers have interests and views they would like to discuss on their blogs but are not sure how it will affect their search engine ranking and readership. The fact is that a large amount of unique content will attract a higher ranking. The variety of topics covered will strengthen the domain making it easier to rank well for new keywords. However, this move may cost the blogger his/her readers.

What makes a great blog?

Five qualities make for a great blog and content counts as the first three. Content, content, content, interactivity and syndication will ensure successful blogging. Readers will visit your blog, return to it, reference you and subscribe to your feeds as long as you provide content which is interesting, informative and relevant. Having established that, it stands to reason that bloggers trepidation with regards to the selection and presentation of blog topics is completely justified.

Topic selection


Topic selection and the leeway you allow yourself with regards to relevance will depend on the blogs purpose. These days, blogs are commonly found on commercial and corporate websites in-which the author discusses topics related to their products, services, industry and organisation.

In this case delving into related topics is interesting and adds spice to the content however straying too far from the main topic will cause readers to lose interest. Chances are you will probably attract a great deal of irrelevant traffic thus skewing your analytics and conversion ratio. The point is that content should always be written with the target audience in mind.

In the case of personal or social blogs, the author is often looking to attract the interest of a wide circle of like-minded individuals. Although this model allows more room to express views on a wider variety of topics, it is important to keep the discussion general and not get too specific. This type of blog will benefit from the diverse cross section of incoming links driving their search engine ranking upwards whilst at the same time keeping readers enthralled by the shifting content.

The niche blog on the other hand focuses on a very specific topic and usually contains a number of advertisements. Straying from the main topic of a niche blog would be counterproductive however I have come across a number of content portals containing several niche blogs in one website. These sites obviously require a great deal of resources to maintain and may be better served by having several blogs with a unique topic.

In answer to the question, is one blog with many topics better than many blogs with a single topic each, I would like to reiterate that the number of topics a blog includes should depend on the nature of the blog and its purpose.

Also, including more topics in a blatant effort to improve search engine optimisation may be transparent to regular readers. It is a good idea to consider topic selection carefully and test the waters by gauging reader’s reactions to changes in the topic.
Blogs and Blog Topics Bookmark and Share

20 March 2009

Time to look at gearing websites for mobile access

originally published by BusinessDay on 24/03/2009

Smaller businesses need to start gearing up their websites for mobile access to accommodate the growing number of people that have internet capable mobile phones and want to access information while on the move.

In February, the GSM Association announced the four billionth mobile phone connection and predicts this will increase to six billion connections by 2013, and an increasing number of these cellular devices being sold have internet capabilities.

Emerging markets like SA and the rest of Africa have vast potential for mobile internet. As far back as August 2006, the BBC carried a story on its website in which it said that of the international users that accessed its website with an internet capable mobile phone, 61% were based in Nigeria and 19% in SA. It will become increasingly attractive to access web content from a mobile phone as the internet capabilities of these devices become more advanced.

Juniper Research says by 2013, 23% of all new cell phones will be smart phones, which allow internet browsing.


Global research company In-Stat forecasts more than 300 million Wi-Fi-enabled mobile phones will be sold in 2012 and the number of mobile phones that are capable of receiving video content will exceed half a billion by then.

"Some smart phones already allow high resolution pictures to be taken and uploaded directly to the web," says Ceri James, sales and marketing director at JD Internet Consulting .


He says companies like Amazon.com and the BBC have mobile friendly websites and Radio 702 and Makro are among the local companies that do. Some mobile-friendly websites will automatically detect that the user is using a mobile device, but with websites like FaceBook users are required to access a specific address - in this case m.facebook.com, says James.

A mobile browser can be downloaded free from www.opera.com/mini/download/ allowing content to be accessed from any website in a compressed format and users can zoom into any specific area.

Companies need to understand how important it is for their customers and potential customers to access their website from a mobile device and what kind of content they are most likely to be browsing for.

Most websites cannot be accessed effectively by mobile users, in many cases because they have been created in different sections containing different content. When viewing these web pages with a mobile device, all the frames of content are squashed into one cramped page.

The more recently designed websites use techniques that overcome this problem. Websites containing a lot of high resolution graphics present another major problem for mobile users because they slow down response times when trying to access information chew up bandwidth.

"Mobile users want to be able to search for and retrieve information quickly and view it clearly on the screen."

To allow for better mobile access all the rich unnecessary content needs to be stripped out and graphic images reduced to one to two kilobytes.

"If you browse on Amazon.com with a mobile phone, all the images are small, but are of a high enough resolution to view them properly." Companies can set up specific web pages that contain a light version of the same content and a process that recognises when a mobile user is browsing and diverts them to these pages from the same website address.

However, this will require keeping the mobile-friendly web pages in sync with the main website, which could become a major issue if more than one person is posting content. It will also result in having duplicate content on the website, which the search engines do not like and might affect the website rating.

A better alternative is to build intelligence into the website that will recognise when users are browsing with a mobile phone and optimise the content and the way it is presented to them, which is becoming the trend. "This will involve a few hours of additional development to an existing website."

Information on the recently introduced international development standard for mobile website content, XHTML, can be viewed at www.w3.org .

"It is possible to have a mix of XTML and HTML generated content on a website."

Companies can check how mobile-friendly their own or any other website is against the standard by visiting: validator.w3.org/mobile/ website, says James.

Google has developed an internet tool that allows users to select a mobile view of websites when searching for information, more information about which can be found at www.google.com/pda .

Tools like Google analytics will provide a detailed breakdown of browser types and screen resolution. James says when gearing a website for mobile access it is important to work with a web development company that understands the specific
needs of these users.
Time to look at gearing websites for mobile access Bookmark and Share

13 March 2009

What You Should Pay for Search Engine Optimisation

Search engine optimisation (SEO), is becoming an important marketing discipline in South Africa. Many organisations are beginning to recognise the value of Internet marketing and allocating their marketing budgets to online media. A mushrooming of new entrants to the Internet marketing industry has meant that there are many companies offering search engine optimisation services, some at very competitive rates.

As a result, when website custodians or owners attempt to ascertain whether their websites require SEO and what the possible costs will be, they are faced with an overwhelming variety of service offerings and widely diverse cost structures. One reason that costs differ so vastly is due to the nature of the Internet marketing industry. SEO is very much a natural progression for website development firms wishing to grow their business and expand their service offerings.

SEO services, on the one hand are often provided by companies who established themselves as website hosting and development agencies in the last decade. The larger more reputable organisations feel comfortable contracting with an established SEO service provider. They feel a greater sense of security in knowing that the company has been around the block a few times. They are also willing to pay a premium when contracting with a service provider who has a proven track record and offers continuity of service.

On the other hand there are a number of newcomers to the field who operate with little infrastructure and low overheads. Many of these companies are run by highly skilled SEO practitioners who are quite capable of delivering great results at a fraction of the cost.

Whilst both the larger, established agencies and the smaller up-coming practices present great solutions many marketers struggle with the choice, and wish to know what they should be paying for SEO. To compound the problem, the industry does not have a standard costing structure. Whilst many charge per hour, others charge on project to project basis and still others opt for a share of profits. It cannot be said that “x” amount per hour is the going rate as that would be a gross oversimplification. Rather, and as with almost every marketing decision the costs should be considered in relation to the benefits.

Search Engine Optimisation – A Cost/Benefit Analysis

A few guidelines could be used to determine the value an SEO campaign would have to your business. The following list is not comprehensive as each business’ needs will differ however these principles could assist with the decision making process.

Get your house in order:
Should there be problems elsewhere in your business it would be better to have those resolved before investing in an SEO campaign. If your website design and development are good, you have your operations in order and your products and services have been developed to an acceptable standard you are ready for SEO.

Goals and Objectives:

What you should pay for search engine optimisation will depend on your Internet marketing objectives. If your goals are very ambitious, you will probably require an aggressive strategy and you should be willing spend more to meet your objectives. It is important to set attainable goals and to know what they relate to in monetary terms. Then ask yourself what percentage of the expected return you are willing to spend to realise your goals.

Maintain a healthy development to IM ratio:

You should be prepared to spend three times what you spent on initial website design and development on Internet marketing initiatives which are designed to attract traffic, raise search engine rankings, increase conversion and ensure return visits. A great website alone will not lead to growth in market share. In any given period your Internet marketing budget should be allocated in a ratio of approximately one to four.

Choosing an SEO partner:
Other considerations which impact on the cost of search engine optimisation include the size of the firm you are dealing with, their years of experience and the demand for their services. A more reputable and experienced company’s rates may be higher than those of a smaller new entrant. It is important to bear in mind that Internet marketing is a process. It does take time and you should feel comfortable in the knowledge that you will be engaged in a long term relationship with your SEO service provider.

Industry specific factors:

Factors pertaining specifically to your business and website will also have an impact on the cost. The size and complexity of the website, the demand for your brand and competitiveness of your industry will all impact on the cost, time and nature of the project.

Having taken all the above into consideration, you should have a fairly good idea what you can afford to pay for search engine optimisation and with the number of service providers in this industry you should find one who meets the needs of your business. Finally your SEO service provider should offer transparent reporting on your websites current performance, SEO measures taken and the impact thereof. This will allow you to monitor your campaign and make the necessary adjustments.

return to articles page
What You Should Pay for Search Engine Optimisation Bookmark and Share

27 February 2009

Crowdsourcing - How the Consumer is Producing the Product They Want.

What is crowdsourcing?

Crowdsourcing is a novel concept in problem solving. It is based on the idea that (paraphrasing Wikipedia) tasks can be outsourced to an undefined and generally large group of people or community in the form of an open call. It takes cognizance of the fact that many minds can accomplish more than a few and uses integrated media to tap into the intellects, creativity and preferences of the public.

The term is relatively new and was coined by Wired’s Jeff Howe in 2006. Jeff puts forth a second definition of crowdsourcing describing it as-
“The application of Open-Source principles outside the field of software”.

Crowdsourcing in action

Although the term may be new, the principle has been in use for some time. In-fact we have all seen crowdsourcing in action and may or may not have been aware of it. To cite a few examples, consider the process followed in 1994 in the implementation of the new South African flag in-which a call for designs invited the general public to participate in the design process thereby ensuring their buy in. That was an older form of crowdsourcing.

Popular reality television programs such as The Apprentice and Idols where viewers are invited to vote for their favorite candidate are some highly publicized recent examples of crowdsourcing. Although Wikipedia co-founder Jimmy Wales objects to the term, the collaborative free-content encyclopaedia is another example of a successfully crowdsourced initiative.

A particularly noteworthy recent application of crowdsourcing was the use of Internet driven participatory politics in American President Barack Obama’s presidential campaign. The Obama campaign made brilliant use of new media to not only communicate regularly with voters but also to gain insight into what people were thinking, what their concerns were and ultimately what it would take to win their votes.

It seems as though Obama’s term in office will maintain the high level of interaction the American public has come to expect with other politicians following suit. N.Y. Public Advocate Mark Green recently called on New Yorker’s to aid him in setting his agenda saying: “Contact me at MarkGreen.com with 20 ways to fix our City so I can learn as I campaign”.

These are just a few of the larger instances in-which crowdsourcing lends itself to massive self-promotion giving the organisation, candidate or initiative widespread exposure.

Crowdsourcing as a business model

Crowdsourcing as a business model is used across various industries and sectors. It is also applied as a public problem solving model and has been used effectively in the design and development of products as diverse as software, pharmaceuticals, clothing and online stock-photo and map repositories.

Some of the perceived benefits of crowdsourcing include low costs, short lead times and increased customer input as well as buy in.

Ultimately the success of the crowdsourcing model can be attributed to the combined application of a few basic business and marketing principles:

  • It places great emphasis on research and development – a key facet of the competitive marketing strategy
  • is an innovative research and development solution using new media to improve reach and shorten lengthy research times
  • is highly responsive to customer needs
  • is interactive and invites participation from the target market
  • uses an integrated media approach blending online and offline media
  • promotes brand building through increased exposure and the use of publicity.

The model is quite controversial and has many vocal critics however the customer centric nature of crowdsourcing affords consumer’s a new level of involvement in the product development process.

11 February 2009

Google's Gmail Slams IE6 – Have They Gone Too Far?

Google recently discouraged users of their popular Gmail service from using Micosoft’s Internet Explorer 6 (IE6). They seem to favor and encourage the use of alternative browsers such as Mozilla’s Firefox and Google’s very own recently launched Chrome .

It stands to reason that Google will promote the use of their newly developed Chrome but have they really gone too far in slamming IE6? Most people to whom this question is posed will shout an emphatic “absolutely not”. I’ll leave it to you to decide for yourself.

Gmail users accessing their account via an IE6 browser are greeted with a message saying “Get Faster Gmail”. This link leads to a page which provides numerous browser alternatives for different operating systems including Chrome and Firefox 3 amongst others. At the same time claims that Chrome and Firefox are “twice as fast” offer many users a compelling reason to switch. Apparently IE6 also fails to run certain Gmail features earning them the label “Unsupported Browser”.

Google’s obvious endorsement of browsers other than IE6 led to quite a flurry of online comment most of it in support of the move. For years developers have complained about the fact that IE6 fails to support numerous web applications. Problems with the browser include the inability to handle style sheets and security vulnerabilities.

It has become standard practice for developers to create web pages for Firefox and other advanced browsers then spend hours backtracking to tweak each page for IE6 compliance. Developers complain endlessly about the sheer absurdity of this practice not to mention the inefficiency. It makes sense for developers to favor browsers which comply with web standards and support fast JavaScript engines.

The launch of Firefox was welcomed by the development community with exposure to the browser gaining momentum in 2007. Word spread and Firefox quickly became the web communities preferred search engine. Although IE is losing market share to Firefox, Microsoft still retain a large portion of the browser market. In my opinion there are two key factors contributing to the IE’s stickiness. They are: corporate aversion to change and the fact that IE comes standard with Windows software packages.

Small business and home users of IE6 stay with the browser for lack of information. They are not aware of its shortcomings or the fact that there are alternatives. Corporations on the other hand, even those who are aware of the alternatives and the comparative benefits stay with the browser they have been using instead of incurring switching costs and risking backward compatibility. As long as IE6 holds its market share, developers will be faced with cross browser compatibility issues.

Ultimately Google, the website development community and W3C slams IE6 and Microsoft for their lackadaisical attitude towards progress. While other industry role-players are hungry for innovation, pushing the envelope and continuously testing the webs capabilities, the software giant seems to have contracted the destructive virus prevalent amongst organizations whose products dominate the technological landscape, namely indifference. In such cases there’s nothing like some healthy competition to awaken the beast from its slumber. Perhaps coming head to head with a worthy opponent may jar them into action.

Download Firefox Now
Download Google Chrome
Dowload Safari for apple users
Flock for social media
IE7+ if you must

23 January 2009

Dealing with Duplicate Content: A Practical Guide

The Internet thankfully is still all about great content. Despite all the new marketing disciplines (SEO, SEM, SMO, SMM, ORM and IM) punting guaranteed search results, search engines serve appropriate results rich in content, thus keeping the system honest and ensuring their own survival.

The issue of duplicate content is something which mystifies, even plagues many Internet marketers. Every Internet marketer fears a drop in their websites search engine ranking. Duplicate content could result in a drop in search engine ranking or worse. Being black listed and having penalties instituted against you could deal a devastating blow to your search engine marketing efforts and prove very hard to recover from.

Before getting into ways of preventing search engines from perceiving your content as being duplicated it is necessary to understand that there are different types of duplicate content. Some occurrences of duplicate content are un-intentional whilst others are due to malicious and unimaginative “black-hat” marketing tactics.

Google defines duplicate content as: “substantive blocks of content within or across domains that either completely match other content or are appreciably similar.” They also acknowledge that “mostly, this is not deceptive in origin”.

Duplicate content can occur in the following ways:

Having two domains carry the same content. When an organisation decides for geographical and/or other search reasons including canonical domains, to use the same website content on two domains, it confuses search engine robots causing them not to list the site at all.

Using Dynamic URL’s. Database driven websites, in which content is stored in a database and accessed on demand, may serve the same content under different URL’s, depending on the search parameters being used. Search engines can view this as duplicate content.

Circular navigation, in which there are multiple paths to the same website can also cause content to appear to be duplicated.

Providing printer-friendly pages. Users often require hard copies of the information on your website. It is therefore sometimes necessary to provide them with the option to print content. This can be done in one of two ways. You could either host a printer friendly copy (i.e. duplicating the content) or if you are using a CMS, the same content can be supplied in both web and printer formats by the system. Unfortunately, both ways cause your content to appear to be duplicated.

Providing a mobile-friendly version of your website. With the number of Internet friendly mobile devices hitting the market in recent days, it is becoming essential to offer a very basic version/format of your website specifically for user’s browsing your site from a mobile phone. As in the above case however you may run the risk of duplicating content.

Syndication: In cases in which your content is syndicated to other websites or when your articles and blog posts are submitted to content repositories search engines are likely to pick up your content as a duplicate and possibly link to the more “popular” version of your content.

E-commerce product descriptions: Common e-commerce products often have manufacturer descriptions which are used when they are retailed online, particularly in the case of affiliates. This too could be viewed as duplicate content and affect a websites ranking. The same applies in cases where products have multiple categorisations.

Plagiarism is presenting the written intellectual product of another person as your own without their permission. Copying online content is very easy to do and can be a quick fix for websites requiring content in a hurry. Although in most cases search engine crawlers can be relied on to select and serve a link to the originator of the content this unethical duplication of content could result in a loss of traffic.

Content Scraping refers to the use of a crawler to copy website content and rephrase it in an attempt to present it as unique content.

Preventing penalties

Whilst more advanced search engine crawlers do a good job of distinguishing between genuine cases of duplicate content and inadvertent instances, you can avoid being wrongfully penalised for duplicate content by doing the following:

Redirects: In order to ensure that search engines don’t view multiple domains and dynamic URL’s as duplicated content, select a canonical domain and use 301 redirects for permanent pages and 302 redirects for pages with content which is changed or updated often.

Robots.txt: In the case of printer friendly or mobile versions of your website, you need to indicate to the search engine that the information contained therein is not for crawling, use robots.txt to do so. You can also request that syndicated copies of your content be blocked using this method.

Style sheets: As an alternative to creating a duplicate of your webpage for print purposes you could program a style sheet for print. Provided that your website is programmed well, this may be a viable alternative to using robots.txt.

Link back to your site: If blocking syndicated content using robots.txt is not an option for you ensure that your articles contain links to your website.

Categorise and enhance product descriptions: It is really important to categorise products carefully, taking a user’s browser behaviour into consideration rather than duplicating a single page under various categories. Manufacturer’s product descriptions or reviews should be enhanced by providing additional unique content.

Use a content checker: You can detect duplicate content using online duplicate content software, a pro-active approach is required.

Claim ownership of your content: In cases in which your content is continuously ripped off and you are convinced that it negatively impacts your search engine ranking, you can file a DCMA infringement request with search engines such as Google, MSN and Yahoo.

Knowledge of duplicate content issues and how to prevent them could be beneficial when optimising your website for better search engine performance. It is important to keep in mind that your website could contain duplicated content and to take steps to ensure that preventative measures are implemented.

Check your site for duplicate content

www.copyscape.com
Dealing with Duplicate Content: A Practical Guide Bookmark and Share

SEO News