Skip navigation

Part 3

(To view Part 1 go to: http://sourcewire.com/releases/rel_display.php?relid=33080&h...=)

(To view Part 2 go to: http://sourcewire.com/releases/rel_display.php?relid=33087&h...= )

If you would like a free pdf of this whitepaper email me at gary@stickyeyes.com .

*************************************************************************************


16. Bump Your Competitors Multiple Listings Out of Google and Pick up a Position or Two

Every wonder why during a search you find a competitor that has two pages listed above you? I call them kicker listings. The home page is always the second listing, and the first is an internal page that actually has relevant content.

Here is why this happens. When you submit a query Google looks at its rank and if they are close to each other in their results, they group them together. If you are showing up in the SERP’s first couple pages then it is most likely that you are listed again much deeper in the results. But when two pages are close, like top ten, or top 20, then Google shows them side-by-side. The second, usually the index page, will be listed below and also indented.

By going into ‘advanced search’ the number of default result can be changed, or you can add this bit of code to the end of the url string that it shows after a search for your keyword, just after the search? And the results will be more refined. Add this ‘num=8&’ to the end of the url. This number may change the results, but if not reduce the number. This will show you where your competitor’s second page should actually be.

Okay, so now should go back to the original search that showed the double listing. Within the search results look where your competitor is showing up, then look below his listings for a non-competitor. It could be anything, a video, a news story or a Wikipedia or eBay listing. Use the guide in Tip #11 to do some social bookmarking, or even link to the page from your website (preferably on a second level subdirectory).

What this will do is add a little boost to the non-competitive website and bump the ‘kicker’ listing that your competitor has, back to where he belongs, below your listing.

This is surprisingly easy and quick using a combination of bookmarks and back links. It may even boost your trust rating with Google by having an outbound link to a high ranking website.

Using this method on eBay sometimes provides a double-boost because if it is an auction rather than a store item it may drop off the SERP’s once the auction is over.


17. Automate XML Sitemaps

In the past you had to create several version of your sitemap for the different search engine bots. They required these to properly crawl your website’s content for indexing (inclusion in their results)

Since then, two major changes have been made.

a. A universal sitemap format was adopted: xml (this even includes Ask.com)
b. A tweak was added that tells the bots to go to your robots.txt file first and look for a path to the xml file so that it knows where to go, and additional features that allow you to prevent the bots from crawling and indexing unnecessary files such as cpanel, administration or even image files.

You can specify the location of the Sitemap using a robots.txt file by simply adding the following line:
Sitemap:

< sitemap_location >

The < sitemap_location > should be the complete URL to the Sitemap, such as: http://www.example.com/sitemap.xml

This directive is independent of the user-agent line, so it doesn't matter where you place it in your file. If you have a Sitemap index file, you can include the location of just that file. You don't need to list each individual Sitemap listed in the index file.

There are lots of places that have a free xml sitemap generator;

SourceForge.net
xml-sitemaps.com
auditmypc.com

I think GSoft also has an Open Source program that will automatically create the xml sitemap and upload it via your ftp if you set it up.

Because of the ever-changing content of a properly optimised site, as well as sites with CMS’s (content management systems) and the millions of static sites out there, this is the method that I recommend.

Additionally, fresh content will keep the robots coming back to index your site. Most of these programs will insert the creation date in to the file to document that it is a revised version, but the most important part is that whenever you change anything within the site (and based on this article you may have a few changes to make) this will assure you that it will be picked up by the engines automatically without having to spend the time that we did in the past to expedite this process.

Part of the problem is that if you add navigation to this new content and put it in the site template, as I mentioned earlier, search engines will parse (remember) the content and skip over it to preserve its allotment of data that it can crawl on each url. They want to crawl deep and get as much content as possible, so they skip pages that provide no new content. (This is why multiple Rss feeds are important as mentioned in Tip #13)


18. Finding What Terms Are Converting Into Sales/Tracking Keywords to Conversion With Weighting

Having 100,000 unique visitors a day really doesn’t matter in the end if you aren’t getting any conversions (new members, info requests, sales).

Measuring successes and failures for landing pages, on-page content like CTA’s, and especially keyword to sale are some of the most important pieces of information that you can gather and use to improve and optimise your overall website.

Here are two scenarios to better illustrate this point;

I. Paid Advertising –

A car insurance company starts a paid advertising campaign on Google and after a week or so they see that the name of their company or their ‘brand’ seems to be converting the majority of their sales. Because of this discovery, they target the majority of their budget on their brand terms like ABC Insurance and ABC Insurance Company.

A week later they see that their CPA (cost per acquisition) has sky-rocketed almost two-fold and can’t figure out why this is. When they look at Google analytics and other third-party tracking software, they both say the same thing.

So why is this?

Let’s take a look at the buying process (also called funnel tracking) to see where they went wrong; Mrs.INeedInsurance hopped online while enjoying her morning java to look for insurance because last night when Mr.INeedInsurance opened his renewal notice he got a significant premium hike. At dinner they decided to start shopping around for insurance. Mrs.INeedInsurance searched ‘car insurance’ between 6-8am that day, going in and out of different companies websites, learning what she was up against…tens of 1000’s of results. So at work (11a-2pm is the #1 time people shop online – not necessarily making purchases) Mrs.INeedInsurance has learned a bit about search and decides to add her city in the query. This time she searches ‘car insurance London’, and still gets several thousand results, but at least they are localised, and there are a few that she recognizes from this morning so she goes in and fills a few of the forms out to get quotes. Throughout the rest of the day she gets the quotes either immediately from the website or via email. Now she’s getting somewhere. Jump forward to after dinner that evening. Mr.INeedInsurance looks through the notes his wife brought home and decides that ABC Insurance offers the best deal for the money, then goes to Google and searches for ABC Insurance and makes the purchase.

See what happened here? I use this as an example because this is exactly what I identified for a client a few years back that inevitably led to changes that doubled their conversions.

The problem is that all the data pointed to ABC Insurance’s brand name as being the top converting term, so that’s where they concentrated the bulk of their budget. In actuality, ‘car insurance’ and then ‘car insurance London’ were the terms that actually led up to the sale.

The reason that this is important for PPC campaigns, or any paid advertising, is that many will allow you to do keyword weighting. This is where you increase your bids or decrease your bids by a percentage according to day parting. Day parting is turning your ads up or down according to the time table that you put in place.

In this instance I would turn my bids up to 125% on ‘car insurance’ and ‘car insurance London’ in the morning and afternoon, then down at night. On ‘ABC Insurance’ I would turn the bids down in the morning to 50%, and then back up to 125% in the evening.

Keyword weighting also allows you to weight your keywords and track them to conversion. It places a cookie on the end-users computer to track what keyword brought them to the sight, what keyword resulted in a quote, and what keyword resulted in a sale.

This is beneficial because I can further adjust my bidding strategies according to demographics and geographical metrics.

With these cookies I can also successfully measure and establish LTV (Lifetime Values) of the average customer. This allows me to adjust the conversion value, which allows me to go back to my company/client and potentially get a higher advertising budget.

Using this same insurance company as an example; initially they gave me a conversion value of $25. Now, since we were able to identify other sales made by this customer, the conversion value is $40.

Offline this company spends 100,000 on advertising through different venues, acquiring customers at a cost average of £/$56. Guess what happened the next month? They increased the budget by 100,000.

II. Organic Advertising –

Same scenario as above, except ABC Insurance Company identifies through log files or Google Analytics that his top converting keyword that is getting sales is car insurance.

In light of this, the decision maker decides to create a landing page that is fully optimised so that the relevancy grade that all 3 search engines use will increase their organic positions, which it will.

The problem here is that the term that was actually bringing them to the website to buy was ‘cheap car insurance’. If they had identified this they could have built the page around the term, ‘cheap car insurance’ rather than just ‘car insurance’. This would have served double-duty and acted as a great landing page for both keyword phrases.

This is why tracking your keywords to conversion is so important. It can save thousands on paid advertising and identify the actual keyword phrases that need pages built around for improving organic rankings.

If you are experiencing a high bounce rate or what you feel is high cart abandonment, you might be surprised to find that many didn’t buy elsewhere; they actually came back to you and bought.

This is also helpful in refining your stats. Rather than show this customer as 3 separate visitors, it identifies (through the cookies) that they were actually just one visitor, and the bounce rate or cart abandonment is significantly reduced.
This information can very invaluable as well.

For instance, maybe I was getting high unique cart abandonment from unique users that was significantly higher once they went to checkout. I know that happens when I add shipping costs into the total. So I might try to do some A/B testing with and without shipping costs listed separately, added into the price initially and adding it during checkout and see which converts better. Or I may set the website up to recognize the cookie and create a drop down that offers free shipping today with any purchase over $/£XX.XX.

There are endless possibilities to use this information for.

There are many good tools out there to measure these variables and that will let you set up rules and keyword weighting, as well as many other great features. Some of these are;

WebTrends
DirectTracks
BidBuddy


19. Supplemental Results – What They Are, How to Find Them and How to Get Out of Them

“Supplemental sites are part of Google’s auxiliary index. Google is able to place fewer restraints on sites that we crawl for this supplemental index than they do on sites that are crawled for the main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in the main index; however, it could still be crawled and added to Google’s supplemental index.

The index in which a site is included is completely automated; there’s no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank.”
Nonsense!

At the time of this article Google was already starting to eliminate their search results showing supplemental results. Until recently, all you had to do was go to the last few pages of your query and locate the pages that had ‘ - Supplemental Result’ just after the page size. They aren’t showing these anymore. Here’s what they had to say,

“Since 2006, we've completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We're also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.

The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results." Of course, you will continue to benefit from Google's supplemental index being deeper and fresher.”

Google then said that the easiest way to identify these pages is like this; “First, get a list of all of your pages. Next, go to the webmaster console [Google Webmaster Central] and export a list of all of your links. Make sure that you get both external and internal links, and concatenate the files.

Now, compare your list of all your pages with your list of internal and external backlinks. If you know a page exists, but you don't see that page in the list of site with backlinks, that deserves investigation. Pages with very few backlinks (either from other sites or internally) are also worth checking out.”

Nonsense!

The easiest way to identify your supplemental pages is by entering this query ‘site:www.yoursite.com/&’

Okay so now you have identified the pages that are in supplemental results and not showing up in the results anywhere.

Now we need to identify why they are there. The main reasons that a page goes to supplemental results are;

1. Duplicate Content
2. 301’s. Redirected Pages that have a cache date prior to the 301 being put in place
3. A 404 was returned when Google attempted to crawl it
4. New Page
5. Bad Coding
6. Page Hasn’t Been Updated in Awhile
7. Pages That Have Lost Their Back Links
8. And according to Matt Cutts of Google,“PageRank is the primary focus determining whether a URL is in the main web index vs. supplemental results”

Now this isn’t the end-all, but it covers about 95% of the reason that you may be in the supplementals.
So now we know what they are, how to find them and why they are most likely in the supplemental results. Now let’s get them out of there.

Here are the different methods that I use when I find that a page has gone supplemental;

1. Add fresh content to the page
2. Add navigation to the page from the main page
3. Move the pages to the first subdirectory if it is not already there
4. Get a back link to the page and/or create a link from an existing internal page with the anchor text containing the keywords for that page
5. Do some social bookmarking on the page
6. Make sure the page is included in my xml sitemap and then resubmit it to Webmaster Central.
7. Lastly, if none of the above seem to be working after 90 days, and I have another page that is relevant and does have PageRank and isn’t listed in the supplemental, I do a 301 (permanent redirect) to it from the supplemental page.


20. My Top SEM Tools---

This is a comprehensive list of tools that my team and I use daily at Stickyeyes.

The first one, Firefox with the three extensions, are the tools that I spoke about at SES London 2007. I encourage you to install it and check it out. Its one of the better tools I have used.

I use all three of these applications; IBP, SEOElite and WebCEO . Each has it own best tool and each are indispensible.

Firefox 2
Firefox SEO Add-Ons (1) , (2) & (3) (Density, links, code cleaner, W3C Compliance, etc.)
Google Analytics - Provides deep analysis on all traffic, including paid search.
IBP - Several tools for checking rank positions, basic SEO page analysation and link building tools
SEO Elite – Excellent for Link Building, analysis, finding where competitors are advertising
WebCEO - Site optimization, promotion and analysis.

Back Linking/Bookmark Tools:-

Bookmark Demon and BlobCommentDemon – Automates the process of Bookmarking and Posting to Blogs
Link Building 101 - Basic Link Building Instructions and Tips.
Link Baiting - Good Link Baiting Tutorial
Google Webmaster Central – Statistics, diagnostics and management of Google's crawling and indexing of your website, including Sitemap submission and reporting.
Comprehensive Link Building 101
Link Baiting *Instruction

Pay Per Click Tools:-

Keyword Elite – I use this within my arsenal of keyword tools
Wordtracker -Data is based on the Metacrawler and Overture search engines.
KeywordDiscovery - Data is based on the number of search engines.
Keyword Optimizer - Enter a list of keywords and this tool will remove any duplicate entries and re-order the list alphabetically.
Google Analytics - Provides deep analysis on all traffic, including paid search.
Google Suggest - As you type, Google provides the top 10 most popular keywords that begin with the keyed-in letters, in order of popularity.
SpyFu - Find out what competitors are biding on and estimates for the cost of PPC advertising and others bells and whistles.
Hittail – Finds and easily groups the actual terms being used to find your site into an excel format. Great for finding niches and long keyword strings.
Google Trends - Graphs historical trends of various Google searches.
Google Keyword Tool External -Historical trends in keyword popularity.
BidCenter - A good tool for comparative analysis and easy to use
SEO Sleuth - Find what AOL users search for (AOL produces 2x the retail conversions as any other engine)
ROI Calculator - This calculator measures the ROI (return on investment) of a CPC (cost per click) campaign.
Adwords Wrapper – Concatenates multiple words into a usable format in Adwords

PPC Hijacking *Information
PPC 101 *Instruction
PPC 102 *Instruction

Site Tools:

Virtual Webmaster – This is a great tool for the ‘Do-It-Yourself” type. 200 Web Developers Will Complete Any Website Change Request in 48 Hours
C-Class Checker - Use the Class C Checker if you own several cross-linked sites. If you do, it may be more efficient (for SEO purposes) to host them on different Class C IP ranges.
Code to Text Ratio - This tool will help you discover the percentage of text in a web page (as compared to the combined text and code).
Future PageRank - This tool will query Google's various data centers to check for any changes in PageRank values for a given URL.
Internet Officer - Checks for Redirects
Live PR - The Live PageRank calculator gives you the current PageRank value in the Google index, not just the snapshot that is displayed in the toolbar.
Keyword Cloud - This tool provides a visual representation of keywords used on a website.
Keyword Difficulty Check - Use the Keyword Difficulty Check Tool to see how difficult it would be to rank for specific keywords or keyword phrases.
Page Size - This tool will help you to determine HTML web page size.
Site Link Analyzer - This tool will analyze a given web page and return a table of data containing columns of outbound links and their associated anchor text.
Link Analysis - Find out about all links on a page, including hidden ones and nofollow-markers
Spider Simulator - This tool simulates a search engine spider by displaying the contents of a web page in exactly the way the spider would see it.
URL Rewriting - This tool converts dynamic URLs to static URLs. You will need to create an .htaccess file to use this tool.
Keyword Misspelling Generator - allows you to generate various misspellings of a keyword or phrase to match common typing errors.
Useful for creating keyword lists around your most important keywords to bid on.
Keyword Density Analysis Tool - finds common words and phrases on your site.
Hub Finder - finds topically related pages by looking at link co-citation. post about tool
Page Text Rank Checker - tool allows you to check where your site ranks for each phrase or term occurring on the page.
XML Sitemaps - makes XML sitemaps for sites.
PageRank Toolbar For Mac - A widget to show PageRank for the site you are on.
Xenu Link Sleuth – Use to find broken links. Supports SSL sites and also reports on redirects.
Mobile Readiness Report – See how well your site is formatted for mobile phones. Includes Visualisation.
Javascript Content Hiding – Hide content on your site from search engines and other crawler/bots

Google Tools:-

Google Webmaster Central *Tool
Google Labs*Tool
Check For Google Supplemental Results
SpyFu for Google Bidding
Google Future PR
Google Sandbox
Google Dance Watch
Google Page Rank Formula and Sandbox Explanation
Google Google Information and FAQ
Google Reinclusion Request
Banned by Google?
Google Advanced Search
Google Data Center Pages Indexed Check
Google Page Rank Check (All DC's)
Google Keyword Ranking Check
Google "Need-To-Know" Info
Beginner Adwords Tips
How Google Analytics Work
Check Google Keyword Prices
Hit Tail *Advanced Adwords
Hit Tail Documented
Fake Page Rank Detection Tool
Adwords Click Fraud Study *Information

And Even More SEO Tools:-

• [Getting into DMOZ|http://forums.seochat.com/open-directory-project-13/submissi...
Meta Tag Generator
RoboForm - A MUST-HAVE!
Keyword Density
Redirect Checker
Robots.txt Generator
Link Popularity
Domain Age Check
Code-to-Text
Spider Simulator
Who Supplies Who with Search Results
Abuse IP Checker Tool
IP Information Tool
IP, City and reverse IP Lookup *Tool
Ping Tool
Traceroute Tool

Other Useful Tools:-

Data Recovery Software – Powerful Data Recovery with ‘on-the-fly’ viewing
Create MultiMedia Pdf EBooks – Great for eTailers. This creates search engine optimised customer-facing Pdf documents as mentioned in Tip #5
Streaming Video For Your Website - Add Streaming Audio Or Video To Your Website Easily And Quickly
Add YouTube or Arcade Scripts - Entertainment Scripts Which Allow You To Start Your Own YouTube, MySpace, Break Or Arcade Website.
WordPress Auto Content Generator – Auto Generates Fresh Content for Your Blog
Next Generation RSS (SEO) Software – Add Rss Feeds To Your Site Easily
Bookmark Demon and BlobCommentDemon – Automates the process of Bookmarking and Posting to Blogs
Aaron Walls SEO Book – Yea I know, why am I listing this…I guess because it’s a great resource that provides a few thing I haven’t listed here.
Traffic Travis – Another Unique SEO Tool with its Own Unique Merits


21. SEO Checklist

I am currently having this checklist developed in an automated tool.

Email me re: Automated SEO Checklist When Available if you would like a beta version when I get it.

This checklist will take care of approximately 75% of your SEO.

-----------
SEO Checklist
KW:
Page: http://
-----------

Tool --- Metatags and on-page optimisation

http://www.seochat.com/seo-tools/meta-analyzer/ --- Are the keywords in the title with a 1-word buffer (Max - 1 keyword phrase)

http://www.seochat.com/seo-tools/meta-analyzer/ --- Are Keywords in META keywords. It’s not necessary for Google,but a good habit. Keep the META keywords short (128 characters max, or 10).

http://www.seochat.com/seo-tools/meta-analyzer/ --- Are Keywords in META description. Keep keyword close to the left but in a full sentence.

Check Content or --- Are Keywords in the top portion of the page in first sentence of first full bodied paragraph (plain text: no bold, no italic, no style).

Check Content --- Are Keywords in an H2-H4 heading

Check Content --- Are Keywords in bold – second paragraph if possible and anywhere but the first usage on page.

Check Content --- Are Keywords in italic – no more than once

Check Content --- Are Keywords in subscript/superscript - no more than once

Check Content --- Are Keywords in URL (directory name, filename, or domain name). Do not duplicate the keyword in the URL.

Check Code --- Are Keywords in an image filename used on the page.

Check Content --- Are Keywords in ALT tag of that previous image mentioned.

Check Content --- Are Keywords in the title attribute of that image.

Check Content --- Are Keywords in an internal link’s text.

Check Code --- Are Keywords in title attribute of all links targeted in and out of page.

Check Code --- Are Keywords in the filename of your external CSS (Cascading StyleSheet) or JavaScript file.

Check Content --- Are Keywords in an inbound link on site (preferably from your home page).

Check Content --- Are Keywords in an inbound link from a related offsite (if possible).

Check Content --- Are Keywords in a link to a site that has a PageRank of 8 or better (e.g. .gov or .edu)

Check Code --- Are Keywords in an html comment tag? < !-- keyword -- >

Tool---Technical

IBP --- What is the code-to-text ratio? (text should be at minimum higher than the code)

IBP --- How many links are pointing to the full url (w/http://)

IBP --- How many links are pointing to the domain?

IBP --- Have you associated the http and the http://www versions of your site with Google?

IBP --- What is the Domain name visibility? A count of results at Google for a search for the domain, showing URL visibility rather than incoming link count.

IBP --- Number of internal pages that link to the home page?

IBP --- Number of Technorati links?

IBP --- Number of del.icio.us links?

IBP --- What is the page size?

IBP --- How long does it take to load the page?

IBP --- On each page, is the top keyword density on each page between 3-7%?

http://www.internetofficer.com/redirect-check.html --- Are their any redirects?

http://validator.w3.org/detailed.html --- Is the page W3C Compliant?

http://www.copyscape.com/ --- Is their any duplicate content out on the web?

http://www.123promotion.co.uk/directory/ --- Is the site in the top 10 directories?

http://www.seochat.com/seo-tools/spider-simulator/ --- Is a spider seeing all of the site content?

Maximum 1 (one) instance of the '=' symbol --- Does each page have titles that are not dynamically generated?

IBP --- Is there javascript within the content? Move it off

Other Issues:

Check Content --- Is there at least 250 words in the content?

Check Code --- Javascript in external files?

Check Code --- Alternative navigation on flash or frames?

Check Content --- Xml and html sitemap?

Xenu (Download) --- Are their any broken links?

Check Code --- Is there a robots.txt file?

Check Code --- Do you have a path to the xml sitemap in the robots.txt file?

http://www.netmechanic.com/toolbox/power_user.htm --- Browser Compatibility (IE, Netscape, Opera, Firefox, Mosaic and Safari)

Linking:

SEO Elite --- Google backlinks?

SEO Elite --- MSN backlinks?

SEO Elite --- Yahoo backlinks?

SEO Elite --- DMOZ listing?

Check Site --- Does the site have outward rss feeds option?

Check Site --- Does the page have rss feeds for fresh on-page content in pages other than the index page?

Check Site --- Does the site have an SEO optimised 404 page?

Search Google for site: and .pdf --- PDF optimised docs in root file with a navigation page listing each doc description and link. Also a separate xml sitemap for these and separate submission.

http://home.snafu.de/tilman/xenulink.html --- 302 redirects? (Change to 301 - Google will penalise you for these if you leave them up too long)


---About The Author---

Gary R. Beal is originally from the United States. Now living in the UK, he travels to conferences all over the world.

Gary has “crossed the pond” to close the gap between the US and Europe in online marketing training many U.S. based Search Managers at top agencies, companies and conferences. In 2007 Gary spoke at the SES conference in London, and Gaming and Affiliate conferences around Europe.

Gary is the Director of Search at Stickyeyes - one of the UK’s leading internet marketing agencies with a client portfolio that includes major corporations such as MTV, Jaguar, 02, Jet2, Littlewoods Bingo, Mecca Bingo, First Direct, Lloyds TSB and many others.

Gary attended Ohio State University in the U.S. and holds a Masters Degree in Biometrics and Mathematical Statistics. He has been instrumental in the development of many Search Engine Optimisation and Pay Per Click tools as an analyst and consultant.

He is well known in most of the top SEO/SEM/PPC forums, a staff writer for DevShed and SEOChat, and a Moderator at SEO Chat. He has worked for many years in lead aggregation for highly competitive industries such as Online Gaming, Banking and Finance, Insurance, Travel and Investments and can effectively speak about doing business in these industries, as well as successfully doing business on the internet.

GaryTheScubaGuy

I am a regular poster on SEORefugee, SERoundtable, WebmasterWorld, DigitalPoint, SearchEngineLand, BruceClay and Marketing Pilgrim, as well as a Moderator on SEO Chat where I will gladly answer questions for you. I am also a Staff Writer for DevShed and SEOChat so join up and add me to your watch list for additional tips as they happen.

Please feel free to contact me at 0113-391-2929 or gary@stickyeyes.com

****************************************************************************

To go to part 1: http://sourcewire.com/releases/rel_display.php?relid=33080&h...=
To go to part 2: http://sourcewire.com/releases/rel_display.php?relid=33087&h...=

****************************************************************************

-ENDS-

This press release was distributed by ResponseSource Press Release Wire on behalf of SEPR in the following categories: Media & Marketing, Computing & Telecoms, for more information visit http://pressreleasewire.responsesource.com/about.