Our blog has moved!

You should be automatically redirected. If not, visit
and update your bookmarks.

December 29, 2005

It's A Wonderful Internet

For all of the fans of Frank Copra's well known movie, It's A Wonderful Life, this link is for you:
It's A Wonderful Internet

Enjoy, and Happy Holidays from MasterLink!

December 19, 2005

AOL picks Google over MSN in new deal

AOL has chosen to use Google for all aspects of its search needs. No more than two weeks ago the Wall Street Journal stated that MSN was in the final stages of inking a deal to provide the AOL with its search services. The deal is expected to be announced formally tomorrow.

Published reports claim that Google has paid $1 billion to take a 5 per cent stake in the media giant. This is good news for AOL as it has struggled with a lower subscriber base and problems with its merger with Time Warner.

Google will promote AOL web properties and include AOL's online videos "amongst the search results", adds the Wall Street Journal, ambiguously. 'Amongst' may mean 'as part of', or it may mean along side', and there's a world of difference between the two. Google emerges from the negotiations as an advertising powerhouse. It will use AOL's ad sales team and make use of a range of advertising inventory, such as banners, that brand conscious advertisers demand.

The deal is principally about advertising, but the pair also agreed to extend their search agreement, signed in 2001, for another five years.

In MSN's fight to 'build a better mouse trap' it seems Google has thrown a large obstacle in the way of MSN's fight to the top.

-Mark Barrera
The MasterLink Group, Inc.
Search Engine Marketing Services

December 15, 2005

Is Google’s algorithm working against them?

A large debate has sprung up on the Internet between Jeremy Zawodny and Matt Cutts on the practice of selling links. In a post by Zawodny, he has discussed the many ways he has attempted to profit from his popular blog. It seems the latest is by selling sponsored links to clients who also benefit by receiving a piece of his well-ranked site with a PR8. It seems that Cutts had no objections until it came to the practice of selling sponsored links outside of the large advertisers such as Google. Recently Google has been altering their algorithm to fight against the purchase of links by penalizing sites that purchase links to gain a higher page rank. There also seems to be a different issue that must be addressed. Google itself makes the majority of its revenue by selling ads through its contextual advertising program, Adsense. The issue at hand is whether Google can really start penalizing sites for selling links, because this is what Google itself is doing when it plasters Adsense all over the Internet. It seems as if they deem themselves the Internet authority so everyone must play by their rules or else suffer at their hands while they don’t have to abide by any rules themselves. Google is attempting to influence these link sellers to use the rel="nofollow” attribute in order to prevent the passing of page rank. This leads to the question of for whom websites are created for….Should programmers focus all efforts on making Google happy or design for the benefit of the reader as well as the company who owns the site. It seems as if Google is attempting to force its rules on all web developers and punish people for practices that have been common since the introduction of the Internet. Matt Cutts seems to take the argument that Google should be allowed to down-rank pages such as Zawodny’s for selling links if they choose to do so. How fair is it to punish a site with some of the most relevant information just because people are willing to pay for links from such a popular site? It seems as if Google, by putting such a strong emphasis on page ranks may be causing some problems that may not have been foreseen and are now coming to the forefront of discussion. Also, it seems that Google may be worried that webmasters may take the selling of ads into their own hands and stray away from the use of one of Google’s cash cows – Adsense. I would like to say however that I do think that some standard should be created to prevent spammy sites from attaining high PRs in order to just gain a buck, but punishing everyone who sells links even when they have relevant content just doesn't seem to be fair. Google should be VERY careful in deciding how to attack this issue as I predict that an algorithm change in the near future may be focused on this issue.

On a side note, it seems that Matt Cutts had been linking to Zawodny's site and not using a "nofollow" tag. After Cutts' comments on this topic, the links have now been changed to include this reference or removed all together. Removing PR benefits from a non-paid link? Hmm....can't everyone do this and essentially eliminate the whole idea behind page rank?

Ahh.....the things we do to make Google happy.

-Mark Barrera
The MasterLink Group, Inc
Search Engine Marketing Services

December 12, 2005

Excellent Review of SEO Ranking Factors

I got this link today on one of my RSS feeds.  This came to me from the Beanstalk Blog
Many lists have been written of the various factors used to rank websites however I've yet to see one so well researched and presented as the one on the SEOmoz website.

The contributors to this list include:
  • Danny Sullivan
  • Dan Thies
  • Jill Whalen
  • Scottie Claiborne
  • among others
I found this very useful and getting multiple perspectives from active industry experts cuts through a ton of the misinformation that is out there,
Jack Spirko
The MasterLink Group, Inc

December 07, 2005

Microsoft and AOL may be close to a deal

According to a report in the New York Times today, Time Warner will most likely not sell a stake in AOL or sell the business outright, but will strike a partnership with either Microsoft or Google. A story in the Wall Street Journal (subscription required) names Microsoft as the frontrunner, saying the two are close to a deal.

Google currently provides AOL with its organic and paid search results, and shares revenue generated by the ads with AOL. Microsoft is currently piloting its own paid search listings, and has its own organic search engine, which could replace Google on AOL. Google's current contract with AOL ends in 2006.

The Times said the deal with Microsoft could include creating a joint advertising sales force with MSN, but that Google would not be interested in such an arrangement.

This just adds fuel to the fire in regard to my article MSN Building a Better Mousetrap?

Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

December 06, 2005

Is MSN Search Building a Better Mouse Trap?

It is an old cliché, build a better mouse trap and the world will beat a path to your door.  Looking back at the history of the search industry there is no better proof of this then the success of the internet search giant Google but are they now being beaten at their own game?  Possibly, let’s look at how Google got to where they are, what has happened since they became the “big dog” and what their most recent algorithm changes mean for web users and Google’s competitors. 
First let’s start with why Google became “the search engine”.  In the beginning of search engines there were a ton of players some you may remember and some you may not.  Some survived and others died off and many of the small fish were eaten up (bought out) by their brothers as the big three and came to control the vast majority of the search market.  Google came out of this with the biggest chunk by far of that majority, it didn’t hurt that for a while they were actually providing the results of their competitors (it was not that long ago that Yahoo was simply a mirror of Google’s results) but there was something more at the core of their success. 
Keep in mind that as this war was raging the internet was really still quite young and there were a lot less sites online even say 3 years ago then there are today.  Take that back 7-8 years when the net was really starting to grow and Google first launched in 1998.  At that time people where not yet doing a ton of “buying” online they were more in the information gathering world.  You really have to think about what it was like just a short time back.  People were quickly moving to looking up any question, concept or piece of information online and for a huge number of people it was a brand new experience.  Further when you started trying to find sites about monarch butterflies or classic muscle cars or fine cigars there were actually a lot less sites out there about what ever subject it may have been that you were looking to learn about.  

Today's New Tool

It honestly amazes me how much information can be acquired if you just know where to look. For a while I have kept this little gem to myself but today I will share it with you.

This tool is called the Marketleap Link Popularity Check

Yes yet another link checking tool but this one is very cool in its own way

First you enter any site you want to check links for and up to 3 additional sites you also have an option to select a "category" such as books, computers, media etc.

You then click Generate Report and the first report you will get is a listing of the websites you entered, their total link popularity over all the main search engines and about 30 sites from the category you selected. (if you don't select a category it will default to "general") You then see your sites back links and how they compare to those other 30 sites across a spectrum that measures your "Total Presence"

Presence is classified into one of the following areas

Limited Presence
Average Presence
Above Average
900 Pound Gorilla

The number of link to qualify for each level varies with the category and appears to be based on what the top 5-6 sites in each category rank.

As if this was not enough once you run a report on your site you create a bench mark which is stored in the Market Leap data base so when you come back you can pull up a report called a "Trend History Report" so you can see a graph of your growth or loss of back links. There is no user name or password or account info required anytime anyone runs a report on ANY SITE the data point is stored. Hence you may find a lot of info on your competition already stored for your review.

One more cool thing since you can run a report on any site your competitors may be running checks on you if you go in and check one of your sites and see that trending data is available you KNOW someone is checking up on your site. The more dates recorded the more often you are being checked on. Keep track of the days you run your reports and you will know every time a competitor uses this tool to check up on you. You may not like being "checked out" but one thing is for sure if you getting looked at you know you are doing something right.

There is also another report called a Search Engine Saturation Report which just tells you how many pages of a site are indexed for the main search engines and this report shows trending as well. This is a great tool for the do it yourself type and a great tool professionals can use to show clients trending in their back links and indexing.

Again you can find this tool at the following Link - Marketleap Link Popularity Check

Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc - Dallas Web Design

Extended Warranty Quote

December 02, 2005

Yet Another Tool

This morning I was wondering what new tool I could find that would add value to the things I have already posted about. 
This tool is really great if you understand how to use it right.  What it will do is identify common SEO techniques for any term you are looking to rank. 
What you do is find a site that ranks well for a term you are looking to rank a page of your own for, enter the competitors site and the terms you want to rank for and click
"Analyze Page"
Now a lot of tools do this but what I particularly like is the data that is compared on one page side by side that allows you to quickly identify what I consider to be under utilized soft targets.  You see not only does it show the on page factors like key word density, tags, and even key word densities for the first 250, 500 and 1000 characters it also shows back links.  I know if you use tools similar you are thinking big deal right?
But wait go try it with a site you are familiar with and what should quickly jump out at you is the fact that you can identify four specific types of sites that rank at the top for a term or phrase
1.  Good on page factors (tags, density, etc) and good off page factors (inbound links)  - This means the term is most difficult to acquire as both sides need to be right to sit on the top of the heap
2.  Bad on page factors and good off page linking - This may actually be a harder competitor to beat in any event you know this term is driven highly by off site factors and that will be important to both how you go after it and IF you go after it depending on the number of links the number one and number ten ranked competitors on google have. 
3.  Bad off page factors but good on page factors - This is a relatively soft target because it is driven mostly by on page factors which you can analyze and better and off page factors are weak so a few good links can really push you to the top.
4.  Relatively poor off page factors and relatively poor on page factors -  If this type of phrase has traffic behind it you have found a nugget of pure gold!  It is a term you can just get a few good links for and adjust the page for density and tagging and simply take the top of all the big engines for it.  This is one of the key things we target for our clients at MasterLink. 
The key here is to understand that if you have a choice to rank well for a SINGLE term that will bring you 1000 visitors a month it will be competitive and you will have to stay on it heavy, one major update can tank it and you have to start all over to push it back up.  However if over time you can take good rankings for 50-100 niche terms that combine to provide you with the same 1000 visitors if you loose any of them in an update your liability is limited.  This can easily be compared to buying a mutual fund vs... a single stock to limit total liability.
The right way of course is to over time build a portfolio of rankings that include all four types of terms outlined here but the time to acquire and labor necessary to maintain each will vary by making sure you have a good mix you can build a lot of stability and more important a ton of steady traffic.  What I really like about this tool is it will quickly help you sort terms into one of the above 4 categories.
Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

November 29, 2005

Google Click-to-Call

Google is now testing a new click-to-call service that lets web surfers speak with advertisers on its search results page without having to pick up the phone and dial.

A Web surfer can click a small phone icon adjacent to an PPC ad and enter his or her own phone number and then click a "connect for free" button. Google's system then calls the advertiser's phone number and the surfer just picks up the receiver on his phone, he or she will then hear ringing as the call to the advertiser is connected.

Right now it is not clear how broad the testing is. I did a few sample searches for frequently searched topics, such as "cell phones," "car rental" and "mortgage," and was not able to find any of the icons but fellow SEO blogger Greg Yardley was able to obtain screen shots, which he posted last Wednesday.

Google said it pays for the calls, whether local or long distance the only charges the Web surfer may incur would be airtime fees if they were using a cell phone which is kind of obivious that if you use your cell phone you are spending your cell phone minutes.

Rival Microsoft which of course provides MSNSearch said it had purchased Teleo (announced in August), an Internet calling company with the potential to allow MSN to offer click-to-call capabilities. Time will tell how well this new feature will be recieved by customers and advertizers it seems like a great feature but it will be getting surfers to recognize and use it that will determine the success or failure of this program.

I am not quite sure yet as to my opinion on this. On the surface it looks like a clear winner, however think about the fact that internet users generaly want to scan, research and then contact or buy and you begin to question just how popular this feature is going to become.

We will just have to wait and see,

Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

Google API Key

Well in only a matter of hours after my post on Back Link Analyzer I have had a few people ask, "just how to you get a Google API Key" and given it has been so long since I got mine I forgot how screwed up Google is about this if you already have an account.  Everyone who asked me how has an existing Google account by the way.
Now why is this a pain in the butt to get the key well because apparently the folks at Google did not really think the api page though
Here is what you do go to http://www.google.com/apis/ you will see all types of wonderful information about the API Key and what it can do and an invitation to set up an account. You can then set up your google account if you don't have one yet.  All you do is set up your account and because you came through the API portion of the Google site they will send you an api key when you sign up.  No real troubles there.
Now for those who have an existing Google account.
You will notice on the api key page there is nothing about creating an api key such as something simple like "click here to create an api key"
Nope that would be too simple so the only thing you see to click on is create an account which you already have!
So if you search for "create a google api key" on google you will find this URL  www.api.google.com/create key yet it will just take you to the create an account page like some twisted circle set up just to annoy you.  Of course you can't set up an account because you already have an account and that would violate the terms and condition by having two accounts (heaven forbid you do that and get banned).  All you want is your API key but Google just mocks you!
Of course no one at Google seems to of had the foresight to have put a link on said page that says something like "existing account holders click here to sign in and get your API key".
Now doesn't this make you wonder how a company as successful as Google could do something this confusing?
So if you have a Google account and you want to get an api key added to it just how the heck do you do it?
Only way I can see to get it done is this
1.  Go to https://www.google.com/accounts/ and log into your google account
2.  Go to  www.api.google.com/create key after you have logged in during the same browser session and Google will recognize you as being logged in and create and email your key to you.
Now is it just me or does this seem to be overly complicated?  Perhaps someone should point this out to Matt Cutts?
Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

Back Link Analyzer - Great Free Tool

Now this is a really great tool that gives you more information then just about any free tool I have yet to see and in fact more information then a lot of expensive SEO software out there.
The tool is called  Backlink Analyzer and it does a ton more then just show back links.
Here is how it works you enter the url you want to check for links, you then select the search engine/engines you want to check and set the parameters such as do you want to see the link type (as in it will identify if your links are recopicial or one way) and click go.  In just a few minutes (seconds if you have only a few links) it develops a report for you that shows
1.  The page linking to you
2.  The IP address of that page
3.  The Anchor Text of the Link or the Alt Image Tag if the link is an image and has an alt link (this is extremely valuable)
4.  The total number of links on the page your link is on and how many are outbound links vs. inbound links
5.  Whether the link is one way or reciprocal
6.  A summary report of the raw numbers of reciplicle links, one way links, total link popularity and number of IP C-Blocks represented in all your linking pages
I found this tool extremely useful, very intuitive and very fast. 
There were a few hang ups though.
1.  You need a Google API Key to get the detailed google results (no big deal) and that limits your google results to 1000 a day
2.  Serval times while running a link check the program hung up on me (always with yahoo by the way) this seemed to occur when one of the linking sites was down and unresponsive to the query from the program.  In other words Yahoo had the info but when the individual site was asked for an update the program simply did not give up.  I checked the offending site and it was indeed down.  The program has a "stop" button but clicking it did not work it told me it may take a few minutes to stop but 20 minutes later it was still hung.  I used windows task manager to shut it down and that worked at once.
Despite the hang up on yahoo for one site the tool has proven quite effective and I highly recommend it, again it is called Backlink Analyzer and you can find it by clicking here.
Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

November 28, 2005

New Article on our SEM Site

We have just added a new article about Hiring a SEO Firm on our new dedicated SEO Website.  Here is a part of it, 
It is no secrete that the topic of search engine marketing (SEM) and in fact internet marketing as a whole has become a hot topic lately.  Many companies from small businesses to the largest corporations have really come around to understanding just how powerful web site optimization can be when done properly.  Even the companies that as recent as two years ago, were convinced that the internet wasn’t for them are beginning to see the light, primarily because they are seeing their competitors succeed online and feel they are falling behind.

Considering all of this it is no surprise that over the same period of time an entire new business segment has developed with the rise of Search Engine Optimization Firms, (also called SEO Firms), ranging in size from small one man operations to large professional companies serving America ’s top multi-national corporations.  Some of these firms are excellent; many leave a lot to be desired and to compound the confusion in the market place search engine optimization is not very well understood by most people.  Then add in this little tid bit.

 “Whatever you know about search engine optimization and in fact any internet marketing method will be at least a little bit different next month and will continue to change almost daily for years to come”.

Read the Full Article about Hiring a Website Optimization Firm

Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

November 23, 2005

RankQuest another FireFox Extension

I have decided to post about RankQuest and the tool bar they offer.  This issue with this to me is you don't really need the tool bar book marking the RankQuest Website is really all you need to do but the toolbar is very convenient.
This tool bar does a lot of great things but all by simply taking you to their website and entering the URL that you enter into the tool bar automatically as you use each tool.  I myself have decided to go ahead and keep the bar installed in FireFox and that way I don't have to dig into my book marks to use the features it has.
So just what does this tool do? Well to start off it does the typical stuff like
Like Google, MSN and Yahoo back links
Shows key word density
Shows Indexed Pages
For those types of features I actually prefer and will continue to use the Search Status Bar from Quirk as it sits at the bottom right of the Firefox browser out of the way and can do these things much quicker, including using the tabs in the FireFox browser to show all back links and all indexed pages in one click for all three engines.  Yet RankQuest has some very cool features that the Quirk Tool simply does not.
1.  Lynx View -  With this you can see what your site would look like in the Lynx Browser, at least that is the claim, having viewed the Lynx browser I disagree with that but it is a good text only view.
2.  Key Word Density Compare - To me this is the best feature and the reason I am willing to keep this bar installed.  You can enter several URLs and a group of key word phrases and click and get side by side analysis of all the sites for all the phrases in again one click.  I don't know how many urls or phrases you can check at one time but I tried four sites for five phrases and got results in a few seconds.
3.  A pretty good META tag analyzer
They also have a HTML Validation Tool and Code Cleaner which I think needs work.  I tried the code cleaner on a few pages to see the results and found that it seemed to cause layout problems, image file name changes etc. So personaly I won't be using those features for the time being.
All in all it is a great tool worth having mostly for the side by side densities.  Some people will claim densities do not matter any more and those people are right and wrong at the same time. 
If the term is a major term like "cell phone" or "chevy" that is true as tons of webmaster and optimizers are getting targeted anchor links for those terms heavily weighting the off page factors for them.  Yet if you are searching for
"cingular cell phone dealer dallas"
"original used chevy parts wholesale"
Density plays a major role and such phrases are the ones buyers use, keep that in mind as you continue in your SEM efforts.
Jack Spirko
The MasterLink Group, Inc

November 22, 2005

Google Sitemaps Improves

Last week when updating Sitemaps for our Search Engine Marketing Clients, I noticed a major upgrade to the service. Google is giving more advanced reporting tools that allow you to better understand how Google is crawling your site, as well as other insights. In order to be able to view these new advanced statistics, you must first verify that you are the owner of the domain by a two step process which helps to prevent unauthorized access to your site statistics.

Google now allows you to see in depth crawl stats. You can see a distribution of the pages successfully crawled and the errors encountered on each page as well as a distribution of current PageRanks for all pages in your site.

The URL error reports now include 40 different types of errors in 5 categories, which allows you to correct any crawl errors much faster without having to crawl the site yourself to locate problems. If a link is broken or a page is no londer there, you will be notified every time that Google crawls your site. This is a great tool for large websites who have link structures that are complex and constantly changing. You can also better mangage the URLs that are blocked from being crawled as well be notified of URLs that are not being reached when they should be.

Another neat feature is the query stats that are now provided. These stats are shown for your domain whether or not you have submitted a site map to Google. It allows you to see the top Google search queries that return pages to your site as well as the top queries that caused users to click on your site in the search results. This allows you to better write titles and descriptions for pages that are top search queries and are not returning a high click through rate. It is also useful for sites that do not track referral stats by monitoring log files. At this point in time, Google is only showing the top 4 queries and the top 4 search query clicks. However, I feel that Google will show more results in the future as more people start to take advantage of this free feature and they are better able to fine tune the service.

Learn more about Google Sitemaps by visiting: https://www.google.com/webmasters/sitemaps/docs/en/about.html

-Mark Barrera
The MasterLink Group, Inc
Search Engine Marketing Services

November 18, 2005

Online Search Marketing Tool

Well I have one more key word / search engine tool for everyone today. It is called Site Report Card and has been around for a long time. A fellow named Gene Culver turned me onto it about three years ago and I have used it with clients especially when in the field and away from my own computer to explain a lot of things like Key Word Density, Site Optimization, Inbound Links, Page Ranks, etc.

One of the real cool things to do with this tool is to do report on one of your pages, then do the keyword report on it, then open a new window and go do a search for that term or group of terms and see who is number one for it on what ever engine you want.

Then put the URL of the competitor who is at number one into the box to compare it to yours and it will show the key word densities side by side for both sites. It does not show everything but does show basic density and works very quickly.

If you take this tool and combine it with the two Mozilla/Firefox extensions I mentioned earlier today you have about 60-75% of what many SEO software suites provide.

So there you go your new tool kit is...

Just download the Mozilla Browser

Download and install the Quirk SearchStatus Tool Bar Extension

Download and install the SEO Links Extension

Then bookmark and remember to use SiteReportCard

With those resources you can very quickly make adjustments and do checks on the fly greatly enhancing your SEO efforts,

Jack Spirko
The MasterLink Group, Inc

Tool for those using Adsense

Here is the code to tell Google to start emphasizing the content on your page:

<!-- google_ad_section_start -->

Then after your article put the following code:

<!-- google_ad_section_end -->

Using these tags will help you to insure that google puts the adsense content on your page that actually has to do with what your page is about and of course this will help your click thru rates.  In many instances it will also result in more expensive key words being displayed and therefore also increase revenue from  your adsense campaign.

Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

Another SEO Links Extension

This SEO Links extension gives you tooltips specially enhanced for SEOs. When enabled, hovering any link in Firefox will show you Yahoo, MSN, and Google link popularity and ranking data for the URL and anchor text pair. For times you just want regular surfing and tooltips, the extension can be quickly toggled on and off in a single click.
This works really easily once installed you simply hover over any hyperlink and you will see information about the page in points to show up in a little hover box this information is for MSN, Yahoo and Google as is basicly
1.  The number of links indexed in each engine that point to the target page
2.  The rank on each engine (if ranked in the top ten) for the term held in the anchor text
Mark Barerra another member of the MasterLink Internet Marketing Team turned me on to this tool.
Here is the link to learn more about it,
Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

SearchStatus: A Search Extension for Mozilla and Mozilla Firefox

Now this is a great tool for Mozilla - Fire Fox which shows things like Google Page Rank, which of course is something the Google Tool Bar for IE does but this tool offers so much more.
In addition it shows features of the web page you are viewing like
Key Word Density
Backward Links on MSN, Google, Yahoo or All Three at Once (via the tabs in the browser)
Total Indexed Pages for the site again in any of or all of the big three search engines
The Alexa Rank
It also shows quite a bit more.  I am not an anti Microsoft guy I am just big on Mozilla because it works better and this is just one more example of that.
You can download this tool at the following link http://www.quirk.biz/searchstatus/ 
I have to say this is one of the best SEO Tools I have ever worked with,
Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

Here Comes the Real Results of Jagger3

Well it looks like the full effect of Jagger3 is about to be upon us in full force.  Quite a few other SEO seem to have noticed the same thing I noticed last night results show the update you do a refresh and then back to the Jagger2 now this morning it seems that the Jagger3 results are steady and consistent.
Here is what I have come to realize looking at the results as they have changed over the three updates
1.  The value of direct reciprocal linking was clearly effected but not to the degree many including myself have surmised.  I know of a few extremely competitive key phrases who were number one on Google and the SEOs behind them have really relied on reciprocal links to a high degree.  Most of them maintained their number one ranks or if they dropped only went down to number two. 
2.  The big winner as everyone expected were the HUGE sites in quite a few of the above mentioned competitive phrases when the old number one was pushed to number 2 or 3 the new king is about.com  I think about was the biggest winner in most of the areas where I saw change, just does not seem like what is really best for the user does it?
3.  Professional SEOs and Do It Yourselfers alike can still gain advantage with reciprocal links but they need to work on one way linking as well and stay away from small rings, I feel site links that link site a to site b and site b to site c and then site c back to site a arrangements have been found out as well.
4.  Deep links were always important as were varied links (to your home page and many other pages of your site) honestly I feel reciprocal links will still work great as long as you use a "change up" approach.  Set up a rotation and each time you work on links change the page you are getting links to, organize this and if you keep moving it around your links even reciprocal links will continue to work well.
5.  This may sound a bit contradictory but I also feel in some instances on page factor value has gone up and here is what I mean by that.  In some instances with moderately competitive phrases where top dogs were there with a ton of clear reciprocal links with most going to one page on their site some of their competitors with a lot less (in fact very few) natural links and great on page factors have pulled ahead.
The old view was content is king and while that has been mitigated a bit by Google's high dependence on links it is clear that content is still a much bigger part of the equation then many seem to realize,
Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

November 17, 2005

New SEO Site [Beta Version]

MasterLink has launched a brand new dedicated SEO site Dallas Search Engine Marketing this is a brand new site in initial development but we will be moving quickly to develop a huge data base of SEO and SEM articles.
Jack Spirko

Google Base

On November 16th Google announced the launch of Google Base, a free online resource for the selling of products and services online.

It appears that Google Base is an extension of Google's existing content collection efforts such as our traditional web crawl system – as well as Google Sitemaps, Google Print and Google Video – all which enable content owners to easily make their information searchable via Google. According to Google "the goal of Google Base is to improve the overall quality and breadth of Google Search results by collecting even more information about a wider diversity of content".

There's not much in there yet but I'd highly recommend checking out Google Base at http://base.google.com/ and become familiar with how it all works. If you sell a product or service then this free tool stands to be a fantasitc addition to your marketing arsenal.

I also found this great article on Google Base by Search Engine Watch editor Danny Sullivan at http://searchenginewatch.com/searchday/article.php/3564506 in which he points out some of the problems he sees with this new service. I would suggest reading but I would also suggest that when doing so we should all keep in mind that Google Base is still in Beta and I would venture a lot of changes are still to come.

Jack Spirko
Search Engine Marketing Specialist
The MasterLink Group, Inc

November 14, 2005

Getting Your Site Out of the Google Sandbox

By Jack Spirko – SEO Specialist
The MasterLink Group, Dallas Texas

So Just what is a “Google Sandbox”?

There has been a lot of talk about Google placing new sites in a “sandbox” away from the regular ranking index, to rank well in Google’s general organic listings a new site simply must get out of this sandbox phase as soon as possible. So how long does this period last if you don't take specific steps to speed it up? Generally from 120-180 days, that is up to a half a year just to start being given a fare shake. During this period, new sites tend to have low page rank or even a page rank zero however, they often rank well in MSN or Yahoo in spite of being shunned by Google.

To me personally this is just one big reason Google will continue to loose market share to Yahoo and MSN. People what the most germane content to their search and ruling out all the content put up over the past 180 days rules out a lot of content.

If that is true then just why did Google create the sandbox?

In theory sandbox prevents new sites using what Google sees as "tricks" to rapidly gain rankings. By filtering out sites that are new they force you to build real and relevant content and links over time. While this seems good in theory it basically assumes that your site has nothing relevant to contribute until it passes some random time period some person at Google simply decided was "long enough". As you can tell I am not a fan of the "sandbox". Of course as an SEO Specialist I guess that is natural but if you think about it with logic this does seem to be a bit like tossing the baby with the bath water does it not?

What triggers the sandbox?

According to the few people at Google, (such as googleguy) who will comment on the sandbox, “The sandbox is triggered mainly by unnatural linkage patterns. As evidenced by Goggle’s recent patent, Google has been keeping track of historical data for some time now. Google has an excellent understanding of what natural linkage and text looks like”. Which to me again personaly is Google simply stating, that we have decided that a new site is not capable of gaining quality links in any quantity quickly no mater how germane the content may be or what the quality of the site may be.

There are a few things you should be aware of to understand this issue.

First Your Anchor Text

One of the biggest signs of over optimization is your anchor text. If the majority of your anchor text is for the same keyword phrase, Google may penalize your site. Natural anchor text is varied and Google knows this. So if you want to optimize for lets say “low mortgage rates” make sure you use various anchors such as “get low mortgage rates” and “find low mortgage rates” as you build links both on and off site.

Next Your Incoming Links

Google’s algorithm knows if your incoming links are not natural if you just launched your site and you already have five thousand links to it. Google knows how many incoming links your site has, the rate of incoming link growth, the anchor text used, what ip addresses the links come from and many other signs of artificial efforts to inflate rankings to quickly will almost always lock the sandbox down harder.

Finally the location of your links and where they point to are of key importance.

The location of your off site links is very important natural links occur on many different sites hosted on different servers all over the country and the world if you have 100 links from a single IP address that is sure to trigger the filer and even after you are “out of the box” those links will still be discounted and do you little if any good. On where your incoming links are pointing to, if they all or even most point to your homepage, there is high probability you will indeed trigger the filter. So as you build links focus on 5-10 pages (at least) on your site and try to create a balance of home page and deep links.

So what can you do to avoid the sandbox?

In theory not all new sites are placed in the box though I tend to disagree with this. I personaly feel they all are or if you don’t trigger it that means you link popularity is so low it won’t matter, you still will not rank for any competitive terms. However you can indeed cut down your time in the box with a few simple steps.

First acquire offsite links slowly and avoid “link farms”. Work hard especially in the beginning to gain 20-30 links on very much “on topic” pages and vary your anchor text using the main phrases surrounded by different terms and even break your phrase up and leave a word out from time to time. By the time you gain 10-15 links to your home page try to have 20-30 “deep links” to various pages on your site. One of the biggest things you can do is the second you register your domain, place a temporary page up and get it indexed. If your site is going to be in initial development for say 1-2 months DO NOT WASTE THAT TIME get indexed fast and let Google see you continue to add content as soon as a page is decent get it added to your online web.

Even during construction try to obtain some off site links and start pushing links to your sub pages right away. Continue to develop your content and stay aware of all the important on page factors as well. We may not like that sandbox but it is here to stay for a while anyway.

Now here is the good news, no make that great news. In spite of the fact that Yahoo and MSN don’t at this time have a sandbox of their own the reality is if you take the approach I advise here you are going to do very well with both of them as well.