Tuesday, February 24, 2009

Website Traffic Reports and Site Marketing

Why You Need Them

Website traffic reports are a key element of search engine tracking site and site optimization. Without the traffic reports, you can't tell:

• if search engine spiders are seeing your site at all

• if there are any errors that avoid search engine spiders from successfully reviewing your entire site

• how often search engine spiders return to your site to modernize their information (if they don't return by themselves, you need to resubmit)

• what sites have links to your site that are generating high-quality traffic

• links are only generating sporadic traffic (and thus may need a re-written description)

• what relevant keywords typed at the search engines have people clicking through to your site

• what terms that don't associate to your site are bringing people through to your site (i.e. "false click through")

Site traffic reports are utilized throughout the duration of a site optimization project. You start a project by first reviewing the site traffic report to get a baseline of total visits, visits from search engines, conditions used, and link activity. Plus, the initial review of a site traffic report can help pinpoint and problems in your code or server setup that are preventing search engines from spidering your site.

Then, after resubmission, you review ongoing site traffic reports to monitor spider sees and revisits, and once you see that the spiders have visited, you review the reports to check the changes in click-through patterns and to make sure no new "false click through" patterns have been created by accident.

Can your site be optimized without access to site traffic reports? Yes, there are things that can be done, but it's a bit more difficult for the following reasons:

• In regards to keyword phrases, the site traffic report tells you which ones are currently useful and being clicked on in the search engines -- so without the report you don't have any idea which of your current keywords are or aren't working. This means that you have no idea if you need to change your keywords, so any keyword research is basically done as a shot in the dark.

• regards to links, without the traffic reports you don't have any indication of which current links are working for you, which means you don't know what types of sites you should appeal additional links from. This makes any link requests totally untargeted and may mean that hours are tired requesting links from sites that don't tend to drive any traffic your way.

• The timing of your optimization project is effected by not having access to site traffic reports. Since you don't have a record of when search engine spiders have visited, you have to think they haven't and wait blindly to see if you're appearing in the search engines. Since many search engines update only once a month, and since you won't know if your site get's reviewed during the open window, you just have wait, try your terms in the search engine, and wait several more. This means that running any final reports showing your search engine position can't be scheduled in advance.

• Most importantly, without site traffic reports you won't know if your optimization project was flourishing, because you won't have any idea if your site is getting more traffic or not! The bottom line in search engine optimization is to get embattled traffic to the site -- and without a site traffic report you have no idea if your optimization lead to an increase or a decrease in visitors.

Doing Without

So, what can be done without a site traffic report?

1. Competitors' sites can be reviewed and their Meta tags (i.e. title, meta keywords, meta description) looked at for ideas.

2. Titles and Meta tags can be written based on words used in the companies marketing material and on research into site relevant popular keywords.

3. The home page URL can be submitted to directories like Yahoo (there's a fee involved for listing) and Open Directory.

4. A selection of pages can be submitted to "pay for reindexing" subscribing at Inktomi and AltaVista (prices differ depending on the number of pages submitted).

5. Using fluctuations of the keywords selected for Meta tags, searches can be run to find places that might link to your site - once you find one, you'll need to look it over suspiciously to see if it already links to you and to see if it looks like it would send relevant traffic.

6. Site pages can be submitted at the free search engines submission sites.

7. Search engine ranking reports can be run to see where your site stands for each keyword phrase.

8. You can pay for outsourced tracking -- this usually means adding a small "bug" or hidden image on each page of your site. These bugs are stored on the tracking company’s server, and reports are run on their server logs to see how many times your particular bugs are called and who's calling them. The major downside to this is that internet traffic jamming could slow down loading of your site (since each page has to call an object on another server). Also, some of these services are very expensive, and those that aren't expensive often have very limited reporting.

So, there's still plenty that can be done even if your hosting company doesn't give site logs or traffic reports -- but it makes YOUR job more difficult, because instead of having real evidence of who's coming and where from, you just have to hope that the descriptions in the search engines are tempting click throughs and that no-one is clicking through on terms that don't apply.

Wednesday, February 18, 2009

SEO: Increase Your Business Website's Ranking

I'm departing through an SEO overhaul of our company's website. There is definitely a mixture of art and science that I never appreciated until having to do it. So, to save others some time, here are some very basic things that any business should do to optimize the ranking of its website in search engine results. Nothing explained here is necessarily a secret or hard to find out. The trick is to find a good cookbook and a process that is easy to follow.

1. Keywords. Decide on a manageable number of keywords to merge into the content of your website (I chose 10). This falls into the category of art. You need to think of words that people would potentially use to look for what you're offering; similar to the process you go through when starting an AdWords campaign. You need to climb into the brain of your customer and understand how he or she forces try to find you. For example, our company often refers to itself as a SaaS company. However, potential customers might first think of the words "online" or "hosted," instead of SaaS. Tools exist that can help with this, but the end goal is to list important keywords.

2. Site changes. Depending on what software or hosting services you use, you may need to tweak this process differently, but here are some common ways to influence how your site gets indexed:

* Page title. These are the words at the top of the browser window that describe the current page. Search engines look at these words. Be strategic here. For example, if you have a page devoted to a product that allows online document collaboration; don't use "Product" in the page title. Use "Online document collaboration" instead. Also, don't put your product name first. Search engines will hopefully already know what that is; they tend to look at the first three words, so make them the most important.

* H1 tags. Use the same strategy. Search engines prioritize H1 tags, so use important keywords here. Try to stick to one H1 tag per page.

* URLs. Same deal here: use keywords. For example, instead of http://mycompany.com/product, use http://mycompany.com/online-document-collaboration. Separating words with hyphens is best.

* Keyword density. Make sure to use your keywords in the body of your pages. Density is important; i.e. 2 keywords for every 5 words count more than 2 for every 15.

* Meta description tags. This will sound redundant but... use keywords. This is also where you should put a 25-word description of your site. Search engines sometimes display this description under links in search results.

* Links. Link to other pages on your website, but (get ready for it!) use keywords. For our example, use "online document collaboration" as the link text, not "product name."

* Content. Create pages on your site that focus on your most important keywords and topics. And then link to those pages using the method described above. Do not copy content from other sites because this only confuses search engines.

3. Link love. We all know that the additional inbound links pointing to your site, the better. Get people to write and comment about your business. We also know that the quality of the site linking to yours is very important. What you may not know is that the words in links are important, too. Try to ensure that the text in those inbound links contains keywords; for example, use "business collaboration software" (see how I'm learning) instead of something generic like "click here."

You can do other, more technical things, like create an XML site map and submit it to search engines. Our company brought in SEO experts, and that can be very beneficial if you require help in this area. Regardless, the whole process makes you really focus on what your company does and how you describe it. It isn't easy, but it can pay great dividends with leads and site traffic. How do you practice SEO? Any tips or strategies that I missed?

Thursday, February 12, 2009

SEO Inc. Chief Operating Officer to Speak at SMX West 2009

CARLSBAD, CA - SEO Inc.'s Brad Lipschultz, COO, will speak at this year's Search Marketing Expo (SMX) West conference in an educational session on search engine optimization.

An experienced Internet and SEO veteran, Brad is responsible for managing the day-to-day activities at SEO Inc., a leading full-service integrated SEO and search engine marketing company.

Brad will be presenting in Theater B of the SMX Theater on the Expo Hall floor Wednesday, February 11 at 1:40 pm. All SMX attendees are invited to attend and hear him discuss various aspects of SEO with an emphasis on web accessibility for the visually impaired.

SMX West will take place at the Santa Clara Convention Center, California from February 10 through February 12 and will feature three days of sessions, keynotes, networking activities, special educational presentations and meals for which SMX events are famous.

SMX is programmed by the sharpest minds in search marketing. It is where experienced SEO and search engine marketing professionals go to learn everything that they need to know about Internet marketing. There are sessions for everyone ranging from beginners to the extremely advanced.

For more information about SEO Inc. call 877-736-0006, visit www.seoinc.com or visit booth #323 at the SMX West Expo Hall.

About Search Engine Optimization Inc.:

Search Engine Optimization Inc. is a professional search engine optimization firm that specializes in achieving top search engine placement for websites through highly targeted SEO campaigns that are geared towards the unique online business objectives of each client. SEO Inc. is an Inc. 5000 company that leverages more than 120 years of combined Internet marketing experience. SEO Inc.'s certified search engine specialists have developed and honed a suite of highly effective, proprietary optimization methodologies and technologies that have placed more than 700 leading corporations in the top rankings of world's leading search engines.

Wednesday, February 4, 2009

Google Started software to track mobile users

U.S. Internet search company Google Inc released software on Wednesday that allows users of mobile phones and other wireless devices to automatically share their whereabouts with family and friends.

Users in 27 countries will be able to broadcast their location to others constantly, using Google Latitude. Controls allow users to select who receives the information or to go offline at any time, Google said on its Web site.

"Fun aside, we recognize the sensitivity of location data, so we've built fine-grained privacy controls right into the application," Google said in a blog post announcing the service.

"You not only control exactly who gets to see your location, but you also decide the location that they see."

Friends' whereabouts can be tracked on a Google map, either from a handset or from a personal computer.

Google's new service is similar to the service offered by privately-held Loopt.

Companies including Verizon Wireless, owned by Verizon Communications and Vodafone Group Plc, already offer Loopt's service, which also works on iPhone from Apple Inc.

Latitude will work on Research In Motion Ltd's Blackberry and devices running on Symbian S60 devices or Microsoft Corp's Windows Mobile and some T-1 Mobile phones running on Google's Android software.

The software will eventually run on Apple's iPhone and iTouch and many Sony Ericsson devices.

In 2005, Google acquired, but subsequently shut down, a location-finding service that used text messaging to keep mobile phone users aware of their friends' proximity.

Tuesday, February 3, 2009

PageRank is Important for Websites

Page Rank (PR) is important factor to identify quality of websites.

One of Google's methods to measure the importance of a web page according to its popularity and links to other websites.

Page rank is a value that Google assigns to a web page based on the importance of the page on the internet that is determined by the number of incoming links to that web page and few other factors.


How is PageRank measured?


To determine the PageRank for a page, all of its inbound links are taken into account. These are links from within the site and links from outside the site.

PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

That's the equation that calculates a page's PageRank. It's the original one that was published when PageRank was being developed, and it is probable that Google uses a variation of it but they aren't telling us what it is. It doesn't matter though, as this equation is good enough.

In the equation 't1 - tn' are pages linking to page A, 'C' is the number of outbound links that a page has and 'd' is a damping factor, usually set to 0.85.


We can think of it in a simpler way:-


a page's PageRank = 0.15 + 0.85 * (a "share" of the PageRank of every page that links to it)

"share" = the linking page's PageRank divided by the number of outbound links on the page.

A page "votes" an amount of PageRank onto each page that it links to. The amount of PageRank that it has to vote with is a little less than its own PageRank value (its own value * 0.85). This value is shared equally between all the pages that it links to.

From this, we could conclude that a link from a page with PR4 and 5 outbound links is worth more than a link from a page with PR8 and 100 outbound links. The PageRank of a page that links to yours is important but the number of links on that page is also important. The more links there are on a page, the less PageRank value your page will receive from it.


If the PageRank value differences between PR1, PR2,.....PR10 were equal then that conclusion would hold up, but many people believe that the values between PR1 and PR10 (the maximum) are set on a logarithmic scale, and there is very good reason for believing it. Nobody outside Google knows for sure one way or the other, but the chances are high that the scale is logarithmic, or similar. If so, it means that it takes a lot more additional PageRank for a page to move up to the next PageRank level that it did to move up from the previous PageRank level. The result is that it reverses the previous conclusion, so that a link from a PR8 page that has lots of outbound links is worth more than a link from a PR4 page that has only a few outbound links.

Whichever scale Google uses, we can be sure of one thing. A link from another site increases our site's PageRank. Just remember to avoid links from link farms.


Note that when a page votes its PageRank value to other pages, its own PageRank is not reduced by the value that it is voting. The page doing the voting doesn't give away its PageRank and end up with nothing. It isn't a transfer of PageRank. It is simply a vote according to the page's PageRank value. It's like a shareholders meeting where each shareholder votes according to the number of shares held, but the shares themselves aren't given away. Even so, pages do lose some PageRank indirectly, as we'll see later.


Ok so far? Good. Now we'll look at how the calculations are actually done.

For a page's calculation, its existing PageRank (if it has any) is abandoned completely and a fresh calculation is done where the page relies solely on the PageRank "voted" for it by its current inbound links, which may have changed since the last time the page's PageRank was calculated.

The equation shows clearly how a page's PageRank is arrived at. But what isn't immediately obvious is that it can't work if the calculation is done just once. Suppose we have 2 pages, A and B, which link to each other, and neither have any other links of any kind. This is what happens:-


Step 1: Calculate page A's PageRank from the value of its inbound links


Page A now has a new PageRank value. The calculation used the value of the inbound link from page B. But page B has an inbound link (from page A) and its new PageRank value hasn't been worked out yet, so page A's new PageRank value is based on inaccurate data and can't be accurate.


Step 2: Calculate page B's PageRank from the value of its inbound links


Page B now has a new PageRank value, but it can't be accurate because the calculation used the new PageRank value of the inbound link from page A, which is inaccurate.

It's a Catch 22 situation. We can't work out A's PageRank until we know B's PageRank, and we can't work out B's PageRank until we know A's PageRank.


Now that both pages have newly calculated PageRank values, can't we just run the calculations again to arrive at accurate values? No. We can run the calculations again using the new values and the results will be more accurate, but we will always be using inaccurate values for the calculations, so the results will always be inaccurate.


The problem is overcome by repeating the calculations many times. Each time produces slightly more accurate values. In fact, total accuracy can never be achieved because the calculations are always based on inaccurate values. 40 to 50 iterations are sufficient to reach a point where any further iterations wouldn't produce enough of a change to the values to matter. This is precisiely what Google does at each update, and it's the reason why the updates take so long.


One thing to bear in mind is that the results we get from the calculations are proportions. The figures must then be set against a scale (known only to Google) to arrive at each page's actual PageRank. Even so, we can use the calculations to channel the PageRank within a site around its pages so that certain pages receive a higher proportion of it than others.