Blogroll

photoshop cs6 html 5 css php seo backlinks

adsense

Smarty Template Engine Step by Step Tutorial

Smarty has focused on how to help you make an high-performance, scalability, security and future growth application.

JavaScript was designed to add interactivity to HTML pages

JavaScript’s official name is ECMAScript, which is developed and maintained by the ECMA International organization.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Wednesday, 22 April 2015

America’s Online Home Improvement Place™

Showroom Partners is America’s Online Home Improvement Place™ since 2007. As a trusted resource for home owners, Showroom Partners provides a wide array of fine building products that home owners can review and order to ensure that their home improvement, renovation, furnishing or building needs are met. Our products include full service installation and delivery options that make it even easier for you to shop for quality, name brand building products. Showroom Partners’ interactive website features leading product manufacturers that provide products that include premium fireplaces, custom shower doors, custom wardrobe and closets, bath products, garage doors, windows, kitchen finishes and insulation. We are the building partners that America’s home owners can trust and rely on. Our product specialists are committed to providing the latest product updates and features, as well as excellence in the dedicated customer service quality we guarantee all our customers. Contact Showroom Partners at showroompartners@gmail.com anytime.

Wednesday, 5 March 2014

Search Engine Optimization: For Authors

Driving usage and readership is critically important to raise the
visibility of your research. Wiley Online Library is one of the most
highly visited scientific web sites, with over half of our traffic
originating directly from Google, Google Scholar and other search
engines. Wiley has a robust search engine optimization strategy and
we are actively engaged in ensuring that all of our research content
is visible and high ranking in the search results of Google and other
engines. One of the key factors in sustaining long-term usage for
your research is through search engine optimization (SEO). Authors
can also play a crucial role in optimizing search results at the article-level by following the tips below.
authorservices.wiley.com
Search Engine Optimization: For Authors
Wiley Online Library Traffic Sources
Search Engines
Referring Sites
Direct Links
Search Engines58%
Referring Sites31%
Direct Links11%

Top Tips to Make Your Article Discoverable Online
1.  Make sure you have an SEO-friendly title for your article
The title needs to be descriptive and mustincorporate a key phrase related to your
topic. Put your keywords within the first 65 characters of the title.
2. Carefully craft your abstract using keywords, keywords, keywords
a.  Choose the appropriate keywords and phrases for your article. Think of a phrase of 2-4 words that a
researcher might search on to find your article.
b. Consider looking up specific keywords on Google Trendsor the Google Adwords keywords toolto
find out which search terms are popular
c.  Repeat your keywords and phrases 3-4 times throughout the abstract in a natural, contextual way.
d.BUT don’t go overboard with repetition as search engines may un-index your article as a result.
3. Provide at least five keywords or phrases in the keywords field
Include the keywords and phrases you repeated in your abstract. Provide additional relevant keywords
and synonyms for those keywords as they relate to your article. Keywords are not only important for SEO,
they are also used by abstracting and indexing services as a mechanism to tag research content.
4. Stay consistent
Refer to authors’ names and initials in a consistent manner throughout the paper and make sure you’re
referring to them in the same way they’ve been referred to in past online publications.
5. Use headings
Headings for the various sections of your article tip off search engines to the structure and content of your
article. Incorporate your keywords and phrases in these headings wherever it’s appropriate.
6. Cite your own, or your co-authors, previous publications
Cite your previous work as appropriate because citations of your past work factors into how search
engines rank your current and future work.

Example of Well-Optimized Abstract 

Ocean Acidification and Its Potential Effects on Marine Ecosystems
Keywords
ocean acidification, climate change; carbonate saturation state; seawater chemistry; marine ecosystems; 
anthropogenic CO
2
Abstract
Ocean acidification is rapidly changing the carbonate system of the world oceans. Past mass extinction 
events have been linked to ocean acidification, and the current rate of change in seawater chemistry is 
unprecedented. Evidence suggests that these changes will have significant consequences for marine 
taxa, particularly those that build skeletons, shells, and tests of biogenic calcium carbonate. Potential 
changes in species distributions and abundances could propagate through multiple trophic levels of 
marine food webs, though research into the long-term ecosystem impacts of ocean acidification is in its 
infancy. This review attempts to provide a general synthesis of known and/or hypothesized biological 
and ecosystem responses to increasing ocean acidification. Marine taxa covered in this review include 
tropical reef-building corals, cold-water corals, crustose coralline algae, Halimeda, benthic mollusks, 
echinoderms, coccolithophores, foraminifera, pteropods, seagrasses, jellyfishes, and fishes. The risk of 
irreversible ecosystem changes due to ocean acidification should enlighten the ongoing CO
2
emissions 
debate and make it clear that the human dependence on fossil fuels must end quickly. Political will and 
significant large-scale investment in clean-energy technologies are essential if we are to avoid the most 
damaging effects of human-induced climate change, including ocean acidification.
Promoting your Article after Publication Using Internet and Social Media
Once your article is written and published, there are still a few more steps to take to ensure your article is 
discoverable and visible. The best way to do this is to inform everyone in your academic and social networks 
about it. The volume of in-bound links also plays a factor in search engine rankings.
Share/include your article on the following platforms (as applicable in your discipline):
n  LinkedIn
n  Facebook 
n  Twitter
n  Your blog, or websites that you contribute to
n  Your institution’s repository
n  Mendeley
n  ResearchGate 
n  Your website
n  Your academic institution’s website

Monday, 3 March 2014

Will my competitor be penalised for unnatural links?Why doesn’t Google just ignore bad links?Is linkbuilding Dead?

Your guess is as good as mine. Sometimes they will, sometimes they won’t. You can
always tell Google about them or out them in Google forums. If you have the energy
to be bothered with that – perhaps focusing some of this on making your site a better
landing prospect for Google’s customers is a more productive use of your time.
Why doesn’t Google just ignore bad links?
Where would the fun in that be? Google wants our focus on low quality backlinks
now, and so, it is. It’s in Google’s interest to keep us guessing at every stage of seo.
Is linkbuilding Dead?
No – this is what seo (I use the term collectively) is all about. If Google didn’t do this
every now and again, ‘search engine optimisation’ wouldn’t exist. Opportunity will
exist as long as Google doesn’t do away with organic listings because they can’t be
trusted or produce a ‘frustrating’ user experience in themselves. Not until Google
convince people of that.
One thing has been constant in Google since day 2. SPAM, or Sites Positioned
Above Me. I think it’s safe to say there will always be spam; some of
your competition will always use methods that break the rules and beat you down.
There will be ways to get around Google – at least, there always has.
I can tell you I am auditing the backlink profiles of clients we work with  – and new
projects I’m invited to advise on. Those obviously manipulative backlinks aren’t going
to increase in quality over time, and if Google is true to its word,  it might just slap us
for them.
Matt said that there will be a large Penguin (“webspam algorithm update”) update in
2013 that he thinks will be one of the more talked about Google algorithm updates
this year. Google’s search quality team is working on a major update to the Penguin
algorithm, which Cutts called very significant. 

So how do we get natural links?

The simple answer is we’re all going to have to think harder and work harder to
get links from real sites. I think it’s fair to say you need to avoid links from websites
designed to give you a link. It’s hard not to think Google will at some point takedown
guest blogs and press release sites, much like the recent action they took on
advertorials.
I’d certainly:
  stay away from just about all ARTICLE SITES
  avoid most DIRECTORIES and
  avoid most BLOG NETWORKS
  IGNORE LOW QUALITY SPAM EMAILS offering you links (or cheap seo
services).
  be wary of ADVERTORIALS
  avoid LOW QUALITY GUEST POSTS and
  avoid LOW QUALITY, OFF TOPIC SITEWIDE LINKS.
Have a think for a minute and work out if the article you are going to have a link on
will end up duplicated across many low quality sites, for a start.
NOTE – In my experience you do not need to remove every instance of a site-wide
link. NOT if they are on topic, and editorially given.

What Google says about building natural links?

The best way to get other sites to create relevant links to yours is to create unique,
relevant content that can quickly gain popularity in the Internet community. The more
useful content you have, the greater the chances someone else will find that content
valuable to their readers and link to it. Before making any single decision, you should
ask yourself: Is this going to be beneficial for my page’s visitors? It is not only the
number of links you have pointing to your site that matters, but also the quality and
relevance of those links. Creating good content pays off: Links are usually editorial
votes given by choice, and the buzzing blogger community can be an excellent place
to generate interest.
Ironically Google has ignored their own rules on many occasions with apparently -little long term consequence. Big brands too have been recently hit, including (in the
UK) the BBC and INTERFLORA. Big brands certainly DO seem to be able to get
away with a LOT more than your average webmaster, and so these problems often
are often short-lived, especially if they make the news.

Link Schemes:What Google says about link schemes?

Your site’s ranking in Google search results is partly based on analysis of those sites
that link to you. The quantity, quality, and relevance of links influence your ranking.
The sites that link to you can provide context about the subject matter of your site,
and can indicate its quality and popularity. Any links intended to manipulate a site’s
ranking in Google search results may be considered part of a link scheme. This
includes any behaviour that manipulates links to your site, or outgoing links from
your site. Manipulating these links may affect the quality of our search results, and
as such is a violation of Google’s Webmaster Guidelines. The following are
examples of link schemes which can negatively impact a site’s ranking in search
results:
Some examples include:
  Buying or selling links that pass PageRank. This includes exchanging money
for links, or posts that contain links; exchanging goods or services for links; or
sending someone a “free” product in exchange for them writing about it and
including a link
  Excessive link exchanging (“Link to me and I’ll link to you”)
  Linking to web spammers or unrelated sites with the intent to manipulate
PageRank
  Building partner pages exclusively for the sake of cross-linking
  Using automated programs or services to create links to your site
  Text advertisements that pass PageRank
  Links that are inserted into articles with little coherence, for example:
  Low-quality directory or bookmark site links
  Links embedded in widgets that are distributed across various sites:
  Widely distributed links in the footers of various sites
  Forum comments with optimized links in the post or signature

Are you really penalised, or is Google just ignoring your links?

If you start with nothing, get top rankings in 3 months, and then end up with nothing.
Are you really penalised? Or is Google just ignoring your links? If the ‘penalty’ is an
algorithmic shift, then by the very nature of it, getting good links (links Google has no
reason to believe are suspect) to your website should tip the balance in your favour
again.
Google can’t tell the difference between good seo and good spam. The payday loans
market is taking a pounding at the minute as some folk brute force Google’s
algorithms using basic protocols Google basically has to respect (for now at least).
If you see obviously spammy links to your site, and your rankings are in the toilet,
perhaps disavowing the links is an option. I’ve seen improvement, and heard of other
successes using the tool. For me, the jury is still out on whether you can actually use
the disavow tool as a new seo tool. 

Will your rankings come back?

This depends on what, if any, quality signals are left in your backlink profile and
what’s happening in your niche. If you have decent links, individual rankings can
come back, that is for sure. I’ve yet to see a site where total traffic levels have come
back to previous best positions. Sometimes there’s just better, more information rich
pages out there these days. Often, there are always a couple of low quality or
spammy sites between your site and number 1 in Google. Hey, there’s always
Adwords.
But YES, I’ve seen rankings come back after a manual penalty. Sometimes better
than they were before. I’ve yet to see site-wide traffic levels return to normal in most
cases. 

Should I use the Disavow Tool?

This is an advanced feature and should only be used with caution. If used
incorrectly, this feature can potentially harm your site’s performance in Google’s
search results. We recommend that you disavow backlinks only if you believe you
have a considerable number of spammy, artificial, or low-quality links pointing to your
site, and if you are confident that the links are causing issues for you. In most cases,
Google can assess which links to trust without additional guidance, so most normal
or typical sites will not need to use this tool. Google
Some might recommend pulling links down instead of using this tool from Google.
Lots of people have different angles. If you have a manual penalty, you’ll probably
also need to actually get some of these links physically removed, too. Yes that
means emailing them.
If you get a manual penalty, have lots of links and actually removing the low quality
links is going to be a hard task - definitely. I’m also proactively using it on sites that
are obviously algorithmically penalised for particular keywords or on links I expect
will cause a problem later on. One would expect penalties are based on algorithmic
detection on some level for some sites.
If you’ve ever attempted to manipulate Google, now’s the time to at least quantify the
risk attached with those links.
Its clear Google is better at identifying your low quality links. Google already knows
about your crap links. Google is very definitely ignoring some of your links. Google
has probably already has penalised you in areas and you probably are not aware of
it. For instance, I’ve helped a few sites that got the unnatural links message that
were clearly algorithmically slapped a year before and never noticed it until it started
to hurt.
Using the disavow tool
1.  Upload a list of links to disavow:
2.  Go to the disavow links tool page.
3.  Select your website.
4.  Click Disavow links.
5.  Click Choose file.
Google says:
It may take some time for Google to process the information you’ve uploaded. In
particular, this information will be incorporated into our index as we recrawl the web
and reprocess the pages that we see, which can take a number of weeks.
… and they are telling it like it is.
You really do need to wait for a few weeks (after you submit your disavow list) before
you submit a reinclusion request (if you have a manual penalty).

What are unnatural links?Do I need to remove bad links?Do I need to audit my backlinks?

Well, if you’ve been actively promoting your website, sit back for a moment and think
about all the links you managed to generate to your site because you DID NOT
come from a position of actually building a rich informative site – yes – all those links.
If you paid someone else like a seo to get you links, yes, those links (probably). If
you are using cheap submission services that actually are not a scam, yes those
links. Those tactics to get easy-to-get links you got that were linking to your
competitors’ websites? Yes, those links.
In short – if you are using unnatural links to get top positions and don’t deserve them
Google will nuke your site if it detects them. Google knows exactly which keywords
to hit you for to destroy your ability to rank. Sometimes keyword phrase by keyword
phrase, sometimes page by page – sometimes site by site!
I’ve seen sites penalised for their main keyword and the main keyword in anchor text
back links from other sites is not the problem.
Sensible opportunistic links still pass a manual review, it appears. Paid links and lots
of ‘spam‘ still dominate lots of competitive niches – that is – white hat seo has little, if
any chance, of ranking in these serps.
The important thing to realise is there is a certain amount of risk now associated with
backlinks that point to any site and any page.
How Do I know if I have unnatural links?
If you honestly do not have a clue….
Google is telling a lot of people by email if you are subscribed in Google Webmaster
Tools. If you have unnatural links you need to worry about – the best place I think to
detect any issues is rather obviously Google Analytics.
There is a case to be said Google is kind-of forcing people into using Google
Webmaster Tools.
What happens to my site if Google detects unnatural links?
Sometimes you’ll get an email from Google:
Dear site owner or webmaster of http://www.example.com/, We’ve detected that
some of your site’s pages may be using techniques that are outside Google’s
Webmaster Guidelines. Specifically, look for possibly artificial or unnatural links
pointing to your site that could be intended to manipulate PageRank. Examples of
unnatural linking could include buying links to pass PageRank or participating in link
schemes. We encourage you to make changes to your site so that it meets our
quality guidelines. Once you’ve made these changes, please submit your site for
reconsideration in Google’s search results. If you find unnatural links to your site that
you are unable to control or remove, please provide the details in your
reconsideration request. If you have any questions about how to resolve this issue,
please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team
Google is moving in various directions:
In less severe cases, we sometimes target specific spammy or artificial links created
as part of a link scheme and distrust only those links, rather than taking action on a
site’s overall ranking. The new messages make it clear that we are taking “targeted
action on the unnatural links instead of your site as a whole.”
Other times the indicators might be more subtle. You might not rank at all in Google
for something you used to rank for very well for. Your traffic might reduce month by
month. You might disappear overnight for valuable keywords associated with your
content. You might disappear for one keyword phrase.  You might be reviewed
manually. If you are actually penalised, you’re going to have clean your links up if
you want to restore your ‘reputation’ in Google. Penalties can last from 30 days to,
well, forever (if the penalty is a manual action).
Google appears to crawls a site slower under a penalty. Google caches changes to
your pages a lot less frequently, too, it appears and new content seems to struggle a
bit more to actually get into Google. In some case – you might not rank for your
brand name (like happened to Interflora a few weeks ago). In the very worst cases –
your site can disappear from Google.
When you get a penalty revoked, things start to get back to normal within a month or
two.
What can I do about unnatural links?
If you are a small business – you probably don’t want to start again with a new
domain. Do you want to use 301 redirects to postpone a Google slap? That option
works, for at least, a while. The best option is to clean them up.
First, you’ll need to download your backlinks from Google.
Download links to your site
On the Webmaster Tools home page, click the site you want.
On the Dashboard, click Traffic, and then click Links to Your Site.
Under Who links the most, click More.
Click Download more sample links. If you click Download latest links, you’ll see
dates as well.
Note: When looking at the links to your site in Webmaster Tools, you may want to
verify both the www and the non-vww version of your domain in your Webmaster
Tools account. To Google, these are entirely different sites. Take a look at the data
for both sites. More information

Which unnatural links am I supposed to worry about?

I think these can be summed up if you are ranking for money terms with a low quality
site and have:
a high % of backlinks on low quality sites
a high % of backlinks on duplicate articles
a high % of links with duplicate anchor text
Basically the stuff that used to work so well for everyone and is mainly detectable by
Googlebot. Google doesn’t just ignore these links anymore if intent
to manipulate Google is easy to work out. Most low quality links are (probably) easy
to detect algorithmically.

Do I need to remove bad links?

We know that perhaps not every link can be cleaned up, but in order to deem a
reconsideration request as successful, we need to see a substantial good-faith effort
to remove the links, and this effort should result in a decrease in the number of bad
links that we see. GOOGLE
It kind of looks as though we’re going to have to, especially if you  receive a manual
action notice.

How To Remove Unnatural Links?

There are services popping up everywhere offering to remove unnatural links – I’ll 
blog about those later as I have little experience with any of them. An seo needs to 
be able to deal with this new problem in seo with the very basic of tools.
I’ve had success using simple methods.
Removing pages that are the target of unnatural links
Google Webmaster Tools
Excel
PageRank

Do I need to audit my backlinks? 

Most definitely. Google is fully expected to make a lot of noise about unnatural links 
this year, and that always involves website rankings being nuked with traffic 
decimated, and lots of ‘collateral’ damage.
Whether or not you eventually use the Disavow Tool in Google, you should be 
looking at your backlink profile and see what various links are doing to your rankings 
for instance. You should at least know who links to you, and the risk to high rankings 
now attached to those links.
Download your links from Google Webmaster Tools, pop them into Excel. I assume 
you have SEO Tools for Excel (I also have URL Tools installed)?
Get the root domain of each link (I’ve used URL Tools for this for a while), and check 
its toolbar PageRank with SEO Tools for excel. Most of those links with zero -1 
PageRank on the domain are worth looking at. Do the same for the actual page your 
links are on (on domains with PR). Similarly, if you have lots of links and all your 
links are on page with -1. That’s probably not good indicator of reputation.
If you have a LOT of links (tens of thousands) filtering, in Excel, for only unique
domains can speed up this process.
I normally get the PAGE TITLE of the linking page too (using SEO Tools for Excel), 
so I can easily detect duplicate articles on lower quality sites, and sites not yet 
affected by a PageRank drop.
Of course, there are some false positives. PageRank can be glitchy, or flat out 
misleading. So a human eye is often needed to reduce these. If you are using this 
method, you can run it again in the future and see if sites you identified as low quality 
by PageRank have changed, and perhaps modify your disavow list. 
Using this method I’ve successfully identified lower quality sites fairly easily. To be 
fair, I know a crap link. Ultimately, if you have a lot of links, you can never be too 
sure which particular links are ‘toxic’. It may very well be the volume of a specific 
tactic used that gets your site in trouble – and not one solitary link.
If you have a load of low quality directory submissions in your backlink profile, or 
have taken part in low quality article marketing recently, the next Google update 
might just be targeted at you (if it hasn’t already had an impact on your rankings).
Once you’ve examined your links and identified low quality links, you can then 
submit a list of links to Google in a simple text file called disavow.txt.
What is the disavow Tool?
A tool provided by Google in Google Webmaster Tools. You can specify which 
domains you want to disavow the links from (you can also specify individual pages). 
Generally speaking if disavowing a link, you are better of disavowing the entire 
domain (if it is a spammy domain).
The disavow.txt is just a simple text file with the following list of domains:
  domain:google.com
  domain:plus.google.com
The way it appears to work is you tell Google which links to ignore when they are 
calculating whether or not to rank you high or boot your rankings in the balls.
If you’ve done as much work as you can to remove spammy or low-quality links from 
the web, and are unable to make further progress on getting the links taken down, 
you can disavow the remaining links. In other words, you can ask Google not to take 
certain links into account when assessing your site.

What Not To Do In SEO?

Google has now released a search engine optimisation starter guide for
webmasters, which they use internally:
Although this guide won’t tell you any secrets that’ll automatically rank your
site first for queries in Google (sorry!), following the best practices outlined
below will make it easier for search engines to both crawl and index your
content. Google
Still worth a read even if it is fairly basic, generally accepted (in the
industry) best practice search engine optimisation for your site.
Here’s a list of what Google tells you to avoid in the document;
1.  choosing a title that has no relation to the content on the page
2.  using default or vague titles like “Untitled” or “New Page 1″
3.  using a single title tag across all of your site’s pages or a large group of pages
4.  using extremely lengthy titles that are unhelpful to users
5.  stuffing unneeded keywords in your title tags
6.  writing a description meta tag that has no relation to the content on the page
7.  using generic descriptions like “This is a webpage” or “Page about baseball
cards”
8.  filling the description with only keywords
9.  copy and pasting the entire content of the document into the description meta tag
10.  using a single description meta tag across all of your site’s pages or a large
group of pages
11.  using lengthy URLs with unnecessary parameters and session IDs
12.  choosing generic page names like “page1.html”
13.  using excessive keywords like “baseball-cards-baseball-cards-baseball-cards.htm”
14.  having deep nesting of subdirectories like “…/dir1/dir2/dir3/dir4/dir5/dir6/
page.html”
15.  using directory names that have no relation to the content in them
16.  having pages from subdomains and the root directory (e.g. “domain.com/
page.htm” and “sub.domain.com/page.htm”) access the same content
17.  mixing www. and non-www. versions of URLs in your internal linking structure
18.  using odd capitalization of URLs (many users expect lower-case URLs and
remember them better)
19.  creating complex webs of navigation links, e.g. linking every page on your site
to every other page
20.  going overboard with slicing and dicing your content (it takes twenty clicks to get
to deep content)
21.  having a navigation based entirely on drop-down menus, images, or animations
(many, but not all, search engines can discover such links on a site, but if a user
can reach all pages on a site via normal text links, this will improve the
accessibility of your site)
22.  letting your HTML sitemap page become out of date with broken links
23.  creating an HTML sitemap that simply lists pages without organizing them, for
example by subject (Edit Shaun – Safe to say especially for larger sites)
24.  allowing your 404 pages to be indexed in search engines (make sure that your
webserver is configured to give a404 HTTP status code when non-existent
pages are requested)
25.  providing only a vague message like “Not found”, “404″, or no 404 page at all
26.  using a design for your 404 pages that isn’t consistent with the rest of your site
27.  writing sloppy text with many spelling and grammatical mistakes
28.  embedding text in images for textual content (users may want to copy and
paste the text and search engines can’t read it)
29.  dumping large amounts of text on varying topics onto a page without paragraph,
subheading, or layout separation
30.  rehashing (or even copying) existing content that will bring little extra value to
users
Pretty simple stuff but sometimes it’s the simple seo often get overlooked. Of
course, you put the above together with Google Guidelines for webmasters.
Search engine optimization is often about making small modifications to parts
of your website. When viewed individually, these changes might seem like
incremental improvements, but when combined with other optimizations, they
could have a noticeable impact on your site’s user experience and
performance in organic search results.
Don’t make simple mistakes…..
1.  Avoid duplicating content on your site found on other sites. Yes, Google likes
content, but it *usually* needs to be well linked to, unique and original to get you
to the top!
2.  Don’t hide text on your website. Google may eventually remove you from the
SERPS (search engine results pages).
3.  Don’t buy 1000 links and think “that will get me to the top!” Google likes natural
link growth and often frowns on mass link buying.
4.  Don’t get everybody to link to you using the same “anchor text” or link phrase.
This could flag you as an seo.
5.  Don’t chase Google PR by chasing 100′s of links. Think quality of links….not
quantity.
6.  Don’t buy many keyword rich domains, fill them with similar content and link them
to your site, no matter what your seo company says. This is lazy seo and could
see you ignored or worse banned from Google. It might have worked yesterday
but it sure does not work today!
7.  Do not constantly change your site pages names or site navigation. This just
screws you up in any search engine.
8.  Do not build a site with a JavaScript navigation that Google, Yahoo and MSN
cannot crawl.
9.  Do not link to everybody who asks you for reciprocal links. Only link out to quality
sites you feel can be trusted.
10.  Do not submit your website to Google via submission tools. Get a link on a
trusted site and you will get into Google in a week or less.

How To Implement Google Authorship Markup – What is Rel Author & Rel Me?

Google is piloting the display of author information in search results to help
users discover great content. Google.
We’ve implemented Google Authorship Markup on the Hobo blog so my
profile pic appears in Google search snippets.
This helps draw attention to your search listing in Google, and may increase
click-through rate for your listing. Many expect Authorship reputation to play a
role in rankings in the near future. Google has released videos to help you get
your face in Google serps.  If you have a Google profile (or Google Plus) you
can implement these so that you can get a more eye-catching serp snippet in
Google results.
http://www.hobo-web.co.uk/how-to-implement-google-authorship-markup/
Rich Snippets
Rich Snippets in Google enhance your search listing in Google search engine
results pages. You can include reviews of your products or services, for
instance. Rich Snippets help draw attention to your listing in serps. You’ve no
doubt seen yellow stars in Google natural results listings, for instance.

Canonical Tag – Canonical Link Element Best Practice

Google SEO – Matt Cutts from Google shares tips on the new rel=”canonical”
tag (more accurately – the canonical link element) that the 3 top search
engines now support. Google, Yahoo!, and Microsoft have all agreed to work
together in a “joint effort to help reduce duplicate content for larger, more
complex sites and the result is the new Canonical Tag”.
Example Canonical Tag From Google Webmaster Central blog:
<link rel="canonical"
href="http://www.example.com/product.php?item=swedish-fish" />
You can put this link tag in the head section of the duplicate content urls, if
you think you need it.
I add a self-referring canonical link element as standard these days – to ANY
web page.
Is rel=”canonical” a hint or a directive?
It’s a hint that we honour strongly. We’ll take your preference into account, in
conjunction with other signals, when calculating the most relevant page to
display in search results.
Can I use a relative path to specify the canonical, such as <link
rel=”canonical” href=”product.php?item=swedish-fish” />?
Yes, relative paths are recognized as expected with the <link> tag. Also, if
you include a<base> link in your document, relative paths will resolve
according to the base URL.
Is it okay if the canonical is not an exact duplicate of the content?
We allow slight differences, e.g., in the sort order of a table of products. We
also recognize that we may crawl the canonical and the duplicate pages at
different points in time, so we may occasionally see different versions of your
content. All of that is okay with us.
What if the rel=”canonical” returns a 404?
We’ll continue to index your content and use a heuristic to find a canonical,
but we recommend that you specify existent URLs as canonicals.
What if the rel=”canonical” hasn’t yet been indexed?
Like all public content on the web, we strive to discover and crawl a
designated canonical URL quickly. As soon as we index it, we’ll immediately
reconsider the rel=”canonical” hint.
Can rel=”canonical” be a redirect?
Yes, you can specify a URL that redirects as a canonical URL. Google will
then process the redirect as usual and try to index it.
What if I have contradictory rel=”canonical” designations?
Our algorithm is lenient: We can follow canonical chains, but we strongly
recommend that you update links to point to a single canonical page to ensure
optimal canonicalization results.

Does Only The First Link Count In Google?

One of the more interesting discussions in the seo community of late has been
trying to determine which links Google counts as links on pages on your site.
Some say the link Google finds higher in the code, is the link Google will
‘count’, if there are two links on a page going to the same page.
Update – I tested this recently with the post Google Counts The First Internal
Link.
For example (and I am talking internal here – if you took a page and I placed
two links on it, both going to the same page? (OK – hardly scientific, but you
should get the idea). Will Google only ‘count’ the first link? Or will it read the
anchor txt of both links, and give my page the benefit of the text in both links
especially if the anchor text is different in both links? Will Google ignore the
second link?
What is interesting to me is that knowing this leaves you with a question. If
your navigation array has your main pages linked to in it, perhaps your links in
content are being ignored, or at least, not valued.
I think links in body text are invaluable. Does that mean placing the navigati on
below the copy to get a wide and varied internal anchor text to a page?
Perhaps.

Do I Need A Google XML Sitemap For My Website?

No. You do not need a XML Sitemap to optimise a site for Google, again, if
you have a sensible navigation system. But it’s wise to have one.
A XML Sitemap is a method by which you can help a search engine, including
Google, find & index all the pages on your site. Sometimes useful for very
large sites, perhaps if the content changes often, but still not necessary if you
have a good navigation system.
1.  Make sure all your pages link to at least one other in your site
2.  Link to your important pages often, with varying anchor text, in the navigation
and in page text content
Remember Google needs links to find all the pages on your site.
Sitemaps are an easy way for webmasters to inform search engines about
pages on their sites that are available for crawling. In its simplest form, a
Sitemap is an XML file that lists URLs for a site along with additional metadata
about each URL (when it was last updated, how often it usually changes,
and how important it is, relative to other URLs in the site) so that search
engines can more intelligently crawl the site.
I don’t use xml sitemaps that much at all, as I am confident I can get all my
pages indexed via links on the website and via RSS feed if I am blogging. I
would however suggest you use a ‘website’ sitemap – a list of the
important pages on your site.