White Hat way of introducing a new site

, 1 Comments »
About a month ago I have set up the new site guides in Krakow and Poland .

Before the submission I'd checked the pages very carefully, as indexing the new sites takes huge amount of time, and any mistakes would last for weeks literally as G can visit the site only once a week.The other two even more seldom. The pages were checked by w3c validator for XHTML and CSS code.
Some pages I was still not certain about were given noindex,nofollow metatags and there were no empty links to non existing pages.

Another search has been conducted for duplicated content, to make sure that the text is original ( different from already published) and will be indexed by SE.
I am not certain about so called canonization effect but there were two other issues introduced.All the links to home page are referring to '/' not to 'index.php'. Also I used 301 redirection from cracow.name to www.cracow.name as we decided that www form looks better. Links on pages are referring to relative '/' not to full URL address.

I didn't put any outbound links( but the weatherunderground.com) . They are going to be added soon but slowly, one by one, to see the influence of the links, and to avoid 'bad neighbouring', as you don't know exactly what is 'bad' for big G. Of course there could be some links, but for now there aren't any, especially that I need the people to link to me , not 'vice verse'.

Next the robot friendly file robots.txt and sitemap were created. There are not necessary but sniffing bots are suspicious when they don't find any of these files. Later on the files are the 'must'.

I've submitted the site to three engines GYM and kept waiting.

1 Response to "White Hat way of introducing a new site"

Rumburak Says :
24 July 2008 at 05:44

Robots.txt are absolutely necessary.
I will write about it more soon.

Post a Comment