How to harness the Googlebot

n64man120

2[H]4U
Joined
Jan 11, 2004
Messages
3,498
Does anyone know of a way to selectively choose sub-portions of a page to be indexed by googlebots, and then choose which pages give priority in Google's results? Or at least help to influence this? I've got a whole proper sitemap.xml set already, that Google shows no errors on.

I run a college video website, and here is the current scenario:

Search for "baseball" within my site using Google

Result 1: Homepage which happens to have baseball article on it since it is recent
Result 2: Page of sports related videos, that has the baseball one in it
Result 3: Page with cooking video, that was indexed as baseball because the right hand margin shows baseball as a recently added video to the site
Result 4: The actual baseball video we're looking for

So I don't want to discourage hits to the homepage, but at the same time I want users to be able to find what they're looking for as quickly as possible. Any ideas how to bump that Result 4 up the chain of things, and deter Result 3 from appearing, or at least shove it to the bottom?
 
The simplest way is (assuming that your site is dynamic - PHP/ASP/Ruby etc), you can interrogate the user agent in the HTTP headers and only display certain portions of the page if you see a search-engine bot.
 
Interesting thought, I don't see why that wouldn't work, thanks!

How static is the Googlebot's user agent though? I.E. has it reported the same user agent for years, or is this something that it often evolves, requiring update maintenance?
 
Back
Top