drupal - Can I tell sitecrawlers to visit a certain page? -


I have this Drupal website that revolves around a document database. By design you can find these documents by searching the site. But I want all the results to be indexed by Googlebot and other crawlers, so I was wondering what would happen if I made a page that lists all the documents, and then the robot should see all the pages of index of my documents Ask to go on ..

Is it possible or is there a better way to do this?

Maybe a

Google introduced Google Sitemaps so that the Web Developers can publish lists of links from their sites. The basic premise is that some sites have a large number of dynamic pages available only through the use of forms and user entries. Sitemap files can then be used to indicate a web crawler how to get such pages. Ask Google, Bing, Yahoo and now jointly support Sitemap Protocol.


Comments

Popular posts from this blog

windows - Heroku throws SQLITE3 Read only exception -

lex - Building a lexical Analyzer in Java -

python - rename keys in a dictionary -