What a search engine needs is it has to fetch out information regarding your site. For the same reason, the crawlers keep visiting your site.
Now how to keep your site crawler friendly?
The crawler comes one of your page first, picks up the content from it, it can take out information such as your title, meta tags, body content and can take out the links in that page. Now what the crawler does is it starts visiting the pages to which the current page has been linked to. The crawl process continues.
Making a site crawlable (Making it crawler friendly) is the basic think you need to have for proper indexing of the web site and subsequent ranking.
What you need to do?
Use of Sitemap
A sitemap is a page in your website where you present links to all your pages in your website. Again, this makes the crawling more easily. Once search engine crawler is able to visit sitemap, it can go to any other page in the site which is directly linked.
Now in case you want to restrict crawlers accessing your pages?
This can be done using a document called ‘robots.txt’ in the root of your website. My next post is how to use robots.txt and about creating this robots instruction document.