Lock up your web forms: Google has started indexing the "invisible web"!

Webmasters and web marketing managers should be sure to read a recent post in the Google Webmaster Central blog. Google is beginning to submit web forms in an effort to find additional content that resides behind them.

It's part of Google's effort to index the "invisible web" -- pages that aren't currently being spidered by the Googlebot. Up until now, when the Google spider hit a page that required you to fill out a form to continue, it would stop there. But now Google says that in some situations, the spider will attempt to submit the form so it can find out what's on the other side.

Is this good or bad? I'd say it's more good than bad. It's good for web searchers who use Google. And it can be good for webmasters and marketers, as long as you're aware of Google's new spidering policy and you design your web forms with the Googlebot's new capabilities in mind.

Take a typical B2B landing page. It's a single page with an offer -- let's say "Download our latest white paper on widgets!". But in order to get the white paper PDF, the visitor needs to fill out the form. Under the old rules, Google wouldn't be able to get to that white paper because it was housed behind the form.

But with this new initiative, Google might try to fill out that form and get to the white paper. Once it gets there, it would spider the white paper (because of course Google can index PDFs) and the white paper might appear in search results. So if someone types in "Widgets" and your white paper is relevant enough, it could appear high in the search results and people could be viewing it thanks to Google -- without filling out the form!

So if you have critical pieces of content like white papers that you don't want to appear in Google searches, make sure you exclude those form pages in your robots.txt file. (A simple Google search can tell you how to edit your site's robots.txt file.)

0 comments: