My main stumbling block is that I have a lot of pages generated from different databases, plus a number of static pages. I thought it would take a long time.
Imagine my relief when I read deeper on Google's page and found this quote. "This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program."
If you've got a lot of pages that are four or five clicks away from your home page, Google's less likely to find those pages and get around to crawling them. Their logic is that if they're that buried, they must not be that important. So those pages will be found more slowly and crawled less often.
What a sitemap does is sort of let those pages say "hey, here I am and I'm more important than it seems" to the Google spider. So for your pages that are one or two links away from the home page or get direct links from other sites, getting them in the sitemap may be of middling value, because the spider will find them easily and regularly anyway. But for the pages that live deeper in your directory structure, a sitemap may be a great tool for getting them a little daylight.
For me, that was a lot of relief. The scattered static pages, despite being harder to round up and put in a sitemap are generally easier for Google to find. The pages that are harder for Google to find are generally easier to generate a sitemap file for. So, if I want to generate a quick and dirty sitemap with the more buried content, I can do it without harming the SEO on the content I left out.