A sitemap does not guarantee indexing, but it is the most reliable way to tell search engines which pages you consider canonical and how often they change. For sites larger than a few hundred pages, it dramatically improves crawl efficiency.
The sitemap should only contain canonical, indexable URLs that return HTTP 200. Including redirected, 404, or `noindex` URLs wastes crawl budget and confuses signals. Most modern frameworks generate sitemaps automatically; verify the output reflects what you actually want indexed.
Sitemaps are referenced in `robots.txt` (`Sitemap: https://example.com/sitemap.xml`) and submitted directly in Google Search Console and Bing Webmaster Tools. For very large sites (>50,000 URLs), split into multiple sitemaps under a sitemap index file.

