In a robust web-based system one would expect to have multiple servers serving content. In Sitecore, this is typically that you have a content management server, and then one or more content delivery servers.
Sitecore also offers a ‘Sitemap XML module’, which is pretty neat. When you publish content it will generate an updated Sitemap.xml file – basically, a listing of pages that a web-crawler like Googlebot can use to crawl and index a site. It can also ping the major search engines, telling them that the sitemap has been updated, and that they should recrawl it at their leisure.
However, things get trickier where you have content delivery servers. When content is published, you will want the Sitemap.xml file to be regenerated on ALL servers. This is not what happens by default – only the publishing server’s sitemap is updated.
You can make sitecore do this, though. In your SitemapXML.config include file, add the following to the ‘events’ section. You’ll want to do this for all servers:
<handler type="Sitecore.Modules.SitemapXML.SitemapHandler, Sitemap.XML" method="RefreshSitemap" />
This causes remote publishing to cause the Sitemap.xml to refresh! Job done.
Edit: I’m not the first to discover this: https://www.captechconsulting.com/blogs/sitecore-sitemap-shared-source-module-configuing-for-scaled-environments