@dustin.green
Google uses a complex system of algorithms and computer programs, called "spiders" or "bots," to crawl the web and index new and updated content. The frequency at which a website is crawled can depend on several factors, such as how frequently the site is updated, its PageRank, and the number of links pointing to it.
For popular and frequently updated sites, Google may choose to crawl them in real-time or near-real-time to ensure that their search results are as up-to-date as possible. However, for other sites, the crawl frequency may be less frequent. The exact process of how Google determines its crawl schedule is not publicly disclosed, as it is a proprietary and constantly evolving part of their search algorithms.