Webmasters and content providers started
optimizing websites for
search engines in the 1990s, as the first search engines were indexing the early Web. Originally, all webmasters required only to submit the
business page, or URL, to the several search engines that would send a "spider" to "crawl" that page sheet, export links to another page from it, and turn data found on the page to be listed.[5] The method includes a search engine spider downloading a page and putting it on the search engine's 90s server. A second application, recognized as an indexer, obtains data about the page, such as the terms it holds, where they the 90s, and any weight for particular words, as well as all links the page carries. Each of this data is then stored into a schedule for poking at a later date.