Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

A content marketer, for example, can create a series of blog posts that serve to generate leads from a new ebook the business recently created. The company's social media marketer might then help promote these blog posts through paid and organic posts on the business's social media accounts. Perhaps the email marketer creates an email campaign to send those who download the ebook more information on the company. We'll talk more about these specific digital marketers in a minute.

×