Optimizing website for the search engines is as old as the web. Back in the early 90's webmasters just needed to submit the URL to relatively few engines who would then scan the website pages. They would extract information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date. That part of the process is still valid today, what has changed is the algorithms the search engines use to index the pages. Originally indexing relied on the additional data provided by the webmaster, known as metadata.
It was used as the guide to the site content and what keywords were relevant. This was seriously abused to cause sites to be ranked high regardless of the actual content or site purpose. As computation power increased and as search engines became a business there were many changes to how a site is indexed and rated. Today how search providers like Google determine who is placed at the top are closely guarded industry secrets and believed to be changed frequently in an unseen "war" between the less scrupulous webmasters and the search algorithms.
Here is a shortened list of some of the major changes Google has undergone to keep their results "clean"
• 1998 Google used general factors like page rank, the links to and from other sites in addition to the key words, meta tags and site layout.
• 2004 Google had many undisclosed factors they used to counter the many "tricks" the industry had created to falsely "buy" page ranking.
• In 2005, Google began personalizing search results for each user depending on their history of previous searches.
• In 2007, Google disclosed that they had taken measures to mitigate the effects of Page Rank sculpting by use of the no follow attribute on links.
• In late 2010 in an attempt to make search results timelier and with the growth in popularity of social media sites made changes to their algorithms to allow fresh content to rank quickly within the search results.
• In February 2011, Google announced an update, which penalizes websites containing content duplicated from other websites and sources.
• In April 2012, Google launched the Google Penguin update the goal of which was to penalize websites that used manipulative techniques to improve their rankings on the search engine.
Given this quick insight to how search engines index sites, anybody claiming to be able to "buy" top placement or to make quick changes to a site to affect placement, should be treated with skepticism.