More and more sources say that your actual rank depends on how many pages link to your site. This is not to suggest that you should not optimize your site for web crawlers. Google uses the phrase you have in your Description meta tag; some other search engines do so, as well. But not all of them. It also depends on how much a particular search engine query matches the description.
Another important thing to know is that there are literally billions of pages on the Internet. Web crawling bots devote only so much time to indexing your page. If the page itself is not optimized (contains too much unnecessary code as opposed to actual content), chances are crawlers will give up before indexing it.
The best way to get higher rankings is to have as many people provide link to your web site as possible. At the same time, links from higher-rated pages weigh more than those from obscure home pages.
Avoid submitting your link to "link farms." Most web crawlers are aware of those, and will effectively lower your page rank.
I know it sounds tricky, but that's because it is. Ten years ago, web crawlers were much more forgiving. Today, the situation is different because of billions of web pages that exist.