Jump to content

Does A Internal Page Link Affect Ranking?


wangs8

Recommended Posts

Hi

 

I checked the Awstats report. I found that there is no any "Links from an internal page" shown in the report. However, I did link a few pages each other. Why are these links not shown in the report. Does these internal links increase my site ranking in search engines?

 

Another question is about body text. I read some message in the fourm about body text can increase rank. Does it mean that I have to put all relevant keywards as many as possible inside of my home page. or as long as inside one of pages in the web site?

 

Thanks in advance

 

Click to my site

Link to comment
Share on other sites

Wangs8...if I'm not mistaken, the "Links from an internal page" on your Awstats report is telling you how many times your pages have been accessed from within your own site. For example, if I go to your one of your site pages, and then follow a link to another one of your pages, that click would count as a "Link from an internal page".

 

Anyone else out there, feel free to correct me if I'm wrong...

 

As for body text...that is one of the most important features of your site getting good pagerank and indexing. You should have good text on all of your pages. Use the keywords specific for each page. For example, on my site I have a photo gallery of animals and a gallery of landscapes. Those pages have different keywords, description and text for the respective topics.

 

Look up some of the "Search Engine Optimization" topics in the forum to read more. Also, HERE IS A GREAT ARTICLE on search engine optimization!

Link to comment
Share on other sites

nat is right on what he's she said here.

 

Every time a page on your website is accessed, it checks to see what the referrer is to the page. If the referrer is a page on the same domain, then it is an internal page link. If the referrer is an outside domain, then it is an external "link," even if there was no actual link clicked. It just checks the previously viewed page. And even that isn't foolproof because it depends upon the browser, and that can be manipulated.

Edited by TCH-Scott
Link to comment
Share on other sites

Yes, she is :lol:

 

[ Thumbs Up Robert]

 

 

Does it mean that I have to put all relevant keywords as many as possible inside of my home page.

Sounds like you may be flirting with spam... I would read the [just ok] article that Tracy recommended (and follow the links) and you will get a better idea of preparing your pages for the search engines.

Link to comment
Share on other sites

Another question is about body text. I read some message in the fourm about body text can increase rank.

Gaining in the ranks is tough, and there is a lot of work to do to get to the top, but it's not too hard once you get it set up.

You should learn more about how Google, for instance works (most are the same) and you can get a lot of good webmaster information from Google by clicking here.

 

Some extra tips:

  • Try to use the keywords in Meta tags in your page title and first paragraph.
  • Try to use the Meta description in your first paragraph too.
  • Images help too. When naming images, give them names that are in your keyword list.
  • Use the image ALT tag by including it's ALT the keywords used in your Meta tags.
  • Hang out in online forums, with a link in your signature to your site, The more links hitting you, the better. I wouldn't do that myself :goof:
  • Go into the news groups, using Googles is good, and find groups that talk about your sites interests - hang out there too and leave messages with your signature that contains a link to your site. Be honest though, really have something to offer in the way of chatter.
  • Hide things - things like a 1x1 transparent pixel that links to another page on your site. Use the meta words, name the image with something useful, and do this on each page. This helps bots and sniffers roam your site.

CAVEAT: Hide your email addresses! On your site, don't use the code mailto: unless you munge the address. I use a small CGI script that reverses the address into an unreadable email format but when the user clicks on it, it works right. When a spam-bot is hunting them down, it gets garbage.

Munge your email address in News groups. Some web masters run a cgi that detects spam-bots and sends them on a wild spree collecting thousands of fake email addresses.

 

Ramble mode off:

Link to comment
Share on other sites

Hide things - things like a 1x1 transparent pixel that links to another page on your site. Use the meta words, name the image with something useful, and do this on each page. This helps bots and sniffers roam your site.
All the advice is good with this one exception: Never use hidden text or links... never.

 

 

From Google Information for Webmasters - Webmaster Guidelines

 

Quality Guidelines - Specific recommendations:

 

Avoid hidden text or hidden links.

 

 

 

I use a small CGI script that reverses the address into an unreadable email format but when the user clicks on it, it works right. When a spam-bot is hunting them down, it gets garbage.

 

Shiva this would be a nice script to show... start another post in Scripting Talk

Link to comment
Share on other sites

Does it mean that I have to put all relevant keywords as many as possible inside of my home page.

 

How does that "flirt" with spam?

 

Well "as many as possible" sounds like it might trigger a keyword density filter (i.e. there is a maximum number).

 

One of the factors that affected so many sites in Google's November changes (a.k.a. Florida Update) is the fact the stemming introduced a lot more 'counted' keywords within a page. Thus, if you were near the maximum limit for a given word or phrase, the 'other' variations of those words within your page may have put you over the limit.

Link to comment
Share on other sites

Ok Scott. So if a news article, on a page is about the President, and it involves an interview with Reporter said: President said: and it's a 10 minute interview, you are suggesting the words reporter and president, which may appear 50 times each, if there are 50 responses from the president, will make Google call it spam - that the number of words is over their limit? How does a web page become junk mail?

 

I don't think wise web masters are going to start designing pages around what some considers to be too many keywords. I know what you're referring to - and I think it isn't the number of same-words used, but the format they are set in within a Meta tag. For example "google, google works, google search, googles, googled, googler" might be what you are refering to. Stugffing those into keywords, description, first paragraph, last paragraph, footer, and iframes or <-- code is the no-no. But Google clearly eats it up any way. I don't suggest people do this.

 

Matching our keywords, using them in the description meta, and in the first paragraph, and page title and images works safely, and doesn't break any 'rules'.

 

Naming an image cheap_web_hosting.jpg is better than image1.gif and adding the keywords "cheap web hosting" to the keyword meta helps, not hinders. And, a transparant gif named "cheap_web_hosting.gif" with the "alt=cheap web hosting" in it won't hinder either. Adding the same term to the Descript as "tch - providing cheap web hosting" boosts, not detracts from the ranking.

 

I'm floored at how Goolge would confuse spam with a web page content. Makes no sense at all. Too much sunshine in Florida.

 

Altavsiata, at one time, was the better of the lot in that it allowed for bollean opperands. Even when used with Google, Google still won't display hits by relevancy of the search term, but instead displays by popularity - for example, a search with "shaving cream" puts second in the ranks, the "Nebraska Coeds - Real college girls going crazy on camera!" web site! and the description has nothing to do with the words "shaving cream". Neither word is used on the Nebraska Coeds... web site. So, why does Google shove it into the results for "shaving cream"? Money. There are hundreds of other search engines out there and better at what they do. Google went into the gutters a long time ago. Internet users will learn, in time, that Google is not the internet - just a crummy tool.

 

My web stats for most sites I manage, but as an example, I'll use Dec stats with a total of 24523 hits - put google as bringing 0.18% of the traffic followed by Yahoo, MSN, and some others. The rest of the visitors come in via direct (4599 - 18%) but for the most part, via links from other pages, news groups, forums :) and advertising.

 

99.9% of the information on the internet is pure garbage. 99% of the remaining .01% is useless and 99% of the possibly useful .001% are no longer there.

Link to comment
Share on other sites

Shiva: Ok Scott. So if a news article, on a page is about the President, and it involves an interview with Reporter said: President said: and it's a 10 minute interview, you are suggesting the words reporter and president, which may appear 50 times each, if there are 50 responses from the president, will make Google call it spam - that the number of words is over their limit?

 

D. Scott: Well it could, it would depend on the conversation. If the interviewer used very, very few words in the question (which would be quite difficult) and the President answered with 'Yes' or 'No', then the density of 'Reporter' and/or 'President' might be to high.

 

‘Shiva’ has a 0.87 density (1 out of 115), as well as ‘D. Scott’

 

Another example:

 

Reporter: Hi Mr. President.

President: Hi

Reporter: How are you?

President: Good.

Reporter: What is your favorite color?

President: Blue.

Reporter: Do you eat beets?

President: Yes.

Reporter: How disgusting!

 

 

‘Reporter’ has a 16.7% density (5 out of 30) and ‘President’ has a 13.3% density (4 out of 30). No keyword stuffing here (as I said, it would be quite difficult).

 

 

I'm floored at how Goolge would confuse spam with a web page content.
It is not just Google but all the search engines who deal with this. Think about this though. A sophisticated algorithm is written in order to rank pages according to 'relevancy' for a given search phrase. As long as there are people who pick away at that algorithm in order to understand it and, more importantly, exploit it then the search engines will have to 'filter'. No filter will be perfect, meaning that 'good' content will be filtered and 'bad' content will escape (i.e. confusion). However, it will be a continuous battle (on both sides).

 

 

So, why does Google shove it into the results for "shaving cream"? Money.
No, it has nothing to do with money (well unless you are considering the money supplied to the optimizer). I did not look at the site (I'm a prude :) ) but if there is no 'shaving' or 'cream' in the source (not just the text) then the likely answer is in the linking text. This is a 'trick' that is referred to as Google Bombing. The most famous case: 'miserable failure'. Google is aware of this flaw and I am sure will be filtering it in the near future.

 

 

...I'll use Dec stats with a total of 24523 hits - put google as bringing 0.18% of the traffic followed by Yahoo, MSN, and some others.
It is excellent that you are obtaining traffic from 'other' sources but I think that you may be missing a huge potential of interested viewers. It is a fact that most people search for what they are looking for. If a given site does not rank well for the most logical (and useful) phrases then that site will not be viewed. Would it not be ideal to have you same situation with the added bonus of top rankings with the most relevant keyword phrases? Who could complain?

 

 

99.9% of the information on the internet is pure garbage. 99% of the remaining .01% is useless and 99% of the possibly useful .001% are no longer there.
I am not as pessimistic. Yes, there is a lot of junk. But, I generally can find the information I seek.... it is a wonderful resource (and I always use Goggle :D )!
Link to comment
Share on other sites

CAVEAT: Hide your email addresses! On your site, don't use the code mailto: unless you munge the address. I use a small CGI script that reverses the address into an unreadable email format but when the user clicks on it, it works right. When a spam-bot is hunting them down, it gets garbage.

Munge your email address in News groups.  Some web masters run a cgi that detects spam-bots and sends them on a wild spree collecting thousands of fake email addresses.

Shiva, can you post a good cgi email-scrambler code? Perhaps better in the 'scripting' forum, but I was just wondering how this code worked. Thanks!

 

Jeff

www.tornadocentral.com

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...