Jump to content

Converting To Php


TCH-Andy

Recommended Posts

I've had a good play with PHP now, and seen the many advantages that it gives me - thanks for all those who have provided support, ideas etc.

 

I'd now like to upgrade my site from straight HTML to PHP.

 

How do I do this without losing my position with the search engines?

 

One way seems to be to set the .htaccess file to say that all files ending in .htm are really php files. That way I wouldn't be changing any file names, the existing ones would work until I upgraded them... but is there a drawback doing it this way?

 

A second way would be to place a permanent redirect for each of the html files as I change them in my .htaccess file.

 

Is there and alternative, or better method? or should I use one of the above?

 

All ideas / suggestions welcome.

 

Andy

Link to comment
Share on other sites

A second way would be to place a permanent redirect for each of the html files as I change them in my .htaccess file.

 

Redirect 301 (as you and I have discussed before) is the only way to go.

 

The 301 redirect is the safest way to preserve your existing rankings. Any spider that visits your pages will obey the redirect and in the next index update will drop the old file name and path and replace it with the new one... no loss of recognition or rank.

 

 

:lol: :lol:

 

301 Redirects explained here

Link to comment
Share on other sites

Thanks Scott.

 

What about search engines like inktomi that don't crawl dynamic sites by default

 

From Inktomi site

Dynamic links: Slurp now has the ability to crawl dynamic links or dynamically generated documents. It will not, however, crawl them by default. There are a number of good reasons for this. A couple of reasons are that dynamically generated documents can make up infinite URL spaces, and that dynamically generated links and documents can be different for every retrieval so there is no use in indexing them.

If your Web site is based on dynamic links and you want your site to appear in our search engine, one approach is to have some static pages which have links to your dynamic pages. Some static pages can provide information about your Web site and services, others can provide indices or directories into your site.

 

This seems to indicate that a completely dynamic site would not be fully crawled. I guess just having a main index page in static html would provide enough pointers to get the rest linked.

 

On the other hand Google seems to be by far the most important, and that does do dynamic sites.

 

What is the problem with setting php to interpret all your .htm files? it would increase the server load slightly, but not really if you were converting them all to php anyway. I'll take the 301 redirect route, but I'm just curious :lol:

 

Thanks again,

 

Andy

Link to comment
Share on other sites

What is the problem with setting php to interpret all your .htm files?

Honestly not sure about the advantage vs. disadvantage for this route... would have to look into it more.

 

I agree with your selection of the 301 redirect route, that I do know.

 

As to your other comments, my advice would be to have a static html index page and a static html site map (off the home page). The rest of the site can be dynamically generated. Just my nickel's worth.... As you may note in my other posts I only give 2 cents, this would suggest that this advice is more valuable (i.e. take the advice)

:lol: :lol:

Link to comment
Share on other sites

Dsdemmin is the guy to talk to on SEO topics... but I think that there is an aspect to this question that needs clarification. I've researched this issue extensively.

 

If you re-read the comments from Inktomi, I think you'll find that they don't have problems with dynamic content... whether js, asp, php, cf, shtml, etc.

 

The problem is with dynamic links containing '&' '?' '=' and the like.

 

You'll commonly see this with database driven sites such as catalogs and also Nuke and postNuke.

 

Example:

 

Inktomi and others search engines are very reluctant to index pages with dynamic links because they are concerned that the links are temporary and don't really point to a real page. There also is an apparent risk that the search engine spider gets caught in an infinite loop of dynamically generated links.

 

So, if you have a catalog or postNuke website or pages with dynamic link characters, then there is good evidence that many of the search engines won't index these pages.

 

I know that some of my pages with dynamic urls have been seen and indexed by Google... but every search engine is different.

 

There are a variety of methods you can use to turn dynamic urls into static looking urls such as

 

The method I have used involves mod rewrite combined with a script that parses the page url and pulls the variables out of the url.

 

The mod rewrite instruction tells the server "If 'catalog' folder is referenced, then parse catalog as a php file... not a folder... and everything that comes after /catalog/ is to be ignored.... they're just variables for the script."

 

In other words... there's not a catalog folder but rather a catalog.php page that gets called and parsed by the server.

 

(Mod rewrite can be tricky and should be done by someone who knows what they are doing. Serious website chaos can result if you goof it.)

 

There is plenty of info on the subject online.

 

Basically, if you are good at coding PHP, and you have the time and patience, you can rewrite your catalog, bulletin board, or postNuke site to have static looking URLs... even though they're dynamically generated.

 

But Andy, you need not worry that search engines discriminate against php or asp pages... they don't... or I haven't found any evidence to the contrary.

 

It's dynamic urls that are the issue with Inktomi.

Link to comment
Share on other sites

Jack,

 

Many thanks for the explanantion, I understand what you mean about the difference, and fortunately I plan all my pages to be without the dynamic '&' '?' or '='. So everything should be fine :)

 

Thanks again

 

Andy

Link to comment
Share on other sites

  • 3 weeks later...

I've updated all my site, and tested it so that I'm happy with it.

 

I have done a permanent redirect of my main pages, but didn't want to do this for 300+ pages :D, hence I've used the following in my .htaccess file. It seems to work fine (if a .php file exists, then permanently redirect to that, if it doesn't stick with the .htm file)

 

Can anyone see any problem with doing it this way (as it saves me typing in 300 lines and I'm bound to make a mistake B)

 

RewriteEngine on

RewriteBase   /

#   parse out basename, but remember the fact

RewriteRule   ^(.*)\.htm$              $1      [NC,E=WasHTM:yes]#   rewrite to document.php if exists

RewriteCond   %{REQUEST_FILENAME}.php -f

RewriteRule   ^(.*)$ $1.php             [R=301,S=1]

#   else reverse the previous basename cutout

RewriteCond   %{ENV:WasHTM}            ^yes$

RewriteRule   ^(.*)$ $1.htm

 

Thanks

 

Andy

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...