-
Posts
1,008 -
Joined
-
Last visited
Everything posted by surefire
-
It's a VERY good practice to backup your mysql database often. How often depends on how important your data is. On my job posting site, I'd hate to have to tell a member that their resume is gone. So I backup daily. Directions. 1- Cpanel 2- phpMyAdmin 3- Click on your database (not the tables, the actual database link above your tables) 4- in the main window, click Export 5- Choose 'Structure and Data', 'Save as File' and any other features you want 6- submit, choose the location to download and store the textfile on your computer Repeat as often as you like. Having good security measures in place is good, but there's nothing like having a solid backup of your database in case all h*ll breaks loose. Also, from experience I can tell you that the biggest threat comes from within. Backup your database before testing out that new script you wrote. I once zapped a precious table of data with one of my own scripts. And if you don't have a backup and something goes wrong, your only chance at salvation is to ask the Help Desk if they can rescue you, but you need to ask immediately. And ask nicely.
-
Glad to help. One thing I forgot to mention... those ideas aren't mutually exclusive... you can try combinations of some or all of them if you like. Also, another point on security... and this goes for mysql as well. Make sure that you test the user input. In other words, before you ask the php script to access the file or database, do some checking on the variables. PHP has nifty functions to check 1- Is this variable a number? 2- Is this variable devoid of html code 3- Does this variable contain words like "Drop" Also, be very very careful NOT to write a variable into your code to determine which file to get. In other words, don't write something like fopen($file_var.php); Your script should explicitly name the file that is to be opened. Don't pass the file name in a variable.
-
Boxturt, since I'm rather new to cron (old hosting company didn't allow) I can't say for sure... but here's how I would trouble shoot: I'd put a simple php script in the same director that does nothing but send you an email. Then I'd set up a new cron directive set to execute the script every other minute. This should give you some indication if your idea is going to work (and I think it will). Also, it looks like you are setting up a cron job with the 'advanced' option (writing it by hand). Have you considered setting it up by using the 'basic' cpanel interface?
-
Sometimes I feel like a nut... sometimes I don't. Anyhow, here are my thoughts. 1- Mysql is not difficult, is more secure (MORE... not 100%) 2- Mysql should be faster with complex queries 3- TCH, in their infinite kindness and wisdom, gives all users great access to mysql at no additional charge... you don't know what a gift that is! Ok. You wanted to know about flat files. Although I'm pretty darn good with PHP... your question is really about security. I know enough, but it seems that others here know quite a bit more than I. So here are some of my ideas. Please keep in mind that I use MySQL so I have either borrowed these ideas I'm giving you or I'm kinda making them up on the spot. 1- .htaccess to password protect the folder containing the flat file. PHP script can still read the script 2- name the flat file with .php extension 3- put a few lines of code that detects hacks and sends you an email with their IP, redirects the hacker 3- place the flat file in your root directory so it is a bit harder to find or access by the general public. In other words, don't put it in your public_html folder 4- place an index.htm (or.php) file in the folder that the flat file is in so that public won't get a directory listing of the files when the access the folder through http 5- name the flat file beginning with a period'h' like I read at php.net in one of the user comments that this makes it invisible to those who don't know what they are looking for Closing remarks: If security is vital (NSA secrets, location of Jimmy Hoffa, social security numbers of your friends and family, etc.) then don't put it on the web. There are all sorts of security measures you can take... but if someone with the motive, skills, and time decides to crack your system regardless of what it takes... there's a good chance they'll succeed. Luckily, most script kiddies go after easy targets and the ones with skill go after government databases and big corporations. imho... mysql is many times safer and versatile than a flat file system. Oh, one other thing. If you only need to write to the file on rare occasions, then you can change the chmod settings, write to the file, and change the settings back. PHP can read only the file if you are using file system for presenting existing data. Obviously, if you are looking to write data to the file whenever you please, then this would be tedious. I hope some of these ideas help.
-
The force type rule indicated by Borfast in the other thread is the way I would do it. I used a modrewrite rule instead only because the old hosting company I used to host with didn't allow force type. I wrote something close to this: This rewrite rule specifically forces server to parse an address like: http://www.yoursite.com/path/folder/345/abcd as http://www.yoursite.com/path/folder.php and then script in folder.php parses the url and pulls out variables using the explode() function. So in this case, var1=345 and var2=abcd It may sound confusing, and it is, but there's plenty of info on the net. Once again, if force type is available (and probably is at a host as awesome as TCH) then that's the way to go. Simple and clean. Follow that other link.
-
I ain't no expert... But my recent experience indicates that freshbot revisits if you modify your pages... although I have no way of knowing how Google 'knows'... so maybe the cause and effect relationship only exists in my head. Based on my interpretations of other forums dedicated to SEO, if you already have been through a dance, then freshbot activity can have an immediate effect on rank... but the dance has a greater effect.... or effects a greater number of websites. So Freshbot changes in rank might end up being fleeting. Remember... I'm not the resident TCH expert on SEO...
-
Dsdemmin, I've done mod rewrites to eliminate dynamic url's so no problem with that one. But a couple other things that I'd mention. After posting this thread, I've noticed that Google does indeed index these dynamic urls, based on pages that I've seen Freshbot visit. Also, by Google PR bar indicates that TCH forums has a PR of 4 on most pages. So doesn't that mean that these pages are indexed? I've also noticed that Google freshbot doesn't seem deterred by sessions on my pages. Now, if I'm left out of next month's dance, then I'll have an idea why. But otherwise, early results would seem to indicate that Google doesn't care much about sessions stored as cookies, but maybe it has a problem with sessions appended to urls in dynamic fashion (as most BB scripts do). Oh well, I have to have access restricted pages so sessions seems to be the way to go for now.
-
Yes. I've done it successfully at TCH. I used a mod rewrite command in .htaccess
-
Oxyg3n's questions were resolved in another thread.
-
Sure. This is relatively easy. You need to be able to log into your cpanel. Once you do, you have to set up your own database. You determine the name of the db, the username, and the password. The host is 'localhost' There is a great page of resources for this at TCH that you can find here http://www.totalchoicehosting.com/help/php...hpmysqlpage.htm The thing that confuses a lot of folks is that there are three basic steps, not two: 1- create a user 2- create a database 3- assign a user to a database A lot of people forget that last part and wonder why it doesn't work. That should do it for you. Read those resources at the link I gave. You don't need to be an expert at mysql... just know the basics.
-
Although this isn't a simple fix... it's super effective: Follow the directions and it's pretty straightforward. My personal experience is that sending mail using SMTP option is MUCH faster. I'm in the process of replacing all of my mail() functions with this mail class script.
-
I am currently using mod rewrite on my Silver account. The mod rewrite directions are in an .htaccess file for the folder I wanted to modify. I used this feature to tell server to parse certain pages as php regardless of extension. Hope that helps
-
If it's been more than three weeks, with no action from Dmoz, consider submitting to the next higher directory category. Do this only once, since you don't want to be penalized for submitting too often. With Yahoo, my research indicates you have three options: 1-regional directory listing for free 2-Express submit for $299 3-Submit every two weeks and wait a really long time I have gone the second route. I have read that the third route works... but if you feel that your site NEEDS to be in a certain Yahoo category, then I'd go with #2 and avoid #3 Look for other directories that are relevant to your subject. There are smaller, less well known directories like Joe Ant and Gypsy. You can't control when DMOZ, or Google will get around to doing what we want them to do. Yahoo's Express service gives you a seven day turnaround guarantee... just make sure you get it right the first time. If you read other threads, you'll see that Google is doing wierd things lately. No telling what will happen over the next couple of weeks. What you really want is a good deepcrawl... sounds perverted. Naughty Anyhow, you shouldn't expect much from freshbot activity. What I've read is that freshbot crawls indicates that Google knows where you are. Then comes the monthly deepcrawl. Then Google indexes those deepcrawl results a month later. Lots of things are changing at Google... but I'd say that if you are getting freshbot visits, then keep working on backlinks and eventually things will work. You have a choice: focus on those things you can control, or focus on things you can't control and go crazy. Over the past several months, I've been tempted to choose the crazy route... but hang in there.
-
by the way... it was freshbot that visited me. Maybe it would be a good idea to start a thread the next time a TCH user or must dsdemmin sees Google deepcrawl come through. Freshbot has looked at 1500+ pages of my site. I'm thinking this is a VERY good thing.
-
You are exactly right. If you are planning on sending any email to hotmail users, then you should read the user comments at php.net as there are other headers that you need to put into your email function in order to make it work with some hotmail accounts. http://us4.php.net/manual/en/ref.mail.php
-
which server are you on? register globals is on on server 11 I swear that must have been changed recently... but maybe I was seeing things. Dunno. Ignore my previous suggestion re globals
-
It has taken me four months to finally get freshbot to come by my site. My extensive research indicates that I've done everything 'right'... it just takes time. The good news is that doing the right thing can pay off. The fact that freshbot has looked at over 1500 pages of my site this month indicates that I should get a deepcrawl visit from Google soon. To speed things up, get listed in dmoz.org and other directories (Yahoo) Also, get high page ranked sites to link to you. The more similar to your 'theme' the better... and generally, the more the better. Google likes to find sites through other sites.
-
I'm not sure if this helps... but I've gotten that same error several times during the transfer of one of my sites that is database driven. I wish I could tell you exactly what the problem is... but I'll try to point you in the right direction. I had to recode MANY pages of my site to account for register_globals == off. If you look in the php(info) file, you'll see what I'm talking about re. globals. Anyhow, my old scripts would assign variables that were passed by forms without any special code. In other words, if I had a field in a form that was 'email' then the script receiving the data would say $var=$email. I've had to go through my pages and put in code that looks like $var = $_POST['email']; I don't know if that's the issue, but some older scripts seem to be written based on the assumption that register_globals is on. TCH has it off. I'd hate to send you on a wild goose chase but maybe this will help.
-
Open the file within Wordpad. Either right click on the file and choose Open With... or launch Wordpad first and choose the file.
-
Question: Will putting certain headers in my <head>ers help Google index my site if some of my pages have session id's on the urls? What is the correct way to write those headers? example: no cache Background: Lots of folks use bulletin boards that append session id's to the url. I have read that this can hurt Google rankings because Google will see some pages as duplicates. Also, I use sessions instead of cookies, but I'm not sure whether googlebot sees session id's appended to the urls. Maybe I'm worrying about something inconsequential. Thanks for your input.
-
Disallowing a spider from visiting is very different from cloaking. Cloaking is serving different content to spiders than the content that you show to human visitors. The idea is that you can design the site that humans see, and then show a different, optimized version to spiders so your rank is higher than it normally would be. Search engines don't like being fooled. From what I have read, most tricks should be avoided. I say 'most' because some would argue over what is considered a trick. I guess if you are trying to 'fool' the search engines, then you have to be aware that there is a good chance that if you are caught, then your site could be banned.
-
Darn, I just realized that I wasn't archiving my raw log files... and so unless I'm missing something, I can't look back through previous days... which is where the Google information is. Shucks.
-
(Blushing...) As stupid as it is... I could never resist potty humor. It's cheap humor... but it never lets you down. On to a serious question... do you have a log analyzer program you'd recommend for TCH log files? I'd like to know which of the Google bots is checking out my site. Thanks
-
What are you doing analyzing your logs... that's gross.
-
Dsdemmin, what is your feeling on having a robots.txt file? (I think I got that right.) Second question, does your post mean that you suggest having this Meta tag on pages?