Jump to content


  • Content Count

  • Joined

  • Last visited

Posts posted by tr4nc3

  1. I'm not sure if this is really the correct place to post this, but I'll give it a shot.. :)


    I noticed the release of Gallery v2 posted today on /. and decided to give it a go by installing the minimal version. The installation process looks to be very good, but I've paused at step 3.


    With our reseller account I would like to try and setup a "Multisite installation" of Gallery 2. This is a quote from the install script's step 3...

    Gallery can support multiple independent sites with a single installation of the code. Choose this installation type if you want to install a new Gallery on the same webserver but in a different path, subdomain or URL domain.

    With our reseller account, on a shared server, can we install the codebase at our main website, and host the gallery install for any number of the websites we resell to. This would save on disk space by requiring only one 12-18 mB copy of Gallery 2 to be installed. On the sites we host/manage, 4 small files, a database, and a few tables, etc, is all that would be required to setup a gallery.


    The only problem seems to be that files running under our reseller account that reside in the path...




    ...cannot access files in another accounts path.


    E.g.: /home/foobar/public_html/gallery2


    Would we have to be on a dedicated server to be able to do this?

    Any thoughts on how we could go about setting up a multiple site install?


    The only alternative that comes to mind is to setup subdomains at our main website for each website we host, that will be using the gallery script.


    Or, we could upload multiple installations of Gallery 2 to most/some of the accounts we host/manage, and in a way, we'd really be wasting disk space.


    Would using the subdomain idea from above be the best way for us to go, for now, since we're on a shared server?


    I’d really appreciate any advice on this topic.


    Thank you very much TCH,


    Mike J.

  2. Can anybody tell me why my "User Defined Mime Types" in cPanel keep on disappearing about once a month????? This has exposed database passwords to the net twice now. I guess I should have the passwords in an included file, but really, my defined types should stay defined and shouldn't just disappear.


    Btw, I'm setting a type of: "application/x-httpd-php" for "rss"

  3. I'm attempting to resize images using Imagemagick and PHP. Anyone know what the path to convert is?


    Here is a sample of a command I've tried with both system() and exec()


    convert -geometry 100 x 100 temp/thumb_pic.jpg temp/thumb_pic.jpg


    I've also tried..


    /usr/local/bin/convert -geometry 100 x 100 temp/thumb_pic.jpg temp/thumb_pic.jpg


    system() returns in error for me. I can't seem to find any information on TCH about the whereabouts of ImageMagick. help?




  4. your welcome.. Happy to hear you like it. :)


    Make sure to look over the sourceForge fourms for this script. It gets really tricky after you process the upload and try to do a header redirect, sessions, or cookies. It's worth it though.. lol



  5. Thanks Don. I actually had tried that program first before setting out on my journey installing mega upload. It has an upload limit of 500k so it won't work very good for my purposes. (I'm having admin users upload clients flash (*.FLA) sites to our server. And they can be up to 10 megs.)


    I managed to get the 1.35 version of Mega Upload to work with no problems at all. I guess in the most recent 'experimental' version (1.42) the script doesn't POST back data to the PHP script. It instead sends GET variables, but mean time PHP deletes the temporary uploaded POST data.. lol


    It's actually a very nice script and I plan on upgrading my other scripts that have upload functions to include this nice script. It has the potential to be high on resources so I'll be taking that into consideration when implementing it of course.


    I searched google for probably a half hour and had tried the java applet before deciding to go with the complicated Perl/PHP solution. But in the end it was worth it. :)




    note to self: change "Kbps" to "kB/sec"..

  6. HA! Well, if I'd listen to the author...


    Special Notes.


    Version 1.4x is still new though 1.35 is very stable. This version may have

    race conditions, which may result in the loss of uploaded files.


    I'm always trying to use the newest version of anything.. :blink:


    It works without a hitch in 1.35.


    Big thanks to TCH-Robert and dsdemmin because I think they were trying to help me.


    Here is a link to 1.35 if your still interested in this script.




    /smacks self for not trying the 'stable' version before asking for help at TCH.

  7. Ok.. I'm really close to giving up on this one. I've been trying to install what supposedly is a simple script to install for the past 3 hours. It uses a combination of Perl and PHP to show a progress bar during a file upload.


    The files are being created in the temp directory that I set, but they are not being copied/moved after the upload is complete.


    I have a strong feeling that the CGI script can't POST form data back to the PHP script. Because I check $_FILES['file'], and 'tmp_name' and they are blank.


    The progress bar works perfect. If I could only get the rest of the script to work.


    Anybody have experience with this script? Or anybody willing to give it a shot at trying to install it on TCH? Takes about 2 minutes to set it up. LOL


    I've checked cPanel and I'm pretty sure we have all the required Perl Modules. (CGI, Fcntl and File)


    You can find the script at http://www.raditha.com/php/progress.php


    Thanks in advance,


  8. Ohh yeah you wanted just the pages and not all the files. Well this PHP script will give you the count of all the .html, .htm, and .php files in your site.


    function direcho($path){
    global $a;
    if(strstr($file,".htm") || strstr($file,".php")){
    return $a;}
    echo direcho('./'); 

  9. Put this php script in the root dir of your site and run it. It will return the total number of files.


    function direcho($path){global $a;if($dir=opendir($path)){while(false!==($file=readdir($dir))){if(is_dir($path.$file)){if($file!='.'&&$file!='..'){direcho($path.$file);}}else{$a++;}}closedir($dir);}return $a;}
    echo direcho('./'); 

  10. I've been using CuteFTP Pro for years.


    Wow, I was starting to think I'm the only one using CuteFTP Pro. I've used a lot of others and CFTP PRO is by far the best in my opinion.


    One of my favorite (and most used features) is the Folder Monitor. (Or transfer engine.) It is a feature that sits in the system tray and monitors a folder for change every 5 or X seconds. Once it detects a change, it's automatically connects to the FTP acct. you have it setup for and uploads the changed files. This allows me to be more productive since I don't ever have to manually upload something. I just wait 5 seconds and it's there.


    Sometimes it will upload my temp files and renamed files.. (because it doesn't end up deleting files... which is probably a good thing don't you think?) So I just use the synchronize feature of CFTP or Dreamweaver to 'clean' the folder, when I've decided it's time.


    I used to use a program called 'Homepage Upper' that had the same functionality of the folder monitor. But it kept on crashing on me since I upgraded to WinXP. I stopped using it, then it took me a couple years before I set out to find another good util that will monitor folders.


    CuteFTP Pro does that and SOOO much more. A little complicated at first, but it is truly easy to use once you've learned the basic interface.


    I doubt I'll ever use another ftp app... CUTE PRO R0XX!


    and so does TCH! Rock Sign

  11. I looked over that link. So I guess it is just a behavior of HTTP. Which is weird because I hadn't experienced this problem until just recently.


    But it's good that I found and learned about this now because I'll know not to do the same thing in the future.


    The code I posted works great. It just appends the PHPSESSID to the end of the URL. So once your in the members area your set.. I guess as long as you don't do anymore header redirection once your inside.


    Which I just thought of.. and my application happens to do a header redirect after every form is submitted. I checked and I get booted once I submit the form. LOL! So I got some more work ahead of me once I wake up..


    I'm also going to test out session_write_close().. One of the messages in the bug listing mentioned that using s_w_c() fixed the problem for them.


    >I've also had a similar problem of session variables not being passed
    following a call to header(). I am running PHP 4.0.15 on an XP m/c.
    The following worked for me, by placing a session_write_close() before
    the call to header, followed by and exit():
    header("Location: $strPage");
    I hope this will be of use to some.


    But he's running on WinXP. I'll find out if it applies to us here at TCH.


    Thanks for the link! I've bookmarked it. I didn't even think about looking there.

  12. I was using a system of logging on users with sessions. The login form would submit to login.php that would lookup the password in the DB and register a session if the PW matched. If it registered a session it would then preform a header redirect to the homepage of the member area.


    Well this system worked fine ever since I implemented it 2 months ago. But just today I noticed that it was no longer working. My session variables were not being stored.


    I ended up bugging the guys at the help desk about it because I thought it must have something to do with the server side of things. (Since my code had previously worked for the past 2 months.)


    Turns out, it actually did have something to do with the server side of things. They updated PHP. (And before posting this thread I searched for PHP UPDATE and PHP UPDATED and only the first query came up with one result. Which was someone asking how often PHP is updated. No announcements that PHP was updated were found.)


    I guess with this update sessions variables are not kept through a header redirect. So I am posting this message to warn everyone who might be using a similar structure of programming.


    I was redirecting with the common..


    >header("Location: /members/index.php");


    But now I have updated/upgraded my code to redirect in this fashion...


     function session_redirect ($url = "") 
          function _safe_set (&$var_true, $var_false = "") 
              if (!isset ($var_true))
              { $var_true = $var_false; }
          $parse_url = parse_url ($url);
          _safe_set ($parse_url["scheme"], "http");
          _safe_set ($parse_url["host"], $_SERVER['HTTP_HOST']);
          _safe_set ($parse_url["path"], "");
          _safe_set ($parse_url["query"], "");
          _safe_set ($parse_url["fragment"], "");
          if (substr ($parse_url["path"], 0, 1) != "/")
              $parse_url["path"] = dirname ($_SERVER['PHP_SELF']) . 
                              "/" . $parse_url["path"]; 
          if ($parse_url["query"] != "")
          { $parse_url["query"] = $parse_url["query"] . "&"; }
          $parse_url["query"] = "?" . $parse_url["query"] . 
                            session_name () . "=" . 
                          strip_tags (session_id ());
          if ($parse_url["fragment"] != "")
          { $parse_url["fragment"] = "#" . $parse_url["fragment"]; }
          $url = $parse_url["scheme"] . "://" . $parse_url["host"] .
                $parse_url["path"] . $parse_url["query"] . 
          session_write_close ();
          header ("Location: " . $url);


    This might save somebody the hours that me, and the help-desk team had to go through trying to figure out what this problem was.


    My apologies to the help-desk team for bringing a scripting issue in their direction.

  13. You might know that probably the only metatag that google will even consider is the description tag. (also obeying robots commands.)


    Also, I noticed you don't have keywords tag. There are still other search engines that use the keyword tag.


    The robots tag ALLOW is wrong. Try searching google for the following.


    name="ROBOTS" content="ALLOW"




    name="ROBOTS" content="FOLLOW"


    The first link has 1 match, the 2nd has 1,050.


    Considering the search results from google, I'm pretty sure that ROBOTS ALLOW isn't correct.


    Also, I'm quite positive that you don't have to give a robot permission to crawl your site. They're going to crawl it until you specifically tell them not to.

  14. Well I gave up at first. Since I had a working database on my local machine. But lately the volume of work I have demanded that I design the sites on TCH's servers. So I had to get Dreamweaver to connect to the DBs here.


    It turns out it's not that bad.


    Make sure you setup all your UN/PW info in Site Definition.


    Local Info:


    Site Name: ?

    Local Root Folder: ?

    etc, etc...


    Remote Info:


    Access: FTP

    FTP Host: ftp.******

    Host Directory: /www/

    Login: username

    Password: ?


    Testing Server:


    Server Model: PHP MySQL

    Access: FTP

    FTP Host: ftp.******

    Host Dir: /www/

    Login: username

    Pass: ?

    URL Prefix: http://www.******/



    That's it for that area. Now go to create a record set in your document.


    Click on 'Define...', then 'New'.


    MySQL Connection:


    Connection Name: whatever you wanna call this connection.

    MySQL Server: www.******

    UserName: username_username (we'll get to this in a second.)

    Password: ???


    OK now if you click 'Select', you should receive a list of the DBs under your account. But this isn't going to work unless you do a few things in CPanel first. (I had got to about this point when I gave up. :) )


    Go into CPanel...


    Go to MySQL Databases...


    Under 'Access Hosts', either add % or your IP. I've tried 68.% and 68.%.%.% for example, and neither work.


    Next, right above 'Access Hosts', is Users:. If you haven't added one yet, add one now. It is required for all this to work.


    Now you've added a user and an access host, so the last thing to do is give the new user you created access to your databases. (I'm assuming you have databases already created.)


    At the top of the page, you'll see a form with a bunch of check boxes. (Privileges:) Just leave "ALL" checked or check off whatever type of access you want to give to your user. Then click the submit button 'ADD USER TO DB'. The next screen will say, 'Account added to Access List', click 'Go Back', and you should see that the top half looks a little bit different now.


    It will say...


    Users in [yourDBname]


    username_username (Privileges: ALL PRIVILEGES) [delete]


    Connection Strings

    .... etc etc.. (it will list PHP and Perl DB connection string for you to use.)


    But you don't need none of that, because Dreamweaver will do your dirty work for you now.


    Now back to Dreamweaver MySQL Connection settings....


    MySQL Connection:


    Connection Name: whatever you wanna call this connection.

    MySQL Server: www.******

    UserName: username_username (the username you created.)

    Password: ???


    Click on 'Select' and you should see your Database(s). Select it and your good to go.


    I'm not giving any warranties, but this worked for me. And I'm able to duplicate it for the rest of my resold accounts with ease. It took me having to submit a help desk ticket to figure out that you have to "add the user to the DB.".


    Hope this helps.. Good luck Thumbs Up


    Rock Sign

  15. I read your post on OMGN. The first thing I thought about was why didn't you use an automated task to do that? It's really not that hard to update and enable a task once you have it setup. I don't think the task scheduler works with command line switches very good, so create a batch file and run the batch file to shutdown your computer. Hell, that 3rd part app could have just been a nasty lil' ViRii.. who knows..




    c:\windows\System32\shutdown.exe -s -t 0



    or actually you could make a link to this batch file on your desktop and not even use a scheduled task.. Just load up and play winAmp, then click a shortcut that links to a batch file with...


    c:\windows\System32\shutdown.exe -s -t 1800


    That would set you up for 30 minutes, and your system WILL shutdown properly because it is initiated from windows shutdown utility.


    And just to top it all off... Let's say that you happen to not want to shutdown your computer after you initiated the 30 minute sequence..


    Just run "shutdown -a" to abort the countdown.


    I happen to have two icons on my desktop. One to reboot, the other to shutdown.


    This isn't my computer or else I'd be having it on all the time.

  16. I've got FireFox at least for website testing purposes.

    I use firefox on my local computer for creating thumbnails of URLs via a webinterface from a remote computer. The reason I use FireFox is because of the PopUp blocking. (If my routine get's a popup (with IE), the VBSCRIPT can't close all the windows and then the whole 'process' doesn't work to good after that.)



  17. You can access the logs via FTP




    I'm not familiar with StatsNow, but you could setup a cron job to run a PHP script that will download the logs from the FTP and save them in the same directory with your script. (Then you wouldn't even need an absolute path.)


    I'm sure you'll figure out something if you determined enough. Just know that you can have access to your logs on demand. You just have to get a 'local copy' of them first.


    I'm trying to setup a program called StatsNow which monitors web site usage but it needs to know the absolute path to the raw apache log files. Does anyone know the path to where these files for my site are kept? I've looked through my root folder but all the log files I can find are for the awstats etc.
  18. Actually guys... You can forget about having to use PERL for this. You got me all bummed out thinking I wasn't going to be able to get this done!


    You can only access the domain log of the account you are trying to access it with.


    You can access anybodies logs from anywhere. (As long as you have their UN & PW)


    Just ftp to:




    I needed a script that would, as you know, that would keep my database updated with the 'Bandwidth usage' of all of my resold accounts. (So I didn't have to try and make sense out of Cpanel's usage chart.)


    FYI, this is how I do it...


    A PHP script, triggered by a CRON @ 12:01 AM everyday..


    FTP to the location where you can pickup your logs, download the LOG, strip the log of all lines that are not from yesterday, parse the log and get the bytes transfered, delete the downloaded file, update my DB with the new bandwidth total, and restart the loop for every domain in the DB.


    Works like a charm!

  19. I have a broadband connection and it didn't take that long to do.


    Yeah same here.. But my upstream is limited at 35kb/sec so that is kinda slow sometimes.. Can never have a connection that's TOO fast.. Thumbs Up


    Just thought I'd ask to see if it was possible. I've found that many things are possible with TCH, but not this I guess.. Overall.. NOT a big deal.


    TCH ROCKS!!!


    Rock Sign

  • Create New...