edward19283
Members-
Posts
31 -
Joined
-
Last visited
Everything posted by edward19283
-
That seems to work. Thanks. Ed S.
-
I am looking to embed in the URL the username and password to autosubmit login info for access to files in a protected directory. I've done this with other sites, but can't figure out how or if I can do this with my TCH site. Here is what I am currently doing: http://mydomain/protecteddirectory/filename.ext?User ID$=username&Password$=password I am using User ID$ and Password$ because that is what gets me in when I autosubmit to access my cPanel (via an autosubmit program I use to help manage access to websites). It does not appear to be working here, as it pops up the username and password dialog box. Any suggestions? Thanks. Ed S.
-
I submitted a trouble ticket with TCH and there was a syntax error in my cron job. Here is the cron that is working for me: mysqldump -u cpanelusername_dbname --password='password' cpanelusername_dbname | gzip -c > /home/cpanelusername/path
-
I found instructions on using tar to backup the directory, located here: http://www.tldp.org/LDP/lame/LAME/linux-ad...ver-backup.html It seems to be working fine for me.
-
I checked usernames and passwords and made sure I was using my cpanel username where appropriate and still got the same problem. I created a new dbusername and password and added it to the database with all priviledges and created a new cron with this information and I got the same response, copied below, except it now says it is not using the password. Enter password: mysqldump: Got error: 1045: Access denied for user: 'cpanelusername_dbusername@localhost' (Using password: NO) when trying to connect I looked around hotscripts.com, per your suggestion and found another script for mysql backups (automysqlbackup). I decided to try it and configured it to run through cron and it appears to be having the same problem, "Access denied for user". Its error message is copied below: mysqldump: Got error: 1044: Access denied for user: '@localhost' to database 'dbname1' when selecting the database /home/cpanelusername/automysqlbackup.sh.2.2: line 572: /bin/mail: Permission denied Any suggestions? Thanks.
-
I seem to be having a problem with my script to do a simple mysql backup. Here is the script: mysqldump -u cpanelusernamename_dbusername -ppassworrd cpanelname_dbname | gzip -c > /home/cpanelname/backups/backup-dbname.sql.gz cpanelusernamename = your login username for cpanel password = your password for the dabase (note, there is NO space between -p and password) dbname = the name of your database The error I am getting is: mysqldump: Got error: 1045: Access denied for user: 'cpanelusernamename_dbusername@localhost' (Using password: YES) when trying to connect Also, I would like to do a simple backup of a directory. What would the the script to do that? Thanks.
-
After some digging around, I think I somewhat answered my own question. First, cPanel creates a raw log file that is accessible in the main menu. I downloaded it (which is current month to date - there is a menu option to have cPanel save the last months raw log file) and unzipped and opened in Excel, after some maniuplation with delimiters to get the data parsed properly. This data is pretty raw, but I could do some basic sorting and find out which were the RSS readers vs. the browser clients, and, presumably, while there are multiple hits from the same aggregator, one could probably tell if its a different aggregator by looking at its correspinding IP address. Whether in Excel or MS Access, I could probably figure out how to do counts of aggregators based on different IP addresses. Second, since I do not have a database driven site, I learned that to track click throughs from the RSS feeds independent of a browser, I can create the feeds with the link back to my site or article in the site, but at the end of the URL in the RSS feed, attach the following: ?src=rss - rss is the name I give, but it could be anything. For example, with URLs to my site from emails I send out to subscribers, I can attach the following: ?src=email. It appears the Awstats does not track a URL with the ?src= But, the raw log file does, so I suppose, while a little cumbersome, I could look at my raw log file and get an idea of click throughs to my site from the RSS feed or me emails, independent of browser hits. So, in summary, I can get a rough idea of the unique readers that hit my feed, and get an idea of the click throughs from the URL's in the feeds.
-
Awstats gives me the views to my RSS feeds. Is there any way in awstats or some other way to get information about the unique RSS aggregators that hit the feed? I care less at the moment what kind of aggregator it is (although longer term it would be beneficial to know that), I just want to know the unique aggregator hits. I have used Feedburner, but that is not a good permanent solution. If I am rediectng my feed subscribers to Feedburner, then I loose control if I ever want to dump Feedburner and require subscribers to change the feed address. Also, using Feedburner gives them the search engine credit for linking. As a workaround, I have the same feed in two directories. Users subscribe to the feed from the one directory, while Feedburner is directed to the feed in the second director. Thus, when I want to get a snapshot of my feed stats, I can redirect the subscriber feed to Feedburner, wait 24 hours or so and check the stats on feed burner, then delete the redirect. So far, it seems to work fine, but I am looking for a more permanent solution rather than having to employ this redirect. I don't think its fair to FeedBurner, either, for me to be doing this. Is there some other stats package out there that provides a Feedburner like tracking service, but which is installed on the server? I am not really a developer - hack at best - so I suppose creating a script that captures the header data coming from aggregator hits would work, but that is far beyond my technical capabilities and I am not sure I have the time to learn that. Thanks.
-
Cron Job Requiring Username And Password
edward19283 replied to edward19283's topic in CPanel and Site Maintenance
Well, I should have read ALL the post, as I followed the last entry at the bottom, using this script and it worked. php /home/cpanelnale/public_html/main/wordpress2/wp-mail.php -
I have a password protected directory and need to execute a cron job. I read this post Cron Job For Password Protected Directory and followed the directions and still get a permission denied. I am guessing I need to put the username and password in somewhere, but not sure where. I don't really know php, other than being able to read it a little and make minor changes, like font, etc., so modifying the script I am using is probably not something I should do. Is there another way to pass the username and password in the cron job itself? Thanks.
-
I seem to have fixed my problem, although not exactly sure what it was. After repeatedly trying to install again, I did a fresh downloaded of the tar.gz file, instead of the .zip file I was working with, uploaded to my server, then unzipped from there. I also opened up permissions on all top level folders, in addition to the subfolders required, and went through the install. All worked fine from here.
-
Thanks. That worked great. Ed S.
-
Thanks. Ed S
-
I normally unzip server-side programs on my desktop and upload via FTP. I am working on a new program and find the when I FTP a large number of files, the server or connection hangs up at some point, so my FTP gets cut off. I use SmartFTP and have passive file transfer enabled. I am having problems getting my current program working, so I am wondering if perhaps my having to restart the FTP where I left off may be causing me to leave out some files. Is there is a way for me to upload the zip file of the software and unzip it from the server so that all the files get unzipped and copied properly. Thanks. Ed S.
-
What is the PHP memory size? I am working on installing TikiWiki and recommendations are at least 8mb, but preferrably 16 or 32. I believe I can easily get by with 8mb. Thanks. Ed S.
-
I have not found documentation about the problem yet, and I duplicated the install on another domain on another server that I have at TCH and came up with the same issue. However, while I am at it, one issue I came across in my search through the TikiWiki forums probably needs comfirmation. The PHP script memory size must be at least 8 mb (16 or 32 is recommended). What is the size at TCH?
-
Yes, there is a setp script, and there are fairly detailed setup instructions, which are easy to follow. I am trying to search through their forums now to ID the source for this latest problem...nothing so far. I will see what I can find and report back here. Thanks
-
I have multiple choices, from simply 'MySQL' to 'MySQL 3.0.X', 'MySQL 4.0.x', etc. Since you asked, I chose simply 'MySQL' and it seemed to have accepted, although now I am encountering the following error: Call to undefined function: mssql_get_last_message() in /home/dkqopvli/public_html/tiki/lib/adodb/drivers/adodb-mssql.inc.php on line 470
-
Yes.
-
I am installing Tikiwiki 1.8.5 and as far as I can tell, have successfully FTP'd to its own directory, set the appropriate folder permissions, and created the MySQL database. On install, I keep getting the error message that it cannot find the database connection. I am providing the following: Database type: MySQL 4.0.x (my cPanel shows version 4.0.22-standard) Hostname: localhost User: mysitename_admin Password: password Database name: mysitename_tiki mysitename is that provided by TCH when I created the account, not my domain name Is there something very obvious that I am not doing properly? Any other recommendations? Thanks.
-
I accessed my webmail the other day via Horde and saw a much improved look...version 3.0.2 I also maintain a website for my wife with TCH nd her version of Horde is an earlier one (not sure which) that use to be what mine looked like. She uses webmail all the time...what is the process for upgrading her account? Thanks.
-
For the heck of it, I tried this using IE and it worked fine. So, perhaps the question has to do with firefox. Other than the cache and password list, is firefox retaining the information in another place?
-
I web protected a directory within public_html earlier and it worked fine. What I have are PDF forms in the directory that require username and password to access. Now, it is not, allowing anyone full access. I have reset it by deselecting web protect for that directory and deleting users and passwords, and reselected to protect it, as well as adding new users and passwords. It says its protected, but I can access via the web without a username and password. I checked the server status and all appears OK. I have cleared my cache and checked my passowords list in firefox, and still, I can get into the directory without a problem. Am I missing something or is there something else going on? Thanks
-
I just figured out what my problem was. Thanks for the reply. Also, I use an FTP program, except that is a cumbesome way to upload images. I want to be able to do it within word press directly, instead of having to use my FTP program, and I think I just enabled that by solving my issue with changing directory permissions.
-
I am using wordpress and am configuring the file/content upload for use within the wordpress system. My current permissions are set to 755 for the content directory. Wordpress says to use either 755, 765 or 777, eventhough the current setting, 755, does not appear to allow file uploads. I have tried changing to the other two options and am unable to do so through cpanel/file manager. Any advice? Thanks
