charlesleo Posted November 6, 2009 Share Posted November 6, 2009 (edited) Background information: I have a website with TCH called www.hdrsource.com. It has been created using Wordpress. I have an online store using the Wordpress plugin WP eStore for Digital Downloads. I'm hosting VERY large images (50mb - 3gb) on an external website called 4shared.com How WP eStore Works: WP eStore works (using php) to encrypt the link of any digital asset you are selling - that way others cannot pass the link around. You can host your files locally, or externally - it shouldn't matter. After an allotted time, the encrypted link goes dead. The problem: When I've hosted the large image files on external websites - mediafire.com and/or 4shared.com, as someone goes to download from the encrypted link created by the WP eStore script (locally), the downloads cut out prematurely. Troubleshooting so far: 1) I've contacted both hosts - mediafire.com and 4shared.com thinking that it might be a security issue on their behalf. Both have informed me that it wasn't. 2) I've been in contact with the WP eStore author for the past few weeks and we've been working on seeing what could potentially be wrong. He informed and showed me that he was able to download one of the test files (60mb)successfully using his website with the plugin. When I try the same exact thing from mine with the same exact file (and same external source), I still gives me a timeout. 3) Direct links to the 4shared and mediafire websites obviously works. 4) When the file is hosted locally with TCH, it downloads just fine. Which leads me to believe that there is some sort of scripting timeout or memory parameter somewhere - except I don't know where to begin to look. And that's why I'm asking you guys. Is there a php.ini file somewhere? A .htaccess parameter that should be adjusted? Maybe it's neither of those??? The question: How to fix download timeouts when using a script to a 3rd party website? Currently works fine with other sites, but not with mine on TCH. Edited November 6, 2009 by charlesleo Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted November 6, 2009 Share Posted November 6, 2009 First question, does the download start? If the download starts then it sounds like an issue with the site serving the file. Quote Link to comment Share on other sites More sharing options...
charlesleo Posted November 6, 2009 Author Share Posted November 6, 2009 The download starts on both of these other sites. Te script works perfectly fine with the author's Wordpress site to pull these files fully down. But it doesn't work fully on mine (partial, then cuts out seemingly randomly.) That's why I was asking here to see if it was a php.ini or .htaccess permission problem - perhaps with some sort of memory or scripting timeout. Maybe I'm completely off base. I don't know - I'm not unfortunately a security expert. I just don't see why it would work on one website and not another. As I mentioned before - the script also works completely fine on mine if I host locally. Unfortunately, I already have the largest plan with you guys and I'm about to run out of space. That's why I'm trying to host all the files remotely. Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted November 6, 2009 Share Posted November 6, 2009 Sounds more like the sending server is losing its connection with the receiver. I'm only guessing here. I can see how there could be something in php.ini for preventing files from being uploaded but not downloaded (I don't think). Could it be a routing issue? Quote Link to comment Share on other sites More sharing options...
Bob Crabb Posted November 7, 2009 Share Posted November 7, 2009 It might be a script time out issue. If so, you can try adjusting the max_execution_time by setting a value in your .htaccess file example: > php_value max_execution_time "120" this would set it at 120 seconds. I would try experimenting with one of the smaller files that isn't working, and work up from there. Also, if the script is reading the file and passing it on to the user, then you might have a memory imit issue. In that case you would need to experiment with the memory_limit value by setting it in your .htaccess file. Quote Link to comment Share on other sites More sharing options...
charlesleo Posted November 7, 2009 Author Share Posted November 7, 2009 Okay. So do I have a php.ini file somewhere? Or do I only have access to .htaccess? Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted November 7, 2009 Share Posted November 7, 2009 You cannot access the php.ini file. You only have access to your .htaccess files. Quote Link to comment Share on other sites More sharing options...
charlesleo Posted November 12, 2009 Author Share Posted November 12, 2009 (edited) Thanks guys for your information and knowledge. Are there any negative consequences to raising the value below past '120' in the .htaccess file? >php_value max_execution_time "120" I should add that having changed this value to '7200' seconds, that it seems to be working just fine now. Thank you. The reason I changed this to '7200' is that I figured with downloading a 2-3 GB file over a slower connection, that a large file might potentially take someone 1-2 hours to download. Perhaps I'm way off in thinking that a value of '7200' would encompass the script timeout issue. If you think this is a wise thing to have done, please advise. Otherwise I'd like to know of any potential consequences for such a large value. Another issue I've encountered is with Firefox now - as I'm downloading the file (although technically it is hosted remotely), it won't let continue browsing my website while the download is active. It's as if it takes up/locks out resources. Does anyone know what is going on here? On the other hand, if I open up a different browser such as Internet Explorer as that is downloading, I am able to surf the site again. Edited November 12, 2009 by charlesleo Quote Link to comment Share on other sites More sharing options...
charlesleo Posted November 14, 2009 Author Share Posted November 14, 2009 (edited) Now I seem to have another wall unfortunately. Files for download (again, hosted externally but through my TCH hosted site) that are above 447 mb timeout consistently even after raising the execution time well beyond 7200. So it appears that I'm hitting a 'roof' somewhere else in the server configuration/.htaccess after modifying the max execution time. At least it doesn't do it on the smaller files now. Does anyone know what's possibly causing the error for the larger +447mb files? Edited November 14, 2009 by charlesleo Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted November 14, 2009 Share Posted November 14, 2009 There may be a server limit to how high you can set this value. I would open a ticket with the help desk to find out. Quote Link to comment Share on other sites More sharing options...
charlesleo Posted November 15, 2009 Author Share Posted November 15, 2009 Thanks Bruce. I posted a ticket after seeing another TCH thread on something similar and am waiting to hear back. For now, my workaround is to simply divide the file into several .rar components/parts available for download. It works, but I would still like to find out the answer to this. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.