Linode Forum
Linode Community Forums
 FAQFAQ    SearchSearch    MembersMembers      Register Register 
 LoginLogin [ Anonymous ] 
Post new topic  Reply to topic
Author Message
PostPosted: Wed Feb 29, 2012 8:15 am 
Offline
Senior Member

Joined: Wed Jan 21, 2009 7:13 pm
Posts: 126
Location: Portugal
Hi,

I have a php file that runs a heavy task.
This file needs to make 2 loops, one in one table and for each result it loops again in another table.

At this time, the 1st table has +- 100 records, so and the second loop with take some time because it needs to export some csv.

I'm planning run this cron hourly, but I have some concerns related to memory exhaustion and timou out.

I can adjust php.ini for cli to a longer timeout and for the nedded memory allocation, but I think there is somehow a better way to do this and to control memory and time needed to do this task.

Is there any better option/daemon/language that let me control the memory usage, the time need to run this task and guarantee that in the final I have everything running smooth and export the data without flaws?

Thanks


Top
   
 Post subject:
PostPosted: Wed Feb 29, 2012 9:31 am 
Offline
Senior Member
User avatar

Joined: Sat Aug 30, 2008 1:55 pm
Posts: 1739
Location: Rochester, New York
It sounds like you might be able to do this entirely in SQL, which would probably avoid problems. If it's a simple nested loop situation, a cross join may do the trick.

Barring that, most other languages won't have problems with memory leaks(*) or timeouts, so choosing something other than PHP would also avoid problems.

(*) 'course, if you're loading gigabytes of data into memory at once, you'll run out of memory. But that's what iterators are for.

_________________
Code:
/* TODO: need to add signature to posts */


Top
   
 Post subject:
PostPosted: Wed Feb 29, 2012 10:11 am 
Offline
Senior Member

Joined: Wed Jan 21, 2009 7:13 pm
Posts: 126
Location: Portugal
Hello,

Thank you for your feedback. I can optimize the DB/queries but my main concern is the guarantee that everything is processed without errors.

Suppose each of the 100 processes take 1 minute. Can php with a long timeout and cron only by them self run for 100m without nay problems?

Is there any daemon/cron alternative that controls this type of process?

I'm very worried about the long time that this task can take!

Thanks


Top
   
 Post subject:
PostPosted: Wed Feb 29, 2012 12:34 pm 
Offline
Senior Member
User avatar

Joined: Tue May 26, 2009 3:29 pm
Posts: 1691
Location: Montreal, QC
PHP does not inherently leak memory, and it's perfect possible to write a long-running PHP app. The problem is that garbage collection was kind of meh before 5.3. In fact, I'm not sure that it's that much better in 5.3, only that they claim it is. Regardless, if you code carefully, you can do long-running PHP apps.

Years ago, I wrote an IRC bot in PHP that would run for days or weeks at a time (I restarted it when I had to make changes, although I used separate processes for the plugins, which accounted for most updates, so those could be changed on the fly).

In fact, the biggest problem was that the most popular feature of my bot was the integration with a chatterbot, written in C (not by me), and THAT had memory leaks; it could only run for a few days at a time before the memory leak got too bad.


Top
   
PostPosted: Wed Feb 29, 2012 1:46 pm 
Offline
Junior Member

Joined: Fri Jul 08, 2011 7:46 pm
Posts: 44
Website: http://ericsonwilkinson.me
Location: United States
If your logic is fairly straight forward, consider using a stored procedure/function in mysql.

You can also create triggers if you don't want to have it running as a cron job

Eric

nfn wrote:
Hi,

I have a php file that runs a heavy task.
This file needs to make 2 loops, one in one table and for each result it loops again in another table.

At this time, the 1st table has +- 100 records, so and the second loop with take some time because it needs to export some csv.

I'm planning run this cron hourly, but I have some concerns related to memory exhaustion and timou out.

I can adjust php.ini for cli to a longer timeout and for the nedded memory allocation, but I think there is somehow a better way to do this and to control memory and time needed to do this task.

Is there any better option/daemon/language that let me control the memory usage, the time need to run this task and guarantee that in the final I have everything running smooth and export the data without flaws?

Thanks


Top
   
 Post subject:
PostPosted: Thu Mar 01, 2012 4:32 pm 
Offline
Senior Member
User avatar

Joined: Tue Nov 24, 2009 1:59 pm
Posts: 362
Isn't php-cli hardcoded to never timeout no matter what you set in php.ini?
And yeah, it definitely sounds like something you should be able to do in SQL faster and better, at least from the vague description you gave us.

_________________
rsk, providing useless advice on the Internet since 2005.


Top
   
 Post subject:
PostPosted: Thu Mar 01, 2012 5:36 pm 
Offline
Senior Member

Joined: Fri May 02, 2008 8:44 pm
Posts: 1121
rsk wrote:
Isn't php-cli hardcoded to never timeout no matter what you set in php.ini?

Yes, unless you manually override it from inside the script by calling set_time_limit().

One side effect of this behavior is that if your script hangs for whatever reason, like an infinite loop, it will keep running until you kill it. So if you're going to run complicated PHP scripts repeatedly from cron, it's often a good idea to check if the previous run has completed successfully.

As for memory leaks, they are mostly a thing of the past, unless you use a horribly badly written library. In PHP 5.3, calling gc_enable() at the beginning usually helps, even in the case of rogue libraries.


Top
   
Display posts from previous:  Sort by  
Post new topic  Reply to topic


Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
RSS

Powered by phpBB® Forum Software © phpBB Group