Error 500 Debug

n64man120

2[H]4U
Joined
Jan 11, 2004
Messages
3,498
I'm working on putting together a script that parses through a large csv file and puts it into an associative array. Below is the basic debug code I'm trying to get working for now.

It works with the csv files I have that are up to 1.5MB but fails with my 4MB file, returning a generic Error 500. I am using a 1&1 account, so as far as I know more detailed logs are not available. One valuable clue (I believe) is that when I execute this script through an SSH session, it completes and returns the value properly.

Any ideas where to start looking at timeout/filesize properties that would cause it to fail in browser but not in shell session?

Code:
if ($handle) {
	$header = fgetcsv($handle, 1000, "|");
	while (($data = fgetcsv($handle, 1000, "|")) !== FALSE) {
		foreach ($header as $key=>$heading) {
			$row[$heading]=(isset($data[$key])) ? $data[$key] : '';
		}
	$listings[]=$row;
	}
    fclose($handle);

	print_r($listings[6]["AGENT_FORM_NAME"]);
}

Thanks,
Eric
 
I'm not really a Php guy, but looking at your description and looking at the spec of fgetcsv(). It would appear that the second argument '1000' is what's causing the error, you can either extend this or if they are using php version 5 remove it all together.

Btw all http 500 errors mean is that something went wrong on the server, yes it couldn't be less descript. Lookup excuption handling to get more detail and how cover your arse when something goes wrong.
 
how are you getting the CSV file? Is it uploaded from a form? I believe the default maximum file size for uploads in PHP is 2 Megs, though you can override that setting in PHP.ini
 
Shared hosting with 1&1? Seems like something is limiting you from going past 2MB. Tried 2.some odd MB? How about using a smaller buffer/length? I vaguely remember myself running into this problem, so I want to say its some PHP setting but I can't remember what it was. :\
 
When run at the CLI, how long does it take to run? How much memory is allocated by the process (try memory_get_peak_usage()) when you're done?

My first guess would be that you're exceeding your time/space resource limits.
 
Back
Top