Quantcast
Channel: Serverphorums.com
Viewing all articles
Browse latest Browse all 23908

RE: [PHP] Segmenation fault

$
0
0
------------ Original Message ------------
> Date: Sunday, June 21, 2015 12:22:28 AM +0200
> From: Anatol Belski <anatol.php@belski.net>
>
> Hi Frank,
>
>> -----Original Message-----
>> From: Frank Arensmeier [mailto:farensmeier@gmail.com]
>> Sent: Sunday, June 21, 2015 12:05 AM
>>
>> Hi List!
>>
>> I am working on a CLI script (PHP 5.5.25) that parses and imports
>> large amounts of data into a database. Usually, the script runs
>> for 5-15 minutes. For the most part, the script is working just
>> fine. But for some reason, it seems to hit some limit and PHP
>> crashes with a segmentation fault. I managed to hook GDB to
>> the PHP process and got a backtrace. Unfortunately, I am not so
>> much wiser than before.
>>
>> Can someone be so kind and point me into the right direction and
>> give me a hint why PHP crashes? The backtrace:
>>
>> Program received signal EXC_BAD_ACCESS, Could not access memory.
>> Reason: KERN_INVALID_ADDRESS at address: 0x0000000000000018
>> 0x000000010e8f416f in zend_signal_handler_unblock ()
>> (gdb) bt
>> # 0 0x000000010e8f416f in zend_signal_handler_unblock ()
>> # 1 0x000000010e8b47d5 in _zend_mm_alloc_int ()
>> # 2 0x000000010e884cd4 in vspprintf ()
>> # 3 0x000000010f5e112f in xdebug_error_cb ()
>> # 4 0x000000010e8d5875 in zend_error ()
>> # 5 0x000000010e8f4121 in zend_signal_handler ()
>> # 6 0x000000010e8f3fae in zend_signal_handler_defer ()
>> # 7 <signal handler called>
>> # 8 0x00007fff855fce9a in write$NOCANCEL ()
>> # 9 0x00007fff8fae8970 in _swrite ()
>> # 10 0x00007fff8fae15e9 in __sflush ()
>> [.]
>>
>> I have already tried to give PHP more memory (512MB) but that had
>> no impact on the segmentation fault. Any ideas?
>>
> You can try to disable xdebug and see what happens.
>
> Regards
>
> Anatol

Are you trying to read in a whole data set, process it and then spit
it out or are you reading it incrementally (by line or defined
chunk)? Based on what you said, I suspect the former -- the whole
data set. In general, reading/writing incrementally (if possible)
will keep you from running into resource (e.g., memory) limits and
is a more robust long-term approach as increases in the input data
set size are less likely to cause unexpected processing changes
(e.g., resource limit issues) over time.





--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Viewing all articles
Browse latest Browse all 23908

Trending Articles