eZ Platform Discussions

Allowed memory size bytes exhausted: eZ platform import contents

Hello guys I have a kind of symfony command that import data from a CSV file and display an output for each line. Weather the line has been sucsufflly imported or not.
the command does 3 things mainly:

  1. recover data from a csv file line.
  2. encapsulate them with a DTO object.
  3. persist them in the database.
  4. Reindex the object in solr and delete the cache pool.

The command works without any issue for 400 lines. But when the the csv contain more lines the command will be killed and a

PHP Fatal Error: Allowed Memory Size Exhausted …bytes exhausted (tried to allocate…)

will be throwed.

This a part of my code.

class ImportCSVCommand extends Command
{
// more then 20k lines
        foreach ($line as $lines) {
            try {
                $fields = $this->getFieldsFromLine($line);
                ....
                $draft = $contentService->createContent( $contentCreateStruct, array( $locationCreateStruct ) );
                $content = $contentService->publishVersion( $draft->versionInfo );
                $output->writeln('line has been  successfully imported');
            } catch (\Exception $e) {
                $output->writeln('c"ant  importe line ...' ));

            }
        }

Most of the time when I work with clip migration PHP scripts, even with 20k contents, I don’t get any CPU problems. But with the symfony command I get the exception because it consume a lot of memory.

I already tried to execute with php -d memory_limit=-1 but didn’t work great. I’m asking if there’s any idea to optimize my symfony command, if there any solution that can help me to import huge amount of content!

Any serious help would be appreciated and voted.

Hi Ahmed!
One thing you could do, is not index each object. Just skip that, and when the import is complete, you reindex the whole database. I think this will be more efficient, at least if the amount of content you import is larger than what you already have.

Such memory issues can be hard to fix. A quite easy way to work around it is to have an outer script that reads through all your CSV lines, then sends a limited number of them, like 100, to an inner script that does the import. Then the inner script will terminate before the memory problem happens, and the outer script then restarts the inner script with the next 100 items, and so on. For this to work, you can’t import() or require() the inner script, you have to start it as a separate process using exec(), system() or similar.

NB: You have a memory problem, not a CPU problem :wink:

1 Like

How not index the for each object, does there any option, that can help or create, publish, delete, hideContent…functions ?

I don’t know, but I think there are ways to disable the indexing temporarily.
Since you wrote the following, I thought you were indexing manually: