Allowed memory size exhausted in PHP for loop












2















I'm facing a fatal error while I'm trying to manipulate a huge array of arrays in PHP and return the result as a response of an HTTP POST request:




Allowed memory size of 536870912 bytes exhausted




I have already tried to set ini_set('memory_limit', '-1'); in order to see if I get the result, but I didn't get any type of response. Postman crashed all the times that I tried to make the POST request.



The starting structure of the array is this one. The body size is around 25mb. The main array contains around 22k arrays with this structure, I have just included 2:



Array
(
[0] => Array
(
[id] => 14
[isActive] => 1
[personId] => 0023fff16d353d16a052a267811af53bc8bd42f51f2266a2904ca41db19dfd32_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 17:15:18, 155.59, 294.076; 2018-03-01 17:16:04, 502.968, 249.947; 2018-03-01 17:16:44, 276.837, 270.593; 2018-03-01 17:17:28, 431.68, 371.14; 2018-03-01 17:17:34, 851.622, 355.915
)

[1] => Array
(
[id] => 180
[isActive] => 1
[personId] => 02659982ae8286409cc5bb283089871b62f2bafbbad517941d64e77ecf2b62b1_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 13:20:05, 155.599, 293.841; 2018-03-01 13:20:48, 495.468, 249.582; 2018-03-01 13:21:28, 258.791, 260.748; 2018-03-01 13:23:20, 859.061, 352.237; 2018-03-01 13:23:32, 56.1404, 269.858
)
)


Here below the php part for manipulate the array in order to have the expected final result by explode the timestamp and coordinates for each user:



$final_result = ;

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;

array_push($final_result, $final);
}

return $final_result;


As it seems to me there are no infinite loop or bad practices that can justify a memory issue. The only real problem could be the size of the array that need to be manipulated.



Any suggestions?










share|improve this question




















  • 1





    The first optimization step I would do is to use the $result array returned from the database and not build another monster array. Also any work needs to be done on the data, let the database do it and get the final result data ready to be used, to reduce the work inside the PHP layer.

    – Accountant م
    Nov 22 '18 at 20:44













  • @Accountantم the main problem is that I already made a monster aggregation in SQL for get that particular result. I don't think that I can get the final expected result directly with MySQL. That's why I thought about make some parts of the transformation within PHP.

    – UgoL
    Nov 22 '18 at 20:48













  • Try and set a limit on the DB query to only fetch a few rows and see if the result is the same. If it is, then you must have some infinite loop somewhere in the code you haven't posted.

    – Magnus Eriksson
    Nov 22 '18 at 20:54













  • @MagnusEriksson already checked. With the LIMIT it works properly.

    – UgoL
    Nov 22 '18 at 20:55











  • I can imagine that postman crashes if you're trying to return that much data in one go. Use some pagination instead of returning it all at once.

    – Magnus Eriksson
    Nov 22 '18 at 20:58


















2















I'm facing a fatal error while I'm trying to manipulate a huge array of arrays in PHP and return the result as a response of an HTTP POST request:




Allowed memory size of 536870912 bytes exhausted




I have already tried to set ini_set('memory_limit', '-1'); in order to see if I get the result, but I didn't get any type of response. Postman crashed all the times that I tried to make the POST request.



The starting structure of the array is this one. The body size is around 25mb. The main array contains around 22k arrays with this structure, I have just included 2:



Array
(
[0] => Array
(
[id] => 14
[isActive] => 1
[personId] => 0023fff16d353d16a052a267811af53bc8bd42f51f2266a2904ca41db19dfd32_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 17:15:18, 155.59, 294.076; 2018-03-01 17:16:04, 502.968, 249.947; 2018-03-01 17:16:44, 276.837, 270.593; 2018-03-01 17:17:28, 431.68, 371.14; 2018-03-01 17:17:34, 851.622, 355.915
)

[1] => Array
(
[id] => 180
[isActive] => 1
[personId] => 02659982ae8286409cc5bb283089871b62f2bafbbad517941d64e77ecf2b62b1_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 13:20:05, 155.599, 293.841; 2018-03-01 13:20:48, 495.468, 249.582; 2018-03-01 13:21:28, 258.791, 260.748; 2018-03-01 13:23:20, 859.061, 352.237; 2018-03-01 13:23:32, 56.1404, 269.858
)
)


Here below the php part for manipulate the array in order to have the expected final result by explode the timestamp and coordinates for each user:



$final_result = ;

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;

array_push($final_result, $final);
}

return $final_result;


As it seems to me there are no infinite loop or bad practices that can justify a memory issue. The only real problem could be the size of the array that need to be manipulated.



Any suggestions?










share|improve this question




















  • 1





    The first optimization step I would do is to use the $result array returned from the database and not build another monster array. Also any work needs to be done on the data, let the database do it and get the final result data ready to be used, to reduce the work inside the PHP layer.

    – Accountant م
    Nov 22 '18 at 20:44













  • @Accountantم the main problem is that I already made a monster aggregation in SQL for get that particular result. I don't think that I can get the final expected result directly with MySQL. That's why I thought about make some parts of the transformation within PHP.

    – UgoL
    Nov 22 '18 at 20:48













  • Try and set a limit on the DB query to only fetch a few rows and see if the result is the same. If it is, then you must have some infinite loop somewhere in the code you haven't posted.

    – Magnus Eriksson
    Nov 22 '18 at 20:54













  • @MagnusEriksson already checked. With the LIMIT it works properly.

    – UgoL
    Nov 22 '18 at 20:55











  • I can imagine that postman crashes if you're trying to return that much data in one go. Use some pagination instead of returning it all at once.

    – Magnus Eriksson
    Nov 22 '18 at 20:58
















2












2








2








I'm facing a fatal error while I'm trying to manipulate a huge array of arrays in PHP and return the result as a response of an HTTP POST request:




Allowed memory size of 536870912 bytes exhausted




I have already tried to set ini_set('memory_limit', '-1'); in order to see if I get the result, but I didn't get any type of response. Postman crashed all the times that I tried to make the POST request.



The starting structure of the array is this one. The body size is around 25mb. The main array contains around 22k arrays with this structure, I have just included 2:



Array
(
[0] => Array
(
[id] => 14
[isActive] => 1
[personId] => 0023fff16d353d16a052a267811af53bc8bd42f51f2266a2904ca41db19dfd32_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 17:15:18, 155.59, 294.076; 2018-03-01 17:16:04, 502.968, 249.947; 2018-03-01 17:16:44, 276.837, 270.593; 2018-03-01 17:17:28, 431.68, 371.14; 2018-03-01 17:17:34, 851.622, 355.915
)

[1] => Array
(
[id] => 180
[isActive] => 1
[personId] => 02659982ae8286409cc5bb283089871b62f2bafbbad517941d64e77ecf2b62b1_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 13:20:05, 155.599, 293.841; 2018-03-01 13:20:48, 495.468, 249.582; 2018-03-01 13:21:28, 258.791, 260.748; 2018-03-01 13:23:20, 859.061, 352.237; 2018-03-01 13:23:32, 56.1404, 269.858
)
)


Here below the php part for manipulate the array in order to have the expected final result by explode the timestamp and coordinates for each user:



$final_result = ;

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;

array_push($final_result, $final);
}

return $final_result;


As it seems to me there are no infinite loop or bad practices that can justify a memory issue. The only real problem could be the size of the array that need to be manipulated.



Any suggestions?










share|improve this question
















I'm facing a fatal error while I'm trying to manipulate a huge array of arrays in PHP and return the result as a response of an HTTP POST request:




Allowed memory size of 536870912 bytes exhausted




I have already tried to set ini_set('memory_limit', '-1'); in order to see if I get the result, but I didn't get any type of response. Postman crashed all the times that I tried to make the POST request.



The starting structure of the array is this one. The body size is around 25mb. The main array contains around 22k arrays with this structure, I have just included 2:



Array
(
[0] => Array
(
[id] => 14
[isActive] => 1
[personId] => 0023fff16d353d16a052a267811af53bc8bd42f51f2266a2904ca41db19dfd32_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 17:15:18, 155.59, 294.076; 2018-03-01 17:16:04, 502.968, 249.947; 2018-03-01 17:16:44, 276.837, 270.593; 2018-03-01 17:17:28, 431.68, 371.14; 2018-03-01 17:17:34, 851.622, 355.915
)

[1] => Array
(
[id] => 180
[isActive] => 1
[personId] => 02659982ae8286409cc5bb283089871b62f2bafbbad517941d64e77ecf2b62b1_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 13:20:05, 155.599, 293.841; 2018-03-01 13:20:48, 495.468, 249.582; 2018-03-01 13:21:28, 258.791, 260.748; 2018-03-01 13:23:20, 859.061, 352.237; 2018-03-01 13:23:32, 56.1404, 269.858
)
)


Here below the php part for manipulate the array in order to have the expected final result by explode the timestamp and coordinates for each user:



$final_result = ;

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;

array_push($final_result, $final);
}

return $final_result;


As it seems to me there are no infinite loop or bad practices that can justify a memory issue. The only real problem could be the size of the array that need to be manipulated.



Any suggestions?







php arrays for-loop foreach memory-size






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 22 '18 at 20:42









JazZ

3,3792930




3,3792930










asked Nov 22 '18 at 20:36









UgoLUgoL

470313




470313








  • 1





    The first optimization step I would do is to use the $result array returned from the database and not build another monster array. Also any work needs to be done on the data, let the database do it and get the final result data ready to be used, to reduce the work inside the PHP layer.

    – Accountant م
    Nov 22 '18 at 20:44













  • @Accountantم the main problem is that I already made a monster aggregation in SQL for get that particular result. I don't think that I can get the final expected result directly with MySQL. That's why I thought about make some parts of the transformation within PHP.

    – UgoL
    Nov 22 '18 at 20:48













  • Try and set a limit on the DB query to only fetch a few rows and see if the result is the same. If it is, then you must have some infinite loop somewhere in the code you haven't posted.

    – Magnus Eriksson
    Nov 22 '18 at 20:54













  • @MagnusEriksson already checked. With the LIMIT it works properly.

    – UgoL
    Nov 22 '18 at 20:55











  • I can imagine that postman crashes if you're trying to return that much data in one go. Use some pagination instead of returning it all at once.

    – Magnus Eriksson
    Nov 22 '18 at 20:58
















  • 1





    The first optimization step I would do is to use the $result array returned from the database and not build another monster array. Also any work needs to be done on the data, let the database do it and get the final result data ready to be used, to reduce the work inside the PHP layer.

    – Accountant م
    Nov 22 '18 at 20:44













  • @Accountantم the main problem is that I already made a monster aggregation in SQL for get that particular result. I don't think that I can get the final expected result directly with MySQL. That's why I thought about make some parts of the transformation within PHP.

    – UgoL
    Nov 22 '18 at 20:48













  • Try and set a limit on the DB query to only fetch a few rows and see if the result is the same. If it is, then you must have some infinite loop somewhere in the code you haven't posted.

    – Magnus Eriksson
    Nov 22 '18 at 20:54













  • @MagnusEriksson already checked. With the LIMIT it works properly.

    – UgoL
    Nov 22 '18 at 20:55











  • I can imagine that postman crashes if you're trying to return that much data in one go. Use some pagination instead of returning it all at once.

    – Magnus Eriksson
    Nov 22 '18 at 20:58










1




1





The first optimization step I would do is to use the $result array returned from the database and not build another monster array. Also any work needs to be done on the data, let the database do it and get the final result data ready to be used, to reduce the work inside the PHP layer.

– Accountant م
Nov 22 '18 at 20:44







The first optimization step I would do is to use the $result array returned from the database and not build another monster array. Also any work needs to be done on the data, let the database do it and get the final result data ready to be used, to reduce the work inside the PHP layer.

– Accountant م
Nov 22 '18 at 20:44















@Accountantم the main problem is that I already made a monster aggregation in SQL for get that particular result. I don't think that I can get the final expected result directly with MySQL. That's why I thought about make some parts of the transformation within PHP.

– UgoL
Nov 22 '18 at 20:48







@Accountantم the main problem is that I already made a monster aggregation in SQL for get that particular result. I don't think that I can get the final expected result directly with MySQL. That's why I thought about make some parts of the transformation within PHP.

– UgoL
Nov 22 '18 at 20:48















Try and set a limit on the DB query to only fetch a few rows and see if the result is the same. If it is, then you must have some infinite loop somewhere in the code you haven't posted.

– Magnus Eriksson
Nov 22 '18 at 20:54







Try and set a limit on the DB query to only fetch a few rows and see if the result is the same. If it is, then you must have some infinite loop somewhere in the code you haven't posted.

– Magnus Eriksson
Nov 22 '18 at 20:54















@MagnusEriksson already checked. With the LIMIT it works properly.

– UgoL
Nov 22 '18 at 20:55





@MagnusEriksson already checked. With the LIMIT it works properly.

– UgoL
Nov 22 '18 at 20:55













I can imagine that postman crashes if you're trying to return that much data in one go. Use some pagination instead of returning it all at once.

– Magnus Eriksson
Nov 22 '18 at 20:58







I can imagine that postman crashes if you're trying to return that much data in one go. Use some pagination instead of returning it all at once.

– Magnus Eriksson
Nov 22 '18 at 20:58














3 Answers
3






active

oldest

votes


















1














This answer is an example of how to implement a buffer(a limited array in memory) in your code and when it is filled, flush it's contents to disk, at the end you will find a huge array on disk in JSON format. I used this way in a situation similar to yours and got great result regarding "memory usage", but as I told you in comments you need to rethink why you need that HUGE array in the first place, and if there is a way to avoid it, go with it.



using this function will save you the memory used by your $final_result array and replace it with $final_result string buffer but we are controlling it's use of memory. However your $query_result array will still taking the memory it needs.



Note that you need to alter the function as you need because I used your variables which are undefined in my code.



/**
* proccess the HUGE array and save it to disk in json format [element,element]
*
* @param string $fileName absulote file name path you want to save the proccessed array in
* @return int processed elements count
*/
function buildMyArrayInFile($fileName)
{
$flushCheckPoint = 100;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;


$final_result .= json_encode($final) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;

}


this is another simple version of the function for testing



// test
var_dump(buildMyArrayInFile2("/home/myuser/myArray.json"));
// outputs int(7)



function buildMyArrayInFile2($fileName)
{
$flushCheckPoint = 2;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

$test_result = [1,2,3,4,"wee","hellonworld",5];
foreach($test_result as $row)
{
$final_result .= json_encode($row) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;
}





share|improve this answer


























  • Thank you. Give me some time to look at it.

    – UgoL
    Nov 22 '18 at 23:29











  • you are welcome, take the time you need

    – Accountant م
    Nov 22 '18 at 23:31






  • 1





    Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

    – UgoL
    Nov 28 '18 at 15:50



















2














You are collecting a large amount of data into the array, and only then returning it.



If you instead, collect a single '$final' item, and yield it inside the foreach-loop, rather than putting it into an ever-increasing sized variable, you will still be able to foreach around the function call.



Here is a simplistic example, where $i stands in as a sample returning value instead of your '$final' array of collected data.



<?php
function count_one_to_three() {
for ($i = 1; $i <= 3; $i++) {
// Note that $i is preserved between yields.
yield $i;
}
}

$generator = count_one_to_three();
foreach ($generator as $value) { // you can also foreach(count_one_to_three() as $value)
echo "$valuen";
}


Information on 'yield' in PHP






share|improve this answer


























  • Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

    – UgoL
    Nov 22 '18 at 22:19











  • It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

    – Alister Bulman
    Nov 22 '18 at 22:37











  • I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

    – UgoL
    Nov 23 '18 at 0:30



















2














It's bad practice to work with big datas in this case.



Imagine this thing: U have variable $a which contains 22k arrays, and u started forming second variable $b which will contain 22k arrays too.



So at the end of your script u will have 2 variables with 22k arrays.



To escape these problems you should get your data by batches. For example 500 rows at one loop.





    function findRows($offset = 0, &$final_result = ) {
$sql = 'SELECT * FROM my_table LIMIT ' . $offset . ', 500';
//your code to find rows

if ($query_result) {
$offset = $offset + 500;

foreach($query_result as $row) {
//your another code
array_push($final_result, $final);
}
findRows($offset, $final_result);
}

return $final_result;
}

return findRows();





share|improve this answer


























  • Can you show me an example of getting data by batches based on my current code?

    – UgoL
    Nov 22 '18 at 20:58











  • Check the code)

    – user10685556
    Nov 22 '18 at 21:46











  • I have applied your solution with the offset of 0,500. But the memory size error still occurs.

    – UgoL
    Nov 22 '18 at 23:10













  • And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

    – UgoL
    Nov 23 '18 at 0:33













  • Okay, try this. Because i didnt check the code before. This works well

    – user10685556
    Nov 23 '18 at 7:21











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53437729%2fallowed-memory-size-exhausted-in-php-for-loop%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









1














This answer is an example of how to implement a buffer(a limited array in memory) in your code and when it is filled, flush it's contents to disk, at the end you will find a huge array on disk in JSON format. I used this way in a situation similar to yours and got great result regarding "memory usage", but as I told you in comments you need to rethink why you need that HUGE array in the first place, and if there is a way to avoid it, go with it.



using this function will save you the memory used by your $final_result array and replace it with $final_result string buffer but we are controlling it's use of memory. However your $query_result array will still taking the memory it needs.



Note that you need to alter the function as you need because I used your variables which are undefined in my code.



/**
* proccess the HUGE array and save it to disk in json format [element,element]
*
* @param string $fileName absulote file name path you want to save the proccessed array in
* @return int processed elements count
*/
function buildMyArrayInFile($fileName)
{
$flushCheckPoint = 100;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;


$final_result .= json_encode($final) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;

}


this is another simple version of the function for testing



// test
var_dump(buildMyArrayInFile2("/home/myuser/myArray.json"));
// outputs int(7)



function buildMyArrayInFile2($fileName)
{
$flushCheckPoint = 2;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

$test_result = [1,2,3,4,"wee","hellonworld",5];
foreach($test_result as $row)
{
$final_result .= json_encode($row) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;
}





share|improve this answer


























  • Thank you. Give me some time to look at it.

    – UgoL
    Nov 22 '18 at 23:29











  • you are welcome, take the time you need

    – Accountant م
    Nov 22 '18 at 23:31






  • 1





    Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

    – UgoL
    Nov 28 '18 at 15:50
















1














This answer is an example of how to implement a buffer(a limited array in memory) in your code and when it is filled, flush it's contents to disk, at the end you will find a huge array on disk in JSON format. I used this way in a situation similar to yours and got great result regarding "memory usage", but as I told you in comments you need to rethink why you need that HUGE array in the first place, and if there is a way to avoid it, go with it.



using this function will save you the memory used by your $final_result array and replace it with $final_result string buffer but we are controlling it's use of memory. However your $query_result array will still taking the memory it needs.



Note that you need to alter the function as you need because I used your variables which are undefined in my code.



/**
* proccess the HUGE array and save it to disk in json format [element,element]
*
* @param string $fileName absulote file name path you want to save the proccessed array in
* @return int processed elements count
*/
function buildMyArrayInFile($fileName)
{
$flushCheckPoint = 100;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;


$final_result .= json_encode($final) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;

}


this is another simple version of the function for testing



// test
var_dump(buildMyArrayInFile2("/home/myuser/myArray.json"));
// outputs int(7)



function buildMyArrayInFile2($fileName)
{
$flushCheckPoint = 2;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

$test_result = [1,2,3,4,"wee","hellonworld",5];
foreach($test_result as $row)
{
$final_result .= json_encode($row) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;
}





share|improve this answer


























  • Thank you. Give me some time to look at it.

    – UgoL
    Nov 22 '18 at 23:29











  • you are welcome, take the time you need

    – Accountant م
    Nov 22 '18 at 23:31






  • 1





    Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

    – UgoL
    Nov 28 '18 at 15:50














1












1








1







This answer is an example of how to implement a buffer(a limited array in memory) in your code and when it is filled, flush it's contents to disk, at the end you will find a huge array on disk in JSON format. I used this way in a situation similar to yours and got great result regarding "memory usage", but as I told you in comments you need to rethink why you need that HUGE array in the first place, and if there is a way to avoid it, go with it.



using this function will save you the memory used by your $final_result array and replace it with $final_result string buffer but we are controlling it's use of memory. However your $query_result array will still taking the memory it needs.



Note that you need to alter the function as you need because I used your variables which are undefined in my code.



/**
* proccess the HUGE array and save it to disk in json format [element,element]
*
* @param string $fileName absulote file name path you want to save the proccessed array in
* @return int processed elements count
*/
function buildMyArrayInFile($fileName)
{
$flushCheckPoint = 100;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;


$final_result .= json_encode($final) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;

}


this is another simple version of the function for testing



// test
var_dump(buildMyArrayInFile2("/home/myuser/myArray.json"));
// outputs int(7)



function buildMyArrayInFile2($fileName)
{
$flushCheckPoint = 2;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

$test_result = [1,2,3,4,"wee","hellonworld",5];
foreach($test_result as $row)
{
$final_result .= json_encode($row) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;
}





share|improve this answer















This answer is an example of how to implement a buffer(a limited array in memory) in your code and when it is filled, flush it's contents to disk, at the end you will find a huge array on disk in JSON format. I used this way in a situation similar to yours and got great result regarding "memory usage", but as I told you in comments you need to rethink why you need that HUGE array in the first place, and if there is a way to avoid it, go with it.



using this function will save you the memory used by your $final_result array and replace it with $final_result string buffer but we are controlling it's use of memory. However your $query_result array will still taking the memory it needs.



Note that you need to alter the function as you need because I used your variables which are undefined in my code.



/**
* proccess the HUGE array and save it to disk in json format [element,element]
*
* @param string $fileName absulote file name path you want to save the proccessed array in
* @return int processed elements count
*/
function buildMyArrayInFile($fileName)
{
$flushCheckPoint = 100;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = ;
$final = ;
$d = ;

for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);

array_push($linedata,$d);
}

$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;


$final_result .= json_encode($final) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;

}


this is another simple version of the function for testing



// test
var_dump(buildMyArrayInFile2("/home/myuser/myArray.json"));
// outputs int(7)



function buildMyArrayInFile2($fileName)
{
$flushCheckPoint = 2;// set the buffer size as needed, depending on the size of PHP allowed memory and average element size
$processedElements = 0;
$final_result = "[";

file_put_contents($fileName, "");//prepare the file and erase anything in it

$test_result = [1,2,3,4,"wee","hellonworld",5];
foreach($test_result as $row)
{
$final_result .= json_encode($row) . ",";
$processedElements ++;
if($processedElements % $flushCheckPoint === 0){
//the array has reached the limit, flush the array to disk
file_put_contents($fileName, $final_result, FILE_APPEND);
$final_result = "";
}

}

$final_result = rtrim($final_result, ",");//trim the last comma
$final_result .= "]";
//flush the remaning data in $final_result
file_put_contents($fileName, $final_result, FILE_APPEND);

return $processedElements;
}






share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 22 '18 at 23:29

























answered Nov 22 '18 at 23:18









Accountant مAccountant م

1,94421126




1,94421126













  • Thank you. Give me some time to look at it.

    – UgoL
    Nov 22 '18 at 23:29











  • you are welcome, take the time you need

    – Accountant م
    Nov 22 '18 at 23:31






  • 1





    Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

    – UgoL
    Nov 28 '18 at 15:50



















  • Thank you. Give me some time to look at it.

    – UgoL
    Nov 22 '18 at 23:29











  • you are welcome, take the time you need

    – Accountant م
    Nov 22 '18 at 23:31






  • 1





    Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

    – UgoL
    Nov 28 '18 at 15:50

















Thank you. Give me some time to look at it.

– UgoL
Nov 22 '18 at 23:29





Thank you. Give me some time to look at it.

– UgoL
Nov 22 '18 at 23:29













you are welcome, take the time you need

– Accountant م
Nov 22 '18 at 23:31





you are welcome, take the time you need

– Accountant م
Nov 22 '18 at 23:31




1




1





Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

– UgoL
Nov 28 '18 at 15:50





Finally I have tested all the hypothesis, including the yield solution. But this one was the only solution that worked in my case.

– UgoL
Nov 28 '18 at 15:50













2














You are collecting a large amount of data into the array, and only then returning it.



If you instead, collect a single '$final' item, and yield it inside the foreach-loop, rather than putting it into an ever-increasing sized variable, you will still be able to foreach around the function call.



Here is a simplistic example, where $i stands in as a sample returning value instead of your '$final' array of collected data.



<?php
function count_one_to_three() {
for ($i = 1; $i <= 3; $i++) {
// Note that $i is preserved between yields.
yield $i;
}
}

$generator = count_one_to_three();
foreach ($generator as $value) { // you can also foreach(count_one_to_three() as $value)
echo "$valuen";
}


Information on 'yield' in PHP






share|improve this answer


























  • Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

    – UgoL
    Nov 22 '18 at 22:19











  • It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

    – Alister Bulman
    Nov 22 '18 at 22:37











  • I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

    – UgoL
    Nov 23 '18 at 0:30
















2














You are collecting a large amount of data into the array, and only then returning it.



If you instead, collect a single '$final' item, and yield it inside the foreach-loop, rather than putting it into an ever-increasing sized variable, you will still be able to foreach around the function call.



Here is a simplistic example, where $i stands in as a sample returning value instead of your '$final' array of collected data.



<?php
function count_one_to_three() {
for ($i = 1; $i <= 3; $i++) {
// Note that $i is preserved between yields.
yield $i;
}
}

$generator = count_one_to_three();
foreach ($generator as $value) { // you can also foreach(count_one_to_three() as $value)
echo "$valuen";
}


Information on 'yield' in PHP






share|improve this answer


























  • Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

    – UgoL
    Nov 22 '18 at 22:19











  • It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

    – Alister Bulman
    Nov 22 '18 at 22:37











  • I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

    – UgoL
    Nov 23 '18 at 0:30














2












2








2







You are collecting a large amount of data into the array, and only then returning it.



If you instead, collect a single '$final' item, and yield it inside the foreach-loop, rather than putting it into an ever-increasing sized variable, you will still be able to foreach around the function call.



Here is a simplistic example, where $i stands in as a sample returning value instead of your '$final' array of collected data.



<?php
function count_one_to_three() {
for ($i = 1; $i <= 3; $i++) {
// Note that $i is preserved between yields.
yield $i;
}
}

$generator = count_one_to_three();
foreach ($generator as $value) { // you can also foreach(count_one_to_three() as $value)
echo "$valuen";
}


Information on 'yield' in PHP






share|improve this answer















You are collecting a large amount of data into the array, and only then returning it.



If you instead, collect a single '$final' item, and yield it inside the foreach-loop, rather than putting it into an ever-increasing sized variable, you will still be able to foreach around the function call.



Here is a simplistic example, where $i stands in as a sample returning value instead of your '$final' array of collected data.



<?php
function count_one_to_three() {
for ($i = 1; $i <= 3; $i++) {
// Note that $i is preserved between yields.
yield $i;
}
}

$generator = count_one_to_three();
foreach ($generator as $value) { // you can also foreach(count_one_to_three() as $value)
echo "$valuen";
}


Information on 'yield' in PHP







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 22 '18 at 22:39

























answered Nov 22 '18 at 22:05









Alister BulmanAlister Bulman

25.7k555100




25.7k555100













  • Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

    – UgoL
    Nov 22 '18 at 22:19











  • It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

    – Alister Bulman
    Nov 22 '18 at 22:37











  • I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

    – UgoL
    Nov 23 '18 at 0:30



















  • Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

    – UgoL
    Nov 22 '18 at 22:19











  • It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

    – Alister Bulman
    Nov 22 '18 at 22:37











  • I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

    – UgoL
    Nov 23 '18 at 0:30

















Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

– UgoL
Nov 22 '18 at 22:19





Thank you for the suggestion. Honestly I never hear about yield. I'm not sure how I could correctly implement it in my current code. Is this a better procedure than batches approach in theory?

– UgoL
Nov 22 '18 at 22:19













It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

– Alister Bulman
Nov 22 '18 at 22:37





It's pretty much as simple as replacing the array_push($final_result, $final); with yield $final;, and not needing to return anything. Instead, it is returning one item to the caller per loop, and then continuing from where it left off . The link has more details. It's worth reading up on - php.net/manual/en/language.generators.syntax.php

– Alister Bulman
Nov 22 '18 at 22:37













I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

– UgoL
Nov 23 '18 at 0:30





I have read it. But honestly I didn't get the point about how it could be useful for my case... can you show me an example according to my current code? Maybe explain what should be the advantage?

– UgoL
Nov 23 '18 at 0:30











2














It's bad practice to work with big datas in this case.



Imagine this thing: U have variable $a which contains 22k arrays, and u started forming second variable $b which will contain 22k arrays too.



So at the end of your script u will have 2 variables with 22k arrays.



To escape these problems you should get your data by batches. For example 500 rows at one loop.





    function findRows($offset = 0, &$final_result = ) {
$sql = 'SELECT * FROM my_table LIMIT ' . $offset . ', 500';
//your code to find rows

if ($query_result) {
$offset = $offset + 500;

foreach($query_result as $row) {
//your another code
array_push($final_result, $final);
}
findRows($offset, $final_result);
}

return $final_result;
}

return findRows();





share|improve this answer


























  • Can you show me an example of getting data by batches based on my current code?

    – UgoL
    Nov 22 '18 at 20:58











  • Check the code)

    – user10685556
    Nov 22 '18 at 21:46











  • I have applied your solution with the offset of 0,500. But the memory size error still occurs.

    – UgoL
    Nov 22 '18 at 23:10













  • And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

    – UgoL
    Nov 23 '18 at 0:33













  • Okay, try this. Because i didnt check the code before. This works well

    – user10685556
    Nov 23 '18 at 7:21
















2














It's bad practice to work with big datas in this case.



Imagine this thing: U have variable $a which contains 22k arrays, and u started forming second variable $b which will contain 22k arrays too.



So at the end of your script u will have 2 variables with 22k arrays.



To escape these problems you should get your data by batches. For example 500 rows at one loop.





    function findRows($offset = 0, &$final_result = ) {
$sql = 'SELECT * FROM my_table LIMIT ' . $offset . ', 500';
//your code to find rows

if ($query_result) {
$offset = $offset + 500;

foreach($query_result as $row) {
//your another code
array_push($final_result, $final);
}
findRows($offset, $final_result);
}

return $final_result;
}

return findRows();





share|improve this answer


























  • Can you show me an example of getting data by batches based on my current code?

    – UgoL
    Nov 22 '18 at 20:58











  • Check the code)

    – user10685556
    Nov 22 '18 at 21:46











  • I have applied your solution with the offset of 0,500. But the memory size error still occurs.

    – UgoL
    Nov 22 '18 at 23:10













  • And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

    – UgoL
    Nov 23 '18 at 0:33













  • Okay, try this. Because i didnt check the code before. This works well

    – user10685556
    Nov 23 '18 at 7:21














2












2








2







It's bad practice to work with big datas in this case.



Imagine this thing: U have variable $a which contains 22k arrays, and u started forming second variable $b which will contain 22k arrays too.



So at the end of your script u will have 2 variables with 22k arrays.



To escape these problems you should get your data by batches. For example 500 rows at one loop.





    function findRows($offset = 0, &$final_result = ) {
$sql = 'SELECT * FROM my_table LIMIT ' . $offset . ', 500';
//your code to find rows

if ($query_result) {
$offset = $offset + 500;

foreach($query_result as $row) {
//your another code
array_push($final_result, $final);
}
findRows($offset, $final_result);
}

return $final_result;
}

return findRows();





share|improve this answer















It's bad practice to work with big datas in this case.



Imagine this thing: U have variable $a which contains 22k arrays, and u started forming second variable $b which will contain 22k arrays too.



So at the end of your script u will have 2 variables with 22k arrays.



To escape these problems you should get your data by batches. For example 500 rows at one loop.





    function findRows($offset = 0, &$final_result = ) {
$sql = 'SELECT * FROM my_table LIMIT ' . $offset . ', 500';
//your code to find rows

if ($query_result) {
$offset = $offset + 500;

foreach($query_result as $row) {
//your another code
array_push($final_result, $final);
}
findRows($offset, $final_result);
}

return $final_result;
}

return findRows();






share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 23 '18 at 7:19

























answered Nov 22 '18 at 20:56









user10685556user10685556

212




212













  • Can you show me an example of getting data by batches based on my current code?

    – UgoL
    Nov 22 '18 at 20:58











  • Check the code)

    – user10685556
    Nov 22 '18 at 21:46











  • I have applied your solution with the offset of 0,500. But the memory size error still occurs.

    – UgoL
    Nov 22 '18 at 23:10













  • And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

    – UgoL
    Nov 23 '18 at 0:33













  • Okay, try this. Because i didnt check the code before. This works well

    – user10685556
    Nov 23 '18 at 7:21



















  • Can you show me an example of getting data by batches based on my current code?

    – UgoL
    Nov 22 '18 at 20:58











  • Check the code)

    – user10685556
    Nov 22 '18 at 21:46











  • I have applied your solution with the offset of 0,500. But the memory size error still occurs.

    – UgoL
    Nov 22 '18 at 23:10













  • And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

    – UgoL
    Nov 23 '18 at 0:33













  • Okay, try this. Because i didnt check the code before. This works well

    – user10685556
    Nov 23 '18 at 7:21

















Can you show me an example of getting data by batches based on my current code?

– UgoL
Nov 22 '18 at 20:58





Can you show me an example of getting data by batches based on my current code?

– UgoL
Nov 22 '18 at 20:58













Check the code)

– user10685556
Nov 22 '18 at 21:46





Check the code)

– user10685556
Nov 22 '18 at 21:46













I have applied your solution with the offset of 0,500. But the memory size error still occurs.

– UgoL
Nov 22 '18 at 23:10







I have applied your solution with the offset of 0,500. But the memory size error still occurs.

– UgoL
Nov 22 '18 at 23:10















And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

– UgoL
Nov 23 '18 at 0:33







And also the $final_result is replaced every time. So with your code we'll get null as final result. So an empty array.

– UgoL
Nov 23 '18 at 0:33















Okay, try this. Because i didnt check the code before. This works well

– user10685556
Nov 23 '18 at 7:21





Okay, try this. Because i didnt check the code before. This works well

– user10685556
Nov 23 '18 at 7:21


















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53437729%2fallowed-memory-size-exhausted-in-php-for-loop%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Wiesbaden

Marschland

Dieringhausen