php - for loop repeating itself / going out of sequence
I'm very new to PHP, making errors and learning as I go. Please be gentle! :)
I want to access some data from Blizzard.com's API. For this particular data set, it's not a block of data in JSON, rather each object has it's own URL to access. I estimate that there are approx 150000 objects, however I don't know the start or end points of the number range. So I'm having to assume 1 and work past the highest number I know (269065)
To get the data, I need to access each object's data via a JSON file, which I read, get the contents of & drop in to a text file (this could be written as an insert in to a SQL db too, as I'm able to do this if it's the text file that's the issue). But to be honest, I would love to get to the bottom of why this is happening as much as anything!
I wasn't going to try and run ~250000 iterations in a for loop, I thought I'd try something I considered small, 2000.
The for loop starts with $a as 1, uses $a as part of the URL, loads & decodes the JSON, checks to see if the first field (ID) in the object is set, if it is, it writes a few fields to data.txt & if the first field (ID) isn't set it just writes $a to data.txt (so I know it's a null for other purposes not outlined here).
Simple! Or so I thought, after approx after 183 iterations, the data written to the text file goes awry as seen by the quote below. It is out of sequence and starts at 1 again, then back to 184 ad nauseam. The loop then seems to be locked in some kind of infinite loop of running, outputting in a random order until I close the page 10-20 minutes later.
I have obviously made a big mistake! But I have no idea what I have done wrong to have caused this. During my attempts I have rewritten the code with new variable names, so a new text does not conflict with code that could be running in memory.
I've tried resetting variables to blank at the end of the loop in case it something was being reused that was causing a problem.
If anyone could point out any errors in my code, or suggest something for me to look in to, to handle bigger loops that would be brilliant. I am assuming my issue may be a time out or memory problem. But I don't know where to start & was hoping I'd find some suggestions here.
If it's relevant, I am using 000webhostapp.com as my host provider for now, until I get some paid for hosting.
1 ... 182 183 1 184 2 3 185 4 186 5 187 6 188 7 189 190 8 191
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$file = fopen("data.txt","a");
fwrite($file,$data['id'].",'".$data['name']."'n");
fclose($file);
else
$file = fopen("data.txt","a");
fwrite($file,$a."n");
fclose($file);
The content of the file I'm trying to access is
"id":33994,"name":"Precise Strikes","profession":"Enchanting","icon":"spell_holy_greaterheal"
I scrapped the original plan and wrote this instead. Thank you again who took the time out of their day to help and offer suggestions!
$b = $mysqli->query("SELECT id FROM `static_recipes` order by id desc LIMIT 1;")->fetch_object()->id;
if (empty($b)) $b=1;;
$count = $b+101;
$write = ;
for ($a = $b+1; $a < $count; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&apikey=";
$contents = @file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = "(".$data['id'].",'".addslashes($data['name'])."','".addslashes($data['profession'])."','".addslashes($data['icon'])."')";
else
$write = "(".$a.",'a','a','a'".")";
$SQL = ('INSERT INTO `static_recipes` (id, name, profession, icon) VALUES '.implode(',', $write));
$mysqli->query($SQL);
$mysqli->close();
php json for-loop fwrite
|
show 2 more comments
I'm very new to PHP, making errors and learning as I go. Please be gentle! :)
I want to access some data from Blizzard.com's API. For this particular data set, it's not a block of data in JSON, rather each object has it's own URL to access. I estimate that there are approx 150000 objects, however I don't know the start or end points of the number range. So I'm having to assume 1 and work past the highest number I know (269065)
To get the data, I need to access each object's data via a JSON file, which I read, get the contents of & drop in to a text file (this could be written as an insert in to a SQL db too, as I'm able to do this if it's the text file that's the issue). But to be honest, I would love to get to the bottom of why this is happening as much as anything!
I wasn't going to try and run ~250000 iterations in a for loop, I thought I'd try something I considered small, 2000.
The for loop starts with $a as 1, uses $a as part of the URL, loads & decodes the JSON, checks to see if the first field (ID) in the object is set, if it is, it writes a few fields to data.txt & if the first field (ID) isn't set it just writes $a to data.txt (so I know it's a null for other purposes not outlined here).
Simple! Or so I thought, after approx after 183 iterations, the data written to the text file goes awry as seen by the quote below. It is out of sequence and starts at 1 again, then back to 184 ad nauseam. The loop then seems to be locked in some kind of infinite loop of running, outputting in a random order until I close the page 10-20 minutes later.
I have obviously made a big mistake! But I have no idea what I have done wrong to have caused this. During my attempts I have rewritten the code with new variable names, so a new text does not conflict with code that could be running in memory.
I've tried resetting variables to blank at the end of the loop in case it something was being reused that was causing a problem.
If anyone could point out any errors in my code, or suggest something for me to look in to, to handle bigger loops that would be brilliant. I am assuming my issue may be a time out or memory problem. But I don't know where to start & was hoping I'd find some suggestions here.
If it's relevant, I am using 000webhostapp.com as my host provider for now, until I get some paid for hosting.
1 ... 182 183 1 184 2 3 185 4 186 5 187 6 188 7 189 190 8 191
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$file = fopen("data.txt","a");
fwrite($file,$data['id'].",'".$data['name']."'n");
fclose($file);
else
$file = fopen("data.txt","a");
fwrite($file,$a."n");
fclose($file);
The content of the file I'm trying to access is
"id":33994,"name":"Precise Strikes","profession":"Enchanting","icon":"spell_holy_greaterheal"
I scrapped the original plan and wrote this instead. Thank you again who took the time out of their day to help and offer suggestions!
$b = $mysqli->query("SELECT id FROM `static_recipes` order by id desc LIMIT 1;")->fetch_object()->id;
if (empty($b)) $b=1;;
$count = $b+101;
$write = ;
for ($a = $b+1; $a < $count; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&apikey=";
$contents = @file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = "(".$data['id'].",'".addslashes($data['name'])."','".addslashes($data['profession'])."','".addslashes($data['icon'])."')";
else
$write = "(".$a.",'a','a','a'".")";
$SQL = ('INSERT INTO `static_recipes` (id, name, profession, icon) VALUES '.implode(',', $write));
$mysqli->query($SQL);
$mysqli->close();
php json for-loop fwrite
3
We can't help much without seeing the content of your data. I would suggest that opening and closing this file 2000 is a waste of resources though...
– miken32
Nov 15 '18 at 18:22
1
Open the file once before you start the loop and only close after the loop is finished. There is no reason to open and close it each time through the loop. You're going to be appending to it regardless.
– Dave
Nov 15 '18 at 18:25
are you running this from cli , ie single instance ? if not (like trigger via an http server) , you could have two instances or more running in parallel.
– YvesLeBorg
Nov 15 '18 at 18:44
Thanks for the suggestions, I'll try it now.
– kosta12
Nov 16 '18 at 8:36
@YvesLeBorg yeah, I am running it as a single instance to start with, until I'm sure that I have the most effiicient / low foot print code before I go nuts with it. I will try your suggest to run in parallel.
– kosta12
Nov 16 '18 at 8:44
|
show 2 more comments
I'm very new to PHP, making errors and learning as I go. Please be gentle! :)
I want to access some data from Blizzard.com's API. For this particular data set, it's not a block of data in JSON, rather each object has it's own URL to access. I estimate that there are approx 150000 objects, however I don't know the start or end points of the number range. So I'm having to assume 1 and work past the highest number I know (269065)
To get the data, I need to access each object's data via a JSON file, which I read, get the contents of & drop in to a text file (this could be written as an insert in to a SQL db too, as I'm able to do this if it's the text file that's the issue). But to be honest, I would love to get to the bottom of why this is happening as much as anything!
I wasn't going to try and run ~250000 iterations in a for loop, I thought I'd try something I considered small, 2000.
The for loop starts with $a as 1, uses $a as part of the URL, loads & decodes the JSON, checks to see if the first field (ID) in the object is set, if it is, it writes a few fields to data.txt & if the first field (ID) isn't set it just writes $a to data.txt (so I know it's a null for other purposes not outlined here).
Simple! Or so I thought, after approx after 183 iterations, the data written to the text file goes awry as seen by the quote below. It is out of sequence and starts at 1 again, then back to 184 ad nauseam. The loop then seems to be locked in some kind of infinite loop of running, outputting in a random order until I close the page 10-20 minutes later.
I have obviously made a big mistake! But I have no idea what I have done wrong to have caused this. During my attempts I have rewritten the code with new variable names, so a new text does not conflict with code that could be running in memory.
I've tried resetting variables to blank at the end of the loop in case it something was being reused that was causing a problem.
If anyone could point out any errors in my code, or suggest something for me to look in to, to handle bigger loops that would be brilliant. I am assuming my issue may be a time out or memory problem. But I don't know where to start & was hoping I'd find some suggestions here.
If it's relevant, I am using 000webhostapp.com as my host provider for now, until I get some paid for hosting.
1 ... 182 183 1 184 2 3 185 4 186 5 187 6 188 7 189 190 8 191
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$file = fopen("data.txt","a");
fwrite($file,$data['id'].",'".$data['name']."'n");
fclose($file);
else
$file = fopen("data.txt","a");
fwrite($file,$a."n");
fclose($file);
The content of the file I'm trying to access is
"id":33994,"name":"Precise Strikes","profession":"Enchanting","icon":"spell_holy_greaterheal"
I scrapped the original plan and wrote this instead. Thank you again who took the time out of their day to help and offer suggestions!
$b = $mysqli->query("SELECT id FROM `static_recipes` order by id desc LIMIT 1;")->fetch_object()->id;
if (empty($b)) $b=1;;
$count = $b+101;
$write = ;
for ($a = $b+1; $a < $count; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&apikey=";
$contents = @file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = "(".$data['id'].",'".addslashes($data['name'])."','".addslashes($data['profession'])."','".addslashes($data['icon'])."')";
else
$write = "(".$a.",'a','a','a'".")";
$SQL = ('INSERT INTO `static_recipes` (id, name, profession, icon) VALUES '.implode(',', $write));
$mysqli->query($SQL);
$mysqli->close();
php json for-loop fwrite
I'm very new to PHP, making errors and learning as I go. Please be gentle! :)
I want to access some data from Blizzard.com's API. For this particular data set, it's not a block of data in JSON, rather each object has it's own URL to access. I estimate that there are approx 150000 objects, however I don't know the start or end points of the number range. So I'm having to assume 1 and work past the highest number I know (269065)
To get the data, I need to access each object's data via a JSON file, which I read, get the contents of & drop in to a text file (this could be written as an insert in to a SQL db too, as I'm able to do this if it's the text file that's the issue). But to be honest, I would love to get to the bottom of why this is happening as much as anything!
I wasn't going to try and run ~250000 iterations in a for loop, I thought I'd try something I considered small, 2000.
The for loop starts with $a as 1, uses $a as part of the URL, loads & decodes the JSON, checks to see if the first field (ID) in the object is set, if it is, it writes a few fields to data.txt & if the first field (ID) isn't set it just writes $a to data.txt (so I know it's a null for other purposes not outlined here).
Simple! Or so I thought, after approx after 183 iterations, the data written to the text file goes awry as seen by the quote below. It is out of sequence and starts at 1 again, then back to 184 ad nauseam. The loop then seems to be locked in some kind of infinite loop of running, outputting in a random order until I close the page 10-20 minutes later.
I have obviously made a big mistake! But I have no idea what I have done wrong to have caused this. During my attempts I have rewritten the code with new variable names, so a new text does not conflict with code that could be running in memory.
I've tried resetting variables to blank at the end of the loop in case it something was being reused that was causing a problem.
If anyone could point out any errors in my code, or suggest something for me to look in to, to handle bigger loops that would be brilliant. I am assuming my issue may be a time out or memory problem. But I don't know where to start & was hoping I'd find some suggestions here.
If it's relevant, I am using 000webhostapp.com as my host provider for now, until I get some paid for hosting.
1 ... 182 183 1 184 2 3 185 4 186 5 187 6 188 7 189 190 8 191
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$file = fopen("data.txt","a");
fwrite($file,$data['id'].",'".$data['name']."'n");
fclose($file);
else
$file = fopen("data.txt","a");
fwrite($file,$a."n");
fclose($file);
The content of the file I'm trying to access is
"id":33994,"name":"Precise Strikes","profession":"Enchanting","icon":"spell_holy_greaterheal"
I scrapped the original plan and wrote this instead. Thank you again who took the time out of their day to help and offer suggestions!
$b = $mysqli->query("SELECT id FROM `static_recipes` order by id desc LIMIT 1;")->fetch_object()->id;
if (empty($b)) $b=1;;
$count = $b+101;
$write = ;
for ($a = $b+1; $a < $count; $a++)
$json = "https://eu.api.battle.net/wow/recipe/".$a."?locale=en_GB&apikey=";
$contents = @file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = "(".$data['id'].",'".addslashes($data['name'])."','".addslashes($data['profession'])."','".addslashes($data['icon'])."')";
else
$write = "(".$a.",'a','a','a'".")";
$SQL = ('INSERT INTO `static_recipes` (id, name, profession, icon) VALUES '.implode(',', $write));
$mysqli->query($SQL);
$mysqli->close();
php json for-loop fwrite
php json for-loop fwrite
edited Nov 17 '18 at 16:54
kosta12
asked Nov 15 '18 at 18:18
kosta12kosta12
13
13
3
We can't help much without seeing the content of your data. I would suggest that opening and closing this file 2000 is a waste of resources though...
– miken32
Nov 15 '18 at 18:22
1
Open the file once before you start the loop and only close after the loop is finished. There is no reason to open and close it each time through the loop. You're going to be appending to it regardless.
– Dave
Nov 15 '18 at 18:25
are you running this from cli , ie single instance ? if not (like trigger via an http server) , you could have two instances or more running in parallel.
– YvesLeBorg
Nov 15 '18 at 18:44
Thanks for the suggestions, I'll try it now.
– kosta12
Nov 16 '18 at 8:36
@YvesLeBorg yeah, I am running it as a single instance to start with, until I'm sure that I have the most effiicient / low foot print code before I go nuts with it. I will try your suggest to run in parallel.
– kosta12
Nov 16 '18 at 8:44
|
show 2 more comments
3
We can't help much without seeing the content of your data. I would suggest that opening and closing this file 2000 is a waste of resources though...
– miken32
Nov 15 '18 at 18:22
1
Open the file once before you start the loop and only close after the loop is finished. There is no reason to open and close it each time through the loop. You're going to be appending to it regardless.
– Dave
Nov 15 '18 at 18:25
are you running this from cli , ie single instance ? if not (like trigger via an http server) , you could have two instances or more running in parallel.
– YvesLeBorg
Nov 15 '18 at 18:44
Thanks for the suggestions, I'll try it now.
– kosta12
Nov 16 '18 at 8:36
@YvesLeBorg yeah, I am running it as a single instance to start with, until I'm sure that I have the most effiicient / low foot print code before I go nuts with it. I will try your suggest to run in parallel.
– kosta12
Nov 16 '18 at 8:44
3
3
We can't help much without seeing the content of your data. I would suggest that opening and closing this file 2000 is a waste of resources though...
– miken32
Nov 15 '18 at 18:22
We can't help much without seeing the content of your data. I would suggest that opening and closing this file 2000 is a waste of resources though...
– miken32
Nov 15 '18 at 18:22
1
1
Open the file once before you start the loop and only close after the loop is finished. There is no reason to open and close it each time through the loop. You're going to be appending to it regardless.
– Dave
Nov 15 '18 at 18:25
Open the file once before you start the loop and only close after the loop is finished. There is no reason to open and close it each time through the loop. You're going to be appending to it regardless.
– Dave
Nov 15 '18 at 18:25
are you running this from cli , ie single instance ? if not (like trigger via an http server) , you could have two instances or more running in parallel.
– YvesLeBorg
Nov 15 '18 at 18:44
are you running this from cli , ie single instance ? if not (like trigger via an http server) , you could have two instances or more running in parallel.
– YvesLeBorg
Nov 15 '18 at 18:44
Thanks for the suggestions, I'll try it now.
– kosta12
Nov 16 '18 at 8:36
Thanks for the suggestions, I'll try it now.
– kosta12
Nov 16 '18 at 8:36
@YvesLeBorg yeah, I am running it as a single instance to start with, until I'm sure that I have the most effiicient / low foot print code before I go nuts with it. I will try your suggest to run in parallel.
– kosta12
Nov 16 '18 at 8:44
@YvesLeBorg yeah, I am running it as a single instance to start with, until I'm sure that I have the most effiicient / low foot print code before I go nuts with it. I will try your suggest to run in parallel.
– kosta12
Nov 16 '18 at 8:44
|
show 2 more comments
3 Answers
3
active
oldest
votes
$write = ;
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = $data['id'].",'".$data['name']."'n";
else
$write = $a."n";
$file = fopen("data.txt","a");
fwrite($file, implode('', $write));
fclose($file);
Also, why you are think what some IDS isn't duplicated at several "https://eu.api.battle.net/wow/[N]" urls data?
Also if you are I wasn't going to try and run ~250000
think about curl_multi_init()
: http://php.net/manual/en/function.curl-multi-init.php
1
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
I cant help without soirce code. checkre-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
1
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
|
show 1 more comment
I can't really see anything obviously wrong with your code, can't run it though as I don't have the JSON
It could be possible that there is some kind of race condition since you're opening and closing the same file hundreds of times very quickly.
File operations might seem atomic but not necessarily so - here's an interesting SO thread:
Does PHP wait for filesystem operations (like file_put_contents) to complete before moving on?
Like some others' suggested - maybe just open the file before you enter the loop then close the file when the loop breaks.
I'd try it first and see if it helps.
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
add a comment |
There's nothing in your original code that would cause that sort of behaviour. PHP will not arbitrarily change the value of a variable. You are opening this file in append mode, are you certain that you're not looking at old data? Maybe output some debug messages as you process the data. It's likely you'd run up against some rate limiting on the API server, so putting a pause in there somewhere may improve reliability.
The only substantive change I'd suggest to your code is opening the file once and closing it when you're done.
$file = fopen("data_1_2000.txt", "w");
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/$a?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents, true);
if (!empty($data['id']))
$data["name"] = str_replace("'", "\'", $data["name"]);
$record = "$data[id],'$data[name]'";
else
$record = $a;
fwrite($file, "$recordn");
sleep(1);
echo "$a "; if ($a % 50 === 0) echo "n";
fclose($file);
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53325641%2fphp-for-loop-repeating-itself-going-out-of-sequence%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$write = ;
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = $data['id'].",'".$data['name']."'n";
else
$write = $a."n";
$file = fopen("data.txt","a");
fwrite($file, implode('', $write));
fclose($file);
Also, why you are think what some IDS isn't duplicated at several "https://eu.api.battle.net/wow/[N]" urls data?
Also if you are I wasn't going to try and run ~250000
think about curl_multi_init()
: http://php.net/manual/en/function.curl-multi-init.php
1
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
I cant help without soirce code. checkre-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
1
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
|
show 1 more comment
$write = ;
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = $data['id'].",'".$data['name']."'n";
else
$write = $a."n";
$file = fopen("data.txt","a");
fwrite($file, implode('', $write));
fclose($file);
Also, why you are think what some IDS isn't duplicated at several "https://eu.api.battle.net/wow/[N]" urls data?
Also if you are I wasn't going to try and run ~250000
think about curl_multi_init()
: http://php.net/manual/en/function.curl-multi-init.php
1
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
I cant help without soirce code. checkre-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
1
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
|
show 1 more comment
$write = ;
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = $data['id'].",'".$data['name']."'n";
else
$write = $a."n";
$file = fopen("data.txt","a");
fwrite($file, implode('', $write));
fclose($file);
Also, why you are think what some IDS isn't duplicated at several "https://eu.api.battle.net/wow/[N]" urls data?
Also if you are I wasn't going to try and run ~250000
think about curl_multi_init()
: http://php.net/manual/en/function.curl-multi-init.php
$write = ;
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/".$a."?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents,true);
if (isset($data['id']))
$write = $data['id'].",'".$data['name']."'n";
else
$write = $a."n";
$file = fopen("data.txt","a");
fwrite($file, implode('', $write));
fclose($file);
Also, why you are think what some IDS isn't duplicated at several "https://eu.api.battle.net/wow/[N]" urls data?
Also if you are I wasn't going to try and run ~250000
think about curl_multi_init()
: http://php.net/manual/en/function.curl-multi-init.php
edited Nov 15 '18 at 18:47
answered Nov 15 '18 at 18:34
Николай ЛубышевНиколай Лубышев
44948
44948
1
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
I cant help without soirce code. checkre-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
1
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
|
show 1 more comment
1
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
I cant help without soirce code. checkre-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
1
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
1
1
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
Thank you for the new code suggestion Николай Лубышев. The ID has a one to one relationship within the game of World of Warcraft. Each ID represents an object in game. I will also look & start learning cURL, thank you!
– kosta12
Nov 16 '18 at 8:40
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
This worked, putting the data in an array and inserting that has overcome the problem of data going out of sequence and repeating. Thank you! Also thank you to all that helped & had similar suggestions.
– kosta12
Nov 16 '18 at 9:01
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
Well! It didn't work after all, I used the above code & had a range of $a = 2000; $a <=3000; it got to 2998, 2999, 3000,then went back to 2000 again & looped up to 3000, and restarted at 2000 for 1 1/2 hours! I'm wondering if this is a memory stack problem, an unsigned data type problem or maybe a http header issue. I am going to completely change my direction on this, write it directly to the database in chunks of 20 rows, re-reading the latest ID in the DB & set that to $a, having a cron job run every minute, to blast through this.
– kosta12
Nov 16 '18 at 14:23
I cant help without soirce code. check
re-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
I cant help without soirce code. check
re-reading the latest ID in the DB
– Николай Лубышев
Nov 16 '18 at 15:25
1
1
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
Hi Николай Лубышев the source code & JSON contents are in my original post. I just did not include my SQLI log in details and <?PHP ?> tags. The source code is very small because it was doing one simple job :)) The Cron job is working now. thank you!
– kosta12
Nov 17 '18 at 16:50
|
show 1 more comment
I can't really see anything obviously wrong with your code, can't run it though as I don't have the JSON
It could be possible that there is some kind of race condition since you're opening and closing the same file hundreds of times very quickly.
File operations might seem atomic but not necessarily so - here's an interesting SO thread:
Does PHP wait for filesystem operations (like file_put_contents) to complete before moving on?
Like some others' suggested - maybe just open the file before you enter the loop then close the file when the loop breaks.
I'd try it first and see if it helps.
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
add a comment |
I can't really see anything obviously wrong with your code, can't run it though as I don't have the JSON
It could be possible that there is some kind of race condition since you're opening and closing the same file hundreds of times very quickly.
File operations might seem atomic but not necessarily so - here's an interesting SO thread:
Does PHP wait for filesystem operations (like file_put_contents) to complete before moving on?
Like some others' suggested - maybe just open the file before you enter the loop then close the file when the loop breaks.
I'd try it first and see if it helps.
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
add a comment |
I can't really see anything obviously wrong with your code, can't run it though as I don't have the JSON
It could be possible that there is some kind of race condition since you're opening and closing the same file hundreds of times very quickly.
File operations might seem atomic but not necessarily so - here's an interesting SO thread:
Does PHP wait for filesystem operations (like file_put_contents) to complete before moving on?
Like some others' suggested - maybe just open the file before you enter the loop then close the file when the loop breaks.
I'd try it first and see if it helps.
I can't really see anything obviously wrong with your code, can't run it though as I don't have the JSON
It could be possible that there is some kind of race condition since you're opening and closing the same file hundreds of times very quickly.
File operations might seem atomic but not necessarily so - here's an interesting SO thread:
Does PHP wait for filesystem operations (like file_put_contents) to complete before moving on?
Like some others' suggested - maybe just open the file before you enter the loop then close the file when the loop breaks.
I'd try it first and see if it helps.
answered Nov 15 '18 at 18:32
NunchyNunchy
825411
825411
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
add a comment |
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
I'll try the suggestions, thank you for the article as well.
– kosta12
Nov 16 '18 at 8:35
add a comment |
There's nothing in your original code that would cause that sort of behaviour. PHP will not arbitrarily change the value of a variable. You are opening this file in append mode, are you certain that you're not looking at old data? Maybe output some debug messages as you process the data. It's likely you'd run up against some rate limiting on the API server, so putting a pause in there somewhere may improve reliability.
The only substantive change I'd suggest to your code is opening the file once and closing it when you're done.
$file = fopen("data_1_2000.txt", "w");
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/$a?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents, true);
if (!empty($data['id']))
$data["name"] = str_replace("'", "\'", $data["name"]);
$record = "$data[id],'$data[name]'";
else
$record = $a;
fwrite($file, "$recordn");
sleep(1);
echo "$a "; if ($a % 50 === 0) echo "n";
fclose($file);
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
add a comment |
There's nothing in your original code that would cause that sort of behaviour. PHP will not arbitrarily change the value of a variable. You are opening this file in append mode, are you certain that you're not looking at old data? Maybe output some debug messages as you process the data. It's likely you'd run up against some rate limiting on the API server, so putting a pause in there somewhere may improve reliability.
The only substantive change I'd suggest to your code is opening the file once and closing it when you're done.
$file = fopen("data_1_2000.txt", "w");
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/$a?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents, true);
if (!empty($data['id']))
$data["name"] = str_replace("'", "\'", $data["name"]);
$record = "$data[id],'$data[name]'";
else
$record = $a;
fwrite($file, "$recordn");
sleep(1);
echo "$a "; if ($a % 50 === 0) echo "n";
fclose($file);
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
add a comment |
There's nothing in your original code that would cause that sort of behaviour. PHP will not arbitrarily change the value of a variable. You are opening this file in append mode, are you certain that you're not looking at old data? Maybe output some debug messages as you process the data. It's likely you'd run up against some rate limiting on the API server, so putting a pause in there somewhere may improve reliability.
The only substantive change I'd suggest to your code is opening the file once and closing it when you're done.
$file = fopen("data_1_2000.txt", "w");
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/$a?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents, true);
if (!empty($data['id']))
$data["name"] = str_replace("'", "\'", $data["name"]);
$record = "$data[id],'$data[name]'";
else
$record = $a;
fwrite($file, "$recordn");
sleep(1);
echo "$a "; if ($a % 50 === 0) echo "n";
fclose($file);
There's nothing in your original code that would cause that sort of behaviour. PHP will not arbitrarily change the value of a variable. You are opening this file in append mode, are you certain that you're not looking at old data? Maybe output some debug messages as you process the data. It's likely you'd run up against some rate limiting on the API server, so putting a pause in there somewhere may improve reliability.
The only substantive change I'd suggest to your code is opening the file once and closing it when you're done.
$file = fopen("data_1_2000.txt", "w");
for ($a = 1; $a <= 2000; $a++)
$json = "https://eu.api.battle.net/wow/recipe/$a?locale=en_GB&<MYPRIVATEAPIKEY>";
$contents = file_get_contents($json);
$data = json_decode($contents, true);
if (!empty($data['id']))
$data["name"] = str_replace("'", "\'", $data["name"]);
$record = "$data[id],'$data[name]'";
else
$record = $a;
fwrite($file, "$recordn");
sleep(1);
echo "$a "; if ($a % 50 === 0) echo "n";
fclose($file);
answered Nov 16 '18 at 16:24
miken32miken32
24.2k95073
24.2k95073
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
add a comment |
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
Yeah I agree, I've pegged it down to do with me not handling the http header responses. I've took the file_get_contents out & ran the code with big loops without an issue. The cron job/small loop seems to be working most of the time. I can handle repeats in the SQL DB a lot easier than a text file. I just thought it would of been easier to of made the CSV, also it let me try something new. Turns out it was a bit of PITA! Thanks for your help!
– kosta12
Nov 17 '18 at 16:45
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53325641%2fphp-for-loop-repeating-itself-going-out-of-sequence%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
We can't help much without seeing the content of your data. I would suggest that opening and closing this file 2000 is a waste of resources though...
– miken32
Nov 15 '18 at 18:22
1
Open the file once before you start the loop and only close after the loop is finished. There is no reason to open and close it each time through the loop. You're going to be appending to it regardless.
– Dave
Nov 15 '18 at 18:25
are you running this from cli , ie single instance ? if not (like trigger via an http server) , you could have two instances or more running in parallel.
– YvesLeBorg
Nov 15 '18 at 18:44
Thanks for the suggestions, I'll try it now.
– kosta12
Nov 16 '18 at 8:36
@YvesLeBorg yeah, I am running it as a single instance to start with, until I'm sure that I have the most effiicient / low foot print code before I go nuts with it. I will try your suggest to run in parallel.
– kosta12
Nov 16 '18 at 8:44