How to UPDATE many rows(1 500 000) fast
I have table with 1.5 mil rows and I have 47k values to update.
I've tried two ways of doing it and both are pretty slow.
First is 47k rows of
UPDATE $table SET name='$name' WHERE id='$id'
second is
$prefix = "UPDATE table
SET name = (case ";
while () {
$mid .= "when id = '$id' then '$name' ";
}
$sufix = "end);";
$query = $prefix . $mid . $sufix;
Is there a way of doing it faster? Maybe with LOAD DATA INFILE
? Can't figure out the UPDATE syntax with this one.
mysql sql-update load-data-infile
add a comment |
I have table with 1.5 mil rows and I have 47k values to update.
I've tried two ways of doing it and both are pretty slow.
First is 47k rows of
UPDATE $table SET name='$name' WHERE id='$id'
second is
$prefix = "UPDATE table
SET name = (case ";
while () {
$mid .= "when id = '$id' then '$name' ";
}
$sufix = "end);";
$query = $prefix . $mid . $sufix;
Is there a way of doing it faster? Maybe with LOAD DATA INFILE
? Can't figure out the UPDATE syntax with this one.
mysql sql-update load-data-infile
add a comment |
I have table with 1.5 mil rows and I have 47k values to update.
I've tried two ways of doing it and both are pretty slow.
First is 47k rows of
UPDATE $table SET name='$name' WHERE id='$id'
second is
$prefix = "UPDATE table
SET name = (case ";
while () {
$mid .= "when id = '$id' then '$name' ";
}
$sufix = "end);";
$query = $prefix . $mid . $sufix;
Is there a way of doing it faster? Maybe with LOAD DATA INFILE
? Can't figure out the UPDATE syntax with this one.
mysql sql-update load-data-infile
I have table with 1.5 mil rows and I have 47k values to update.
I've tried two ways of doing it and both are pretty slow.
First is 47k rows of
UPDATE $table SET name='$name' WHERE id='$id'
second is
$prefix = "UPDATE table
SET name = (case ";
while () {
$mid .= "when id = '$id' then '$name' ";
}
$sufix = "end);";
$query = $prefix . $mid . $sufix;
Is there a way of doing it faster? Maybe with LOAD DATA INFILE
? Can't figure out the UPDATE syntax with this one.
mysql sql-update load-data-infile
mysql sql-update load-data-infile
asked Nov 23 '18 at 21:43
Andy CoreAndy Core
105
105
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
I had to import large files on a daily basis, and tried all sorts of things.
In the end I got the best performance a specific combination of:
- First copy the CSV to the database server, and load it from the local disk there, instead of loading the CSV from your client machine.
- Make sure that you have a table structure that matches exactly with this. I've used a temporary table for the import, and then used separate queries on that to get data into the final table.
- No foreign keys and unique index checks on the tmp table.
- That will speeds things up a lot already. If you need to squeeze more performance in, you can increase the log buffer size.
And obviously:
- make sure that you don't import stuff that you don't need to. Be critical about which fields you include, and which rows.
- If you only have a few different values of text in a column, use a numeric value for it instead.
- Do you really need 8 decimals in your floats?
- Are you repeatedly importing the same data, where you could insert updates only?
- Make sure that you don't trigger unnecessary type conversions during import. Prepare your data to be as close to the table that you're importing into.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53453230%2fhow-to-update-many-rows1-500-000-fast%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I had to import large files on a daily basis, and tried all sorts of things.
In the end I got the best performance a specific combination of:
- First copy the CSV to the database server, and load it from the local disk there, instead of loading the CSV from your client machine.
- Make sure that you have a table structure that matches exactly with this. I've used a temporary table for the import, and then used separate queries on that to get data into the final table.
- No foreign keys and unique index checks on the tmp table.
- That will speeds things up a lot already. If you need to squeeze more performance in, you can increase the log buffer size.
And obviously:
- make sure that you don't import stuff that you don't need to. Be critical about which fields you include, and which rows.
- If you only have a few different values of text in a column, use a numeric value for it instead.
- Do you really need 8 decimals in your floats?
- Are you repeatedly importing the same data, where you could insert updates only?
- Make sure that you don't trigger unnecessary type conversions during import. Prepare your data to be as close to the table that you're importing into.
add a comment |
I had to import large files on a daily basis, and tried all sorts of things.
In the end I got the best performance a specific combination of:
- First copy the CSV to the database server, and load it from the local disk there, instead of loading the CSV from your client machine.
- Make sure that you have a table structure that matches exactly with this. I've used a temporary table for the import, and then used separate queries on that to get data into the final table.
- No foreign keys and unique index checks on the tmp table.
- That will speeds things up a lot already. If you need to squeeze more performance in, you can increase the log buffer size.
And obviously:
- make sure that you don't import stuff that you don't need to. Be critical about which fields you include, and which rows.
- If you only have a few different values of text in a column, use a numeric value for it instead.
- Do you really need 8 decimals in your floats?
- Are you repeatedly importing the same data, where you could insert updates only?
- Make sure that you don't trigger unnecessary type conversions during import. Prepare your data to be as close to the table that you're importing into.
add a comment |
I had to import large files on a daily basis, and tried all sorts of things.
In the end I got the best performance a specific combination of:
- First copy the CSV to the database server, and load it from the local disk there, instead of loading the CSV from your client machine.
- Make sure that you have a table structure that matches exactly with this. I've used a temporary table for the import, and then used separate queries on that to get data into the final table.
- No foreign keys and unique index checks on the tmp table.
- That will speeds things up a lot already. If you need to squeeze more performance in, you can increase the log buffer size.
And obviously:
- make sure that you don't import stuff that you don't need to. Be critical about which fields you include, and which rows.
- If you only have a few different values of text in a column, use a numeric value for it instead.
- Do you really need 8 decimals in your floats?
- Are you repeatedly importing the same data, where you could insert updates only?
- Make sure that you don't trigger unnecessary type conversions during import. Prepare your data to be as close to the table that you're importing into.
I had to import large files on a daily basis, and tried all sorts of things.
In the end I got the best performance a specific combination of:
- First copy the CSV to the database server, and load it from the local disk there, instead of loading the CSV from your client machine.
- Make sure that you have a table structure that matches exactly with this. I've used a temporary table for the import, and then used separate queries on that to get data into the final table.
- No foreign keys and unique index checks on the tmp table.
- That will speeds things up a lot already. If you need to squeeze more performance in, you can increase the log buffer size.
And obviously:
- make sure that you don't import stuff that you don't need to. Be critical about which fields you include, and which rows.
- If you only have a few different values of text in a column, use a numeric value for it instead.
- Do you really need 8 decimals in your floats?
- Are you repeatedly importing the same data, where you could insert updates only?
- Make sure that you don't trigger unnecessary type conversions during import. Prepare your data to be as close to the table that you're importing into.
edited Nov 23 '18 at 22:32
answered Nov 23 '18 at 22:26
Wouter van NifterickWouter van Nifterick
18.7k365115
18.7k365115
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53453230%2fhow-to-update-many-rows1-500-000-fast%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown