Textblob giving memory error while using NaiveBayesAnalyzer on large dataset












0















I am opening each text file and assigning it a label -- pos or neg as per for training NaiveBayes classifier.The data set contains about 12,000 txt files. I am using TextBlob library for sentiment analysis



train = [('I dont like this movie','neg')]
path = 'C://TextDemo//senti//aclImdb//train//neg//*.txt'
for f in glob.glob(path):
with open(f, "r", encoding="UTF-8") as read_file:
for line in read_file:
train.append(((line.replace("<br />","")),'pos'))

cl = NaiveBayesClassifier(train)









share|improve this question























  • You use up too much memory. So either buy more memory (and make sure you use 64 Bit Python), or train your classifier in batches.

    – deets
    Nov 25 '18 at 13:22











  • Alright! Thank you. I thought I didn't optimize my code :) I will try this of 16GB RAM.

    – user10646468
    Nov 30 '18 at 15:14
















0















I am opening each text file and assigning it a label -- pos or neg as per for training NaiveBayes classifier.The data set contains about 12,000 txt files. I am using TextBlob library for sentiment analysis



train = [('I dont like this movie','neg')]
path = 'C://TextDemo//senti//aclImdb//train//neg//*.txt'
for f in glob.glob(path):
with open(f, "r", encoding="UTF-8") as read_file:
for line in read_file:
train.append(((line.replace("<br />","")),'pos'))

cl = NaiveBayesClassifier(train)









share|improve this question























  • You use up too much memory. So either buy more memory (and make sure you use 64 Bit Python), or train your classifier in batches.

    – deets
    Nov 25 '18 at 13:22











  • Alright! Thank you. I thought I didn't optimize my code :) I will try this of 16GB RAM.

    – user10646468
    Nov 30 '18 at 15:14














0












0








0








I am opening each text file and assigning it a label -- pos or neg as per for training NaiveBayes classifier.The data set contains about 12,000 txt files. I am using TextBlob library for sentiment analysis



train = [('I dont like this movie','neg')]
path = 'C://TextDemo//senti//aclImdb//train//neg//*.txt'
for f in glob.glob(path):
with open(f, "r", encoding="UTF-8") as read_file:
for line in read_file:
train.append(((line.replace("<br />","")),'pos'))

cl = NaiveBayesClassifier(train)









share|improve this question














I am opening each text file and assigning it a label -- pos or neg as per for training NaiveBayes classifier.The data set contains about 12,000 txt files. I am using TextBlob library for sentiment analysis



train = [('I dont like this movie','neg')]
path = 'C://TextDemo//senti//aclImdb//train//neg//*.txt'
for f in glob.glob(path):
with open(f, "r", encoding="UTF-8") as read_file:
for line in read_file:
train.append(((line.replace("<br />","")),'pos'))

cl = NaiveBayesClassifier(train)






python textblob






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 25 '18 at 12:59









user10646468user10646468

1




1













  • You use up too much memory. So either buy more memory (and make sure you use 64 Bit Python), or train your classifier in batches.

    – deets
    Nov 25 '18 at 13:22











  • Alright! Thank you. I thought I didn't optimize my code :) I will try this of 16GB RAM.

    – user10646468
    Nov 30 '18 at 15:14



















  • You use up too much memory. So either buy more memory (and make sure you use 64 Bit Python), or train your classifier in batches.

    – deets
    Nov 25 '18 at 13:22











  • Alright! Thank you. I thought I didn't optimize my code :) I will try this of 16GB RAM.

    – user10646468
    Nov 30 '18 at 15:14

















You use up too much memory. So either buy more memory (and make sure you use 64 Bit Python), or train your classifier in batches.

– deets
Nov 25 '18 at 13:22





You use up too much memory. So either buy more memory (and make sure you use 64 Bit Python), or train your classifier in batches.

– deets
Nov 25 '18 at 13:22













Alright! Thank you. I thought I didn't optimize my code :) I will try this of 16GB RAM.

– user10646468
Nov 30 '18 at 15:14





Alright! Thank you. I thought I didn't optimize my code :) I will try this of 16GB RAM.

– user10646468
Nov 30 '18 at 15:14












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53467684%2ftextblob-giving-memory-error-while-using-naivebayesanalyzer-on-large-dataset%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53467684%2ftextblob-giving-memory-error-while-using-naivebayesanalyzer-on-large-dataset%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Wiesbaden

Marschland

Dieringhausen