How to run a long running python script in background from Flask Rest API? [duplicate]
This question already has an answer here:
Reporting yielded results of long-running Celery task
6 answers
I have a flask rest API setup correctly.
Once I hit the rest API, I want to run a python script/function and return with a runID and then user can hit another rest API with the runID to get information.
How can I return immediately and keep another python script/function running in another thread/process?
Following is my rest API
@app.route('/start/', methods=['POST'])
def start_run():
run_id = "SOME RANDOM NUMBER"
# I want to start python script here
return run_id
@app.rout('/get_report', methods=['GET'])
def get_report():
run_id = request.args("run_id")
return some_method(run_id)
python flask subprocess python-multithreading
marked as duplicate by davidism
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 26 '18 at 14:49
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
add a comment |
This question already has an answer here:
Reporting yielded results of long-running Celery task
6 answers
I have a flask rest API setup correctly.
Once I hit the rest API, I want to run a python script/function and return with a runID and then user can hit another rest API with the runID to get information.
How can I return immediately and keep another python script/function running in another thread/process?
Following is my rest API
@app.route('/start/', methods=['POST'])
def start_run():
run_id = "SOME RANDOM NUMBER"
# I want to start python script here
return run_id
@app.rout('/get_report', methods=['GET'])
def get_report():
run_id = request.args("run_id")
return some_method(run_id)
python flask subprocess python-multithreading
marked as duplicate by davidism
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 26 '18 at 14:49
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
add a comment |
This question already has an answer here:
Reporting yielded results of long-running Celery task
6 answers
I have a flask rest API setup correctly.
Once I hit the rest API, I want to run a python script/function and return with a runID and then user can hit another rest API with the runID to get information.
How can I return immediately and keep another python script/function running in another thread/process?
Following is my rest API
@app.route('/start/', methods=['POST'])
def start_run():
run_id = "SOME RANDOM NUMBER"
# I want to start python script here
return run_id
@app.rout('/get_report', methods=['GET'])
def get_report():
run_id = request.args("run_id")
return some_method(run_id)
python flask subprocess python-multithreading
This question already has an answer here:
Reporting yielded results of long-running Celery task
6 answers
I have a flask rest API setup correctly.
Once I hit the rest API, I want to run a python script/function and return with a runID and then user can hit another rest API with the runID to get information.
How can I return immediately and keep another python script/function running in another thread/process?
Following is my rest API
@app.route('/start/', methods=['POST'])
def start_run():
run_id = "SOME RANDOM NUMBER"
# I want to start python script here
return run_id
@app.rout('/get_report', methods=['GET'])
def get_report():
run_id = request.args("run_id")
return some_method(run_id)
This question already has an answer here:
Reporting yielded results of long-running Celery task
6 answers
python flask subprocess python-multithreading
python flask subprocess python-multithreading
edited Nov 26 '18 at 6:20
Raja Simon
7,25632254
7,25632254
asked Nov 26 '18 at 6:16
Ishan BhattIshan Bhatt
1,19711326
1,19711326
marked as duplicate by davidism
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 26 '18 at 14:49
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
marked as duplicate by davidism
StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 26 '18 at 14:49
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Running long-running background task is a complex process for singleton wsgi
based project. Best approach is to use some kind of background scheduler like celery, apschedular. This answer based on celery approach.
There is good tutorial for how to run celery background task and you can use that in your project.
Take look for below example.
@celery.task()
def add_together(a, b):
return a + b
result = add_together.delay(23, 42) # Create celery instance for this job
print(result.request.id) # Print celery task id
By default celery provide state and you can use that as well.
task = result.AsyncResult(task_id)
print(task.state)
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Running long-running background task is a complex process for singleton wsgi
based project. Best approach is to use some kind of background scheduler like celery, apschedular. This answer based on celery approach.
There is good tutorial for how to run celery background task and you can use that in your project.
Take look for below example.
@celery.task()
def add_together(a, b):
return a + b
result = add_together.delay(23, 42) # Create celery instance for this job
print(result.request.id) # Print celery task id
By default celery provide state and you can use that as well.
task = result.AsyncResult(task_id)
print(task.state)
add a comment |
Running long-running background task is a complex process for singleton wsgi
based project. Best approach is to use some kind of background scheduler like celery, apschedular. This answer based on celery approach.
There is good tutorial for how to run celery background task and you can use that in your project.
Take look for below example.
@celery.task()
def add_together(a, b):
return a + b
result = add_together.delay(23, 42) # Create celery instance for this job
print(result.request.id) # Print celery task id
By default celery provide state and you can use that as well.
task = result.AsyncResult(task_id)
print(task.state)
add a comment |
Running long-running background task is a complex process for singleton wsgi
based project. Best approach is to use some kind of background scheduler like celery, apschedular. This answer based on celery approach.
There is good tutorial for how to run celery background task and you can use that in your project.
Take look for below example.
@celery.task()
def add_together(a, b):
return a + b
result = add_together.delay(23, 42) # Create celery instance for this job
print(result.request.id) # Print celery task id
By default celery provide state and you can use that as well.
task = result.AsyncResult(task_id)
print(task.state)
Running long-running background task is a complex process for singleton wsgi
based project. Best approach is to use some kind of background scheduler like celery, apschedular. This answer based on celery approach.
There is good tutorial for how to run celery background task and you can use that in your project.
Take look for below example.
@celery.task()
def add_together(a, b):
return a + b
result = add_together.delay(23, 42) # Create celery instance for this job
print(result.request.id) # Print celery task id
By default celery provide state and you can use that as well.
task = result.AsyncResult(task_id)
print(task.state)
answered Nov 26 '18 at 6:40
Raja SimonRaja Simon
7,25632254
7,25632254
add a comment |
add a comment |