Airflow DAGs in Kubernetes ask OAUTH Google Cloud Platform when access PostgreSQL database












0















I have set and up Airflow in Kubernetes. This Airflow using postgresql database in other pods in same cluster.



When I try to run some dags that connect to same postgresql with configured option, the connection to postgresql pods ask for OAUTH authorize



 Reading local file: /root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,821] {models.py:1547} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 6
--------------------------------------------------------------------------------

[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing <Task(PythonOperator): task_4_get_bigquery_pandas> on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', '-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas /usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148: DeprecationWarning: Deprecated parameter `bql` used in Task id: task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to be executed. `bql` parameter is deprecated and will be removed in a future version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - Running <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [running]> on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 15:56:24,837] {base_hook.py:83} INFO - Using connection to: anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx


Why this happen? in the same pods, one ask for OAUTH and other not?










share|improve this question

























  • Could you provide more information about configuration of connection between Airflow and Postgre. Also, in your logs isn't it ask you to authorize application for further use on link https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx?

    – Artem Golenyaev
    Nov 26 '18 at 14:38
















0















I have set and up Airflow in Kubernetes. This Airflow using postgresql database in other pods in same cluster.



When I try to run some dags that connect to same postgresql with configured option, the connection to postgresql pods ask for OAUTH authorize



 Reading local file: /root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,821] {models.py:1547} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 6
--------------------------------------------------------------------------------

[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing <Task(PythonOperator): task_4_get_bigquery_pandas> on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', '-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas /usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148: DeprecationWarning: Deprecated parameter `bql` used in Task id: task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to be executed. `bql` parameter is deprecated and will be removed in a future version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - Running <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [running]> on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 15:56:24,837] {base_hook.py:83} INFO - Using connection to: anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx


Why this happen? in the same pods, one ask for OAUTH and other not?










share|improve this question

























  • Could you provide more information about configuration of connection between Airflow and Postgre. Also, in your logs isn't it ask you to authorize application for further use on link https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx?

    – Artem Golenyaev
    Nov 26 '18 at 14:38














0












0








0


1






I have set and up Airflow in Kubernetes. This Airflow using postgresql database in other pods in same cluster.



When I try to run some dags that connect to same postgresql with configured option, the connection to postgresql pods ask for OAUTH authorize



 Reading local file: /root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,821] {models.py:1547} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 6
--------------------------------------------------------------------------------

[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing <Task(PythonOperator): task_4_get_bigquery_pandas> on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', '-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas /usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148: DeprecationWarning: Deprecated parameter `bql` used in Task id: task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to be executed. `bql` parameter is deprecated and will be removed in a future version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - Running <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [running]> on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 15:56:24,837] {base_hook.py:83} INFO - Using connection to: anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx


Why this happen? in the same pods, one ask for OAUTH and other not?










share|improve this question
















I have set and up Airflow in Kubernetes. This Airflow using postgresql database in other pods in same cluster.



When I try to run some dags that connect to same postgresql with configured option, the connection to postgresql pods ask for OAUTH authorize



 Reading local file: /root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [queued]>
[2018-11-23 15:56:22,821] {models.py:1547} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 6
--------------------------------------------------------------------------------

[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing <Task(PythonOperator): task_4_get_bigquery_pandas> on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', '-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas /usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148: DeprecationWarning: Deprecated parameter `bql` used in Task id: task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to be executed. `bql` parameter is deprecated and will be removed in a future version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - Running <TaskInstance: anu_funnel_analysis_v2.task_4_get_bigquery_pandas 2018-06-29T02:30:00+00:00 [running]> on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 15:56:24,837] {base_hook.py:83} INFO - Using connection to: anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx


Why this happen? in the same pods, one ask for OAUTH and other not?







kubernetes google-cloud-platform airflow






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 23 '18 at 16:38







Rio Harapan Pangihutan

















asked Nov 23 '18 at 16:23









Rio Harapan PangihutanRio Harapan Pangihutan

188




188













  • Could you provide more information about configuration of connection between Airflow and Postgre. Also, in your logs isn't it ask you to authorize application for further use on link https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx?

    – Artem Golenyaev
    Nov 26 '18 at 14:38



















  • Could you provide more information about configuration of connection between Airflow and Postgre. Also, in your logs isn't it ask you to authorize application for further use on link https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx?

    – Artem Golenyaev
    Nov 26 '18 at 14:38

















Could you provide more information about configuration of connection between Airflow and Postgre. Also, in your logs isn't it ask you to authorize application for further use on link https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx?

– Artem Golenyaev
Nov 26 '18 at 14:38





Could you provide more information about configuration of connection between Airflow and Postgre. Also, in your logs isn't it ask you to authorize application for further use on link https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awgxxxxxxxxxxxxxxxxxxx?

– Artem Golenyaev
Nov 26 '18 at 14:38












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53450060%2fairflow-dags-in-kubernetes-ask-oauth-google-cloud-platform-when-access-postgresq%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53450060%2fairflow-dags-in-kubernetes-ask-oauth-google-cloud-platform-when-access-postgresq%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Wiesbaden

To store a contact into the json file from server.js file using a class in NodeJS

Marschland