Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@ Available addons
----------------
addon | version | maintainers | summary
--- | --- | --- | ---
[queue_job](queue_job/) | 19.0.1.1.0 | <a href='https://github.com/guewen'><img src='https://github.com/guewen.png' width='32' height='32' style='border-radius:50%;' alt='guewen'/></a> <a href='https://github.com/sbidoul'><img src='https://github.com/sbidoul.png' width='32' height='32' style='border-radius:50%;' alt='sbidoul'/></a> | Job Queue
[test_queue_job](test_queue_job/) | 19.0.1.0.1 | <a href='https://github.com/sbidoul'><img src='https://github.com/sbidoul.png' width='32' height='32' style='border-radius:50%;' alt='sbidoul'/></a> | Queue Job Tests
[queue_job](queue_job/) | 19.0.2.0.0 | <a href='https://github.com/guewen'><img src='https://github.com/guewen.png' width='32' height='32' style='border-radius:50%;' alt='guewen'/></a> <a href='https://github.com/sbidoul'><img src='https://github.com/sbidoul.png' width='32' height='32' style='border-radius:50%;' alt='sbidoul'/></a> | Job Queue
[test_queue_job](test_queue_job/) | 19.0.2.0.0 | <a href='https://github.com/sbidoul'><img src='https://github.com/sbidoul.png' width='32' height='32' style='border-radius:50%;' alt='sbidoul'/></a> | Queue Job Tests


Unported addons
Expand Down
49 changes: 44 additions & 5 deletions queue_job/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Job Queue
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:8f055109b96365bbd4bbcdd3273a3d2be459c003b4d49604bac2e2988bcf5c49
!! source digest: sha256:9837fe197bd7c0731992dd5b828a33c6b5c98a20220d8983df29f094e17c342f
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

.. |badge1| image:: https://img.shields.io/badge/maturity-Mature-brightgreen.png
Expand Down Expand Up @@ -85,6 +85,36 @@ Features:
.. contents::
:local:

Use Cases / Context
===================

Odoo treats task synchronously, like when you import a list of products
it will treat each line in one big task. "Queue job" gives you the
ability to detail big tasks in many smaller ones.

Imagine you have a lot of data to change for thousand orders, you can do
it in one step and cause a heavy load on the server, and this may affect
the performance of Odoo. With queue_job you can divide the work in jobs
and run thousand jobs (one job for each orders). An other benefit is if
one line failed it doesn't block the processing of the others, as the
jobs are independent. Plus you can schedule the jobs and set a number of
retries.

Here are some community usage examples:

- Mass sending invoices:
`account_invoice_mass_sending <https://github.com/OCA/account-invoicing/tree/17.0/account_invoice_mass_sending>`__
- Import data in the background:
`base_import_async <https://github.com/OCA/queue/tree/17.0/base_import_async>`__
- Export data in the background:
`base_export_async <https://github.com/OCA/queue/tree/17.0/base_export_async>`__
- Generate contract invoices with jobs:
`contract_queue_job <https://github.com/OCA/contract/tree/17.0/contract_queue_job>`__
- Generate partner invoices with
jobs:`partner_invoicing_mode <https://github.com/OCA/account-invoicing/tree/17.0/partner_invoicing_mode>`__
- Process the Sales Automatic Workflow actions with jobs:
`sale_automatic_workflow_job <https://github.com/OCA/sale-workflow/tree/17.0/sale_automatic_workflow_job>`__

Installation
============

Expand All @@ -99,10 +129,14 @@ Configuration

- ``ODOO_QUEUE_JOB_CHANNELS=root:4`` or any other channels
configuration. The default is ``root:1``
- if ``xmlrpc_port`` is not set: ``ODOO_QUEUE_JOB_PORT=8069``

- Start Odoo with ``--load=web,queue_job`` and ``--workers`` greater
than 1. [1]_
- ``ODOO_QUEUE_JOB_PORT=8069``, default ``--http-port``
- ``ODOO_QUEUE_JOB_SCHEME=https``, default ``http``
- ``ODOO_QUEUE_JOB_HOST=load-balancer``, default
``--http-interface`` or ``localhost`` if unset
- ``ODOO_QUEUE_JOB_HTTP_AUTH_USER=jobrunner``, default empty
- ``ODOO_QUEUE_JOB_HTTP_AUTH_PASSWORD=s3cr3t``, default empty
- Start Odoo with ``--load=web,queue_job`` and ``--workers`` greater
than 1. [1]_

- Using the Odoo configuration file:

Expand All @@ -116,6 +150,11 @@ Configuration
(...)
[queue_job]
channels = root:2
scheme = https
host = load-balancer
port = 443
http_auth_user = jobrunner
http_auth_password = s3cr3t

- Confirm the runner is starting correctly by checking the odoo log
file:
Expand Down
2 changes: 1 addition & 1 deletion queue_job/__manifest__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

{
"name": "Job Queue",
"version": "19.0.1.1.0",
"version": "19.0.2.0.0",
"author": "Camptocamp,ACSONE SA/NV,Odoo Community Association (OCA)",
"website": "https://github.com/OCA/queue",
"license": "LGPL-3",
Expand Down
38 changes: 28 additions & 10 deletions queue_job/controllers/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@
from werkzeug.exceptions import BadRequest, Forbidden

from odoo import SUPERUSER_ID, api, http
from odoo.modules.registry import Registry
from odoo.service.model import PG_CONCURRENCY_ERRORS_TO_RETRY
from odoo.tools import config

from ..delay import chain, group
from ..exception import FailedJobError, RetryableJobError
Expand All @@ -38,8 +38,10 @@ def _prevent_commit(cr):
def forbidden_commit(*args, **kwargs):
raise RuntimeError(
"Commit is forbidden in queue jobs. "
"If the current job is a cron running as queue job, "
"modify it to run as a normal cron."
'You may want to enable the "Allow Commit" option on the Job '
"Function. Alternatively, if the current job is a cron running as "
"queue job, you can modify it to run as a normal cron. More details on: "
"https://github.com/OCA/queue/wiki/Upgrade-warning:-commits-inside-jobs"
)

original_commit = cr.commit
Expand Down Expand Up @@ -103,11 +105,16 @@ def _try_perform_job(cls, env, job):
job.set_done()
job.store()
env.flush_all()
env.cr.commit()
if not config["test_enable"]:
env.cr.commit()
_logger.debug("%s done", job)

@classmethod
def _enqueue_dependent_jobs(cls, env, job):
if not job.should_check_dependents():
return

_logger.debug("%s enqueue depends started", job)
tries = 0
while True:
try:
Expand Down Expand Up @@ -136,13 +143,13 @@ def _enqueue_dependent_jobs(cls, env, job):
time.sleep(wait_time)
else:
break
_logger.debug("%s enqueue depends done", job)

@classmethod
def _runjob(cls, env: api.Environment, job: Job) -> None:
def retry_postpone(job, message, seconds=None):
job.env.clear()
with Registry(job.env.cr.dbname).cursor() as new_cr:
job.env = api.Environment(new_cr, SUPERUSER_ID, {})
with job.in_temporary_env():
job.postpone(result=message, seconds=seconds)
job.set_pending(reset_retry=False)
job.store()
Expand All @@ -167,24 +174,22 @@ def retry_postpone(job, message, seconds=None):
# traceback in the logs we should have the traceback when all
# retries are exhausted
env.cr.rollback()
return

except (FailedJobError, Exception) as orig_exception:
buff = StringIO()
traceback.print_exc(file=buff)
traceback_txt = buff.getvalue()
_logger.error(traceback_txt)
job.env.clear()
with Registry(job.env.cr.dbname).cursor() as new_cr:
job.env = job.env(cr=new_cr)
with job.in_temporary_env():
vals = cls._get_failure_values(job, traceback_txt, orig_exception)
job.set_failed(**vals)
job.store()
buff.close()
raise

_logger.debug("%s enqueue depends started", job)
cls._enqueue_dependent_jobs(env, job)
_logger.debug("%s enqueue depends done", job)

@classmethod
def _get_failure_values(cls, job, traceback_txt, orig_exception):
Expand Down Expand Up @@ -229,6 +234,7 @@ def create_test_job(
failure_rate=0,
job_duration=0,
commit_within_job=False,
failure_retry_seconds=0,
):
if not http.request.env.user.has_group("base.group_erp_manager"):
raise Forbidden(http.request.env._("Access Denied"))
Expand Down Expand Up @@ -266,6 +272,12 @@ def create_test_job(
except ValueError:
max_retries = None

if failure_retry_seconds is not None:
try:
failure_retry_seconds = int(failure_retry_seconds)
except ValueError:
failure_retry_seconds = 0

if size == 1:
return self._create_single_test_job(
priority=priority,
Expand All @@ -275,6 +287,7 @@ def create_test_job(
failure_rate=failure_rate,
job_duration=job_duration,
commit_within_job=commit_within_job,
failure_retry_seconds=failure_retry_seconds,
)

if size > 1:
Expand All @@ -287,6 +300,7 @@ def create_test_job(
failure_rate=failure_rate,
job_duration=job_duration,
commit_within_job=commit_within_job,
failure_retry_seconds=failure_retry_seconds,
)
return ""

Expand All @@ -300,6 +314,7 @@ def _create_single_test_job(
failure_rate=0,
job_duration=0,
commit_within_job=False,
failure_retry_seconds=0,
):
delayed = (
http.request.env["queue.job"]
Expand All @@ -313,6 +328,7 @@ def _create_single_test_job(
failure_rate=failure_rate,
job_duration=job_duration,
commit_within_job=commit_within_job,
failure_retry_seconds=failure_retry_seconds,
)
)
return f"job uuid: {delayed.db_record().uuid}"
Expand All @@ -329,6 +345,7 @@ def _create_graph_test_jobs(
failure_rate=0,
job_duration=0,
commit_within_job=False,
failure_retry_seconds=0,
):
model = http.request.env["queue.job"]
current_count = 0
Expand All @@ -355,6 +372,7 @@ def _create_graph_test_jobs(
failure_rate=failure_rate,
job_duration=job_duration,
commit_within_job=commit_within_job,
failure_retry_seconds=failure_retry_seconds,
)
)

Expand Down
Loading
Loading