Task handler raised error: ValueError(‘not enough values to unp…


Python version 3.7.2

Celery – Distributed task queues

Celery is a simple, flexible and reliable distributed system that processes large amounts of messages and provides the necessary tools to maintain such a system. It is a task queue that focuses on real-time processing and also supports task scheduling.

Usage scenario: A user initiates a request and waits for a response to return. In some views, users may have to wait for a long time to execute a time-consuming program, resulting in poor user experience, such as sending emails and mobile verification codes. With celery, the situation is different. Solution: Put time-consuming procedures into celery to execute.

  • Please click on the celery official website
  • Click to view celery Chinese file

Celery nouns:

  • Task: is a Python function.
  • Queue: Adds tasks to be executed to a queue.
  • Worker: Performs tasks in a queue in a new process.
  • Broker: Responsible for scheduling, redis needs to be deployed in advance.

The installation package:

Pip3 install -celery pip3 install django- Celery == Celery"celery[librabbitmq,redis,auth,msgpack]"
Copy the code

Version celery 4.3.0

The sample

1) Create view sayHello in assetinfo/views.py.

import time
...
def sayhello(request):
    print('hello ... ')
    time.sleep(2)
    print('world ... ')
    return HttpResponse("hello world")
Copy the code

2) Configure it in assetinfo/urls.py.

urlpatterns = [
    # ex:/assetinfo/sayhello
    path('sayhello', views.sayhello, name='sayhello'),]Copy the code

3) Start the server and enter the following url in the browser:

http://127.0.0.1:8000/assetinfo/sayhello

4) The browser took 2 seconds to complete the access because the view was set to sleep 2 seconds.

5) Install in project /settings.py.

INSTALLED_APPS = (
  ...
  'djcelery',}Copy the code

6) Create a celery_tasks package file to store the tasks.py task scripts

7) Create the tasks.py file in the celery_tasks directory.

from celery import Celery
import time

Create an instance object of Celery class
app = Celery('celery_tasks.tasks', broker='redis: / / 127.0.0.1:6379/8')

@app.task
def async_sayhello():
    print('hello ... ')
    time.sleep(2)
    print('world ... ')
Copy the code

8) Open the assetinfo/views.py file and modify the sayHello view as follows:

from celery_tasks.tasks import async_sayhello

def sayhello(request):
    # print('hello ... ')
    # time.sleep(2)
    # print('world ... ')
    async_sayhello.delay()
    return HttpResponse("hello world")
Copy the code

Data table needed to perform celery generation.

python3 manage.py migrate
Copy the code

The following table is generated:

10) Start Redis, if already started, do not need to start.

[root@server01 ~]# ps -ef | grep redisroot 3694 3462 0 00:41 pts/4 00:00:00 grep --color=auto redis root 31054 1 0 Jun27 ? 01:12:52./redis-server 127.0.0.1:6379 [root@server01 ~]# redis-cli 127.0.0.1:6379 >Copy the code

11) Start the worker. celery -A celery_tasks.tasks worker –loglevel=info

Celery = ‘celery’

http://127.0.0.1:8000/assetinfo/sayhello

Check error as follows:

[2019-08-01 00:16:03,062: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
  File "g:\python3\python371\lib\site-packages\billiard\pool.py", line 358, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "g:\python3\python371\lib\site-packages\celery\app\trace.py", line 544, in _fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
Copy the code

After looking up the information it was found that there was an issue with the high version of celery running in Win10.

13) Resolve the error

Solution:

Unable to Run Tasks under Windows

Win10: celery4. X: celery4. X: celery4.

Start by installing an Eventlet

pip3 install eventlet
Copy the code

Then add a parameter to start the worker as follows:

celery -A <mymodule> worker -l info -P eventlet
Copy the code

Officially implemented as follows:

(venv) F:\pythonProject\django-pratice>celery -A celery_tasks.tasks worker -lInfo - P eventlet -- -- -- -- -- -- -- -- -- -- -- -- -- -- celery @ USC2VG2F9NPB650 v4.3.0 (rhubarb) - * * * * -- -- -- -- -- -- -- -- - * * * * * Windows 2019-08-01-10-10.0.17763 - SP0 00:22:21 - * - * * * * * * -- -- -- -- -- -- -- -- -- -- -- -- -- -- (config) * * -- -- -- -- -- -- -- -- -- -. > app: Celery_tasks. Tasks: 0 x1e4a24170f0 - * * -- -- -- -- -- -- -- -- -- - > transport: redis: / / 127.0.0.1:6379/8 - * * -- -- -- -- -- -- -- -- -- - > the results: disabled:// - *** --- * --- .> concurrency: 12 (eventlet) -- ******* ---- .> task events: OFF (enable -E to monitor tasks inthis worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . Celery_tasks. Tasks. Async_sayhello [00:22:21 2019-08-01, 629: INFO/MainProcess] Connected to Redis ://127.0.0.1:6379/8 [2019-08-01 00:22:21,700: INFO/MainProcess] Mingle: searchingforNeighbors [2019-08-01 00:22:22,913: INFO/MainProcess] Mingle: All alone with Neighbors [2019-08-01 00:22:22, 012] INFO/MainProcess] celery@USC2VG2F9NPB650 ready. [2019-08-01 00:22:23,045: INFO/MainProcess] Connected to redis: / / 127.0.0.1:6379/8.Copy the code

Once again, visit http://127.0.0.1:8000/assetinfo/sayhello

The following information is displayed: