You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello I try to use it as spawn
to work correctly with django database connections
like in the this post
One possibility is to use multiprocessing spawn child process creation method, which will not copy django's DB connection details to the child processes. The child processes need to bootstrap from scratch, but are free to create/close their own django DB connections.
In calling code:
import multiprocessing
from myworker import work_one_item # <-- Your worker method
...
Uses connection A
list_of_items = djago_db_call_one()
'spawn' starts new python processes
with multiprocessing.get_context('spawn').Pool() as pool:
# work_one_item will create own DB connection
parallel_results = pool.map(work_one_item, list_of_items)
Continues to use connection A
another_db_call(parallel_results)
but i get error
/usr/local/lib/python3.6/multiprocessing/reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object 'wrap_work_function_for_pipe..closure'
I try CONTEXT = multiprocessing.get_context("spawn")
in the core.py
The text was updated successfully, but these errors were encountered:
Hello I try to use it as spawn
to work correctly with django database connections
like in the this post
One possibility is to use multiprocessing spawn child process creation method, which will not copy django's DB connection details to the child processes. The child processes need to bootstrap from scratch, but are free to create/close their own django DB connections.
In calling code:
import multiprocessing
from myworker import work_one_item # <-- Your worker method
...
Uses connection A
list_of_items = djago_db_call_one()
'spawn' starts new python processes
with multiprocessing.get_context('spawn').Pool() as pool:
# work_one_item will create own DB connection
parallel_results = pool.map(work_one_item, list_of_items)
Continues to use connection A
another_db_call(parallel_results)
but i get error
/usr/local/lib/python3.6/multiprocessing/reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object 'wrap_work_function_for_pipe..closure'
I try CONTEXT = multiprocessing.get_context("spawn")
in the core.py
The text was updated successfully, but these errors were encountered: