current position:Home>[Python basics] Python collaboration

[Python basics] Python collaboration

2022-02-02 07:40:00 Python Yixi

Conceptually speaking , We all know that multiprocessing and multithreading , In fact, coprocessing is to realize multi concurrency in a single thread . From the perspective of syntax , A process is similar to a generator , All of them contain yield Keyword function . The difference lies in the yield Usually on the right side of an expression :​​datum = yield​​. This makes beginners instantly feel yield The key words are not fragrant , Originally thought yield Simply pause the execution and return a value , As a result, it can be placed on the right ?

From generator to coroutine
Let's take a look at what may be the simplest use example of a collaborative process :

>>> def simple_coroutine():
...     print("-> coroutine started")
...     x = yield
...     print("-> coroutine received:", x)
>>> my_coro = simple_coroutine()
>>> my_coro
<generator object simple_coroutine at 0x0000019A681F27B0>
>>> next(my_coro)
-> coroutine started
>>> my_coro.send(42)
-> coroutine received: 42
Traceback (most recent call last):
  File "<input>", line 1, in <module>
 Copy code 

The reason yield You can put it on the right , This is because the coroutine can be used by the caller ​​.send()​​ Pushed value .

yield Put it behind the right , There's another expression on its right , Please see the following example :

def simple_coro2(a):
    b = yield a
    c = yield a + b

my_coro2 = simple_coro2(14)
 Copy code 

The execution process is :

call next(my_coro2), perform yield a, Produce 14.
call my_coro2.send(28), hold 28 Assign a value to b, And then execute yield a + b, Produce 42.
call my_coro2.send(99), hold 99 Assign a value to c, The collaboration process ends .

From this we come to the conclusion that , about b = yield a In this line of code ,= The code on the right executes before assignment .

In the example , You need to call ​​next(my_coro)​​ Start the generator , Let the program in yield Pause at statement , Then you can send data . This is because there are four states in a collaborative process :

'GEN_CREATED' Wait for execution to begin
'GEN_RUNNING' The interpreter is executing
'GEN_SUSPENDED' stay yield Pause at expression
'GEN_CLOSED' end of execution

Only in GEN_SUSPENDED Status to send data , This step done in advance is called pre excitation , You can call ​​next(my_coro)​​​ Pre excitation , You can also call ​​my_coro.send(None)​​ Pre excitation , The effect is the same .

Pre excitation process
The co process must be pre excited to use , That is to say send front , First call next, Put the process in GEN_SUSPENDED state . But this thing is often forgotten . To avoid forgetting , You can define a pre excitation decorator , such as :

from functools import wraps

def coroutine(func):
    def primer(*args, **kwargs):
        gen = func(*args, **kwargs)
        return gen
    return primer
 Copy code 

But actually Python Gives a more elegant way , be called yield from, It will automatically pre excite the co process .

Custom pre excitation decorators and yield from It's not compatible .

yield from
yield from It's equivalent to... In other languages await keyword , Role is : In the generator gen Use in yield from subgen() when ,subgen Will gain control , Pass the value of the output to gen The caller of , That is, the caller can directly control subgen. meanwhile ,gen It will block , wait for subgen End .

yield from Can be used to simplify for In the loop yield:

for c in "AB":
    yield c
 Copy code 
yield from "AB"
 Copy code 

yield from x The expression is right x The first thing to do is , call iter(x), Get iterators from .

but yield from It's more than that , Its more important role is to open two-way channels . As shown in the figure below :


This graph contains a lot of information , It's hard to understand. .

The first thing to understand is 3 A concept : The caller 、 Delegation generator 、 Child generator .

The caller
To put it bluntly main function , That is, the well-known program entry main function .

​# the client code, a.k.a. the caller def main(data): # <8> results = {} for key, values in data.items(): group = grouper(results, key) # <9> next(group) # <10> for value in values: group.send(value) # <11> group.send(None) # important! <12> # print(results) # uncomment to debug report(results)​
 Copy code 

Delegation generator
That is to include yield from Function of statement , That is to say, the cooperative process .

​# the delegating generator def grouper(results, key): # <5> while True: # <6> results[key] = yield from averager() # <7>​
 Copy code 

Child generator
Namely yield from The subprocess followed on the right of the statement .
​# the subgenerator def averager(): # <1> total = 0.0 count = 0 average = None while True: term = yield # <2> if term is None: # <3> break total += term count += 1 average = total/count return Result(count, average) # <4>​ ​

It's much more comfortable than the term looks .

And then there was 5 line :send、yield、throw、StopIteration、close.

Assist in yield from When paused at expression ,main Functions can be yield from Expressions send data to yield from The subprocess followed on the right of the statement .
yield from The subprocess followed on the right of the statement passes the value of the output through yield from The expression is sent to main function .
main Function by ​​group.send(None)​​, Pass in a None value , Give Way yield from The of the subprocess followed on the right of the statement while Cycle termination , In this way, control will be handed back to the process , In order to proceed , Otherwise, it will stay for a long time yield from Statement pause .
yield from The generator function followed on the right of the statement returns , The interpreter will throw StopIteration abnormal . And attach the return value to the exception object , The process will resume .
main After the function is executed , Would call close() Method exit process .

The general process is clear , More technical details will not continue to study , If you have time , In the future Python Learn more in the principle series .

yield from Often with Python3.4 Standard library ​​@asyncio.coroutine​​ Use in combination with decorators .

The coroutine is used as an accumulator
This is a common use of coprocessing , The code is as follows :

def averager():
    total = 0.0
    count = 0
    average = None
    while True:  # <1>
        term = yield average  # <2>
        total += term
        count += 1
        average = total/count
 Copy code 

The coroutine implements concurrency
The example here is a little complicated ​

The core code snippet is :

def taxi_process(ident, trips, start_time=0):  # <1>
    """Yield to simulator issuing event at each state change"""
    time = yield Event(start_time, ident, 'leave garage')  # <2>
    for i in range(trips):  # <3>
        time = yield Event(time, ident, 'pick up passenger')  # <4>
        time = yield Event(time, ident, 'drop off passenger')  # <5>

    yield Event(time, ident, 'going home')  # <6>
    # end of taxi process # <7>
 Copy code 
def main(end_time=DEFAULT_END_TIME, num_taxis=DEFAULT_NUMBER_OF_TAXIS,
    """Initialize random generator, build procs and run simulation"""
    if seed is not None:
        random.seed(seed)  # get reproducible results

    taxis = {i: taxi_process(i, (i+1)*2, i*DEPARTURE_INTERVAL)
             for i in range(num_taxis)}
    sim = Simulator(taxis)
 Copy code 

This example shows how to handle events in a main loop , And how to drive the process by sending data . This is a asyncio The basic idea at the bottom of the package . Use coprocessors instead of threads and callbacks , Implement concurrency .

copyright notice
author[Python Yixi],Please bring the original link to reprint, thank you.

Random recommended