current position:Home>Summary of Python technical points

Summary of Python technical points

2022-01-30 05:50:12 An unexpected glow

Little knowledge , Great challenge ! This article is participating in “ A programmer must have a little knowledge ” Creative activities .


with sentence

with Statement is suitable for accessing resources , Make sure that the necessary... Is executed regardless of whether an exception occurs during use Cleanup operations , Release resources . Through the bottom layer __enter__,__exit__, To implement context management , Release resources .


file = './a.txt'

with open(file) as f:
    file_data = file.read()
    print(file_data)
    
 Copy code 

Details please see :Python with Detailed explanation of keyword principle https://juejin.cn/post/6959886107496415246


Closure 、 Decorator

Closure

Closure concept : In an internal function , A reference is made to the variable of the external scope , ( And generally, the return value of external function is internal function ), Then call this internal function and some variables used as closures (colsure).


def func(number):

    #  Define another function inside the function , And this function uses the variables of the external function ,
    #  So this function and some variables used are called closures 
    def func_in(number_in):
        print("in func_in  function , number_in is %d" % number_in)
        return number + number_in

    #  What is returned here is the closure 
    return func_in

 Copy code 

Using the characteristics of closures and the transfer of function references , Can be realized Decorator .

For more details, please see : Explain profound theories in simple language Python Closure https://juejin.cn/post/6960487978703519775


Decorator

The decorator is python Functions used to extend the functions of the original functions in . adopt @ Function name It can pass the decorated function as a parameter to the decorator function .@ Function name yes Python A kind of grammar sugar . There are many application scenarios for decorators , For example, insert logs , Calculate program run time , Login authentication , Transactions, etc .

It improves the reusability of the code , And will not destroy the internal structure of the function . Very suitable Section oriented programming AOP

import time


def calc_time(func):
    """  Calculate function run time  """
    
    def wapper():
        start_time = time.time()
        func() #  Call the passed function 
        use_time = start_time - time.time()
        print(use_time)
        
    return wapper
        

@calc_time 
def demo():
    
    for i in range(100000):
        print(i)
   
 Copy code 

@calc_time The function of is to make Python Interpreter perform demo = calc_time(demo)


Interview questions : Write a decorator with parameters

Parameter decorator , Maybe there are a lot of nested functions , Just understand it , In fact, it is no different from ordinary ornaments .

The general idea is : Call the function with parameters and return the internal decorator , The decorator with parameters is realized .

# 1、 Define a function with parameters 
def router(name):


    # 2、 The decorator is defined inside the function 
    def _router(func):

        def wapper():
            pass

        return wapper

    # 3、 Return to the decorator 
    return _router 
 Copy code 

# 1、 Define a function with parameters 
def router(path):

    # 2、 The decorator is defined inside the function 
    def _router(func):

        def wapper():
            print('path:', path)    #  Use the parameters passed by the function 
            func()

        return wapper

    # 3、 Return to the decorator 
    return _router


#  Use router Decorator 
@router('/index')
def index():
    print(' home page ')


index()

#  give the result as follows 
path: /index
 home page 

 Copy code 

The execution flow of the above program is as follows

  1. First, execute router('/index') Function call for , Back to _router Decorator
  2. And then it's execution Python The grammar sugar of @_router, hold index Function reference passed to _router(func)
    • index = _router(index)
  3. Finally, call index() function

Python Detailed explanation of decorator use https://juejin.cn/post/6961360227690086437


iterator 、 generator

iterator

Iteration is a way to access collection elements . An iterator is an object that remembers the traversal location . The iterator object is accessed from the first element of the collection , Until all elements are accessed . Iterators can only move forward and not backward .

adopt __iter__() To get an iterator object , And then use it __next__() To iterate over elements .__iter__(), __next__() Are specific implementation details , It's usually used iter() Function to get the iterator of the iteratable object , then next() The function is used to traverse the iteration .


Let's see how to construct a Fibonacci sequence iterator

class FibIterator(object):
    """ Fibonacci sequence iterator """
    
    def __init__(self, n):
        """ :param n: int,  Indicates the front of the generated sequence n Number  """
        self.n = n
        
        # current It is used to save the number generated into the sequence 
        self.current = 0
        
        # num1 Used to save the previous number , The initial value is the first number in the sequence 0
        self.num1 = 0
        
        # num2 Used to save the previous number , The initial value is the second number in the sequence 1
        self.num2 = 1

    def __next__(self):
        """ By next() Function call to get the next number """
        if self.current < self.n:
            num = self.num1
            self.num1, self.num2 = self.num2, self.num1+self.num2
            self.current += 1
            return num
        else:
            raise StopIteration

    def __iter__(self):
        """ Iterator __iter__ Just go back to yourself """
        return self


if __name__ == '__main__':
    fib = FibIterator(10)
    for num in fib:
        print(num, end=" ")

#  give the result as follows 
0 1 1 2 3 5 8 13 21 34
 Copy code 

For more details, please move to :Python iterator https://juejin.cn/post/6948437239286202375


generator

Using iterators , We can get the data at each iteration ( adopt next() Method ) It is generated according to a specific law . But when we implement an iterator , We need to record the state of the current iteration , Then the next data can be generated according to the current state . In order to achieve the current state of the record ( Save context ), And cooperate with next() Function to use iteratively , We can use a simpler grammar , namely generator (generator).

Python generator https://juejin.cn/post/6948437559533895693


How the generator is created

Deduce the list [] Switch to () It's the generator .

In [21]: L = [i for i in range(10)]

In [22]: G = (i for i in range(10))

In [23]: type(L)
Out[23]: list

In [24]: type(G)
Out[24]: generator
    
 Copy code 

Another way is to use yiled keyword

Using generator to realize Fibonacci sequence

def fib(n):

    cur = 0
    num1 = 0
    num2 = 1

    while cur < n:

        yield num1

        num1, num2 = num2, num1 + num2

        cur += 1
        
        
In [11]: f = fib(5)

In [12]: type(f)
Out[12]: generator

In [13]: next(f)
Out[13]: 0

In [14]: next(f)
Out[14]: 1

In [15]: next(f)
Out[15]: 1

In [16]: next(f)
Out[16]: 2

In [17]: next(f)
Out[17]: 3

In [18]: next(f)
---------------------------------------------------------------------------
StopIteration                             Traceback (most recent call last)
<ipython-input-18-aff1dd02a623> in <module>
----> 1 next(f)

StopIteration:

In [19]: for i in fib(10):
    ...:     print(i, end=' ')
    ...:
0 1 1 2 3 5 8 13 21 34
In [20]:
    
 Copy code 

To derive from a list , You can create a list directly . however , Limited by memory , List capacity must be limited . and , Create a list of millions of elements , Not only takes up a lot of memory space , Such as : We just need to access the first few elements , Most of the space occupied by the latter elements is wasted . therefore , There is no need to create a complete list . stay Python in , We can use generators : Edge cycle , Mechanism of edge computing —> generator. It can save a lot of memory space .


import sys
import time


def calc_time(func):
    """  Calculate function runtime decorator  """

    def wapper(*args, **kwargs):
        start_time = time.time()
        f = func(*args, **kwargs)
        use_time = time.time() - start_time
        print(use_time, 's')
        return f
    return wapper


@calc_time
def get_list(n):
    """ List derivation generates list data """
    return [i for i in range(n)]


@calc_time
def generate_list(n):
    """ Generate list data through generator """
    return (i for i in range(n))


n = 1_0000_0000
li = get_list(n)
g_li = generate_list(n)

print('li size:', sys.getsizeof(li))

print('g_li size', sys.getsizeof(g_li))

for i in range(5):
    print('li', li[i], 'g_li', next(g_li))

 Copy code 

Running results

4.615425109863281s	#  The time taken to derive the list 
0.0s				#  The time taken by the generator 

li size: 859724472	#  List derivation produces object size 
g_li size 120		#  The size of the object generated by the generator 

li 0 g_li 0
li 1 g_li 1
li 2 g_li 2
li 3 g_li 3

 Copy code 

It can be seen that the list derivation takes a long time to generate a lot of data and takes up a lot of memory , But by generator It takes little time to generate , Because it generates data dynamically at runtime , So the generator doesn't have to save all the data , Just save some variables used by the context , But the generator is inconvenient to operate , Slicing is not supported , There are no methods as rich as lists . Each has its own advantages and disadvantages .


Interview questions : In file 1000w Data , Memory cannot store all at once , How to read ?

def read_file(file):
    with open(file, mode='r') as f:

        # f.read()  Load all data into memory 
        # f.readline()  Read one line in the file at a time 
        # f.readlines()  What is returned is a list of each row 

        for row in f.readlines():
            yield row   #  adopt  yield  Return each row of data 


file = 'aaa.txt'

for row in read_file(file):
    print(row)           
            
 Copy code 

Assignment and depth copy

  • Direct assignment : It's actually a reference to an object ( Alias ).
  • Shallow copy (copy): Copy the parent object , Does not copy the inner child of an object . But for immutable data types , No copying , Just point to
  • Deep copy (deepcopy): copy Modular deepcopy Method , Completely ( recursive ) Copy the parent object and its children .

Built in module copy It can help us realize Shallow copy (copy), Deep copy (deepcopy)


Let's take an example c = [ [1, 2], [3, 4] ]

#  Direct assignment 
In [54]: import copy

In [55]: c = [ [1, 2], [3, 4] ]

In [56]: d = c

In [57]: id(d), id(c)
Out[57]: (2365035580680, 2365035580680)
 Copy code 

d = c Assignment reference ,c and d They all point to the same object , The address is the same .


In [58]: #  Shallow copy 

In [59]: e = copy.copy(c)

In [60]: id(e), id(c)
Out[60]: (2365034915848, 2365035580680)

In [61]: e
Out[61]: [[1, 2], [3, 4]]

In [62]: c
Out[62]: [[1, 2], [3, 4]]

In [63]: e[0].append(5)

In [64]: e
Out[64]: [[1, 2, 5], [3, 4]]

In [65]: c
Out[65]: [[1, 2, 5], [3, 4]]

 Copy code 

e = copy.copy(c) Shallow copy ,c and e It's a Independent object ( Address different ), But their The child object still points to the unified object, that is, the reference . So go to e[0] Add data 5, Will affect c.


In [66]: #  Deep copy 

In [67]: c=[ [1, 2], [3, 4] ]

In [68]: f = copy.deepcopy(c)

In [69]: id(c), id(f)
Out[69]: (2365035892552, 2365035662792)

In [70]: c
Out[70]: [[1, 2], [3, 4]]

In [71]: f
Out[71]: [[1, 2], [3, 4]]

In [72]: f.append(5)

In [73]: f[0].append(6)

In [74]: f[1].append(7)

In [75]: c
Out[75]: [[1, 2], [3, 4]]

In [76]: f
Out[76]: [[1, 2, 6], [3, 4, 7], 5]
    
 Copy code 

f = copy.deepcopy(c) Deep copy , f Completely copied c The parent object and its children , The two are completely independent .f Any modification will not affect c.

 Deep and shallow copy understanding diagram 1


matters needing attention

  • copy.copy() For variable types , A shallow copy will be made .
  • copy.copy() For immutable types , No copying , Just point to .
  • copy.deepcopy() Deep copy to variable 、 Immutable types are the same, recursively copying all , Objects are completely independent .

about Variable data type 、 Immutable data types For details of and copy, please check :

The depth resolution Python Assignment 、 Shallow copy 、 Deep copy https://juejin.cn/post/6955354098988220423/


GIL lock

GIL The full name is Global Interpreter Lock( Global interpreter lock ), Because of the existence of this lock ,CPython In the environment of multithreading and concurrency At the same time Only one thread is running , Can't make full use of multi-core CPU Performance of . Even though it's chicken ribs , But encountered IO In case of blockage , The global interpreter lock will be released , Let other threads work . for example : Read in file , Network request (IO In dense scenes ), It can still effectively improve the performance .


solve GIL The lock problem

1、 Using multiple processes

principle : Each process is assigned a different interpreter , There are separate GIL, Built in module multiprocessing You can start multiple processes .

shortcoming : High resource consumption , Additional data serialization and communication overhead .


2、 Use C Language extension module

principle : C The execution of the language extender remains the same as Python Interpreter isolation , stay C Release... From code GIL, for example numpy, pandas Such as the library .

shortcoming : call C Function time GIL Will be locked , If blocked , The interpreter cannot release GIL.

Usage method : stay C Insert special macros into the code or use other tools to access C Code , Such as ctypes Library or Cython.(ctypes By default, it will call C Automatically release when code GIL)


3、 Choose others without GIL Instead of CPython

principle : Use without GIL The interpreter implementation of .

shortcoming : Not fully compatible .

Usage method : at present Jython and IronPython No, GIL.


process , Threads , coroutines

  • process Process Is the basic unit of operating system resource allocation
    • stay Python Use in multiprocessing Built in modules can create processes .
  • Threads Thread yes CPU The smallest execution unit of scheduling . There can be multiple sub threads in a process .
    • stay Python Use in threading Built in modules can create threads .
  • coroutines CoRoutine Is a lightweight user mode Micro thread , It realizes the saving and switching of context environment .
    • Simply speaking , The scheduling of threads is the responsibility of the operating system , Thread sleep 、 wait for 、 The timing of wake-up is controlled by the operating system , Developers cannot decide . Use the process , Developers can control the timing of program switching , You can interrupt the execution of a function halfway through its execution , Give up CPU, Return to the break point when necessary and continue to execute . The timing of switching is up to the developer .
    • stay Python Can be used in yield Keyword to implement coroutines , Or use a third-party library greenlet, gevent.
    • Python 3.5 After the new version async/await Key words match asyncio The built-in module can realize concurrent and asynchronous programming .

Python Middle process 、 Threads 、 Please refer to the following link for the use of collaborative process :


asyncio Asynchronous programming

asyncio yes Python 3.4 Version to standard library ,Python3.5 Joined again async/await characteristic , It is very convenient to create a collaborative process and control its context switching .


Add... Before defining the function aysnc keyword , When the function is called, it returns <class 'coroutine'> ,Python The instance object of the internally encapsulated collaboration class .

In [93]: async def task():
    ...:
    ...:     i = 0
    ...:     while i < 5:
    ...:         print(i)
    ...:

In [94]: c = task()

In [95]: c
Out[95]: <coroutine object task at 0x00000226A71505C8>

In [96]: type(c)
Out[96]: coroutine
 Copy code 

import asyncio


async def task1():
    i = 0
    while i < 5:
        print('task1', i)
        i += 1
        await asyncio.sleep(0.1)    # await  Let time-consuming tasks hang automatically 


async def task2():
    i = 0
    while i < 5:
        print('task2', i)
        i += 1
        await asyncio.sleep(0.1)


def main():
    loop = asyncio.get_event_loop()

    tasks = [
        task1(),
        task2()
    ]

    #  Wait for all tasks to complete 
    loop.run_until_complete(asyncio.wait(tasks))
    print('end')


if __name__ == '__main__':
    main()

 Copy code 

give the result as follows :

task2 0
task1 0
task2 1
task1 1
task2 2
task1 2
task2 3
task1 3
task2 4
task1 4
end
 Copy code 

Implementation of the singleton pattern

1、 Override class __new__() Method

class Singleton(object):

    def __init__(self, name):
        self.name = name

    def __new__(cls, *args, **kwargs):

        if not hasattr(cls, '_instance'):
            cls._instance = super().__new__(cls)

        return cls._instance


s1 = Singleton('hui')
s2 = Singleton('jun')

print(id(s1), id(s2))
print(s1.name, s2.name)

#  give the result as follows 
3097281233736 3097281233736		#  The address is the same 
jun jun		#  Therefore, the same object is operated on 
 Copy code 

2、 Use decorators

def singleton(cls):

    _instance = {}

    def _singleton(*args, **kwargs):

        if cls not in _instance:

            _instance[cls] = cls(*args, **kwargs)

        return _instance[cls]

    return _singleton


@singleton
class Demo(object):

    def __init__(self, name, age=18):
        self.name = name
        self.age = age


d1 = Demo('hui', age=18)
d2 = Demo('jun', age=21)

print(id(d1), id(d2))
print(d1.name, d1.age)
print(d2.name, d2.age)


#  give the result as follows 
2107174554696 2107174554696
hui 18
hui 18
 Copy code 

The analysis is as follows


def singleton(cls):

    print(type(cls))

    _instance = {}

    def _singleton(*args, **kwargs):

        print(args)
        print(kwargs)

        if cls not in _instance:

            print(cls.__dict__)
            instance = cls(*args, **kwargs)
            print(type(instance))

            _instance[cls] = instance

        return _instance[cls]

    return _singleton


@singleton
class Demo(object):

    def __init__(self, name, age=18):
        self.name = name
        self.age = age


d1 = Demo('hui', age=18)
d2 = Demo('jun', age=21)

print(id(d1), id(d2))
print(d1.name, d1.age)
print(d2.name, d2.age)
 Copy code 

Operation results and Analysis :

<class 'type'> # Demo Class object (cls) The type of  ('hui',) # Demo  Position parameters during initialization  {'age': 18}		# Demo  Keyword parameters during initialization 

# Demo Class object __dict__ attribute 
{'__module__': '__main__',
 '__init__': <function Demo.__init__ at 0x00000239C8784798>, 
 '__dict__': <attribute '__dict__' of 'Demo' objects>,
 '__weakref__': <attribute '__weakref__' of 'Demo' objects>,
 '__doc__': None
}

#  An instance object constructed from a class object 
<class '__main__.Demo'> ('jun',) {'age': 21}
2447199767752 2447199767752 #  The address is the same 

#  Because the instance object is built for the first time , It won't be created next time , Go straight back to the first time 
#  Therefore, the initialization parameters passed by the later created object have no effect 
hui 18
hui 18
 Copy code 

For more ways to create a singleton, please see :www.cnblogs.com/huchong/p/8…


The metaclass

trace Python The ancestor of class —— The metaclass https://juejin.cn/post/6957631734343008269


CGI,WSGI

CGI Its full name is Common Gateway Interface ( Universal gateway interface ), Is a standard protocol that defines how programs interact with servers .

WSGI The full name is Python Web Server Gateway Interface (Web Service gateway interface ), It is Web The server and Web Interface specification for application communication , Used to support Web The server and Web Application interaction .


Python Web Communication diagram


Python Memory management and garbage collection mechanism

Python Memory management in is controlled by Python Private heap space management . all Python Objects and data structures are in the private heap . Programmers do not have access to this private heap .Python The interpreter takes care of this .

Python Memory management mechanism : Reference count 、 Garbage collection 、 Memory pool .

Garbage collection mechanism : Mainly reference counters , Mark removal and generational recycling are supplemented for garbage collection .


You can refer to :www.pythonav.com/wiki/detail…

copyright notice
author[An unexpected glow],Please bring the original link to reprint, thank you.
https://en.pythonmana.com/2022/01/202201300550081647.html

Random recommended