引子

进程之间数据不共享,但是共享同一套文件系统,所以访问同一个文件,或同一个打印终端,是没有问题的,而共享带来的是竞争,竞争带来的结果就是错乱,如何控制,就是加锁处理。

例1:多个进程共享同一打印终端

①并发运行,效率高,但竞争同一打印终端,带来了打印错乱
from multiprocessing import Process
import os,time
def work():
    print('%s is running' %os.getpid())
    time.sleep(2)
    print('%s is done' %os.getpid())

if __name__ == '__main__':
    p1=Process(target=work)
    p2=Process(target=work)
    p3=Process(target=work)
    p4=Process(target=work)
    p1.start()
    p2.start()
    p3.start()
    p4.start()

结果:

python多进程logging concurrent python多进程写入同一文件_python

②由并发变成了串行,牺牲了运行效率,但避免了竞争
from multiprocessing import Process,Lock
import os,time
def work(lock):
    lock.acquire()
    print('%s is running' %os.getpid())
    time.sleep(2)
    print('%s is done' %os.getpid())
    lock.release()
if __name__ == '__main__':
    lock=Lock()
    p1=Process(target=work,args=(lock,))
    p2=Process(target=work,args=(lock,))
    p3=Process(target=work,args=(lock,))
    p4=Process(target=work,args=(lock,))
    p1.start()
    p2.start()
    p3.start()
    p4.start()

结果:

python多进程logging concurrent python多进程写入同一文件_python_02

例2:多个进程共享同一文件

①普通版:

例如有一个txt文件,里面内容是一个数字10,我们要用多进程去读取这个文件的值,然后每读一次,让txt中的这个数字减1,不加锁时代码如下:

from multiprocessing import Process
from multiprocessing import Lock
import time
import os

def func (i):
    if os.path.exists('num.txt'):
        with open('num.txt' , 'r') as f:
            num = int(f.read())
            print(i,'次','打开',"num:",num)
            num -= 1
        time.sleep(1)
        with open('num.txt' , 'w') as f:
            f.write(str(num))
            print(i,'次','写',"num:",num)
    else:
        with open('num.txt' , 'w') as f:
            f.write('10')

if __name__ == '__main__':

    print("主进程开始运行……")
    p_list = []
    for i in range(10):
        p = Process(target=func,args=(i,))
        p_list.append(p)
        p.start()
        
    for p in p_list:
        p.join()
        
    with open('num.txt', 'r') as f:
        num = int(f.read())
        print('最后结果为:{}'.format(num))
        print("主进程结束运行……" )

结果图:

python多进程logging concurrent python多进程写入同一文件_多进程_03


虽然我们用了10个进程读取并修改txt文件,但最后的值却不是1。这正是多进程共同访问资源造成混乱造成的。要达到预期结果,就要给资源上锁:

from multiprocessing import Process
from multiprocessing import Lock
import time
import os

def func (i,lock):
    
    if os.path.exists('num.txt'):
        lock.acquire()
        with open('num.txt' , 'r') as f:
            num = int(f.read())
            print(i,'次','打开',"num:",num)
            num -= 1
        time.sleep(1)
        with open('num.txt' , 'w') as f:
            f.write(str(num))
            print(i,'次','写',"num:",num)
        lock.release()
    else:
        with open('num.txt' , 'w') as f:
            f.write('10')

if __name__ == '__main__':

    print("主进程开始运行……")
    lock=Lock()
    p_list = []
    for i in range(10):
        p = Process(target=func,args=(i,lock,))
        p_list.append(p)
        p.start()
        
    for p in p_list:
        p.join()
        
    with open('num.txt', 'r') as f:
        num = int(f.read())
        print('最后结果为:{}'.format(num))
        print("主进程结束运行……" )

结果图:

python多进程logging concurrent python多进程写入同一文件_json_04

②进阶版:
注:文件db的内容为:{“count”:1}
注:注意一定要用双引号,不然json无法识别

并发运行,效率高,但竞争写同一文件,数据写入错乱

下面这个是没有加锁导致并发操作造成了数据丢失的例子

from multiprocessing import Process,Lock
import time,json,random
def search(i):
    dic=json.load(open('db.txt'))
    print(i,'-查询剩余票数:' ,dic['count'])

def get(i):
    dic=json.load(open('db.txt'))
    time.sleep(0.1) #模拟读数据的网络延迟
    if dic['count'] >0:
        dic['count']-=1
        time.sleep(0.2) #模拟写数据的网络延迟
        json.dump(dic,open('db.txt','w'))
        print(i,'购票成功')

    else:
        print(i,'购票失败')

def task(lock,i):
    search(i)
    get(i)
if __name__ == '__main__':
    lock=Lock()
    for i in range(100): #模拟并发100个客户端抢票
        p=Process(target=task,args=(lock,i))
        p.start()

加锁可以保证多个进程修改同一块数据时,同一时间只能有一个任务可以进行修改,即串行的修改,没错,速度是慢了,但牺牲了速度却保证了数据安全。

下面这个是在写的时候加了锁
注:文件db的内容为:{“count”:5}

from multiprocessing import Process,Lock
import time,json,random
def search(i):
    dic=json.load(open('db.txt'))
    print(i,'-查询剩余票数:' ,dic['count'])

def get(i,lock):
    lock.acquire()
    dic=json.load(open('db.txt'))
    print(i,'-查询剩余票数:' ,dic['count'])
    time.sleep(0.1) #模拟读数据的网络延迟
    if dic['count'] >0:
        dic['count']-=1
        time.sleep(0.2) #模拟写数据的网络延迟
        json.dump(dic,open('db.txt','w'))
        print(i,'购票成功')

    else:
        print(i,'购票失败')
    lock.release()

def task(lock,i):
    search(i)
    get(i,lock)
if __name__ == '__main__':
    lock=Lock()
    for i in range(10): #模拟并发10个客户端抢票
        p=Process(target=task,args=(lock,i))
        p.start()

结果图:

python多进程logging concurrent python多进程写入同一文件_多进程_05