
作用:使用计算机的多核
python中的多线程其实并不是真正的多线程,如果想要充分地使用多核CPU的资源,在python中大部分情况需要使用多进程。
Python 的 multiprocessing 库用于创建多进程程序。它提供了一个易于使用的接口来并行运行多个进程,从而充分利用多核处理器的计算能力。该库允许你创建进程、管理进程池、共享内存和通信。它适用于执行CPU密集型任务,如科学计算、图像处理等,可以显著提高程序的运行效率,是实现并行计算的重要工具。
import multiprocessing
print (multiprocessing.cpu_count())
import multiprocessing as mpdef job(a,b):print("aaaaa")if __name__=="__main__":p1 = mp.Process(target = job,args = (1,2))p1.start()p1.join()
import multiprocessing as mp # 引入multiprocessingdef job(a,b):print("aaaaa")if __name__=="__main__":p1 = mp.Process(target = job,args = (1,2)) # 指定方法p1.start() # 开始工作p1.join() # 主进程等待进程池中的所有子进程结束
p1 = mp.Process(target = job,args = (1,2)) # 指定方法理解成要for循环开启
import multiprocessing as mpdef job(q):res = 0for i in range(1000):res += i+i**2+i**3q.put(res)if __name__=="__main__":q = mp.Queue()p1 = mp.Process(target = job,args = (q,))p2 = mp.Process(target = job,args = (q,))p1.start()p2.start()p1.join()p2.join()res1 = q.get()res2 = q.get()print(res1)print(res2)print(res1 + res2)
import multiprocessing as mpdef job(q):res = 0for i in range(1000):res += i+i**2+i**3q.put(res) # 3、获得内容if __name__=="__main__":q = mp.Queue() # 1、实例化Queuep1 = mp.Process(target = job,args = (q,)) # 2、传入Queuep2 = mp.Process(target = job,args = (q,))p1.start()p2.start()p1.join()p2.join()res1 = q.get() # 4、获得内容res2 = q.get()print(res1)print(res2)print(res1 + res2)
import multiprocessing as mpdef job(x):return x*xdef multicore():pool = mp.Pool()res = pool.map(job,range(100))print(res)if __name__=="__main__":multicore()
import multiprocessing as mpdef job(x):return x*xdef multicore():pool = mp.Pool() # 1、实例化res = pool.map(job,range(100)) # 2、指定任务print(res)if __name__=="__main__":multicore()
import multiprocessing as mpdef job(x):return x*xdef multicore():pool = mp.Pool(processes = 2) # 1、实例化,核数,不写就全部一起res = pool.map(job,range(100))print(res)if __name__=="__main__":multicore()
import multiprocessingimport timedef func(name):print("start: %s" % name)time.sleep(2)return "end: %s" % nameif __name__ == "__main__":name_list = ["winter", "elly", "james", "yule"]res_list = []pool = multiprocessing.Pool(3) # 创建一个进程总数为3的进程池for member in name_list:res = pool.apply(func, (member,)) # 创建子进程,并执行,不需要startprint(res)pool.close()pool.join() # 调用join之前,先调用close函数,否则会出错。执行完close后不会有新的进程加入到poolprint("all done...")
start: winterend: winterstart: ellyend: ellystart: jamesend: jamesstart: yuleend: yuleall done...
import multiprocessingimport timedef func(name):print("start: %s" % name)time.sleep(2)return "end: %s" % namedef func_exp(msg):print("callback: %s" % msg)if __name__ == "__main__":name_list = ["winter", "elly", "james", "yule"]res_list = []pool = multiprocessing.Pool() # 创建一个进程总数为3的进程池for member in name_list:res = pool.apply_async(func, (member,), callback=func_exp) # 创建子进程,并执行,不需要startres_list.append(res) #注意这里是append了res,不是res.get(),不然又要阻塞了for res_mem in res_list:print(res_mem.get())pool.close()pool.join() # 调用join之前,先调用close函数,否则会出错。执行完close后不会有新的进程加入到poolprint("all done...")
start: winterstart: ellystart: jamesstart: yulecallback: end: winterend: wintercallback: end: ellyend: ellycallback: end: jamesend: jamescallback: end: yuleend: yuleall done...
def func(i):mysqldb = pymysql.connect(host="127.0.0.1", user="root", password="123456", db="data",port=3306, charset="utf8")mysqlcursor2 = mysqldb.cursor() # 获取指针以操作数据库