python對大檔案 的處理
阿新 • • 發佈:2019-02-11
第一方法:
def read_in_chunks(filePath, chunk_size=1024*1024):
"""
Lazy function (generator) to read a file piece by piece.
Default chunk size: 1M
You can set your own chunk size
"""
file_object = open(filePath)
while True:
chunk_data = file_object.read(chunk_size)
if not chunk_data:
break
yield chunk_data
if __name__ == "__main__":
filePath = './path/filename'
for chunk in read_in_chunks(filePath):
process(chunk) # <do something with chunk>
第二種方法:
#If the file is line based with open(...) as f: for line in f: process(line) # <do something with line>
第三種方法:
import fileinput for line in fileinput.input(['sum.log']): print line