我有一个非常大的4GB文件,当我试图读取它时,我的电脑挂了。 所以我想一块一块地读取它,在处理每一块之后,将处理过的一块存储到另一个文件中,然后读取下一块。

有什么方法可以生产这些碎片吗?

我喜欢有一个懒惰的方法。


当前回答

您可以使用以下代码。

file_obj = open('big_file') 

Open()返回一个文件对象

然后使用os。获取大小的数据

file_size = os.stat('big_file').st_size

for i in range( file_size/1024):
    print file_obj.read(1024)

其他回答

在Python 3.8+中,你可以在while循环中使用.read():

with open("somefile.txt") as f:
    while chunk := f.read(8192):
        do_something(chunk)

当然,你可以使用任何你想要的块大小,你不需要使用8192(2**13)字节。除非你的文件大小恰好是你的块大小的倍数,否则最后一个块将小于你的块大小。

要编写惰性函数,只需使用yield:

def read_in_chunks(file_object, chunk_size=1024):
    """Lazy function (generator) to read a file piece by piece.
    Default chunk size: 1k."""
    while True:
        data = file_object.read(chunk_size)
        if not data:
            break
        yield data


with open('really_big_file.dat') as f:
    for piece in read_in_chunks(f):
        process_data(piece)

另一种选择是使用iter和helper函数:

f = open('really_big_file.dat')
def read1k():
    return f.read(1024)

for piece in iter(read1k, ''):
    process_data(piece)

如果文件是基于行的,那么文件对象已经是一个惰性的行生成器:

for line in open('really_big_file.dat'):
    process_data(line)

更新: 也可以使用file_object。如果你想让数据块以完整的行形式给出结果我的意思是没有未完成的行会出现在结果中。

例如:-

def read_in_chunks(file_object, chunk_size=1024):
    """Lazy function (generator) to read a file piece by piece.
    Default chunk size: 1k."""
    while True:
        data = file_object.readlines(chunk_size)
        if not data:
            break
        yield data

——在给出的答案上加上——

当我在chunk中读取文件时,让我们假设一个名为split.txt的文本文件,我在chunk中读取时面临的问题是,我有一个用例,我正在逐行处理数据,只是因为我在chunk中读取文本文件,它(文件块)有时以部分行结束,最终破坏了我的代码(因为它期望完整的行被处理)

阅读之后,我知道我能克服这个问题通过保持一块跟踪的最后一点我做的是如果块/ n,这意味着块包含一个完整的线,否则我通常存储部分最后一行和保持它在一个变量中,以便我可以利用这一点,将它与下一个未完成的线在接下来的一部分与我成功地克服这个问题。

示例代码:-

# in this function i am reading the file in chunks
def read_in_chunks(file_object, chunk_size=1024):
    """Lazy function (generator) to read a file piece by piece.
    Default chunk size: 1k."""
    while True:
        data = file_object.read(chunk_size)
        if not data:
            break
        yield data

# file where i am writing my final output
write_file=open('split.txt','w')

# variable i am using to store the last partial line from the chunk
placeholder= ''
file_count=1

try:
    with open('/Users/rahulkumarmandal/Desktop/combined.txt') as f:
        for piece in read_in_chunks(f):
            #print('---->>>',piece,'<<<--')
            line_by_line = piece.split('\n')

            for one_line in line_by_line:
                # if placeholder exist before that means last chunk have a partial line that we need to concatenate with the current one 
                if placeholder:
                    # print('----->',placeholder)
                    # concatinating the previous partial line with the current one
                    one_line=placeholder+one_line
                    # then setting the placeholder empty so that next time if there's a partial line in the chunk we can place it in the variable to be concatenated further
                    placeholder=''
                
                # futher logic that revolves around my specific use case
                segregated_data= one_line.split('~')
                #print(len(segregated_data),type(segregated_data), one_line)
                if len(segregated_data) < 18:
                    placeholder=one_line
                    continue
                else:
                    placeholder=''
                #print('--------',segregated_data)
                if segregated_data[2]=='2020' and segregated_data[3]=='2021':
                    #write this
                    data=str("~".join(segregated_data))
                    #print('data',data)
                    #f.write(data)
                    write_file.write(data)
                    write_file.write('\n')
                    print(write_file.tell())
                elif segregated_data[2]=='2021' and segregated_data[3]=='2022':
                    #write this
                    data=str("-".join(segregated_data))
                    write_file.write(data)
                    write_file.write('\n')
                    print(write_file.tell())
except Exception as e:
    print('error is', e)                

我认为我们可以这样写:

def read_file(path, block_size=1024): 
    with open(path, 'rb') as f: 
        while True: 
            piece = f.read(block_size) 
            if piece: 
                yield piece 
            else: 
                return

for piece in read_file(path):
    process_piece(piece)
f = ... # file-like object, i.e. supporting read(size) function and 
        # returning empty string '' when there is nothing to read

def chunked(file, chunk_size):
    return iter(lambda: file.read(chunk_size), '')

for data in chunked(f, 65536):
    # process the data

更新:该方法最好在https://stackoverflow.com/a/4566523/38592中解释