Append multiple file in python

I have up to 8 seperate Python processes creating temp files in a shared folder. Then I'd like the controlling process to append all the temp files in a certain order into one big file. What's the quickest way of doing this at an os agnostic shell level?

asked Apr 1, 2011 at 6:26

Append multiple file in python

6

Just using simple file IO:

# tempfiles is a list of file handles to your temp files. Order them however you like
f = open("bigfile.txt", "w")
for tempfile in tempfiles:
    f.write(tempfile.read())

That's about as OS agnostic as it gets. It's also fairly simple, and the performance ought to be about as good as using anything else.

answered Apr 1, 2011 at 6:29

Rafe KettlerRafe Kettler

74.2k19 gold badges152 silver badges149 bronze badges

2

Not aware of any shell-level commands for appending one file to another. But appending at 'python level' is sufficiently easy that I am guessing python developers did not think it necessary to add it to the library.

The solution depends on the size and structure of the temp files you are appending. If they are all small enough that you don't mind reading each of them into memory, then the answer from Rafe Kettler (copied from his answer and repeated below) does the job with the least amount of code.

# tempfiles is an ordered list of temp files (open for reading)
f = open("bigfile.txt", "w")
for tempfile in tempfiles:
    f.write(tempfile.read())

If reading files fully into memory is not possible or not an appropriate solution, you will want to loop through each file and read them piece-wise. If your temp file contains newline-terminated lines which can be read individually into memory, you might do something like this

# tempfiles is an ordered list of temp files (open for reading)
f = open("bigfile.txt", "w")
for tempfile in tempfiles:
    for line in tempfile
        f.write(line)

Alternatively - something which will always work - you may choose a buffer size and just read the file piece-wise, e.g.

# tempfiles is an ordered list of temp files (open for reading)
f = open("bigfile.txt", "w")
for tempfile in tempfiles:
    while True:
        data = tempfile.read(65536)
        if data:
            f.write(data)
        else:
            break

The input/output tutorial has a lot of good info.

answered Apr 1, 2011 at 10:28

CptJeanLucCptJeanLuc

1411 silver badge5 bronze badges

1

Rafe's answer was lacking proper open/close statements, e.g.

# tempfiles is a list of file handles to your temp files. Order them however you like
with open("bigfile.txt", "w") as fo:
     for tempfile in tempfiles:
          with open(tempfile,'r') as fi: fo.write(fi.read())

However, be forewarned that if you want to sort the contents of the bigfile, this method does not catch instances where the last line in one or more of your temp files has a different EOL format, which will cause some strange sort results. In this case, you will want to strip the tempfile lines as you read them, and then write consistent EOL lines to the bigfile (i.e. involving an extra line of code).

answered Aug 16, 2013 at 16:24

Append multiple file in python

ksedksed

3361 gold badge4 silver badges13 bronze badges

I feel a bit stupid to add another answer after 8 years and so many answers, but I arrived here by the "append to file" title, and didn't see the right solution for appending to an existing binary file with buffered read/write.

So here is the basic way to do that:

def append_file_to_file(_from, _to):
    block_size = 1024*1024
    with open(_to, "ab") as outfile, open(_from, "rb") as infile:
        while True:
            input_block = infile.read(block_size)
            if not input_block:
                break
            outfile.write(input_block)

Given this building block, you can use:

for filename in ['a.bin','b.bin','c.bin']:
    append_file_to_file(filename, 'outfile.bin')

answered Dec 15, 2019 at 14:40

ishahakishahak

6,1835 gold badges34 silver badges53 bronze badges

import os
str = os.listdir("./")

for i in str:
    f = open(i)
    f2 = open("temp.txt", "a")
    for line in f.readlines():
        f2.write(line)

We can use above code to read all the contents from all the file present in current directory and store into temp.txt file.

answered Mar 16, 2017 at 11:27

Sumit NaikSumit Naik

7141 gold badge7 silver badges11 bronze badges

Use fileinput:

with open("bigfile.txt", "w") as big_file:
    with fileinput.input(files=tempfiles) as inputs:
        for line in inputs:
            big_file.write(line)

This is more memory efficient than @RafeKettler's answer as it doesn't need to read the whole file into memory before writing to big_file.

answered Oct 11, 2014 at 17:30

Append multiple file in python

Peter WoodPeter Wood

23.2k5 gold badges58 silver badges94 bronze badges

Try this. It's very fast (much faster than line-by-line, and shouldn't cause a VM thrash for large files), and should run on about anything, including CPython 2.x, CPython 3.x, Pypy, Pypy3 and Jython. Also it should be highly OS-agnostic. Also, it makes no assumptions about file encodings.

#!/usr/local/cpython-3.4/bin/python3

'''Cat 3 files to one: example code'''

import os

def main():
    '''Main function'''
    input_filenames = ['a', 'b', 'c']

    block_size = 1024 * 1024

    if hasattr(os, 'O_BINARY'):
        o_binary = getattr(os, 'O_BINARY')
    else:
        o_binary = 0
    output_file = os.open('output-file', os.O_WRONLY | o_binary)
    for input_filename in input_filenames:
        input_file = os.open(input_filename, os.O_RDONLY | o_binary)
        while True:
            input_block = os.read(input_file, block_size)
            if not input_block:
                break
            os.write(output_file, input_block)
        os.close(input_file)
    os.close(output_file)

main()

There is one (nontrivial) optimization I've left out: It's better to not assume anything about a good blocksize, instead using a bunch of random ones, and slowly backing off the randomization to focus on the good ones (sometimes called "simulated annealing"). But that's a lot more complexity for little actual performance benefit.

You could also make the os.write keep track of its return value and restart partial writes, but that's only really necessary if you're expecting to receive (nonterminal) *ix signals.

answered Oct 11, 2014 at 19:06

user1277476user1277476

2,83111 silver badges9 bronze badges

In this code, you can indicate the path and name of the input/output files, and it will create the final big file in that path:

import os

dir_name = "Your_Desired_Folder/Goes_Here"    #path
input_files_names = ["File1.txt", "File2.txt", "File3.txt"]     #input files
file_name_out = "Big_File.txt"     #choose a name for the output file
file_output = os.path.join(dir_name, file_name_out)
fout = open(file_output, "w")

for tempfile in input_files_names:
    inputfile = os.path.join(dir_name, tempfile)
    fin = open(inputfile, 'r')
    for line in fin:
        fout.write(line)

fin.close()    
fout.close()

answered Sep 11, 2018 at 1:56

mah65mah65

5168 silver badges18 bronze badges

Simple & Efficient way to copy data from multiple files to one big file, Before that you need to rename your files to (int) eg. 1,2,3,4...etc, Code:

#Rename Files First

import os

path = 'directory_name'
files = os.listdir(path)
i = 1
for file in files:
    os.rename(os.path.join(path, file), os.path.join(path, str(i)+'.txt'))

    i = i+1

# Code For Copying Data from Multiple files

import os

i = 1
while i<50:

    filename = i
    for filename in os.listdir("directory_name"):

        # %s is your filename # .txt is file extension 
        f = open("%s.txt" % i,'r') 
        fout = open("output_filename", "a")

    for line in f:
        fout.write(line)
    i += 1

answered Oct 8, 2018 at 14:59

There's also the fileinput class in Python 3, which is perfect for this sort of situation

answered Sep 12, 2019 at 9:39

MikeTheTallMikeTheTall

3,0064 gold badges28 silver badges39 bronze badges

1

I was solving similar problem, I was combining multiple files in a folder into a big one, in the same folder, sorted based on file modified Hints are in comments in the code block

from glob import glob
import os

# Folder is where files are stored 
# This is also where the big file will be stored
folder = r".\test_folder"
big_filename = r"connected.txt"

# Get all files except big the file and sort by last modified
all_files = glob(folder + "/*")
all_files = [fi for fi in all_files if big_filename not in fi]
all_files.sort(key=os.path.getmtime)

# Get content of each file and append it to a list
output_big_file = []
for one_file in all_files:
    with open(one_file, "r", encoding="utf-8") as f:
        output_big_file.append(f.read())

# Save list as a file
save_path = os.path.join(folder, big_filename)
with open(save_path, "w", encoding="utf-8") as f:
    f.write("\n".join(output_big_file))

answered Sep 14, 2021 at 8:37

Append multiple file in python

1

Just change target dir)))

import os

d = "./output_dir"
str = os.listdir(d)

for i in str:
    f = open(d + '/' + i)
    f2 = open(d + '/' + "output.csv", "a")
    for line in f.readlines():
        f2.write(line)

answered Mar 19 at 20:11

Append multiple file in python

ValknutValknut

9202 gold badges14 silver badges19 bronze badges

Can you append a file in Python?

In order to append a new line your existing file, you need to open the file in append mode , by setting "a" or "ab" as the mode. When you open with "a" mode , the write position will always be at the end of the file (an append).

How do I import multiple text files into python?

Import the OS module in your notebook. Define a path where the text files are located in your system. Create a list of files and iterate over to find if they all are having the correct extension or not. Read the files using the defined function in the module.

Can Python have multiple files open?

Python provides the ability to open as well as work with multiple files at the same time. Different files can be opened in different modes, to simulate simultaneous writing or reading from these files.

How do you merge files in a folder in Python?

Practical Data Science using Python To merge all excel files in a folder, use the Glob module and the append() method. Note − You may need to install openpyxl and xlrd packages.