Python 3 - Can pickle handle byte objects larger than 4GB? -


based on comment , referenced documentation, pickle 4.0+ python 3.4+ should able pickle byte objects larger 4 gb.

however, using python 3.4.3 or python 3.5.0b2 on mac os x 10.10.4, error when try pickle large byte array:

>>> import pickle >>> x = bytearray(8 * 1000 * 1000 * 1000) >>> fp = open("x.dat", "wb") >>> pickle.dump(x, fp, protocol = 4) traceback (most recent call last):   file "<stdin>", line 1, in <module> oserror: [errno 22] invalid argument 

is there bug in code or misunderstanding documentation?

here simple workaround issue 24658. use pickle.loads or pickle.dumps , break bytes object chunks of size 2**31 - 1 in or out of file.

import pickle import os.path  file_path = "pkl.pkl" n_bytes = 2**31 max_bytes = 2**31 - 1 data = bytearray(n_bytes)  ## write bytes_out = pickle.dumps(data) open(file_path, 'wb') f_out:     idx in range(0, n_bytes, max_bytes):         f_out.write(bytes_out[idx:idx+max_bytes])  ## read bytes_in = bytearray(0) input_size = os.path.getsize(file_path) open(file_path, 'rb') f_in:     _ in range(0, input_size, max_bytes):         bytes_in += f_in.read(max_bytes) data2 = pickle.loads(bytes_in)  assert(data == data2) 

Comments

Popular posts from this blog

php - Zend Framework / Skeleton-Application / Composer install issue -

c# - Better 64-bit byte array hash -

python - PyCharm Type error Message -