Out of Memory error

Asked by Jack Ellsworth IV

I received an error that said out of memory when trying to commit. I did not have any large files or a large number of files. I even tried to commit just one small file and I still received the same error. Here is the log from that commit. Any help would be greatly appreciated.

Mon 2010-09-20 14:27:52 -0400
0.172 bazaar version: 2.1.1
0.172 bzr arguments: [u'qsubprocess', u'--bencode', u'l6:commit2:-m41:Added read in and process FPGA output.sce16:Output Plot Data23:toolbox/DSP toolbox.sci21:Create DSP Signal.sce20:Create TB Signal.sce46:DSP polyphase filter Coefficient Formating.sce17:Data playback.sce37:FPGA filter Coefficient Formating.sce30:Transmitter Design Testing.sce32:filter Coefficient Formating.sce15:lpf1000_48k.txt12:lpf33_1k.txt12:lpf33_2k.txt14:lpf500_48k.txt23:read in FPGA output.sce20:read in and plot.sce35:read in and process FPGA output.sce50:Output Plot Data/3100 Hz demod low level FPGA2.txt34:Output Plot Data/lpf 6000 FPGA.txt30:Output Plot Data/mult_imag.txt30:Output Plot Data/mult_real.txte']
0.172 looking for plugins in D:/Documents and Settings/jaellswo/Application Data/bazaar/2.0/plugins
0.172 looking for plugins in C:/Program Files/Bazaar/plugins
0.281 encoding stdout as osutils.get_user_encoding() 'cp1252'
0.344 bazaar version: 2.1.1
0.344 bzr arguments: [u'commit', u'-m', u'Added read in and process FPGA output.sce', u'Output Plot Data', u'toolbox/DSP toolbox.sci', u'Create DSP Signal.sce', u'Create TB Signal.sce', u'DSP polyphase filter Coefficient Formating.sce', u'Data playback.sce', u'FPGA filter Coefficient Formating.sce', u'Transmitter Design Testing.sce', u'filter Coefficient Formating.sce', u'lpf1000_48k.txt', u'lpf33_1k.txt', u'lpf33_2k.txt', u'lpf500_48k.txt', u'read in FPGA output.sce', u'read in and plot.sce', u'read in and process FPGA output.sce', u'Output Plot Data/3100 Hz demod low level FPGA2.txt', u'Output Plot Data/lpf 6000 FPGA.txt', u'Output Plot Data/mult_imag.txt', u'Output Plot Data/mult_real.txt']
0.344 encoding stdout as osutils.get_user_encoding() 'cp1252'
0.406 opening working tree 'D:/Working/Scilab'
1.343 preparing to commit
[ 5764] 2010-09-20 14:27:53.688 INFO: Committing to: T:/PROD_DEV/Repo/Products/AFTC5/bzr/source/dev/Scilab/
1.687 Selecting files for commit with filter [u'Create DSP Signal.sce', u'Create TB Signal.sce', u'DSP polyphase filter Coefficient Formating.sce', u'Data playback.sce', u'FPGA filter Coefficient Formating.sce', u'Output Plot Data', u'Transmitter Design Testing.sce', u'filter Coefficient Formating.sce', u'lpf1000_48k.txt', u'lpf33_1k.txt', u'lpf33_2k.txt', u'lpf500_48k.txt', u'read in FPGA output.sce', u'read in and plot.sce', u'read in and process FPGA output.sce', u'toolbox/DSP toolbox.sci']
[ 5764] 2010-09-20 14:27:53.720 INFO: added lpf33_2k.txt
[ 5764] 2010-09-20 14:27:53.720 INFO: added read in and process FPGA output.sce
[ 5764] 2010-09-20 14:27:53.736 INFO: modified FPGA filter Coefficient Formating.sce
[ 5764] 2010-09-20 14:27:53.736 INFO: added Transmitter Design Testing.sce
[ 5764] 2010-09-20 14:27:53.736 INFO: modified Create TB Signal.sce
[ 5764] 2010-09-20 14:27:53.736 INFO: modified read in FPGA output.sce
[ 5764] 2010-09-20 14:27:53.736 INFO: added DSP polyphase filter Coefficient Formating.sce
[ 5764] 2010-09-20 14:27:53.752 INFO: added Output Plot Data/3100 Hz demod low level FPGA2.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: added Output Plot Data/lpf 6000 FPGA.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: added Output Plot Data/mult_imag.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: added Output Plot Data/mult_real.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: modified toolbox/DSP toolbox.sci
[ 5764] 2010-09-20 14:27:53.752 INFO: modified Data playback.sce
[ 5764] 2010-09-20 14:27:53.752 INFO: added lpf33_1k.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: modified read in and plot.sce
[ 5764] 2010-09-20 14:27:53.752 INFO: modified filter Coefficient Formating.sce
[ 5764] 2010-09-20 14:27:53.752 INFO: added lpf500_48k.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: added lpf1000_48k.txt
[ 5764] 2010-09-20 14:27:53.752 INFO: modified Create DSP Signal.sce
1.953 Auto-packing repository GCRepositoryPackCollection(CHKInventoryRepository('file:///D:/Working/Scilab/.bzr/repository/')), which has 8 pack files, containing 30 revisions. Packing 7 files into 1 affecting 7 revisions
1.953 repacking 7 revisions
2.046 repacking 7 inventories
2.062 repacking chk: 7 id_to_entry roots, 7 p_id_map roots, 13 total keys
2.093 repacking 66 texts
3.874 Adding the key (<bzrlib.btree_index.BTreeGraphIndex object at 0x00B4B290>, 132669091, 137021513) to an LRUSizeCache failed. value 300101948 is too big to fit in a the cache with size 41943040 52428800
13.465 Adding the key (<bzrlib.btree_index.BTreeGraphIndex object at 0x00B4B650>, 132629801, 137021513) to an LRUSizeCache failed. value 300101948 is too big to fit in a the cache with size 41943040 52428800
17.823 Adding the key (<bzrlib.btree_index.BTreeGraphIndex object at 0x00B4B290>, 65, 132627187) to an LRUSizeCache failed. value 295707666 is too big to fit in a the cache with size 41943040 52428800
49.004 aborting commit write group because of exception:
49.004 Traceback (most recent call last):
  File "bzrlib\commit.pyo", line 402, in _commit
  File "bzrlib\repository.pyo", line 179, in commit
  File "bzrlib\repository.pyo", line 1563, in commit_write_group
  File "bzrlib\repofmt\pack_repo.pyo", line 2314, in _commit_write_group
  File "bzrlib\repofmt\pack_repo.pyo", line 2174, in _commit_write_group
  File "bzrlib\repofmt\pack_repo.pyo", line 1476, in autopack
  File "bzrlib\repofmt\pack_repo.pyo", line 1516, in _do_autopack
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 693, in _execute_pack_operations
  File "bzrlib\repofmt\pack_repo.pyo", line 761, in pack
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 478, in _create_pack_from_packs
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 461, in _copy_text_texts
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 402, in _copy_stream
  File "bzrlib\groupcompress.pyo", line 1739, in _insert_record_stream
  File "bzrlib\groupcompress.pyo", line 1633, in flush
  File "bzrlib\groupcompress.pyo", line 310, in to_bytes
  File "bzrlib\groupcompress.pyo", line 303, in _create_z_content
  File "bzrlib\groupcompress.pyo", line 291, in _create_z_content_from_chunks
MemoryError

[ 5764] 2010-09-20 14:28:41.006 INFO: aborting commit write group: MemoryError()
49.066 Traceback (most recent call last):
  File "bzrlib\commands.pyo", line 853, in exception_to_return_code
  File "bzrlib\commands.pyo", line 1055, in run_bzr
  File "bzrlib\commands.pyo", line 661, in run_argv_aliases
  File "bzrlib\commands.pyo", line 665, in run_direct
  File "bzrlib\cleanup.pyo", line 122, in run_simple
  File "bzrlib\cleanup.pyo", line 156, in _do_with_cleanups
  File "C:/Program Files/Bazaar/plugins\qbzr\lib\commands.py", line 788, in run
  File "C:/Program Files/Bazaar/plugins\qbzr\lib\subprocess.py", line 786, in run_subprocess_command
  File "bzrlib\commands.pyo", line 1055, in run_bzr
  File "bzrlib\commands.pyo", line 661, in run_argv_aliases
  File "bzrlib\commands.pyo", line 665, in run_direct
  File "bzrlib\cleanup.pyo", line 122, in run_simple
  File "bzrlib\cleanup.pyo", line 156, in _do_with_cleanups
  File "bzrlib\builtins.pyo", line 3138, in run
  File "bzrlib\decorators.pyo", line 194, in write_locked
  File "bzrlib\workingtree_4.pyo", line 197, in commit
  File "bzrlib\decorators.pyo", line 194, in write_locked
  File "bzrlib\mutabletree.pyo", line 225, in commit
  File "bzrlib\commit.pyo", line 257, in commit
  File "bzrlib\cleanup.pyo", line 118, in run
  File "bzrlib\cleanup.pyo", line 156, in _do_with_cleanups
  File "bzrlib\commit.pyo", line 402, in _commit
  File "bzrlib\repository.pyo", line 179, in commit
  File "bzrlib\repository.pyo", line 1563, in commit_write_group
  File "bzrlib\repofmt\pack_repo.pyo", line 2314, in _commit_write_group
  File "bzrlib\repofmt\pack_repo.pyo", line 2174, in _commit_write_group
  File "bzrlib\repofmt\pack_repo.pyo", line 1476, in autopack
  File "bzrlib\repofmt\pack_repo.pyo", line 1516, in _do_autopack
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 693, in _execute_pack_operations
  File "bzrlib\repofmt\pack_repo.pyo", line 761, in pack
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 478, in _create_pack_from_packs
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 461, in _copy_text_texts
  File "bzrlib\repofmt\groupcompress_repo.pyo", line 402, in _copy_stream
  File "bzrlib\groupcompress.pyo", line 1739, in _insert_record_stream
  File "bzrlib\groupcompress.pyo", line 1633, in flush
  File "bzrlib\groupcompress.pyo", line 310, in to_bytes
  File "bzrlib\groupcompress.pyo", line 303, in _create_z_content
  File "bzrlib\groupcompress.pyo", line 291, in _create_z_content_from_chunks
MemoryError

49.066 Transferred: 0KiB (0.0K/s r:0K w:0K)
49.066 return code 3

Question information

Language:
English Edit question
Status:
Solved
For:
Bazaar Edit question
Assignee:
No assignee Edit question
Solved by:
Jack Ellsworth IV
Solved:
Last query:
Last reply:
Revision history for this message
John A Meinel (jameinel) said :
#1

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

So, #1 this isn't failing because of the new commit you are generating.
This is failing because we are trying to automatically repack the
repository where you already have data. So while the current commit
isn't particularly big, you have earlier commits that are.

...
> 3.874 Adding the key (<bzrlib.btree_index.BTreeGraphIndex object at 0x00B4B290>, 132669091, 137021513) to an LRUSizeCache failed. value 300101948 is too big to fit in a the cache with size 41943040 52428800
> 13.465 Adding the key (<bzrlib.btree_index.BTreeGraphIndex object at 0x00B4B650>, 132629801, 137021513) to an LRUSizeCache failed. value 300101948 is too big to fit in a the cache with size 41943040 52428800
> 17.823 Adding the key (<bzrlib.btree_index.BTreeGraphIndex object at 0x00B4B290>, 65, 132627187) to an LRUSizeCache failed. value 295707666 is too big to fit in a the cache with size 41943040 52428800

These are a bit worrying. It is indicating that you have a 4MB block
137021513 - 132669091 = 4.15MB, which is then expanding to over 300MB in
memory. I don't know what your content is, but something is *highly*
compressible.

But it also hints that we probably have a few hundred MB of content that
we are computing a delta for.

Interestingly, we don't seem to be failing where I would expect, but
instead we are failing inside of 'zlib.compress()' (we call
map(compressor.compress, content) which doesn't show up in the traceback.)

That isn't the place I would have thought we would consume the most
memory. Though if content isn't very compressible at that point, we can
go up as high as 2x a given content (one for the original text, 1 for
the not-really-compressed text). Maybe even a little bit higher if it is
reallocating a buffer.

At the moment we do have an open bug about delta compression taking up
too much memory (something like 5-6x the largest content). I think we
can get that down via a variety of ways (mostly, our core data structure
for the compression isn't optimized for being updated, and we update as
we go.)

If you just want to poke at it, you could do something like this:
=== modified file 'bzrlib/diff-delta.c'
- --- bzrlib/diff-delta.c 2010-08-02 16:35:11 +0000
+++ bzrlib/diff-delta.c 2010-09-20 20:19:23 +0000
@@ -34,7 +34,7 @@
  * for more data. Tweaking this number above 4 doesn't seem to help much,
  * anyway.
  */
- -#define EXTRA_NULLS 4
+#define EXTRA_NULLS 2

It would require recompiling afterward, and depending on the data, may
not actually help. But it *is* a cheap first step. (The number 4 was
settled on at the time as the 'best performance', but I wasn't really
tweaking for peak memory consumption.)

John
=:->
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (Cygwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAkyXwh0ACgkQJdeBCYSNAAM+RACeIN0j+n/963yWW7tYcE/tb1yb
pWYAoJ9IpwpkIzPpRI064isly3KsDHPS
=M2ck
-----END PGP SIGNATURE-----

Revision history for this message
Jack Ellsworth IV (jack-ellsworth-iv) said :
#2

I updated to version 2.2 and I was able to complete the commit.