﻿id	summary	reporter	owner	description	type	status	priority	milestone	component	version	resolution	keywords	cc	launchpad_bug
1045	Memory leak during massive file upload	francois	somebody	"Today, I copied about 12'000 files for a total size of about 52 GB into Tahoe with the SFTP frontend.

Here's what {{{top}}} has to say about the Tahoe process after this operation.
{{{
 2765 francois  20   0 2059m 1.5g 2472 D    2 75.4 527:08.83 python
}}}

I will update this ticket as soon as I can gather more details.

David-Sarah Hopwood proposed to do the same test a second time via the wapi to help locate the leak.

Here is what Brian Warner proposed on IRC:

> keep track of the process size vs time, with munin or a script that saves values and then graph them with gnuplot or something
> I think tahoe's /stats WAPI will give you process-memory-size info
> the idea is to do some operation repeatedly, measure process-space change while that's running, then switch to some other operation and measure that slope, and look for differences
> 'cp' to an SFTP-mounted FUSEish thing, vs 'tahoe cp' might be a good comparison"	defect	new	major	undecided	code	1.6.1		performance reliability upload memory sftp		
