[sane-devel] memory problem

Gerhard Jaeger gerhard@gjaeger.de
Wed, 25 Aug 2004 18:30:17 +0200


Hi,

as long as you run the stuff on linux, I suggest to use the process
model and NOT the pthread model. The pthread implementation
on Linux is somewhat crappy.
Anyway it should not happen, that all of your memory will be used,
so I'm pretty sure that you missed something obvious.
What about linking the libs statically and using valgrind or efence?

Ciao,
  Gerhard

On Wednesday 25 August 2004 17:49, Oliver Schirrmeister wrote:
> Hi,
>
> I'm trying to find a memory leak in the fujitsu backend.
> When I do a duplex scan (for example with scanadf) the vm-size
> of the process increases aproximatly the size of one image
> (for example ~4MB for DIN-A5-gray and ~8MB for DIN-A4-gray
> at 300dpi). So all 250 sheets when the process reaches the 2GB
> limit I run into problems.
>
> I've configured sane to use pthreads and use sanei_thread_begin
> to fork a reader thread that reads the data from the scanner and
> writes it into two pipes (one for the frontside and one for the backside).
>
> The scanner returns the image data in alternate order, one block
> frontside, one block backside ...
> So I write the front side data directly to the frontside-pipe and
> write the backside data in a buffer of the size of the image.
> When the transfer is complete, I write that buffer into the backside
> pipe.
> I've checked it several times. I'm really freeing that buffer. The address
> I free is really the address I got from alloc.
>
> If I don't use a pipe for the backside and use a file (so I don't have to
> allocate that buffer) there is no problem.
>
> When I don't use threads (--enable-fork-process=YES) I don't have
> problems because the reader process terminates and frees all memory. So
> I think the problem is in the reader thread.
>
> I'm not killing the reader thread. I think it should terminate when the
> reader-procedure ends.
>
> Any suggestions are welcome.
>
> Oliver