More than 1 million, less than 15 million packets, depends on the file...
Nothing on our network believes in using a packet larger than 384 bits (as
opposed to up to 1500):
I can't think of how you'd parallelize (sp?) it easy either, but I was
hoping you might have come up with a way. ;)
So pretty much what I was guessing, time to get a bigger/faster box. :)
Thanks,
Chris
> -----Original Message-----
> From: Guy Harris [mailto:guy@xxxxxxxxxx]
> Sent: Thursday, April 04, 2002 5:07 PM
> To: Chris Robertson
> Cc: 'ethereal-users@xxxxxxxxxxxx'
> Subject: Re: [Ethereal-users] Large memory footprint
>
>
> On Thu, Apr 04, 2002 at 04:44:32PM -0800, Chris Robertson wrote:
> > A quick question, I'm loading faily large capture files
> (created with
> > tcpdump -w <input.file>, between 100MB-400MB, up to
> millions packets) into
> > Ethereal and I'm getting a memory footprint of 600MB to
> just under 1GB. Is
> > this normal?
>
> Given that the GTK+ widget Ethereal uses to display the list
> of packets
> allocates a string for *every* column in *every* row, the memory
> footprint could be large. That'd probably take at least 35
> or so bytes
> per packet for starters.
>
> That widget also allocates a data structure per row, and Ethereal also
> allocates a data structure per packet - possibly more than
> one, in order
> to keep state information needed for some protocols.
>
> When you say "up to millions [of] packets", how many are "millions"? 1
> million packets? 10 million? More?
>
> > Also, Ethereal is single threaded, correct?
>
> It's single-threaded except when you do an "Update list of packets in
> real time" capture, in which case there's one process doing the
> capturing and another process updating the display.
>
> They're not "threads" in the sense of, say, pthreads, but
> they could run
> on different CPUs.
>
> Nothing else uses more than one thread of control; I'm not
> certain that
> there's anything that could make good use of multiple CPUs or, even if
> there is, that it wouldn't involve a considerable amount of work to
> parallelize.
>
> _______________________________________________
> Ethereal-users mailing list
> Ethereal-users@xxxxxxxxxxxx
> http://www.ethereal.com/mailman/listinfo/ethereal-users
>