Ethereal-users: [Ethereal-users] Tethereal analyzing a large capture file crashes...

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: "Joel R. Helgeson" <joel@xxxxxxxxxxxx>
Date: Mon, 7 Mar 2005 15:44:21 -0600
I have a 24 hour packet capture that I did at a customers network.  I only captured the 68byte headers and the file size came out to a mere 40gigs.
 
I'd now like to do analysis of the file using tethereal, I'd like to give them a graph showing kbps at 5 minute intervals:
tethereal -r capture.cap -z io,stat,300 -q >io_stat_results.txt
 
or any other similar analysis just causes my machine to process the file for a while then bomb out.  I'm running the analysis on a Windows XP sp2 machine with 2gb of ram and 400 gigs of disk space. The processor is an Intel 3.0ghz. What seems to happen is that tethereal uses up the ram and swap, up to a total of 2gigs used then it crashes and states that it failed to allocate xxxx bytes of memory...
 
Can anyone help me? Is there any way that I can run reports on this huge of a file without tethereal bombing on me?
 
Any ideas?
 
Joel