Ethereal-dev: Re: [Ethereal-dev] When are packets dissected?

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: Guy Harris <gharris@xxxxxxxxx>
Date: Sat, 1 Jun 2002 13:35:22 -0700
On Sun, Jun 02, 2002 at 10:17:47AM -0700, Chris Waters wrote:
> I am trying to understand at what point Ethereal dissects packets. From the
> README.developer it sounds like a packet is only dissected when it becomes
> the selected packet and needs to be displayed in the tree view.

No.  If README.developer implies that, it needs to be fixed.

Ethereal dissects each frame when the file is read in, as well as
dissecting a packet when:

	it's selected and displayed;

	it's printed;

	it's processed by a display filter;

	it's processed by the "Protocol Hierarchy Statistics" code.

However:

> This seems
> like the most memory efficient scheme because you would only need enough
> memory to hold the data for all of  the packets that have been captured, and
> not all of the data generated by the dissectors as well.

	1) the protocol tree is not necessarily generated in some of
	   those cases (it's generated when the file is read in only if
	   a "read filter" is being used or the display is being
	   colored);

	2) the protocol tree is not saved after Ethereal is finished
	   processing the frame (that would, indeed, significantly
	   increase Ethereal's memory usage).

Note also that the raw packet data is not kept in memory - it's read
from the capture file as needed.

> If this is the case, how does statistics collection and filtering
> work?

If by "statistics collection" you mean the stuff done by "Protocol
Hierarchy Statistics" under the "Tools" menu, then the answer to both
questions is "they work because it's not the case." :-)

> Don't you need to have dissected a packet to apply a filter because the
> filters rely on the fields extracted by the dissectors? I guess that each
> packet could be dissected when it is captured, tested against the filter and
> then the dissection freed.

The packet is dissected when it's read, not when it's captured, but that
is exactly how it works.

> The reason for these questions is that I have got my Windows GUI for
> Ethereal working well, but it is taking far too much memory. The strategy
> that I had adopted is that each packet is fully dissected when it is
> received, but I wonder now if that was a good idea.

The Windows GUI should be using the code in "file.c", and thus should be
doing what the GTK+ GUI does.  Unfortunately, "file.c" *currently* has a
bunch of GTK+ calls in it, so you'd have to use #ifdefs to control
whether it uses the native Windows or GTK+ routines; at some point we
should remove all the GTK+ calls from "file.c" and have it call routines
in a GUI library to do all its work.