Ethereal-dev: Re: [Ethereal-dev] problems on windows

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: Jaap Keuter <jaap.keuter@xxxxxxxxx>
Date: Sun, 19 Mar 2006 10:03:12 +0100 (CET)
Hi,

Issue 1 seems to me like you have decoding logic in tree presentation
code. Let me explain with a code sample.

if (tree) {
	operation = tvb_get_uint8(tvb, offset);

	proto_tree_add_item(tree, hf_operation, tvb, offset, FALSE);

	if (operation == PROTO_MDI)
		call_subdissector(tvb_new_subset(tvb, offset, -1));
}

Now guess what happens when your dissector is called with tree=NULL.
Whether or not tree is NULL depends on all kinds of stuff. Normally the
packet list is build by calling the dissector with tree=NULL. But FI when
a filter is applied or coloring rules apply tree is set.
The above code should be split like so:

if (tree) {
	proto_tree_add_item(tree, hf_operation, tvb, offset, FALSE);
}

operation = tvb_get_uint8(tvb, offset);

if (operation == PROTO_MDI)
	call_subdissector(tvb_new_subset(tvb, offset, -1));

Hope it helps,
Jaap


On Sat, 18 Mar 2006, Julian Cable wrote:

> Hi,
>
> I have a couple of problems with the dissectors I've written. Any ideas
> would be really helpful.
>
> 1) I get different displays on windows and linux. Specifically this
> file http://217.35.80.115/drm/example.pcap displays "MDI" in the
> protocol column on linux, but a different, lower level decode "DCP-TP"
> on windows. If I select a display filter including an MDI layer field
> (eg "MDI") then the windows display changes to be more like the linux
> one.
>