Title: preventing malformed packet errors with dissector when desegment is turned off
I have a dissector for an application layer protocol that is on top
of a TCP port.
\code snippet
guint
get_xyz_pdu_len(packet_info *pinfo, tvbuff_t *tvb, int offset)
{
guint32 plen;
/* Get the length of the TCP XYZ payload. */
plen = tvb_get_ntohl(tvb, offset);
/* The TCP XYZ payload length value does not include itself so
add that in. */
return plen + 4;
}
void
dissect_xyz_tcp(tvbuff_t *tvb, packet_info *pinfo, proto_tree *tree)
{
tcp_dissect_pdus(tvb, pinfo, tree,
desegment_xyz_messages, TCP_XYZ_PAYLOAD_HDR_LENGTH,
get_xyz_pdu_len, dissect_xyz_tcp_pdu);
}
static void
dissect_xyz_tcp_pdu(tvbuff_t *tvb, packet_info *pinfo, proto_tree *tree)
{
...
}
void
proto_reg_handoff_xyz(void)
{
xyz_tcp_handle = create_dissector_handle(dissect_xyz_tcp, proto_xyz);
dissector_add_uint("tcp.port", WKS_TCP_PORT, xyz_tcp_handle);
heur_dissector_add("udp", dissect_xyz_udp_heur, proto_xyz);
}
\endcode
When the 'desegment_xyz_messages' is true, the protocol dissector
appears to work fine, and I do not get any malformed packets in
my desegmented TCP messages. However, when I set desegment to false,
I get '[Malformed Packet]' next to any TCP segment messages and my
dissector is attempting to run on each segment which creates these
errors.
So my question is, what is the correct behavior to implement? Should
I not have a desegment option?
I could decide not to run my dissector when its false and just leave
all of the packets in the conversation as TCP.
\code
void
dissect_xyz_tcp(tvbuff_t *tvb, packet_info *pinfo, proto_tree *tree)
{
if (desegment_xyz_messages) {
tcp_dissect_pdus(tvb, pinfo, tree,
desegment_xyz_messages, TCP_XYZ_PAYLOAD_HDR_LENGTH,
get_xyz_pdu_len, dissect_xyz_tcp_pdu);
}
}
\endcode
In this scenario, if desegment is on, I get the protocol messages; if
it's off, it looks like unadorned TCP messages.
I'm just wondering what kind of expectations there are for TCP based
application layer protocol dissectors.
Best regards,
John Dill