Ethereal-users: Re: [Ethereal-users] Problem with h245 dissector

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: "Martin Regner" <martin.regner@xxxxxxxxx>
Date: Sat, 22 Nov 2003 23:16:00 +0100
Ronnie Sahlberg wrote:
> I am pretty sure your fix is wrong and the real problem is that your
> implementation that generated the packet is broken.
> Your fix does cause ethereal dissect that particular packet in the way you
> wish it to do but it does not dissect it according to the ASN.1
> definition of h.245.
>
> As Guy wrote below, the definition for this type is :
> signalType
>                  IA5String(SIZE (1) ^ FROM ("0123456789#*ABCD!")),
>
> but the alphabet string passed to
> "dissect_per_restricted_character_string()" is "!#*0123456789ABCD".
>
> Note that ethereal specifies the alphabet using a different order of the
> possible characters.
> The reason for this is that  a restricted alphabet in ASN.1 does not
require
> the individual characters in the alphabet to be specified in any
> given order.   You can rearrange them any way you want and it will still
> describe the same restricted alphabet and the encoding will still be the
> same.
> Thus  IA5String(SIZE (1) ^ FROM ("0123456789#*ABCD!")),
> is equivalent and is encoded in exactly the same way as IA5String(SIZE (1)
^
> FROM ("DCBA9876543210#*!")),
> When restricted alphabets are used, the individual characters in the
> restricted subset will automatically be reordered in the order they
> appear in some table in the standard (the ASCII value order?)
>
> Anyway, Ethereal does NOT do this reordering of the individual characters
> byt REQUIRES the alphabet to be specified in the order the
> characters will be assigned values. That is why the alphabet characters
are
> specified in a different order.
>
>
> OK. Now look at how the characters in a restricted character string is
> encoded:
> For a restricted string, each character in the string will be encoded as a
> bitfield of as many bits that are required to represent the
> entire restricted alphabet.  I.e.  We need n number of bits to describe
2^n
> or less restricted characters.
> For ALIGNED PER (which this uses) there is an additional restriction  :
the
> number of bits used to represent each restricted character must also
> be a power of two.
> I.e.  in ALIGNED PER the number of bits used to encode each character MUST
> be either 1,2,4 or 8.
>
> This alphabet consists of 17 characters.  We need 5 bits to be able to
> represent 17 different values. However since we use ALIGNED PER,
> 5 bits per character is not legal so we go to the next larger valid
number:
> 8
>
> Each character in the string is thus encoded as 8 bits.   (would only be 5
> bits for UNALIGNED PER)
>
>
> The values used in the encoding starts at 0 and is incremented with 1 for
> each character in the restricted alphabet :
> Thus hte characters would then be encoded as 8 bit values with the
following
> values:
> !     0x00
> #    0x01
> *    0x02
> 0    0x03
> 1    0x04
> 2    0x05
> 3    0x06
> 4    0x07
> 5    0x08
> 6    0x09
> 7    0x0a
> 8    0x0b
> 9    0x0c
> A    0x0d
> B    0x0e
> C    0x0f
> D    0x10
>
> These are all the possible values in the restricted aplhabet.
> Each character is encoded as 1 BYTE   the ONLY legal values in this
encoding
> are 0x00 to and including 0x10.
>

There is something on page 446 i Dubuissons book "ASN.1 Communication
Between Heterogeneous Systems" (chapter 20.6.10 Character Strings and Dates)
that I think could be an explanation to why the decoding with Ethereal
0.9.16 fails for this scenario.

Dubissons book can be downloaded from OSS Nakalva's homepage:
http://www.oss.com/asn1/dubuisson.html

I'm not completely sure if I have understood the text correct, but my
preliminar interpretion of the text is that since there is 8 bits used for
each character then every character of the string is encoded as its
associated number in the the interval (0..127) for IA5 characters (i.e. 0x32
for '2') instead of reindexing the characters as the index in
"!#*0123456789ABCD" (i.e. 0x05 for '2').

If the constraint had been e.g. "ACGT" then there had been a different
scenario, since 127 > 2^B -1 in that case (2 or 4 bits is not been enough to
handle values between vmin=0 and vmax=127, but 7 bits or more).

I need to check this more in detail and also test your patch with several
different captures.