Ethereal-users: Re: [Ethereal-users] Decoding RTP?

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: "James A. Crippen" <james@xxxxxxxxxxxx>
Date: Mon, 5 Feb 2001 12:05:58 -0900 (AKST)
On Mon, 5 Feb 2001, Craig Rodrigues wrote:

> Hi,
> 
> I have a RTP question for the protocol gurus out there.
> I looked at RFC 1889 for RTP (http://www.ietf.org/rfc/rfc1889.txt)
> and borrowed the following code from it:
> 
> /*
>  *     * RTP data header
>  *         */
> typedef struct {
>   unsigned int version:2;   /* protocol version */
>   unsigned int p:1;         /* padding flag */
>   unsigned int x:1;         /* header extension flag */
>   unsigned int cc:4;        /* CSRC count */
>   unsigned int m:1;         /* marker bit */
>   unsigned int pt:7;        /* payload type */
>   u_int16_t seq;              /* sequence number */
>   u_int32_t ts;               /* timestamp */
>   u_int32_t ssrc;             /* synchronization source */
>   u_int32_t csrc[1];          /* optional CSRC list */
> } rtp_hdr_t;  
> 
> 
> I am running on i386 Linux.
> 
> I create a packet, where I set the version number to 2, the payload type to
> 32, and everything else to zero.  When I ship this packet over the
> network via UDP, Ethereal decodes the packet with a version number of 0.
> The first byte of the packet, according to Ethereal, is 0xa (10).
> 
> Am I doing things wrong?  Do I need to redefine my struct depending on
> what byte order I am in?

Probably that's your problem.  'Network Standard Order' is big endian, but
Intel IA32 (x86) processors are all little endian.  I recall this bug
biting me once.  All those bits should probably be in reverse order.

HTH
'james

-- 
James A. Crippen <james@xxxxxxxxxxxx> ,-./-.  Anchorage, Alaska,
Lambda Unlimited: Recursion 'R' Us   |  |/  | USA, 61.2069 N, 149.766 W,
Y = \f.(\x.f(xx)) (\x.f(xx))         |  |\  | Earth, Sol System,
Y(F) = F(Y(F))                        \_,-_/  Milky Way.