https://bugs.wireshark.org/bugzilla/show_bug.cgi?id=6854
Summary: tn3270 dissector: decoding WCC and SF attribute bytes
incorrectly
Product: Wireshark
Version: 1.6.5
Platform: x86-64
OS/Version: Windows 7
Status: NEW
Severity: Minor
Priority: Low
Component: Wireshark
AssignedTo: bugzilla-admin@xxxxxxxxxxxxx
ReportedBy: michael.wojcik@xxxxxxxxxxxxxx
Build Information:
Version 1.6.5 (SVN Rev 40429 from /trunk-1.6)
Copyright 1998-2012 Gerald Combs <gerald@xxxxxxxxxxxxx> and contributors.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Compiled (64-bit) with GTK+ 2.22.1, with GLib 2.26.1, with WinPcap (version
unknown), with libz 1.2.5, without POSIX capabilities, without libpcre, without
SMI, with c-ares 1.7.1, with Lua 5.1, without Python, with GnuTLS 2.10.3, with
Gcrypt 1.4.6, without Kerberos, with GeoIP, with PortAudio V19-devel (built Jan
10 2012), with AirPcap.
Running on 64-bit Windows 7 Service Pack 1, build 7601, with WinPcap version
4.1.2 (packet.dll version 4.1.0.2001), based on libpcap version 1.0 branch
1_0_rel0b (20091008), GnuTLS 2.10.3, Gcrypt 1.4.6, without AirPcap.
Built using Microsoft Visual C++ 9.0 build 21022
--
I'm a bit hesitant to open this bug, because it seems unlikely this would have
gone unnoticed so long; but I've triple-checked the IBM docs and it really
appears the tn3270 sub-dissector is wrong.
Here's the problem: when tn3270 analyzes the bits in the WCC (Write Control
Character), or the attribute byte that follows the SF (Start Field) order, and
possibly other bit vectors that appear in the TN3270 stream, it's interpreting
the bits in the wrong order.
The IBM 3270 Data Stream Programmer's Guide documents these bits, numbering
them from 0 to 7; but bit 0 is the *most* significant bit, not the LSB. (IBM
does this to sow confusion among their enemies.)
The tn3270 dissector treats what IBM calls "bit 0" as the LSB, not the MSB.
For example, here's the tn3270 dissector's breakdown of the WCC 0xc2:
Write Control Character: WCC Reset, WCC Keyboard Restore, WCC Reset MDT
.... ...0 = WCC NOP: False
.... ..1. = WCC Reset: True
.... .0.. = WCC Printer1: False
.... 0... = WCC Printer2: False
...0 .... = WCC Start Printer: False
..0. .... = WCC Sound Alarm: False
.1.. .... = WCC Keyboard Restore: True
1... .... = WCC Reset MDT: True
But what the dissector is calling "WCC NOP" is in fact the MSB (it's used in
the same manner as the "Graphic Convert 1" bit in the SF attribute byte, if the
client doesn't support partitions), "WCC Reset" is the second-most-significant,
etc. From the 3270 Data Stream Programmer's Reference, Table 3-2:
| | If the Reset function is not supported, the only function of |
| | bits 0 and 1 is to make the WCC byte an |
| | EBCDIC/ASCII-translatable character. Bits 0 and 1 are set in |
| | accordance with Figure C-1 in topic C.0. |
Figure C-1 is the table used to convert the six bits of a byte in a 12-bit 3270
buffer address to printable characters. It determines the settings for the two
most-significant bits.
Note that the CICS Programmer's Guide, in its example of a 3270 output stream
(Table 35 in the CICS 4.1 Prog Guide, SC34-7022-01), uses that same WCC value
of 0xc2, and explains it as "WCC; this value unlocks the keyboard, but does not
sound the alarm or reset the MDTs." But the dissector analysis above shows "WCC
Reset MDT" as "True".
0xc2 is binary 1100 0010. The first two 1 bits are the graphic-convert bits;
they have no other meaning unless this is a 3270 model that supports
partitions, in which case the second 1 bit will reset the partitions. The last
1 bit is the keyboard-restore bit, which IBM calls bit 6. So the dissector
should be displaying this as:
Write Control Character: WCC Reset, WCC Keyboard Restore, WCC Reset MDT
.... ...0 = WCC Reset MDT: False
.... ..1. = WCC Keyboard Restore: True
.... .0.. = WCC Sound Alarm: False
.... 0... = WCC Start Printer: False
...0 .... = WCC Printer2: False
..0. .... = WCC Printer1: False
.1.. .... = WCC Reset: True
1... .... = WCC NOP: True
The situation with SF attribute bytes is a bit more complicated. Here's the
dissector's breakdown of an attribute byte:
3270 Field Attribute: 0xf0, Display: Display Not Selector Pen Detectable,
Graphic Convert1, Graphic Convert2, Numeric, Protected
.... ..00 = Display: Display Not Selector Pen Detectable (0x00)
1... .... = Graphic Convert1: True
.1.. .... = Graphic Convert2: True
.... .0.. = Modified: False
...1 .... = Numeric: True
..1. .... = Protected: True
.... 0... = Reserved: False
The four more-significant bits are correct here: Graphic Convert 1 and 2,
Protected, and Numeric. The bits in the second nybble are wrong, but they're
not backward - they're just mixed up. They should be:
.... 00.. = Display: Display Not Selector Pen Detectable (0x00)
.... ...0 = Modified: False
.... ..0. = Reserved: False
That is, the first two are the display/pen bits (I haven't verified whether the
dissector gets the four combinations of these bits correct), the second-to-last
bit is reserved, and the final bit is the MDT.
It's probably relatively easy to fix these in the dissector source. I haven't
taken a look at it yet.
--
Configure bugmail: https://bugs.wireshark.org/bugzilla/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are watching all bug changes.