Bug 10955 - gawk fails on first record if using FS other than default
Summary: gawk fails on first record if using FS other than default
Keywords:
Status: CLOSED NOTABUG
Alias: None
Product: Red Hat Linux
Classification: Retired
Component: gawk
Version: 6.2
Hardware: All
OS: Linux
medium
high
Target Milestone: ---
Assignee: David Lawrence
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2000-04-21 04:24 UTC by jerry cloe
Modified: 2008-05-01 15:37 UTC (History)
1 user (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2000-05-18 14:43:41 UTC


Attachments (Terms of Use)

Description jerry cloe 2000-04-21 04:24:24 UTC
gawk fails on first record (remaining records are OK) if using an FS value
of anything but the default (of space).  I have generated input files and
tested with a variety of FS values and the first record ALWAYS returns the
entire record as $1 and doesn't break it down into the appropriate
fields.  Remaining records return as expected.  First record only $1 has a
value (of the whole record) and remaining fields are null.

I can duplicate this error under redhat 5.2, 6.1 and 6.2, and using a
variety of different FS values, including ":" (documented below), "/", "a"
and several other values.

Example:

input file:
cat /etc/passwd
root:x:0:0:root:/root:/bin/bash
bin:x:1:1:bin:/bin:
daemon:x:2:2:daemon:/sbin:

looking for first record:
cat /etc/passwd | gawk '{FS=":"}{print $1}'
root:x:0:0:root:/root:/bin/bash
bin
daemon

looking for 5th record:
cat /etc/passwd | gawk '{FS=":"}{print $5}'
<<--this field returned nothing except an output record seperator,
  --I would have expected the word "root"
bin
daemon

Comment 1 jerry cloe 2000-04-22 06:45:59 UTC
In my examples, I meant to say "looking for first field" and "looking for 5th
field" rather than record...  sorry for any confusion

Comment 2 Florian La Roche 2000-05-18 14:43:59 UTC
use awk -F: '{ print $1 }'


Note You need to log in before you can comment on or make changes to this bug.