Description of problem:
All golang versions prior to 1.9 do not support OIDs that require more than 28 bits.
Backport of https://github.com/golang/go/commit/94aba76639cf4d5e30975d846bb0368db8202269 is required to support the language maximum of 31 bits.
Version-Release number of selected component (if applicable):
Anything less than golang 1.9
Steps to Reproduce:
1. Minimum reproducer: https://play.golang.org/p/wITjVuO0-F
2. The following test cert can also reproduce the problem: https://www.viathinksoft.de/~daniel-marschall/asn.1/oid-sizecheck/oid_size_test.pem
asn1: structure error: base 128 integer too large
Correctly parse the certificate.
### What version of Go are you using (`go version`)?
`go version go1.7.5 linux/amd64` (should be the same in go 1.8+)
### What operating system and processor architecture are you using (`go env`)?
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build829310214=/tmp/go-build -gno-record-gcc-switches"
### What did you do?
Minimum reproducer: https://play.golang.org/p/wITjVuO0-F
The following test cert can also reproduce the problem: https://www.viathinksoft.de/~daniel-marschall/asn.1/oid-sizecheck/oid_size_test.pem
### What did you expect to see?
No cert parsing errors.
### What did you see instead?
`2009/11/10 23:00:00 failed to parse cert: asn1: structure error: base 128 integer too large`
The fundamental issue is that `asn1.ObjectIdentifier` is a alias for `int` instead of `big.Int`. Based on the data http://luca.ntop.org/Teaching/Appunti/asn1.html
> INTEGER, an arbitrary integer.
> OBJECT IDENTIFIER, an object identifier, which is a sequence of integer components that identify an object such as an algorithm or attribute type.
> The contents octets shall be an (ordered) list of encodings of subidentifiers concatenated together. Each subidentifier is represented as a series of (one or more) octets. Bit 8 of each octet indicates whether it is the last in the series: bit 8 of the last octet is zero; bit 8 of each preceding octet is one. Bits 7 to 1 of the octets in the series collectively encode the subidentifier. Conceptually, these groups of bits are concatenated to form an unsigned binary number whose most significant bit is bit 7 of the first octet and whose least significant bit is bit 1 of the last octet. The subidentifier shall be encoded in the fewest possible octets, that is, the leading octet of the subidentifier shall not have the value 80 (base 16).
it would seem that `int` is not the correct type.
However, since the type is a simple alias, it is trivial to cast between them. Any change to the underlying type would then be considered backwards incompatible. I will leave it to the Go team to decide if that is the case.
We have run into a real world case where an individual was assigned an OID component that requires 29 bits to represent. Thus at this time they cannot use their certs as the current go implementation https://github.com/golang/go/blob/master/src/encoding/asn1/asn1.go#L298 is limited to 28 bits. It should be possible to use 31+ bits from `int`. This would be fully backwards compatible and consistent on all machines. Using a temporary `int64`, one should be able to store greater values on machines where `int` is larger than 32 bits. This would not be consistent across machine architectures.
Both of these methods are simple mitigations and would not help in cases such as https://github.com/docker/distribution/issues/1370 where http://www.oid-info.com/get/2.25 was used to store a 128 bit UUID. They are also not tolerant of implementations that may pad the integer with unnecessary leading zeros.
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.
For information on the advisory, and where to find the updated
files, follow the link below.
If the solution does not work for you, open a new bug report.