We have a user report where coredump contains $pc pointing into .bss gdb "disassemble", consequently, tries to disassemble *entire* .bss. Since it is done by automated gdb run (gdb -batch -ex disassemble), there is no user at the keyboard who can realize that this isn't useful and can produce many megabytes of output. Please add a parameter or an option to "disassemble" command which can be used to cap the output size. Something along the lines of (gdb) disassemble /maxlines:20000
Such simple usage is really scriptable with Python. For example first 3 lines: (gdb) python import re;print re.compile("^([^\n]*\n){3}").match(gdb.execute("disassemble main",False,True)).group()
(In reply to Jan Kratochvil from comment #1) > Such simple usage is really scriptable with Python. For example first 3 > lines: > > (gdb) python import re;print > re.compile("^([^\n]*\n){3}").match(gdb.execute("disassemble > main",False,True)).group() See bug 995889: """ Generating core_backtrace Lock file './.lock' is locked by process 6567 Generating backtrace Backtrace is too big (33564131 bytes), reducing depth to 512 Backtrace is too big (33564131 bytes), reducing depth to 256 Backtrace is too big (33564131 bytes), reducing depth to 128 etc. The backtrace never decreases in size and I am unable to submit the bug. On a Core-i7, it is very slow and can easily take 30+ minutes to run through the gamut before giving up. """ IOW: the execution time and possible memory exhaustion temporarily storing megabytes of disasm is also the problem here.
disassemble main,+20000 Just it is in bytes and not lines but that should not matter much.
oops, not right, it does not stop at the end of main.
This one should work: import re f="main" l=1 r=20000 # Is there \Q in Python? p=re.compile("^%s [+] "%f) while l<r: m=(l+r)/2 if p.match(gdb.execute("info symbol %s+%d"%(f,m),False,True)): l=m+1 else: r=m gdb.execute("disassemble %s,+%d"%(f,r))