binstats : Basic statistics on binary code
As you may know, my work consists in developing software for embedded devices. We usually says that an embedded device is a peace of hardware with low resources (memory, CPU, flash...) taking in example phones. Nowaday, smartphones have only core software that is really embedded, the rest is sometimes more powerful than the computer I wrote this post. But, don't care, I don't work in telephony. Here (at Neotion), we do really embedded software, with chipsets clocked from 100Mhz to 200Mhz, with available RAM from 1MB to 32MB, and flash up to 8MB.
After years of developments, we can have one or more software that became too big to fit in its allocated flash partition. So, to find the guilty functions, I wrote a simple PERL script (~130 lines) that will count number of instructions for each function from objdump's output (with -ld switch) and displays statistics per function and per file (it doesn't focus on .data or .bss section). To correctly use the script, you have to compile your program with -ggdb option (to have line numbers and file paths), but you can also set optimisations (-OX).
Example with main.c
#include <stdio.h> int function1(int a, int b) { return a*b+4; } int function2(int a, int b, char* c) { printf("Result %d*%d+4 = %s\n", a, b, c); return 0; } int main(int argc, char** argv) { char buf[32]; sprintf(buf, "%d", function1(5, 4)); function2(5, 4, buf); }
> gcc main.c -ggdb -o test > objdump -ld test > test.txt > ./binstats.pl --in test.txt Total instructions 63 63 (100.00%) /home/soutade/main.c 38 main 16 function2 9 function1
There are also options to filter small files, small functions and paths that helps to focus on big ones. Have fun !