It all comes from the defined requirements and specifications.
i.e. "You shall handle x messages in y milliseconds."
from that, you derive your worst case buffer size given that you can service that buffer every 'z' milliseconds at most (Note, that involves a hard-real-time requirement as it is a bounded maximum time).
As said by TickleSteve, you define the worst case rather than analyze it.
As for the stack usage requirements, it seems like this could be determined statically by some parametric process, but I’m no expert on this.
Does anyone see a reason why there couldn’t be some algorithm to statically analyze some code to derive the worst case stack usage?
For example, take every function and assume that every variable declaration will be required. Add them up. Then, follow every path down the call tree while adding up the required stack for each call.