An engineer implements a new Cisco UCM based telephony system per these requirements:
The local Ethernet bandwidth is sized based on the total bandwidth per call.
A G.736 codec is used.
The bit rate is 64 kbps.
The codec sample interval is 10 ms.
The voice payload size is 160 bytes per 20 ms.
What should the size of the Ethernet bandwidth be per call?
A. 31.2 kbps
B. 38.4 kbps
C. 55.2 kbps
D. 87.2 kbps
All details match with G711 protocol, for which the calculation is :
(160+(40+18))/160 x 64 =87.2 kbps
https://www.cisco.com/c/en/us/support/docs/voice/voice-quality/7934-bwidth-consume.html
If we’re given the bit rate for the codec, I’m hoping all the options will only include one that’s actually higher making it the correct answer. Otherwise, Cisco expecting us to memorize these bit-rates, kinda dumb.
Check this link under the “Introduction to codecs” menu on the left. You will see a table of codecs with their calculations for bandwidth.
https://www.cisco.com/c/en/us/td/docs/ios-xml/ios/voice/cube/configuration/cube-book/cube-codec-basic.html
a Step by Step calculation can be found here : https://www.cbtnuggets.com/blog/technology/networking/how-to-calculate-bandwidth-for-cisco-ip-calls
Do not get confuse by the G736 codec since we only need the bitrate which is 64kbps
I think it should B
https://www.cisco.com/c/en/us/support/docs/voice/voice-quality/7934-bwidth-consume.html
Sorry. D