Thanks for reply,
The question asked by me is not totally baseless.
Just look at the scenario.
A Bit must have a certain duration.
The minimum the duration of bit the more will be the number of bits per second transmitted.
For the case of E1,
we have a time slot ( usual with TDMA) and there we have certain bits fitted in it.
Now it is not possible to alter the standard itself, i mean the source and destination nodes cannot change the standard itself. Both the nodes have studied the same school of standardization and thus they have no choice other than to obey the rules of G.703.
But the case takes a twist when
both of them, i mean nodes agreed to interpret the sequence of bits being delivered to them by E1 in some other way.
for example they might start doing like this
we will not do anything until our tray is filled with 20 bits.
These bits are not the bits but actually a code that only source and destination knows.
For every 20 bits received the destination node will look the sequence
this sequence will be compared with a coding table.
and it might be possible that these 20bits can correspond to another sequence of 40 bits
Thus it might be possible that
even though you have received 20 bits in one second ,
but you have understood 40 bits and that too in one second.
What do you say?
As i am not well versed with transmission standards but
when i hear things like
ATM over SDH
IP over SDH
that, just confuses me why we are riding a transmission technique over other if it is not providing any gain in
what you call
bits per second OR
Information at a rapid rate!!!