Encrypting voice comunications..
Ian Stirling
openmoko at mauve.plus.com
Sat Feb 3 16:19:17 CET 2007
Mikko Rauhala wrote:
> pe, 2007-02-02 kello 09:54 -0800, Tim Newsom kirjoitti:
>> So, though possibly inefficient, we could not some how take the analog
>> audio stream, do some predictable and reversible encoding/encrypting
>> then convert into sounds again.. Like doing base64 encoding for binary
>> data..
>>
>> In that way we are still sending audio information and letting it get
>> encoded by the gsm module.
>
> Well. Even if the hardware supports feeding the GSM chip with audio from
> the SoC instead of the mic directly (does it? it would be useful for
> other stuff too, but I don't really know), it would be a rather
> nontrivial exercise to make an encryption transform that would properly
> survive lossy GSM encoding (and two d-a-d-conversions...) and be
> readable.
GSM encoding is not lossy in that way.
If you have perfect knowledge of the sample stream that the GSM coder
has been passed, then you can accurately predict the output of the GSM
coder.
And you can then perfectly predict the decoders output stream in that
way, and get a clean bitstream out that matches what was put in.
In practice, well...
If we have no A/D on either end, it might actually work, though there
will probably be horrible framing problems.
Bit errors are going to be really, really annoying, as the GSM codec has
many properties of an encryption algorithm.
It is designed on single bit errors to produce something that sounds to
the ear - most of the time with reasonable input - similar.
This means in practice that you will not get one bit error in the
output, but most of them in error.
It _is_ possible to correct this - you construct a model of the GSM
codec state, and for each packet recieved, you update this model.
If an output packet does not checksum cleanly, you try taking the GSM
codecs output, back-calculating what the input to it must have been, try
flipping bits on that input till you get output that checksums correctly.
Unfortunately, this takes a lot of CPU power. I would be surprised if
it's possible to correct double bit errors in real time on the latest
desktop CPUs.
I suspect in practice this scheme may only work for absolutely perfect
links, where there is no noise, or where there is little enough noise
that you can afford to throw away the affected packets.
It would give a drastically better bandwidth, when it works, but you're
going to need a fallback option for the other 80% of the time.
More information about the community
mailing list