You have two 3-bit sensors, A and B, that measure the same thing, whatever it is -- temperature of the room, radioactivity levels, whatever. Both sensors are hooked up to the same CPU, which takes in the sensor readings. You know that the sensors are designed so that their readings can be off by at most one bit. We claim that if B knows that A has sent the CPU a 3-bit sequence, then B only needs to send 2 bits, and the CPU will be able to reconstruct B's 3-bit measurement, thereby conserving bandwidth. How is this so?
Question
BMAD
You have two 3-bit sensors, A and B, that measure the same thing, whatever it is -- temperature of the room, radioactivity levels, whatever. Both sensors are hooked up to the same CPU, which takes in the sensor readings. You know that the sensors are designed so that their readings can be off by at most one bit. We claim that if B knows that A has sent the CPU a 3-bit sequence, then B only needs to send 2 bits, and the CPU will be able to reconstruct B's 3-bit measurement, thereby conserving bandwidth. How is this so?
Link to comment
Share on other sites
5 answers to this question
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.