Game Development Community

Bit Precision

by Jared Schnelle · in Torque Game Engine · 11/20/2002 (12:18 am) · 3 replies

*** This post has been answered(by myself) down a bit ***

I was placing a Power Bar(much like the health bar) into the PlayGui.

Since there's going to be a graphical update on the screen, there needed to be some code added to the engine.

My question comes from the final parts of the code that I needed to add. Whenever I change my Power Level(by casting a spell or whatnot), I do a setMask(PowerMask) so it knows to trigger an update in the next available packet.

So, this works fine, except when I unpack the information. (on a side note, I noticed that the floating points are dropped to be a value between 0->1 then increased back to whatever they were on the receiving end... is this to save bandwidth?)

if (stream->writeFlag(mask & PowerMask)) {
	   stream->writeFloat(mClampF(mPower / mDataBlock->maxPower, 0.f, 1.f), PowerLevelBits);
       F32 clamp = mClampF(mPower / mDataBlock->maxPower, 0.f, 1.f);
The mClampF makes sure that the value is: 0 < value < 1. The variable clamp is there just so I can echo to the console what value I'm trying to pack.

...

A few down lines in the unPack function, I do the following:
F32 clamp = stream->readFloat(PowerLevelBits);
I then echo the value for what I unPacked from the packet.

This is where I'm having trouble, these two values (what I packed and unpacked) are not the same. They're close, but not the same.

When I define:
PowerLevelBits = 6 (in shapebase.h) the value is about 5% off.
PowerLevelBits = 8, the value is about 2% off

You see the pattern, the more bits I let it pass, the more precise it becomes. Is there a better way around this? It seems to me that it forces you to pack a number between 0 -> 1 into the bitstream to save bandwidth, but it seems like this does just the opposite.

Thanks for any insight, it's late and I've probably missed some stuff, so please point me in the right direction.

-Jared

#1
11/20/2002 (11:47 am)
Every bit you add should halve the inaccuracy. And remember, the server is the authoritative ruler, so if the client is off a little bit, it's not a big problem.

Maybe instead of a float for your power level, you should use an int? It might save you rounding headaches, and such... (ie, if you do it right, the play might well not notice that the 255px wide bar only has 255 different "levels")
#2
11/20/2002 (6:01 pm)
Ok, here are my findings.

Let's keep with the Power bar example I started above.

If I assume that in my game, the maximum power will be no more than 4096, that allows me to set a cap at 12 bits on the power, because 2^12 is 4096.

So, if I create a packet, that's 12 bits for my number, I can send any integer value across at 100% success. I wont have any ambiguity in the results; however, is this really necessary?

For my purpose, this number updates the PowerBar GUI. Do I need to be 100% accurate while stretching the bar back and forth? Nope.

I did a little bit of C++ and came up with a few programs to convert dec -> binary, and calculate the error. Just like the engine would be doing when it makes the packets.

Let's just give an example. Say I want to send the value 3412 across the network. I could send it 3412 at 12 bits and it'd arrive just like that, 3412; however I have the option to do this:

3412 / 4096(my max value) = .8330078125

If I convert that to binary using 6 bits, send it across the network, and then turn it back into decimal, the number I get comes out to: .828125, which is = 3392.

This number is off by 0.5%, yes that's .005 the original value in difference. Considering this is only used for updating a GUI bar, this is incredible, and the differences in the values isnt that noticable on a bar that displays from 0->4096.

--------------
!!!However!!!
--------------

The rewards really only come from values that are 10 bits or larger, because if you're not using at least 8 bit precision, your numbers generally come out less precise than just sending the thing over unmodified.

In my case, I think I'll go with a 10 bit number, and use 10 bits the whole time, because eventhough some of my players may be up around 1023 level of power, most of them will be at the lower numbers, where this little trick doesnt work so well.

Also, in the general demo, the players health (100) is a 7 bit number. The network code transmits it at 6 bits, so it can be a tad inaccurate when updating your health bar. I'd just set the value to 7 bits, and take out the extra multiplications / divisions that take place client and server side pre/post packet send.

Take a look in shapebase.cc, specifically
DamageLevelBits = 6,

Hope this made sense,
-Jared
#3
11/20/2002 (6:58 pm)
One more thing that needs to be mentioned is that if you're going to attempt to pass the value across the network as something greater than 1. Yes if your value is > 1, it must be passed as an integer. I havnt dug down deep enough yet, but when you pass a float, the packUpdate expects that float to be < 1.

So, if you want to change your level of bit precision on your health, to something that makes sense(really who has .5 of a health point anyhow?), you need to change all the

F32 mDamage ----> S32 mDamage and so on, and pass it as an integer into the packUpdate.

Doing that took me about 10 minutes, but I have on the mark health displaying through the GUI, and I think this will make more sense when I implement a different sort of regen.

Ok, have fun.
-Jared