LuaJIT bit.band stopped working

Came across an interesting feature/artifact of using LuaJIT today.

In Smith and Winston we use the bit package that LuaJIT provides to flag and mask collisions. The types of collision (for example world, bullet, vehicle) were hard coded on both the engine and Lua side. Today I moved toward the types being loaded from an external file by the engine/editor and injecting those in to the Lua global space for use by the scripts. No more hard coding, everything in one place, what’s not to like?

Well it didn’t work. Take the following code:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
function Projectile:collision_trigger(othergo, localpos, 
                                      worldpos, otherlocalpos, 
                                      othercoltype, normal)

    --othercoltype = bit.tobit(othercoltype)
    --vox.print(string.format("bang %d\n", 23))
    if bit.band(othercoltype, vox.col.WORLD) ~= 0 then

        self:explosion(worldpos, normal)

    end
end

othertype should be a bit field showing the type of thing this GameObject has hit (for example WORLD). If it’s the world I want to cause an explosion at the impact location (worldpos) and move on. This didn’t work!

The weird things is that if I printed out othercoltype then the if statement worked!

The constant vox.col.WORLD is injected in to the Lua state using LuaBridge before anything gets a chance to run:

1
2
3
4
5
static int kWorld = ColFlags::kWorld;
getGlobalNamespace (L)
    .beginNamespace ("vox")
        .beginNamespace ("col")
            .addVariable("WORLD", &kWorld, false)

and has the value 2.

So it seems that bit.band(othercoltype, 2) doesn’t work so there is a difference between bitfields and number in LuaJIT. There is a clue to this in the LuaJIT bit documentation. The fact that it has a bit.tobit function implies there is something magic going on. To test this I changed the code to bit.band(othercoltype, bit.tobit(2)) and it started working. I have NO idea why printing the value worked but I that’s not a long term solution obviously.

So now I need to inject the normalized bit values in to the Lua state, not just the power of two numbers I thought.

Unfortunately this means playing with the Lua stack. I’ve been using LuaBridge to do all the heavy lifting up til now.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
ColFlags &colflags = ColFlags::Get();
lua_createtable(L, 0, colflags.GetNumFlags());                  /* table */

for(size_t i = 0; i < colflags.GetNumFlags(); ++i){
    ColGroup colgroup = static_cast<ColGroup>(1 << i);
    std::string colname = colflags.NameFromFlag(colgroup);

    lua_pushstring(L, colname.c_str());                         /* table | name */

    lua_getglobal(L, "bit");                                    /* table | name | bit table */
    lua_getfield(L, -1, "tobit");                               /* table | name | bit table | tobit func */
    lua_pushinteger(L, colgroup);                               /* table | name | bit table | tobit func | colgroup*/
    lua_pcall(L, 1, 1, 0);                                      /* table | name | bit table | hexified colgroup */
    lua_remove (L, -2);                                         /* table | name | hexified colgroup */
    lua_settable(L, -3);                                        /* table */
}

lua_setglobal(L, "const");                                      /* <empty stack> */

In true Lua example fashion I’ve added comments showing the state of the Lua stack (relative to where the createtable code starts). I use a singleton called ColFlags that is initialized with the names and values of all the collision flags when the editor or engine is started up.

Now everything works, I’ve got all the collision flags defined in one place for both the Editor, Engine and Lua Scripts.