k1lib.serpent module
This module is for Lua’s serpent module, which serializes object into a string similar to json. But maddingly, no one has actually wrote a serialization/deserialization code in Python before, and I desperately need it for a factorio project. Because this was written in a rush, I’m in no way guaranteeing that it will work on all serialized objects out there, but from my testing, it seems pretty robust. This is exposed automatically with:
from k1lib.imports import *
serpent.loads(...) # exposed
- k1lib.serpent.deconstruct(s: str) list[list[5]] [source]
Not intended for the end user. Deconstructs and grab metadata of some lua objects. Example:
a = '{1,2,3,{a=3,b={c=6,d={5,6,7}}},{b=3}}' serpent.deconstruct(a)
That returns:
[['_v3', 21, 27, False, '_v2'], ['_v2', 14, 28, True, '_v1'], ['_v1', 7, 29, True, '_v0'], ['_v4', 31, 35, True, '_v0'], ['_v0', 0, 36, False, 'root']]
The columns are: [unique index of bracket, start byte, end byte, is it a dictionary?, parent index]
This is a crucial step within
listCorrection()
- k1lib.serpent.listCorrection(s: str) str [source]
Not intended for the end user. Corrects for lists in Lua. Example:
a = '{1,2,3,{a=3,b={c=6,d={5,6,7}}},{b=3}}' serpent.listCorrection(a) # returns '[1,2,3,{a=3,b={c=6,d=[5,6,7]}},{b=3}]'
See how some pointy brackets have been replaced with square brackets?
This is because there are no list or tuple types in lua, and there are also no sets in json, so kinda have to roll my own solution
- k1lib.serpent.loads_monolith(lua: str) object [source]
Not intended for the end user. Core loading mechanism. See
loads()
- k1lib.serpent.loads_fragments(lua: str) object [source]
Not intended for the end user. See
loads()
. Deserializes lua objects, breaking up the work into multiple fragments. So here’s the general gist:s = "{1, 2, 3, {4, 5, 6}, {7, 8, 9}}" # then, we grab the fragments, which are the top level {} blocks, assigning a unique key (the character and autoInc index) fragments = {"0": {4, 5, 6}, "1": {7, 8, 9}} # then, we replace the fragments with their keys s = "{1, 2, 3, "0", "1"}" # then we load s, it will run fast since the fragments are just simple strings s = serpent.loads_monolith(s) # then we patch s, replacing the keys with actual parsed objects s = {1, 2, 3, {4, 5, 6}, {7, 8, 9}}
Why so convoluted? Well turns out, loads_monolith is pretty slow. It has a for loop there, and there’s a .replace() within, which is a hidden for loop that copies the entire string over and over again, which slows it down. Haven’t done extensive testing, but feels like O(n^2) time complexity while I was working with it.
So this optimization assumes that the top level {} blocks are small, but there’re many of them, thus this assigns less work (shorter string, hence faster .replace()) to each loads_monolith() calls. So if there’re 10k fragments, this can potentially be 10k faster.
This assumption of course is not that great and not very general, and you can easily find ways around it. But it’s just enough for my use case right now, which is to analyze factorio. The correct way would be to dive deeper and benchmark everything more clearly, but I don’t have time for that.
- k1lib.serpent.loads(lua: str)[source]
Deserialize lua objects from string. Example:
# returns [1, 2, 3, {'a': 3, 'b': {'c': 6, 'd': [5, 6, 7]}}, {'b': 3}] loads("{ 1, 2, 3, { a = 3, b = { c = 6, d = {5, 6, 7} } }, { b = 3 } }")
See also:
dumps()
What’s the relative speed here? Because everything is written in Python, I expect it to be slower than json, but by how much? Here’re some benchmark results:
| lua | json | binary |———– | —- | —- | —— |from python | 21us | 11us | 184us |to python | 92us | 10us | 5.8us |The “lua” column uses “serpent.loads()”, “json” uses “json.loads()”, and “binary” uses “dill.loads()”