-
Notifications
You must be signed in to change notification settings - Fork 18
SWIG support (for Python and other languages) #1
Comments
Hi, looking at the limitations, not really my cup of tea. Limited record size and limited database size are not really attributes I'm attracted to .. :) .. however I'd hadn't realised there were serious LMDB forks, I probably need to have a browse. |
These restrictions come from the original DNA of LMDB. |
Mm, maybe it's Google translate playing up .. I thought records were limited to 4k and database size limited to memory size? |
Not 'limited', but |
Ahhhh, Ok. |
Actual 'the hard' limits are comes from libfptu, exactly from here:
|
Ok, I'm thinking my approach is a little different, rather than dealing in fields / columns, I'm writing JSON blobs as values, and converting back and fore between Python dict items on read/write .. seeing around 40,000 writes per second on a single core (in Python), or 30,000 writes per second on a table with a compound index. Reading is much faster, reading through a compound index with 5 keys yields around 200,000 records per second. (again, this is in Python) |
I think I need to explain a little bit of my plans. so, In
Therefore, the
|
Ok, that sounds good, currently I'm relying on the Python-lmdb package .. do you have an equivalent, or is this the bit you're missing? |
Nowadays mdbx/fptu/fpta haven't any support for python. I think python support could be useful only with schema and json (de)serialization. |
Sure, at the end of the day, very few use-cases for databases involve low-level programming, so access from the likes of Python, Node, PHP etc, and the performance of those interfaces are pretty key. The driving force for me was seeing my Python write speed down to ~ 2000/sec with MongoDB. |
Related to jnwatson/py-lmdb#204 |
SWIG support for Python
The text was updated successfully, but these errors were encountered: