Repairing MetaKit databases
WARNING: this wasn't the magic fix I hoped for, and may have contributed to a real crash a day or so later. Use at your own risk
The Python Community Server has been a little unstable over the last few days, and it's reminding me of what happened a year or so ago when the database file got corrupted after a system crash. Basically, the server process suddenly starts using huge quantities of CPU time for no apparent reason. This time, I can just restart it and it goes back to normal, but last time it was hanging completely.
The solution was to take the MetaKit database and dump out all the data, then create another database with the same structure and fill it up with the data. Radio/Frontier calls this 'compacting', and Zope calls it 'packing', I think -- it's not exactly an uncommon thing to do. However, I don't know of any utilities that will pack MetaKit databases.
Does anybody know of such a tool?
Update: Scratch that request, I've done it myself. Wasn't hard - MetaKit lets you just pick up a view and drop it into another database. Great!
... more like this: [MetaKit, Python, Python Community Server]
The Python Community Server has been a little unstable over the last few days, and it's reminding me of what happened a year or so ago when the database file got corrupted after a system crash. Basically, the server process suddenly starts using huge quantities of CPU time for no apparent reason. This time, I can just restart it and it goes back to normal, but last time it was hanging completely.
The solution was to take the MetaKit database and dump out all the data, then create another database with the same structure and fill it up with the data. Radio/Frontier calls this 'compacting', and Zope calls it 'packing', I think -- it's not exactly an uncommon thing to do. However, I don't know of any utilities that will pack MetaKit databases.
Does anybody know of such a tool?
Update: Scratch that request, I've done it myself. Wasn't hard - MetaKit lets you just pick up a view and drop it into another database. Great!
import metakit, re
# path to your (possibly broken) settings.dat file:
FN = 'settings.dat'
# where to save the compacted version:
NEW_FN = 'settings_2.dat'
# open the databases
s = metakit.storage(FN, 1)
new_s = metakit.storage(NEW_FN, 1)
# we'll build up a copy of the description string here, just to make
# sure we didn't do anything dumb.
total_desc = []
def process_table(desc):
"copy a single table between the databases"
print "copying table:", desc
global total_desc
total_desc.append(desc)
# get the table from the source, and create it on the dest
src = s.getas(desc)
dest = new_s.getas(desc)
# copy all rows
for row in src:
dest.append(row)
#########
def main():
# grab full database description
desc = s.description()
# split it up into tables and process them one at a time
sofar = []
add = sofar.append
depth = 0
for c in desc:
if c == '[':
depth += 1
elif c == ']':
depth -= 1
if depth == 0 and c == ',':
process_table("".join(sofar))
sofar[:] = []
else:
add(c)
process_table("".join(sofar))
# all done: make sure we really did process them all
assert ",".join(total_desc) == desc, "didn't get everything!"
# and save ...
new_s.commit()
if __name__ == '__main__':
main()
print "done."