performance - Most efficient way to update database -
i have table auto-updating time time (say daily). updated fields of type text, , might have lots of data. know data not change lot. 30 characters added or deleted. more efficient? merge somehow changes or delete old data , retrieve new one? and, if merge way way it, how should that? there keyword or in order make easier , more efficient?
p.s new databases in general, it's first time ever create , use database, sorry if silly question
due mvcc model, postgresql always writes new row any set of changes applied in single update. doesn't matter, how change. there no "merge way".
it's similar (but not same as) deleting row , inserting new one.
since columns big, going toasted, meaning compressed , stored out-of-line in separate table. in update, these columns can preserved as-is if remain unchanged, it's considerably cheaper update delete , insert. quoting the manual here
during
updateoperation, values of unchanged fields preserved as-is;updateof row out-of-line values incurs notoastcosts if none of out-of-line values change.
if rows have lots of columns , updated lot, might pay have 2 separate tables 1:1 relationship. that's extreme case.
Comments
Post a Comment