Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
520 views
in Technique[技术] by (71.8m points)

sql - Limiting the number of records in a Sqlite DB

What I'm trying to implement here is a condition wherein a sqlite database holds only the most recent 1000 records. I have timestamps with each record. One of the inefficient logic which strikes right away is to check the total number of records. If they exceed 1000, then simply delete the ones which fall out of the periphery.

However, I would have to do this check with each INSERT which makes things highly inefficient.

What could be a better logic? Can we do something with triggers?

Some related questions which follow the same logic I thought of are posted on SO:-

Delete oldest records from database

SQL Query to delete records older than two years

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You can use an implicit "rowid" column for that.

Assuming you don't delete rows manually in different ways:

DELETE FROM yourtable WHERE rowid < (last_row_id - 1000)

You can obtain last rowid using API function or as max(rowid)

If you don't need to have exactly 1000 records (e.g. just want to cleanup old records), it is not necessary to do it on each insert. Add some counter in your program and execute cleanup f.i. once every 100 inserts.

UPDATE:

Anyway, you pay performance either on each insert or on each select. So the choice depends on what you have more: INSERTs or SELECTs.

In case you don't have that much inserts to care about performance, you can use following trigger to keep not more than 1000 records:

CREATE TRIGGER triggername AFTER INSERT ON tablename BEGIN
     DELETE FROM tablename WHERE timestamp < (SELECT MIN(timestamp) FROM tablename ORDER BY timestamp DESC LIMIT 1000);
END

Creating unique index on timestamp column should be a good idea too (in case it isn't PK already). Also note, that SQLITE supports only FOR EACH ROW triggers, so when you bulk-insert many records it is worth to temporary disable the trigger.

If there are too many INSERTs, there isn't much you can do on database side. You can achieve less frequent trigger calls by adding trigger condition like AFTER INSERT WHEN NEW.rowid % 100 = 0. And with selects just use LIMIT 1000 (or create appropriate view).

I can't predict how much faster that would be. The best way would be just measure how much performance you will gain in your particular case.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...