Best method for repeated searches on large list of dicts
Let's say I have a function that returns 1000 records from a postgres
database that looks like this (but much bigger):
[ {"thing_id" : 245, "thing_title" : "Thing title", "thing_url":
"thing-url"},
{"thing_id" : 459, "thing_title" : "Thing title II", "thing_url":
"thing-url/2"}]
I have a process that requires around 600 individual searches on this list
for the right dict based on a given unique thing_id. Rather than iterating
through the entire list each time, wouldn't it be more efficient to create
a dict of dicts, making the thing_id for each dict a key, like this:
{245 : {"thing_id" : 245, "thing_title" : "Thing title", "thing_url":
"thing-url"},
459 : {"thing_id" : 459, "thing_title" : "Thing title II", "thing_url":
"thing-url/2"}}
If so, is there a preferred way of doing this? Obviously I could build the
dict by iterating through the list. But was wondering if there are any
built in methods, and if not, what the most elegant way of going about
this is. Also if there is a better way of repeatedly retrieving data from
the same large set of records than what I am proposing here.
No comments:
Post a Comment