If you happen to be using macOS, there's mdfind, which uses the spotlight database which is always kept up to date, unlike locate/updatedb, where updatedb is expensive to run, even if you've run it recently.
I have yet to find a good solution for linux CLI... something that uses an internal database that is kept up to date with all directory structure changes.
Maybe someone else has seen something cool for this? :D
It's still not clear to me whether you want the files inside the database or just the metadata. The stat part suggests the latter?
I guess I can see that, but now you have cache invalidation (someone else linked to Spotlight, which does this as a background process). SQLite files can be larger than any physical medium you can purchase so why not go the distance?
I'm looking to do the opposite. Given a location, recurse it, stat() each file and create the database.