The title says it all...
Do we really need NLP, Full-Text Search (FTS), or is working with simple keywords enough?
When you have a large database full of content—like user messages—you can truly see the magic of AI. It can provide a perfectly sized text snippet (with a character limit you set) that answers your search query or question precisely.
But this comes with a cost!
Take product databases, for example, with queries like this:
Let's say a user searches for "OIL EXTRA VIRGIN 250ml".
I split the phrase into words (using regex or split), and the SQLite query looks something like:
SELECT * FROM products WHERE product LIKE '%" & keywords(0) & "%'
AND product LIKE '%" & keywords(1) & "%'
AND product LIKE '%" & keywords(2) & "%' ... ;
There are other approaches to normalize and truncate words, but none of these achieve the naturalness and accuracy that a large language model (LLM) can deliver.
What do you think?
How do you handle product databases, or databases with messages/help texts/books in your systems?