Neural Search
I recently started on an interesting little side project – using pre-trained AI models to search and retrieve information from public channels in exported Slack messages.
The goal was to sift through all the messages and enable an intelligent search functionality with the ability to interpret/summarise the results rather than just return the relevant messages. I wanted to be able to find and interpret that one crucial piece of information hidden in an ocean of conversations!
The biggest challenge? Adapting pre-existing code to work with text files(the exported Slack message) and generating BERT embeddings(codings that reflect the context of words) from those files. The embeddings were the key – they were fed into a qdrant vector database, the backbone of the search engine.
It wasn't just about coding; it was about using existing AI models (BERT and Mistral7b) to meet my requirements. The result? A proof of concept command line tool that provides a great start for future experiments. It was coded using the Rust programming language and made use of the Orca project in order to implement the search. The code for the Query can be found here
For those interested, you can find more about the Orca project here. The models BERT and Mistral7b can be obtained from popular repositories like Hugging Face or their respective project pages.
I have to say this would not have been possible without the community and ongoing collaborations shown on sites like Hugging Face.
The key thing is that AI is not just a buzzword (It is a buzz word but it is also much more); it's a tool that, when wielded carefully can lead to useful applications that were simply not possible previously.