Oasis
Oasis is a functional local LLM solution for ingesting and storing your own data in a local vector database. Oasis enables secure use of large language models without the need to send your data to the cloud.
Visit websiteTuning
The Oasis system is built on LanghChain and HuggingFace, making it easily customizable via a range of parameters. From personality traits like creativity or factuality, to chunk sizing for personalizing the balance between context and speed - You can do it all with Oasis.
Querying
Interacting with Oasis is as simple as asking a question. Open a new session through the CLI or any of the supported clients and query the database - remember to re-run ingestion periodically to stay up to date! Once you have got an answer, you can choose to explore the sources, or ask another.
Continuous Learning
Ingestion is easy with drag and drop file handling but automation is the future! The core oasis engine is up and running but we are working on big things to make the experience seamless - think automatic ingestion of your favourite new source, crawling your favourite website or automatically staying up to date on the documentation for various projects.
Messaging Service Integration
An intuitive end-user experience is our real goal - not everyone has access to a GPU, and the convenient interface a messaging service provides is ideally suited to Oasis. Oasis will be extendable through a variety of platforms, starting with MS Teams, with slack, discord and others to follow.
Collaboration
I am time limited at the moment so if you want to jump on just let me know! I am easily reachable via the contact page.