Skip to main content

BioGenie

A RAG-based AI assistant for biological data analysis

Open in Codespaces

BioInformatics Software Tutorial Chatbot

Report Issue on Jira Deploy Docs Documentation Website Link CI Status

Keywords

Section 003, React.js, JavaScript, HTML, CSS, Python, Flask, Ollama, Retrieval-Augmented Generation (RAG), Generative AI, Large Language Model (LLM)

Project Overview

This project aims to create a chatbot powered by Generative AI and RAG to help bioinformatics researchers solve problems by providing accurate answers and tutorials using predefined methods.

Project Abstract

The Bioinformatics Chatbot is a cutting-edge web application designed to assist bioinformatics researchers with complex problems more efficiently. The application enables users to ask the chatbot questions and receive relevant, accurate answers. Using innovative learning technology and human-like behavior, the chatbot guides the researchers with step-by-step tutorials (answers) for complex bioinformatics questions. The methods provided to the chatbot will serve as the foundation for generating precise responses, enabling users to save time and focus on advancing their research. By harnessing the strength of AI, the web application is transforming how researchers tackle heartfelt problems leading to rapid advancements.

High Level Requirement

Once the user accesses the link to our web application, they will come into view of our welcome page that briefly summarizes the application's purpose. The welcome page also details the users who have worked on bringing the Bioinformatics Chatbot to life. To begin using the chatbot, the user clicks on the "Begin New Chat" button which activates the interface for the chatbot.

The chat interface is the crux of our application. The user can ask the chatbot questions and receive answers that are based on user-provided documentation, which can be submitted by clicking the black paper plane icon. The interface also allows the user to edit their queries if they do not like the response from the chatbot, or if the user wants to change their question. This is done by clicking on the pencil icon. Additionally, once the user is done conversing with the chatbot, they have the option to download the conversation history for later reference by pressing the download button in the bottom right-hand corner.

For the chatbot to deliver quality answers tailored towards Bioinformatics, there is a backend application where users can upload relevant documentation. After logging in, users can navigate to the documentation upload page by clicking on the "Upload PDF" link. Once on the page, the user can upload their bioinformatics documentation which is stored inside of our database for the LLM to reference when producing accurate responses.

How to run this project in your own environment?

Note: You will need to add your own Document for the Chatbot to work

  1. Install and Open Docker Desktop and Vscode
  2. Download and unzip the source code in the most recent project release.
  3. In Vscode open the project file after you unzip, in the backend folder create .env file and database.env
  4. In the .env file put this
#database variables
POSTGRES_USER=admin
POSTGRES_PASSWORD=admin
POSTGRES_DB=database
POSTGRES_HOST_AUTH_METHOD=trust
  1. In the database.env put this:
#database variables
POSTGRES_USER=admin
POSTGRES_PASSWORD=admin
POSTGRES_DB=database
POSTGRES_HOST_AUTH_METHOD=trust
  1. If on Windows, update docker-entrypoint.sh in backend, and docker-compose.yml, to LF format from CRLF format in Vscode by hiting ctrl+shift+p and clicking on "Change End of Line Sequence" and LF. There won't be any noticeable changes but it will change the end of line sequence. Then save the files.
  2. In vscode open the terminal and cd to the project from current release number because after you extract the file, the project is going be in a folder with the same name.
  • For example, release v1.0.0 cd project-003-bioinformatics-chatbot-1.0.0
  • or release v2.0.0 cd project-003-bioinformatics-chatbot-2.0.0
  1. Run this command in the terminal, it may take a while to build due to LLM.
    docker compose -f docker-compose.yml up --build
  2. Frontend host is http://localhost:5173/
    Backend host is http://localhost:444/
    Default Username is admin
    Default Password is admin

How to Host and port forward this project to use in your own lab?

(Please check Hardware requirement below before proceeding)

  1. Click this link Hosting to access the branch purposely made for Hosting service
  2. Then read this documentation for more information.

Required Resources

Hardware Requirements for local development

  • A computer with a modern operating system (Windows, macOS, or Linux)
  • At least 8 GB of RAM
  • At least 2 GB of free disk space

Hardware Requirements for Hosting service

  • A computer with a modern operating system (Windows, macOS, or Linux)
  • At least 8GB of RAM
  • At least 10 GB of VRAM (Prefer 12GB NVIDIA GPU with CUDA support)
  • At least 2 GB of free disk space

Software Requirements

  • An up-to-date web browser:
    • Google Chrome: Version 131 or later
    • Mozilla Firefox: Version 133 or later
    • Microsoft Edge: Version 131 or later
    • Apple Safari: Version 18.0 or later
  • Docker Desktop (handles all required dependencies for development)
  • Internet access

Collaborators

kBunn411
Keith C Bunn
ahgoldmeer
Amitai Goldmeer
ishmam02
Ishmam Kabir
khanhquocng2801
Khanh Q Nguyen
korlovskiy
Katerina Orlovskiy
JustinTruong456
Justin Truong
chroy2
Troy K Witmer