Skip to content

lauracabayol/LLaMA_cosmoChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLaMA_cosmoChat

LLaMA_cosmoChat is a python module to generate SQL queries to CosmoHub with LLaMA generative AI open source models.

Installation

(Last update Sept 3 2024)

  • Create a new conda environment. It is usually better to follow python version one or two behind, we recommend 3.11.
conda create -n chatcosmohub -c conda-forge python=3.11 pip=24.0
conda activate chatcosmohub
  • Clone the repo into your machine and perform an editable installation:
git clone https://github.com/lauracabayol/LLaMA_cosmoChat.git 
cd LLaMA_cosmoChat
pip install -e .
  • To access the LLaMA models, one needs to request it to Huggingface. To access the model, one needs to declare the access token in Huggingface as an environment variable called 'KERNEL_HIGGINFACE'.

  • If you want to use notebooks via JupyterHub, you'll also need to download ipykernel and jupytext:

pip install ipykernel
python -m ipykernel install --user --name chatcosmohub --display-name chatcosmohub

Tutorials:

In the notebooks folder, there is a tutorial. These are .py scripts, in order to pair them to .ipynb, please run:

jupytext your_script --to ipynb

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages