XingYu-Zhong 91145f55ab feat: Update project structure generation logic 1 year ago
..
.chainlit a714607eaf 1.Fix some typos. 1 year ago
llm 91145f55ab feat: Update project structure generation logic 1 year ago
prompts 91145f55ab feat: Update project structure generation logic 1 year ago
public 9e364eec5d Initial commit 1 year ago
utils 91145f55ab feat: Update project structure generation logic 1 year ago
.env e117a614e7 Add openai format api,Supports openai format api of vllm and ollama,Optimization tool selection agent,Integrate chat and online chat,Add project Q&A for uploading githuburl 1 year ago
chainlit.md e74f7e6f90 chore: Update image URLs in markdown files 1 year ago
chainlit_zh-CN.md e74f7e6f90 chore: Update image URLs in markdown files 1 year ago
readme.md e74f7e6f90 chore: Update image URLs in markdown files 1 year ago
readme_zh.md e74f7e6f90 chore: Update image URLs in markdown files 1 year ago
requirements.txt 5025da65e4 feature:add mermaid_png function 1 year ago
run.py 91145f55ab feat: Update project structure generation logic 1 year ago

readme.md

English | 中文

Welcome to Chat Demo Application

This is a simple demo application designed to showcase multi-turn conversations and project Q&A functionalities.

Features

  • Supports multi-turn conversations
  • Supports online Q&A
  • Supports uploading local zip files for project Q&A and modifications
  • Supports inputting GitHub project links for Q&A and modifications

    Installation

  1. Clone the repository locally.
  2. Start the model. You can deploy the model via vllm or ollama, provide the OpenAI request format, set the deployed api_base and api_key, or access the CodeGeeX API to get an API key. Fill in the corresponding information in the .env file.

    # Using open.bigmodel.cn API
    openai_api_key = ""
    openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
    model_name = "codegeex-4"
    # Using vllm
    openai_api_key = "EMPTY"
    openai_api_base = "http://xxxx:xxxx/v1"
    model_name = "codegeex4-all-9b"
    
  3. Fill in the corresponding model information and bing_search_api (if you want to experience online queries) in the .env file. Turn on the online query switch on the left side of the input box during the chat, which is off by default.

  4. Install dependencies: pip install -r requirements.txt

  5. Run the application: chainlit run run.py --port 8899

Notes

Please ensure your network environment can access the CodeGeeX API.

Acknowledgments

Thank you for using our application. If you have any questions or suggestions, please feel free to contact us. We look forward to your feedback and are committed to providing you with better service.