XingYu-Zhong 28567ea0ea feat: Add file summary prompt for each file vor 1 Jahr
..
.chainlit a714607eaf 1.Fix some typos. vor 1 Jahr
llm 91145f55ab feat: Update project structure generation logic vor 1 Jahr
prompts 28567ea0ea feat: Add file summary prompt for each file vor 1 Jahr
public 9e364eec5d Initial commit vor 1 Jahr
utils bb106517c8 update project structure png vor 1 Jahr
.env e117a614e7 Add openai format api,Supports openai format api of vllm and ollama,Optimization tool selection agent,Integrate chat and online chat,Add project Q&A for uploading githuburl vor 1 Jahr
chainlit.md e74f7e6f90 chore: Update image URLs in markdown files vor 1 Jahr
chainlit_zh-CN.md e74f7e6f90 chore: Update image URLs in markdown files vor 1 Jahr
readme.md e74f7e6f90 chore: Update image URLs in markdown files vor 1 Jahr
readme_zh.md e74f7e6f90 chore: Update image URLs in markdown files vor 1 Jahr
requirements.txt 5025da65e4 feature:add mermaid_png function vor 1 Jahr
run.py 28567ea0ea feat: Add file summary prompt for each file vor 1 Jahr

readme.md

English | 中文

Welcome to Chat Demo Application

This is a simple demo application designed to showcase multi-turn conversations and project Q&A functionalities.

Features

  • Supports multi-turn conversations
  • Supports online Q&A
  • Supports uploading local zip files for project Q&A and modifications
  • Supports inputting GitHub project links for Q&A and modifications

    Installation

  1. Clone the repository locally.
  2. Start the model. You can deploy the model via vllm or ollama, provide the OpenAI request format, set the deployed api_base and api_key, or access the CodeGeeX API to get an API key. Fill in the corresponding information in the .env file.

    # Using open.bigmodel.cn API
    openai_api_key = ""
    openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
    model_name = "codegeex-4"
    # Using vllm
    openai_api_key = "EMPTY"
    openai_api_base = "http://xxxx:xxxx/v1"
    model_name = "codegeex4-all-9b"
    
  3. Fill in the corresponding model information and bing_search_api (if you want to experience online queries) in the .env file. Turn on the online query switch on the left side of the input box during the chat, which is off by default.

  4. Install dependencies: pip install -r requirements.txt

  5. Run the application: chainlit run run.py --port 8899

Notes

Please ensure your network environment can access the CodeGeeX API.

Acknowledgments

Thank you for using our application. If you have any questions or suggestions, please feel free to contact us. We look forward to your feedback and are committed to providing you with better service.