Przeglądaj źródła

Merge pull request #44 from XingYu-Zhong/main

Optimize repo demo readme
Shaobo 1 rok temu
rodzic
commit
683913d09c

+ 20 - 29
repodemo/chainlit.md

@@ -1,50 +1,41 @@
-# CodeGeeX
-
-# Welcome to My Chat Demo Application
-
-This is a simple demonstration application.
-
-## Instructions
-
-1. Enter your question.
-2. Wait for a response.
-3. Enjoy the conversation!
+![](../resources/logo.jpeg)
+## Welcome to Chat Demo Application
+![](https://github.com/user-attachments/assets/f2cb6c13-a715-4adf-bf3a-b9ca5ee165df)
+This is a simple demo application designed to showcase multi-turn conversations and project Q&A functionalities.
 
 ## Features
 
-- Supports multi-turn conversations.
-- Supports online Q&A.
-- Supports uploading local zip packages for project Q&A and modifications.
-- Supports inputting GitHub project links for project Q&A and modifications.
-
+- Supports multi-turn conversations
+- Supports online Q&A
+- Supports uploading local zip files for project Q&A and modifications
+- Supports inputting GitHub project links for Q&A and modifications
+![](https://github.com/user-attachments/assets/ff6f6e32-457c-4733-815b-b639e4197899)
 ## Installation
 
 1. Clone the repository locally.
-2. Start the model. You can deploy the model using vllm or ollama, provide the OpenAI request format, and set the deployed `api_base` and `api_key`. Alternatively, visit [CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4) to get the API key.
+2. Start the model. You can deploy the model via vllm or ollama, provide the OpenAI request format, set the deployed `api_base` and `api_key`, or access the [CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4) to get an API key. Fill in the corresponding information in the .env file.
+![](https://github.com/user-attachments/assets/6aabc3e4-a930-4853-b511-68b9389fa42f)
 
 ```shell
-#use open.bigmodel.cn api
-openai_api_key = "<|apikey|>"
+# Using open.bigmodel.cn API
+openai_api_key = ""
 openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
 model_name = "codegeex-4"
-#use vllm
+# Using vllm
 openai_api_key = "EMPTY"
 openai_api_base = "http://xxxx:xxxx/v1"
 model_name = "codegeex4-all-9b"
 ```
 
-3. Fill in the corresponding model information and `bing_search_api` (if you want to experience online search) in the `.env` file.
-4. Install dependencies: `pip install -r requirements.txt`.
-5. Run the application: `chainlit run run.py --port 8899`.
+3. Fill in the corresponding model information and `bing_search_api` (if you want to experience online queries) in the .env file. Turn on the online query switch on the left side of the input box during the chat, which is off by default.
+![](https://github.com/user-attachments/assets/e9d9b620-cfc7-4c2d-bedc-a01d41f79e29)
+4. Install dependencies: `pip install -r requirements.txt`
+5. Run the application: `chainlit run run.py --port 8899`
 
-## Note
+## Notes
 
 Please ensure your network environment can access the CodeGeeX API.
 
-## Disclaimer
-
-This application is for educational and research purposes only and should not be used for any commercial purposes. The developer is not responsible for any loss or damage caused by the use of this application.
-
-## Acknowledgements
+## Acknowledgments
 
 Thank you for using our application. If you have any questions or suggestions, please feel free to contact us. We look forward to your feedback and are committed to providing you with better service.

+ 20 - 26
repodemo/chainlit_zh-CN.md

@@ -1,51 +1,45 @@
-# CodeGeeX
+![](../resources/logo.jpeg)
+## 欢迎使用Chat Demo应用
+![](https://github.com/user-attachments/assets/f2cb6c13-a715-4adf-bf3a-b9ca5ee165df)
+这是一个简单的演示应用程序,用于展示多轮对话和项目问答功能。
 
-# 欢迎使用我的chat demo应用
-
-这是一个简单的演示应用程序。
-
-## 使用说明
-
-1. 输入您的问题
-2. 等待回复
-3. 享受对话!
 
 ## 功能
 
--  支持多轮对话
--  支持联网问答
--  支持上传本地zip压缩包项目,可以进行项目问答和对项目进行修改
--  支持输入GitHub链接项目,可以进行项目问答和对项目进行修改。
+- 支持多轮对话
+- 支持联网问答
+- 支持上传本地zip压缩包项目进行问答和修改
+- 支持输入GitHub链接项目进行问答和修改
+![](https://github.com/user-attachments/assets/ff6f6e32-457c-4733-815b-b639e4197899)
 
 ## 安装
 
 1. 克隆仓库到本地
-2. 启动模型,可以通过vllm或者ollama部署模型,提供openai的请求格式,设置部署的api_base和api_key,或者访问[CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4)获取apikey.
+2. 启动模型,可以通过vllm或者ollama部署模型,提供openai的请求格式,设置部署的api_base和api_key,或者访问[CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4)获取apikey。在.env文件中填写对应的信息
+![](https://github.com/user-attachments/assets/6aabc3e4-a930-4853-b511-68b9389fa42f)
 
 ```shell
-#use open.bigmodel.cn api
-openai_api_key = "<|apikey|>"
+# 使用open.bigmodel.cn API
+openai_api_key = ""
 openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
 model_name = "codegeex-4"
-#use vllm
+# 使用vllm
 openai_api_key = "EMPTY"
 openai_api_base = "http://xxxx:xxxx/v1"
 model_name = "codegeex4-all-9b"
 ```
 
-3. 到.env文件里填写对应模型信息和bing_search_api(如果需要体验联网查询)
+3. 在.env文件中填写对应模型信息和bing_search_api(如果需要体验联网查询),并且在聊天的时候在输入框左侧打开
+联网查询开关,默认关闭。
+![](https://github.com/user-attachments/assets/e9d9b620-cfc7-4c2d-bedc-a01d41f79e29)
 4. 安装依赖:`pip install -r requirements.txt`
-5. 运行应用:`chainlit run run.py --port 8899` 
+5. 运行应用:`chainlit run run.py --port 8899`
 
-
-## 注意
+## 注意事项
 
 请确保您的网络环境可以访问CodeGeeX的API。
 
-## 免责声明
-
-本应用仅供学习和研究使用,不得用于任何商业用途。开发者不对因使用本应用而导致的任何损失或损害负责。
 
 ## 感谢
 
-感谢您使用我们的应用。如果您有任何问题或建议,请随时联系我们。我们期待您的反馈,并致力于为您提供更好的服务。
+感谢您使用我们的应用。如果您有任何问题或建议,请随时联系我们。我们期待您的反馈,并致力于为您提供更好的服务。

+ 21 - 29
repodemo/readme.md

@@ -1,50 +1,42 @@
-# CodeGeeX
-
-# Welcome to My Chat Demo Application
-
-This is a simple demonstration application.
-
-## Instructions
-
-1. Enter your question.
-2. Wait for a response.
-3. Enjoy the conversation!
+![](../resources/logo.jpeg)
+[English](./readme.md) | [中文](./readme_zh.md)
+## Welcome to Chat Demo Application
+![](https://github.com/user-attachments/assets/f2cb6c13-a715-4adf-bf3a-b9ca5ee165df)
+This is a simple demo application designed to showcase multi-turn conversations and project Q&A functionalities.
 
 ## Features
 
-- Supports multi-turn conversations.
-- Supports online Q&A.
-- Supports uploading local zip packages for project Q&A and modifications.
-- Supports inputting GitHub project links for project Q&A and modifications.
-
+- Supports multi-turn conversations
+- Supports online Q&A
+- Supports uploading local zip files for project Q&A and modifications
+- Supports inputting GitHub project links for Q&A and modifications
+![](https://github.com/user-attachments/assets/ff6f6e32-457c-4733-815b-b639e4197899)
 ## Installation
 
 1. Clone the repository locally.
-2. Start the model. You can deploy the model using vllm or ollama, provide the OpenAI request format, and set the deployed `api_base` and `api_key`. Alternatively, visit [CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4) to get the API key.
+2. Start the model. You can deploy the model via vllm or ollama, provide the OpenAI request format, set the deployed `api_base` and `api_key`, or access the [CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4) to get an API key. Fill in the corresponding information in the .env file.
+![](https://github.com/user-attachments/assets/6aabc3e4-a930-4853-b511-68b9389fa42f)
 
 ```shell
-#use open.bigmodel.cn api
-openai_api_key = "<|apikey|>"
+# Using open.bigmodel.cn API
+openai_api_key = ""
 openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
 model_name = "codegeex-4"
-#use vllm
+# Using vllm
 openai_api_key = "EMPTY"
 openai_api_base = "http://xxxx:xxxx/v1"
 model_name = "codegeex4-all-9b"
 ```
 
-3. Fill in the corresponding model information and `bing_search_api` (if you want to experience online search) in the `.env` file.
-4. Install dependencies: `pip install -r requirements.txt`.
-5. Run the application: `chainlit run run.py --port 8899`.
+3. Fill in the corresponding model information and `bing_search_api` (if you want to experience online queries) in the .env file. Turn on the online query switch on the left side of the input box during the chat, which is off by default.
+![](https://github.com/user-attachments/assets/e9d9b620-cfc7-4c2d-bedc-a01d41f79e29)
+4. Install dependencies: `pip install -r requirements.txt`
+5. Run the application: `chainlit run run.py --port 8899`
 
-## Note
+## Notes
 
 Please ensure your network environment can access the CodeGeeX API.
 
-## Disclaimer
-
-This application is for educational and research purposes only and should not be used for any commercial purposes. The developer is not responsible for any loss or damage caused by the use of this application.
-
-## Acknowledgements
+## Acknowledgments
 
 Thank you for using our application. If you have any questions or suggestions, please feel free to contact us. We look forward to your feedback and are committed to providing you with better service.

+ 45 - 0
repodemo/readme_zh.md

@@ -0,0 +1,45 @@
+![](../resources/logo.jpeg)
+[English](./readme.md) | [中文](./readme_zh.md)
+## 欢迎使用Chat Demo应用
+![](https://github.com/user-attachments/assets/f2cb6c13-a715-4adf-bf3a-b9ca5ee165df)
+这是一个简单的演示应用程序,用于展示多轮对话和项目问答功能。
+
+
+## 功能
+
+- 支持多轮对话
+- 支持联网问答
+- 支持上传本地zip压缩包项目进行问答和修改
+- 支持输入GitHub链接项目进行问答和修改
+![](https://github.com/user-attachments/assets/ff6f6e32-457c-4733-815b-b639e4197899)
+## 安装
+
+1. 克隆仓库到本地
+2. 启动模型,可以通过vllm或者ollama部署模型,提供openai的请求格式,设置部署的api_base和api_key,或者访问[CodeGeeX API](https://open.bigmodel.cn/dev/api#codegeex-4)获取apikey。在.env文件中填写对应的信息
+![](https://github.com/user-attachments/assets/6aabc3e4-a930-4853-b511-68b9389fa42f)
+
+```shell
+# 使用open.bigmodel.cn API
+openai_api_key = ""
+openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
+model_name = "codegeex-4"
+# 使用vllm
+openai_api_key = "EMPTY"
+openai_api_base = "http://xxxx:xxxx/v1"
+model_name = "codegeex4-all-9b"
+```
+
+3. 在.env文件中填写对应模型信息和bing_search_api(如果需要体验联网查询),并且在聊天的时候在输入框左侧打开
+联网查询开关,默认关闭。
+![](https://github.com/user-attachments/assets/e9d9b620-cfc7-4c2d-bedc-a01d41f79e29)
+4. 安装依赖:`pip install -r requirements.txt`
+5. 运行应用:`chainlit run run.py --port 8899`
+
+## 注意事项
+
+请确保您的网络环境可以访问CodeGeeX的API。
+
+
+## 感谢
+
+感谢您使用我们的应用。如果您有任何问题或建议,请随时联系我们。我们期待您的反馈,并致力于为您提供更好的服务。

+ 2 - 1
repodemo/requirements.txt

@@ -1,4 +1,5 @@
 chainlit==1.1.305
 beautifulsoup4
 python-dotenv
-gitpython
+gitpython
+openai==1.35.4

+ 3 - 3
repodemo/run.py

@@ -49,8 +49,8 @@ def tools_choose_agent(input_text):
 async def chat_profile():
     return [
         cl.ChatProfile(
-            name="联网聊天",
-            markdown_description="聊天demo:支持多轮对话。支持联网回答用户问题。默认联网,如不联网在输入框左边关闭联网功能。",
+            name="chat聊天",
+            markdown_description="聊天demo:支持多轮对话。支持联网回答用户问题(需要在输入框左边打开联网开关)。默认联网,如不联网在输入框左边关闭联网功能。",
             starters=[
                 cl.Starter(
                     label="请你用python写一个快速排序。",
@@ -107,7 +107,7 @@ async def start():
             Switch(
                 id="is_online",
                 label="CodeGeeX4 - is_online",
-                initial=True
+                initial=False
             ),
         ]
     ).send()