ChatGPT-Turbo-SMS
ChatGPT-Turbo-SMS copied to clipboard
ChatGPT-Turbo SMS is a Flask application that allows users to send SMS text messages to ChatGPT-Turbo using Twilio and receive instant responses.
ChatGPT-Turbo SMS
ChatGPT-Turbo SMS is a Flask application that allows users to send SMS text messages to ChatGPT-Turbo using Twilio and receive instant responses. This application can be run locally or hosted on services like DigitalOcean to stay active 24/7 using tmux. The application uses Flask, Twilio, OpenAI, and ngrok python libraries.
Video Demonstration
A video demonstration of the ChatGPT-Turbo SMS application, hosted on DigitalOcean and utilizing tmux to keep the session open after closing the console, is shown below.
https://user-images.githubusercontent.com/63890666/225838063-f71c5a9b-fe22-4882-ab70-b3c20e26ba11.mp4
Features
- Send text messages to ChatGPT-Turbo and receive instant responses
- Utilizes Twilio API for SMS handling and OpenAI API for processing user queries
- Supports local deployment and also easily deployable on DigitalOcean using tmux for persistent sessions
- Utilizes ngrok for easy access to the application
- Can easily change system role by replacing the content 'You are a helpful assistant.' to anything else in line 25 in app.py
{"role": "system", "content": "You are a helpful assistant."}
Prerequisites
- Python 3.6 or later
- Twilio account with an SMS-enabled phone number
- OpenAI API key
- ngrok
Installation
-
Clone the repository:
`git clone https://github.com/Kaludii/ChatGPT-Turbo-SMS.git cd ChatGPT-Turbo-SMS` -
Create and activate a virtual environment:
`python3 -m venv venv source venv/bin/activate` -
Install the required packages:
pip install -r requirements.txt -
Create a
.envfile in the project root directory and copy your Twilio Account SID, Auth Token, and OpenAI API Key:`TWILIO_ACCOUNT_SID=your_twilio_account_sid TWILIO_AUTH_TOKEN=your_twilio_auth_token OPENAI_API_KEY=your_openai_api_key`
Replace your_twilio_account_sid, your_twilio_auth_token, and your_openai_api_key with the corresponding values from your Twilio and OpenAI accounts.
Usage
Running the application locally
-
Start the Flask app:
python app.py -
Go to the following website, sign up for ngrok and connect your authtoken to your account.
-
In a separate terminal, start ngrok:
ngrok http 5000 -
Configure your Twilio phone number's messaging webhook with the generated ngrok URL followed by
/sms. For example, if your ngrok URL ishttps://abcd1234.ngrok.io, set the webhook tohttps://abcd1234.ngrok.io/sms.
Add the webhook URL to the following two Twilio pages, Phone Numbers > Manage > Active Numbers, and Conversations > Manage > Global Webhooks. Make sure both are with HTTP POST and for the second link make sure "onMessageAdded" is selected in the Post-webhooks section. Example pictures:

- Send an SMS to your Twilio phone number. ChatGPT-Turbo will process the message and you'll receive an immediate response.
Hosting the application on DigitalOcean
To host the application on DigitalOcean like in the video example shown above, follow these steps:
-
Create a DigitalOcean account if you don't have one: https://www.digitalocean.com/
-
Create a new droplet (virtual machine) by clicking "Create" and then "Droplets." Choose an appropriate plan, region, and operating system (e.g., Ubuntu 20.04 LTS) for your droplet. (I used the cheapest $4 dollar plan and it works perfectly, you don't need any more than that)
-
Use an SSH client (e.g., PuTTY on Windows or
sshon macOS/Linux) to connect to your droplet using its IP address, usernameroot, and the provided password. -
Update your system packages and install necessary dependencies:
`sudo apt-get update sudo apt-get upgrade sudo apt-get install python3-pip python3-venv` -
Clone the Flask application repository or transfer the Flask application files to the droplet using
scpor an SFTP client, and also configure the.envfile as described in hosting locally section. -
Navigate to the Flask application directory and create a virtual environment:
python3 -m venv venv -
Activate the virtual environment and install the Flask application's dependencies:
`source venv/bin/activate pip install -r requirements.txt` -
Set the Flask environment variables:
`export FLASK_APP=app.py export FLASK_ENV=production` -
Run the Flask application using the
flask runcommand:flask runThis will start the Flask application on the default port 5000.
-
In another terminal (SSH) session, install ngrok on your DigitalOcean droplet and connect your authtoken to your account using this website:
`wget https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip
sudo apt-get install unzip
unzip ngrok-stable-linux-amd64.zip`
- Run ngrok to create a tunnel to the Flask application:
`ngrok http 5000`
- Copy the webhook URL from the ngrok window followed by
/smsand add it to the two Twlio sections, similar to 3A from the running locally section.
Hosting the application on DigitalOcean (Using tmux to keep session active after console is closed)
-
Deploy a new Droplet on DigitalOcean.
-
Install the required dependencies on the Droplet.
-
Clone the repository and configure the
.envfile as described in hosting locally section. -
Start a new
tmuxsession:
tmux new -s chatgpt-turbo-sms
- Run the application within the
tmuxsession:
python app.py
-
In a separate terminal, start a new
tmuxsession:tmux new -s chatgpt-turbo-sms-ngrok -
Install ngrok on your DigitalOcean droplet and connect your authtoken to your account using this website:
`wget https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip sudo apt-get install unzip unzip ngrok-stable-linux-amd64.zip` -
Start ngrok:
start ngrok -
Copy the webhook URL from the ngrok window followed by
/smsand add it to the two Twlio sections, similar to 3A from the running locally section. -
Detach from the
tmuxsession by pressingCtrl-bfollowed byd.
The application will continue running even after you close your console. To reattach to the tmux session, use:
tmux attach -t chatgpt-turbo-sms, and tmux attach -t chatgpt-turbo-ngrok
About the Developer
This app was developed by Kaludii using the the different libraries linked above. Kaludii is an AI enthusiast who is passionate about developing and applying large learning models to solve real-world problems quickly and stress-free.