LogoLogo
NeurochainAI Guides
NeurochainAI Guides
  • Quick Start Guide
  • NeurochainAI Official Links
  • Getting started
    • Understanding the Dashboard
    • Specific FAQs
  • BUILD ON DIN
    • What is Distributed Inference Network (DIN)
      • Why Build on DIN?
      • Introduction to Inference
    • AI Models Available on DIN
    • Adding Credits
    • Generating an API Key for Inference
    • Use Sentiment Analysis
    • Pricing
    • INTEGRATIONS
      • Make (Integromat) - Use IaaS in Your Scenarios
      • N8N - Use IaaS in Your Workflow
      • Typebot - Use IaaS in Your Chatbot
      • Botghost - Creating AI Discord Bots
      • Replit - Building AI-Powered Chatbot
        • Build Custom Solutions with Flux.1 Schnell
      • Pipedream - N8N - Use IaaS in Your Automation
      • Voiceflow - Use IaaS to Enhance Your Chatbot
      • Open API Integration
      • BuildShip - Use IaaS to Automate Workflows
      • Pipefy - Optimizing Business Processes
  • No-code workshops
  • NeurochainAI No-Code: AI Automation with N8N
  • NeurochainAI No-Code: Development Guide (Bolt.new)
  • NeurochainAI No-Code: Build AI-Powered Apps with Cursor
  • NeurochainAI No-Code: Intelligent Text Parsing
  • CONNECT GPUs
    • Connect GPUs: All You Need to Know
    • GPU Setup Instructions
    • Running the Worker
    • Mobile App
  • ENTERPRISE SOLUTIONS
    • Inference Routing Solution
    • Managed Inference Infrastructure
    • AI Model Quantization
    • Data Layer
  • NCN Chain
    • NCN Scan
    • Setting Up Wallet
      • Manual Addition (MetaMask)
    • Trading $NCN on Uniswap
    • Neuron Validator Nodes
      • How to stake
      • Hardware Requirements
      • Running a Neuron Node
  • Community
    • NeurochainAI Loyalty Program
    • All the Ways to Get Involved
Powered by GitBook
On this page
  1. BUILD ON DIN

Use Sentiment Analysis

PreviousGenerating an API Key for InferenceNextPricing

Last updated 6 months ago

This guide will help you set up and use Sentiment Analysis by NeurochainAI for monitoring sentiment in Telegram group chats. With this setup, you'll be able to analyze group messages, track sentiment, and collect insights using Docker and Python.


Prerequisites

Before starting, ensure you have the following software installed:

  • Python 3.10 or higher -

  • Docker -

  • Docker Compose -


Step 1: Clone the Repository

Begin by cloning the Sentiment Analysis repository and navigating to the project directory:

# Clone the repository
git clone https://github.com/NeuroChainAi/python-telegram-sentiment.git

# Navigate to the project directory
cd python-telegram-sentiment
    

  1. Install Dependencies

Install the Python dependencies from the requirements file

# Install dependencies
pip install -r requirements.txt
    

  1. Configure Environment Variables

Create a `.env` file with the following environment variables:

# .env file content
NEUROCHAIN_API_KEY=your_neurochain_api_key
TELEGRAM_API_KEY=your_telegram_api_key
DATABASE_URL=postgres://your_user:your_password@postgres/your_database
    

  1. Setup Docker Compose

Docker Compose sets up both PostgreSQL and the Telegram bot. Use the following command to start the services:

# Build and start Docker services
docker-compose up --build
    

This command will:

  • Start a PostgreSQL container

  • Build and run the Telegram bot container

  1. Docker Compose File Structure

The `docker-compose.yml` file defines both PostgreSQL and the bot containers. Below is the structure:

version: '3.8'

services:
  # PostgreSQL container
  postgres:
    image: postgres:14
    environment:
      POSTGRES_USER: your_user
      POSTGRES_PASSWORD: your_password
      POSTGRES_DB: your_database
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data

  # Telegram bot container
  telegram-bot:
    build: ./bot  # Path to the bot folder where the Dockerfile is located
    environment:
      DATABASE_URL: postgres://your_user:your_password@postgres/your_database  # Database connection string
                  NEUROCHAIN_API_KEY: your_neurochain_api_key
      TELEGRAM_API_KEY: your_telegram_api_key
    depends_on:
      - postgres  # Ensures PostgreSQL starts before the bot
    volumes:
      - ./bot:/app  # Mount the bot folder to the container
    restart: always

volumes:
  postgres_data:
    driver: local
    

  1. Running the Bot

Once the containers are running, the bot should start interacting with the Telegram API and PostgreSQL. You can monitor logs to check the status:

# Monitor logs
docker-compose logs -f
    

  1. Stopping the Containers

To stop and remove the containers, use the following command:

docker-compose down
    

Download Python
Download Docker
Download Docker Compose