Books/Deploying AI Apps/CI/CD for AI Apps

    CI/CD for AI Apps

    CI/CD for AI Apps

    Deploying manually works, but it gets tedious fast. What if your app could test itself and deploy automatically every time you push code? That's exactly what CI/CD does.

    What is CI/CD?

    CI/CD stands for Continuous Integration and Continuous Deployment (or Continuous Delivery):

    • Continuous Integration (CI): Automatically runs tests and checks every time code is pushed. Catches bugs before they reach production.
    • Continuous Deployment (CD): Automatically deploys your app after tests pass. No manual deployment steps needed.

    The CI/CD Pipeline

    You push code → CI runs tests → Tests pass? → CD deploys automatically
                                      ↓ fail
                                You get notified, deployment blocked
    

    This pipeline ensures that:

    1. Broken code never reaches production
    2. Deployments happen consistently (no "I forgot to deploy" moments)
    3. Every change is tested the same way
    4. Your team can ship faster with confidence

    GitHub Actions Basics

    GitHub Actions is the most popular CI/CD tool for projects hosted on GitHub. It's free for public repos and has generous free minutes for private repos.

    Key Concepts

    ConceptWhat It IsExample
    WorkflowAn automated process defined in a YAML fileBuild and deploy on push
    TriggerWhat starts the workflowPush to main, pull request
    JobA set of steps that run on a virtual machine"build", "test", "deploy"
    StepAn individual action within a jobRun npm install, run tests
    ActionA reusable step made by someone elseactions/checkout@v4

    Where Workflows Live

    Workflows are YAML files in .github/workflows/:

    my-project/
    ├── .github/
    │   └── workflows/
    │       ├── ci.yml          # Runs tests on every push
    │       └── deploy.yml      # Deploys when tests pass
    ├── src/
    └── package.json
    

    Your First CI Workflow

    Let's create a workflow that runs your tests on every push and pull request.

    # .github/workflows/ci.yml
    name: CI
    
    on:
      push:
        branches: [main]
      pull_request:
        branches: [main]
    
    jobs:
      test:
        runs-on: ubuntu-latest
    
        steps:
          - name: Checkout code
            uses: actions/checkout@v4
    
          - name: Set up Node.js
            uses: actions/setup-node@v4
            with:
              node-version: "20"
              cache: "npm"
    
          - name: Install dependencies
            run: npm ci
    
          - name: Run linter
            run: npm run lint
    
          - name: Run type check
            run: npx tsc --noEmit
    
          - name: Run tests
            run: npm test
            env:
              OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}

    What This Does

    1. Triggers on pushes to main and on pull requests targeting main
    2. Checks out your code
    3. Sets up Node.js 20 with npm caching (faster installs)
    4. Installs dependencies with npm ci (clean install, faster than npm install)
    5. Runs your linter, type checker, and tests
    6. Injects your API key from GitHub Secrets (for tests that need it)

    What to ask your AI: "Create a GitHub Actions CI workflow for my [framework] project. Include linting, type checking, and testing."

    Automated Testing Before Deploy

    For AI apps, you want to test more than just code correctness. Here are tests that matter:

    Unit Tests

    Test individual functions:

    // __tests__/formatResponse.test.ts
    import { formatAIResponse } from "../src/lib/formatResponse";
    
    test("formats AI response correctly", () => {
      const raw = "Here is a **bold** response";
      const result = formatAIResponse(raw);
      expect(result).toContain("<strong>bold</strong>");
    });

    API Integration Tests

    Test that your AI API wrapper works (use mock data to avoid real API calls in CI):

    // __tests__/aiService.test.ts
    import { generateResponse } from "../src/services/aiService";
    
    test("handles API errors gracefully", async () => {
      // Mock the API to return an error
      const result = await generateResponse("test prompt", { mockError: true });
      expect(result.error).toBe("AI service temporarily unavailable");
    });

    Build Tests

    Make sure the app actually builds:

    - name: Build
      run: npm run build
      env:
        NEXT_PUBLIC_APP_URL: https://example.com

    Automated Deployment on Push

    Now let's add automatic deployment. Here are workflows for Vercel and Firebase.

    Deploy to Vercel via GitHub Actions

    The easiest approach is to connect Vercel to GitHub directly (no workflow needed). But if you want more control:

    # .github/workflows/deploy-vercel.yml
    name: Deploy to Vercel
    
    on:
      push:
        branches: [main]
    
    jobs:
      deploy:
        runs-on: ubuntu-latest
    
        steps:
          - name: Checkout code
            uses: actions/checkout@v4
    
          - name: Set up Node.js
            uses: actions/setup-node@v4
            with:
              node-version: "20"
              cache: "npm"
    
          - name: Install dependencies
            run: npm ci
    
          - name: Run tests
            run: npm test
    
          - name: Deploy to Vercel
            uses: amondnet/vercel-action@v25
            with:
              vercel-token: ${{ secrets.VERCEL_TOKEN }}
              vercel-org-id: ${{ secrets.VERCEL_ORG_ID }}
              vercel-project-id: ${{ secrets.VERCEL_PROJECT_ID }}
              vercel-args: "--prod"

    Deploy to Firebase via GitHub Actions

    # .github/workflows/deploy-firebase.yml
    name: Deploy to Firebase
    
    on:
      push:
        branches: [main]
    
    jobs:
      build-and-deploy:
        runs-on: ubuntu-latest
    
        steps:
          - name: Checkout code
            uses: actions/checkout@v4
    
          - name: Set up Node.js
            uses: actions/setup-node@v4
            with:
              node-version: "20"
              cache: "npm"
    
          - name: Install dependencies
            run: npm ci
    
          - name: Run tests
            run: npm test
    
          - name: Build
            run: npm run build
    
          - name: Deploy to Firebase Hosting
            uses: FirebaseExtended/action-hosting-deploy@v0
            with:
              repoToken: ${{ secrets.GITHUB_TOKEN }}
              firebaseServiceAccount: ${{ secrets.FIREBASE_SERVICE_ACCOUNT }}
              channelId: live
              projectId: your-project-id

    What to ask your AI: "Set up GitHub Actions to automatically deploy my [framework] app to [Vercel/Firebase] when I push to main. Include tests before deployment."

    Setting Up GitHub Secrets

    Both workflows reference secrets like OPENAI_API_KEY and FIREBASE_SERVICE_ACCOUNT. Here's how to add them:

    1. Go to your GitHub repository
    2. Click Settings → Secrets and variables → Actions
    3. Click "New repository secret"
    4. Add each secret:
    Secret NameWhere to Get It
    OPENAI_API_KEYOpenAI dashboard → API keys
    VERCEL_TOKENVercel dashboard → Settings → Tokens
    VERCEL_ORG_IDVercel project settings
    VERCEL_PROJECT_IDVercel project settings
    FIREBASE_SERVICE_ACCOUNTFirebase Console → Project Settings → Service Accounts → Generate Key

    Example: Full CI/CD Workflow

    Here's a complete workflow that tests, builds, and deploys:

    # .github/workflows/ci-cd.yml
    name: CI/CD Pipeline
    
    on:
      push:
        branches: [main]
      pull_request:
        branches: [main]
    
    jobs:
      # Step 1: Test
      test:
        runs-on: ubuntu-latest
        steps:
          - uses: actions/checkout@v4
          - uses: actions/setup-node@v4
            with:
              node-version: "20"
              cache: "npm"
          - run: npm ci
          - run: npm run lint
          - run: npx tsc --noEmit
          - run: npm test
    
      # Step 2: Deploy (only on push to main, after tests pass)
      deploy:
        needs: test
        if: github.event_name == 'push' && github.ref == 'refs/heads/main'
        runs-on: ubuntu-latest
        steps:
          - uses: actions/checkout@v4
          - uses: actions/setup-node@v4
            with:
              node-version: "20"
              cache: "npm"
          - run: npm ci
          - run: npm run build
          - name: Deploy
            uses: FirebaseExtended/action-hosting-deploy@v0
            with:
              repoToken: ${{ secrets.GITHUB_TOKEN }}
              firebaseServiceAccount: ${{ secrets.FIREBASE_SERVICE_ACCOUNT }}
              channelId: live
              projectId: your-project-id

    Key details:

    • needs: test — The deploy job waits for the test job to pass
    • if: github.event_name == 'push' — Only deploys on pushes to main (not on pull requests)
    • Pull requests still run tests, giving you feedback before merging

    CI/CD Best Practices for AI Apps

    1. Never test with real AI API calls in CI — Use mocks or stubs. Real API calls are slow, costly, and flaky.
    2. Cache dependencies — Use cache: "npm" to speed up installs.
    3. Keep workflows fast — Aim for under 5 minutes. Slow pipelines slow down your team.
    4. Use branch protection — Require CI to pass before merging pull requests.
    5. Monitor deployment status — GitHub shows deployment status on commits and PRs.

    What to ask your AI: "Create mock/stub functions for my AI API calls so I can test without making real API requests in CI."

    What's Next?

    Your app deploys automatically. But how do you know it's working correctly in production? The next tutorial covers monitoring and logging — keeping an eye on your deployed AI app.

    What to ask your AI: "Help me add branch protection rules on GitHub so that CI must pass before code can be merged to main."


    🌐 www.genai-mentor.ai