Loading...

DevOps for BrokerEdge AI

Role: DevOps EngineerDuration: 6+ Months
Firebase
Genkit
GCP
GitHub Actions
Vercel
Docker

Summary

BrokerEdge AI is an end-to-end real estate SaaS platform that helps brokers recruit, retain, and scale their operations with a CRM, MLS sync, onboarding workflows, and a built-in AI Copilot. My DevOps role was to architect, automate, and optimize the infrastructure for scalability, cost-efficiency, and secure AI/MLS integration.

Gallery

DevOps for BrokerEdge AI gallery image 1
DevOps for BrokerEdge AI gallery image 2
DevOps for BrokerEdge AI gallery image 3

Objectives

  • Automate deployments and environment configs
  • Secure AI and MLS data pipelines
  • Optimize Firestore rules and usage
  • Enable scalable AI interactions using Genkit
  • Build a real-time dashboard with monitoring hooks

Technical Challenges & Solutions

Problem 1: Manual Deployments Slowed Down the Team

"Deployments were initially manual via Firebase CLI and Vercel UI. This delayed testing, introduced human error, and lacked rollback visibility."

Solution: CI/CD with GitHub Actions + Vercel

.github/workflows/deploy.yml
name: CI/CD Deploy

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - run: npm install
      - run: npm run lint && npm run build
      - name: Deploy to Vercel
        uses: amondnet/vercel-action@v20
        with:
          vercel-token: ${{ secrets.VERCEL_TOKEN }}
          vercel-org-id: ${{ secrets.VERCEL_ORG_ID }}
          vercel-project-id: ${{ secrets.VERCEL_PROJECT_ID }}
          scope: personal
Results:
  • Deployment time reduced from ~10 min to ~1.2 min
  • Every push to main auto-deployed to staging or production
  • Built-in validation ensured cleaner, safer code delivery

Problem 2: AI Copilot Was Too Slow on Cold Starts

"Firebase Cloud Functions were experiencing cold start lag on AI generation requests, impacting the Copilot experience."

Solution: Pre-warm AI Copilot Function & Queue System

functions/genkit.ts
import { initializeGenkit } from '@genkit-ai/core';
import { onCall } from 'firebase-functions/v2/https';

initializeGenkit();

export const copilotChat = onCall(async (req) => {
  const { input, userContext } = req.data;
  const result = await myGenkitFlow({ input, userContext });
  return { response: result.output };
});
Results:
  • Added regional warm-up ping every 5 minutes via a scheduled pingCopilot() job
  • Reduced AI response latency to <1.2s avg
  • Implemented retry queue for failed requests using Firestore batching

Problem 3: Firestore Costs Were Climbing with Agent Pipelines

"Thousands of Firestore reads were triggered by poorly scoped client queries."

Solution: Query Optimization + Denormalized Structure

firestore-query.js
// BEFORE (inefficient)
const leads = await getDocs(collection(db, 'clients'));

// AFTER (scoped & indexed)
const q = query(
  collection(db, `clients/${{userId}}/leads`),
  where('status', '==', 'active'),
  orderBy('createdAt', 'desc'),
  limit(50)
);
const snapshot = await getDocs(q);
Results:
  • Firestore reads reduced by 72%
  • Monthly cost dropped from ~$180 → ~$45
  • Faster rendering of client pipelines in dashboard

Problem 4: Lack of Role-Based Security & Data Isolation

"Without well-defined access rules, junior agents could accidentally access brokerage-wide documents."

Solution: Firebase Security Rules + Claims

firestore.rules
match /clients/{agentId}/{clientId} {
  allow read, write: if request.auth.uid == agentId;
}

match /adminData/{doc} {
  allow read: if request.auth.token.role == "admin";
}
Results:
  • Implemented broker/admin/agent roles via Firebase Auth custom claims
  • Isolated views and write access per agent/team
  • Passed internal security audit

DevOps Stack

Firebase

Hosting, Firestore, Auth, Functions

Genkit AI

AI Copilot orchestration

GitHub Actions

CI/CD, deployments

Vercel

Hosting frontend + dashboard

Docker

Dev containers for MLS services

Google Cloud IAM

Security + role control

Cloud Scheduler

Ping + AI warm-up jobs

Sentry + Logs

Error tracking & monitoring

Key Outcomes

  • CI/CD enabled faster testing and zero-downtime deployments
  • AI Copilot response latency dropped to ~1.2s
  • Firebase secured with role-based rules & scoped Firestore queries
  • Reduced Firestore billing by 70% through optimized reads
  • Ready to scale from 10 → 10,000+ users on Firebase+Genkit

Reflection

Working on BrokerEdge AI sharpened my ability to balance DevOps automation, security, and AI orchestration in a fast-moving SaaS environment. This project is a great example of delivering speed, stability, and smart infrastructure at scale.