Automate your supabase media handling

View Markdown

Introduction

In this guide, you will:

  • Upload media files to Supabase Storage.
  • Trigger Ittybit automations from webhook functions or database triggers.
  • Build automations for different media types (video, image, audio).
  • Process files with descriptions, thumbnails, transcripts, and more.
  • Automatically save processed files back to Supabase storage.
  • Add generated metadata to your database.

Set up a complete media processing pipeline that automatically handles every file uploaded to your Supabase storage.

Step 1: Upload to Supabase Storage

Ensure you have already created a Storage bucket in your Supabase project. If you haven't, you can follow the Supabase guide on creating buckets.

First, upload a file to your Storage bucket. This will generate a signed URL needed for the next step.

Upload Files via Dashboard

  • Click into your ittybit-files bucket or the existing bucket
  • Click Upload files button
  • Select files from your computer to upload
  • Files will appear in the bucket list once uploaded

Get File Path

After upload, note the file path shown in the dashboard . You'll need this for triggering automations.

Step 2: Trigger ittybit automation from a webhook function

Create a serverless function that triggers your Ittybit automation when files are uploaded.

Supabase Edge Function Example:

import { serve } from 'https://deno.land/std@0.177.0/http/server.ts';

serve(async req => {
  if (req.method !== 'POST') {
    return new Response('Method not allowed', { status: 405 });
  }

  const { key, bucket } = await req.json();

  // Trigger Ittybit automation
  const response = await fetch('https://api.ittybit.com/tasks', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      Authorization: `Bearer ${Deno.env.get('ITTYBIT_API_KEY')}`,
    },
    body: JSON.stringify({
      kind: 'automation',
      automation_id: Deno.env.get('ITTYBIT_AUTOMATION_ID'),
      connection: '{{ project.connections.SUPABASE }}',
      key: key,
    }),
  });

  return new Response(JSON.stringify({ success: true }), {
    headers: { 'Content-Type': 'application/json' },
  });
});

Step 3: Optionally, put the trigger directly in a database function

For a more direct approach, use PostgreSQL triggers to automatically call Ittybit.

Database Trigger Function:

CREATE OR REPLACE FUNCTION trigger_ittybit_automation()
RETURNS TRIGGER
LANGUAGE PLPGSQL
AS $$
BEGIN
  PERFORM net.http_post(
    url := 'https://api.ittybit.com/tasks',
    headers := jsonb_build_object(
      'Content-Type', 'application/json',
      'Authorization', 'Bearer YOUR_ITTYBIT_API_KEY'
    ),
    body := jsonb_build_object(
      'kind', 'automation',
      'automation_id', 'YOUR_AUTOMATION_ID',
      'connection', '{{ project.connections.SUPABASE }}',
      'key', NEW.name,
      'bucket', NEW.bucket_id
    )
  );
  RETURN NEW;
END;
$$;

-- Create trigger
CREATE TRIGGER on_media_upload
  AFTER INSERT ON storage.objects
  FOR EACH ROW EXECUTE FUNCTION trigger_ittybit_automation();

Step 4: Build automation with code or on ittybit

Create automations that handle different media types with appropriate processing.

Simple Version - Basic Media Processing:

{
  "name": "Basic Media Processor",
  "trigger": {
    "kind": "event",
    "event": "media.created"
  },
  "workflow": [
    {
      "kind": "description",
      "ref": "media-description"
    },
    {
      "kind": "conditions",
      "conditions": [{ "prop": "kind", "value": "video" }],
      "next": [
        {
          "kind": "thumbnails",
          "count": 1,
          "ref": "video-poster"
        }
      ]
    }
  ]
}

Step 5: Use custom prompts for specialized processing

Leverage AI prompts to build custom functionality for your specific use cases.

Customer Service Call Analysis:

{
  "kind": "prompt",
  "prompt": "Analyze this customer service call and provide feedback on: 1) Agent empathy level 2) Problem resolution effectiveness 3) Communication clarity 4) Areas for improvement",
  "model": "gpt-4",
  "ref": "agent-feedback"
}

Content Moderation:

{
  "kind": "prompt",
  "prompt": "Review this content for: hate speech, inappropriate content, spam, and harassment. Rate severity from 1-5 for each category.",
  "model": "gpt-4",
  "ref": "content-moderation"
}

Step 6: Upload created files back to supabase storage and send webhooks

Configure your automation to save processed files back to Supabase and notify your application.

Enhanced Automation with Webhooks:

{
  "name": "Complete Media Pipeline",
  "trigger": {
    "kind": "event",
    "event": "media.created"
  },
  "workflow": [
    {
      "kind": "description",
      "ref": "media-description"
    },
    {
      "kind": "thumbnails",
      "count": 1,
      "ref": "main-thumbnail"
    },
    {
      "kind": "storage",
      "connection": "{{ env.SUPABASE_STORAGE }}",
      "path": "processed/{{ task.id }}",
      "ref": "save-to-supabase"
    },
    {
      "kind": "webhook",
      "url": "https://your-app.com/api/media-processed",
      "method": "POST",
      "ref": "notify-completion"
    }
  ]
}

Step 7: Add extra file metadata to objects table

Extend your Supabase storage objects table to store Ittybit-generated metadata.

Create Metadata Table:

CREATE TABLE IF NOT EXISTS media_metadata (
  id UUID DEFAULT gen_random_uuid() PRIMARY KEY,
  object_id UUID REFERENCES storage.objects(id),
  description TEXT,
  duration FLOAT,
  width INTEGER,
  height INTEGER,
  processing_status TEXT,
  ittybit_task_id TEXT,
  created_at TIMESTAMPTZ DEFAULT NOW()
);


CREATE OR REPLACE FUNCTION update_media_metadata()
RETURNS TRIGGER
LANGUAGE PLPGSQL
AS $$
BEGIN
  INSERT INTO media_metadata (
    object_id,
    description,
    duration,
    width,
    height,
    processing_status,
    ittybit_task_id
  ) VALUES (
    NEW.id,
    NEW.metadata->>'description',
    (NEW.metadata->>'duration')::FLOAT,
    (NEW.metadata->>'width')::INTEGER,
    (NEW.metadata->>'height')::INTEGER,
    'processed',
    NEW.metadata->>'task_id'
  );
  RETURN NEW;
END;
$$;

This complete pipeline automatically processes every file uploaded to Supabase Storage, generates rich metadata and derivatives, saves everything back to your storage, and updates your database, all without manual intervention.