Automate your supabase media handling

View Markdown

Introduction

In this guide, you will:

  • Upload media files to Supabase Storage.
  • Trigger Ittybit automations from webhook functions or database triggers.
  • Build automations for different media types (video, image, audio).
  • Process files with summaries, thumbnails, transcripts, and more.
  • Automatically save processed files back to Supabase storage.
  • Add generated metadata to your database.

Set up a complete media processing pipeline that automatically handles every file uploaded to your Supabase storage.

Step 1: Upload to Supabase Storage

Ensure you have already created a Storage bucket in your Supabase project. If you haven't, you can follow the Supabase guide on creating buckets.

First, upload a file to your Storage bucket. This will generate a signed URL needed for the next step.

Upload Files via Dashboard

  • Click into your ittybit-files bucket or the existing bucket
  • Click Upload files button
  • Select files from your computer to upload
  • Files will appear in the bucket list once uploaded

Get File Path

After upload, note the file path shown in the dashboard . You'll need this for triggering automations.

Step 2: Trigger ittybit automation from a webhook function

Create a serverless function that triggers your Ittybit automation when files are uploaded.

Supabase Edge Function Example:

import { serve } from 'https://deno.land/[email protected]/http/server.ts';

serve(async req => {
  if (req.method !== 'POST') {
    return new Response('Method not allowed', { status: 405 });
  }

  const { key, bucket } = await req.json();

  const fileUrl = `https://YOUR_SUPABASE_PROJECT.supabase.co/storage/v1/object/public/${bucket}/${key}`;

  const response = await fetch(`https://api.ittybit.com/automations/${Deno.env.get('ITTYBIT_AUTOMATION_ID')}/run`, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      Authorization: `Bearer ${Deno.env.get('ITTYBIT_API_KEY')}`,
    },
    body: JSON.stringify({
      url: fileUrl,
    }),
  });

  return new Response(JSON.stringify({ success: true }), {
    headers: { 'Content-Type': 'application/json' },
  });
});

Step 3: Optionally, put the trigger directly in a database function

For a more direct approach, use PostgreSQL triggers to automatically call Ittybit.

Database Trigger Function:

CREATE OR REPLACE FUNCTION trigger_ittybit_automation()
RETURNS TRIGGER
LANGUAGE PLPGSQL
AS $$
BEGIN
  PERFORM net.http_post(
    url := 'https://api.ittybit.com/automations/YOUR_AUTOMATION_ID/run',
    headers := jsonb_build_object(
      'Content-Type', 'application/json',
      'Authorization', 'Bearer YOUR_ITTYBIT_API_KEY'
    ),
    body := jsonb_build_object(
      'url', CONCAT('https://YOUR_SUPABASE_PROJECT.supabase.co/storage/v1/object/public/', NEW.bucket_id, '/', NEW.name)
    )
  );
  RETURN NEW;
END;
$$;

-- Create trigger
CREATE TRIGGER on_media_upload
  AFTER INSERT ON storage.objects
  FOR EACH ROW EXECUTE FUNCTION trigger_ittybit_automation();

Step 4: Build automation with code or on ittybit

Create automations that handle different media types with appropriate processing.

Simple Version - Basic Media Processing:

{
  "name": "Basic Media Processor",
  "workflow": [
    {
      "kind": "summary",
      "metadata": { "ref": "media-summary" }
    },
    {
      "kind": "conditions",
      "conditions": {
        "kind": { "eq": "video" }
      },
      "next": [
        {
          "kind": "thumbnails",
          "options": {
            "count": 1
          },
          "metadata": { "ref": "video-poster" }
        }
      ]
    }
  ]
}

Step 5: Use custom prompts for specialized processing

Leverage AI prompts to build custom functionality for your specific use cases.

Customer Service Call Analysis:

{
  "kind": "prompt",
  "options": {
    "prompt": "Analyze this customer service call and provide feedback on: 1) Agent empathy level 2) Problem resolution effectiveness 3) Communication clarity 4) Areas for improvement",
    "model": "gpt-4"
  },
  "metadata": { "ref": "agent-feedback" }
}

Content Moderation:

{
  "kind": "prompt",
  "options": {
    "prompt": "Review this content for: hate speech, inappropriate content, spam, and harassment. Rate severity from 1-5 for each category.",
    "model": "gpt-4"
  },
  "metadata": { "ref": "content-moderation" }
}

Step 6: Push processed files back to Supabase storage

To save outputs back to your Supabase bucket, first create a connection with your Supabase S3 credentials. Then use destination on your tasks or add explicit upload tasks in runs.

Using destination on a task:

curl -X POST "https://api.ittybit.com/tasks" \
-H "Authorization: Bearer ITTYBIT_API_KEY" \
-H "Content-Type: application/json" \
-d '{
  "url": "https://YOUR_SUPABASE_PROJECT.supabase.co/storage/v1/object/public/ittybit-files/my-file.mp4",
  "kind": "video",
  "connection_id": "con_your_supabase_connection",
  "destination": "s3://ittybit-files/processed/my-file-720p.mp4",
  "options": {
    "width": 1280,
    "format": "mp4"
  }
}'

Using upload tasks in a run for more control:

{
  "tasks": [
    {
      "kind": "ingest",
      "options": { "url": "https://YOUR_SUPABASE_PROJECT.supabase.co/storage/v1/object/public/ittybit-files/my-file.mp4" },
      "next": [
        {
          "kind": "thumbnails",
          "options": { "count": 1 },
          "next": [
            {
              "kind": "upload",
              "options": {
                "connection_id": "con_your_supabase_connection",
                "bucket": "ittybit-files",
                "key": "processed/thumbnails/thumb.jpg"
              }
            }
          ]
        }
      ]
    }
  ]
}

Step 7: Add extra file metadata to objects table

Extend your Supabase storage objects table to store Ittybit-generated metadata.

Create Metadata Table:

CREATE TABLE IF NOT EXISTS media_metadata (
  id UUID DEFAULT gen_random_uuid() PRIMARY KEY,
  object_id UUID REFERENCES storage.objects(id),
  description TEXT,
  duration FLOAT,
  width INTEGER,
  height INTEGER,
  processing_status TEXT,
  ittybit_task_id TEXT,
  created_at TIMESTAMPTZ DEFAULT NOW()
);


CREATE OR REPLACE FUNCTION update_media_metadata()
RETURNS TRIGGER
LANGUAGE PLPGSQL
AS $$
BEGIN
  INSERT INTO media_metadata (
    object_id,
    description,
    duration,
    width,
    height,
    processing_status,
    ittybit_task_id
  ) VALUES (
    NEW.id,
    NEW.metadata->>'description',
    (NEW.metadata->>'duration')::FLOAT,
    (NEW.metadata->>'width')::INTEGER,
    (NEW.metadata->>'height')::INTEGER,
    'succeeded',
    NEW.metadata->>'task_id'
  );
  RETURN NEW;
END;
$$;

This complete pipeline automatically processes every file uploaded to Supabase Storage, generates rich metadata and derivatives, saves everything back to your storage, and updates your database, all without manual intervention.