Shotstack plugin help

Hi there, I have a shotstack.io plugin that creates videos from any number of clips or images with any transitions.

i lost the original plugin author and need someone to add the ability to add overlays.

Hello @jeffmccutcheon

I can help you with this work. Write me in PM and we will discuss everything in detail.

1 Like

Hi Jeff,
If you are still looking out for some help, feel free to reach out to me as this is well within my area of expertise.
You can reach me on andrewjohnson56782@gmail.com
Best Wishes,
Andrew

Hello @jeffmccutcheon

If you are looking for plugin creator, I offer a development subscription with unlimited custom plugin development requests. I’ve used Shotstack API to merge videos before so I believe I am able to add overlays with Shotstack in a few days. If you don’t have any more request afterward, you can cancel the subscription at any time.

Hugo
Developer @ Nalfe

60+ Bubble Templates
10+ Bubble Plugins
Bubble Component Library & Extension
No-code Development Subscriptions
No-code Podcast

1 Like

Hello,
I’m also looking for a Shotstack specialist … please PM me.

Thanks

Hi, can you help me with the ShotStack plugin or how to utilize this tool for my app?

1 Like

Hi,

Could not find the plugin anymore so want to connect directly through API connecter plugin, to the Shotstack API curl: Hello World | Shotstack Documentation

How do I get a response with the test endpoints? Right now they return “be sure to call a valid resource” instead of “response”:{
“message”:“Render Successfully Queued”,
“id”:“d2b46ed6-998a-4d6b-9d91-b8cf0193a655”

What an awesome AI generative video API. They just have no no-code documentation for Bubble, just Zapier. Thanks in advance!

-jz

Did u get valid API keys and set up your authentication correctly?

I’ve experience in Shotstack Plugin. Please send me the details.

Hey @initialsjz :wave:

I got it working pretty quickly. Fortunately, they have pretty good documentation. So for the first call, you need to render something. Like this:

I used this Body from their example to get it to work:

{
  "timeline": {
    "tracks": [
      {
        "clips": [
          {
            "asset": {
              "type": "text",
              "text": "NoCodeMinute.com",
              "font": {
                "family": "Clear Sans",
                "color": "#ffffff",
                "size": 46
              },
              "alignment": {
                "horizontal": "left"
              },
              "width": 566,
              "height": 70
            },
            "start": 4,
            "length": "end",
            "transition": {
              "in": "fade",
              "out": "fade"
            },
            "offset": {
              "x": -0.15,
              "y": 0
            },
            "effect": "zoomIn"
          }
        ]
      },
      {
        "clips": [
          {
            "asset": {
              "type": "video",
              "src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/footage/earth.mp4",
              "trim": 5,
              "volume": 1
            },
            "start": 0,
            "length": "auto",
            "transition": {
              "in": "fade",
              "out": "fade"
            }
          }
        ]
      },
      {
        "clips": [
          {
            "asset": {
              "type": "audio",
              "src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/music/freepd/motions.mp3",
              "effect": "fadeOut",
              "volume": 1
            },
            "start": 0,
            "length": "end"
          }
        ]
      }
    ]
  },
  "output": {
    "format": "mp4",
    "size": {
      "width": 1280,
      "height": 720
    }
  }
}

Then you can check on the status of the render like this:

Then you just need to fill in the data that you want by switching out the data in the body from the Render endpoint to something that allows you to put in the information in a workflow.

I am using the ‘Stage’ endpoint so it doesn’t count against any credits. It will have the watermark. that will be for the test environment. For the live environment, you can use the real endpoint without the ‘stage’ part in the url. This could also be done dynamically by taking advantage of the ‘isn’t live version’ value in a workflow.

Does that help a little bit? Let us know where you get stuck and we can help some more. :blush:

2 Likes

Wow thanks! How do I, and great if you can screenshot a workflow:

I created a database thing and use in a workflow. But make changes to a thing does not save the API url response to my database? Thanks so much!



I don’t have a workflow set up. However, the first ‘render’ endpoint only starts the process. You would use the ‘poll’ method or, even better, use this ‘webhook’ method to send the file to you once it has completed rendering.

Here is the documentation about it, we just have to figure out how it works: Shotstack Ingestion Webhook | Shotstack Documentation

Thanks! Hope to get to the next step. I’m still having trouble understanding if I put the data in the API connector or the workflow, or both. When I intialize the call in the api connector, it errors out if there is no data in the body. It also errors out when there is a data [parameter]. Am I supposed to set this in both places? Is the API connector data ignored after initialization? Thank you!

So here is how you setup a web hook from Shotstack so you will always get the rendered video after it has completed.

  1. Create a backend workflow to catch your web hook.
  2. Choose detect data
  3. Initialize your API Connector ‘Render’ call with your own URL from the backend workflow when choose detect data. Will look something like the url I put in the code.
  4. For security, drop in your bubble api token from the settings page here: Settings > API > API tokens > generate a new API token.
  5. Remove the initialize part after it has been initialized and start making the data dynamic. Example of dynamic data would be below like

Check it out:

{
  "callback": "https://nocodeminute.bubbleapps.io/version-test/api/1.1/wf/shotstack?api_token=YOURAPIKEY",
  "timeline": {
    "tracks": [
      {
        "clips": [
          {
            "asset": {
              "type": "text",
              "text": "NoCodeMinute.com",
              "font": {
                "family": "Clear Sans",
                "color": "#ffffff",
                "size": 46
              },
              "alignment": {
                "horizontal": "left"
              },
              "width": 566,
              "height": 70
            },
            "start": 4,
            "length": "end",
            "transition": {
              "in": "fade",
              "out": "fade"
            },
            "offset": {
              "x": -0.15,
              "y": 0
            },
            "effect": "zoomIn"
          }
        ]
      },
      {
        "clips": [
          {
            "asset": {
              "type": "video",
              "src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/footage/earth.mp4",
              "trim": 5,
              "volume": 1
            },
            "start": 0,
            "length": "auto",
            "transition": {
              "in": "fade",
              "out": "fade"
            }
          }
        ]
      },
      {
        "clips": [
          {
            "asset": {
              "type": "audio",
              "src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/music/freepd/motions.mp3",
              "effect": "fadeOut",
              "volume": 1
            },
            "start": 0,
            "length": "end"
          }
        ]
      }
    ]
  },
  "output": {
    "format": "mp4",
    "size": {
      "width": 1280,
      "height": 720
    }
  }
}



Does that help? Let me know. :blush:

1 Like