Hi there, I have a shotstack.io plugin that creates videos from any number of clips or images with any transitions.
i lost the original plugin author and need someone to add the ability to add overlays.
Hi there, I have a shotstack.io plugin that creates videos from any number of clips or images with any transitions.
i lost the original plugin author and need someone to add the ability to add overlays.
Hello @jeffmccutcheon
I can help you with this work. Write me in PM and we will discuss everything in detail.
Hi Jeff,
If you are still looking out for some help, feel free to reach out to me as this is well within my area of expertise.
You can reach me on andrewjohnson56782@gmail.com
Best Wishes,
Andrew
Hello @jeffmccutcheon
If you are looking for plugin creator, I offer a development subscription with unlimited custom plugin development requests. Iâve used Shotstack API to merge videos before so I believe I am able to add overlays with Shotstack in a few days. If you donât have any more request afterward, you can cancel the subscription at any time.
Hugo
Developer @ Nalfe
60+ Bubble Templates
10+ Bubble Plugins
Bubble Component Library & Extension
No-code Development Subscriptions
No-code Podcast
Hello,
Iâm also looking for a Shotstack specialist ⌠please PM me.
Thanks
Hi, can you help me with the ShotStack plugin or how to utilize this tool for my app?
Hi,
Could not find the plugin anymore so want to connect directly through API connecter plugin, to the Shotstack API curl: Hello World | Shotstack Documentation
How do I get a response with the test endpoints? Right now they return âbe sure to call a valid resourceâ instead of âresponseâ:{
âmessageâ:âRender Successfully Queuedâ,
âidâ:âd2b46ed6-998a-4d6b-9d91-b8cf0193a655â
What an awesome AI generative video API. They just have no no-code documentation for Bubble, just Zapier. Thanks in advance!
-jz
Did u get valid API keys and set up your authentication correctly?
Iâve experience in Shotstack Plugin. Please send me the details.
Hey @initialsjz
I got it working pretty quickly. Fortunately, they have pretty good documentation. So for the first call, you need to render something. Like this:
I used this Body from their example to get it to work:
{
"timeline": {
"tracks": [
{
"clips": [
{
"asset": {
"type": "text",
"text": "NoCodeMinute.com",
"font": {
"family": "Clear Sans",
"color": "#ffffff",
"size": 46
},
"alignment": {
"horizontal": "left"
},
"width": 566,
"height": 70
},
"start": 4,
"length": "end",
"transition": {
"in": "fade",
"out": "fade"
},
"offset": {
"x": -0.15,
"y": 0
},
"effect": "zoomIn"
}
]
},
{
"clips": [
{
"asset": {
"type": "video",
"src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/footage/earth.mp4",
"trim": 5,
"volume": 1
},
"start": 0,
"length": "auto",
"transition": {
"in": "fade",
"out": "fade"
}
}
]
},
{
"clips": [
{
"asset": {
"type": "audio",
"src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/music/freepd/motions.mp3",
"effect": "fadeOut",
"volume": 1
},
"start": 0,
"length": "end"
}
]
}
]
},
"output": {
"format": "mp4",
"size": {
"width": 1280,
"height": 720
}
}
}
Then you can check on the status of the render like this:
Then you just need to fill in the data that you want by switching out the data in the body from the Render endpoint to something that allows you to put in the information in a workflow.
I am using the âStageâ endpoint so it doesnât count against any credits. It will have the watermark. that will be for the test environment. For the live environment, you can use the real endpoint without the âstageâ part in the url. This could also be done dynamically by taking advantage of the âisnât live versionâ value in a workflow.
Does that help a little bit? Let us know where you get stuck and we can help some more.
Wow thanks! How do I, and great if you can screenshot a workflow:
I created a database thing and use in a workflow. But make changes to a thing does not save the API url response to my database? Thanks so much!
I donât have a workflow set up. However, the first ârenderâ endpoint only starts the process. You would use the âpollâ method or, even better, use this âwebhookâ method to send the file to you once it has completed rendering.
Here is the documentation about it, we just have to figure out how it works: Shotstack Ingestion Webhook | Shotstack Documentation
Thanks! Hope to get to the next step. Iâm still having trouble understanding if I put the data in the API connector or the workflow, or both. When I intialize the call in the api connector, it errors out if there is no data in the body. It also errors out when there is a data [parameter]. Am I supposed to set this in both places? Is the API connector data ignored after initialization? Thank you!
So here is how you setup a web hook from Shotstack so you will always get the rendered video after it has completed.
Check it out:
{
"callback": "https://nocodeminute.bubbleapps.io/version-test/api/1.1/wf/shotstack?api_token=YOURAPIKEY",
"timeline": {
"tracks": [
{
"clips": [
{
"asset": {
"type": "text",
"text": "NoCodeMinute.com",
"font": {
"family": "Clear Sans",
"color": "#ffffff",
"size": 46
},
"alignment": {
"horizontal": "left"
},
"width": 566,
"height": 70
},
"start": 4,
"length": "end",
"transition": {
"in": "fade",
"out": "fade"
},
"offset": {
"x": -0.15,
"y": 0
},
"effect": "zoomIn"
}
]
},
{
"clips": [
{
"asset": {
"type": "video",
"src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/footage/earth.mp4",
"trim": 5,
"volume": 1
},
"start": 0,
"length": "auto",
"transition": {
"in": "fade",
"out": "fade"
}
}
]
},
{
"clips": [
{
"asset": {
"type": "audio",
"src": "https://shotstack-assets.s3-ap-southeast-2.amazonaws.com/music/freepd/motions.mp3",
"effect": "fadeOut",
"volume": 1
},
"start": 0,
"length": "end"
}
]
}
]
},
"output": {
"format": "mp4",
"size": {
"width": 1280,
"height": 720
}
}
}
Does that help? Let me know.