[navigation]
TL;DR:
- You can now use optimized documentation to quickly integrate Face AR SDK & Video Editor SDK with LLMs;
- This doesn't replace traditional integration methods, which are also fast and convenient;
- The new addition gives developers more options in how they want to approach their work.
Integrate Banuba products with vibe coding
Vibe coding is one of the major software development trends. With Microsoft’s CEO Satya Nadella claiming that up to 30% of their current code is generated and Sundar Pichai making similar claims about Google, it is clear that large companies are pushing this approach, though the results are sometimes questionable.
To avoid errors and make vibe coding easier even for someone who never even programmed “hello world,” we have released a set of documents designed to be easily digestible by AI reasoning engines. They combine all the necessary info in one file with clear formatting optimized for context windows.
In practice, this means that you can feed these documents to the agent of your choice and drink a cup of coffee while AI does all the work. And that you won’t have to spend hours debugging the results after.
You can find the documentation here:
Note that this method doesn’t replace our usual integration flows. If you prefer a more hands-on approach, feel free to follow the regular guides:
After all, it is also quick. For example, Video Editor SDK can be integrated in under 8 minutes.