Made in Builder.io

Upcoming webinar with Figma: Design to Code in 80% Less Time

Announcing Visual Copilot - Figma to production in half the time

Builder.io logo
Talk to Us
Platform
Developers
Talk to Us

Blog

Home

Resources

Blog

Forum

Github

Login

Signup

×

Visual CMS

Drag-and-drop visual editor and headless CMS for any tech stack

Theme Studio for Shopify

Build and optimize your Shopify-hosted storefront, no coding required

Resources

Blog

Get StartedLogin

Builder uses AI (Artificial Intelligence) in certain features and workflows to help customers speed up design and development time.

This document covers how Builder incorporates AI and how Builder's AI workflows engage with your data.

Your privacy is of paramount importance. Here's how we approach your data:

  • Data privacy and storage. No customer data is used for training AI models.
  • Minimal data collection. For features that use external Large Language Models (LLMs), we send those LLMs only the data in prompts and, if applicable, the text or content being edited. This data is necessary for the AI to help with your request. Other AI models we use are entirely internal to Builder and do not store customer data. The third-party LLMs we use do not train on your data.
  • Usage. Data is strictly used for providing the requested service. There's no secondary usage.

Current Builder.io privacy policies apply to all Builder features, including AI features. Builder does not retain any AI-generated data except in cases where we need to debug an issue.

Builder uses two types of AI models:

  • Builder's proprietary, privacy-first AI models
  • Third-party Large Language Models (LLMs)

Builder's proprietary AI models are fully built and trained internally and hosted on Google Cloud. In this way, these models do not engage with any third party besides Google Cloud, which Builder already uses for all Builder systems.

Builder uses proprietary AI models to:

  • Prepare your Figma file with the Figma plugin
  • Import your design
  • Make your Figma design responsive in the Visual Editor
  • Generate "fast" code

Some, but not all of Builder's AI features, use fine-tuned external LLMs. Currently, Builder uses LLMs to:

  • Generate images
  • Generate and edit content
  • Generate and edit text
  • Generate "quality" code

Builder currently uses OpenAI and Anthropic Claude. We may also potentially use open-source LLMs, such as Mistral or Llama 2. Enterprise plan customers can choose which LLMs to use, including using your own LLM as well as turn off LLM-use entirely.

By understanding which Builder features use AI, how they use AI, and which models they use, you can make informed decisions and be intentional with your workflow.

Though there's a growing feature set using AI in Builder, you can still create and manage your Builder content without using the AI-enhanced features.

AI features in Builder's Visual Editor have indicators including a magic wand icon and text that states that the feature uses AI.

The features in which AI is available in Builder include:

The image below shows a few examples of Builder features that use AI. The examples here are, from the top left: the Visual Editor AI section of the Insert tab, the Visual Editor AI dialogue, the AI Text Editor dialogue, the Builder Figma plugin, and the Generated Code interface.
Screenshots of the Visual Editor AI section of the Insert tab, the Visual Editor AI dialogue, the AI Text Editor dialogue, the Builder Figma plugin, and the Generated Code interface and a header that says "Examples of Builder AI features".

To determine if a feature in the Visual Editor uses AI, you can check for copy that states it's an AI feature or hover your cursor over any field. If AI is available for that field, a magic wand icon appears on hover.

The image below shows several examples of the magic wand icon, which when clicked, open the AI Text Editor where you can generate or edit text. Notice that this is a hover state and the icon doesn't show if you don't hover over the field.

Screenshots that show the magic wand hover state in various Visual Editor inputs.

Image generation, text generation and editing, and content generation and editing use a fine-tuned LLM to process requests. The following sections show how a prompt is submitted to the LLM and the image, text, or content are returned.

When you use the Visual Editor to generate an image, the prompt that you give in the AI Generate dialogue goes to the LLM, which in turn creates the image. No additional data is sent.

The diagram below shows this flow:

Diagram of Prompt going to an LLM, which in turn creates an image.

When you use the Visual Editor to generate or edit text, the prompt that you give in the Visual Editor AI dialogue goes to the LLM, which in turn creates the copy. If you're editing existing copy, the original text is included in the prompt.

The diagram below shows this flow:

Diagram of Prompt going to an LLM, which in turn creates text.

When you use the Visual Editor to generate or edit content, the prompt that you give in the Visual Editor AI dialogue goes to the LLM, which in turn creates the content. For editing, the specific content being edited is converted to code and sent.

The diagram below shows this flow:

Diagram of Prompt going to the LLM, which in turn creates Builder content.

Visual Copilot is a workflow that leverages AI to the labor-intensive tasks in moments. This workflow uses AI in the Builder Figma plugin's process, creating a responsive design in Builder's Visual Editor from the Figma import, and in generating quality code.

When you use Visual Copilot, Builder's proprietary AI models and LLMs are involved at several, but different points in the process.

  1. Builder's Figma plugin sends your design to Builder's proprietary AI models. These models are hosted on Google Cloud (as is all of Builder's infrastructure) to ensure privacy and data security.
  2. Your design is imported to Builder.io.

At this point you can publish your site using Builder's APIs and SDKs. This option relies only on Builder's private, in-house AI models.

OR

Alternatively, you can generate code. When generating code, you have two options:

  • Fast code: this uses a Mitosis compiler without AI.
  • Quality code (optional): this uses a fine-tuned third-party LLM. The third-party LLMs that Builder uses do not train on your data.

Tip: When the Builder Figma plugin generates code, it bases the code strictly on your Figma file — not on anyone else's code or designs.

The diagram below charts this process:

Diagram of How Visual Copilot processes your data. 1. Imports with Figma Plugin. 2. Goes to Builder's proprietary AI Models which are privacy first, never trained on customer data, 100% private, and hosted on Google Cloud. 3. Plugin imports your design into Builder.io App with responsive styles. And then Option1: Publish your site with Builder APIs and SDKs or Option 2: Generate "fast" code (uses Mitosis compiler). Then, at your option, Visual Copilot uses a finely tuned LLM at OpenAI to finally create "quality code".

For more information on AI features in Builder, visit Visual Copilot.

Was this article helpful?

Product

Visual CMS

Theme Studio for Shopify

Sign up

Login

Featured Integrations

React

Angular

Next.js

Gatsby

Get In Touch

Chat With Us

Twitter

Linkedin

Careers

© 2020 Builder.io, Inc.

Security

Privacy Policy

Terms of Service

Newsletter

Get the latest from Builder.io

By submitting, you agree to our Privacy Policy