Made in Builder.io

Join us for the biggest Figma-to-code launch of the year

Builder.io logo
Talk to Us
Platform
Developers
Talk to Us

Blog

Home

Resources

Blog

Forum

Github

Login

Signup

×

Visual CMS

Drag-and-drop visual editor and headless CMS for any tech stack

Theme Studio for Shopify

Build and optimize your Shopify-hosted storefront, no coding required

Resources

Blog

Get StartedLogin

Builder uses AI (Artificial Intelligence) in certain features and workflows to help customers speed up design and development time.

This document covers how Builder incorporates AI and how Builder's AI workflows engage with your data.

Your privacy is of paramount importance. Here's how we approach your data:

  • Data privacy and storage. No customer data is used for training AI models.
  • Minimal data collection. For features that use OpenAI, we send OpenAI only the data in prompts and, if applicable, the text or content being edited. This data is necessary for the AI to help with your request. Other AI models we use are entirely internal to Builder and do not store customer data. OpenAI does not train on your data.
  • Usage. Data is strictly used for providing the requested service. There's no secondary usage.

Current Builder.io privacy policies apply to all Builder features, including AI features. Builder does not retain any AI-generated data except in cases where we need to debug an issue.

Builder uses two types of AI models:

  • Builder's proprietary, privacy-first AI models
  • An OpenAI LLM

Builder's proprietary AI models are fully built and trained internally and hosted on Google Cloud. In this way, these models do not engage with any third party besides Google Cloud, which Builder already uses for all Builder systems.

Builder uses proprietary AI models to:

  • Prepare your Figma file with the Figma plugin
  • Import your design
  • Make your Figma design responsive in the Visual Editor
  • Generate "fast" code

Some, but not all of Builder's AI features, use an OpenAI fine-tuned LLM. Currently, Builder uses OpenAI to:

  • Generate images
  • Generate and edit content
  • Generate and edit text
  • Generate "quality" code

By understanding which Builder features use AI, how they use AI, and which models they use, you can make informed decisions and be intentional with your workflow.

Though there's a growing feature set using AI in Builder, you can still create and manage your Builder content without using the AI-enhanced features.

AI features in Builder's Visual Editor have indicators including a magic wand icon and text that states that the feature uses AI.

The features in which AI is available in Builder include:

The image below shows a few examples of Builder features that use AI. The examples here are, from the top left: the Visual Editor AI section of the Insert tab, the Visual Editor AI dialogue, the AI Text Editor dialogue, the Builder Figma plugin, and the Generated Code interface.
Screenshots of the Visual Editor AI section of the Insert tab, the Visual Editor AI dialogue, the AI Text Editor dialogue, the Builder Figma plugin, and the Generated Code interface and a header that says "Examples of Builder AI features".

To determine if a feature in the Visual Editor uses AI, you can check for copy that states it's an AI feature or hover your cursor over any field. If AI is available for that field, a magic wand icon appears on hover.

The image below shows several examples of the magic wand icon, which when clicked, open the AI Text Editor where you can generate or edit text. Notice that this is a hover state and the icon doesn't show if you don't hover over the field.

Screenshots that show the magic wand hover state in various Visual Editor inputs.

Image generation, text generation and editing, and content generation and editing use a fine-tuned LLM at OpenAI to process requests. The following sections show how a prompt is submitted to the LLM and the image, text, or content are returned.

When you use the Visual Editor to generate an image, the prompt that you give in the AI Generate dialogue goes to OpenAI, which in turn creates the image. No additional data is sent.

The diagram below shows this flow:

Diagram of Prompt going to OpenAI, which in turn creates an image.

When you use the Visual Editor to generate or edit text, the prompt that you give in the Visual Editor AI dialogue goes to OpenAI, which in turn creates the copy. If you're editing existing copy, the original text is included in the prompt.

The diagram below shows this flow:

Diagram of Prompt going to OpenAI, which in turn creates text.

When you use the Visual Editor to generate or edit content, the prompt that you give in the Visual Editor AI dialogue goes to OpenAI, which in turn creates the content. For editing, the specific content being edited is converted to code and sent.

The diagram below shows this flow:

Diagram of Prompt going to OpenAI, which in turn creates Builder content.

Visual Copilot is a workflow that leverages AI to the labor-intensive tasks in moments. This workflow uses AI in Builder Figma to Code plugin's process, creating a responsive design in Builder's Visual Editor from the Figma import, and in generating quality code.

When you use Visual Copilot, Builder's proprietary AI models and OpenAI are involved at several, but different points in the process.

  1. Builder's Figma plugin sends your design to Builder's proprietary AI models. These models are hosted on Google Cloud (as is all of Builder's infrastructure) to ensure privacy and data security.
  2. Your design is imported to Builder.io.

At this point you can publish your site using Builder's APIs and SDKs. This option only relies on Builder's private, in-house AI models.

OR

Alternatively, you can generate code. When generating code, you have two options:
  • Fast code: this uses a Mitosis compiler without AI.
  • Quality code (optional): this uses a fine-tuned LLM at OpenAI. Open AI does not train on your data.

Tip: When the Builder Figma plugin generates code, it bases the code strictly on your Figma file — not on anyone else's code or designs.

The diagram below charts this process:

Diagram of How Visual Copilot processes your data. 1. Imports with Figma Plugin. 2. Goes to Builder's proprietary AI Models which are privacy first, never trained on customer data, 100% private, and hosted on Google Cloud. 3. Plugin imports your design into Builder.io App with responsive styles. And then Option1: Publish your site with Builder APIs and SDKs or Option 2: Generate "fast" code (uses Mitosis compiler). Then, at your option, Visual Copilot uses a finely tuned LLM at OpenAI to finally create "quality code".
For more information, read about OpenAI's training processs and visit OpenAI's Security Portal.

For more information on AI features in Builder, visit Visual Copilot.


Looking to hire a third party to help with your project?

Submit a project request and our partnerships team will reach out to connect you with an Expert from our partner ecosystem.

Connect with us

Was this article helpful?

Product

Visual CMS

Theme Studio for Shopify

Sign up

Login

Featured Integrations

React

Angular

Next.js

Gatsby

Get In Touch

Chat With Us

Twitter

Linkedin

Careers

© 2020 Builder.io, Inc.

Security

Privacy Policy

Terms of Service

Get the latest from Builder.io

Newsletter

Developer Newsletter

Latest tips, tricks, and news for frontend developers from our blog

Product Newsletter

Latest features and updates on the Builder.io platform

By submitting, you agree to our Privacy Policy