# A/B Testing Survey

**Note:** A/B Testing is available on the **Growth plan and above**.

### How it works <a href="#how-it-works" id="how-it-works"></a>

You pick two existing surveys (Survey A and Survey B), set a traffic split ratio, choose which channels to test on, and start the test. SEA Survey automatically routes visitors to one version based on the ratio you set. Results update in real time.

***

### Set up an A/B test <a href="#set-up-an-ab-test" id="set-up-an-ab-test"></a>

**Step 1 — Go to the A/B Testing section**

In the left navigation, click **A/B testing**.

**Step 2 — Create a new test**

Click **Create new test** and give it a name.

**Step 3 — Select Survey A and Survey B**

Choose the two surveys you want to compare. Both surveys must share the same **page targeting** and **user targeting** settings on the channel you want to test — otherwise that channel will not be available for selection.

**Step 4 — Set the traffic split**

Enter the percentage of traffic going to each version. The two values must add up to 100.

| Example ratio | Effect                                                 |
| ------------- | ------------------------------------------------------ |
| 50 / 50       | Each survey receives equal traffic                     |
| 80 / 20       | Survey A gets most traffic; Survey B is the challenger |
| 70 / 30       | Common when you want to protect a known-good survey    |

**Step 5 — Choose the channels to test**

Select one or more channels: **Site Widget**, **Thank you page**, **Post-purchase email**, **Exit intent**.

Only channels where both surveys share the same targeting settings will appear as options.

**Step 6 — Start or schedule the test**

* Click **Run now** to start immediately.
* Click **Schedule** to set a start date and time.

{% embed url="<https://app.arcade.software/share/BG28d4upq68Fq7elPWWR>" %}

***

### Track results <a href="#track-results" id="track-results"></a>

#### Impressions over time <a href="#impressions-over-time" id="impressions-over-time"></a>

A line chart shows the number of impressions for Survey A and Survey B over the test period.

#### Engagement metrics <a href="#engagement-metrics" id="engagement-metrics"></a>

For each channel, the dashboard shows:

| Metric              | What it measures                                                          |
| ------------------- | ------------------------------------------------------------------------- |
| **Start rate**      | Percentage of impressions where a customer started answering              |
| **Completion rate** | Percentage of started surveys that were fully completed                   |
| **Dismissal rate**  | Percentage of impressions where the survey was dismissed without starting |

#### Version history <a href="#version-history" id="version-history"></a>

The dashboard logs every change made to either survey during the test. This lets you understand whether a metric shift was caused by a survey edit or by organic traffic changes.

***

### Test states <a href="#test-states" id="test-states"></a>

| State         | Description                                               |
| ------------- | --------------------------------------------------------- |
| **Draft**     | Test created but not started yet                          |
| **Scheduled** | Test will start automatically at the set date and time    |
| **Running**   | Traffic is being split and data is being collected        |
| **Completed** | Test has ended; results and version history are preserved |

You can end a running test at any time. Results and the full version history remain accessible after the test ends.

***

### Requirements and limitations <a href="#requirements-and-limitations" id="requirements-and-limitations"></a>

* Both surveys must be **Active** before the test can start.
* A channel is only testable if both Survey A and Survey B share the **same page targeting and user targeting** on that channel.
* Each survey can only participate in one running test at a time.
