Skip to main content

Test Types: Single Session Test

Testers use your product for one-sitting (5-60 minutes), and provide feedback through a survey, along with bug/issue reports.

Updated this week

The Single Session Test allows you to provide a link to your product and collect feedback via a survey after testers have used your product in a single usage session. You'll also receive bug/issue reports for any issues testers encounter.

Choose Single Session Test (previously called Single Session Beta Test) as your test type when creating your test.


Test Details

Test Title

Your survey title is shown publicly to potential applicants during recruiting. Make it something relevant to your test, for example: "First impressions of a strategy game".

Brief description of your test

In the description, you can describe a bit about what you're researching or who you're targeting. This is also shown to applicants during the recruiting phase

How are my test title and description shown during recruiting?

When you launch your test, testers will receive an email invite with your Test Title as the email subject and Test Description in the email body. It will look similar to the screenshot below and will ask testers to apply to participate.

In the below email "We want your feedback on a new packaging design" is the Test Title, and "Help us understand what resonates..." is the Brief Description of Your Test.


Test Instructions (Single Session Beta Test)

Be sure to include:

  • Link to access your product (e.g. link to your website, TestFlight public beta URL, Play Store, APK, Firebase, prototype URL, etc).

  • Details on what you'd like testers to do. This can be high level, or with step-by-step tasks.

  • Optionally attach any other document for instructions. Make sure it's clear and easy to understand, or you'll get poor results for your test.

Don't include:

  • Don't put questions directly in the instructions (except for a Usability Video test). Instead, you need to include any questions you want to have answered directly in a survey itself.

  • Don't expect a tester to purchase something or go through a process that requires sensitive personal or financial information

  • Don't use this test type for a live/group test that occurs at a specific day/time. Instead, use the Multi-Day Test for that.

  • Don't provide a test process that can't be completed within 24 hours.


Timing & Pricing

How long will it take to complete your test?

Provide an estimate for how long the test will take each tester from start to finish. Be sure to include a reasonable estimate for time required to download/access your product, engage with your product, and complete your survey.

How quickly do you want results?

  • With the Standard option, testers have 12 hours to finish your test, regardless of the expected time required. You should get the full results within ~24 hours if you're running a test with Automatic acceptance (rather than Manual Review). If a tester shows as "Active" during your test, it's because their due date has not yet passed.

  • The Expedited option costs $199 / test and allows you to get results within a few hours. If you have a contract with BetaTesting, we can discuss providing this feature for all your tests at no additional cost.

  • The No Rush option allows testers up to 24 hours to submit their results. If your test might require a tester to set aside some unique time during the day, this option would be helpful (e.g. an app that someone can use during a walk or for meal planning).

Price / Tester

BetaTesting uses credits for running tests. You can get volume discounts by purchasing subscriptions or larger credit packages.

The Single Session Test costs 1 credit per tester for the standard test time requirements (30 minutes). Credit costs are inclusive of payments to testers (which means you don't pay anything extra for incentives).

How does my target audience impact pricing?

Targeting business professionals with employment targeting criteria, or defining very niche consumer requirements increase the credit cost by 2X. This is because business professionals (and other difficult-to-reach audiences) demand higher incentives to participate and higher rewards leads to better results.

To see our packages for credits, see our pricing page. We also have a credit calculator to see how many credits you need for various test configurations.

Custom incentives Feature

For clients that have a need to customize the incentive amount that you provide to each participant, get in touch to discuss our custom incentive feature. This is useful if you are giving testers free products as part of the test process (e.g. a free TV), or testing with your own customers/employees, or need to offer higher incentives for difficult tests.

When using this feature, our system recommends average incentive amounts based on your test type, time expectations, and your target audience. You can learn more about choosing how much to incentivize in this article.


Recruiting & Screening

You can choose your recruitment criteria and screening questions like you normally would on any test. Learn how tester recruiting works here.


Survey Design

After testers complete your tasks, they will be given your survey. You can use our Standard UX Survey, build your own survey, use our Participation Verification survey, or link to an external survey.

Standard UX Survey

Our standard UX survey includes industry standard qualitative and quantitative questions. This is a safe bet for any UX test, and you can use as-is or edit as you desire!

Building your own survey

You can build your own survey by using any of our base question types, or our core question bank. Learn about the various survey question types here and how to use show/hide logic here.

Participation Verification Survey: Linking feedback to your database usage data

BetaTesting does not integrate directly with your product. A Participation Verification survey is useful in the following cases:

  1. If you want to cross reference your user feedback to exact usage data in your database

  2. If you want testers to prove they participated fully and completed your tasks and instructions

The Participant Verification Survey is a simple 3 question standalone survey that asks testers for the following info:

  • Username / email / or phone number used to access your product, so you can cross-reference the dat

  • Screenshot to validate that they used the product

  • A few sentences about the testing experience.

You also have the option to add "participation verification" questions directly to any survey that you create. Choose "Verify Participation" within the Core Question Bank, and it will add these 3 questions directly to any survey.

Using an external survey on Qualtrics / SurveyMonkey / Typeform, etc.

You can choose to refer testers to an external survey.

When doing that, you'll need a way to validate that your testers actually completed your survey since we don't integrate directly with third party survey providers. You can do that by providing a completion code at the end of your third-party survey, or by cross-referencing something like name or email (if you collect these in your survey), or asking participants to upload a screenshot.


Reviewing your Survey Feedback Results

After your test launches, you will get survey results that you can review, filter, highlight, segment, and export. Learn about reviewing your results here.


πŸ’‘Want to learn more? Book a call/demo with our team or get in touch through our contact form.

Did this answer your question?