What is an AI prompt?

Think of it like the instructions you give an AI model to generate your desired output. For “see you there”, our prompts instructed the models to see pictures of Black & Abroad guests at different destinations around the world.

Did you get permission to use people’s photos?

Yes. Each Black & Abroad guest filled out original waivers on their past trips with us, clearing their travel images for future use. We only generated personalized photo albums for past guests who opted in to the “see you there” project. Guests’ images were only ever used to generate pictures of them in our upcoming destinations – never shared with 3rd parties during training or image generation.

What generative AI Models did you use?

For training, tuning and generation, we used a widely available, powerful generative AI platform that allows its users to train its models on images uploaded by its users. However, we’re not here to single out a specific technology platform for being biased. We believe racial bias is systemic across society, culture and technology. This is well documented elsewhere. The world is biased. The Internet is biased. How could the AI models trained on the Internet not also be biased? The work ahead is to work together to identify phenomena as they occur and addresses them with speed, agility and vision for a more inclusive and joyful future for all.

How many tests did you run?

We started “see you there” in December, 2023 and ended in March, 2024. Over those 4 months, we ran hundreds of generative tests and used the findings for this site.

What factors led to problematic images being generated?

We identified four variables that impacted the accuracy of our generated image outputs: geographic destination (e.g., “Bamako, Mali”), specific activities (e.g., “at a tapas bar”), prompt specificity (e.g., “Black traveler”), and the quality of images in our training data.

How did you change your prompts to achieve accurate photos?

We quickly learned that prompt specificity is key. The more detail we added in our prompts, the more accurate results we tended to see. Here’s an example.

If generative AI is so racially biased, why did you use it?

We used generative AI to create “see you there” because it’s an innovative tool with the power to create incredible experiences for our customers. Rather than avoid it, we chose to embrace the new technology and use our discoveries of implicit racial bias as valuable evidence for the need to build and train generative AI models with more inclusive data and guardrails. There’s still work to be done but we truly believe AI can help us achieve our goal of every traveler seeing themselves in an inclusive vision of travel.

Why doesn't the generated photo look exactly like the real guest?

Even as we worked to overcome biases in the generative AI models, they still tended to standardize body types, remove or alter jewellery, introduce variations in skin tone and hair and in other ways, both small and large, alter appearances. These images are intended to be fun and interesting, and are not meant to be pixel-perfect, 100% accurate depictions of our guests.