We trained an AI model on photos of guests from our previous trips…
..then prompted it to generate new photos of those same guests at our upcoming destinations.
We wanted good pics. Instead, we got a good question:
To launch Black & Abroad’s 2024 travel season, we used generative AI to imagine past guests at our upcoming travel destinations. Along the way, we uncovered striking evidence of racial bias woven into the ways AI models are built and trained – and in what they can produce.
We hope sharing what we found here can help us all decide where to go to next.
This is Tamika, a past Black & Abroad guest. We askedAI to imagine her in Machu Picchu, Peru. The AI model generated images of her with white skin,blonde hair and blue eyes.
_Portrait of <Tamika> at Machu Picchu
This is Aina, a past Black & Abroad guest. We asked AI to imagine her in Medellin, Colombia. The AI model generated images of her with light skin, white features and blonde highlighted hair.
_Portrait of <Aina> on a balcony in Medellin watching the sunset
This is Aina again. This time, we asked AI to imagine her at a tapas bar in Barcelona, Spain. The model generated images of her with light skin and light brown hair.
_Portrait of <Aina> in Barcelona at a tapas bar
During our tests generating images of Black & Abroad guests at the iconic Machu Picchu ruins in Peru, there was an 85% failure rate in depicting our subject as Black. Similarly, while our AI models were able to generate images of our guests in “Barcelona” without issue – when we added the prompt “tapas bar”, the failure rate jumped to 92%.
Because of long-standing systems of exclusion and marginalization, we believe there may be fewer Black travelers at some iconic destinations in the first place, which means fewer photos of Black people exist that are tagged to those locations, and less data for the models to be trained on. Of course, many Black people have been to these places many times – we’re talking here about data bias at scale and what the machines are seeing across millions of training images pulled from the Internet.
Labeled Faces in the Wild, a popular open source facial image dataset, is 83.5% white.
This is Jordan, a past Black & Abroad guest. We asked AI to imagine him in Bamako, Mali. The AI model generated images of him looking ragged, threadbare, impoverished and malnourished.
_Portrait of <Jordan> in Bamako
The Black diaspora is historically under-documented – with one exception: in moments of trauma, our diverse global community tends to be recorded in detail. When we’re experiencing poverty, pain, famine or violence, the cameras come out, and our trauma becomes the story these AI models get trained on.
And while these stories are critical, there’s ample evidence that too often the community’s hardships are centered as the primary or only story. Traveling for pleasure should include plenty of joy – our AI models should be able to see and generate that too.
In a survey of nearly 5,000 Black Americans, 63% said news about Black people is more negative than news about other racial groups
In a survey of nearly
5,000 Black
Americans,
63% said news
about
Black people is more
negative than news
about other
racial groups
Black pain transforms from tragedy to content, from trauma to increased engagement for news anchors and others looking to boost their platforms.
This is Saderia, a past Black & Abroad guest. We asked AI to imagine her in Paris, France. The AI model generated images with items like flowers and headdresses in her hair.
_Portrait of <Saderia> in Paris
Whether it’s photos of tribal life in Africa as seen through the eyes of non-Black photographers, or rare historic photos of Black women in floral clothing... Black communities have tended to be documented through an ethno-photographic lens that sees its subjects first for their race and differences – a powerful “othering” effect that’s perpetuated through the data new generated AI models are trained on.
‘Swahili Beauties of Zanzibar - 1906’
Source: Old East Africa Postcards
Nigerian women in 1922
Source: Linda Ikeji’s Blog
Fishermen Tribe at the Harbor Falls, Stanleyville
Kisangani, Congo, Africa, 1937
Source:
The E.O. Hoppé Estate Collection
... anthropology and ethnography have been utilized under the guise of science and research to further dehumanize those native to colonized lands through photography and a Eurocentric narrative.
As long as people have used technology to generate images, the tools we use have found new ways to express the biases that are already woven into the world we live in. You can see examples of it everywhere you look – including recent news headlines.
Our community has also always led change through innovation, from Frederick Douglas’ innovative use of photography to document Black achievement in the 1800s to Oprah Winfrey’s large investment in television studio cameras calibrated to Black skin in the 1990s.
We hope this project contributes in some small way to this proud path of progress.
American Civil Rights leader Frederick Douglass used photography to document accurate Black likenesses, joy and achievement where white painters had failed to do so.
The Oprah Winfrey Show becomes an early adopter of Phillips LDK series digital video technology, which has a second chip for imaging darker skin tones.
Google Gemini recently sought to overcome its model’s inherent data bias with a technique called “prompt transference”, which instead introduced wild historical inaccuracies, such as this Black Nazi soldier.
The Shirley Card – used for decades to calibrate photographic film stock to favor fair skin tones.
Early photo classification platforms, such as Google Photos, have been known to misclassify images of Black people as “gorillas”.
We’ve always been excited about new technology and how it can help to create better experiences for our guests. And while this project showed us that generative AI still has miles to go, by working with powerful technologies, trained on our own data, and using careful prompt engineering to steer clear of some latent biases – we eventually got the images we needed.
These new images are beautiful. Our guests are looking great. And the places we’ll go together are even more amazing than even the most powerful technologies can imagine.
You can see some of the great generative results here.
Examples of what we were able to achieve through fine tuning, human review and careful prompt engineering.
We’re out here. We’ve been out here. And we invite you to join us for transformational travel experiences in 2024.
Book your curated trip
with us today.
The GenAI BRS is an interactive, web-based utility that allows anyone, anywhere to log reports of bias in generative AI models, across text, image and video. The input dashboard allows for the easy structuring of quantitative, qualitative and observational data, which can then be downloaded by anyone for use in academic study, fine tuning models, refinement and by the generative AI industry itself to discover trends and make their models more inclusive. It can also be used to benchmark AI models against known problematic responses.
prompt African worker
prompt A First Nations person
prompt Person with obesity
prompt B&A customer in Bamako
prompt Trans grandmother
The path to solving racial bias in AI is likely to be as complicated as solving racial bias out here in the real world. But there are things we can all do to advance the cause.