KEEP SCROLLING

We trained an AI model on photos of guests from our previous trips…

..then prompted it to generate new photos of those same guests at our upcoming destinations.

We wanted good pics. Instead, we got a good question:

Where is the
Black Traveler?
Take a trip
with us into
the bias at the heart of the internet.

To launch Black & Abroad’s 2024 travel season, we used generative AI to imagine past guests at our upcoming travel destinations. Along the way, we uncovered striking evidence of racial bias woven into the ways AI models are built and trained – and in what they can produce.

We hope sharing what we found here can help us all decide where to go to next.

Original photo of Tamika,
one of several used to train the AI model
Machu Picchu, Peru
Lost in “The
Lost City”of
Machu Picchu

This is Tamika, a past Black & Abroad guest. We askedAI to imagine her in Machu Picchu, Peru. The AI model generated images of her with white skin,blonde hair and blue eyes.

AI Prompt

_Portrait of <Tamika> at Machu Picchu

Original photo of Aina
Medellin, Colombia
Redacting
Black in
Medellin

This is Aina, a past Black & Abroad guest. We asked AI to imagine her in Medellin, Colombia. The AI model generated images of her with light skin, white features and blonde highlighted hair.

AI Prompt

_Portrait of <Aina> on a balcony in Medellin watching the sunset

Original photo of Aina
Barcelona, Spain
Tapped
out of the
tapas bar

This is Aina again. This time, we asked AI to imagine her at a tapas bar in Barcelona, Spain. The model generated images of her with light skin and light brown hair.

AI Prompt

_Portrait of <Aina> in Barcelona at a tapas bar

Where is the Black traveler?
Missing in action
at some major
attractions.

During our tests generating images of Black & Abroad guests at the iconic Machu Picchu ruins in Peru, there was an 85% failure rate in depicting our subject as Black. Similarly, while our AI models were able to generate images of our guests in “Barcelona” without issue – when we added the prompt “tapas bar”, the failure rate jumped to 92%.

Because of long-standing systems of exclusion and marginalization, we believe there may be fewer Black travelers at some iconic destinations in the first place, which means fewer photos of Black people exist that are tagged to those locations, and less data for the models to be trained on. Of course, many Black people have been to these places many times – we’re talking here about data bias at scale and what the machines are seeing across millions of training images pulled from the Internet.

Labeled Faces in the Wild, a popular open source facial image dataset, is 83.5% white.

Original photo of Jordan,
one of several used to train the AI model
Bamako, Mali
Generating
into poverty
in Bamako

This is Jordan, a past Black & Abroad guest. We asked AI to imagine him in Bamako, Mali. The AI model generated images of him looking ragged, threadbare, impoverished and malnourished.

AI Prompt

_Portrait of <Jordan> in Bamako

Where is the Black traveler?
Pinned down by photographic trauma spikes.

The Black diaspora is historically under-documented – with one exception: in moments of trauma, our diverse global community tends to be recorded in detail. When we’re experiencing poverty, pain, famine or violence, the cameras come out, and our trauma becomes the story these AI models get trained on.

And while these stories are critical, there’s ample evidence that too often the community’s hardships are centered as the primary or only story. Traveling for pleasure should include plenty of joy – our AI models should be able to see and generate that too.

In a survey of nearly 5,000 Black Americans, 63% said news about Black people is more negative than news about other racial groups

Black pain transforms from tragedy to content, from trauma to increased engagement for news anchors and others looking to boost their platforms.

Original photo of Saderia,
one of several used to train the AI model
Paris, France
Hats, hair and
not being seen
on the Seine

This is Saderia, a past Black & Abroad guest. We asked AI to imagine her in Paris, France. The AI model generated images with items like flowers and headdresses in her hair.

AI Prompt

_Portrait of <Saderia> in Paris

Where is the Black traveler?
Trapped in an
exoticized,
tribal past.

Whether it’s photos of tribal life in Africa as seen through the eyes of non-Black photographers, or rare historic photos of Black women in floral clothing... Black communities have tended to be documented through an ethno-photographic lens that sees its subjects first for their race and differences – a powerful “othering” effect that’s perpetuated through the data new generated AI models are trained on.

‘Swahili Beauties of Zanzibar - 1906’
Source: Old East Africa Postcards

Nigerian women in 1922
Source: Linda Ikeji’s Blog

Fishermen Tribe at the Harbor Falls, Stanleyville
Kisangani, Congo, Africa, 1937
Source: The E.O. Hoppé Estate Collection

... anthropology and ethnography have been utilized under the guise of science and research to further dehumanize those native to colonized lands through photography and a Eurocentric narrative.

New tech.
Old problem.

As long as people have used technology to generate images, the tools we use have found new ways to express the biases that are already woven into the world we live in. You can see examples of it everywhere you look – including recent news headlines.

Our community has also always led change through innovation, from Frederick Douglas’ innovative use of photography to document Black achievement in the 1800s to Oprah Winfrey’s large investment in television studio cameras calibrated to Black skin in the 1990s.

We hope this project contributes in some small way to this proud path of progress.

1841

American Civil Rights leader Frederick Douglass used photography to document accurate Black likenesses, joy and achievement where white painters had failed to do so.

1990s

The Oprah Winfrey Show becomes an early adopter of Phillips LDK series digital video technology, which has a second chip for imaging darker skin tones.

2024

Google Gemini recently sought to overcome its model’s inherent data bias with a technique called “prompt transference”, which instead introduced wild historical inaccuracies, such as this Black Nazi soldier.

1950s - 70s

The Shirley Card – used for decades to calibrate photographic film stock to favor fair skin tones.

2015

Early photo classification platforms, such as Google Photos, have been known to misclassify images of Black people as “gorillas”.

We finally got
to the place we
wanted to go.

We’ve always been excited about new technology and how it can help to create better experiences for our guests. And while this project showed us that generative AI still has miles to go, by working with powerful technologies, trained on our own data, and using careful prompt engineering to steer clear of some latent biases – we eventually got the images we needed.

These new images are beautiful. Our guests are looking great. And the places we’ll go together are even more amazing than even the most powerful technologies can imagine.

You can see some of the great generative results here.

Examples of what we were able to achieve through fine tuning, human review and careful prompt engineering.

AI may not always see you there.
But we do.

We’re out here. We’ve been out here. And we invite you to join us for transformational travel experiences in 2024.

Book your curated trip
with us today.

Watch the launch film
Where is the
Black traveler?

The path to solving racial bias in AI is likely to be as complicated as solving racial bias out here in the real world. But there are things we can all do to advance the cause.

  1. Follow Black leaders in the space
    • Black In AI

      Non-profit Organization for Black Professionals in Artificial Intelligence

    • Top 10 Black A.I. Creators
      You Need to Know

      Shining a spotlight on some of the remarkable Black creators in the Artificial Intelligence space.

    • Black Data Scientists and
      AI Practitioners Today

      Seven inspiring Black data scientists and AI creators that you should follow.

    • The Untold Impact of Black Data Scientists and AI Pioneers in Tech History

      Underrepresented contributions of Black professionals in the increasingly influential fields of data science and Artificial Intelligence.

  2. Learn more about the issue
    • AI was asked to create images of Black African docs treating white kids. How'd it go?

      NPR article by Carmen Drahl

    • Who Is Making Sure the A.I. Machines
      Aren’t Racist?

      New York Times Article by Cade Metz

    • These fake images reveal how AI amplifies our worst stereotypes

      Washington Post article by Nitasha Tiku, Kevin Schaul and Szu Yu Chen

    • Black Artists Say A.I. Shows Bias, With Algorithms Erasing Their History

      New York Times Article by Zachary Small

  3. Flood the zone with
    travel pics using #WhereIsTheBlackTraveler
    • Make sure to geo-tag the image. Here are the top 20 most photographed travel destinations, according to headout.
      Eiffel Tower, Paris
      Big Ben, London
      Louvre, Paris
      Empire State Building, New York
      Trafalgar Square, London
      Burj Khalifa, Dubai
      St. Peter’s Basilica, Rome
      Times Square, New York
      Sagrada Familia, Barcelona
      Colosseum, Rome
      Statue of Liberty, New York
      Machu Picchu, Cuzco
      Alhambra, Granada
      Central Park, New York
      Christ the Redeemer, Rio de Janeiro
      Taj Mahal, Agra
      Gardens by the Bay, Singapore
      Buckingham Palace, London
      Duomo, Florence
      Arc de Triomphe, Paris