As we get ready for our July 10 webinar on Trust in Design with brand and content strategist  Margot Bloomstein, we’re taking a closer look at what it really means to design digital experiences that earn user trust. Don’t forget to register for the webinar to join the conversation live.

With the advent of AI, how we create and consume content is changing rapidly. Meanwhile, people are afraid of being replaced by AI and are rapidly losing trust in the content it creates. It’s becoming harder to tell if content is generated by AI, and realistic deepfake AI videos of celebrities and politicians make us increasingly skeptical of digital content. Cynicism has become the default and trust must be earned. 

This blog post includes some of my takeaways from three excellent talks from our recent EvolveDrupal Summit in Boston that relate to trust and authenticity.

Shawn presenting, standing in front of a screen that says AI won't kill creativity.
Shawn presented how we still have an advantage over AI—our creativity!

How AI Erodes Trust

Putting out lots of generic AI-generated content that lacks authenticity and an emotional connection can quickly alienate your audience. 

As Shawn Perritt, VP Brand Strategy and Creative at Acquia outlined in Brand Trust and Emotional Connections in the Age of AI, there are several ways AI often damages brand credibility including:

Loss of Authenticity

One of the quickest ways to erode trust is by replacing a brand’s distinct, human voice with bland, templated AI content. Shawn gave the example of Duolingo: When they announced it would be adopting an AI-first content strategy, fans voiced concern that the quirky, relatable tone would be lost. Even loyal audiences can turn when they sense a shift toward soulless automation. Authenticity is a core part of what builds emotional connection with users, and when it disappears, trust often follows.

Poor Output Quality

Errors, hallucinations, or tone-deaf responses that feel cold or wrong can definitely erode trust. Even the most advanced models produce hallucinations, defined as confidently incorrect information, 20 to 30 percent of the time. That’s a significant risk. A striking example of this came from Air Canada, whose customer service chatbot gave a user false information about bereavement discounts. When the company initially refused to honor the AI-generated advice, the situation escalated to court. The ruling? Air Canada was held responsible for what its AI chatbot said. This case sends a clear message: AI is an extension of your brand. If your chatbot or content makes a promise, users will expect you to stand behind it.

Lack of Transparency

Users want to know when and how AI is being used, especially when decisions are being made on their behalf. Facebook’s AI moderation system offers a cautionary tale here. The algorithm was frequently flagging or suppressing legitimate activist content while allowing harmful misinformation to remain online. With no clear explanation or visibility into how decisions were made, users felt censored and powerless. The opacity of the system bred suspicion and diminished trust in the platform as a whole. If users don’t understand why something is happening, they won’t trust the process or the brand behind it.

Bias and Unfairness 

AI can replicate and amplify systemic inequalities, especially if it is trained on flawed or incomplete data or the biases already exist in the training data. Amazon experienced this firsthand when it developed an internal AI recruiting tool that showed a clear preference against female applicants. The system had been trained on ten years of past hiring data, which reflected existing gender biases, and it learned to penalize resumes that included words like “women’s.” Once the issue became public, the company had to scrap the tool entirely. This highlights how quickly trust can unravel when users perceive a system as unfair or discriminatory—especially in high-stakes areas like employment. AI must be built and tested with intentional equity, or it risks reinforcing systemic issues and alienating audiences.

Margot standing in front a crowd presenting
Margot taught us how to earn our audience's trust by being consistent in how we represent ourselves and in how much you say, and by being wiling to admit our mistakes. 

Building Trust

Trust in a brand is not a given and it needs to be earned. There are, however, steps you can take to build trust over time with your users. 

In her presentation Designing for Trust, author, leading content strategist and keynote speaker Margot Bloomstein, explained how three Vs of voice, volume and vulnerability are essential in building trust. 

Voice

Voice tells your audience who you are and how you communicate. It must be consistent. Ask: What does your organization stand for? What’s your communication goal hierarchy? Are you innovative, smart, wise? Cutting-edge, bleeding-edge, or leading-edge?

Margot brings up Mailchimp as an example. They started as a newsletter tool and have grown into a marketing platform giant. Through it all, they've maintained a voice—using visuals like Cavendish banana yellow and personality like their mascot Freddy. Their voice adapted over time to meet both novice and experienced users, all while maintaining consistency.

Volume

Volume refers to the level of detail—how much you say and how. It includes images, copy length, and level of detail. America’s Test Kitchen is another example. They cater to both novice and experienced cooks with content ranging from simple Instagram recipes to detailed magazine articles. Their chief creative officer, Jack Bishop, emphasizes that they show their work which helps audiences build confidence. 

On the other hand, GOV.UK reduced 75,000 web pages down to 3,000, focusing only on content only they could provide. They let other expert organizations fill in the rest. The result? More trust, easier navigation, and happier users. How much you say can be just as important for establishing trust as how you say it. 

Vulnerability

This is where it all comes together. Vulnerability means being transparent and open. It builds trust when used strategically. Penzey Spices is a bold example. CEO Bill Penzey openly shared political views on Facebook, aligning the brand with values of inclusivity and social justice. They lost some customers but gained many more. 

At Evolving Web, we aim to work with clients who share our values. We are proud to have worked with Planned Parenthood Direct as a client and our team who worked on the project felt they were doing meaningful work that aligned with their values. It might not align with everyone’s values but we are not afraid to work with organizations who share ours, and openly talk about that work because it is who we are.

Zoom is another great  example of embracing vulnerability. When the pandemic hit and usage skyrocketed, security issues emerged. CEO Eric Yuan responded with a blog post starting with: “I am deeply sorry.” Not Zoom, not our company or we but “I”. He laid out action steps, committed to weekly updates, and showed both humility and leadership. Zoom users stayed because the company responded with transparency and humility. 

Val smiling and pointing at the screen during her presentation.
Val Yang presented a project that is close to her heart. Her experience staying connected with her grandparents in China inspired a tool to support family caregivers. CuroNow is about making caregiving less stressful and more manageable, helping people feel less alone in the day-to-day of looking after someone they love.

Building an Emotional Connection

AI can’t replicate emotional connection or authenticity.Val Yang, a Senior Product Designer at Acquia, spoke in Designing for Dignity about how she channels the perspective of her grandparents to guide her work on senior-friendly digital interfaces. When you imagine the user is someone you know,  you create an authentic emotional connection and naturally want them to have a good experience. Designing for seniors no longer is about checking accessibility boxes but imagining the actual struggles your own grandparents might have and wanting to make the experience as seamless as possible. 

That same personal motivation is what led her to create CuroNow, an app designed to support family caregivers. Inspired by a desire to stay connected with her grandparents in China, she built CuroNow to ease the overwhelming mental load caregivers face every day. By organizing daily care needs, appointments, and important conversations in one place—with the help of smart guidance and AI-powered support. Thinking of our users as people and not just numbers helps us relate to them and empathy drives a better end user experience.  

Creativity

Original thinking is your last unfair advantage over AI. That’s why creativity should be a daily practice. Like going to the gym, it’s something you need to train. Try a new activity, cook a new cuisine or explore a different  neighbourhood. The more inputs you give your brain, the more unique and trustworthy your outputs will be.

And if you need a jump-start, try the New York Times’ Creativity Challenge or just get outside and move. Physical activity is proven to boost creativity.

 

A sketch of takeaways from Designing for Trust, featuring a sketch of Margot in the centre
Jillian King, a Service Designer on the Veterans Affairs Team at MO, let her creativity shine by sketching her notes while listening to sessions at EvolveDrupal Boston. We thank her for sharing her talent and for inspiring us to be more creative in our everyday lives. 

In Conclusion, Be Human

Maybe the challenge isn’t about keeping up with AI, but about staying rooted in what people actually need: honesty, transparency, and care. We can embrace technology while creating work that feels human.


AI builds on what exists. People imagine what doesn’t.  - Shawn Perritt

If you want to learn how to put these principles into practice, don’t miss our upcoming webinar with Margot Bloomstein on July 10 at 12:30 EDT