Alliance for Peacebuilding

View Original

Digital Trailblazer: Sinead Bovell

June 1, 2021

Cyberspace is not just code, computers, and robots – it’s made up of real people. Lucky for us, some of these people use the Internet to make a difference in our online and offline world, and we like that. To recognize these individuals’ unique contributions, we have officially launched our new series Digital Trailblazers. From cybersecurity experts to online activists to digital artists, we will nominate a handful of amazing individuals that prove the Internet continues to be a place for connection, opportunity, and inspiration. We are excited to celebrate the extremely talented people working hard to protect our cyberspace and use our interconnectivity for good.

We are thrilled to announce our inaugural Digital Trailblazer honoree is Sinead Bovell. As a futurist, UN speaker, WIRED contributor, and model, Sinead has made it her mission to bridge the gap between the younger generation and the digital future. Sinead is also the founder of WAYE, an education startup that empowers young people to prepare for work, and life, in the digital age. Like the Digital Peace Now movement, she deeply cares about the state of our shared cyberspace and is working to ensure it is a safe and inclusive space for all.

So, how is youth tech leader Sinead Bovell making tech more inclusive? Find out below:

 

DPN: Tell us more about yourself. How did you become the model who talks tech?

S.B: Modeling and tech, I’d say you don’t usually see those words together. As a true millennial multi-hyphenate, I would say my career is quite unconventional. I grew up very academically focused. I studied finance and chemistry. I also did my MBA. During my MBA, I first studied under a futurist. I learned about strategic foresight and applying statistical models to build future scenarios. So, now that we have these probabilities, societal trends, and technologies, what might the future look like? That was my first kind of glimpse into applying the tools of math and business to future forecasting.

But still, I was headed down a very, very corporate path. And I started to realize that it wasn’t the path for me. This isn’t the life I wanted to live or the problems I wanted to solve. Simultaneously, I was scouted by a modeling agency. I never considered modeling before. We didn’t even read magazines in my house until we were 18. However, I saw it as a potential window to create the version of life I may want to live. So, I quit the corporate world, moved to New York City, and started my life as a fashion model.

When I stepped into the world of fashion and entertainment, I immediately recognized the conversations about artificial intelligence, the future of work, cybersecurity, and so on were not happening in these rooms. It’s not that people in the arts or creative spaces were not interested—no one invited them to those conversations. It’s not digestible or readily accessible. That’s where the light bulb went off for me: I think I can bridge the gap between these two very different worlds. I found myself in a unique position to merge them. Here we are today.

 

You use the word futurist. What does that word mean to you?

The word futurist can get all sorts of reactions. To me, futurists use data and forecasting models to build different scenarios of the future, applying both qualitative and quantitative data, usually pertaining to technologies, society, and history. That’s how I practice it. I’m sure different futurists would have different approaches, but that’s how I describe it.

 

What gets you excited about our future in tech? Also, what makes you nervous?

In terms of what gets me excited, there is so much. We’ve started an entirely new journey with how and where we’re going to work, live, and play. Look at the metaverse, space exploration, and artificial intelligence. But right now, it still feels buzzwordy, people don’t necessarily know how it applies to them. But AI is poised to become a general-purpose technology. What does that mean? It means you could gain access to intelligence with the same ease as you get access to the worldwide web. It will completely revolutionize how we live.

At the same time, these very technologies can also be used for harm. A lot of focus on technology gone wrong focuses on who uses it incorrectly or uses it for harm. However, that’s not where it starts. We do need to focus on the people who are building the technology. No, we’re not looking for people in villainous trench coats scheming to bring down the world. It’s more about making sure there are diverse voices in the room. If certain identities have not been considered while building a certain technology, it can be a nightmare for those people.

I also think with technology, there’s a lot of hype that happens with innovation. And that causes us to move very quickly, especially in a more capitalist-driven market. When profit is the motive, corners get cut, and incentives can be compromised. That’s how we ended up in certain situations, like our current day data nightmare. The fact that an algorithm can know if you’re about to get divorced, and sway your decisions, that’s a nightmare. And guess what? From a national security perspective, that’s also a nightmare. So, the speed at which we’re moving and the questions we’re not asking, that’s what makes me nervous

 

You focus on bringing young people to the table when it comes to tech and the future of tech, particularly diverse, marginalized voices. Can you tell us why that’s important to you?

Quite a few things to unpack there. First, it is just morally incorrect to make massive life-changing decisions that will impact a generation without any representation from that generation. You cannot make decisions about the future without the decisions of those who will live in that future. So, that’s why I’m very vocal about advocacy for youth in these key decision rooms. If we had that approach in decisions like the climate, things may have been different. It’s incredibly important that young people are in decision-making rooms because they are the ones that are going to have to live with what gets decided.

To the younger generation listening, your voice really matters. Because of technology, you can participate in conversations, even if you’re not invited to them, and insert yourself. Use the tools around you, whether that’s social media or your networks, to make sure your voice is heard. Your perspective is important. We’ve grown up with technology, which provides us with a vantage point that older generations might not have.

 

What stories are you hearing about cybersecurity in your workshops and from your community? What kind of experiences does your community have when it comes to cyber threats or vulnerabilities?

Alarm bells have been going off now for decades. More people are starting to wake up to it. The main conversations are on how our national security and our personal security are very much in the hands of the private sector. This is the first time in history that the biggest button in war can be in the pockets of children. I think there was a massive lapse in understanding that the private sector didn’t have to adhere to building systems that were up to a specific protocol to ensure their users and their countries were safe with these tools. So, the main conversation is there needs to be a federal standard. From software companies to the next iPhone, you have to adhere to a certain level of care to keep users safe.

We’re already seeing that the future of war, and the present of war, are largely digital. That means the things we buy in stores are potential attack surfaces. It wasn’t viewed that way prior, and we need to understand all the different touch points that intersect with cybersecurity.

Speaking about warfare in general, governments haven’t established an international understanding of where lines cannot be crossed. For example, hospitals and churches are typically safe havens in times of war. In cyberspace? Not so much. We need to have these conversations concerning digital technologies. We should be discussing if launching a cyberattack against a hospital should be a war crime. These are the kinds of cybersecurity questions and conversations we’ve been having in my workshops, as well as in my networks and communities more broadly.

 

Was there a particular cyberattack that made you think about how you frame your discussion?

The one that stood out to me, and it was massive, is the Colonial Pipeline attack. If you’re not familiar with this attack, a few years ago, hackers shut down the largest pipeline in the US. It caused a massive shortage across the entire East Coast. By the way, the hackers weren’t incredibly sophisticated. They didn’t crack the system using quantum computing or anything. It was due to a single compromised password, and it caused one of the biggest hacks to date. The attack impacted something as significant as infrastructure, something as important as fuel. It could have ended up a lot worse than it did. Imagine what it would mean if somebody gained access to our air traffic system? It could have ended in a very different situation. So, for me, that really stands out because it was more about cyber hygiene. There are some attacks, like zero-day attacks, that nobody can see coming. But Colonial Pipeline was the result of something that happens every day. We get messages that passwords have been compromised. You need to change it and not use the same password for multiple accounts. So, that one stands out to me the most.

 

What would you want people to know about security as we continue to see the growth of AI automation and other digital transformations?

The more we deploy AI automation and digital technologies, the more potential attack surfaces we expose ourselves to. Any time we provide a new digital lane into a system, we increase our cyber risks. Every time you interact with cyber technologies, there is risk.

So, it is important to follow simple cyber hygiene steps. Sometimes, people think they don’t have much to hide, and if something gets hacked, it’s not the end of the world. But you don’t necessarily know where that could lead. Perhaps you use the same compromised password for your banking. Now, it’s the end of the world for you.

I think it’s about being alert and recognizing your role as an active user of cyber technologies.

 

We heard WAYE’s Young Leaders Board created an Artificial Intelligence Code of Ethics. Would you mind telling us more about this initiative? Why do people need to worry about ethics when it comes to technology?

I try to practice what I preach. In terms of an AI Code of Ethics, young people will have to live with these systems that we are designing today. How do you want them to be? What do you need to know? How can we be a part of artificial intelligence in a way that makes us feel safe? This is where the AI Code of Ethics came from. Young Leaders worked together to develop a list of twelve critical principles that they believe need to be considered and implemented in AI systems. And why should we need to worry about ethics when it comes to technology? Because technology isn’t just a technical problem – it’s very much a social problem as well. There are massive social, cultural, and political impacts of the technology we use. So, if we don’t take a more holistic approach, we’re going to end up with systems that don’t work for people because they weren’t considered during the creation. We need to have an ethical lens to ensure that everybody is involved with what gets deployed, if it is fair, if it is going to be equal, if it is going to consider all sorts of people.

 

How can the next generation of digital users help keep our Internet a place for connection, opportunity, and inspiration?

If you want to participate in making a future that works for everyone, use the tools you have. Those tools can help you build digital communities and join digital communities that interest you. Build networks that help others contribute to the conversation. It is important to know that your voice really does matter, and the tools that you have, like social media or otherwise, can make an impact. They are significant if you use them with those intended goals. Continue to see yourself as an active player and contributor in making the Internet safer. Don’t be afraid to call out a company that you think is compromising the safety of its users. Don’t be afraid to stand up. There are communities of people that can support you online, that share the same values and perspectives about digital technologies. It doesn’t matter how small you think your voice is. The tools around us are powerful ways to connect with one another, build communities, and contribute as a collective.