Edition 3

Taking back control

Written by Cape Capital | Oct 24, 2023 9:02:17 AM

From deep-fakes to derailing democratic values, technology is being deployed to manipulate and control. Zoe Weinberg has a game-changing idea that brings innovation and capital together to restore human agency in the digital realm.

For Zoe Weinberg, it was a summer spent as an aid worker in Iraq six years ago, when Mosul was being retaken by the Iraqi forces and the American Coalition, that she had a lightbulb moment about technology. ‘I saw over and over again the ways in which technology was being deployed in the course of conflict to surveil and control people but also the ways that it was being leveraged, from bitcoin miners who set up in operation because the Iraqi dinar wasn’t a reliable currency, to encrypted communications and messaging,’ recalls Zoe who was then studying human rights law and international humanitarian law at Yale Law School. ‘One of the key sources of truth when it came to the Mosul frontline was a Twitter (now X) account called Mosul Eye.’

Of course, it’s not only in war settings where technology can be utilised to both help and hinder society – how we harness new digital innovation affects everything from news feeds to elections and AI. The latter is a subject that Zoe is well versed in, having done stints in both Research and Machine Intelligence at Google and on the National Security Commission on Artificial Intelligence. ‘I did an MBA at Stanford Graduate School and it was while I was there that I started to dig into the implications of AI from a technological and business perspective, and I do think that generative AI in many ways is potentially changing the game,’ she says. 

Although there are often two polar schools of thought around its development – either that it’s detrimental to humanity and we should pause all progress, or that we should let it flourish with no restrictions – Zoe believes that there is an alternative. ‘I think there is a way to forge a third path which is a positive vision of an AI innovation ecosystem that also has ethics and responsibility at its core and I would be interested in helping bring that into reality. There is space for both industry leaders to make commitments to certain guidelines, and a role for governments to intervene in helping shape what we think is ultimately beneficial for society, and what poses too high of a risk or constitutes reckless development.’

However, AI is just one area of Zoe’s expertise. Where she is set to really play a role in shifting the dial is in her disruptor approach to technology itself. ‘Historically we think about technology in certain sectors or verticals like AI, quantum and blockchain. Thinking about it in terms of the ability to restore human agency and give individuals control is not a lens that I have seen brought to technology or investing before,’ says Zoe, who is working on ex/ante, a project–still in stealth–aimed at advancing free societies and countering digital authoritarianism. 

‘I think that democracies are vulnerable and under threat to a degree that hasn’t been seen in a long time and while physical threats are always going to exist, now conflict can also play out through disinformation, state sponsored internet blackouts, censorship and enhanced surveillance, all facilitated by digital means,’ says Zoe. ‘Democracy protection is a form of national security and I think it’s important to tackle all these problems. In the past, the intersection of technology, investing and democracy was around government technology and civic technology, such as investing in tools to make it possible for people to contact their local representative or register to vote. I’m interested in technology that gives individuals more control over their data, their assets and the information they consume; fundamentally, that is a form of advancing democratic values, accountability and all of those things we hold dear in a free society.’

To do this, Zoe is helping companies that are building solutions to provide that control – whether through censorship-resistant decentralised storage, identity tools that give users more power over their digital footprint on the internet, cybersecurity and enhanced privacy, or deep fake detection. ‘I’ve been working with a company focused on deep-fake detection. The threat of photorealistic content that has been manipulated is very real, particularly, in the context of the upcoming election cycle and a rapidly evolving news cycle’ she explains. 

‘Part of it is the technological challenge of identifying when something is fake and whether there is a way to watermark content or have a forensic tool that allows you tell if it’s authentic. But there’s also a psychological piece too. Studies show even if people are told that content is fake, many will still believe it might be real because there is such a strong confirmation bias and we’ve been trained to trust our eyes. While certain things around disinformation such as propaganda and photoshop are not new, the uncanniness of how good the technology is becoming is what changes the equation.’

There’s also a lack of understanding around how immersive experiences such as virtual reality, the metaverse or gaming environments impact our ability to distinguish real from fake. ‘There hasn’t been a ton of research on how disinformation travels throughout a gaming environment or the metaverse,’ says Zoe. ‘We don’t know if seeing an immersive video, as opposed to a feed on your phone, affects your ability to distinguish fiction from reality. You may have a very visceral reaction to something that’s hard to detach from.’

While some companies she is partnering with are developing cutting-edge tools from scratch, others are harnessing existing technology in ways that favour the user. ‘I would be keen to work with someone if what grounds their product and who they sell it to, is a point of view about how technology should exist in society that ultimately bolsters the world we want to live in.’