5 questions with IBM’s Christina Montgomery

With help from Derek Robertson

Happy Friday! And tomorrow, happy National Data Privacy Day. Today we have Christina Montgomery, IBM’s chief privacy officer, taking on The Future in Five Questions.

The century-old company was a tech giant before there was a tech industry, and these days its data-driven software powers everything from loan approvals to airline reservations — putting the company at the center of the discussion of both consumer data use and AI, which powers many of its products.

Montgomery, who is also chair of the company’s AI ethics board, spoke with us about what a company with IBM’s business model wants to see from federal data privacy regulation; what doesn’t work about the existing proposal for a law; and — looking further ahead — humanity’s role in stewarding emerging technology. Responses have been edited for length and clarity.

What’s one underrated big idea?

The need to differentiate between low-risk and high-risk business models — or precision regulation in the data privacy space.

I spend a lot of time talking about data privacy legislation, particularly now, given the activity over the past year in the US with the ADPPA [American Data Privacy and Protection Act]. One thing I’ve noticed in those conversations is that often every technology for every company gets thrown into one bucket, regardless of our business models.

Not every company is a platform company. Different business models pose very different levels of risk to consumers. I put out a paper in November, calling on policymakers to differentiate between low-risk business models that use data to deliver or improve a company’s operations, products and services (called internal data monetization, or data valorization in the paper) and a higher risk business model, where companies use consumer data as a revenue stream, called external data monetization.

What’s a technology you think is overhyped?

The metaverse, frankly, is overrated.

We spent 2020 through 2022 living in these small virtual worlds and zooming in to family gatherings, into work or happy hours with friends. It reminded me that technology, no matter how good it is, is not going to replace human contact.

I do think the metaverse has a really valuable place in applications like gaming and training scenarios — where you learn something. But the way that the metaverse is being marketed as humans living in this virtual world, zooming into their workplace as avatars in a virtual conference room… I don’t really think that that’s where we’re headed as the human race.

What book most shaped your conception of the future?

Cloud Cuckoo Land” by Anthony Doerr. It’s a saga, connecting multiple storylines spanning hundreds of years — the past, present and future. The timelines are interconnected through a single book that survives throughout the generations due to the care and stewardship of humans.

So, I’m an English major and the book is actually dedicated to librarians, which I think is fascinating. It really focuses on the lasting power of books — that books survive technology and tell our story. To me, that’s a very powerful concept.

I don’t want to give away the ending, but there is an AI in the future called Sybil. And we see the limitations of that technology, juxtaposed with the resilience of humanity.

What could government be doing regarding tech that it isn’t?

We need to focus on passing national privacy legislation that both protects consumers and doesn’t stifle innovation. The rest of the world is passing comprehensive privacy regulations. And by the end of the year five U.S. states will have enacted privacy legislation covering about 60 million Americans. We look like an outlier as a country if we can’t get to it now.

The ADPPA is a great starting point. But there are some areas in the bill that have prevented us from outright saying, “We support it.”

Part of the reason the bill didn’t get the bipartisan support it needed to move out of committee and into the Senate is the private right of action [i.e. U.S. citizens’ ability to enforce their rights through lawsuits]. We need to have the ability to enforce data privacy legislation, sure. But a private right of action is not the right path because you’re gonna get a fragmented interpretation of what’s essentially a new regulatory framework, driven case by case by plaintiff lawyers.

What has surprised you most this year?

The democratization of generative AI like ChatGPT, in terms of the tangible ways people can interact with it and the discussions surrounding it.

It has reinforced the things I’ve been talking about for over three years — namely the importance of embedding ethical principles into AI, because AI doesn’t have any kind of moral judgment or compass.

Generative AI will become a tool for people to use as part of the creative process (although I don’t think it will replace creativity). We need to have those conversations now, as well. What does this mean for people who have been contributing their knowledge to the internet over the years? You’ve got an AI that’s scraping the web. So what are the legal rules around web scraping that websites should follow? It’s also bringing up issues surrounding intellectual property as well — about crediting journalists and artists, about ownership over your own creations and your pictures.

warren talks tough

In case you missed yesterday’s Morning Money newsletter, POLITICO’s Zach Warmbrodt spoke with another major crypto player on the Hill: Sen. Elizabeth Warren (D-Mass.), who had plenty to say about how she’s ready to crack down on the industry this year.

First she told Zach she intends to re-introduce a bill with Kansas’ Republican Sen. Roger Marshall that would toughen protections against money laundering, something she calls a “main focus.”

“Money laundering is in a different space. It’s not nearly as visible to the public… but its impact on our national security and law enforcement is immense,” Warren told Zach. “The current legal structure essentially holds up a giant sign over crypto that says, money laundering done here.”

One might think that an anti-money laundering bill would be relatively uncontroversial, but nothing ever is in the world of crypto: The blockchain-focused think tank Coin Center called the version introduced in the most recent Congress “the most direct attack on the personal freedom and privacy of cryptocurrency users and developers we’ve yet seen,” focusing on its requirement for blockchain developers to register as financial institutions. — Derek Robertson

nist gets in on ai

Another piece in the puzzle of the U.S.’ approach to guiding AI development just dropped into place.

Yesterday the National Institute of Standards and Technology announced its AI Risk Management Framework, meant to guide AI developers and users through the technology’s various risks and based on an 18-month process that featured comments from hundreds of organizations.

So what’s actually in it? Well, that question isn’t quite so easy to answer: The document itself less so sets specific restrictions or recommendations than tells its readers to put in place specific restrictions or recommendations, something an AI regulation attorney told VentureBeat makes it “so high-level and generic that it really only serves as a starting point for even thinking about a risk management framework to be applied to a specific product.”

Still, establishing procedural standards in their own right is part of any new industry’s development — what remains to be seen is if such standards can keep up with the dizzying, near-exponential pace of the technology itself. — Derek Robertson

tweet of the day

the future in 5 links

  • Amid layoffs, BuzzFeed plans to use ChatGPT to assist with its content creation.
  • Crypto billionaire Jihan Wu is in cost-cutting mode.
  • How are employees bumped out of the crypto biz pitching themselves for new jobs?
  • Learn how your router could unintentionally be spying on you.
  • Autonomous vehicles are causing mayhem with San Francisco’s 911 system.

Stay in touch with the whole team: Ben Schreckinger ([email protected]); Derek Robertson ([email protected]); Steve Heuser ([email protected]); and Benton Ives ([email protected]). Follow us @DigitalFuture on Twitter.

If you’ve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.