Image for post
Image for post
Photo by Zac Ong

Written Testimony by Stefaan G. Verhulst

Co-Founder, The GovLab, Tandon School of Engineering, New York University

Before the New York City Council Committee on Technology

Oversight Hearing: Smart City, January 19, 2021

Chairman Holden and distinguished members of the Committee, thank you for allowing me the privilege to appear virtually before you today. My name is Stefaan G. Verhulst, Co-Founder of and Chief Research and Development Officer at the Governance Lab (“The GovLab”) at New York University. The GovLab is an action research center whose mission is to strengthen the ability of institutions — including but not limited to governments — and people to work more openly, collaboratively, effectively and legitimately to make better decisions and solve public problems. …

Image for post
Image for post

Last month, several experts and practitioners gathered on-line to reflect on the use of AI in managing pandemics as part of The Responsible AI Forum (Preview) — organized by the Institute for Ethics in Artificial Intelligence (TUM, Munich), in close cooperation with the Global AI Ethics Consortium.

I had the pleasure to join a panel that was comprised of Dirk Brand (Stellenbosch), Christian Djeffal (Munich), mark findlay (Singapore), Christoph Lütge (Munich), and Jeannie Marie Paterson (Melbourne) that focused on “Governing Responsible AI in Health Crises”.

Delighted to share below the short presentation I shared on the emergence of AI Localism [SLIDE DECK]. …

By Juliet McMurren and Stefaan G. Verhulst

This article was originally published in Data & Policy, the peer-reviewed, open-access venue dedicated to the potential of data science to address important policy challenges.

If data is the “new oil,” why isn’t it flowing? For almost two decades, data management in fields such as government, healthcare, finance, and research has aspired to achieve a state of data liquidity, in which data can be reused where and when it is needed. For the most part, however, this aspiration remains unrealized. …

Image for post
Image for post
De Nacthwacht by Rembrandt, 1642 (Rijksmuseum Amsterdam)

A shorter version of the below appeared in The Harvard Business Review on May 15, 2020

Twenty years ago, Kevin Rivette and David Kline wrote a book about the hidden value contained within companies’ underutilized patents. These patents, Rivette and Kline argued, represented “Rembrandts in the Attic” (the title of their book*). Patents, the authors suggested, shouldn’t be seen merely as defensive tools but also as monetizable assets that could be deployed in the quest for profits and competitive dominance. In an interview given by the authors, they referred to patents as “the new currency of the knowledge economy.”

We are still living in the knowledge economy, and organizations are still trying to figure out how to unlock under-utilized assets. But today the currency has shifted: Today’s Rembrandts in the Attic are data. At the same time, the currency and the means of unlocking the value of data are quite different than with patentable innovations. Unlike patents, the key to harnessing data’s value is unlikely to be found in restrictive licensing approaches. Unlocking the value of data requires a new approach, one that recognizes that the value of data ultimately lies in access, collaboration, and application of data to solving a wide range of problems using tools like machine learning. …

Why We Need an Open Data Policy Lab

Image for post
Image for post

The belief that we are living in a data age — one characterized by unprecedented amounts of data, with unprecedented potential — has become mainstream. We regularly read phrases such as “data is the most valuable commodity in the global economy” or that data provides decision-makers with an “ever-swelling flood of information.”

Without a doubt, there is truth in such statements. But they also leave out a major shortcoming — the fact that much of the most useful data continue to remain inaccessible, hidden in silos, behind digital walls, and in untapped “treasuries.”

For close to a decade, the technology and public interest community have pushed the idea of open data. At its core, open data represents a new paradigm of data availability and access. The movement borrows from the language of open source and is rooted in notions of a “knowledge commons”, a concept developed, among others, by scholars like Nobel Prize winner Elinor Ostrom. …

With national innovation strategies focused primarily on achieving dominance in artificial intelligence, the problem of actually regulating AI applications has received less attention. Fortunately, cities and other local jurisdictions are picking up the baton and conducting policy experiments that will yield lessons for everyone.

Stefaan G. Verhulst and Mona Sloane

(Reposted from Project Syndicate; Part of Value in the Age of AI Series; See also Re-Imagining Cities)

Image for post
Image for post
Frans Masereel “The City”

Every new technology rides a wave from hype to dismay. But even by the usual standards, artificial intelligence has had a turbulent run. Is AI a society-renewing hero or a jobs-destroying villain? …

Image for post
Image for post
Photo by Zach Lucero on Unsplash

To see the potential of data, we need to tap into the wisdom of bilinguals

(Reposted from apolitical)

“If I had only one hour to save the world, I would spend fifty-five minutes defining the questions, and only five minutes finding the answers,” is a famous aphorism attributed to Albert Einstein.

Behind this quote is an important insight about human nature: Too often, we leap to answers without first pausing to examine our questions. We tout solutions without considering whether we are addressing real or relevant challenges or priorities. We advocate fixes for problems, or for aspects of society, that may not be broken at all.

This misordering of priorities is especially acute — and represents a missed opportunity — in our era of big data. Today’s data has enormous potential to solve important public challenges. …

Image for post
Image for post
(Picture by Dennis Kummer)

“Data collaboratives,” an emerging form of partnership in which participants exchange data for the public good, have huge potential to benefit society and improve artificial intelligence. But they must be designed responsibly and take data-privacy concerns into account.

(Reposted from Project Syndicate)

After Hurricane Katrina struck New Orleans in 2005, the direct-mail marketing company Valassis shared its database with emergency agencies and volunteers to help improve aid delivery. In Santiago, Chile, analysts from Universidad del Desarrollo, ISI Foundation, UNICEF, and the GovLab collaborated with Telefónica, the city’s largest mobile operator, to study gender-based mobility patterns in order to design a more equitable transportation policy. …

Scaling social impact is not easy, but it is worth it

Image for post
Image for post

We live in challenging times. From climate change to food insecurity and forced migration, the difficulties confronting decision makers are unprecedented in their variety, and also in their complexity and urgency. Our standard policy toolkit seems stale and ineffective while existing governance institutions are increasingly outdated and distrusted.

To tackle today’s challenges we need not only new solutions but also new methods for arriving at solutions. Data and data science will become more central to meeting these challenges and to social innovation, philanthropy, international development, and humanitarian aid. …

The “AI for Social Good” conference that recently took place at the Qatar Computing Research Institute examined the potential of Artificial Intelligence (AI) for good. It was widely agreed that the potential is real, and that AI could help jumpstart economic development and support humanitarian causes when used responsibly.

Image for post
Image for post
AI for Social Good Conference, Qatar

Yet equally, it was evident to all that increasing the adoption of AI faces certain challenges and constraints. In particular, AI (and the associated methods of machine learning, deep learning, data science, etc.) relies on access to vast amounts of data that can help train and develop new systems. …

Stefaan G. Verhulst

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store