Bringing More Humanity to Web3

Mental health, tech, and ontological design

What follows is a passage from an essay I wrote for Crypto, Culture, and Society on the intersection of mental health and Web3. To read the full essay click here. 

If something is a tool, it genuinely is just sitting there, waiting, patiently. If something is not a tool, it is demanding things from you… and we’ve moved away from having a tools-based tech environment to an addiction manipulation-based tech environment. Social media isn’t a tool waiting to be used; it has its own goals and its own means of pursuing them by using your psychology against you.”– Tristan Harris

Most Web2 products suffer from adversarial design because their fundamental business model is audience-as-product. This means that these companies’ goals are diametrically opposed to ours. Where they win (i.e. maximizing time on site), we lose (i.e. decreased mental health and overall well-being). With such a gross misalignment of incentives, how can we possibly justify giving up so much of our lives to these platforms?

Web3 offers hope because it gives us the tools to intentionally architect novel incentives. Doing so gives us greater capacity to build tech tools starting from core principles of ethics, connection, and well-being as opposed to maximizing profit for a few shareholders.

The Hooked Model, Intermittent Rewards, and other Studies on the Intersection of Tech and Mental Health

Every time they post something on a social media platform: Will you get likes (or hearts or retweets), or will it languish with no feedback? The former creates what one Facebook engineer calls “bright dings of pseudo-pleasure” while the latter feels bad[ly]. Either way, the outcome is hard to predict, which as the psychology of addiction teaches us, makes the whole activity of posting and checking maddeningly appealing. - Adam Alter

The problem with social media is that the best engineers in the world are using leading-edge scientific research on addiction to optimize time on site.

Using the Hook Model in online learning - Rocket Concepts

The trigger that gets us to open the app is a red push notification.

Then as we take the action of opening the app, we receive a reward of seeing all the likes, comments, and engagements on our posts.

This then causes us to invest in the app by replying to comments and interacting with the app, which then leads to more triggers in the future and makes the app as sticky as possible. And so it ever goes.

These product engineers are using tried and tested psychological research, dating back to 1948 when psychologist BF Skinner discovered that intermittent rewards were much more addicting than consistent rewards. This is why technology ethicist Tristan Harris describes our phones as slot machines. We never know when we’re going to be rewarded with a hit of dopamine from our notifications, which causes us to check our phones even more often — and to become even more addicted to them.

In 2020, Jean M. Twenge conducted an eye-opening study on the link between mental health and technology use. She discovered that there was a sharp spike in depression, self-harm, and suicide attempts among teenagers beginning in 2011 (which directly correlates to the rise in smartphone ownership amongst teenage groups). Twenge also discovered that the amount of time in front of a screen was a strong determinant of well-being, where heavy tech users (i.e. over six hours a day) were twice as likely to have low well-being as opposed to light tech users (less than two hours a day).

On top of all of this, tech has downgraded our human capacities in general. We have shorter attention spans, are feeling increasingly isolated as a population, our sensemaking is breaking down and turning into polarized mud-slinging contests, and so much more.

This begs the inquiry, what type of world are we moving towards?

Ontological Design and Building Humane Tech

We shape our tools, thereafter our tools shape us. — Marshall McLuhan

Ontological design is the design discipline concerned with designing human experience. It does so by operating under one essential assumption: that by designing objects, spaces, tools, and experiences, we are in fact designing the human being itself. - Daniel Fraga

Our environments shape us in profound ways that we’re only beginning to understand. As we process this insight we can become more aware of the magnitude of the task at hand. The technological environments that we create will have effects orders of magnitude higher than we could ever imagine — they will shape the fundamental psychology, emotional landscape, social interactions, and identities of entire generations to come. What could be more meaningful and impactful than that?

Just as the Greeks invented the social technology called democracy and it influences how we organize and go about life in such fundamental ways that we just call it normal, the normal of the future is being architected by the current collective tech revolution and the narratives that we craft around it.