For over a decade now, Silicon Valley ideology has been to simultaneously: Move fast and break things. Make the world a better place.
Our community — entrepreneurs and investors alike — have had a sense of manifest destiny to move humanity forward with our work. Somewhere along the way however, making the world a better place became more of a punchline than an ethos, and breaking things has become what we are best known for outside Silicon Valley.
To that end, 2017 has been a demoralizing year. While our product innovations have built accessible and affordable content, community and commerce on top of a thriving deregulated and open internet, they have also been weaponized against society by bad actors. Social media became a vector of bullying and propaganda. Algorithms meant to personalize experiences have ended up amplifying bias. Personal data has been compromised in ways that we don’t even fully understand yet.
What’s worse is that as our industry has gained influence and power, we have also become plagued with sexual harassment scandals on top of a persistent lack of diversity. This is not unique to our industry but it adds significantly to perception that we are all bad actors. This is really painful to see especially when I step foot in other parts of the country and describe what I do.
Further, there is another dynamic emerging where Congress is feeling threatened by the growing power of the businesses that we bring to the world. There is a growing bipartisan support for increased regulation of internet companies. The current administration has tussled with tech and investors over the International Entrepreneur Rule and other visa and immigration points. The level playing field we’ve enjoyed with net neutrality is about to disappear under current FCC leadership.
Right now, the relationship between tech and regulators is reading somewhere between strained and adversarial. My worry is that regulation destroys the ability to innovate. We regulated electricity in the early 20th Century and it has significantly contributed to climate change. If we regulated technology (and AI is the electricity of the 21st Century in many ways), we will create other large issues for ourselves.
Where do we go from here?
First and foremost, we need to recognize that stakes are getting higher and higher with our innovations. We have gone from building software companies that provided efficiency for workers in every industry to completely rethinking how to provide better healthcare, education, financial services, transportation, and even work itself. And with new enabling technologies like blockchain, CRISPR, 3D printing, AR/VR and drones that are right on the horizon, we are going to impact our core values around equality, purpose and work to a much greater degree.
One thing is for sure, we can’t be regressing in our core values as a community and expect to take on such great responsibility. I hope we can celebrate the great founders who are building companies with the right values as much as we have ripped apart the bad actors. The next generation of entrepreneurs and investors needs to be inspired to build on that.
For the last several years, I have encouraged many founders to incorporate two intangibles into their own definition of minimum viable products: awareness for regulation and recognition for social impact. With the benefit of time, it has become obvious to me that this is not enough and that we need to shift our collective mindset from being obsessed with the hacker entrepreneur archetype to one of an empathetic entrepreneur.
We no longer have the luxury of being reactive to the impact technology has on our society and culture. While the hyper-competitive, product-obsessed hackers created the fastest growing companies and generated the best venture capital returns in the past, this will not be the case going forward because there is way more at stake here now and everyone is scrutinizing our work.
Taking ownership in 2018
Facebook gets a lot of grief for how they handled the 2017 elections and I know that it has been a demoralizing year for its leadership, despite great success as a business. But they deserve credit for making continued progress for the features they’ve started to test and roll out: using third-party services and AI to flag and downrank fake news; adding one-tap access to information about a publisher; and creating a tool so users can see if they were duped by inflammatory ads before the last US election. It’s a good start.
In 2018, we will see the major players called upon for more transparency in how their algorithms work. This is something that every startup using any amount of machine learning needs to consider building into their products and platforms in a way that informs users how their data is being parsed but without giving away any secret sauce.
Companies like Google and Facebook already produce transparency reports detailing their responses to government requests for user data. Why shouldn’t they do something similar for content integrity? Especially now that we more fully understand the implications when that integrity is not protected.
This growth towards openness and transparency has to come from within Silicon Valley. Regulation isn’t inherently bad but it does tend to encourage developing workarounds versus focusing on true innovation. It is incumbent for us, the technology community, to reach out to the regulators and legislators to help them better understand the wider impact of the things we’re working on. And it’s our responsibility to be transparent and honest with consumers about how we’re using their data and what they can expect from us.
One thing we are great at in Silicon Valley is iterating fast to the best answers. I am hopeful that our community will iterate its core values from hacking and growth to responsible innovation as we continue to rewrite major parts of the economy and society as a whole. Here’s to a responsible 2018!