The Value of the Liberal Arts in a “Techie” World

Venture capitalist Scott Hartley credits his wide-ranging liberal arts education as key to his success in tech. Photo courtesy of Scott Hartley.

Scott Hartley is a New York City-based venture capitalist who grew up in Silicon Valley and went on to work at major technology giants like Facebook and Google.  But during his undergraduate years at Stanford University, he majored in political science—not computer science.  In fact, Hartley credits his wide-ranging liberal arts education as key to his success in the tech world.

In his best-selling bookThe Fuzzy and the Techie:  Why the Liberal Arts Will Rule the Digital World, Hartley debunks the assumption that the liberal arts have no place at innovative companies.  The book’s title refers to the nicknames given to Stanford students and their areas of study—with “fuzzies” majoring in the humanities or social sciences, while those labeled as “techies” study engineering or math.  But the point is simple:  the humanities and hard sciences are unnecessarily cleaved into separate disciplines, and the liberal arts are critical for driving innovation in an increasingly tech-focused economy.   “If we peer behind the veil of our greatest technology, we will see that it is distinguished by its humanity,” Hartley writes.

In the following Q&A, Hartley speaks about why the liberal arts matter more than ever in the tech world.

How did you become interested in focusing on the liberal arts and how important they are in our digital world?
Part of it was autobiographical.  I was a “fuzzy,” even though I am originally from Palo Alto.  I went to high school across the street from Stanford.  I loved political theory, and I sought out this broad liberal arts education.  I took ancient, medieval, and modern political theory.  I took French, Italian, Spanish.  I was really a “fuzzy” at heart, yet I had grown up in the “techie” world.

Then, I went to work at Google, and my family asked, “What do you possibly do there? You're not an engineer.”  It’s a theme has stayed with me all the way through joining a venture capital firm, Mohr Davidow Ventures, in 2011, right off Stanford's campus.  There, we were meeting with hundreds of entrepreneurs, and we had this real bias toward technical founders.  There was a strong drumbeat that science, technology, engineering, and math were all going to be the key to successful companies.

Really, this book was my counter-narrative about Silicon Valley, which is that so many of these great leaders, and so many of the people steeped in the problems that really affect human lives, studied all sorts of things.  They’re people who really understood psychology, understood philosophy, and understood some of these broader-level subjects and applied them to the tech world.

[book cover]Hartley’s best-selling book, The Fuzzy and the Technie. Photo courtesy of Houghton Mifflin Harcourt.

You provided a number of these examples in your book, about people who did study the humanities or social sciences yet excelled at tech companies.  How do you think they made the leap?
They were technical enough to be dangerous, but they didn't feel locked out of the technology world.  They were able to continually learn and engage with technology, but their comparative advantage wasn't that they were technical.  It was that they understood a particular problem that they learned through their broad exposure in education.

We really can’t forget about training for flexible skills, training for soft skills, training for curiosity, training for just having an empathetic world view—it’s central to success.  That requires really digging into literature, digging into history, digging into subjects and questions that we don't necessarily know the answers to.  We've been asking the same complex questions in philosophy for 3,000 years—and we still don't have the answers.  Those are things that I think help people grapple with ambiguity and grapple with change.  If anything, we live in a world that's rife with change today.

It's no wonder to me that there's so many great entrepreneurs who are philosophers… it allows you to grapple in this gray area of ambiguity and really structure your way forward with conviction.

In your book, you cite examples of so many technology company founders who were “fuzzies,” everyone from Alex Karp, CEO and co-founder of the software company Palantir, who earned both a law degree and a PhD in neoclassical social theory; Salesforce co-founder Parker Harris, who studied English; and Carly Fiorina, the ex-CEO of Hewlett-Packard, who majored in medieval history.  What do you think draws “fuzzies” to the technology world?
These days, technology is the engine of the generation. So it's no wonder that people from all backgrounds are drawn to it.  But I think that the reason why so many of the leaders—people like Susan Wojcicki, who runs YouTube, or Reid Hoffman, co-founder of LinkedIn, or even Alexis Ohanian from Reddit, who was a history major—I think there's a sort of a gravitas in the type of problems that people are focused on.

From an entrepreneurship standpoint, so many tech leaders have majored in philosophy.  Philosophy as a potential major for an entrepreneur is incredible, because there's nothing more unstructured than entrepreneurship. How do you possibly prioritize what to do?  The role of philosophy is trying to create structure, and conviction, argumentation and logic… all out of complete ambiguity, right?  It's no wonder to me that there's so many great entrepreneurs who are philosophers by training, because it allows you to grapple in this gray area of ambiguity and really structure your way forward with conviction.  I think that's exactly what you do as an entrepreneur and as a CEO.

The other thing is curiosity.  This actually echoes something that [Google’s former executive chairman] Eric Schmidt says, which is, "The two characteristics of success at Google are curiosity and persistence."  I think that in a broad-based education like the liberal arts where you're tugging on the mind in different ways, and you're exposing somebody to ten different subjects and ten different methodologies and ideas, you really allow them to explore their own curiosity.  I think that curiosity over time is what allows people to persist and succeed in high positions. 

We live in a time during which many parents of students who are entering college are really concerned about the kid who wants to be an English or a history major, and in which we have some prominent tech luminaries saying that soft skills are not important.  Why should students pursue the liberal arts and the humanities anyway?
Here’s why.  In five years, JavaScript will not be a coding language that anyone cares about. In ten years, it won't even matter.  It's interesting that these strict vocational skills seem like the shiny objects that provide us with job security.  But in fact, it's the soft skills—curiosity, persistence, empathy, collaboration. These are the things that really will sustain us out two jobs, three jobs, four jobs in our careers.  That's what I would say to parents.

It's interesting that these strict vocational skills seem like the shiny objects that provide us with job security. But in fact, it's the soft skills—curiosity, persistence, empathy, collaboration.

Among the most interesting parts of your book were the different examples of how these “soft skills” were really crucial in the success of technology companies.  What do people need to know about the importance of soft skills?
Underneath technology are human inputs.  People are choosing which data to pay attention to in big data, and how to organize it.  They're choosing the taxonomies in machine learning.  They're choosing probabilities and sensitivities in artificial intelligence.  They're choosing how to optimize processes.

We need to myth-bust our expectations about these technology terms and understand that underneath each of them, if we maneuver our way through an application on our phone, that app is designed by people.  The choices and menus that have been created for us are the creations, optimized for a certain process, that are focused on certain elements of psychology and certain behavioral designs.  So, we can't think of these things as uniformly “techie.”  They're deeply rooted in psychology and in all these other skills as well.

Fei-Fei Li, who is the head of artificial intelligence and machine learning at Google Cloud, had this great line that she said in the New York Times:  that “there's nothing artificial about this technology [artificial intelligence].” It’s all derivative of human inputs.

Well, what are the other kind of areas and roles that humanities folks may have in digital fields?  Perhaps in artificial intelligence, for example.
Melissa Cefkin—she is an anthropologist and principal researcher in the Human Centered Systems practice at Nissan Research Center, a self-driving car lab in the Bay Area.  We have this sort of end-state vision of the world where it's all autonomous vehicles on the roads at all times, but really, for the foreseeable future for our lifetimes, it's going to be a mixed-use environment where there are some people driving.  How do you put into computer code the tacit communication challenges that happen across cultures, across ages, across demographics?

It's really an ethnographic research study.  It's an ethnographic study of how do we take tacit human communication like a head nod or a wave of the hand that means one thing in New York and Boston, and means something totally different if it was in Beijing.  So, if you're Nissan, and you're creating cars that are supposed to autonomously drive through the streets all over the world, it's this incredible anthropology challenge.  People forget that, yes, there are people writing in computer code, but the inputs for how they write that code and what they build are really human.  They're human-centric.

[campus landscape photograph]Stanford University, where Hartley pursued a liberal arts course of study in political science. Photo: Adobe Stock.

How do we get to a place where in society and in education, we're not viewing it as STEM versus humanities?  What can we do to encourage a collaboration of the two?
I think that really cuts both ways.  Engineering can be a highly creative type of field, yet we think of it as sort of rote and robotic in some ways.  Equally, the humanities can be highly quantitative.  They can involve using statistical software to do massive data sets in the social sciences.

I talk a lot in the book about the prominence of the humanities in Silicon Valley entrepreneurship, but I think what we also lack is the prominence of technologists in really “fuzzy” areas, like government.  If you look at Mark Zuckerberg's testimony in front of Capitol Hill, you realize that people from government don't understand anything about technology.  So, the same way that we need to get ethicists, philosophers, people from the humanities into these AI [artificial intelligence] technology conversations, we need to get deeply technical people in Washington to think about some of these policy challenges.

So are we worrying unnecessarily about the “rise of the machines,” and that our jobs will become obsolete?
Technology has created fear in a lot of people.  Really, these technologies that we point to, like machine learning and robotics, are going to optimize the routine parts of our jobs.  They're going to whittle away at the very repeatable tasks that we do.  What that does is it actually forces us all to be bit more human, to upscale in the roles that we have, where they're going to deal more with human-to-human interaction, more with having empathy for a customer, collaborating with colleagues.

There's a professor at the Harvard School of Education, David Deming.  He basically talks about why soft skills are relevant.  One of the arguments is, if machines are taking over the really routine stuff, then what's left is the more complex, the more improvisational, the more unscripted tasks.  Those things require human specialization.  When we're trading a lot of tasks between people, the thing that reduces the friction in that work environment are the soft skills.

If anything, in this more robot-enhanced, machine learning future, people need to have these soft skills.  They need to be curious.  They need to have flexibility to continue to be adaptive and focus on these human skills.