Emily Tavoulareas, a professor at the McCourt School of Public Policy at Georgetown University, is an expert in technology, specifically in how technology is applied in practice by governments, organizations, and individuals. Through her new project, Homescreen, she explores how technology impacts our lives, especially those of children, and how it can be leveraged for social good.

What is your connection to Greece, and how do you feel about your Greek roots?

I love my Greek roots. I am the first generation born here in the United States. My parents came to the United States in the late 1970s, like many Greeks did after the fall of the junta. I was very lucky to grow up in a part of the United States where there is a very strong Greek community, so I grew up with my own little “village,” I would say, in our church and Greek school, with multiple generations around me. Now my children go to school with the children of my friends, and literally, the parking lot looks like a small village.

When you teach at Georgetown University, what do you think is the main concern among your students regarding technology and the future?

There are many concerns. I would say there is some difference of opinion about how students see and think about technology. The main issues, though, I would say, are jobs, climate, and the environment.

There is also a more generalized feeling that the technologies we use are not designed for our benefit, but, in some way, are being used against us. There is this sense that these technologies constantly promise to be helpful, but they don’t exactly do what we expect from them. I think this shows up in different ways.

I teach in a public policy school (it’s a graduate public policy school). Last year, around this time, DOGE and Elon Musk were in government, cutting things left and right. Honestly, in the classroom, you could really feel the weight of the situation, because students were seeing, in many ways, their professional paths and ambitions being disrupted — at least those focused on work in the federal government.

So I would say that is a very big issue. I also hear many students worrying about the effects of new technology, especially data centers, the need for computational power, and how that affects our environment.

I also think there is a feeling of loss of control. Everything is moving very fast, but it’s like asking yourself: who decided this is the path we are going to follow? Especially for those studying government.

In recent years, we constantly hear about huge investments in artificial intelligence. However, when we hear statements from CEOs like Sam Altman saying that this could potentially lead to the end of the world, it feels like we are not on the same team. How do we reverse this?
It doesn’t feel like we are on the same team because I don’t think we are. I think there is a combination of complex things happening right now.

One is that they run companies that have to make as much money as possible. The second — and perhaps more complicated — is that many of these tech leaders have adopted an ideology that assumes artificial intelligence will dominate the world and essentially replace humans. That is, it’s an ideology.

Something important to keep in mind is that many tech titans, at least in the United States, grew up reading a lot of science fiction from a young age. There is, in fact, quite a bit of evidence showing how science fiction has influenced their worldviews. But it’s even more interesting that people are drawn to it. We also see a convergence of ideologies that already existed, like effective altruism. These are very quantitative approaches to decision-making, thinking about ethics, and philosophy. They clash with philosophy and religion and have gained a lot of influence.

You asked how we can reverse the situation. One reason we are in this situation is that these people have amassed enormous power. And they have amassed it because individual companies have enormous influence. Essentially, these are monopolies. They have monopolized not just the entire industry, but — because much of the world depends on this technology as critical infrastructure — they have gained massive power.

That is why I believe that any response to this situation requires governments, particularly in the countries where these companies are located, to start regulating them in some way. That is, after all, the purpose of public institutions.

 

You have worked for the Beeck Center for Social Impact and Innovation and the U.S. Digital Service. From your experience, where do you think the government goes wrong regarding the use of technology to serve citizens?

Can new technologies be applied in public administration without reducing the workforce?

My time at the U.S. Digital Service really changed my perspective on many issues related to government and was a transformative experience. To answer your question literally: the government makes many mistakes. There is a mistaken belief that technology will make life easier, whereas, in reality, technology is just a tool for something else. It is not an end in itself. Technology can bring change, but many other things must happen for citizens’ lives to actually become easier.

When we interact with technology, we only see the tip of the iceberg. Technology reflects the foundations it is built upon. In government, these foundations include policies, procedures, and regulations. To improve a citizen’s experience, it’s not enough to improve just the technology; you have to improve the whole environment around it.

Another major mistake the government makes is overreliance on quantified information, like data. Data is only one piece of the picture. You can’t fully understand a problem by looking only at numbers. You have to see people’s real experiences. For example, a few years ago, Europe and Greece announced that the austerity crisis had ended, but walking around my city, I saw that nothing had changed — closed shops, neighbors struggling to pay rent, cafes closing. The numbers told one story, but reality told another. To really solve problems, you have to combine data with observed reality and understand people’s experiences.

Technically, new technologies can be integrated into public administration without impacting jobs, but it must be done carefully. Often, people simply “load” work onto systems to make things easier, but that doesn’t mean it will work as imagined. There are examples of companies that laid off staff expecting AI to replace them, only to have to rehire them later.

Your new project, Home Screen, emphasizes the impact technology has on our lives, especially on children. Recent studies show that among Gen Z individuals, over 50% use social media for more than three hours a day, while many struggle with social interactions.

What advice would you give to parents who see their children permanently glued to a screen most of the day? And do you think a social media ban for kids under 18 could bring positive results?

It’s a complex issue. Screens have replaced many of the things we used to do outside, like finding friends and playing together. In the past, we would come home from school and go outside to meet our friends. To get kids off technology, you have to give them alternative activities.

There are many reasons parents today are less willing to let their children go outside. The world outside can seem more dangerous, and life has changed: parents work more, and family routines are different.

So we have to give them back that experience. If we don’t want children on devices, we have to give them something else to do, something that is not a screen. We have to give them independence and the freedom to choose and act. If we want children to become functional members of society, we need to allow them to have that level of independence so they gain the confidence to face different situations.

Another important point is that the way parents use social media affects how children use devices. If parents constantly consume social media content filled with frightening news, they often keep children inside because they fear for their environment, even though reality may be much safer. This behavior affects children’s choices and ultimately increases the time they spend in front of screens.

Regarding a social media ban for children under 18: in schools, it seems to have positive results, but its implementation is complex, because it requires age verification and accountability from big companies. It is a temporary solution, a “band-aid” for a deeper problem, which concerns children’s dependence on screens and the lack of experiences that would give them independence and confidence.

Much of children’s confidence and ability to face the world comes from confronting difficult, unpleasant, or uncomfortable situations. When we remove these experiences and confine them to a “box” screen experience, we protect them from discomfort but also increase their anxiety and hinder their growth.

In short, screens are dangerous, but removing independence and experiences is also dangerous. The ideal is to find balance: to give children alternatives, independence, and space to grow so they can use technology consciously and gain life experiences outside of screens.

Do you believe technology is ethically neutral?

No. That is my quickest answer.

All technology is created by humans. And when that happens, it reflects the beliefs, priorities, values, and goals of those humans. There is no way technology can be neutral. This is even more true for algorithm-based technology. Algorithms are just rules for computers. Someone wrote those rules, and those rules reflect the views of an individual or group of people.

There is no way to avoid it. It is not neutral. And the more we treat it as a neutral tool, the more problems we will create.

How do you think global society will evolve over the next ten years?

This is a difficult question. What do I think? Honestly, I don’t think anyone really knows. Everything is changing so fast. When I say “everything,” I mean that the tectonic plates of our system and society are shifting right now: geopolitically, technologically, socially. Everything is changing at the same time, which is why I think it’s very hard to imagine. Whatever the answer is, the world will probably be very different from what we see today — perhaps even unrecognizable. Much depends on the decisions we make now: first, in relation to the climate and how we interact with our environment; and second, in relation to technology. In fact, I believe that decisions we make regarding technology are among the most important.

What is your wish for 2026?

My wish for 2026 is twofold.

First, I hope that strong governments — those with the ability to shape the decisions and priorities of companies — find ways to effectively regulate these companies and ensure that they no longer create products deliberately designed to be addictive to the public. And, on a personal level, I really hope that people around the world “wake up” and remember the power they have. Life existed before these products. We were able to connect with each other, socialize, and exchange information before these platforms existed. I hope that in 2026 we remember this again and go outside more, because right now we are giving too much to these companies at the expense of our own lives. We need to take it back. And I truly hope we see more of that.

Author