Emory University has increased its focus on artificial intelligence (AI) in recent years, launching a minor and the AI.Humanity Initiative. The project aims to use AI across different fields at Emory and “shape the AI revolution” to improve well-being, stimulate economic growth and support diversity, equity and inclusion initiatives. Center for AI Learning Director Joe Sutherland has been at the forefront of Emory’s movement into the AI field. Sutherland sat down with The Emory Wheel to discuss his roles in and outside of the University, the AI.Humanity Initiative and what the future holds for AI.

This Q&A has been edited for clarity and length.

Joe Sutherland is the director of the Emory Center for AI Learning. (Courtesy of Emory University)

The Emory Wheel: Can you tell me a little about your role at Emory and how you came to Emory?

Joe Sutherland: The AI.Humanity Initiative initially focused on faculty recruitment and set a goal to recruit anywhere between 50 and 70 tenure-track faculty. And it was such a successful exercise that the next question was, “Well, how can we continue to build on this?” And the two problems were, number one, we realized that we needed a place for these faculty and the students and the staff interested in artificial intelligence, curricularly and co-curricularly. We wanted them all to be able to gather in one place to build community. And the second thing was we wanted to provide what are called co-curricular learning opportunities for people to gain the skills that they need to be able to succeed in this new AI world. And so my role at the center is to do those two things — is to build community and to provide co-curricular learning opportunities for anybody who wants them.

TEW: I saw you’re involved with politics. Can you talk about your roles outside of Emory?

Sutherland: Politically, artificial intelligence is a hot topic at the moment, and through my position as the center director, I’ve helped many state legislators and the Georgia Chamber [of Commerce] craft their concepts when it comes to AI and the things that we want to pass laws about or not pass laws about. And certainly, artificial intelligence is an important topic for workforce development because there’s a whole generation of people who feel left behind right now by the technology revolution, but I actually look at AI as an opportunity for the people who were left behind to hop back on the train. All you need to use this stuff is a laptop and a kitchen counter. To me, there’s never been a lower barrier to entry for those people. 

I still think there’s some people who don’t even have internet, right? … There’s issues with connectivity, and those are social coordination problems that we need to think about. How do we actually extend broadband to the most underserved areas to enable people to participate in this revolution? Well, that is something you can’t do on your own. You can’t just go and dig a hole in your backyard and just say, “I installed Ethernet and fiber optic cable.” You need the assistance of a larger entity to be able to do that.

TEW: Academically at Emory, what does the future of AI learning look like? I know there’s a minor now, but what do you think the future holds?

Sutherland: The future is bright. … I’m responsible for co-curricular development, which is to say we provide workshops and courses and experiential learning opportunities where students can sign up to work on AI projects through the center. But curricularly, I can sort of speculate. We’ll most likely develop new degree programs that focus on bringing AI as a core part of them. I think we’ll probably focus on, “How do we build the AI leaders of the future?” — not necessarily just the AI technicians, but the people who can think more critically about the use cases to which we must apply AI and the ethical implications of those applications. I could see that happening not only in the computer science department where they have the minor, but also in the other departments in the college and the other schools like the business school and the law school and the medical school. I know that there’s a number of initiatives that are underway, and I’m excited as a faculty member to participate in those.

TEW: How does Emory compare to other universities in its investment and its role in AI?

Sutherland: I was just actually at a conference for the [Southern Association of Colleges and Schools Commission on Colleges (SACSCOC)], which is the accrediting organization for many universities. It is also the accrediting organization for Emory University, and I was on a panel for them during their Presidents’ Day conference, where I spoke to an audience of maybe 150, 200 university presidents and provosts from throughout the accrediting network that SACSCOC is responsible for. … I got a really nice introduction to what other schools are doing and I can tell you that Emory is by far a leader. We’ve invested in more faculty than many other schools. We’ve been running the AI.Humanity Initiative for longer than many other schools. As far as computing resources and other types of capabilities that are becoming available to our faculty, students and staff, other schools are beginning to plan for it but maybe not necessarily caught up yet. … We’re definitely on the front end of the curve at the moment.

TEW: What do you think the biggest accomplishment so far of the AI.Humanity project has been?

Sutherland: One of the biggest accomplishments was our admission to the U.S. AI Safety Institute Consortium. If you’re not familiar with this program, it’s run by the National Institute of [Standards] and Technology, which was directed by [U.S.] President [Joe] Biden under his Oct. 30, 2023 executive order to go and develop a point of view on how the United States should progress when it comes to artificial intelligence and where we should invest, what issues we need to be cognizant of, how to coordinate between public and private entities to be able to achieve the future that we care about, what types of sponsored projects we perhaps could be interested in to help include artificial intelligence as part of the enhancement of human health. … We actually put together a very broad and cross-functional application to join this safety institute consortium that included input from every single side of Emory. …  I was so pleased to see that it prevailed. The next step, now that we’ve been admitted, is actually to start staffing the working groups, which we’ll be focusing on how should we think about U.S. policy when it comes to, for example, synthetic content that’s generated by artificial intelligence models? Or how should we even think about U.S. policy when it comes to misinformation? How should we be thinking about the safety … of these models and the bias that’s entailed in the creation of these models? What are the methodologies we can use to adjust for that? Those are the things I’m most excited about. 

TEW: Finally, where do you see AI at Emory in the next five to 10 years?

Sutherland: We have a lot of opportunities ahead of us, but we also have a lot of challenges. I think one of the number one challenges is there’s not enough people outside of the more familiar academic circles right now that understand these technologies and things as simple as using large language models for planning your grocery list, or for writing job descriptions for your small business or thinking about reviewing documents that you would normally need a lawyer to look at when negotiating simple contracts. There’s so many opportunities for people who aren’t familiar with these technologies to be exposed to them and then therefore increase their earning potential or increase their productivity or just increase the number of free hours they have in their day. Something I’m excited about … is in the summer, we’re rolling out a publicly accessible certification program that anybody can access from anywhere in the state of Georgia or in the United States to learn how to use these technologies just in their own day-to-day life. … Doing that will help us craft the future that we want and not the future that is out of our hands.

+ posts

Spencer Friedland (26C) is from Long Island, New York and is the Emory Wheel's Managing News Editor. He is a Philosophy, Politics and Law major and has a secondary major in Film. Spencer is also a part of the Franklin Fellows program at Emory.