Microsoft CTO Kevin Scott on how AI language models can democratize education: “It creates a bunch of opportunity”
If you’ve been on the internet this week, you’ve likely seen the buzz about ChatGPT, AI firm Open AI’s advanced, human-sounding chatbot that was released for public use last week. As users marveled at the chatbot’s ability to answer complex questions and even make jokes, many are hypothesizing what the implications will be for higher education as well as the future of work. At Fortune’s Brainstorm AI conference in San Francisco, CTO of Microsoft Kevin Scott explained what the potential is for powerful AI language models to make information more accessible.
Scott emphasized that one important aspect of AI models is its ability to help individuals and businesses create complex code without necessarily having to get a degree in computer science. Microsoft, GitHub and OpenAI partnered to build GitHub Copilot, which is a coding assistant. “You don’t need to have a PhD in computer science anymore to build an AI application, which I think is really, really exciting,” he said. “I went back home to rural Central Virginia, where I grew up, and I talked to a bunch of people who had an entrepreneurial mindset, and given these tools, even without computer science degrees, these folks will be able to see the opportunities and they will absolutely be able to incorporate them into their businesses,” he said. “To me, that feels really exciting.”
He also emphasized the potential for AI language models applies to an array of professional uses beyond coding. “You will have lots and lots of these [models], helping people with a pretty wide range of tasks, whether you are a video editor or trying to pull together a piece of content or you’re a journalist doing research or writing an article.”
While AI language models sound impressively human, Scott emphasized that these models are tools, not replacements, for human workers and educators. Scott explained that AI language models can function using tools like Chat GPT is to help articulate information more clearly, which Scott referred to as prompt engineering. He acknowledged that these chat bots can be wrong, and therefore it is up to the user to give the model accurate information to work with.
To Scott, the fact that users must have context and information to correctly prompt engineer is a boon for students using AI to cheat on essays and other homework assignments as the tools become more advanced. He argued that to create an accurate essay using AI, the student still has to learn the material. “So in a sense, like, nothing really is changing here,” he explained of using AI in education. “You have this tool, and now the student themselves has to become the teacher to the model,” he said.
“I think it will be a more much more accessible way for people to get real power out of their technology,” he added.
Our new weekly Impact Report newsletter will examine how ESG news and trends are shaping the roles and responsibilities of today’s executives—and how they can best navigate those challenges. Subscribe here.