
Can using AI teach you to code more quickly than traditional methods? Anthropic certainly thinks so. The AI outfit has partnered with computer science education org CodePath to get Claude and Claude Code into the hands of students, a time-tested strategy for seeding product interest and building brand loyalty.
The project aims to Claude-ify more than 20,000 students at community colleges, state schools, and Historically Black Colleges and Universities (HBCUs).
According to Anthropic, more than 40 percent of CodePath students come from families earning less than $50,000 a year, a nod to the less privileged who may not be able to afford college without financial assistance.
“We now have the technology to teach in two years what used to take four,” said Michael Ellison, co-founder and CEO of CodePath, in a statement. “But speed for some and not others just widens inequality. Partnering with Anthropic means our students learn to build with Claude from day one, at institutions that have historically been overlooked. This results in better outcomes for our students and a fundamentally different answer to who gets to shape the AI economy.”
We question whether access to Claude will empower economically disadvantaged students to “shape the AI economy.” Entering the workforce with some knowledge of Claude should enable participation in the AI economy – certainly a win if the Claude-deprived find jobs scarce. But shaping the AI economy remains the privilege of corporations and billionaires, of those throwing cash at computing infrastructure, politicians, and public relations.
CodePath plans to integrate Claude into various programming courses to give students experience building projects with AI tools and contributing to open source projects – at least the ones that allow AI-generated code submissions.
CodePath students have been pilot testing Claude Code, to good effect, it’s claimed. Anthropic reports that Laney Hood, CodePath student and computer science major at Texas Tech University, had nice things to say about its software.
“Claude Code was instrumental in my learning process, especially since I came into the project with very little experience in the programming languages used in the repository [including TypeScript and Node.js],” said Hood.
At the start of the personal computer revolution in the 1980s, companies like Apple and Microsoft worked to get their products into the hands of students, knowing that early familiarity encourages customer retention.
As web and cloud services began to overshadow traditional operating systems as computing gatekeepers, Google adopted a similar strategy by pushing its Chromebook hardware into schools. More recently, Meta has followed suit, with a mixed and virtual reality offering called Meta for Education.
And now, as AI companies strive to make their models chokepoints for computing services, they too are wooing students in the hope of building lasting brands.
OpenAI, last year, announced that it had joined the American Federation of Teachers to help launch the National Academy for AI Instruction, alongside Anthropic and Microsoft. And before that, OpenAI debuted ChatGPT Edu. Meta, meanwhile, has been trying to get its Llama model family into schools through a partnership with Blended Labs.
Anthropic insists that its tie-up with CodePath isn’t just about modernizing the curriculum of computer science. The AI biz says it expects to work with CodePath on public research into the way that AI is changing education and economic opportunities.
Those opportunities – specifically programming jobs – have declined significantly since 2022, according to the Federal Reserve Bank of St. Louis. Nonetheless, the US Bureau of Labor Statistics says, “Overall employment of software developers, quality assurance analysts, and testers is projected to grow 15 percent from 2024 to 2034, much faster than the average [of 3 percent] for all occupations.”
There is already ample research on the impact that AI is having on computer science education. Recent papers on the subject tend to be a mixed bag, finding AI assistance can be helpful if properly administered, so long as there’s compensation for the learning lost by offloading cognitive tasks. ®