I opened Pandora's Box with my last unit in 8th grade. I was eager to explore the possibilities, so I investigated what students could do with Python and ChatGPT. Addressing broad topics such as Social Change, and Big Data, I challenged the students to build a web visualization as I had just learned in my six-month Data Analytics Bootcamp. The results were terrific.
It was diverse in the outcome, demonstrating students' various abilities to process what they could read. Some students produced web apps in Dash or Github.io. While others created interactive graphs with dropdown menu features. And other students learned how to interact with 5000+ data points in a .csv. The students undoubtedly acquired skills throughout the project, but I couldn't help but wonder, did they become better coders from this project?
As we watch the emergence of more AI tools powered by ChatGPT that enable kids to code simple solutions, concerns and discussions about the content and considerations of K-12 coding courses should also continue. A recent article by Nisha Talagala titled "Implications For Training And Education" highlights that coding in a professional setting goes beyond what current AI tools can generate. These tools are valuable as co-pilots but do not replace the software engineering role. However, in the context of training and education, the situation becomes more uncertain.
Traditionally, programming training programs, including those in K-12 settings, have focused on contained problems. In my 6th grade class, it is not that much different. Trivial coding challenges allow students to practice primary skills to master topics. However, students may turn to ChatGPT to solve the problem when faced with slight difficulties. This statement is evident when witnessing coding solutions in a 7th-grade class. ChatGPT solves "Rock, Paper, Scissors" much differently than a student who has just learned to code!
Talagala referenced that fresh graduates are not expected to know how complex systems behave, while senior engineers are expected to possess such knowledge. And the challenge arises when ChatGPT and AI can automate tasks typically given to entry-level engineers. This raises questions about how these engineers will learn the skills that senior engineers possess and, more importantly, will junior roles be displaced. Much like the basics for fresh graduates, what constitutes an essential skill and content for today's K-12 learners, and what does not? Will the basics be displaced as well?
More than ever, education programs need to incorporate project-based learning, where students use trivial code snippets as a means to an end rather than the end itself. These challenges need to be unique and open-ended. There needs to be a shift in the focus from solely coding syntax and language proficiency to problem-solving strategies and critical thinking. We must encourage students to think about the problems they are solving and teach them how to take time to consider multiple approaches and plausible solutions. This form of instruction will take a lot more work from the teacher with planning and assessing. And these concerns apply not only to Computer Science teachers but to other curriculum instructors.
Teachers must encourage students to explore creative solutions and think outside the box. We must allow students to express their unique ideas and encourage experimentation. These projects will challenge educators in how they grade; mastery of basic skills typically lends itself to a very A/F grading scale; however, creative and out-of-the-box solutions are often subjective in their measure of success.
Highlighting the importance of collaborative coding and encouraging students to work in teams will also be essential skills to address. Promoting communication, sharing ideas, and learning from peers to simulate real-world coding environments is a lifelong skill applicable in real-world situations. However, teachers must develop ways to grade students accordingly and accept that students will develop strengths in different skills, and the results will be differentiated, as often seen in real life.
Furthermore, teachers should actively modify their course content to address the ethical implications of students' code 'made by AI'. This includes adding active discussions on privacy, bias, and responsible AI development to cultivate a sense of responsibility among learners. Or as Talagala puts it, "Effective, Efficient and Ethical" with AI. To achieve this, teachers must stay up to date with emerging technologies and tools like ChatGPT, integrating them into coding courses to enhance learning while developing coding principles. And this will require a lot of work and training on a daily basis.
And the question still remains, what coding principles are still relevant in this AI world? It is evident that teaching students to think algorithmically and develop logical problem-solving skills will remain important, even with the ever increasing availability of AI tools. It remains true that logic is the foundation for efficient coding and general problem-solving overall. However, this skill is also very systematic, which leads us back to an infinite loop of controversy and the opening of Pandora's Box. Like training an AI, developing logic takes a lot of practice with trivial coding challenges, which AI does efficiently, so where do we go from here?
A shift in mindset for students? And understanding that learning and the love of life long learning is what will keep us human and relevant? More discussions as to why we need to practice what we do is a must. In addition, an understanding of how AI works and how we can stay up-to-date with our thinking is also important. Or on the flip side, do we shift completely,? Do we accept that computers and AI will outperform us in specific roles, and hope that creativity and understanding of complex situations will consistently beat the computer? Focus on teaching the art of creativity and break down the barriers of classrooms and throw out the foundational knowledge and skills? As with the referenced opening of Pandora's Box which symbolizes the unintended consequences and complexities of using AI tools in school, we are at a place where both options offer a challenge and a world of possibilities.
Talagala, N. (Jun 1, 2023). Implications For Training And Education. Retrieved from https://www.forbes.com/sites/nishatalagala/2023/06/01/is-coding-educationas-we-know-itdead-how-large-language-models-are-changing-programmers/?sh=7ec5ceb43ba9