Cognitive load or cognitive lift? Navigating AI in Undergraduate Learning

By Sydney Brown, Asst. Dir., Center for Transformative Teaching 

Cognitive reserve is the capacity of your brain’s ability to solve problems and adapt. This capacity is developed by a lifetime of curiosity and learning. Cognitive reserve is why some folks showed no outward signs of Alzheimer’s even though when their brains were examined after death, there were clear indications of the disease.  

Figen Mekik discovered that teaching her students about cognitive reserve in her general education class helped motivate them to engage in STEM activities and learning in a way that emphasizing increased future salaries did not, especially after Covid. She wondered if their interest was a byproduct of the pandemic and if they prioritized “lifeload over learning load,” which means they might put their personal lives and wellbeing ahead of learning commitments.  

And speaking of cognitive decline, Chris Westfall outlined concerns in the workplace about skill loss from over-reliance on AI in December last year.  More recently, at the end of March, New York Times Hard Fork podcast hosts Kevin Roose and Casey Newton asked listeners if using AI was making them dumber (YouTube, 18:46 min.), particularly in terms of listeners’ critical thinking skills.  

The first three listeners talk about how AI has been integral to their work and helped them augment their own capabilities, whether it’s helping a marketer with ADHD organize his brainstorming ideas, or a junior level coder’s need to learn about error codes and have code reviewed. However, the final case – that of a graduate student who finds using AI to help with writing to be a slope so slippery that they slide into supervising AI instead of authoring – best illustrates the concerns many instructors have expressed. Acknowledging this risk, the hosts raise two final questions: first, can someone who continues to do all their own cognitive work continue to be competitive? And second, as businesses and bosses discover the degree to which AI can speed employee processes, will employees even be given the time to employ critical thinking?  

As teachers, we may wonder which skills and knowledge are most essential for our students, and are we making enough time to focus on those?  

According to AI visionary Noelle Russell, keynote speaker at this spring’s Emerging Tech Conference,  success in an AI saturated future requires two key things: deep domain knowledge and the ability to communicate well. Domain knowledge empowers us to recognize when the AI is wrong, or headed down a path of unwanted, or even dangerous, consequences. Moreover, the ability to communicate with clarity will be essential to effectively leverage our AI tools.  

In light of what I’ve been reading, watching, and thinking about, AI use in the classroom needs to be considered from a human empowerment perspective as well as from a domain perspective. Perhaps asking “does this use of AI make my students stronger in terms of their domain knowledge and skill? In what ways will this usage grow my students’ ability to communicate with clarity?”  

I can see that the answer to these questions will differ with each class, such that some classes may make heavy use of AI while others may make use of device-free environments and flipped approaches to learning in order to hold time sacred for cognitive workouts.   

Other heuristics that might be useful in placing the focus on building cognitive skills are as follows: 

  1. Process over product. The pressure to produce a “perfect” product and get that grade can get in the way of student learning. Is there a way to put the focus on the process of learning - perhaps even celebrate the difficult messy process of learning? “Dude, suckin’ at something is the first step to being sorta good at somethin’” (Jake the Dog, Adventure Time, Cartoon Network.) 
  2. Cognitive load management. In this instance, would using AI free up resources for students to focus on higher level thinking? For example, in the podcast mentioned previously, Kevin Roose, proposes that there are two reasons you might want to lift heavy things. One, to get things from point A to point B, which might be a “forklift job.” Or, two, weightlifting, which is about self-improvement. In a classroom, if qualitative data analysis is not the learning goal, but generating creative solutions to problems is, then using AI to analyze user data about a problem might be the forklift job, and using design thinking processes to generate solutions might be the weightlifting activity.  
  3. Transparency. In what ways does the assignment support and foster transparency and critique on AI use? 
  4. Scaffold AI use. Where it makes sense to use AI, start students with small, low-stakes applications before moving to more involved applications. This enables instructors to teach both ethical use as well as evaluative, or critical, use. Currently, AI can give users a false sense of fluency, or expertise, in an area. During the AI Learning Community presentation on April 11, two faculty gave examples where they had encouraged students to use AI knowing that the students would fail in the task because they lacked sufficient domain knowledge to first understand that AI wasn’t the way to approach the problem, and second, because they wouldn’t be able to judge the quality of response. The instructors found having students fail in this way helped open their eyes to other methods and instructor guidance in the use of AI.  
  5. Consider a combination of AI-resistant and AI-enhanced assignments.  Use AI-resistant approaches to give students practice with foundational skills and knowledge. Use AI-enhanced assignments to teach students how to use AI in the domain appropriately and effectively.   

For UNL instructors interested in exploring these ideas in more systematic ways, there is an opportunity through a partnership with OpenAI.  Please consider submitting a proposal to the OpenAI Impact Program. This partnership with UNL provides 200 users with an enterprise version of ChatGPT.  

The advantages of using an enterprise version include  

  • ChatGPT Plus provides extended messaging limits, file uploads, advanced data analysis, image generation, standard and advanced voice modes, access to deep research and multiple reasoning models (like o3-mini, o3-mini-high, and o1), the ability to create and use projects and custom GPTs, limited access to Sora video generation, and opportunities to test new features 
  • SSO is set up with our OpenAI enterprise system. This means that users will use two-factor authentication to securely access their accounts, making them more secure than a standard, non-enterprise ChatGPT account. 

Finally, if you would like assistance in developing content and activities employing AI or restricting its use, please contact an instructional designer assigned to your college.