(no title)
dansmyers | 1 year ago
We've let our kids play with LLMs by having conversations in voice mode and generating images. The youngest one likes doing this, but it's a novelty, not something that he does all the time.
For academic work, we've had success using Perplexity (with parental guidance) for the older kids' projects that require Internet research. The ability to get an overview of a topic at a moderate level of complexity with links to other sources is beneficial. This isn't a substitute for doing in-depth research in the library or with actual peer-reviewed articles, but they're not yet at that level of depth.
At the college level, the most important lesson we're trying to teach is using LLMs as a source of ideas, suggestions, and feedback to advance your work, rather than as a tool for generating finished work. I often phrase this as "collaborating vs. delegating". I want students to think critically about their ideas and repeatedly iterate with LLMs in the loop to help solve the creative problems they encounter - but without outsourcing their own vision for the project.
My colleagues are seeing good results across multiple disciplines using LLMs for topic development and pre-writing, so I'd encourage leaning into that role, as opposed to jumping straight into text generation.
We've also learned that students benefit from a clear process with specific example prompts. Using AI well requires developing critical thinking and self-reflective skills, so there's a process of maturing that comes with time and exposure.
If you're interested, here's an example research assignment I've used in my own classes with some specific prompts and suggestions for different phases of the writing process:
No comments yet.