In-class exercise: GenAI
High-level goals
The high-level goals of this exercise are to (1) experience using Generative AI (GenAI) coding assistants and (2) critically reflect on the benefits, pitfalls, and future implications of GenAI in software engineering.
The coding task (choose one)
Recall the previous two in-class exercises you’ve completed: Mutation testing and Delta Debugging.
This time, you will use a GenAI coding assistant to develop an extended component for one of these exercises.
You only have to complete one of the two options below, and you can refine the given specification — that is, you can add more features or make the coding task more difficult. Note that the tasks are intentionally vague to encourage creativity, exploration with GenAI, and critical reflections.
Option 1: Mutation Testing
In reference to the extra-credit question, use GenAI to create a program that:
- reads a mutant-test detection matrix (killMap.csv),
- builds a Dynamic Mutant Subsumption Graph (DMSG),
- visualizes that graph (any format you like), and
- prints the set of dominator mutants.
Option 2: Delta Debugging
Using your understanding of delta debugging, use GenAI to create a program that:
- implements delta debugging to minimize a failure-inducing input, and
- produces a visualization (e.g., timeline or tree) of the algorithm’s steps.
Set-up
Team up in groups of size 2.
Assign yourself to a group in the correct groupset (In-class-GenAI) on Canvas. (If you are in a Canvas group of size 1, you can still submit.)
Instructions
Read the entire assignment and ask any clarifying questions that you might have.
Work with GenAI to complete your chosen task:
Use your chosen AI assistant(s) to help write the program and generate additional artifacts. You’re free to use any GenAI tool or combination of tools however you like (vibe-coding is okay!). Make sure to follow the terms of service.
Consider design and implementation choices, libraries, algorithms, and output formats. Document your specification and reasoning for key decisions (bullet points are sufficient).
Capture evidence: keep screenshots of relevant chat transcripts, code suggestions, and code outputs. (You will attach these excerpts to your reflection.)
Self-evaluate the efficacy of GenAI assistance for your task:
Run/test your program. Fix issues (with or without GenAI).
Record how GenAI did: which parts did it get right, partially right, or wrong.
Note: Your developed program itself is not graded for correctness; grading focuses on the quality of the reflection on using GenAI (see questions below).
Questions
Submit a PDF file answering all questions. Bullet points are fine.
Which GenAI assistant(s) did you use, and how did you use it to complete your task?
Describe how well GenAI performed on your task. Consider how many prompts it took to get a working solution, how many suggestions were useful, and how many required significant human intervention. Did you have to write any code yourself? If so, how much?
How did GenAI handle the intentionally vague task specification? Compare its approach to ambiguity and implementation choices with how a human teammate might respond.
Describe specific moments where GenAI sped up your development process and moments where it slowed you down. What patterns did you notice about when GenAI was most and least helpful?
Based on this short experience, how might widespread GenAI adoption change software engineering, and what new skills will engineers need?
Extra credit: So far, you have used GenAI to write a program to solve a problem. Now, prompt GenAI to directly solve the problem (i.e., do not develop code, just prompt the GenAI assistant to provide an answer for the given input files). Does it return the correct answer? How does this approach compare to the previous one?
Deliverables
- A PDF file answering the questions above. Please list all group members.
- Evidence (screenshots of chat and suggestion excerpts) — embedded in the reflection supporting your answers.
- Your program (for option 1 or 2) with instructions for how to run it (you may use GenAI to create the program documentation).
Steps for turn-in
One team member should upload the deliverables to Canvas.
Hints
If you’re uncomfortable using a commercial GenAI tool, you can use open-source alternatives like ollama.
You are free to define the requirements for your program, including the programming language, any particular APIs or libraries, etc.