Designing with AI

Problem

Outcomes are essential to degree programs as they serve as proof of student learning and accomplishments. Many programs struggle to refresh outdated outcomes, or need a starting point when writing new ones.

I saw AI as an opportunity to help with this process.

Goal

Build a proof of concept to introduce AI into our outcomes assessment product in 2 weeks.


Product Demo


Process

This project attracted a lot of attention and feedback. Some collaborations, like Executives and Legal, were new and provided a valuable chance to learn how to prioritize their feedback.

Engineering

This was a strong team effort with our Engineering Lead to ensure high-quality content. With a 12-hour time difference, clear communication was essential for our trial-and-error approach. Ultimately, the algorithm considers:

  1. Examples of outcomes from accrediting bodies and other institutions.

  2. Previously used outcomes from this user’s organization in the university.

  3. Outcomes from other organizations within the university.

Executives

Since this was the company's first AI project, Leadership was more hands-on than usual. Their approval was crucial, so I invited them to client feedback sessions so they could hear feedback from clients directly. I was also very transparent with any technical limitations we faced.

Clients

We engaged our Client Advisory Board to validate the content, comparing it to their outcomes during development to address any issues. Volunteers then piloted the feature in their institutions to maintain feedback until the general release.

To tackle data security concerns and faculty doubts about AI, we simplified the workflow to demonstrate its ease of use. They can also still use all normal functions, like editing, versioning, archiving, and deleting. They are not locked in to using the generated outcomes.

Legal

Including the Legal team was another new wrinkle for this project. They provided a lot of the language in the Generate Outcomes consent message to make sure we covered our bases from a security standpoint.


Future Versions

With our 2 week time constraint and proof of concept mentality, there is still a lot of opportunity to enhance this feature:

  • Implement a “Generate Again” experience

  • Let the user configure specific settings before generating

    • Number of Outcomes

    • Tone of Voice

  • Collect specific content requests to improve the algorithm


Outcome

The project has produced positive results and has proven the value of AI at Watermark. Now that users seem open to the use of AI in our products, we are excited to broaden this feature to additional workflows.

This type of AI feels less scary to me. My Program Chairs will love not having to start from scratch with their outcomes.
— Director of Institutional Effectiveness
Next
Next

New Product Design