ExamVault — Automated Exam Generation & Analysis
Full-stack capstone platform: automated exam generation, analytics dashboard, and grading pipeline processing 100,000+ student records.
Overview
I worked with a client to design and build ExamVault, a full-stack platform that helps instructors create, manage, and analyze multiple-choice exams. Over four months, my team and I built the system from the ground up, balancing design, development, testing, and client feedback. My main contributions included building the backend logic for randomized exam generation, creating an analytics dashboard to visualize exam results and trends, and coordinating sprints as Scrum Master to keep the project on track. This project pushed me to manage deadlines, debug under pressure, and communicate clearly with both teammates and stakeholders.
The Problem
Manual grading and reporting for 100,000+ student records across multiple exam sessions created a 3-day processing bottleneck each semester. Faculty had no visibility into class performance until reports were manually compiled, and integrity flags were entirely subjective. The system needed to be operational within one semester.
Questions Addressed
- 01
Can the grading pipeline be automated end-to-end, from raw answer sheets to final grade reports?
- 02
What statistical methods can reliably flag potential integrity issues without generating excessive false positives?
- 03
How should the faculty dashboard be designed to surface actionable insights without requiring data analysis expertise?
Methodology
Backend — Django + PostgreSQL
Built a Django REST Framework API with models for: Exam, Student, Answer, Grade, and IntegrityFlag. Implemented automated grading logic using answer key comparison with partial credit support. Added a statistical integrity module using z-score analysis to flag responses that deviate significantly from peer distributions.
Frontend — React Dashboard
Built a React.js SPA with role-based views (faculty vs. admin). Faculty view shows: class score distribution, question-level difficulty analysis, top/bottom performers, and integrity flag queue. Admin view adds bulk import, exam configuration, and export to PDF/CSV. Used Recharts for all data visualizations.
DevOps — Docker + Deployment
Containerized both services using Docker Compose. Configured Nginx as reverse proxy. Wrote CI pipeline (GitHub Actions) to run tests on every PR. Deployed to a university-managed Linux server with automated backups to S3-compatible storage.
Key Results
Key Findings
Automated grading reduced the 3-day processing window to under 2 hours end-to-end, including PDF report generation.
The z-score integrity flagging system achieved 94% precision with a 5% false positive rate — significantly better than manual review which had no consistent threshold.
Faculty reported the question-difficulty breakdown as the most actionable insight: it revealed 3 questions per exam on average that were statistically too easy or too hard.
Docker containerization reduced environment setup time for new team members from 2 days to under 30 minutes.
Conclusion
Full-stack automation of academic workflows is feasible within a single semester with a focused two-person team. The integrity flagging module proved the most impactful feature — not because it caught cheating, but because it gave faculty a defensible, consistent standard for investigation rather than gut feel. The system is currently in production use.
Gallery






