Andre Paez Castro

About Me

I'm a graduate student at the University of Pittsburgh studying Information Science with a focus on XR and AI. My background is in Psychology, which gives me a different perspective on this work. XR is an inherently human technology, and understanding the person in the loop matters as much as the tech itself.

What drew me to XR is the idea of genuinely extending human senses, not just screens you wear, but tools that change how you perceive and interact with the world. Right now I'm building TranslationXR, an AR app that overlays the real world with language markers to make learning feel more intuitive and connected to real life.

I'm working toward a career in medical or educational XR, spaces where that human-centered perspective actually matters.

Projects

TranslationXR (In Development)

An AR language learning app for Meta Quest 3S that overlays the real world with translation markers. Point at an object, see its name in another language. The goal is to make vocabulary feel connected to real life rather than flashcards.

Built with Unity, AR Foundation, and OpenXR. Object recognition powered by a YOLOv10 model running on device. Flask backend connected to a MySQL database handles vocabulary and user profiles.

Frontier Hunt (VR Survival Game)

A VR survival game built for Meta Quest 3. You wake up in a forest, work through a tutorial learning the core mechanics, then fight your way through wolves and eventually a bear using physics based interactions and raycast shooting.

What I cared most about was the feeling of being in the world, the weight of the mechanics, the atmosphere. That sense of immersion is what keeps pulling me toward XR.

Built with Unity and C# using OpenXR.

Teak Custom Woodwork & Interiors

A full stack website I built for my family's custom furniture business. It handles the business's online presence and customer inquiries.

This is my most complete web project. Built with Next.js, React, and Tailwind CSS. Backend data management via Supabase, deployed on Vercel.

Emotion Detection App

A computer vision app that detects emotions from a webcam feed in real time. Built to learn how to deploy vision models on small compute — the architecture lessons directly informed how I approach model integration in TranslationXR.

Face detection runs entirely in the browser using face-api.js, sending cropped face regions over WebSocket to a Flask backend running FER+ inference on Azure.

Built with Python, Flask-SocketIO, ONNXRuntime, face-api.js, Docker, and GitHub Actions.

Contact

andrepaez98@gmail.com

GitHub · LinkedIn