I am a Computer Science graduate (B.Sc.) focused on the intersection of High-Performance Computing, Sequential Data Modeling, and Quantum AI. My work centers on treating complex datasets—from ancient texts to particle physics—as a universal framework for discovery.
I am currently a candidate for Google Summer of Code 2026 with ML4Sci, focusing on the Quantum Particle Transformer (Q-ParT) project.
- GSoC-2026-Research: My primary research sandbox containing:
- Quantum Attention PoC: A JAX + PennyLane implementation of hybrid variational circuits for sequence modeling.
- Quark-Gluon Classification: High-performance jet classification using the Google/DeepMind stack (JAX/Flax/Optax).
- Technical Benchmarks: Documentation on addressing Barren Plateaus and scaling to 100+ qubits.
Cleaned KJV Bible for LLMs (Kaggle)
- Role: Lead Data Engineer.
- Impact: Engineered a structured, high-fidelity dataset of 31,102 verses for LLM research.
- Relevance: This served as my foundational benchmark for large-scale sequence modeling, providing the ETL and data-handling expertise necessary for High Energy Physics (HEP) datasets.
- Frameworks: JAX, Flax, PennyLane, Cirq, TensorFlow, PyTorch.
- Infrastructure: XLA (Accelerated Linear Algebra), GPU-accelerated simulation (lightning.gpu).
- Mathematics: Currently advancing in AP Calculus and Quantum Dynamics to support utility-scale QML research.
- B.Sc. in Computer Science
- Mission: To build AI systems that drive scientific discovery at the LHC while supporting human emotional and spiritual growth.
Connect with me: Email | Github| Kaggle | Hugging Face | skill.google | Google Developers | LinkedIn


