Hello !!!

I'm Kushagra Dixit.

Machine Learning Researcher Software Developer

More About Me
About

Let me introduce myself.

Profile Picture

I am a dedicated computer science enthusiast currently pursuing a Master's in Computer Science at the University of Utah, having previously earned a Bachelor's degree in Instrumentation and Control Engineering. My passion lies at the intersection of machine learning and software development, and I am eager to contribute my skills as both a Machine Learning Engineer and Software Development Engineer. With a solid foundation in programming languages such as C++, Java, and Python, along with proficiency in frameworks like PyTorch and TensorFlow, I am excited to bring my adaptable mindset and proactive problem-solving approach to dynamic teams working on cutting-edge technologies.

Profile

Originally from India, I'm now navigating life in the USA, driven by a passion for computer science. Outside of my academic pursuits, I enjoy playing cricket and table tennis, and I find immense joy in exploring hiking trails.

  • Fullname: Kushagra Dixit
  • Birth Date: March 30, 1999
  • Job: Research and Development Engineer, Software Development Engineer, Research Programmer
  • Email: kushagradixit1[at]gmail[dot]com

Skills

I bring proficiency in programming languages such as C++, Java, and Python. Additionally, I possess expertise in machine learning frameworks, including PyTorch, Keras, and TensorFlow. My skill set extends to technologies like Spring-Boot, RESTful APIs, AWS, and MongoDB, showcasing versatility in varied technical environments. With a proven ability to innovate and adapt, I am well-equipped to tackle diverse challenges in the realms of machine learning and software development.

  • Programming Languages: C++, Java, Python, Javascript, PHP.
  • Technologies and Frameworks: Spring-Boot, Microservice Architecture, RESTful APIs, PostgreSQL, AWS, MongoDB, Git, Apache Kafka, Kubernetes, GDB, Unix.
  • Machine Learning: Pytorch, Keras, Tensorflow, Scikit-Learn, NumPy, Pandas, Deep Learning, Transformers (Hugging Face), Word embeddings (Word2Vec, GloVe, BERT), Large Language Models.
Resume

My Journey

Work Experience

ML Research Programmer

May 2024 - Sep 2025

University of Pennsylvania

• Built adaptive multi-agent systems for RAG, selecting reasoning strategies based on context type and complexity.
• Developed inference and evaluation pipelines in Python, LangGraph, scikit-learn, and FAISS for retrieval experiments.
• Optimized inference with parallelism, batching, and MongoDB persistence, cutting runtime from 5+ hours to 20 minutes.
• Built Neo4j knowledge graphs for temporal QA and fine-tuned LLMs with LoRA on 4x A100 GPUs.
Tech : Python, LangGraph, scikit-learn, FAISS, Neo4j, vLLM, Docker.

Research and Development Engineer

Sept 2022 - Aug 2023

Synopsys

• Developed core C++ modules for Design-for-Test (DFT) automation, transforming RTL logic into gate-level netlists for synthesis.
• Implemented RTL-level design rule checking for inserted DFT logic, reducing design iteration cycles by 20%.
• Automated debugging and validation workflows using GDB, Python, shell scripting, and Linux tooling.
Tech : C++, Python, Linux, GDB.

Software Development Engineer

Aug 2021 - Sept 2022

OYO

• Developed Spring Boot microservices (Java + PostgreSQL) handling 90K+ requests/min, deployed with Docker, Kubernetes, and AWS EC2.
• Re-architected the pricing aggregator with Java multithreading, MongoDB caching, and Kafka sync, cutting P99 latency by 1s.
• Built a price prediction pipeline (Python, scikit-learn, XGBoost) with A/B testing, boosting hotel revenue by 15%.
• Designed Spark/SQL ETL pipelines and added Redis caching + SQL indexing, improving query latency by 35% and conversions by 10%.
• Built a React + TypeScript portal with AWS S3 uploads and Kafka-based pricing updates; added Grafana/Prometheus monitoring.
Tech : Java, Python, Spring Boot, AWS, PostgreSQL, MongoDB, Kafka, Docker, Kubernetes.

Education

MS in Computer Science

Aug 2023 - Present

University of Utah

  • Courses: Machine Learning, Natural Language Processing with Deep Learning, Computer Vision, Computer Architecture, Operating Systems.

BE in Instrumentation and Control Engineering

Aug 2017 - May 2021

Netaji Subhas Institute of Technology

  • Courses: Computer Programming, Data Structures and Algorithms, Unix/Linux, AI Techniques and Applications, Robotics, Intelligent Control.
Publications
Temporal reasoning over tabular data presents substantial challenges for large language models (LLMs), as evidenced by recent research. In this study, we conduct a comprehensive analysis of temporal datasets to pinpoint the specific limitations of LLMs. Our investigation leads to enhancements in TempTabQA, a benchmark specifically designed for tabular temporal question answering. We provide critical insights for enhancing LLM performance in temporal reasoning tasks with tabular data. Furthermore, we introduce a novel approach, C.L.E.A.R to strengthen LLM capabilities in this domain. Our findings demonstrate that our method improves evidence-based reasoning across various models. Additionally, our experimental results reveal that indirect supervision with auxiliary unstructured data (TRAM) substantially boosts model performance in these tasks. This work contributes to a deeper understanding of LLMs’ temporal reasoning abilities over tabular data and promotes advancements in their application across diverse fields.
Paper preview
Temporal tabular question answering presents a significant challenge for Large Language Models (LLMs), requiring robust reasoning over structured data—a task where traditional prompting methods often fall short. These methods face challenges such as memorization, sensitivity to table size, and reduced performance on complex queries. To overcome these limitations, we introduce TEMPTABQA-C, a synthetic dataset designed for systematic and controlled evaluations, alongside a symbolic intermediate representation that transforms tables into database schemas. This structured approach allows LLMs to generate and execute SQL queries, enhancing generalization and mitigating biases. By incorporating adaptive fewshot prompting with contextually tailored examples, our method achieves superior robustness, scalability, and performance. Experimental results consistently highlight improvements across key challenges, setting a new benchmark for robust temporal reasoning with LLMs.
Paper preview
Temporal Table Reasoning is a critical challenge for Large Language Models (LLMs), requiring effective reasoning to extract relevant insights. Despite existence of multiple prompting methods, their impact on table reasoning remains largely unexplored. Furthermore, model performance varies drastically across different table and context structures, making it difficult to determine an optimal approach. This work investigates multiple prompting technique on diverse table types to determine that performance depends on factors such as entity type, table structure, requirement of additional context and question complexity, with "NO" single method consistently outperforming others. To address this, we introduce SEAR, an adaptive prompting framework inspired by human reasoning that dynamically adjusts to context and integrates structured reasoning. Our results demonstrate that SEAR achieves superior performance across all table types compared to baseline prompting techniques. Additionally, we explore the impact of table structure refactoring, finding that a unified representation enhances model reasoning.
Paper preview
My App: Creda AI

Personal finance, simplified.

  • Split expenses and settle instantly
  • Send and receive payments
  • Track bank transactions and insights
  • CredaAI voice-first assistant
Download My App
Creda AI app home screen
Portfolio

Check Out Some of My Works.

I am highly passionate about delving into Natural Language Processing (NLP) and Computer Vision using deep learning techniques. Below, you can find information about some of my work and projects:

Recommendations

What does my Manager say about me.

Contact

I'd Love To Hear From You.

Sending...
Your message was sent, thank you!