publications

(*) denotes equal contribution.

For a complete list, visit my Google Scholar profile.

2024

  1. Preprint
    Quadratic Gating Functions in Mixture of Experts: A Statistical Insight
    Pedram Akbarian*, Huy Nguyen*, Xing Han*, and Nhat Ho
    arXiv:2410.11222, 2024
    Under review
  2. Preprint
    Understanding Expert Structures on Minimax Parameter Estimation in Contaminated Mixture of Experts
    Fanqi Yan, Huy Nguyen, Dung Le, Pedram Akbarian, and Nhat Ho
    arXiv:2410.12258, 2024
    Under review
  3. Preprint
    Statistical Advantages of Perturbing Cosine Router in Sparse Mixture of Experts
    Huy Nguyen, Pedram Akbarian*, Trang Pham*, Trang Nguyen, Shujian Zhang, and Nhat Ho
    arXiv:2405.14131, 2024
    Under review
  4. ICML
    Improving Computational Complexity in Statistical Models with Local Curvature Information
    Pedram Akbarian*, Tongzheng Ren*, Jiacheng Zhuo, Sujay Sanghavi, and Nhat Ho
    In International Conference on Machine Learning (ICML), 2024
  5. ICML
    Is Temperature Sample Efficient for Softmax Gaussian Mixture of Experts?
    Huy Nguyen, Pedram Akbarian, and Nhat Ho
    In International Conference on Machine Learning (ICML), 2024
  6. ICML
    A General Theory for Softmax Gating Multinomial Logistic Mixture of Experts
    Huy Nguyen, Pedram Akbarian, TrungTin Nguyen, and Nhat Ho
    In International Conference on Machine Learning (ICML), 2024
  7. ICLR
    Statistical Perspective of Top-K Sparse Softmax Gating Mixture of Experts
    Huy Nguyen, Pedram Akbarian, Fanqi Yan, and Nhat Ho
    In International Conference on Learning Representations (ICLR), 2024

2022

  1. NeurIPS
    Improving Counterfactual Explanations for Time Series Classification Models in Healthcare Settings
    Tina Han, Jette Henderson, Pedram Akbarian, and Joydeep Ghosh
    In NeurIPS 2022 Workshop on Learning from Time Series for Health, 2022