Dark Mode

Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
hmxiong
Follow
Focusing

Hm Xiong hmxiong

Focusing
My research interests lie in the field of MLLM, Agents, LLM Reasoning and Inference Optimization. I am currently seeking internship opportunity.

Block or report hmxiong

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user's behavior. Learn more about reporting abuse.

Report abuse

Pinned Loading

  1. paper-reading paper-reading Public

    Forked from mli/paper-reading

    Shen Du Xue Xi Jing Dian , Xin Lun Wen Zhu Duan Jing Du

    2

  2. PathWeave PathWeave Public

    Forked from JiazuoYu/PathWeave

    Code for paper "LLMs Can Evolve Continually on Modality for X-Modal Reasoning" NeurIPS2024

    Jupyter Notebook 3

  3. 3UR-LLM 3UR-LLM Public

    Official repo for "3UR-LLM: An End-to-End Multimodal Large Language Model for 3D Scene Understanding" TMM2025

    Python 13

  4. StreamChat StreamChat Public

    Official repo for "Streaming Video Understanding and Multi-round Interaction with Memory-enhanced Knowledge" ICLR2025

    Python 101 3

  5. CUDA-Learn-Note CUDA-Learn-Note Public

    Forked from xlite-dev/LeetCUDA

    CUDA Bi Ji / Da Mo Xing Shou Si CUDA / C++Bi Ji ,Geng Xin Sui Yuan : flash_attn, sgemm, sgemv, warp reduce, block reduce, dot product, elementwise, softmax, layernorm, rmsnorm, hist etc.

    Cuda 1

  6. Awesome-LLM-Inference Awesome-LLM-Inference Public

    Forked from xlite-dev/Awesome-LLM-Inference

    A curated list of Awesome LLM/VLM Inference Papers with codes: WINT8/4, Flash-Attention, Paged-Attention, Parallelism, etc.

    1