You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First, ensure that pretrained weights and dataset for all entities in the dataset are properly located at LANTERN\data as guided in LANTERN\data\README.md .
Please modify the dataset_name, path_to_dataset, and save_path according to your experiments.
DDI datasets (DeepDDI):
First, ensure that pretrained weights and dataset for all entities in the dataset are properly located at LANTERN\data as guided in LANTERN\data\README.md .
@article {Ha2025.02.10.637522, author = {Ha, Cong Nga and Pham, Phuc and Hy, Truong Son}, title = {LANTERN: Leveraging Large Language Models and Transformers for Enhanced Molecular Interactions}, elocation-id = {2025.02.10.637522}, year = {2025}, doi = {10.1101/2025.02.10.637522}, publisher = {Cold Spring Harbor Laboratory}, abstract = {Understanding molecular interactions such as Drug-Target Interaction (DTI), Protein-Protein Interaction (PPI), and Drug-Drug Interaction (DDI) is critical for advancing drug discovery and systems biology. However, existing methods often struggle with scalability due to the vast chemical and biological space and suffer from limited accuracy when capturing intricate biochemical relationships. To address these challenges, we introduce LANTERN (Leveraging large LANguage models and Transformers for Enhanced moleculaR interactioNs), a novel deep learning framework that integrates Large Language Models (LLMs) with Transformer-based architectures to model molecular interactions more effectively. LANTERN generates high-quality, context-aware embeddings for drug and protein sequences, enabling richer feature representations and improving predictive accuracy. By leveraging a Transformer-based fusion mechanism, our framework enhances scalability by efficiently integrating diverse interaction data while maintaining computational feasibility. Experimental results demonstrate that LANTERN achieves state-of-the-art performance on multiple DTI and DDI benchmarks, significantly outperforming traditional deep learning approaches. Additionally, LANTERN exhibits competitive performance on challenging PPI tasks, underscoring its versatility across diverse molecular interaction domains. The proposed framework offers a robust and adaptable solution for modeling molecular interactions, efficiently handling a diverse range of molecular entities without the need for 3D structural data and making it a promising framework for foundation models in molecular interaction. Our findings highlight the transformative potential of combining LLM-based embeddings with Transformer architectures, setting a new standard for molecular interaction prediction. The source code and relevant documentation are available at: https://github.com/HySonLab/LANTERNCompeting Interest StatementThe authors have declared no competing interest.}, URL = {https://www.biorxiv.org/content/early/2025/02/15/2025.02.10.637522}, eprint = {https://www.biorxiv.org/content/early/2025/02/15/2025.02.10.637522.full.pdf}, journal = {bioRxiv} }
About
LANTERN: Leveraging Large Language Models And Transformer For Enhanced Molecular Interaction