• Home
  • LLMs
  • Docker
  • Kubernetes
  • Java
  • Ubuntu
  • Maven
  • Archived
  • About
LLMs | Installation
  1. Python Environment Setup
  2. Hugging Face CLI
  3. Hugging Face Transformers
  4. llama-cpp-python
  5. LangChain

  1. Python Environment Setup
    See this page for details on installing Python and the minimum libraries required for your development environment:
    Install Python

    Create and activate a Python virtual environment for isolated LLM development:

    To deactivate the virtual environment, run the following command:
  2. Hugging Face CLI
    See this page for more details: https://huggingface.co/docs/huggingface_hub/main/en/guides/cli

    Hugging Face CLI: Access and download models from Hugging Face Hub.

    Install Hugging Face CLI:

    Verify installation and check version:

    You can use the CLI to download models:
  3. Hugging Face Transformers
    See this page for more details: https://huggingface.co/docs/transformers/en/installation

    Transformers: Main library for working with transformer models (BERT, GPT, etc.).

    Install transformers:

    To install a CPU-only version of Transformers and a machine learning framework PyTorch, run the following command:
    To test if the installation was successful, run the following command. It should return a label and a score for the provided text:
    Example: Running a model using Hugging Face Transformers:

    Run the python script:

    Output:
  4. llama-cpp-python
    See these pages for more details:
    https://pypi.org/project/llama-cpp-python/
    https://python.langchain.com/docs/integrations/llms/llamacpp/

    llama-cpp-python: Run optimized LLMs locally with efficient inference.

    Install llama-cpp-python:

    Test lama-cpp-python:

    Download a compatible model:

    Python code:

    Run the python script:

    Output:
  5. LangChain
    See this page for more details: https://python.langchain.com/docs/how_to/installation/

    LangChain: Framework for building applications powered by language models.

    To install the main LangChain package:

    To install the LangChain core package:

    To install the LangChain community package:

    To install the LangChain command Line Interface (CLI) package:

    To test the installation of the LangChain CLI package:
© 2025  mtitek